Your Instagram Algorithm is a Little… Too Good

Your+Instagram+Algorithm+is+a+Little%E2%80%A6+Too+Good
Note: This article was originally published in February 2021.

The addicting rabbit holes of the Instagram Explore and TikTok For You page has captured countless hours of more than a billion users. With the emergence of data privacy concerns, more are aware of social media giants tracking user activity to fuel their engagement algorithms. But just how far do these corporate giants really go?

 

Algorithms and artificial intelligence are fueled by consumer data. As more input is added, the better algorithms will increase app and ad engagement, simultaneously encouraging screen addiction and fast consumerism. The alarming part of this is how successful these app algorithms can hold users’ attention and loyalty, by knowing users better than they know themselves.

This user surveillance and personality learning goes further than just noting key words in personal direct messages or search bar. The computer is not just observing what users write about their day. The analysis closes in at more minute details, scrutinizing if tasks were written in a bulleted list, paragraph, with proper punctuation and emojis. It learns every detail of each individual personality, something Meyers Briggs could never depict. 

 

The observation of every action is more present than the public thinks. Whenever summoning “Hey Alexa,” this voice assistant algorithm is not just recording transcriptions, but also determining valence, or emotional value, in voices. As part of Amazon’s effort to reliably detect users’ emotional state from their voices, the progression of Emotion AI across the tech industry is becoming increasingly advanced. Future versions of Apple’s Siri may go even further than voice recognition, utilizing FaceTime cameras to simultaneously interpret users’ facial reactions and moods. In the near future, AI could detect whether one is in a manic episode while asking about the weather, then using this learned data to further improve customized actions and emotional recognition.

 

This data collection is  constantly evolving to personally serve users better. Many find surrendering their privacy a small trade off to obtain a more personalized virtual assistant or entertaining feed. However, the normalization of civilian surveillance is troubling in certain aspects.

 

Already in China, non-transparent data collection and analysis has a place in everyday life for its amazing convenience and accuracy. From the super-app WeChat used to message, pay, book appointments etc. to the law enforcement database publicly shaming jaywalkers, individual data is obtained and stored everywhere. It is virtually impossible to avoid user surveillance in this modern life. 

 

Heavy dependence on artificial intelligence goes even further when it comes to determining loans. China’s money lending app, Yongqianbao, runs through more than 1,200 data points, scourging your phone from usual credit card information to more microscopic details such as the number of unanswered calls. Even low phone battery is a consideration in loan refusal. A loan rejection could hinge on just cell phone habits, and unless users read through 3,000 words of terms and conditions, they would never know. 

 

Courts in China are also turning towards AI, using computerized analysis based on  past precedents to make rulings. Similarly in the US adversarial judicial system, representatives such as Todd Stephens have remarked human-judged rulings to be totally arbitrary. In 2010, the state panel devised an algorithm to predict how likely a defendant is to be a repeat offender, saving trial time and human liability. However, this has led to major controversy over data collection bias, lack of human empathy and moral reasoning. The possibility of a jail sentence could depend on how a computer calculates one’s innocence, regardless of how well a lawyer presents a case

 

While the US does still hold some privacy protection legislation, every virtual activity within apps is still being recorded. There is no certain protection for this future. As technology advances, the world could be easier and cheaper, but at what cost? With incessant scrutiny and formula- dependent decision making, future society could be void of human mistakes. However, it could also be void of human empathy, ethicality and equitability.

 

Sources

Wiggers, Kyle. “Amazon’s AI improves emotion detection in voices.” Venture Beat. May 12, 2019. <https://venturebeat.com/2019/05/21/amazons-ai-improves-emotion-detection-in-voices/>

Gallagher, William. “Future versions of Apple’s Siri may interpret your emotions.” Apple Insider. 2019. <https://appleinsider.com/articles/19/11/14/future-versions-of-apples-siri-may-read-interpret-your-facial-expressions>

Yuan, Li. “Want a Loan in China? Keep Your Phone Charged.” Wall Street Journal. April 6, 2017. <https://www.wsj.com/articles/want-a-loan-in-china-keep-your-phone-charged-1491474250>

Yu, Meng. Du, Guodong. “Why Are Chinese Courts Turning to AI?” The Diplomat. January 19, 2019. <https://thediplomat.com/2019/01/why-are-chinese-courts-turning-to-ai/>

Yu, Alan. “Can algorithms help judges make fair decisions?” WHYY. February 20, 2020. <https://whyy.org/segments/can-algorithms-help-judges-make-fair-decisions/>