All popular social media platforms, including those used heavily by minors such as TikTok,
<br /> Snapchat, Instagram, and Facebook, feature endless scroll feeds strategically designed to
<br /> intermittently surface content that users are algorithmically predicted to engage with. An
<br /> internal TikTok document said that the app maximizes for two metrics: user retention and time
<br /> spent.81 Similarly, a product manager for YouTube's recommendation system explained that the
<br /> platform's recommendation algorithm "is designed to do two things: match users with videos
<br /> they're most likely to watch and enjoy, and . . . recommend videos that make them happy. . . .
<br /> [S]o our viewers keep coming back to YouTube, because they know that they'll find videos that
<br /> they like there."82 And Adam Mosseri of Instagram said, "[W]e make a set of predictions. These
<br /> are educated guesses at how likely you are to interact with a post in different ways.... The more
<br /> likely you are to take an action, and the more heavily we weigh that action, the higher up you'll
<br /> see the post.""
<br /> Tech companies know that variable rewards are a valuable tool to increase users' activity and
<br /> time spent online and ultimately, to maximize profits. But they are similarly aware of the risks
<br /> associated with these types of rewards. For example, in 2020, responding to internal research
<br /> indicating that teen users had difficulty controlling their use of Facebook and Instagram, a Meta
<br /> employee wrote to a colleague: "I worry that the driving [users to engage in more frequent]
<br /> sessions incentivizes us to make our product more addictive, without providing much more
<br /> value... Intermittent rewards are the most effective (think slot machines), reinforcing behaviors
<br /> that become especially hard to extinguish."84 Ultimately, these sophisticated variable reward
<br /> techniques prey upon minors' developmental sensitivity to rewards.
<br /> Algorithmic content recommendation systems
<br /> Algorithms designed to maximize engagement fill young people's feeds with the content that is
<br /> most likely to keep them online, even when that means exposing them to a post, image, or
<br /> video that is dangerous or abusive. Platforms such as YouTube, TikTok, and Instagram serve
<br /> users content based on automated suggestions. Algorithms choose which content to suggest to
<br /> children and teens based on the vast amount of data they collect on users, such as likes, shares,
<br /> comments, interests, geolocation, and information about the videos a user watches and for
<br /> how long. As described above, these algorithms are designed to extend engagement by
<br /> discerning which pieces of content a user is most likely to engage with — not whether the
<br /> content or overall online experience is beneficial to the user.81
<br /> 81 Ben Smith,How TikTok Reads Your Mind, New York Times, (Dec.5,2021),
<br /> I�L� s. www.i� tii �:s.coi f) '. '. CS Ir�u.�.Fin.ss rnedi �i1 t..I< a1 o[ith n.htrn1.
<br /> I.............././............................................y.
<br /> .........................................................../............................./................./................./....................................................................................../................................................g....................................................................
<br /> 82 Creator Insider,Behind the Algorithms-How Search and Discovery Works on YouTube,YouTube(Apr. 16,2021),
<br /> https:Hyoutu.be/9Fn79gJa2Fc.
<br /> 83 Adam Mosseri,Shedding More Light on How Instagram Works, Instagram(June 8,2021),
<br /> https:Habout.instagram.com/blog/announcements/shedding-more-light-on-how-instagram-works.
<br /> 84 Spence v. Meta Platforms, N.D.Cal.Case No.3:22-cv-03294 at 82 (June 6,2022) (citing Facebook Papers: "Teen
<br /> Girls Body Image and Social Comparison on Instagram—An Exploratory Study in the US" (March 2020),at p. 8).
<br /> 85 A former YouTube engineer observed: "recommendations are designed to optimize watch time,there is no
<br /> reason that it shows content that is actually good for kids. It might sometimes, but if it does, it is coincidence."
<br /> Orphanides, K.G. "Children's YouTube is still churning out blood,suicide and cannibalism." Wired, (March 23,
<br /> 2018), 17,:G;G,as: www.wired.co.u.sl< article ou.sGu.slae for Iaids videos arolalerris al oriGl7rri recornrriend
<br /> Testimony of Josh Galin, Fairplay, February 14, 2023 13
<br />
|