Some videos are explicitly made to mimic reality, while others use artificial intelligence in more absurd ways.
Elephants drop-kicking crocodiles while breaking the laws of physics, bewildering deepfakes of politicians and deceased public figures, and seemingly animated children’s videos of Jesus fighting the Grinch: generative artificial intelligence (AI) is sweeping across online video platforms and may now account for a sizable portion of YouTube’s short-form video feed, recent research shows.
After being accused last year of causing users to end up in psychiatric wards and allegedly helping multiple depressed teenagers take their own lives, generative AI tools are also inspiring new genres of online content.
AI-generated images and clips were found in 21 percent of the 500 short-form videos screened in a study released last November by the video editing software company Kapwing, with some of the channels analyzed amassing millions of subscribers and billions of views.
Some, such as India-based channel Bandar Apna Dost, were estimated to generate millions of dollars in YouTube ad revenue annually. These channels are found worldwide, with those based in Spain and South Korea garnering the “most devoted viewerships,” according to the study.
“Generative AI tools have dramatically lowered the barrier to entry for video production,” Rohini Lakshané, an interdisciplinary technology researcher, told The Epoch Times.
“So, the channel can churn out massive amounts of content and maintain a high frequency of posting. Channels using these methods can flood recommendation feeds simply by volume, irrespective of intrinsic quality.”
Here’s what we know about “brainrot” and “AI slop,” what’s at stake for viewers and content creators, and why you might want to pay closer attention when browsing social media.
‘Brainrot’ and ‘AI Slop’
Kapwing determined that 33 percent of the videos it screened after creating a new account on YouTube appeared to have the hallmarks of “brainrot” content, which Oxford defines as “trivial or unchallenging” and considered to deteriorate a “person’s mental or intellectual state.”
Existing long before the advent of generative AI, “brainrot” includes memes, humor, nonsensical skits, videos of children or animals engaging in “silly” actions or behaviors, and other forms of content that minimally engage users intellectually or convey little or no meaning beyond randomness or absurdity.
Combining generative AI with “brainrot” characteristics gives rise to the emerging genre many refer to as “AI slop,” which Kapwing defines as “careless, low-quality content” generated with AI tools that is intended to “farm views and subscriptions or sway political content.”
By Jacob Burg







