YouTube’s Spammy Sex Bots Make a Ton of Money

Here’s how scammers turn those ubiquitous, meaningless comments into profits

Photo: Leon Bublitz/Unsplash

My YouTube channel is extremely wholesome. I post helpful, friendly tutorial videos. I sign off, rather cheesily, with, “Happy writing!”

That’s why it’s so annoying when my videos get flooded with decidedly not-PG and nonsensical comments from what the YouTube community has dubbed “sex bots.”

If you’ve posted a YouTube video, or even scrolled through the comments of one, you’ll probably know what I’m talking about:

Screenshot of sex bot comments taken by the author.

These sex bots are pervasive and obviously not real people. But it’s such a weird strategy that, though I was irritated, I was also deeply curious. As I recently did with spammy Instagram requests “dm to collab,” I decided to investigate the strategy behind these YouTube sex bots.

What were these sex bots trying to accomplish? Why were they leaving oddly horny comments? Why were they targeting my videos? (How could I get them to stop?)

The answer was not as straightforward as I’d hoped. You can read below or check out the video I made about it to find out what I learned.

The primary aim is to drive views to a playlist of longer, monetized videos

For the rest of this story to make sense, you should understand three things:

  1. Most YouTubers make money from ads on their videos.
  2. This incentivizes YouTubers to do their best to game the algorithm in order to get more views.
  3. The YouTube algorithm is mysterious and capricious, leading to many bizarre strategies that are surprisingly effective, like “Elsagate,” and the one you’re about to read.

The very first thing I did was click on the YouTube accounts that had left these obviously spammy comments to see what they had in common.

Each profile had a couple of subscribers and was founded in August 2020. The names were all vaguely uncommon and feminine (for example, Izayah Emerson), and the profile pictures all featured dark-haired women with pale skin and improbably large bosoms.

Screenshot by the author of Izayah Emerson’s YouTube profile.

The weirdest thing was that none of them had uploaded any videos. Instead, they only had playlists on their profiles. The playlists all started with an NSFW thumbnail of what looks like, frankly, a low-budget porn film.

Screenshot from Fortun MJ xQue’s YouTube profile.

I thought I had pieced together their strategy at this point. These accounts, acting as modern-day digital sirens, would lure unsuspecting viewers who had clicked on their intriguingly sexual comments onto the proverbial rocks of these pornographic playlists.

But YouTube isn’t PornHub. It has pretty strict policies in place banning “sexually gratifying content.” This meant my hypothesis couldn’t be right. You can’t upload porn onto YouTube.

I needed to investigate further.

Bracing myself, I lowered the volume on my computer and clicked on one of the videos on the first playlist, entitled rather cryptically, “18 6.”

Nothing happened. After two seconds of looking at that image above, the playlist moved onto the next video — a 90-minute deep house music compilation featuring a young woman in a bikini dancing on a beach in slow motion.

I realized that the original provocative thumbnail showed a “video” that was literally a still image shown for two seconds, long enough to count as a video and entrap viewers, but short enough that it perhaps would not be flagged by YouTube’s automated flagging system. It was the bait to get viewers on the playlist.

The playlist contained a bizarre mix of very short fake sex videos and ludicrously long and unrelated music compilations and atmospheric music.

Screenshot of playlists, one compiled by Lincoln Stoltenberger PhD Dr., the other by Top Playlist2.

I checked out other sex bots’ playlists and found a similar story — all of the playlists had been compiled by a separate account and featured videos uploaded from a few very specific but totally unrelated channels, like gaming videos or funny videos. All the playlists began with the faux erotic thumbnail video.

It was bizarre. The plot was thickening so much it was like pudding at this point.

Where were these short porno-thumbnail videos coming from?

I was intrigued by “Lincoln Stoltenberg PhD Dr.,” the compiler of one of the playlists I examined. I checked out their profile, which boasts 238 subscribers, which is many more than the sex bots, who only had a handful. It, too, was created in August 2020, like our fake sex bots.

While the sex bots hadn’t uploaded any content of their own, and only linked to the content through playlists, it appeared I had finally found the source of those short, sex-thumbnail videos.

Lincoln Stoltenberg PhD Dr.’s videos were entirely faux-pornographic two-second videos, all with tens of thousands of views. Some, like the “18 6” video I’d clicked on earlier, had over 600,000 views. I want to point out here that YouTube doesn’t monetize videos that are under a minute long, so these short videos are not the end goal of whoever is behind this nefarious plot.

Screenshot of Lincoln Stoltenberg PhD Dr.’s video uploads.

Aside from faux porno videos, Lincoln Stoltenberg PhD Dr., who seems to not have any online presence other than his YouTube account, also compiled playlists featuring accounts that uploaded those videos I saw earlier, like those from Best Deep House.

The accounts with the long videos had an unexplained change in the kind of content they posted

These accounts all experienced a strange shift in marketing strategy very recently. For example, Best Deep House wasn’t always passionate about house music — look back just a few weeks, and you’ll see the account was posting videos of animated fruits. The new owner of the account Best Deep House didn’t even update their banner or about page.

Screenshot of Best Deep House.

Other sex bot comments led to a playlist featuring World of Warcraft gameplay by an account called KAIN, which again featured scantily clad women in the thumbnail. But if you scrolled back far enough, the World of Warcraft gameplay videos stopped, and the channel featured gaming videos with far fewer views and no woman gamer in sight.

Screenshot of KAIN’s and Libby Witting’s YouTube channels.

Lincoln Stoltenberg PhD Dr. also compiled playlists featuring content from an account called Libby Witting, with similar mixes of Abraham Hicks affirmations and misleading thumbnails for gaming videos.

Libby Witting’s account also changed its strategy about a month ago. It was at one point owned by a real person — checking its About page led me to the Instagram account originally owned by a young man using the name Baruch Zilberman. I messaged him on his Instagram account, and he told me he’d sold his YouTube channel for $50 not long ago.

Screenshot of IG communications between me and Mr. Zilberman.

Many of the YouTube videos are obviously pirated — for instance, Libby Witting’s gaming videos actually feature a French gamer who goes by the handle Seckooh as well as an American gamer called Konazb. Other videos feature stock content of a woman’s silhouette before a sunset paired with affirmation audio from Abraham Hicks.

I reached out to both Seckooh and Konazb to let them know their videos were being stolen. Konazb told me that she hadn’t known the Libby Witting account had been using her videos.

Screenshots of me reaching out to Konazb and her reply.

It looks like the scheme is to employ playlists as a method to contain ad-stuffed videos of stolen content for mindless listening or watching. These longer videos make the real money.

I suspect that Best Deep House, Libby Witting, and KAIN all used to be legitimate YouTube accounts that were successful enough to field inquiries for purchase — or were hacked. You can pinpoint the very moment these accounts pivoted from their original content to the money-making scheme.

Who’s profiting here?

I tried to follow the money. Who stood to profit? There were three account types here: the sex bots that left comments to lead you to a playlist, the playlist compilers, and the owners of the videos that ended up on the playlists.

The only accounts that make money from these views are the owners of the videos from the wacky playlists.

Here’s the full scheme as I understand it:

  1. YouTube sex bots leave horny comments based on videos that have tags like #short and #shorts in the title. That’s how they found me — I post videos with hashtags in the title to aid in discovery.
  2. The sex bot accounts themselves don’t post videos, but they do post playlists.
  3. The playlists are all created by a separate account, and all start with a misleading thumbnail that looks like pornography, but is actually a two-second video.
  4. The rest of the playlist features videos from yet another account, mostly a mix of faux sex videos and extremely long videos of attractive women gamers — pirated from their rightful owners — positive thinking affirmations, or mindless listening music like house or atmospheric music.
  5. The owners of the accounts with the longer videos used to be legitimate accounts with real content, but were either hacked or purchased a couple of months ago.
  6. These accounts earn money through ads due to views. Here’s an example of one what one of the videos has earned according to Influencer Marketing’s YouTube Money Calculator:
Screenshot from YouTube’s Money Calculator.

My only remaining question is why YouTube continues to allow this.

My guess? They’re profiting. While it’s annoying for me, the end user, to receive these spammy comments on my video, the end goal for both the owner of those spam accounts and YouTube is the same: ad revenue.

It would be easy to enact a filter to stop bots from leaving multiple comments — or even filter out the five or so comments these bots cycle through, like, “Need Lovely😍💋 💝💖♥️❤️,” “Awesome 😍💋 💝💖♥️❤️,” “Fantastic 😍💋 💝💖♥️❤️.”

But for every view that earns the fraudulent Libby Witting money, YouTube takes their 45% share, too. The company has a very real incentive to continue allowing these bots to do their effective business in getting people to watch more videos — if Libby Witting is doing the hard work of shepherding people to these spammy videos, why should YouTube intervene?

I’m not the first person to remark on these comments, so YouTube is certainly aware of the issue. A YouTuber by the handle of @thatstarwarsgirl77 tweeted directly at YouTube, with only a canned response from the Support team.

YouTube simply doesn’t seem to care.

This sordid journey is also a commentary on how attractive women are marketed, sometimes without their knowledge. All the faux sex videos primarily feature women’s faces. The gamer content was stolen from attractive women. Best Deep House features footage of women in bikinis. Even the Abraham Hicks affirmations video showed a silhouette of a woman with long hair blowing gently in the breeze.

And of course, the entire scam starts with the sex bots portraying attractive, scantily clad women pictured in their profile.

When I began this investigation, I expected it to be as straightforward as my excursion into the world of Instagram “dm to collab” scams, a tale of marketing and social media in the digital age.

But these sex bots revealed two complex issues — one, YouTube has no incentive to stop this kind of spammy behavior when it’s profiting, even though the scheme breaks many of their rules against spam, impersonation, misleading content and thumbnails, copyrighted content, misleading playlists, and fake engagement. And two, sex still sells. Even though anyone can tell those sex bots are fraudulent, Lincoln Stoltenberg PhD Dr. is still driving tens or even hundreds of thousands of views to these videos through his spammy sex bot comment strategy.

So next time you see an attractive woman named something like Kyson Harper commenting on a YouTube video that she “Needs Lovely 😍💋 💝💖♥️❤️,” you know the why, the where, and the how. Now it’s down to YouTube to decide what — if anything — it’ll ever do about it.

Biology MSc. Psychology nerd. She/her. Get my FREE 5-day Medium Starter Kit to make money writing about what you love:

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store