Skip to main content
Telegram messages showing "bot" term used 16,232 times in 2.8M messages, highlighting bot activity study.

Editorial illustration for Study finds ‘bot’ term used 16,232 times in 2.8M Telegram messages

Bots Dominate 2.8M Telegram Messages in EU Study

Study finds ‘bot’ term used 16,232 times in 2.8M Telegram messages

2 min read

Researchers have sifted through almost 2.8 million posts from 16 Telegram groups and channels that operate in Italy and Spain, looking for patterns in how automated accounts are discussed. Their tally shows the term for these scripts popping up tens of thousands of times, a volume that hints at a broader, organized activity. While the study maps the sheer frequency of the word, it also tracks the URLs that members share, many of which disguise their true destination.

Nearly half of those hidden redirects point to services that generate AI‑powered “girlfriend” avatars, and a sizable slice funnels users toward bots that alter images to add nudity. This blend of terminology and link‑sharing paints a picture of a monetized abuse network thriving on the platform. The findings underscore how commonplace the language of automation has become and why the underlying links matter.

The word "bot" appears 16,232 times across the nearly 2.8 million messages analyzed from 16 Italian and Spanish Telegram groups and channels. Among the disguised redirect links users shared within these groups, 49.71 percent led to AI girlfriend generators and 19.14 percent pointed to nudifying bots.

The word "bot" appears 16,232 times across the nearly 2.8 million messages analyzed from 16 Italian and Spanish Telegram groups and channels. Among the disguised redirect links users shared within these groups, 49.71 percent led to AI girlfriend generators and 19.14 percent pointed to nudifying bots, the study found. The researchers also documented how users share prompts for commercial chatbots like Grok or Gemini to manipulate images.

Under the hashtag #PornoTok, users deliberately create synthetic content of female TikTok influencers, including fake audio clips featuring their voices. According to the report, AI has lowered the technical barrier so far that the number of potential victims is growing dramatically.

The analysis counted 16,232 occurrences of the word “bot” in just under 2.8 million Telegram messages from 16 Italian and Spanish groups. The data is stark. It shows nudifying bots can turn ordinary photos into synthetic nude images at scale, and that the activity is tied to a clear revenue stream.

Archives of non‑consensual intimate images reportedly sell for 20 to 50 euros, while affiliates claim up to a 40 percent commission. Nearly half of the disguised redirect links—49.71 percent—lead to AI girlfriend generators, and 19.14 percent point directly to nudifying bots. Researchers are calling for a ban on these tools, but enforcement mechanisms remain undefined.

Could enforcement ever keep pace? How effective such bans could be is still uncertain, given the fluid nature of Telegram’s private channels. The findings raise questions about the ease with which AI can be weaponized for profit, and they highlight a gap between technical capability and policy response.

Ultimately, the study documents a monetized abuse ecosystem, yet it leaves open how regulators might intervene.

Further Reading

Common Questions Answered

How many times was the term 'bot' mentioned in the Telegram messages analyzed?

The study found the term 'bot' appeared 16,232 times across nearly 2.8 million messages from 16 Italian and Spanish Telegram groups and channels. This high frequency suggests a significant and organized presence of automated accounts and bot-related discussions.

What types of links were most commonly shared in these Telegram groups?

According to the research, 49.71 percent of disguised redirect links led to AI girlfriend generators, while 19.14 percent pointed to nudifying bots. These links represent a concerning trend of potentially exploitative and non-consensual digital content generation.

What economic incentives exist for creating nudifying bots?

The study revealed that archives of non-consensual intimate images can sell for 20 to 50 euros, with affiliate marketers claiming up to a 40 percent commission. This financial motivation appears to drive the creation and distribution of synthetic nude image generation technologies.