The internet calls it AI slop. We think **AI pollution** is a better term.
“Slop” makes this sound like a taste problem. “Pollution” describes the mechanism where it lowers signal quality for everyone and pushes cleanup costs onto people who did not create it. That is what AI pollution is doing to the web.
The internet has always had waste streams: spam, clickbait, SEO farms, content mills, engagement bait etc. Generative AI did not invent any of that, but it changed the production curve. Writing a mediocre article, creating a set of images, or posting a reply thread used to require enough human effort to create a speed limit. Now one prompt can produce dozens of variants in minutes. When generation becomes cheaper than judgment, systems built around visibility begin rewarding pollution.
That is why AI pollution is spreading so quickly. It has three defining traits: superficial competence, asymmetric effort, and effectively infinite reproducibility. It passes the skim test by sounding complete, yet fails when someone actually depends on it.
Search engines and social feeds were already vulnerable to this. Google’s own guidance is revealing. It does not ban AI-generated content in principle. It targets scaled, low-value, unoriginal content created to manipulate rankings, and makes that distinction clearly. In its March 2024 search update, Google tightened those anti-spam rules further. The lesson is straightforward: if the ranking function rewards cheap synthetic volume, AI pollution will fill the channel immediately.
There is a deeper systems problem underneath this. Models are trained on the internet, and the internet is becoming increasingly synthetic. Research around model collapse, the curse of recursion, and follow-up work on mixing synthetic and real data carefully keeps circling the same point: when models repeatedly learn from generated approximations of reality, they begin losing the tails of the distribution. Rare facts go first, edge cases disappear early, and niche viewpoints get compressed into generic averages. The web becomes easier to generate from, but harder to learn from. The problem is environmental and that is why “pollution” is the right metaphor.
AI pollution does not just create bad artifacts, but also degrades the shared substrate that humans, search systems, and future models all rely on.