The internet is already very noisy.
To even be noticed, you have to value quantity over quality in some respect.
Figure 1. You surely have noticed such sites
If you search for "programmatic SEO", you will run across the early stages of generated content.
As an example, Tripadvisor has a page for almost everything travel related. Yelp has a page for all business searches. Thatâs because they use programmatic SEO to reach their target audience.
If someone searches âtop things to do in (city),â Tripadvisor is always in the top results.
â https://breaktheweb.agency/seo/programmatic-seo/
I notice that practice when I want to know how to make two frameworks work together.
As a constructed example: Letâs say I want to know how to apply the redux pattern in the svelte framework.
I might google "svelte redux" and very often one hit will be something like "reduxjs vs svelte".
The comparison between reduxjs and svelte is obviously comparing apples to oranges.
No real person sat down and created that site.
It was auto-generated once and now clouds peoples search results for eternity.
We can even push this idea a bit more and find sites that try to do bend spacetime and compare CSS and SQL.
I went with the programmatic SEO example here to drive my point home.
But even in various news sites like the german finanzen.net auto-generated articles are a standard practice.
Feeding for example press releases into a simple NLP program is the easiest and cheapest way to add articles to your site.
You could also feed the current stock price into a NLP program and let it comment on it every other hour.
Luckily, there is little to no AI involved in these tools.
It is blatantly auto-generated.
You maybe chuckle a bit when you stumble across one of these and go on with your life.
Maybe some news articles end up sounding very similar to their competition.
Only minor inconveniences.