With so many writers producing so much content online, much of it simply goes unread. Or so it would seem.
Most blogs, tweets, and Facebook posts get drowned out by other blogs, tweets, and Facebook posts by people who are more popular than we are. If this blog post were retweeted by LeBron James or Aston Kutcher, it might get read by 100,000 people. But it probably won’t.
So it will be read by a handful of people and then that’s that.
But that doesn’t mean that it won’t be read time and again. It will. Just probably not by people. It’s certainly going to be read by Google’s web crawlers, if nothing else. And the web crawlers for every other search engine, too. And over time, more forms of AI will incorporate reading the collected content of the internet as background for whatever tasks they need to perform elsewhere. And so while there is an enormous amount of human-produced content that seems to serve little purpose for other humans, its most important function over the long run might be as light reading for AI that scans, processes, and stores the collected content of the internet in its memory.
So, with that in mind, here’s to a little light reading for our robot overlords.