A Computing Age of Abundance
Back In The Day
Remember the Y2K Bug? The thing that was supposed to wreck tons of software as the millennium rolled over?
Remember why that was supposed to happen? Because, in the Computing Age of Scarcity, programmers saved precious memory by shortening years from four digits to two.
Yes, computing scarcity was serious enough that consideration was given to individual characters in a software's code.
Ch Ch Ch Ch Changes
Enter Google I/O: Google's annual developer conference and narrative-driving "look at us" moment. It's a big deal. Such a big deal, in fact, that there's a countdown to the kickoff on the event website.
It's fun, beautiful, and very Google.
It's also a perfect signal that we've entered a new age of computing. Those stylized numbers? They're intense. Just rendering the number two takes 2,931 characters.
Yep. Where our predecessors were parsing individual characters, we're now using thousands of characters to make one character look neat-o.
We've known for a while that people are acting unrestrained by computing resources. Heck, it was way back in 2016 that the average web page became bigger than the entire package for the first-person shooter video game DOOM.
But still. That nonchalant, unnecessary flagrance, placed in the setting of Google I/O, makes it feel official. We're in the Computing Age of Abundance.