As IÂ
tweeted out last week, it’s dangerous to lose sight of the role that noncommercialism played in the birth of the web, or as blockchain enthusiasts call it now, Web1. Some of the most important early software for the internet, including theÂ
Eudora email client and Mosaic, was developed in-house at major universities (in the latter case, by one of the high-profile venture capitalists I slagged a couple of paragraphs prior!). Later, software like PHP and MySQL became some of the defining technologies of Web 2.0, technology developed from open-source roots. Web 2.0, while often having purely commercial aims and evolving into the boogeyman of “big tech,” started from a perfectly reasonable place. As much as we hate Facebook, they still open-source React.js, a tool that much of the internet uses.
With that in mind, it’s worth considering that protocols
like Project Gemini, a Gopher-inspired protocol, are more in the spirit of the original internet than many Web3 initiatives.
I of course see a lot of these conversations from the creator perspective, and I’ve seen it all—wild promises, little payoff. I remember vividly, during the early Web 2.0 days, how I would get pitched on adding a new piece of Javascript to my website every single week. These tools had functional reasons behind them, but they were really out for your data, and always promised lots of additional revenue (generally without much payoff).
The one tool of that nature I really liked (and which offered no revenue promises) was a contextual-research tool called Apture that was eventuallyÂ
bought by Google and killed off. And I wasn’t pitched by them; IÂ
found them. It’s fitting for me the guy who was the CEO of that company,Â
Tristan Harris, eventually became one of Big Tech’s most consistent critics. To me, that’s telling.