VCS and the 1993 internet

Joey Hess suggests that current discussions about the superfluity of DVCS systems have a parallel in how the internet protocol world, circa 1993, played out:

I’m reminded of 1993. Using the internet at that time involved using a mishmash of stuff — Telnet, FTP, Gopher, strange things called Archie and Veronica. Or maybe this CERN “web” thing that Tim Berners-Lee had just invented a few years before, but that mostly was useful to particle physicists.

Then in 1994 a few more people put up web sites, then more and more, and suddenly there was an inflection point. Suddenly we were all browsing the web and all that other stuff seemed much more specialised and marginalised.

I would disagree, a little. Back in the early ’90’s, I was a sysadmin playing around with internet- and intranet-facing TCP/IP services (although in those days, the term “intranet” hadn’t been coined yet), so I gained a fair bit of experience at the coal-face in this regard. The mish-mash of protocols — telnet, gopher, Archie, WAIS, FTP, NNTP, and so on — all had their own worlds and their own views of the ‘net. What changed this in 1993 was not so much the arrival of HTTP, but TimBL’s other creation: the URL.

The URL allowed all those balkanized protocols to be supported by one WWW client, and allowed a HTML document to “link” to any other protocol —

The WWW browsers can access many existing data systems via existing protocols (FTP, NNTP) or via HTTP and a gateway. In this way, the critical mass of data is quickly exceeded, and the increasing use of the system by readers and information suppliers encourage each other.

This was a great “embrace and extend” manoeuvre by TimBL, in my opinion — by embracing the existing base of TCP/IP protocols, the WWW client became the ideal user interface to all of them. Once NCSA Mosaic came along, there really was no alternative to rival the Web’s ease of use. This was the case even if you didn’t have a HTTP server of your own; you could still access HTML documents and remote URLs.

In essence, HTML and the URL were the trojan horse, paving the way for HTTP (as HTML’s native distribution protocol) to succeed. It wasn’t the web sites that helped the WWW “win”, but embrace-and-extend via the URL.

For what it’s worth, I think there is an interesting parallel in today’s DCVS world: git-svn.

This entry was posted in Uncategorized and tagged , , , , , , , , . Bookmark the permalink. Both comments and trackbacks are currently closed.

2 Comments

  1. Posted June 25, 2008 at 17:47 | Permalink

    You’re quite right about the URl, and git-svn.

    However, however it got there, the internet did in approx. ’94 (or 95?) reach a tipping point, where very few people were bothing to put up gopher servers and the “http” in spoken urls became implicit.

  2. Posted June 26, 2008 at 12:30 | Permalink

    hi Joey! thanks for the comment.

    Yeah, I agree it got there eventually, alright — I reckon that the “trojan horse” effect of HTML and URLs got the installed base of WWW browsers up and running, at which point it became a no-brainer for sites offering info to build their infrastructure using HTTP servers (rather than hosting them on gopher, or FTP, or whatever).