It's a Read/Write/Execute Web and We Just Live In It

This article will take approx 3 minutes to read.

I hesitate to put any kind of definition around the versioning of the web. The fact that the internet world has to quantify the differences between the so-called Web 1.0 and Web 2.0 is silly at best. However, there is no doubt that there is a vast degree of difference between the web that was known in, say, 1999 and the web that we know of in 2009.

Objectively speaking, the first generation of the internet was based around a premise of “Read only”. It, of course, was not termed that, but the technology did not exist to support anything else. People used the internet to read the news, find weather forecasts and catch up on sports scores. Blogs didn’t exist. Facebook and Twitter were but thoughts in their founders minds, and likely thoughts that did not even exist yet. Who knew that a time would come when the most interactive thing on the web would not be shopping and ecommerce?

Somewhere in the middle of this decade, the web took on a more interactive approach. Tim O’Reilly began calling it Web 2.0 to note the clear cut difference between a “read only” web and a “read/write” web. Social networks and blogs gave users of the internet a chance to participate in the creation of it, by generating content. Eventually, content generation transformed from the written word to video, podcasts and microcontent.

On the cusp of a next generation to the web, there is a movement toward meta-data, that is granular information to help discoverability on the web. APIs allow developers to take content from, say, YouTube or Twitter, and repurpose that into something usable in other forms by humans, applications and mobile devices. It is, in essence, a “read/write/execute” version of the web and we are already beginning to see this.

Ari Herzog, a longtime reader of this blog as well as a longtime opponent of mine, wrote a post declaring Europe’s Government 2.0ish aspect of their EU site a win over the United States. See his post for his rationale.

He certainly makes a good point with his premise after the jump:

If I must, you can see information in multiple columns”“and the data makes sense. It’s logically organized, providing intuitive links for wherever you might need to go for further environmental information in any of the EU member nations. Whereas the US list is, well, a list. How boring!

The problem, of course, is that the battle of websites is a tired one and caters to a “Read only” view of the web. That users engage content on the website only (this is true at this point) and that they will continue to do so in the future. That the future of the government-oriented portion of the web, also known as “Government 2.0″ to denote a next-gen approach to internet media and government participation, is a “Read Only” approach.

This argument is short-sighted and, while I agree that a usable and UI-focused approach to a government website is important, it does not address the larger hurdles faced in the government community. It does not consider that, as an example, the NOAA might want to allow constituents to engage their data in a mobile or iPhone app. Or that DefenseLINK, the official website of the Defense Department, might want to make their official data accessible to all other DoD websites via RSS or other API method.

The next generation of the web, and the government participation on the web, is not about pixels and content presentation for humans using Internet Explorer! That is certainly an aspect, but will not translate to anything that could be billed as Government 2.0. The assumption, the premise, and therefore the jumping off point in terms of thinking, should be consistently “How do we provide the most data to our constituents?” (where constituents might be internal or machine/computer/webapp, and not even be the U.S. Citizen).

Peter Corbett wrote a post here several months ago talking about the building of apps to meet the needs of government and their constituency. He alluded to his Apps for Democracy project which opened up vast amounts of District [of Columbia] data for developers to build real life solutions to. Sunlight Foundation has a similar project called Apps for America. Without a doubt, every developer who built an app as part of those project, considered the web beyond the browser. That… is what is the keystone of Government 2.0 will be.

An understanding of, at minimum, the “Read/Write” web is necessary. Better yet, having a firm grasp and understanding of the “Read/Write/Execute” web, where data discoverability is ubiquitous via microformats (read and subscribe to Chris Messina and the fine work of the DiSo Project for more ubiquity/discoverability/findability work on the web) and mobile devices.

Of course, this would also mean that the technology community would get off their asses and actually innovate, but I digress.

Comments

  1. Sue Densmore says

    The Twitter conversation between you and Ari was interesting to follow today. I appreciate both of you, and your interest in educating the public about Web 2.0 and Gov 2.0 issues.

    You touch on something in this post, and Ari said something in one of his tweets, though, that could use some development. And perhaps shame on me for not doing a search of your blogs to see if either of you commented further on it.

    But for government sites it is important to remember the target audience, and the fact that many users are still just folks looking for info and using basic tech and browsers like IE6.

    So, many people don’t mind a simple list of logically codified links to info. They would say it’s “user friendly,” and that wins over some great looking site that takes forever and a day to load, and even longer to sort through. Your initial point about the visual attractiveness of the site being less important than functionality was really well taken.

    • says

      Maybe the constituency, as I mentioned, is not the general public. Maybe it’s something internal? Maybe IE6 is the standard. Maybe it’s not even a browser-based dataset. Maybe it’s XML and machine readable. Gotta get out of the mindset that the web is for browsers. And the mindset that the government agency always exists to directly serve the public. Sometimes it doesn’t.

      • Sue Densmore says

        Point well taken! Of course, recent articles would indicate that perhaps not all the government agencies have the level of tech they need, and aren’t even up to IE6 yet… ;-)

        I guess, because I would be considered just an end user, I do not have enough technical knowledge to disassociate the web from a browser, because, except for RSS/XML feeds in a reader on my phone, that’s pretty much the only way I interact with the web. Even my ‘Berry has a browser for web interface. And the reader just gets stuff and converts it.

        I’ll be interested to see what all the more knowledgeable and inventive folks in the field of internet tech development come up with next. But I truly do appreciate the opportunity to learn from and be challenged by people like you.

        • says

          RSS is a good example of content being repurposed in some other way outside of the browser (though sometimes a feed reader is in a browser, sometimes it’s on a Blackberry, iPhone or desktop app as well).

  2. says

    Thanks for the shout-out, kind sir, but I’d like to suggest a fault with your logic.

    First, some background: In the early 1980s, I was exposed to the internet through bulletin board forums and file transfer protocols on sites hosted by AOL, Prodigy, and CompuServe. In 1993, I was introduced to the world of Internet Relay Chat and the art of telnetting to external computer mainframes. In 1996, I worked as a tech support rep for an Internet Service Provider–a mere 12 months or so after the World Wide Web was officially launched.

    Looking back some 25+ years, the internet has been much more than “Read only.”

    * The early Prodigy forums–and later, IRC–enabled people from multiple locations to meet in a virtual place to share information and speak in a common voice.

    * While my college experienced lacked the notion of any Facebook or even BlackBoard, we used a text-based VAX bulletin board community in 1994, creating content on our intranet, allowing other people to post replies, and enabling anyone to email the content from the bulletin board to beyond.

    I don’t disagree with your sentiment that the internet (and its World Wide Web) are changing into a new generation, but I take fault with your logic that “Read/Write only” is from 2004 (when Tim O’Reilly coined Web 2.0) and not earlier. Clearly, you’re not suggesting the content creation I cite above from the mid-1990s, let alone the early 1980s, was only readable and not writeable.

    Oh, and for the record, I may be your opponent but you’re not mine. Another faulty figment of yours that I’d enjoy you respecting.

    • says

      Ari-

      I don’t want to argue semantics, but to clarify and, if you will, put context around “the internet” – none of those services you mention fall into the category of a public and open internet that I speak of. Each was a silo for member-only content. AOL didn’t even move away from this until a few years ago. Compuserve was slurped up and made extinct. Prodigy… who knows what the hell happened to them. But none of those were open protocol services. All were walled gardens. The same could be said of BBSes which I too used.

      As well, DARPANET, which was the predeccesor to the internet that I speak of is not to be construed as “the internet” as it too was not open protocol-based.

      • says

        My apologies for comparing present and future technologies to the past, but I believe effective progress is helped with parallels, e.g. comparing Twitter to a telephonic party line.

        Yes, such systems were silos — well, IRC wasn’t and the Usenet wasn’t really either, but I digress.

  3. says

    As a contractor helping the fantastic Army.mil team work on the current and future versions of the U.S. Army’s site, I do always feel torn between innovation, useful information, and…IE6. But I can assure you the Army, at least, is workin some pretty awesome and forward-thinking things :) (And we’re listening and taking notes out here in the greater community! Thanks Aaron for feedback :))

  4. says

    Excellent way to separate the understanding of web 1.0 and 2.0 – because so many people simply don’t understand any differences. I am on the same page with you, it’s kind of silly the definition 2.0 – I think we are more at 1.354 (just joking).

    All website I designed 10 years ago were static HTML, now many of the sites I develop entail blogging, networking, video and more interactive opportunities.