Twitter is Dead, Long Live Twitter

A year ago today, Twitter was something that many communicators were just trying to wrap their heads around. It was a new form of communication that was threatening to upset the precious fiefdom that they had built up over years and that had been taught in universities.

A year ago today, Twitter was something that a fringe of the greater population used regularly to discuss the election and monitor debates and campaign stops. It was something used for grass roots organizing and the biggest name was @BarackObama.

A year ago today, a handful of major media outlets were using Twitter. @ricksanchezcnn adopting Twitter on air at CNN and using it to monitor conversations around stories he was reporting on was a major coup de grace for stalwart journalism types who refused to adopt this new form of communication.
2755v28-max-250x250
Contrast these three scenarios with todays world. White House staffers are using Twitter as a regular routine. Sports fans follow @QBKILLA (aka Warren Sapp) and @THE_REAL_SHAQ (aka Shaquille O’Neal) – and yes, your observation of sports figures typing in all CAPS is not unshared. Musicians like @johncmayer – John Mayer – and @davejmatthews – Dave Matthews – are also using Twitter and talking to fans.

With this massive uptake of Twitter, it’s easy to think that the platform has arrived. And it has. It is as mainstream as any social service could hope to be. At the same time, Twitter is dead.

I don’t mean Twitter is going away. In fact, I don’t think it will ever go away. In fact, I think it is part of the future of online communications, much like email was back in the 1990s. Back then, it was somewhat rare for people to have email addresses. Clearly, this changed toward the end of the decade, but for most of the decades, the fad of having email was clearly seen in the resurrection of the old chain letter. We would find funny things online and forward them to all our friends like email was going out of style. Those of us who had an email address were considered the rare few.

Over time, email revolutionized the workplace to the point where, at the start of this decade, it was unusual for people not to have email and businesses began to rely on it as a necessity for internal and external communication.

Spamming picked up on the email service as it became easy to assume someone was attached to an email address somewhere.

Since 2006, Twitter has been like email of yore. Relatively few (in the grand scheme of things) had a Twitter ID. It was seen as somewhat geeky and was dominated by early adopters (from true early adopters early on to earlier-but-not-quite-early adopters joining in late 2007 and 2008. We developed exclusive little circles that we gave cutesy names like “tweetup” to – a mashup of the words Twitter and meetup. We developed our own lexicon for the efficiency of 140 characters. Words like “failwhale” and “hashtag”. We would “at” people and “DM” and we all knew what we were talking about. It was our little secret that would cause innocent bystanders to scratch their heads in collective confusion.

Sometime last year or early this year, perhaps with the election or the sudden rate of adoption thanks to celebrities such as Oprah and Ashtun Kutcher joining the rank and file, Twitter became mainstream. It happened while we were asleep and we all revelled in the fact that these well known names were becoming part of us. Until it happened without our notice and we became part of them.

See they used our tool to assimilate our culture into theirs – the same way they used tabloids and celebrity blogs to draw more attention to their worlds. More power to them. Twitter is not something that can be assigned rules of behavior or communication.

Excuse the long winded article as I come into land with my point.

Historically, tools come and go – whether email or Twitter, the sex appeal of a service inevitably gives way to the practicality of being. Much like a marriage where (and I’ve been through this), a couple meets, dates, has fun, gets butterflies but eventually settles into a more mature state of existence with their partner, platforms evolve into a mature offering that is critical to communications. It becomes the norm to have the tool and the conversation evolves from the topic of conversation to the catalyst for conversation. The platform ceases to be the focus and just “becomes”.

This is where we are at now, or rather, where we should be now. We are not and this needs to change. Twitter as a business offers much fodder for discussion, but Twitter as a tool needs to become that tool and not the topic of conversation. When we get together we need to stop having tweetups and start getting together. We need to put down our iPhones and BlackBerrys and sending 140 character messages on to our friends in the ether. Instead of talking to them, get back to communications with the people sitting across the table from you.

Instead of worrying about how to use Twitter, we need to just use it. Instead of having panels at conferences about Twitter, we should be having panels about the topics people are talking about on Twitter. Instead of worrying about whats the best way to use Twitter, we need to get back to our roots (whether in journalism or communications or customer service) and start doing the jobs we are meant to do and using Twitter to make our performances better.

Twitter is dead as a topic of conversation. It is dead as fodder for blogs. It is dead as a startup that is revolutionizing our way of lives. It already has revolutionized our lives and now we run the danger of over-committing to a way of life that will keep us in one place instead of looking forward to the next big thing. Twitter is important to help us get to that point but, like Twitter founder Biz Stone says, it should be the pulse of the planet. And that’s it.

It's a Read/Write/Execute Web and We Just Live In It

I hesitate to put any kind of definition around the versioning of the web. The fact that the internet world has to quantify the differences between the so-called Web 1.0 and Web 2.0 is silly at best. However, there is no doubt that there is a vast degree of difference between the web that was known in, say, 1999 and the web that we know of in 2009.

Objectively speaking, the first generation of the internet was based around a premise of “Read only”. It, of course, was not termed that, but the technology did not exist to support anything else. People used the internet to read the news, find weather forecasts and catch up on sports scores. Blogs didn’t exist. Facebook and Twitter were but thoughts in their founders minds, and likely thoughts that did not even exist yet. Who knew that a time would come when the most interactive thing on the web would not be shopping and ecommerce?

Somewhere in the middle of this decade, the web took on a more interactive approach. Tim O’Reilly began calling it Web 2.0 to note the clear cut difference between a “read only” web and a “read/write” web. Social networks and blogs gave users of the internet a chance to participate in the creation of it, by generating content. Eventually, content generation transformed from the written word to video, podcasts and microcontent.

On the cusp of a next generation to the web, there is a movement toward meta-data, that is granular information to help discoverability on the web. APIs allow developers to take content from, say, YouTube or Twitter, and repurpose that into something usable in other forms by humans, applications and mobile devices. It is, in essence, a “read/write/execute” version of the web and we are already beginning to see this.

Ari Herzog, a longtime reader of this blog as well as a longtime opponent of mine, wrote a post declaring Europe’s Government 2.0ish aspect of their EU site a win over the United States. See his post for his rationale.

He certainly makes a good point with his premise after the jump:
Continue reading It's a Read/Write/Execute Web and We Just Live In It

Rant: Silicon Valley Fenetics

Yes, intentionally misspelled. Phonetics.

Phonetics and mashup are all the rage in Silicon Valley web 2.0 start-up naming conventions right now.  When it was Digg, FaceBook and Skype, this was different.  It was cool, fresh and neat.  You could not help but ask yourself, what’s that?!?

Now, it’s not cute anymore (‘sup Pownce and Jaiku!). Instead it signals, “Oh, another 2 dot-bomb.” OK, maybe we’re not there yet, but you get the point.

Branding gurus are charging clients tens, hundreds of thousands for not-so-cheeky plays on phonetics or slamming two words together.  Read TechCrunch, and you’ll find posts littered with examples:

Out of the three of these, there’s only one I like: TasteBook. Why?  Because it tells you or at least gives you an idea of what it does.  TasteBook allows Shazam-Poster-C10097475users to create and order custom hardback cookbooks (“tastebooks”). BTW, that’s what a company name is supposed to do. Tell potential buyers, partners and investors what kind of business it is.

One must wonder how much longer this latest naming fad will continue.  And if you don’t think it’s a fad, how many eGoofy cos and .bombs can you name in five seconds? Pets.com, eHarmony, eLuminant, etc., etc.

P.S. As a result of this rant and as a tribute to Doug Haslam, I’ve decided to rename my PR firm Shazaaamr.