A Tale of Two Cities: How DC and San Francisco Are Handling Citywide 311

Without a doubt, I am a data whore. I love raw data. I love APIs. I love finding interesting ways to mashup data. With the new found craze in government for openness, led in no small part from the Federal level and work endorsed by the Obama Administration to work pushed forward by Sunlight Labs, Craigslist founder Craig Newmark and others, I’d expect the openness to trickle down to state and local levels. And it is.

On one level, you have Washington, DC (where I live) who has been making impressive strides through OCTO (Office of the Chief Technology Officer) with the assistance of iStrategyLabs and the Apps for Democracy competition.

Washington, DC is in production of it’s Open 311 API, a RESTful data API that they are careful to note is in development. (We will be building a PHP library around this API shortly, so keep an eye for that announcement over at Emmense.com).

In using a REST API, DC is opening up the service sector of the DC City government for developers of all sorts to tap into and build applications around. All to meet the needs of city residents.

San Francisco, on the other hand, just announced that they are utilizing Twitter to allow residents to submit issues directly from their favorite web application. Simply by following @sf311 (and being refollowed), citizens are able to DM requests.

Personally, I am partial to DC’s approach but I applaud both cities for pushing the boundaries to bring city government closer to the people. Frankly, I’m a little concerned about San Francisco utilizing Twitter for this purpose, for the same reason that I am hesitant about any business making their business model about Twitter. Twitter has not proved, at least in my mind, that they have the business savvy to keep their service from going out of business. Likewise, they have not proved their technical ability to make a fail-less system. It’s a game of Russian roulette to base a business (or government service) around this application. San Francisco probably has failover plans and this is just another approach though, so arguably it’s not a significant risk.

However, the solution to the 311 problem becomes infinitely more scalable when utilizing a pure API and allowing the pure submission and retrieval of data. And the use of an API keeps responsibility in-house. Twitter is not paid for by taxpayer money, so there is no expectation of quality control. A government owned and maintained API, on the other hand, provides safeguards that make sense.

All that aside, it is clear that both DC and San Francisco recognize that the accessibility of governments to their citizens is an utmost important goal in 2009. They are taking laudable steps to break down the barriers and solve real problems with modern technologies. For that, I can find no fault.

Crossing Over Technology With Government

In recent months, I’ve made a small fuss over the so called Government 2.0 experts descending on Washington expecting to change the way of life in government. Of course, I’ve been also called out for not providing actual solutions. Probably rightly so, but understand that I don’t work in the government space. I am simply an outside observer who approaches problems with some degree of sobriety and realism.

Today, I figure I’ll offer some ideas that can move the conversation forward in some kind of constructive way. Wired’s Noah Shachtman covered a white paper released from the National Defense University that approaches Government 2.0 from the perspective of information sharing. While that is indeed a portion of the solution to the greater problem, the military in particular, probably needs to look at broader solutions (and more specific, less 50,000 foot view), as a more effective technology complement to their Mission.

For instance, while simple communication across the various branches of the service is useful for any enterprise, it would pay to address the core war-fighting mission of the military. For instance, a less than 50,000 foot view that suggests “information sharing”, might propose use of mobile devices that utilize GPS information for tactical war-theatre decision making.

Real-time use of video and photography immediately makes data available to analysts requiring split second decisions (such as the split second decision making by the Navy Captain responsible for ordering the sniper takedown of the Somali pirates this weekend).

It is not useful to simply put out generic information about “information sharing” and suggest blogs, wikis and the like are the solution to the problem. While I understand whitepapers are intended to provide a skeletal framework for further action, it is condescending to organizations who already value and understand the need for “information sharing”. What they are looking for is the “hows” and “whats” to achieve their mission.

As stated in previous articles, this is where the “experts” should be focusing. Realistically, those activities will be classified and not published for public consumption. That’s probably the way it should be. The real experts are working internally, inside their organizations, with their constituency – not in the public forum where context and value are lost.

New York Times Makes Massive Leap in Bringing Congressional Data to the Web

For all the talk in DC about transparency in government, that seemed (at least in my sense) to really come to the forefront of everyone’s attention with the House Rules on social media use issue last July, then escalated with the Senate, the bailouts and finally the election of one of the most social media savvy presidents ever, the status quo has been largely wishing for transparency and talking about it.

img_9766

The New York Times decided to take it a step farther today by actually providing data in the form of the Congress API. This data is pulled from the House and Senate websites but I have to guess also includes data that is mined from the Congressional Record, the daily public account of all official business that is still, ironically, published in print form en masse. Up until now, the Congressional Record has been available upon request and is hard to actually get real signal from amidst the noise of process and procedure.

With the NY Times Congress API, it is now possible for developers to build tools that mine the Record for roll call votes, members of each chamber, and information about members including chairmanships or committee memberships.

It will be interesting to see how this data is used and how it can be leveraged to keep the government honest. Developers can check out the technical details here.