, simplejson and json modules

When dealing with JSON objects in Python, we have the excellent simplejson module to convert a JSON object into Python object. But, in Python 2.6 (and 3.0), we have the json. While loading it’s as simple as

    # Python 2.6/3.0 JSON parser
    import json
except ImportError:
    # Fallback to SimpleJSON
    import simplejson as json

And you would just need to use json.dump(), that leaves the problem of pointing if the application requires simplejson or not in the The solution is use sys.version_info comparing versions with distutils.version.StrictVersion:

from distutils.version import StrictVersion

import sys
version_number = '.'.join([str(a) for a in sys.version_info[:3]])

if StrictVersion(version_number) < StrictVersion('2.6'):
    params['requires'] = ['simplejson']


  • First of all, params is a dictionary of parameters passed to setup (as setup(**params), as Python supports passing a dictionary as a list of parameters as a function.) Another probable solution would be use a requires variable and set it to None in case the version is superior than 2.6.
  • The version_number uses just the first three elements in sys.version_info 'cause it also contains other information like "final" or "rc" and the release info (e.g., Python 2.5rc1 would have a sys.version_info of (2, 5, 0, 'rc', 1))
  • StrictVersion provides proper comparison between versions, so you don't have to worry about Major, Minor and Release versions or even have to write three cases in an if.

Web 2.0 is not streamable

This week our connection at home is shaped. This means that, instead of the shinny 1Mbp/s that we usually have, now we have to suffer to see pages with a bandwidth of just 64Kbp/s. But there is one thing that such limited bandwidth made me realize: The next web isn’t streamable.

To get to that conclusion, I hadn’t have to go far: Just opening Google Reader shown that it’s impossible to live with a very limited bandwidth. Right now, I should have something like 1000 unread news in 1 hundred subscriptions, which means Reader have to download a large description file with all that information. Thing is, right now, it doesn’t do anything: It shows the default Google application header, the logo and that’s it. But, knowing how things usually works in this Web 2.0 universe, I know that there is something going on:

Interactive sites, like Google Reader and GMail use AJAX. AJAX relies on XML, which is a structured plain text data (the same can be said for JSON.) XML allows the data to be in any other inside their structure. As an example, imagine a book information list: Inside the “Book” item, you can have a “Title”, which can be in the very beginning or the very end, but the result would be the same. So, any application that uses XML need to first receive the information, then convert it to some internal representation and then it can be used. Google Reader wasn’t “doing nothing”: It was receiving the list of feeds and the initial 100-something feed items which, due the small bandwidth, was taking very long. And, because it needed the whole thing, nothing was being displayed.

Which is a problem I see with many XML/JSON results: You can’t stream them in a way that you can start using the information before having it all. For example, in Mitter, we can’t display tweets before we received the whole message. If XML and JSON weren’t so loosely defined and we had a way to assure that after the element “User” we would have an element “Message”, then we could start displaying tweets before we had all of them (not that the format changes all the time, but since we can’t ensure that ordering, we must be ready for the data appearing in a different order — or with some other data between the ones we need.)

In a way, that’s a complete reverse of roles for AJAX. In the very beginning, AJAX was used to prevent large downloads: If you had a page where it would be useful to display all options to the user to help him/her to find data, you’d have to fill the page with that data (imagine, for example, a page with all your tags, plus all the possible suggestions for all the other users.) The use of AJAX meant the site could filter results, so you’d have a smaller page, with would do small requests to the webserver, returning small amounts of data. In overall, it meant that the user experience would be faster. Now, we have so much information packed in XML/JSON formats that the user experience is not as responsive as it should.