When SPRX Mobile announced that they were opening up the Layar API back in July, I applied immediately. I wanted to learn more about publishing geo-coded data, keep abreast of what Layar was up to, and try to deliver some useful data all at the same time. Fortunately my application was accepted and I received one of the first batch of API keys to go out.
My specific project has been to take the real-time bus arrival information provided by One Bus Away and publish it on the Layar platform. I use the mobile-formatted One Bus Away website at least twice per workday as part of my commute. This data is currently only available in Seattle, but will soon be expanding to everywhere that offers a GTFS feed. My feelings about this experience have been almost entirely positive, but I still come away from it discouraged.
On one hand, the people building Layar (Dirk in particular) have been very helpful. The platform is easy to develop for and they provide good documentation and tools to make it even easier. All of the time spent on this project (which took less than 24 working hours, total) was spent figuring out Google AppEngine, the python web framework I used, the One Bus Away API, and how to filter nearby stops to a reasonable set to show to a user. With minor exceptions Layar performed very well. I have provided all 436 lines of code here so you can see for yourself how easy it was.
Marjolein and Claire from SPRX were helpful in less technical ways too. All developers were invited to the launch event to show off their layers. They ran several conference calls for people all over the world to answer any questions on the API or about the launch. SPRX has done a great job with the launch of Layar 2.0, and I think all the positive press they have received is a direct result of that.
My discouragement has less to do with Layar specifically than it does with the entire category of tricorder augmented reality. The view through the mobile phone and its camera is less useful than a top-down map would be for every piece of data I have seen so far. For my layer in particular, the rider is very likely to know where the stop is. In situations like that where location is unimportant, both the Reality View and Map View actually get in the way.
This experience has led me to two conclusions. First, augmented vision is pointless until head-mounted displays are available. I already felt that way, so now I am just more firm in my belief. Second, filtering data to a useful subset for display is actually the hard problem. Job listing sites, travel sites, Ecommerce sites, and review sites already knew this, which is why they spend so much effort on search. Turns out the problem is the same for mobile location-aware services.
If you live in Seattle and would like to try out the One Bus Away layer for Layar, just search for One Bus Away inside of Layar. I welcome your feedback on how I could make this layer more useful. And, of course, I would also love to hear your thoughts on the utility of augmented vision on a mobile phone.