This week I was at SxSW Interactive, for those that aren’t familiar it is a 5 day conference about new technology. I was there to be on a panel about “Interoperable Location Data” with Josh Babetski from Mapquest, Adam DuVander of Programmable Web, Scott Raymond from Gowalla and Tyler Bell of Factual. The premise of the panel was that many organizations are currently creating point of interest databases, but how do you combine them or allow them to interact?

What was interesting was the day before Foursquare made an announcement that fit directly into this premise. They began partnering with other organizations to link information between checkins and other services. You can see this in action on the the Thrillist section of Foursquare. If you notice there is a “More Info” link next to each of the displayed venues. The permise of this is you can then go read the Thrillist review of the venue and easily add it to your Foursquare to-do list. I think the idea of “pre-exploring” an area is interesting, meaning when I normally use Foursquare I tend to only use it when I am out and about. Sometimes I look to see where my friends are, other times I look for popular places to go. Combining the Foursquare information with other things allows me to explore from my desktop and plan to do things later. Other partners for this launch were The New York Times, New York Magazine and MenuPages. For the full Foursquare blog post look here.

So Foursquare has starting doing these integrations how can everyone do it? Services are emerging to crawl and collecting this data such as Factual and others are attempting to solve the problem on their own. The most idealist suggestion was to create a foundation to hold specific location information everyone would link to. This was suggested as a possibility within our SxSW panel, but I think everyone agrees it would likely be impractical. I’m excited to see how GeoIQ analysis modules can further aggregate data from a variety of sources to help others create these links and potentially serve as an outside link as well. This can either happen through our APIs or straight downloads as well. I’m excited because I think it means more cross data analysis similar to Sean’s recent analysis into San Francisco start-ups and his other analysis on DC Restaurant Inspections and Yelp Reviews.

As combining data continues to become easier I can wait to see what other insight is possible!

Tagged with:

3 Responses to SxSW and Interoperable Location Data

  1. Carl Reed says:

    Interesting. Is the group aware of the W3C and OGC related standards activities related to PoI’s? The current focus is on an encoding independent abstract model for specifying a PoI.

    • kate says:

      Hi Carl,

      This came up in during the panel. One of the comments made was that often OGC standards end up being too complicated compared to community developed standards. Start-ups are ready to link their data together now and quickly.


  2. Carl Reed says:

    Well, I am a bit slow in responding! Anyway, the OGC develops standards based on a community, consensus process. We just happen to have very formal policies and procedures that guide the standards development and life cycle management process.

    I have also worked in developing specifications, such as GeoRSS, using what I believe you are thinking is a community process as opposed to the “formal” standards organization processes. Interestingly enough, these community groups quickly begin talking about process and procedures – like how to be know we have consensus or how do we resolve conflict.

    As to OGC standards being “complex”, well, maybe. There are now OGC standards or standards in progress that are very light weight yet still consistent with various other international standards and agreed to abstract models for expressing location content. One example is Open GeoSMS. Another is the GML profile just agreed to by OASIS for use in all of their emergency service and alerting standards.

    The problem is that geospatial data refuses to stay simple! Sure, one can easily share points of interest through very simple encodings and interfaces. The problem is that these simple encodings provide no information related to accuracy, uncertainty, provenance, semantics, and on and on. Perhaps for simple social media applications such information is not required. But as soon as location data and properties related to that location are ingested into applications such as those for emergency services, a different set of rules apply.

    Speaking of the properties related to location, too often different applications collect such information using different vocabularies and codelists. The net effect is that content from vendor A application is different from the content semantics from vendor B application. In order to bring these two data sources into a 3rd application (say from a start-up), some level of semantic mediation is required. Suddenly simple linking of data is not so simple anymore.

    There is also the “fit for purpose” aspect of dealing with geospatial/location content. In order for users and applications to have the information necessary to determine whether a specific set of data are fit for purpose, some elements of additional information are required.

    Apologies for the slight ramble :-)