There is a lot of buzz about analyzing social media today, but I don’t think we spend much time thinking about how we can make analytics themselves more social. The Web has done a great job of making several things more social. At many levels the heart of the Web 2.0 moniker is making information more social. First, making the written word more social with blogs and wikis, then more recently making data more social. Part of this has been the social act of crowdsourcing data, or remixing data through mash-ups. Emerging from the ability to remix data have been data communities like InfoChimps, Freebase and OpenStreetMap to name only a few. Each community has a slightly different spin on the concept, creating a vibrant ecosystem.
This movement to make data more social was the motivation to create GeoCommons – a community of shared geodata. Our first step down the path was the release of Finder – a tool to share and discover data on the Web. We then evolved GeoCommons with the addition of Maker to allow for the collaborative visualization of data – allowing anyone to be a cartographer, share their maps and discover other’s work. The next step in the sequence is analytics, and in keeping with the mission of GeoCommons making analysis social and collaborative.
One of the things I loved about academia was the social and collaborative nature of doing analysis. Whether is was white boarding equations with other grad students, getting feedback from a professor or opening up a concept to the scrutiny of peer review, there was a constant feedback loop on analysis and the new ideas they generated. As we thought through bringing analysis to GeoCommons we wanted to bring the same petri dish of innovation to the Web at a much larger scale. Can we make the ideas formulated in analysis viral – spreading not only to different users but also to other data sets. Letting the ideas mutate – tweaking equations and changing data sets to discover new trends and insights over time and space. Then tracking the evolution of an analysis and the derivative works it creates.
Social alone, though, is not enough to keep up with where the Web is headed. Our analysis must be dynamic just as our data is. Whether it is Tweets, check-ins, or my point-of-sales data, nothing is static. Just as my data changes so should my analysis. We need to go beyond just counting events as they fly through space and time, but also discover what is driving them. Knowing how many people have checked-in is helpful, but knowing the correlation to store sales and how I can optimize predictive promotions through that is powerful.
We’ve already pushed out new analytic functionality to several of our enterprise customers for feedback and we are excited to be launching the same on GeoCommons this January. To give a little hint of what is to come check out the icons for a few of the analytics we’ll be launching then.
If you are an enterprise and see a need in your organization today – drop us a line.
Welcome to the Esri DC Development Center blog. We write about features of our work on big data analytics, open platforms, and open data, what is new and exciting in the Esri and community, and general industry thought leadership and discussions of geospatial data visualization and analysis.
Please explore what we're working on and let us know if you have any questions or ideas!
- Bob Carr on Working Around Min/Max Scale
- Histogram Time Slider | #MustVisit - Pintu Gerbang on Histogram Time Slider
- Armanda Dudden on Helpful Data for the San Diego Fires
- Quinton Salomon on Dataset of the Day: State Firearm Restrictions, Solely based on Crime Rates?
- Gertha Godzik on Dataset of the Day: After the Floods, Farmers Get a Little Help From Uncle Sam!