A picture is worth a thousand words, and there are now millions of data points in every photo which can be analysed to tell a whole story.
Startup Crowdoptic uses accessible data from photo uploads to draw conclusions about events such as concerts, sporting matches and natural disasters. Data includes:
- GPS co-ordinates and compas position (from smartphones and now some cameras) to show exactly where each photo was taken
- EXIF data shows what camera was being used, and at what settings (“Was the photographer a pro or clueless amateur?”)
- Social metadata (#hashtags, Facebook/Flickr/Instagram album locations) can be scoured for more details
Why is this interesting?
Asking people to download Yet Another App is a barrier, so why not use the natural photo-taking behaviour as an interface in itself.
Example: “Take photos of our activation staff at the Winter Olympics to enter our competition. Tag photos, or not – we’ll do the rest”. Technology like Crowdoptics could be used (along with image recognition) to help identify all photos of the staff based on GPS coordinates (from staff phones) at the time, and display them in an interesting way.