Color me impressed — the big idea behind Color

There’s been a lot of head scratching in the past week about Color having raised $41 million for another photo sharing application. One questioner on Quora asked “How does the Color photography app compare to Picplz, Path and Instagram?”

Although on the surface, Color seems to be another mobile photo sharing app, it is really the first incarnation of a ubiquitous location-aware sensor network.

Today’s cell phones are in many ways more powerful than laptops and desktops because they are packed with sensors. A modern smartphone has GPS, WiFi, Bluetooth, compass, gryoscope, light sensor, microphone and camera — at a minimum. All of these data can capture data to be analyzed.

Ever wonder how Google can show you traffic on side streets? It’s by crunching location data sent out by Android phonesSkyhook Wireless has used its WiFi location look up system to create visualizations that correlate location with time of day. (Scroll down on that page to watch a video of user flows in and out of Manhattan.)

Color is trying to take all of those inputs and layer social networks on to them.

If Color’s vision is fully realized (or my vision of Color’s vision), we can expect to see applications like these:

  • Breaking news. By detecting abnormal usage spikes, Color could quickly identify where news is happening. Because the app is automatically location aware, it’s possible to distinguish between people who are actually at the scene and those elsewhere who may be reacting to the event. See my post Adding Color to breaking news.
  • Security line timers. Get accurate times for various security checkpoints. Copenhagen International Airport is deploying technology that will use WiFi signals to track passenger traffic flows.
  • Race finders. Marathons and similar events today use chips to track runners. Imagine that Color is able to identify all of the spectators and runners with the app during Bay to Breakers. Based on your previous social interactions, Color would know who your favorite runners are. Not only would you be able to track their position on a map, you’d be able to zero in on the pictures that are being taken in the vicinity of those runners. It would also be able to provide you a map to reconnect after the race.
  • Person-to-person transactions. Going to a game at AT&T Park, but don’t have a ticket? Fire up Color and see people nearby who have tickets for sale. Tickets from people you know would be prioritized. Instead of sitting next to strangers, you might end up next to friends who have an extra seat.
  • Person recognizer. This could be a huge boon to people with a poor memory for faces. The person at the party looks vaguely familiar. You know you’ve seen them before, but you’re too embarrassed to ask for the name. Pull up previous interactions and find out their name and the contexts in which you’ve met.
  • Bar finder. When I go out, I often have a mood in mind. I may want to be really social or I may want to chill. With Color, I could pull up a bar and see what the feel is right now by looking through the photostream. If there are no pictures, I could potentially ping someone there and ask them to take to a picture. (It gives new meaning to “Would you mind taking a picture for me?”) Foursquare is providing a variant of this with Foursquare 3.0’s recommendations.
  • Search and rescue. Missions could be tracked automatically, making for more efficient operations. Pictures from a location could be used to identify victims, discover who may still be missing and to notify next of kin.
  • CalTrain tracker. Instead of the horribly inaccurate data provided by CalTrain, Color users would automatically crowdsource the data. You wouldn’t even have to check manually for updates. They would be automatically pushed to you.

That’s the grand vision. In order for Color to accomplish any of these things, it will have to reach large scale. This is a challenge because Color is a seaparte application and not built in to the OS. Google can use Android phones to detect traffic because it’s baked into the OS. Likewise, Google and Apple get location and WiFi network information based on other things that people do on their devices.

Color needs to create an application that provides enough value that people launch it and enable all of those sensors. The application that’s out right now falls short of that goal. It doesn’t deliver an instant wow experience and by most accounts is confusing. Color has tremendous potential, we just need to see that demonstrated better.

See also:

3 Comments

  1. […] I wrote about Color, a new app that has the promise to become a ubiquitous, location-aware sensor network. It’s […]

  2. […] All of these can capture data to be analyzed. It is really the first incarnation of an “ubiquitous location-aware sensor network“. Calling it a “photography app” is vastly underselling the technology behind […]

  3. When will people realize that the iPhone app Color isn’t about mobile photo sharing?…

    From March 27: > There’s been a lot of head scratching in the past week about Color having raised $41 million for another photo sharing application. One questioner on Quora asked “How does the Color photography app compare to Picplz, Path and Instagram…

Comments are closed.