Category Archives: color

How Color can chart a course to avoid being the next Wave

Yesterday, I wrote about Color, a new app that has the promise to become a ubiquitous, location-aware sensor network. It’s first incarnation is as a photo sharing application available on iPhone and Android.

The initial launch has met with much criticism, including comparisons to Wave — a doomed social effort from Google.

Consider the similarities:

  • Enormous expectations. Wave was hyped by Google and given high profile executive attention at Google I/O. Color’s expectations have been set by having raised $41 million.
  • Poor out-of-the-box experience. Both Wave and Color have poor first experiences for the casual user. Even industry luminaries are scratching their heads.
  • Big change in user behavior. Both Wave and Color go against established patterns of user behavior. Wave tried to replace email. Color is challenging the notion of manually creating friend lists.

Google made a number of key execution mistakes in the launch of Wave. Fortunately, Color has avoided the biggest one: Wave was opened slowly on an invite-only basis. Despite the fact that the product was based on group interactions, you couldn’t get enough invites. I know. I tried to get my entire team to use Google Wave, but I couldn’t secure enough invites. Color doesn’t require a special invite.

The Color app should provide an easy way for viewers to see where a picture was taken.

The Color app should provide an easy way for viewers to see where a picture was taken.

There are a couple of other key mistakes Google made. Here’s  how Color can avoid them:

  • Lack of notifications. When I would make a change in Google Wave, the other participants had no way of knowing that I made the change. (Short of logging back into the service.) After I did this a few times and got no response, I stopped using it. The same exists with Color. I will see people’s photos randomly added to my Color application, but I don’t get notified when it happens. Getting notified when other people nearby are using Color would increase usage because you wouldn’t feel like you were talking to an empty room. It would also make face-to-face interactions easier. Notifications are an important part of ramping up any social network. At some point, your product will become so popular that people will use it all day unprompted. (I shut off Facebook email notifications long ago.) Until then, you need to nudge people to use it.
  • Lack of clear use cases. Most people have a really difficult time adopting radically new product concepts. You need to hold their hands and show them how it could apply to their lives. Wave didn’t do that. You were dumped into a blank canvas with a lot of unfamiliar controls. Color is much the same. There’s little guidance as to how Color can improve your life. The initial controls are so tight that you also can’t easily see how other people are using the product.

Then there a few things that are specific to Color and its goals that should be improved:

  • Lack of location/time liquidity. Color matches you with people based on photos being taken at the same place at the same time. That’s overly restrictive. Outside of major events and cities like San Francisco, this is going to be infrequent at best in this stage of the product’s adoption. It’s as if you launched foursquare and could only see tips left by users in the last 5 minutes. Older content has value. A few years ago, I took a set of pictures at Liberty Tavern. These pictures are valuable even now. And they’re certainly better than showing nothing. Showing older content would also encourage more people to take pictures. If privacy is a concern, older pictures with faces could be excluded with face detection software.
  • Locations aren’t visible. For a product that is focused on location, it doesn’t do a good job of showing it. I have random people in my Color feed, but I don’t know where I might have bumped into them — I have to guess at that. It would be better if I could select a person and see a map of where I met them with the date and time. Someone commented on one of my pictures asking where it was taken. That’s not a question they should have to ask. The data is already in the network; it should be accessible. (With the caveat that private places should be obscured so that someone doesn’t follow you home.)
  • People can’t connect with experts. One of the big reasons for the success of Twitter is that it works even if you don’t have any friends. When I’ve done user research on social products in the past, I inevitably had people who said “I don’t have any friends” or “my friends are stupid.” Social products need to work even in these scenarios. In fact, most of my real friends aren’t on Twitter. But I can still derive value from the people who are. With any social product, you’ll have a few people who are on the bleeding edge who can seed content for you. Exploit that. To fit into Color’s model of not requiring explicit follows, they could be added automatically if someone browses their pictures.

See also:


Color me impressed — the big idea behind Color

There’s been a lot of head scratching in the past week about Color having raised $41 million for another photo sharing application. One questioner on Quora asked “How does the Color photography app compare to Picplz, Path and Instagram?”

Although on the surface, Color seems to be another mobile photo sharing app, it is really the first incarnation of a ubiquitous location-aware sensor network.

Today’s cell phones are in many ways more powerful than laptops and desktops because they are packed with sensors. A modern smartphone has GPS, WiFi, Bluetooth, compass, gryoscope, light sensor, microphone and camera — at a minimum. All of these data can capture data to be analyzed.

Ever wonder how Google can show you traffic on side streets? It’s by crunching location data sent out by Android phonesSkyhook Wireless has used its WiFi location look up system to create visualizations that correlate location with time of day. (Scroll down on that page to watch a video of user flows in and out of Manhattan.)

Color is trying to take all of those inputs and layer social networks on to them.

If Color’s vision is fully realized (or my vision of Color’s vision), we can expect to see applications like these:

  • Breaking news. By detecting abnormal usage spikes, Color could quickly identify where news is happening. Because the app is automatically location aware, it’s possible to distinguish between people who are actually at the scene and those elsewhere who may be reacting to the event. See my post Adding Color to breaking news.
  • Security line timers. Get accurate times for various security checkpoints. Copenhagen International Airport is deploying technology that will use WiFi signals to track passenger traffic flows.
  • Race finders. Marathons and similar events today use chips to track runners. Imagine that Color is able to identify all of the spectators and runners with the app during Bay to Breakers. Based on your previous social interactions, Color would know who your favorite runners are. Not only would you be able to track their position on a map, you’d be able to zero in on the pictures that are being taken in the vicinity of those runners. It would also be able to provide you a map to reconnect after the race.
  • Person-to-person transactions. Going to a game at AT&T Park, but don’t have a ticket? Fire up Color and see people nearby who have tickets for sale. Tickets from people you know would be prioritized. Instead of sitting next to strangers, you might end up next to friends who have an extra seat.
  • Person recognizer. This could be a huge boon to people with a poor memory for faces. The person at the party looks vaguely familiar. You know you’ve seen them before, but you’re too embarrassed to ask for the name. Pull up previous interactions and find out their name and the contexts in which you’ve met.
  • Bar finder. When I go out, I often have a mood in mind. I may want to be really social or I may want to chill. With Color, I could pull up a bar and see what the feel is right now by looking through the photostream. If there are no pictures, I could potentially ping someone there and ask them to take to a picture. (It gives new meaning to “Would you mind taking a picture for me?”) Foursquare is providing a variant of this with Foursquare 3.0’s recommendations.
  • Search and rescue. Missions could be tracked automatically, making for more efficient operations. Pictures from a location could be used to identify victims, discover who may still be missing and to notify next of kin.
  • CalTrain tracker. Instead of the horribly inaccurate data provided by CalTrain, Color users would automatically crowdsource the data. You wouldn’t even have to check manually for updates. They would be automatically pushed to you.

That’s the grand vision. In order for Color to accomplish any of these things, it will have to reach large scale. This is a challenge because Color is a seaparte application and not built in to the OS. Google can use Android phones to detect traffic because it’s baked into the OS. Likewise, Google and Apple get location and WiFi network information based on other things that people do on their devices.

Color needs to create an application that provides enough value that people launch it and enable all of those sensors. The application that’s out right now falls short of that goal. It doesn’t deliver an instant wow experience and by most accounts is confusing. Color has tremendous potential, we just need to see that demonstrated better.

See also: