Blip!

The Story Behind

It all started with a joke. We had a funny picture in the office that someone tagged with a note sticker - just to make fun of it. While the actual message came over quite well, a brilliant comment was made about how the sticker did not match the look & feel of the picture - it somehow did not blend in right. Obviously, that was not the planned outcome, but instead of moving on with the day, the design team proposed a different colored sticker to make it fit right. And it did. An idea was born. Thinking about it for a bit, we realized that we could make a fun and entertaining small app out of this scenario and create an additional value we could use for upcoming projects - gain experience with color recognition and eventually create our own solution or engine.

The Blip Engine

Doing our research, we decided to use Google Palette and tweak it to the point that enabled us expanded control over its behavior. What we wanted was not only to have full control of color recognition over an image - we also wanted to modify the outcome to a degree where the applied content would use slightly different colors than the ones on the image (either a lighter or darker tone). Corner cases turned out to be tricky too - some images were hard to work with - therefore, test and trial was a viable solution.

Design Sprint

When it comes to design, we believe in trying out a few approaches before we decide on the final one. If Blip’s case, we sketched a few UI’s, prototyped them, and decided to work with the most understandable one which engaged the user to a level where he got seamlessly through the flow of creating, editing and sharing an image. There were some challenges we needed to overcome, though - for instance, the users did not know that our engine pulls colors from their pictures, mostly because after selecting an image, they would immediately have an option to add content.

Animated color extraction process whereby colors are associated with small cards that drop and call out a bar with corresponding colors Blips can be dragged onto the image and thrown out of it. They also follow a fluid path depending from which corner they are dragged or rotated, ensuring a smooth experience.

Outcome

Blip was developed using agile on both iOS and Android platforms. As expected, due to the fluid nature of UI, iOS did take less time to develop and behaved more smoothly, but - we’re happy with how both platforms turned out. As imagined, Blip was a playground for us - a small adventure into experimenting with new UI challenges, and a fun way to gain extensive knowledge about image recognition. Blip’s engine will be reused and extended as a separate entity and be a part of many more apps or components, and, as we move forward, it will be noticed in future projects. As for the actual app itself, it’s was a fun experiment that has come to its end.

What We Learned

As stated before, Blip was more an experiment than an actual project - it started as a lean design thinking workshop, translated into agile design/development cycles and got a few tweaks during the process. We tried hard to do a lot of things in little time and test it with real users. Bottlenecks did arise during the entire project cycle, but it did not hinder us in finishing the MVP. The lessons learned go mostly in favor of our internal process, and our design thinking workshops that we insist on. Blip is proof of how much can be accomplished in little time, with limited resources, but more than that, how much experience can be gained from a fun, inspiring little project that started out as an internal collaboration.

Get Blip here

App Store Google Play