I had the opportunity and privilege to get an early look at the new Amazon Fire phone. It delights in many ways, but I’ll focus on the shopping experience enabled through Firefly.

For those who may not remember, Amazon put a dedicated physical button on the left hand side of the phone that launches directly into image recognition. If the image is recognized, then a web-based mCommerce experience launches. The user can then buy the product or it on a wish list, among other things. From there, the experience is more ‘traditional Amazon.’ The ‘new’ is the image, email, URL, etc. recognition.

Why is selling mobile phones important for Amazon? mCommerce in the US alone will add up to nearly $100M by the end of 2014. The new battleground for retailers is in the mobile moment – the point in time and space when a consumer pulls out her phone to get something she needs immediately and in context. Amazon’s FireFly service facilitates two core types of mobile sales moments:

  • Impulse Sales Moments – these are often flash sales (e.g., WTSO.com, SteepAndCheap, etc.) or spontaneous purchases (e.g., Groupon). The opportunity for Amazon here is in minimizing the friction between consumers seeing something they want, and enabling them to buy it before they forget about it, or find it later in a store nearby.
  • Replenishment Sales Moments – the phone (or something like an Amazon Dash) is with me when I realize a shampoo bottle or milk is empty or I need more toothpaste.

I tried several recognition scenarios with mixed success using the Firefly service. Generally, known items were recognized in about 4 seconds. Firefly seemed to give up after about 10 seconds if it couldn’t identify the product. Sometimes (especially with books) it would return a “this product is not known,” and then find it a few seconds later. It was fun to watch the analysis in action as the blue dots shifted their focus from the middle to the edges of the screen.

Here’s what I wanted the service to do for me:

  1. “I see that! I want that!” (an impulse sales moment). I scanned books, sneakers, food, a shirt, and a cat house. These items were mixed in terms of availability on Amazon.com. Some were. Some weren’t. What disappointed me was that items Amazon sells (e.g., Drip Drop – a hydration solution) weren’t recognized. But the experience was still inspiring.
  2. “Ack. I’m nearly out of shampoo!” (a replenishment sales moment). Firefly had a mixed ability to identify products like soap and shampoo. At times I wondered if it would be easier to just trigger recognition off of the words in the label.
  3. “Ah, that is cool. I want that.” (again, an impulse moment, but from a magazine). Understandably Firefly struggled to identify an item of clothing on a page – even with a label – but especially if there were a lot of other items on the page. I didn’t expect Firefly to do this, but I was really hopeful that it could.
  4. "I want that song!” Firefly very easily identified music playing on the radio and offered a seamless process for me to buy that music.
  5. “Switching from my TV to my phone.” I used the video identification portion of Firefly not only to identify a TV show, but also the chapter of the show, to be able to transition to that portion on my phone. This feature was very impressive.

Was shopping using Firefly faster than using the app on my iPhone? Not necessarily. But I can genuinely imagine the potential. It’s always hard to be first in this market. What Amazon has created with Firefly is impressive. Amazon clearly understands the potential of winning in customers’ mobile moments. Like all connected products, I expect to see continual updates and improvements. It’s good now, but I can see it being great in a few months’ or a years’ time.

What’s the next milestone for Amazon? We’ll have to wait and see if other retailers use the technology in their apps. But the number of developers creating apps for the Fire platform continues to grow, and I’m excited to see what they do next.