A class of apps are endowing smartphone users with what might have passed for superpowers a decades ago. They work by keeping our smartphones’ mics, GPS and even our cameras listening, watching and seeking out every signal coming from our surroundings. I have chosen a few examples of these apps which are listed below. In many ways, these apps are a logical step in the evolutionary process of mobile devices.
Shazam is undoubtedly one of those “I have to have it” apps for anyone who has ever heard a song or tune they love – whether in a car with extraneous background noise, in an advert or in a crowded shop but have not been able to identify the artist or song.
How it works:
Shazam fingerprints a comprehensive catalogue of music, and stores the fingerprints in a database. As a user, I “tag” a song I hear, which fingerprints a 10 second sample of audio. The Shazam app uploads the fingerprint to Shazam’s service, which runs a search for a matching fingerprint in their database. If a match is found, the song info is returned to me, otherwise an error is returned.
In more recent days, a new app, ASAP54, was launched working in a similar way to Shazam by using a combination of visual recognition technology with a social community and a team of personal stylists to find outfits we see on the street or in a glossy magazine and matches it to their database to instantly scout out the exact item we’re looking for. If it’s unable to find what we’re after, it suggests a range of similar products and click-through links to buy. It also has a database of stylists on hand who promise to send us five similar suggestions within 24 hours via email.
How it works:
I would snap and upload a picture of an item or look I was after, whether it’s a designer dress seen in a magazine or a swatch of fabric, then select the type of item I’m interested in, such as shoes, dress or trousers. The app searches an exact match in its database, or suggests a range of related products, with the option to click-through to buy. It’s designed to be faster and easier than a search engine.
Even the aviation industry is not spared. Point our smartphones at the sky the next time a plane flies overhead, and Flightradar24 app can tell us not only the make and model of the plane, but also where it departed from, where it’s headed, how fast it’s going, how high it is and more!
How it works:
It is an inspired combination of smartphone hardware, mapping, and real-time access to massive data storages tracking every commercial flight, mixing that with data from the airlines, then matching it all with the GPS coordinates from our phones, ultimately determining what plane we must be looking up at.
This evolutionary phase of apps integrates their ‘awareness’ of where we are and what is happening around us, with highly personal contextual data when recognising human speech, motion, visual cues and audio.
This is where automatic content recognition (ACR) comes in. ACR involves a watermark, signature or fingerprint that is embedded across all content, across all possible platforms. It opens so many doors for content and synchronisation across screens, in a way that is possible to trace the traffic route of a specific piece of content no matter where it is viewed. In fact, ACR could prime the media landscape to track content across even future, yet-to-be invented platforms.
Delivering contextual data is a tantalising opportunity for new revenue streams. ACR offers consumers deeper immersion and interaction through dynamic and seamless interlinking of devices, users, content and applications with advertising opportunities. For example, giving TV viewers the ability to buy the products available for purchase with each scene they see on TV matched via ACR.
In the multi-screen environment, ACR is also a tool that gives a smart device the ability to become “content-aware.” This automatic recognition can then be employed to trigger content that is complementary, without the need for the consumer to manually enter Web site addresses, or search for the relevant information on multiple media channels. Advertisement can be triggered allowing for the synchronisation of value-added functionality such as content-specific background information, hyperlinks, and synchronised social news feeds, all within the application. In addition, such services enable application providers to work in close partnership with advertising agencies and brands to further monetise their application platforms.
ACR also provides a vital strategic and tactical tool that addresses the multi-screen environment in which today’s consumer consume content – such as the ability to track highly granular viewing habits, identify detailed real-time information as to where, when, for how long, and on what device content is being consumed.
Nonetheless, what if though this technology, we become the content. What will happen if these apps can help us identify someone speaking at the table across from us and automatically takes us to their LinkedIn profile without their awareness? That could definitely be considered a breach of privacy, and that’s the power, promise and pitfall of these apps.
These apps now may seem like magic, but how comfortable are we enough to embrace Big Brother? Because essentially, for these apps to be meaningful, they have to be always on.