In its early guise, Qsic were capitalising on a simple problem that many businesses didn’t know they had: playing background music in their public venues from streaming services using private subscriptions, which is illegal. Back then, in 2011, there were no smart speakers, talk of automatons was confined to Ex Machina-type dystopian worlds and the idea that a self-driving car would be allowed on our roads was ludicrous.
Today, the number of smart-speaker users is growing at an annual growth rate of 47.9%,robots are in fact being introduced into the work place in droves and self-driving cars are no longer a thing of fantasy. In 2018, it’s hard to miss the buzz around AI and Machine Learning (ML). But what are these terms? How do machines with these capabilities work? And what has Qsic’s patented, industry-leading AI called AVA, Autonomous Volume Control– which can learn, predict and adapt to changing business conditions – got to do with it all?
The concept of AI emerged along with the first computers which were – simply put – remembering information and making calculations. As our understanding improved of how the human brain worked, so did the technology around AI. While computers still make incredibly complex calculations, it’s not these developments which excite technologists. Instead, what gets them going is creating machines that can make decisions like humans and can complete tasks intuitively.
Machine learning is essentially a branch of AI based on this idea of building machines which process data and learn on their own – completely autonomously. After all, it’s moreefficient for humans to teach computers how to think for themselves, rather than for humans to input the data needed for computers to perform tasks, and then making them complete as many tasks as possible. This idea, coupled with what the internet offered in terms of unprecedented information storage, meant machines would be able to look at vast amounts of data, absorb it and then make decision based on what they had learned. This is where we are today – creating machines which can learn without being programmed, discovering insights for themselves through various data touchpoints.
What’s this got to do with Qsic and AVA? Like all good ideas Qsic cofounders, Matt Elsley and Nick Larkins, fell over AVA. Feedback from clients was that whilst being able to stream curated playlists legally into their venues was a real coup, a major issue for them was volume control. Specifically, that their ever-changing foot traffic affected how their background music sounded, meaning that it often felt too soft when the space was busy and then blaringly loud when say, the lunchtime rush, finished. This left staff having to continuously adjust the volume of their systems when they could be doing much more useful things.
Always at the forefront of developments in technology, the duo introduced decibel readers as a data touchpoint to their systems in these venues and began working on algorithms that would allow the platforms to adjust volume autonomously based on their own learnings. This system independence would allow businesses to vary volume levels across their venues without human involvement – a major win not least because one store may be experiencing a very different level of foot traffic to another, but also because it would enable staff to focus on customers rather than worrying about adjusting the volume of the background music.
Add more data touchpoints such as temperature into the mix plus the potential to feedback to clients the information gleaned from the machine’s learning, the possibilities are endless for AVA and are steering Qsic into a whole new stratosphere of venue environment control, putting them firmly at the forefront of technology for businesses centred around making patrons feel good. Qsic saw the potential that AI brought and, like all great innovators, evolved with it. Now, Qsic is not only a commercial music streamer but an industry-first, game-changing system, able to listen to a venue and change its environment autonomously.