The power of machine learning is becoming clearer to the general public every day, right before their eyes, without many even realizing what they’re seeing. The virtual voice assistants in smartphones or smart homes are driven very strongly through machine learning, and more specifically through natural language processing. Siri, Alexa, and Cortana are all built to recognize speech and give an accurate response, but how does that happen? The machine recognizes speech, usually activated by a trigger phrase (“Hey Siri”), and then, using the voice recognition technology, synthesizes what it hears into a series of numbers that it uses to understand what the person said and prepares a response accordingly.
But while those voice assistants are the most obvious instances of machine learning in our phones, they aren’t only instances; modern smart phones are actually some of the most machine-learning-rich devices in the world. For example, one of Apple’s biggest advertising campaigns is the improvements made to the iPhone camera, such blurring out the background of an image, digitally enhancing your image, and even zooming out and extending beyond just what was captured in the image, having your phone fill in on the boundaries. It can detect where the person or main objects of the picture are using image processing and single out the defining characteristics. Even the face unlock feature of some newer phones is driven by AI, although facial recognition is hardly limited to cell phones; everyone from Facebook to the government has found a use for such applied science.
The most common and most popular of all machine learning programs is one that is probably used at least once a day by the readers themselves. The Google search engine improves its results every day using the results of searches and deciphering what a person was actually looking for. Only those who work for Google designing the search function know exactly how it works, but no doubt it was one of the first mainstream machine learning programs out there. It’s training program is every Google search ever! The amount of data that system has seen and processed is nearly unfathomable, making it one of the most well-refined functions in existence as well.
In a similar vein to Google, which suggests web pages based on keywords and past search histories, recommendation engines can view what a customer has purchased, viewed, enjoyed, etc. across a number of variables and categories and suggest related possibilities that others have also purchased, viewed, or enjoyed. The most common recommendation engines are used by retailers such as Neiman Marcus and Walmart, streaming services like Netflix and Disney+, and food services like Uber Eats and Postmates. But the list of industries that do or can benefit from these engines is very long and as they improve, industries become more and more reliant on them to keep customers around for as long as possible.
Even though recommendation engines were used before machine learning became more common, they only become more and more effective once switched to a machine learning platform rather than legacy engines. The difference lies in the ability of the machine learning system to make connections that aren’t as intuitively connected, while also using past purchases to guess something more random that a specific customer may want. In essence, it not only predicts what users will also need, it tries to determine what the users don’t even yet know they want. While the numbers aren’t exact, Amazon believes 35% of their sales have come thanks to their recommendations, and Netflix reports that almost 75% of the content watched on their streaming service can be attributed to the same.
One of the primary concerns that arises when discussing the rise of AI and machine learning is the possibility of machines taking over jobs that have been done by humans for years, thus limiting the job pool and increasing unemployment, primarily among low-skilled workers. It’s still too early to decipher if those predictions will come true, but for some companies they already are. Factories have been almost entirely machine-run for decades now, and retail companies like Best Buy have experimented with robots replacing their in-store representatives, using smart machines to interact with customers and manage products.
But there are also many cases in which AI is used to augment the existing employees, even in industries which are typically most associated as likely to use AI to replace the human labor force. Take, for example, Good Times Burgers & Frozen Custard, a fast-food chain based in Colorado, that created an AI platform named Holly that used machine learning to improve the customer experience and personalize customer interaction. In only just a year of use, Good Times’ average customer wait time decreased by 20%, average upsell rate increased by 40%, and spend per order increased by 6%.
According to a study by Microsoft, two out of three business leaders and employees believe that AI is built to augment jobs rather than replace them, and just 10% of that same pool believe that AI will significantly negatively impact the human workforce. Holly is a perfect example of an enterprise using AI and machine learning to complement their employees rather than replace them. She is capable of taking orders quickly, accurately, and cheerfully, unfazed by long waits between orders or a long string of orders consecutively. Holly removes some of the repetitiveness from the fast food employee, leaving them to speak to the customer at the counter or window, focus on more specific customer service, and deliver better food more quickly than before. In this manner, humans and machines work together, specializing in what each is best, to build a more efficient working environment and business model.