fbpx

Machine Learning

  • Improvements in Capabilities Make AI Viable for the Enterprise

    Artificial Intelligence has long been viewed as the solution to all of the problems we will encounter a decade from now. Ever since the term was coined in the 1950’s, it has been peering over our shoulder, teasing us as seemingly within reach but simultaneously just beyond our grasp. Bold predictions about its power and proximity of use go unfounded, and we resign our hopes for AI to the future while maintaining the norm in the now.  

    Many of the difficulties in creating more advanced AI lie within a theory known as Polanyi’s Paradox; many of the things that humans know, whether it be how to play a hand of poker or how to recognize a face, cannot be easily explained, to either another human or a machine, and are instead tacit knowledge that is acquired through some form of the human experience. This is natural advantage humans have long held over artificial intelligence, which is not capable of “unsupervised learning,” as it were, instead reliant upon personal instruction.

    But the time may have finally come. AI advances, coupled with huge steps in machine learning, have resulted in huge improvements in the capabilities of machines to accomplish goals that previously were left to the human mind. Tasks relating to perception and cognition have seen serious progress in just several years; the error rate of speech recognition has dropped by nearly fifty percent since 2016, and companies like Google and PayPal are using machines to independently enhance and protect their systems.

    But how can enterprises more reliably and effectively take advantage of AI? Most of these businesses have remained profitable by controlling costs and exploiting economies of scale, which offers stability and predictability as it pertains to costs and profit, but very little when it comes to creating growth or sensing opportunity and threat rising and falling from within the market. They operate on a very large, very slow scale and are built on a foundation of huge wells of data, each one limited to one aspect of their internal operation.

    If enterprises, instead of relying on these large warehouses of data, can focus on specific use cases within industries, they will find an increase in demand, a lift in revenue, and an improvement in sales. Specific, valuable use cases are what drive data monetization, especially when spanning multiple, horizontally-oriented functions. 

    Another great advance in the development of AI is the decrease in the length of the implementation cycle. Previous systems took years to develop and were borderline outdated by the time they were fully operational and maximized. Now, within three months, data can be brought together to create initial automated actions and construct a client-specific view of the market. And thanks to the continuous stream of data, the system will always be updating and improving to fine-tune and validate its recommendations in the market. Every piece of data helps the AI devise a more complete view of the market, and unlike old systems, there is no level-off at a certain point; the more data you have, the better the computer’s predictions will be.

    Enterprise AI has also been used to drive retail. Matching sales, production, and resources with consumer demand can be achieved through a cross-channel analysis of the market through artificial intelligence. Data tells you who your customers are and how to market to them each individually, and then to prioritize outlets and discover demand to allow you to efficiently meet the consumer’s needs.

    Artificial intelligence has been used by trucking companies to optimize their routes and even to offer advice to their drivers while in the cab, such as whether to speed up or slow down. Reducing delivery time and optimizing fuel use has saved one European trucking company over 15% in fuel costs from sensors that monitor driver behavior and vehicle performance. Similarly, airlines have used AI to predict problems such as airport congestion and bad weather, helping them avoid cancellations that can prove extremely costly.

    Customer service and marketing fields have also benefitted greatly from the growing versatility of AI, thanks to the improvement of both AI pattern recognition and voice recognition. Companies like Amazon and Netflix that use “Next Product” recommendations in order to target individual customers can see large sales increases by using machines to determine what products a consumer is more likely to buy based on their purchasing history. And any company with an automated customer service system will benefit not only from better voice recognition, but from improving voice analysis that allows automated answering systems to recognize when a customer is becoming upset and automatically transfer them to a human representative.

    Deep neural networks, the technology behind machine learning and advanced AI systems, was shown to improve upon the performance of other analytic techniques in 69% of cases. Use cases show that modern deep learning AI has the ability to boost value above traditional AI by an average of 62%, with industries like retail and transport seeing a boost of almost 90%, and travel enjoying a fantastic 128% benefit with modern AI as opposed to traditional. When aggregated, AI has the potential to create up to $5.8 trillion, which would be 40% of the overall potential impact of all analytics techniques.

    Retail stands to gain the most from advanced AI in marketing and sales areas such as pricing and promotion, where they could gain $500 billion dollars a year. Combining that gain with advances in task automation and in inventory and parts optimization, the retail industry as a whole could see up to a trillion dollars as a result of new AI techniques.

    Consumer package goods can see the greatest benefit in supply-chain management and manufacturing, where predictive maintenance and inventory optimization could result in a net of $300 billion dollars a year. There are dozens of areas in which goods enterprises can improve themselves with AI, among them yield optimization, sales and demand forecasting, budget allocation, and analytics-driven hiring and retention.

    Banking can see additional profits by applying enterprise AI to their marketing, sales, and risk sectors. Just those three could allow the banking industry to see another $300 billion. Using advanced analytics to handle, as we’ve discussed, customer service, risk assessment, pricing, and customer acquisition

    The discovery of farmer/customer archetypes has been driven by AI. Every farmer has a “digital fingerprint,” created through characteristics such as crop yield, geography, operational performance, among many others. Products are then rated for customers to reveal taste and preference, specific to product attributes, which in turn boosts products that have many similar attributes as the most popular products. These attributes becoming more or less popular provide the farmers with insight into consumer trends, and provide the customer with thematic farmer archetypes. The creation of these archetypes helps the underlying enterprise AI undercover the true drive in demand for products, find lookalike customers, focus growth targeting, define targeted customer concepts, and target promotions to simulate cross and upsell.

    June 11th, 2019

    © 2019 RM Dayton Analytics, Ltd. Co.

      

  •  

    Cross-Enterprise AI 101 for Analytics and EPM People


    Where does Cross-Enterprise Artificial Intelligence (AI) take the baton from Enterprise Performance Management (EPM) and other business analytics technologies?  

    In this e-learning session, we’ll briefly review the scope and breadth of traditional EPM and Business Analytics towards the objective of becoming an intelligent agile enterprise.

    We’ll talk through examples of advanced analytics applications typically beyond the scope of classic EPM, where massive integration of different ‘type of data’ silos are required, and different skill sets, such of those of a data scientist, are often deployed.

    You’ll then learn about the key components of a cross-enterprise AI platform, as well as several use cases highly adaptive for cross-enterprise AI. A prime objective of Cross-Enterprise Artificial Intelligence is to continuously sense what people want, and then deliver continuously adapting value.

     

    Register Here!

     

     

  •  

    Cross-Enterprise AI vs. Intelligent Automation: What's the Difference?


    Where do Intelligent Automation (IA) and Cross-Enterprise AI fill the gaps in enterprise software technologies, such as Enterprise Resource Planning (ERP), Enterprise Performance Management (EPM), Business Intelligence (BI) and Business Analytics technologies? Where do they help people by allowing more time for the creative, spending less on the mundane?

    You’ll learn about the key components Intelligent Automation (IA), introducing how Robotic Process Automation (RPA) can provide with more time for creativity, realizing more value out of your existing technology investments.

     

    We’ll also review the key components of a Cross-Enterprise AI platform, highlighting the differences between Intelligent Automation and Cross-Enterprise AI.

    Learn how Cross-Enterprise AI and Intelligent Automation unleash business value in new ways, transforming core processes and business models in various industries.

     

    Register Here!

     

     

  •  

    Ignite Business Analytics & EPM with IA


    How can Intelligent Automation (IA) supercharge Enterprise Performance Management (EPM), Business Intelligence (BI) and Business Analytics technologies via Robotic Process Automation (RPA)? 

    In this e-learning session, we’ll briefly review the scope and breadth of traditional EPM and Business Analytics towards the objective of becoming an intelligent agile enterprise..

     

    We’ll talk through examples of where massive integration and data silos typically exist in BI, EPM and Business Analytics solutions.

    You’ll then learn about the key components of RPA, as well as several use cases highly adaptive for Intelligent Automation.

     

    Register Here!

     

     

  •  

    What Goes Around Comes Around

          Not long ago, a legacy system was enterprise software running on a mainframe.  Wait, I thought those supercharged Teradata screaming machines were purchased just a few years ago?  Did someone say legacy Teradata?  Grab the gold coins son, we’re going off the grid.  It’s a stirring revolution that some large industry leaders are taking their in-house MPP databases and shoving them, opting for the route of IaaS and PaaS.  Part of what’s driving this movement is the desire to let someone else worry about infrastructure and platforms. These initiatives are not little toy sandbox data marts, but large-scale behemoths being deployed by the largest retail, consumer goods and high tech companies in the world.  Sound familiar?  No, it’s not déjà vu.  This really is a lot like the time sharing on mainframes almost a half century ago.

     The Cloud People

          Whether on premise or off, you need to align the right set of skills when deploying big data in the cloud.   So what kind of skills should you bring on board?  Perhaps you’ll need experience in hosting and performance tuning Splunk applications on Azure to analyze product (machine) operations data for troubleshooting or analysis.  Or maybe you’ll need knowledge in tools such as SOASTA for load testing and log analysis as teams spin up thousands of servers across your cloud platform.  Going off premise cloud raise may raise your bar on efficiency and effectiveness, but subsequently, require some team retooling.  SaaS experience should entail big queries on cloud services platforms.  You’ll need to import data into the cloud storage and thereafter ingest to cloud storage from real-time sources.  IaaS and PaaS experience should involve virtual machine, virtual network and tools heavy lifting on cloud platforms.  Though many big data tools are open source, not all are created equal across cloud platforms.  Many skills are transferable, so don’t decide solely based on your architects’ cloud platform religions as to whether you’ll be standing up Virtual Clouds (AWS) or Virtual Nets (Azure).

     It’s Real Time

         In ongoing cloud operations and performance, there is a need to process significant amount of data from web servers, application servers, databases and external sources, such Akamai and SiteCatalyst. Additionally, data may need to be correlated with applications such as Apigee or an Order Management System.  There is a need for real-time and near real-time visibility for into operations.  Basket analysis, for example, may use Apache Hadoop, with the data ingested via Flume, where then custom programs do the analysis and prepare the results for reporting.  Tools such as Splunk may be used to collect logs from hundreds or servers and correlate with the data from the applications.

         Moving data in and out of cloud storage may end up being a blend of batch and real-time using cloud features such as Elasticsearch in an AWS cloud to search data lakes of mobile, social, consumer and business applications data.  For Google Cloud, you’ll want experience with Google Pub/Sub and Dataflow for real-time data integration into BigQuery. 

     Nothing but Cloud

         You wouldn’t do it of course just because the cool CIO getting all of the attention at the conference said she did, but between us, if you’re thinking about going off-premise cloud, you’re not alone.  More than one top Fortune Company is reducing their in-house Teradata footprint and moving to cloud platforms.  Note:  Teradata is now offering Database as a service (DBaaS) in its own cloud, as well as on public cloud platforms.

         Your plans may be to go all-in all a public cloud platform.  You might end up, however, with some on premise deployment, even if that’s not the way it started out on the drawing board. So if you end up with an Elasticsearch cluster running on Azure with an on premise connection to MongoDB inside the firewall, you’re not a loser.  And you can still tell your buddies at the bocce league that you’ve gone cloud.

         Make sure to get architects with experience in designing data flows into Hadoop and on a large-scale cloud platform.  Develop best practices for the day-to-day loads, which may involve MongoDB or Cassandra Collections.  Be prepared to have your team develop use case prototypes for demonstration to upper management.   Data lake technology is still in its early stages, and the skills to extract meaningful business value are in rather short supply.  Deployment guides don’t exist yet.  Aligning the right skills and governance is key to avoid projects getting bogged down.

    Master the Data Lake

         It’s intriguing that the new term data lake is named after a body of water rather than a place where we store stuff a la the data warehouse.  That lets us use other water terms when the data gets all mucked up or otherwise permeated, for better or worse, enabling us to call them things like data bogs, swamps, creeks and oceans.  Sticking to the metaphor of the lucent body of data, a data lake is similar to the persisted storage area of a data warehouse.  The idea is to store massive amounts of data so that is readily available.

         Back in the data warehouse glory days, you only wanted to have the really important data immediately available, because you simply could not handle making all of the data persistent.  It ended up though not being such a bad thing, because adding some governance to the process led to some discipline up front about what was more important and what was less important.  True, technology limited storing ‘everything’ from a cost and capabilities perspective.  The collective technology, which we might just call tools, really did not allow us to ‘have it all.’

         Alas, technology has advanced, and it is now cost effective to store gargantuan amounts of data – yes, unstructured too – in data lakes.  But is there a downside?  Time will tell.  The good news is, we get to store a bunch of data we don’t have a clue about.  So maybe, over time, we can make some sense out of it.  It’s sort of like data cryogenics, in that we want to keep the data alive until we find the cure to fix whatever ailments are being caused by something we know so little about.

     Decoder for Big Data & Cloud Terms Referenced

    SaaS – Software as a Service – functional specific on-demand application. Example:  SalesForce.com

    PaaS – Platform as a Service – middleware, virtualized databases, development platforms, operating systems. Examples: Azure and AWS

    IaaS – Infrastructure as a Service – memory, disk, network, virtual machine (computer).  Examples: Amazon EC2, Rackspace, IBM SoftLayer

    API Management Platform - Apigee

    Big Data Platforms – Hadoop, Hortonworks

    Cloud Services Platforms – Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform

    Content Deliver Network (CDN) – Akamai

    Data Ingestion Tool – Flume

    Massively Parallel Processing (MPP) DB– Teradata

    Monitoring and Analyzing Platform - Splunk

    NoSQL Databases – Cassandra, MongoDB

    Performance Analytics – SOASTA

    Search Analytics – Elasticsearch, Google BigQuery

    Web Analytics - SiteCatalyst

    © 2016 RM Dayton Analytics, Ltd. Co.

     This article was originally published 6/27/16 as a CIO Review White Paper entitled ‘Meshing Big Data and The Cloud.’  

    July 9th, 2016

    Download White Paper

     

     
  • As the business world has grown in the 21st century, artificial intelligence (AI) has grown with it, to the point that it is no longer a rare occurrence or unique application. Companies are using AI in several different ways, among them improving customer relations, creating more efficient processes, and making smarter business decisions. Cross-enterprise AI has changed the rules regarding the way executives can tailor product, marketing and sales strategies. There are several different technologies underneath the broader umbrella of artificial intelligence; business analytics, data science, data engineering and machine learning.

    Business analytics is a wide, encompassing term that simply refers to the various ways in which we use data to derive meaningful patterns and draw conclusions. We use this data to tell us why things happened in the past and hopefully why they will happen in the future. It’s a process that has been used for ages, as business attempt to gain an upper hand, and can be as simple or as complicated as you like. But the field of business analytics is constantly changing, evolving as better techniques become available and stronger improvements are made every day.

    Data science is what allows business analytics to continue to improve. As the name implies, it is a scientific process focused on the study of data. Data scientists create new analytic processes in search of the best ways to obtain insights and understanding into the markets they serve. Companies who wish to stay ahead of the curve in artificial intelligence should ensure they have several data scientists staffed in order to push the boundaries of their analytic capabilities further than their competitors.

    Data engineering is what allows data analytics and data sciences to be as productive as they are. It makes data usable for the analytics and sciences to work with. Data engineers convert masses of data that have been collected in systems and data silos into groups of useful data which can be put into applications and through algorithms in order to find the meaning you want. As more and more data become available – and the amount of data being extracted across the world grows every day – data engineering becomes more important if we want to sift through the excess and get to the data points that mean the most to us.

    Machine learning is a more of a field contained within these different types of artificial intelligence in which computers don’t just use data to spit out results, but also to learn from the data and use it to improve its own systems. The machine is given a set of rules and then is essentially given large quantities of data and set free. This version of AI does not require the structured data that analytics or data science does, and thus can find results from datasets that might otherwise be unusable.

    So, with this knowledge in hand, what might be the best ways to implement AI into your own enterprise? Almost every organization knows AI can help them, but they aren’t sure how. Determining the right kinds of analytics your company needs is essential. What kind of problems do you need to solve? What needs to improve? Is the proper organizational structure in place to take fullest advantage of this technology? Knowing why and how you want to utilize cross-enterprise AI is a vital first step.

    You next must ask how you plan to implement cross-enterprise AI, which generally offers two choices: buying an application that is already prepared or building your own to suit your needs. Both paths offer pros and cons, which is why deciding what is right for your company is even more important.

    Building your own AI can also make sense if your enterprise requires a more tailored solution not typical of cross-enterprise AI. Whether a niche industry or unique business process, most cloud vendors may not offer the application you need. Luckily, investing in data scientists can help you develop your own AI applications, and many cloud platforms, even if they can’t provide a ready-made solution, do have tools that assist with some of the most daunting obstacles in creating your own artificial intelligence.

    Buying already-built cross-enterprise artificial intelligence, however, provides a much easier entry into the world of AI. Costs are lower and implementation time is shorter; there won’t be any need for development platforms or solutions in buying or integrating. Because it’s much easier to install, you will also see benefits from your investment sooner. This path is best for a company who has a very defined need that is common within the business world, and which wants to use AI in very common use cases such as Finance, Marketing, Operations and Sales, where the AI systems for these use cases almost certainly already exist. For these types of uses, which you had to build say 5-10 years ago, building your own today would be akin to those who started building the own ERP system in the mid-90’s. You generally will get about 80% out of the box with buy solution, the remaining 20% being the customization to your business and specific use cases.

    July 25th, 2019

    © 2019 RM Dayton Analytics, Ltd. Co.

      

  •  

    Cross-Enterprise AI vs. Intelligent Automation: 

    What's the Difference?

     

    As artificial intelligence (AI) has increased in efficiency and spread across the country and across the world, more and more managers have turned towards AI solutions to improve their processes and become more agile. The technology has drastically transformed the workplace, making it a much more intelligent, accurate, and productive space than it was even ten years ago.  

    Two of the most prominent systems used by companies today are Robotic Process Automation (RPA) and Cross-Enterprise AI. Organizations seeking to employ these systems often know that they can greatly improve their internal operations but not the extent, and in many cases they aren’t even sure how, without the help of extensive implementation consulting, to properly use these systems and install them in a value-maximizing way. RPA and Cross-Enterprise AI are extremely powerful, but their actual purpose must be understood, and proper expectations must be set in order to fully harness them.

    Strictly speaking, RPA is not a form of AI, as it represents more automation and performance of prescribed tasks than the use of data to create new information. RPA is designed to perform the same actions a human would through user interface and descriptor technologies. RPA operates in place of a human via a “bot” that uses various pathways and triggers. It does not create new data or enhance existing business models, but rather accelerates the timeline upon which those models operate.

    Download the Executive Briefing White Paper to learn about the key differences between Cross-Enterprise AI and RPA.

     

    I'd Like to Read this White Paper

     

     

  •  

    Enterprise AI Digital Transformation for the 2020's 

     

     

    As business moves along in the 2020's, enterprises continually look for new pays to one-up the competition.

    Artificial Intelligence (AI) is now well-established as a transformation technology across various sectors of industry, from retail and manufacturing to transport, as well as in government and in scientific research. This executive briefing examines the factors that influence adoption of AI in industry and the opportunities and risks that such adoption will entail in the build versus buy decision. 

    The digital transformation impact of AI comes from both its effect on intelligent decision making and predictions as well as from its facilitation of greater automation. While increased automation has been a key component of technological progress since the industrial revolution,1 AI and Machine Learning (ML) techniques promise far greater automation than before. Automation can bring about lower costs and faster turnaround time on projects as well as free up human time for tasks that are less amenable to automation. The effort exerted towards automation can also identify bottlenecks in a project which cause friction and reduce efficiency.

    Download this RM Dayton Analytics Executive Briefing to get perspectives on enterprise-AI digital transformation for the 2020's.

     

    I'd Like to Read this Executive Briefing

     

  •  

    Getting on the Right Path to Intelligent Automation:

    An Approach to Robotic Processing Automation (RPA) 

     

     The first phase in the journey is planning, in which you must determine the ways in which RPA can help you and whether your company is prepared to bring in this kind of software. You must develop your strategy, identify benefits that RPA could bring, and select a partner to help incorporate RPA. Key components of strategy should include building awareness of automation technology within the company, obtaining sources of funding, and creating your initial approach for implementation.  

    You then have to confirm that RPA programs are indeed applicable and begin your pilot programs, selecting your initial use cases, observing preliminary results, and measuring whatever benefits you can see. You must set up your basic infrastructure, begin acquiring skill in RPA technology and re-training your workers with new, more advanced skills that they will be able to use as a result of the freedom provided by robotic automation.

    The first major challenge can be a need of a roadmap or well-developed strategy. As mentioned earlier, a robust RPA strategy is necessary when it comes to determining potential outcomes, finding potential fits within the company for RPA tools, and executing the choices necessary to maximize the use of automation. Change management strategy is also needed when it comes to dividing roles, assuring company-wide buy-in, and resisting push back that may come from a skepticism of new technology.

    Download the Executive Briefing White Paper to learn about getting on the right path to effective and efficient Intelligent Automation.

     

    I'd Like to Read this White Paper

     

     

Before download...

Continue... ×