fbpx

Data Science

  • “A long, long time ago, I can still remember, how the music used to make me smile.  And I knew if I had my chance, I could make those people dance, and maybe they’d be happy for a while.”   Don McLean never did reveal what “American Pie” meant, though others have provided interpretations.   Thank you Don.  Perhaps you thought that meanings could be projected based upon individual experiences. 

    In early ‘94, I happened to stumble upon OLAP as it was morphing from an elitist mainframe technology into one that could be put into the hands of we common folk.  I’d never heard of Arbor or Essbase when I applied to that ad in the Dallas Morning News, but saw it was client-server and wanted some of that.  I found out later on that I was going to be the first dedicated Essbase consultant hired by Comshare – and that Essbase was not the same thing as Sybase.   Take a job on something no one’s ever heard of?  Sign me up!

    One of the joys of being the first guy in, is that you get to be a jack of all trades.  A little training here, an implementation over there, and oh – by the way – we need some help selling new deals.

    Implementations were their own kind of fun.  The early adapters generally were large companies that would at times create labs to explore this new OLAP toy.  One of my first clients, Shell Oil in Houston, actually paid me to benchmark Essbase calculation performance on Windows NT vs. OS X.  Paid to play.  How cool was that?

    But things were about to get a whole lot better.  As a techie that avoided the dark side of sales, what lie ahead was beyond the reaches of my limited imagination.   We didn’t know it, but we were about to become showman. 

    The glory was astounding.  As we were evangelizing OLAP, we developed quite the shtick.  We’d always get our prospect companies in Texas to give us a small extract out of their GL (in that mostly pre-ERP era).

    When I received 3.5” diskette with their data extract in my hand, it was sort of like receiving a tech toy from Amazon today, but exponentially better.  You just knew.  What was about to come could not be stopped, an uncontrollable destiny with only one outcome – and you could hardly contain yourself.

    So, you’re thinking – data on a diskette – how much could it be?  There was a secret to the shows that we put on.  We demoed everything live.  So all of the data needed to run on circa ’95 laptops running both the client and server versions of Essbase on Windows NT.  Everything crashed all the time.  Nobody seemed to care much... 

    When show time came, the choreography included setting up the audience with a teaser, showing them their data on the ‘back-end’ in a graphical format.  That was the perfect opening act, as hardly anyone had seen their business rendered in such a manner before.  Then we’d hit them with the punch line – the show’s climactic moment, that magical ‘slice and dice’ in Excel.  The ‘oohs’ and ‘awes’ filled the air, and unconstrained joy would permeate the room.  Conquering heroes, we were, as we pivoted the data from a row to a column, then drilled down to resounding applause.   My buddy David and I would be jumping in front of each other, quite literally, to deliver the best lines.  We may have been rehearsed, but we weren’t scripted.

    Audiences would clap, then just gush with unbridled excitement, as if every prior data frustration they had ever experienced had been somehow mystically lifted from their shoulders.   Some of the best testimonials that I’ve ever heard came from some of those early OLAP audiences, guttural praise, straight from the heart.  One of my all-time favorites was from a manager in the Lubes Division of Shell Oil in Houston, as he witnessed in front of we showman and a dozen or so colleagues, “This gives us answers to questions that we never even knew how to ask.”

    We got the deal.  It worked every time.  Even when our laptops would freeze up during the demo, and we’d have to re-start, we’d get the deal.  We always got the deal.  The early adapters willed OLAP into prime time. 

    The people smiled.  We made the people dance.  They were so very happy – for a while.  Then the novelty of that magical, pivoting two-step eventually ran its course, of course, as that is what happens to all new game-changing technologies.  The world went on and grew up in a hurry.  Oracle has Essbase these days, and their sales model is well, um … different.

    In our showman days, Decision Support Systems (DSS) was the collective term used to encapsulate all of what we provided, as the then progression from Executive Information Systems (EIS).  Since then, we’ve added Business Intelligence (BI), which in the day, was staunchly defended as different from DSS.  Then with a little help here and there from Hyperion and Gartner, along came Business Performance Management (BPM), which was the same as Corporate Performance Management (CPM), which is the same as what we now primarily have settled on as Enterprise Performance Management (EPM). 

    Add Data Analytics, Data Science, Big Data, Data Mining, Predictive Analytics, Data Visualization since then, just to name a few. We were telling everyone 22 years ago that coding was not necessary for OLAP – even the users in Finance could handle it.  The semantic layer is still alive and well, but there’s also been a renaissance in coding.  We’ve good coding schemes and data dreams and IoT not so much on the QT.  We’ve got Hadoop and R and Machine Learning and Python and Pig.  We’ve got Spark on the Apache.  We’ve got a zillion ways to zap every bit of data on the planet into sub-atomic data particles.  And we’re just getting started parsing data into oblivion. 

    I wonder sometimes if we’ve not come full circle?  We’re still mincing, slicing and dicing – but we’re just doing a great deal more of it – on a heck of a lot more data.  Would not all of the above ‘Big Data’ technologies, without too much of a stretch, settle themselves just dandy under the classification of Decision Support?   I’m just sayin’.

    Perhaps my slice of the pie is that I got to be a showman for a while. Or perhaps it’s just that I’ve had a bird’s eye view of seeing everything come full circle. Either way, it makes me smile.

    Robert Dayton

    February 8th, 2016 

  • The world's top brands have hired RM Dayton to align top analytic talent.

  • CIO Review selects RM Dayton Analytics as a '20 Most Promising Enterprise Performance Management Solution Provider 2016,' its top-tier list. The merit-based award was earned after CIO Review's editorial research panel of experts evaluated over 300 solution providers. RM Dayton Analytics is changing the way the world's leading organizations deploy their X's and O's, aligning the world's most-prized intellectual capital.  

    "We are honored and humbled to receive this prestigious award," said Robert Dayton, Managing Partner, "EPM is the heart and soul of data analytics. To receive this award, in this category, at this juncture in our journey, not only is validation of our knowledge worker alignment mission, but also is a recognition award for our customers, who are at the forefront in the strategic use of data analytics."  CIO Review EPM Special Edition

  • Improvements in Capabilities Make AI Viable for the Enterprise

    Artificial Intelligence has long been viewed as the solution to all of the problems we will encounter a decade from now. Ever since the term was coined in the 1950’s, it has been peering over our shoulder, teasing us as seemingly within reach but simultaneously just beyond our grasp. Bold predictions about its power and proximity of use go unfounded, and we resign our hopes for AI to the future while maintaining the norm in the now.  

    Many of the difficulties in creating more advanced AI lie within a theory known as Polanyi’s Paradox; many of the things that humans know, whether it be how to play a hand of poker or how to recognize a face, cannot be easily explained, to either another human or a machine, and are instead tacit knowledge that is acquired through some form of the human experience. This is natural advantage humans have long held over artificial intelligence, which is not capable of “unsupervised learning,” as it were, instead reliant upon personal instruction.

    But the time may have finally come. AI advances, coupled with huge steps in machine learning, have resulted in huge improvements in the capabilities of machines to accomplish goals that previously were left to the human mind. Tasks relating to perception and cognition have seen serious progress in just several years; the error rate of speech recognition has dropped by nearly fifty percent since 2016, and companies like Google and PayPal are using machines to independently enhance and protect their systems.

    But how can enterprises more reliably and effectively take advantage of AI? Most of these businesses have remained profitable by controlling costs and exploiting economies of scale, which offers stability and predictability as it pertains to costs and profit, but very little when it comes to creating growth or sensing opportunity and threat rising and falling from within the market. They operate on a very large, very slow scale and are built on a foundation of huge wells of data, each one limited to one aspect of their internal operation.

    If enterprises, instead of relying on these large warehouses of data, can focus on specific use cases within industries, they will find an increase in demand, a lift in revenue, and an improvement in sales. Specific, valuable use cases are what drive data monetization, especially when spanning multiple, horizontally-oriented functions. 

    Another great advance in the development of AI is the decrease in the length of the implementation cycle. Previous systems took years to develop and were borderline outdated by the time they were fully operational and maximized. Now, within three months, data can be brought together to create initial automated actions and construct a client-specific view of the market. And thanks to the continuous stream of data, the system will always be updating and improving to fine-tune and validate its recommendations in the market. Every piece of data helps the AI devise a more complete view of the market, and unlike old systems, there is no level-off at a certain point; the more data you have, the better the computer’s predictions will be.

    Enterprise AI has also been used to drive retail. Matching sales, production, and resources with consumer demand can be achieved through a cross-channel analysis of the market through artificial intelligence. Data tells you who your customers are and how to market to them each individually, and then to prioritize outlets and discover demand to allow you to efficiently meet the consumer’s needs.

    Artificial intelligence has been used by trucking companies to optimize their routes and even to offer advice to their drivers while in the cab, such as whether to speed up or slow down. Reducing delivery time and optimizing fuel use has saved one European trucking company over 15% in fuel costs from sensors that monitor driver behavior and vehicle performance. Similarly, airlines have used AI to predict problems such as airport congestion and bad weather, helping them avoid cancellations that can prove extremely costly.

    Customer service and marketing fields have also benefitted greatly from the growing versatility of AI, thanks to the improvement of both AI pattern recognition and voice recognition. Companies like Amazon and Netflix that use “Next Product” recommendations in order to target individual customers can see large sales increases by using machines to determine what products a consumer is more likely to buy based on their purchasing history. And any company with an automated customer service system will benefit not only from better voice recognition, but from improving voice analysis that allows automated answering systems to recognize when a customer is becoming upset and automatically transfer them to a human representative.

    Deep neural networks, the technology behind machine learning and advanced AI systems, was shown to improve upon the performance of other analytic techniques in 69% of cases. Use cases show that modern deep learning AI has the ability to boost value above traditional AI by an average of 62%, with industries like retail and transport seeing a boost of almost 90%, and travel enjoying a fantastic 128% benefit with modern AI as opposed to traditional. When aggregated, AI has the potential to create up to $5.8 trillion, which would be 40% of the overall potential impact of all analytics techniques.

    Retail stands to gain the most from advanced AI in marketing and sales areas such as pricing and promotion, where they could gain $500 billion dollars a year. Combining that gain with advances in task automation and in inventory and parts optimization, the retail industry as a whole could see up to a trillion dollars as a result of new AI techniques.

    Consumer package goods can see the greatest benefit in supply-chain management and manufacturing, where predictive maintenance and inventory optimization could result in a net of $300 billion dollars a year. There are dozens of areas in which goods enterprises can improve themselves with AI, among them yield optimization, sales and demand forecasting, budget allocation, and analytics-driven hiring and retention.

    Banking can see additional profits by applying enterprise AI to their marketing, sales, and risk sectors. Just those three could allow the banking industry to see another $300 billion. Using advanced analytics to handle, as we’ve discussed, customer service, risk assessment, pricing, and customer acquisition

    The discovery of farmer/customer archetypes has been driven by AI. Every farmer has a “digital fingerprint,” created through characteristics such as crop yield, geography, operational performance, among many others. Products are then rated for customers to reveal taste and preference, specific to product attributes, which in turn boosts products that have many similar attributes as the most popular products. These attributes becoming more or less popular provide the farmers with insight into consumer trends, and provide the customer with thematic farmer archetypes. The creation of these archetypes helps the underlying enterprise AI undercover the true drive in demand for products, find lookalike customers, focus growth targeting, define targeted customer concepts, and target promotions to simulate cross and upsell.

    June 11th, 2019

    © 2019 RM Dayton Analytics, Ltd. Co.

      

  •  

    Conference - Collaborate 19 -- Technology Forum


    Collaborate will be held in San Antonio, Texas from April 7th-11th 2019.  Let us know if you'll be heading out to San Antone,.  Schedule 30 minutes on the calendar below, and let's talk shop about where business analytics is today, where it's heading, and how that might impact your company initiatives and career.

     

  •  

    Conference - Predictive Analytics World


    The Predictive Analytics World Conference will be held in Las Vegas, NV from June 16th - June 20th, 2019.  We look forward to seeing you there to network with industry practitioners in business analytics.   Let us know if you'll be heading out.  Find 30 minutes on the below calendar, and we'll talk talk shop on business analytics over the java that you drink in Vegas.

     

     

  •  

    Conference - Strata Data -- Make Data Work


    The Strata Data Conference will be held in San Jose, CA from March 15th-18th, 2020.   Let us know if you'll be heading out to Silicon Valley.  Schedule 30 minutes on the calendar below. Thank you.

     

     

    Thank you!

  •  

    Conference - The AI Summit 


    The AI Summit will be held in New York, NY from December 11th-12th, 2019.  We look forward to seeing you there.   Find 30 minutes on the below calendar, and we'll sync up in The City.

     

    Thank you!

  •  

    Cross-Enterprise AI 101 for Analytics and EPM People


    Where does Cross-Enterprise Artificial Intelligence (AI) take the baton from Enterprise Performance Management (EPM) and other business analytics technologies?  

    In this e-learning session, we’ll briefly review the scope and breadth of traditional EPM and Business Analytics towards the objective of becoming an intelligent agile enterprise.

    We’ll talk through examples of advanced analytics applications typically beyond the scope of classic EPM, where massive integration of different ‘type of data’ silos are required, and different skill sets, such of those of a data scientist, are often deployed.

    You’ll then learn about the key components of a cross-enterprise AI platform, as well as several use cases highly adaptive for cross-enterprise AI. A prime objective of Cross-Enterprise Artificial Intelligence is to continuously sense what people want, and then deliver continuously adapting value.

     

    Register Here!

     

     

  •  

    Cross-Enterprise AI vs. Intelligent Automation: What's the Difference?


    Where do Intelligent Automation (IA) and Cross-Enterprise AI fill the gaps in enterprise software technologies, such as Enterprise Resource Planning (ERP), Enterprise Performance Management (EPM), Business Intelligence (BI) and Business Analytics technologies? Where do they help people by allowing more time for the creative, spending less on the mundane?

    You’ll learn about the key components Intelligent Automation (IA), introducing how Robotic Process Automation (RPA) can provide with more time for creativity, realizing more value out of your existing technology investments.

     

    We’ll also review the key components of a Cross-Enterprise AI platform, highlighting the differences between Intelligent Automation and Cross-Enterprise AI.

    Learn how Cross-Enterprise AI and Intelligent Automation unleash business value in new ways, transforming core processes and business models in various industries.

     

    Register Here!

     

     

  •  

    Ignite Business Analytics & EPM with IA


    How can Intelligent Automation (IA) supercharge Enterprise Performance Management (EPM), Business Intelligence (BI) and Business Analytics technologies via Robotic Process Automation (RPA)? 

    In this e-learning session, we’ll briefly review the scope and breadth of traditional EPM and Business Analytics towards the objective of becoming an intelligent agile enterprise..

     

    We’ll talk through examples of where massive integration and data silos typically exist in BI, EPM and Business Analytics solutions.

    You’ll then learn about the key components of RPA, as well as several use cases highly adaptive for Intelligent Automation.

     

    Register Here!

     

     

  •  

    What Goes Around Comes Around

          Not long ago, a legacy system was enterprise software running on a mainframe.  Wait, I thought those supercharged Teradata screaming machines were purchased just a few years ago?  Did someone say legacy Teradata?  Grab the gold coins son, we’re going off the grid.  It’s a stirring revolution that some large industry leaders are taking their in-house MPP databases and shoving them, opting for the route of IaaS and PaaS.  Part of what’s driving this movement is the desire to let someone else worry about infrastructure and platforms. These initiatives are not little toy sandbox data marts, but large-scale behemoths being deployed by the largest retail, consumer goods and high tech companies in the world.  Sound familiar?  No, it’s not déjà vu.  This really is a lot like the time sharing on mainframes almost a half century ago.

     The Cloud People

          Whether on premise or off, you need to align the right set of skills when deploying big data in the cloud.   So what kind of skills should you bring on board?  Perhaps you’ll need experience in hosting and performance tuning Splunk applications on Azure to analyze product (machine) operations data for troubleshooting or analysis.  Or maybe you’ll need knowledge in tools such as SOASTA for load testing and log analysis as teams spin up thousands of servers across your cloud platform.  Going off premise cloud raise may raise your bar on efficiency and effectiveness, but subsequently, require some team retooling.  SaaS experience should entail big queries on cloud services platforms.  You’ll need to import data into the cloud storage and thereafter ingest to cloud storage from real-time sources.  IaaS and PaaS experience should involve virtual machine, virtual network and tools heavy lifting on cloud platforms.  Though many big data tools are open source, not all are created equal across cloud platforms.  Many skills are transferable, so don’t decide solely based on your architects’ cloud platform religions as to whether you’ll be standing up Virtual Clouds (AWS) or Virtual Nets (Azure).

     It’s Real Time

         In ongoing cloud operations and performance, there is a need to process significant amount of data from web servers, application servers, databases and external sources, such Akamai and SiteCatalyst. Additionally, data may need to be correlated with applications such as Apigee or an Order Management System.  There is a need for real-time and near real-time visibility for into operations.  Basket analysis, for example, may use Apache Hadoop, with the data ingested via Flume, where then custom programs do the analysis and prepare the results for reporting.  Tools such as Splunk may be used to collect logs from hundreds or servers and correlate with the data from the applications.

         Moving data in and out of cloud storage may end up being a blend of batch and real-time using cloud features such as Elasticsearch in an AWS cloud to search data lakes of mobile, social, consumer and business applications data.  For Google Cloud, you’ll want experience with Google Pub/Sub and Dataflow for real-time data integration into BigQuery. 

     Nothing but Cloud

         You wouldn’t do it of course just because the cool CIO getting all of the attention at the conference said she did, but between us, if you’re thinking about going off-premise cloud, you’re not alone.  More than one top Fortune Company is reducing their in-house Teradata footprint and moving to cloud platforms.  Note:  Teradata is now offering Database as a service (DBaaS) in its own cloud, as well as on public cloud platforms.

         Your plans may be to go all-in all a public cloud platform.  You might end up, however, with some on premise deployment, even if that’s not the way it started out on the drawing board. So if you end up with an Elasticsearch cluster running on Azure with an on premise connection to MongoDB inside the firewall, you’re not a loser.  And you can still tell your buddies at the bocce league that you’ve gone cloud.

         Make sure to get architects with experience in designing data flows into Hadoop and on a large-scale cloud platform.  Develop best practices for the day-to-day loads, which may involve MongoDB or Cassandra Collections.  Be prepared to have your team develop use case prototypes for demonstration to upper management.   Data lake technology is still in its early stages, and the skills to extract meaningful business value are in rather short supply.  Deployment guides don’t exist yet.  Aligning the right skills and governance is key to avoid projects getting bogged down.

    Master the Data Lake

         It’s intriguing that the new term data lake is named after a body of water rather than a place where we store stuff a la the data warehouse.  That lets us use other water terms when the data gets all mucked up or otherwise permeated, for better or worse, enabling us to call them things like data bogs, swamps, creeks and oceans.  Sticking to the metaphor of the lucent body of data, a data lake is similar to the persisted storage area of a data warehouse.  The idea is to store massive amounts of data so that is readily available.

         Back in the data warehouse glory days, you only wanted to have the really important data immediately available, because you simply could not handle making all of the data persistent.  It ended up though not being such a bad thing, because adding some governance to the process led to some discipline up front about what was more important and what was less important.  True, technology limited storing ‘everything’ from a cost and capabilities perspective.  The collective technology, which we might just call tools, really did not allow us to ‘have it all.’

         Alas, technology has advanced, and it is now cost effective to store gargantuan amounts of data – yes, unstructured too – in data lakes.  But is there a downside?  Time will tell.  The good news is, we get to store a bunch of data we don’t have a clue about.  So maybe, over time, we can make some sense out of it.  It’s sort of like data cryogenics, in that we want to keep the data alive until we find the cure to fix whatever ailments are being caused by something we know so little about.

     Decoder for Big Data & Cloud Terms Referenced

    SaaS – Software as a Service – functional specific on-demand application. Example:  SalesForce.com

    PaaS – Platform as a Service – middleware, virtualized databases, development platforms, operating systems. Examples: Azure and AWS

    IaaS – Infrastructure as a Service – memory, disk, network, virtual machine (computer).  Examples: Amazon EC2, Rackspace, IBM SoftLayer

    API Management Platform - Apigee

    Big Data Platforms – Hadoop, Hortonworks

    Cloud Services Platforms – Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform

    Content Deliver Network (CDN) – Akamai

    Data Ingestion Tool – Flume

    Massively Parallel Processing (MPP) DB– Teradata

    Monitoring and Analyzing Platform - Splunk

    NoSQL Databases – Cassandra, MongoDB

    Performance Analytics – SOASTA

    Search Analytics – Elasticsearch, Google BigQuery

    Web Analytics - SiteCatalyst

    © 2016 RM Dayton Analytics, Ltd. Co.

     This article was originally published 6/27/16 as a CIO Review White Paper entitled ‘Meshing Big Data and The Cloud.’  

    July 9th, 2016

    Download White Paper

     

     
  • Select a news topic from the list below, then select a news article to read.

  • Business Analytics in the Travel Industry

    The travel industry makes extensive use of business analytics to drive loyalty, promotions, revenue management – and ultimately, the customer experience.  

    These are not your grandfather’s type of slice and dice analytics, but use a combination of data science, big data and business analytics visualization tools.  Below are a few examples of business modeling in travel and hospitality:

    The Booking Performance Model

    A predictive model of booking performance was redesigned from scratch: adding new functionality while reducing error and the time to update and run the model by twenty-fold compared to the previous version. This model is responsible for determining:

    - Targets and forecasts
    - Of bookings and arrival
    - Calculated at the book date, arrival data pair level

    The model is a multiplicative pick up model, expanded to include:

    - Stationarization
    - Seasonal decomposition
    - Outlier detection
    - Frequency domain analysis
    - SARIMAX
    - Synthetic booking and cancellation curves

    The Strategic Business Simulation

    This model "looks at" the data and business questions in a unique way, to answer questions that are largely immune to statistical and trend extrapolation models.

    The Deep Learning Model

    This is a long short-term memory (LSTM) recurrent neural network (RNN) model. It will extract and predict spatial, temporal, and agent patterns of behavior.  

    Rapid Response Analytics

    Improve the rapid response analytics process to give better data faster by creating high-quality, complex data extracts and aggregations that can be quickly customized for unusual data pulls. This saves days of effort for several very urgent descriptive / diagnostic analytics requests, while drastically reducing turnaround time.

    Data analytics revolve around these types of business subjects:

    - Sales promotion targeting and performance
    - Booking channels
    - Competitive analysis
    - A/B testing of new initiatives
    - Segmentation of guests, assets, and prices
    - Analysis of pricing strategies

    May 24th, 2017

    © 2017 RM Dayton Analytics, Ltd. Co.

      

  •  

    Appointment Request - Phone Chat with Dayton Analytics


    Thank you for scheduling time on your calendar.  Please request to schedule 20 minutes at one of the available times listed on the calendar below.  If something comes up, please use the confirmation email that you'll receive to reschedule.  

     

  • Strata Data is where cutting-edge science and new business fundamentals intersect.

    The conference is a deep-immersion event where data scientists, analysts and executives get up to speed on emerging technologies.  

    The selected speakers have a focus on the data issues that are shaping the finance, media, fashion, retail, manufacturing and energy worlds.  

    Find 30 minutes on the calendar link, and we'll talk shop on business analytics over the java that you drink.

    The conference is being held in New York, NY US, Sep. 23 - 26, 2019. 

     

     Java Calendar for Strata Data Conference 2019

  • As the business world has grown in the 21st century, artificial intelligence (AI) has grown with it, to the point that it is no longer a rare occurrence or unique application. Companies are using AI in several different ways, among them improving customer relations, creating more efficient processes, and making smarter business decisions. Cross-enterprise AI has changed the rules regarding the way executives can tailor product, marketing and sales strategies. There are several different technologies underneath the broader umbrella of artificial intelligence; business analytics, data science, data engineering and machine learning.

    Business analytics is a wide, encompassing term that simply refers to the various ways in which we use data to derive meaningful patterns and draw conclusions. We use this data to tell us why things happened in the past and hopefully why they will happen in the future. It’s a process that has been used for ages, as business attempt to gain an upper hand, and can be as simple or as complicated as you like. But the field of business analytics is constantly changing, evolving as better techniques become available and stronger improvements are made every day.

    Data science is what allows business analytics to continue to improve. As the name implies, it is a scientific process focused on the study of data. Data scientists create new analytic processes in search of the best ways to obtain insights and understanding into the markets they serve. Companies who wish to stay ahead of the curve in artificial intelligence should ensure they have several data scientists staffed in order to push the boundaries of their analytic capabilities further than their competitors.

    Data engineering is what allows data analytics and data sciences to be as productive as they are. It makes data usable for the analytics and sciences to work with. Data engineers convert masses of data that have been collected in systems and data silos into groups of useful data which can be put into applications and through algorithms in order to find the meaning you want. As more and more data become available – and the amount of data being extracted across the world grows every day – data engineering becomes more important if we want to sift through the excess and get to the data points that mean the most to us.

    Machine learning is a more of a field contained within these different types of artificial intelligence in which computers don’t just use data to spit out results, but also to learn from the data and use it to improve its own systems. The machine is given a set of rules and then is essentially given large quantities of data and set free. This version of AI does not require the structured data that analytics or data science does, and thus can find results from datasets that might otherwise be unusable.

    So, with this knowledge in hand, what might be the best ways to implement AI into your own enterprise? Almost every organization knows AI can help them, but they aren’t sure how. Determining the right kinds of analytics your company needs is essential. What kind of problems do you need to solve? What needs to improve? Is the proper organizational structure in place to take fullest advantage of this technology? Knowing why and how you want to utilize cross-enterprise AI is a vital first step.

    You next must ask how you plan to implement cross-enterprise AI, which generally offers two choices: buying an application that is already prepared or building your own to suit your needs. Both paths offer pros and cons, which is why deciding what is right for your company is even more important.

    Building your own AI can also make sense if your enterprise requires a more tailored solution not typical of cross-enterprise AI. Whether a niche industry or unique business process, most cloud vendors may not offer the application you need. Luckily, investing in data scientists can help you develop your own AI applications, and many cloud platforms, even if they can’t provide a ready-made solution, do have tools that assist with some of the most daunting obstacles in creating your own artificial intelligence.

    Buying already-built cross-enterprise artificial intelligence, however, provides a much easier entry into the world of AI. Costs are lower and implementation time is shorter; there won’t be any need for development platforms or solutions in buying or integrating. Because it’s much easier to install, you will also see benefits from your investment sooner. This path is best for a company who has a very defined need that is common within the business world, and which wants to use AI in very common use cases such as Finance, Marketing, Operations and Sales, where the AI systems for these use cases almost certainly already exist. For these types of uses, which you had to build say 5-10 years ago, building your own today would be akin to those who started building the own ERP system in the mid-90’s. You generally will get about 80% out of the box with buy solution, the remaining 20% being the customization to your business and specific use cases.

    July 25th, 2019

    © 2019 RM Dayton Analytics, Ltd. Co.

      

  •  

    Cross-Enterprise AI vs. Intelligent Automation: 

    What's the Difference?

     

    As artificial intelligence (AI) has increased in efficiency and spread across the country and across the world, more and more managers have turned towards AI solutions to improve their processes and become more agile. The technology has drastically transformed the workplace, making it a much more intelligent, accurate, and productive space than it was even ten years ago.  

    Two of the most prominent systems used by companies today are Robotic Process Automation (RPA) and Cross-Enterprise AI. Organizations seeking to employ these systems often know that they can greatly improve their internal operations but not the extent, and in many cases they aren’t even sure how, without the help of extensive implementation consulting, to properly use these systems and install them in a value-maximizing way. RPA and Cross-Enterprise AI are extremely powerful, but their actual purpose must be understood, and proper expectations must be set in order to fully harness them.

    Strictly speaking, RPA is not a form of AI, as it represents more automation and performance of prescribed tasks than the use of data to create new information. RPA is designed to perform the same actions a human would through user interface and descriptor technologies. RPA operates in place of a human via a “bot” that uses various pathways and triggers. It does not create new data or enhance existing business models, but rather accelerates the timeline upon which those models operate.

    Download the Executive Briefing White Paper to learn about the key differences between Cross-Enterprise AI and RPA.

     

    I'd Like to Read this White Paper

     

     

  •  

    Enterprise AI Digital Transformation for the 2020's 

     

     

    As business moves along in the 2020's, enterprises continually look for new pays to one-up the competition.

    Artificial Intelligence (AI) is now well-established as a transformation technology across various sectors of industry, from retail and manufacturing to transport, as well as in government and in scientific research. This executive briefing examines the factors that influence adoption of AI in industry and the opportunities and risks that such adoption will entail in the build versus buy decision. 

    The digital transformation impact of AI comes from both its effect on intelligent decision making and predictions as well as from its facilitation of greater automation. While increased automation has been a key component of technological progress since the industrial revolution,1 AI and Machine Learning (ML) techniques promise far greater automation than before. Automation can bring about lower costs and faster turnaround time on projects as well as free up human time for tasks that are less amenable to automation. The effort exerted towards automation can also identify bottlenecks in a project which cause friction and reduce efficiency.

    Download this RM Dayton Analytics Executive Briefing to get perspectives on enterprise-AI digital transformation for the 2020's.

     

    I'd Like to Read this Executive Briefing

     

  •  

    Getting on the Right Path to Intelligent Automation:

    An Approach to Robotic Processing Automation (RPA) 

     

     The first phase in the journey is planning, in which you must determine the ways in which RPA can help you and whether your company is prepared to bring in this kind of software. You must develop your strategy, identify benefits that RPA could bring, and select a partner to help incorporate RPA. Key components of strategy should include building awareness of automation technology within the company, obtaining sources of funding, and creating your initial approach for implementation.  

    You then have to confirm that RPA programs are indeed applicable and begin your pilot programs, selecting your initial use cases, observing preliminary results, and measuring whatever benefits you can see. You must set up your basic infrastructure, begin acquiring skill in RPA technology and re-training your workers with new, more advanced skills that they will be able to use as a result of the freedom provided by robotic automation.

    The first major challenge can be a need of a roadmap or well-developed strategy. As mentioned earlier, a robust RPA strategy is necessary when it comes to determining potential outcomes, finding potential fits within the company for RPA tools, and executing the choices necessary to maximize the use of automation. Change management strategy is also needed when it comes to dividing roles, assuring company-wide buy-in, and resisting push back that may come from a skepticism of new technology.

    Download the Executive Briefing White Paper to learn about getting on the right path to effective and efficient Intelligent Automation.

     

    I'd Like to Read this White Paper

     

     

Before download...

Continue... ×