fbpx

Big Data

  • “A long, long time ago, I can still remember, how the music used to make me smile.  And I knew if I had my chance, I could make those people dance, and maybe they’d be happy for a while.”   Don McLean never did reveal what “American Pie” meant, though others have provided interpretations.   Thank you Don.  Perhaps you thought that meanings could be projected based upon individual experiences. 

    In early ‘94, I happened to stumble upon OLAP as it was morphing from an elitist mainframe technology into one that could be put into the hands of we common folk.  I’d never heard of Arbor or Essbase when I applied to that ad in the Dallas Morning News, but saw it was client-server and wanted some of that.  I found out later on that I was going to be the first dedicated Essbase consultant hired by Comshare – and that Essbase was not the same thing as Sybase.   Take a job on something no one’s ever heard of?  Sign me up!

    One of the joys of being the first guy in, is that you get to be a jack of all trades.  A little training here, an implementation over there, and oh – by the way – we need some help selling new deals.

    Implementations were their own kind of fun.  The early adapters generally were large companies that would at times create labs to explore this new OLAP toy.  One of my first clients, Shell Oil in Houston, actually paid me to benchmark Essbase calculation performance on Windows NT vs. OS X.  Paid to play.  How cool was that?

    But things were about to get a whole lot better.  As a techie that avoided the dark side of sales, what lie ahead was beyond the reaches of my limited imagination.   We didn’t know it, but we were about to become showman. 

    The glory was astounding.  As we were evangelizing OLAP, we developed quite the shtick.  We’d always get our prospect companies in Texas to give us a small extract out of their GL (in that mostly pre-ERP era).

    When I received 3.5” diskette with their data extract in my hand, it was sort of like receiving a tech toy from Amazon today, but exponentially better.  You just knew.  What was about to come could not be stopped, an uncontrollable destiny with only one outcome – and you could hardly contain yourself.

    So, you’re thinking – data on a diskette – how much could it be?  There was a secret to the shows that we put on.  We demoed everything live.  So all of the data needed to run on circa ’95 laptops running both the client and server versions of Essbase on Windows NT.  Everything crashed all the time.  Nobody seemed to care much... 

    When show time came, the choreography included setting up the audience with a teaser, showing them their data on the ‘back-end’ in a graphical format.  That was the perfect opening act, as hardly anyone had seen their business rendered in such a manner before.  Then we’d hit them with the punch line – the show’s climactic moment, that magical ‘slice and dice’ in Excel.  The ‘oohs’ and ‘awes’ filled the air, and unconstrained joy would permeate the room.  Conquering heroes, we were, as we pivoted the data from a row to a column, then drilled down to resounding applause.   My buddy David and I would be jumping in front of each other, quite literally, to deliver the best lines.  We may have been rehearsed, but we weren’t scripted.

    Audiences would clap, then just gush with unbridled excitement, as if every prior data frustration they had ever experienced had been somehow mystically lifted from their shoulders.   Some of the best testimonials that I’ve ever heard came from some of those early OLAP audiences, guttural praise, straight from the heart.  One of my all-time favorites was from a manager in the Lubes Division of Shell Oil in Houston, as he witnessed in front of we showman and a dozen or so colleagues, “This gives us answers to questions that we never even knew how to ask.”

    We got the deal.  It worked every time.  Even when our laptops would freeze up during the demo, and we’d have to re-start, we’d get the deal.  We always got the deal.  The early adapters willed OLAP into prime time. 

    The people smiled.  We made the people dance.  They were so very happy – for a while.  Then the novelty of that magical, pivoting two-step eventually ran its course, of course, as that is what happens to all new game-changing technologies.  The world went on and grew up in a hurry.  Oracle has Essbase these days, and their sales model is well, um … different.

    In our showman days, Decision Support Systems (DSS) was the collective term used to encapsulate all of what we provided, as the then progression from Executive Information Systems (EIS).  Since then, we’ve added Business Intelligence (BI), which in the day, was staunchly defended as different from DSS.  Then with a little help here and there from Hyperion and Gartner, along came Business Performance Management (BPM), which was the same as Corporate Performance Management (CPM), which is the same as what we now primarily have settled on as Enterprise Performance Management (EPM). 

    Add Data Analytics, Data Science, Big Data, Data Mining, Predictive Analytics, Data Visualization since then, just to name a few. We were telling everyone 22 years ago that coding was not necessary for OLAP – even the users in Finance could handle it.  The semantic layer is still alive and well, but there’s also been a renaissance in coding.  We’ve good coding schemes and data dreams and IoT not so much on the QT.  We’ve got Hadoop and R and Machine Learning and Python and Pig.  We’ve got Spark on the Apache.  We’ve got a zillion ways to zap every bit of data on the planet into sub-atomic data particles.  And we’re just getting started parsing data into oblivion. 

    I wonder sometimes if we’ve not come full circle?  We’re still mincing, slicing and dicing – but we’re just doing a great deal more of it – on a heck of a lot more data.  Would not all of the above ‘Big Data’ technologies, without too much of a stretch, settle themselves just dandy under the classification of Decision Support?   I’m just sayin’.

    Perhaps my slice of the pie is that I got to be a showman for a while. Or perhaps it’s just that I’ve had a bird’s eye view of seeing everything come full circle. Either way, it makes me smile.

    Robert Dayton

    February 8th, 2016 

  • The world's top brands have hired RM Dayton to align top analytic talent.

  • RM Dayton - Top Gun Analytic Talent

  •   

     

    PLANO, TX—August 24, 2017 – CIO Review selects RM Dayton Analytics as a '100 Most Promising Big Data Solution Provider 2017,' its top-tier list. The merit-based award was earned after CIO Review's editorial research panel of experts evaluated several hundred solution providers.  

    "We are honored and humbled to receive this prestigious award," said Robert Dayton, Managing Partner at RMDA. "To receive this accolade not only is validation of our knowledge worker alignment mission, but also is a recognition award for our incredible team and forward-thinking our customers, who are at the forefront in the strategic use of data analytics."  RM Dayton Analytics makes it to CIO Review’s Top 100 Big Data Solution Providers list for its expertise in helping clients align enterprise intelligence in a manner that transcends the iterative approach to deploying analytic talent in silos. 

    The annual list showcases the 100 Most Promising Big Data Solution Providers 2017. The positioning is based on evaluation of RM Dayton Analytics’ specialties in building teams of strategic and operations knowledge workers from an inside looking out perspective.  RM Dayton is changing the way the world's leading organizations deploy their X's and O's, aligning the world's most-prized intellectual capital in a new paradigm for data analytics.  The annual list of companies is selected by a panel of experts and members of CIO Review’s editorial board.

    RM Dayton Analytics has been selected after being evaluated across more a dozen quantitative and qualitative elements. Experts have made the decision by taking to consideration, company’s experience, industry recognition, technical certifications, market presence and positive client reviews. “The company continued to break new ground within the past year benefiting its customers,” according to Jeevan George, Managing Editor of CIO Review, “We are happy to showcase them this year due to their continuing excellence in delivering top-notch technology driven solutions.”

    Download the CIO Review Article

    About CIOReview

    CIO Review constantly endeavors to identify “The Best” in a variety of areas important to tech business. Through nominations and consultations with industry leaders, its editors choose the best in different domains.  The Big Data Special Edition is an annual listing of 100 Most Promising Big Data Solution Providers 2017 in the U.S.  For more information, visit the website at https://bigdata.cioreview.com/vendor/2017/rm_dayton_analytics

    About RM Dayton Analytics

    RM Dayton Analytics is trusted by Fortune 100 companies, as well as industry disruptors and other admired Global 2000 brands, to find the sharpest minds that embrace technology and enhance the customer experience. 

    We've been there ourselves as practitioners in our space. We've either been in your shoes as a hiring manager and/or we've worked on data analytics solutions like you. We get it.  We know what team building takes.  And we know what makes someone a top gun.

    The world's most admired companies realize that it's the people that make the difference in driving their technology roadmap.  RMDA focuses exclusively on aligning knowledge workers in business analytics, data science and big data.   The top brands on the planet realize that their People Roadmap is as important as their technology roadmap.

    We align the X's and O's that are right at the heart of the world's most prized intellectual capital, right at the leading edge of the people-side of data analytics.  It is people, after all, that drive the analytic technology, i.e. the 1's and 0's.  We do the X's and O's™.

  • CIO Review selects RM Dayton Analytics as a '20 Most Promising Enterprise Performance Management Solution Provider 2016,' its top-tier list. The merit-based award was earned after CIO Review's editorial research panel of experts evaluated over 300 solution providers. RM Dayton Analytics is changing the way the world's leading organizations deploy their X's and O's, aligning the world's most-prized intellectual capital.  

    "We are honored and humbled to receive this prestigious award," said Robert Dayton, Managing Partner, "EPM is the heart and soul of data analytics. To receive this award, in this category, at this juncture in our journey, not only is validation of our knowledge worker alignment mission, but also is a recognition award for our customers, who are at the forefront in the strategic use of data analytics."  CIO Review EPM Special Edition

  • Improvements in Capabilities Make AI Viable for the Enterprise

    Artificial Intelligence has long been viewed as the solution to all of the problems we will encounter a decade from now. Ever since the term was coined in the 1950’s, it has been peering over our shoulder, teasing us as seemingly within reach but simultaneously just beyond our grasp. Bold predictions about its power and proximity of use go unfounded, and we resign our hopes for AI to the future while maintaining the norm in the now.  

    Many of the difficulties in creating more advanced AI lie within a theory known as Polanyi’s Paradox; many of the things that humans know, whether it be how to play a hand of poker or how to recognize a face, cannot be easily explained, to either another human or a machine, and are instead tacit knowledge that is acquired through some form of the human experience. This is natural advantage humans have long held over artificial intelligence, which is not capable of “unsupervised learning,” as it were, instead reliant upon personal instruction.

    But the time may have finally come. AI advances, coupled with huge steps in machine learning, have resulted in huge improvements in the capabilities of machines to accomplish goals that previously were left to the human mind. Tasks relating to perception and cognition have seen serious progress in just several years; the error rate of speech recognition has dropped by nearly fifty percent since 2016, and companies like Google and PayPal are using machines to independently enhance and protect their systems.

    But how can enterprises more reliably and effectively take advantage of AI? Most of these businesses have remained profitable by controlling costs and exploiting economies of scale, which offers stability and predictability as it pertains to costs and profit, but very little when it comes to creating growth or sensing opportunity and threat rising and falling from within the market. They operate on a very large, very slow scale and are built on a foundation of huge wells of data, each one limited to one aspect of their internal operation.

    If enterprises, instead of relying on these large warehouses of data, can focus on specific use cases within industries, they will find an increase in demand, a lift in revenue, and an improvement in sales. Specific, valuable use cases are what drive data monetization, especially when spanning multiple, horizontally-oriented functions. 

    Another great advance in the development of AI is the decrease in the length of the implementation cycle. Previous systems took years to develop and were borderline outdated by the time they were fully operational and maximized. Now, within three months, data can be brought together to create initial automated actions and construct a client-specific view of the market. And thanks to the continuous stream of data, the system will always be updating and improving to fine-tune and validate its recommendations in the market. Every piece of data helps the AI devise a more complete view of the market, and unlike old systems, there is no level-off at a certain point; the more data you have, the better the computer’s predictions will be.

    Enterprise AI has also been used to drive retail. Matching sales, production, and resources with consumer demand can be achieved through a cross-channel analysis of the market through artificial intelligence. Data tells you who your customers are and how to market to them each individually, and then to prioritize outlets and discover demand to allow you to efficiently meet the consumer’s needs.

    Artificial intelligence has been used by trucking companies to optimize their routes and even to offer advice to their drivers while in the cab, such as whether to speed up or slow down. Reducing delivery time and optimizing fuel use has saved one European trucking company over 15% in fuel costs from sensors that monitor driver behavior and vehicle performance. Similarly, airlines have used AI to predict problems such as airport congestion and bad weather, helping them avoid cancellations that can prove extremely costly.

    Customer service and marketing fields have also benefitted greatly from the growing versatility of AI, thanks to the improvement of both AI pattern recognition and voice recognition. Companies like Amazon and Netflix that use “Next Product” recommendations in order to target individual customers can see large sales increases by using machines to determine what products a consumer is more likely to buy based on their purchasing history. And any company with an automated customer service system will benefit not only from better voice recognition, but from improving voice analysis that allows automated answering systems to recognize when a customer is becoming upset and automatically transfer them to a human representative.

    Deep neural networks, the technology behind machine learning and advanced AI systems, was shown to improve upon the performance of other analytic techniques in 69% of cases. Use cases show that modern deep learning AI has the ability to boost value above traditional AI by an average of 62%, with industries like retail and transport seeing a boost of almost 90%, and travel enjoying a fantastic 128% benefit with modern AI as opposed to traditional. When aggregated, AI has the potential to create up to $5.8 trillion, which would be 40% of the overall potential impact of all analytics techniques.

    Retail stands to gain the most from advanced AI in marketing and sales areas such as pricing and promotion, where they could gain $500 billion dollars a year. Combining that gain with advances in task automation and in inventory and parts optimization, the retail industry as a whole could see up to a trillion dollars as a result of new AI techniques.

    Consumer package goods can see the greatest benefit in supply-chain management and manufacturing, where predictive maintenance and inventory optimization could result in a net of $300 billion dollars a year. There are dozens of areas in which goods enterprises can improve themselves with AI, among them yield optimization, sales and demand forecasting, budget allocation, and analytics-driven hiring and retention.

    Banking can see additional profits by applying enterprise AI to their marketing, sales, and risk sectors. Just those three could allow the banking industry to see another $300 billion. Using advanced analytics to handle, as we’ve discussed, customer service, risk assessment, pricing, and customer acquisition

    The discovery of farmer/customer archetypes has been driven by AI. Every farmer has a “digital fingerprint,” created through characteristics such as crop yield, geography, operational performance, among many others. Products are then rated for customers to reveal taste and preference, specific to product attributes, which in turn boosts products that have many similar attributes as the most popular products. These attributes becoming more or less popular provide the farmers with insight into consumer trends, and provide the customer with thematic farmer archetypes. The creation of these archetypes helps the underlying enterprise AI undercover the true drive in demand for products, find lookalike customers, focus growth targeting, define targeted customer concepts, and target promotions to simulate cross and upsell.

    June 11th, 2019

    © 2019 RM Dayton Analytics, Ltd. Co.

      

  •  

    Conference - Collaborate 19 -- Technology Forum


    Collaborate will be held in San Antonio, Texas from April 7th-11th 2019.  Let us know if you'll be heading out to San Antone,.  Schedule 30 minutes on the calendar below, and let's talk shop about where business analytics is today, where it's heading, and how that might impact your company initiatives and career.

     

  •  

    Conference - Predictive Analytics World


    The Predictive Analytics World Conference will be held in Las Vegas, NV from June 16th - June 20th, 2019.  We look forward to seeing you there to network with industry practitioners in business analytics.   Let us know if you'll be heading out.  Find 30 minutes on the below calendar, and we'll talk talk shop on business analytics over the java that you drink in Vegas.

     

     

  •  

    Conference - Strata Data -- Make Data Work


    The Strata Data Conference will be held in San Jose, CA from March 15th-18th, 2020.   Let us know if you'll be heading out to Silicon Valley.  Schedule 30 minutes on the calendar below. Thank you.

     

     

    Thank you!

  •  

    Conference - The AI Summit 


    The AI Summit will be held in New York, NY from December 11th-12th, 2019.  We look forward to seeing you there.   Find 30 minutes on the below calendar, and we'll sync up in The City.

     

    Thank you!

  •  

    Cross-Enterprise AI 101 for Analytics and EPM People


    Where does Cross-Enterprise Artificial Intelligence (AI) take the baton from Enterprise Performance Management (EPM) and other business analytics technologies?  

    In this e-learning session, we’ll briefly review the scope and breadth of traditional EPM and Business Analytics towards the objective of becoming an intelligent agile enterprise.

    We’ll talk through examples of advanced analytics applications typically beyond the scope of classic EPM, where massive integration of different ‘type of data’ silos are required, and different skill sets, such of those of a data scientist, are often deployed.

    You’ll then learn about the key components of a cross-enterprise AI platform, as well as several use cases highly adaptive for cross-enterprise AI. A prime objective of Cross-Enterprise Artificial Intelligence is to continuously sense what people want, and then deliver continuously adapting value.

     

    Register Here!

     

     

  •  

    Cross-Enterprise AI vs. Intelligent Automation: What's the Difference?


    Where do Intelligent Automation (IA) and Cross-Enterprise AI fill the gaps in enterprise software technologies, such as Enterprise Resource Planning (ERP), Enterprise Performance Management (EPM), Business Intelligence (BI) and Business Analytics technologies? Where do they help people by allowing more time for the creative, spending less on the mundane?

    You’ll learn about the key components Intelligent Automation (IA), introducing how Robotic Process Automation (RPA) can provide with more time for creativity, realizing more value out of your existing technology investments.

     

    We’ll also review the key components of a Cross-Enterprise AI platform, highlighting the differences between Intelligent Automation and Cross-Enterprise AI.

    Learn how Cross-Enterprise AI and Intelligent Automation unleash business value in new ways, transforming core processes and business models in various industries.

     

    Register Here!

     

     

  •  

    Ignite Business Analytics & EPM with IA


    How can Intelligent Automation (IA) supercharge Enterprise Performance Management (EPM), Business Intelligence (BI) and Business Analytics technologies via Robotic Process Automation (RPA)? 

    In this e-learning session, we’ll briefly review the scope and breadth of traditional EPM and Business Analytics towards the objective of becoming an intelligent agile enterprise..

     

    We’ll talk through examples of where massive integration and data silos typically exist in BI, EPM and Business Analytics solutions.

    You’ll then learn about the key components of RPA, as well as several use cases highly adaptive for Intelligent Automation.

     

    Register Here!

     

     

  • Eliminate the Fluff

    I have been working in Business Analytics for about 20 years, and for a good part of that, focused on aligning teams of consultants for projects. A lot of thought goes into putting these teams together.  The most important attributes to consider are technology expertise and domain knowledge. 

    Consultant profiles span a myriad of technologies, so what someone looks like ‘on paper’ from a technology expertise perspective can be deceiving. Technical expertise can be the most difficult to assess from a customer's point of view.  You're buying into the fact that the consulting firm may have done this successfully in the past, though perhaps with a vastly different set of characters.   

    After three or more projects, one generally has enough domain expertise so they no longer have to fake it.  In the ‘full-service’ consulting model, juniors would be put on the team as billable consultants, learning both technical skills and domain knowledge on the client’s nickel.

    Before founding RMDA, I worked on a projects with an experts in EPM that involved Enterprise Performance Management (EPM), Business Intelligence (BI) and Business Analytics. Each resource billed around from between $150/hr at the low end to $225/hr+ on the upper end.  So in in just one month, for a team of just three resources, the client was billed around $100,000. 

    The Less is More Paradigm

    During the build of the project, it took effort to make sure the right hand know what the left hand was doing. Precious time and money at $200/hour could be spent, for example, on naming conventions. This overhead adds cost and risk to the project. 

    With a 'less is more' mindset, RMDA challenges this resource alignment method and dares to ask, "why not?'  Why not reduce the footprint related to business analytics to fewer, pinpointed resources?  

    The paradox is, you can leverage the underlying technology investment in the 'to be' state better, while retaining more of the enhanced, prized intellectual capital you wanted to attain in the first place.

    Combining an agile approach with experience in the business analytics domain as practitioner's, transformative results can be delivered without the layers.  

    RM Dayton knowledge worker engagements consist of fewer, but laser focused resources, without the fluff - an approach that has earned the honor of being one of The Silicon Review 50 Most Admired Companies of the Year. Because you don’t have four or five experts sitting around, when one or two pinpointed resources would do, you save time and money while reducing project risk. 

    We can do this, as RMDA is not focused on 'all' IT, but rather, is focused solely on Business Analytics, Big Data and Data Science.

    Robert Dayton

    October 20th, 2017

    © 2017 RM Dayton Analytics, Ltd. Co.

    Related article on CIO Review - Enhance Customer Engagement from the Inside Looking Out.

      

  •  

    What Goes Around Comes Around

          Not long ago, a legacy system was enterprise software running on a mainframe.  Wait, I thought those supercharged Teradata screaming machines were purchased just a few years ago?  Did someone say legacy Teradata?  Grab the gold coins son, we’re going off the grid.  It’s a stirring revolution that some large industry leaders are taking their in-house MPP databases and shoving them, opting for the route of IaaS and PaaS.  Part of what’s driving this movement is the desire to let someone else worry about infrastructure and platforms. These initiatives are not little toy sandbox data marts, but large-scale behemoths being deployed by the largest retail, consumer goods and high tech companies in the world.  Sound familiar?  No, it’s not déjà vu.  This really is a lot like the time sharing on mainframes almost a half century ago.

     The Cloud People

          Whether on premise or off, you need to align the right set of skills when deploying big data in the cloud.   So what kind of skills should you bring on board?  Perhaps you’ll need experience in hosting and performance tuning Splunk applications on Azure to analyze product (machine) operations data for troubleshooting or analysis.  Or maybe you’ll need knowledge in tools such as SOASTA for load testing and log analysis as teams spin up thousands of servers across your cloud platform.  Going off premise cloud raise may raise your bar on efficiency and effectiveness, but subsequently, require some team retooling.  SaaS experience should entail big queries on cloud services platforms.  You’ll need to import data into the cloud storage and thereafter ingest to cloud storage from real-time sources.  IaaS and PaaS experience should involve virtual machine, virtual network and tools heavy lifting on cloud platforms.  Though many big data tools are open source, not all are created equal across cloud platforms.  Many skills are transferable, so don’t decide solely based on your architects’ cloud platform religions as to whether you’ll be standing up Virtual Clouds (AWS) or Virtual Nets (Azure).

     It’s Real Time

         In ongoing cloud operations and performance, there is a need to process significant amount of data from web servers, application servers, databases and external sources, such Akamai and SiteCatalyst. Additionally, data may need to be correlated with applications such as Apigee or an Order Management System.  There is a need for real-time and near real-time visibility for into operations.  Basket analysis, for example, may use Apache Hadoop, with the data ingested via Flume, where then custom programs do the analysis and prepare the results for reporting.  Tools such as Splunk may be used to collect logs from hundreds or servers and correlate with the data from the applications.

         Moving data in and out of cloud storage may end up being a blend of batch and real-time using cloud features such as Elasticsearch in an AWS cloud to search data lakes of mobile, social, consumer and business applications data.  For Google Cloud, you’ll want experience with Google Pub/Sub and Dataflow for real-time data integration into BigQuery. 

     Nothing but Cloud

         You wouldn’t do it of course just because the cool CIO getting all of the attention at the conference said she did, but between us, if you’re thinking about going off-premise cloud, you’re not alone.  More than one top Fortune Company is reducing their in-house Teradata footprint and moving to cloud platforms.  Note:  Teradata is now offering Database as a service (DBaaS) in its own cloud, as well as on public cloud platforms.

         Your plans may be to go all-in all a public cloud platform.  You might end up, however, with some on premise deployment, even if that’s not the way it started out on the drawing board. So if you end up with an Elasticsearch cluster running on Azure with an on premise connection to MongoDB inside the firewall, you’re not a loser.  And you can still tell your buddies at the bocce league that you’ve gone cloud.

         Make sure to get architects with experience in designing data flows into Hadoop and on a large-scale cloud platform.  Develop best practices for the day-to-day loads, which may involve MongoDB or Cassandra Collections.  Be prepared to have your team develop use case prototypes for demonstration to upper management.   Data lake technology is still in its early stages, and the skills to extract meaningful business value are in rather short supply.  Deployment guides don’t exist yet.  Aligning the right skills and governance is key to avoid projects getting bogged down.

    Master the Data Lake

         It’s intriguing that the new term data lake is named after a body of water rather than a place where we store stuff a la the data warehouse.  That lets us use other water terms when the data gets all mucked up or otherwise permeated, for better or worse, enabling us to call them things like data bogs, swamps, creeks and oceans.  Sticking to the metaphor of the lucent body of data, a data lake is similar to the persisted storage area of a data warehouse.  The idea is to store massive amounts of data so that is readily available.

         Back in the data warehouse glory days, you only wanted to have the really important data immediately available, because you simply could not handle making all of the data persistent.  It ended up though not being such a bad thing, because adding some governance to the process led to some discipline up front about what was more important and what was less important.  True, technology limited storing ‘everything’ from a cost and capabilities perspective.  The collective technology, which we might just call tools, really did not allow us to ‘have it all.’

         Alas, technology has advanced, and it is now cost effective to store gargantuan amounts of data – yes, unstructured too – in data lakes.  But is there a downside?  Time will tell.  The good news is, we get to store a bunch of data we don’t have a clue about.  So maybe, over time, we can make some sense out of it.  It’s sort of like data cryogenics, in that we want to keep the data alive until we find the cure to fix whatever ailments are being caused by something we know so little about.

     Decoder for Big Data & Cloud Terms Referenced

    SaaS – Software as a Service – functional specific on-demand application. Example:  SalesForce.com

    PaaS – Platform as a Service – middleware, virtualized databases, development platforms, operating systems. Examples: Azure and AWS

    IaaS – Infrastructure as a Service – memory, disk, network, virtual machine (computer).  Examples: Amazon EC2, Rackspace, IBM SoftLayer

    API Management Platform - Apigee

    Big Data Platforms – Hadoop, Hortonworks

    Cloud Services Platforms – Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform

    Content Deliver Network (CDN) – Akamai

    Data Ingestion Tool – Flume

    Massively Parallel Processing (MPP) DB– Teradata

    Monitoring and Analyzing Platform - Splunk

    NoSQL Databases – Cassandra, MongoDB

    Performance Analytics – SOASTA

    Search Analytics – Elasticsearch, Google BigQuery

    Web Analytics - SiteCatalyst

    © 2016 RM Dayton Analytics, Ltd. Co.

     This article was originally published 6/27/16 as a CIO Review White Paper entitled ‘Meshing Big Data and The Cloud.’  

    July 9th, 2016

    Download White Paper

     

     
  • Business Analytics in the Travel Industry

    The travel industry makes extensive use of business analytics to drive loyalty, promotions, revenue management – and ultimately, the customer experience.  

    These are not your grandfather’s type of slice and dice analytics, but use a combination of data science, big data and business analytics visualization tools.  Below are a few examples of business modeling in travel and hospitality:

    The Booking Performance Model

    A predictive model of booking performance was redesigned from scratch: adding new functionality while reducing error and the time to update and run the model by twenty-fold compared to the previous version. This model is responsible for determining:

    - Targets and forecasts
    - Of bookings and arrival
    - Calculated at the book date, arrival data pair level

    The model is a multiplicative pick up model, expanded to include:

    - Stationarization
    - Seasonal decomposition
    - Outlier detection
    - Frequency domain analysis
    - SARIMAX
    - Synthetic booking and cancellation curves

    The Strategic Business Simulation

    This model "looks at" the data and business questions in a unique way, to answer questions that are largely immune to statistical and trend extrapolation models.

    The Deep Learning Model

    This is a long short-term memory (LSTM) recurrent neural network (RNN) model. It will extract and predict spatial, temporal, and agent patterns of behavior.  

    Rapid Response Analytics

    Improve the rapid response analytics process to give better data faster by creating high-quality, complex data extracts and aggregations that can be quickly customized for unusual data pulls. This saves days of effort for several very urgent descriptive / diagnostic analytics requests, while drastically reducing turnaround time.

    Data analytics revolve around these types of business subjects:

    - Sales promotion targeting and performance
    - Booking channels
    - Competitive analysis
    - A/B testing of new initiatives
    - Segmentation of guests, assets, and prices
    - Analysis of pricing strategies

    May 24th, 2017

    © 2017 RM Dayton Analytics, Ltd. Co.

      

  •  

    Appointment Request - Phone Chat with Dayton Analytics


    Thank you for scheduling time on your calendar.  Please request to schedule 20 minutes at one of the available times listed on the calendar below.  If something comes up, please use the confirmation email that you'll receive to reschedule.  

     

  • Strata Data is where cutting-edge science and new business fundamentals intersect.

    The conference is a deep-immersion event where data scientists, analysts and executives get up to speed on emerging technologies.  

    The selected speakers have a focus on the data issues that are shaping the finance, media, fashion, retail, manufacturing and energy worlds.  

    Find 30 minutes on the calendar link, and we'll talk shop on business analytics over the java that you drink.

    The conference is being held in New York, NY US, Sep. 23 - 26, 2019. 

     

     Java Calendar for Strata Data Conference 2019

  •  

    Regardless of the ‘how’ resources are hired, one thing is for certain: knowledge workers must represent the interests of the company – what RM Dayton calls an ‘inside looking out’ mindset.   With the a ‘less is more’ philosophy, we dare to ask, ‘why not?’ Have 20 resources on a project? Why not cut that to 10 or fewer, laser-focused, where you retain the prized intellectual capital?  

    A business leader at one of the best-known brands in travel and hospitality, remarked after a top gun resource started, ‘he’s really good, but he knows about many things we are not currently doing.’  

    Bingo.  Over one year later, the resource is indispensable. A predictive model of performance was redesigned from scratch: adding new functionality while reducing error and the time to update and run the model by 2000% compared to the previous version.  They’ve made new forays into strategic business simulation, where the model ‘looks at’ business questions in a unique way, to provide answers largely immune to trend and extrapolation models.  Then there’s that Deep Learning Model, a long short-term (LSTM) recurrent neural network (RNN) model that will predict spatial, temporal and agent patterns of behavior. 

    The power of one.  Never underestimate it.  Just one person can make an insurmountable difference. 

    Companies that previously understood perhaps only a small percentage of its customers well, at discrete points in time, are getting to the point where they know more about more of their customers on a continuum.   Clearly, being on the Big Data sidelines is not an option.

    About a dozen years ago, businesses of all sizes began using offshore resources in droves to reduce costs.  Initially, companies did save money with a labor arbitrage.  Today, some companies are re-thinking this decision, deciding to re-shore, rather than to have much of their prized intellectual capital locked up in minds of behemoth offshore-based IT outsourcing organizations. 

    Whether companies plunged heard first into the gushing Big Data torrent or dipped their toes in with a POC, these IT outsourcing agreements and resources were largely in place.  Naturally, as a convenient convergence of time and circumstance, the hammers were handed to these resources to a large degree to go build some Big Data solutions.

    Some business leaders now wonder whether a commodity approach to their most-prized intellectual capital is truly in their best interest – and in the long run, they wonder, what are the true costs of outsourcing prized strategic and operational knowledge?

    This introspection is pronounced for big data analytics initiatives, where customer engagement is the goal, and success or failure may well foreshadow the ongoing viability and competitiveness of the organization in certain industry verticals.

    The approach of more low-cost resources seemed to work well for outsourcing call centers and technical support - you maybe really did need more bodies just to handle the volume.  But for transformative development leveraging Data Science and Big Data, the right elbow reflex of the non-focused IT outsourcing firms – who want to sell you more – may not be the right prescription. 

    RM Dayton knowledge worker engagements consist of fewer, but laser focused resources, without the fluff - an approach that has earned to be one of  The Silicon Review 50 Most Admired Companies of the Year

    RM Dayton Analytics is singularly focused on Big Data, Data Science and Business Analytics.  A simple, yet powerful, mindset provides for a more effective and efficient approach to providing a world class experience for our customers.   

    Combining an agile approach with the firepower of those who dare to ask, ‘why not?’, the firm is able to deliver transformative results, providing strategic and operational advanced analytics intellectual capital at scale and speed, from the inside looking out.   RM Dayton has earned several accolades, including 20 Most Promising Enterprise Performance Management Solution Providers 2016, 100 Most Promising Big Data Solution Providers 2017 and 50 Most Admired Companies of the Year 2017.

    June 1st, 2017

    © 2017 RM Dayton Analytics, Ltd. Co.

      

  •  

    Cross-Enterprise AI vs. Intelligent Automation: 

    What's the Difference?

     

    As artificial intelligence (AI) has increased in efficiency and spread across the country and across the world, more and more managers have turned towards AI solutions to improve their processes and become more agile. The technology has drastically transformed the workplace, making it a much more intelligent, accurate, and productive space than it was even ten years ago.  

    Two of the most prominent systems used by companies today are Robotic Process Automation (RPA) and Cross-Enterprise AI. Organizations seeking to employ these systems often know that they can greatly improve their internal operations but not the extent, and in many cases they aren’t even sure how, without the help of extensive implementation consulting, to properly use these systems and install them in a value-maximizing way. RPA and Cross-Enterprise AI are extremely powerful, but their actual purpose must be understood, and proper expectations must be set in order to fully harness them.

    Strictly speaking, RPA is not a form of AI, as it represents more automation and performance of prescribed tasks than the use of data to create new information. RPA is designed to perform the same actions a human would through user interface and descriptor technologies. RPA operates in place of a human via a “bot” that uses various pathways and triggers. It does not create new data or enhance existing business models, but rather accelerates the timeline upon which those models operate.

    Download the Executive Briefing White Paper to learn about the key differences between Cross-Enterprise AI and RPA.

     

    I'd Like to Read this White Paper

     

     

Before download...

Continue... ×