ALIGNING TOMORROW'S KNOWLEDGE WORKER™            

Sr. Cloud Solutions Architect

About the job

The Sr. Cloud Solutions Architect will have the unique combination of business acumen needed to interface directly with key stakeholders to understand business challenges. You will be responsible for designing and implementing data architectures, data models, and data integration processes using the latest technologies that the data cloud platform touches, as your clients will need a flexible data architecture that quickly turns data of various types into informative, meaningful intelligence.

Artificial intelligence (AI) is at the precipice of being the core of all intelligent, human-centric business decisions. By decoding the signals through the noise of customer needs and preferences, people can understand exactly what services, products, and experiences their consumers need. Within our AI & Analytics practice, we work to design the future of work – a future in which ad hoc, hit or miss business decisions are replaced by informed choices, supported by ever more effective business models.  RMDA takes insights that are buried in data and provides businesses a clear way to transform how organizations consume not only their information, but also secure data clean room information that other entities have agreed to share.  Our mission is to assist leading companies in scaling their AI and analytics initiatives in a manner to which they may not be accustomed.

The position:

The Sr. Cloud Solutions Architect will have the unique combination of business acumen needed to interface directly with key stakeholders to understand business challenges. They will have the skills and vision required to translate the need into a world-class technical solution using the latest technologies.

This person will be in a hands-on role as part of a development team responsible for building data engineering solutions for clients using cloud based data platforms. They will work closely with clients to take a lead on day-to-day development. In this role, you need to be equally skilled with the whiteboard and the keyboard.

Functions:  Develop and deploy software networks.  Understand the current application infrastructure and suggest changes to it.  Set-up a monitoring stack with alerts and dashboards.  Experience with automation including deployments and configuration management.  Help design provisioning and help client on what can be provisioned. Securing the tech stack along with the team.  Understands data ingestion, downstream ML and security. 

Tech Skills:  Experience with AWS, Azure and/or Google Cloud formation, code pipeline, and code build.  Strong cloud data management and automation expertise. 

This role will also provide technical subject matter expertise for RM Dayton Analytics sales and account teams during the scoping of new cloud data platform opportunities.

This is a remote position with ability to travel to client sites as needed open to any qualified applicant in the United States.  US work authorization that requires transfer of sponsorship or support for employment is not available for this position.

Practice – Cloud Data Platform

 

What will I be doing?

Primary focus of this position is to provide design and development support to clients.  This position is a hands-on, intense role of working side by side with our end user partners to drive better analytics, reporting and most importantly… client competitive advantage.  

Key responsibilities to include:  

  • Work primarily with architects and at times with business partners and data science teams to understand business context and craft best-in-class solutions to their toughest problems
  • Provide data modeling, process modeling, and data mart design support.
  • Create Python Scripts/SQL scripts in support of data platform load and batch processes.
  • Design and deploy consolidation and summary tables as required within the data warehousing environment.
  • Perform periodic performance assessments of the automated load processes.
  • Be proactive in identifying and working with issues.
  • Provide specialized support for Legacy platforms and EDW’s.
  • Documentation of deliverables.
  • Standardization of deliverables.
  • Create robust and automated pipelines to ingest and process structured and unstructured data from source systems into analytical platforms using batch and streaming mechanisms leveraging a cloud native tool-set
  • Implements automation to optimize data platform compute and storage resources
  • Develops and enhances end to end monitoring capability of cloud data platforms
  • Responsible for implementing custom data applications as required for delivering actionable insights
  • Provides regular status updates to all relevant stakeholders
  • Participates in developing projects plan, timelines and providing estimates
  • Provide hands-on technical assistance in all aspects of data engineering design and implementations including data ingestion, data models, data structures, data storage, data processing, and data monitoring at scale
  • Develop data engineering best practices with considerations for high data availability, fault tolerance, computational efficiency, cost, and quality.

 

What are we looking for?

    • Cloud data warehouse experience – Snowflake is a plus
    • Must have demonstratable working knowledge of modern information and delivery practices for on-premises and cloud environments.
    • Must have demonstratable experience delivering robust information delivery and management solutions as part of a fast paced data platform program.
    • MUST able to apply business rules/logic using SQL scripts.
    • Must have working knowledge of various data modeling techniques (3NF, denormalized, STAR Schema).
    • Position requires a self-starter, capable of quickly turning around vaguely defined projects with minimal supervision or assistance.
    • Ability to conduct analysis of source data sets to achieve target data set objectives.
    • STRONG VERBAL/WRITTEN COMMUNICATION is a MUST. Interacting with business community/users is a core requirement of the role.
    • Ability to travel an average of one-two weeks average per month if necessary.
    • Candidate MUST possess a STRONG INITIATIVE.
    • Candidate MUST be able to run with a project on their own, with as little as a few sentences to begin the project with.
    • Prior consulting experience is a plus.
    • Experience in the migration of data from an on premise database to a cloud based  data warehouse platform is a strong plus.
    • A candidate with experience working with terabyte sized data warehouses and complex ETL mappings that process 50+ million records per day is strongly preferred.

 

To fulfill this role successfully, you should demonstrate the following minimum qualifications:

  • 4 year College degree in Computer Science, Information Technology or equivalent demonstrated experience.   Masters degree preferred.
  • Strong SQL development skills using databases like Oracle and Vertica.
  • Experience in cloud databases like Snowflake, BigQuery, or Redshift is a plus
  • Experience with AWS technologies such as EC2, S3 and other basic AWS technologies
  • Certification –preferably GCP or any other cloud data or big data platforms
  • 4+ years of experience in the data and analytics space
  • Experience with workload automation tools such as Airflow, Autosys.
  • 6+ years of RDBMS concepts with Strong Data analysis and SQL experience
  • 3+ years of Linux OS command line tools and bash scripting proficiency
  • Google Cloud Professional Cloud Architect AWS Certified Solutions Architect Associate, AWS Certified Solutions Architect – Professional or Azure Solutions Architect Expert certification.

It would be useful in this position for you to demonstrate the following capabilities and distinctions:

  • Downstream ML knowledge
  • Prior working experience on data science work bench
  • Data Modeling experience a plus
  • Nine (9) + years of professional Technology related field experience.
  • Seven (7) + years of experience with Big Data technologies and platforms such as Hadoop, Redshift, Scala, Spark, or Hive.
  • Snowflake experience is a plus
  • SnowPro Core Certification or SnoPro Advanced Architect Certification is a plus
  • Master’s degree in a quantitative discipline is preferred.

 

About RM Dayton Analytics

A little bit about us…

Founded in 2014, RM Dayton Analytics is a professional services company. We transform clients’ business, operating and technology models for the digital era.  As a Snowflake and Google Cloud Partners, our mission is to help enterprises accelerate innovation by harnessing the power of the Cloud Data Platform. Our associates provide superior domain and technology expertise to drive business outcomes in a converging world.  Our consultative approach helps our customers build more innovative and efficient businesses.

RM Dayton Analytics is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law.

Job Category: Cloud Data Platform
Job Location: United States

Apply for this position

Allowed Type(s): .pdf, .doc, .docx