AWS Cloud Engineer

at Strategic Staffing Solutions
Published June 16, 2022
Location Charlotte, NC
Category Default  
Job Type Contractor  

Description

STRATEGIC STAFFING SOLUTIONS HAS AN OPENING!

Strategic Staffing Solutions is currently looking for a AWS Cloud Engineer for a contract opening with one of our largest clients located in Charlotte, NC!

This is a Contract Opportunity with our company that MUST be worked on a W2 Only. No C2C eligibility for this positionVisa Sponsorship is Available ! The details are below.

Location: Charlotte, NC (Remote)

Duration:  12 Months

To apply: Please email your resume in Word Format to Bob Cromer at: [Click Here to Email Your Resumé] and Reference Job Order #: 203269 or Click the Apply Button.

Job Description

  • Support or collaborate with application developers, database architects, data analysts and data scientists to ensure optimal data delivery architecture throughout ongoing projects/operations.
  • Design, build, and manage analytics infrastructure that can be utilized by data analysts, data scientists, and non-technical data consumers, which enables functions of the big data platform for Analytics.
  • Develop, construct, test, and maintain architectures, such as databases and large-scale processing systems that help analyze and process data in the way the Analytics organization requires.
  • Develop highly scalable data management interfaces, as well as software components by employing programming languages and tools.
  • Work closely with a team of Data Science staff to take existing or new models and convert them into scalable analytical solutions.
  • Design, document, build, test and deploy data pipelines that assemble large complex datasets from various sources and integrate them into a unified view.
  • Identify, design, and implement operational improvements: automating manual processes, data quality checks, error handling and recovery, re-designing infrastructure as needed.
  • Create data models that will allow analytics and business teams to derive insights about customer behaviors
  • Build new data pipelines, identify existing data gaps and provide automated solutions to deliver analytical capabilities and enriched data to applications.
  • Responsible for obtaining data from the System of Record and establishing batch or real-time data feed to provide analysis in an automated fashion.
  • Develop techniques supporting trending and analytic decision making processes
  • Apply technologies for responsive front-end experience
  • Ensure systems meet business requirements and industry practices
  • Research opportunities for data acquisition and new uses for existing data
  • Develop data set processes for data modeling, mining and production? Integrate data management technologies and software engineering tools into existing structures? Employ a variety of languages and tools (e.g. scripting languages)

Core Technical Skills

  • 3 years of AWS experience
  • Hands-on Experience with AWS services - S3, EMR, Glue Jobs, Lambda, Athena, CloudTrail, SNS, SQS, CloudWatch, Step Functions
  • Extensive Experience with EMR databases such as Glue Catalog, Lake Formation, Redshift, DynamoDB and Aurora

Required Tools and Languages  

  • Python, Spark, PySpark and Pandas
  • Infrastructure as Code technology
  • Terraform/CloudFormation
  • Experience with Secrets Management Platform like Vault and AWS Secrets manager
  • Experience with DevOps pipeline (CI/CD) - Bitbucket; Concourse
  • Experience with RDBMS platforms and Strong proficiency with SQL
  • Deep knowledge of IAM roles and Policies
  • Experience using AWS monitoring services like CloudWatch, CloudTrail ad CloudWatch events
  • Experience with AWS workflow orchestration tool like Airflow or Step Functions
  • Experience with Kafka/Messaging preferably Confluent Kafka
  • Experience with Event Driven Architecture

Desired Technical Skills

  • Experience with native AWS technologies for data and analytics such as Kinesis, OpenSearch
  • Databases - Document DB, Mongo DB
  • Hadoop platform (Hive; HBase; Druid)
  • Java, Scala, Node JS
  • Workflow Automation
  • Experience transitioning on premise big data platforms into cloud-based platforms such as AWS
  • Strong Background in Kubernetes, Distributed Systems, Microservice architecture and containers
  • Experience with Rest APIs and API gateway
  • Deep understanding of networking DNS, TCP/IP and VPN

Core Responsibilities

  • Provides technical direction, guides the team on key technical aspects and responsible for product tech delivery
  • Lead the Design, Build, Test and Deployment of components
  • Collaborate with Lead Developers (Data Engineer, Software Engineer, Data Scientist, Technical Test Lead)
  • Understand requirements / use case to outline technical scope and lead delivery of technical solution
  • Confirm required developers and skillsets specific to product
  • Provides leadership, direction, peer review and accountability to developers on the product (key responsibility)
  • Works closely with the Product Owner to align on delivery goals and timing
  • Assists Product Owner with prioritizing and managing team backlog
  • Collaborates with Data and Solution architects on key technical decisions
  • The architecture and design to deliver the requirements and functionality

Core Experience and Abilities

  • Ability to perform hands on development and peer review for certain components / tech stack on the product
  • Standing up of development instances and migration path (with required security, access/roles)
  • Develop components and related processes (e.g. data pipelines and associated ETL processes, workflows)
  • Lead implementation of integrated data quality framework
  • Ensures optimal framework design and load testing scope to optimize performance (specifically for Big Data)
  • Supports data scientist with test and validation of models
  • Performs impact analysis and identifies risk to design changes
  • Ability to build new data pipelines, identify existing data gaps and provide automated solutions to deliver analytical capabilities and enriched data to applications
  • Ensures Test Driven development

Requirements

  • Degree in Computer Science, Engineering, or related fields 
  • 8-15 years of experience
  • 3 years of Experience leading teams to deliver complex products
  • Strong technical skills and communication skills
  • Strong skills with business stakeholder interactions
  • Strong solutioning and architecture skills
  • 5 years of Experience building real time data ingestion streams (event driven)
  • Ensure data security and permissions solutions, including data encryption, user access controls and logging                       

$$ WE OFFER A REFERRAL FEE FOR ANYONE REFERRED & HIRED WITH S3! $$

Strategic Staffing Solutions (S3), based in Detroit, Michigan, prides itself on being an international, woman-owned, $300 million IT and Business Services Corporation with 30 years of service. We are ranked 16th among the largest staffing firms in the US by Staffing Industry Report, 6th largest IT Diversity staffing firm, and are one of five companies nationally certified as a Charter Partner with Staffing Industry Analysts. S3 provides IT consulting, customized project solutions, vendor management programs and executive search services to financial institutions, insurance, energy, oil/gas, telecommunication, government, retail, and health care industries worldwide.  We have more than 3,600 consultants and 31 offices in the US and Europe. S3 is also proud to be nationally recognized as both a Military Friendly and Military Spouse Friendly Employer.

Drop files here browse files ...