AWS Data Architect

Overview

We are seeking an experienced AWS Data Architect with over 10 years of experience in modern data ecosystems, particularly on AWS/cloud platforms. The role involves designing technical architectures, setting software engineering standards, and leading team meetings and workshops. Key responsibilities include working with AWS ETL/File Movement tools, using AWS databases, programming in Python and Spark, and utilizing automation tools like Apache Airflow and Ansible. The ideal candidate should possess strong analytical and problem-solving skills, excellent communication abilities, and experience with quality, compliance, and security models on large datasets.

Job Description

Roles and responsibilities :

Work closely with the Product Owners and stake holders to design the Technical Architecture for data platform to meet the requirements of the proposed solution.

Work with the leadership to set the standards for software engineering practices within the machine learning engineering team and support across other disciplines.

Play an active role in leading team meetings and workshops with clients.

Choose and use the right analytical libraries, programming languages, and frameworks for each task.

Help the Data Engineering team produce high-quality code that allows us to put solutions into production.

Create and own the technical product backlogs for products, help the team to close the backlogs in right time.

Refactor code into reusable libraries, APIs, and tools.

Help us to shape the next generation of our products.

What We’re Looking For:

Total experience in data management area for 10 + years experience in the implementation of modern data ecosystems in AWS/Cloud platforms.

Strong experience with AWS ETL/File Movement tools ( GLUE, Athena, Lambda, Kenesis and other AWS integration stack).

Strong experience with Agile Development, Cloud formation Template, AWS CodeBuilt, AWS Code Pipeline.

Strong experience with Two or Three AWS database technologies ( Redshift, Aurora, RDS,S3 & other AWS Data Service ) covering security, policies, access management.

Strong programming Experience with Python and Spark.

Experience with Apache Airflow,Ansible & other automation stack.

Excellent oral and written communication skills.

A high level of intellectual curiosity, external perspective, and innovation interest.

Strong analytical, problem solving and investigative skills

Experience in applying quality and compliance requirements.

Experience with security models and development on large dataset.

Skills & Requirements

Technical Architecture Design, AWS ETL/File Movement Tools, Agile Development, AWS CodeBuild & CodePipeline, AWS Database Technologies, Python Programming, Apache Spark, Apache Airflow, Ansible Automation, Communication Skills

Apply Now

Join Our Community

Let us know the skills you need and we'll find the best talent for you