Senior Data Architect

Overview

Acuity Knowledge Partners, a leading provider of high-value research, analytics, and business intelligence to over 500 financial institutions and consulting companies through our specialist workforce of over 6,000 analysts and delivery experts across our global delivery network.

Job Description

We are seeking a highly skilled Data Architect to join our Financial Business Automation Products group.

The ideal candidate will have a deep understanding of data architecture principles, extensive experience with data modeling, and the ability to design and implement scalable data solutions in financial domains.As a Data Architect, you will develop and maintain the data strategy, governance, security, and availability for the suite of products developed by Acuity and used by Acuity clients. The individual must be familiar with Gen AI technologies and adapt at creating data strategy to allow the Gen AI based LLMs use the data infrastructure to provide relevant outcomes.

Key Responsibilities

• Platform Design and Architecture:

o Lead the design and development of the data platform architecture, ensuring scalability, performance, reliability, and security.

o Define and implement Standards for data modeling, data integration, and data lifecycle management.

o Well versed with modern data platform stack with end-end coverage to build large scale Data and AI solutions.

o Create blueprints for data pipelines, data lakes, data warehouses, and analytical systems.

o Provide technical leadership in choosing appropriate technologies for data processing, cloud compute, and storage solutions.

• Technical Solutions and Roadmap:

o Influence enterprise architecture design conversations and deliver sophisticated data solutions.

o Work closely with leaders, data engineers, data scientists, and analysts to define and refine data platform requirements.

o Lead cross-functional teams to develop and integrate new data products and solutions.

o Understand business needs and translate them into data solutions and architecture roadmap that add value to the organization.

• Cloud usage and Governance:

Design and implement cloud-based solutions for data processing and storage (e.g. Azure,Snowflake, Databricks, GCP etc).

o Optimize cloud resources for cost efficiency, performance, and availability.

o Ensure the security and compliance of data platforms, addressing regulatory and privacyconcerns.

o Develop strategies to enforce data governance policies, ensuring data quality, consistency,and integrity across systems.

o Design data security measures and control access to sensitive data through role-based access and encryption.

• Innovation & Continuous Improvement:

o Stay up-to-date with emerging technologies and trends in data architecture, big data, cloud computing, and AI.

o Recommend and lead initiatives to improve the performance, scalability, and efficiency of data processing and storage systems.

o Act as the Data Architecture subject matter expert to drive the innovation for the company

• Documentation and Technical design:

o Produce detailed documentation for platform architecture, data models, and data workflows

o Well versed with technical design, diagrams, and documentation tools

• AI Integration:

o Collaborate with AI/ML teams to ensure data architectures support Gen AI initiatives.

o Enable real-time and batch data processing for AI model training and deployment.

Technology and Tools:

• 10+ years experience in designing and implementing end-to-end data platforms, including data lakes, data warehouses, and data integration pipelines.

• Experience designing and developing low-latency and high-throughput enterprise grade data architecture ecosystem

• Knowledge of relational and non-relational databases, and big data technologies (e.g., Hadoop, Spark, Kafka).

• Expertise in cloud platforms Azure, Snowflake, Databricks, Github, Jenkins etc

• Strong knowledge of ETL processes and tools for real-time data processing

• Proficiency in building data solutions using tools like Apache Kafka, Apache Airflow, and dbt (Data Build Tool) and Python

• Strong understanding of SQL and data querying best practices

• Proficiency in managing and deploying solutions on cloud platforms such as Azure, Snowflake, Databricks

• Experience with data encryption, privacy, and security best practices, including GDPR compliance

• Excellent problem-solving and communication skills

• Strong scripting skills in Python, Shell, or similar languages for automation and process optimization

• Familiarity with CI/CD pipelines, version control (Git), and deployment automation tools (Jenkins, Terraform)

• Familiarity with BI tools such as Tableau, Power BI, or Looker, as well as experience working with data scientists and analysts to support analytical workloads

Qualifications:

• Education:

o Bachelor’s degree in Computer Science, Information Technology, or a related field.Master’s degree preferred.

Experience:

o 10+ years overall and at least 7 years of experience in data architecture, data modeling,

and database design.

o Proven experience with data warehousing, data lakes, and big data technologies.

o Expertise in SQL and experience with NoSQL databases.

o Experience with cloud platforms (e.g., AWS, Azure) and related data services.

• Skills:

o Strong understanding of data governance and data security best practices.

o Excellent problem-solving and analytical skills.

o Strong communication and interpersonal skills.

o Ability to work effectively in a collaborative team environment.

o Leadership experience with a track record of mentoring and developing team members.

• Certifications:

o Relevant certifications (e.g., Certified Data Management Professional, AWS Certified Big

Data – Specialty) are a plus.

Behavioral Competencies

• A self-starter, excellent planner and executor and above all, a good team player

• Excellent communication skills and inter-personal skills are a must

• Must possess organizational skills, including multi-task capability, priority setting and meeting

deadlines.

• Ability to build collaborative relationships and effectively leverage networks to mobilize resources.

• Liking and initiative to learn business domain is highly desirable.

• Likes dynamic and constantly evolving environment and requirements.

Skills & Requirements

Data Modelling, Data Architecture, Designing and implementing end-to-end data platforms, BigData technologies, BI tool, Cloud Platforms.

Apply Now

Join Our Community

Let us know the skills you need and we'll find the best talent for you