Acuity Knowledge Partners, a leading provider of high-value research, analytics, and business intelligence to over 500 financial institutions and consulting companies through our specialist workforce of over 6,000 analysts and delivery experts across our global delivery network.
Job Summary:
To design, implement, and optimize scalable data architectures to support complex financial data workflows,enabling analytics and insights in a highly dynamic environment with the experience of data architect role and hands-on expertise in Databricks and a strong understanding of fixed income instruments or capital markets.
Desired Skills and Experience:
Qualifications:
Technical Expertise:
Proven hands-on experience with Databricks and Apache Spark.
Strong knowledge of cloud platforms (Azure, AWS, or GCP) and data lake architectures.
Proficiency in programming languages like Python and SQL.
Experience with database systems such as MongoDB, PostgreSQL, or SQL Server.
Deeper understanding in Data governance.
Domain Knowledge:
Strong understanding of fixed income instruments (e.g., bonds, credit default swaps) and capital market workflows.
Familiarity with market data sources, trading systems, and financial reporting.
Education & Experience:
Bachelor’s or master’s degree in computer science, Engineering, or a related field.
8+ years of experience in data architecture or engineering, with 3+ years of hands-on experience in Databricks.
Previous experience in the financial services sector, particularly with capital markets or asset management.
Soft Skills:
Excellent problem-solving and analytical skills.
Strong communication and collaboration abilities to work across business and technical teams.
Preferred Qualifications:
Experience with Delta Lake for transactional data processing.
Familiarity with fixed income risk analytics and regulatory requirements.
Certification in Databricks, cloud platforms, or related technologies.
Key Responsibilities
Data Architecture & Design:
Design and implement modern, scalable data architectures using Databricks.
Define and maintain data models, pipelines, and governance strategies tailored to capital market datasets.
Build robust ETL processes and optimize data workflows for performance and reliability.
Data Engineering & Integration:
Develop and manage end-to-end data pipelines on Databricks, integrating diverse data sources (structured and unstructured).
Leverage tools like Spark, Delta Lake, and Python/Scala for data transformation and enrichment.
Ensure compliance with data security and privacy regulations (e.g., GDPR, CCPA).
Domain-Specific Knowledge:
Utilize expertise in fixed income instruments (e.g., bonds, TIPS, interest rate swaps) or
capital markets to design data solutions aligned with business needs.
Collaborate with quantitative analysts, portfolio managers, and traders to support analytics and decision-making.
Technology Leadership:
Provide technical guidance to data engineering and analytics teams.
Stay updated on the latest trends in Databricks and capital markets, driving innovation.
Implement best practices for data architecture, quality assurance, and DevOps in a cloud environment.
Stakeholder Collaboration:
Partner with business leaders, technology teams, and data scientists to align data strategies with organizational objectives.
Prepare documentation, architecture diagrams, and operational guidelines for cross-team collaboration
Databricks, Apache Spark(Capital market-Fixed Income domain is must )