At Qubiqon LLC, we are an innovative US-based LLC with a robust presence in the US, Middle East, and India. We are a collective venture initiated by industry veterans, bringing together over eighty years of combined expertise in application development, data management, automation, and cloud technology.
Purpose of the Job
This role is responsible for managing and maintaining data management related technologies used for ongoing operations and support such as data quality, ETL, data integration, master data management and data warehouse environments, ensuring agreed service levels are met.
Dimensions
Customers : All PDO staff & contractors
Interfaces:Data management architects, IDD operations teams, EP Data Management groups, PDO Assets and Functions.
Systems:
Primary: Data Management Technologies such Microsoft SQL Server Big Data Cluster, Microsoft SQL Server Integration Service (SSIS), Microsoft Master Data Service (MDS), Knowledge of using Kafka to integrate to sources for real time integration and Skills in PySpark programming.
Secondary: All data-producing, operational applications that serve as data sources for Bayanat (Data Lake), as well as all resulting datamart interfaces and consuming applications for business intelligence. All data management solutions that either directly/indirectly affect the master data model of PDO.
Experience/Qualifications Needed
• Graduate in Computer Science or related with min 7+ years experience.
• Knowledge of E&P processes and data and information management
• Experiance in Microsoft SQL Server 2019
• Experience in SSIS administration
• Experience in Data warehouse and data lake environments
• Experience in Linux administration
• Experience in SQL, SQL scripting
• Experience in PySpark scripting
• Experience in Kafka administration
• Organizational and problem-solving skills
Principal Accountabilities
Administration activities
• Managing day-to-day administration for data management technologies such as (SSIS, MDS, MDM, KAFKA, Python, Data Lake).
• Managing the implementation of data management application(s) upgrades, infrastructure changes, and/or data migrations.
• Plan for and schedule installation, configuration, and deployment activities, organize/execute functional, data integrity, and integration.
• Manage and support in Installing, configuring, maintaining, patching, and upgrading data management application activities
• Monitor data management applications
• For given scope, ensure Service Level Agreement(s) (SLAs) are met, Key Performance Indicators (KPIs) are delivered, applicable ITIL processes are followed, and critical business needs are responded to appropriately.
• Build and maintain relationships with business resources, peers within IDD, and key service provides.
• Enforce HSE, IRM, and all other compliance requirements across the team.
• Document application functionalities, processes and user & maintenance procedures.
Extract, Transform, Load activities
• Responsible for development of ETL/replication solutions using Microsoft SQL Server Integration Service (SSIS), Kafka, PySpark and setup of processing structures in order to facilitate periodic (monthly/daily/realtime) loads.
• Responsible for supporting the ETL and data replication processes that continuously collect data from multiple client, and subsequent integration of data into client repositories.
• Provider support in maintaining data integration pipelines. Proactively manage and monitor data integration/orchestration flows, and scheduled tasks.
• Responsible for optimizing performance to support near real time data processing.
• Gather requirements related to either system and/or process change(s) by meeting with customers and listening actively.
• Help data warehouse end-users understand and query data to fulfill their business needs.
• Responsible for implementing data validation based on business requirements.
• Provide highly-responsive analysis of issues to diagnose and resolve with sense of urgency.
• Work collaboratively as part of the DM team.
Enterprise Data Quality activities
• Support the business in conducting data quality profiling using using Microsoft SQL Server Integration Service (SSIS), Kafka, PySpark and Information Quality Metrics.
• Support the business in Implement data quality rules for standardizing and cleansing data.
• Perform business requirement gathering.
• Support the business in generateing data quality reports.
• Document test plans and integration plans, and take the lead to perform these tests.
Challenges
• Managing competing priorities while maintaining sufficient technical depth and breadth is a key challenge.
• Maintaining in-depth technical skills of a specific range of rapidly changing data management technologies in order to relate confidently and competently with globalisation and rapidly growing business demands.
Data management technologies, Microsoft SQL Server, Big Data Cluster, Microsoft SQL Server Integration Service (SSIS), Microsoft Master Data Service (MDS), Kafka, PySpark programming, SQL, SQL scripting, Linux administration, Data warehouse, Data lake environments, ETL processes, Data replication, Real-time data integration, Data quality profiling, Information Quality Metrics, Business requirement gathering, Data validation, Data integration pipelines, Performance optimization, Data management application upgrades, Data migrations, Service Level Agreement (SLA) management, Key Performance Indicators (KPI) delivery, ITIL processes, HSE compliance, IRM compliance, Documentation, Troubleshooting, Data cleansing, Data standardization, Business intelligence, Application installation and configuration, Application patching and upgrading, Collaboration, Technical depth, Globalization adaptability, Business demand management.