Senior Data Engineer Job in Emirates, Dubai, UAE

Emirates looking for Senior Data Engineer in Dubai, UAE

Job Title:

Senior Data Engineer – Quality Automation (12 Months Contract)

Job Purpose:

The Senior Data Quality & Automation Engineer is a fully-participating member of a cross-functional team working autonomously on technology development and problem resolution in the Enterprise Data & Analytics space. The role involves hands on contributing to quality practices along with designing and implementing data quality and automation platforms. They will also provide support to technical analytics solutions and products that support Emirates Airlines and the Emirates Group businesses. 

Job Outline:

·        Work closely with product owners, analysts, software engineers and architects to understand the technical landscape and context of deliveries to refine complex functional and non-functional requirements and translate it into fit for purpose acceptance tests.

·        Work on problems of diverse scope where analysis of data requires evaluation of identifiable factors and select methods to validate and automate the solution.

·        Build test strategies and test plans validating the core business problem and translating them into tests around code functionality, data quality, performance, and security.

·        Build and support generating or mocking of test data for exploratory analysis and running tests.

·        Design and build automated tests / jobs of moderate to high scope and complexity across all stages of the data pipeline while demonstrating good coding & design practices and adhering to published coding standards and guidelines.

·        Build and enhance data quality rules and platforms for observability across the data pipeline.

·        Debug complex issues, resolve blockers and follow design documents with minimal or no supervision.

·        Conduct data analysis activities such as source system analysis, data modelling, and data dictionary collection, data profiling and source-to-target mapping ensuring delivery on business needs.

·        Support in updating data inventories and registries as required to keep metadata and data lineage up-to-date, following agreed Data Governance standards, guidelines and principles.

Qualifications & Experience:

Qualifications & Experience:

·        Degree in a relevant field such as Computer Science, Computational Mathematics, Computer Engineering or Software Engineering.

·        Specialization or electives in a Data & Analytics field (e.g. Data Warehousing, Data Science, Business Intelligence) a nice-to-have.

·        Experience 2+ years Data Engineering experience with focus on quality & automation.

·        Minimum 2+ years of testing, automation and support experience in analytics applications such as Data Lake and Data Warehouse (preferably using the Big Data stack and Microsoft Azure cloud infrastructure).

·        Seasoned in identifying quality issues across complex data pipelines running on big data technologies and defining rules for validating health of the data.

·        Experienced in a wide variety of testing methods and tools covering functional, performance, security tests across individual jobs, pipelines and end to end across enterprise.

·        Well versed in building automated checks running in a CI pipeline to validate the ETL / ELT jobs are performing as expected.

·        Experience with batch and real-time data ingestion/integration tools and technologies handling massive quantities of data (structured and unstructured).

·        Exposed to data architecture concepts such as data modelling, Big Data storage, and dimensional modelling.

·        Exposed to working with jobs in data pipelines and define metrics / measures to ensure correctness of the data.

·        Programming (Python or Scala) and SQL querying skills is required.

·        Exposure to Spark & airline industry experience is nice-to-have

Knowledge/Skills:

·        Ability to drive quality of data assets independently.

·        Able to deliver solutions (and associated value) interactively.

·        Strong ability to conduct data analysis (e.g. source system identification, data dictionary / metadata collection, data profiling, source-to-target mapping) is preferred.

·        Operates with You Code It, You Own I, mind-set (i.e. supports the products they build).

·        Team player, able to collaborate with others to remove blockers, solve complex design problems and debug/resolve issues.

·        Is accountable and displays positive attitude.

·        Self-starter and has passion for exploring and learning new technologies, especially those in the Enterprise Data & Analytics space.

Key Technologies/Tools:

Big Data & distributed processing : Spark, Hadoop (HDFS, Hive, H-Base, Oozie), Airflow, , Apache Nifi, Azure (ADLS, DataBricks, Azure Data Factory) Elasticsearch, AVRO / PARQUET file formats Data Analysis.

Modelling and Reporting: Snowflake, SQL, Data Vault 2.0, MicroStrategy, Power BI.

Cloud Technologies: Microsoft Azure and Cloudera technology stacks

Integration and Messaging: Streaming (e.g. Spark Streaming), SnapLogic, TIBCO, Kafka,

CI/CD : GIT, Bitbucket, Jenkins, Azure DevOps, Kubernetes, docker, SonarQube,

Gatling Languages: Scala, Python

Last Date:

How to Apply:

Enter your e-mail address to get job updates delivered to you

Icon of Jobs logo

Enter your e-mail address to get job updates delivered to you

ہم وعدہ کرتے ہیں کہ ہم آپ کے ای میل کا غلط استعمال نہیں کریں گے اور آپ کا ای میل کسی اور کو نہیں دیں گے

We promise that we will not misuse your e-mail address or share it with anyone else

Icon of Jobs logo

Enter your e-mail address to get job updates delivered to you

ہم وعدہ کرتے ہیں کہ ہم آپ کے ای میل کا غلط استعمال نہیں کریں گے اور آپ کا ای میل کسی اور کو نہیں دیں گے

We promise that we will not misuse your e-mail address or share it with anyone else