Senior Data Engineer
Date: 1 May 2026
Location: Abu Dhabi, Abu Dhabi, AE
Company: G Forty Two General Trading LLC
Overview:
About AIQ:
AIQ is an Abu Dhabi based joint venture company between Presight and ADNOC, which focuses on developing artificial intelligence technologies. AIQ develops and commercializes AI products and applications for energy world. It aims in providing end-to-end solutions by using its data, cloud and talents to develop AI solutions that seek to reduce costs and generate revenue for its clients. AIQ embodies an innovative and entrepreneurial spirit that embraces challenges to push boundaries and seeks to welcome professionals to its team that share the desire to make meaningful and impactful contributions to its mission. Always on the cutting edge of technology, AIQ provides its talent all the opportunities to thrive and excel. Working at AIQ includes dealing with massive data sets, an AI infrastructure that is powered by the latest NVIDIA GPU cloud computing platform and access to limitless computing, storage and network resources.
About the role
As a Senior Data Engineer at AIQ, you will be responsible for building and maintaining the infrastructure that supports data collection, processing, and storage, workking closely with data scientists, analysts, and other stakeholders to ensure that data systems are reliable, scalable, and secure. Your work will be crucial in enabling data-driven decision-making across the organization. This is a key technical role focused on developing and optimizing the company's data infrastructure which involves designing and implementing data pipelines, ensuring data quality, and collaborating with cross-functional teams to support various data initiatives.
Responsibilities:
As a Senior Data Engineer, you will be responsible for developing and maintaining data systems to support the company’s strategic goals. Your role will encompass a range of activities focused on data pipeline development, data quality, and cross-functional collaboration.
· Data Pipeline Architecture and Development
Design, construct, install, test, and maintain highly scalable data pipelines with a focus on machine learning models and analytics.
· Data Integration
Work closely with data scientists, ML engineers, and stakeholders to ensure that data is accessible, consistent, and reliable for ongoing projects.
· API and Data Services
Develop and maintain APIs for data access and manipulation, and integrate with external data services as needed.
· Data Storage
Manage and optimize data storage solutions for both structured and unstructured data, where structured data includes relational databases and unstructured data includes Text Image Audio and Video, Search Engines like Elasticsearch and NoSQL databases, to support the requirements of machine learning models.
Understand data engines and structure to effectively design solutions for transactional, analytics, and search purposes.
· Data Quality and Governance
Implement processes to monitor data quality and ensure production data is always accurate and available for key stakeholders.
· Collaboration and Support
Collaborate with ML engineers to assist in data-related technical issues and provide architectural guidance and solutions.
· Security and Compliance
Ensure compliance with data security and privacy policies.
· Documentation
Maintain clear and up-to-date documentation including data dictionaries, metadata, and architectural diagrams.
Qualifications:
Skills and attributes for success
o Bachelor’s degree in Computer Science, Engineering, Mathematics, or a related field; or equivalent work experience.
o 7+ years of experience in a Data Engineering role.
o Programming languages like Python, Java, and Scala and managing huge scale data potentially Terabyte to Petabyte.
o Hands-on experience with big data technologies like Hadoop, Spark and Flink.
o Familiarity with machine learning frameworks such as TensorFlow, PyTorch, or similar.
o Strong understanding of data warehousing concepts, ETL processes, and data modeling.
o Experience with API development and integration with data services.
o Experience with cloud platforms like Azure.
o Knowledge in DevOps, CI/CD methods, and containerization technologies like Docker or Kubernetes.
o Experience with real-time / streaming data processing.
Technical stack
o Programming Languages: Python, Java, Scala, SQL, Bash
o Big Data Technologies: Hadoop, Spark, Flink
o Unstructured Data: Text, Image, Audio & Video
o Databases: MySQL, PostgreSQL, MongoDB, Cassandra, HBase, Redis
o Cloud Platforms: Azure
o API Development: RESTful APIs, GraphQL, OpenAPI
o Data Services: Kafka, RabbitMQ
o Containers: Docker, Kubernetes