Browse
···
Log in / Register

Data Engineer

Supertech Group

Dubai - United Arab Emirates

Favourites
Share

Description

Do you want to love what you do at work? Do you want to make a difference, an impact, and transform peoples lives? Do you want to work with a team that believes in disrupting the normal, boring, and average? If yes, then this is the job you are looking for , webook.com is Saudi’s #1 event ticketing and experience booking platform in terms of technology, features, agility, revenue serving some of the largest mega events in the Kingdom surpassing over 2 billion in sales.  webook.com is part of the Supertech Group also consisting of UXBERT Labs, one of the best digital and user experience design agencies in the GCC, along with Kafu Games, the largest esports tournament platform in MENA.  Key Responsibilities: Data Integration and ETL Development Architect and implement robust data integration pipelines to extract, transform, and load data from various sources (e.g., databases, SaaS applications, APIs, and flat files) into a centralized data platform. Design and develop complex ETL (Extract, Transform, Load) processes to ensure data quality, consistency, and reliability. Optimize data transformation workflows to improve performance and scalability. Data Infrastructure and Platform Management: Implement and maintain data ingestion, processing, and storage solutions to support the organization's data and analytics requirements. Ensure the reliability, security, and availability of the data infrastructure through effective monitoring, troubleshooting, and disaster recovery planning. Data Governance and Metadata Management: Collaborate with the data governance team to establish data policies, standards, and procedures. Develop and maintain a comprehensive metadata management system to ensure data lineage, provenance, and traceability. Implement data quality control measures and data validation processes to ensure the integrity and reliability of the data. Requirements 5-6 years of experience as a Data Engineer or a related role in a data-driven organization. Proficient in designing and implementing data integration and ETL pipelines using tools such as Apache Airflow, airbyte, or any cloud-based data integration services. Strong experience in setting up and managing data infrastructure, including data lakes, data warehouses, and real-time streaming platforms (e.g. Elastic , Google Bigquery, Mongodb). Expertise in data modeling, data quality management, and metadata management. Proficient in programming languages such as Python, or Java, and experience with SQL. Familiarity with cloud computing platforms (e.g., AWS,Google Cloud) and DevOps practices. Excellent problem-solving skills and the ability to work collaboratively with cross-functional teams. Strong communication and presentation skills to effectively translate technical concepts to business stakeholders. Preferred Qualifications: Familiarity with data visualization and business intelligence tools (e.g., Tableau, qlik.etc). Knowledge of machine learning and artificial intelligence concepts and their application in data-driven initiatives. Project management experience and the ability to lead data integration and infrastructure initiatives. If you are a seasoned Data Engineer with a passion for building scalable and robust data integration solutions, we encourage you to apply for this exciting opportunity


Location
Dubai - United Arab Emirates
Show Map

Workable
2,033listings

You may also like

BlackStone eIT
SAP Success Factor
Dubai - United Arab Emirates
BlackStone eIT is a leading computer software company specializing in SAP solutions. We are currently seeking a dedicated SAP Success Factors Consultant to join our team. In this role, you will assist clients with the implementation, configuration, and optimization of their SAP Success Factors systems, ensuring that their HR processes are efficient and aligned with business objectives. Responsibilities Engage with clients to understand their HR needs and requirements for SAP Success Factors Assist in the design and configuration of Success Factors modules, including Employee Central, Performance & Goals, and Recruiting Conduct workshops to gather user requirements and define system functionality Support data migration from legacy systems to Success Factors Conduct system testing and manage defect resolution to ensure successful implementation Provide training and ongoing support to end-users Stay current with the latest trends and updates in SAP Success Factors Requirements Bachelor's degree in Computer Science, Information Technology, Human Resources, or a related field Minimum of 2 years' experience in SAP Success Factors Familiarity with HR processes and Success Factors modules Strong configuration and troubleshooting skills in Success Factors Excellent analytical and problem-solving abilities Strong communication and interpersonal skills Ability to work collaboratively with cross-functional teams Certification in SAP Success Factors is a plus Benefits Paid Time Off Work From Home Performance Bonus Training & Development
Negotiable Salary
Confidential
Full stack Mobile App developer
X8V5+24 Dubai - United Arab Emirates
We are looking for a highly skilled and motivated full-stack developer to build and scale production-level mobile applications for a fast-growing digital startup in the UAE. You’ll be responsible for end-to-end development: frontend (mobile-first), backend, API integration, admin panel, and preparing the app for App Store/Play Store deployment. This role suits someone who can work independently, solve problems, and deliver high-quality results on time. You’ll start with one core app and potentially continue across a portfolio of upcoming B2C and B2B applications. Responsibilities: Build complete mobile/web apps using React, Next.js, or Flutter Set up and manage backend infrastructure using Supabase or Firebase Integrate third-party services: payments (Stripe/Tap), GPS/maps, notifications Implement complex logic flows such as loyalty, QR, booking, or gifting systems Build admin dashboards and internal tools Deploy apps to iOS and Android stores Maintain code quality, version control (Git), and security best practices Required Skills: Strong command of React.js, Next.js, or Flutter Deep experience with Supabase, Firebase, or similar BaaS platforms Proven ability to build and scale backend logic and API endpoints Payment integration: Stripe or Tap Real-time features: push notifications (FCM, OneSignal), webhooks Admin panel development (charts, filters, user control) Version control (Git, GitHub/GitLab) Strong communication skills and ability to work autonomously Bonus Skills (Nice to Have): Experience converting no-code prototypes into production apps Familiarity with tools like Lovable, Airtable, Make, or Cursor.dev Mobile-first design and responsive UI/UX skills (Tailwind CSS, Figma handoff) Prior work in delivery, commerce, or loyalty-based applications
AED 4,000-5,999
Confidential
Business Process Analyst
Dubai Marina2 - Dubai Marina - Dubai - United Arab Emirates
Process Discovery & Mapping: Lead workshops/interviews to document end‑to‑end workflows for each department; produce flowcharts/swimlanes showing inputs, outputs, roles, handoffs, and controls. Requirements Gathering: Translate business needs into clear process requirements, acceptance criteria, and change requests. Data & Metrics: Define process KPIs (cycle time, throughput, rework, defects, OTIF, assay turn‑around, etc.); build simple dashboards and track performance. Gap & Root‑Cause Analysis: Identify bottlenecks, waste, and risks using techniques like SIPOC, value‑stream mapping, 5‑Whys, and Pareto analysis; recommend fixes. SOP & Policy Development: Draft, standardize, and version‑control SOPs, work instructions, and policies; ensure they reflect actual practice and regulatory needs. Controls & Compliance: Embed quality, safety, and security controls in processes; align with relevant standards (e.g., ISO 9001/14001/45001, chain‑of‑custody). Change Management: Prepare impact assessments, RACI charts, implementation plans, and cutover checklists; support pilots and phased rollouts. Stakeholder Coordination: Bridge operations, quality, finance, HR, IT, and logistics to align timelines, dependencies, and responsibilities. Tooling & Automation Support: Recommend process‑enablement tools (workflow, forms, document control); work with IT to configure or automate steps. Training & Enablement: Create process diagrams, SOP packs, and quick‑reference guides; conduct training and certify process adherence. Audit & Continuous Improvement: Run periodic process reviews and audits; maintain a living process repository and improvement backlog. Confidentiality & Data Protection: Handle all documentation and process data with strict confidentiality and access controls.
AED 4,000-5,999
Deeplight
Data Engineer
Dubai - United Arab Emirates
About DeepLight: DeepLight is a pioneering AI company committed to pushing the boundaries of innovation in artificial intelligence. Our mission is to harness the power of data and machine learning to revolutionize industries and create a brighter future. With a dynamic team of experts and a culture of relentless innovation, we are at the forefront of AI research and development. Position Overview: DeepLight is seeking an exceptional Data Engineer to join our team of AI specialists in the UAE. As an Expert Data Engineer, you will be responsible for designing, implementing, and optimizing data pipelines and infrastructure to support our cutting-edge AI systems. You will collaborate closely with our multidisciplinary team to ensure the efficient collection, storage, processing, and analysis of large-scale data, enabling us to unlock valuable insights and drive innovation across various domains. Requirements · Pipeline Development: Design, develop, and maintain scalable and reliable data pipelines to ingest, transform, and load diverse datasets from various sources, including structured and unstructured data, streaming data, and real-time feeds, with consideration for downstream AI/ML workloads. · Data Integration: Implement robust data integration processes to seamlessly integrate data from different sources, ensuring consistency, reliability, and data quality for analytics and AI use cases. · Data Storage: Design and optimize data storage solutions, including relational databases, NoSQL databases, data lakes, and cloud storage services, to efficiently store and manage large volumes of data for AI and machine learning model consumption. · Performance Optimization: Optimize data processing and query performance to enhance system scalability, reliability, and efficiency, leveraging techniques such as indexing, partitioning, caching, and parallel processing—especially for AI model training and inference pipelines. · Data Governance: Implement data governance frameworks to ensure data security, privacy, integrity, and compliance with regulatory requirements, including data encryption, access controls, and auditing—crucial for responsible AI deployment. · Monitoring and Maintenance: Monitor data pipelines and infrastructure components, proactively identify and address issues, and perform routine maintenance tasks to ensure system stability and reliability across AI and data science environments. · Collaboration: Collaborate closely with cross-functional teams, including data scientists, ML engineers, architects, and domain experts, to understand AI/ML requirements, gather insights, and deliver integrated, production-ready data solutions. · Documentation: Create comprehensive documentation, including technical specifications, data flow diagrams, and operational procedures, to facilitate understanding, collaboration, and knowledge sharing across AI and analytics teams. · Proven experience as a Data Engineer, with a track record of designing and implementing complex data pipelines and infrastructure solutions that support advanced analytics and AI initiatives. · Expertise in data modeling, ETL (Extract, Transform, Load) processes, and data warehousing concepts, with proficiency in SQL and scripting languages (e.g., Python, Scala)—especially in AI pipeline preparation and feature engineering. · Strong hands-on experience with big data technologies and frameworks, such as Hadoop, Spark, Kafka, and Flink, as well as cloud platforms (e.g., AWS, Azure, GCP) commonly used for AI workloads. · Familiarity with containerization and orchestration technologies, such as Docker and Kubernetes, and DevOps practices for CI/CD (Continuous Integration/Continuous Deployment), including ML model deployment and data pipeline automation. · Experience supporting AI/ML initiatives, including the preparation of training datasets, model input/output data flows, MLOps integration, and experimentation tracking. · Excellent analytical, problem-solving, and communication skills, with the ability to translate complex technical concepts into clear and actionable insights. · Proven ability to work effectively in a fast-paced, collaborative environment, with a passion for innovation, continuous learning, and contributing to AI-driven solutions. Benefits Why Join DeepLight? ·   Impact: Be part of a dynamic team that is shaping the future of AI and making a meaningful impact on industries and society. ·   Innovation: Work on cutting-edge projects at the intersection of AI, data engineering, and machine learning, leveraging the latest technologies and methodologies. ·   Collaboration: Collaborate with a diverse team of experts from various disciplines, fostering creativity, learning, and growth. ·   Opportunity: Enjoy ample opportunities for professional development, career advancement, and leadership roles in a rapidly growing company. ·   Culture: Join a culture of curiosity, excellence, and collaboration, where your ideas are valued, and your contributions are recognized and rewarded.   If you are passionate about data engineering, AI, and innovation, and you thrive in a dynamic and collaborative environment, we want to hear from you! Apply now to join DeepLight and be part of our journey to unlock the potential of AI for a brighter future.
Cookie
Cookie Settings
© 2025 Servanan International Pte. Ltd.