CoDev

Data Engineer

Category Business Intelligence & Data Analytics
ID 2025-2719
Office Location : Location
PH--Negros Occidental
Job Locations
PH-Bacolod-Negros Occidental | PH-Cebu-Cebu City | PH-Davao del Sur-Davao City | PH-Metro Manila-Makati
Shift Schedule
4 pm - 1 am PHT, 9 pm - 6 am PHT
Work Set Up
Remote

Overview

Position Summary
We are seeking an experienced Data Engineer to specialize in the orchestration and operational management of a new data warehouse project. This role is centered around an expert-level mastery of Google Cloud Composer (Apache Airflow) and the art of DAG (Directed Acyclic Graph) authoring. You will be responsible for scheduling, automating, and monitoring all data pipelines, ensuring they run on time, in the correct order, and with robust error handling. Your expertise will be vital in building a resilient, observable, and highly automated data ecosystem that the entire team can depend on.

Responsibilities

Key Responsibilities

  • Design, build, and manage the complete orchestration framework for all data pipelines
    using Google Cloud Composer.
  • Author complex, dynamic, and maintainable DAGs in Python, implementing
    sophisticated dependency management, triggers, and scheduling logic.
  • Serve as the team's subject matter expert on all DAG Authoring/Best Practices,
    including idempotency, task modularity, backfilling, and performance tuning.
  • Develop a comprehensive monitoring and alerting strategy for all orchestrated pipelines
    to proactively identify and resolve issues.
  • Establish and enforce CI/CD processes for deploying and testing DAGs to ensure
    operational stability.
  • Collaborate with the ingestion and transformation engineers to understand pipeline
    dependencies and integrate their workflows into the master orchestration plan.
  • Own the operational health of the data platform, managing retries, handling failures, and
    communicating pipeline status to stakeholders.

Qualifications

  • Core Required Skills
    Expert-Level DAG Authoring: Demonstrable mastery of authoring complex DAGs in
    Python. This must include deep knowledge of Airflow operators and sensors, dynamic DAG generation, managing task dependencies and SLAs, and creating idempotent
    tasks.
  • Cloud Composer (Apache Airflow): Deep, hands-on experience managing a
    production Cloud Composer environment, including configuration, security, scaling, and
    troubleshooting.
  • Python for Orchestration: Expert-level Python proficiency as it applies to writing clean,
    efficient, and testable Airflow DAGs.
  • CI/CD for Data Pipelines: Experience implementing continuous integration and
    deployment workflows for DAGs using tools like Git and Google Cloud Build.
  • Monitoring & Alerting: Strong knowledge of setting up and using monitoring tools (like Google Cloud's operations suite) to track pipeline performance and alert on failures.

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.