Enginering Backend Airflow SQL Python Data Modeling ETL Data Architecture Database
Based in Vietnam out of our centrally located Ho Chi Minh City office, you'll be joining our Data team within our fast-growing tech team. Reporting to the Head of Data & Analytics, the senior data engineer will own the design, building, and maintenance of key parts of the data architecture and will have the opportunity to lead strategic projects that look to deliver value to the business in the future. As a Senior Data Engineer, you will have hands-on experience of the different elements of the data engineering process and you will be great at building the business context needed to inform technical decision making. To be the technical leader / decision maker on data architectural tools and technologies Ownership of the Data Warehouse from ingestion of data sources to storage and serving Own the ideation and execution of architectural principles, guidelines and technology standards of ETL and data pipelines Work with analytics to ensure the data modeling is suitable for reporting needs Managing our GCP and Airflow instances Working with Analytics, Back-end, Front-end, DevOps & QA engineers to ensure that the data events are well-designed and correctly integrated to the data pipeline services Conducts/leads and implements proof of concepts to demonstrate new technologies in support of architecture vision and guiding principles Collaborate with Data Engineers and Analysts to develop views, sources and aggregations of data Review SQL and Python code from across the Data team to help maintain data and code integrity
Undergraduate degree in any quantitative field (physics, data science, computer science, engineering, business science - anything with a maths or analysis focus). 5+ years in Data Engineering roles Excellent English communication skills Strong knowledge and experience of ETL and other aspects of data processing pipelines and the data architecture. Strong knowledge and experience of BigQuery, REST APIs, data and model storage formats Strong experience with modular Python and Airflow Experience implementing event streaming with any of the platforms eg Kafka, Kinesis, SQS/SNS Experience building analyst-ready data models. Solid SQL/query optimization skills Unit and integration testing Experience of distributed/automated workflows and technologies (Docker, K8S, Spark)
Competitive salary Flexible working 6 weeks remote working (work from anywhere) policy 25 days holiday: plus the option to buy more! Share options Private health insurance Wellbeing Days Generous parental leave policies Family friendly policies Employee Assistance Program (EAP) Lunch & learn sessions Team social budget
Knowledge of a range of third party data sources and APIs (eg, GA, Matomo, Mixpanel, Facebook, LinkedIn) Experience with consumer app data Some experience working with dbt