Data Platform Engineers (GCP/AZURE/AWS)

Closed

Enginering Java Go Backend AWS CI/CD JSON Azure GCP Terraform Python Rust Python

Icon Location Location
Ho Chi Minh, Vietnam

Benefits

Full social insurance Full social insurance
Flexible working time Flexible working time
Salary review Salary review
Travel/company trips Travel/company trips
Laptop/desktop for works Laptop/desktop for works
Performance bonus Performance bonus
Extra health insurance Extra health insurance
13th month salary 13th month salary
Others Others
Work-from-home policy Work-from-home policy

Job Overview And Responsibility

· Build and run the data processing pipeline on Google Cloud Platform (GCP) · Work with Implementation teams from concept to operations to provide deep technical expertise for successfully deploying large scale data solutions in the enterprise and use modern data/analytics technologies on GCP · Design pipelines and architectures for data processing · Implement methods for DevOps automation of all parts of the built data pipelines to deploy from development to production · Formulate business problems as technical data problems while ensuring that key business drivers are captured in collaboration with product management · Extract, load, transform, clean and validate data · Support and debug data pipelines

Required Skills and Experience

We are trying to reimagine the way we help and interact with our customers, so we are looking for candidates with creativity, an open mind and a positive energy. Our detailed requirements are as below: Must have: · Bachelor’s degree in computer science, Software Engineering, or a related field · At least 2 Years of Experience working on developing backend / platform applications using modern programming languages such as Python, Java, JS, Go, Rust. · Experience with provisioning services on top of Cloud Platforms (AWS, GCP, Azure) & Infrastructure as Code (Terraform, CDK, Pulumi...) · Experience with constructing CI/CD workflows. · Experience with (or having knowledge on) building data intensive applications. · Excellent problem-solving skills and the ability to analyze and resolve complex technical issues. · Working proficiency in English, both verbal and writing.

Why Candidate should apply this position

· Hybrid working mode (3 working days at office, flexible time) · Attractive Package including full salary + 13th month salary + Performance bonus · 18 paid leaves/year (12 annual leaves and 6 personal leaves) · Insurance plan based on full salary · 100% full salary and benefits as an official employee from the 1st day of working · Medical benefit for employee and family · Working in a fast-paced, flexible, and multinational working environment. Chance to travel onsite (in 49 countries) · Free snacks, refreshment, and parking · Internal training (Technical & Functional & English) · Working time: 08:30 AM - 06:00 PM from Mondays to Fridays, hybrid mode (meal breaks included)

Preferred skills and experiences

· Experience in developing big-data pipelines (batch, streaming processing) is a huge plus · Experience in developing automation processes and Observability is a huge plus. · Interested in working with data platforms / eager to learn more about data is a huge plus. · Familiar with Docker, K8S You are not expected to have 100% of these skills. At HCLTech, a growth mindset is at the heart of our culture, so if you have most of these things in your toolbox, we would love to hear from you.

Evelyn L

Headhunter | Recruiter
Verified
employee 131 件の履歴書
cup 25 件の面接
health 8 件のオファー

Evelyn L

Headhunter | Recruiter
Verified
Icon employee 131 件の履歴書
Icon cup 25 件の面接
Icon health 8 件のオファー

ご成約済みの案件 (8)