Enginering Java Go Backend AWS CI/CD JSON Azure GCP Terraform Python Rust Python
· Build and run the data processing pipeline on Google Cloud Platform (GCP) · Work with Implementation teams from concept to operations to provide deep technical expertise for successfully deploying large scale data solutions in the enterprise and use modern data/analytics technologies on GCP · Design pipelines and architectures for data processing · Implement methods for DevOps automation of all parts of the built data pipelines to deploy from development to production · Formulate business problems as technical data problems while ensuring that key business drivers are captured in collaboration with product management · Extract, load, transform, clean and validate data · Support and debug data pipelines
We are trying to reimagine the way we help and interact with our customers, so we are looking for candidates with creativity, an open mind and a positive energy. Our detailed requirements are as below: Must have: · Bachelor’s degree in computer science, Software Engineering, or a related field · At least 2 Years of Experience working on developing backend / platform applications using modern programming languages such as Python, Java, JS, Go, Rust. · Experience with provisioning services on top of Cloud Platforms (AWS, GCP, Azure) & Infrastructure as Code (Terraform, CDK, Pulumi...) · Experience with constructing CI/CD workflows. · Experience with (or having knowledge on) building data intensive applications. · Excellent problem-solving skills and the ability to analyze and resolve complex technical issues. · Working proficiency in English, both verbal and writing.
· Hybrid working mode (3 working days at office, flexible time) · Attractive Package including full salary + 13th month salary + Performance bonus · 18 paid leaves/year (12 annual leaves and 6 personal leaves) · Insurance plan based on full salary · 100% full salary and benefits as an official employee from the 1st day of working · Medical benefit for employee and family · Working in a fast-paced, flexible, and multinational working environment. Chance to travel onsite (in 49 countries) · Free snacks, refreshment, and parking · Internal training (Technical & Functional & English) · Working time: 08:30 AM - 06:00 PM from Mondays to Fridays, hybrid mode (meal breaks included)
· Experience in developing big-data pipelines (batch, streaming processing) is a huge plus · Experience in developing automation processes and Observability is a huge plus. · Interested in working with data platforms / eager to learn more about data is a huge plus. · Familiar with Docker, K8S You are not expected to have 100% of these skills. At HCLTech, a growth mindset is at the heart of our culture, so if you have most of these things in your toolbox, we would love to hear from you.