Enginering SQL Data Confluence Jira Scrum Apache Python Azure
Be a part of a multi-disciplinary and international SCRUM team that takes ownership of designing and implementing our platform and its software features Design, develop, and maintain scalable data pipelines on Azure using Azure Data Factory, Databricks, and other relevant tools. Build the data foundations needed for a strong BI service to business and R&D stakeholders Transform our raw data to ready-to-use, high quality datasets and build mechanisms to monitor their quality. Suggest, discuss and define process improvements, how can we improve our way of working
Self-responsibility and commitment in meeting project objectives and ensuring software validation activities are conducted with timeliness, thoroughness and accuracy Determination and sound technical judgment in problem solving, analytical techniques, develops new / creative test designs, and can work independently with little supervision At least 5 years of experience in data engineering Work fluently with Azure, Databricks, SQL, Python, Apache Spark, PySpark and orchestration tools, and BI tools Experience working in SCRUM teams or the agile approach Experience with Continuous Integration Principles and related tools e.g. Jira, Confluence
Working time: 40 hours per week from Monday to Friday Paid Days-off: 17 days off/year Work-from-everywhere: 24 WFE days/year Competitive salary and benefits: 13th salary, profit sharing, bonus for excellent members, bonus for excellent projects Social Insurance coverage: 100% of your gross salary of the maximum related to the labor law Probation will receive 100% salary Monthly lunch and gasoline allowances Performance salary review: annually after 12 months, exceptional excellence performance after 6 months 1 Laptop + 1 additional LCD screen Company healthcare budget Annual healthcare check Annual company trip Training opportunities TechSoft Learning Budget English working environment
Certification in Azure Data Engineering or related areas is a plus Familiarity with DevOps practices and tools (e.g., Docker, Kubernetes) is a plus