This position is for a Senior Data Engineer to lead the design and implementation of Hunkemöller's enterprise data warehouse and data mesh on Google Cloud. You will guide and mentor a team of internal & external engineers, owning and driving key data transformation initiatives. The right candidate will be excited by the prospect of architecting and building a new, scalable data platform from the ground up to support next-generation analytics and AI initiatives.
Responsibilities:
- Architect and Build Data Pipelines:Design, develop, and own robust, scalable data pipelines using SQL and cloud technologies (GCP), ensuring the highest standards of data quality, reliability, and performance.
- Lead Data Modeling & Warehousing:Lead the design and implementation of scalable data models (e.g., star schemas, data vaults) within our enterprise data warehouse (BigQuery), ensuring they are optimized for analytics and BI consumption.
- Own GCP Data Services:Drive the design and implementation of our Google Cloud Platform (GCP) data infrastructure, focusing on automation, data delivery, security, and scalability improvements.
- Mentor and Collaborate:Partner with Product, Data, and Design teams to resolve complex technical data issues. Mentor junior engineers and guide the team on data engineering best practices.
- Contribute to Data Governance:Help establish and enforce data governance standards, data quality frameworks, and best practices across the data platform.
- Build for Analytics & AI:Build and optimize data platforms that power our BI, Data Science, and AI solutions, ensuring data is accessible, reliable, and ready for analysis.
Requirements:
- Extensive Data Engineering Experience:5+ years of hands-on experience in data engineering, with a proven track record of designing and implementing large-scale data pipelines and data warehouse solutions.
- Expert SQL Proficiency:Expert-level skills in writing complex, highly-optimized SQL queries for data manipulation and analysis.
- Cloud & Data Warehousing Expertise:Deep knowledge of cloud data services, preferably onGoogle Cloud Platform (BigQuery, Dataflow, etc.).
- Strong Programming Skills:Experience with OOP concepts (Pythonpreferable) and applying those principles to build robust data engineering solutions.
- ETL/ELT Architecture:Extensive experience architecting and building ETL/ELT systems.Experience with dbt is a strong plus.
- Code Quality & Version Control:Proficient in writing clean, well-documented, and tested code, with strong experience using Git.
- Leadership Potential:Experience mentoring junior engineers and leading technical discussions is highly desirable.
- English Proficiency:Excellent written and verbal English communication skills.
Benefits
25 Days of Annual Leave: As a full-time employee, you will receive 25 days of paid holiday each year, with the flexibility to buy or sell up to 4 additional days to meet personal needs.
Hybrid Work Model: We offer a flexible hybrid work environment, enabling you to seamlessly balance office and remote work to enhance productivity and work-life balance.
Work From Abroad: Take advantage of the opportunity to work remotely from abroad for up to 2 weeks annually.
International Work Environment: Join a dynamic, global team that fosters collaboration and cultural diversity, providing you with a rich and inclusive professional experience.
Partial Travel Allowance: Benefit from a partial travel allowance to support your commuting costs, ensuring your journey to and from work is more affordable.
HunkemöllerAcademy: Gain access to our Hunkemöller Academy, offering a wide range of professional development programs to support your career growth and skill enhancement.
Employee Discounts: Enjoy a 25% discount on all company products, allowing you to access our offerings at a reduced price.
Apply now with your CV (in English) or LinkedIn profile only.
For this role, we're not looking to work with external agencies.
#J-18808-Ljbffr€60000 - €80000 monthly
