The One Planet strategy we are committed to here at Eneco sets an ambitious goal to be climate neutral by 2035. We want to achieve that goal for both us and our clients. To make it happen, we are dedicated to offering our customers innovative digital capabilities and smart solutions. Our Digital Core team is working towards creating an exceptional online customer experience, through modernizing the Eneco chat, app and web environments. We are striving to deliver a superior digital customer experience that will stimulate and make it easier for our customers to become greener every day.
As a Data Engineer, you will play a crucial role in setting up and leading technical decisions for our cloud-based data platform. We are specifically looking for someone that will contribute to a combination of Cloud Infrastructure setup, maintain API server, and develop streaming / batch Data processing pipeline. You will be working on an exciting IOT product (smart thermostat, energy insight, smart charging) for our consumers.
Responsibilities- Setting up projects and leading technical decisions involving real time time-series data in Databricks (scala) environments.
- Empowering other departments by making data accessible and usable, driving Eneco’s digital innovations forward.
- Design and implement cloud solutions to handle product requirements.
- Shape the product by providing technical advice to the product manager or other team.
- Ensuring our solutions are robust, scalable, cost efficient and ready to meet future challenges
- Previous experience of REST API development (e.g. Spring or FastAPI).
- Understanding of streaming data ingestion and processing.
- Previous experience working with MPP data platforms such as Spark. Working experience using Databricks and Unity Catalog is a plus.
- Proficiency in programming languages (Java, Scala, and Python).
- Knowledge of software engineering best practices: code reviews, version control, testing, and CI/CD
- Genuine interest in DevOps/SRE principles for production deployment.
- Working experience with high volume time series data.
- Knowledge of data modeling and architecture patterns.
- Experience deploying applications to Kubernetes, with skills in monitoring (Grafana) and debugging.
- Knowledge with cloud provider (e.g. AWS). Infrastructure as code (IAC) is a plus.
- Experience with NoSQL databases (e.g. DynamoDB) and RDBMS (e.g. Postgres).
- Proficiency in SQL and DBT (Data Build Tool) with Snowflake.
- Familiarity or interest with MLOps and data science techniques.
You will be working together with other Data Engineer, Machine Learning Engineers, Data Scientists and Data Analysts. Together, you will shape IOT products that will transform how our consumers use their energy. Within the team, we encourage learning, actively seek out collaboration, celebrate successes, and learn from failures.
Contact our recruiter: Venetia.dewit@eneco.com
Impact & Why JoinMake real-world impact at scale by building cloud-native, real-time data platforms that power Eneco’s IoT products and directly help millions of customers use energy more sustainably.
Own end-to-end technical decisions across streaming, APIs, and cloud infrastructure, working with modern stacks like Databricks, Spark, Scala, and Kubernetes in a highly autonomous engineering role.
Grow with a collaborative, future-focused team where Data Engineers, ML Engineers, and Product work closely together to innovate, learn, and shape the next generation of smart energy solutions.
€60000 - €80000 monthly
