IT Support Services Senior Data Engineer

Solliciteer op de website van de werkgever

Hoe ziet jouw rol eruit? - What will your job look like?

The team

Our BI teams and data scientists deliver high-quality analytical models and dashboards to enable descriptive and diagnostic capabilities, focusing our efforts on improving predictive and forecasting abilities across all business groups.

Being the “Data Lake” team, based in the Netherlands and Malaysia, we are there to support these analytical efforts and products on a global level. We work closely with each other, regardless the location.

We speak the same language of Python, SQL, or whatever meets the need. Together we are building a state-of-the-art AWS-based data landscape, consisting of a data lake and high performance data-warehouse, using largely AWS standard services (S3, Redshift, CloudWatch,etc.).

Your role in the team

As Senior Data Engineer, you ensure our data pipelines are built to perform and scale. You manage our existing PySpark-based pipelines, running on AWS-standard building blocks like S3, Lambda and EMR. You are responsible for orchestrating pipelines, using Airflow, as well as driving continuous improvement on our CI/CD pipelines and data access configuration. Besides developing and operating your own pipelines, you will also review, test and help ensure qualities of those developed by your team mates. You explore new solutions, execute root-cause-analysis for production data or CI/CD pipeline breaks down.

Your challenges

Your eagerness to get yourself started is key. There will be no one to tell you how to do your job or which solutions to follow. You should be keen to drive your initiative and ideas, you go ahead educating the business what to do with their data puzzles.

You take the lead of external consultants’ delivery. You should feel the responsibility of quality (even if it’s not built by yourself). You will be challenged constantly in seeking the balance between assuring an “Always On ” production landscape”, whilst driving innovation with a fail-fast attitude.

Your profile

To succeed in this role, you should have demonstrable knowledge of both data engineering and some DevOps / Platform engineering skills. You should recognize these skills in your toolbox:

  • Proven experience with AWS services
  • Extensive experience with Python, PySpark, SQL and Linux scripting
  • Good understanding of CI/CD Fundamentals, preferably utilizing AWS’ CodeCommit and Code Pipeline
  • Ideally, some familiarity with Airflow and know how to build and debug DAG’s
  • If you already know how access controls are managed, using e.g. AWS IAM roles, coupled with Microsoft Active Directory services, that is a big plus! If not, you will learn.

As an international company, our primary business language is English and we often work with remote teams. Bringing in excellent communication skills and good cultural awareness will accelerate your success!

Our offer

  • Competitive salary and benefits
  • Flexibility at work (yes you can work from home too)
  • 25 days holidays plus 1 extra free day of every four working week
  • Free cheese and milk at lunch
  • Learning opportunities (on job training, conferences, training certifications when they add value to your professional growth
  • A team of talented and passionate data enthusiasts as sparring partners

*Please kindly note that this is permanent position, no freelance contracting.

Uren per week: Full-Time


Solliciteer op de website van de werkgever

Of solliciteer later

Fiona Wang
Telefoonnummer onbekend