We are seeking an experienced and strategic Data Engineer to design and develop scalable data solutions in a high-stakes financial environment. This role requires deep technical expertise in Databricks, SQL, and cloud-based data warehousing, along with a strong understanding of financial data domains, regulatory compliance, and data governance.
What you are going to do
- Implement robust, scalable ETL/ELT pipelines using Databricks (Spark, Delta Lake) and SQL to support complex financial data workflows
- Design data models and warehouse schemas optimized for performance, auditability, and regulatory compliance
- Collaborate with data architects, analysts, and business stakeholders to translate financial requirements into technical solutions
- Work with data quality, lineage, and governance initiatives across the data platform
- Integrate data from core banking systems, trading platforms, market data providers, and regulatory sources
- Optimize data processing for cost-efficiency, latency, and scalability in Azure
- Implement and maintain CI/CD pipelines, automated testing, and monitoring for data workflows
- Stay current with emerging technologies and evaluate their applicability to the financial data ecosystem
- Help team in migrating towards new combined DAP platform
What we offer you
Our people are the driving force behind our organization. We value the knowledge and expertise you bring. We believe that your commitment can take our organization to a higher level. We offer you:
- Plenty of training and learning opportunities
Who you are
We are looking for someone with:
Required Qualifications
- Bachelor's in Computer Science, Engineering, Finance, or a related field
- 2+ years of experience in data engineering, with at least 2 years in a senior or lead role
- Proven expertise in: Databricks (Spark, Delta Lake, Unity Catalog, MLflow), SQL (T-SQL, Spark SQL) for complex transformations and performance tuning, Python or Scala for data processing and automation, Cloud platforms (Azure) and ETL orchestration tools (e.g. Azure Data Factory )
- Strong understanding of financial data domains: trade data, market data, risk metrics, compliance reporting, and financial KPIs
- Experience with data governance, metadata management, and security controls in regulated environments
- Familiarity with DevOps practices, infrastructure as code, and CI/CD pipelines
Preferred qualifications
- Exposure to machine learning pipelines in Databricks for financial modeling or fraud detection
- Certifications in Azure Data Engineering, Databricks, or financial data management
- Experience leading cross-functional data initiatives or platform migrations
Who you will work with
As a member of NN Bank's, you will work within the mortgage data lake domain, where our mission is to collect and deliver high-quality data to our customers in the Mortgage and finance and risk domains for risk modelling and reporting.
In the Data lake team, collaboration is key. We engage with other teams to achieve the best possible results, and everyone is very accessible and willing to help. Our team is composed of diverse, international individuals with varying backgrounds and areas of expertise.
Any questions?
If you have any questions about the job or about the process, you can reach out via mail to Bianca Schaareman (Talent Acquisition Specialist) via bianca.schaareman@nn-group.com.
Het salaris bedraagt €3908 - €5583
