Voor KPN is Harvey Nash op zoek naar een Medior Data Engineer

Startdatum: 01-11-2025

Einddatum: 01-06-2026

Locatie: Amersfoort

Aantal uur: 37uur per week

Data Business Value TDO is the department responsible for transforming data from IT systems into usable data products. Its customers are located on the network side of the company. This work is carried out by multiple scrum teams.

DataFANs is the DevOps team responsible for all strategic and tactical data related to the rollout of KPN Fiber. The team contributes to accelerating the fiber rollout at optimal cost.
In addition to daily operational responsibilities, the team focuses on three core tasks:

  1. Creating value for stakeholders by unlocking and integrating data.
  2. Developing advanced analytics use cases to generate insights and optimizations.
  3. Migrating data to a new platform, including performing User Acceptance Testing (UAT).


Technical Skills
* Language Skills: Proficient in both Dutch and English, written and spoken.
* Cloud Expertise: Hands-on experience with Microsoft Fabric, including working with dataflows, pipelines, notebooks, Lakehouse, and warehouse environments, Azure datalake as well as PySpark for large-scale data processing.
* ETL/ELT Development: Experience in developing, maintaining, and optimizing ETL/ELT pipelines using Informatica PowerCenter / Informatica Cloud Data Integration (CDI).
* Architecture Knowledge: Strong understanding of data warehouse concepts, Data Mesh principles, Big data and modern data architecture best practices.
* Programming: Practical coding expertise in Python or PySpark for data processing and transformation.
* SQL Proficiency: Experienced in writing complex SQL queries, performing in-depth data analysis, and optimizing query performance for efficiency and scalability
* Data Modelling: Knowledge on designing efficient data models and structures across dimensional Star & snowflake schema , and Lakehouse architectures.

Soft Skills
* Demonstrates a strong team-oriented and collaborative mindset, contributing effectively in cross-functional environments.
* Quick learner with a proactive attitude toward adopting new technologies and tools.
* Analytical mindset with keen attention to detail and strong problem-solving capabilities.
* Good communication skills, loves to work closely with the stakeholders

Responsibilities:
* Design, develop, and optimize data pipelines and workflows to support efficient ETL/ELT processes.
* Work collaboratively within an Agile team to develop, test, and release user stories, ensuring timely and high-quality delivery
* Support day-to-day operations during weekdays, ensuring data systems run efficiently and issues are resolved promptly.
* Contribute ideas and collaborate closely with team members to improve data architecture, processes, and overall delivery.
* Involvement in requirements gathering, solution design, and data analysis to support data-driven decision-making.

Reageren