Voor KPN is Harvey Nash op zoek naar een DevOps Technical Solutions Architect

Startdatum: 3-2-2025
Einddatum: 19-12-2025
Locatie: Amsterdam/Zoetermeer
Aantal uren per week: 37 uur
Deadline om te reageren: 24-01-2025 voor 12:00 uur

Your impact
We believe in data and the positive impact it can have on our business. While providing the best network of the Netherlands, KPN creates tremendous amounts of data in our applications and network components. It is critical for KPN to be able to monitor and analyze this data.

UDEX is KPN's internal Big Data platform to store and exchange all service and resource data. We need your help to grow and improve the platform by onboarding critical data sources. While also transforming towards a Data Mesh architecture and way of working.

Why
To make KPN a more data-driven company. It is essential that our data consumers can be reliant on always having the most recent data available and a stable platform to process data and create new insights.

Where
We work 3-4 days from home, and 1-2 days at the KPN office. Currently this is in Zoetermeer, but KPN is transitioning towards Amsterdam Sloterdijk in 2025.

Team

We are a combined DevOps team with 4 developers mostly focusing on Operations, 5 developers (including Solutions Architect) mostly on Innovation and a Product Owner and Data Architect. You'll be part of the development team, focusing on complex challenges and new innovations. We foster a very collaborative environment because we are convinced that cooperation is essential for tackling our complex challenges.

Our Way of Working

Within the data office department of KPN we have adapted Agile and work according to the Scrum methodology. We are convinced of the value of "We Make-We Own" and have embedded DevOps in our development teams. We aim for the best results by hiring best-in-class developers and giving the teams as much autonomy as possible .We believe we have created an inspiring environment for personal development and a great and fun place to work!

Responsibilities

  • Onboarding and updating data deliveries on our Hadoop Datalake
  • Developing frameworks to process and deploy new data delivery methods or file formats (JSON, XML, CSV, ProtoBuf, Avro, Parquet, etc…)
  • Creating views and tables for our users with optimal performance for their use case
  • Experience of connecting and interfacing with systems via APIs
  • Support both Development and Operations teams in quickly solving complex issues that are hindering the team and/or stakeholders
  • Coordinating with source teams producing data and also consumers using this data for their use cases
  • Supporting in the design, building, and testing of new services and extending existing services on our platform
  • Team collaboration on complex challenges, code reviews and testing new functionalities
  • Evaluating stakeholder requirements and guiding the innovation team towards the most suitable solution architecture
  • Identifying common development patterns and ensuring generic solutions and architecture are applied where possible
  • Take the design lead for complex requirements that go beyond the standard work done by the team
  • Continue to move the teams towards using more automation and efficient ways of working, to improve quality, eliminate manual

mistakes and be more predictable with our deliveries

Education, Experience, and Licensing Requirements:

As an engineer, ideally you have the following qualifications or experience with:

  • Extensive knowledge of Data modelling - Data Warehousing, Relational DBs and Hadoop BigData stores
  • Hadoop including specifically HDFS, Hive and SQL
  • Java/Scala, Spark with 5 years+ general software design/development experience
  • Shell scripting / Shell commands
  • Git and Confluence
  • Bachelor's Degree in Computer Science, Information Systems or other area of academic study (IT experience can substitute for a Bachelor's Degree)
  • DevOps mindset
  • Able to learn quickly in a complex multitool environment

Extra assets are:

  • Python, Groovy, Ansible
  • Hbase and Phoenix
  • Apache Nifi and Kafka
  • General hands-on experience with (Cloudera's) Hadoop Suite
  • Telecom Domain knowledge - mainly Fault, Performance, Configuration and Capacity Management Data

Reageren