D&AI
Remote & On Site
Contract
Start date: 7 Apr 2026
You will develop and maintain secure, compliant, and high-performing data pipelines and data stores using Azure services—supporting modern data warehouse (MDW), big data, and lakehouse architectures
Tasks:
• Design and implement data storage
• Develop data processing
• Implement a partition strategy for files in Azure Synapse Analytics
• Transform data by using Apache Spark or/and Transact-SQL (T-SQL) in Azure Synapse Analytics
Conditions:
• Azure Synapse Workspace ( All aspects ), SQL, Python
• Nice to have skills: Airflow, Spark Delta Table Libs
Project details:
• Start: ASAP
• Duration: 6 months (+ option to extend)
• Utilisation: 2 days per week
• Location: Wuppertal, NRW
• Language: German
Telephone interview slots with our client can be arranged at short notice with a decision quick after.
If you are interested in receiving further information about the role or can recommend a colleague or friend, please get in touch.
Back