Freelance Data Engineer (Azure Data Lake / Databricks) – 6 months – €NEG per hour (DOE)

Noord Holland
Freelance
€NEG Per hour (DOE)
1510

For a long-term client of mine, I’m looking for a Freelance Data Engineer with strong experience in Azure Data Lake, Databricks, and data pipeline engineering.

You’ll support the transition of a business-built solution into a centrally managed data platform, ensuring it is scalable, secure, and maintainable.

Experience in FMCG, retail, or e-commerce environments is highly desirable, given the scale and complexity of data.

This is a hands-on role where you’ll work closely with both business and IT stakeholders to deliver a clean, production-ready solution.


Responsibilities

  • Build and optimise data pipelines (Azure Data Lake / Databricks)

  • Support migration into a central data platform

  • Ensure compliance with engineering, security, and governance standards

  • Improve reliability, structure, and maintainability of data workflows


Requirements

  • Strong experience as a Data Engineer

  • Hands-on experience with Azure Data Lake and Databricks

  • Proven data pipeline engineering experience

  • Strong stakeholder communication skills

  • Fluent English required


Nice to Have

  • Experience in FMCG, retail, or e-commerce environments

  • Experience working with high-volume / complex data platforms


Details

  • Contract: 6 months

  • Start: April – June

  • Location: Noord Holland (2 days per week onsite, travel between locations required)


Interested? Please apply with your CV, rate, and availability.

Apply Now

    live jobs:

    Freelance Senior Data Engineer (Azure Fabric / Power BI) – 6 months – Up to €100,00 per hour

    Amsterdam / Hybrid
    Freelance
    Up to €100,00 per hour
    1510

    Freelance Senior Data Engineer (Azure Fabric / Power BI)

    Data Platform & Reporting Infrastructure | 6-Month Contract

    The Role

    A client of mine is scaling its internal data platform and is looking for a Senior Data Engineer to take ownership of the data infrastructure and reporting layer.

    This role is focused on Azure Fabric, Power BI, and data modelling — ensuring data is properly structured, usable, and supports real business outcomes.


    Key Responsibilities

    • Own and improve data infrastructure within Azure Fabric

    • Build and maintain data pipelines and workflows

    • Design schemas and data models for reporting

    • Enable and optimise Power BI reporting

    • Build usable data products (dashboards / reporting layers)

    • Work closely with stakeholders to deliver practical solutions


    Must-Have Experience

    • Strong Data Engineering / Analytics Engineering background

    • Experience with Azure (Fabric preferred)

    • Strong Power BI + data modelling experience

    • Solid schema design capability

    • Python for data processing

    • Experience delivering end-to-end data solutions

    • Strong stakeholder communication

    • Dutch Speaking

    Nice to Have

    • Scale-up experience

    • Experience building data platforms


    Contract Details

    • Contract: 6 months (extension likely)

    • Location: Hybrid (Amsterdam – Tuesdays preferred)

    • Start: April 2026

    • Process: Interviews over next 2 weeks

    Apply Now

      live jobs:

      Freelance Cloud / DevOps Infrastructure Engineer – 6 months – Up to €95,00 per hour

      Rotterdam / Hybrid
      Freelance
      Up to €95,00 per hour
      1510

      Freelance Cloud / DevOps Infrastructure Engineer

      The Role

      A client of mine is building out the infrastructure supporting a new automated trading platform and is looking for a Cloud / DevOps Infrastructure Engineer to help design and operate the core platform environment.

      The role focuses on ensuring reliable connectivity between Databricks, market data providers, execution systems, and internal monitoring tools, while establishing the infrastructure foundations for a greenfield trading environment.

      This is a hands-on DevOps / SRE role where automation, observability, and operational reliability are critical.

      Key Responsibilities

      Platform & Environment Management

      • Configure and manage Databricks environments across development, testing, and production
      • Implement Infrastructure as Code (Terraform) to support reproducible environments
      • Establish environment separation for strategy development, testing, and live trading

      External Integration

      • Integrate external data feeds and execution APIs
      • Design and implement secure authentication and secrets management
      • Enable automated deployment and execution of trading strategies
      • Ensure reliable data pipelines and platform connectivity

      Reliability & SRE

      • Implement monitoring, logging, and alerting across the trading platform
      • Define SLAs / SLOs for strategy execution and data availability
      • Support incident response and root cause analysis
      • Improve platform resilience and operational stability

      Automation & Governance

      • Build CI/CD pipelines for platform and strategy deployments
      • Implement cluster governance and infrastructure policies
      • Monitor capacity, performance, and infrastructure usage

      Must-Have Experience

      • Strong Azure cloud infrastructure experience
      • Terraform / Infrastructure as Code expertise
      • Experience operating Databricks in production
      • Experience integrating external APIs in high-reliability environments
      • Strong understanding of monitoring, alerting, and SRE practices
      • Fluent English

      Nice to Have

      • Experience in finance, trading, energy trading or other time-sensitive systems
      • Experience supporting data or ML platforms
      • Experience working in greenfield platform builds

      Contract Details

      • Contract length: 6 months (extension possible)
      • Working model: Hybrid (Netherlands)
      • Start: Interviews starting early April
      • Applications: CVs being reviewed over the next 2 weeks

      If you’re interested, please apply within.

      Apply Now

        live jobs:

        contact our team.