Azure Data Engineer | EU Institution | Remote from EU
- Remote
- Brussels, Brussels, Belgium
- Trasys International
Remote from anywhere in EU | Freelance B2B | Azure Data Engineer | Microsoft Fabrics, Synapse, PySpark, Rest APIs
Job description
Who are we?
Trasys International is a dynamic global organization that takes pride in being the trusted partner of EU Institutions. With strong commitment to excellence and a 30-years track record of delivering high-quality solutions, we are dedicated to supporting the growth and success of our clients. Our Mission is to help our clients keep up with the challenges of digital transformation by providing the right talent at the right time for the right job. To this end, we are constantly looking for talented professionals who are interested in working on challenging international projects and able to deliver high-quality results within multicultural environments. Our services include (but are not limited to) modernization of solutions, digital workspaces, cloud technologies and IT security. Our Headquarters are in Brussels and we have active accounts and offices across Europe (i.e. Luxembourg, Amsterdam, Athens, Stockholm, Geneva).
For one our esteemed Clients in Belgium, Brussels we are currently looking for Azure Data Engineer.
Please note that, despite the remote nature of the role within any EU location, you will be expected to attend onboarding at the client’s premises in Brussels.
You will be mainly responsible for…
Develop, deploy, and maintain scalable and incremental data pipelines from REST APIs and databases using Python, PySpark, Azure Synapse, Knime, SQL, and ETL tools to ingest, transform, and prepare data.
Process and transform complex JSON and GIS data into structured datasets optimized for analysis and reporting. This includes parsing, transforming, and validating JSON data to ensure data quality and consistency.
Load, organize, and manage data in Azure Data Lake Storage and Microsoft Fabric OneLake, ensuring accessibility, performance, and efficient storage using lakehouse and Delta Lake patterns.
Document ETL processes, metadata definitions, data lineage, and technical specifications to ensure transparency and reusability.
Collaborate with data analysts, BI developers, and business stakeholders to understand data requirements and deliver reliable, well-documented datasets aligned with organizational needs.
Implement data quality checks, logging, monitoring, and automated incremental load mechanisms within data pipelines to support maintainability, observability, and troubleshooting.
#LI-MS1
Job requirements
Bachelor degree - EQF 6
Ability to understand, speak and write English (C1/C2), French (B2) will be an advantage
Excellent knowledge of data engineering tools Azure Synapse Analytics, Microsoft Fabric, PySpark and Python.
Excellent knowledge of working with REST APIs, including ingestion and parsing of JSON and GIS data
Excellent knowledge of Azure data lake storage and Oracle database
Experience designing incremental loads, CDC processes, and automated schema evolution
Ability to implement robust data quality checks, logging, and monitoring in ETL processes
Ability to document ETL workflows, metadata, and technical specifications clearly and consistently
Familiarity with DevOps and version control best practices. Experience with CI/CD pipelines
Experience working in an Agile and Scrum framework
Analysis and problem solving skills
Ability to participate in technical meetings and good communication skills
Ability to participate in multilingual meetings
Ability to work in multi-cultural environment, on multiple large projects;
Excellent Team Player
Specific expertise (mandatory)
at least 5 years of excellent knowledge in Azure Data Lake Storage, Microsoft Fabric OneLake, and Oracle databases
at least 5 years of excellent expertise in developing data pipelines from REST APIs and on integration (such as Azure Synapse, PySpark, Microsoft Fabric, Python, SQL, KNIME)
at least 5 years of excellent expertise in processing JSON and GIS data
Certificates (mandatory)
Microsoft Azure Data Engineer Associate
Following certifications are a plus:
Microsoft Certified: Azure Solutions Architect Expert
Microsoft Certified: Azure Developer Associate
Microsoft Certified: Azure Database Administrator Associate
or
All done!
Your application has been successfully submitted!

