




Job Summary: We are seeking a data specialist with a strong technical foundation for a newly established team responsible for designing, building, and operating a data platform on Azure Databricks. Key Highlights: 1. Design and develop scalable data pipelines and transformations. 2. Ensure security, reliability, and observability of data processing. 3. Be part of a team with a career development and professional growth plan. **Company Description** Inetum is a European leader in digital services. The Inetum team of 28,000 consultants and experts strives daily to deliver digital impact for enterprises, public-sector entities, and society at large. Inetum’s solutions aim to drive customer performance and innovation, as well as serve the broader public interest. Present in 19 countries with an extensive network of delivery centers, Inetum collaborates with leading software vendors to address digital transformation challenges with proximity and flexibility. Driven by its ambition for growth and expansion, Inetum generated revenue of €2.5 billion in 2023. **Job Description** We are looking to hire a candidate with a strong technical background and enthusiasm for working within a newly formed team tasked with developing a new software product for a major international client. You will be responsible for designing, building, and operating the data platform on **Azure Databricks**, ensuring secure, reliable, and high-performance data ingestion and transformation. **Your Day-to-Day Responsibilities** * Design and develop scalable data pipelines and transformations on Azure Databricks. * Implement technical data quality controls and ensure reliability and observability of data processing. * Guarantee end-to-end platform security, including identity management, access control, and data protection. * Manage data workload deployments via CI/CD. * Support integration with source systems and analytical consumers, and contribute to preparation for testing and deployment. **Requirements** Candidates **must have experience** with the following technologies: * **Azure Data Lake** + Build a Data Lake from scratch. + Mount folders/layers: RAW, BRONZE, SILVER, GOLD. + Configure access (permissions, roles, keys). + Understand how Delta Lake and versioning work. + Manage historical data retention (snapshots / time-travel). + Optimize storage and partitioning. * **Azure Data Factory** + Create pipelines to ingest data from SAP, P6, Excel, APIs, SFTP, etc. + Parameterize pipelines (variables, templates, loops). + Handle triggers: hourly, daily, weekly. + Manage error handling, logging, and retries. + Connect ADF with Key Vault to use secure credentials. + Orchestrate data loads and fully automate workflows. * **Databricks** + Program in Python and/or SQL on Spark. + Create BRONZE, SILVER, GOLD notebooks. + Clean, transform, and join data sources. + Implement simple business rules (“if EXW < PU, error”). + Understand Delta Lake versioning. + Configure clusters and run jobs. * **Interconnection** among Azure services: ADF, Databricks, SQL * Configuration and operation of **Azure security**. Additionally, the following would be **advantageous**: * Intermediate-to-advanced English level (B2–C1) * Kafka/Spark Streaming Azure ML * **Additional Information** **What We Offer** * Company-sponsored training program to support your continued development and advancement within your personalized career path. * Permanent contract and job stability. * Flexible compensation and additional benefits. * Flexible working hours. * Hybrid remote work model; however, for this position, **100% remote work is possible**. * Intensive work schedule every Friday! * Summer intensive work schedule! * We offer a dynamic environment where your career path and professional growth are our top priorities. * A positive, open, and inclusive work atmosphere. * You will join a large team of professionals passionate about and motivated by technology. Are you ready to take on new challenges and further advance your professional career? Apply now—this could be the opportunity you’ve been waiting for!


