




**Company Description** ***Why Talan?*** **Talan – Positive Innovation** Talan is an international consulting group specializing in innovation and business transformation through technology. With over 7,200 consultants in 21 countries and a turnover of €850M, we are committed to delivering impactful, future\-ready solutions. **Talan at a Glance** Headquartered in Paris and operating globally, Talan combines technology, innovation, and empowerment to deliver measurable results for our clients. Over the past 22 years, we’ve built a strong presence in the IT and consulting landscape, and we’re on track to reach €1 billion in revenue this year. **Our Core Areas of Expertise** * **Data \& Technologies:** We design and implement large\-scale, end\-to\-end architecture and data solutions, including data integration, data science, visualization, Big Data, AI, and Generative AI. * **Cloud \& Application Services:** We integrate leading platforms such as SAP, Salesforce, Oracle, Microsoft, AWS, and IBM Maximo, helping clients transition to the cloud and improve operational efficiency. * **Management \& Innovation Consulting:** We lead business and digital transformation initiatives through project and change management best practices (PM, PMO, Agile, Scrum, Product Ownership), and support domains such as Supply Chain, Cybersecurity, and ESG/Low\-Carbon strategies. **We work with major global clients across diverse sectors, including Transport \& Logistics, Financial Services, Energy \& Utilities, Retail, and Media \& Telecommunications.** **Job Description** As a **Senior Data Lead Engineer**, your mission will be to lead the design, evolution and operation of **cloud\-based Data, AI \& BI platforms**, enabling scalable, secure and high\-quality data products that drive business value. You will play a key role in defining the data roadmap, mentoring engineering teams and delivering advanced analytics and AI use cases in a complex and large\-scale environment. We need someone like you to contribute across the following responsibilities: * Lead the **Data, AI \& BI roadmap**, ensuring scalability, resilience, security and cost efficiency. * Design, evolve and operate **cloud data lakehouse architectures**. * Define and build **domain\-oriented data products** aligned with data mesh principles (data\-as\-a\-product, SLAs, ownership). * Build and maintain **data ingestion, ETL and transformation pipelines**, including CDC\-based and event\-driven architectures. * Integrate **cloud platforms with on\-premise data platforms** in hybrid environments. * Implement and enforce **data governance, data quality rules and data guardrails**. * Deliver high\-quality, well\-modelled datasets and **semantic layers** for BI, reporting and analytics. * Enable **AI/ML and LLM use cases** (feature engineering, training, RAG, fine\-tuning, monitoring). * Promote engineering best practices and act as a **technical leader and mentor** for data, ML and BI engineers. * Collaborate with product, technology and business teams to prioritise and deliver high\-impact initiatives. Qualifications EXPERIENCE * 5\+ years of experience in **Data Engineering, Data Platforms, AI Engineering or Advanced Analytics**. * Proven experience designing and building **cloud data platforms** and lakehouse architectures (preferably AWS). * Hands\-on experience with **Databricks or EMR** for large\-scale data processing. * Strong background in **data ingestion, ETL and CDC\-based pipelines**. * Experience working with **hybrid architectures** (on\-premise \+ cloud). * Experience enabling **AI/ML solutions in production**. * Experience collaborating with BI teams and business stakeholders on data modelling and KPI definition. EDUCATION * Bachelor’s degree or higher in **Computer Science, Engineering, Mathematics** or a related technical discipline. * Additional training in Data Engineering, AI/ML or Analytics is a plus. SKILLS \& KNOWLEDGE * **AWS:** S3, Lake Formation, Glue, EMR. * **Databricks:** Spark (PySpark/Scala), Delta tables, MLflow, feature store, performance optimisation. * Strong **SQL and Python** skills for data processing and automation. * Data formats and lakehouse concepts: **Parquet, Iceberg / Delta**, curated layers. * Experience with **data quality, lineage, observability and monitoring**. * Knowledge of **CDC patterns** and event\-driven ingestion. * Understanding of **data mesh principles** and federated governance. * Experience supporting **ML workflows** (feature engineering, training, deployment, monitoring). * Knowledge of **LLMs** (prompt engineering, fine\-tuning, RAG, evaluation and guardrails). * Strong understanding of **BI concepts**, semantic modelling and analytics consumption. * Practical experience with **data governance, data rules and data guardrails**. SOFT SKILLS * Strong communication skills, able to explain complex data and AI topics to technical and non\-technical audiences. * Ability to influence and align multiple teams without direct authority. * Proven leadership and mentoring capabilities. * Proactive, hands\-on and outcome\-oriented mindset. * Collaborative and adaptable in complex environments. OTHER INFORMATION / NICE TO HAVE * AWS or Databricks certifications. * Experience with orchestration tools and **CI/CD for data and ML pipelines**. * Knowledge of **Infrastructure as Code**. * Experience with BI tools such as **QuickSight, Power BI or Qlik**. * Experience working with **Agile methodologies** (JIRA, Confluence). Additional Information What do we offer you? * Hybrid position based in **Málaga, Spain**. * Permanent, full\-time contract. * Training and continuous career development. * Opportunity to work in a **multicultural team** on international projects. If you are passionate about **data, AI and cloud technologies**, we want to meet you! Talan Spain’s commitment to non\-discrimination based on gender, race, ideology, or any other reason, in accordance with the company’s "Equality Plan" and the current regulations on gender equality between women and men (Royal Decree\-Law 6/2019\). \#LI\-CL1


