




Job Summary: The Specialist Data Engineer will maintain and enhance the Google Cloud Platform environment to ensure the smooth operation of the company’s CORE processes, collaborating with business and BI teams. Key Highlights: 1. ETL/ELT process maintenance and optimization in GCP 2. Collaboration with business analysts and stakeholders 3. Implementation of best practices for data quality and security **About the Role:** The **Specialist Data Engineer** will join the Business Intelligence team, maintaining the **Google Cloud Platform (GCP)** environment to ensure the correct functioning of the company’s **CORE activity processes**. Maintenance includes Composer orchestration, creation of new processes, process re-engineering, and ongoing upkeep—ensuring timely and high-quality information availability. **Key Responsibilities** * Maintain and improve existing **ETL/ELT processes** in GCP. * Monitor **pipeline execution** and resolve incidents promptly and effectively. * **Optimize costs** and performance across GCP services (BigQuery, Cloud Storage, Dataflow, Composer, etc.). * Manage and **evolve data models** and storage structures. * **Collaborate with business analysts**, BI developers, and stakeholders to understand requirements and translate them into technical solutions. * Implement best practices for **data quality, security, and versioning**. * **Automate processes** and operational tasks wherever possible. * Implement **alerts, metrics, and observability tools**. * **Data Engineering** + Use of **cloud tools and platforms, especially GCP** (**BigQuery, Cloud Storage, Looker Studio**, etc.) + Use of **programming languages** for data manipulation. * Business Partnering: **Work hand-in-hand with the business unit** to understand its needs, prioritize deliverables, and translate requirements into data models. **What We’re Looking For** * Experience with **GCP**, especially: + **BigQuery** + **Cloud Composer / Airflow** + **Dataflow / Apache Beam** + **Cloud Storage** * **Strong SQL skills** (tuning, modeling, complex queries). * Experience with **ETL/ELT processes** and handling large volumes of data. * **Python knowledge** applied to data processing. * Version control (**GitHub**). * **Bachelor’s degree** in Engineering, Mathematics, Statistics, Economics, Data Science, or related field. * **Additional training** in data analytics or cloud computing is a plus. **What We Offer** Flexible work: 60% remote and 40% office-based—with flexible hours! 20 additional remote workdays: Work from anywhere in Spain. ️ 25 vacation days: Plus December 24th and 31st off, and the option to purchase up to 10 extra days. ️ Restaurant vouchers: Additional to fixed compensation and flexible remuneration plan. Free health insurance: Coverage via Adeslas, life insurance, and a comprehensive physical and mental health & wellbeing program. Pension plan option. Secure your future with our pension plan options. ❤️ 3 volunteer days per year: Dedicate time to volunteering initiatives with these additional days. Career development and training: Access to an AI-powered skill development platform and learning content from Skillsoft, MIT Horizon, Harvard, and more. Digital management app. Easily organize your daily office routine with our app. If you’d like to learn more about us, visit our website to discover what it’s like to work at Vodafone: https://www.vodafone.es/c/conocenos/es/vodafone\-espana/trabaja\-con\-nosotros/


