




Job Summary: Devoteam G Cloud is seeking a GCP Data Engineer for a major banking client, focused on data model development and maintenance, SQL query optimization, and collaboration on data solutions. Key Responsibilities: 1. Development and maintenance of data models in BigQuery. 2. Optimization of SQL queries for data extraction and transformation. 3. Collaboration in the initial design of analytical tables and data structures. **Company Description** **Devoteam is a leading European consultancy focused on digital strategy, technology platforms, cybersecurity, and business transformation through technology.** Concentrating on six areas of expertise, we address our clients’ strategic challenges: Digital Business & Products, Data-driven Intelligence, Distributed Cloud, Business Automation, Cybersecurity, and Sustainability enabled by Digitalization. Technology is in our DNA, and we believe in it as a lever capable of driving change for improvement, maintaining a balance that enables us to provide our clients with top-tier technological tools while always ensuring the proximity and professionalism of a team acting as a guide throughout the journey. Our **30 years of experience** make us an innovative, established, and mature consultancy, enabling the development of our **12,000 professionals**, continuously certifying our consultants in the latest technologies and counting on experts in: Cloud, BI, Data Analytics, Business Process Excellence, Customer Relationship Management, Cybersecurity, Digital Marketing, Machine Learning, Software Engineering and Development. Devoteam has been awarded Partner of the Year 2021 by the five leading cloud providers: **AWS, Google Cloud, Microsoft, Salesforce, and ServiceNow.** **Job Description** Devoteam G Cloud is looking for a GCP Data Engineer for an important client in the banking sector. **Responsibilities**: * Assist in the **development and maintenance of simple data models** under supervision, primarily in **BigQuery**. * Write and optimize **SQL queries** for data extraction and transformation, focusing on efficiency and information quality. * Collaborate with the team to understand **data requirements** and help implement basic ingestion and transformation solutions. * Participate in the **initial design of fact tables** and analytical data structures, learning about business needs. * Document data transformation processes and developed SQL queries. **Requirements**: * Interest and/or initial experience (internships, personal projects, courses) in the field of data engineering. * **Intermediate/advanced SQL proficiency**: Ability to write complex queries (JOINs, window functions, basic optimization). **Mandatory**. * **Preferred knowledge of BigQuery**, and familiarity with the Google Cloud Platform (GCP) environment. * Understanding of fundamental **data modeling concepts** (e.g., differences between fact and dimension tables). * Proactivity, eagerness to learn quickly, and ability to work collaboratively, seeking help when needed. **Desirable**: * Knowledge of or interest in data orchestration tools (Airflow, dbt) or automation scripting. * Basic communication skills with the technical team to report progress and issues. * Experience or familiarity with handling large volumes of data.


