···
Log in / Register
Data Engineer
Indeed
Full-time
Onsite
No experience limit
No degree limit
Carrer del Pont, 42, 17706 Pont de Molins, Girona, Spain
Favourites
Share
Some content was automatically translatedView Original
Description

Job Summary: Calsina Carré is seeking a Data Engineer to design and build the corporate data model within a medallion architecture on Databricks, ensuring data quality and performance. Key Highlights: 1. Participate in building the corporate data platform. 2. Be part of a stable project at a leading company. 3. Collaborate with BI and business teams to define KPIs. **Description:** ---------------- Do you have solid experience in data engineering and want to contribute to building a modern architecture within a digitally transforming enterprise? At Calsina Carré—a multinational leader in the logistics sector—we are looking for a Data Engineer to strengthen the Data & Analytics team within the IT department. **Who We Are** Calsina Carré is a company with over 50 years of experience in the logistics sector, headquartered in Pont de Molins (Figueres, Girona), with international presence across Europe and North Africa. We are currently undergoing a digital transformation and modernization of our data platform. **What You’ll Do** Reporting to the Head of Data, your mission will be to design and build the corporate data model within a medallion architecture on Databricks, ensuring quality, performance, and consistency. Your key responsibilities will include: **Data Modeling** * Define and maintain OLTP and OLAP models. * Relational and dimensional modeling (normalization, star schema, snowflake schema). * Ensure the model supports reporting, analytics, and operational needs. **Medallion Architecture on Databricks** * Design Bronze, Silver, and Gold layers following industry best practices. * Implement Delta Lake tables, partitioning, optimizations, and time travel. * Standardize ingestion and publication patterns. **ETL/ELT Processes** * Design, develop, and maintain integration pipelines. * Automate data loads, transformations, and validations. * Monitor job executions and ensure data availability and timeliness. **Data Quality and Governance** * Define quality rules and validate compliance. * Document data models, lineage, and data catalog. * Guarantee data reliability, traceability, and consistency. **Performance, Security, and Costs** * Optimize queries and jobs (SQL tuning, partitioning, caching). * Collaborate on security mechanisms and access control. * Consider architectural and resource cost implications. **Collaboration with BI and Business** * Work with the Power BI team to expose governed models. * Support definition of entities and KPIs. * Translate business requirements into clear data requirements. **Best Practices and Continuous Improvement** * Apply DataOps/DevOps principles: Git, versioning, environments, deployments. * Propose improvements to architecture, tools, and processes. * Maintain and document existing models and pipelines. This position allows remote work, with periodic travel to Pont de Molins expected. **What We Offer** * A stable project at a growing company and sector benchmark in logistics. * Direct involvement in building the corporate data platform. * Remote work opportunity. * Competitive compensation based on experience and fit for the role. * Permanent contract. **Requirements:** --------------- **What We’re Looking For** Mandatory experience (minimum 5 years) * Advanced SQL: complex queries, joins, window functions, relational/analytical modeling. * OLTP/OLAP data modeling (star schema, snowflake schema, normalization/denormalization). * Solid hands-on experience with Azure (Databricks, Data Factory, Synapse, Fabric or similar). * Development using PySpark or Spark SQL. * Knowledge of Medallion architecture and Delta Lake. * End-to-end ETL/ELT process design and operations. * Experience with SQL Server, Azure SQL, or other relational databases. * Understanding of Data Warehouse, Data Lake, and modern data architecture. * Experience integrating with BI tools (Power BI preferred). * Proficiency with Git and CI/CD pipelines. **Nice-to-Have** * Experience with Agile methodologies. * Participation in enterprise-scale data projects. **Key Competencies** * Analytical ability and structured thinking. * Focus on quality and technical rigor. * Autonomy to manage backlog and make well-founded decisions. * Teamwork with both technical and business stakeholders. * Clear communication and ability to explain complex concepts. * Proactivity, documentation, and continuous improvement. * Comfort working in hybrid/remote environments.

Source:  indeed View original post
David Muñoz
Indeed · HR

Company

Indeed
David Muñoz
Indeed · HR
Similar jobs

Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.