




Job Summary: We are seeking a Data ETL Developer to build a career in data engineering, responsible for data cleansing, transformation, and standardization, ensuring quality and consistency. Key Highlights: 1. Integration into a multidisciplinary team for process automation. 2. Development and maintenance of data transformation processes. 3. Personalized career plan with training and continuous learning. Data \& ETL Developer WayOps is looking for a Data ETL Developer eager to build a professional career in data engineering, handling data cleansing, transformation, and standardization to ensure data quality, availability, and consistency across international operations. PROJECT TEAM You will join a multidisciplinary team, collaborating with technical profiles on web application and instant messaging development within a unit dedicated to business process automation. Your primary responsibilities will include developing and maintaining data transformation processes, participating in quality control and inconsistency detection, gathering business requirements, and contributing to the creation of data dictionaries and catalogs. FUNCTIONS RESPONSIBILITIES **On a day-to-day basis, you will actively participate in tasks such as:** * Ingestion: Extracting data from corporate sources, databases, and third-party services via APIs. * ETL: Cleansing, transforming, and standardizing data following Medallion architecture best practices. * Governance: Data quality control and inconsistency detection; documentation and communication with business stakeholders. * Pipeline orchestration: Building and monitoring automated data flows. * Technical documentation: Producing documentation for processes, data flows, and technical structures to support project scalability. REQUIREMENTS EXPERIENCE To be considered, we seek foundational expertise in data technologies and strong motivation to specialize in data engineering environments. Essential skills and knowledge include: * Degree in engineering, mathematics, statistics, or related field. * 1–3 years of experience in a similar role. * Proficiency in Python for data transformation and SQL for data querying. * Data transformation using Pandas or similar libraries. * Analytical capability to work with large volumes of information. * Structured mindset, attention to detail, and technical problem-solving orientation. **Additionally, the following will be valued positively:** * Data transformation using Big Data tools such as PySpark. * Knowledge of ETL tools such as Snowflake. * Familiarity with cloud environments (AWS, Azure, Google Cloud). * Experience working with Agile methodologies. * Knowledge of the pharmaceutical sector. CONTRACT LOCATION Collaboration will preferably be as a self-employed professional, under annual contracts with tacit renewal, within a long-term project designed to offer stability and continuity. For candidates demonstrating exceptional technical or cultural fit, indefinite-term salaried employment may be considered. This is a full-time position (40 hours/week). The project requires close collaboration with the team, necessitating a hybrid work model (attending the client’s offices located in Madrid/Sanchinarro three days per week) to facilitate communication and project momentum. ABOUT WAYOPS WayOps is a technology consultancy specializing in digital, data-driven, and cognitive transformation. We are passionate about working on innovative projects leveraging cutting-edge technologies, always within cloud environments and adhering to clean code best practices. All our professionals—whether self-employed contractors or salaried employees—receive a personalized career plan, reflecting our commitment to individual development through investment in training, continuous learning, and opportunities to take on new challenges within real-world, disruptive projects. **At WayOps, we believe in an accountability model:** responsibility and reward. Our work is grounded in competence and trust—facing challenges with leadership, pursuing excellence, adapting to the pace of business, and delivering tangible impact. Python, SQL, Pandas, PySpark, Snowflake, AWS,


