





Job Function: Software \& Cloud The role: You will participate as a **Data \& AI Architect in projects** being carried out. What will you do?* You will be responsible for designing and establishing the appropriate data architecture for our cloud projects (typically AWS and GCP). This involves understanding business and technical requirements, identifying storage, processing, and data analysis needs, and designing scalable and efficient solutions using AWS and GCP cloud services. * You will work closely with Data Engineers and Data Scientists to understand client requirements and challenges. Provide technical guidance and advice on implementing data solutions, ensuring industry best practices and standards are followed. * Evaluate and select the most suitable AWS and GCP services and tools for each project. This requires understanding the strengths and limitations of cloud services, as well as staying up to date with the latest trends and features in cloud data engineering and science. * Design and develop efficient and reliable data pipelines using AWS and GCP data processing and storage capabilities. This includes integrating different data sources, data transformation, data cleansing, and loading into data warehouses or analytics platforms. * Provide technical advice to Data Engineers and Data Scientists regarding algorithm selection, data modeling techniques, and analysis strategies. Participate in code reviews and ensure the quality and robustness of implemented solutions. * Maintain continuous contact with third-party solutions that complement the Data \& AI ecosystem, ensuring team awareness of these solutions. What we need to see from you: If you have over **4\-5 years of experience working in the Data and AI field, especially as a data architect in cloud environments,** we are looking for you. We need you to have a strong technical background in **Data and AI**, along with the ability to understand business functional requirements. We want to see in you* A solid understanding of data architecture design principles and best practices, including selecting appropriate data models, choosing data warehouses, and implementing efficient data pipelines. * Experience designing and implementing data lakes, delta lakes, and/or data warehouses. * In-depth knowledge of services and tools provided by **AWS** (Amazon Web Services) **and/or** **GCP** (Google Cloud Platform). This includes data storage services such as S3, Cloud Storage, and BigQuery; data processing services such as AWS Glue and Cloud Dataflow; and analytics and machine learning services such as AWS Redshift and GCP BigQuery ML. * **Programming languages:** Experience with Python. * **Database technologies:** Familiarity with different types of databases, such as relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra). * **ETL/ELT and data processing:** Knowledge of extraction, transformation, and loading (ETL) or extraction, load, and transformation (ELT) techniques and tools used to manipulate and transform data. Experience using tools such as AWS Glue, GCP Dataflow, or Apache Spark for distributed data processing. * **MLOps:** Experience implementing MLOps practices. * **Data analysis and visualization:** Understanding of data analysis concepts and techniques, including the use of popular libraries and tools like Tableau, Quicksight, Looker, or Power BI for data analysis and visualization. * **Security and regulatory compliance:** Knowledge of security practices and measures to protect sensitive data and ensure regulatory compliance. * **Best practices and standards:** Familiarity with best practices and standards in data engineering and data science, including data quality management, data governance, code documentation, and collaborative teamwork. * **Communication skills:** Ability to communicate effectively with both technical and non-technical teams. * If your experience is primarily in Azure and you have limited experience with AWS or GCP, we may still consider your qualifications. Why SoftwareOne?: SoftwareOne is a global leader in software and cloud solutions, redefining how businesses build, buy, and manage everything in the cloud. By helping customers migrate and modernize their workloads and applications—and simultaneously navigate and optimize resulting software and cloud changes—SoftwareOne unlocks the value of technology. The company’s 8,900 employees are driven to deliver a portfolio of 7,500 software brands with sales and delivery capabilities across 60 countries. Headquartered in Switzerland, SoftwareOne is listed on the SIX Swiss Exchange under the symbol SWON. Visit us at https://www.softwareone.com/es\-es


