




Design, deploy, and operate scalable and secure data platforms (Data Warehouse/Data Lake/Lakehouse), including batch and streaming pipelines, orchestration, code repositories, secret management, and container execution. Responsibilities 1\. Data Infrastructure Architecture and Construction * Define and deploy Data Lake / Data Warehouse / Lakehouse (e.g., S3, Redshift/BigQuery/ClickHouse) * Data modeling 2\. Beyond ETL/DAG * Understand and configure the execution systems behind Airflow * Integration with Data Lake/DWH, orchestrator, Git repository, secret management (AWS Secrets Manager) 3\. Infrastructure Deployment (DevOps) * Infrastructure as Code (CI/CD) 4\. Containers and Execution * Dockerize jobs/algorithms (Docker Executor or KubernetesExecutor with Airflow) Requirements * Experience working with DWH and data modeling * Experience with end-to-end data analytics projects * CI/CD Benefits * Hybrid work model * Career development plans and training in technical and soft skills * Flexible compensation system (restaurant and transportation vouchers) * Weekly hours dedicated to learning and discussing innovative technologies * Opportunity to join an innovative data science startup with exciting projects underway * Extracurricular activities: paintball, afterworks, laser tag, beach volleyball, and much more


