





**Our Vision** We believe in a world where everyone, regardless of their country’s wealth or frontiers, enjoys access to medicines and healthcare when they need it. **Our Mission** We work tirelessly to remove access barriers faced by patients and caregivers across Low and Middle\-Income Countries (LMICs) when seeking quality medicines and quality healthcare. **Who We Are:** **Accelerating access to medicines for all** Imagine a world where critical medicines are within reach, affordable, and synonymous with quality, no matter where you are. That is the world we're building with our unique, demand\-aggregation model that unites healthcare providers across LMICs. At the heart of our identity is a single, resolute commitment: to build a future where geography and income never stand between any individual and life\-saving medicines. Our platform isn't merely a space for transactions, we unlock affordable access to medicines by aggregating demand across healthcare providers. We also help manufacturers build a sustainable and reliable global access strategy. Our business focuses on low\- and middle\-income countries and strengthens the purchasing power of these markets to ensure the supply of cost\-effective, high\-quality medicines. We partner exclusively with authorized, high\-quality pharmaceutical companies and certified caregivers to accelerate sustainable access to medicines. **About the Position:** We are seeking a Data Engineer to advance and strengthen our data platform, ensuring scalable data availability, harmonisation, and analytics across the organisation. This role sits at the center of our digital ecosystem and directly powers our marketplace, AI products, and operational insights. You will operate autonomously and thrive in ambiguity, shaping data architecture, leading analytics delivery, and building AI\-enabled data workflows. A significant portion of your time will be spent enabling analytics through Metabase or similar BI systems, while also designing and maintaining the pipelines, models, and infrastructure required for high\-quality data at scale. This role requires strong leadership, excellent stakeholder management, and the ability to say no when necessary to maintain focus and data quality. You will collaborate closely with engineering, product, and operations teams to deliver reliable, accessible, and insight\-driven data systems. **Key Responsibilities:** **Data Platform \& Pipeline Development** * Build and maintain pipelines and data models using dbt * Manage data ingestion, transformation, and harmonisation * Develop Python\-based workflows for structured and unstructured data * Lead architectural improvement proposals with engineering peers **Analytics Development** * Build dashboards, datasets, and analytics layers in Metabase or similar BI tools * Ensure trusted, consistent analytics for internal and external users * Support product and operations teams with reliable decision\-support data **AI\-Enabled Workflows** * Use AI\-in\-the\-loop workflows for development and PRs * Apply AI\-assisted coding tools (Cursor or similar) * Implement NLP and text extraction features * Process and structure unstructured data at scale **Infrastructure, Orchestration \& Deployment** * Use orchestration tools for data workflows * Build CI/CD for data using GitHub Actions * Work with Docker\-based deployments * Operate within cloud environments (GCP/AWS) **Stakeholder Alignment \& Leadership** * Partner with engineering, product, and operations teams * Communicate clearly about data quality, limitations, and priorities * Manage stakeholder expectations with clarity and confidence * Say no when necessary to protect data quality and engineering focus **Requirements**: * 8–10 years of data engineering experience * Strong Python development experience * Strong experience with dbt * Strong experience with Metabase (major portion of role) or similar BI tools * Strong SQL and data modelling skills * Experience with orchestration tools (Airflow, Dagster, or Prefect) * Experience with NLP, text extraction, and unstructured data workflows * Familiarity with AI\-assisted development workflows * Strong data and AI foundations * Experience with B2B platforms * Experience with streamlit, dbt, n8n, and LLM/agent development * Experience with GCP/AWS, Docker, and CI/CD (GitHub Actions) * Operates autonomously and thrives in ambiguity * Strong stakeholder management, leadership, and ability to say no **What is in it for you?** At Axmed, we believe in creating a supportive and rewarding environment where our team can thrive. Here’s what we offer: * **Unlimited PTO**: Take the time you need to recharge and maintain work\-life balance. * **Monthly wellness allowance**: Prioritize your health and well\-being with extra support. * **Paid parental leave**: Time off to bond with your new family member without any added stress. * **Flexible working hours**: Enjoy the freedom to structure your workday in a way that suits your lifestyle. * **Annual off\-site retreats**: Connect with the team and build lasting relationships during our company retreats. * **Fully remote work**: Work from anywhere in the world and join our distributed team. * **The opportunity to make a difference**: Be part of a mission\-driven company working to improve healthcare equity. * **Competitive salaries**: We offer a compensation package that reflects your skills and experience. * **Plenty of room for growth**: We believe in nurturing talent and offering opportunities for professional development and advancement.


