




Job Summary: We are seeking a Data Engineer with experience in Big Data technologies, programming, and database management for data projects. Key Highlights: 1. Experience with Big Data technologies and data modeling. 2. Advanced programming skills (Python, SQL, Bash/Shell). 3. Knowledge of cloud data platforms (Azure, AWS, GCP) is a plus. MANDATORY REQUIREMENTS: Bachelor’s degree in Computer Science, Telecommunications, Industrial Engineering, or other IT-related fields. Programming knowledge: Python, REST APIs, Bash/Shell scripting, SQL, and NoSQL. Data modeling expertise in Big Data technologies (e.g., Impala, Hive, HBase, Solr). Familiarity with technologies for database management, data transformation, data flow automation, data ingestion, data mining, ETL, real-time data processing, data buffering, cloud computing, or data visualization. UNIX operating system knowledge: Debian/Ubuntu, CentOS/RHEL. DESIRED REQUIREMENTS: Experience with cloud data platforms (Azure, AWS, or GCP) and tools such as Databricks, Spark, or Airflow. Knowledge of DevOps/DataOps, continuous integration, and automated deployment. Skills in data modeling, data quality, and data governance. Technical certifications in data engineering or cloud (e.g., Azure DP-203, AWS Data Analytics, etc.). REQUIRED QUALIFICATION: Bachelor’s degree in Computer Engineering, Telecommunications, Mathematics, or equivalent practical experience in data projects. YEARS OF EXPERIENCE IN THE REQUESTED PROFILE: Minimum 2–3 years of experience in data projects.


