Skip to main content

MIGx is hiring Data Engineer

➡️ Apply here: Data Engineer

🔔 Monitor #data_engineer jobs

👩‍💼 Want to stand out? Improve your resume to appeal to recruiters, hiring managers, and Applicant Tracking Systems. ➡️ Improve your resume


**Job Title:** Data Engineer

**Company:** MIGx

**Location:** Batumi

**Job Description:**

MIGx is seeking a Data Engineer to join their Data and AI Engineering team. This role involves building and managing ETL/ELT pipelines, developing cloud-native data solutions, and implementing DataOps practices.

**Responsibilities:**

* Build and manage ETL/ELT pipelines using tools like Databricks, dbt, PySpark, and SQL.
* Contribute to scalable data platforms across cloud environments (Azure, AWS, GCP).
* Implement and maintain CI/CD workflows using tools such as GitHub Actions and Azure DevOps.
* Apply DataOps principles: pipeline versioning, testing, lineage, deployment automation, and monitoring.
* Integrate automated data quality checks, profiling, and validation into pipelines.
* Ensure strong data observability via logging, metrics, and alerting tools.
* Collaborate on infrastructure as code for data environments using Terraform or similar tools.
* Connect and orchestrate ingestion from APIs, relational databases, and file systems.
* Work in agile teams, contributing to standups, retrospectives, and continuous improvement.

**Requirements (Must have):**

* Experience with cloud-native data engineering using tools such as Databricks, dbt, PySpark, and SQL.
* Comfort working with at least one major cloud platform (Azure, AWS, GCP).
* Hands-on experience with CI/CD automation, especially with GitHub Actions or Azure Pipelines.
* Strong Python programming skills for transformation, scripting, and automation.
* Working knowledge of data quality, validation frameworks, and test-driven data development.
* Familiarity with observability practices including metrics, logging, and data lineage tools.
* Understanding of DataOps concepts, including reproducibility, automation, and collaboration.
* Team-first mindset and experience in agile environments (Scrum or Kanban).
* Professional working proficiency in English.

**Requirements (Nice to have):**

* Experience with Snowflake or similar cloud data warehouses.
* Knowledge of data lineage tools and frameworks.
* Infrastructure automation using Terraform, Bash, or PowerShell.
* Exposure to data modeling techniques like Data Vault or dimensional modeling.
* Familiarity with data testing tools.
* Understanding of GxP or other healthcare data regulations.
* Experience with non-relational data systems (e.g., MongoDB, CosmosDB).

**What we offer:**

* Excellent compensation package
* Family Insurance Package
* Modern office in a very good location
* Possibilities of career development and the opportunity to shape the company future
* An employee-centric culture
* Different training programs for personal and professional development
* Work in a fast-growing, international company
* Friendly atmosphere and supportive Management team

**Address:** Batumi, Tbel Abuseridze street 5-a

**To apply:** Submit your application via the provided link: https://migx.jobs.personio.com/job/1556143?_pc=2776478#apply

Previous and next articles