Skip to main content

Danzas is hiring Data QA Engineer

➡️ Apply here: Data QA Engineer

🔔 Monitor #data_engineer #qa jobs

👩‍💼 Want to stand out? Improve your resume to appeal to recruiters, hiring managers, and Applicant Tracking Systems. ➡️ Improve your resume


**Application Deadline: ** 6 October 2025

**Department: **Development & Engineering

**Location: **GE – Tibilisi

****Description
About the job**ALL CANDIDATES MUST BE LOCATED IN REBUPLIC OF GEORGIA**

**About Intermedia**
Are you looking for a company where **YOUR VOICE** is heard? Where you can **MAKE A DIFFERENCE**? Do you **THRIVE** in a **FAST-PACED** work environment? Do you wake every morning **EXCITED** to work with **GREAT PEOPLE** and create **SUCCESS** **TOGETHER**? Then Intermedia is the place for you.

Intermedia has established itself as a leading provider of cloud communications and collaboration tech that allows companies to connect better. We have a strong track record of growth, profitability, and creating an environment where everyone matters. Everyone. While we are fast-paced and admittedly a bit intense, we promise that you won’t be bored. You will find Intermedia is a place where you can indulge your passion for creating and supporting great cloud technology. What’s more, we always look to promote from within and have many employees who have been with us 10, 15, and 20+ years!

***Culture at Intermedia is built on teamwork and transparency. We hold each other accountable and always have each other’s back!**

Are you ready to make your mark?

**Job description:**
**About the Role** Data QA Engineer to ensure data quality across our stack (MSSQL, Snowflake, PBI, Python). Maintain standards, implement monitoring, and coordinate UAT to deliver accurate, reliable data.

**Key Responsibilities**

* Data Quality: Maintain standards and acceptance criteria; Create test cases and scenarios; Document requirements; Ensure transformations meet standards
* Monitoring: Track pipeline health; Monitor quality metrics and KPIs; Implement anomaly detection and alerting
* UAT: Design test plans for new data products; Coordinate and execute UAT; Validate reporting outputs; Document and resolve issues

**Core Requirements**

* 3-5 years QA/data quality experience
* Test case creation and implementation (manual + automated)
* Basic automation experience required
* Strong SQL for data validation
* Basic scripting (Python or similar)
* Defect tracking (like JIRA, Azure DevOps)
* QA methodologies
* Version control (Git)

**Data Platform Skills**

* Snowflake or similar data warehouse
* ETL/ELT processes
* Data lineage and UAT coordination
* Data quality tools (like Great Expectations, dbt tests, DataFold)

**Soft Skills**

* Analytical problem-solving
* Attention to detail
* Prioritization
* Cross-functional collaboration

**Nice to Have**

* BDD/TDD with frameworks (Cucumber, Behave, pytest)
* API testing
* BI tools (Power BI, Tableau)
* Advanced Python/R
* CI/CD (GitHub Actions, Jenkins)
* Agile/Scrum

**What We Offer**

* Modern data stack (Snowflake, dbt)
* Business-critical impact
* Professional development opportunities

**Key Challenge** Implement and maintain quality framework ensuring accurate data flow from source systems through Snowflake to reporting, with early issue detection and strong stakeholder coordination

**Key Responsibilities**
**Key Responsibilities**

* Data Quality: Maintain standards and acceptance criteria; Create test cases and scenarios; Document requirements; Ensure transformations meet standards
* Monitoring: Track pipeline health; Monitor quality metrics and KPIs; Implement anomaly detection and alerting
* UAT: Design test plans for new data products; Coordinate and execute UAT; Validate reporting outputs; Document and resolve issues

**Core Requirements**
**Skills, Knowledge and Expertise**

* 3-5 years QA/data quality experience
* Test case creation and implementation (manual + automated)
* Basic automation experience required
* Strong SQL for data validation
* Basic scripting (Python or similar)
* Defect tracking (like JIRA, Azure DevOps)
* QA methodologies
* Version control (Git)

**Data Platform Skills**

* Snowflake or similar data warehouse
* ETL/ELT processes
* Data lineage and UAT coordination
* Data quality tools (like Great Expectations, dbt tests, DataFold)

**Soft Skills**

* Analytical problem-solving
* Attention to detail
* Prioritization
* Cross-functional collaboration

**Nice to Have**

* BDD/TDD with frameworks (Cucumber, Behave, pytest)
* API testing
* BI tools (Power BI, Tableau)
* Advanced Python/R
* CI/CD (GitHub Actions, Jenkins)
* Agile/Scrum

**Benefits**
**What We Offer**

* Modern data stack (Snowflake, dbt)
* Business-critical impact
* Professional development opportunities

**Key Challenge** Implement and maintain quality framework ensuring accurate data flow from source systems through Snowflake to reporting, with early issue detection and strong stakeholder coordination

Previous and next articles