➡️ Apply here: Data Engineer
👩💼 Want to stand out? Improve your resume to appeal to recruiters, hiring managers, and Applicant Tracking Systems. ➡️ Improve your resume
Intetics Inc. is a global technology company providing custom software application development, distributed professional teams, software product quality assessment, and “all-things-digital” solutions. Based on its proprietary business model of Remote In-Sourcing®, advanced Technical Debt Reduction Platform (TETRA™), and measurable quality management platform (Predictive Software Engineering), Intetics enables clients to achieve measurable business results.
**Position:** Data Engineer / Apache Airflow Specialist
**Level:** Senior
**Technologies:** Apache Airflow, Python, Flask, Elasticsearch, Unix/Linux, Oracle, PostgreSQL, GitLab
**Workload:** 1100 hrs/year
**Location:** Remote — work from anywhere
**English level:** Advanced
**Education:** Technical degree
**Role Description**
Intetics is looking for an experienced **Data Engineer / Apache Airflow Specialist** to join our distributed team for a data-driven project focused on large-scale ETL workflows, data indexing, and performance optimization.
The specialist will design, implement, and optimize data pipelines using Apache Airflow, manage database performance on Oracle and PostgreSQL, and support Elasticsearch integration for efficient data retrieval and search operations.
You will also be responsible for ensuring smooth deployment pipelines in GitLab and collaborating with a cross-functional engineering team to enhance the quality and reliability of complex data processes.
**Requirements**
**Technical Responsibilities and Skills**
**What You’ll Do:**
* Develop, orchestrate, and maintain complex Apache Airflow DAGs for ETL and data-processing pipelines.
* Build and optimize Python-based ETL scripts, integrating with Flask APIs when needed.
* Design and manage Elasticsearch indexing and performance tuning workflows.
* Handle Unix/Linux scripting and operations for automation and monitoring.
* Work with Oracle and PostgreSQL databases for large-scale data processing.
* Implement and maintain GitLab CI/CD pipelines for build, test, and deploy stages.
* Collaborate with the project team to ensure scalability, reliability, and quality of data solutions.
**What We’re Looking For:**
* ≥ 3 years of Apache Airflow DAG orchestration.
* ≥ 5 years of Python (ETL focus), with Flask API experience as a plus.
* ≥ 3 years of Elasticsearch (data indexing & optimization).
* ≥ 3 years of Unix/Linux scripting & operations.
* ≥ 3 years with Oracle or PostgreSQL (ideally both).
* ≥ 3 years of GitLab pipelines (build/test/deploy).
* Advanced English and a technical degree.
**Nice-to-Have / Bonus:**
* Experience with Great Expectations or similar data-quality tools.
* Airflow on Kubernetes.
* Proven performance tuning experience
