TBC Bank Uzbekistan, the first digital bank in Uzbekistan and a part of TBC Bank Group PLC, commenced its operations in 2020. Today, the bank is rapidly developing, offering retail banking solutions through its innovative fintech platform. Over 4 million registered users have access to banking services via its mobile application. "Making People's Life Easier" - the core principle of TBC Bank's operations. By advancing digital technologies across all spheres of life, TBC Bank is dedicated to building a bank free from queues, physical offices, bureaucracy, complex terminologies, and lengthy waits.
We are seeking a highly skilled DataOps Engineer to join our growing team. You will play a critical role in designing, implementing, and maintaining our data pipelines, ensuring the smooth and reliable flow of data throughout the organization. This role requires a strong understanding of data engineering principles and experience with modern data stack tools like Airflow, Kafka, Debezium, and Change Data Capture (CDC).
Responsibilities:
- Design, develop, and maintain data pipelines using Airflow, orchestrating data movement and transformation processes.
- Implement stream processing solutions using Apache Kafka for real-time data ingestion and delivery.
- Configure and manage Debezium connectors to capture change data from various databases and integrate it into the data pipeline.
- Develop and maintain data quality checks and monitoring tools to ensure data integrity and consistency.
- Automate data pipeline deployment, testing, and configuration management using infrastructure as code (IaC) tools.
- Collaborate with data analysts, data scientists, and software engineers to understand data needs and translate them into efficient data pipelines.
- Troubleshoot and resolve issues related to data pipelines, ensuring timely and accurate data delivery.
Qualifications:
- Bachelor's degree in Computer Science, Information Technology, or a related field (Master's degree preferred).
- 5+ years of experience in a data engineering or data operations role.
- Strong experience with Apache Airflow, including DAG development, scheduling, and monitoring.
- In-depth knowledge of Apache Kafka and its ecosystem, including producers, consumers, and connectors.
- Experience with Debezium and its use for Change Data Capture (CDC).
- Familiarity with data warehousing and data lake architectures.
- Experience with scripting languages like Python or Bash for data processing tasks.
- Excellent problem-solving and analytical skills.
- Strong communication and collaboration skills.
- Ability to work independently and as part of a team.
Working with us means:
- An open atmosphere and vibrant team life without unnecessary bureaucracy or dress code;
- Medical insurance with extended coverage after the probation period;
- Coverage for conference participation, discounts for English language courses, and access to educational platforms (Coursera, Udemy);
- Internal meetups: lightning & thunder talks;
- High-performance equipment and additional monitors for productive work;
- Class “A” office in the city center
- A multicultural team (Georgia, Russia, Poland, Kazakhstan, Kyrgyzstan, Ukraine, Tajikistan);
- Referral program "Bring a Friend" with bonus opportunities;
- Coffee points for snacks at any time;
- Team-building and sports events that foster team spirit.