# Data Flow Engineer

> SquareDev · Warsaw, Poland (Hybrid) · Full-time · Posted 2026-05-11

**Workplace:** hybrid

**Department:** Consulting

## Description

**Why are you looking for a job?**

If your answer ticks all the boxes, this could be the start of a great collaboration.

-   You have a curious mind - You won't understand what we're talking about if you don't 🤔
-   You want to learn more around technology - You won't survive if you don't 😱
-   You want to make the world a bit better - We don’t like you if you don’t 😎

We happen to be just like that as well. We like hacking things here and there (you included) and create scalable solutions that bring value to the world.

### **SquareDev? 🐿️**

We use state-of-the-art technology to build solutions for our customers and our partners' customers. We make sure we stay best-in-class by participating in **research projects across Europe**, collaborating with **top universities and enterprises** **on** **AI, Data, and Cloud**.  
  
**About QnR Group**

**SquareDev is a member of the** **QnR Group**, a leading technology organization specializing in end-to-end custom software solutions, Artificial Intelligence, Cybersecurity, SAP S/4HANA, SAP Business One, ServiceNow, and FinTech solutions.

As part of QnR Group's ongoing expansion — both in Greece and internationally — we are continuously hiring across a wide range of tech roles. Successful candidates may be hired by QnR Group, or another company within the Group, depending on the role and project.

### **Role overview**

We are looking for a **Data Flow Engineer** to join a project with one of our public-sector clients in Warsaw. In this role, you will design and run complex data pipelines that move, transform and deliver data across our systems. Your main toolbox will be Apache NiFi (Cloudera DataFlow), together with Kafka, Iceberg and the wider Cloudera Data Platform. You will work closely with data engineers, architects and business teams to make sure our data flows are reliable, secure and well documented.

## Requirements

**The ideal candidate will be responsible for:  
**

-   Designing, building, testing and maintaining complex data flows in Cloudera DataFlow (Apache NiFi) — ingest, transform, enrich, route and deliver data.
-   Building and tuning Change Data Capture (CDC) pipelines in real time or near real time, using NiFi together with Kafka, Debezium or SQL-based CDC connectors.
-   Connecting external systems through REST APIs, JDBC, Kafka and other protocols.
-   Managing data schemas in Avro and keeping metadata and lineage clean in Apache Atlas.
-   Setting up security and governance for data flows through Apache Ranger policies.
-   Monitoring pipelines, setting up alerts and fixing performance or reliability issues.
-   Working with data engineers, architects and business stakeholders to gather requirements and shape the architecture of data flows.
-   Writing and keeping up to date SOPs, runbooks and technical documentation.
-   Taking part in upgrades and migrations of CDP, NiFi and Kafka.

**To excel in this role, you'll need:**

-   At least **6 years** of relevant experience.
-   Bachelor’s or Master’s degree in Computer Science, Engineering or a related technical field.
-   At least one of the following certifications: Cloudera Certified Developer for Apache NiFi or Cloudera Data Flow (CFM) related certification, or equivalent.
-   Skills in designing, building and maintaining complex flows in Apache NiFi, **with 2-3 years of daily hands-on work**, ideally in a CDP environment, and at least one large delivered integration project where NiFi was the central tool.
-   Strong Python skills for data processing, custom NiFi logic, automation and integrations.
-   Solid experience with REST API integrations — endpoint calls, OAuth/JWT, rate limiting and error recovery.
-   Hands-on experience building CDC pipelines to and from relational databases, using native NiFi processors, connectors and SQL Builder.
-   Practical knowledge of Apache Iceberg (tables, schema evolution, partitioning) and its integration with NiFi, Spark or Flink, preferably in CDP.
-   Experience with data governance in CDP — Apache Atlas for metadata, lineage and tagging, and Apache Ranger for security policies and audit on NiFi flows.
-   Experience with Apache Kafka as a message broker (topics, producers and consumers, schema registry, NiFi integration) and Apache Avro for serialisation and schema evolution.
-   English at **B2 level (CEFR)** or higher.

## Apply

[Apply at SquareDev](https://apply.workable.com/squaredev/j/C08973C654/apply)

---
Powered by [Workable](https://www.workable.com)
