Data Engineer
We’re FOTC – a team of cloud enthusiasts helping companies get the most out of Google Cloud and Google Workspace. Whether it's moving to the cloud, building smarter workplaces, using AI, or just making everyday work easier — we're here for it.We’ve been around for over 10 years, and in that time we’ve worked with more than 6,500 companies in 50+ countries. Big names, small teams, startups, scaleups – you name it. From our offices in Wrocław, Warsaw, Bucharest, Budapest – or from wherever we’re working remotely – we help businesses grow with the right tech.We’re a Google Cloud Premier Partner, but more than that — we’re people who genuinely like solving problems, testing new ideas, and turning complex stuff into simple solutions.
What we believe inWe believe work should make sense — not just on paper, but in real life. That means innovation, partnership, responsibility, flexibility & adaptation, transparency, and a team you can count on. We support each other, share what we know, and celebrate wins (big and small).If you're someone who likes figuring things out, isn’t afraid to take initiative, and wants to work with tech that actually makes a difference — you might just find your place with us.Because at FOTC, it’s not just about cloud. It’s about people.
Data Engineer is a position that requires both independence and the ability to work in a team. The main responsibilities in this position include designing and developing a data warehouse, building independently or collectively ETL processes. Here you will learn a professional approach to data management and you will learn about the data life cycle from the source to Business Intelligence systems.
Here are some things to expect in your day-to-day:
Building and maintaining data pipelines and workflows using Cloud Composer
Designing and implementing data models, database schemas, and ETL processes using core GCP data services like BigQuery, Dataflow, Pub/Sub, Cloud Storage, Cloud Functions
Collaborating with cross-functional teams to understand business requirements and develop solutions
Ensuring data quality and reliability by monitoring and debugging data pipelines
Maintaining and improving our existing data infrastructure and processes
Staying up-to-date with industry trends and best practices in data engineering
What are we looking for?
3+ years of experience in Data Engineering, with at least 2 years focused on GCP
Proficiency in SQL and Python
Knowledge of Apache Airflow
Hands-on experience with core GCP data services: BigQuery, Dataflow, Pub/Sub
Experience in building data warehouses on GCP (Google Cloud Platform)
English level: communicative
It would be great if:
You have experience with CI/CD processes
You have experience with version control systems, such as Git
You have experience in data modeling, database design, and data schema optimization
We will be super happy if:
You have experience with other cloud platforms such as AWS or Azure
You have experience with machine learning workflows and pipelines
You have experience with Docker and Kubernetes
You have experience with IaC (Terraform)
We offer:
B2B contract
compensated days without service delivery obligation (up to 31!)
UNUM group insurance
private medical care and sport card
option of fully remote cooperation or from our office in Rynek in Wrocław (Św. Mikołaja) / Przeskok in Warsaw
company retreats abroad or in Poland once a year (bonding time, yeah!)
company equipment provided
budget for your training and development
access to Google Cloud Skills Boost platform
Data Engineer
Data Engineer