For our client we are looking for Data Engineer.
Client is the emerging leader in the $100B+ cloud communications platform market. Customers like Airbnb, Viber, Whatsapp, Snapchat, and many others depend on client's APIs and SDKs to connect with their customers all over the world. As businesses continue to shift to a real-time, customer-centric communications model, we are experiencing a time of impressive growth.
They're looking for a Data Analyst or Data Engineer to join the Engineering Productivity team and help them make smarter, data-driven decisions that improve developer productivity and engineering outcomes. You'll work closely with development, platform, and product teams to analyze, structure, and optimize data flows related to engineering metrics, DevOps performance, and platform usage. This role is ideal for someone who thrives in transforming ambiguous data into actionable insight and is excited about the potential of AI tools in developer platforms.
Key Responsibilities:
Analyze engineering, operational, and productivity data to uncover trends, risks, and opportunities
Design and implement data models that improve accessibility, structure, and long-term maintainability of engineering metrics.
Build or enhance ETL pipelines to collect, transform, and export data from various systems (e.g., GitHub, Jira, Security Scans, Costs tools).
Partner with stakeholders to define meaningful KPIs across engineering domains (e.g., reliability, security, velocity).
Explore and implement GenAI tooling to support automation, summarization, and pattern detection in engineering workflows.
Maintain data hygiene and enforce best practices in data governance and lineage within the API Engineering environment.
What You’ll Gain:
A unique opportunity to shape how engineering data is used across a large and evolving platform organization.
The chance to make a difference using GenAI tools in a real-world engineering context.
Collaboration with a team driving developer experience, reliability, and engineering consistency at scale.
Required Skills and Experience:
Proven experience as a Data Analyst or Data Engineer, preferably in a software engineering or DevOps context.
Strong SQL skills and experience with Python or another scripting language for data transformation and analysis.
Hands-on experience working with APIs and integrating data across SaaS tools (e.g., Jira, GitHub, Datadog).
Familiarity with dashboarding/visualization platforms like Looker, Grafana or Tableau.
Demonstrated experience structuring unorganized or siloed data into actionable reporting models.
Desirable:
Experience designing and building ETL pipelines and data lakes or warehouses (e.g. Snowflake).
Exposure to GenAI tooling and experience applying AI to engineering or operational workflows.
Knowledge of modern data orchestration tools (e.g., Airflow, dbt).
Understanding of software development lifecycle and metrics used in engineering productivity and platform health.
What we offer:
Contract: B2B directly with US company
Salary: up to 160 pln/h
100% remote
Polish time zone
Polish public holidays
Long term cooperation
Recruitment process: 1-2 technical calls
Net per hour - B2B
Check similar offers