For our client who is partnering with a leading programmatic media company, specialising in ingesting large volumes of data, modeling insights, and offering a range of products and services across Media, Analytics, and Technology. Well-known brands such as Walmart, Barclaycard, and Ford are among their clients.
The company has expanded to over 700 employees, with 15 global offices spanning four continents. With the imminent opening of a new office in Warsaw, we are seeking experienced Data Engineers to join their expanding team.
The Data Engineer will be responsible for developing, designing, and maintaining end-to-end optimised, scalable Big Data pipelines for our products and applications. In this role, you will collaborate closely with team leads across various departments and receive support from peers and experts across multiple fields.
Responsibilities:
- Follow and promote best practices and design principles for Big Data ETL jobs.
- Help in technological decision-making for the business’s future data management and analysis needs by conducting POCs.
- Monitor and troubleshoot performance issues on data warehouse/lakehouse systems.
- Provide day-to-day support of data warehouse management.
- Assisting in improving data organisation and accuracy.
- Collaborate with data analysts, scientists, and engineers to ensure best practices in terms of technology, coding, data processing, and storage technologies.
- Ensure that all deliverables adhere to our world-class standards.
Skills:
- 3+ years of overall experience in Data Warehouse development and database design.
- Deep understanding of distributed computing principles.
- Experience with AWS cloud platform and big data platforms like EMR, Databricks, EC2, S3, and Redshift.
- Experience with Scala, Spark, Hive, Yarn/Mesos, etc.
- Experience in SQL and NoSQL databases, as well as experience with data modeling and schema design.
- Proficiency in programming languages such as Java, Scala, or Python for implementing data processing algorithms and workflows.
- Experience with Presto and Kafka is a plus.
- Experience with DevOps practices and tools for automating the deployment, monitoring, and management of big data applications is a plus.
- Excellent communication, analytical, and problem-solving skills.
- Knowledge of scalable service architecture.
- Experience in scalable data processing jobs on high-volume data.
- Self-starter, proactive, and able to work to deadlines.
Opportunities:
- Career and professional growth.
- Competitive salary.
- Hybrid work model (3 days per week work from office space in the heart of Warsaw city)
- Long-term employment with 20 working days of paid vacation, sick leaves, and national holidays