Design, develop, and maintain robust and scalable data pipelines on AWS.
Manage and optimize AWS services such as S3, Lambda, Glue, Redshift, and EMR.
Collaborate with data scientists, analysts, and software engineers to ensure efficient data integration and delivery.
Implement best practices in data architecture, security, and governance.
Monitor and troubleshoot data workflows and infrastructure.
Continuously improve system performance and reliability.
Our Requirements:
Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience).
Proven experience as a Data Engineer with a strong focus on AWS technologies.
Proficiency in programming languages such as Python, Java, or Scala.
Hands-on experience with AWS services including but not limited to S3, Glue, Lambda, Redshift, and RDS.
Strong knowledge of SQL and data modeling.
Familiarity with data warehousing concepts and tools.
Excellent problem-solving skills and a proactive attitude.
Ability to work in a fast-paced, collaborative environment.
Nice to have:
Experience with DevOps practices and tools (e.g., CI/CD, Docker, Kubernetes).
Knowledge of big data processing frameworks (e.g., Apache Spark).
Certifications in AWS (e.g., AWS Certified Solutions Architect, AWS Certified Big Data – Specialty).
What can we offer:
Flexible working hours
Flexible forms of employment and working hours (CoE or B2B)
An interesting, challenging job in the dynamically developing Capital Group company
Possibility of remote work
Work on innovative projects using modern technologies
Direct impact on shaping the image of the Capital Group’s companies on the market
Possibility to develop competences in a wide range
Attractive salary
Stability of employment and a friendly work atmosphere
Cool benefits, among others Legimi platform, Multisport card, Company Social Fund, Mentor and Psychologist support, Group Insurance, technical trainings and conferences, languages courses, integration meetings, internal company competitions and much more…