Position Overview:
We are seeking an experienced ETL Developer to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust ETL processes to ensure the efficient flow of data from various sources to data warehouses or data lakes. This role involves collaborating with business and technical teams to support data-driven decision-making and analytics initiatives.
MD rate: 16600 - 20000PLN
Roles and Responsibilities:
- Design, develop, and optimize ETL pipelines to extract, transform, and load data from multiple data sources.
- Collaborate with data architects and business analysts to gather and understand data requirements.
- Implement and maintain data integration workflows using ETL tools such as Informatica, Talend, SSIS, or Apache NiFi.
- Perform data validation and quality checks to ensure data integrity and accuracy.
- Troubleshoot and resolve issues related to ETL processes and data flows.
- Monitor and enhance the performance of ETL jobs to meet business SLA requirements.
- Maintain and document technical solutions, including data mappings, workflows, and procedures.
- Work closely with other data team members to support data warehouse and data lake initiatives.
Required Skills and Experience:
- Proficiency in SQL for querying and transforming data.
- Hands-on experience with ETL tools such as Informatica, Talend, SSIS, or similar.
- Strong knowledge of data modeling techniques, including star schema and snowflake schema.
- Experience in data integration and data warehousing concepts.
- Familiarity with cloud platforms (e.g., AWS Glue, Azure Data Factory, Google Dataflow) for ETL processes.
- Strong problem-solving skills and the ability to troubleshoot complex data issues.
- Experience with scripting languages such as Python, Shell, or Bash for automation.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
Nice to Have:
- Experience with big data tools such as Spark, Kafka, or Hadoop.
- Knowledge of NoSQL databases like MongoDB or Cassandra.
- Familiarity with DataOps practices and CI/CD pipelines for ETL workflows.
- Exposure to data governance and metadata management tools.
- Understanding of data security and compliance requirements.
- Experience with version control systems like Git.
- Exposure to Agile/Scrum methodologies.
Additional Information:
This role provides an opportunity to work on complex data integration projects and contribute to the development of scalable data solutions. If you are passionate about transforming raw data into actionable insights and thrive in a fast-paced environment, we encourage you to apply.