As an AWS Data Engineer you’ll be part of the project focused on building large, enterprise Data Lake on top of AWS. You will participate in many aspects such as integration of data from multiple sources, applying transformations using Python and SQL and taking care of the performance aspects. The candidate needs to be able to effectively translate business requirements into technical specification, experience with Python and/or SQL and he’s willing to get some hands-on experience with AWS.
Your responsibilities:
- Being able to effectively convert business requirements into technical solutions
- Promote software development best practices / have an DevOps mindset
- Use AWS Data Engineering services like AWS Glue, Lambda, Athena to build scalable data solutions
- Designing and building Data Lakes / Data Platforms using modern solution and data architectures
- Building data pipelines with Apache Airflow
- Job scheduling, execution and supporting during UAT/go live.
- Leverage AWS analytical services to build data applications
Our requirements:
- At least 2 years of experience with Python language and understanding the concept of OOP and best coding standards
- Good knowledge of SQL for data transformation (including complex queries with analytical functions)
- Ability to perform data manipulations, load, extract from several sources of data into another schema.
- Ability to work with multiple file formats (JSON, xml, csv, reports, etc.) and analyze data, if required for further processing
- Willing to learn AWS architecture (we support certification!)
- Experience with Infrastructure as a code concepts (Terraform, AWS CDK/CloudFormation, etc.)
- Experience with Apache Spark (preferably Databricks) would be a plus
- Snowflake experience would be a plus (or similar cloud DWH technology like e.g. AWS Redshift)
- Hands-On experience with Apache Airflow would be a plus
What we offer:
- 100% remote work
- B2B contract
- A competitive salary
- Multiple opportunities to gain new knowledge from AWS / Data Engineering area in our internal knowledge sharing meetups
- Money refund for Data Engineering certification expenses
- Training budget