PROJECT INFORMATION:
- Industry: Banking
- Location: Krakow ( 1-2 days per week at office)
- Type of assignment: B2B
- Start: ASAP (to be agreed)
- Project duration: long term
We seek a skilled and experienced Google Cloud Data Engineer to design, build, test, and deploy data models and transformations in Google Cloud, specifically utilizing BigQuery.
This role will focus on developing and optimizing data pipelines and ensuring secure, scalable, and reliable solutions. The ideal candidate will have a deep understanding of database design, cloud architecture, and best practices in data management.
Job Duties:
- Design, build, test, and deploy Google Cloud data models and transformations in BigQuery (e.g., SQL, stored procedures, indexes, clusters, partitions, triggers, etc.).
- Develop data warehouses and pipelines that follow abstraction and 'database refactoring' best practices to support evolutionary development and ongoing changes.
- Implement appropriate authorization and authentication models, data encryption, and other security measures to protect the solution, including consumer registration and change management considerations.
- Review and interpret business and technical requirements, refining and implementing them efficiently.
- Contribute to ongoing productivity and prioritize tasks by refining User Stories, Epics, and Backlogs in Jira.
- Use tools such as Git, Jenkins, Google Secrets Manager, and others to manage code artefacts and CI/CD pipelines.
- Estimate, commit, and deliver project requirements according to scope, quality, and time expectations.
- Ensure the delivery of non-functional requirements, IT standards, and developer tools to ensure applications are secure, compliant, scalable, reliable, and cost-effective.
- Write clean, well-commented, and maintainable code.
- Address defects, provide enhancements during development, and hand over knowledge, code, and support responsibilities to the support team.
Experience & Skills Required:
- Expertise in database design, development, and administration, with a strong understanding of relational and dimensional data models (preferably Data Vault).
- Proven experience in developing and optimizing SQL/T-SQL procedures in traditional or cloud databases.
- Strong knowledge of GCP architecture and solution design.
- Experience with on-prem or cloud databases, warehouses, and lakes.
- Proficiency in coding and development of DDL and DML database components.
-
Solid understanding and hands-on experience with DevOps tools (e.g., Ansible, Jenkins, GitHub, Puppet, Chef, etc.).
- Knowledge of IT methodologies and principles, including Agile/Scrum, DevOps, and ITIL.
- Strong analytical, problem-solving, and troubleshooting skills.
- Ability to work collaboratively in a team environment and communicate effectively with stakeholders.
- Detail-oriented with a focus on quality, security, and scalability in all deliverables.
Additional Beneficial Experience (Not Essential):
- Experience in pod leadership and leading Agile development teams.
- Involvement in solution architecture (technical design authority, establishing design blueprints, and principles).
- Experience in project planning and coordination.