Spyrosoft
Spyrosoft is an authentic, cutting-edge software engineering company, established in 2016. We have been included in the Financial Times ranking of 1000 fastest growing companies for three consecutive years: 2021, 2022 and 2023.
Requirements:
Minimum 2 years of experience as a Data Engineer working with Google Cloud Platform (GCP) and cloud-based infrastructure.
Hands-on experience with LookML and Looker modelling language.
Proven experience in transforming and migrating data models from SSAS and Power BI Semantic Models to LookML.
Ability to translate DAX expressions into LookML formulas.
Deep understanding of GCP services and cloud computing architecture.
Strong background in designing, building, and deploying cloud-based data pipelines, including ingestion from various data sources (e.g., relational databases).
Proficiency in data modelling and database optimisation, including query tuning, indexing, and performance optimisation for efficient data processing and retrieval.
Nice to Have:
Experience with at least one orchestration and scheduling tool – Airflow is strongly preferred.
Familiarity with ETL/ELT processes and the ability to integrate data from multiple sources into a usable analytical format.
Working knowledge of modern data transformation tools such as DBT and Dataform.
Strong communication skills to collaborate effectively with cross-functional teams (data scientists, analysts, business stakeholders).
Ability to translate technical concepts into business-friendly language and present findings.
Experience leading or actively contributing to discussions with stakeholders to identify business needs and improvement opportunities.
Relevant certifications in big data technologies and/or cloud platforms (GCP, Azure).
Main responsibilities:
Design and develop data models in LookML, adhering to best practices.
Support business users in building Looker dashboards.
Migrate existing data structures and models to LookML.
Support and mentor team members in designing and managing LookML-based data models.
Build efficient and optimised aggregations and calculations for analytics purposes.
Optimise and continuously improve existing data models to enhance performance and usability.
Net per hour - B2B
Check similar offers