#1 Job Board for tech industry in Europe

  • Job offers
  • Data Engineer with GCP
    New
    Data

    Data Engineer with GCP

    Type of work
    Full-time
    Experience
    Mid
    Employment Type
    Any
    Operating mode
    Remote
    Lingaro

    Lingaro

    Lingaro - an award-winning tech partner to global leaders. Our clients include Fortune 500 companies whom we provide with cutting edge solutions. We attract the best talents from the IT industry as well as the most promising rising stars.

    Company profile

    Tech stack

      GCP

      regular

      BigQuery

      regular

      SQL

      regular

      Python

      regular

    Job description

    Online interview
    Friendly offer

    Tasks:

    • You will be a part of the team accountable for design, model and development of whole GCP data ecosystem for one of our Client’s (Cloud Storage, Cloud Functions, BigQuery) 
    • Involvement throughout the whole process starting with the gathering, analyzing, modelling, and documenting business/technical requirements will be needed. The role will include direct contact with clients.  
    • Modelling the data from various sources and technologies. Troubleshooting and supporting the most complex and high impact problems, to deliver new features and functionalities. 
    • Designing and optimizing data storage architectures, including data lakes, data warehouses, or distributed file systems. Implementing techniques like partitioning, compression, or indexing to optimize data storage and retrieval. Identifying and resolving bottlenecks, tuning queries, and implementing caching strategies to enhance data retrieval speed and overall system efficiency. 
    • Identifying and resolving issues related to data processing, storage, or infrastructure. Monitoring system performance, identifying anomalies, and conducting root cause analysis to ensure smooth and uninterrupted data operations. 
    • Train and mentor junior data engineers, providing guidance and knowledge transfer. 


    Requirements:

    Must have:  

    • At least 4 years of experience as a Data Engineer, including min. 3 years of experience working with GCP cloud-based infrastructure & systems. 
    • Strong knowledge in cloud computing platforms - Google Cloud - Candidate should be able to design, build, and deploy data pipelines in the cloud, to ingest data from various sources like databases, APIs or streaming platforms. 
    • Programming skills (SQL, Python, other scripting). 
    • Proficient in data modeling techniques and database optimization. Knowledge of query optimization, indexing, and performance tuning is necessary for efficient data retrieval and processing. 
    • Proficient in database management systems such as SQL (Big Query is a must), NoSQL, Candidate should be able to design, configure, and manage databases to ensure optimal performance and reliability.
    • Knowledge of at least one orchestration and scheduling tool (like Airflow). 
    • Experience with data integration tools and techniques, such as ETL and ELT Candidate should be able to integrate data from multiple sources and transform it into a format that is suitable for analysis. 
    • Knowledge of modern data transformation tools (such as DBT, Dataform). 
    • Excellent communication skills to effectively collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders. Ability to convey technical concepts to non-technical stakeholders in a clear and concise manner. 
    • Tools knowledge: Git, Jira, Confluence, etc. 
    • Open to learn new technologies and solutions. 
    • Experience in multinational environment and distributed teams. 
    • English: C1


    Good to have:

    • Certifications in big data technologies or/and cloud platforms. 
    • Experience with BI solutions (e.g. Power BI, Tableau). 
    • Experience with ETL tools: e.g. Talend, Alteryx. 
    • Experience with Azure cloud-based infrastructure & systems. 


    We offer:

    • Stable employment. On the market since 2008, 1500+ talents currently on board in 7 global sites.
    • “Office as an option” model. You can choose to work remotely or in the office. 
    • Great Place to Work® certified employer.
    • Flexibility regarding working hours and your preferred form of contract. 
    • Comprehensive online onboarding program with a “Buddy” from day 1.  
    • Cooperation with top-tier engineers and experts. 
    • Unlimited access to the Udemy learning platform from day 1.
    • Certificate training programs. Lingarians earn 500+ technology certificates yearly. 
    • Upskilling support. Capability development programs, Competency Centers, knowledge sharing sessions, community webinars, 110+ training opportunities yearly.
    • Grow as we grow as a company. 76% of our managers are internal promotions.  
    • A diverse, inclusive, and values-driven community.  
    • Autonomy to choose the way you work. We trust your ideas. 
    • Create our community together. Refer your friends to receive bonuses. 
    • Activities to support your well-being and health.
    • Plenty of opportunities to donate to charities and support the environment. 
    • Modern office equipment. Purchased for you or available to borrow, depending on your location.
    Undisclosed Salary

    Any

    Check similar offers

    Mid Data Engineer

    New
    CLOUDFIDE
    3.28K - 5.73K USD
    Wrocław
    , Fully remote
    Fully remote
    Microsoft Azure Cloud
    Data Lake
    Databricks

    Experienced Data Scientist

    New
    Limango Polska
    4.39K - 5.12K USD
    Wrocław
    , Fully remote
    Fully remote
    Python
    Git
    CI/CD

    Data Scientist (Gen AI + Python)

    New
    IN Team
    5.46K - 7.8K USD
    Warszawa
    , Fully remote
    Fully remote
    SQL
    Python
    AI

    Oracle Apex Developer (100% remote)

    New
    Crestt
    6.14K - 7.37K USD
    Poznań
    , Fully remote
    Fully remote
    Git
    Oracle SQL
    Apex

    Data Engineer / Data Platform Developer with .NET

    New
    Kambu
    3.17K - 4.39K USD
    Wrocław
    , Fully remote
    Fully remote
    Azure Data Factory
    ELT
    CI/CD pipelines