Link Group
Hundreds of IT opportunities are waiting for you—let’s make it happen! Since 2016, our team of tech enthusiasts has been building exceptional IT teams for Fortune 500 companies and startups worldwide. Join impactful projects in BFSI, CPG, Industrial, and Life Sciences & Healthcare industries. Work with cutting-edge technologies like Cloud, Business Intelligence, Data, and SAP. Unlock your potential, grow your skills, and collaborate with top global clients. Ready for your next big career move? Let’s link with us!
About the Role
We are looking for a skilled Adoption Engineer with strong hands-on experience in dbt (Core & Cloud), DevOps, Infrastructure as Code, and Airflow. This role is critical to support the successful rollout, scaling, and enablement of modern data transformation platforms within enterprise environments.
As an Adoption Engineer, you will serve as a technical bridge between platform engineering and data users, ensuring smooth adoption of tools and best practices across teams. You will be responsible for building robust, reusable infrastructure components, supporting automation, and guiding users through optimal implementation patterns.
Key Responsibilities
Act as a subject matter expert (SME) in dbt Core and Cloud implementation, configuration, and adoption strategy
Design and build scalable and reusable infrastructure components using IaC tools (Terraform, CloudFormation, etc.)
Enable and support Airflow DAG creation and scheduling for orchestration of dbt models and other workflows
Collaborate with data teams to improve workflows, CI/CD pipelines, and DevOps processes
Define and enforce best practices in data modeling, code versioning, and deployment automation
Provide onboarding and enablement support to teams migrating to dbt Cloud
Support observability, testing, and documentation standards for all data transformation processes
Ensure security, scalability, and operational excellence across the data transformation stack
Required Skills
Proven experience with dbt Core and dbt Cloud (advanced user-level, admin or platform support experience is a plus)
Strong proficiency in SQL and data modeling principles
Experience with DevOps tooling and CI/CD pipelines
Hands-on experience with Airflow and workflow orchestration
Experience with Infrastructure as Code (Terraform preferred, but others acceptable)
Cloud platform familiarity (AWS, Azure or GCP)
Strong communication skills, with the ability to support teams across different technical maturity levels
Nice to Have
Experience in data platforms such as Snowflake, BigQuery, Databricks, or Redshift
Familiarity with monitoring & observability tools for data workflows
Understanding of data governance and access control concepts
Background in platform evangelism, internal consulting, or enablement is a plus
Gross per hour - Permanent
Check similar offers