Link Group
Hundreds of IT opportunities are waiting for you—let’s make it happen! Since 2016, our team of tech enthusiasts has been building exceptional IT teams for Fortune 500 companies and startups worldwide. Join impactful projects in BFSI, CPG, Industrial, and Life Sciences & Healthcare industries. Work with cutting-edge technologies like Cloud, Business Intelligence, Data, and SAP. Unlock your potential, grow your skills, and collaborate with top global clients. Ready for your next big career move? Let’s link with us!
About the Role
We are looking for a Data Engineer experienced with Palantir Foundry to join a cross-functional team working on large-scale data integration, modeling, and analytics platforms. The ideal candidate is hands-on, proactive, and capable of navigating complex data ecosystems in an enterprise environment.
Key Responsibilities
Design and build data pipelines and models using Palantir Foundry
Integrate multiple data sources (structured and unstructured) into usable, high-quality data assets
Collaborate with data scientists, analysts, and business stakeholders to support advanced analytics initiatives
Apply data governance, lineage, and cataloging principles within Foundry
Develop and maintain Foundry “Objects”, Code Workbooks, and other tooling
Ensure quality, performance, and scalability of the implemented data solutions
Support and document platform usage and development best practices
Requirements
3+ years of experience in Data Engineering
Hands-on experience with Palantir Foundry in a commercial or enterprise setting
Proficiency in SQL, Python, and data transformation techniques
Good understanding of data modeling (dimensional, relational, and graph-based)
Familiarity with data governance and metadata management
Experience working in cloud-based environments (AWS, GCP, or Azure)
Excellent communication skills and ability to work with cross-functional teams
Nice to Have
Previous experience in highly regulated industries (finance, pharma, defense, etc.)
Experience integrating Foundry with external tools and systems via APIs
Knowledge of CI/CD, Git, and software engineering best practices
Exposure to tools like Airflow, dbt, Databricks, Snowflake, etc.
Experience with data privacy regulations (GDPR, HIPAA, etc.)
Net per hour - B2B
Check similar offers