#1 Job Board for tech industry in Europe

  • Job offers
  • All offersWarszawaDataData Engineer (Automotive Domain)
    Data Engineer (Automotive Domain)
    Sigma Software

    Data Engineer (Automotive Domain)

    Sigma Software
    Type of work
    Employment Type
    Permanent, B2B
    Operating mode
    Sigma Software

    Sigma Software

    Sigma Software is a global software development company that enables enterprises, startups, and product houses to meet their technology needs through end-to-end delivery. We have been working since 2002, from all over the world.

    Company profile

    Tech stack








      nice to have


      nice to have

    Job description

    Online interview
    Friendly offer

    Are you looking for a team of professionals on top of cutting-edge technologies? Are you looking for a place to boost your Big Data career?

    We invite you to join our Big Data Competence Center, part of Sigma Software’s complex organizational structure. This center combines various clients, interesting projects, and activities to grow your professional skills.


    Our client is a well-established company in Europe’s automotive market. We are working together to develop software that will be used worldwide.


    The Client provides the middleware application for the independent automotive aftermarket. The solution provides easy integration of end-user IT infrastructure and bi-directional product catalogue updating.

    The solution supports various platforms with instant data processing and synchronization. It is a middleware component used in B2B integration scenarios, with some Front-End components, REST API, and Windows services running locally.


    • 4+ years of strong experience with Python as a programming language for data pipelines and related tools (e.g., Pandas)
    • Experience as a Data Analyst
    • In-depth knowledge of AWS infrastructure and tools for creating serverless data pipelines, mainly for batch processing
    • Hands-on experience with IaaS tools, preferably AWS CDK or Serverless Framework
    • Work experience building ETL pipelines for analytics and internal operations
    • Experience building internal APIs and integrating with external APIs
    • Practical experience with distributed application concepts and DevOps tooling
    • Experience with Linux operational system
    • Effective communication skills, especially for explaining technical concepts to nontechnical business leaders
    • Desire to work on a dynamic, research-oriented team
    • Troubleshooting and debugging ability



    • Certification Solutions Architect or Data Analytics
    • 2+ years of experience with Hadoop, Spark, and Airflow
    • Experience with DAGs and orchestration tools
    • Experience with developing Snowflake-driven data warehouses
    • Experience with developing event-driven data pipelines


    • Contributing to new technologies investigations and complex solutions design, supporting a culture of innovation considering matters of security, scalability, and reliability with the focus on building out our ETL processes
    • Working with modern data stack, coming up with well-designed technical solutions and robust code, implementing data governance processes
    • Working and professionally communicating with the customer’s team
    • Taking up responsibility for delivering major solution features
    • Participating in requirements gathering & clarification process, proposing optimal architecture strategies, lead the data architecture implementation
    • Developing core modules and functions, designing scalable and cost-effective solutions
    • Performing code reviews, writing unit and integration tests
    • Scaling the distributed system and infrastructure to the next level
    • Building data platform using the power of AWS cloud provider