All offersWarszawaDataBig Data Engineer
Big Data Engineer
new
Data
EndySoft

Big Data Engineer

EndySoft
Warszawa
5 780 - 7 060 USDNet/month - B2B
Type of work
Full-time
Experience
Mid
Employment Type
B2B
Operating mode
Remote

Tech stack

    Big Data
    regular
    Spark
    regular
    Hadoop
    regular
    Kafka
    regular
    AWS
    regular
    Azure
    regular
    GCP
    regular
    Python
    regular
    Java
    regular
    Scala
    regular

Job description

Online interview

EndySoft, a rapidly expanding company headquartered in Central Europe, specializes in providing comprehensive IT resources and services. Our core offerings include body-leasing, team outsourcing, recruitment, and HR solutions, with a strong focus on business automation and software development. By leveraging our expertise, we empower our clients and partners to streamline operations, freeing them from time-consuming tasks and enabling them to concentrate on core business objectives.


Responsibilities:


  • Design, build, and maintain scalable big data solutions and infrastructure.
  • Develop and implement data pipelines for ingesting, processing, and analyzing large volumes of data.
  • Collaborate with data scientists and analysts to understand business requirements and translate them into technical solutions.
  • Optimize data processing and storage solutions for performance, reliability, and scalability.
  • Implement data governance and security best practices to ensure data integrity and compliance.
  • Monitor and troubleshoot data pipelines, identifying and resolving performance bottlenecks and issues.
  • Stay updated with the latest big data technologies and trends, incorporating them into projects where applicable.


Requirements:


  • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
  • Proven experience as a Big Data Engineer or similar role, with a strong portfolio showcasing your contributions to big data projects.
  • Proficiency in big data technologies such as Hadoop, Spark, Kafka, Hive, and/or others.
  • Experience with cloud platforms such as AWS, Azure, or Google Cloud Platform.
  • Strong programming skills in languages such as Python, Java, or Scala.
  • Familiarity with data warehousing concepts, ETL processes, and data modeling.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and collaboration skills, with the ability to work effectively in a cross-functional team environment.


We Offer:


  • Remote Work: Enjoy the flexibility of working remotely from anywhere.
  • Professional Growth: Access to continuous learning opportunities, training programs, and certifications.
  • Exciting Projects: Work on innovative projects that challenge and expand your skills.
  • Competitive Compensation: Receive a competitive salary package that reflects your expertise and contributions.
  • Positive Work Environment: Join a supportive team of professionals who value collaboration, creativity, and excellence.
5 780 - 7 060 USD

B2B

Apply for this job

File upload
Add document

Format: PDF, DOCX, JPEG, PNG. Max size 5 MB

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Informujemy, że administratorem danych jest EndySoft z siedzibą w Ostrava, ul. Zámostní 1155/27 (dalej jako "administrat...more