#1 Job Board for tech industry in Europe

  • Job offers
  • ETRM Data Scientist
    Data

    ETRM Data Scientist

    373 - 427 USD/dayNet per day - B2B
    373 - 427 USD/dayNet per day - B2B
    Type of work
    Full-time
    Experience
    Mid
    Employment Type
    B2B
    Operating mode
    Remote

    Tech stack

      Data Science / Data Scientist

      advanced

      Energy trading and Risk Management (ETRM)

      advanced

      PySpark

      regular

      Azure Databricks

      regular

      Azure

      regular

      MLOps

      regular

    Job description

    Online interview
    Friendly offer

    Role: ETRM Data Scientist

    Location: Poland, Remote

    Type: Contract

    Duration 6+ Months




    Job Description:

    • Education Requirements:
    • Master’s degree in mathematics, Statistics, Data Science, or related fields is mandatory.
    • A Ph.D. in Mathematics, Statistics, Data Science, or similar areas is preferred but not mandatory.

     

    • Mandatory skills:
    • Data Science:
    • Extensive experience in time-series forecasting, predictive modelling, and deep learning.
    • Proficient in designing reusable and scalable machine learning systems.
    • Proficiency in implementing techniques such as ARIMA, LSTM, Prophet, Linear Regression, and Random Forest to ensure accurate forecasting and insights.
    • Strong command of machine learning libraries, including scikit-learn, XGBoost, Darts, TensorFlow, and PyTorch, along with data manipulation tools like Pandas and NumPy.
    • Proven expertise in designing and implementing explicit ensemble techniques such as stacking, boosting and bagging to improve model accuracy and robustness.
    • Proven track record of analysing and optimizing performance of operational machine learning models to ensure long-term efficiency and reliability.
    • Expertise in retraining and fine-tuning models based on evolving data trends and business requirements.
    • MLOps Implementation:
    • Proficiency in leveraging Python-based MLOps frameworks for automating machine learning pipelines, including model deployment, monitoring, and periodic retraining.
    • Advanced experience in using the Azure Machine Learning Python SDK to design and implement parallel model training workflows, incorporating distributed computing, parallel job execution, and efficient handling of large-scale datasets in managed cloud environments.
    • PySpark Proficiency
    • Strong experience in PySpark for scalable data processing and analytics.


    • Azure Expertise:
    • Azure Machine Learning: Managing parallel model training, deployment, and operationalization using the Python SDK.
    • Azure Databricks: Collaborating on data engineering and analytics tasks using PySpark/Python.
    • Azure Data Lake: Implementing scalable storage and processing solutions for large datasets.


    • Preferred skills:
    • K-Means Clustering: Experience in applying k-means clustering for data segmentation and pattern identification.
    • Bottom-Up Forecasting: Skilled in creating granular bottom-up forecasting models for hierarchical insights.
    • Azure Data Factory : Designing, orchestrating, and managing pipelines for seamless data integration and processing.
    • knowledge of power trading concepts.
    • Generative AI (GenAI): Experience in applying generative AI models, such as GPT or similar frameworks.
    373 - 427 USD/day

    Net per day - B2B

    Apply for this job

    File upload
    Add document

    Format: PDF, DOCX, JPEG, PNG. Max size 5 MB

    This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
    Please be informed that the data controller is Infoplus technologies UK Ltd (hereinafter "controller"). You have the rig...more

    Check similar offers

    Data Engineer (GCP)

    New
    Addepto
    3.74K - 5.61K USD/month
    Wrocław
    , Fully remote
    Fully remote
    SQL
    Python
    GCP