All offersKatowiceDataData Engineer
Data Engineer
new
Data
DCV Technologies

Data Engineer

DCV Technologies
5 000 - 5 300 USDNet/month - B2B
Type of work
Full-time
Experience
Senior
Employment Type
B2B
Operating mode
Remote

Tech stack

    MySQL
    advanced

Job description

Friendly offer

Our client, is a world-leading digital payments network that removes barriers and connects people.

We're urgently seeking a Data Engineer 🚀


Remote, Poland

Salary range up to 250-265 USD /day




🦾 In addition to creating and maintaining an optimal pipeline architecture, typical duties and responsibilities for a Data Engineer position may include: 

 

  • Assembling large, complex sets of data that meet non-functional and functional business requirements 
  • Identifying, designing and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes 
  • Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using new or existing technologies 
  • Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition 
  • Working with stakeholders including data, design, product, and executive teams and assisting them with data-related technical issues 
  • Working with stakeholders including the data science, product, TPM, software teams to support their data infrastructure needs while assisting with data-related technical issues 

 

✅ Basic Qualifications: 

 

  • 5 years of work experience with a bachelor's degree or an advanced degree in Computer Science, Information Technology, Engineering, or a related field. 
  • Proficiency in SQL and experience with other programming languages such as Python, Java, or Scala. 
  • Experience with big data tools and frameworks such as Hadoop, Spark, or Hive. 
  • Strong experience in ETL (Extract, Transform, Load) tools and processes. 
  • Familiarity with databases and data warehousing solutions, both relational (like MySQL, Oracle) and non-relational (like MongoDB, Cassandra). 
  • Knowledge of data modeling and data architecture. 
  • Experience in automation and setup scheduling jobs on Hadoop like Apache Tuber or other tools 
  • Experience with kafka or flink steaming data to hadoop is a plus 
  • Experience with cloud services like AWS, Google Cloud, or Microsoft Azure is plus 
  • Experience with data visualization tools and BI tools like Tableau, Power BI etc. can be a plus. 


Innovation is in our DNA, if you feel the same way, send your CV ✔

5 000 - 5 300 USD

B2B

Apply for this job

File upload
Add document

Format: PDF, DOCX, JPEG, PNG. Max size 5 MB

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Informujemy, że administratorem danych jest z siedzibą w , ul.(dalej jako "administrator"). Masz prawo do żądania dostęp...more