Majorel Global IT HUB
Take your career to the next level working with amazing people around the world!
By combining talent, data and technology, we provide Cloud data and analytics solutions for our amazing projects with Global clients. Driven by 78 000+ awesome employees in 41 countries of the world, in Europe, the Middle East, Africa, Americas and Asia in 60 languages, we create great customer experiences that people value, and we are proud of.
Here is what makes Majorel Global IT Hub to be a #dreamjob for you:
- Vibrant modern global culture driven by innovative, cutting-edge technologies
- Acquiring IT market high demand skills with TOP IT talents across the organization
- Competitive salary and huge growth opportunities in international company
- Well established employment processes (Talent Review, Succession planning, Onboarding, Performance management)
The Data Engineer is responsible for developing and maintaining business intelligence solution for Operations, Human Resources, Finance and Sales teams.
Responsibilities:
- Executes full business intelligence life cycle including:
- Analysis and ETL design processes
- ETL creation & development: Azure Data factory and Databricks
- Maintenance of Azure DevOps (boards, repos, pipelines)
- Developing, implement and optimize procedures and functions (T-SQL, Pyspark)
- Participation in data warehouse/data lake design
- Maintaining, evaluate and improve BI systems
- Maintaining and support Cloud data platform
- Collaborate with teams to integrate systems,
- Collaborate with teams to test and improve process of implementation
- Collaborate with teams to design new BI cloud solutions for multi-regional access to distributed data sets
Educational Background:
- BS/MS degree in Computer Science, Engineering or related subject
Technical Skills: (knowledge, experiences, IT tools/software, languages)
- Experience as a Data Engineer, BI Developer approx. 5-7 years experience as SQL Developer, or similar role
- Strong skills Azure Data Factory or Databricks to include understanding of control flow and data flow tasks, with package auditing of data as it moves through the data pipeline (or other tools – SSIS, Pentaho etc.)
- Good understanding of SQL/Pyspark programming
- Ability to program some custom scripting tasks for complex assignments (Python language preferred)
- Knowledge of SQL Server database (T-SQL) tasks including jobs, data backups and redundancy, database maintenance (indexing and statistics)
- Agile approach
- English – B2/C1
Soft Skills:
- Organizational, influencing and communication skills
- High level of critical thinking and analytical skills
- Proven abilities to take initiative and be innovative