DXC Technology (NYSE: DXC) is the world’s leading independent, end-to-end IT services company, helping clients harness the power of innovation to thrive on change. Created by the merger of CSC and the Enterprise Services business of Hewlett Packard Enterprise, DXC Technology serves nearly 6,000 private and public sector clients across 70 countries. The company’s technology independence, global talent and extensive partner alliance combine to deliver powerful next-generation IT services and solutions. DXC Technology is recognized among the best corporate citizens globally. For more information, visit www.dxc.technology.
Job location: hybrid (Warsaw) or remote (Poland)
The person in this position will be responsible for developing, improving and maintaining the data ingestion and applications in environments based on Azure Cloud and Google Cloud components. Working for us, you will be the part of Advanced Analytics Apps Big Data Team located in Poland, focused on the newest technologies, providing services to international client and business users from across the globe.
We are looking for juniors/specialists and experts.
The Main Tasks Will Be
· Using the latest technologies ensure successful data delivery form external sources on the Microsoft Azure or Google Cloud Platform making them available for customer's data scientist and analytics.
· Adapt applications to changing customer expectations and changes on the part of data providers.
· Continuously improving application effectiveness and implement cost cutting solutions
· Taking needed actions on occurring events disruptive expecting results and planned processes.
· Cooperating with multiple international and multicultural teams delivering data and microservices
We Require
· High analytical skills and abstract thinking
· Strong SQL, Python and databases technologies
· Experience with:
· Spark and clusters computation
· Orchestration and job scheduling
· Relational and No-relational databases
· Linux
· Code versioning and CI/CD systems
· Containerization - locally hosted and/or clustered
· Familiar with:
· Data transfer schemes and protocols
· Data file formats
· Good command of English and Polish
We Appreciate
· Fascination with cloud technologies
· Eagerness to learn and practice new areas independently
· Other IT experiences not only related strictly to data
· Experience with:
· Machine Learning implementation and productionalization
· Datapipe setup
· Application operationalization
· Familiar with any of our technology stack:
Primary tech stack: Azure / Google Cloud, SQL, Python, Spark, Databricks, Airflow, NO-SQL Databases, Linux, GitHub, Docker, Kubernetes, Data File Formats
Secondary tech stack: API, Azure Data Factory, Azure DevOps, Azure SQL Databases, BigQuery, CI/CD, Cloud Storages, Composer, Crontab, Data Lake Storage, Databricks,
DataProc, Docker, GitHub, Jira / Confluence, Kubernetes, PubSub, Scala, Transfer Service
Send us your CV. We are looking forward to meeting you!