We are looking for a Senior Data Platform Engineer to join remotely one of the exciting projects within a global, American, market-leading, NYSE-listed trading platform.
-
Salary: 26 880 - 35000 PLN on B2B +VAT monthly
- 100% Remote!
-
Working hours: Starting from 12:00 or ideally from 2 PM CET to have the overlap with the US team in the Eastern Time zone (especially during the onboarding process, later 10-18 CET)
We (at RITS) work in the cooperation/partnership model where we see our consultants as clients as well. Currently, we are an exclusive Polish vendor for this particular client which is a very stable company that has been in existence for over 25 years and helps the world's leading asset managers, central banks, hedge funds, and other institutional investors access the liquidity they need through a range of electronic marketplaces and trades on average of about 30 trillion dollars A MONTH.
Job Responsibilities
- Build and run data platform using such technologies as public cloud infrastructure (AWS and GCP), Kafka, databases and containers.
- providing the platform and secure access for data engineers than doing the data engineering itself.
- Develop data science platform based on open-source software and Cloud services.
- Design, build and run our kafka messaging clusters.
- Work on performance optimization issues in the application around the messaging platform.
- Assist in the building and running of our K8s clusters and cloud environments.
- Build and run ETL pipelines to onboard data into the platform, define schema, build DAG processing pipelines and monitor data quality.
- Help develop machine learning development framework and pipelines.
- Manage and run mission crucial production services.
Qualifications
- 3+ years of experience in a Platform Engineering role such as System Engineer, DevOps Engineer.
- Strong software engineering experience and working with Python.
- Strong experience working with SQL and databases/engines such as MySQL, PostgreSQL, SQL Server, Snowflake, Redshift, Presto, etc
- Experience building ETL and stream processing pipelines using Kafka, Spark, Flink, Airflow/Prefect, etc.
- Familiarity with data science stack: e.g. Juypter, Pandas, Scikit-learn, Dask, Pytorch, MLFlow, Kubeflow, etc.
- Experience with using AWS/GCP (S3/GCS, EC2/GCE, IAM, etc.), Kubernetes and Linux in production.
- Strong proclivity for automation and DevOps practices.
- Experience with managing increasing data volume, velocity and variety.
- Agile, self-starter and is focused on getting things done.
- Ability to deal with ambiguity.
- Strong communicator.
- Participate in on-call outside of regular business hours.
Nice to have
- Development skills in C++, Java, Go, Rust.
- Understands TCP/IP and distributed systems.
- Experience managing time series data.
- Familiarity with working with open source communities.
- Financial Services experience.
🎯 Job Details:
-
Salary range: 26 880 - 35 000 PLN on B2B +VAT without paid vacations depending on the experience, and technical testing
-
Working hours: Starting from 12:00 or ideally from 2 PM CET to have the overlap with the US team in the Eastern Time zone
-
Budget for materials and hardware: such as standing desks, laptops, monitors or screens, WeWork-type workspaces, etc.
-
Free private medical insurance OR 💪 Medicover Sport Membership
- Ready to have you on a team ASAP!
- Long-term cooperation!
-
Integration Trips to NY/London/Warsaw several times a year for 3-4 days (non-mandatory, expenses covered)
Interview Process:
- Introductory call with RITS Recruiter..
- Interview with the hiring manager, 45 mins
- Technical interview, 90 mins with the developers from the team.