Team Connect is Poland’s leading nearshore and offshore IT provider. Since 2008 we successfully create and develop software for our clients.
We are also a proud certified Salesforce Partner.
We specialize in Agile and DevOps-based software development. From the analysis stage through implementation. We develop backend, frontend, and mobile applications.
We are currently looking for an experienced Data Modeller with strong Financial Services knowledge to join our dynamic team. You will play a critical role in translating complex business requirements into efficient data structures that support various data platforms (including OLTP, OLAP, and Big Data environments). This is an exciting opportunity to be part of high-impact data initiatives in the financial sector, working closely with stakeholders to ensure high-quality, scalable, and well-governed data architecture.
KEY RESPONSIBILITIES:
- Translate business requirements into Conceptual, Logical, and Physical Data Models.
- Identify appropriate relational design models (OLTP, OLAP, Data Warehouse) from business requirements or Logical Data Models.
- Participate in the mapping between Logical and Physical Data Models.
- Create and manage models for large datasets sourced into Big Data platforms.
- Work collaboratively with stakeholders to prevent negative business impacts through optimized data solutions.
- Manage stakeholder expectations and maintain effective communication and escalation protocols.
- Support strategic, long-term solutions over short-term workarounds wherever possible.
KEY SKILLS & QUALIFICATIONS:
- Proven experience as a Data Modeler within the Banking or Financial Services industry.
- Familiarity with data dictionaries, data governance, and financial domains such as:
- Client/Party Data
- Trades
- Settlements
- Payments
- Instruments & Pricing
- Market & Credit Risk
- Solid understanding of ER modeling, ETL/ELT processes, and data architecture (both on-prem and cloud-based).
- Experience with dimensional data modeling (e.g. Star Schema, Kimball methodology).
- Strong SQL and Python scripting skills; familiarity with Scala, Hive, CMD, Putty, Notepad++.
- Understanding of Agile methodologies and software development lifecycle (SDLC).
- Excellent verbal and written communication skills; strong documentation and stakeholder engagement.
- Ability to manage multiple priorities with attention to detail and precision.
Preferred/Good to Have:
- Experience with Hadoop or Google BigQuery.
- Knowledge of Data Quality and Data Governance best practices
We offer:
- Long-term cooperation
- Benefit package - Multisport, private medical care, life insurance
- Training budget
- Free English lessons
- Individual support from a dedicated company mentor