All offersWrocławDataData Architect
Data Architect
Data
Syncron

Data Architect

Syncron
Wrocław
Type of work
Undetermined
Experience
Senior
Employment Type
Permanent, B2B
Operating mode
Remote

Tech stack

    Data architecture
    advanced
    BigData
    advanced
    Cloud
    advanced
    Data Lake concept
    advanced
    Data platforms
    advanced
    RedShift/BigQuery
    regular
    AWS
    nice to have
    Redshift
    nice to have
    BigQuery or Snowflake
    nice to have
    Snowflake
    nice to have

Job description

Online interview
Syncron is looking to identify a Data Architect to join the central architecture practice. The successful individual would be expected to be part of forming the future end-to-end data processing architecture of Syncron. We also expect the individual to be able to evangelize and harmonize data and its exploitation across the whole organization. 



KEY EXPERIENCE AND SKILLS FOR THIS ROLE INCLUDE 



  • Data Architecture skills – being able to take a leadership position in standards development and the practical design of databases, data warehouses and data lakes (as appropriate) – and understand the differences. 
  • Data Processing skills – being able to deal with issues around scalability, consistency, reliability, efficiency, and maintainability. Need to have a good understanding around pros and cons of various technologies for processing and storing big volumes of data. 
  • Data Exploitation skills – understand the role of analytics to Syncron and their customers, and work with Analytics and products teams to ensure that the strategy for analytics is well designed, well understood, and well executed across all products. 
  • Data Futures – knowledge and experience of AI and ML techniques and practices in order to evangelize, facilitate and extend the use of ML across all products produced by Syncron. 

ROLES AND RESPONSIBILITIES



  • Data Warehousing experience, with exceptional skill at delivering enterprise-level data-driven analytics solutions. 
  • Experience in implementing full-stack Data platforms. 
  • Industry-proven expert in analyzing, re-architecting, and re-platforming on-site Data Bases & Data Artifacts into Cloud Data warehouses and Data Lakes. 
  • Hands-on expert with real-time data processing and analytics, data ingestion (batched and streamed), and data storage solutions. 
  • Proven expert at delivering end-to-end analytical solutions. 
  • Build, Optimize and maintain conceptual and logical database models for operational databases, data warehousing, data integration, data visualization patterns/platform/systems/tools. 
  • Defining enterprise-wide standards and best practices for data management. 
  • Industry expert at creating SAAS, multi-tenancy supported analytics solutions.

DESIRED CANDIDATE PROFILE SHOULD PRESENT 



  • Ability to communicate complex technical issues to a non-technical audience. 
  • Expertise in articulating thoughts, ideas and leading by collaboration. 
  • Competence to work with all levels of organization and ability to handle complex technical escalations. 
  • Ability to provide guidance, status, and leadership within the scope of data management. 
  • 10+ years' experience of related experience in software development. 
  • Strong data governance skills. 
  • Ability to meet key responsibilities.  
  • Strong problem solving, analytical, and troubleshooting skills. 
  • Familiarity with concepts such as Datamesh and Datafabric.

TECHNICAL BACKGROUND



  • Strong ‘hands-on’ experience as a Big Data Architect with a solid design/development background with Java, Scala, or Python. 
  • Good experience with RedShift, BigQuery or Snowflake.  
  • Experience delivering data analytics projects and architecture guidelines. 
  • Experience in big data solutions on premises and on the cloud (ideally Amazon Web Services). 
  • Production project experience in at least one of the big data technologies: 
    • Batch processing: Hadoop and MapReduce/Spark/Hive, 
    • NoSQL databases: MongoDB/Cassandra/HBase,
    • Delta Lake technologies.

NICE TO HAVE



  • Stream processing: Kafka Streams/Flink/Spark Streaming. 
  • Background in traditional data warehouse and business intelligence stacks (ETL, MPP Databases, Tableau, Microsoft Power BI). 

WE OFFER



  • Being yourself in an informal, low-ego and open working environment where you can truly make a difference and enjoy working with positive, passionate, and collaborative people, ready to share their knowledge with you 
  • Scandinavian style and company culture with work-life balance and true care for your wellbeing 
  • 100% remotely / hybrid / work from Warsaw office that is centrally located (Plac Grzybowski) - depending on your plans 
  • Freedom to choose the employment type: employment contract vs. B2B model
  • ! Copyrights tax benefit on an employment contract (tax-deductible expenses for this role = 80%)
  • Fixed monthly rate on B2B contract, including 33 days off in a given year (23 days for leisure and 10 days for unexptected events)  
  • Flexible working hours and no micromanagement 
  • Fringe benefits (private medical insurance, multisport, life insurance) 
  • Employee referral program - a bonus of 1500 EUR if the referral gets hired 
  • Internal training sessions (Friday Seminars), conference and training budget in every team, free English & Swedish classes, LinkedIn Learning for all
  • Opportunity to work in a cross-functional and agile team you can learn from
  • Opportunity to take part in the development of “off-the-shelf” products, based on best practices (code review, automated tests, continuous integration) 
  • We do respect one another, and we do enjoy working together – we play pool and board games (yes, we got back to the office with integration events), organize charity activities - to name just a few 
  • Remote recruitment, hiring and onboarding process