Data Architect with Kafka - Toronto, Canada - Compunnel Inc.

    Default job background
    Description

    PRIMARY SKILL: AWS Glue, Kafka, Redshift/Postgres, RDBMS/Datawarehousing

    SECONDARY SKILL: Power BI, REST APIs

    Relevant experience: Overall 10+ years of experience in SAP data management, data quality assurance, data extraction from legacy systems.

    Detailed JD:

    • Hands-on Data Architect - to provide guidance to the team - does design, coding or configuring tools as required. Also provides oversight as the senior person & on-shore - direct client dealing, status-reporting. Fills any gaps in DEV.
    • Creating data models that specify how data is formatted, stored, and retrieved inside an organisation.
    • This comprises data models that are conceptual, logical, and physical.
    • Creating and optimising databases, including the selection of appropriate database management systems (DBMS) and the standardisation and indexing of data.
    • Creating and maintaining data integration processes, ETL (Extract, Transform, Load) workflows, and data pipelines to seamlessly transport data between systems.
    • Collaborating with business analysts, data scientists, and other stakeholders to understand data requirements and align architecture with business objectives.
    • Stay current with industry trends, best practices, and advancements in data management through continuous learning and professional development.
    • The architects should be fluent in explaining the master data templates to the process and tech teams in order to get the field mapping and data fields sacrosanct with the current ERP in to SAP S/4 HANA
    • They should validate the master data quality and fitment into S/4 templates, provide expertise in reducing time to map the data templates, assistance in deriving the field values and reducing time to generate new values which does not exist in the current ERP
    • The architects should train the internal IT team on the master data field mapping, upload tools