Ingénieur de données/Ingénieure de données - Ontario, Canada - FX Innovation

    Default job background
    Technology / Internet
    Description

    Field Data Engineer - Legacy data sources to GCP migration-

    Specific projects: Assist with data transfers from legacy sources to GCP using Dataflow/Airflow/Apache Beam either batch or streaming.

    Localisation : Ontario

    Hybrid role. Present in the office 3 days a week. Tuesday, Wednesday and Thursday. Remote Monday and Friday.

    Tâches :

    Responsible for developing and maintaining software applications in a cloud-based environment. They design, develop, test, and deploy cloud-based applications that meet the needs of customers and stakeholders. They also collaborate with other engineers and stakeholders to ensure that applications are scalable, secure, and reliable.

    Skillset required :

    Hands-on experience with data integration tools and technologies such as Google Dataflow, AirFlow, Apache Beam, or equivalent.

    • Deploy and manage Big Data solutions on GCP, both batch and real-time.
    • Use GCP's infrastructure-as-code tools, particularly Terraform, to provision and configure cloud resources.
    • Collaborate with the data engineering team to integrate data pipeline code into CI/CD processes (GITLAB).
    • Knowledge of networking components (VPC, subnets, firewall rules) for secure communication.
    • Strong programming skills in Python.

    Locations- Toronto – Adelaide (co-located with the rest of the team)./

    Anglophone

    Looking for a contractor to assist with data transfers from legacy sources to GCP using Dataflow/Airflow/Apache Beam either batch or streaming.

    Top 3 skills sets:

    1. Hands-on experience with data integration tools and technologies such as Google Dataflow, AirFlow, Apache Beam, or equivalent.
    2. Deploy and manage Big Data solutions on GCP, both batch and real-time. Use GCP's infrastructure-as-code tools, particularly Terraform, to provision and configure cloud resources.
    3. Collaborate with the data engineering team to integrate data pipeline code into CI/CD processes (GITLAB). Knowledge of networking components (VPC, subnets, firewall rules) for secure communication. Strong programming skills in Python./

    Interview process:

    1st interview : Virtual.

    2nd interview : in person if required.

    Behavioral + technical questions for approx. 1h in total./