ETL Engineer - Calgary, Canada - CP

    CP
    CP Calgary, Canada

    Found in: Talent CA C2 - 1 week ago

    Default job background
    Full time
    Description

    POSITION ACCOUNTABILITIES

  • Utilize Power Center, IICS Data Integration, and IICS Application Integration to perform data integration tasks
  • Develop and maintain ETL workflows, adhering to team standards for naming, promotion, data flow, documentation, and code repository
  • Design, build, and optimize data pipelines and workflows for efficient data processing and analysis
  • Collaborate closely with project teams, including Business Analysts, Solution Architects, Project Managers, Application Developers, and BI Developers
  • Collaborate with operational teams such as DBAs, Service Operations Center, BI developers, and SAP developers
  • Participate in peer reviews of Informatica workflows and production deployments
  • Perform data modeling, logic development, performance tuning, scheduling, and validation of data flows
  • Provide immediate incident resolution for data, performance, and integration problems.
  • Conduct incident follow-up with root cause analysis and manage incident change release and dev/acceptance reconciliation
  • Maintain and provide support for Informatica and Linux middleware, including patching, installations, shell scripting, updates, and configuration
  • Stay updated with emerging trends in data management, ETL, and data engineering
  • Availability to provide 24/7 on-call support (on rotation basis)
  • POSITION REQUIREMENTS

  • Bachelor's degree in Computer Science, Information Systems, or equivalent education or work experience
  • 5+ years' experience in ETL development
  • 5+ years' experience with SQL and scripting skills for data manipulation and transformation
  • 3+ years' experience with big data patterns, data modeling (Star, Snowflake, Relational), NoSQL, and data engineering concepts
  • 3+ years' experience with database systems, including Oracle or SQL Server
  • 3+ years' experience in data pipeline design, implementation, and optimization.
  • Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and their data services
  • 3 + years' experience with data warehousing, data lakes, and data integration techniques
  • Recent experience with database systems, including Azure Synapse, Snowflake, and SAP eHana is nice to have
  • Knowledge of system integration using APIs, MQ, FTP, and other integration software is nice to have
  • Working knowledge of Linux and shell scripting, Python and R is nice to have
  • WHAT CPKC HAS TO OFFER

  • Flexible and competitive benefits package
  • Employer Funded Retirement Plan
  • Employee Share Purchase Plan
  • Performance Incentive Program
  • Annual Fitness Subsidy
  • Part-time Studies Program