TechDigital Corporation

Data Lab SME (BB-C30EA)

Found in: Neuvoo Bulk CA


Full Legal Name:  
Address with ZIP Code:  
Contact Number:  
Email ID:  
Interested to Relocate (Yes/No):  
Work Authorization:  
Current Rate  
Earliest Availability to Join:  
LinkedIn Profile: (Must)  
Current and Last few employers:  


Must Have Skills Candidate Self Rating (Rating out of 5) Total Yrs. of experience Last Used (Year) Client Project (Where candidate worked on this technology)
Azure Data tools        


Job description:


·         Analyze source system data to assess data quality. Work with technical and business representatives to determine strategies for handling data anomalies that are identified.

·         Design ETL processes and develop source-to-target data mappings, integration workflows, and load processes. Involved in gathering, understanding and validating the project specifications and participate in ETL architecture design reviews

·         Experience with Data Lake infrastructures, Azure Blob, Azure Data Factory, Azure DataBricks, Azure Event Hubs and Event Grids

·         Identify problems, develop ideas and propose solutions within differing situations requiring analytical, evaluative or constructive thinking in daily work

·         Document designs and architect data maps, develop data quality components and establish and/or conduct unit tests

·         Perform reviews and quality checks after data has been loaded

·         Connecting to data sources, importing data and transforming data for Business Intelligence

·         Excellent in analytical thinking for translating data into informative visuals and reports


·         5+ years of experience with data warehouse technical architectures, ETL/ELT, and BI reporting/analytic tools

·         Data Engineering/Data Modeling experience in a Dev/QA/Prod capacity

·         Kusto Query Language (KQL) and/ or Expert SQL and data modeling skills

·         Enterprise-scale technical experience with cloud and hybrid infrastructures, architecture designs, database migrations, and technology management

·         Advanced Analytics including Azure DataBricks, visualization tools as Power BI

·         Experience in troubleshooting and resolving database integrity and performance issues

·         Follows data standards, resolves data issues, completes unit testing and system documentation for Extract Transfer Load (ETL) processes

·         Knowledge of Hadoop ecosystem an asset (Hive, Spark, HDFS, NiFi)

·         Knowledge of Python/Scala is an asset

·         Strong communication skills, including the ability to speak clearly to technical and nontechnical people

·         Self-driven, highly motivated and able to learn quickly

calendar_today1 day ago

Similar jobs

location_onMontreal, Canada

work TechDigital Corporation

I expressly authorise the Terms and Conditions