Description: Job title ; Data integration Developer Location: Montreal ( office attendance from day one – Hybrid mode 3 x per week) As a Data Integration / ETL Developer, you will be a member of the GRC Data Warehouse development team, with specific focus on sourcing data and developing data solutions for Legal, Compliance, Audit, Risk Management functions. In this role you will be primarily responsible for the development of data workflows, views, and stored procedures, in addition to performing data analysis, and monitoring and tuning queries and data loads. You will be working closely with data providers, data scientists, data developers, and data analytics teams to facilitate the implementation of client-specific business requirements and requests. YOUR KEY RESPONSIBILITIES: • To develop ETLs, stored procedures, triggers, and views on our DB2-based Data Warehouse • To perform data profiling and analysis on source system data to ensure that source system data can be integrated and represented properly in our models • To monitor the performance of queries and data loads and perform tuning as necessary • To provide assistance and guidance during the QA & UAT phases to quickly confirm the validity of potential issues and to determine the root cause and best resolution of verified issues SKILLS / QUALIFICATIONS • Bachelor’s degree in Computer Science, Software Engineering, Information Technology, or related field required • At least 7 years of experience in data development and solutions in highly complex data environments with large data volumes • At least 7 years of experience developing complex ETLs with Informatica PowerCenter a must • At least 7 years of SQL / PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis • At least 7 years of experience developing complex stored procedures, triggers, MQTs and views on IBM DB2 (experience with v10.5 a plus) • Experience with performance tuning DB2 tables, queries, and stored procedures • Experience with sourcing data from Kafka and Splunk based data sources • Experience with Python for scripting • Experience with Autosys or Airflow a plus • An understanding of E-R data models (conceptual, logical and physical) • Strong understanding of advanced data warehouse concepts (Factless Fact Tables, Temporal BI-Temporal models, etc.) • Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and translate them into technical designs and solutions • Experience with both Waterfall and Agile development methodologies • Strong communication skills both verbal and written.
Capable of collaborating effectively across a variety of IT and Business groups, across regions, roles and able to interact effectively with all levels. • Self starter. Proven ability to manage multiple, concurrent projects with minimal supervision. Can manage a complex ever changing priority list and resolve conflicts to competing priorities. • Strong problem solving skills. Ability to identify where focus is needed and bring clarity to business objectives, requirements and priorities.
Education Level: Bachelor’s Degree Experience Level : Level 4 Upload cover letter 514-525-5777 infofxinnovation.com 400, boul. De Maisonneuve Ouest, bureau 1100, Montréal (Québec) FX Innovation. All rights reserved 2021