Data Architect

Full Time
  • Full Time
  • Toronto

GTT, LLC

Data Architect /Modeller
Contract Duration 2 years
Job Overview:
As part of the modernization of the Social Assistance analytics platform, review the existing data model and redesign it to be scalable and flexible to meet current program and Ministry needs for analytical and operational reporting.
Support data migration plan to redesign current on-premise or hybrid processes from a SqlServer environment to a cloud-based delta lake built on Azure Data Lake service, using Azure Databricks
General Responsibilities
Data Architect reviews business requirements, familiarizes with and understands business rules and transactional data model
Define conceptual, logical model, and physical model mapping from data source to curated model and data mart.
Analyze requirements and recommend changes to the physical model.
Develop scripts for the physical model, and create database and/or delta lake file structure.
Access Oracle DB environments, and set the necessary tools for developing solutions.
Implement data design methodologies, historical and dimensional models
Develop a curated model to store historical data captured incrementally from the source
Design dimensional data mart models, create source-to-target-mapping documentation, design and document data transformation from curated model to data mart
Perform data profiling, assess data accuracy, design and document data quality, and master data management rules
Functionality Review, Data Load review, Performance Review, and Data Consistency checks.
Help troubleshoot data mart design issues
Review the performance of ETL with developers and suggest improvements
Participate in end-to-end integrated testing for Full Load and Incremental Load and advise on issues
Plan for Go Live, Production Deployment.
Work with system administrator, ETL developers, and ministry team to define production deployment steps.
Configure parameters, and scripts for go live. Test and review the instructions.
Review release documentation
Go Live Support and Review after Go Live.
Review data models, ETL processes, and tools and provide recommendations on improving performance and reducing ETL timelines.
Review Infrastructure and any performance issues for overall process improvement
Proactively communicate with stakeholders on any changes required to conceptual, logical, and physical models, and communicate and review dependencies and risks.
Knowledge Transfer to Ministry staff, development of documentation on the work completed.
Document share and work on the architecture end-to-end-working knowledge, Troubleshooting steps, configuration, and script review.
Data Architect transfers documents, scripts, and reviews of documents.
Must Haves:
7+ years in data modeling and data warehouse design
2+ years Azure Data Lake and Azure Databricks SQL Warehouse
5+ years SQL
Assets:
Knowledge of Curam IBM COTS solutions (Social Assistance Management System)
ETL design concepts
Knowledge of Enterprise Architecture tools and frameworks (ArchiMate, TOGAF, Zachmann)
Evaluation Criteria:
Design Documentation and Analysis Skills:
Demonstrated experience in creating both Functional Design Documents (FDD) & Detailed Design Documents (DDD).
Experience in Fit-Gap analysis, system use case reviews, requirements reviews, coding exercises, and reviews.
Experience in the development and maintaining a plan to address contract deliverables, through the identification of significant milestones and expected results with weekly status reporting.
Work with the Client & Developer(s) assigned to refine/confirm Business Requirements
Participate in defect fixing, testing support, and development activities for ETL pipelines. Assist with defect fixing and testing support for PowerBI reports.
Analyze and document solution complexity and interdependencies
BI Data Modelling and Technical Skills
Understanding of Data Modelling for Business Intelligence including:
Expert Knowledge of data warehouse design methodologies, delta lake, and dimensional modeling in particular
Understanding of Extract/Transform/Load processes to transform data for reporting/BI purposes
Ability to define a schema for reporting databases
Experience with advanced modeling tools
Knowledge of BI tools for metadata modeling and report design (e.g. PowerBI, Cognos)
Good knowledge and experience in MS SQL Server technology, Azure Databricks SQL Warehouse, Azure Data Lake
Experience using T-SQL, and PL/SQL for the development of Business Intelligence applications.

Demonstrated skills in writing and reverse engineering SQL stored procedures and packages for datamarts and reporting.
Demonstrated experience in performance tuning of Business Intelligence applications, including data model and schema optimization
Quality Assurance
Demonstrated experience in defining and executing tests across the development lifecycle (unit testing, system testing, user acceptance testing) and using results to refine database design
Knowledge Transfer
The Architect / Modeler must have previous work experience in conducting Knowledge Transfer and training sessions, ensuring the resources will receive the required knowledge to support the system. The resource must develop learning activities using the review-watch-do methodology & demonstrate the ability to prepare and present.
Development of documentation and materials as part of a review and knowledge transfer to other members
Development of specific activities as part of a review (hand over to ministry staff)and building block approach which, builds on knowledge transfer and skills development from the previous stage to the next
Development and facilitation of classroom-based or virtual instructor demo-led sessions for developers
Monitor identified milestones and submission of status reports to ensure Knowledge Transfer is fully completed
Note:
Hybrid Role – 3 days per week on-site
#gttca
#LI-GTT

To apply, please visit the following URL: