Data Architect Modeler 5782

4 weeks ago


Toronto, Canada Foilcon Full time

General Responsibilities


  • Review business requirements, familiarize with and understand business rules and transactional data model
  • Define conceptual, logical model and physical model mapping from data source to curated model and data mart.

a) Analyze requirements and recommend changes to the physical model.

b) Develop scripts for the physical model, create database and/or delta lake file structure.

c) Access Oracle DB environments, set necessary tools for developing solution.

  • Implement data design methodologies, historical and dimensional models

a) Develop curated model to store historical data captured incrementally from source

b) Design dimensional data mart models, create source-to-target-mapping documentation, design and document data transformation from curated model to data mart

c) Perform data profiling, assess data accuracy, design and document data quality and master data management rules

  • Functionality Review, Data Load review, Performance Review, Data Consistency checks.

a) Help troubleshooting data mart design issues

b) Review performance of ETL with developers and suggest improvements

  • Participate in end-to-end integrated testing for Full Load and Incremental Load and advise on issues
  • Plan for Go Live, Production Deployment.

a) Work with system administrator, ETL developers and ministry team to define production deployment steps.

b) Configure parameters, scripts for go live. Test and review the instructions.

c) Review release documentation

  • Go Live Support and Review after Go Live.

a) Review data models, ETL process, tools and provide recommendation on improving performance and reduce ETL timelines.

b) Review Infrastructure and any performance issues for overall process improvement

  • Proactively communicate with stakeholders on any changes required to conceptual, logical and physical models, communicate and review dependencies and risks.
  • Knowledge Transfer to Ministry staff, development of documentation on the work completed.

a) Document share and work on the architecture end-to end-working knowledge, Troubleshooting steps, configuration and scripts review.

b) Transfer documents, scripts and review of documents.


Data Modeler Requirements:


  • 7+ years BI Data Architect experience in enterprise applications and solutions design/development and related with data warehousing, data lake implementations and dimensional modelling.
  • Collect business-level questions and propose approaches to address business needs and provide data insights.
  • Expand documentation and knowledge of business processes relative to available data to provide contextual guidance for operation/project, reporting and insights generation.
  • Ability to design and articulate complex technical concepts into executable development work packages.
  • Knowledge of BI tools for metadata modeling and report design (e.g. Power BI)
  • MS SQL Server Technology, Azure Data Lake, Azure Databricks
  • Expert knowledge developing data warehouse solutions on MS Stack (Azure Data Lake, SQL, ADF, Databrciks, PowerBI) to store and retrieve centralized information. Experience designing the data warehouse using dimensional and delta lake concepts.
  • Create/maintain enterprise data model and data dictionary. Help development team to optimize database performance. Coordinate with the Integration department to identify future needs and requirements.
  • Extensive knowledge of data modelling tools (e.g. SAP PowerDesigner, Visio)
  • Review, install and configure information systems to ensure functionality and security. Analyze structural requirements for new data warehouse and applications
  • Experience using Oracle database server and tools (12c, 19c), PL/SQL for development of Business Intelligence applications.
  • Demonstrated skills in writing SQL stored procedures and packages for datamarts and reporting.
  • Demonstrated experience in Azure DevOps
  • Demonstrated experience in performance tuning of Business Intelligence applications, including data model and schema optimization


Skills:

  • 7+ years in data modelling and data warehouse design (Must Have)
  • 2+ years Azure Data Lake and Azure Databricks SQL Warehouse (Must Have)
  • 5+ years SQL (Must Have)


Assets:

  • Knowledge of Curam IBM COTS solutions (Social Assistance Management System)
  • ETL design concepts
  • Knowledge of Enterprise Architecture tools and frameworks (ArchiMate, TOGAF, Zachmann)


Evaluation Criteria:


Design Documentation and Analysis Skills (30 points)

  • Demonstrated experience in creating both Functional Design Documents (FDD) and amp; Detailed Design Documents (DDD). –
  • Experience in Fit-Gap analysis, system use case reviews, requirements reviews, coding exercises and reviews.
  • Experience in the development and maintaining a plan to address contract deliverables, through the identification of significant milestones and expected results with weekly status reporting.
  • Work with the Client and amp; Developer(s) assigned to refine/confirm Business Requirements
  • Participate in defect fixing, testing support and development activities for Informatica/ETL tool. Assist with defect fixing and testing support for Cognos reports.
  • Analyze and document solution complexity and interdependencies


BI Data Modelling and Technical Skills (40 points)

  • Understanding of Data Modelling for Business Intelligence including:

a. Expert Knowledge of data warehouse design methodologies, delta lake and dimensional modeling in particular

b. Understanding of Extract/Transform/Load processes to transform data for reporting/BI purposes

c. Ability to define schema for reporting databases

d. Experience with advanced modeling tools,

  • Knowledge of BI tools for metadata modeling and report design (e.g. PowerBI, Cognos 10/11)
  • Extensive knowledge and experience in MS SQL Server Technology, Azure Databricks SQL Warehouse, Azure Data Lake
  • Experience using Oracle database server and tools (12c, 19c), PL/SQL for development of Business Intelligence applications. Demonstrated skills in writing and reverse engineering SQL stored procedures and packages for datamarts and reporting.
  • Demonstrated experience in performance tuning of Business Intelligence applications, including data model and schema optimization


Quality Assurance (20 points)

  • Demonstrated experience in defining and executing tests across the development lifecycle (unit testing, system testing, user acceptance testing) and using results to refine database design


Knowledge Transfer (10 points)

  • The Architect / Modeler must have previous work experience in conducting Knowledge Transfer and training sessions, ensuring the resources will receive the required knowledge to support the system. The resource must develop learning activities using review-watch-do methodology and amp; demonstrate the ability to prepare and present.
  • Development of documentation and materials as part of a review and knowledge transfer to other members
  • Development of specific activities as part of a review (hand over to ministry staff)and building block approach which, build on knowledge transfer and skills development from the previous stage to the next
  • Development and facilitation of classroom based or virtual instructor demo led sessions for Developers
  • Monitor identified milestones and submission of status reports to ensure Knowledge Transfer is fully completed

  • Data Modeller 5782

    4 weeks ago


    Toronto, Canada Foilcon Full time

    Description : As part of the modernization of the analytics platform, review existing data model and redesign to be scalable and flexible to meet current program and client needs for analytical and operational reporting . Support data migration plan to redesign current on-premise or hybrid processes from a SQL Server environment to cloud-based data lake...


  • Toronto, Canada Global Pharma Tek Full time

    Position: Sr. Data Architect/Modeler Location: Toronto, ON (Hybrid) Duration: days contract with the possibility of extension. Job Description: Responsibilities: Develops and implements the data architecture for application development in a complex and distributed environment, including the determination of the flow and distribution of data,...

  • Data Modeler

    5 days ago


    Toronto, Canada spruceinfotech Full time

    Job DescriptionData ModelerToronto/Montreal- 3 days/week Mandatory: Erwin, Data Vault 2.0•             Designing and implementing data modeling solutions using relational, dimensional, and NoSQL databases. Working closely with data architects, they develop bespoke databases utilizing a mixture of conceptual, physical, and logical...

  • Data Modeler

    4 days ago


    Toronto, Canada spruceinfotech Full time

    **Company Description** Spruce InfoTech is a leading information technology firm that provides varied services to help clients change manage and transform their businesses by means of high quality, innovative and cost effective solutions. We provide services to different companies from small scale level to even fortune 500 organizations and guide them in the...


  • Toronto, Canada Global Pharma Tek Full time

    Position: Data Architect/Modeller - Senior Location: Toronto, ON Duration: Months Contract Job Description: Technical Skills: Experience creating physical database designs and operates and experience administering database management systems (DBMS) including database optimization, capacity planning, installation and migration, database...

  • Data Architect

    2 weeks ago


    Toronto, Canada Trinity Infotech Inc. Full time

    Position: Data Architect Location: Toronto, ON (Hybrid onsite from day 1) Duration: Contract Mandatory Skills: AWS Glue, Kafka, Redshift/Postgres, RDBMS/Datawarehousing Primary Skills: Power BI, REST APIs Relevant experience: Overall 10+ years of experience in SAP data management, data quality assurance, data extraction from legacy systems. Detailed...

  • Data Architect

    2 weeks ago


    Toronto, Canada Trinity Infotech Inc. Full time

    Position: Data Architect Location: Toronto, ON (Hybrid onsite from day 1) Duration: Contract Mandatory Skills: AWS Glue, Kafka, Redshift/Postgres, RDBMS/Datawarehousing Primary Skills: Power BI, REST APIs Relevant experience: Overall 10+ years of experience in SAP data management, data quality assurance, data extraction from legacy systems. Detailed...

  • Data Architect

    2 weeks ago


    Toronto, Canada Trinity Infotech Inc. Full time

    Position: Data Architect Location: Toronto, ON (Hybrid onsite from day 1) Duration: Contract Mandatory Skills: AWS Glue, Kafka, Redshift/Postgres, RDBMS/Datawarehousing Primary Skills: Power BI, REST APIs Relevant experience: Overall 10+ years of experience in SAP data management, data quality assurance, data extraction from legacy systems. Detailed...

  • Data Architect

    1 week ago


    Toronto, Canada Trinity Infotech Inc. Full time

    Position: Data ArchitectLocation: Toronto, ON (Hybrid onsite from day 1) Duration: ContractMandatory Skills:AWS Glue, Kafka, Redshift/Postgres, RDBMS/Datawarehousing Primary Skills:Power BI, REST APIsRelevant experience:Overall 10+ years of experience in SAP data management, data quality assurance, data extraction from legacy systems.Detailed JD: Hands-on...

  • Data Architect

    2 weeks ago


    Toronto, Canada Trinity Infotech Inc. Full time

    Position: Data Architect Location: Toronto, ON (Hybrid onsite from day 1) Duration: ContractMandatory Skills:AWS Glue, Kafka, Redshift/Postgres, RDBMS/Datawarehousing Primary Skills:Power BI, REST APIsRelevant experience:Overall 10+ years of experience in SAP data management, data quality assurance, data extraction from legacy systems.Detailed JD: Hands-on...

  • Data Architect

    4 weeks ago


    Toronto, Canada hireVouch Full time

    Data ArchitectAbout Us: We're committed to leveraging data to drive insights, innovation, and strategic decision-making. As a Data Architect, you'll play a pivotal role in shaping our (and our clients) data infrastructure and ensuring the integrity, efficiency, and accessibility of our data assets.Job Description:As a Data Architect, you will be responsible...

  • Data Architect

    2 weeks ago


    Toronto, Canada Trinity Infotech Inc. Full time

    Position: Data ArchitectLocation: Toronto, ON (Hybrid onsite from day 1)Duration: Contract Mandatory Skills: AWS Glue, Kafka, Redshift/Postgres, RDBMS/Datawarehousing Primary Skills: Power BI, REST APIs Relevant experience: Overall 10+ years of experience in SAP data management, data quality assurance, data extraction from legacy systems. Detailed...

  • Data Architect

    2 weeks ago


    Toronto, Canada Trinity Infotech Inc. Full time

    Position: Data ArchitectLocation: Toronto, ON (Hybrid onsite from day 1)Duration: Contract Mandatory Skills: AWS Glue, Kafka, Redshift/Postgres, RDBMS/Datawarehousing Primary Skills: Power BI, REST APIs Relevant experience: Overall 10+ years of experience in SAP data management, data quality assurance, data extraction from legacy systems. Detailed...

  • Data Architect

    1 month ago


    Toronto, Ontario, Canada Project X Full time

    Role SummaryAs a Data Architect at Project X Ltd., you will be providing our clients with your knowledge and experience in making sense of messy, unstructured data. The Data Architect will be responsible for designing, implementing, and overseeing end-to-end solutions that address designing scalable and efficient data solutions, and complex business...

  • Data Architect

    3 days ago


    Toronto, Canada Net2source Inc. Full time

    Job Title: Data ArchitectLocation: Toronto, ON (Hybrid onsite from day 1) Type: ContractPRIMARY SKILL: AWS Glue, Kafka, Redshift/Postgres, RDBMS/Data warehousing SECONDARY SKILL: Power BI, REST APIs Relevant experience: Overall 10+ years of experience in SAP data management, data quality assurance, data extraction from legacy systems.Detailed JD:Hands-on...

  • Data Architect

    2 weeks ago


    Toronto, Canada Trinity Infotech Inc. Full time

    Position: Data ArchitectLocation: Toronto, ON (Hybrid onsite from day 1)Duration: Contract Mandatory Skills: AWS Glue, Kafka, Redshift/Postgres, RDBMS/Datawarehousing Primary Skills: Power BI, REST APIs Relevant experience: Overall 10+ years of experience in SAP data management, data quality assurance, data extraction from legacy systems. Detailed...

  • Data Architect

    3 days ago


    Toronto, Canada Trinity Infotech Inc. Full time

    Position: Data ArchitectLocation: Toronto, ON (Hybrid onsite from day 1)Duration: Contract Mandatory Skills: AWS Glue, Kafka, Redshift/Postgres, RDBMS/Datawarehousing Primary Skills: Power BI, REST APIs Relevant experience: Overall 10+ years of experience in SAP data management, data quality assurance, data extraction from legacy systems. Detailed...

  • Data Architect

    2 weeks ago


    Toronto, Canada Trinity Infotech Inc. Full time

    Position: Data ArchitectLocation: Toronto, ON (Hybrid onsite from day 1)Duration: Contract Mandatory Skills: AWS Glue, Kafka, Redshift/Postgres, RDBMS/Datawarehousing Primary Skills: Power BI, REST APIs Relevant experience: Overall 10+ years of experience in SAP data management, data quality assurance, data extraction from legacy systems. Detailed...

  • Data Architect

    2 weeks ago


    Toronto, Canada Trinity Infotech Inc. Full time

    Position: Data Architect Location: Toronto, ON (Hybrid onsite from day 1) Duration: Contract Mandatory Skills: AWS Glue, Kafka, Redshift/Postgres, RDBMS/Datawarehousing Primary Skills: Power BI, REST APIs Relevant experience: Overall 10+ years of experience in SAP data management, data quality assurance, data extraction from legacy systems. ...

  • Data Architect

    1 week ago


    Toronto, Ontario, Ontario, Canada Trinity Infotech Inc. Full time

    Position: Data ArchitectLocation: Toronto, ON (Hybrid onsite from day 1)Duration: Contract Mandatory Skills: AWS Glue, Kafka, Redshift/Postgres, RDBMS/Datawarehousing Primary Skills: Power BI, REST APIs Relevant experience: Overall 10+ years of experience in SAP data management, data quality assurance, data extraction from legacy systems. Detailed...