Business Intelligence ETL 5781

2 weeks ago


Toronto, Canada Foilcon Full time

General ResponsibilitiesDesign, develop and implement ingestion framework from Oracle source to Azure Data Lake - initial load and incremental ETL. Used tools are:-Oracle Golden Gate (knowledge and experience are an asset but not required) for data ingestion and change data capture (currently in final stages of proof of concept)-Azure Data Factory (expert knowledge) to maintain pipeline from Oracle to Azure Data Lake-Azure Databricks/PySpark (good Python knowledge required) to build transformations of raw data into curated zone in the data lake-Azure Databricks/Python/SQL (good SQL knowledge required) to develop and/or troubleshoot transformations of curated data into datamart modelReview the requirements, database tables, and database relationships - Identify gaps and inefficiencies in current production reporting environment and provide recommendations to address them in the new platform.Design ingestion framework and CDC - preferred tools are Oracle Golden Gate and Azure Data FactoryPrepare design artifactsWork with IT partner on configuration of Golden Gate - responsible to provide direction and and quot;how to and quot;.Maintain dynamic pipeline for ETL ingestion to add new tables and data elementsData design - physical model mapping from data source to reporting destination.Understand the requirements. Recommend changes to the physical model.Assist with data modelling and updates of source-to-target mapping documentationDevelop scripts for the physical model, and update database and/or data lake structure.Access Oracle DB and SQL Server environments, use SSIS, SQLDeveloper, Azure Data Studio, Azure Data Factory, Databricks and other tools for developing solution.Proactively communicate with business and IT experts on any changes required to conceptual, logical and physical models, communicate and review dependencies and risks.Development of ETL strategy and solution for different sets of data modulesUnderstand the Tables and Relationships.Create low level design documents and test cases.Create the workflows of package designDevelopment and testing of data with Incremental and Full Load.Develop high quality ETL mappings/scripts/jobsDevelop ETL from source to Data WarehouseDevelop ETL from Data Warehouse to Data MartPerform unit testingEnsure performance monitoring and improvementPerformance Review, data Consistency checksTroubleshoot performance issues, ETL Load issues, log activity for each Individual package and transformation.Review Performance of ETL overall.End-to-end Integrated testing for Full Load and Incremental LoadPlan for Go Live, Production Deployment.Create production deployment steps.Configure parameters, scripts for go live. Test and review the instructions.Create release documents and help build and deploy code across servers.Go Live Support and Review after Go Live.Review existing ETL process, tools and provide recommendation on improving performance and reduce ETL timelines.Review Infrastructure and any pain points for overall process improvementKnowledge Transfer to Ministry staff, development of documentation on the work completed.Document share and work on the ETL end-to-end working knowledge, troubleshooting steps, configuration and scripts review.Transfer documents, scripts and review of documents.Experience:Experience of 7+ years of working with SQL Server, SSIS, and T-SQL development or similarExperience of 2+ years of working with Azure SQL Database, Azure Data Factory, Databricks and Python developmentExperience building data ingestion and change data capture using Oracle Golden GateExperience working with building databases, data warehouses and dimensional data marts and working with delta and full loadsExperience with any ETL tools such as SSIS, ADF, Databricks, other cloud toolsExperience working with MS SQL Sever on premise and within Azure EnvironmentExperience on Data modeling, and tools – e.g. SAP Power Designer, VisioExperience with snowflake and star schema model. Experience in designing data warehouse solutions using slowly changing dimensions.Experience working with SQL Server SSIS and other ETL tools, solid knowledge and experience with SQL, other RDBMS (Oracle, PL/SQL)Understanding data warehouse architecture with a delta lake, dimensional model.Analyze, design, develop, test and document ETL programs from detailed and high-level specifications, and assist in troubleshooting.Utilize SQL to perform tasks other than data transformation (DDL, complex queries)Good knowledge of database performance optimization techniquesAbility to assist in the requirements analysis and subsequent developmentsAbility to conduct unit tests and assist in test preparations to ensure data integrityWork closely with Designers, Business Analysts and other DevelopersLiaise with Project Managers, Quality Assurance Analysts and Business Intelligence ConsultantsDesign and implement technical enhancements of Data Warehouse as required.Skills:7+ years in ETL tools such as Microsoft SSIS, stored procedures (Must Have)2+ Azure Data Lake and Data Warehouse, and building Azure Data Factory and Azure Databricks pipelines (Must Have)2+ years Python (Must Have)Oracle Golden GateSQL ServerOracleAbility to present technical requirements to the businessAssets:Knowledge and experience building data ingestion, history, change data capture using Oracle Golden Gate is major asset but is not required.Evaluation CriteriaDesign Documentation and Analysis Skills (30 points)Demonstrated experience in creating both Functional Design Documents (FDD) and amp; Detailed Design Documents (DDD).Experience in Fit-Gap analysis, system use case reviews, requirements reviews, coding exercises and reviews.Experience in the development and maintaining a plan to address contract deliverables, through the identification of significant milestones and expected results with weekly status reporting.Work with the Client and amp; Developer(s) assigned to refine/confirm Business RequirementsParticipate in defect fixing, testing support and development activities for ETL tool. Assist with defect fixing and testing support for Power BI reports.Analyze and document solution complexity and interdependencies by function including providing support for data validation.Development, Database and ETL experience (60 points)Demonstrated experience in Microsoft specific software development and a number of years of practical experience (minimum 7+ years)Experience in developing in an agile Azure DevOps environmentExperience in application mapping to populate data vault and dimensional data mart schemasDemonstrated experience in Extract, Transform and amp; Load and Extract, Load and Transforms software development and a number of years of practical experience (minimum 7+ years)Experience in providing ongoing support on Azure pipeline/configuration and SSIS developmentExperience building data ingestion and change data capture using Golden GateAssist in the development of the pre-defined and ad-hoc reports and meet the coding and accessibility requirements.Demonstrated experience with Oracle and Microsoft interfacesProficient in SQL and PythonImplementing logical and physical data modelsKnowledge Transfer (10 points)The Developer must have previous work experience in conducting Knowledge Transfer and training sessions, ensuring the resources will receive the required knowledge to support the system. The resource must develop learning activities using review-watch-do methodology and amp; demonstrate the ability to prepare and present.Development of documentation and materials as part of a review and knowledge transfer to other membersDevelopment and facilitation of classroom based, or virtual instructor demo led sessions for DevelopersMonitor identified milestones and submission of status reports to ensure Knowledge Transfer is fully completed



  • Toronto, Canada Foilcon Full time

    Description : General ResponsibilitiesDesign, develop and implement ingestion framework from Oracle source to Azure Data Lake - initial load and incremental ETL. Used tools are:-Oracle Golden Gate (knowledge and experience are an asset but not required) for data ingestion and change data capture (currently in final stages of proof of concept)-Azure Data...


  • Toronto, Canada Esolutions Full time

    **Job Role - BSA- Azure** **Location : Toronto, ON Hybrid** **Hire Type-Full Time** **Job Description-**: **Mandatory Skills**:ETL, Azure Cloud, Azure Databricks, Azure Data factory, SQL, Agile, Business Analysis** **Job Summary**: - Senior BSA with Azure - Strong Azure Databricks, Data Mapping, Pyspark, SQL, ETL, Data Migration projects experience. -...

  • Business Intelligence

    2 weeks ago


    Toronto, Canada iPartner Staffing Full time

    Hybrid work Arrangement - 3 days per week ETL Azure SQL POWER BI Developer Power BI reporting - 10+ years of technical data warehousing experience with a focus in the design and development of Business Intelligence - Experience with Azure Analysis Services / SSAS (tabular/multidimensional), SQL Server Reporting Services (SSRS) and Power BI - Development...

  • Etl QA

    2 weeks ago


    Toronto, Canada Jarvis Consulting Group Full time

    **QA Tester - ETL** **Contract**:6 months (Opportunity for extension) Understand data flow and test strategy for ETL, Data warehouse and Business Intelligence testing - Data warehouse testing involving the testing of stored procedures, SSIS packages - Write advanced SQL queries for ETL/data warehousing/Business Intelligence testing - Identify test cases...


  • Toronto, Canada LanceSoft Full time

    OverviewTitle: Business Intelligence SpecialistLocation: Toronto, ONDuration: 12+ MonthsDescriptionRequired Knowledge / Skills:ResponsibilitiesSupports Client’s wide-ranging business information requirements, implementing and supporting business intelligence software solutionsDevelops and supports customized query applications which analyze key business...


  • Toronto, Canada LanceSoft Full time

    Overview Title: Business Intelligence Specialist Location: Toronto, ON Duration: 12+ Months Description Required Knowledge / Skills: Responsibilities Supports Client’s wide-ranging business information requirements, implementing and supporting business intelligence software solutions Develops and supports customized query applications which analyze key...


  • Toronto, Canada GS1 Canada Full time

    Overview The Business Intelligence Analyst role at GS1 Canada is responsible for designing, developing, and maintaining reporting, analytics, and data solutions that support GS1 Canada’s strategic objectives and Industry Managed Solutions. The incumbent gathers requirements, analyzes data workflows, and delivers insights that enable data-informed...


  • Toronto, Canada GS1 Canada Full time

    OverviewThe Business Intelligence Analyst role at GS1 Canada is responsible for designing, developing, and maintaining reporting, analytics, and data solutions that support GS1 Canada’s strategic objectives and Industry Managed Solutions. The incumbent gathers requirements, analyzes data workflows, and delivers insights that enable data-informed...


  • Toronto, Canada GS1 Canada Full time

    OverviewThe Business Intelligence Analyst role at GS1 Canada is responsible for designing, developing, and maintaining reporting, analytics, and data solutions that support GS1 Canada’s strategic objectives and Industry Managed Solutions. The incumbent gathers requirements, analyzes data workflows, and delivers insights that enable data-informed...


  • Toronto, Canada EBF Full time

    on behalf of our client on Public Sector. we are looking for contract **Business Intelligence Specialist - ETL Developer** Description **General Responsibilities** - Design, develop and implement ingestion framework from Oracle source to Azure Data Lake - initial load and incremental ETL. Used tools are: - Azure Data Factory (expert knowledge) to maintain...