RQ08100: Software Developer

2 days ago


Toronto, Canada Rubicon Path Full time

About the job

RQ08100: Software Developer - ETL General Responsibilities This role is responsible for designing, developing, maintaining, and optimizing ETL (Extract, Transform, Load) processes in Databricks for data warehousing, data lakes, and analytics. The developer will work closely with data architects and business teams to ensure the efficient transformation and movement of data to meet business needs, including handling Change Data Capture (CDC) and streaming data. Azure Databricks, Delta Lake, Delta Live Tables, and Spark to process structured and unstructured data. Azure Databricks/PySpark (good Python/PySpark knowledge required) to build transformations of raw data into curated zone in the data lake.

Azure Databricks/PySpark/SQL (good SQL knowledge required) to develop and/or troubleshoot transformations of curated data into FHIR. Understand the requirements. Recommend changes to models to support ETL design. Define primary keys, indexing strategies, and relationships that enhance data integrity and performance across layers.

Define the initial schemas for each data layer. Assist with data modelling and updates of source-to-target mapping documentation. Document and implement schema validation rules to ensure incoming data conforms to expected formats and standards. Design data quality checks within the pipeline to catch inconsistencies, missing values, or errors early in the process.

Proactively communicate with business and IT experts on any changes required to conceptual, logical and physical models, communicate and review timelines, dependencies, and risks. Development of ETL strategy and solution for different sets of data modules Understand the Tables and Relationships in the data model. Create low level design documents and test cases for ETL development. Implement error-catching, logging, retry mechanisms, and handling data anomalies.

Create the workflows and pipeline design. Development and testing of data pipelines with Incremental and Full Load. Develop high quality ETL mappings/scripts/notebooks. Develop and maintain pipeline from Oracle data source to Azure Delta Lakes and FHIR.

Perform unit testing. Ensure performance monitoring and improvement. Performance review, data consistency checks Troubleshoot performance issues, ETL issues, log activity for each pipeline and transformation. Review and optimize overall ETL performance.

End-to-end integrated testing for Full Load and Incremental Load No specific tasks listed. Plan for Go Live, Production Deployment. Configure parameters, scripts for go live. Test and review the instructions.

Create release documents and help build and deploy code across servers. Go Live Support and Review after Go Live. Review existing ETL process, tools and provide recommendation on improving performance and reduce ETL timelines. Review infrastructure and remediate issues for overall process improvement.

Knowledge Transfer to Ministry staff, development of documentation on the work completed. Document work and share the ETL end-to-end design, troubleshooting steps, configuration and scripts review. Transfer documents, scripts and review of documents to Ministry. SkillsExperience and Skill Set Requirements Please note this role is part of a Hybrid Work Arrangement and resource(s) will be required to work at a minimum of 3 days per week at the Office Location.

Must Have Skills 7+ years using ETL tools such as Microsoft SSIS, stored procedures, T-SQL. 2+ Delta Lake, Databricks and Azure Databricks pipelines. 2+ years Python and PySpark. Solid understanding of the Medallion Architecture (Bronze, Silver, Gold) and experience implementing it in production environments. Hands‑on experience with CDC tools (e.g., GoldenGate) for managing real‑time data. SQL Server, Oracle.

Experience

Experience of 7+ years of working with SQL Server, T‑SQL, Oracle, PL/SQL development or similar relational databases.

Experience

of 2+ years of working with Azure Data Factory, Databricks and Python development.

Experience

building data ingestion and change data capture using Oracle Golden Gate.

Experience

in designing, developing, and implementing ETL pipelines using Databricks and related tools to ingest, transform, and store large‑scale datasets.

Experience

in leveraging Databricks, Delta Lake, Delta Live Tables, and Spark to process structured and unstructured data.

Experience

working with building databases, data warehouses and working with delta and full loads.

Experience

on Data modeling, and tools e.g. SAP Power Designer, Visio, or similar.

Experience

working with SQL Server SSIS or other ETL tools, solid knowledge and experience with SQL scripting.

Experience

developing in an Agile environment. Understanding data warehouse architecture with a delta lake. Ability to analyze, design, develop, test and document ETL pipelines from detailed and high‑level specifications, and assist in troubleshooting. Ability to utilize SQL to perform DDL tasks and complex queries.

Good knowledge of database performance optimization techniques. Ability to assist in the requirements analysis and subsequent developments. Ability to conduct unit testing and assist in test preparations to ensure data integrity. Work closely with Designers, Business Analysts and other Developers.

Liaise with Project Managers, Quality Assurance Analysts and Business Intelligence Consultants. Design and implement technical enhancements of Data Warehouse as required. Development, Database and ETL experience (60 points) Experience in developing and managing ETL pipelines, jobs, and workflows in Databricks. Deep understanding of Delta Lake for building data lakes and managing ACID transactions, schema evolution, and data versioning.

Experience

automating ETL pipelines using Delta Live Tables, including handling Change Data Capture (CDC) for incremental data loads. Proficient in structuring data pipelines with the Medallion Architecture to scale data pipelines and ensure data quality. Hands‑on experience developing streaming tables in Databricks using Structured Streaming and readStream to handle real‑time data. Expertise in integrating CDC tools like GoldenGate or Debezium for processing incremental updates and managing real‑time data ingestion.

Experience

using Unity Catalog to manage data governance, access control, and ensure compliance. Skilled in managing clusters, jobs, autoscaling, monitoring, and performance optimization in Databricks environments. Knowledge of using Databricks Autoloader for efficient batch and real‑time data ingestion. Experience with data governance best practices, including implementing security policies, access control, and auditing with Unity Catalog.

Proficient in creating and managing Databricks Workflows to orchestrate job dependencies and schedule tasks. Strong knowledge of Python, PySpark, and SQL for data manipulation and transformation. Experience integrating Databricks with cloud storage solutions such as Azure Blob Storage, AWS S3, or Google Cloud Storage. Familiarity with external orchestration tools like Azure Data Factory.

Implementing logical and physical data models. Knowledge of FHIR is an asset. Design Documentation and Analysis Skills (20 points) Demonstrated experience in creating design documentation such as: Schema definitions Error handling and logging ETL Process Documentation Job Scheduling and Dependency Management Data Quality and Validation Checks Performance Optimization and Scalability Plans Troubleshooting Guides Data Lineage Security and Access Control Policies applied within ETL Experience in Fit‑Gap analysis, system use case reviews, requirements reviews, coding exercises and reviews. Participate in defect fixing, testing support and development activities for ETL.

Analyze and document solution complexity and interdependencies including providing support for data validation. Strong analytical skills for troubleshooting, problem‑solving, and ensuring data quality. Certifications (10 points) Databricks Certified Professional Data Engineer AWS Certified Data Analytics - Specialty Google Cloud Professional Data Engineer Communication, Leadership Skills and Knowledge Transfer (10 points) Ability to collaborate effectively with cross‑functional teams and communicate complex technical concepts to non‑technical stakeholders. Strong problem‑solving skills and experience working in an Agile or Scrum environment.

Ability to provide technical guidance and support to other team members on Databricks best practices. Must have previous work experience in conducting Knowledge Transfer sessions, ensuring the resources will receive the required knowledge to support the system. Must develop documentation and materials as part of a review and knowledge transfer to other members. #J-18808-Ljbffr



  • Toronto, Canada Rubicon Path Full time

    About the job RQ08100: Software Developer - ETL General Responsibilities This role is responsible for designing, developing, maintaining, and optimizing ETL (Extract, Transform, Load) processes in Databricks for data warehousing, data lakes, and analytics. The developer will work closely with data architects and business teams to ensure the efficient...


  • Toronto, Ontario, Canada Rubicon Path Full time

    General ResponsibilitiesThis role is responsible for designing, developing, maintaining, and optimizing ETL (Extract, Transform, Load) processes in Databricks for data warehousing, data lakes, and analytics. The developer will work closely with data architects and business teams to ensure the efficient transformation and movement of data to meet business...

  • Software Developer

    2 days ago


    Toronto, Canada Jonas Software Full time

    OverviewCompensation: The expected salary range for this role is between $70,000 and $85,000, depending on experience and qualifications.Reason for Opening: Backfill positionAI is used to screen, assess, or select applicants for this role.As a Software Developer at ProviderSoft, you will have the opportunity to work closely with a highly-skilled, distributed...

  • Software Developer

    2 days ago


    Toronto, Ontario, Canada Jonas Software Full time $70,000 - $85,000

    Job Description: Compensation: The expected salary range for this role is between $70,000 and $85,000, depending on experience and qualifications.Reason for Opening: Backfill positionAI is used to screen, assess, or select applicants for this role.As a Software Developer at ProviderSoft, you will have the opportunity to work closely with a highly-skilled,...


  • Toronto, Canada Bluefruit Software Limited Full time

    Would you like to collaborate with a team of experienced developers and grow your skills while working on interesting projects? - Full-time Software Developer, Senior Software Developer Full-time, Permanent, On Site Are you looking to work at a company built by Developers for Developers? Are you looking to work in a genuinely Lean Agile way, where there...


  • Toronto, Canada Jonas Software Full time

    Director of Software Development CAST Group of Companies – Toronto, Canada Compensation & Role Details Expected Salary Range: The expected base salary range for this role is between $155,000 – $175,000 CAD, depending on experience and qualifications Role Type: New Role AI Disclosure: AI is used to screen, assess, or select applicants for this role About...


  • Toronto, Canada Software International Full time

    Senior Software Developer Agentic Application Development Job Openings Senior Software Developer Agentic Application Development About the job Senior Software Developer Agentic Application Development Software International (SI) supplies technical talent to a variety of clients ranging from Fortune 100/500/1000 companies to small and mid-sized organizations...

  • Software Developer

    2 days ago


    Toronto, Ontario, Canada Guidewire Software Full time $108,000 - $135,000

    SummaryWe are growing our engineering team and are looking for an experienced, product-minded Software Engineer to help build and scale our platform. This role is ideal for someone who enjoys ownership, values clean and reliable systems, and thrives in a fast-moving, collaborative environment.Our technology stack includes Python, FastAPI, TypeScript, React,...


  • Toronto, Canada Software International Full time

    Senior Software Developer Agentic Application DevelopmentJob Openings Senior Software Developer Agentic Application DevelopmentAbout the job Senior Software Developer Agentic Application DevelopmentSoftware International (SI) supplies technical talent to a variety of clients ranging from Fortune 100/500/1000 companies to small and mid-sized organizations in...


  • Greater Toronto Area, Canada Jonas Software Full time

    Job Description:Director of Software DevelopmentJonas SoftwareRemote CanadaCompensation & Role DetailsExpected Salary Range: $130,000 - $150,000 CAD, depending on experience and qualificationsRole Type: New RoleAI Disclosure: AI is not used to screen, assess, or select applicants for this roleAbout The RoleThe R&D Director leads the technical vision,...