Job description
- ETL analysis, design, and development
- Source-to-target mapping
- Peer review
- Data validation and solution testing
- DevOps and Code version control
- Metadata documentation
- Production implementation and support
You will work closely with cross-functional teams, including healthcare professionals, data architects, and IT specialists, to develop robust data pipelines, implement data quality controls, and generate insights to support clinical decision-making. The ideal candidate has a strong background in both data engineering and analytics, with a deep understanding of healthcare systems and clinical data standards.
This position will be developing primarily in the Epic Caboodle Console and Microsoft SQL Server tools. However, duties may extend to IBM DataSage, Azure DataFactory, and various business intelligence tools. Azure DevOps will be used for version control, code management, and code deployment. All development will observe the OHSU BIAA data governance policies and SDLC.
- Design and develop data pipelines to extract, transform, and load (ETL) clinical data from various sources into formal structures suitable for analysis.
- Contribute to the growth of the OHSU Caboodle Data Warehouse architecture by designing, developing, testing and implementing custom clinical data models.
- Work with BI architects, developers, analysts, and customers (practice managers, data scientists, quality analysts, etc.) to design, create and publish data models, ETL processes, and metadata using the Epic Caboodle Console, as well as third party ETL tools.
- Provide complete and consistent documentation of data warehouse content in the Caboodle Console to ensure that users can determine what data is available in the warehouse, how this data is defined, and lineage to Epic Clarity.
- Document assets in the Analytics Marketplace.
- Develop Epic Clarity and Caboodle data feeds using SSIS or similar tools, ensuring data feeds have undergone appropriate security review, and are transported abiding by the guidelines of the information privacy and security office, as well as any further security requirements outlined in a business associates' agreement or data use agreement.
- Coordinate data integration projects with system architects, DBAs, and vendors.
- Deploy data warehouse content using approved Azure DevOps systems and processes
Systems Analysis & Operational Support
- Conduct comprehensive analysis of clinical data, employing statistical techniques and data mining methodologies
- Troubleshoot and analyze ETL process failures, data anomalies and other ETL or data warehouse issues identified by automated monitoring, other developers, and end users.
- Work through established processes to resolve any system or data problems and communicate updates and status to affected groups.
- Recommend improvements to ETL processes, tool sets, data models, and monitoring techniques to improve existing structures and code to increase reliability and efficiency.
- Create and maintain metadata models that accurately describe custom warehouse data structures.
- Provide technical expertise to support end users by providing business logic and information about data pipeline transformations.
- Respond to and track issues using Jira Service Desk.
- Gather additional information from the customers and triage the issue and resolve or escalate by involving additional team members as required.
- Understand and utilize approved System Development Life Cycle (SDLC) systems, processes, and procedures.
- Manage projects by creating timelines, identifying risks and milestones, and provide status reporting to others as defined.
- Provide support to data warehouse developers, line-of-business analysts, and users to validate data in the data warehouse.
- Investigate reported incidents and work to ensure the validity of the data warehouse.
- Use approved development tools (BusinessObjects, SQL Server Management Studio, etc.) to identify data quality and data relationships to assist with the design of efficient and effective reporting solutions.
- Schedule and oversee the schedules and publications of report distributions to ensure that data are delivered only to the intended recipients at the intended time.
Bachelor’s degree in computer science, a related field, or a clinical field and six years work related experience in the information technology field or a combination of clinical or operational healthcare environments; OR
Successful candidates must have experience and working knowledge in operating system software, relational and/or hierarchical databases, and technical application support for administrative and clinical systems
Job Related Knowledge, Skills and Abilities (Competencies):
- Knowledge of data warehousing architecture and dimensional modeling concepts.
- Ability to read, write and maintain SQL code at an advanced skill level.
- Proven communication, analytical, and problem-solving skills.
- Ability to manage multiple projects on an ongoing basis with excellent attention to detail.
- Ability to accurately document system technical artifacts at a level of detail sufficient for ongoing production support.
Registrations, Certifications and/or Licenses:
- Epic Clarity Data Model Certifications
- Epic Caboodle Developer Certification
- Minimum of three (3) years SQL Server Experience, including SSIS and T-SQL, coding, performance tuning, and system optimization.
- Minimum of five (5) years with Microsoft SQL Server T-SQL
- Minimum of two (2) years of experience in a data warehouse environment utilizing Epic Healthcare Data, Epic Clarity Database, Epic Caboodle Data Warehouse and Epic ETL Processes.