Covid Cheq Integration Documentation
BI - Integration - CovidCheq (Full/Inc)
Last Updated 5/24/2021
Overview
Location
Schedule
Details
Change Log
Troubleshooting Notes
Overview:
The SQL Agent Job for Covid Cheq stages and processes data needed for COVID related tables and dashboards. This is the main job that processes COVID data that, along with the ETL – Qualtrics Surveys job, runs all the processes needed for COVID tracking.
The primary purpose of this job is to do the staging and processing for the COVID related tables. There are a number of procedures that run to build tables related to covid tests, cases and quarantines.
Location:
SQL Agent Job in the dataMarq data warehouse.
Schedule:
The jobs in this project run as part of multiple SQL Agent jobs
- ETL - BI-Integration - CovidCheq Inc – Daily every hour from 9am to 6pm
- ETL - BI-Integration - CovidCheq Full – Daily at 12am
Job Details:
STEP 1: Stage Medicat Data
This step runs the master package in the DS-Medicat (link!) project which stages all the required data from Medicat into the medicat schema in datamarq_staging.
STEP 2: Stage RHMS Data
This step runs the master package in the DS-RHMS (link) project which stages all the required data from Mercury into the rhms schema in datamarq_staging.
STEP 3: Execute Tests Procs
This step runs the procedures related to capturing MUMC testing data.
- sp_f_covid_tests
Populates med.f_covid_tests with details on tests conducted by the MUMC
STEP 4: Execute Tests Summary
This step runs the procedures that translate data from med.f_covid_tests into a summary table used by OIRA for its dashboard.
- sp_s_covid_tests
Populates oira.s_covid_tests, similar data to med.f_covid_tests but deidentified. Probably not necessary but at first OIRA didn’t have rights to see identified tests so this was created and used
STEP 5: Execute Units Procs
This step runs the procedures that OIRA uses to track unit risk for the COVID gating criteria
- sp_s_covid_units
Populates oira.s_covid_units, which takes take from a qualtrics staging table with responses to a critical units survey that identifies risk level for each unit
STEP 6: Execute Quarantine Procs
This step runs the procedures that ResLife and OIRA uses to track quarantine cases, covid cases, and quarantine space
- sp_f_quarantine_cases
Takes data from RHMS and populates reslife d and f schema tables that track quarantine rooms, students and block use. It also populates the OIRA table oira.s_quarantine_cases and totals which are used in the dashboards
- sp_f_covid_cases
Populates med.f_covid_cases but runs here because it needs the quarantine data before it can run successfully
- sp_f_quarantine_cases
Populates the med.f_quarantine_cases table that the med clinic uses from tracking quarantine
- sp_f_covid_contacts
Populates the contact tracing table med.f_covid_contacts that the med clinic and OIRA use to track close contacts – also runs here because of the need for quarantine/housing info
STEP 7: Run Covid Cheq Master
This step runs the master integration job in the DW-CovidCheq project in the BI-Integrations solution. This is the job that sends data to Qualtrics. It runs at this step because we return testing data and quarantine data back to the Qualtrics systems to power logic related to auto setting someone’s quarantine status.
- Incremental – uses the modified date on the ops.d_covidcheq_list to send only recently modified records
- Full – sends every record on the table to Qualtrics
STEP 7 and 8: Success or Failure Email
Depending on the status of the job, the job will exit sending a success or failure message to the BI Team.
- ops.sp_etl_send_email
This is our custom notification proc that gets passed the job id and looks up the success/failure status and sends the appropriate email and text message to the BI team.
Change Log:
Date | Developer | Change Record |
5/24/2021 | N/A | Initial Documentation |
Troubleshooting Notes:
-
Intermittent Failure on Step 7 - Covid Cheq Integration Master
Occassionally the integration with Qualtrics will fail due to a connection error or a timeout on the Qualtrics side. We use the API to interact with Qualtrics and they sometimes throttle the connection. If there are more than two consecutive failures in this job it is usually something on our end, not the Qualtrics. But an occassional faillure is to be expected. -
CDC not capturing deletes
Although it has been fixed, an issue has popped up in the past where the CDC jobs have not correctly marked soft deletes and deleted records have remained in dim and fact tables during the week. Full refreshes almost always fix this issue, but many procs have been fixed to allow soft deletes during an incremental refresh. -
Bad natural key joins on CDC procs
If the natural key is not correctly identified for the join, rows can be improperly updated/deleted/inserted during the CDC merge process