Quality Improvement, Program Evaluation, Education Outreach, Design Science, and Evaluation Overview

There are two other kinds of scholarship that fulfill expectations of an R2 Scholarship project: Quality Improvement and Program Evaluation and Education Outreach, Design Science, and Evaluation. According to the Fraser Health Authority, the purpose of a quality improvement scholarship endeavor is to improve internal processes, practices, costs or productivity for a specific intervention [i.e. determine how this intervention affected this participant group in this setting]. The purpose of a program evaluation is to inform decisions, identify improvements [i.e. formative evaluation], and provide information about the success of programs [i.e. summative evaluation] according to predefined goals and objectives. Planning the evaluation may run concurrently with program planning.  The design of both kinds of projects can be flexible (you can choose different methods for undertaking your QI or program evaluation project) and both, again according to Fraser Health, will ultimately have utility to decision-makers; program management who use the findings to make improvements to the ‘practice’ being reviewed to benefit current and future program participants. Education Outreach, Design Science, and Evaluation is a broad umbrella descriptor of scholarship that, again as summarized in contemporary reference sources like Wikipedia, focuses both on the development and performance of something (a website or an ap or a tool) with the intention of improving the functional performance of that something.

The Quality Improvement, Program Evaluation, Education Outreach, Design Science, and Evaluation projects provide guidance to residents interested in designing or developing new tools as part of their resident project. A “tool” can be any number of types of artifacts that are the output of the design process. Some examples include:

  • A handout for patients
  • An educational resource for medical students/residents/allied health professionals
  • A new process or workflow
  • A design or prototype for an app or website
  • A product that promotes better hand hygiene
  • An educational health campaign

What we want to highlight for these types of projects is that a scholarly approach is taken to understanding the problem by applying theories and methods as part of design / development and evaluation processes.

UBC Department of Family Practice encourages and supports residents who wish to pursue the creation of new knowledge translation through creative development of artifacts that address a clinical, health, or health system problem.

Objective

The objective of a Quality Improvement, Program Evaluation, Education Outreach, and Design Science project is to contribute to Family Medicine through the understanding of a real-world problem through the systematic creation of a product or process that seeks to improve the problem in some measurable way.

DSR projects require abductive reasoning; that is the generation of ideas. These ideas are then developed into tangible artifacts: tools, apps, processes, etc. However, this project is not just building a tool; a DSR requires the use of clear scholarly foundations in the design process and development of an evaluation plan.

The core of quality improvement efforts of our program lies at its ability to evaluate multi-faceted components of our program systematically and critically. Residents may choose a topic of interest pertaining to our residency program to review current processes, develop assessment plan, collect/analyze data, and discuss results and next steps.

  • Examples:
    • e.g. Assessing resident experience in using Field Notes as a learning tool.
    • e.g. Evaluation of Academic half days in its relevance and efficacy to residents.

Foundations for the Quality Improvement, Program Evaluation, Education Outreach, and Design Science Projects

To ensure scholarly work, we require all projects that are going to considered as design science research projects to include the following aspects:

Clear Problem Definitions

What are you trying to address?

This can be stated as an objective for the work. In design research objectives or more open-ended questions are more common than binary hypotheses. Problem definition will require some understanding from the literature on the nature of the problem, its scope, etc.

NOTE: Proper problem definition itself is often a design challenge that can be taken up as a project. In this kind of project, the output would be a clearer description of the problem and some of the root causes discovered through the exploration process. Some possible topics to explore are: Needs assessment, environmental scan, and root cause analysis.

Example Methods:

  • Usability testing with think aloud – we should add a line about when these are appropriate to use, for what type of project?
  • Usability Inspection
  • Cognitive Walkthroughs
  • Paper Prototyping
  • Program design/evaluation frameworks, e.g., US CDC, PHAC, NCCHPP  – what to add here Brian? Then we need to add a bit to each about when they should be used.
  • Logic Model
  • Kirkpatrick Model – to design evaluation of educational programs

It is critical to being a scholarly project that a methodological foundation is used. i.e. this is not just about building a “thing”, but taking a measured approach, using evidence and synthesizing your learnings in that “thing” you build.

Output:

Typically, there are three main outputs from a design project:

  • The artifact that you developed
  • A report
  • Presentation / Demo

Project Artifact:

Design artifacts will range considerably, based on interest etc. Thy can be paper based, online tools, prototype app designs, etc. These are sufficiently “tangible” to translate the work into reality. They do not have to be fully functional systems. For example, a smaller DSR project could result in a paper prototype of an app (or limited interactive mock-up) and a larger team could actually develop a clickable prototype of the app as the project artifact.

Handouts, websites, videos, and podcasts are all examples of artifacts.

Eligible projects should meet all the requirements for a Resident Scholar Project. Presentation is a good place to demonstrate your artifact.

1. Program Evaluation

Program and site evaluation must take into consideration the ethnogeographic location of the site, including the diversity of its patient population as well as the social location and lived experience of its trainees, supervisors, faculty and administrators.  Chosen methods and reporting will depend on these considerations.  If method and reporting is expected to expand past traditional program evaluation methodologies and written reporting, early consultation with site Scholar Lead and the Scholarship Committee is required.

Manuscripts must be prepared in accordance with the “Uniform Requirements for Manuscripts Submitted to Biomedical Journals” available on the International Committee of Medical Journal Editors (ICMJE) website.

Data request/Resident Access Residents may request access to Family Practice resident data and/or request access to fellow family practice residents (e.g. distribute surveys, conduct interviews) for their resident scholar project.

Please review the guidelines under ‘Data request/Resident Access’ and submit completed Data request/Resident access application form to Data Concierge Committee. For all other inquiries related to Data request/Resident access please contact the Data Concierge Committee at:  fmprpostgrad.research@familymed.ubc.ca

2. Continuous Quality Improvement (CQI)

In the healthcare system, there are always opportunities to optimize, streamline, develop and test processes. Quality improvement (QI) is a proven, effective way to improve care for patients, residents, and clients, and improve practice for staff[1]. QI utilizes structured improvement methods and models, such as Model for Improvement and a testing model called Plan-Do-Study-Act[2] (PDSA, see Figure 1 below). The principles of CQI involve using the PDSA cycles to continuously improve an aspect of practice.

  • Residents may partake in CQI projects as a continuation of the R1 project or as a new project, where you gather data after implementing a quality improvement intervention and measure whether that intervention has had its desired effect in practice.
  • Examples:
    • g. interviewing patient experience before and after implementation of increased time allocation in doctor’s office
    • g. number of healthcare associated infections before and after implementation of hand hygiene posters in hospital

The goal of CQI is to learn how to identify and address safety, efficiency and/or quality concerns in practice.

Eligible projects should meet all the requirements for a Resident Scholar Project in addition to the following criteria.

  1. Evidence-based: A thorough health literature search must be undertaken to find evidence for and against the performance standard and the improvement plan. The evidence should be critically assessed for epistemic biases and contextualized to avoid perpetuating bias.  This evidence should be summarized in the written report.
  2. Evaluative: An evaluation of the Quality Improvement Project must be included. This means evaluating the practice before the improvement is implemented and repeating the evaluation at a reasonable time afterwards with a summary and comparison of the results.

Data request/Resident Access Residents may request access to Family Practice resident data or request access to fellow family practice residents (e.g., distribute surveys, conduct interviews) for their resident scholar project.

Please review the guidelines under ‘Data request/Resident Access’ and submit completed Data request/Resident access application form to Data Concierge Committee. For all other inquiries related to Data request/Resident access please contact the Data Concierge Committee at:  fmprpostgrad.research@familymed.ubc.ca

3.  Validation of a tool

When referring to measurement, validity is the degree to which a measurement measures what it purports to measure.[1] Validation is the process of establishing that a method is sound.

In validation studies, different measurements are made of the same variable (this might involve assessing a clinical condition, knowledge, risk, etc.), and the level of agreement between these measurements is determined.[2] It is important to note that one of the measurements is believed to be a “gold standard”, or a method of measurement that reveals the true value or outcome. The other measurement (the “non-gold standard measure) usually has some desirable aspect, such as being less invasive, cheaper, easier to administer or more readily available, and the aim is to determine whether this other measure can determine the truth. In other words, the key question is: does the simpler/cheaper/easier to administer test actually measure what we want to measure?

Examples of publications that have documented the validation of an assessment tool are presented below.

Olsen JR, Gallacher J, Piguet V, Francis NA. Development and validation of the Molluscum Contagiosum Diagnostic Tool for Parents: diagnostic accuracy study in primary care. Br J Gen Pract 2014; 64(625): e471-6.

Matlow AG, Cronin CMG, Flintoft V, Nijssen-Jordan C, Fleming M, Brady-Fryer B, et al. Description of the development and validation of the Canadian Paediatric Trigger Tool. BMJ Qual Saf 2011; 20(5): 416–423.

It is important to note that many, if not most, clinical tools are developed and validated using a dominant population as participant population (Ie: White, cishetero male, middle age, middle class, English speaking).  Critical assessment of the development of the tool being validated and its appropriateness for the chosen population it is being validated within must be completed to avoid selection bias and possible perpetuation of incorrectly identified population-deficits. If there is a pre-test bias identified in the development of the tool which may impact validation of said tool, early consultation with site Scholar Lead and Scholarship Committee is recommended.  Likewise, partnership with the participant population in combined qualitative methods is highly recommended.