All Collections
Feature overviews
Audit Details
Best practices in audit design
Best practices in audit design

Best practices to avoid common audit design pitfalls

Dan Bishop avatar
Written by Dan Bishop
Updated over a week ago

It's great to make auditing easier with software. But not all audits are created equal, and it's easy to digitize poor designs that waste time. Here's some best practices we've identified over the years. Use it to tackle your next improvement project. And, skip (re)learning some painful but common lessons in audit design!

Make sure you (and your team) have a clear goal

Picture this. You've put hours of testing, designing, and researching to create an audit. Excited to feel some progress, you've even started collecting some initial data. At your next monthly meeting with a team of stakeholders you share your early results and get the frustrating feedback:

"Sorry, but this isn't really what we need."

Yikes. Too detailed, too general, unactionable information, questions that people don't buy into - these are some of the most common pitfalls we've seen customers run into.

Here's a simple set of upfront questions that can get everyone on the same page.

  • What are you trying to accomplish through process improvements?

  • What do you need to understand about your process?

  • How is data collection going to happen?

  • How will the data be used?

Go out and observe

Some people love to work with patients. Others love using technology and tools. It takes all types to deliver great healthcare. However, a common pitfall is for audit designers to be just far enough away from the frontline work to miss important details. You see, humans are great at confidently remembering details we don't actually know. And it's these details that make the difference between rolling out an audit that fits with how people work and a design that send auditors roaming around the unit for 10 minutes to answer every question. Take 30 minutes to go observe your process of interest and sketch out how it happens today. You'll uncover considerations about workflow, timing, and question format that, if unnoticed, can lead to ineffective audit designs, unhelpful data, and lost buy-in from auditors.

Design & iterate for buy-in and effectiveness

"No man is an island" and neither is an improvement project. Your team can help you arrive at more useful designs, eliminate timewasting errors, and buy into process improvement if you engage them throughout the design process. The key is to shift your framing from "I need to design the right audit" to "the team needs arrive at an effective audit they're bought into using."

  • Start modestly with your first design

  • Involve key stakeholders in a review

  • Ask for help ("I'm sure this design needs work. I'll need your input to get there.")

  • Use Draft Mode to enable easy test data collection by key stakeholders

  • Use Enable Feedback to allow auditors to give you input on design issues

  • Schedule a reminder to solicit feedback in the future

Common Audit Styles

Here's some of the most common styles of process measurement.

Rounding

Audits designed to be part of regular rounding processes, typically by managers or other leaders. Qualaris audits can be a helpful tool for structuring or guiding the rounding experience. Howoever, buy-in and workflow considerations are particularly high. So, start small and pay attention to whether all of the desired information is easily accessible while rounding. Use audit note fields at the early stages. You can capture non-PHI comments that may need to become structured question. These comment fields also avoid the frustrating situation of busy managers feeling like they can't record important information.

Standard work

Standard work is a common countermeasure implemented by organizations adopting lean principles. Bringing higher levels of standardization to key processes can make it easier to identify issues and propose and test improvements. While lean experts beyond Qualaris will be your best source for in-depth education on the concept, we'd like to highlight some common practices adopted for Qualaris' audits. 

Often, team have a goal to understand questions beyond a simple "Yes" or "No" for process components such as sequence, timing, and barriers to completing a process as specified. Choice fields placed at the bottom of an audit can be a helpful way to measure issues (e.g. "Sequence?" with answers "As specified" or "Not as specified"). And the note field can be used to add capture additional non-PHI details about the deviations observed.

Teams also often have a goal to standardize the observation process itself. The audit form has a description section to capture helpful details about how the audit should be used. Checklist subsections can also be added to add visual cues to ensure correct audit usage as well.

Bundles

Bundles are used to define a group of best practices related to improving a specific clinical outcome. Typically, these bundles tend to represent a few key processes or interventions rather than a singular process in greater detail. Often, bundles are informed by evidence-based recommendations from clinical experts as well as regulatory requirements. The most common pitfall is failing to adequately adapt these requirements from "outsiders" into something that can be effectively used.

For example, a recommended bundle can include clinical practices as well as documentation practices. When everything is added to a single audit design, the end result can be an unweildy workflow for auditors.

Another common pitfall is overly lengthy audits. If a comprehensive list of bundle elements takes too long, consider prioritizing the top 5-8 elements you'd like to improve first. 

Lastly, adopting recommendations often comes with its own set of vocabulary and jargon. Match the internal language of your organization and consider adding a help text for clarification if needed.

Interview surveys

Audits can be designed to gather survey questions directly from staff or patients and family. An important limitation of Qualaris' audit tool is that it's designed for data collection by auditors. This design best fits workflows where a surveyor is asking questions of another person and doesn't fit self-reported survey.

Translating a list of survey questions into an interview can be challenging for auditors. There's a natural flow to conversation that is sometimes difficult to recreate with an audit. As always, testing is a great technique for reducing issues. Many customers have designed a standardized introduction that helps setup the survey, and this can be documented in the audit description. If questions tend to not easily follow a standardized design, checklist subsections can be used to help auditors quickly move between sections as a conversation fluidly progresses.

Advanced Audit Design Practices

Staff names and room numbers

There's two common ways to note staff names with audit designs. If your list is a manageable length and doesn't change often, consider using a choice field. This approach allows you to easily view your data by name. If your list is very long or changes often, consider using a short note field. This simple write-in allows flexible names recording as needed without maintaining staff lists in Qualaris. However, be mindful of potential issues with misspellings.

Similar to staff names, you can take the same approach to note room numbers with audit designs. If you're taking the approach of a short note field, recognize that audits can also support an integer field to collect room numbers as well.

Real-time coaching and follow-up

Some auditing projects want to support a workflow for immediate feedback provided by the auditor. A choice field placed at the bottom of your audit can help you prompt this action with auditors and document what happened (e.g. "Real-time coaching provided?").

Prevalence counts and Likert Scales

Some auditing projects want to collect information about the prevalence of patients by type or use of interventions. For example, you might want to learn how many audited patients are using a sitter, but you don't want to include this as part of your calculated compliance score. A choice field will allow you to add these prevalence questions. Similarly, Likert Scales are a popular way to collect feedback during interview survey-style audits that can be supported with a choice field. For analysis, use Explore or Dashboards to look at the counts rather than compliance by choice value.

Conditional Form Logic

Sometimes not all questions should be answered depending on the situation an auditor encounters. A certain procedure was not performed so the question should be skipped, or you want auditors to only write in a free text answer if a question was answered as "other". To solve these issues, forms can be configured to show or hide a field or show or hide specific options based on the selected answer to another question. This opens the door for all kinds of powerful forms when designing Audit Forms.

Did this answer your question?