← Home

Cancer Care Ontario

Designing a tool for medical professionals to create synoptic reporting checklists

Timeline September 2019 - April 2020
Role UX Designer + Front-End Developer
Responsibilities UX Research
Prototyping
UI Design
Front-End Development
Team Adham Zaki
Maxim Romanoff
Daniel Yong

How it started

In my final year at the University of Toronto, I worked in a team of four to complete a Capstone Design project for Cancer Care Ontario, the Ontario Government's principal advisor on cancer care. Over 8 months, my team worked closely with stakeholders at CCO to design and develop an MVP for a new tool to accelerate the adoption of synoptic reporting across the province.

Medical reporting lacks consistency

To treat patients more effectively, each hospital establishes its own set of standards for reporting results from medical treatments. Although this is effective for one-time treatments, it becomes problematic when dealing with cancer or chronic diseases which take years and many procedures to treat. The problem arises when your medical reports are transferred between hospitals. These reports are often recorded in various formats (paper, electronic, and voice notes) and lack consistency in the breadth and depth of the data captured. One way to eliminate the inconsistency of medical reports between hospitals is through synoptic reporting.

What is synoptic reporting?

Synoptic reporting is a documentation method that uses structured checklists to help medical practitioners and clinicians create more consistent and complete medical reports. Instead of having each hospital standardize their reporting methods, all hospitals would agree on the same reporting format to simplify the knowledge transfer between them. If all hospitals had the same reporting standards, it would significantly improve the knowledge transfer and ensure that doctors are well-informed about their patient's medical history. An example of a structured checklist is shown below.

Structured Checklist

synoptic reporting checklist

Checklists take too long to create

You're probably thinking that synoptic reporting is the obvious answer to the problem, and you're right. The real problem is that these structured checklists take too long to create and publish. In Ontario, each checklist is the result of the collaboration of up to 30 clinicians from all across the province. This collaboration process requires the team of clinicians to discuss and vote on which questions and data elements need to be included in each checklist, which typically consists of more than 100 questions. This process of collaborating, voting, and publishing currently takes an entire year for each structured checklist.

Making the checklist development process easier

For our capstone project, we needed to design a solution that can help reduce the time required to produce these structured checklists to accelerate the transition to synoptic reporting across hospitals in Ontario.

Why is the checklist development process so long?

Before we could solve the problem, we needed to understand why each structured checklist took an entire year to be created and published. To do this, we interviewed the people that were involved in the report development process. From our initial meetings, we understood that there were many people involved in the process: one Functional Lead, 1-2 representatives from Cancer Care Ontario, and 10-20 clinicians.

synoptic report development team

The Functional Lead

The Functional Lead is responsible for spearheading the checklist development process by creating the checklist, collecting votes from clinicians, and manually aggregating all these results to determine which questions and data elements to keep.

Persona - Functional Lead

functional lead persona

Clinicians

The Clinicians (10-20 people) are responsible for voting on which questions and data elements to keep in the checklist. Clinicians often choose different formats to submit these votes: email, Word, Excel, or PDF. The inconsistency of formats makes it difficult for the Functional Lead to aggregate these results.

Persona - Clinician

clinician persona

CCO Representative

The CCO representatives oversee the entire process and are responsible for publishing the final checklist. This requires them to code the checklist submitted by the Functional Lead in XML and sharing it with hospitals across Ontario.

Persona - CCO Representative

CCO representative persona

The current process

Once we understood the roles and responsibilities of everyone involved, we were able to map out the current state journey map of the checklist development process. The end-to-end process is divided into three distinct stages:

The scope of our project only involved the Checklist Creation and Checklist Publication stages.

current state journey map

The problems with the current process

cartoon

Cluttered sources of information

Data is constantly being transferred between Microsoft Excel and Microsoft Word by the Functional Lead to send, receive, and aggregate results.

cartoon

Manual data aggregation

Clinicians often send their votes in different formats, and the Functional Lead has to manually transfer each response to Microsoft Excel and aggregate the results.

cartoon

Manual conversion to XML

Each checklist usually has more than 100 questions once it's finalized. These must be manually converted to XML by CCO representatives to be able to publish them.

The ideal state

The pain points identified in the journey map helped us create an ideal state journey for our users, which guided our brainstorming and design efforts later on. Ideally, we wanted to solve all the pain points:

ideal state journey map

Brainstorming

Once we've concluded our interviews with users, our team had a solid understanding of the problem that needed to be solved and our users' frustrations and needs. To start our brainstorming, our team ran a Crazy 8's exercise to generate different ideas.

My sketches

sketch sketch

The first prototype

After generating some ideas, we decided to put together a low-fidelity prototype to conduct some initial testing with the Functional Lead and some clinicians. The first iteration of our solution was a unified tool that can be used by both the Functional Lead and the Clinicians.

Functional Lead view

The role of the Functional Lead is to create questions, share them with clinicians, collect the results, and aggregate them. For this reason, we decided to design a solution that would allow them to do all their work in one place without juggling between their email, Word, Excel, and any other tool they were using. By eliminating most of the manual work currently being done by the Functional Lead, we can speed up the data collection and analysis processes.

Edit Form

The "Edit Form" page is a space where the Functional Lead can create the questions for the synoptic report. We followed a similar design pattern to popular form builders such as Google Forms and Typeform to make it more familiar to the user. The Functional Lead would also have the ability to share these questions with the clinicians from the same page and download the form in XML format.

Functional Lead - Edit view

edit view

View Results

When clinicians have voted on the questions, the results are displayed under the "View Results" tab. This view would show all the questions from the checklist and the approval percentage (e.g. how many clinicians voted to keep the question). These percentages would be colour-coded: any question that received less than 60% approval would be marked in red so they are easily spotted while scanning the page. These contentious questions are usually discussed in the monthly meetings between the Functional Lead, Clinicians, and CCO representatives.

Functional Lead - Results view

results view

Clinician view

When the Functional Lead uses the share functionality and sends the form to Clinicians, they will need to vote on which questions they think should be kept. For each question, clinicians need to indicate the level of clarity and relevance, as well as their overall vote to determine whether it should be mandatory, optional, or excluded from the checklist. Once clinicians have individually completed their voting, they can submit their answers which are then automatically aggregated and shown under the "View Results" tab in the Functional Lead view.

Clinician - Voting view

voting view

Getting feedback from users

The participants

With our first prototype, we decided to start testing users to validate and even challenge our initial designs and assumptions. Due to clinicians' busy schedules and limited availabilities, we were only able to conduct testing with one Functional Lead and four clinicians. To ensure the consistency of our results, we created a script to follow while testing our prototype with users.

The results

Although we had a small number of users to test with, we were able to identify numerous issues with our initial prototype. We ranked these issues based on their criticality using the following criteria:

Summary of user testing results

usability testing results

Iterating on the design

For the next iteration of our design, we wanted to solve all the issues that we found in our first prototype, even the less critical ones since they would improve the overall experience.

Changes to the Functional Lead view

The Functional Lead view did not have any significant issues, but the "View Results" screen needed to display more detailed results such as the Clarity and Relevance scores submitted by Clinicians instead of only showing the overall rate.

Updated Results view with Clarity and Relevance scores

functional lead updated view

Changes to the Clinician view

For the clinician view, we were able to identify some critical shortcomings in our design, as well as a few minor changes needed to improve the overall experience.

Switching background colours

In our first design, the question on the left-hand side had a white background while the voting side on the right had a grey background. This prompted clinicians to attempt to answer the question, instead of completing the voting section. To focus the users' attention on the voting side, we switched the background colours: the question was given a grey background and the voting side was given a white background. The text colour for questions and answers was changed to a lighter colour to clarify that they are not clickable.

First iteration

old voting view

Second iteration

new voting view

Adding detailed instructions

During testing, 3 out of 4 clinicians expressed frustration at the lack of written instructions. Instructions were added at the top of the page to guide users on what to do. Additional instructions were also provided to explain the scales for "Clarity" and "Relevance".

Instructions

instructions

Adding a progress bar and save functionality

In our initial design, we failed to take into account the extreme case where clinicians have to vote on over 50 questions. During testing, we learned that clinicians wanted to save their progress in situations where they are called for an emergency and needed to finish their work at another time. In the next iteration, we added a progress bar and a save functionality that is always visible on the page as users are scrolling through the questions.

Progress bar and "Save" button

save button and progress bar

Adding an "Opt-out" option

In some cases, clinicians can't vote on a particular question if it's not within their area of expertise. For this reason, they needed an "Opt-out" option to skip a question that they couldn't vote on. When the checkbox is selected, the voting side (right) turns grey to show that it is inactive.

Opt-out

opt-out functionality

Final Design + MVP

After iterating on the design, it was time to develop our final deliverable: an MVP with the technical functionalities built-in. I had to put on my front-end development hat to turn our design into an MVP that can realistically simulate how our solution worked. While working on the front-end, my teammate worked extensively on the back-end to support the sharing and collaboration functionalities, as well as the data storage and processing that our solution required. The screenshots and screen recordings below were taken from our fully-coded MVP.

Functional Lead view

Editing the form

Once the Functional Lead creates a new form, they are directed to the "Edit Form" page where they can start creating questions. The Functional Lead can create text, single select, or multi-select questions by clicking on one of the three blue buttons.

Adding a question

adding a question

To edit a question, the Functional Lead can click on the pencil icon in the top right. This allows them to edit the question and the answers, or delete them entirely.

Editing a question

editing a question

The Functional Lead can also add child questions by clicking on the circular '+' button which appears when hovering on a question. Like other questions, these child questions are either text, single select, or multi-select. For single-select and multi-select child questions, the Functional Lead can select a conditional answer to add logic jumps to the form.

Adding a child question

adding a child question

Sharing the form

Upon clicking the "Share" button in the top navigation, a modal appears to allow the Functional Lead to enter emails of the recipients of the checklist. Typically, this would require the Functional Lead to add the email addresses of all the clinicians who need to vote on the questions. The Functional Lead is then able to:

Sharing

sharing with others

Viewing the results

When Clinicians submit their votes, the Functional Lead can see the results by navigating to the "View Results" tab. The default view will show the aggregated votes of all Clinicians for the Clarity, Relevance, and Overall scores, as well as any comments that were left. Contentious items that have received less than a 60% approval rate on any of the scores are highlighted in orange. Similarly, items that have received more than a 60% approval rate are highlighted in green. The results can also be sorted from highest to lowest and vice versa.

Viewing and sorting results

full results view

In addition to the aggregated results of all Clinicians, the Functional Lead can view individual submissions from each clinician by using the dropdown menu to select a specific participant. This will display the detailed submission of each clinician.

Viewing individual submissions

individual results view

Downloading the XML

When the Functional Lead is ready to export the form in XML format, they can do so by clicking the XML button in the top navigation. The resulting XML file follows the same standards used by medical systems across Ontario. This eliminates the need for the manual XML coding that was required by the CCO representatives.

Exporting the form to XML

downloading an xml file

Clinician view

When the Functional Lead shares a form with clinicians, they receive a unique link to access and vote on the questions. Clinicians can then access all the questions and vote on the Clarity, Relevance, and Overall scores, and even leave comments. The progress bar is updated accordingly as the user inputs their selections.

Voting on questions

voting view

What I learned

Clearly explaining the purpose of testing to users

In our first prototype, we used Lorem Ipsum text as dummy content for the questions that clinicians needed to vote on. This caused some confusion among our participants, so we decided to replace them with more realistic questions. Surprisingly, this also caused confusion since clinicians began focusing more on the content of the questions than our prototype. We realized that the problem was not the content we used, but our inadequacy in clearly explaining the purpose of the test. Simply saying "The questions included in the prototype are placeholders, so don't worry if they seem odd to you" would have helped us avoid the confusion entirely.

Becoming better at front-end development

Before working on this, my only experience with front-end development was coding my portfolio. This project was much more robust and complex from a technical perspective and required me to collaborate closely with my teammate who was working on the back-end. Through this experience, I gained a deep understanding of how to perform DOM using JavaScript and a general understanding of how data is stored and transferred between front-end and back-end.

Aligning goals and expectations with your client

Contrary to our usual course projects where we're given a clear problem to solve and predetermined goals and expectations, the Capstone Design project required us to work with a real client to clearly define the problem we're addressing and what's expected from us. We spent the first 3 months of the project meeting weekly with our client and other stakeholders to dive deep into the challenges of implementing synoptic reporting across Ontario. Spending significant time talking and listening to our client and stakeholders was extremely valuable because it helped us understand the problem through their perspective and experience, which helped us align quickly on the project goals and expectations.

Continue reading

Spotify

Redesigning the playlist creation process on the Spotify app

Read story →
← Home