My roles in all projects below: information architecture, graphic design, interaction design, animation, content editing, quality assurance, user testing, LMS administration, tech support. Please note: All modules have been edited from their original state for client confidentiality.
All White – an experiment
Upon successful completion of this online module, a nurse is re-certified for assisting with patients who have suffered a stroke. The module tests the nurse and tracks their results to comply with government-mandated regulations. Working with the subject matter experts on content, I designed, developed, and deployed this module using various tools, including Articulate Storyline and Pathlore Learning Management System.
Module Feature Requirements:
- Certifies nurses by providing evidence of their completion
- Plays a large number of video clips
- Allows user interaction to assess patients on a scale devised by the National Institutes of Health
- Randomly chooses two patients from a pool of eighteen patients each time the module is opened
- Allows user to rate each patient on fifteen assessment items
- Requires users to pass with a minimum score of 85%
- Allows users to review their answers
With 18 patients, each containing 15 assessment items, this modules requires 270 assessment slides. Each assessment slide contains a video clip, a multitude of specific labeling, a text entry function, a submit button, a correct answer prompt, a wrong answer prompt, and a prompt to remind users to type an answer before clicking submit. And to make it even more challenging, the module needed to work in Internet Explorer 6 and Internet Explorer 8, the only browsers the nursing staff have access to on campus.
The first element to tackle was video, which were provided by the National Institutes of Health. I received the patient videos as 18 individual videos. Each needed to be cut up into 15 segments for the scale items. Luckily, when my team’s A/V coordinator was converting the videos from their original source, she used markers for each clip. I was able to use those markers to cut the videos and exported each one as an mp4.
I tested with the size and delivery of the videos. Considering the staff would be using older computers and most likely IE6, the videos needed to find that sweet spot between fast loading and crisp quality. After some testing, I decided to insert the videos onto each slide as a web object rather than embedding them into the slide. I discovered that Storyline will convert embedded videos upon publishing, no matter the type. I didn’t want the videos to be compressed again, nor did I want the module to become bloated with 270 video clips, so web objects were the next best thing. Some definite cons exist in using this method, but using this method was the best strategy for the situation at hand.
Because the videos were web objects, they needed to be wrapped in a player and hosted on a server from which the module could load each video. The videos were wrapped in the JW Player and set in a webpage that simply contained the video on a black background. These webpages were linked to the module as a web object. When the user lands on a slide, the video automatically plays inside the slide quickly and clearly. The only caveat is that the webpages were not responsive, so if the users screen is too small, the videos require scrolling to be seen completely. I felt it was a fair trade-off, and tested this method extensively on monitors that were the same size as most staff on campus.
The design for the module went through a number of iterations. Previous ideas from other team members included using a tab system across the top of the page to simulate going page-by-page through the 15 scale items. Another team member created a mock-up where the user is presented with a menu of buttons that would take you through each scale item. In both of these iterations, the user is able to move from item to item at their own accord.
It seemed more important to me that the user answer each question in chronological order, so my design did away with the functionality to allow the user to advance through the course themselves. I also did not want the user to be tasked with keeping track of their progress or to think that because they could flip through the scale items, there was an important reason to do so. It was not really necessary. They simply needed to watch the video and rate the patient, with as few clicks as possible.
Ultimately, to display the 15 scale items, I decided on a visual bookmarking list on the left-hand side to indicate where the user is on the scale. The list is only a visual cue and is not interactive. For the interactive element, text entry and submission, the user enters their assessment in a text field in the bottom right corner of the screen and clicks the submit button below it. Keeping these two elements close together makes it easier for the user. Not only is it easy to see and remember where the interactive elements are placed on the screen, but there was less distance for the user to navigate.
The video plays automatically when they land on the page and the rating scale is below it. In the center of the screen, the patient’s category is listed (Group A, B, or C and Patient 1 – 6), a description is given that defines the assessment, and at the bottom are instructions on how to complete the assessment. All of this information is different on each slide and the content is supplied by the NIH. See the image below for more details on the design.
The utmost attention was paid to ensure that the staff would be able to complete the stroke certification module as easily and quickly as possible. Staff are always busy while on shift and they can often have up to 5 – 7 modules assigned to them at any given time. Making the module easy to use, easy to learn from, and easy to recall for situations involving actual patients were at the top of the list of requirements.
Gallery of Modules