Usability
Evaluation of a Property Management System: An Example
By Galen
Collins
Introduction
Rubin
(1994) points out that the test plan is the foundation for the entire
evaluation because it addresses the how, when, where, why, and what of the
usability test. Following is the test plan that was used for conducting the
usability test of a first-generation, Windows-based property management system
(PMS). The plan covered the following sections (Rubin, 1994):
· Purpose.
· Problem statements.
· User profile.
· Methodology
· Task list.
· Test environment and
equipment requirements.
· Test monitor role.
· Evaluation measures.
·Test report and findings and
recommendations.
Purpose
The purpose
of the test was twofold:
1. To
predict the expected performance of naive, novice and competent/expert
reservationists using the system to input and modify reservation data and to
access reservation related information
2. To
collect data on difficulties incurred and attitudes of users and to determine
how to refine future product releases.
Problem Statements
The
specific questions that needed to be answered:
1.
After a 40-minute training session, were users able to successfully complete
reservation transactions and access reservation related information in a timely
manner?
2.
Did users find the program easy to use?
User Profile
A total of three participants were
tested during April, 2000 at the
The
participants selected represented the types of employees who might be hired
into reservationist positions. According to Shneiderman (1980), to operate a
computer, a person needs two kinds of knowledge - a general knowledge of how a
computer system works (semantic knowledge) and the specific knowledge needed to
operate a particular system (syntactic knowledge). Hospitality employees have different levels
of computer-related knowledge. Some who have very little or no syntactic or
semantic knowledge are classified as naive users. Novice users may have some computer
experience on personal computers or other PMS systems but they lack the syntactic
knowledge of how a particular system works.
Finally, competent/expert users have both the syntactic and semantic
knowledge needed. For the purposes of
this evaluation, a participant was selected to represent each of the
aforementioned user classifications.
1. User 1. This naive user has recent and
limited Windows experience with no hospitality-related computer experience.
2.
User 2. This novice user has
experience with central reservations systems but none with the PMS.
3.
User 3. This competent/expert user
has extensive experience with the PMS.
Characteristic |
Range |
Frequency Distribution |
Windows
(GUI) Experience |
6
months - 6 years |
67%
greater than 5 years 33
% less than 6 months |
Hospitality
Experience |
0
-20 years |
33%
20 years 33%
13 years 33
% No Experience 33% |
Experience
with PMS |
0
- 7 years |
67%
have never used MSI 33%
have 7+ years experience using MSI |
Education
Level |
High
School College |
100
% high school diploma 33 %
college degree 67%
some college experience |
Age |
28
- 53 |
67%
28-45 33% 50+ |
Learning
Style Preference |
Read/Write
- Aural |
67%
Read/Write 33%
Aural |
Sex |
Male/Female |
33%
Male |
Figure
1. User Profile of Participants
Methodology
The
usability test was designed to gather usability data via direct observation
(see Appendix C for data collection instrument). After the performance test,
participants were asked to complete a user satisfaction survey. Most of the
survey questions were primarily derived from the Questionnaire for User
Interaction Satisfaction (QUIS) developed by Shneiderman and refined by Norman Chin
(see Appendix A). The usability test
consisted of four steps:
1.
Participant greeting. Each
participant, greeted by the test monitor, was made to feel comfortable and
relaxed.
2.
Orientation and training. Prior to the test the participants received a
short introduction, explaining the test’s purpose and objective. The also
learned that a satisfaction survey would be administered after all of tasks
were completed or the time had expired. The participants were informed that the
test monitor would be observing them.
They were encouraged to think aloud while performing the tasks to help
them to focus and to concentrate as well as to express their feelings (e.g.,
confusion, frustration, delight, etc.). Each participant received up to 40
minutes of training (show, do, and tell) prior to the test.
3.
Performance test. After the training was complete, participants
were asked to complete seven tasks, each of which had upper time limits. See
Appendix B for task list.
· Task 1. Make a
reservation for a guest based on the following data: name: Galen Collins, arrival
date: 10/04/00, departure date:10/05/00,
market/package: rack rate, adults: 2, child: 1, room type: 2
queen beds, reservation source:
telephone, guarantee: American Express:-
372795231002000, information: need
one bed board, business address: NAU
BOX 5638, Flagstaff, AZ 86004, arrival:
1:00 PM, flite number: 1148, phone number: 928-523-7333, and e-mail
address: Galen.Collins@nau.edu.
· Task 2. Modify the reservation by changing the arrival date to
· Task 3. Cancel the reservation.
· Task 4. Identify the percentage of rooms available for sale today
and tomorrow.
· Task 5. Identify room types that are sold out for the next seven
days, beginning with today.
· Task 6. Identify rooms undergoing maintenance on
· Task 7. Identify expected arrivals for 05/12/00- 05/13/00.
4. Participant debriefing. After all tasks
were completed or the time had expired, each participant was debriefed by the
test monitor. This included the completion of a questionnaire pertaining to the
participant's subjective perceptions of program usability and the participant's
overall comments about his or her performance.
Test Environment
Located
in the general manager's office at the
Evaluation Measures
The
following evaluation measures were collected and calculated:
1.
The average time to complete the task.
2.
The percentage of participants who finished each task successfully.
3.
Error classification: to the degree possible, each error was classified and a
source of error indicated. Error
classifications were as follows:
· Observations and
comments - The test monitor noted when a participant
had difficulty, when an unusual
behavior occurred, or when a cause of error
became obvious.
· Noncritical error. An individual participant who made
a mistake was
able to recover during the
task in the allotted time.
· Critical errors. An individual participant who made
a mistake and was unable to recover and complete the task
on time.
4.
Participant rankings of usability. Some questions were open-ended questions,
rather than rankings.
Test Monitor
The person
acting as the test monitor sat in the office while each participant performed
the tasks. This person also provided the training preceding the performance
test. The monitor did not help the participants unless a question about the
test procedures arose.
Test Report
In this
final section, the results, findings, and recommendations are all discussed. This section begins with a summary of the
performance and survey data.
Task Timings. Below in Figure 2 is the average time
required by all participants to complete each task.
Task |
Average Time to Complete (in min.) |
Median Time to Complete (in min.) |
Range (in minutes) |
Target (in min.) |
1. Make a reservation |
4.4 |
3.95 |
2 - 5.9 |
6.0 |
2. Modify a reservation |
1.25 |
1.75 |
.75 -1.90 |
2.0 |
3. Cancel a reservation |
.57 |
.60 |
.33 - .87 |
1.0 |
4. Identify percentage of
rooms sold |
.94 |
1.04 |
.58 - 1.50 |
2.0 |
5. Identify rooms sold over
next seven days |
.65 |
.685 |
.62 - .75 |
1.0 |
6. Identify rooms undergoing
maintenance |
1.0 |
1.0 |
1.0 - 1.0 |
1.0 |
7. Identify expected
arrivals |
1.19 |
.91 |
.32 – 1.50 |
2.0 |
Figure
2. Task Timings
Error Types. Below in Figure 3 denotes the
number of errors committed while performing each task along with a brief
description of the error. No critical errors were encountered.
Task |
Noncritical Errors |
1 |
1. Initially placed phone
number in business address (software error). Had to reinput in guest address
field, although it was business-related reservation. 2. Initially inputted
incorrect credit card information. 3. Initially put airline
information in wrong field. 4. Initially clicked on wrong
field (room type and not arrow) to select room type. Took several times
before identifying the correct item. 5. Initially selected wrong
year for reservation by clicking on “>>” as opposed to one “>.” |
2 |
1. Initially selected wrong
menu choice. 2. Initially clicked on
“search” rather than “OK” to pull up reservation. 3. Initially put in the
arrival date in the wrong format. 4. When using the calendar,
highlighted the wrong departure date. When staying two nights, three dates
must be highlighted. The user
highlighted two. |
3 |
No
errors encountered. |
4 |
Initially
selected wrong menu choice twice. |
5 |
No
errors encountered. |
6 |
Two
users initially selected the start date to identify rooms undergoing
maintenance for a particular date. This feature does not work (software
error). |
7 |
Initially
entered incorrect ending date for the guest arrival list. |
Figure
3. Listing of noncritical errors.
Observations of Users/User Comments. Below is
a discussion on the comments and observations recorded by the test monitor.
When making
a reservation, the tabbing was inconsistent.
For example, the rate field (market/package) is the fourth field listed
horizontally but the second field requiring information. The default field for
rates is rarely used. Another example is
the zip code. This appears after the
state and city fields, which if entered first, would automatically populate
these fields. Users also experienced some difficulty in keeping track of the
blinking cursor.
The system was slow in responding to user
requests. The drop-down boxes were helpful.
One user felt that the credit card number field should be split into
separate fields based on the grouping of numbers to reduce errors. One user
felt that making a reservation would be more expedient by incorporating the
rate quoter screen (an option found in the main menu) into the "making a
reservation" selection, because its focus is immediately on availability
and the total cost of the stay.
Furthermore, once the data is entered into this screen, it then prefills
the reservation screen when the "book reservation" option is
selected.
The test
monitor looked at the "rate quoter" screen and found that it did not
calculate estimated cost of a total stay (software error). It was not clear
what data had to be entered (mandatory data fields were not yellow like in the
reservation screen) or what button (there are 10 buttons on this screen) to
click on to calculate availability and rate costs. Users occasionally were not sure what buttons
to click on in other parts of the program and whether it was a single or double
click.
One
user was confused by the field descriptions (e.g., mkt/pkg and source). She
attempted to click on “help”, which was not context sensitive and, therefore,
not useful.
Users commented on the use of color
particularly in the room availability tables, where red, purple, yellow, and
blue text and numbers were against a green background.
Survey results. Below is a summary of the survey
results. Item evaluation was based on a Likert rating scale using a range of
numerical values (1 lowest, 9 highest) to indicate the degree of user
satisfaction.
User
Interaction Satisfaction
Item Evaluated |
Evaluation Rating Scale
|
Avg. Rating |
1.
Screen text |
Hard to
read easy to read
1 2 3
4 5 6
7 8 9 |
5.33 |
2.
Use of color |
Distracting helpful
1
2 3 4
5 6 7
8 9 |
5.33 |
3.
Tasks (menu choices/commands) |
Hard to
locate easy to
locate
1 2
3 4 5
6 7 8 9 |
7.67 |
4.
Field descriptions (e.g., guest name) |
Confusing clear
1 2
3 4 5
6 7 8 9 |
8.67 |
5.
Task descriptions (e.g., make a
reservation) |
Confusing clear
1 2
3 4 5
6 7 8 9 |
8.67 |
6.
Terminology |
Unfamiliar appropriate 1 2
3 4 5
6 7 8 9 |
8.33 |
7.
Sequence of screens when executing a task |
Confusing clear 1 2
3 4 5
6 7 8 9 |
4.66 |
8.
Tasks can be completed easily |
Never always 1 2
3 4 5
6 7 8 9 |
7.33 |
9.
Computer keeps you informed of where you are and what you are doing |
Never always
1 2
3 4 5
6 7 8 9 |
4.33 |
10.
Remembering what to enter into data fields |
Difficult easy 1
2 3 4
5 6 7
8 9 |
7.00 |
11.
System speed |
too slow fast enough
1 2
3 4 5
6 7 8 9 |
5.33 |
12.
Screen layouts were helpful |
Never always
1 2
3 4 5
6 7 8 9 |
6.33 |
13.
Amount of information presented on each screen |
too
little
adequate 1
2 3 4
5 6 7
8 9 |
8.33 |
14.
Amount of information presented on each
screen |
too
much
adequate 1 2
3 4 5
6 7 8 9 |
7.00 |
15. Similar information consistently placed |
Never always 1
2 3 4
5 6 7
8 9 |
6.67 |
16. Adequate feedback after completing a task
or when an error is incurred |
Never always 1 2 3
4 5 6 7 8 9 |
8.67 |
17.
You feel in control of the system |
Never always 1
2 3 4
5 6 7
8 9 |
6.00 |
18. Program
navigation |
Difficult easy 1 2 3
4 5 6 7 8 9 |
7.33 |
19.
Online help |
Unhelpful helpful 1 2
3 4 5 6 7
8 9 |
6.00 |
20.
Overall reaction to system |
Difficult easy 1 2
3 4 5
6 7 8 9 |
6.33 |
* This checklist was based on information
from several sources (Galitz, 1997, Rubin, 1994, Shneiderman, 1998).
Below
are the responses to the open-ended questions on the survey.
1. What part of the PMS program did you like best? Least?
· "Calendars were easy to access and select."
· "Graphs were colorful and gave relevant
information."
· "You can not check room availability from the
reservation screen."
· "Terminology is easy to understand."
· "Too many redundant commands."
· "Easy to make and modify a reservation."
2. Please describe your first reactions to it.
· "Color combinations hard on the eyes. The dark green,
red, and yellow difficult to view." Suggest pastel colors.
· "Font size could be larger."
· "Windows is great!"
· "At first, I was not sure if I could do it. After I got
started, it was fun."
3. What kinds of problems did you
experience using this product?
· "I got stuck when I checked dates that were not
available and then tried to check alternate dates."
· "Having an employee log on to the system at the
beginning of the shift requires additional terminals for tracking individual
performance. This poses a security risk."
· "Just trying to remember steps."
4. What product improvements would you recommend?
· "Consistent tabbing function. It was not
consistent."
· "Security and elimination of redundant fields."
· "Trying to click on room type in the reservation
screen."
5. Would you recommend this program to others?
· "It's OK but it would take a while to train
someone."
· "Yes."
· " Yes."
6. Did the organization of the screens match real-world tasks?
·"The rooms blocked screen was awkward to read. Some information
gathered does not flow smoothly."
· "No! The reservation process needs to be
reevaluated."
7. Is there anything else you would like to tell me about this program?
· " Phone numbers not accepted on business screen."
· "Computer response should be faster."
· "The fonts are too small."
· "The color scheme needs to be changed (green, on a
forest green is hard on the eyes).
· "The help screen needs to be expanded."
· "The screens are crowded, most screens have redundant
fields.
· "Enjoyed training and would like to work with this
program if I had the opportunity."
Findings and Recommendations
All of the
users successfully completed the reservation tasks within the upper time
limits, suggesting that any user with adequate training can use this program
within a relatively short time frame. As
denoted in the observations, comments and survey results (average score of 6.33
for overall perceived ease of use), users generally were comfortable with the
Windows-based PMS. The expert user stated that this system is much easier to
use and learn than its predecessor, a DOS-based PMS. DOS-based PMSs are being replaced rapidly
with Windows, a graphical user interface, because it provides a more intuitive
interface (Collins and Malik, 1999).
While
the expert user was able to complete a reservation in two minutes, the other
two participants took over five minutes to do so. Three minutes per reservation is typically
the ideal time for achieving targeted booking confirmation rates, while
maintaining an acceptable labor cost per call.
Because most hospitality employees will be naive or novice users due to
the high turnover rate and shortage of skilled workers, performance targets
could be achieved more quickly by providing both an online tutorial and help
function and by reengineering some of the screen designs as well as layouts and tasks.
Online tutorial and help facility.
According to Shneiderman (1998), first-time users of a software package need an
interactive tutorial environment using the actual PMS. Shneiderman (1998) contends that this enables
new users to work alone at an individual pace, to practice the skills they need
to use the system in a non-threatening environment, and to eliminate the
distraction of shifting between the workstation and the instructional material.
While the
system has an online help facility, it did not provide context-sensitive help.
According to Galitz (1997), contextual help provides information within the
context of the task being performed, or about the specific object being operated
on. One study has shown that
context-sensitive help improves the efficacy of online help facilities (Magers,
1983).
Tasks.
The users pointed out that there is redundancy in task selections from
the main menu (can use rate quoter, 7-day availability, or make reservation to
initiate the booking of a reservation) and within individual screens (can
select three ways to select dates for identifying rooms that are undergoing
maintenance). While there may be some
value in providing more than one way of executing a task or subtask, it
clutters the screen and may confuse the user. Although the users felt the
information presented within screen was adequate (not too much, 7.00, too
little 8.33), the test monitor observed them having difficulty in selecting
command buttons (e.g., room blocking menu) in some of the screens. This was
because of the large number of choices and varying location of these buttons
and/or the presence of other screen information not needed to complete the
task. Consequently, the execution of a
specific task can be aided by limiting the number of screen commands, by using
easily identifiable command buttons (e.g., enter search date in room block
screen and click on OK to retrieve information), and by locating them in a
place consistent in other screens. This
enables the user to more quickly spot the appropriate command or choice. It
also prevents the user from becoming overwhelmed with too many choices.
Information that is irrelevant to the task should also be eliminated. No more
than 25 percent of the screen display should be covered with information
(Danchak, 1976), which has not been completely achieved in the PMS. Simplicity,
clarity, and understandability are the desired outcomes in all screens (Galitz,
1989).
Tasks
should be grouped according to their sequence, frequency, function, and
importance of use (Galitz, 1997). According
to the expert user, the rate quoter selection is rarely used in the main menu.
Consequently, the number of main menu items could be reduced to five or six
items based on importance and listed from top to bottom in terms of usage. Furthermore, the sequence of screens/tasks,
which received an average score of 4.66 for perceived clarity, could be
reorganized to better match the real-world task of reservations
processing. For example, it might make
more sense to eliminate the rate quoter selection from the main menu and to
include a scaled down version as the first step in the task of "making a
reservation."
Data fields and ordering. The order of the data fields does not match
the sequence of actions in the reservation screen. Furthermore, some of the more frequently used
data fields are located at the bottom of the screen, including a mandatory
field (telephone number). According to Galitz (1997), locate frequently used
fields in the top half of the screen, starting with the left corner. Another measure to reduce keystrokes would be
to list the zip code field first to automatically populate the city and state
date fields.
To reduce
errors in entering credit card information, separate the “hold” data field into
groups of numbers because people are better at handling chunks of information. Incorporating separators also permits much
easier visual checking of the data keyed within that field (Galitz, 1989).
One user
suggested that auto skip would also expedite data input. This would only be
useful, however, if all screen fields are completely filled. Manual tabbing
typically results in keying that is faster and less error prone (Galitz,
1997). Using color to highlight the data
field where the cursor is located, as suggested by one of the participants, may
help the user to more quickly identify where to enter data.
In the rate
quoter screen, for example, after entering data it is not obvious what button to
click on or what data fields need to be filled for executing the task. Coloring
or highlighting mandatory fields – like in the reservation screen – and listing
them in a logical order would great enhance the usability of this screen
Most of the
field descriptions are self explanatory as reflected in the high average score
(8.67) for field descriptions, although the test monitor observed users
struggling with several data field descriptions (e.g., Mkt/Pkg, and Room
Types). Renaming these fields or providing an online, context-sensitive help
function would resolve this problem, especially for naïve and novice users.
Color. The use of color (average score
of 5.33) in some of the tables (e.g. red on green) made them difficult to read.
Poor color combinations can impair the system’s usability and lead to slower
reading and visual fatigue. Furthermore,
approximately eight percent of North American and European users have some
degree of color deficiency in their vision. The most common deficiency is
red-green blindness (Shneiderman, 1998).
Some of the screens in the MSI PMS do not consider the needs of
color-deficient users.
Font Size.
The participants found the screen text (average score of 5.33) somewhat
difficult to read because of the font size.
However, the program did use the recommended size conventions for text
(10-point). Readability may have been negatively affected by a perceived lack
of visual lines. The eye should be guided vertically or horizontally through
the screen with the use of white space and control alignments. This will
prevent unfettered wandering of the eye (Galitz, 1997).
Response time. The
participants complained of the frequent sluggishness of the program. When the
system's response to a request was greater than 10 seconds, the participants
unfamiliar with the program became slightly anxious hindering their
productivity. Most users prefer rapid interactions (Shneiderman, 1998).
References
Collins,
G. Mailk, T. (1999) . Hospitality
information technology: Learning how to use it.
Danchak,
M. M. (1976) . CRT displays for power plants. Instrumentation
Technology, 23 (10), 27-32.
Galitz,
W. O (1989) . Handbook of screen format
design.
QED Information Sciences, Inc.
Galitz,
W. O (1997) . The essential guide to user
interface design: An introduction
to GUI design principles and techniques.
and Sons, Inc.
Magers,
C. S. (1983) . An experimental evaluation of online help for
Non-programmers.CHI' 83 conference proceedings: Human factors in computing
systems,
Rubin,
J (1994) . Handbook of usability testing:
How to plan, design, and conduct
effective tests.
Shneiderman,
B. (1980) . Software psychology.
Shneiderman,
B. (1998) . Designing the user interface.
Wesley.
APPENDIX A:
USER INTERACTION SATISFACTION
Item Evaluated |
Evaluation Rating Scale
|
NA |
1.
Screen text |
hard to
read easy to read
1 2 3
4 5 6
7 8 9 |
|
2.
Use of color |
distracting helpful
1
2 3 4
5 6 7
8 9 |
|
3.
Tasks (menu choices/commands) |
hard to
locate easy to
locate
1 2
3 4 5
6 7 8 9 |
|
4.
Field descriptions (e.g., guest name) |
confusing clear
1 2
3 4 5
6 7 8 9 |
|
5.
Task descriptions (e.g., make a
reservation) |
confusing clear
1 2
3 4 5
6 7 8 9 |
|
6.
Terminology |
unfamiliar appropriate 1 2
3 4 5
6 7 8 9 |
|
7.
Sequence of screens when executing a task |
confusing clear 1 2
3 4 5
6 7 8 9 |
|
8.
Tasks can be completed easily |
never always 1 2
3 4 5
6 7 8 9 |
|
9.
Computer keeps you informed of where you are and what you are doing |
never always
1 2 3
4 5 6
7 8 9 |
|
10.
Remembering what to enter into data fields |
difficult easy 1
2 3 4
5 6 7
8 9 |
|
11.
System speed |
too slow fast enough
1 2 3
4 5 6
7 8 9 |
|
12.
Screen layouts were helpful |
never always
1 2
3 4 5
6 7 8 9 |
|
13.
Amount of information presented on each screen |
too
little
adequate 1
2 3 4
5 6 7
8 9 |
|
14.
Amount of information presented on each
screen |
too
much
adequate 1 2
3 4 5
6 7 8 9 |
|
15. Similar information consistently placed |
never always 1
2 3 4
5 6 7
8 9 |
|
16. Adequate feedback after completing a task
or when an error is incurred |
never always 1 2 3
4 5 6 7 8 9 |
|
17.
You feel in control of the system |
never always 1
2 3 4
5 6 7
8 9 |
|
18. Program
navigation |
difficult easy 1 2 3
4 5 6 7 8 9 |
|
19.
Online help |
unhelpful helpful 1 2
3 4 5
6 7 8 9 |
|
20.
Overall reaction to system |
difficult easy 1 2
3 4 5
6 7 8 9 |
|
* This checklist was based on information
from several sources (Galitz, 1997, Rubin, 1994, Shneiderman, 1998).
1.
What parts of the PMS program did you like best? Least?
2.
Please describe your first reactions to it?
3.
What kinds of problems did you experience using this product?
4.
What product improvements would you recommend?
5.
Would you recommend this program to others?
6. Did the organization of the
screens match real-world tasks?
7.
Is there anything else you would like to tell me about this program?
APPENDIX B:
TASK LIST
TASK 1
TASK
DESCRIPTION:
MAKE A RESERVATION (MTC[1]=6
minutes)
Subtasks |
Task Description |
Task Detail |
1 |
Select
reservation task. |
REQ[2]:
Click on "make a reservation" from main menu. SCC[3]:
Accessed reservation screen. |
2 |
Enter
Reservation data. |
REQ:
Participant must enter the reservation data in the fields and click on
"save." SCC:
Generated confirmation ID. |
3 |
Go
back to main menu. |
REQ:
Click on "close." SCC:
Accessed main menu. |
TASK 2
TASK
DESCRIPTION:
MODIFY RESERVATION (MTC=2 minutes)
Subtasks |
Task Description |
Task Detail |
1 |
Retrieve
reservation. |
REQ:
Click on "retrieve" from main menu. SCC:
Accessed "guest information" screen. |
2 |
Enter
name and arrival date (Collins,
Galen & |
REQ:
Participant must enter name and date into data fields and click on
"search." SCC:
Accessed Collins' reservation. |
3 |
Change
arrival date to |
REQ:
Change arrival date to SCC:
Reservation modified. |
4 |
Go
back to main menu. |
REQ:
Click on "close." SCC:
Accessed main menu. |
TASK 3
TASK
DESCRIPTION:
CANCEL RESERVATION (MTC=1 minutes)
Subtasks |
Task Description |
Task Detail |
1 |
Retrieve
reservation. |
REQ:
Click on "retrieve" from main menu. SCC:
Accessed "guest information" screen. |
2 |
Enter
name and arrival date (Collins,
Galen & |
REQ:
Participant must enter name and date into data fields and click on
"search." SCC:
Accessed Collins' reservation. |
3 |
Cancel
reservation. |
REQ:
Participant must click on "cancel"and answer "yes." SCC:
Generated cancellation ID. |
4 |
Go
back to main menu. |
REQ:
Click on "close." SCC:
Accessed main menu. |
TASK 4
TASK
DESCRIPTION:
IDENTIFY THE PERCENTAGE OF ROOMS
AVAILABLE FOR
Subtasks |
Task Description |
Task Detail |
1 |
Access
1-day availability for today. |
REQ:
Click on "1-day availability" from main screen. SCC:
Accessed "1-day
availability" screen. |
2 |
Access
1-day availability % for today. |
REQ:
Click on "house view chart." SCC:
Generated pie chart with availability statistics for today. |
3 |
Access
1-day availability % or tomorrow. |
REQ:
Click on "house view chart." SCC:
Generated pie chart with availability statistics for tomorrow. . |
4 |
Go
back to main menu. |
REQ:
Click on "close." SCC:
Accessed main menu. |
TASK 5
TASK DESCRIPTION: IDENTIFY ROOM TYPES THAT ARE SOLD OUT FOR THE NEXT SEVEN DAYS,
BEGINNING WITH TODAY (MTC=1 minutes)
Subtasks |
Task Description |
Task Detail |
1 |
Access
7-day availability for today. |
REQ:
Click on "7-day availability" from main screen. SCC:
Accessed "7-day
availability" screen. |
2 |
Identify
room types that are sold out for each day. |
REQ:
View the room availability table. SCC:
Identified room types that are sold out for each day. |
3 |
Go
back to main menu. |
REQ:
Click on "close." SCC:
Accessed main menu. |
TASK 6
TASK
DESCRIPTION:
IDENTIFY ROOMS UNDERGOING MAINTENANCE ON
Subtasks |
Task Description |
Task Detail |
1 |
Access
reports |
REQ:
Click on "reports," then "reservations," and then
"room blocking." SCC:
Accessed "room blocking at a
glance" screen. |
2 |
Access
what room are undergoing maintenance beginning on May 5th. |
REQ:
Click on "May 5th" and scroll down to identify room numbers
undergoing maintenance. SCC:
Identified room numbers undergoing maintenance. |
3 |
Go
back to main menu. |
REQ:
Click on "close." SCC:
Accessed main menu. |
TASK 7
TASK
DESCRIPTION:
IDENTIFY EXPECTED ARRIVALS FOR MAY 12th
AND 13th, 2000 (MTC=2 minutes)
Subtasks |
Task Description |
Task Detail |
1 |
Access
reports |
REQ:
Click on "reports," then "reservations," and then
"guest name list." SCC:
Accessed "guest name list criteria" screen. |
2 |
Access
reservations records for guests arriving on May 12th and 13th |
REQ:
Change date to 05/12/2000-05/13/2000 and click on "retrieve." SCC:
Accessed reservation records and click on "close." |
3 |
Go
back to main menu. |
REQ:
Click on "close." SCC:
Accessed main menu. |
APPENDIX C:
DATA COLLECTION FORM
Name:_____________ |
Date:_________ Time:_________ |
Page ____ of ______ |
Task # ___ |
Elapsed Time
|
Observations/Comments/Codes* |
|
Start
Time: |
|
|
End
Time: |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
*Codes:
1. A -
Completed task successfully within time allotment
2. F
- Did not complete task.
3. R -
Made a mistake but was able to recover from the error to successfully complete
the task (noncritical error).
4. E -
Made a mistake and was unable to recover from the error (critical error).