• Aucun résultat trouvé

A pilot study to assess the usability of software for simulated office work

N/A
N/A
Protected

Academic year: 2021

Partager "A pilot study to assess the usability of software for simulated office work"

Copied!
29
0
0

Texte intégral

(1)

Publisher’s version / Version de l'éditeur:

Vous avez des questions? Nous pouvons vous aider. Pour communiquer directement avec un auteur, consultez la première page de la revue dans laquelle son article a été publié afin de trouver ses coordonnées. Si vous n’arrivez pas à les repérer, communiquez avec nous à PublicationsArchive-ArchivesPublications@nrc-cnrc.gc.ca.

Questions? Contact the NRC Publications Archive team at

PublicationsArchive-ArchivesPublications@nrc-cnrc.gc.ca. If you wish to email the authors directly, please see the first page of the publication for their contact information.

https://publications-cnrc.canada.ca/fra/droits

L’accès à ce site Web et l’utilisation de son contenu sont assujettis aux conditions présentées dans le site LISEZ CES CONDITIONS ATTENTIVEMENT AVANT D’UTILISER CE SITE WEB.

Internal Report (National Research Council of Canada. Institute for Research in Construction), 1995-09-01

READ THESE TERMS AND CONDITIONS CAREFULLY BEFORE USING THIS WEBSITE.

https://nrc-publications.canada.ca/eng/copyright

NRC Publications Archive Record / Notice des Archives des publications du CNRC :

https://nrc-publications.canada.ca/eng/view/object/?id=f054557b-ec8a-43a2-a502-2c930a05768f https://publications-cnrc.canada.ca/fra/voir/objet/?id=f054557b-ec8a-43a2-a502-2c930a05768f

NRC Publications Archive

Archives des publications du CNRC

For the publisher’s version, please access the DOI link below./ Pour consulter la version de l’éditeur, utilisez le lien DOI ci-dessous.

https://doi.org/10.4224/20375326

Access and use of this website and the material on it are subject to the Terms and Conditions set forth at A pilot study to assess the usability of software for simulated office work

(2)

Software for Simulated Office Work

I R C - I R - 7 0 4

S c o v i l ,

C . Y . ; V e i t c h ,

J . A . ;

N e w s h a m ,

G . R .

(3)

Simulated Office Work 2

Abstract

This pilot investigation tested the usability of custom designed computerized software to test typing, proofreading and reaction time skills. The software will be used to determine the effects of lighting quality on office work performance in future studies. Six temporary office workers did 36, 10 minute sessions of the three tasks for one 7.5 hour day. Both Proofreading Task and Typing Task used conditions of 3 screen colours

x

2 font types

x

2 font sizes. Conveyor Belt, the reaction time test, used 5 symbol speeds

x

3 symbol colours x 2 target frequencies

x

2 times of day. Typing Task and Proofreading Task were familiar to the subjects, making the tasks too easy and the results less clearly related to font size and screen colour. Smaller font size and gray background made the most difficult typing conditions; small font and blue background were the most difficult for proofreading. Conveyor Belt was more sensitive to changes in other variables at high symbol speeds. Low target probability with blue symbols at the fastest speeds gave the poorest performance.

(4)

A Pilot Study to Assess the Usability of Software for Simulated Office Work

The Video Display Terminal (VDT) has become an integral part of most modern offices. Many jobs involve a significant amount of time working at a VDT. As a result, office design must include computer and VDT ergonomics as a component of creating pleasant and efficient

workplaces. Lighting in the office is an important consideration because computer-based work is generally vertical whereas paper-based tasks are mostly horizontal. This makes avoiding glare and creating adequate illumination more difficult. The introduction of energy efficient lighting also poses new challenges in this regard in many workplaces.

This is a pilot study to examine the software that will be used in the "Experimental Investigations of Lighting Quality, Preferences, and Control Effects on Task Performance and Energy Efficiency" studies performed at the National Research Council of Canada (NRC). Recently, an Indoor Environment Research Facility was built at NRC's Institute for Research in Construction (IRC). It allows for lighting, acoustical, ventilation and indoor air quality research in a model office environment. The study hopes to quantify lighting quality and to measure to what extent poor lighting quality has a detrimental effect on the abilities of people to perform their work.

There are several factors involved in measuring lighting quality. One important element is looking at worker performance under various lighting conditions. To do this, custom software has been developed to measure three objective components of office work performance: typing, reaction time and proofreading. This pilot study was conducted to determine the usability of this software and the sensitivity of the tasks to measure performance differences in the population (English-speaking adult office workers) for which it is intended to be used.

The software includes three tasks to test each of the components of office work. These tasks provide the researcher with a quantitative description of the user's performance. They will be used to determine how the lighting quality affects workers' performance. It is necessary that the tasks be sensitive to differences in lighting quality. That is, the task must be at a difficulty level such that any changes in ambient conditions affect perfornlance noticeably, avoiding

(5)

Simulated Office Work 4

insignificant results from tasks that are either too difficult or too easy. The purpose of this pilot study was to determine the appropriate settings for each of these tasks.

For the Typing Task, the operator re-types a model text, presented on screen, into a second window at the bottom of the screen, as shown in Figure 1. This task is similar to many typing proficiency tests used by employers of temporary office workers, and the model text can be presented on paper if desired. The program compares the texts at the end of each word, and the user must correct any mistakes before continuing. This error checking is one of the optional settings of Typing Task used in the pilot study. It was selected so that the time per paragraph is a measure of the time required for perfect performance. The software automatically stores the typed text to a file and also saves data pertaining to the speed of typing, the operator's error rate (number of backspace keys used), and other perfor~nance variables. Showing performance feedback to the participants during the session is an option allowed by each of the programs that was not used in the pilot study because it can cause improved performance over time (Davies &

Parasuraman, 1982; Davies & Tune, 1970; Grandjean, 1982). The text presentation conditions (font, size, colour and contrast) can all be varied using the software. Several of these were selected to establish the conditions used in the pilot study to determine when the task is suitably sensitive to measure individual differences in performance.

The reaction time task, Conveyor Belt, is similar to a video game. Symbols (coloured geometric shapes) move horizontally across the computer screen on a "conveyor belt" to a marked window (see Figure 2). Certain symbols are designated as "targets". The task requires the user to press the space bar on the keyboard as quickly as possible after a target symbol enters the marked window. The software records reaction time, number of keystrokes, nurnber of false positives (selecting a non-target), number of correct hits and number of ~nissed targets. The presentation parameters (colour, number of targets and presentation speed) were systematically varied to ensure suitable statistical properties of the response distribution.

In the Proofreading Task, the operator compares as quickly and as accurately as possible two lists of random numbers presented side-by-side on the computer screen as shown in Figure 3. Using the cursor keys, the user marks those lines in which the two numbers in each list are not identical. This a VDT model of the paper based proofreading task developed by Rea (1981), and can be used with either the mouse or the keyboard. The program records total elapsed time,

(6)

Aria1 fonts, two common variable width fonts. The continuing questions as to whether a change in the type of font affects performance (Dillon 1992; Gould, Alfaro, Barnes, Finn, Grischkowsky

& Minuto, 1987; Gould, Alfaro, Finn, Haupt & Minuto, 1987) justified the use of both a Serif and sans-Serif font, although it was difficult to predict which would give better results.

Conveyor Belt is a vigilance and reaction time task, requiring the user to watch all the symbols and quickly respond to the targets. The critical parameters are different from those that influence the proofreading and typing tasks. Both the total symbol rate (Davies & Parasuraman, 1982) and the percentage of symbols which are targets (Davies & Parasuraman, 1982; Davies &

Tune, 1970; Grandjean, 1982) affect performance on vigilance tasks. Therefore, this experiment varied these factors, as well as symbol colour. The pilot test included a total of 60 conditions of 2 minutes each. There were five speeds (total syrnbol rate), two sets of symbols, two levels of target probabilities and two times of day used to evaluate the effect of each variable on the performance of the participants. A mid-range speed and higher target probability may produce the best performance in terms of reaction time and percentage of targets detected.

Three sets of symbols were used: a11 red symbols, all blue sylnbols and both red and blue symbols. The colours were chosen from the many possible co~nbinations to make the task more sensitive to lighting quality. Red on black and blue on black are both less legible than most colour combinations (Nilsson, Connolly & Ireland, 1993). Red and blue sy~nbols together require more accommodation of the eye because they are both spectrally extreme (Collins, Davis & Goode, 1994; Langen & Rau, 1990; Matthews, Lovasik & Mertins, 1989; Murch, 1984), making this a more visually difficult task. Nonetheless, it seemed likely that the best perfor~nance will result from the conditions with two different colours because there is more variance between symbols, with single colour conditions causing more difficulty.

Davies and Parasuraman (1982) and Davies and Tune (1970) noted that vigilance performance is dependent on the time of day, and often improves during the afternoon. This suggests that performance may be better in the afternoon than the morning for this pilot test. For this reason, the conditions for Conveyor Belt were run twice, once in the morning, and once in the afternoon,

(7)

Simulated Office Work 7

Method Participants

Six office workers (3 men and 3 women) were hired from an office temporary service supplier for one day to participate in the pilot study. All were over 18 years of age, had normal or corrected-to-normal vision (including good colour vision verified using the Ishihara colour

plates), and experience with Windows-based word processing and spreadsheet software. Participants worked for one 7.5-hour day of clerical work at their standard rate of payment. Participants were provided with a description of the pilot study and what tasks they would be doing when presented with this possible assignment (on the day before). Agreement to accept the assignment indicated their willingness to participate in the experiment.

Setting

Testing took place in the Indoor Environment Research Facility (IERF) in Building M-24 on the Montreal Road Campus of the National Research Council of Canada in Ottawa, Ontario. The IERF is a mocked-up 12.2 x 7.3 m (40 x 24 ft) office designed for acoustics, lighting, ventilation, and indoor air quality research. The space is furnished as a typical mid-level clerical or administrative office similar to those currently being illstalled in Canadian government

buildings. There are six open-plan workstations of approximately 6 mZ (65 ft2) with space for shared file cabinets and printers at the end of the room. The workstations are standard modular systems furniture (175 cm [66 in] panel height) with computers, storage space, keyboard shelf, and adjustable-height chair. The desktop illuminance was between 360-500 lux for all stations during the study. This is within acceptable standards for office lighting. The room is windowless. Temperature, noise, and ventilation, were within normal guidelines for office environments. Materials

The text used in the Typing Task came from Canadian government reports, and had no references to lighting, office environment, or anything that might affect the participants'

understanding of the experimental session. The text appeared one paragraph at a time; once the paragraph was complete the user moved on to the next by pressing a defined function key. There was more text than participants with excellent keyboard skills could type in the allotted time, so that everyone typed for the entire 10 minute period.

(8)

The number lists used in the Proofreading Task were a mixture of those used in the visual performance studies done by Rea (1981) and new lists created using a random number generator. Both sets of lists contained 20 five digit numbers in two nearly identical columns, with 0 to 6 randomly distributed errors in each comparison column. More sets of lists were prepared than would be necessary, to prevent any participants from finishing early.

For the Conveyor Belt task, 4 shapes were randomly chosen for each session, from a set of 6 standard shapes. Two of these were targets. In the case of red and blue symbols, there were two symbols of each colour, one of each being a target. Valying the symbols that appeared and changing the targets, helped to prevent the participants from learning the task and radically improving their performance as time progressed. The two target probabilities were 50% targets (high) and 20% targets (low). The Conveyor Belt software presented the symbols in a random manner, but controlled the numbers so that the probabilities were approximately equal to the high and low percentages required. The five speeds ranged in constant increments from just over one symbol per second at Speed 1, to twice that amount at the fastest speed.

Procedure

Participants arrived at 8:45 a.m., and gathered in the reception area of the IERF. The experimenter greeted them, gave detailed instructions, and provided an opportunity for the participants to ask questions. The participants signed a consent form prior to entering the IERF. The participants were also tested for colour bliildness at this point, using the Ishihara colour plates, because the ability to judge colour was important for this study.

Upon entering the office area, the experimenter assisted the participants in adjusting their workstation seating and keyboard tray to be comfortable and correctly positioned. A practice session, including each of the three tasks, helped familiarize the participants with the software they would use throughout the day.

Scheduler, a new software programme developed for this experiment, presented the three tasks in a predetermined random order, different for each participant. Six 10 minute blocks of each task, for a total of 18, were presented in the morning session, the remaining 18 in the afternoon. The software saved all data directly to the hard disk for automated scoring. The workstation number served as the identifying code linking individuals and their scores. The Scheduler also indicated to the participants when it was time for a coffee break or lunch. There

(9)

Simulated Office Work 9

were two 15 minute coffee breaks, one in the morning and one in the afternoon. Lunch was 45 minutes long. The experimenter was present during these breaks to hear the participants' reactions to the tasks.

Each 10 minute block included one or more combinations of presentation parameters. Both Typing Task and Proofreading Task required 12 sessions for a fully crossed 3 factor design

(3 screen colour x 2 font x 2 size). Each task had 12 ten minute sessions throughout the day, with one combination of parameters per session. Conveyor Belt required 6 0 , 2 minute long sessions to allow for all colnbinations of conditions (5 speed

x

3 colour x 2 target probability x 2

time of day). Five such sessions were included in one 10 minute interval, allowing all five speeds to be tested for each particular set of conditions. This produced 6 sessions of 10 minutes for the morning, that were repeated (in rendom order) in the afternoon to test the time dependency of Conveyor Belt performance.

After the testing, a short debriefing addressed any questions the participants may have had. The testing ended at 5:00 p.m. Participants had access to their scores by telephone, if they wished, two weeks after the testing day.

Results

At the end of day, the data was retrieved from the individual workstation computers and compiled for analysis. Missing data due to incomplete or interrupted sessions were not included. Two Typing Task sessions, 7 Proofreading Task Sessions and 8 Conveyor Belt Sessions were removed.

w i n g Task Perforniance

Two variables were calculated from the recorded data. Typing Task automatically records a variable called Score, which is the Typing Speed, measured as the number of correctly typed characters per second (cps). Typing Task's tallies of the number backspace and cursor keys sum to give the Error Rate variable. The means and standard deviations for these two variables over each condition, as well as summaries of the data for each experimental variable are listed in Table 1. Figures 4 and 5 show the histograms, collapsed across font, size, and colour conditions, for both Typing Speed and Error Rate.

(10)

Proofreading Task Performance

Two variables were calculated from the data recorded by Proofreading Task. The Score is found by dividing the number of correct detections of discrepancies by the total discrepancies. The following formula is used:

Hits

-

False Positives Hits

+

Misses

where Hits indicate a correctly marked discrepancy, False Positives are times when identical numbers were marked and Misses are the number of unmarked discrepancies.

Proofreading Task also records the Number of Paragraphs (screens) completed. The means and standard deviations of both of these variables can be found on Table 2. Histograms of the entire data set for both Score and Number of Paragraphs, collapsed across font, size, and colour conditions, are shown in Figures 6 and 7.

Convevor Belt Task Performance

Only one variable was calculated from the Conveyor Belt Task data. The Score is calculated using the following formula:

Hits - False Positives Targets

where Hits are the number of targets removed, False Positives are the number of attempts to remove non-targets and Targets is the total number of Targets in that session. A perfect Score is

1, and negative Scores are possible if there is a high rate of False Positives. There were two cases where the Score was greater than one, due to a slight overlap in scoring between speed

conditions. The means and standard deviations for all conditions are listed on Table 3, and the histograms are found in Figure 8. The slower speeds, (Speeds 1

-

3) were easy for the subjects, and the high scores reflect that. Performance drops sharply for Speeds 4 and 5, as the task became more difficult.

The graphs in Figures 9 and 10 show how two variables, target colour and frequency, are affected as the speed increases. The scores for all colours are nearly equal until the higher speeds where conditions with blue symbols result in much lower scores. The conditions with both red and blue symbols result in scores between those of all red and all blue conditions (Figure 9). Again, the scores are nearly equal at lower speeds. At Speed 4 and 5, the scores from the low

(11)

Simulated Office Work 11

probability conditions drop much more quickly than for high target probability conditions (Figure 10).

Discussion

Typing Task and Proofreading Task appear to have been too easy for the highly practiced office workers. Their skill shifted the Proofreading results to the high end of the scale. This is particularly visible in Figure 6 , the histogram of the Proofreading Task scores, which illustrates a ceiling effect. There were some hints about why this happened. The subjects mentioned that Proofreading Task was very similar to a test required for some Public Service jobs. It was obvious that many of the subjects were very proficient at this task, with the average score being 0.883 out of a maximum of I.

The proofreading speed, measured in number of screens completed (see Figure 7) was much more varied, and may be the variable that can be used to determine effects of lighting quality. This is consistent with the findings of Gould, Alfaro, Barnes et a]. (1987) and Gould and Grischkowsky (1984) who noted that people tend to change the speed rather than the accuracy of proofreading when confronted with more unfavourable conditions.

This factor seems particularly notable in the mean number of screens completed for each screen colour (Table 2). The subjects consistently proofread fewer screens in the white on blue conditions than any others, even though the proofreading scores are nearly identical. White on blue was the only reverse polarization condition, and the veiling reflections (noticed by all the subjects), may have influenced proofreading speed for this condition.

Typing Task had very little variability among the different conditions. The individual conditions seem to have almost no effect on the typing speed or error rate, although they are both distributed over a reasonable range of values (see Figures 4 and 5).

Screen colour seemed to create a consistent trend in Typing Task although a different one from that of Proofreading Task. The gray on gray condition created the slowest typing speed and highest error rate, compared with the other screen colours. The effect of contrast was not as large as was expected, which may indicate that the contrast of the gray on gray condition was still above the perforlnance threshold found by Rea (1982, 1986), Rea and Ouellette (1988, 1991),

(12)

and Rea, Ouellette and Tiller (1990). The contrast could be reduced further to increase the sensitivity of Typing Task to lighting.

The typing speed may be the most useful variable for measuring performance effects for Typing Task, because the collected data falls into a reasonable normal distribution curve

(Figure 4). The Error Rate is much more scattered (Figure 5 ) and it would more difficult to observe effects from outside sources using this variable.

For both Typing Task and Proofreading Task the data shows no significant effects from the font type. The trends in Typing Task suggest that Times New Roman font increases speed and reduces the error rate, but this is a small effect, and is not consistent across all conditions.

For both tasks, the data suggests that a larger font size i~icreases speed and performance and decreases the nu~iiber of errors, as would be expected. This difference was not as

pronounced as expected, likely due to the ceiling effect on both of these tasks and the small number of subjects.

Conveyor Belt Task was new and difficult for the subjects. Everyone remarked on its difficulty because of the symbol speed and the screen glare. The histograms in Figure 8 show how speed affects performance. Results range from all subjects getting a nearly perfect score at Speed 1, to Speed 5 where the average score is only 0.176 out of a perfect score of 1.

The three lower speeds were slow enough that few performance differences were noted, but as the task speed increased, effects of some of the other variables begin to appear. Figure 9 shows how blue targets become much more difficult to remove as the speed increases. This confirmed comments by the subjects themselves who found the blue symbols harder to see. The red targets were easier to remove, and perfor~nance on the cotnbination of red and blue targets was intermediate between those for the two colours alone.

Target probability was also affected dramatically by the speed of the task. As shown in Figure 10, the scores dropped off for low probability conditions I ~ L I C ~ more than for high

probability conditions. Once again, speed 4 seems to be the threshold where performance becomes affected by other variables. The results were consistent throughout the day, as there is very little difference between morning and afternoon sessions.

The pilot study found that both Typing Task and Proofreading Task need to be more difficult before they will have enough sensitivity to be used in future experiments. Given the small

(13)

Simulated Office Work 14 References

Bassani, G. (1980). NCR: From the first computer in Italy to the 1980s. Research and development on VDUs connected to EPD systems. In E. Grandjean & E. Vigliani (Eds.), Ergonomic aspects of visual display terminals (pp. 257-262). London: Taylor and Francis.

Beldie, I. P., Pastoor, S. & Schwarz, E. (1983). Fixed versus variable letter width for televised text. Human Factors, 25,273-277.

Collins, M., Davis, B. & Goode, A. (1994). Steady-state accommodation response and VDT screen conditions. Applied Ergonomics, 25, 334-338.

Davies, D. R. & Parasuraman, R. (1982). T h e ~ s h o l o g y of vigilance. London: Academic Press.

Davies, D. R. & Tune, G. S. (1970). Human vigilance performance. London: Staples Press. Dillon, A. (1992). Reading from paper versus screen: a critical review of the empirical literature.

Ergonomics, 35, 1297-1326.

Gould, J. D., Alfaro, L., Barnes, V., Finn, R., Grischkowsky, N. & Minuto, A. (1987). Reading is slower from CRT displays than from paper: Attempts to isolate a single-variable

explanation. Human Factors, 29, 269-299.

Gould, J. D., Alfaro, L., Finn, R., Haupt, B. & Minuto, A. (1987). Reading from CRT displays can be as fast as reading from paper. Human Factors. 29,497-517.

Gould, J. D. & Grischkowsky, N. (1984). Doing the same work with paper and cathode ray tube displays (CRT). In E. Grandjean (Ed.), Ergonomics and health in modern offices (pp. 329-338). London: Taylor and Francis.

Grandjean, E. (1982). Fitting the task to the man: An ergonomic a d(pp. 159-166). London: Taylor and Francis.

Kanaya, S. (1990). Vision and visual environment for VDT work. Ergonomics, 33.775-785. Langen, M. & Rau, G. (1990). Interactive colour design of interactive graphical displays using a

prototyping tool based on colour metrics. Ergonomics. 33, 1043-1054.

Matthews, M. L., Lovasik, J. V. & Mertins, K. (1989). Visual perforinance and subjective discomfort in prolonged viewing of chromatic displays. Human Factors. 3 1.259-27 1.

(14)

Murch, G. M. (1984, November). Physiological principles for the effective use of color- Computer Graphics. November, 49-54.

Nilsson, T., Connolly, K. & Ireland, W. (1993). Development of an industrial design service utilizing new technologv to optimize visual information displays for both human and machine vision (Contract No. 9F006-2-0021101-OSC). Charlottetown, PEI: Traid Design and University of Prince Edward Island.

Radl, G.

W.

(1980). Experimental investigations for optimal presentation-mode and colours of symbols on the CRT screen. In E. Grandjean & E. Vigliani (Eds.), Ergonomic aspects of visual display terminals (pp. 127-137). London: Taylor and Francis.

Rea, M. S. (1982, November). An overview of visual perforn~ance. Lighting Design and Application. 12(1 I), 35-41.

Rea, M. S. (1986, Summer). Toward a model of visual performance: Foundations and Data. Journal of the Illuminating Engineering Society, 15.41-57.

Rea, M. S. & Ouellette, M. J. (1988). Visual performance using reaction times. Lighting Research and Technology. 20. 139-153.

Rea, M. S. & Ouellette, M. J. (1991). Relative visual performance: A basis for application. Lighting Research and Technology, 23. 135- 144.

Rea, M. S., Ouellette, M. J. &Tiller, D. K. (1990). Effects of luminous surroundings on visual performance, pupil size and human preference. Journal of the Illuminating Engineering Society. 19.45-58.

Schmidtke, H. (1980). Ergonomic design principles of alphanumeric displays. In E. Grandjean & E. Vigliani (Eds.), Ergonomic Asvects of Visual Display Terminals (pp. 265-269). London: Taylor and Francis.

Taptagapom, S. & Saito, S. (1990). How display polarity and lighting conditions affect the pupil size of VDT operators. Ergonomics, 33,201-208.

(15)

Simulated Office Work 16 Table I

Tvving Task Speed and Error Rate

Screen Colours Marginal Mean Marginal Mean

Font Black on White White on Blue Gray on Gray Font Size Font Type Typing Speed 8 Point Font Times 2.78 (0.94) 2.98 (0.84) 2.63 (0.67) 2.76 (0.77) 2.92 (0.81) Arial 2.75 (0.90) 2.88 (0.93) 2.54 (0.51) 2.85 (0.84) 12 Point Font Times 3.01 (0.77) 3.10 (0.89) 3.00 (1.02) 3.00 (0.86) Arial 2.97 (0.98) 2.94 (0.74) 2.99 (1.13) Marginal Mean

Screen Colour 2.88 (0.84) 2.97 (0.80) 2.80 (0.85) Grand Total 2.89 (0.82) Error Rate 8 Point Font Times 59.80 (33.74) 83.50 (41.42) 87.50 (74.66) 76.38 (47.31) 74.14 (46.26) Arial 68.83 (48.25) 89.83 (42.60) 64.00 (43.38) 77.94 (41.19) 12 Point Font Times 65.00 (36.27) 56.50 (37.88) 90.17 (47.87) 75.72 (40.29) Arial 92.50 (36.30) 76.50 (49.59) 73.67 (37.02) Marginal Mean

Screen Colour 72.04 (38.72) 76.58 (42.16) 79.48 (50.53) Grand Total 76.04 (43.52)

Note.

Times is short for Times New Roman Font. Scores listed in the form: Mean (Standard Deviation).

(16)

Table 2

Proofreading Task Scores and Screens Comoleted

Screen Colonrs Marginal Mean Marginal Mean Font Black on White White on Blue Gray on Gray Font Size Font Type

Score 8 Point Font Times 0.880 (0.082) 0.891 (0.089) 0.903 (0.077) 0.879 (0.075) 0.885 (0.082) Arial 0.906 (0.046) 0.874 (0.088) 0.845 (0.079) 0.897 (0.061) 12 Point Font Times 0.865 (0.091) 0.875 (0.093) 0.903 (0.080) 0.904 (0.068) Arial 0.907 (0.059) 0.925 (0.037) 0.926 (0.046) Marginal Mean

Screen Colour 0.891 (0.067) 0.890 (0.078) 0.892 (0.074) Grand Total 0.883 (0.076) Number of Screens Completed

8 Point Font Times 20.60 (5.68) 18.80 (4.66) 21.50 (4.04) 19.00 (4.64) 19.47 (4.88) Arial 19.67 (4.89) 16.83 (3.54) 18.83 (4.96) 19.03 (4.52) 12 Point Font Times 21.00 (6.16) 18.17 (4.36) 18.50 (5.61) 19.50 (4.77) Arial 20.17 (4.31) 18.80 (4.92) 19.20 (5.76) Marginal Mean

Screen Colour 20.32 (4.87) 18.09 (4.12) 19.33 (4.93) Grand Total 19.29 (4.73)

Note.

Times is short for Times New Roman font. Scores listed in the form: Mean (Standard Deviation).

(17)

Simulated Office Work 18 Table 3

Convevor Belt Performance Scores

Morning Condition Afternoon Condition Marginal Mean Condition Low High Low High Colour

Frequency Frequency Frequency Frequency Red Symbols Speed 1 0.846 (0.162) 0.922 (0.106) 0.939 (0.070) 0.963 (0.051) 0.918 (0.108) Speed 2 0.820 (0.191) 0.840 (0.221) 0.915 (0.075) 0.800 (0.205) 0.844 (0.176) Speed3 0.747 (0.111) 0.833 (0.087) 0.615 (0.301) 0.721 (0 370) 0.729 (0 245) Speed 4 0.449 (0.320) 0.588 (0.248) 0.522 (0.579) 0.727 (0.297) 0.572 (0.373) Speed 5 0.057 (0.328) 0.360 (0.220) 0.229 (0.640) 0.566 (0.323) 0.303 (0.426) Blue Symbols Speed 1 0.936 (0.073) 0.919 (0.103) 0.964 (0.064) 0.935 (0.1 12) 0.938 (0.086) Speed 2 0.830 (0.132) 0.903 (0.056) 0.903 (0.101) 0.864 (0.086) 0.875 (0.096) Speed 3 0.650 (0.239) 0.771 (0.130) 0.678 (0.250) 0.768 (0.190) 0.717 (0.201) Speed4 0.185 (0.466) 0.457 (0.275) 0.350 (0.381) 0.389 (0.246) 0.345 (0.345) Speed5 -0.030 (0.204) 0.188 (0.214) -0.228 (0.436) 0.203 (0.167) 0.033 (0.315) Red and Blue Symbols

Speed 1 0.882 (0.162) 0.868 (0.230) 0.992 (0.018) 0.850 (0.130) 0.895 (0.157) Speed2 0.835 (0.142) 0.778 (0.198) 0.932 (0.081) 0.783 (0.188) 0.828 (0.163) Speed3 0.781 (0.198) 0.659 (0.190) 0.829 (0.176) 0.669 (0.193) 0.728 (0,190) Speed 4 0.523 (0.401) 0.640 (0.202) 0.564 (0.235) 0.484 (0.345) 0.552 (0.294) Speed5 0.090 (0.448) 0.338 (0.172) 0.025 (0.475) 0.316 (0.305) 0.194 (0.367)

(18)

Marginal Mean

Time of Day Morning Condition Afternoon Condition Speed 1 0.896 (0.141) 0.939 (0.090) Speed 2 0.834 (0.158) 0.864 (0.137) Speed 3 0.739 (0.168) 0.710 (0.249) Speed 4 0.474 (0.340) 0.504 (0.364) Speed 5 0.167 (0.298) 0.186 (0.463) Marginal Mean

Target Frequency Low Frequency High Frequency Speed 1 0.926 (0,108) 0.910 (0.130) Speed 2 Speed 3 Speed 4 Speed 5 Grand Total Speed 1 Speed 2 Speed 3 Speed 4 Speed 5

(19)

Simulated Office Work 20 Figure Captions

Figure 1. Layout of Typing Task. The Model Text Box appears at the top of the screen, and contains the text to be typed. The user retypes the text into the bottom Entry box, exactly as it appears above. The strikethough text indicates to the user that there is an error in the retyped text.

Figure 2. Layout of Conveyor Belt Task. The Conveyor Belt is the shaded across the screen. The symbols move across the screen from left to right along the Conveyor Belt. Targets must be removed once they enter the dotted line around the Removal Zone. The task used in the pilot study had a black Conveyor Belt with red or blue symbols.

Figure 3. The layout of Proofreading Task. Column B is compared to Column A, and any discrepancies between the two are marked in the Check column. The user can move the highlighted box in the Check column with the cursor keys to select the rows to mark. A box marked with an 'X' appears in the Check column to indicated the discrepancy when the space bar is pressed.

Figure 4. Typing speeds for all Typing Task sessions in characters per second (cps). Figure 5. Error rates for all Typing Task sessions. Errors rate determined by counting the number of cursor and backspace keys used.

Figure 6. Proofreading Task scores from all sessions.

Figure 7. The speed of each 10 minute session of Proofreading Task measured by the number of screens completed in that time.

Figure 8. Conveyor Belt Task Scores for each speed

Figure 9. The effect of symbol colour by speed on Conveyor Belt performance. Figure 10. The effect of target probability by speed on Conveyor Belt performance.

(20)

energy consumption and electrical demand in

buildings. Utilities, facing rapidly rising Costs for additional generating capacity to meet

. . .

(21)

Conveyor Belt Task

(22)

99505 99505 58629 58629 16379 16379 54613 5461 3 42880 42880 12952 12952 32307 32307 56941 56941 64952 64952 78 188 78182 90322 90322

\

I

Message box - Press F5 when list has been proofread to move on to the

1

I

next screen.

I

/

(23)
(24)

0 12

06 C

OL

1

(25)

SO' I.

1

8'0

SL'O

L'O

2?

S 9 . G

O

-- -

I

I I I I I I I I I

S.0

--

SP'O

--

P'O

-

SE'O

E'O

03 7 (0 l-

d-

l- N r 0 a3

a

*

CV 0

an

ba.y

(26)

-

0 Y CU 7 d-

a

7 Y a3 T- O CU d- CD a3 O CU CU CU CU CU c9

Paragraphs Completed

(27)

Conveyor Belt Task

-

Speed1

-

- 2 2

0

Score

Conveyor Belt Task

-

Speed 2

: q ; :

z g 2 2 3 2 2

Score

Conveyor Belt Task

-

Speed 3

Score

Conveyor Belt Task

-

Speed 4

12 T

Score

Conveyor Belt Task

-

Speed 5

12 T

h 9 @ ? - ~ ~c 1 7 m ~ S o r r

? ? ? ?

Score

Conveyor Belt Task

-

All Speeds

100 90

5

(28)
(29)

Références

Documents relatifs

A variable such that the initial portion of its name is the concatenation of the name for a ProNET type network interface followed by the octet string represented symbolically

example, when a router of the NSFNET Backbone exchanges routing or reachability information with a gateway of a regional network EGP is

The short term implementation mechanism will use an ISO 8473 normal data PDU as the echo-request and echo-reply PDU.. A special NSAP selector value will be used to identify

This model requires that just before the application begins a large bulk transfer of data, a request message is sent to the switch asking that the switched network connection

o Some DUAs are better than others at finding the required results o Some DUAs will get the required results more quickly than most o DUA designers have to build DUAs in

If a previous execute failed in a way indicating a programming error in the display program (error conditions 6,7 and 8), then an SETUP or an UPDATE request must be

When used in conjunction with MS-CHAP-2 authentication, the initial MPPE session keys are derived from the peer’s Windows NT password.. The first step is to obfuscate the

The test report MUST note the connection attempt rate, aging time, minimum TCP connection establishment time, maximum TCP connection establishment time, average