225:
Ives et al. (1983) defined 'User
Information Satisfaction' as "the extent to which users believe the information system available to them meets their information requirements." Other terms for User Information Satisfaction are "system acceptance" (Igersheim, 1976), "perceived usefulness" (Larcker and Lessig, 1980), "MIS appreciation" (Swanson, 1974) and "feelings about information system" (Maish, 1979). Ang en Koh (1997) have described user information satisfaction (UIS) as "a perceptual or subjective measure of system success". This means that user information satisfaction will differ in meaning and significance from person to person. In other words, users who are equally satisfied with the same system according to one definition and measure may not be equally satisfied according to another.
196:, such errors not only result in reduced sample sizes but can also distort the results, as those who return long questionnaires, properly completed, may have differing psychological traits from those who do not. Ives, et al. thus developed the UIS. This only requires the respondent to rate 13 factors, and so remains in significant use at the present time. Two seven‑point scales are provided per factor (each for a quality), requiring 26 individual responses in all. But in a recent article, Islam, Mervi and Käköla (2010) argued that it is difficult to measure user satisfaction in the industry settings as the response rate often remain low. Thus, a simpler version of user satisfaction measurement instrument is necessary.
209:. They identified end-users as users who tend to interact with a computer interface only, while previously users interacted with developers and operational staff as well. McKinney, Yoon and Zahedi (2002) developed a model and instruments for measuring web-customer satisfaction during the information phase. Cheung and Lee (2005) in their development of an instrument to measure user satisfaction with e-portals, based their instrument on that of McKinney, Yoon and Zahedi (2002), which in turn was based primarily on instruments from prior studies.
127:. Successful organisations have systems in place which they believe help maximise profits and minimise overheads. It is therefore desirable that all their systems succeed and remain successful; and this includes their computer-based systems. According to key scholars such as DeLone and McLean (2002), user satisfaction is a key measure of computer system success, if not synonymous with it. However, the development of techniques for defining and measuring user satisfaction have been ad hoc and open to question. The term
25:
275:(measured using the KAI scales) and user satisfaction, a more significant link was found in the regions of 85 and 652 days into the systems' usage. This link shows that a large absolute gap between user and analyst cognitive styles often yields a higher rate of user dissatisfaction than a smaller gap. Furthermore, an analyst with a more adaptive cognitive style than the user at the early and late stages (approximately days 85 and 652) of system usage tends to reduce user dissatisfaction.
279:
usefulness (or otherwise) of tools of the trade are contextually related, and so are special cases of hygiene factors. They consequently define user satisfaction as the absence of user dissatisfaction and complaint, as assessed by users who have had at least some experience of using the system. In other words, satisfaction is based on memories of the past use of a system. Motivation, conversely, is based on beliefs about the future use of the system. (Mullany et al., 2007, p. 464)
295:. Others are replacing structured questionnaires by unstructured ones, where the respondent is asked simply to write down or dictate all the factors about a system which either satisfies or dissatisfies them. One problem with this approach, however, is that the instruments tend not to yield quantitative results, making comparisons and statistical analysis difficult. Also, if scholars cannot agree on the precise meaning of the term
192:(the electronic data processing, or computing department). However, the CUS requires 39 x 5 = 195 individual seven‑point scale responses. Ives, Olson and Baroudi (1983), amongst others, thought that so many responses could result in errors of attrition. This means, the respondent's failure to return the questionnaire or the increasing carelessness of the respondent as they fill in a long form. In
262:, and so on. Islam (2011) in a recent study found that the sources of dissatisfaction differs from the sources of satisfaction. He found that the environmental factors (e.g., system quality) were more critical to cause dissatisfaction while outcome specific factors (e.g., perceived usefulness) were more critical to cause satisfaction.
228:
Several studies have investigated whether or not certain factors influence the UIS; for example, those by
Yaverbaum (1988) and Ang and Soh (1997). Yaverbaum's (1988) study found that people who use their computer irregularly tend to be more satisfied than regular users. Ang en Soh's(1997)research, on
220:
As none of the instruments in common use really rigorously define their construct of user satisfaction, some scholars such as
Cheyney, Mann and Amoroso (1986) have called for more research on the factors which influence the success of end-user computing. Little subsequent effort which sheds new light
151:
short-form of
Baroudi, Olson and Ives are typical of instruments which one might term as 'factor-based'. They consist of lists of factors, each of which the respondent is asked to rate on one or more multiple point scales. Bailey and Pearson's CUS asked for five ratings for each of 39 factors. The
224:
In the literature there are two definitions for user satisfaction, 'User satisfaction' and 'User
Information Satisfaction' are used interchangeably. According to Doll and Torkzadeh (1988) 'user satisfaction' is defined as the opinion of the user about a specific computer application, which they use.
204:
An early criticism of these measures was that the factors date as computer technology evolves and changes. This suggested the need for updates and led to a sequence of other factor-based instruments. Doll and
Torkzadeh (1988), for example, produced a factor-based instrument for a new type of user
245:
Another difficulty with most of these instruments is their lack of theoretical underpinning by psychological or managerial theory. Exceptions to this were the model of web site design success developed by Zhang and von Dran (2000), and a measure of user satisfaction with e-portals, developed by
278:
Mullany, Tan and
Gallupe (2006) devised an instrument (the System Satisfaction Schedule (SSS)), which utilizes user generated factors (that is, almost exclusively, and so avoids the problem of the dating of factors. Also aligning themselves to Herzberg, these authors argue that the perceived
270:
A study by
Mullany (2006) showed that during the life of a system, satisfaction from users will on average increase in time as the users' experiences with the system increase. Whilst the overall findings of the studies showed only a weak link between the gap in the users' and analysts'
221:
on the matter exists, however. All factor-based instruments run the risk of including factors irrelevant to the respondent, while omitting some that may be highly significant to him/her. Needless to say, this is further exacerbated by the ongoing changes in information technology.
232:
Mullany, Tan and
Gallupe (2006) do essay a definition of user satisfaction, claiming that it is based on memories of the past use of a system. Conversely motivation, they suggest, is based on beliefs about the future use of the system. (Mullany et al., 2006).
329:
Bargas-Avila, J., Loetscher, J., Orsini, S. and Opwis, K. "Intranet
Satisfaction Questionnaire: Development and Validation of a Questionnaire to Measure User Satisfaction with the Intranet" Paper submitted to Information & Management.
463:
Mullany, M. J., Tan, F. B. and Gallupe, R. B., 2006, "The S-Statistic: a measure of user satisfaction based on Herzberg's theory of motivation", Proceedings of the 17th Australasian Conference on Information Systems (ACIS),
467:
Mullany, M. J., Tan, F. B. and Gallupe, R. B., 2007, "The Impact Of Analyst-User Cognitive Style Differences On User Satisfaction", Proceedings of the 11th Pacific-Asia Conference on Information Systems (PACIS),
236:
The large number of studies over the past few decades, as cited in this article, shows that user information satisfaction remains an important topic in research studies despite somewhat contradictory results.
417:
Islam, A.K.M. Najmul, Koivulahti-Ojala, M., and Käkölä, T. "A lightweight, industrially-validated instrument to measure user satisfaction and service quality experienced by the users of a UML modeling tool",
250:. Consequently, their factors were designed to measure both 'satisfiers' and 'hygiene factors'. However, Herzberg's theory itself is criticized for failing to distinguish adequately between the terms
460:
Mullany, Michael John, and Auckland University of Technology. "The use of Analyst-User Cognitive Style Differentials to Predict Aspects of User Satisfaction with Information Systems" 2006. Print.
152:
first four scales were for quality ratings and the fifth was an importance rating. From the fifth rating of each factor, they found that their sample of users rated as most important:
111:) is the attitude of a user to the computer system they employ in the context of their work environments. Doll and Torkzadeh's (1988) definition of user satisfaction is,
54:
299:, respondents will be highly unlikely to respond consistently to such instruments. Some newer instruments contain a mix of structured and unstructured items.
502:
287:
Currently, some scholars and practitioners are experimenting with other measurement methods and further refinements of the definition for
333:
Baroudi, J.J., and Orlikowski, W.J. "A Short-Form Measure of User Information Satisfaction: A Psychometric Evaluation and Notes on Use",
453:
McKinney, V., Yoon, K., and Zahedi, F.M. "The measurement of web-customer satisfaction: An expectation and disconfirmation approach",
76:
425:
Islam, A.K.M. Najmul, "Information Systems Post-adoption Satisfaction and Dissatisfaction: A Study in the E-Learning Context",
340:
Cheung, C.M.K., and Lee, M.K.O. "The Asymmetric Effect of Website Attribute Performance on Satisfaction: An Empirical Study",
485:
Yaverbaum, G. J. "Critical factors in the user environment - an experimental study of users, organizations and tasks",
315:
Ang, J. and Soh, P. H. "User information satisfaction, job satisfaction and computer background: An exploratory study",
115:. In a broader sense, the definition of user satisfaction can be extended to user satisfaction with any computer-based
478:
Zhang, P., and Von Dran, G.M. "Satisfiers and dissatisfiers: a two-factor model for Website design and evaluation.",
347:
Cheyney, P. H., Mann, R.L., and Amoroso, D.L. "Organisational factors affecting the success of end-user computing",
124:
37:
382:
Doll, W.J., and Torkzadeh, G. "The measurement of end-user computing satisfaction: theoretical considerations",
368:
DeLone, W.H., and Mclean, E.R. "The DeLone and McLean Model of Information Systems Success: A Ten-Year Update",
322:
Bailey, J.E., and Pearson, S.W. "Development of a tool for measuring and analysing computer user satisfaction",
47:
41:
33:
308:
Ang, J. and Koh, S. "Exploring the relationships between user information satisfaction and job satisfaction",
58:
354:
DeLone, W.H., and Mclean, E.R. "Information Systems Success: The Quest for the Dependent Variable",
439:
Larcker, D.F. and Lessig, V.P. "Perceived usefulness of information: a psychometric examination",
432:
Ives, B., Olson, M.H., and Baroudi, J.J. "The measurement of user information satisfaction",
272:
246:
Cheung and Lee (2005). Both of these models drew upon Herzberg's two-factor theory of
496:
471:
Swanson, E.B. "Management and information systems: an appreciation and involvement",
193:
375:
Doll, W.J., and Torkzadeh, G. "The Measurement of End User Computing Satisfaction",
363:
35th Hawaii International Conference on System Sciences, IEEE Computer Society Press
342:
38th Hawaii International Conference on System Sciences, IEEE Computer Society Press
116:
247:
229:
the other hand, could find no evidence that computer background affects UIS.
113:
the opinion of the user about a specific computer application, which they use
120:
361:
DeLone, W.H., Mclean, and R, E. "Information Systems Success Revisited",
206:
106:
119:
appliance. However, scholars distinguish between user satisfaction and
410:
Igersheim, R.H. "Management response to an information system",
403:
Herzberg, F. "One more time: How do you motivate employees?",
18:
480:
Journal of the American Society for Information Science
172:. The factors of least importance were found to be
446:Maish, A.M. "A user's behavior towards his MIS",
46:but its sources remain unclear because it lacks
310:International Journal of Information Management
412:Proceedings AFIPS National Computer Conference
389:Herzberg, F., Mausner, B., and Snyderman, B.
8:
16:Attitude to a computer system used for work
400:World Publishing, Cleveland, 1966, p. 203.
370:Journal of Management Information Systems
349:Journal of Management Information Systems
335:Journal of Management Information Systems
77:Learn how and when to remove this message
407:(46:1), January-February 1968, pp 53-62.
365:, Los Alamitos, CA, 2002, pp. 238-248.
200:The problem with the dating of factors
147:questionnaire and its derivative, the
143:Bailey and Pearson's (1983) 39‑Factor
92:(and closely related concepts such as
482:(51:14), December 2000, pp 1253-1268.
7:
457:(13:3), September 2002, pp 296-315.
149:User Information Satisfaction (UIS)
436:(26:10), October 1983, pp 785-793.
241:A lack of theoretical underpinning
14:
393:. Wiley, New York, 1959, p. 257.
205:emerging at the time, called an
145:Computer User Satisfaction (CUS)
23:
379:(12:2), June 1988, pp 258-274.
190:organisational position of EDP
1:
372:(19:4), Spring 2003, pp 9-30.
337:(4:2), Spring 1988, pp 44-58.
326:(29:5), May 1983, pp 530-545.
455:Information Systems Research
386:(15:1), March 1991, pp 5-10.
358:(3:1), March 1992, pp 60-95.
356:Information Systems Research
344:, Hawaii, 2005, pp. 175-184.
317:Information & Management
102:computer system satisfaction
519:
503:Human–computer interaction
398:Work and the nature of man
129:Computer User Satisfaction
125:Human-Computer Interaction
90:Computer user satisfaction
475:(21:2), 1974, pp 178-188.
443:(11:1), 1980, pp 121-134.
434:Communications of the ACM
319:(32:5), 1997, pp 255-266.
312:(17:3), 1997, pp 169-177.
213:The problem of defining
170:confidence in the system
32:This article includes a
489:(12:1), 1988, pp 75-88.
405:Harvard Business Review
61:more precise citations.
450:(3:1), 1979, pp 37-52.
391:The motivation to work
109:computing satisfaction
351:3(1) 1986, pp 65-80.
414:, 1976, pp 877-882.
283:Future developments
174:feelings of control
139:The CUS and the UIS
94:system satisfaction
473:Management Science
324:Management Science
186:degree of training
131:is abbreviated to
34:list of references
427:Proceedings PACIS
420:Proceedings AMCIS
293:user satisfaction
215:user satisfaction
135:in this article.
133:user satisfaction
98:user satisfaction
87:
86:
79:
510:
441:Decision Science
260:job satisfaction
178:volume of output
82:
75:
71:
68:
62:
57:this article by
48:inline citations
27:
26:
19:
518:
517:
513:
512:
511:
509:
508:
507:
493:
492:
305:
285:
273:cognitive style
268:
266:Cognitive style
243:
218:
202:
141:
83:
72:
66:
63:
52:
38:related reading
28:
24:
17:
12:
11:
5:
516:
514:
506:
505:
495:
494:
491:
490:
483:
476:
469:
465:
461:
458:
451:
444:
437:
430:
423:
415:
408:
401:
394:
387:
380:
373:
366:
359:
352:
345:
338:
331:
327:
320:
313:
304:
301:
284:
281:
267:
264:
256:job motivation
242:
239:
217:
211:
201:
198:
182:vendor support
140:
137:
85:
84:
42:external links
31:
29:
22:
15:
13:
10:
9:
6:
4:
3:
2:
515:
504:
501:
500:
498:
488:
487:MIS Quarterly
484:
481:
477:
474:
470:
466:
462:
459:
456:
452:
449:
448:MIS Quarterly
445:
442:
438:
435:
431:
428:
424:
421:
416:
413:
409:
406:
402:
399:
396:Herzberg, F.
395:
392:
388:
385:
384:MIS Quarterly
381:
378:
377:MIS Quarterly
374:
371:
367:
364:
360:
357:
353:
350:
346:
343:
339:
336:
332:
328:
325:
321:
318:
314:
311:
307:
306:
302:
300:
298:
294:
290:
282:
280:
276:
274:
265:
263:
261:
257:
253:
249:
240:
238:
234:
230:
226:
222:
216:
212:
210:
208:
199:
197:
195:
194:psychometrics
191:
187:
183:
179:
175:
171:
167:
163:
159:
155:
150:
146:
138:
136:
134:
130:
126:
122:
118:
114:
110:
108:
103:
99:
95:
91:
81:
78:
70:
60:
56:
50:
49:
43:
39:
35:
30:
21:
20:
486:
479:
472:
454:
447:
440:
433:
426:
419:
411:
404:
397:
390:
383:
376:
369:
362:
355:
348:
341:
334:
323:
316:
309:
297:satisfaction
296:
292:
289:satisfaction
288:
286:
277:
269:
259:
255:
251:
244:
235:
231:
227:
223:
219:
214:
203:
189:
185:
181:
177:
173:
169:
165:
161:
157:
153:
148:
144:
142:
132:
128:
112:
105:
101:
97:
93:
89:
88:
73:
64:
53:Please help
45:
158:reliability
123:as part of
59:introducing
303:References
252:motivation
248:motivation
162:timeliness
117:electronic
67:April 2023
468:Auckland.
464:Adelaide.
166:relevancy
121:usability
497:Category
207:end-user
154:accuracy
107:end user
55:improve
188:, and
429:2011.
422:2010.
330:2008.
40:, or
291:and
168:and
499::
258:,
254:,
184:,
180:,
176:,
164:,
160:,
156:,
104:,
100:,
96:,
44:,
36:,
80:)
74:(
69:)
65:(
51:.
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.