I think, I have to switch to BPMN 2.0 to better handle exceptions.
Thanks,
AS
This blog complements the support web site for my book "Improving enterprise business process management systems" about BPM, SOA and EA, which has just been published (see www.improving-BPM-systems.com/book ). © Copyright 2007-present by A. Samarin. All rights reserved.
Development efforts for automated systems are usually justified in part by their presumed impacts on human performance: a reduced workload, enhanced productivity, and fewer errors. Automation has generally failed to live up to these expectations, due not to the automation itself but to its inappropriate application and design [1]. Studies of failures in automation reveal an "epidemic of clumsy use of technology" [2] which creates new cognitive demands on the operator rather than freeing his/her time, diverts the user's attention to the interface rather than focusing the user's attention on the job, creates the potential for new kinds of errors and "automation surprises" rather than reducing errors, and creates new demands and more difficult knowledge and skill requirements rather than reducing the operator's knowledge requirements.
A key finding [3] of these studies is that "strong, silent" systems, i.e. those implemented as black boxes, are difficult to direct and result in a system with only two modes: fully automatic and fully manual. In such cases the human operator is apt to interrupt the automated agent and take over the problem entirely if the agent is not solving the problem adequately. This situation results from the "substitution myth" [3] whereby developers often assume that adding automation is a simple substitution of a machine activity for a human activity. Instead, partly because activities are highly interdependent or coupled, adding or expanding the machine's role changes the cooperation needed and the role of the human operator. Table 1 summarizes the apparent benefits of automation in contrast to empirical observations of operational personnel [3].
Better results are obtained from "substitution" of machine activity for human activity. | Practices are transformed; the roles of people change. |
Work is offloaded from the human to the machine. | Creates new kinds of cognitive work for the human, often at the wrong times. |
Operator's attention will be focused on the correct answer. | Creates more threads to track; makes it harder for operators to remain aware of and integrate all of the activities and changes around them. |
Less operator knowledge is required. | New knowledge and skill demands are imposed on the operator. |
Errors are reduced. | New problems and potentials for error are introduced. |
Billings [4] enumerates several fundamental attributes which are common to
occurrences of failure in automation and human/ automation interaction:
Billings [4] offers the following “first principles” as essential elements of an over-arching philosophy for human-centered systems:
[1] Thurman, D. A., Brann, D. M., and Mitchell, C. M., “An Architecture to Support Incremental Automation of Complex Systems”, Proceedings of the 1997 IEEE International Conference on Systems, Man, and Cybernetics, Orlando, FL (to appear).
[2] Woods, D. D., Patterson, E. S., Corban, J. M., and Watts, J. C., “Bridging the Gap Between User-Centered Intentions and Actual Design Practice”, on web site http://csel.eng.ohio-state.edu:8080/~csel/BridgeGapUserCtrInt.html.
[3] Woods, D. D., “Human-Centered Software Agents: Lessons from Clumsy Automation”, position paper for National Science Foundation Workshop on Human-Centered Systems: Information, Interactivity, and Intelligence, Arlington VA, February 1997.
[4] Billings, C. E., “Issues Concerning Human-Centered Intelligent Systems: What’s ‘human-centered’ and what’s the problem?”, plenary talk at National Science Foundation Workshop on Human-Centered Systems: Information, Interactivity, and Intelligence, Arlington VA, February 1997.
[5] Brann, D. M., Thurman, D. A., and Mitchell, C. M., “Human Interaction with Lights-out Automation: A Field Study”, Proceedings of the 1996 Symposium on Human Interaction with Complex Systems, Dayton OH, August 1996, pp. 276-283.
[6] Callantine, T., “Intent Inferencing”, on web site http://www.isye.gatech.edu/chmsr/Todd_Callantine/CHII.html.
[7] Mitchell, C. M., “Models for the Design of Human Interaction with Complex Dynamic Systems”, Proceedings of the Cognitive Engineering Systems in Process Control, November 1996.
[8] Thurman, D. A. and Mitchell, C. M., “A Design Methodology for operator Displays of Highly Automated Supervisory Control Systems”, Proceedings of the 6th Annual IFAC/ IFORS/ IFIP/ SEA Symposium on Man-Machine Systems, Boston MA, July 1995.
[9] Thurman, D. A. and Mitchell, C. M., “A Methodology for the Design of Interactive Monitoring Interfaces”, Proceedings of the 1994 IEEE International conference on Systems, Man, and Cybernetics, San Antonio TX, October 1994, pp. 1739-1744.
[10] “Field Guide for Designing Human Interaction with Intelligent Systems”, Draft, on web site http://tommy.jsc.nasa.gov/~clare/methods/methods.html, December 12, 1995.
Note: General thanks to Peter Schooff ( https://www.linkedin.com/in/pschooff ) and bpm.com 1st law of BPM Each BPM vendor, each BPM ...