Contents - Previous - Next


This is the old United Nations University website. Visit the new site at http://unu.edu


Session 4: Intelligent access to information: Part 1


Simulated man-machine systems as computer-aided information transfer and self-learning tools
Human-centred design of information systems
Designing interactive systems based on cognitive theories of human information processing
Personal hypermedia systems
Discussion


Chairperson: Hisao Yamada

Simulated man-machine systems as computer-aided information transfer and self-learning tools


Abstract
1. Introduction
2. Human interaction with integrated automation in man-machine systems
3. Knowledge-based information access by means of simulation and self-learning tools
4. Needs for future research and socio-technical development
References


Gunnar Johannsen

Abstract

Several negative side-effects of socio-technical changes in industrialized as well as in developing countries are recognized. The potential of information technology for bettering these situations is discussed, particularly with respect to improved training for acquainting people with technical systems and their practical use. Human interaction with integrated automation in man-machine systems is surveyed. This involves discussion of control and problem-solving tasks, decision-support systems, man-machine interfaces, and several stages of knowledge transfer. The main contribution of the paper deals with knowledge-based information access by means of simulation and self-learning tools. The author suggests simulated man-machine systems for training supervision and control tasks as well as computerized tools for self-learning of problem-solving tasks and for information transfer of broader man-machine systems engineering issues in a societal context. The cultural aspects of such tools as well as the needs for future research and for future socio-technical development are also emphasized.

1. Introduction

Presently we are living in times of rapid socio-technical changes with certain dangers of possible instabilities. These changes refer, on the one hand, to

- progress in science and technology, particularly in information and communication technologies, and, on the other, to
- inequalities in the standard of living, for example, with respect to food supply, housing, health care, and mobility, as well as in individual and social freedom,
- environmental and social consequences of thoughtless technological developments,
- the need for conversion of many military and very high-risk technologies, and
- the need for trust, responsibility, tolerance, and mutual support in individual, social, and international relations.

One may come to the conclusion that during the last two to three decades, many problems on our globe have worsened. Certainly, parts of the world have attained higher standards of living, more individual and social freedom, and higher mobility than ever experienced before. However, larger parts of the world are far from reaching these standards and have even suffered deteriorations in several cases. Thus, the poor and the developing countries are sometimes lagging further behind, and they see much larger gaps between themselves and the highly developed countries than a few decades ago.

What about the industrialized countries themselves? There, technological developments have also brought about a large number of negative long-term effects. Environmental problems due to the consumption of energy and materials, traffic, and waste (radioactive, toxic, and non-toxic) have created many concerns with respect to the future qualities of air, water, and soil. Other technological developments have led to social consequences such as unemployment, unreasonable distribution of work, and isolation. Traffic congestion in cities and on highways as well as fatal accidents with cars, aircraft, power plants, and industrial processes may remind us of our limited capabilities and understanding. Some experts have recently estimated that the delayed consequences of the reactor catastrophe of the Chernobyl nuclear power plant in 1986 will amount to between one and two million casualities over the next years. Additional risks have only started to become visible to the general public, since the end of the Cold War between the West and East blocs of the industrialized countries makes most of the industrial-military complex obsolete on both sides. The need for conversion of institutions, systems, and materials has been recognized as a huge challenge.

It seems that the amount of clearing work to be done in the near future will be overwhelming. To distrust science and technology because of their negative side-effects will certainly not be an appropriate solution. Instead, science and technology are badly needed to perform the clearing work in an appropriate way and also for improving the standard of living - substantially in poor and developing countries, but also moderately in developed countries. Often, several degrees of development exist within a single country. Therefore, suitable strategies are required that try to avoid earlier errors, when further developing technologies and societies. In order to accomplish this enormous task for mankind in the near future of the next two to three decades, we need a spirit of trust, responsibility, tolerance, and mutual sup port in individual, social, and international relations. And we all need to share in the extra work to be done as well as in the financial and socio-psychological burdens. I believe that information, communication, and truthfulness together with the adoption of a realistic long-term balance of costs and benefits will contribute to building up this spirit and the willingness to share as a prerequisite for peaceful developments.

Information and communication technologies will therefore play an important role for humane development all over the world. However, the very many possible misuses of these technologies need to be banned internationally as much as does the violation of human rights. Each individual human being must have the possibility of access to any information he or she wishes to receive and to understand. Because of the many individual limits to understanding different kinds of information, special tools have to be available for navigating through the information.

In this paper, I will restrict myself to a special kind of tool for computer-aided information transfer and self-learning. Through a further restriction, the paper mainly concentrates on simulated man-machine systems that are to be used as information tools. They are dynamic and interactive and, thus, can demonstrate and teach how technical systems - the machines - are to be operated and maintained. Combined with regularly renewed licensing of human task responsibilities, such tools may possibly help to avoid at least some of the most fateful accidents. Before discussing these tools in more detail in section 3, the most prominent technical and cognitive aspects of man-machine systems will be introduced in section 2. The paper concludes in section 4 by pointing out the need for further work on realizing the suggested tools that now exist at best only in laboratory versions.

2. Human interaction with integrated automation in man-machine systems

2.1 Control and Problem-solving Tasks

All dynamic technical systems that are operated by one or several human beings can be viewed together with these human operators as man-machine systems. Thus, all kinds of vehicles, continuous and discrete industrial processes, power plants, robots and manipulators, business and public information systems, biomedical support systems, and many more are man-machine systems in this sense. To achieve the prescribed goals of safe and efficient operation of man-machine systems, two main categories of tasks, controlling and problem solving, have to be performed. In principle, both task categories can be performed by the human operator(s) as well as by automatic computerized systems.

The control activities comprise reaching, open-loop and closed-loop control in the narrower sense of control theory, monitoring, and lower supervisory control functions such as intervening in automated processes. On higher cognitive levels, problem-solving activities have to be performed. These include fault management, particularly with fault diagnosis and correction, goal setting and hypothesis generation, planning, and the higher supervisory control functions such as teaching. The distinction between control and problem-solving tasks is especially advocated by Johannsen [11,12]. The latter is a task-oriented concept, whereas the overlay of the supervisory control paradigm is an interaction-oriented concept. The latter consists of a hierarchical, decentralized structure with task-interactive computers at the lower level and human-interactive computer(s) at the coordinating, higher level [26]. The five generic supervisory control functions are planning, teaching, monitoring, intervening, and learning.

Different levels and phases of human and automatic controlling and problem solving (after ref. 13)

Behavioural levels

State-oriented level

Context-oriented level

Structure-oriented level

Information - processing phases      
Categorization Signal (and alarm) detection, state estimation Pattern recognition and matching, analysis of sequential observations Situational and system identification
Planning Fixed related(between categorization and action) Script selection Plan generation and adaption
Action Stereotyped automatic control Symptomatic rule application Topographic rule application

Human and automatic controlling and problem solving show several similarities, but there are also differences between both forms of the two task categories. A further classification of these task categories is illustrated in the table. The table shows human and automatic activities separated into three succeeding Information-processing phases, namely categorization, planning, and action (see also Rouse [21] and Sundström [30]). These information-processing phases can be cycled through on three distinct behavioural levels, namely the state-oriented, the context-oriented, and the structure-oriented levels, which correspond in some way to the cognitive levels of behaviour as suggested by Rasmussen [18, 19].

The main domains of automation systems are the complete state-oriented level as well as at least some parts of the categorization phase of the context-and structure-oriented levels. Of course, human operators can also be engaged in these activities. However, they are particularly superior to automation systems in the planning and action phases of the context-and structure-oriented levels. These areas of table 1 correspond more to the problem-solving tasks and, thus, require higher cognitive behaviour. This is even true when prescribed plans, so-called scripts, have to be selected and rules for actions have to be applied on the basis of observed symptoms. The human operator is particularly indispensable when new plans have to be generated and rules have to be applied on a topographic or completely structural relationship because an unforeseen situation has to be dealt with.

The recent aircraft accident near Stockholm is a good example of the latter case, although it also has elements of the context-oriented level. After the engines no longer worked as the aircraft neared the ground, the pilots had only a very short time to plan and execute an emergency landing on a field just behind a forest. All passengers survived because of the excellent human performance. Where to glide down with the aircraft had to be decided on the structure-oriented level, whereas many of the subordinated guidance and control activities were probably done with some kind of script selection. These scripts were built up during previous simulator training. It is a general policy that pilots are well trained by means of simulated critical events in order to build up more automated scripted behaviour that can be reproduced much faster than the generation of new plans but will hopefully never be needed in reality.

The higher cognitive functions as described with table 1 can also, at least partially, be handled with new information technologies. There are knowledge-based decision support systems. Similarly, like human operators, these decision support systems mainly process symbolic information about contexts and structures rather than numerical information about signals and states. Thus, it is necessary to realize some kind of signal-to-symbol transformations and vice versa to be performed by the human operator or the computerized system when freely navigating through the whole plane of table 1.

Appropriate function and task allocations determine which tasks will best be performed by the human operators and which will best be assigned to the computerized systems. With an integration between traditional automation systems and the more recent information technologies of the decision-support systems, human-centred designs and dynamic forms of task allocations are possible (see, e. g., Rouse (22]).

2.2 Integration between Traditional Automation and Decision-Support Systems

Human-centred designs of integrated automation will lead to improved man-machine systems in the very near future. These perspectives occurred because of knowledge-based information technologies. The term integrated automation means that traditional automation systems and knowledge-based decision-support systems cooperate closely in a suitable way. The criteria for what is suitable, however, need to be defined and require the human-centred approach that is also part of the integrated automation concept.

As elaborated in more detail by Johannsen [11], an extended Operator (User) Interface Management System architecture for dynamic technical systems allows the description of the integrated automation concept with a number of separated levels (see also Alty and Johannsen [2]). Figure 1 shows these levels of the UIMS architecture. The features of man-machine interfaces will be reviewed in the next subsection. Here I discuss the different levels of the technical system as shown in figure 1. The information selector 1 processes numerical information, whereas the information selector 2 transforms signals into symbols for the information processing at the decision-support level. This level cooperates with the traditional automation of the supervision and control level.

Decision-support systems can be subdivided into those that are more application oriented and those that are more human operator oriented. In figure 1, they are called application model support and operator model support, respectively. The application model support systems deal with situations in the technical process and the supervision and control system. The information that is processed for these situations is then communicated to the human operator as a decision support. Operators will normally perform a dialogue with such decision-support systems in order to satisfy their information needs. These application model support systems can also be viewed with respect to the behavioural levels as suggested in table 1. Examples for these three levels are the decision support for heuristic control [17], for fault diagnosis (e.g., Borndorff-Eccarius [3]), and for plant management; see Johannsen [11] for more details.

Similarly, the operator model support systems deal more directly with human operators' behaviour and help them to perform as well as possible. Again, three behavioural levels can be distinguished. Examples are human error evaluation (e.g., Hessler [9]), plan or intent recognition (e.g., Rubin et al. [24]), and procedural support (e.g., Sundström [30]); more details are given in Johannsen [11].

A mathematical framework for interaction between traditional automation and decision-support systems is supplied by Johannsen [11] and explained with four case-studies. This mathematical framework requires much more precision and details that can only become available after a lot of further multidisciplinary research.

2.3 Man-Machine interfaces with Graphical and Dialogue Support

The presentation level and the dialogue level describe the two distinct aspects of any man-machine interface, as shown in figure 1. The presentation level concerns the form of the information and, thus, includes the displays and controls, which are more and more often computer graphics screens and computerized control input systems, as well as some knowledge-based graphics support modules. The dialogue system is a kind of central intermediary between the human operator(s) and all levels of the whole technical system, including the decision-support systems, and deals with the contents of information flows in the man-machine communication. Loose ends of dialogue can occur, particularly in critical situations, when an urgent request of a subsystem of the technical system interrupts the interaction of another one with the human operator [2]. Handling these loose ends of dialogue can be supported by small-scale knowledge-based dialogue assistants for the interaction with the different subsystems of the technical system.

Figure 1 Extended Operator (User) Interface Management System structure for dynamic technical systems (after Johannsen, 1992)

The visual display units with colour graphics screens allow flexible and task-adapted forms of presentation. Many options for picture design exist, but the multitude of alternatives can also be misused if the designers lack enough knowledge about ergonomics and the information needs of human operators [10]. Dynamical shifts of information needs in the problem space of the two dimensions of the degree of detail and of the level of abstraction must be taken into account even more with advanced information display concepts, as already suggested by Goodstein [8]. The design of graphics support can lead to knowledge-based graphical systems that dynamically generate new picture contents by using knowledge of the application, of the operator model based on information search needs, of graphical presentation techniques, and of dialogue techniques. Such concepts require an integration of computer graphics and knowledge-based technologies [5, 6]. A similar approach for knowledge-based designer support and intelligent man-machine interfaces for process control was suggested by Tendjaoui, Kolski, and Millot [32]. The cognitive engineering approach must also be applied here to answer the question of how to present information to the human operator(s) and to design appropriate graphical decision support for the goal-directed, spatial visualization of task-meaningful units and their relations [34]. The searcher for which interaction media to use has, among others, to consider recent results on multimedia research [1].

Advanced man-machine interfaces that integrate knowledge-based task and operator models as decision-support systems with presentation and dialogue submodules were particularly investigated by Tendjaoui, Kolski, and Millot [32] and by Sundström [30], and reviewed by Johannsen [11]. Application and process knowledge as well as ergonomic knowledge and knowledge about the information search needs of the human operator(s) in different operational situations determine which information needs to be displayed, when, and how. In principle, cooperative interactions between several human operators and the knowledge-based decision-support systems are possible. However, the state of the technology has not yet advanced to the level of rigorous industrial applications.

2.4 Transformation and Use of Knowledge by Knowledge Engineers, Designers, and Operators

Human operators do not use only their own knowledge in interactions within man-machine systems. In addition, a lot of knowledge is available to them directly or indirectly through the information acquisition system, the supervision and control system, and the decision-support systems. Much of this knowledge has been supplied by the designers of these systems. The maintenance personnel find themselves in a similar situation; however, the decision-support systems, as well as the man-machine interfaces designed for them, may be different from those for the human operators. This is indicated in the lower part of figure 2. The figure also shows that in addition, a tutoring system may have been designed to teach novices how to operate or to maintain the whole technical system.

Figure 2 Stages of knowledge transfer and man-machine interaction for knowledge engineers, designers, operators, and maintenance personnel

The designer creates knowledge support implemented in the different subsystems for the human operators and the maintenance personnel. Similarly, the designers themselves can be assisted by appropriate decision-support systems or, in the case of novices, by a tutoring system. This too is shown in figure 2, as well as the fact that the man-machine interface for the designer is normally completely different from those for the operator and the maintenance personnel. As mentioned above, the designers' decision-support systems may include an application model with knowledge and data about the technical process and its supervision and control system; an operator model will be provided with knowledge about the information search needs of the human operator as well as ergonomic and design procedural knowledge. The problem-solving strategies are mainly left with the designers because they determine the creative part of any design.

As is clear from figure 2, the knowledge contained in the designers' decision-support systems also needs to be generated. This is done in the knowledge acquisition process by the knowledge engineer and/or the domain experts who supply their technological and operational know-how for the conceptualization and implementation of the decision-support systems. Thus, the whole process of knowledge and information transfer as shown in figure 2 starts with the know-how of some experts and is carried through successive stages with their respective interactions by knowledge engineers, designers, and operators or maintenance personnel. Human errors that are made at any of the earlier stages may be propagated to later stages and, finally, may adversely influence the operation of the man-machine system. Reason [20] pointed out, for example, that the human errors of designers can sometimes be compensated for by the human operators but may also be the main contributing factor to turning a certain critical situation into a catastrophe. Regulations with product liability are more and more concerned with these aspects. Errors in early stages may eventually have severe consequences that call for careful avoidance and correction strategies as well as for training towards responsibility.

3. Knowledge-based information access by means of simulation and self-learning tools

3.1 Training Supervision and Control Tasks with Simulated Man-Machine Systems

Simulators of cars, trucks, aircraft, ships, power plants, chemical plants, and other technical processes are valuable tools for research and development as well as for training. They have a long tradition in aeronautics and astronautics, but nowadays are more often used in other application domains. The appropriate simulation fidelity is an important technical, psychological, and economic issue. Often, part-task simulators are sufficient, particularly for research. The simulation fidelity needs to be higher when professionals are to be trained in how to deal with difficult task situations. On the other hand, cheap simulators would be sufficient for less difficult tasks such as driving a car. A new technology for such inexpensive simulators for regular refresher training of drivers is needed, in my opinion, as one countermeasure to the far too many traffic accidents.

Simulated man-machine systems will be a technology that goes beyond the use of simulators. Here, the human operator's behaviour in supervision and control tasks is also simulated. This requires fairly accurate human performance models as well as models of mental workload. As Stassen et al. [29] summarized, such models are available for lower cognitive levels of human behaviour such as those required in manual control, failure detection, and monitoring. The use of these models in closed loops with the technical system leads to completely simulated man-machine systems. They can demonstrate in real time to any one interested how good or bad human interaction with a specific technical system may be. For training purposes, play-back capabilities for repeating any situation of interest, human interaction facilities with the possibility of comparing the trainee's with the ideal or bad simulated behaviour and performance, and explanations in separate text and graphics windows are necessary. Particularly with explanation facilities, such a simulated man-machine system becomes a tutoring system or self-learning tool for the training of supervision and control tasks for specific technical systems.

The three cognitive levels of human behaviour suggested by Rasmussen [19], namely skill-based, rule-based, and knowledge-based behaviour, also have to be addressed appropriately during any training process. As Sheridan [25] proposed, computer aiding has to supply different support and needs different forms of interaction at these three levels. Demonstrations seem to be appropriate at the lowest level, whereas rules and advice are called for respectively at the two higher levels. A simple example may illustrate this. We assume a simulated man-machine system for training in car-driving tasks and select a typical driving situation on a motorway. This situation may be characterized by driving into and through a construction zone with narrower lane width and dense oncoming traffic. The demonstrations may show the outside view in real time for different selected speeds and may exemplify certain critical situations or even accidents. Rules for explaining the reasons behind the speed limits and the behavioural choices in critical situations may be shown in a separate window on the computer screen. This needs to be visible when observing the simulated outside view on the same colour graphics screen but shall not impair the main driving task. The explanations may be further supported by additional demonstrations. Only after the online demonstrations are terminated is the knowledge-based advice given. The human operator can take as much time as needed to interrogate the computer support in an interactive dialogue fashion. In this way, knowledge-based behaviour can be trained and, thus, through training, becomes rule-based behaviour.

Time pressure in the latter training phase and, generally, on the knowledge-based behavioural level is dangerous because it can even lead the human operator to fall back on the skill-based behavioural level with, then, a high risk of human errors [4]. The very last phase of the Chernobyl nuclear power plant accident can be explained by the choice under dramatic time pressure of the wrong skill-based behaviour instead of the necessary knowledge-based behaviour [16]. It may even be possible to demonstrate such human errors online in the simulation by freezing certain situations and by giving additional explanations for the wrong behaviour and for the causes behind it.

3.2 Self-learning Tools for Problem-solving Tasks

Not only supervision and control tasks but also the higher cognitive problem-solving tasks need to be trained appropriately. Unfortunately, our research results, particularly with respect to modelling problem-solving tasks, are not as well developed as our knowledge about control tasks [15]. It is even arguable whether it will ever be possible to model higher cognitive human functioning. Human creativity is a major ingredient in these functions and is mainly responsible for keeping the human being in the man-machine system. Planning tasks have not as yet been modelled sufficiently. Some models exist for human fault diagnosis tasks [23]. The more important research results contribute evaluations and concepts for experimental fault-diagnosis situations. Also, knowledge-based decision-support systems as computer aids for fault diagnosis were developed, as mentioned above (see also Johannsen and Alty [14]; Tzafestas [33]).

The idea here is to use and to integrate concepts, experimental results, models, and computer support systems for developing tools that allow self-learning and the information transfer of problem-solving tasks. The combination of such tools with the on-line simulation tools, as described in the preceding subsection, may further enhance the quality of the training system. It must be possible for human beings anywhere in the world to sit by themselves in a silent place in order to learn or just to understand basically how a particular technical system can be managed and operated, and how problems with this system can be solved once they occur. A reasonably large personal computer should be sufficient for such self-learning and information transfer purposes. Only the recent hardware revolution makes it feasible to supply enough computer power for such new training endeavours, even in geographically very remote places and in fairly limited economic situations. In the latter cases, access to the computer tools has to be organized for sharing their use in a way similar to that of using public library facilities.

Two off-line advisory systems for training maintenance personnel in fault-diagnosis capabilities will be described here as existing examples for self-learning tools in problem-solving tasks. These two systems are FAULT (Framework for Aiding the Understanding of Logical Trouble-shooting), developed by Rouse and Hunt [23] and their colleagues over several years, and ADVISOR (Advanced Video Instructor), developed by Tanaka and others [31].

The original idea of FAULT was to investigate whether generalizable capabilities for fault diagnosis exist in human beings. Therefore, an early experiment was designed with the very abstract task of diagnosing a single fault in a complex network of nodes. In a second, more realistic example, the faulty components had to be identified in an electrical network with AND and OR gates. The connections between the components also included feedbacks. Both examples are suitable for training human operators in fairly general, context-free skills. With the FAULT system, a series of further experiments was performed that were context-specific, as in real-life tasks. The chosen networks were now schematics of car, aircraft, and ship engines with their main subsystems and components. The human subjects started the fault diagnosis with rather general symptoms (e.g. the engine will not start). Then they had to check gauges, ask for information about specific components, make observations, or remove components for testing. The symptom, the status of all available gauges, and the selected actions with their costs were displayed in different windows on the computer screen. Several forms of computer aiding based on models of human problem solving were used to assist the maintenance personnel in their troubleshooting tasks. The results of the experiments showed that human problem solving, which depends on how well the human operators understand the problems, is not optimal; see Rouse and Hunt [23]. There seems to be a trend towards context-dominated problem-solving behaviour. This is, however, not completely context-specific, as can be seen from human beings' abilities to make transitions into unfamiliar problems and to cope with ill-defined and ambiguous situations.

The recognized human cognitive limitations in problem-solving tasks can be overcome by computer-aiding systems based on some problem-solving performance models. Simpler computer aids may just be structure-oriented bookkeeping tools. In FAULT, it was shown that the bookkeeping methods consistently improved human performance, even after later transfer to unaided problem solving. FAULT was also used as a self-learning tool for maintenance personnel of ship engines. Like a tutor, the tool guided the human subjects through different knowledge-acquisition stages and problem-solving phases. Nowadays, similar tools are used for individual self-learning and for quick prescribed testing of the required status of knowledge at freely selected assessment times in airlines. These tools supplement the use of aircraft simulators in pilot training.

The ADVISOR system is another learning environment for maintenance [31]. It was developed jointly by the Tokyo Electric Power Corporation and Mitsubishi Electric Corporation. An interface-centred design approach was chosen, whereby multimedia techniques were also used. The system has some pedagogical interfaces to enhance the understanding of novices and to help them manage troubles and make appropriate guesses. Useful information and suggestions are provided in a format best adapted to the current understanding of the human subject, although the system does not provide a perfect model of human problem-solving performance. A prototype was developed for the maintenance of gas insulated switch gears in the power industries.

ADVISOR is implemented on a workstation, a personal computer, a videodisc, and a TV display. A real TV picture of the interesting views on and into the equipment in the maintenance situation is shown on the TV display with superimposed text windows for explanations and touch-sensitive virtual keys for interrogations. On the neighbouring colour graphics display of the workstation, learning from examples can be supported by explaining what the cause-consequence structures for a number of alarms look like in a multiple fault situation. Thus, the novice can learn the principles behind the examples. Also, the pedagogical interfaces can be shown with several windows on the display of the workstation as supports for recognizing the current status in the whole learning space as well as the significance of each step and the intention behind it in the maintenance sequence. It is not known to this author whether the evaluation of the ADVISOR system was terminated, and what kind of results were achieved.

3.3 Self-learning Tools for Understanding Man-Machine Systems Engineering

Any man-machine system is a part or subset of a larger system with broader boundaries in the systems engineering sense. This has already been shown in figure 2 with the relationships between knowledge engineering, designing, operation, and maintenance; several man-machine systems are involved in this case. Nowadays, even broader contexts and perspectives need to be understood by different human agents dealing with man-machine systems; these include political decision makers, managers, designers. or operators. Different levels of understanding in different contexts are required. For example, environmental and economic issues, matters of legislation and liability, dependencies within a logistics network and of market developments, and many more technical and non-technical influences are to be considered in a real-life systems engineering approach towards decision support, design, and operation (see figure 3).

If knowledge about the needs and the functions of safe systems operation and careful problem solving in man-machine systems is better understood all over the world, then people will probably be more concerned about the selection and the proper use of many technical systems in their societies. The current challenges of technology transfer into developing countries and of conversion with risky systems in the industrialized countries, as mentioned already in the introductory section of this paper, may be better and more rapidly understood; possibly more competent opinions might be expressed if self-learning and information transfer tools were already available. Certainly a particularly strong need for such tools exists among high-level decision makers and public opinion moderators, namely among politicians, leaders of big and medium-sized companies and banks, and journalists (see figure 3). Their knowledge is often too limited and their time to learn something new very restricted, but they nevertheless have to make decisions quickly and as correctly as possible or to comment on actual events almost immediately. The suggested computerized tools for self-learning and information transfer will certainly complement libraries and more traditional information retrieval systems.

In a way similar to that in the prototypes mentioned in the preceding sub section, namely FAULT and ADVISOR, self-learning tools for understanding the broader issues of human interaction with technical systems in a societal context have to draw on a combination of decision-support systems, appropriate multimedia presentation and dialogue techniques in advanced man-machine interfaces, and explanation and justification facilities. The design approach for such tools has to start from requirements and cognitive task analyses that lead to the different decision-support functions needed for different types of later users of the tools. The explanation and justification facilities as well as the advanced man-machine interfaces have to be adapted to different information transfer and learning needs that are to be satisfied by the different decision-support systems.

Figure 3 Understanding man-machine systems engineering

3.4 Cultural Aspects of Using Simulation and Self-learning Tools

Simulation and self-learning tools, as mentioned in subsections 3.1 through 3.3, have to be used in free societies by different kinds of people with different kinds of education, different cognitive abilities for understanding systems, functions and context, and different kinds of responsibilities. Thus, one may say that different cultural patterns for approaching problems and interacting with socio-technical systems exist even within one particular society. Sometimes the differences between two distinct views of the same system are cultivated unnecessarily, as often occurs with the distinction Snow [28] showed between the two cultures of science, namely one for the human and the social sciences and one for the natural and the engineering sciences. Bridging the gaps between all kinds of possible views seems to be more and more important for understanding our world and for contributing to a peaceful future.

These gaps between cultures can be much larger when we look at the differences between societies that developed quite independently from each other over many centuries. Take the Japanese and the German cultures as examples that show a completely distinct dimension of cultural differences from those within either of these two societies. Sheridan et al. [27] dealt with many facets of these cultural apects.

Although the gaps between different cultures, individual as well as societal, became much smaller through the influence of many technologies, particularly the information technologies, many unique traditions and language barriers remain. They have to be accepted and even preserved with patience, goodwill, and historical sense. However, they need also to be overcome mentally in order to allow worldwide cooperation as well as exchange of ideas and products in a successful and mutually rewarding manner. The ideas and technologies of one culture have to be appropriately adapted to another culture when a high degree of user acceptability shall be reached. This requires sensitive tutoring for systems designers in the delivering culture as well as concerned training for end-users in the receiving culture. Thus, two types of simulation and self-learning tools may well be suitable. For the designers, one type can aid the understanding of other cultures. For training purposes, the other type can support understanding of the culturally adapted technology. Multilingual explanation facilities seem to be important aids where appropriate and where necessary to enhancing understanding for the tools that have to consider cultural differences. The use of voice output and input within a multimedia environment may even allow illiterates to get acquainted with new technical systems by means of the suggested tools.

The need for investigating and for considering cultural aspects in the design and use of technologies, in particular automation and information technologies, have only recently been recognized, also by at least some engineers and computer scientists. In the International Federation of Automatic Control (IFAC), a Working Group on Cultural Aspects of Automation was formed in 1990, for example. They organized their first workshop in October 1991 [7].


Contents - Previous - Next