• Aucun résultat trouvé

Comrnand and Control

Dans le document ee ion (Page 151-161)

JIRI NEHNEVAJSA Professor of Sociology University of Pittsburgh

14

I will not attempt to analyze concrete operational experiences in the area of command and control systems. Such an evaluation calls for data on distributions of performances relative to system performance criteria.

Even if available, these data would not be altogether appropriate for a presentation at a general conference of this type.

As a consequence, this paper will be limited to the consideration of certain problems which I consider particularly salient in terms of all, or at least many, command and control systems. These are problems which have significant bearing upon the behavior of the operational system but are not, at the same time, identical with what might be viewed as specific operational experiences.

Furthermore, I propose merely to highlight some of these problems without subjecting them to the detailed analytical scrutiny which each singly may well deserve.

The concept of control implies a capability to monitor an on-going situation and to compare its properties with the characteristics of some corresponding intended state of affairs. This involves, of necessity, some effort at predicting the probable course of events over an appropriate time horizon.

The notion of command, in turn, implies that information on the re-lation between actual and intended situations and processes permits an evaluation which leads to the determination of appropriate courses of action. It also means a capacity to communicate decisions to those who are expected to execute them as well as to those whose own actions will be affected by the decision in any significant manner. The command con-cept also entails the idea that th~ execution of a decision, as well as its effects, come to be monitored, and the nature of the feedback leads to re-enforcing the initial choice or toward a reassessment and a new decision.

A few thoughts now about issues associated with the control functions of the systems.

The intended situation is generally some plan. On one end of the spec-trum, this may be a war plan providing for patterns of force deployment under varieties of likely circumstances and for usually several alternative

141

objectives. On the other end of the spectrum, this may be a plan implicit in any specific decision in that its objective, too, is to produce some de-sired state of affairs or to prevent some unwanted system state from occur-ring.

The difference is one of levels of complexity. But it is far more than that at the same time. One kind of plan refers to an environment which as yet does not exist. Another one is responsive to the here-and-now in a more direct manner. In military systems, of course, the interaction of these issues is quite direct and quite crucial. At any given time there exists some range of intended or desirable situations which ought to pre-vail right now to make for optimal transition to the nonexisting war en-vironment if it ought to become realized in the next moment. Thus, one set of situational control functions is instrumental to major future ob-jectives.

N ow data pertaining to the characteristics of a given intended state of affairs may be provided in varying levels of detail. Generally, the greater the level of detail and specificity in the definition of the situation that ought to prevail, the greater the likelihood that in some manner the actual situation will deviate from the model. If plans are provided only in gen-eralized form, the greater the likelihood that potentially serious discrep-ancies between plan and reality will go undetected with severely degrading effects upon the system as a whole. How to strike a balance remains unsolved unless one is willing to accept diffuse user satisfaction or dis-satisfaction as the main criterion.

Similarly, it is not altogether obvious whether plans as profiles of in-tended situations and processes are preferably generated within a given command and control system or whether they are better viewed as an input into the system which could come from any appropriate source as a fait accompli. The former approach taxes the system heavily in that it must also involve complete planning capabilities. The latter approach alters the fabric of authority, at least at the highest levels of the organiza-tional hierarchy, in that certain accustomed discretionary powers simply disappear.

In any event, no plans can genuinely provide for all contingencies, so that situational and on-the-spot replanning must be almost assumed as the rule rather than an exception. Replanning and planning, of course, are the same processes but viewed from a different point of departure.

The problem of off-line and on-line activities (and their interaction) becomes quite fascinating.

An actual situation keeps changing. Furthermore, the variables which are used to describe the on-going situation change at different rates and with-dissimilar predictability. There is some time delay, no matter how

COMMAND AND CONTROL 143 apparently trivial, between acquisition of data by sensors and its genera-tion in the form of a usable output. The profile of an actual situagenera-tion at any given time has, therefore, two important and limiting characteristics:

for one, it refers to some past situation in any case and not to the situa-tion of the moment. Secondly, the individual descriptors of this actual situation are of varying obsolescence because of their different rates of change, different modes of acquisition and processing. The implications of this problem have really not been studied, and my suggesting it here as a serious problem does not prejudge the alternative outcome of appropri-ate studies. But time-tagging of information items has not been attempted on the whole in any systematic manner, nor do we know how this relates to the confidence which a decision-maker has in the information at his disposal.

A discrepancy between the actual and intended state of affairs signifies some system problem. One issue along these lines has to do with the rela-tive magnitude of deviation between intended and actual values which can be detected due to the system modes of data acquisition, and the magni-tude which can be processed as a function of equipment capabilities.

This is largely a technical problem.

The second issue has to do with some threshold magnitude of discrep-ancy which establishes a boundary between tolerable and no-longer-tol-erable departures of the actual from the desired state of affairs. This, in turn, is chiefly a policy problem.

The third issue has to do with the possibility-or better yet, the fact-that cumulative effects of otherwise tolerable discrepancies may not be tolerable. The criteria for making such choices seem lacking at the mo-ment.

The last issue al0ng these lines has to do with the possibility that joint effects of otherwise singly tolerable discrepancies may not be tolerable.

The criteria both for design and operations choices are largely lacking at the moment.

Before I mention some of the overall system problems, a few remarks more specific to the command function seem appropriate.

A discrepancy which constitutes a system problem can be resolved either by altering the nature of the actual situation or by modifying the specifications of the intended state of affairs or by both to some extent.

The main issue has to do with the determination of the conditions under which it is necessary or preferable to seek to alter the actual state of af-fairs and bring it into harmony with the intended state, and those circum-stances under which it becomes necessary or preferable to adapt the char-acteristics of the intended to the actual situation.

Generally, command and control systems lack the capability to provide

data on projections of the most probable consequences of a given deci-sion before it is firmed up, communicated, and its execution begun. Some such testing can be accomplished in simulated environments, but it raises the most serious methodological questions as to sampling of decisions, circumstances, and decision-makers to yield some confidence in the gen-eralizability of the results to actual operating environments.

Indeed, it would seem at least theoretically possible to develop system capabilities to identify decision options appropriate for a given situation, to identify the probable immediate consequences of each alternative choice, and to identify the probable longer run consequences of each choice. But this raises the most serious question as to whether there would be anything left for the human decision-maker to decide.

I am not prepared to argue altogether that this may be undesirable under all circumstances. Yet even this is a more complicated problem than one concerning the role of men in the total process, or one that simply concerns the efficiency of allocating various functions to machines and others to men. The point I am willing to make, however, is somewhat as follows: even if feasible, computerized decision-making per se is not really quite computerized. What happens is simply a drastic redefinition as to who makes the decisions, and thus a revolutionary modification in existing patterns of authority. In effect, a data-processing specialist or a programmer will make a set of permanent decisions in the place of a decision-maker normally expected to make them.

This may be an improvement or not. But in any event, the importance of this shift cannot be overemphasized, and its implications certainly must not be overlooked. This is underscored by the tentative observation that much less attention is paid to the training of programmers in anything but programming than the corresponding attention which goes into proc-esses whereby our society elevates certain men into significant decision-making roles. And I will be the last one to underestimate the centrality of the decisions which are made quite routinely by programmers of even very low professional calibre.

To argue that the decision-maker can control what is being done on his behalf seems to me somewhat unrealistic. For one, there are individual styles of decision-making and these are not as readily transferable from person to person as are occupancies of various positions and roles in our social system. Secondly, we know very well that decision-makers may be unable to verbalize, or verbalize in a manner directly understandable to the data-processing specialist, the criteria which actually guide them in using information and in reaching conclusions on the basis of it. Thirdly, in complex systems we are speaking of hundreds of thousands of program-ming instructions generated in segments and subsegments by whole teams

COMMAND AND CONTROL 145 of data-processing specialists. It does not seem possible to comprehend all this very adequately any more than it seems likely that given decision-makers could effectively channel the development of these enormous in-formation-handling systems.

Command and control systems are complex. They are also significantly real-time systems. They are expensive to design, install, maintain, and operate. They are expensive to modify, and despite the fetish made of flexibility, often too rigid to permit even small fixes without major effort.

Some consequences flow from these simple observations. First, the complexity tends to be so staggering that the system user must continue relying on the system designer throughout the life-cycle of the system ex-cept for routine utilization. This is not implied as a critique. Rather, I am suggesting that this signifies the arrival of new partnerships, and the necessity for these partnerships might as well be recognized at the outset.

There is, I firmly believe, no such thing as the system user taking over a complex command and control system as a terminal package. The mar-riage of system user and system designer continues and this might as well become an aspect of system planning.

N or is it quite feasible for the system user to be his own designer. In theory this sounds perhaps plausible. In reality, some system is in exist-ence which the user is quite busy employing on an on-going basis right now. He cannot suspend his operational responsibilities of today while developing a system for tomorrow. And I daresay that he cannot do both.

The cost associated with command and control systems is still another matter. It amounts to commitment. This tends to mean that once a development program is initiated, there are sufficient emotional, political, and other reasons to see it through even if alternative systems or alterna-tive configurations became available. This holds above all in the area of equipment procurement, and the problem is accentuated by the fact that far too often equipment is acquired long before the realistic stage of sys-tem development would warrant it. Many syssys-tems are designed around hardware, and this normally means some off-the-shelf hardware or some modified equipment already fully available.

I should add that many research laboratories, too, are designed around hardware with similar consequences. In both instances, instead of iden-tifying the problem and the resulting equipment requirements, the prob-lem and all other requirements are constrained by the hardware which, after all, must justify its cost.

This issue is, indeed, coupled with off-the-shelf thinking. Truly, an on-going battle rages between those who prefer to approach problems by blue-skying and those who prefer improvements of an existing situation.

Clearly, this is not an either-or problem, for if it were it might have already

been resolved. It is obviously safer to avoid radical departures from cur-rent thought. It is therefore both safer and easier to simply superimpose modern equipment upon previously manual functions without signifi-cantly altering these functions, or even questioning their viability. The probability of s.uccess is greater, but the consequences of succeeding some-what less than spectacular.

In the area of man-machine interactions, perhaps the major problem revolves around the determination of the type, amount, and timing of information which the human decision-maker is to receive, and at the same time, the determination of the information which he may have ac-cess to, even though it need not be presented to him under most circum-stances.

Men are on the receiving end of an enormous quantity of information already. in fact, too much of it, as it is. There does not seem to be much point in automating and speeding up this flow, and thus even increasing the effective amount per unit time. Selectivity rather than all-purposive-ness would seem more appropriate both in terms of access to data and of its actual presentation to decision-makers. It is consequently of great importance to identify the information which particular decision-makers ought not to receive.

Information which people say they want is often not the same as in-formation they want. The inin-formation they want is generally quite in excess of information they need. At a given level of the decision-making hierarchy, an effort to provide detailed data on all aspects of the sys-tem and its operations would tend to lead t') centralization of decision functions. At least, it would degrade the use of imagination which goes with autonomy and fairly clear responsibility at more subordinate levels within the organization. No systematic data presently exist on relations between system outputs, the actual decisions in operational contexts, and the actual consequences of such decisions. The problems of determining these information needs therefore remain quite serious.

The notion of real-time monitoring implies a system capability to be operative around the clock. This requirement seems to be always present, and it is the more critical the more the command and control domain of responsibility has to do with rapidly changing events rather than rela-tively slower ones. Indeed, some fallback provisions are an important ingredient of command and control systems. These may be provisions to return to some version of pre-electronic data-handling modes. Or else, multiplexing of the core equipment and the appropriate communications linkages may be used as an alternative.

Relatively little systematic thought has been actually given to multi-plexing of equipment between and among various systems rather than

COMMAND AND CONTROL 147 hardware duplication or multiplication within each system. Although this alternative may seem quite appealing, its consequences are not altogether clear. It may, for instance, involve using the same kind of equipment across a variety of systems and this has something of the effect of mo-nopolization in the hardware production and distribution field.

The same kind of an issue holds regarding intersystem compatibility of equipment, program languages, and resulting procedures. Yet, some de-gree of compatibility is of great relevance because of the interfaces which invariably exist among several command and control systems, if not all of them.

This is further complicated by the fact that various systems are, at any particular point in time, in different stages of development, or else in dif-ferent stages of their life cycle. In the rapidly changing field of data handling, these time differences in and of themselves make adequate com-patibility of past with present, and present with future, systems quite difficult.

The sociological and social psychological components of systems and their operations are also rather central in the eventual capacity of the systems to act on their objectives. Existing organizational forms signifi-cantly constrain the range of choices which are open in system design and utilization. Major departures from prevailing cultural patterns within an organization, such as the military establishment, may be so threatening as to make even good solutions less than acceptable. The problems asso-ciated with phasing people out of one type of working environment and an accustomed set of behaviors into another environment are ample and they are rarely in the direction of upgrading, rather than down-grading, system performance.

I would now like to bring my discussion to a close on a somewhat dif-ferent theme. I have singled out a number of problems associated with development and utilization of command and control systems. This has led me to the exclusion of the tremendous progress which I believe has been made in the course of the past two decades or so in the conceptual, methodological, and hardware aspects of these systems. Nor must we be oblivious of the fact that starting from scratch, numbers of people from various disciplines have developed a truly impressive know-how such that it at least provides assurance that past errors are unlikely to be repeated.

These individuals are heavily concentrated in relatively few organizations, but they are here and they were not here only some ten to twenty years

These individuals are heavily concentrated in relatively few organizations, but they are here and they were not here only some ten to twenty years

Dans le document ee ion (Page 151-161)