Organization, Perception, and Control in Living Systems


The most coherent and extensive theoretical statement of Johnson’s theory about the interconnection between organization, perception, and control in living systems. Particularly important for emphasizing that there's no perception without active participation.

The author takes an organic view of management information systems. The firm is, after all, an organism, and exhibits in large measure the behavior of any other viable organism: you, for example. In particular, the impact of information on the firm’s behavior is the meaning of that information, and meaning arises only through participation in the process of perception. This article is dedicated to all who would design systems to gather information. We invite you to participate in it. Ed.


Ever since I came barging into the ranks of industrial management, there to pay attention to the structure and behavior of “intelligent” systems, I have relied, for the purposes of my own understanding of the issues involved, upon what I know of physiological organisms and of how they deal with their world. I have, at the same time, been somewhat appalled on occasion by the essayed anthropomorphic descriptions so often applied to the observed or intended behaviors of information-handling systems: those “aids” toward which we find ourselves turning in the endeavor to stem the tide of meaningless paper and to simplify our grasp of business processes. We are being inundated by the flow of data and statistics and carefully charted relationships to the point where what we seem to know about a situation threatens to exceed by far in complication the workings of the situation itself. Let us take another and quite different look at what we are doing and why.

Animals deal with a highly complex, ambiguous world while performing the simplest of behaviors. Let us consider how they collect and use their information, and then take a similar view toward improving the health and functioning of those larger systems of which we form but part. In what follows I am not going to suggest that we hook up our computers to animal brains in order somehow to put to use some abstract ability that we have as yet not learned to imitate in hardware. I will, however, suggest that we must seek far better ways to connect computers to people: to make their services much more congenial to obtain, more “conversational,” of the nature of a shared dialogue.[1] The alienation we are coming to feel in our confrontations with those superfast, infallible, compulsive, unforgiving electronic beasts is one that may stem largely from our own failure to demand that they not make such demands for sophistication from us in their use. Let us see what we ourselves are doing “on-line” in the world and note whether we would like our computing or management systems to be of the same ilk.

As we approach the eighth decade of the twentieth century, we find ourselves moving into a world in which we must learn to take account of “triadic” relationships.[2] The didactic attitude of “if this, then that,” irrespective of context, is no longer adequate. It served us fairly well in setting up an industrialized society of interlocking, working parts.

whose roles could be expected not to have to change much in the course of a human liietime, and even less over the span of a year or of a professional commitment. We were once a monadic race, as are the other species: setting our sights on making satisfactory the objects and events of our immediate surroundings. We later became diadic, requir­ing the emergence of symbolisms and metaphoric embodiments of our identity: names money, property, and clothing styles, to name but a few. We are now fast approaching the triadic, which will have its own forms and will serve to relate and give meaning to the other two. The “three-valued” world is one which demands the participation of the indi­vidual in a dialogue with his environment in order that he be able to generate for himself a meaningful existence. Psychologists are discovering that grasp is a more fundamental perceptual mode than mere touch, and that looking is more essentially descriptive of vi­sion than is seeing. It is the active involvement of the organism which is paramount.[3]

I find the use of physiological analogy in teaching to be significant, partly because it al­lows me to elicit from the student a description of how he thinks he perceives his world, thereby providing the raw material for discussion in his terms, but more because it al­lows the student to develop in himself his own laboratory for experiment and for a start along the path to an active participation in the processes he is studying. Rigorous argu­ments employing equations and tables are akin to bricks and mortar; I am more inter­ested jn the thrust of architecture.


We seek to ask, then, as living examples of the organisms about which we theorize, what are we and how do we go about our business. What is our highly developed central ner­vous system for anyway? How does it serve itself and the rest of the corporal system in which it resides? And ultimately, what can we learn from such considerations that will lead us to better formulations of the larger organizations of management?

If called upon to state the overriding purpose of the central nervous system as an evolu­tionary phenomenon, I would characterize its role as the organizing of the total behavior of the organism in the task of discovering the meanings of the objects and events sur­rounding it. The senses measure data flowing in from the environment, but of themselves those measurements are meaningless. Sensory input is metaphoric: there is an implied comparison between it and the event which produced it. The sensory input is not the event. I repeat. THE SENSORY INPUT IS NOT THE EVENT. It is a metaphoric statement about the event or object. It remains for the organism to discern the meaning of the metaphor through some further behavior which is more than simply a passive sensory observation. 

What is meaning? For the most part, it is an identification, but it is far more than a sim­ple naming. The identification of an object or event by name is the result of a common sense, an agreed upon correspondence between metaphors of an event-like kind, as ex­perienced, and metaphors of a symbolic kind, as stated. The apprehension implies that some meaning has been assigned to the object and therefore that some appropriate res­ponse is available within the organism’s repertoire. In fact, that is the crux of it. MEAN­ING IS TO BE FOUND IN THE RESPONSE WHICH THE ORGANISM FINDS APPROPRI­ATE. Note, however, that the response is not necessarily carried out; it may be latent or potential. Note also that we have as yet failed to consider how the response and its appropriateness came to reside within the available repertoire, and how these are elicited. There is, of course, the element of past experience to be considered, but of much more immediate importance are the elements of present experience, of participation in the event. No object or event can exist totally without relation to the observer and still have meaning. It is the manipulation of the relation of event to observer that allows him to discern meaning. This is a roundabout way of saying that the context of the event acts as an operator to assign meaning to the sensory metaphor, and that the discernment of context is an active process on the part of the observer.[4]

Let us try another tack.

Here is a man. I show him a photograph of an object: a piece of mechanical apparatus. He has never seen such a thing and so it has no meaning for him whatever. Since the photograph shows the object only, devoid of any context of use, he cannot even estimate its size, material, color, orientation, or where he might expect to see one. I tell him the object’s name. It is a dibble.[5] A what? A dibble. Sorry, still no enlightenment; the name means nothing to him yet.

Here, my friend, are a few hundred plant seedlings I would like you to put in the ground for me. Just make a little hole for each and stick it in. It’s easy, you can see how I do about ... Aha, he says, now if I had a gadget like the one in the photo about the right size on the tool belt to fit my hand, then the sharp end would make holes just the size I need for the plants.

Here you are, how does it feel?

Fine! Very comfortable to grasp.

How does it work? Try it a bit.

Works fine. Just what I wanted. What do you call it?

A dibble.

And he will never forget what a dibble is because now his mind knows. The feel of it in his hand, the sensation of the earth parting as it makes the hole, the satisfying moment of insight as he “reinvented” the dibble for himself are all his forever. His participation in dibbling and the changes wrought in him by that participation are his knowledge of it. Any metaphor implying the dibble — picture, name, hand-feel, seedling — can now evoke the context of the dibble through his experience and consequently that metaphor can take on meaning for him.

Ask a friend who can type where, on the typewriter, the Y is to be found relative to the H. Almost without exception you will observe his or her hand to rise to near chest height, there to make the motion in space of moving from the H to the Y. Where lies the most convenient route to the “memory” of that relationship? Somewhere in functioning of the circuits that mediate the appropriate responsive behavior. The illustration gives us a clue to what we mean by “knowing.”

Spend some time in a supermarket and contemplate what the meaning of a food package is to each person who must relate to it. As the housewife reaches for it, she is already anticipating its use in cooking, its aspect in serving, and its flavor in the tasting of it. The meaning for the checkout girl is simply price; she probably goes home at night, looks at the shelf, and automatically thinks numbers rather than aromas — and her hand may start to tap out the keyboard pattern. The kid who puts the groceries in bags sees packages solely in terms of size, weight, and relative fragility. His reach and grasp must reflect the spaces remaining within the bag. Price and flavor are irrelevant to whether the bundle will maintain its integrity from there to your kitchen.

Everyone sees objects or events within a context that serves to provide meaning for him. Earlier I suggested that context be thought of as an operator which assigns meaning to metaphor. Now you will want to know what context is. You will want me to point to it as if “background” were a sufficient description. It isn’t! The context which you discern involves your participation in it, not mine. However, I think I can be of help to you here if you will let me make the argument sound circular.

Let us consider context to be everything which serves in some way to assign meaning to an object or event. Since we have already seen that meaning is embodied by the appropriate object or event, only context within that process can be meaningful in the sense that it helps us to organize our responses appropriately. The usefulness of this obverse way of defining context will become more apparent later.


We talk a lot about information and its entropic nature. We talk about redundancy and channel capacity and all the gamut of description of data flow, yet seldom take account of how a part of the system may be changed by the information which it processes. A simulation model of a municipal government is formulated.[6] In one branch of the flow chart we find the mayor with information pouring in from many places and going out to others. For all the simulation runs, that mayor is imagined to have ears and eyes with which to take in data, and a mouth with which to give out orders, but he is never allowed himself. Most of the information flow into the mayor’s office is of high speed and of fairly low resolution: not erroneous, but lacking in detail, and so it should be. It is his job to sense a situation of growing novelty somewhere and to make the active effort to increase the resolution of his view in that specific area until he feels he knows what is happening and what the appropriate municipal response should be. His very participation in the process of refining the grain of the picture he sees — doing so in a limited area and to as high a resolution as he feels is necessary — is what assigns the meaning to the events therein. He is not a simple transformer of information. He is more than tactile flesh; he has a reach and grasp which can serve him to identify what he touches.

There is no doubting that data come at us in many forms, some of which will hit us even if we sat still and some of which we must go after for ourselves. To an extent we have predictive models of what our participation must be in order to sort out meaning from the mess. At the outset all is ambiguity. Redundancy is all of that which has been anticipated by the predictive model and for the perception of which no active participation was required. Information is ambiguity resolved by transaction with the environment. It is not just that the active transaction delivers us the information. OUR PARTICIPATION IN THE PROCESS IS THE PERCEPTION OF THE INFORMATION. The leftovers are generally called noise, but we should not lose sight of the possibility that through further transactions we may be able to diagnose their context and thus rescue them from meaninglessness. It is up to the organism to change itself for the purpose of acquiring elucidation. Discovery of an appropriate response takes trial and error, trial and success. Trial? Not as such, for no trial is carried to completion in the form with which it commenced. The organism changes itself and evolves continuously. No unitary error or success may be identified. The handy phrase, “cut and try,” is too simplistic and even misleading, because to cut the system in time, to take a stop-frame view of it, is tantamount to murder.

What, after all, do our piles of data and statistics and charts on hard copy in endless fan-fold stacks tell us of living, changing, growing organisms? Our computer technology has provided us the opportunity to have at every hand a responsive model of the systems we envision, one which is up-to-date, anticipating, predicting, resolving, and available at all times. No office should require a file or shelf wherein to store away a lot of paper. Any particular aspect of the system about which we want to know should be available at a terminal and should perhaps be printed out on a scope or in ink that has a half-life measurable in minutes or seconds, thus to underscore the relevance of the data only to the temporal context in which they were requested.

Our manners of taking data, of organizing them into “meaningful” statements, and then of applying the inferences of those statements in the processes of management are far from reflecting a real world of things, events, and people, and are also largely unsuited to use by the complex information-handling systems that are available. For the most part, the espoused theories and techniques are only slightly modified versions of those prevalent prior to the invention of the computer. In those days, the purpose of a statistical measure or test was to minimize the manual labor for the investigator while maximizing the probability of success of his interpretations. Who needs that first requirement any more? Let us stop lumping data which have been husked of their context of time and of the marks left by the hand that grasped them.

We are so accustomed to being asked to distort our vision so that a model will look like the original that we forget to ask for the model to be placed in our own hands that we may examine it ourselves. We are likewise prone to crow loudly when in visual aspect a real situation is found which appears to fit one our models with ease. I am sorely reminded of expressions of aesthetic enthusiasm when a scene looks “pretty as a picture postcard.” I would rather we asked why postcards usually fail to convey more than a negligible aspect of the experience of being there.


We hear a lot about decision theory, but I wonder if we really have a clear notion about

what a decision is and how it differs from “doing what comes naturally.” My principal interest is in the latter, the informal ways in which we generally go about getting things to work without ever thinking very explicitly about how and why we are taking each step along the way. There is a clear class of situations, however, in which it is appropriate to make a decision and then to act upon it. Let me illustrate with a somewhat extreme case from the psychophysiological realm.

There are moments in the life of any organism when a state of emergency arises. What is a state of emergency? Whence its urgency? How do we learn to recognize it? What do we do about it?

An emergency may be said to exist when it appears that there is not enough time availa­ble for an adequate exploration of the meaning of incoming data. Something must be done now, a decision has to be made. A commitment to responsive action must be commenced immediately and its selection must be based upon what we already know or can guess. Exploration takes time because it is an interactive dialogue and involves changes in both the perceiver and his environment. Emergencies are of a stop-frame character. Here is a snapshot of the world. Act upon it. Time has run out. The clock is stopped.

Animals do not enter an emergency mode of behavior for the sole reason that a threat to survival is present. If the threat is explicit, and its meaning in terms of an appropriate response is clear, then no decisionary procedures are necessary because, in a very real sense, the correct response is already under way. Emergencies are produced by an over­load of ambiguity (some would say an “information overload,” but this is incorrect), and the danger is that hidden among the ambiguous data may be something which is of a threatening nature. One cannot take chances with survival. Action must be initiated which has the highest probability of moving one out of the ambiguity overload and back to a condition of dialogue. That action should be unitary in the sense that it is a single, well-formed pattern of behavior already available in the repertoire of such patterns, and commitment must be made at any one time to only one of them. Dialogue is a many­eyed, many-armed transaction. The response to emergency should be cyclopian, imme­diate, and uni-directional.

Such is the nature of decision. It is not a gesture which tells you more about the world. Quite the contrary, a decision is made as the shutter closes and no more metaphors will acquire meaning until the action to which you are now committed is either carried to completion or is otherwise terminated. Decisions freeze structure and proceed to turn the behavioral crank. They put the blinders on us, rendering us unaware of meaningful twists in the path. Decisions are for those moments when you cannot do otherwise.[7]

The next time that you find yourself in the act of “making a decision,” ask yourself: (a) whether you would not instead prefer the opportunity to explore the data more fully and then deal with them in smaller pieces, (b) whether the decision you are making has more the character of “timeless relevance” than of specific applicability to a one-of-a-kind sit­uation, and (c) how soon the behavior you are commencing as a result of the decision will give way in its inflexibility to a more informal, congenial groping.


Lest you think I am inveighing totally against structure or policy, let me change my tune to a more positive one. I am concerned with the nature of self-organizing systems. To be sure, they have a structure or mechanism which is put together in a fairly immutable way and which cannot disobey the laws of physics in the use of its parts while engaged in a dialogue with its environment. However, there may and should be a highly flexible set of relationships possible between the sensors that detect changes and the effectors that produce them. The system itself must never be content to remain as a passive observer of incoming data; it must participate in exploration. In fact, it is becoming increasingly evident that any self-organizing system must, in effect, play with itself in a manner which includes part of the environment in the loop. Furthermore, the most meaningful information to be found anywhere in that loop is not the raw data with which the system is dealing but the behavior in which the system must engage in order to deal with the data.

Let us take a look at cybernetics from a somewhat historical perspective in order to see where this trend of thinking is taking us.

Initially, as the principles of feedback control gained in popularity and interest, purposive, “goal-seeking” systems were envisioned as altering their outputs in order to reduce the “error” discrepancies between their inputs and some desired state of the inputs. The sensory metaphors were to be analyzed by a central processor and then appropriate actions were to be taken by the effectors to make corrections (Figure 1). There was little serious concern as to how “appropriateness” was to be determined, since it was presumed that such considerations would be provided for by the designer. Performance assessment was to be prearranged; the systems had no need to learn for themselves what the meanings of the sensory metaphors were because someone had already formulated the task of translation. Where a symbolic manipulation was necessary, the symbols were delivered as symbol;, where the error to be minimized was physically measurable, the notion of “good” or minimum was explicitly defined. These systems did not have to be born and nurtured; they were made of whole cloth and turned on. And if the gadget did not work as intended, the fault was laid at the designer’s feet because he had not foreseen all of the consequences of his design. The system was not supposed to be, or to seek a favorable identity; it was simply supposed to do.

Even now, as we look at diagrams of management systems and at organizational charts and the descriptions of the information channels depicted, we know implicitly (and then promptly forget) that at the nodes, where information enters and is relayed onward, there are men, self-organizing human beings whose task it is to understand that information in its context, to know the policies of the company in relation to that context, and to transform the data into new symbols relevant to the next stage of processing. Fine. People are good at this task and have been doing it for a long time. But every day I see claims that there are systems that are doing it. I claim they are doing no such thing.

About 20 years later (Figure 2), around the time when cybernetics acquired its name and formulations, it became apparent that a new attitude was gaining popularity. Partly as a result of physiological studies and partly arising from the development of servomechanical controls, an advantage was recognized in the use of “interfacing” systems which would deal with the environment in a goal-seeking, “homeostatic” way and not belabor the central processor with the problems of calculating the necessary adjustments. The higher centers could then concern themselves with higher things and would in turn produce their effects upon the environment by altering the goals or null settings of the interfacing systems. In the use of such a mechanism, the designer no longer needed to know precisely what kind of environment the system would be called upon to face but had only to assure himself that the effector apparatus would be capable of finding a route toward satisfaction of the goal structure assigned to the interface.

1. 1930

1. 1930

2. 1950

3. 1970

There was, however, still a reluctance to accept as meaningful descriptors of the environment anything but the sensory metaphors received as inputs. Looking appeared to be merely a useful adjunct to seeing; feeling seemed to come before touching; and shape was considered somehow to have a message of its own even before grasp came along to define it. In short, the organism was still being viewed as a passive observer of information and its active role in accessing that information for itself was no more than an additional feature. The interpretation of meaning by the system was still the a priori responsibility of the designer or programmer. One still needed people at the nodal points to make the transformations called for by the context of the data. The system had no way to explore such operations for itself.

A new trend is in the making.[8] It is becoming apparent that the necessity for the participation of the perceiver in the act of perception is more than fortuitous. IN FACT, THE PROCESS OF PARTICIPATION IS ITSELF THE MEANINGFUL STATEMENT OF OBJECTS AND EVENTS IN THE ENVIRONMENT. The interface required is not much different from what it was, but the useful information we derive from it has taken a profound shift. Any system which has the responsibility for organizing itself and for discovering the meanings of things must trust, at least at first, to what it can learn from the behavior of its interface with the world (Figure 3). After some experience with a sensory metaphor and the rediscovery of responses appropriate to it, the system can learn to apprehend meaning as a passive observer. At the outset, however, no such sophisticated interpretation is possible: the system must trust its interface to deal with a new aspect of the environment according to the goal structure (policies) which it has set for that interface. The meaningful data are to be found in the statement of resulting behavior, not in the raw environmental measurements. The topology of initial exploration and later of interpretation is that of an “effector map,” which in its turn has no meaning if removed from the context of the interfacing mechanism which produced it. If you turn off the system and disassemble it in order to describe the parts, you will not be able to discern that quality which produced it or of the structure which responded as described, you will likewise tell me nothing. Self-organizing systems are of a devious ilk and hide their colors when you try to isolate them. Instead, present the system in some active, responsive embodiment and let the observer explore its responsiveness. Allow him the opportunity to tweak with his interface — his reach, grasp, poke, stratagem, and perturbation — so that he in his turn may “know” this system as only an active participant can ever know it.

This shift of attention from a reliance upon the input as the source of environmental fact to a coincidence in the behavior of an exploratory interface which is adept at dealing with the environment is not a trivial change in point of view. It says, for example, that a representative or ambassador to be selected should be a man who is familiar with the culture into which he is to go and trustworthy in his understanding of and belief in the policies that are to guide his actions. Thereafter he should not be asked for reports of the details of his negotiations but only of what he did about them. He is the one who knows the meaning of the details because he is immersed in the context of their happening. No au­thority further up the pyramid can be expected to grasp their significance (the ambassa­dor himself is the grasper for us), nor should responsive actions be dictated from above, but only shifts of policy. Gradually he may educate us by his actions into the meanings of the events with which he must deal; but at the outset those events are meaningless in our context and only his actions may convey to us their import.

A mountaineer returns from a climb exhausted. Do we ask him metric questions about the mountain if what we wish to know is the effect the climb will have upon us? No, for his very exhaustion is message enough. The metric details would be meaningless by themselves.

Refer for a moment to our earlier considerations of the activities of the perceiver and his search for meaning. If, as was suggested there, we define context to be all of that which allows us to organize our responses appropriately, then it would appear immediately that the interfacing system described above is especially well suited to the exploration of context. The responsive behavior of the interface is the message to which the central processor attends. Since the interfacing system is going to manipulate its effectors so as to explore the ways in which to fulfill the role assigned to it by the next higher level, it will by that very process refrain from isolating a metaphor away from the context of its occurrence.

There lies within this notion of an autonomous interfacing system the seed of the direc­tion which I think our considerations must take when we speak of decentralization of in­dustrial management. A hierarchical structure may be imagined in which each higher level accepts as meaningful statements of reality the behavior of the next rank below and receives as instruction from above a slowly varying goal structure (role assignment). Each rank is trusted to be diligent in the pursuit of the role assigned to it, to be respon­sive to variation of goals, and to be capable of reporting the actions taken (goals set) upon the next rank below.

I am not suggesting that it is out of bounds for an upper level to observe the raw data serving a lower one. I only submit the warning that that raw input may well be meaning­less to one who is not in the habit of immediate and extensive participation in it. The lower ranks — “down where the rubber meets the road” — are more sophisticated in the meanings of events at their levels because they are more colloquial with the appro­priate responses to them. The higher levels may learn those meanings, but only by ob­serving the events in the context of the behavioral responses dealing with them. Direct command for specific action should rarely flow downward except under the conditions discussed above, when a decisionary procedure is mandatory. Even then, the decision reached should give evidence of the redundancy of potential command throughout the system. I will not elaborate here; Kilmer, McCulloch, and Blum have done so at length elsewhere.[9]


Another concept we hear and talk and write about, without ever being very specific about what we mean by the term, is “intelligent behavior.” Let us consider for a moment how we might arrive at a useful, operative definition.

An intelligent act certainly is one which takes account of the context in which it is performed. But there is more to it than that. One can imagine ritual acts carried out in conditions where the context never changes and which are therefore appropriate for all time. So we say, aha, there must be something of adaptability in the process of intelligence, something which takes account of changes in context. All right, but how would we as observers know that this had been the case? We cannot expect our apperception of conditions surrounding an event to be the same as that of some other complex organism. How would we know whether the change in system response were or were not appropriate to the change of conditions?

Consider an artificial situation and a human parallel. Someone brings a black box into your office and claims that it is a machine capable of intelligent behavior. He demonstrates the means of communicating with the device and then departs, leaving you to decide whether to believe his assertion. Note that we are not trying to establish whether there is a man or computer inside the box, but rather to exercise our concept of what constitutes intelligent behavior. Any good definition should be capable of statement, independent of the object to which it is to be applied. Note also that if we wait for a demonstration to take place which does not include us as an active participant, we will be disappointed, for if the definition allowed us to be passive observers, then the box might as well contain nothing more than a magnetic tape playback unit loaded with suitable prerecorded tapes. No, there is more to intelligence than utterances. There must occur a transaction between us and the system.

Consider a human parallel about which you may have fantasized at some time in the past. You have been unwillingly incarcerated in a mental institution and have been brought before a board of psychiatrists for examination. It is your task to demonstrate your sanity. How would you go about it?

Again we come to the immediate conclusion that there is no statement you could make which would be convincing. You may not even have a clear notion of what sanity is, inasmuch as the rest of your world appears to be insane at this very moment. You do know, however, that you want the board to ask you questions and that the onus will be upon you to provide responses which the board will consider appropriate. In the case of an investigation of sanity, the board will most likely be seeking affective (emotional) responses while our accustomed thoughts about intelligent behavior center around rational responses. Let us not quibble. The test is essentially the same. For the sake of brevity, let us consider one familiar form it might take.

A partial, though certainly insufficient, test for sanity is to tell the patient stories, some of which are funny and some of which are not. We will be watching for the appropriateness of his amusement and laughter or lack of it We must, of course, take into account the cultural context from which the patient comes, for otherwise we would have no estimate of appropriateness to him. Let us accept that as a “given” in the problem.

But what, after all, makes a story funny or straight? Wherein lies the humor of a joke? It is really quite simple when you think about it. A joke, prior to the punchline, is a story told in some context with which you are familiar and which appears to assign meaning to all of the metaphors involved. What the punchline does is to shift the context in such a way that some or all of the metaphors suddenly take on new meaning. Your amusement is the natural response evidencing your enjoyment at participating in the construction of the new meanings which have been assigned by the new context. Think up any adequate test for sanity or for the ability to perform intelligent behavior and you will find that the same fundamental structure underlies the test.

The test involves a transaction in which metaphoric (but not necessarily verbal or sym­bolic) statements are to be made by one party and the meanings are to be discerned by the other. You will remember that meaning, to an organism, is implied by its response. Assuming that the appropriateness of the response can be estimated, we may say that INTELLIGENT BEHAVIOR IS THE DETECTION OF THE CHANGE OF MEANING BROUGHT ABOUT BY A SHIFT OF CONTEXT. Great elaborations of this basic definition are possible. I will not indulge in them here, but will leave the definition in its simplest form for now.

Consider if you will, however, the fact that most of the more serious problems faced by management systems arise in the form of an apparent irrelevance of an older, previously successful system for dealing with new problems. It often appears that the problems are, in almost every sense, identical with the old ones, but they are now appearing in a differ­ent context. The system has failed to react intelligently. It has failed to detect the change of meaning brought about by a shift of context.


One idea which is becoming inescapably clear in many areas of human experience is that the sources of motivation for mankind are not limited to actions that promote physi­cal survival or satisfy physiological appetites. Man has an undeniable need to maintain his “information level” and will go to great lengths to do so. Extreme examples are Sat­urday night fights at home, car theft, narcotics experiments, vandalism, exhibitionism, or even alcoholism, a reverse attempt to suppress the need to where it is relatively tolera­ble. It should be clear by now that “information” for the individual is not simply what is beamed at him for his consumption, but is rather that which invites his involvement: i.e., something that will respond to him, that will tell him he is there. In the years to come we, are going to see many instances of people rebelling against a world which lacks cour­tesy, lacks a responsiveness to them. They will kick cigarette machines that swallow their money without giving anything back, and will kick government just as hard when it gives back irrelevance. We will see more students wondering why the university experi­ence prepares them for nothing that they care to be and therefore demanding at least some role in the determination of that preparation. We will see people more prone to ig­nore traffic lights when the irrevocable rhythm is irrelevant to the present traffic condi­tions: unmindful, that is, of the original context and purpose of those lights.

We are also going to see arising the use of involvement itself as a motivator at all levels of industry and of society at large. I predict a higher demand for involvement than for financial compensation as an inducement to productive creativity. There is nothing mysterious about this trend except perhaps its late appearance. Its latency is most plausibly assignable to the lessons of the industrial revolution which pronounced the edict that each could afford the products that all enjoyed provided that he was willing to accept their sameness. We learned not to expect individual difference, but to be content with a ubiquitous value system. However, such conformity need no longer be the case.[10]

We now have the information handling systems that can keep track of routine matters, and if we can get away from the industrial notion that computers are there for the purpose of cranking out mountains of hard copy on call and demand instead that they interface with us in a conversational way, then we can get on with the business of making this world a far more congenial one. Who needs a system that can spit back absolutely detailed and accurate statistics and correlations of data in pre-digested form about a past that is by now virtually irrelevant, when what we want are systems that allow us to play on-line and in real time with relationships that exist now? We don’t need ten-place precision! What we want is self-organizing models that update their structures continually and offer us a 4-bit dialogue in real or fast time so that we come to “feel” what the system is like. Our own involvement is crucial to our growth and to our knowledge of the current state of affairs and of the meanings of the events to follow.

I will also predict that there will be a gradual de-emphasis of the qualities of inclusion and precision in the writing of proposals or reports. Rather, a company will consider as its major salable asset the ability ot adapt its services in a relevant way as conditions change. Artists’ conceptions of the “city of the future” will show some cranes and torn-up streets and the urban designers will be asked to provide the means for making future changes more comfortable to effect That is, the graphical presentation itself will show that the process of change has been taken into consideration. The areas of the social sciences should be the first to show a useful grasp of the principles of self-organization. Projects will be undertaken which start at high speed in many areas at once They will have a clear, over-all purpose, but will be initially defocused as to the means for effect­ing that purpose As a project progresses, there will be internal shifts and refocusing of attention and effort onto those specific active parts that are showing the best promise or achievement of the original intent. In this manner, the resolution or clarity of the picture of social change will arise in the form of effective action taken rather than “statements of the problem.”

Later the technique should become more pervasive in scope and will reach maximum fruition with the maturity of a generation of students who have been educated by a system that attends at all times to the relevance of the education to the individual. It could start now, and some attempts have been made, but their approaches are not sufficiently general and do not reach enough students on a continuing basis. Eventually they will, but it will be expensive, initially clumsy, and widely resisted except by the students themselves and their teachers. The principal focus should be upon ways to interface the system with the student for the maximum of multi-sensory, multi-effector dialogue so that the system can learn enough about the student to model him and to teach him concepts in the contexts with which he is the most facile.[11]

A far brighter, more fascinating future lies in store for us in a “turned on” world if we can get around to developing it. Its major product will be our own participation in the events around us and it will require that our informational technologies provide us with a maximum of on-line facility.


Barron, R.L. “Self-Organizing and Learning Control Systems.” In: H.L Oestreicher and D.R. Moore (eds.), Cybernetic Problems in Bionics. New York, Gordon and Breach, 1968.

Bentley, A.F. Inquiry into Inquiries: Essays in Social Theory. Boston, Beacon Press, 1954.

Brodey, W.M. “Soft Architecture: The Design of Intelligent Environments,” Landscape, Vol. 7, no. 1 (Autumn 1967).

Gibson, J.J. The Senses Considered as Perceptual Systems. Boston, Houghton Mifflin, 1966.

Gurwitsch, A. The Field of Con­sciousness. Pittsburgh, Duquesne Univ­ersity Press, 1964.

Held, R. “Plasticity in Sensorimotor Systems,” Scientific American (November 1965).

Hermann, H., and Kotelly, J.C. “An Approach to Formal Psychiatry,” Perspec. in Biol, and Med. (Winter 1967).

Ittleson, W.H., and Cantril, H. “Per­ception: A Transactional Approach.” In: S.W. Matson and A. Montagu (eds.), The Human Dialogue. Toronto, Free Press 1967. ’

Johnson, A.R. “A Structural, Pre­conscious Piaget: Heed Without Habit,” Proc. Nat’l Electronics Conf., Vol. 23 (1967).

Kilmer, W.L., McCulloch, W.S., and Blum, J. “Some Mechanisms for a Theory of the Reticular Formation.” In: M.D. Mes- arovic (ed.), Systems Theory and Biology. New York, Springer-Verlag, 1968.

Kotelly, J.C. “Survey of Concepts of Content.” NASA Report, 1968.

MacKay, D.M. “On Comparing the Brain with Machines,” American Scientist, Vol. 42 (1954).

McCulloch, W.S. “Lekton.” In: L. Thayer (ed.), Communication: Theory and Research. New York, Charles C. Thomas 1967.

Pask, G. “My Prediction for 1984.” Prospect, London, Hutchinson, 1962.

Storm, H.O. “Eolithism and De­sign,” Colorado Quarterly, Vol. 1, no. 3 (Winter 1953).

Wiener, N. The Human Use of Human Beings: Cybernetics and Society. Boston, Houghton Mifflin, 1950.


Pask, G. “My Prediction for 1984.” Prospect, London, Hutchinson, 1962.

McCulloch, W.S. “Lekton.” In: L. Thayer (ed.), Communication: Theory and Research. New York, Charles C. Thomas 1967.

Held, R. “Plasticity in Sensorimotor Systems,” Scientific American (November 1965).

Hermann, H., and Kotelly, J.C. “An Approach to Formal Psychiatry,” Perspec. in Biol, and Med. (Winter 1967).

Webster’s New World Dictionary, College Edition, 1986, p.406

Paper presented by a member of New York City Mayor’s office at the October 1968 meeting of American Society for Cybernetics.

Kilmer, W.L., McCulloch, W.S., and Blum, J. “Some Mechanisms for a Theory of the Reticular Formation.” In: M.D. Mesarovic (ed.), Systems Theory and Biology. New York, Springer-Verlag, 1968.

Johnson, A.R. “A Structural, Pre­conscious Piaget: Heed Without Habit,” Proc. Nat’l Electronics Conf., Vol. 23 (1967).

Kilmer, W.L., McCulloch, W.S., and Blum, J. “Some Mechanisms for a Theory of the Reticular Formation.” In: M.D. Mes- arovic (ed.), Systems Theory and Biology. New York, Springer-Verlag, 1968.

Brodey, W.M. “Soft Architecture: The Design of Intelligent Environments,” Landscape, Vol. 7, no. 1 (Autumn 1967).

Wiener, N. The Human Use of Human Beings: Cybernetics and Society. Boston, Houghton Mifflin, 1950.