The Active, Self-Organising Interface


A conference presentation that nicely illustrates how Johnson’s insights about perception and cognition could inform the design of computer interfaces. Notable for Johnson’s discussion of self-organizing controllers — one of the technologies they were hoping to implement in designing soft control materials.


Consider the three entities: Men, Machines, and Environments. The interfacing of any pair of these is the problem at issue and our interest is to provide a means whereby an active member of the pair may access information for itself about the other so that a predictive model may be evolved which will make future interactions more meaningful. How should the transaction be conducted?

Let us take as an example for the design of prosthetic devices, sensory or motor. For too long we have considered sensory information to be received passively by the organism, to be processed by the central nervous system for the extraction of meaning, and thereupon to arrive at a decision for motor response. However, sensory data is at best metaphoric: it carries an implied comparison with external objects or events, but not an intrinsic signification. Meaning is to be found in the appropriate response of the organism, which must have taken account of the context in which the object or event occurred. Prosthetic systems should be active, self-organizing, sensorimotor extensions of the user and he a participant in the modification of the system's loop behavior. The prosthesis is itself engaged in a continuing search for an appropriate repertoire of responses to its environment. For the user, the meaningful data about the environment is to be found in the behavior of the interfacing device, not in the raw data with which it must deal, nor in any filtered or coded transformation of that data. The cortical representation would be that of an “effector map”.

Physiologically, a sense of touch and appreciation of texture are subsequent to grasp and to the making of tactile identifications; seeing is possible once we have learned how to look.


1.1 This paper has a starkly simple purpose. It is to promote the idea that a self-organizing interfacing system will deliver a more meaningful account of the environment with which it deals: if it uses terms descriptive of its behavior in that environment, than it would in terms of any presumed metric upon the sense data observable at its input (Johnson 1967, 1969a). There are many examples which might be considered; a few will be chosen.

1.2 We are concerned with the interactions between Men, Machines, and Environments. In particular our interest is focussed upon those interactions which allow one member of any pair of these entities to form a predictive model of the other so that purposive behaviors may evolve, and we must therefore enquire how best to conduct the transaction at the interface between them.


2.1 Imagine that you have been asked to hold your hand open behind you and that above all you are to maintain it passively immobile: outwardly unresponsive to any stimulus. An object which in other circumstances would be quite familiar to you will then be touched to your hand, moved across it, rotated upon it, so that in the absence of all of the surfaces of the object have come in contact with your “tactile” sensing system in some unexceptional manner --- except that it touched you rather than you touching it. You will very likely find it difficult if not impossible to identify the object. However, if we allow you but one fleeting grasp, initiated and carried out by your sensorimotor system, identification will be immediate. It is the process of grasp and the manner in which it organizes and is simultaneously organized by the thing grasped that becomes the embodiment of the recognition. Your grasp --- a self-referent event --- becomes the “behavior word” descriptive of the object.

2.2 Consider the shift of implication upon what one must know a priori for the conduct of a successful transaction or identification as well move from the single loop of sense-process-respond to the use of a self-orgranizing interface. The three diagrams below show the change schematically and depict roughly the evolution of cybernetic thought in the past four decades. The first is a familiar single-loop feedback system which takes in information, performs some sort of transformation upon it, and responds by way of its effectors. In such representations we make the implicit assumption that transformations are available at the processor which will allow meaningful interpretations of the input, i.e., the results of the transformations will be relevant responses. For simple control systems, where perception is unnecessary and relevance is timeless, the assumption is reasonable. However, it is not justified if there has been no sophisticated observer available to provide the appropriate transforming algorithms. The assumption is tantamount to having made a previous commitment to the context in which the data are to be interpreted (Hermann and Kotelny 1967, Gurwitsch 1964).

2.3 The second figure illustrates a shift in point of view. A control system has been made available as an interface which will manipulate the effectors in such a way as to optimize pursuit of the goals set for it by the central processor. No apriori assumption of “what must be done” to transform the sense data has been made for the self-organizing controller (Barron 1966, Barron and Schalkowsky 1967) acting as interface. It will carry out its task via a performance assessment criterion and a pseudo-random search process that makes it “reliable” in a dynamic sense: its bejavior will tend toward an optimal solution. The implication of the former system, however, has not been relieved. The central processor is still dependent for its knowledge of the world upon sense data, descriptive of being touched rather than of grasp. It has no further means than in the first case for evolving a context for the sense data: a meaningful frame of reference in response-oriented terms. Its “words” must be selected from the list provided, and should the environment change so as to demand a shift of context, that list will rapidly become irrelevant.

2.4 The third diagram points the way in which cybernetics seems to be moving. A self-organizing interface device metabolizes data at the input and generates responses through its effectors which are relevant to the goals set for it by the central processor. The latter receives as input the behavioral outflow of the interface, which is necessarily response-oriented because it is data reduced to relevant responses in process. No transform could have been written to reduce the input data to this form unless it took into account all of the changing parameters of the environment in which the effectors must perform. The system is seeking to sort out the meaning of the metaphoric statements of objects and events received by its senses. Meaning, defined as behavioral relevance and embodied in appropriate response, has entered the necessary vocabulary of our description of cybernetic system design. This system, at the simplest level, is appropriately responsive to shifts in context of the metaphoric events of its immediate world. Let us turn to some practical possibilities of application.


3.1 Reflect on what should be the purposes embodied in a prosthetic replacement for lost sense or limb. Should a visual prosthesis provide the user with a sort of simulated sight as might a camera, or should it be primarily an aid to him in his use of visual space? Would he want to “see” a picture of the wall ahead of him or would he prefer a means toward awareness of the meaning of that wall in terms of future actions appropriate to him: its distance, orientation, height, texture, etc.?

3.2 What of prosthetic arms or hands? We who have a normal complement of sensation and articulation are generally quite unaware to what extent we have developed the ability to perform a snapshot intellectual assessment of a task, to set up a predictive model of what the experience of performance will be, and then to allow the extremities to organize themselves and to continue skillfully with a minimum of direct observation. We have scheduled the process of performance in terms of sensorimotor loops specifically appropriate to the task and have allowed them to pursue a sequence of self-referent goals. We are not engaged, as popular belief would have it, in digesting quantities of sensory minutiae for reduction into decisions for following steps. Rather, we assign our sensorimotor systems behavioral roles with foreordained results and then turn them loose to find out for themselves how to move from here to there through a complex parameter space. This point of departure imposes the important demands we must meet with our designs and deployment of the artificial intelligence resources we can now provide our prosthetic systems.

3.3 Why do we persist in describing prosthetic extremities as output devices and prosthetic sense as input? Let us instead refer to the latter as perceptual enhancement and to the former as somatic augmentation. No, it is not just a change of vocabulary that is proposed, but a fundamental change in our way of thinking about the dialogue between a self-organizing system and its environment (Gibson 1966, Held 1965).

3.4 Suppose, for example, that one might provide a blind person with a mechanical eyeball as a prosthetic replacement for one of his own eyes. Let us imagine that it contains within it an optical system to focus an image upon a photocell array and a miniaturized computer which can deliver stimuli to the ocular muscles sufficient to cause them to contract and move the “eye”. Let us also assume that no other output of stimuli is envisioned (Brodley and Johnson 1969). The computer, programmed as a multiple-goal, multiple-actuator, self-organizing controller (Barron and Schalkowsky 1967), would seek to move this system so as to maintain certain properties of the static and dynamic geometry of controlled boundaries on the photocell array within assigned limits. In doing so the resulting self-generated movement of the eye would interact with the user's voluntary directing of it and convey to him---by its way of looking---qualities of the visual space around him. If, as some have suggested, the lack of direct proprioception from the ocular muscles allows for too attenuated an awareness of direction of look, the prosthesis could be made sufficiently irregular in shape to provide enriched cues. The approach to the problem, however, should be broader: for the purpose of allowing a user to organize himself with respect to visual space, the information provided by a process of looking will be more useful than the properties of an image passively seen.

3.5 A similar approach to the organization of a prosthetic hand would provide the device with its own sense array directly coupled via a complex controller to the actuators (Johnson 1969b). Sequences of grasp and manipulation will then take place as the result of immediate consequences of the process rather than having to be programmed in advance or having to be monitored continuously by an unnatural involvement of the sensory system. The fundamental concept is that the behavioral loop needs to contain only information of immediate relevance to the task, the latter having been set by the user's behavior as a role that the prosthesis is to perform with respect to the objects encountered. The information returned to the user is a pseudo-proprioception which describes the effector action taken rather than any sensory data gathered. Very likely such mechanical changes will be apparent at the surfaces of attachment, thereby obviating the provision of artificial monitoring. Again, however, the direction of design should be clear. The control of touch and movement at the extremity is undertaken within a complex parameter space relevant specifically to those variables. Its direction from “above” is that is changing role; its feedback report is in “words” descriptive of performance.


4.1 If a mountaineer returns exhausted from a climb you are intending to make, questioning him about metric details of the mountain will not aid you in your desire to assess the degree of difficulty of the climb. His behavior qua exhaustion will suffice.

4.2 You wish to send a representative or ambassador to another country with a culture different from your own. Select a man who is familiar with that culture and who understands and believes in the policies he is to represent, then turn him loose and request reports only about how he put his resources into action. That is, if he is engaged in negotiations do not expect that the details of the transactions will be meaningful to you. They are to him and he is to be trusted to respond relevantly because he is the one who is in a position to assess relevance: you are not, or at least not yet. He may educate you through his actions to relevance the meaning of the data with which he deals in the context of your policies, but for the moment his decisions are the only ones in context. He is, so to speak, an extra hand.

4.3 If you are attending an athletic event with which you are unfamiliar and you wish to learn the significance toward winning or losing of the events occurring, then watch the crowd more carefully than the game. Those other spectators are self-organizing systems who can interact with the raw data of their environment and their responses indicate the meanings of its processes.

4.4 How do you read the intentions of a mob? Everyone in the crowd is shouting something different but they all seem to be there for the same reason. What is it? If we try to listen we will hear only fragments and they will be interpreted out of context. Better: we hold up signs or otherwise convey messages to the crowd as a whole, and then interpret the response to those messages. We will have no need to listen to individuals and average or correlate out the common sense; instead, we organize the diffuse relevance of the presence of each by directing all behavior toward a common metaphor.


5.1 It has been implied above that the self-organizing interface suggested renders to a higher stage of processor “words”, in metaphors of performance, descriptive of the environment with which it deals. Suppose that our interest is directed more to emergent patterns in time of purposive behaviors. Imagine that a hierarchy of such interfacing systems, each dealing with some multi-dimensional aspect of a common environment, and that some overlap occurs in their parameters spaces; are how we have those set by a similar system, so that the latter is hierarchically “above” those of the former group, and receives as sense data their behavioral outputs. How would the supra-controller manage the spatio-temporal relations of the set of behaviors under its command? The follow-up is a forehanded, partial, and somewhat inadequate glimpse at a few of the interactions that are emerging.

5.2 For example, quite apart from the familiar non-linear “heterodyning” effect which can produce energies at sum and difference frequencies of the interacting components, the very fact of cross-coupling of changes in the physical environment of self-organizing devices which are otherwise unaware of each other's presence will produce a time-phasing in their behaviors which was not present in the behavior of the separate individuals. It may be too early to predict what the results are; further effort in this concept will result in a kind of syntax of responses, but further effort in this direction are under way with such an interpretation in mind. On-line application of computer control to multi-dimensional interaction patterns in real time are essential for a clear understanding and “grasp” of the concepts involved. Simulations, which can present instead of interact with displays which may only observe passively, will prove inadequate for full apprehension of its complex aspect. For the purposes of study, the dialogue between machine and man must be available in a sensorimotor dimensionally commensurate with the complexity of the system itself.


Barron, R.L. “Self Organizing Learning and Control Systems.” Bionics Symposium, Dayton, Ohio, 1966.

Barron, R.L. and Schalkowsky, S. “On-Line Self-Organizing Control of Multiple-Goal, Multiple-Actuator Systems.” 1967 Joint Automatic Control Conference, Univ. of Pennsylvania, 1967.

Brodey, W.M. AND Johnson, A.R. “A Visual Prosthesis that Looks.” 2nd Conf. on Visual Prothesis Assoc. for Computing Machinery, Chicago (in press), 1969.

Gibson, J.J. The Senses Considered as Perceptual Systems. New York: Houghton Mifflin, 1966.

Gurwitsch, A. The Field of Consciousness. Pittsburgh: Duquesne Univ. Press, 1964.

Held, R. “Plasticity in Sensory Motor Systems. Scientific American, Vol. 213, No. 5 (1965).

Hermann, H.T. and Kotelly, J.C. “An Approach to Formal Psychiatry.” Perspectives in Biology and Medicine, Vol. 10, No. 2 (1967).

Johnson, A.R. “A Structural, Preconscious Piaget: Heed Without Habit.” Proc. Nat'l. Electronics Conf., Vol. 23 (1967).

Johnson, A.R. “Organization, Perception, and Control in Living Systems.” Indus. Management Rev., Vol. 10, No. 2, M.I.T. (1969)

Johnson, A.R. “Self-Organizing Control in Prosthetics.” 3rd Int'l. Symp. on External Control of Human Extremities Dubrovnik, Yugoslavia (in press), 1969.