next up previous
Next: Other architectures Up: Patrick J. Hayes: What Previous: Real computers

An alternative account

Here I will give a sketch of an alternative account of what it is to be a computer.

Leaving open the question of whether a nervous system is a computer, artificial computers have two parts: a memory and a processor. Turing machines model this in the familiar way. The ``state-transition'' way of describing computation focusses on the processor, relegating the memory to an aspect of the state. Suppose however we focus instead on the memory.

A computer's memory contains patterns (including the ``blank pattern'') which are stable but labile, and it has the rather special property that changes to the patterns are under the control of other patterns: that is, some of them describe changes to be made to others; and when they do, the memory changes those patterns in the way described by the first ones. One kind of change, for example, is that a pattern shall be created which denotes some property of another pattern, such as the number of symbols in it, or the number of prime factors its denotation has when it is considered to be a numeral. In this account, then, the central notions are those of patterns, changes to patterns, syntactic properties and denotations of patterns. A computer is a machine which is so constructed that patterns can be put in it, and when they are, the changes they describe will in fact occur to them. Notice that this is a perfectly accurate description of a Von Neumann machine.

If it were paper, it would be ``magic paper'' on which writing might spontaneously change, or new writing appear. If some of the written text is about the writing itself, the text will rearrange itself so that the assertions made about it become true. Write ``Fish are lackiing in insight,'' then write ``Change `ii' to `i'''. The first sentence would become correct and the second sentence vanish.

Different kinds of computer can be defined abstractly in terms of the formal syntax of their pattern language, the kinds of syntactic change they can perform, and the semantic theory which relates the former to the latter. I have been careful to avoid calling these patterns ``symbols,'' but they have a syntax and some have denotations. Moreover, if we put such a computer in a `grounding' situation, some of the patterns can become causally related to aspects of the physical environment in systematic ways. For example, we might imagine a memory machine attached to a blood-pressure monitoring machine in such a way that the patterns are numerals which denote the pressure, ordered in the memory so as to denote the temporal history. Other symbols might specify sequences of syntactic changes which produce new numerals denoting the rates of change of blood pressure, and so on. Notice that it is not particularly strange to refer to the denotations of the symbols.

(Istvan Berkeley objected to ``pattern'': [Berkeley:] It seems to me that the term ``patterns'' is horrifically imprecise, given the importance that the term plays in the thesis - after all, wall paper can be said to have patterns!

He suggests that we substitute ``representations,'' understood in a Dretskian fashion, i.e., an item which indicates how things stand with respect to some object, condition or magnitude.

I use ``pattern'' more or less as a synonym for ``symbol,'' but attempting to indicate that they might have complex and non-language-like syntax (what computer science calls ``data structures''), and that there is no presupposition that these symbols are meaningful. Particular computers may be configured so that some of the data structures denote or represent, but a computer programmed to merely play around with meaningless empty lists is still a computer, for all that.)

``Magic paper'' has some potential for being a more rewarding way of characterising computation, for several reasons.

First, it concedes the fact that computers manipulate formal patterns according to their form, without thereby also conceding that a full account of the computational process cannot or must not refer to the denotations of the symbols.

Second, it abstracts completely from any particular implementation structure. It is a higher-level description, referring only to the syntax of the expressions in memory, not the engineering minutiae.

Third, this account makes symbols - or, patterns, at least - a central issue in computation, which they surely are. This is what people have always meant by the term, ever since a time when to be a computer was a matter of professional academic pride. That is why machines like EDSAC and JOHNNIAC were called computers: they were the first machines that could clearly perform symbolic computations.

Fourth, it emphasises that being a computer involves more than moving appropriately between states. Many state-transition machines are not computers not because they fail to conform to a finite-state machine, but because they do not encode symbols: nothing in them has a syntax which refers to their own state-transitions.

This is worth looking at more carefully, since in a sense this condition is trivially satisfied. In Putnam's spirit, we can define each state as being a `pattern' which denotes the transition which immediately follows that state: the pattern-language has a trivial empty syntax, consisting only of a set of discrete symbols. Any finite-state system is then a ``computer'' in our sense; but note that the ``memory'' of such a computer has a recognisably trivial syntax, no matter how many states the machine may have. No matter how complex the machine, it is always trivial as a computer unless it has a nontrivial memory language, so that it becomes useful to describe it as containing patterns.

Here ends the target article.

This gave rise to several issues in the subsequent discussion, as well as those in the comments already inserted.


next up previous
Next: Other architectures Up: Patrick J. Hayes: What Previous: Real computers

hh
Fri Jul 25 22:00:35 MEST 1997