Some gave highly non-Von-Neumannesque examples of computation which seem to escape any such attempt at a definition. While the ``magic paper'' account was intended to abstract as far as possible from low-level minutiae of how things are implemented, even talk of ``patterns'' may be too much for some. Rob Stufflebeam suggested that we simply define a computer as broadly as possible:
[Stufflebeam:] Something is a computer just in case it can be described as satisfying or implementing some function F. Since practically everything in the universe can be described as implementing some function or other, it's hard to imagine a more general - and unambigious - definition.Selmer Bringsjord replied (correctly, in my view) that this was too general, and that many real computers do not in fact compute functions, but perform such tasks as real-time control of ongoing processes.
McKee also offered a very general definition: something is a computer just when it can be programmed. This seems to concede Searle's view of original intentionality rather directly, until one realizes that ``programmed'' here means only that information from somewhere else influences the computer's behaviour:
[George McKee:] All that is needed is compartmentalization between programmer and target systems. One can imagine a purely biochemical computer that operates via enzyme-linked metabolic networks. The elements of these networks exist in the same (unpartitioned) reaction cell, operating by transforming one chemical species into another. The programming module could operate over one set of species (consider ``aminergic'') and the target system could operate over a different set (consider ``cholinergic'') as long as there were a defined (set of) communication channel(s) between them.McKee finds this recasting means that ``it's an exciting time to be doing research.'' I confess I find this prospect depressing, but readers must make their own judgements.
Another example might be the biochemical computers that can use DNA polymerase reactions (``PCR'') to achieve parallelism dozens of orders of magnitude greater than has been achieved in silicon-based computers. These systems have significant problems with i/o and issues of universality, but instances have actually been built, and used to solve small instances of NP-complete problems.
This means that in order to have a notion of computation that is rigorously consistent with all that we know about the physical world, it is necessary to recast most of the common notions about finite state machines and truth-functional operators in terms of continuum mathematics and probability distributions.