Systemic Computation is a bio-inspired computational paradigm.
Natural systems provide unique examples of computation in a form very different from contemporary computer architectures. Biology also demonstrates capabilities such as adaptation, self-repair and self-organisation that are becoming increasingly desirable for our technology. To address these issues a new computer model and architecture with natural characteristics was created by Peter Bentley. Systemic computation is Turing Complete; it is designed to support biological algorithms such as neural networks, evolutionary algorithms and models of development, and shares the desirable capabilities of biology not found in conventional architectures.
Systemic computation recognises the practical incompatibilities between conventional computation and natural computation as performed by biological systems, see the table below.
Features of conventional vs natural computation |
Conventional | Natural |
---|---|
Deterministic | Stochastic |
Synchronous | Asynchronous |
Serial | Parallel |
Heterostatic | Homoestatic |
Batch | Continuous |
Brittle | Robust |
Fault intolerent | Fault tolerant |
Human-reliant | Autonomous |
Limited | Open-ended |
Centralised | Distributed |
Precise | Approximate |
Isolated | Embodied |
Linear causality | Circular causality |
Simple | Complex |
Systemic computation is a method of computation (a paradigm), a computer architecture (both virtual machines on conventional computers and physical electronic design), a formalism (calculus) and a modelling approach (programming language, graph notation, holistic analysis and visualisation tools) that is designed to support natural computation. Conventional computers are useful for determinisitc, serial, centralised computation. Systemic computers are useful for applications involving stochastic, highly parallel, decentralised computation (with all the features listed in the right-hand column of the table above).