Not that one, the other one!

"Algebraic Structure Theory of Sequential Machines" will seem hopelessly out of date to some people, but the contemporary interest in concurrency gives the book a new found importance. I have a complete searchable scan of the book that I would post online if I knew how to get permission from Prentice-Hall. Does anyone have any ideas?

Here is the very interesting preface of the book.

The explosive development of information-processing technology during
the last two decades has stimulated the vigorous growth of an Information
Science. This new science is primarily concerned with the study of information
and the laws which govern its processing and transmission. A very
active part of this science is the study of sequential machines or finite automata
which are abstract models of digital computers. The aim of this research
is to provide a basic theoretical background for the study of digital computers
and to contribute to a deeper understanding of discrete or finite
information-processing devices. This area of research was started around
1954 by D. A. Huffman and E. F. Moore and has since undergone a considerable
growth in several diverse directions. In the period from 1960 to
1965, a body of results we call "structure theory" was created and developed
to a considerable degree of completeness and unity. This book is an exposition
on the foundations, techniques, and applications of this theory.

By a structure theory for sequential machines, we mean an organized
body of techniques and results which deal with the problems of how sequential
machines can be realized from sets of smaller component machines, how
these component machines have to be interconnected, and how "information"
flows in and between these machines when they operate. The importance of
machine structure theory lies in the fact that it provides a direct link between
algebraic relationships and physical realizations of machines. Many structure
results describe the organization of physical devices (or component machines)
from which a given machine can be synthesized. Stated differently, the
structure theory describes the patterns of possible realizations of a machine
from smaller units. It should be stressed, however, that although many
structure theory results describe possible physical realizations of machines,
the theory itself is independent of the particular physical components or
technology used in the realization. More specifically, this theory is concerned
with logical or functional dependence in machines and studies the information
flow of the machine independently of how the information is represented
and how the logical functions are to be implemented.

The mathematical foundations of this structure theory rest on an algebraization
of the concept of "information" in a machine and supply the algebraic
formalism necessary to study problems about the flow of this information
in machines as they operate. The formal techniques and results are
very closely related to modern algebra. Many of its results show considerable
similarity with results in universal algebra, and some can be directly derived
from such considerations. Nevertheless, the engineering motivation demands
that this theory go its own way and raises many problems which require
new mathematical techniques to be invented that have no counterpart in
the development of algebra. Thus, this theory has a characteristic flavor and
mathematical identity of its own. It has, we believe, an abstract beauty
combined with the challenge and excitement of physical interpretation and
application. It falls squarely in the interdisciplinary area of applied algebra
which is becoming a part of engineering mathematics.

This book is intended for people interested in information science
who have either an engineering or mathematical background. It can be read
by anyone who has either some mathematical maturity, achieved through
formal study, or engineering intuition developed through work in switching
theory or experience in practical computer design.

Enough concepts of machine theory and machine design are introduced
in the first chapter so that a mathematician may read the book without any
experience with computers or switching theory. A preliminary chapter on
basic algebraic concepts supplies enough mathematics to make the book
self-contained for a non-mathematician. A good number of examples are
given to supply the engineer with an interpretation or application of the

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Nondeterministic state machines cannot implement concurrency

Nondeterministic state machines cannot implement concurrency.

For example, Actor system can compute an integer of unbounded size:

Unbounded ≡ 
   Start[ ]→
     Let aCounter←CreateCounter.[ ]  
        Prep c.Go[ ], answer←aCounter.Stop[ ]  
CreateCounter.[ ] ≡
  Actor thisCounter  
      with count≔0, continue≔true 
      implements Counter using
         Stop[ ]→ count afterward continue≔false       
         Go[ ]→  continue ¿ True ↝ Exit thisCounter.Go[ ] afterward count≔count+1;
                            False ↝ Void ? 

However, a nondeterministic state machine cannot implement the above actor system. [Plotkin 1976] gave an informal proof as follows:

Now the set of initial segments of execution sequences of a
given nondeterministic program P, starting from a given state, will
form a tree. The branching points will correspond to the choice
points in the program.  Since there are always only finitely many
alternatives at each choice point, the branching factor of the tree
is always finite.  That is, the tree is finitary. Now König's lemma
says that if every branch of a finitary tree is finite, then so is
the tree itself. In the present case this means that if every
execution sequence of P terminates, then there are only finitely many
execution sequences. So if an output set of  P is infinite, it must
contain a non-terminating computation.


I don't get it. What is the point?

Hartmaninis & Stearns theory is *not* about concurrency

Hartmaninis & Stearns theory concerns non-deterministic state machines and consequently it is *not* about concurrency.

This is a point of view I

This is a point of view I don't agree with. But I would be interested to know if this is a published or documented argument, and where the references are. The circuits in the book synchronize because of circuit delays. They work for the same reason linear analog circuits work. It is ironic to say this but the math works because of state and measure. For someone like me this is so commonsense that I don't know what to say to people who don't get it. I need to see the problem the way they see it in order to compose a suitable answer. Also in modern practice the circuits in HandS would be implemented with clocks and would be clearly deterministic.

Arbiters cause indeterminacy in computation

Arbiters cause indeterminacy in computation.

See the following publication: Formalizing common sense for scalable inconsistency-robust information integration


If you need arbiters you don't have a serial/parallel decomposition. The whole point of HandS is to solve the mathematical problem of a serial/parallel decomposition. The result is a concurrent "system" in space and time. The implementation doesn't make a difference. Software works as well as anything else.

Since I can't reference HandS, the basic math can be found in Buris and Sankappanavar paragraph 4; page 17. A serial/parallel decomposition is compact.

Compactness in logic is undesireable in Computer Science

That a computer provides service (i.e. ∃[i:ℕ]→ ResponseBefore[i]) cannot be proved in a theory with compactness.

Proof:  In order to obtain a contradiction, suppose that it is
possible to prove the theorem that computer server provides service
(∃[i:ℕ]→ ResponseBefore[i]) in a compact theory T. Therefore the
following infinite set of sentences is inconsistent:  
([i]→ ⌈¬ResponseBefore[i]⌉ )⟦ℕ⟧. By the definition of compactness,
it follows that there is finite subset of the set
of sentences that is inconsistent. But this is a contradiction,
because all the finite subsets are consistent since the amount of
time before a server responds is unbounded
(∄[i:ℕ]→ ⊢T ResponseBefore[i]).

Existentially closed

Existentially closed discursive linguistics: Proof is just a weapon, it has no significance in itself. Proof without an analytic/synthetic distinction is meaningless.

Note: Added a sentence.

Existentially closed minds

A mathematical proof is a very strong argument that has significance in itself.