How fundamental is information?
started 5/9/2001; 4:46:16 AM  last post 5/11/2001; 7:01:30 PM


Ehud Lamm  How fundamental is information?
5/9/2001; 4:46:16 AM (reads: 598, responses: 4)


How fundamental is information? 
Language, whether you think about syntax or semantics, is about information. This essay raises some questions about what information really is.
I recommend to anyone interested in modern science and thought to read Shannon's classic paper that created information theory.
I am among those who think that algorithmic information theory (Kolmogorov complexity) can be useful in answering questions about relative complexity of organisms etc. But one should be careful not to reason in circles (i.e., assume we know what is more complex).
I once thought it is possible to create a sound theory of meaning (as opposed to information). I am not how I feel about it today. It seems a bit naive.
Posted to general by Ehud Lamm on 5/9/01; 4:46:36 AM




water  Re: How fundamental is information?
5/9/2001; 9:30:52 AM (reads: 624, responses: 2)


I'm working on an alternative viewpoint on information theory which does a little to support the idea of semantics. This work is Arrow, which is being coordinated with and for The TUNES Project. Basically I take the perspective of arrows (objects with a standard interface of 2 references, much like CONScells only informationcentric instead of constructioncentric) instead of bits as fundamental. My position paper is lacking though, due to my lack of time for the project at the moment. However, this will change shortly. Anyway, the TUNES list has discussed this kind of thing several times.


andrew cooke  Re: How fundamental is information?
5/9/2001; 9:37:11 AM (reads: 652, responses: 1)


So how do you represent a bit? As an arrow from a bit to the same bit? Doesn't that still beg the question?
Maybe you really mean that the fundamental repsn is bits plus arrows? Or rather that you're representing a subclass of information (the relationship between "simple" sets  maybe I'm naive in thinking you can argue one is a subset of the other).


Oleg  Re: How fundamental is information?
5/11/2001; 7:01:30 PM (reads: 612, responses: 0)


<cite>I am among those who think that algorithmic information theory
(Kolmogorov complexity) can be useful in answering questions about
relative complexity of organisms etc.</cite>
Related to this is a question that is at the heart of the modern
debate with the "creationist science". Is the sequence of evolutionary
events the result of a blind chance or an "intelligent design"? In a
more mathematical formulation, how to tell if a sequence of numbers
a[n] is truly random or can be generated by some deterministic
function f[n]?
This is a difficult question. Statistical tests are of little help:
deterministic pseudorandom generators can yield randomlooking
results. Such "designed" algorithms as "decimal digits of PI" produce
irregular, randomlooking, infinite sequences. Another example is
Mandelbrot sets  exquisitely complex and yet generated by one of the
most simplest mathematical processes.
Kolmogorov suggested that a sequence can be called random if any of
its descriptions is no shorter than the direct enumeration of its
elements. Alas, young Chaitin showed that this test is undecidable.
BTW, this example indicates how profoundly stupid are the questions
such as "given a sequence of several numbers, tell the next number",
which are typical of SAT, IQ and other standardized tests.


water  Re: How fundamental is information?
5/12/2001; 8:51:02 AM (reads: 708, responses: 0)


Representation in arrow is something relative. Consider embeddings of elements in category theory. The implict idea in bitwise information theory is that finite inductive representations are all that are to be dealt with, and that any other abstraction cannot be dealt with as transparently as it can.
But more to your point, the selection of either reference of an arrow represents an analogy of a bit of information, and bit sequences are chains of these selections concatenated.
As for the subset (or class) proposition, I'd rather say that bits force you to represent things with finite inductive forms, meaning that every type is coerced into a fundamentally finite inductive type nontransparently.
Finally, part and parcel of the arrow theory is that information does not have a universal class, or that a single representation or perspective holds more appropriately than any other. A lot of this comes from category theory, but category theory is not general enough.
~



