Site operation discussions
This is sort of silly, but just plain cool. Eli Fox-Epstein encoded Rule 110 in HTML5 and CSS3. Rule 110 is Turing complete.
See one of his example tests on Github.
Information wants to be free. Computation wants to diverge.
Information wants to hide in a sea of useless data. Or at least it seems that way when working on robotic perception...
I guess trying to analyze programs to find semantics is in the same category as robotic perception.
Principles of action tend to be hidden in a sea of useless implementation details. Programming Languages tend to specify HOW a thing is to be done; so WHAT is done becomes hard to observe.
The classic example is sorting, I guess. The semantics of sort is that you take orderable input in any sequence and produce sequenced output such that (a) each input is also output and (b) their ordering is correlated with their sequential position in output. That's what you're trying to "perceive" when you look for program semantics. But you have to fish that out from a sea of algorithmic implementation details, such that Knuth wrote a whole book about Sorting and Searching.
And as the title of that book implies, sort usually isn't the semantics of the program itself. The sort happens only to facilitate a search, and returning the object of the search is the semantics of a slightly larger part of the program. And the search likewise is usually an implementation detail of some yet larger semantics.
Finding the semantics of a program, as opposed to a particular method or algorithm used to implement those semantics, is hard.
I wanted to use the examples for some security discussion, but it seems he's pulled the Github repo and his page does not discuss the topic any more. So we'll have to wait for someone to illustrate and document it.
This StackOverflow thread links to a jsFiddle that reproduces the technique (direct link), and here's a video of the original.
Er, in fact, here's the original!