Synchronic Computaion

May I draw members attention to a novel approach to spatial programming, and parallel formal model of computation. As far as I am aware, no pure λ-calculus interpreter and simulator for a high level programming language such as Haskell has been constructed, that has viable time and space complexity characteristics. That difficulty lends support to the spatial approach, which questions the outlook that software may be studied as a pure discipline in isolation from hardware. An 8 page summary paper on Synchronic Computation may be accessed at http://arxiv.org/abs/1008.1673. Further links to reports may be found at http:www.isynchronise.com, and links to discussions at http://www.linkedin.com/groupRegistration?gid=2842794.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Problems.

From "intro to casual reader": The grammar of any natural language is based on a hierarchical tree structure, whose tips are words in a language, and whose nodes represent various grammatical parts of speech such as main clause, relative clause, noun phrase, verb phrase etc.."

This is just a model of grammar, convenient enough to be used widely.

Also, just yesterday I happen to search for hypergraph grammar parsing and encountered this interesting paper: Parsing String Generating Hypergraph Grammars. The main idea there is to create hypergraph structure for words in sentence and then infer information from hypergraph parsing.

I think, this somewhat counters existence of "serious defect" in natural languages. ;)

I saved aside energy problems with Synchronic A-RAM, which allows any register to exchange information with any register in constant time and is globally clocked.

[no longer relevant]

Since this gained a response, subject line was: 'Serguey, Fix your italics tag!' (due to a missing tag)

LTU needs new software.

LTU needs new software.

Graphs, parallelism, and physics.

Hi Serguey, many thanks for your response. Natural language representations can be subjected to semantic processing and transformed into graphs (there is all of the historical literature on semantic networks), that are not subject to what I term the Single Parent Restriction (SPR) associated with tree syntax. Semantic networks and graph rewriting systems however did not lead to mainstream forms of deduction and computing, partly because deduction/rewriting rules require in general the solution the NP-complete subgraph isomorphism problem.

The other significant defect I wish to ascribe to natural language, which I argue has been templated into formal and programming languages, is its non-spatial character. A spatial language allows tagging of abstract spatial information at the level of syntax in a way that does not reference semantics directly, relating to data transfers and allocation of jobs to resources on a semantic level or preceding abstraction layer. I argue important benefits ensue from eliminating SPR, and incorporating spatial information at the syntactic level of a parallel programming language, concerning the avoidance of resource contention and state explosion in general purpose parallel computing.

Trying to implement a global clock on a large chip using H-trees and silicon alone, is horrendously expensive in terms of chip area and energy dissipation. David Miller in one of his recent optoelectronics surveys however, stated, “It is likely possible to retain absolute timing accuracy in the delivery of optical clock signals of ~10-100 picoseconds over a computer room (tens of meters) without any special technology.” I believe optoelectronics is going to be something of a game changer in computer science, with regard to the perceived viability of globally clocked computing.

The Synchronic A-ram as currently presented in the 8 page summary and the main report, is indeed physically unrealistic because of information transfer in constant time. My intention was to abstract away notions of propagation delay in order to focus on conceptual problems relating to parallel computing. The model is quite amenable however to including a notion of propagation delay as described in section 3.5.2 of the main report (http://arxiv.org/abs/1005.5183), that will not I claim impact significantly on the programming model Space.

Hyperlinks and Abstracts

Alex, please wrap URIs with hyperlinks, and we traditionally include an abstract or summary of the papers we link. (Remember that laziness, impatience, and hubris are not unusual properties for effective programmers. ;-)

Abstract for the 8 page description:

Space is a circuit oriented, spatial programming language designed to exploit the massive parallelism available in a novel formal model of computation called the Synchronic A-Ram, and physically related FPGA and reconfigurable architectures. Space expresses variable grained MIMD parallelism, is modular, strictly typed, and deterministic. Barring operations associated with memory allocation and compilation, modules cannot access global variables, and are referentially transparent. At a high level of abstraction, modules exhibit a small, sequential state transition system, aiding verification. Space deals with communication, scheduling, and resource contention issues in parallel computing, by resolving them explicitly in an incremental manner, module by module, whilst ascending the ladder of abstraction. Whilst the Synchronic A-Ram model was inspired by linguistic considerations, it is also put forward as a formal model for reconfigurable digital circuits. A programming environment has been developed, that incorporates a simulator and compiler that transform Space programs into Synchronic A-Ram machine code, consisting of only three bit-level instructions, and a marking instruction. Space and the Synchronic A-Ram point to novel routes out of the parallel computing crisis.

This seems interesting. I'm a bit curious as to how synchronization between modules with shared dependencies is handled in this model. I suppose I'll read the paper.