Artificial Grammar Reveals Inborn Language Sense

It is often mentioned on LtU that there is not enough research done on human factors, so I thought this might be interesting for some of you.

Artificial Grammar Reveals Inborn Language Sense human children acquire language-which is so complex and has so many variations-remains largely a mystery. Fifty years ago, linguist and philosopher Noam Chomsky proposed an answer: Humans are able to learn language so quickly because some knowledge of grammar is hardwired into our brains. In other words, we know some of the most fundamental things about human language unconsciously at birth, without ever being taught.
Now, in a groundbreaking study, cognitive scientists at The Johns Hopkins University have confirmed a striking prediction of the controversial hypothesis that human beings are born with knowledge of certain syntactical rules that make learning human languages easier.

Perhaps it is possible to use such research to inform language and library design (or maybe we could use the same research techniques to conduct our own studies?)

I tried to take a look at the actual research by browsing Jennifer L. Culbertson's publications and most of it goes way over my head I'm afraid. However, her dissertation seems like it could be informative and there is a nice graphic summary which is mildly titillating.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

The science is more

The science is more complicated that what this suggests. See this for another recent important study:

The explanation in this news release is a bit flimsy, so if you are interested, try to get your hands on the actual paper.

Get your hands on the actual paper

No need to print it out, he linked to her dissertation above and unless you want to keep an office door propped open, an e-reader is good enough.

I was talking about a

I was talking about a different paper. Pay attention! ;-)


A preprint is available: Dunn, Greenhill, Levinson & Gray (2011) Evolved structure of language shows lineage-specific trends in word-order universals.

The Chomskian view accounts for language diversity by saying that universal grammar (UG), which corresponds to the innate syntactic faculty, is specialised to the grammar of particular natural languages by the fixing of parameters during language acquisition. It's not clear to me from a cursory glance how this paper's findings of near independence in the distribution of grammatical features is meant to serve as an argument against UG. There's a brief discussion of these findings running against predictions at the end of the paper.

Discussed on Language Log

Good link

Liberman links to an expository overview of the paper by two of the authors that answers my question.

Liberman's point about language contact seems very cogent to me. The paper seems to be assuming something like an "asexual reproduction plus mutation" view of language change, while language contact might suggest Brownian motion is a better model.