In certain types of aphasia such as Broca’s aphasia, also known as non-fluent aphasia, many people have problems constructing sentences that are grammatically correct or understanding other people’s speech because they don’t grasp the syntactic structure of their sentences. They may understand the meaning of the individual words in the sentence, just not their relationship, which is specified by sentence syntax. For example, if they hear “the girl pushed the boy”, persons with this type of aphasia have hard times understanding who did the pushing and who got pushed. These kind of problems have informed scientists that grammar rules are processed in the brain somewhat independently of, say, word meaning.

Understanding better how the brain uses grammar rules to construct sentences and comprehend spoken language can aid both the diagnosis and therapy of persons with Broca’s aphasia.

A recent study described in Science last week found that grammar rules are not always either right or wrong, but rather could belong to a ‘gray zone’, quite unlike what we are taught in middle school. As Science explains:

…linguists have long acknowledged that some grammar falls into a “gray zone,” a middle ground in which sentences are neither 100% right nor 100% wrong. Now, a new study shows that the linguists who map out the structure of grammar—syntacticians—rarely use this gray zone in their own studies. It also suggests a wide gap between their black-and white views and those of ordinary people.

Researchers wanted to check how strict grammar rules for composing a sentence compared to the intuition of ordinary people.  The team took 100 sentences with clearly right or wrong structure, according to papers on grammar published in the journal Linguistic Inquiry, and ran them by 65 native English speakers.

Their answers didn’t square with those of the linguists. On a scale of 1 to 7, participants ranked 40% of the black-and-white sentences between 3 and 5, putting them squarely in the “gray zone.”

The results could affect everything from research into how the human brain processes language to building speech recognition software. By ignoring the gray zone, say the researchers, syntacticians are failing to describe how language really works.

Gray zone notwithstanding, grammar is hugely important for human language and our ability to communicate. One of the great advantages to human communication is that grammatical rules allow us to combine the same words into many different messages. As David Poeppel, a professor at New York University and the director of the Max Planck Institute for Empirical Aesthetics in Frankfurt, writes in a recent paper published in Nature Neuroscience:

The most critical attribute of human language is its unbounded combinatorial nature: smaller elements can be combined into larger structures on the basis of a grammatical system, resulting in a hierarchy of linguistic units, such as words, phrases and sentences.

One question that linguists and neuroscientists have been trying to answer is whether our brains have special faculties that process grammatical rules independently of the rest of the linguistic components, such as word meaning and knowledge of common phrases, and how exactly those grammar rules are employed by the brain to comprehend spoken language.

A good illustration of independent grammar processing in the brain are sentences such as “colorless green ideas sleep furiously”. The words that compose this sentence are rarely heard together in regular communication and the sentence conveys no meaningful message, yet we have a sense that it is grammatically correct even without analyzing explicitly it’s syntax.

To find out more about the existence and potential use of such implicit grammar knowledge in the brain, Poeppel and colleagues conducted a study where they had people listen to sentences while they measured how their brain activity changed when they heard separate words, phrases, and entire sentences.

The researchers  paid special attention to construct and read the sentences in a way that carried no acoustic cues or experience-based cues that would help listeners figure out when words make up a phrase and when phrases form a sentence. Listeners had to rely on their internal knowledge of grammar rules to recognize when they heard just strings of words, when they heard a phrase, and when they heard a sentence.

Poeppel and colleagues found that the brain activity of the participants supported the notion that when listening to spoken language the brain uses implicit grammatical knowledge to parse the spoken message into words, phrases, and sentences in order to understand its meaning. Moreover, the specific brain regions that were involved in this process coincide with some of the regions commonly damaged in aphasia, such as the Wernicke’s (posterior temporal) and Broca’s (inferior frontal) areas.

Studies like these are important for our understanding of how the brain uses implicit grammar rules to deconstruct strings of syllables and words and parse them into meaningful messages. Such understanding can shed more light on why in certain types of aphasia people lose their ability to use grammatical cues to understand spoken language.