EFTA00688506
EFTA00688507 DataSet-9
EFTA00688509

EFTA00688507.pdf

DataSet-9 2 pages 657 words document
P17 P22 V16 V11 P24
Open PDF directly ↗ View extracted text
👁 1 💬 0
📄 Extracted Text (657 words)
From: To: Jeffrey Epstein <[email protected]> Subject: Fwd: [Dewayne-Net] MIT claims to have found a "language universal" that ties all languages together Date: Thu, 06 Aug 2015 17:02:56 +0000 FYI Co-authored with iPhone auto-correct Begin forwarded message: From: Hendricks Dewayne Date: August 6, 2015 at 9:08:29 AM PDT To: Multiple recipients of Dewayne-Net Subject: [Dewayne-Net] MIT claims to have found a "language universal" that ties all languages together Reply-To: MIT claims to have found a "language universal" that ties all languages together A language universal would bring evidence to Chomsky's controversial theories. By Cathleen O'Grady Aug 6 2015 <http://arstechnica.com/science/2015/08/mit-claims-to-have-found-a-language-universal-that-ties-all- languages-togetheri> Language takes an astonishing variety of forms across the world—to such a huge extent that a long-standing debate rages around the question of whether all languages have even a single property in common. Well, there's a new candidate for the elusive title of "language universal" according to a paper in this week's issue of PNAS. All languages, the authors say, self-organise in such a way that related concepts stay as close together as possible within a sentence, making it easier to piece together the overall meaning. Language universals are a big deal because they shed light on heavy questions about human cognition. The most famous proponent of the idea of language universals is Noam Chomsky, who suggested a "universal grammar" that underlies all languages. Finding a property that occurs in every single language would suggest that some element of language is genetically predetermined and perhaps that there is specific brain architecture dedicated to language. However, other researchers argue that there are vanishingly few candidates for a true language universal. They say that there is enormous diversity at every possible level of linguistic structure from the sentence right down to the individual sounds we make with our mouths (that's without including sign languages). There are widespread tendencies across languages, they concede, but they argue that these patterns are just a signal that languages find common solutions to common problems. Without finding a true universal, it's difficult to make the case that language is a specific cognitive package rather than a more general result of the remarkable capabilities of the human brain. Self-organising systems EFTA00688507 A lot has been written about a tendency in languages to place words with a close syntactic relationship as closely together as possible. Richard Futrell, Kyle Mahowald, and Edward Gibson at MIT were interested in whether all languages might use this as a technique to make sentences easier to understand. The idea is that when sentences bundle related concepts in proximity, it puts less of a strain on working memory. For example, adjectives (like "old") belong with the nouns that they modify (like "lady"), so it's easier to understand the whole concept of "old lady" if the words appear close together in a sentence. You can see this effect by deciding which of these two sentences is easier to understand: "John threw out the old trash sitting in the kitchen," or "John threw the old trash sitting in the kitchen out." To many English speakers, the second sentence will sound strange—we're inclined to keep the words "threw" and "out" as close together as we can. This process of limiting distance between related words is called dependency length minimisation, or DLM. Do languages develop grammars that force speakers to neatly package concepts together, making sentences easier to follow? Or, when we look at a variety of languages, do we find that not all of them follow the same pattern? The researchers wanted to look at language as it's actually used rather than make up sentences themselves, so they gathered databases of language examples from 37 different languages. Each sentence in the database was given a score based on the degree of DLM it showed: those sentences where conceptually related words were far apart in the sentence had high scores, and those where related words sat snugly together had low scores. [snip] Dewayne-Net RSS Feed: <http://dewaynenet.wordpress.comffeedt> EFTA00688508
ℹ️ Document Details
SHA-256
df767284dc16b74e14c33d711a6bad69578b1fdd3305fc507c150cd9d3610e50
Bates Number
EFTA00688507
Dataset
DataSet-9
Document Type
document
Pages
2

Comments 0

Loading comments…
Link copied!