Computer Science – Computation and Language
Scientific paper
2009-12-04
Computer Science
Computation and Language
Scientific paper
The idea of measuring distance between languages seems to have its roots in the work of the French explorer Dumont D'Urville \cite{Urv}. He collected comparative words lists of various languages during his voyages aboard the Astrolabe from 1826 to 1829 and, in his work about the geographical division of the Pacific, he proposed a method to measure the degree of relation among languages. The method used by modern glottochronology, developed by Morris Swadesh in the 1950s, measures distances from the percentage of shared cognates, which are words with a common historical origin. Recently, we proposed a new automated method which uses normalized Levenshtein distance among words with the same meaning and averages on the words contained in a list. Recently another group of scholars \cite{Bak, Hol} proposed a refined of our definition including a second normalization. In this paper we compare the information content of our definition with the refined version in order to decide which of the two can be applied with greater success to resolve relationships among languages.
Petroni Filippo
Serva Maurizio
No associations
LandOfFree
Measures of lexical distance between languages does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Measures of lexical distance between languages, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Measures of lexical distance between languages will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-420940