data science
Admitted it is in this era when everybody's buzzing about big data science.
I'd like to also ask you a more humanistic question. What makes language language and what makes music.
Music because language and music.
Not only share a common evolutionary origins and neuro biological processing but they're also pretty much the only thing we humans do better than all other species.
Scientific and humanitarian reasons led me to develop the world's first automated translation services in the early days of the web
because language our trade routes language lets us trade ideas overcome barriers share common ground but it remains one of the hardest challenges in Big Data Science.
How do we learn complex relations between languages humans learn by seeing this round orange thing while hearing Mama say choke teachable and we correlate the Chinese we hear against a second representation of the environment we see.
data science
Over many instances we gradually learn Cho means the round thing ti means kick. Our robots today aren't good enough to do this.
But when I first got to HQ a U.S. team many years ago I noticed Hong Kong requires all government proceedings to be kept in both English and Chinese.
And this led to the idea that we could approximate the picture by hearing Mama speak Chinese while seeing an English representation of that environment.
So the key question becomes what kind of models can learn the relationships between any two natural languages.
And attacking that problem gets us closer to identifying the universal DNA of language.
Because language is what lets us think complex thoughts and those thoughts let us create new languages with which to think more complex thoughts in the great cycle of intelligence.
Language structures thought. Now the traditional way linguists and mathematicians have thought of modelling language is tell me if that sentence is grammatical.
But this would not have done your average caveman very much good. What's really needed to survive evolution.
data science
Is to translate. Look out. A tiger is attacking from behind you into another visual or abstract representation that lets you decide quickly to run.
But learning traditional translation models has exponential complexity in attacking that problem.
I was inspired by a long standing scientific mystery. Why do languages across the world universally limit. Of verbs. Semantic roles to a maximum before.
I discovered inversion transaction grammars give rise mathematically to this magic number four in order to keep languages learnable because languages evolved.

No comments:
Post a Comment