50Webs Forum
Welcome, Guest. Please login or register.
Did you miss your activation email?
February 25, 2021, 11:28:55 AM

Login with username, password and session length
Search:     Advanced search
96500 Posts in 24487 Topics by 30864 Members
Latest Member: timmisaunders
* Home Help Search Login Register
+  50Webs Forum
|-+  General Forum
| |-+  Applications - Software reviews and opinions
| | |-+  On Learning Language-Invariant Representations for Universal Machine Translation
« previous next »
Pages: [1] Go Down Print
Author Topic: On Learning Language-Invariant Representations for Universal Machine Translation  (Read 132 times)
kulpubkk
Noobie
*
Posts: 1


View Profile WWW
« on: November 28, 2020, 10:02:55 AM »

Despite the recent improvements in neural machine translation (NMT), training a large NMT model with hundreds of millions of parameters usually requires a collection of parallel corpora at a large scale, on the order of millions or even billions of aligned sentences for supervised training (Arivazhagan et al.). While it might be possible to automatically crawl the web to collect parallel sentences for high-resource language pairs, such as German-English and French-English, it is often infeasible or expensive to manually translate large amounts of sentences for low-resource language pairs, such as Nepali-English, Sinhala-English, etc. To this end, the goal of the so-called multilingual universal machine translation, a.k.a., universal machine translation (UMT), is to learn to translate between any pair of languages using a single system, given pairs of translated documents for some of these languages. The hope is that by learning a shared “semantic space” between multiple sources and target languages, the model can leverage language-invariant structure from high-resource translation pairs to transfer to the translation between low-resource language pairs, or even enable zero-shot translation.

Indeed, training such a single massively multilingual model has gained impressive empirical results, especially in the case of low-resource language pairs (see Fig. 2). However, such success also comes with a cost. From Fig. 2 we observe that the translation quality over high-resource language pairs by using such a single UMT system is worse than the corresponding bilingual baselines.
Artificial Intelligence Course in Pune
Logged

Angular JavaScript is the future pathway of forepart development of any application
Anaaya0123
Noobie
*
Posts: 11


View Profile
« Reply #1 on: January 14, 2021, 09:54:28 AM »

Machine learning now tops trending course.

https://www.sevenmentor.com/machine-learning-course-in-pune.php
Logged
Pages: [1] Go Up Print 
« previous next »
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.14 | SMF © 2006-2011, Simple Machines LLC Valid XHTML 1.0! Valid CSS!