• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Paul Clapham
  • Jeanne Boyarsky
  • Junilu Lacar
  • Henry Wong
Sheriffs:
  • Ron McLeod
  • Devaka Cooray
  • Tim Cooke
Saloon Keepers:
  • Tim Moores
  • Stephan van Hulst
  • Frits Walraven
  • Tim Holloway
  • Carey Brown
Bartenders:
  • Piet Souris
  • salvin francis
  • fred rosenberger

Question to Paul Azunre: Cross-lingual Transfer Learning

 
Greenhorn
Posts: 9
  • Likes 1
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hello Paul.
Hope your book contains a lot of interesting info about cross-lingual models.
Сould you please give me some links on English-German and English-Russian successful stories in this field.
Thank you in advance,
Sergey
 
Author
Posts: 14
5
  • Likes 1
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator

Serge Yurk wrote:Hello Paul.
Hope your book contains a lot of interesting info about cross-lingual models.
Сould you please give me some links on English-German and English-Russian successful stories in this field.
Thank you in advance,
Sergey



Sergey,

Yes, we will try to cover the basics of cross-lingual transfer, which is a complex topic and a burgeoning field of study right now.

Seq2Seq models with attention work pretty well on parallel datasets for pure translation, with the current trend being to substitute it with transformer-based models.

Where Transfer Learning has been incorporated into cross-lingual learning specifically is through embedding models that are simultaneously trained on multiple tasks in multiple languages, e.g., mBERT (https://arxiv.org/abs/1906.01502) and Universal Sentence Encoder Multilingual (https://aihub.cloud.google.com/u/0/p/products%2F4e63f320-d774-4772-aaaa-ccbe8f3f09f2). The idea is that by learning multiple tasks simultaneously across multiple languages (in the case of mBERT>100 languages) allows the model to learn features for cross-lingual tasks like translation.

Apparently this works pretty well for high resource languages like English, German and Russian, so you should be able to work with these methods. For lower resource languages, like many African Languages, more work remains to be done and I would say this is likely the most exciting line of research right now with many open problems.

- Paul
 
Serge Yurk
Greenhorn
Posts: 9
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Thank you
 
He baked a muffin that stole my car! And this tiny ad:
Devious Experiments for a Truly Passive Greenhouse!
https://www.kickstarter.com/projects/paulwheaton/greenhouse-1
    Bookmark Topic Watch Topic
  • New Topic