Bilingual children learn two languages by associating words to visually similar situations. We show how we use related ideas to train bilingual word translation models with no access to paired training data. #CVPR20
— DeepMind (@DeepMind) June 18, 2020
Paper: https://t.co/W8v9C82q0f
Blog: https://t.co/JzZHCDZa4C pic.twitter.com/cmwAK9yCTI
from Twitter https://twitter.com/DeepMind
June 18, 2020 at 08:01AM
via IFTTT
No comments:
Post a Comment