Follow
Quynh Do
Quynh Do
Applied Scientist, Amazon AI
Verified email at amazon.com
Title
Cited by
Cited by
Year
Entailment above the word level in distributional semantics
M Baroni, R Bernardi, NQ Do, C Shan
Proceedings of the 13th Conference of the European Chapter of the …, 2012
2862012
Cross-lingual Transfer Learning with Data Selection for Large-Scale Spoken Language Understanding
Q Do, J Gaspers
Proceedings of the 2019 Conference on Empirical Methods in Natural Language …, 2019
202019
Improving Implicit Semantic Role Labeling by Predicting Semantic Frame Arguments
QNT Do, S Bethard, MF Moens
The International Joint Conference on Natural Language Processing IJCNLP 2017, 2017
202017
Cross-lingual transfer learning for spoken language understanding
QNT Do, J Gaspers
ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and …, 2019
192019
Domain Adaptation in Semantic Role Labeling using a Neural Language Model and Linguistic Resources
QNT Do, S Bethard, MF Moens
IEEE/ACM Transactions on Audio, Speech and Language Processing 23 (11), 1812 …, 2015
172015
Distributionally Robust Finetuning BERT for Covariate Drift in Spoken Language Understanding
S Broscheit, Q Do, J Gaspers
Proceedings of the 60th Annual Meeting of the Association for Computational …, 2022
92022
Adapting Coreference Resolution for Narrative Processing
QNT Do, S Bethard, MF Moens
Proceedings of the Conference on Empirical Methods in Natural Language …, 2015
8*2015
A Flexible and Easy-to-use Semantic Role Labeling Framework for Different Languages
QNT Do, A Leeuwenberg, G Heyman, MF Moens
Proceedings of the 27th International Conference on Computational …, 2018
72018
Learning to Extract Action Descriptions from Narrative Text
O Ludwig, Q Do, C Smith, M Cavazza, MF Moens
IEEE Transactions on Computational Intelligence and AI in Games, 2017
72017
Facing the most difficult case of Semantic Role Labeling: A collaboration of word embeddings and co-training
QNT Do, S Bethard, MF Moens
Proceedings of the International Conference on Computational Linguistics …, 2016
72016
To What Degree Can Language Borders Be Blurred In BERT-based Multilingual Spoken Language Understanding?
Q Do, J Gaspers, T Roding, M Bradford
arXiv preprint arXiv:2011.05007, 2020
52020
Data balancing for boosting performance of low-frequency classes in Spoken Language Understanding
J Gaspers, Q Do, F Triefenbach
Interspeech 2020, arXiv preprint arXiv:2008.02603, 2020
52020
Machine understanding for interactive storytelling
W De Mulder, NQ Do Thi, P van den Broek, MF Moens
Proceedings of KICSS 2013: 8th international conference on knowledge …, 2013
42013
The Impact of Intent Distribution Mismatch on Semi-Supervised Spoken Language Understanding.
J Gaspers, Q Do, D Sorokin, P Lehnen, AI Amazon Alexa
Interspeech, 4708-4712, 2021
32021
Predicting Temporal Performance Drop of Deployed Production Spoken Language Understanding Models.
Q Do, J Gaspers, D Sorokin, P Lehnen
Interspeech, 1249-1253, 2021
32021
Exploring Cross-Lingual Transfer Learning with Unsupervised Machine Translation
C Wang, J Gaspers, TNQ Do, H Jiang
Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 …, 2021
22021
Text mining for open domain semi-supervised semantic role labeling
NQ Do Thi, S Bethard, MF Moens
Proceedings of interactions between data mining and natural language …, 2014
2*2014
The impact of domain-specific representations on BERT-based multi-domain spoken language understanding
J Gaspers, Q Do, T Röding, M Bradford
Proceedings of the Second Workshop on Domain Adaptation for NLP, 28-32, 2021
12021
Visualizing the content of a children’s story in a virtual world: Lessons learned
NQ Do Thi, S Bethard, MF Moens
Proceedings of the EMNLP 2016 Workshop on Uphill Battles in Language …, 2016
2016
Improving word representations for semantic recognition in language by adding visual contextual knowledge
G Collell Talleda, Q Do, T Zhang, MF Moens
CHIST-ERA seminar, Date: 2016/04/28-2016/04/29, Location: Bern, Switzerand, 2016
2016
The system can't perform the operation now. Try again later.
Articles 1–20