Folgen
Naoki Wake
Naoki Wake
Sonstige Namen和家尚希
Bestätigte E-Mail-Adresse bei microsoft.com - Startseite
Titel
Zitiert von
Zitiert von
Jahr
Restoring movement representation and alleviating phantom limb pain through short‐term neurorehabilitation with a virtual reality system
M Osumi, A Ichinose, M Sumitani, N Wake, Y Sano, A Yozu, S Kumagaya, ...
European journal of pain 21 (1), 140-147, 2017
1142017
Chatgpt empowered long-step robot control in various environments: A case application
N Wake, A Kanehira, K Sasabuchi, J Takamatsu, K Ikeuchi
IEEE Access 11, 95060-95078, 2023
842023
Tactile feedback for relief of deafferentation pain using virtual reality system: a pilot study
Y Sano, N Wake, A Ichinose, M Osumi, R Oya, M Sumitani, S Kumagaya, ...
Journal of neuroengineering and rehabilitation 13, 1-12, 2016
472016
Multimodal virtual reality platform for the rehabilitation of phantom limb pain
N Wake, Y Sano, R Oya, M Sumitani, S Kumagaya, Y Kuniyoshi
2015 7th International IEEE/EMBS Conference on Neural Engineering (NER), 787-790, 2015
472015
Agent ai: Surveying the horizons of multimodal interaction
Z Durante, Q Huang, N Wake, R Gong, JS Park, B Sarkar, R Taori, Y Noda, ...
arXiv preprint arXiv:2401.03568, 2024
442024
Gpt-4v (ision) for robotics: Multimodal task planning from human demonstration
N Wake, A Kanehira, K Sasabuchi, J Takamatsu, K Ikeuchi
IEEE Robotics and Automation Letters, 2024
432024
A Learning-from-Observation Framework: One-Shot Robot Teaching for Grasp-Manipulation-Release Household Operations
N Wake, R Arakawa, I Yanokura, T Kiyokawa, K Sasabuchi, J Takamatsu, ...
arXiv preprint arXiv:2008.01513, 2020
342020
Structured movement representations of a phantom limb associated with phantom limb pain
M Osumi, M Sumitani, N Wake, Y Sano, A Ichinose, S Kumagaya, ...
Neuroscience letters 605, 7-11, 2015
312015
Reliability of phantom pain relief in neurorehabilitation using a multimodal virtual reality system
Y Sano, A Ichinose, N Wake, M Osumi, M Sumitani, S Kumagaya, ...
2015 37th annual international conference of the IEEE engineering in …, 2015
302015
Task-oriented motion mapping on robots of various configuration using body role division
K Sasabuchi, N Wake, K Ikeuchi
IEEE Robotics and Automation Letters 6 (2), 413-420, 2020
242020
Verbal focus-of-attention system for learning-from-observation
N Wake, I Yanokura, K Sasabuchi, K Ikeuchi
2021 IEEE International Conference on Robotics and Automation (ICRA), 10377 …, 2021
182021
Semantic constraints to represent common sense required in household actions for multimodal learning-from-observation robot
K Ikeuchi, N Wake, K Sasabuchi, J Takamatsu
The International Journal of Robotics Research 43 (2), 134-170, 2024
142024
Task-grasping from a demonstrated human strategy
D Saito, K Sasabuchi, N Wake, J Takamatsu, H Koike, K Ikeuchi
2022 IEEE-RAS 21st International Conference on Humanoid Robots (Humanoids …, 2022
14*2022
Grasp-type Recognition Leveraging Object Affordance
N Wake, K Sasabuchi, K Ikeuchi
HOBI – IEEE RO-MAN Workshop 2020, 2020
142020
Direct evidence of EEG coherence in alleviating phantom limb pain by virtual referred sensation: Case report
M Osumi, Y Sano, A Ichinose, N Wake, A Yozu, SI Kumagaya, ...
Neurocase 26 (1), 55-59, 2020
142020
An interactive agent foundation model
Z Durante, B Sarkar, R Gong, R Taori, Y Noda, P Tang, E Adeli, ...
arXiv preprint arXiv:2402.05929, 2024
132024
Enhancing listening capability of humanoid robot by reduction of stationary ego‐noise
N Wake, M Fukumoto, H Takahashi, K Ikeuchi
IEEJ Transactions on Electrical and Electronic Engineering 14 (12), 1815-1822, 2019
122019
Bias in Emotion Recognition with ChatGPT
N Wake, A Kanehira, K Sasabuchi, J Takamatsu, K Ikeuchi
arXiv preprint arXiv:2310.11753, 2023
102023
Position Paper: Agent AI Towards a Holistic Intelligence
Q Huang, N Wake, B Sarkar, Z Durante, R Gong, R Taori, Y Noda, ...
arXiv preprint arXiv:2403.00833, 2024
92024
Text-driven object affordance for guiding grasp-type recognition in multimodal robot teaching
N Wake, D Saito, K Sasabuchi, H Koike, K Ikeuchi
Machine Vision and Applications 34 (4), 58, 2023
9*2023
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–20