Project Reference: | ITP/025/19LP |
Project Title: | AI-Enabled Nonverbal Expression for Chatbot Enhancement |
Hosting Institution: | LSCM R&D Centre (LSCM) |
Abstract: | Chatbot technology has leveraged commonly used multimedia conversational interfaces backed by AI-equipped software to engage in dialogs with the public in various individual domain-specific issues. However, textual content – typed in or speech recognized text – is the only data source for chatbots to devise their subsequent responses to conversational parties. A set of audible and visible nonverbal communication cues such as nodding and puzzled facial expression has not been explored or incorporated for a series of relatively short information exchange during chatting. Such nonverbal cues may embed supportive context for assisting chatbots to better engage in ongoing dialogs. This proposed work aims to explore the use of artificial intelligence and machine learning methods to train three nonverbal expression detectors to detect occurrences of those auditory and visual nonverbal cues representing immediate reactive responses in a dialog. The auditory nonverbal cues will be classified by a trained sound response detector. The visual nonverbal cues will be classified by both trained facial response detector and upper body gesture response detector. |
Project Coordinator: | Dr Dorbin Ng |
Approved Funding Amount: | HK$2.79M |
Project Period: | 01 Aug 2019 - 31 Jul 2020 |