AI Provides Real-time Listening Feedback

in natural language, as users ask a question

Can AI Show That It's Listening?

Imagine having a conversation with someone. In typical speech conversations, people naturally provide signals to show they are listening—like nodding or saying “uh-huh.”

However, when chatting with an LLM, this kind of interaction isn’t possible. One reason is that the model can’t see the user’s input until they press enter. Another is that strict turn-taking has become the norm in chat-based interactions — not just with LLMs but also in human-to-human online conversations.

But what if text-based interactions with LLMs could include real-time listening signals? How would users react if the AI could indicate attentiveness while they were still typing?

Our Approach

To explore this, a web-based interactive system was developed, as shown in the video. An open-source model was finetuned on a custom dataset, and a web interface was built to support real-time backchanneling.

Here, as the user types, the AI responds with “Yeah,” signaling that it is listening.

A user study revealed that, while this kind of backchanneling made the AI feel more human-like, it did not provide significant functional benefits. One possible reason is that Human-AI conversations are often more instruction-based than casual.

More details can be found in the research paper.

Tech Stack

Python, LLaMA, HTML, CSS, JavaScript, Photoshop, and Illustrator.