Saturday, June 01, 2024

 

Relieving a fear of public speaking




SINGAPORE MANAGEMENT UNIVERSITY
SMU Associate Professor Kyong Jin Shim 

IMAGE: 

INSPIRED BY HER OWN EXPERIENCE, SMU ASSOCIATE PROFESSOR KYONG JIN SHIM IS LEADING A RESEARCH PROJECT THAT INTEGRATES VIRTUAL REALITY TECHNOLOGY AND AI TO IMPROVE PUBLIC SPEAKING SKILLS IN STUDENTS.

view more 

CREDIT: SINGAPORE MANAGEMENT UNIVERSITY




By Alistair Jones

SMU Office of Research - If you dread public speaking you are not alone. It is a leading social phobia, one that can cause a state of anxiety that reduces otherwise articulate people to nervous incoherence. 

A strong fear of public speaking is known as glossophobia. Academic studies estimate it affects 20 per cent of the population, but depending on the sample and methodology, the figure could be as high as 40 per cent. 

As American writer and humourist Mark Twain said, "There are two types of speakers: Those who get nervous and those who are liars.”

But help may be on the way. Kyong Jin Shim, an Associate Professor of Information Systems at Singapore Management University (SMU), is leading a research project that explores the integration of virtual reality (VR) technology and AI to improve public speaking skills in students. 

And while the research specifically focuses on evaluating the effectiveness of utilising this technology for the development of public speaking skills for university students, the methodology could have wider applications. The project has been awarded an MOE Tertiary Education Research Fund (TRF) grant and the proposed solution is called PresentationPro.

"[Through headsets], presenters will see a three-dimensional virtual environment that mimics a real-world presentation setting, complete with a crowd of AI-driven avatars representing an audience," Professor Shim says. 

"These avatars will display behaviours typical of a live audience, such as nodding, making eye contact, showing various expressions and providing real-time feedback to the presenter," Professor Shim says.

In a high-tech update on practice makes perfect, PresentationPro aims to provide a way for presenters to hone their public speaking skills without the logistical challenges of assembling a live audience for every student.

The team is collaborating with SMU’s Centre for English Communication (CEC) to translate their “presentation” know-how and best practices into a digital platform, and eventually to scale CEC’s communication coaching.

Avatar triggers

The VR content, including the audience avatars, is generated through a combination of advanced computer graphics and AI algorithms. To make the avatars responsive in real time is no small task.

"This is achieved through sophisticated AI programming that includes natural language processing (NLP) and behaviour modelling. The system uses machine learning to analyse the presenter’s speech and body language, allowing avatars to respond realistically in real time to both verbal and non-verbal cues," Professor Shim says.

By working with SMU’s Centre for Teaching Excellence (CTE), Professor Shim’s faculty team tapped into CTE’s expertise in classroom management and wealth of knowledge in different kinds of behaviour that can manifest in classroom “presentation” scenarios. The behaviours of students and faculty/instructors play a crucial role in engineering PresentationPro’s “audience avatar” behaviours using AI.

But can the avatars interrupt the presenter?

"Yes, avatars can interrupt and ask questions, simulating a dynamic interaction typical of real audiences. This capability is enabled by integrating NLP and speech recognition technologies, allowing avatars to process spoken language and respond appropriately," Professor Shim says.

The physical cues of presenters will also be monitored.

"In addition to heart rate tracking with Fitbits, the system uses VR headsets such as Meta Quest equipped with head and gaze tracking technology to monitor where the presenter is looking, such as whether they are avoiding eye contact by staring at their feet. Gesture tracking is also employed to catch other physical behaviours like fidgeting," Professor Shim says.

Verbal triggers for the avatars are set up using a combination of speech recognition and sentiment analysis technologies. 

"These triggers are calibrated to recognise various speech patterns and anomalies such as tics, stutters, or deviations from the script, which then cue the avatars to react in specific ways that mimic a real audience's response," Professor Shim says.

Behavioural changes

The researchers have generated digital twins, which are highly detailed digital replicas of human behaviours and interactions – much like individuals – ensuring a diverse and realistic audience simulation reflective of a typical SMU classroom.

"Using different avatars helps to avoid repetition and predictability in audience reactions, enhancing the realism of the virtual environment and mimicking a typical seminar or classroom setting," Professor Shim says.

"VR and AI can simulate realistic social interactions, which can help individuals practise and improve their public speaking skills in a low-risk environment. Repeated exposure and positive reinforcement through VR can reduce anxiety, build confidence and lead to behavioural changes.

"Improvements will be measured through both subjective evaluations (participant and instructor feedback) and objective metrics (performance data collected during VR sessions and traditional in-person assessments). Comparisons will be drawn between control and experimental groups to assess the efficacy of VR training," Professor Shim says.

Transformative tool

Interestingly, for a project that is about behavioural change, no psychologists were among the project's expert investigators when it began.

"The research team primarily consists of specialists in education technology, AI, and public speaking, focusing on the technological and instructional design aspects of the project," Professor Shim says. 

"Although psychologists play a crucial role in understanding and addressing anxiety, our project's current scope concentrates on developing and integrating AI-driven solutions for public speaking training. Nevertheless, we recognise the value of interdisciplinary collaboration and are very open to partnering with experts in the social sciences to enhance our understanding of anxiety management. 

"Such collaborations could lead to further refinements in our VR system, ultimately enriching the learner's experience by more effectively addressing public speaking anxiety."

Professor Shim has since added SMU Assistant Professor of Psychology Andree Hartanto to the team to explore:

  • Psychological mechanisms through which VR may reduce glossophobia;
  • Long-term impacts of VR training on public speaking anxiety; and
  • Differential effects of VR training across diverse demographic groups

Professor Shim's journey into VR applications began in 2021 with a prototype designed to train new lecturers at SMU. 

"My personal experiences as a faculty member, grappling with the challenges of adapting to a new cultural and academic environment, deeply influenced this initiative. During my early years at SMU, I found lecturing to a seminar-style classroom of 45 students from diverse backgrounds to be particularly daunting," she says.

"As I transitioned into a mentorship role for newer faculty members, I realised how beneficial immersive technologies like VR could be in accelerating the on-boarding process for new lecturers. This technology allows them to practice lecturing in their own time and space, repeat sessions as needed, and eliminates the logistical challenges of scheduling real seminar rooms and audiences. 

"Inspired by the potential of this initial application, we set out to develop a similar VR system to enhance public speaking skills for students. This project not only leverages my teaching-related research in collaboration with CTE, but also builds upon our foundational work in VR, aiming to provide a transformative educational tool for a wider audience," Professor Shim says.


No comments:

Post a Comment