Emergent Mind

Abstract

Social robotics researchers are increasingly interested in multi-party trained conversational agents. With a growing demand for real-world evaluations, our study presents LLMs deployed in a month-long live show at the Edinburgh Festival Fringe. This case study investigates human improvisers co-creating with conversational agents in a professional theatre setting. We explore the technical capabilities and constraints of on-the-spot multi-party dialogue, providing comprehensive insights from both audience and performer experiences with AI on stage. Our human-in-the-loop methodology underlines the challenges of these LLMs in generating context-relevant responses, stressing the user interface's crucial role. Audience feedback indicates an evolving interest for AI-driven live entertainment, direct human-AI interaction, and a diverse range of expectations about AI's conversational competence and utility as a creativity support tool. Human performers express immense enthusiasm, varied satisfaction, and the evolving public opinion highlights mixed emotions about AI's role in arts.

AI-based improv theatre by Improbotics with LLM-generated lines curated via radio-connected earphones.

Overview

  • The paper explores the use of LLMs for co-creative improvised theatre, detailing the deployment of AI-driven conversational agents in live performances at the Edinburgh Festival Fringe.

  • Challenges encountered included speech recognition, understanding physical context, and ensuring timely responses in a dynamic, multi-party dialogue environment.

  • Surveys revealed that while audiences and performers found AI participation intriguing, they also noted its limitations in generating contextually nuanced responses, highlighting areas for improvement and future research.

Designing and Evaluating Dialogue LLMs for Co-Creative Improvised Theatre

Introduction

AI isn't just for your smart speaker or chess-playing algorithms anymore. AI has been making inroads into more creative, social, and interactive areas. One fascinating example of this is a study involving LLMs designed for interactive improvised theatre performances. Imagine watching a live improv show where one of the actors is not human, but an AI! This paper details the deployment of these AI-driven conversational agents during a month-long series of live performances at the Edinburgh Festival Fringe.

The Experiment: AI in Live Theatre

Setting the Stage

Improvised theatre is a dynamic and unpredictable environment, making it an excellent playground for experimenting with AI co-creativity. In these Fringe performances, teams of professional human improvisers shared the stage with conversational agents powered by three different LLMs: Chat GPT-3.5 (OpenAI), PaLM 2 (Google), and Llama 2 (Meta). The AI's lines were delivered through a human actor referred to as the "Cyborg," who received the lines via an earpiece and acted them out on stage.

Challenges

The complexity of live, multi-party dialogue presented several hurdles:

  1. Speech Recognition: Multiple microphones were needed to identify different speakers on stage.
  2. Physical Context: AI needed to understand not just words but gestures, tone, and other physical cues.
  3. Timely Responses: The AI's responses had to be appropriately timed, so it relied on continuous speech recognition supplemented with metadata typed live by an operator to provide context.

A human-in-the-loop system allowed a curator to select the best response from the AI's generated lines during performances, ensuring the output was contextually relevant.

Putting AI to the Test: Formats

To explore how these AI systems could cope in the intense setting of live improvised theatre, various games were designed:

  1. Speed Dating: AI had to perform rapid-fire dialogues with different characters.
  2. Wedding Speech: AI helped generate coherent, humorous speeches incorporating both scripted and audience inputs.
  3. Couples' Therapy and Meet the Parents: AI had to juggle conversations involving multiple interaction dynamics.
  4. Hero's Journey: A complex narrative where AI had to participate in an evolving long-form story.

Audience and Performer Surveys

Surveys were conducted to evaluate the audience's perception of AI in live performance and to gauge the performers' experiences.

Audience Feedback

Audience responses revealed a mixed bag of fascination and skepticism:

  • People were generally curious about AI's role and capabilities.
  • There was excitement about AI's potential in creative fields, but less optimism about its storytelling abilities.
  • AI's responses were viewed as somewhat machine-like and often required human improvisers to work around its limitations.

Performer Feedback

Performers noted various challenges and enjoyments:

  • AI often provided non-sequiturs, adding a layer of unpredictability that improvisers had to creatively integrate.
  • Some performers found AI responses too mechanical, missing the nuanced understanding a human partner would bring.
  • Yet, AI often spurred unexpected and humorous outcomes, making scenes more dynamic.

Practical and Theoretical Implications

Practical Implications

From a practical standpoint, this experiment highlights several potential areas for enhancing human-AI collaboration in real-time creative settings:

  • Enhancing Context Understanding: Improved speech recognition and context-setting mechanisms could make AI interactions more fluid.
  • Refined Curatorial Tools: Developing better UI tools for real-time curation could allow faster, more intuitive scene management.

Theoretical Implications

The research also provides insights into AI's evolving role in social and creative contexts:

  • Human-Centered AI: Highlighted the importance of human-in-the-loop systems to guide AI, making the performances more enjoyable and coherent.
  • Public Perception: Showed that live exposure to AI can demystify its capabilities and limitations, contributing to a more informed public discourse around AI technologies.

Future Developments

Enhanced Multi-Party Dialogue

Future iterations could focus on:

  • Advanced Turn-Taking Algorithms: Improving the AI’s ability to manage and participate effectively in multi-party conversations.
  • Physically Interactive Systems: Incorporating non-verbal cues like gestures and facial expressions to make AI interactions more lifelike.

Application Beyond Theatre

These findings have broader implications than just theatre:

  • Social Robotics: Use cases in social robots where AI can engage in authentic, multi-party dialogues.
  • Education and Training: AI-driven participation in creative learning environments to assist with social and communication skills.

Conclusion

By thrusting AI into the limelight of live theatre, this study sheds light on both the capabilities and limitations of conversational LLMs in complex, real-world settings. It opens up exciting avenues for future research and development, emphasizing the importance of human-AI collaboration. Whether for entertainment or more serious applications, AI's role in our social and creative lives is not just feasible but increasingly fascinating.

Create an account to read this summary for free:

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.