Community is a reconfigurable, physically distributed system of interacting LLM-driven personas.
It seeks to combine the world of immersive art with the world of AI-driven interactivity.
Many identical grey 3D printed heads are arranged in groups around a room. Each group can have 1-6 heads. The top portion of each head is cut off. Colourful brains sit in the space on the top of the heads.
Each head contains a Raspberry Pi computer, which is connected to an RFID tag reader that sits near the top surface of the head. An RFID tag is stuck to the underside of each brain. A persona is associated with the ID number of each RFID tag; in this way, a unique persona is linked to each brain. When a particular brain is placed on a head, the persona for that brain is ‘loaded’ into the head. This persona includes a particular voice, personal history, interests, and opinions about certain topics. The heads within a group then converse.
Visitors to the space can listen to the ongoing conversations between the heads. They are like a ‘fly on the wall’ – they are unnoticed observers. They are also able to control the conversations by moving the brains between heads in different groups. In this way, they get to control who talks to who.
Version 1, with around 15 heads, is underway and will be completed by May 2025.
The following animation visualises the system operation.
Progress images/videos:

Proof of concept video:


Community has been made possible with funding from Culture Liverpool and the Imperial College London AI SuperConnector.
Technical details:
- The software is all custom Python code.
- The system is orchestrated using ROS2 (Robot Operating System).
- The main computer uses the OpenAI ChatGPT API for LLM access.
- The system is configured via a series of YAML files. One file enables the user to define each persona, including their name, age, reaction to certain conversation topics, and reaction to being alone in a group. Another file allows the user to define the timestamps at which the system moves from one conversation phase to the next.
- YAML file example:

Future Directions:
- There is enormous scope for increasing the complexity of persona personalities and the way in which they interact with one another.
- The system could employ speech recognition to enable human visitors to join the conversation. As AI technology improves this is going to become easier to implement.
- The use of ROS2 paves the way for the heads to incorporate moving (robotic) components, rather than being static 3D prints.
- The system could become a kind of game – a game that encourages interaction between human players to solve a challenge (à la escape rooms).
- People could configure their own personas via an app or online interface, which is then associated with a personal RFID card. They can take their card to a venue and put it into the system and watch what happens.