Community is a reconfigurable, physically distributed system of interacting LLM-driven personas.
It is inspired by two main lines of thinking:
1. How can we create a system for entertainment that uses novel technologies (AI/IoT), but is not designed to be used alone? That is, the system is designed to be used by multiple people, in a public space. I wanted to make something that used advanced tech, but that does not align with the possible dystopian future where all of us sit at home on our VR headsets. Ideally, Community would lead people to feel more connected with one another. There will be none of the usual computer peripherals (mouse/keyboard/screen/VR headset). Rather, you interface with the system simply by picking things up and putting them down in specific locations.
2. How can we create a creative system that uses novel technologies, but that does not aim to replace the artist? There is a lot of discussion around AI’s ability to be creative (generative text/images/video) and the potential for this to replace creative jobs. Community will ideally be a system that can be used by writers to tell stories in an interesting new way (e.g. non-linear storytelling).
The idea is to use AI tools to craft novel formats for creative expression, rather than ‘taking over’ creative expression.
This line of thinking is related to the work I have done creating collaborative painting robots.
The project started with heads and brains. The full version 1 of the system is shown in the first video on this page.
Later on, I broadened the idea to just ‘inanimate objects’. The video below shows some inanimate objects talking to each other, powered by the Community system (captions are AI generated so may not be entirely accurate!!).
Each object has an RFID tag on the bottom. There are Raspberry Pis and RFID card readers attached under the table. The lights help to make it clear who is talking.
Community has been made possible with funding from Culture Liverpool and the Imperial College London AI SuperConnector.
How it works:
Each head contains a Raspberry Pi computer, which is connected to an RFID tag reader that sits near the top surface of the head. An RFID tag is stuck to the underside of each brain. A persona is associated with the ID number of each RFID tag; in this way, a unique persona is linked to each brain. When a particular brain is placed on a head, the persona for that brain is ‘loaded’ into the head. This persona includes a particular voice, personal history, interests, and opinions about certain topics. The heads within a group then converse.
The following animation visualises the system operation.
Progress images/videos:

Proof of concept video (From September 2024):


The video below shows two prototype heads talking to each other (captions are AI generated so may not be entirely accurate!!).
Technical details:
- The software is all custom Python code.
- The system is orchestrated using ROS2 (Robot Operating System).
- The main computer uses the OpenAI ChatGPT API for LLM access.
- The system is configured via a series of YAML files. One file enables the user to define each persona, including their name, age, reaction to certain conversation topics, and reaction to being alone in a group. Another file allows the user to define the timestamps at which the system moves from one conversation phase to the next.
- YAML file example:

Future Directions:
- There is enormous scope for increasing the complexity of persona personalities and the way in which they interact with one another. This paper is a big inspiration for simulating personalities and interactions.
- The system could employ speech recognition to enable human visitors to join the conversation. As AI technology improves this is going to become easier to implement.
- The use of ROS2 paves the way for the heads to incorporate moving (robotic) components, rather than being static 3D prints. It would be amazing if the heads were animatronic – e.g. see this guy’s work on YouTube.
- The system could become a kind of game – a game that encourages interaction between human players to solve a challenge (à la escape rooms).
- People could configure their own personas via an app or online interface, which is then associated with a personal RFID card. They can take their card to a venue and put it into the system and watch what happens.