✘ Open Culture Tech: the latest technology for every artist
And: Creating a low-key technology toolbox; experiences in working with AR, Avatars and motion capturing in live performing
Hi all,
We cover a lot of emerging tech in this newsletter, but today we have a takeover from the team at Open Culture Tech. I have previously mentioned them in relation to ethical AI practices, specifically around the artist Fi - developed by Thunderboom Records. Here, they go deep into the practicalities, learnings, and actionable takeaways from experiments they do through this new vehicle called Open Culture Tech. Reading the below gets me excited about the future this tech offers us and how we can make it available to everyone.
Love, Maarten
Grimes clones her voice with AI, the Gorillaz are an AR experience on Times Square, and Travis Scott is an avatar in Fortnite. These world-renowned artists have the amazing opportunity and resources to experiment with the most cutting-edge technology. But how can we ensure that up-and-coming musicians also have the opportunities and resources to work with the latest technology? In which artistic freedom, intellectual property, data privacy and the transparent functioning of technology are not lost?
To answer this question, the Thunderboom Records Foundation launched a unique project last summer, in close collaboration with Superposition, Reblika and Netherlands Institute for Sound and Vision, called Open Culture Tech. Open Culture Tech is an initiative to make the latest Artificial Intelligence (AI), Augmented Reality (AR) and Avatar technology more accessible to emerging artists. Open Culture Tech is all about creating and sharing knowledge and open-source tooling that musicians can immediately use in their live performances. The project is funded by the Dutch Ministry of Education, Culture and Science.
In this newsletter, we – the Open Culture Tech team – would love to take you through our creative process and share the lessons learned from our first two pilot shows.
The goal of Open Culture Tech is to create an open-source toolkit with easy-to-use AI, AR and Avatar software tools; a mix between existing tools and tools that we develop ourselves. As there are thousands of possible applications within the field of AI, AR and Avatar technologies, the starting points for our development process are posed by artists. Open Culture Tech issued an Open Call in the summer of 2023, in which musicians in The Netherlands could join the pilot programme and participate in the creative development process of our toolkit.
From a large number of applications, 10 diverse artists were selected. From Punk to R&B and from Folk to EDM. Each artist introduced a unique research question that could be answered by applying new technology. Questions such as: “I’ve put my heart and soul into creating an EP that’s 15 minutes long. I want to perform this new work live on stage, but I need at least 30 minutes of music to create a full show.” Or questions like: “how can I enhance the interaction between me and the audience, when I play guitar on stage?”.
Together with these artists, we first dive into the current offering of AI, AR and Avatar technologies that can help them answer their questions. Our goal is to build an inclusive toolkit and creative workflow based on public values that guarantees the privacy and autonomy of artists. Unfortunately, most existing AI, AR and Avatar tools on the market do not meet these principles. They are often expensive, technically complex and do not put the interests of the user first. This means that we often have to develop and test our own technology. Since there are so many different questions that our artists pose, we try to look for common denominators in their wishes to prioritise our own tool creation process.
The second step is to develop prototypes and organise pilot shows to test the technology on stage – together with the collaborating artists. In this process, the artistic direction lies entirely with the artist. Each show leads to unique insights, which then lead to new iterations of the technology. After 10 shows, we have a validated toolbox that brings together existing technology with self-developed technology.
Below, we share the creation process and findings from our first two prototypes and pilot shows. All results will be shared bi-weekly in the Open Culture Tech newsletter or on the Open Culture Tech Discord channel.
Two use cases
1. OATS
The first Open Culture Tech pilot show was created in close collaboration with OATS. Merging punk and jazz, OATS is establishing themselves through compelling and powerful live shows. Their question was: “how can we translate the lead singer's expression into real-time visuals on stage?” To answer this question, we decided to create and project a 3D avatar on stage by using Unreal Engine and motion capture technology.
Unreal Engine is the global industry standard in developing 3D models. It is used by many major 3D companies in the music, gaming and film industry. The learning curve is steep and the costs are very high. Reblika is a Rotterdam-based 3D company with years of experience in creating hi-res avatars for the creative industry, who are currently developing their own avatar creator tool – using Unreal Engine – called Reblium. For Open Culture Tech, Reblika will develop a free, stand alone, open-source version with an easy-to-use interface, aimed at helping live musicians. In this way, any musician will be able to create and project a 3D avatar on stage. The findings from the OATS show will be used as a guideline for the development of this open-source tool.
The master plan was to capture the body movement of the lead singer (Jacob Clausen) with a Motion Capture Suit (MVN Awinda) and link the signal to a 3D avatar in 3D an environment that could be projected live on stage. In this way, we could test what it’s like to use live avatars on stage – and to find out what functionalities our open-source Avatar creation tool would need. In this case, the aesthetic had to be dystopian, alienating and glitchy.
Our first step was to create a workflow for finding the right 3D avatar and 3D environment. OATS preferred a gloomy character in hazmat suit, moving through an abandoned factory building. We decided to use the Unreal Engine Marketplace, a website that offers ready-made 3D characters. To create the 3D environment, Jacob Clausen decided to use a tool called Polycam and scan an abandoned industrial area. Polycam is an easy-to-use software tool that uses a technology called LiDAR, better known as 3D-scanning. Polycam allows you to scan any physical 3D object or space and render it into a 3D model.
The 3D scan (factory) and avatar (hazmat suit) were imported into Unreal Engine and the avatar was connected to a Motion Capture Suit. In the video below, you can see Jacob Clausen become the main character on screen and test the experience live on stage at Popronde in EKKO in Utrecht, on 19 October at 23:30. What followed was a show that taught us a lot.
The venue provided us with a standard beamer/projector and a white screen behind the stage. Due to an over-active smoke machine, unstable internet connection and a low-res beamer-projector, the avatar was not always visible on screen. Nevertheless, there were certainly moments where everything came together. At these moments, the synchronisation between Jacob and his avatar was super interesting, the storytelling was amazing and the technology showed a lot of potential.
The Motion Capture suit was very expensive and we had to borrow this suit from Reblika. This is not very sustainable, accessible and inclusive. For our next prototype, we will look at Motion Capture AI technology, such as Rokoko Vision, instead of suits.
The 3D avatar and environment were shown from different camera angles. To make this possible, someone had to keep changing the camera angle (real-time) within the Unreal Engine software. Going forward, we should add predefined camera angles. In this way, you don’t need an extra person to control the visuals.
2. Ineffekt
The second use case of Open Culture Tech was provided by Ineffekt. Through a blend of glistening vocal pieces, strings of dreamy melodies and distinctive rhythms, Ineffekt cleverly takes on a sound that both feels comfortable and illusive. The question of Ineffekt was: “how can I translate my album artwork into a virtual experience that could transform any location into an immersive videoclip?”. To answer this question, we decided to build and test the first version of our own AR creation tool, with the help of Superposition, an innovative design studio for interactive experiences.
For his latest album artwork and music video, Ineffekt used a 3D model of a greenhouse in which yellow organisms are growing. This 3D model formed the basis for the AR experience we tested on the streets during the Amsterdam Dance Event.
Our main goal was to create and test an AR experience that was built with the use of open-source technologies, and develop an intimate mobile AR experience that doesn’t require you to download a new app. To make this possible, Superposition decided to experiment with App Clips on iOS and Google Play Instant on Android. These techniques allow you to create and open a mobile application – after scanning a QR code – in your browser without downloading the actual app. Superposition also used open-source software to create 3D models and AR experience. Blender was used to create a hi-res 3D model and webXR was used to translate this model into mobile Augmented Reality.
On October 20 and 21, we tested our first AR prototype in front of Black & Gold in Amsterdam, during ADE. After scanning the QR code on a poster, the audience was taken to a mobile website that explained the project. Then, the camera on your phone would switch on and you’d see the yellow organisms grow around you. In the back, someone was sitting quietly. A silent avatar. The overall experience was poetic and intimate. As with OATS, we learned a lot.
We are super happy that it is possible to create a mobile AR experience with open-source technology.
The experience was static and did not react to the presence of the viewer. Going forward, we should look into the possibilities of adding interactive elements.
The AR experience was not created by combining different existing software tools. This led to the creation of a validated workflow. Our next step is to integrate these software tools into own own application and combine everything into one AR Creator Tool user interface.
Our ultimate goal is to develop accessible AI, AR and Avatar creation tools that musicians can use without our support. In the above examples, this has not been the case. We have mainly tested the workflow of existing tools and not created our own tools and interfaces – yet. Going forward, we will start building and testing our own software interfaces and let artists create their own AI, AR Avatar experiences from scratch. In this way, we are building towards a future in which every musician is given equal opportunities and resources to work safely and autonomously with the latest technology.
LINKS
✌️ Open Culture Tech
https://www.openculturetech.com
Hey we know. Terrible self promotion. But why the hell not. We are all in this together, right?
⚡️Lidar Scanning
https://poly.cam/
3D scanning software that we used for the motion capturing in the OATS trial.
🤖 Rokoko Vision
https://www.rokoko.com/products/vision
AI motion capturing alternative
This was so interesting to read and learn more about. Thank you!