Dossier 13: The Metaverse

Inside the Metaverse Experiments of Babusi Nyoni

Babusi Nyoni is a tech entrepreneur and innovator building tools for future Africa. Around the end of 2021 and early 2022 he experimented with human digitisation and building an experimental platform for a decentralised metaverse, as a response to the metaverses Big Tech companies were presenting. In the context of the Toolkit for the Inbetween—a research project initiated by The Hmm, affect lab, and MU Hybrid Art House—Babusi developed his platform further to put together an avatar dance party, the METAVERSES CHA-CHA-CHA. Leading up to the dance party, Babusi hosted a series of workshops, both online and at physical locations, in which participants could create their own 3D avatar.

Sjef van Beers: Hi Babusi. You did the workshops in early October and we’re going to have the dance party later this month. Those things are further developments of something you were already working on, right? Could you tell us more about that?

Babusi Nyoni: Last year, I think around December 2021, I started exploring the idea of human digitisation in hopes of creating some idea of a decentralised metaverse. This came in the wake of being made aware of all these metaverse platforms that were owned by companies like Facebook or game companies and, in short, a conversation that was already being closed off before it could be had. And I started thinking about ways that everyday people can digitise themselves so that they can take part in versions of a metaverse that are not tied to companies that they might not ethically align with. I explored different machine learning techniques to turn pictures of myself into 3D models. I found some models that work pretty well and I turned myself into a 3D model using machine learning. Then I applied a texture to myself using Blender, and then I built a little web platform for me to exist as an extension of myself. And this was in December / January or right around that period early this year.

SvB: You mentioned that you use Blender and you use machine learning. Is it important to you that these tools are freely available, or maybe even open source?

BN: Yeah, exactly. The accessibility factor was a big thing for me. Even with accessibility in terms of the tech, and also the work to be done to get that achieved, those things were really important to me. Throughout my career, I’ve made a name for myself by working with emerging technology and centring it on the needs of underrepresented communities. And this, to me, was something that was kind of in line with that and was a question posed to myself, by myself, as to how I could maybe subvert the more, I guess, corporate notion of a metaverse and turn it into something that could be accessible. 

A laptop displaying the online service Mixamo that adds full body animation to 3D avatars
Adding movement to the avatars during a test workshop

SvB: And you mentioned this experiment came in response to metaverses created by companies. Could you say more about why you wanted to start this?

BN: The motivation for me was that I think with conversations around new technology, a lot of people are unaware of the possibilities of it, because a lot of people experience technology when it trickles down from big corporations. It’s important for independent creators or technologists to educate the public about the possibilities of new advancements in technology that they might not be seeing. So even to very publicly say “I’m building my own metaverse” already opens up a conversation around questions like “What actually is the metaverse then, because I thought it was something that Facebook owned?” That definitely was the motivation. But then looking at the case study that I’ve built for it, it’s more around being able to connect with people on a platform that allows for a different type of interaction. And one that also allows people who might not have access to devices like virtual reality gadgets and whatnot to be able to still experience a metaverse without encountering any limits based on economic factors.

SvB: Were there some surprises that you ran into when you were trying to figure out and set up your own metaverse or your own avatar?

BN: I would say the most interesting thing was something that I instigated myself, which was in the later part of my experiment. I invited a guest, my best friend, to my metaverse space and I digitised him. And then, because we were dancing in the virtual space, in selecting his dance move I chose a dance move that was slightly more extravagant than he would do in real life. And I knew that, but I wanted to pose a question to him around how he felt, well after the fact. He was happy to be digitised and to be in the space, etc. But then, a couple of weeks later we called each other and I asked him: “So how did you feel about your dance?” And he replied: “Yeah, I felt like it wasn’t really representative of me.” And I said: “Oh, okay, cool. How did that make you feel?” to which he replied “It’s like, yeah, a little weird” But I really wanted to explore, at his expense, the idea of agency when you are digitised and that version of you is, I guess, owned by whoever owns the platform. What does agency look like when the owners of a platform can essentially do whatever they want to the platform and you might experience yourself performing actions you haven’t approved? Agency and control, for me, were the most interesting things to experience, or least explore and have conversations around, in the process of defining my own metaverse.

SvB: This actually also brings me to the other question: should people in the metaverse have an avatar that that looks like them? Because you could say you have the freedom to pick a bizarre sort of alien look or something as well?

BN: Yeah, I mean, that’s a really good question. And I would say it’s a very Eurocentric question, because a lot of people in Europe have been well represented by, you know, avatars and everything since the beginning of the internet. My focus was really around people who would struggle to customise an avatar. Like my hair would never look like this, so it’s wild to have to pick hair that looks a certain way. I wanted to have that metaverse conversation at a really grassroots level. I mean, I built it for participation by people who live in Zimbabwe, South Africa, and whatnot—the latter being where my best friend currently lives. It was more a conversation for people who, for the first time, can see themselves represented digitally. Whereas, you know, on a more Eurocentric level, it would be unimpressive being visible in this way, when many just want to be unicorns or something.

SvB: Right, and is it now possible, with the methods that you’re using, to have an accurate representation? Was it hard to get there?

BN: I would say it’s relatively easy. There are techniques that I explored on the sidelines of defining my pipeline for human digitisation to the metaverse. There are methods that have a high level of fidelity that, unfortunately for the purposes of extending the work, would either come with a higher learning curve, or would have dependencies like another person to create a scan of someone which, if you live alone, would limit the participant to this digitisation. For the purposes of the workshop that we’ve been running we had to make sure that the steps could be taken by someone who lives alone. Due to this, the possibilities of digitising someone with a high fidelity body scanning app or something were all out. Because that would mean that someone alone can’t participate. This was a version of the workshop that had to be made in order to make the entire experience a little more accessible and one of the aspects we had to compromise on was fidelity. But there are approaches that I figured out that would make it very easy to have a one-to-one likeness of yourself in the metaverse with a very high fidelity of portrayal and high quality textures.

SvB: A lot of things about the metaverse and its various (corporate) iterations are still unclear. There are questions like “What are we going to use this for?” and “what are its benefits?”. How do you think about the future and the potential of the metaverse?

BN: I think during the peak of the coronavirus pandemic we were forced to rethink what it meant to connect. It was like a run through of all these things that we could use to connect in means that are not physical. Being forced to do so showed us that we could connect through other means that weren’t either physical or tactile. And the whole metaverse, or let’s just call it virtual or extended reality, allows us to explore a different way of human connectivity through the digital that, you know, can potentially expand our own consciousness. And I think new forms of connections will be crucial in allowing people from around the world to connect on a basis that goes a little further than just simple texts or calls or the most ideal, which is physical connection.

SvB: How do you want to further develop the metaverse and the human digitisation platform you made?

BN: What I have right now, what we’ve been working towards, is the the event that we’re going to run on the 27th of October 2022 during Dutch Design Week. That will be the first use of this platform beyond my initial experiment. I hadn’t used it for anything after running the first experiment, and when the opportunity came to showcase the technology I started working towards having it as a hybrid experience platform and building everything into it to ensure that we can have it as liminal as possible.

SvB: What do you hope will be the outcome of the experiments?

BN: I want it to be a different way to connect. We’ve built in things I’m not sure exist in this format already. I think as far as meta or you know, augmented or extended experiences go, I think this will be the first time that we have people connecting and translating their movements in a physical space to a meta space without having to directly be in control of them. We are using mobile device sensors to translate the movements of people physically into the online environment and also help people connect with each other virtually by analysing their dance rhythm both in the physical space of the event as well as remotely for all those joining in. The main takeaway I would like is for people to reimagine what it means to be in the metaverse. I think the next iteration of this could start to even blur that further and see to what extent we can start to really merge the physical and the meta or extended reality or whatever that is.