Sneak Peek: Animats Metaverse Viewer

The Second Life virtual world allows creation of extremely complicated and detailed models, but the tools used to view, explore, and manipulate them date from software developed almost twenty years ago, before the era of high speed parallel graphics processor units (GPUs) that have allowed high-end (so-called “AAA+” titles) games to provide near-cinematographic quality and high frame rates.

For more than six months, John Nagle of Animats has been developing, in stealth mode, a next-generation viewer for Second Life (and potentially other virtual worlds, including those based upon OpenSimulator), which uses the Vulkan cross-platform graphics API, allowing access to the multiple CPU and GPU cores available on today’s machines. It is implemented in the Rust programming language, whose features make development of code with multiple concurrent processes more straightforward and less prone to error than legacy languages such as C++.

Well, the viewer is not ready for people to try, and there are many features yet to be added, but the project has just peeked out from the stealth bunker with a discreet announcement on the Animats site: “Animats metaverse viewer” with the following teaser image produced while walking through the City of New Babbage in Second Life.

Accompanying this is a recording of a walk around New Babbage, demonstrating the rendering speed and quality in a real environment. This video is hosted on the “Hardlimit” site, which Discourse does not know how to embed here, so simply click the title below to view the video.

This was run on a computer with a configuration typical of those used to run recent AAA+ game titles; it is not a high-end extreme gaming set-up by any means.

I expect we’ll see more demos as development progresses.

Many presentations of the “metaverse” show a distinctly cartoon-like world, as in this horror from Facebook (now calling itself “Meta”).

But it doesn’t have to be like that. What is now a modest and affordable computer used to play games can, with suitable software, allow exploration and creation of worlds with arbitrary levels of detail and complexity.

3 Likes

Here is another sneak peek, just published, of a walk-through of New Babbage at full resolution and a display rate of 60 frames per second.

This virtual world experience requires absolutely no modifications to Second Life: it is simply a matter of using a viewer that is implemented using contemporary software development methodologies, tools, and libraries which exploit the multi-thread computing and graphics processing unit capabilities of today’s mid-market personal computers.

It is fair to note that few of the destinations in Second Life are built with such painstaking attention to detail and depth of content as New Babbage, but if the ability to experience them at this level of realism became common, doubtless creators to be motivated to rise to the challenge.

2 Likes

I don’t have a Second Life, but I have seen screenshots (shudder). So, I’m glad its users will soon enjoy enhanced graphics. Perhaps I am missing something, but I do wonder how the so-called “metaverse” differs, if at all, from existing MMORPGs? Is the idea simply that online platforms will become more prevalent in the future?

Along the lines of technology that might enhance the metaverse experience, I came across this recently:

The EarIO works like a ship sending out pulses of sonar. A speaker on each side of the earphone sends acoustic signals to the sides of the face and a microphone picks up the echoes. As wearers talk, smile or raise their eyebrows, the skin moves and stretches, changing the echo profiles. A deep learning algorithm developed by the researchers uses artificial intelligence to continually process the data and translate the shifting echoes into complete facial expressions.

This seems like a neat technology that could be used to reproduce facial expressions in the metaverse. The advantages of using sonar rather than cameras or infrared motion sensors such as the Microsoft Kinect include privacy and energy efficiency.

Zhang explains:

”People may not realize how smart wearables are – what that information says about you, and what companies can do with that information,” Guimbretière said. With images of the face, someone could also infer emotions and actions. “The goal of this project is to be sure that all the information, which is very valuable to your privacy, is always under your control and computed locally.”

Using acoustic signals also takes less energy than recording images, and the EarIO uses 1/25 of the energy of another camera-based system the Zhang lab developed previously.

1 Like

The key difference is that the content of Second Life is almost 100% user-created, as opposed to a game where the game studio builds the world and players interact with it in a largely passive way. In Second Life, users can buy, rent, and sell land, build on it, create new objects, including dynamic objects that interact with other users and one another, and extend the system in any way they wish.

Changes made by users are persistent and shared among everybody in the virtual space.

I got interested in Second Life as a way of building and delivering science simulations. Here are YouTube demos of some of my builds.

3 Likes

Nice simulations. I especially like the flocking birds. It occurs to me that very few video games have birds, and even fewer have realistic flocks of birds like what you created. Birds would bring a lot of life to the skies of virtual worlds, which are usually empty with the rare exception of volumetric clouds.

2 Likes

Think of the differences between a toy and a game. You can use toys to make games, by adding more rules and scoring, but they are distinct things. Sadly there is a lot of mislabeling in the game industry, with definite toys (eg. various of Wil Wright’s Sim programs) being called games.

3 Likes