Let’s all create the Open Social Spatial Web using WebXR and WebGPU
This was first given as a talk at Silicon Valley Virtual Reality #71 - 10 year modern VR anniversary event - on August 31, 2022 at the Computer History Museum
The metaverse is already on fire and we have not even built it yet. Think of the metaverse as a Frankenstein made of different parts of current internet and digital technologies. Now with 3D visuals and touch. Between our physical reality and virtual reality, there are different layers of mixed and augmented reality, where physical and digital objects blend. The metaverse will be somewhere across this spectrum; a "the reality gradient." But we shouldn't let the 3D smoke and mirrors distract us from the fact that the metaverse is the internet, and as such, it will inherit the worst traits of its current business models.
So how do we fix it even before we build it? By avoiding repeating past internet mistakes. Understanding why we're building it, who can seize control of it and what makes it different.
access to our bodies, VR headsets and sensors can extract huge amounts of involuntary data that we can't control, like heartbeats or blinks creating a capitalism of cognitive surveillance.
- Video game lawyer Micaela Mantegna
The Augmented Reality Glasses are coming for your face. You will be at a disadvantage if you don’t have them. And yet, how will you ever be able to trust what you see? Who will be in control of your perceived reality?
Trust and Control comes from openness and choice. Open dialog, open teams, open source frameworks, with open and equitable relationships. Trust is Connections over time.
Connection means that the metaverse needs an architecture linking all the metaverses together in a way that we can travel across them.
The web and its browsers are magic. Federated yet connected experiences linking pages, people, technologies, companies, standards, and nations. There’s not much else like it. The metaverse needs to reproduce the best features of the web – it should just be the web.
This metaverse as the web is not a new idea.
In 1994 Virtual Reality Modeling Language (VRML wikipedia.org/wiki/VRML) was conceived to extend HTML / XML to describe 3D space. The approach was primitive and inefficient but led to a fascinating organic community of early 3D web creators.
In 2011, WebGL brought 3D to the web with a JavaScript API for rendering interactive 3D graphics within any web browser without the use of plug-ins. WebGL is fully integrated with other web standards, allowing GPU-accelerated physics, image processing, and effects as part of the web page canvas.
Janus VR came on the scene in 2013 from the interdisciplinary research laboratory at the University of Toronto (wikipedia.org/wiki/JanusVR). The Janus platform was much more modern than VRML, and the team went on to make the first major WebGL tools to create, share and experience spatially rich internet content. Using the start of WebGL, Janus created their own web browser that could bridge VR to these virtual worlds. Before its time, Janus VR browser was niche, esoteric, and enabled early VR headsets to catch a glimpse of what was to come.
From 2010 to 2015 the WebRTC browser API was developed and released as a standard in Web browsers, adding powerful voice, and video transports, along with UDP network transports and P2P communication between browsers.
Inspired by the work of VRML and Janus, the WebVR Browser API standard came online next. Development, led by Mozilla from 2014 to 2016, added web VR to Windows and Headsets on any browser.
Hand in hand with WebVR development, in 2015 Aframe was released. A UX project originally led by Mozilla led to the birth of WebVR content that continues to today. This evolved into Mozilla Hubs in 2018, creating the first open social web VR spaces. At this time, VR creation for the web is still slow, inefficient, complicated, and limited.
Essentially, the tools and technology are still not built out yet. In 2017 work began on the WebXR Browser API, making strides to modernize the browser’s renderer and add Augmented Reality support to the web. This new browser standard effort has been led by many companies, and now web browsers are standard on XR Devices.
With the possibilities of the spatial web starting to bloom, new ways of thinking about web user experience started to take shape. Responsive Immersive Design became a sought after concept. Extend the existing mobile to desktop responsive design to 3D, Augmented Reality, and Virtual reality. If done right, could immersive web development be easier and more flexible than traditional game engines when making UX for 3D, AR & VR?
The first major shared work on responsive immersive web experiences was outlined by Mozilla in April of 2018.
Check it out here: blog.mozvr.com/progressive-webxr-ar-store
This approach is revolutionary compared to the native XR approaches in game engines.
Traditionally, if a developer wants to create magical immersive experiences, the developer must find a path between the assets and code and the target XR device, essentially it’s a path through the stacks of APIs you must navigate. Code must be written and tuned to execute from the Platform store and identity, to the XR API Libraries and Frameworks, to the API frameworks on the Operating system, down to the hardware and sensors themselves. Let’s imagine we are an immersive creator trying to bring our vision to the owners of the next generation of immersive augmented and virtual reality.
To penetrate this massive deep stack, most developers build on heaps of existing work. Practically this means having to pick and put up with an app store ecosystem and then choose a game engine to do the low level work for you. For AR and VR, this is almost always Unity and Unreal.
These proprietary tech stacks should not be taken lightly, they are in fact an existential danger to a healthy ecosystem. The current state of the hardware and platform market is already dominated by a few players and game engine consolidation will lead to further monopolies, consumer loss of choice, and death of the greater possibility of the metaverse. And no matter how good Unreal Engine 5 is, the royalties and app store fees will be a forever tax on the metaverse. This trend is even accelerating together, now Nvidia is making a hardware exclusive metaverse with Omniverse and RTX hardware.
The future doesn’t have to be this way. We can buck this trend towards consolidation. There’s an amazing alternative waiting in the wings.
WebXR. Here. Today.
WebXR is supported on many existing and upcoming devices.
WebXR learns from the lessons of OpenXR. OpenXR was created from the pains of every platform having proprietary XR apis that became so obnoxious that even large game engines could not wrangle the industry. This will happen again in the metaverse and will be much worse.
THE WEB IS FAST NOW
Chromium based browsers now have the killer stack. The latest Chromium based browsers, (Chrome, Edge, Brave, and more) now have a modern rebuilt renderer and compute stack consisting of WebXR on WebGL ANGLE and WebGPU. All of this is calling down into the Vulkan API, culminating in an extremely fast and efficient pathway from the web to perform high performance CPU and GPU calculations.
WebGL 2 ANGLE (Almost Native Graphics Layer Engine) is an open source, cross-platform graphics engine abstraction layer developed by Google.
WebGPU is an API that exposes the capabilities of GPU hardware for the Web at native hardware speeds.
Vulkan by Khronos is a New-Generation graphics and compute API for high-efficiency, cross-platform access to GPUs.
The Web Metaverse Works NOW
Seems easy because jumping websites is already an internet feature. But that is because we don't carry our digital belongings with us. You can't take your Instagram likes to TikTok or your Roblox avatar to Fortnite. To achieve that, we need content portability, which is a very difficult engineering task, one that requires an object to function and look consistently across different software environments.
The task is already underway by developers around the world. The GLTF, developed by Khronos Group, GLTF is the 3D file format for the Web Metaverse. Open extensions to this format are actively worked on.
GLTF is on track to be an open extensible Web Metaverse Standard
Many metaverse GLTF based interoperability extensions and standards in development by several organizations. Such as avatars, portals, audio, video, NFTs.
The explosion of Computer Vision and Machine Learning on the web.
In 2016, several frameworks to run deep learning models in the browser as GPU shaders were released. Such as Tensorflow.js Onnx, and Tensorfire. This allowed machine learning models to be trained on data, tuned for inference and deployed straight from your browser. This magic took advantage of the compute right in the browser and recording intelligent inferences and powering web apps became increasingly possible.
In the half decade following, these web ML frameworks have evolved to outpace original native frameworks. Tensorflow.js is a clear example.
Google’s work on MediaPipe has built on Tensforflow JS and brought computer vision in a box for the browser. The Web Machine Learning & Computer Vision Tools of TODAY allow for all native features today 6Dof Tracking, Body Tracking, Occlusion, Scene Segmentation, and much more. It's nothing short of a revolution.
The buzz of open source computer vision on github is adding features faster than standards, browsers, and tech giants can keep up. Many XR computer vision tools already exist! 8th Wall, Wikitude, Blippar, Zappar, Vossle, and several others. One of note is Niantic Visual Positioning System (VPS) for Web determines position and orientation with centimeter-level accuracy in seconds. The problem is that right now it only works in small mapped areas in a few cities. It will take several years to roll out to any meaningful size of the world.
Looking at all this possibility led us to build an open toolbox for everyone that follows the positive trends of the open web. We are an open code collective working to make the web metaverse greater everyday. Let’s spatialize the web together with the best intentions in mind. Ethereal Engine is our sandbox to find what’s next for immersive computing. We built this platform with open code and creation to make a virtual place you control and trust. You own your data, you define your privacy and sovereignty. Let’s all host and control our own regional metaverses with decentralization in mind. We invite you to build your metaverse dreams on the web with us.
Connection means that the metaverse is the people. The communities that will breathe life and populate these worlds. Without humanity, the metaverse will be just a 3D ghost town.
But not everyone will be there because we are not born equal into the metaverse. We are promised boundless worlds, but in fact, we will be as free as we can afford to pay.
The metaverse may still be a work in progress, but that means we still have room and a chance to influence what it will become… open, accessible and inclusive.
It's not every day that humanity has the chance to create a new reality. So my invitation to you: let's make it a good one.
- Video game lawyer Micaela Mantegna
Excerpts From "The metaverse is already on fire, and we haven't even built it yet"
youtube.com/watch?v=OYv1dIle47U
Visit Web Metaverse Worlds
This was first given as a talk at Silicon Valley Virtual Reality #71 - 10 year modern VR anniversary event - on August 31, 2022 at the Computer History Museum
JOIN OUR COMMUNITY
We are off to an amazing start making the web more equitable, safe, and decentralized, but we can’t do it alone. Help make this dream a reality by joining the effort!