In this blog post, I have shared my thoughts on how 5G, Software-Defined Networking & Network Functions Virtualization (SDN-NFV) and Mobile Edge Computing (MEC) are an integral part of AR-VR(Augmented Reality and Virtual Reality) use cases, by taking a closer look at one of the interesting AR-VR use cases. I have also highlighted key factors that make it imperative for eco system players to engineer these technology areas with due considerations on some of the critical parameters.
Before we look at the role of SDN/NFV, 5G and Mobile Edge Computing, let us begin with understanding some interesting facts around AR-VR use cases, industry adoption. And then let us demystify the art of the technology behind a VR use case which will help to understand the role of digital technologies in the Network arena.
A sneak peek into some interesting AR-VR use-cases
Let us begin with understanding the market size of AR-VR. As per a Market research report from Statista(2022), the global augmented reality (AR), virtual reality (VR), and mixed reality (MR) market reached 28 billion U.S. dollars in 2021, rising to over 250 billion U.S. dollars by 2028.
With Metaverse being the next evolution into Internet, AR-VR plays a crucial role in it and one of the tightly couple Technology. To that extent, Enterprises are innovating with AR-VR based use cases to enable their Business processes which can improve significantly improve their Business with better operational experience, customer experience, new-revenue streams and so on.
It is obvious that Gaming and Entertainment industry will become the early adopters for VR with plenty of offering. Travel and tourism, Social Media, Education and Sports are few of the other industries which will have increased adoption for AR-VR in the coming years. I have listed some of them here which I believe are poised to become game changers in their respective industries.
- AR based remote assistance for field repair and maintenance services
- 360-degree immersive VR video experience for live sports streaming
- VR experience for tourists planning their accommodation in vacation and tours
- AR based 3D fitting rooms in retail industry, which allows consumers to virtually try-on clothing.
- AR enabled 3D experience for Home Décor and Furniture
What makes an AR-VR use case so complex and computationally intensive?
If you are an avid reader or follower of AR-VR use cases and solutions, I am sure you would have come across qualifications like ‘AR-VR video processing is complex and computationally intensive’. And Yes, that is the fact. To understand this better, i thought of taking a closer look at one of the VR use cases by examining the critical technical functions involved in it. And I have chosen ‘360-degree VR video with immersive experience for live streaming’ use case to illustrate this.
Let us understand some basics about 360-degree video, VR video and 6-DoF which are of relevance here.
360-degree video vs VR video
VR video at its full implementation allows the viewer to move around and interact with virtual objects generated in the simulated environment. 360-degree video, on the other hand allow viewers to look around but not traversing the depth, as the position is controlled / determined by the person who has recorded the video.
Six Degree of Freedom (6-DoF)
6-DoF allows the VR user to traverse in 6 different directions within the video. The below figure depicts the head movements in the six directions. And the technique allows one to traverse depth in the video thereby improving immersive experience.
Now that we understand what makes a 360-degree video into a more immersive VR video content, let us look at some of the critical technical functions involved in the overall workflow towards the final product that VR users would consume.
- Capture/Record is the process of capturing the event using a 360-degree camera (e.g., GoPro Omni). This produces multiple raw stream of videos from different cameras mounted on the rig.
- Video stitching is the process where multiple of the raw video streams are stitched together to form a VR 360 video in real time. Video stitching plays a crucial role in the pipeline and improves the quality of the video with new techniques like depth-based stitching, region of interest stitching etc. Video calibration is a sub-function where the stitching software analyzes video streams and identifies how each stream is related to one other in order to start the stitching.
- Video transcoding is the process of converting one video format(codec) to other. This can be software or hardware (GPU accelerated) based encoding depending on the codecs used. Most widely supported codecs are H264, H265, VP8 and VP9.
- Video Container format support – A container is like a box that contains the video, audio, metadata and other information embedded along with content. Based on the user end devices or soft clients, several container formats need to be supported and made available. In the case of video streaming, formats like HTTP Live streaming (HLS), MPEG-DASH (adaptive streaming), Adobe HTTP Dynamic streaming (HDS) will need to be made available to stream the content.
So, what is required to deliver high-degree user experience in AR-VR use-cases?
In order to achieve the right user experience for an AR-VR use case, the solution needs to cater to three critical requirements on the Network which connects it all together.
- High-network-bandwidth: While solution providers, industry forums and other players in the AR-VR ecosystem are continuously inventing the next-generation encoding techniques to compress video content, the demand for high-network bandwidth is here to stay. And for extreme VR use cases, this will be in Gbps.
- End-to-end low-latency: Latency requirement varies based on the AR-VR use case. In the case of Extreme-VR use cases like 360-degree live streaming, motion-to-photon (MTP) latency determines the underpinned latency requirements in the solution which includes sensor, display, computing and communication. Studies reveal that an MTP latency of 10ms to 15ms is ideal for such use cases.
- Computation-at-edge: Given the complex functions and its computing requirements in AR-VR, it is not always cost-effective to carry large amount of content to a centralized datacenter. This implies that some of the real-time functions in the pipeline (like the one we have seen in 360-degree VR streaming pipeline) need to be offloaded to the edge closer to source or customer.
Digital Network Technologies which help to address these requirements
Now that we have unpacked some critical aspects of an AR-VR solution and its critical requirements on Network, let us look at the 3 key Network technologies – 5G, SDN/NFV and MEC that would meet these requirements seamlessly.
MEC – MEC stands for Mobile Edge Computing (this is now evolving into Multi-access Edge Computing). MEC provides edge-solutions to host, control and manage all the Mobile-edge applications, referred as ME app. In the context of AR/VR, the functions such as video stitching, video calibration, video transcoding can be hosted as ME apps and services. There are other aspects including management, control, virtualization and integration with orchestrator provided by the MEC. (Refer ETSI MEC standards to know more).
SDN – When more ME apps and sessions deployed in ME hosts compete for the shared bandwidth, the bandwidth management becomes an important aspect. MEC provides bandwidth management function and APIs which the apps can consume to optimize the available bandwidth. And integrating the bandwidth management function with an SDN solution would provide added flexibility for dynamic scaling, steering and traffic engineering capabilities. Beyond this, SDN switch-based deployments in MEC locations delivers the datacenter economy benefits.
NFV – Both MEC and NFV are based on virtualization technology. They complement each other. In terms of infrastructure, one can choose to deploy them in collocated or as separate deployments. Some of the integral components of AR-VR plus MEC platforms like DNS can be offered as VNF from NFV. Apart from this, 5G RAN network functions like RAN-DU will be deployed as VNFs in the same location as MEC.
5G – 5G helps to extend the boundary of benefits from MEC and SDN/NFV. In the context of AR-VR, the extreme bandwidth and low-latency requirements are being fulfilled by 5G. Especially, the Network Slicing function provided by 5G will be important to address the multi-tenant requirements coming from AR-VR use cases.
While SDN/NFV and MEC started their deployment journey much earlier, now with 5G deployments happening across the globe, we will start witnessing the adoption of even more AR-VR use cases. To summarize, the success and experience of any AR-VR solution lies in building and/or re-engineering the Network for the below attributes:
- 5G-RAN, 5G-Core with Slicing capabilities
- Standards aligned NFV deployment to meet scalability
- Standards aligned MEC deployment to meet integration
- Integration of SDN to meet resource programmability
- Onboarding AR-VR solution components in MEC
- Cloud and Edge Engineering for Telco and Enterprise components
- Engineering Media and Content Delivery Network (CDN)
- Deployment Enablers and Automation
Infosys Engineering Services is actively building this ecosystem and investing on solutions in their CoE and Living labs, for 5G, SDN/NFV and MEC. The Lab is hosting a handful of 5G enabled use cases including ‘VR enabled 360 degree live and immersive Sports streaming experience on 5G’.
You can reach out to Infosys Engineering Services to know more about it.