HomeTechnology NewsHow metaverse could play a role in reshaping healthcare

How metaverse could play a role in reshaping healthcare

The health industry has started using components like augmented reality (AR), virtual reality (VR), and artificial intelligence (AI) in software and hardware to enhance the proficiency of medical devices and expand the reach of medical care, according to a report.

Profile imageBy CNBCTV18.com December 7, 2021, 7:33:08 PM IST (Updated)
How metaverse could play a role in reshaping healthcare
Big tech companies like Meta and Microsoft have been emphasising how metaverses could revolutionise the world. While it is yet to be seen how the metaverse evolves, a few businesses are already utilising some essential components that will ultimately comprise the metaverse. Healthcare and medicine is one such industry.



The health industry has now started using components like augmented reality (AR), virtual reality (VR), and artificial intelligence (AI) in software and hardware to enhance the proficiency of medical devices and expand the reach of medical care, according to a CNBC report. Simply put, the metaverse will be an extension of the VR, AR and mixed reality (MR) technologies that we use today.

The metaverse or virtual universe is a set of interconnected spaces online where one can indulge in activities like gaming, shopping and even attend events through virtual avatars. It is an amalgamation of different technologies like VR, AR, MR, AI and digital currencies.

According to the report, the World Health Organization is using AR to train COVID-19 first responders through smartphones while psychiatrists are using VR technology to treat post-traumatic stress in veterans, and medical schools are using the technology for training.

Also Read | Explained: How AI can accurately determine person's age

AR in the global healthcare market is expected to grow to $1.42 billion in 2021 and $4.15 billion by 2025 from $1.06 billion in 2020, Globenewswire reported, quoting research by The Business Research Company. Here’s a quick look at various other components of the metaverse that could gradually change healthcare as we know it.

Meta's Oculus

The Meta platform acquired the Oculus virtual reality (VR) headset in 2014. Since then, Oculus has had multiple collaborations with the healthcare industry. One of the most prominent collaborations in recent times was with Nexus Studios and the WHO Academy. Meta's R&D incubator has designed a mobile app that will let healthcare workers learn more about battling COVID-19. One training course involves AR to simulate the proper techniques and sequence to put on and remove personal protective equipment. The app is available in seven languages.

The Oculus technology is also used in the University of Connecticut's medical centre to train orthopaedic surgery residents. The university has teamed up with PrecisionOS, a Canadian medical software company that offers VR training and educational modules in orthopaedics. The Oculus headsets allow residents to visualise a range of surgical procedures in 3-D, a harmless environment where they can practise their skills.

Also Read: The rise of Metaverse and what it holds for future


Microsoft's HoloLens

Microsoft introduced the HoloLens smart glasses in 2016. One of the early adopters of the glasses was Stryker, a medical technology company in Michigan. In 2017, it started using the AR device to improve the process for designing operating rooms for hospitals/surgery centres. With the improvements of HoloLens 2, Stryker engineers are now able to create shared ORs with the use of holograms.

Zimmer Biomet, an Indiana-based medical device company, recently revealed its OptiVu Mixed Reality Solutions platform. The platform utilises the HoloLens device and three specific applications. The first application uses MR in manufacturing surgical tools. The second collects and stores data to track patient progress before and after surgery and the third one allows medical professionals to share a MR experience with patients ahead of a procedure.

Microsoft's Mesh

In March, Microsoft unveiled Mesh, a mixed reality (MR) platform powered by Azure cloud services. Azure lets people join 3-D holographic experiences on various devices and from different geographical locations. In Microsoft's blog post about Mesh, it imagined medical students learning about human anatomy using a holographic model of a cadaver.

Use of other VR and AR tech in medicine

The main types of AR in healthcare are the hardware and software to perform tasks such as surgeries, and for efficient diagnosis using tech such as smart glasses. The devices include head-mounted displays, handheld devices, wearables, vision-based, mobile-device based, among others.

The use of AR in surgeries allows surgeons to visualise a patient’s anatomy side by side with their MRI and CT scan data.

At the University of Miami's Miller School of Medicine, for example, instructors utilised AR, VR and MR to train first-responders to treat trauma patients who have had a stroke, heart attack or a gunshot wound. Students practise life-saving cardiac procedures on a life-like mannequin that realistically simulates nearly any cardiac disease wearing VR headsets.

Also Read | Explained: How real estate is booming in metaverse

In fact, Johns Hopkins neurosurgeons performed the institution’s first-ever AR surgeries on living patients in June donning headsets made by Israeli firm Augmedics. “It’s like having a GPS navigator in front of your eyes,” Timothy Witham, MD, Director of the Johns Hopkins Neurosurgery Spinal Fusion Laboratory told CNBC.
Check out our in-depth Market Coverage, Business News & get real-time Stock Market Updates on CNBC-TV18. Also, Watch our channels CNBC-TV18, CNBC Awaaz and CNBC Bajar Live on-the-go!