Category Archives for "News"

February 1, 2018

Oxford team wins government award to develop VR to treat mental patients

Woman on floor

Following on from our recent report about augmented reality being used to help surgeons at St Mary’s Hospital perform reconstructive surgery on lower limbs, bestvr.tech can now report that virtual reality is being trialed by the NHS on mental health patients.

The trial aims to address the problems that psychiatric patients have in dealing with day-to-day situations. Patients will wear a virtual reality headset and in a series of sessions lasting between 20 and 30 minutes will be guided through a series of tasks and situations that to a normal person would be easy enough to handle, but that can pose problems to those with mental health problems.

But the patients won’t be struggling blindly through the sessions. They will have a VR coach who will guide them through the simulation.

Prof Daniel Freeman

Prof Daniel Freeman


The project was developed by clinical psychologist Professor Daniel Freeman of Oxford University’s Department of Psychiatry and the Oxford Health NHS Foundation Trust.

“Patient’s often find it easier to do this work in the virtual world, and enjoy using our VR applications," Professor Freeman explains.

“But the beauty is that the benefits transfer to the real world.”

Human interaction

Human interaction

The visual design of the VR therapy is being led by Jonathan West of the Helen Hamlyn Centre for Design at the Royal College of Art. He explains enthusiastically:

This is such a fantastic opportunity to involve patients in the design of new and exciting VR therapies. It brings together a great team of designers, patients, psychologists, and computer scientists to work towards something with huge potential for impact.

Of the treatment technology itself, Professor Freeman, explained:

Our new treatment is automated – the virtual coach leads the therapy – and it uses inexpensive VR kit, so it has the potential for widespread use in the NHS. We’re inspired by the opportunity VR provides to increase dramatically the number of people who can access the most effective psychological therapies.

Realizing this ambition will require much work, but our amazing team of patients, NHS staff, researchers, and designers has all the capabilities to achieve it. Over the next three years this major investment should lead to real and positive change in services for patients.

The project consists of a design phase, a multi-center clinical trial, and then the creation of a road map for the treatment. The design phase is to test the treatment for ease of use and to make sure that it can hold the patient’s attention and is fine-tuned to suit their needs. The multi-center clinical trial is designed to verify that the use of this virtual reality approach is beneficial to patients. The road map is to ensure a smooth roll out of the treatment across the country.

The project is being funded by a £4 million grant awarded by the National Institute for Health Research (NIHR) announced by Lord O’Shaughnessy, Parliamentary Under-Secretary for the Department of Health and Social Care, at the MQ Mental Health Science Meeting 2018 on February the 1st. In his announcement, Lord O’Shaughnessy stated:

Lord_O_Shaughnessy

Lord O'Shaughnessy

I’d like to offer my congratulations to the winners of this award. We know that tackling the increasingly complex health challenges we face means harnessing the potential of new technology.  Through the NIHR, we spend £1bn per year bringing great British innovations into the NHS for the benefit of patients.

martin hunt

Martin Hunt


To win this award, the Oxford team faced stiff competition from other research teams from all over the UK in a winner-takes-all competition set up last year by the NIHR.

A similar competition has been launched this year.

In the words of Martin Hunt, director of the NIHR i4i program:

I am delighted we have been able to attract and support such an ambitious, potentially transformational project, from a world class team.... I hope that the 2018 competition attracts a similar caliber of applications to enable us to support the translation of more ground-breaking technologies, for the benefit of people living with mental health conditions

The Oxford project was also praised by Dr Jennifer Martin of the NIHR MindTech MedTech Co-operative:

Jen Martin

Jen Martin

"We believe that this collaborative approach will help us to develop a VR treatment that is enjoyable and easy to use, and that will be taken up across the NHS so that as many people as possible can benefit."

Now it is up to the Oxford team to deliver the results that may transform mental health treatment in Britain.

February 1, 2018

Microsoft HoloLens to assist surgeons

Doctors using AR to help surgery

Imagine surgeons doing an operation in which they see through opaque areas of the body as if they had X-ray eyes. Now imagine that these X-ray eyes can see not only bone, but also soft tissue.

To some extent this happens already when antenatal doctors do an amniotic probe, watching the position of the needle in the womb via an ultrasound scan. But they are normally looking at a monitor. This involves looking up and down - or possibly sideways.

But what if, instead, they could keep their eyes on the patient while seeing a superimposed image of what is going on inside the patient’s body, using augmented reality glasses or visors?

This is precisely what surgeons at Imperial College, London have been experimenting with. Using the Microsoft HoloLens, they have been conducting trials in which surgeons doing reconstructive surgery on lower limbs are able to see the positions of bones and blood vessels via CT scans overlaid against the area that they were operating on.

Leg surgery

Leg surgery

The team of surgeons were trying out the new technology at St Mary’s Hospital, which is attached to Imperial College. The most difficult part of reconstructive surgery on limbs is reconnecting blood vessels and sometimes bones. In many cases, the surgeons are effectively working blind, or at least with a highly restricted view. CT scans can give them the extra information they need, but the problem of looking in two directions makes the task difficult. The augmented overlay approach, solves this problem neatly and effectively.

The method was tested on five patients, including a middle-aged man with leg injuries from a road accident and an 85-year-old woman with compound fractures of the ankle, including a protruding bone that had pierced the skin.

The CT scans did not take place in real-time while the surgery was being performed. Instead, the patients went through CT scans beforehand to map out the positions of bones and blood vessels, as well as connective tissue, muscle and fat. The CT scans were then segmented, with the bones, blood vessels, muscle and fat converted into polygonal models and rendered as digital images in a format compatible with the HoloLens. These images were then projected onto the HoloLens visor, so that the surgeons could see them as an overlay against the actual injured limb.

CT Scans

CT Scans

The European radiology Experimental journal published the results of the study. In the abstract they started:

Intraoperatively, the models were registered manually to their respective subjects by the operating surgeon using a combination of tracked hand gestures and voice commands; AR was used to aid navigation and accurate dissection. Identification of the subsurface location of vascular perforators through AR overlay was compared to the positions obtained by audible Doppler ultrasound. Through a preliminary HoloLens-assisted case series, the operating surgeon was able to demonstrate precise and efficient localisation of perforating vessels.
[https://eurradiolexp.springeropen.com/articles/10.1186/s41747-017-0033-2]

Mapping the anatomy

Mapping the anatomy

The study acknowledges that: “One limitation is that presently a technical assistant is required initially to help with preoperative data preparation and later in the operating theatre to assist with application launch and approximate spatial model positioning before involvement of the operating surgeon.”

The study concluded that: “The experience gained hitherto suggests that the techniques developed through this work are appropriate for reconstructive surgery applied to other areas of the body.”

SOURCE: Through the HoloLens™ looking glass: augmented reality for extremity reconstruction surgery using 3D vascular models with perforating vessels,
Philip Pratt, Matthew Ives, Graham Lawton, Jonathan Simmons, Nasko Radev, Liana Spyropoulou and Dimitri Amiras,
European Radiology Experimental20182:2, https://doi.org/10.1186/s41747-017-0033-2,
©  The Author(s) 2018

January 29, 2018

Airbnb planning to use virtual (and augmented) reality

airbnb-ar

Airbnb has just announced that they are going to start integrating virtual and augmented and reality into their business model.

The company - having started off using the internet as an aggregator to link property owners and short-term tenants and tourists, is now adding AR and VR as an extra layer to enhance the user experience. The company has divided the use of these technologies into two areas of application: Before the trip and During the trip.

Before the trip, Airbnb explained on their website, potential travellers could use VR to enable potential customers to view properties that they were considering renting and explore them in more detail than pictures alone can facilitate.The idea is that hosts would be able to scan the apartments and houses creating 360 degree images that could then be viewed either flat (on computer monitors and phone screens) or in VR head-mounted displays. They explain:

Virtual reality gives us an opportunity to reshape where inspiration is drawn from, and take travel planning to the next level. It can also allow people to connect with their destination, host, and co-travelers. Capabilities like 360 photos and 3D scans allow a person to step inside a home or city and understand what to expect and how to orient themselves before they leave the comfort of their own home.

But the company doesn’t stop there. They are also considering ways in which augmented reality could be used to assist travellers in making use of the facilities in the house or apartment. For example, the system could also allow for customers/tenants with AR glasses to see instruction notes overlaid on specific areas of the property or even instructions for specific appliances.

...it can also be stressful when someone doesn’t know how to unlock the door or turn on the hot water for a shower, or when they’re hopelessly lost and everything is in a foreign language.

Augmented reality and related technologies let us recognize surroundings and provide contextual, timely information to navigate these pain points. Just think how welcome, pulling up  directions to the coffee mugs on a mobile device will be first thing in the morning. Or, instant translations on how to work that German thermostat.

However, Airbnb is going further in its ideas of how to enrich the customer experience:

Augmented reality can also breathe life into a space and tell the story behind the personal items to connect a traveler with their host and a new city in richer, more immersive ways.This last idea might seem less practical and more a way of stretching the concept. But it shows that Airbnb are really using their creating imagination to extract maximum value from these emerging technologies.

January 29, 2018

Is Virtual Reality safe for children?

kid with VR

The question’s been around for some time, but parents tend to be all too blase about it. At least that’s the impression you get when you see how little online debate there is about it.


There’s massive hype around the technology itself. But the effect of VR on children doesn’t even seem like it’s an issue at all. We’re always hearing about some new breakthrough in the technology - much of it pure vaporware - but when was the last time you heard any serious, adult, discussion on the child safety aspect of VR?

The Oculus Rift manual actually warns users of all ages that “prolonged use should be avoided, as this could negatively impact hand-eye coordination, balance and multi-tasking ability.” More specifically, regarding children, the Oculus Rift (now owned by Facebook) has a 13+ age rating - which is consistent with Facebook’s 13+ membership requirement.

kids and VR games

Who says VR games are a solitary pastime?

Sony’s Playstation VR is rated 12+, while Samsung’s Gear VR matches the Oculus 13+ rating. The HTC Vive doesn’t specify an age limit, which might explain why the Vive was singled out for some harsh comments in an article by Sandee LaMotte, on the CNN website, covered by us on 15 December last year. In fairness, HTC gives a general warning against allowing young children to use the headset.

Whose going to police it?

But who’s going to enforce these restrictions? The hardware companies can’t. Nor would they even want to. Children may not have autonomous spending power (the X factor of the business world), but they have nag power (the Y factor - as in “why won’t you buy it for me?). That makes them great customers!

The legislators? Forget it. Money buys votes. And politicians these days are always thinking about life after politics. That usually means lucrative consultancies with the private sector. But you can’t piss people off and then expect them to offer you a job. So politicians are in no hurry to pass the kind of laws that’ll alienate big business!

Of course, there are laws about selling certain video games to children - at least in some jurisdictions. Even politicians can’t stand up to the might of outraged, self-styled “moral crusaders!” But these rules and regulations don’t apply to the hardware. Ultimately it is up to parents to set their own standards and police them. What the makers offer is no more than advice.

But is the advice backed up by solid research? Or did they just lick their index finger and hold it up to the wind? And what is the motive behind the advice? To help parents make an informed decision?  Or to cover their corporate asses against legal action? Let’s fact it, the west is becoming increasingly litigious (following guess-who’s lead). And while rich, powerful corporations can defend themselves against lawsuits and large payouts, it helps to create a built-in defense, to nip any cause of action in the bud!

Superman kid

You're never too young to be a superhero

What the academics say...

According to  Martin Banks, Professor of Optometry, Vision Science, Psychology, and Neuroscience at the University of California, Berkeley “So far I’ve seen no so-called smoking gun, no concrete evidence that a child of a certain age was somehow adversely affected by wearing a VR headset. My guess is that all they’re doing is saying that kids are developing and development slows down when they reach adolescence, and so let’s just play it safe and say that while these kids are undergoing significant development, we’ll advise people not to let them use it.”

That’s what Professor Banks says, wearing his psychology hat. However, wearing his optometry hat, he sounds a different tune “There is pretty good evidence, particularly among children, that if you do so-called near work, where you’re looking at something up close, like reading a book up very close or looking at a cellphone, that it causes the eye to lengthen and that causes the eye to become near-sighted.”

Child eye test

Eye test

Indeed a study published in Jama Opthalmology (Increased Prevalence of Myopia in the United States Between 1971-1972 and 1999-2004, Susan Vitale, PhD, MHS; Robert D. Sperduto, MD; Frederick L. Ferris III, MD) found a statistically significant increase in myopia (near-sightedness), stating in their conclusion:

When using similar methods for each period, the prevalence of myopia in the United States appears to be substantially higher in 1999-2004 than 30 years earlier. Identifying modifiable risk factors for myopia could lead to the development of cost-effective interventional strategies.

They went on to state:

In the earliest report from a nationally representative sample of the US population, the prevalence of myopia was estimated to be 25% in persons aged 12 to 54 years. Recently, several studies have documented an increased prevalence of myopia in younger birth cohorts,suggesting that environmental risk factors for myopia may have become more prevalent. In particular, studies in Asian populations have reported epidemics of myopia in younger generations, possibly attributed to the near-work demands imposed by more intensive education.

In other words, activities like working with computers, as distinct from, say, looking at a blackboard or whiteboard as in the old days of education. And with a VR headset, one is looking at something even nearer - which might suggest that near-eye headsets are even more of a problem. However, it isn’t quite as simple as that, Professor Banks explains:

Let’s contrast a kid using a VR headset compared to a kid using a smartphone. When they use the smartphone they typically hold it very close to them and so they have to focus their eye close. You might think that with the VR headset they’d have to do the same thing because the image is close to the eye, but they have optics in the setup that make the stimulus effectively far away. So, in terms of where the eye has to focus, you have to actually focus fairly far away to sharpen the image in the headset.

Two kids with Vr headsets

Two kids with VR headsets

Professor Peter Howarth, a senior lecturer in Optometry doesn’t believe that VR adversely affects a child’s eyesight. After all, the principle of VR is to provide different eye view to facilitate stereopsis (3D vision and perception of depth of field). Howarth even argues that the makers of near-eye stereoscopic headsets could actually provide software to test for vision problems.

But what about other problems? Like dizziness and motion sickness.

Problems arise when what you see with your eyes doesn’t match what you feel with your body. If the visual image tells you that you’re moving forward fast, but your body tells you that you’re stationary, it has a disconcerting effect. If your eyes tell you that you are spinning, but your semi-circular canals tell you that you aren’t, you might feel dizzy. But this applies to adults as much as children if not more so. In fact, children are very often more resilient.

htc-vive-kids

Action!

Then of course, there’s the danger of playing VR games and moving around a room in which there are solid objects. And the risk is even greater if the headset is tethered to a computer. Of course, this risk also applies to adults. And the major players (Vive and Rift) offer built-in warning systems to map out the area and warn the player if they’re in danger of stepping out of the safety zone.  But what if a child innocently strays into the safety zone while big brother is battling with zombies? Tommy Toddler may not be aware of the danger, while Terry Teenager is too wrapped up in a world of his own to notice, and remains totally oblivious to Tommy’s presence until impact!

You get the picture.

Brain development

But what about the long-term effect of VR on the development of the child’s brain? This is the Great Zone of Ignorance. We have centuries of experience regarding the impact of the printed word and stage drama. We have decades of information about the effects of cinema, radio and television. Heck, we’ve had 30-40 years to learn about the impact of personal computers!

​But VR is different. And even mobile phones and social media have been identified as weapons of mass distraction! There is already evidence that instant on-the-go access to information and remote contact with friends is affecting the developing brains of teenagers in terms of their expectations. And it also affects their mood when they find themselves deprived of those expectations.

VR is even more of a game changer. And children are even more in a stage of susceptibility to the environmental factors that shape them for life.

But it’s a hard area to study for two reasons. Firstly, VR is newer and has yet to achieve anything like the market penetration of the smartphone. (It is questionable if it ever will.) Secondly, it would take a formal study many decades to accumulate and evaluate the information. And how will one hold constant for other factors? Most academic studies require a control group that is not subject to the stimuli or causal factors that are being tested. But where is such a control group to be found? The Third World? The Amish? Children with strict parents?

Clearly then, the only thing we know about the effects of VR on children is that we know every little. So maybe it is better to err on the side of caution and keep the VR headsets away from children. Let them explore the real world first - something that millennia of evolution has primed them for - and then when they understand reality, let them play with its alternatives inside a little electronic box!

January 26, 2018

VR + sex dolls + teledildonics = the Final Frontier

Sex doll VR

At first we at bestvr.tech thought that April the 1st had come early. But apparently an adult webcam service - CamSoda - is pairing up sex robots with virtual reality headsets to give users the ultimate telesex is experience.

The idea, in essence, is that the headset shows the user their remote partner while the sex robot provides the… er… haptic feedback.

They are working in partnership with a company called RealDoll using a technology that has been humorously named “teledicdonics.” In effect the users are engaging mutual masturbation at a distance - hence the "tele" part of the name!

This teledildonic “service” has in fact been available for some time. That is, CamSoda has been offering a service whereby “performers" on the CamSoda site use WiFi-linked vibrators (a type called the LoveSense Nora) to capture their - how shall I put it - erotic movements. Whatever the Nora “experiences” is transmitted to its electronic partner - a male masturbator device, also made by LoveSense, called the LoveSense Max.

But CamSoda’s latest innovation is that the Max can be placed into sex dolls and the users can wear VR headsets while engaged in the.. er... action. Thus the users can both see their partner and feel their body - or at least a soft plastic surrogate - up close and personal.

However the sex dolls don’t come cheap - at $1,500 (£1000) a pop, so to speak. And of course one can get some of the experience from the Masturbator without the sex doll. On the other hand if one wants to mimic the full experience, the doll kind of fleshes out the picture.

As Daryn Parker, vice president of CamSoda said:

People have long speculated as to how the adult industry would seamlessly harness its cutting-edge technology to deliver the ultimate sensory experience, one that mimics real-life interaction and, of course, intercourse… Our partnership with RealDoll to allow our fans to VIRP is an absolute game changer.

​Fans will now be able to interact with their favourite cam models in real time via live virtual reality while simultaneously feeling the sensations of actual intercourse via their RealDoll and teledildonic integration.

Essentially, users will be able to live out their ultimate sex fantasies, and quench their immediate desires, in an immersive sensory environment that allows them to have real sex with virtual partners.

airplaneauto_orig

Sometimes it's the inflatable doll that's the sex maniac!

He added that “we know there is an audience because we hear it from our users and models. They are seeking ways to get closer and have more physical interaction.”

But perhaps most interestingly he said, “we’ve had a number of employees, beta users, and models try out the experience. All of them were blown away by the interactive capabilities.”

Nothing like enthusiastic employees to make a company successful.

But does the product fulfill a market need?

“Fifteen years ago people thought cell phones were weird and unnecessary. Look at them today. While there may be some initial hesitation, I anticipate people acquiescing and seeing this for what it is - an awesome product that fulfills people’s deepest desires.”

January 25, 2018

Will 2018 be the Magic Leap year?

Florida-based Magic Leap is a very mysterious company, by any standard. Founded in 2010 by Rony Abowitz. it has raised between $1.4 billion and $2 billion (depending on who you believe) in several rounds of financing. And all this without releasing a product. But in 2017 they did finally announce the forthcoming release a developer’s model and SDK, along with documentation and “learning resources” in 2018. Could this be one more to be added to our best VR list.

What they have developed is a display that projects light into the user’s eyes. This display is something between a full Head Mounted Display, of the kind that one sees with Virtual Reality, and a pair of glasses with attachments of the kind one sees on Augmented Reality hardware.

AR glasses

A gimpse into the future?

The company has raised $1.9 billion dollars in several funding rounds based on its R&D, the track record of its personnel and whatever technology it has demonstrated in private to its investors. And while we’re on the subject of investors, they include Google parent Alphabet, Alibaba and Qualcomm. And although not yet out there in the market with a product, they have been busy on several other fronts.

For example, on February 11, 2016, they joined the Entertainment Software Association and a week later they acquired the 3D division of Dacuda, a Swiss computer vision company. Then, in April of that year, they acquired Israeli cybersecurity company Northbit. Two months later, they announced a partnership with the R&D unit of Lucasfilm (a Disney subsidiary).

Magic Leap patent application

A numbers game? No, the Magic Leap patent application

Although a highly secretive company, some of their known activities suggest that they are also a highly enterprising venture. For example, as far back as December 2014, they had appointed science fiction author Neal Stephenson as “Chief Futurist”. (How many companies have one of those.)

The company’s history is also quite unusual, if their Wikipedia entry is anything to go by:

According to past versions of its website, the startup evolved from a company named "Magic Leap Studios" which around 2010 was working on a graphic novel and a feature film series, and in 2011 became a corporation, releasing an augmented reality app at Comic-Con that year.

A beautiful view

A beautiful view

However, by late 2014, their publicly available patent and trademark applications suggested that they were aiming to create, not content but actual hardware - specifically augmented reality glasses. Moreover, the design of the product they have now released, suggests that they are aiming for a product that can superimpose a virtual image over a real-world view (i.e. augmented reality) whilst being able to block out the outside world when desired (i.e. virtual reality).

Magic Leap remains highly secretive about the technology, but analysts who have examined their patents have concluded that they use stacked silicon waveguides to project an image directly onto the retinas of the user’s eyes.

Where monsters come to life

Where monsters come to life

Early videos showing not the hardware but input through the device, suggested that it required further development. Overlaid “reflections” were not always where they were supposed to be and overlaid objects did not appear to be fully opaque and were therefore incapable of blocking out light from the real-world objects that they were in front of. This would prevent the Magic Leap from being fully immersive or even as versatile as augmented reality glasses ought to be.

But that was two years ago, and a lot of R&D has gone into this hardware since then.

Leap with your eyes not your feet!

Leap with your eyes not your feet!

Unfortunately, Magic Leap has still not given out any information on the price or release date. We know that it will need to be connected not to a computer, but to a dedicated device called a Lightpack. But we know very little else. The company says that the hardware will have sensors, but just what type and what they will “sense” remains a mystery. Visual sensors? Real-space location? Motion?

Magic Leap has hinted that the device will actually be able to “remember” an environment and recreate it later, or at least know how the environment is laid out. They also claim that the full caboodle will respond to voice and gestures and be able to track head and even eye positions. They also say it will have a handheld remote - although why it would need one if it can track gestures is not clear.

Robo person?

Robo person?

Evidently, then, this is a company that prefers to “get it right” behind closed-doors rather than release a kludgy, unfinished product. They have spent a lot of time getting it right and managed to raise a lot of money from companies that understand technology. If I were placing a bet on the breakthrough consumer technology company of 2018, Magic Leap would be a good candidate for my money.

January 24, 2018

YAW VR puts you in the driver’s seat!

Yaw VR set up

The next step after virtual reality is where you don’t just see it and hear it, but also feel it. We’ve reported in the past about haptic feedback. But what about the feeling of acceleration?

The technology has been around in some theme parks - the tilting seats or an entire tilting unit. One minute you feel yourself falling forward (strapped in of course), then your pushed into the back of the seat as the spacecraft accelerates.

Yaw VR

Yaw​ VR

Well now you can experience the same thing in your home with the Yaw VR motion simulator. Developed by Hungarian startup Intellisense, it has raised half of its $150,000 target on Kickstarter where it is billed as “The world's most compact and affordable virtual reality motion simulator.”

The unit consists of a padded seat inside a hemisphere - a sort of inverted dome - that you sit in while wearing a VR headset. The machine then takes you on the ride of your life. It can turn a full 360 degrees and tilt 50 degrees in either direction, So it can be used in conjunction with driving, aircraft and spacecraft simulations. And of course, it is personal.

It is designed for use with the Oculus Rift, Oculus Gear, PlayStation VR, Gear VR and PC-based games, so much of the software is already out there.

Powered by several small motors that operate almost silently, Yaw VR can support up to 330 pounds of weight. Even using only 40% of its full power, it can move 120 degrees per second.

But what about the price, I hear you ask?

Early backers can pledge $890 on their Kickstarter campaign to be first in line to receive their units in August. The normal price will be $1,190. While this isn’t cheap, it will come packaged with an adjustable pedal, steering wheel and joystick holders. And you can attach your own steering wheels and joysticks too!

Yaw VR and attachments

Yaw VR and attachments

The price is no more than you would pay for one of those fancy massage chairs. But this chair doesn’t relax you - quite the opposite! Check out the video below where it was showcased at CES.

The good thing about the Yaw AR is that when you aren’t using it, you can store all the parts in the dome. The whole thing takes up little room and weighs only 33 pounds.

See how light it is

See how light it is

Yaw VR currently works with 80 different simulator apps and will be compatible with SimTools.

All pre-order customers will get four free apps: Flight Simulator, Racing Simulator, Space Battle and Roller Coaster. And when the product hits the market in August, there will probably be a whole lot of new games developed to take advantage of its features. And users can even develop their own applications for it.

But the best is yet to come! The developers are thinking ahead. They are also planning on releasing the Yaw VR Pro, also in August. This Big Brother to the Yaw VR is “designed for extreme intensive use.” Pledges of $1,340 will get you the Pro edition and also let you fit your motion simulator with a custom color pattern.

This is not vaporware. They have a product and a release date just seven months away!

January 23, 2018

New AI tool can bring John Travolta and Uma Thurman into your living room

John and Uma dancing

A new Artificial Intelligence and Augmented Reality application is being developed that makes it possible to take 2D images and not only convert them into 3D, but also to place them into a new (real) environment.

Called Volume, it is the brainchild of Israeli artists and technologists Or Fleisher and Shirin Anlen. It is very much in the early stage of development, but the creators have done a video on YouTube, showcasing the application’s potential by taking the dance contest scene from Quentin Tarantino’s Pulp Fiction and transporting John Travolta and Uma Thurman into a living room.

Fleischer explains it thus:

Our experiment with Pulp Fiction allows users to step inside one the film’s scenes in Augmented Reality, using Apple’s ARKit framework on an iPad. This experiment, is one of a few we are conducting at the moment, which illustrate the power of being able to reconstruct 3D scenes from 2D images. The possibilities of being able to reconstruct archival and static footage into 3D environments are one of the main motivations behind the development of the tool used to create these experiments called Volume.

The system is based on a deep learning tool called a “convolutional neural network” to teach the system. The neural network analyzes a two dimensional image and process, looking at contrast and changing colors of pixels and groups of pixels. It then tries to re-imagine the image in 3D and place it into the designated new environment. The extracted former 2D image thus becomes a 3D AR image in a new environment or context.

Or Fleisher

Or Fleisher

The agenda here is not to create another high-end tool for the video industry, but rather to create a web-based tool that will enable ordinary users to capture video images with their smartphone, tablet or camera and then upload them for automatic conversion.

Shirin Anlin

Shiri​n Anlin

The makers are the first to admit that the project is in its early stages. But they see great potential for the technology. It’s not just about having actors dancing in your living room. It could be Winston Churchill making a speech in war-torn England, Muhammad Ali slugging it out with George Foreman, or your favorite band making a comeback just for you and your friends.

December 28, 2017

What is coming in 2018

Credit Shutterstock

VR predictions for 2018 - we reserve the right to be wrong

2017 was supposed to be the breakthrough year for VR and AR, according to our predictions - and in a way, it was, for VR at least.

A lot of headsets were sold: Vive, Rift, PSVR, Samsung Gear VR. The Oculus Go was at least announced as was the Microsoft Mixed Reality headset. Others like the 200-degree FoV Pimax (3840x2160) and 210-degree Star VR (5120x1440), were also announced, although they were not actually demonstrated and could yet be vaporware.

In the AR world, things were a bit different. So far all we have seen is the ability to overlay a camera view on a phone screen with a virtual supplementary image and a few high-priced headsets that are intended for developers, with no indication of when the price will down to a level that will actually attract consumers.

Credit HTC 

​More games and apps became available, and other used were pioneered like education, consumer visualization and - our pet project of the future - office applications.

Progress was also made in letting the “astronauts” walk untethered, with wireless relay closing the gap with wired connection and inside out tracking (relying on gyroscopes and accelerometers), closing the gap with external tracking that relies upon lighthouses and cameras. Some of the diehards moaned about poor latency and dropped frames. But the problem of the pigtail and the prospect of strangulation in one’s own living room, made it inevitable that cordless would elbow its way into the market.

In due course the latency and dropped frames problem will be solved. Some hardcore gamers will hang on to their ponytail headsets until that happens. Others will opt for the cheaper cordless models now. I had an argument about this a few months back, with a hardcore gamer insisting - with that characteristically adolescent sense of entitlement - that low latency and smooth motion were “basic requirements” for Virtual Reality. I pointed out to him that this was like a rich man saying: “one simply must travel by Rolls Royce or not travel at all.” Needless to say, the rich boy with his toys did not like that one bit!

Royole 3D Moon Virtual Mob

I have also pointed out that the aesthetics of VR headsets leaves a lot to be desired. At the moment they are so kludgy, Apple will not touch them with a ten-foot pole. Until they can achieve the elegance of the Royole Moon personal theatre, I can’t see Apple changing its attitude towards them.

​But where do we go from here?

According to a survey by the International Data Corporation, spending on AR and VR will almost double next year - from $9.1 billion in 2017 to $17.8 billion in 2018. And in the medium term, IDC projects that this growth rate will hold until at least 2021. But what is particularly interesting, is that IDC sees the biggest share of the market being held not by the games sector, nor by hardware or retail showcasing, but rather by what they call “others” - a somewhat vague and amorphous concept, covering pretty much everything that we don’t know about the VR and AR markets.

​One thing they are clear about is that the biggest growth area will be the public sector - infrastructure maintenance and government training.

On the subject of VR-based training, IDC estimates that market revenue in the sector will reach $2.2 billion by 2023. However, this is predicated on a fast rollout of 5G telecoms standards. These standards have not yet even been finalized, but IDC appears to believe - probably wrongly - that 5G will begin commercial deployment in 2018! The faster speeds that 5G promises will no doubt play a part in bringing VR to a wider audience - as it will then be possible to transmit and narrowcast VR to targeted users. But even the standards won’t be finalized until 2018 - and rollout won’t begin until 2022. So, the IDC prediction on VR training, might itself be out by two years.

While I am reluctant to make more predictions after some of our prophecies for 2017 fell short, I will still my neck out by saying that with the Vive releasing the Vive Focus, with the Oculus Go and with others poised to enter the market, we feel that 2018 will be the year of the standalone VR headset.

And as for Augmented Reality, to quote Dostoyevsky: that is the subject of another story...

December 19, 2017

Wrist-worn gesture detector

TENZR1

BioInteractive Technologies (Vancouver) is shaking up the gesture-based controller sector with TENZR™, a gesture detector worn on the wrist. Unlike most other detectors, it does not require a camera, lighthouse, sonic triangulation or any other external recognition to function.

The wristband controller works out of the box with no training and recognizes six hand gestures (left, right, up down, open and closed). It connects to the VR or other device via Bluetooth. This enables it to be used as simple controller device.

TENZR hand gestures

However, it can be trained to recognize more discriminatory finger gestures, thus enabling it to be used to play games such as darts, with the fingers simply “holding” an imaginary dart and the hand and wrist throwing it.

The company has been developing the product for the last three years. Their aim was to herald in the next generation of controllers, making them less bulky and more natural-feeling. What they have produced is a small, comfortable device - worn on the wrist - that can be used for even more discriminating applications such as Rock, Paper, Scissors.

The current unit is a developer model to be presented as  CES 2018 at Eureka Park, Sands, Hall G - 50915 and at Cypress Booth (MP25365). It is Unity3D compatible, making it an ideal plug-and-play solution for various VR platforms.

TENZR’s™ features include:

  • Built-in capabilities to recognize gestures using BIT’s custom sensing technology
  • Infinite customization: Supports the detection of custom gestures for different applications
  • Built-in heading estimation using inertial sensor
  • Integrates with connected devices over Bluetooth Low Energy
  • Weighs 45 grams.

Could this nifty little peripheral work alongside the Logitech-HTC Bridge to create the Virtual Office, that we at bestvr.tech have long been lobbying for?

TENZR6TENZR5TENZR4TENZR3TENZER2