Category Archives for "News"
Tang has said that despite the downsizing, the company remains committed to developing an AR headset. Avergant’s AR division is in direct competition with Microsoft’s HoloLens and Magic Leap. In some respects, the parallels to Magic Leap are stronger, because Magic Leap claims to have developed light field technology. But whilst Avegant has already demonstrated their technology in the Glyph – aiming light directly into the eye – Magic Leap continues to play its cards close to its chest.
Avegant demoed the technology to The Verge over a year ago and it was deemed superior to the HoloLens, despite the fact that HoloLens was a standalone wireless device, whereas the Avegant required the processing power of a high-spec PC. The main strengths of the Avegant headset were its superior field of view and the sharpness and clarity of its image display.
The demo – in a conference room at Avegant’s corporate offices - included views of the solar system and the ocean floor. In the solar system view which one could see Jupiter’s red spot and a satellite orbiting earth. What was so impressive about this demo was that objects of different focal length could be shown in a fixed environment. The Verge writer gave the example of squinting until the sun went out of focus and then seeing the virtual Saturn in sharp focus, including its rings.
There was also a sea view that showed a sea turtle and small, aquatic creatures in sharp definition. The reporter was able to stick his hand into the images, but because the prototype didn’t yet have an advanced tracking system, it was not possible to interact with it. In the HoloLens demo, in contrast, one can tap on one’s coffee table to trigger a display of virtual molten lava.
The demo in fact used the public asset library of the Unity game engine for the images and off-the-shelf cameras for the tracking to identify real objects in the room. Avegant’s long-term strategy calls for inside-out tracking to avoid the need for external cameras or trackers.
To some extent, Avegant is also somewhat secretive about its technology. While refusing to reveal precisely how the company implements light field technology, Founder and new CEO Tang hinted that the HoloLens is based on conventional 3D stereoscopy. Microsoft’s own secrecy policy has prevented them from revealing details of their “light engine”.
Similarly, Rony Abovitz, the CEO of Magic Leap, has criticized Microsoft’s image creation technology and claimed that Magic Leap’s is superior. However, given the absence of evidence that anyone outside Microsoft knows what technology they are using – and given Magic Leap’s own secrecy, there is no way of knowing who has the best technology. All we know for sure is that Avegant has had practical market experience of light field technology, while Magic Leap has had $1.3 million in funding and backing from Google parent Alphabet, while Microsoft has huge size and a range of experience.
It might be the financial disadvantage that has forced Avegant to scale back its efforts in AR. It is worth noting that despite Microsoft’s size – or maybe because of it – they released what was essentially a far from finished product and then charged $3000 for it. In contrast, Avegant’s personal cinema retailed for $799 – the price point for many high-end VR and AR products.
The sudden downsizing and replacement of the CEO suggests that Avegant is having trouble matching its larger competitors when it comes to funding. Ed Tang himself has said that the company is in the process of closing a $10 million funding round that would bring their total capital raised to $60 million since the company’s inception in 2012.
However, there are several more serious omens that bode ill for the company. It has noticeably lowered its public profile on Facebook and Twitter in the last few months, despite a very active and vigorous presence before that. And the last time it posted a new video on its YouTube channel was half a year ago.
This can suggest one of two things: either it is going through some kind of malaise that threatens its very existence, or it is legally obliged to go quiet because it is planning to announce an initial public offering and doesn’t want to be seen as hyping a product still in development, that is nowhere near market ready. The downsizing suggests the former. However, it could also be a strategy to ensure that the company is optimized for efficiency when it launches on the stock market.
The play they selected for their VR Shakespeare experiment was Titus Andronicus, a play co-written by William Shakespeare and George Peele. Reputedly, Shakespeare’s most bloodthirsty play, Titus tells the story of bloodthirsty feuding in ancient Rome and the kingdom of the Goths.
Becky Loftus, who heads the RSC’s “Audience Insight” section explained that the audience was initially divided to see the play either in the cinema or via a VR headset and that their heart rates were monitored by a wrist-worn device. Interviews were also conducted with the subjects afterwards. The choice of Titus Andronicus might have been because of its high levels of violence and gore, commonly attributed to Peele rather than Shakespeare.
The live audience saw the play at Stratford-upon-Avon, performed by the Royal Shakespeare Company. Assisted by MORI and their formidable experience in statistical research, the two groups were matched demographically, taking into account such factors as age, gender and previous theater experience.
The research was subsequently expanded to include a further group who watched the play through a VR headset. For the VR Shakespeare aspect of the experiment, the performance was filmed in 360-degree VR by Gorilla In The Room and shown to the VR group of subjects via the HTV Vive headset.
The results showed that watching Titus Andronicus was the equivalent of a 5-minute cardio workout, across all viewing platforms: live theater, cinema and Virtual Reality headset. This took the form of elevated heart-rates at various times during the play’s performance. The effect was more pronounced in men than in women. However, at the start of the performance, the heart-rates were higher in the live theater than in either of the other groups. Although the reason has not been established, the researchers theorized that this was due to elevated levels of anticipation.
According to Pippa Bailey of Ipsos MORI, 91% of the VR Shakespeare group experienced “moments” when they felt as if they in the theater. In contrast, only 64% of the cinema group had similar experiences. According to cognitive scientist Dr Alistair Goode of Gorilla in the Room, “It showed the potential virtual reality has for use within research – its uncanny ability to replicate real experiences, and respondents’ tolerance for being in VR, opens up an entirely new world for us as researchers.”
Becky Loftus added: “This unique study has allowed us to understand the parallels and differences that theater, cinema and a 360 filmed VR can bring. Specifically, this research will allow us to understand the potential that VR can bring to truly replicate reality and understand how people respond, what they attend to and how they react. The potential applications in the research industry to better understand responses to different experiences, environments and stimuli are significant.”
In the post-performance interviews with the subjects, the theater audience expressed the greatest sense of interaction with the actors. But those watching via VR indicated a higher level of engagement than those who view the performance in the cinema. On the other hand, the cinema audience found the performance more “moving”. The researchers attribute this to the use of close-ups, directing the audience to specific details, such as facial expressions showing the emotional pain of some of the characters. In contrast, the theater and VR Shakespeare audiences could move their heads freely to see such details as they wanted, with no close-ups or external direction.
There were also some other technical differences in the way the different groups saw the performances. For example, the VR Shakespeare group saw the play in five acts - which is how Shakespeare actually wrote it - rather than the normal two-act structure of modern theater. However, individual subjects could choose whether to watch the play with breaks or not. The 360° method of filming also meant that viewers could turn their heads around and see other audience members.
Sarah Ellis, Director of Digital Development at the RSC observed that: “The results have shown us that even after 400 years, Shakespeare’s work packs an emotional punch to today’s audiences wherever and however it is experienced.
First of all, the HTC Vive Pro Link Box has been redesigned to avoid having three connections. The old link box (see below) had a USB Type-A port, an HDMI port and an AC power port on the headset side. On the computer side, it had a USB Type-B port, a power port for an AC adapter, an HDMI port and an alternative Mini DisplayPort as an alternative to HDMI.
The new Link Box, revealed by Cloudgate co-founder Steve Bowler, shows that on the headset side, providing power, data and video. On the computer side it has a power connection, a USB data connection and a DisplayPort for the video. This time, there is no HDMI alternative.
So what do we know about it? In design terms, it has a hinged headband to make it easy to put on or take off. This is not continuous or smooth. It snaps into two fixed positions: angled up 90 degrees and down in the normal worn position. And it can be worn over glasses - always a good thing. But what else? When you want to tighten or loosen it to fit right, you just turn a dial at the back, HTC, it would seem, have learned from others.
The result is a headset that is comfortable to wear and seems to have really good weight distribution. I say “seems to have” because you can only tell for sure if you wear it for a long time. I can say, that once you put it on and tighten it, you can count on it to stay in place, even as you play games that involve vigorous movement.
There is also better blocking out of ambient light from external sources, because the nose pad has also been redesigned.
Resolution is 2880 x 1600 (AMOLED) - up from 2160 x 1200 on the original HTC Vive. This is a 78% increase in overall pixels. And at 615 PPI, it is also a 37% increase in Pixels Per Inch. This means, in practice, that the Vive Pro (or Vive 2.0) matches the resolution of the Samsung Odyssey and beats the other Windows Mixed Reality headsets currently on the market. This is not enough to completely eliminate the screen door effect. Also, it still has the same old Fresnel lenses. This means that those occasional, annoying circular bands of light, still appear every now and then.
The resolution is also not enough to make text completely sharp. But it is a lot sharper than before and so this is a marked improvement on the first generation Vive. And the visual quality of the graphics as a whole was also a big improvement. But remember that you’ll need some pretty high spec processor power at the computer end to exploit this resolution at 90 Hz, without dropped frames or latency problems.
The headset also comes with its own headphones attached. Made of plastic, they have a volume control in the left ear. The control takes the form of two buttons. This is quite common for audio headsets in general, but not so common for Virtual Reality headsets. It is probably more useful in VR than in passive audio listening. After all, who wants to break off while in the middle of a game. That said, it is quite difficult to hit the right button when wielding a wand, but you get the hang of it eventually.
The sound quality itself was good, with no noticeable distortion. HTC have said that the market version will have noise-cancelling properties in the headphones.
The headset will also have a pair of forward facing cameras. This is presumably for a VR Chaperone system, to prevent you from bumping into walls or other obstacles when you play highly active games.
As things stand, then, this headset matches the Samsung Odyssey on spec but it remains to be seen what the price point will be. But at least we’ll only have to wait two weeks and a bit to find out.
In an earlier newsfeed, we covered VR making an appearance in the classroom in America. Now an Irish startup is due to produce a similar disruption in the British educational environment. Now VR Education Holdings has just started trading on ESM - the Enterprise Securities Market of the Irish Stock Exchange as well as the Alternative Investment Market (AIM) in London.
The Waterford based company, which specializes in delivering digital education in a virtual environment, is the first Irish technology company to launch on ESM since the market was created in 2005.
The formalities of the admission protocols required that they raise £6 million (€6.7million). In the event the initial public offering of 60 million 10p shares was oversubscribed more than three times, giving the company an implied market value of £19.3 million (€21.6 million).
The deal was brokered by Shard Capital Partners LLP and the Davy Group, with the Davy Group also acting as ESM advisor and Cairn Financial Advisers LLP as Nominated advisor. Shares in the VR Education Holdings are currently trading at 11.75p - 12% higher than their launch price.
Deirdre Somers, the CEO of the Irish Stock Exchange said that it was “fantastic to see VR Education Holdings at a relatively stage of their development, utilise the ESM to access international pools of capital to deliver the finance they need to fuel their growth.” She added that “VR Education’s IPO success demonstrates that listing on an exchange is an option for SMEs with a great business, ambition and plans to scale.”
Before the stock market launch, the company had already raised €1.3 million in venture capital from Enterprise Ireland, Kernel Capital Venture Funds, Suir Valley Ventures.
The company currently employs 21 full-time employees as graphic artists, animators, developers, researchers and marketers. Early products of the company, from its pre-IPO period, include Apollo 11 Experience and its Engage platform.
The award-winning Apollo 11 VR was a recreation of the 1969 moon landing that sold over 100,000 and brought in €1 million in revenue. It was available for the Sony Playstation and the Oculus Rift. Since then they have released Titanic VR, a virtual exploration of the wreck of the Titanic.
In a recent interview, Dave Whelan, the CEO of VR Education said he wanted to turn “educators into rock stars.” However, on a more serious note, one of the company’s main objectives is to bring down the cost of education. Although by no means a new concept - Coursera got their first, and Britain’s Open University even before that - bringing virtual reality into the process adds another dimension - literally!
The company is planning on releasing a series of 10 free lectures with Oxford University. They will also be creating content for paid access and some content exclusively for particular institutions. Online learning has something of a mixed reputation as only about 15% of students finish online courses that they start, according to VR Education. However, the company believes that its virtual reality approach will be more engaging: hence the name of their platform.
The education market is growing rapidly. It stood at $187 million in 2016, but is expected to grow to $2 billion by 2021, while the VR market is expected to quintuple to $35 billion by that year. Despite their recent cash infusion, VR Education may not be in a position to dominate that market, in the face of potential competition from bigger players. But they will certainly be able to carve a large chunk of it with their head start.
The Engage platform has the flexibility to be used both academic and corporate education. It can be used for lectures, seminars, conferences and presentations in both secure (closed) and open environments. Content can be live, prerecorded or a mixture of the two.
Sony first appeared above the horizon in the west during the nineteen sixties, as a maker of good quality, portable (transistor) radios. Since then, they have given us the walkman, Betamax and many fine audio and video products. They have also ventured into computers, the music business and the movie business.
They are known for being innovative. So, it is only natural that they should be out there in the virtual reality and augmented reality industries. And it should come as no surprise that their “Wow “Studio” at SXSW (South by South West) in Austin, Texas should have made such an impact this year, as indeed it did last year.
One of the products on display this time around was a product called Xperia Touch, a projector that turns any flat surface into not merely a screen, but a 23-inch touch screen. That is, it projects an image onto a flat surface which can be horizontal (e.g. a table top) or vertical (a wall) and it used infrared to track your finger as it touches areas of the screen, which can be hotspots in the same as you can have hotspots on a web page.
You can swipe, select, or even move pieces on a board game - anyone for backgammon? What’s more, it can detect your presence when you approach and switch itself on, along with a greeting message on the surface that it is aimed at.
The full specs are:
Weight: 932 g
Dimensions: 69 x 134 x 143 mm
Battery: 1300 mAh (1-hour continuous video playback) - 1000 cycles
Sensors: Microphone, Accelerator*2, e-Compass, GPS, Gyro, ambient light detector, barometer, thermometer, humidity detector, human proximity detector, infrared sensor.
Connectivity: WiFi 11 a/b/g/n/ac (SISO), Microcast sink, Bluetooth 4.2, NFC.
Connectors: USB Type-C, HDMI Type-D
Display: SXRD laser diode projector with 3 x primary colors and LCD shutter, 1366 x 768, autofocus, 23 - 80-inch projection area, 100 lumens, 4000 - 1 contrast.
Main Camera: 13 megapixels.
Controls: 10-point multi-touch using IR sensor.
Sound: Two-way stereo speakers.
Power: USB 15 volts.
It is pricey - $1699 at the time of writing - but that is in the nature of nearly all new tech products. As time goes by, the Xperia Touch will come down in price and better versions will come out. But for now, it is out there for those who have the money and want it.
But Sony hasn’t stopped there. They have also showcased a whole swathe of products, pushing the limits of innovation and creativity. They are demonstrating this technology in an interactive exhibit combining images from multiple projectors, sensors and 3D-printed models and props. The whole thing is controlled by custom software. It is not intended as an actual product, more a proof of concept.
The exhibit has attracted praise form the Verge, if only for proving that “something like this is both more accessible and can be experienced collectively, without requiring everybody wear a pair of smart glasses, a VR-style helmet, or even a compatible smartphone with the requisite software."
In our previous article, we speculated that it might have already been the case. Participation in the survey is optional so there might have been an element of self-selection in favor of the Vive. A further complicating factor was that Rift’s main content platform was the Oculus Home Platform. To Oculus/Facebook, Steam was merely a supplementary content supplier, to expand and enlarge upon their repertoire. In contrast, Steam was the unofficial content platform for the Vive, notwithstanding the fact that HTC ran the Viveport Appstore in tandem, in addition to the Viveport subscription service.
Thus it was never entirely clear who had the lead in the battle for hardware installed base. However, the strong suspicion was that Rift was already in the lead. Now the survey confirms this. After the catching up period between October and December 2017, the Vive and Rift were tied in January. A photo finish at that stage would have shown only 0.9% between them - a nose in equine racing terms.
However that all changed in February, when the Rift (excuse the pun) opened up. The survey showed the Rift holding 47.31% of the hardware market and Vive a close second with 45.38. Although the gap is small, it’s the first time that Rift has held the lead at all and as such is significant.
Also, the Oculus DK2 has a further 1.95% of the market and Windows Mixed Reality has 5.36% - up from 5.17% last month. This past point is important, because WMR has only been around for a few months, compared to over two years for the Vive and Rift. When you consider that the survey looks at customer base rather than current sales, this could be ominous. Add to this the fact that WMR can also run Windows apps and Rift apps with Revive and that self-selection element kicks in with a vengeance. The customer base for the WMR range is probably bigger than the survey implies.
As for the head to head between the Vive and the Rift, these figures are all very unofficial. Neither Facebook (owners of Oculus) nor HTC (owners of the Vive) have given out any official sales figures. But the trend is consistent with other data. Rift was also voted the “Most Popular Headset” for the second time.
However none of the competitors are standing still. HTC are about to release the Vive Pro- a high-end headset that will probably have a very high resolution and FoV spec, but will almost certainly command a premium price.
Way back on the 4th of July 2017, we reported here that Apple had acquired a German company that made eye tracking technology. At the time, we speculated that Apple was planning on using the eye-tracking technology in conjunction with a process known as “foveated rendering.” We can now report that barely a month after that (in August), Apple applied for a Europe-wide patent for a foveated rendering chip.
This is a technique whereby the eye movement is tracked and only the area in the center of the vision is rendered in high resolution, whilst the peripheral area is rendered in lower resolution. This reduces processing time or power. It is based on the fact that the area at the center of a person’s vision is clearer, sharper and more focused than the area at the periphery of their vision.
In the Apple patent, the area at the center of the user’s vision is shown at 8K resolution and this area is identified by eye-tracking. The eye-tracking updates rapidly, in real-time, and informs the graphics processor. The graphics processor in turn rapidly updates the relevant area of the display to make sure that the area the user is focusing on is displayed at the highest resolution.
The patent discloses a system that updates 120 times per second. The display latency for each eye is 240 Hz. But because both eye has to be updated, the effective system latency is 120 Hz.
The patent application is broadly worded to include different types of display, not just head mounted displays. These include computers, mobile phones, wall-mounted screens and even projectors. However, this is normal for patent applications to maximize any potential future benefit.
In practice, it would be of limited value in a wall-mounted or projector-based display. In a smartphone it would have value if the smartphone is used inside a headset display. The main application of the patent is obviously head-mounted displays. In other words, an Apple Virtual Reality headset.
You’ve got to hand it to the Japanese, for a nation so thoroughly steeped in tradition, they’re incredibly good at coming up with original ideas. Who else could have persuaded us to get up in public and sing to a pre-recorded background track? Who else could have developed little boxes that would enable us to listen to music on the go? Even if the idea was later appropriated by Steven Jobs.
So, what is the latest idea to emerge from the land of the rising sun? How about this - a restaurant on the ground where you can imagine that you are on an airline flight. This includes a virtual tour of the place you are flying to and a themed meal based on where you are flying to. Current “destinations” being “catered for” are Hawaii, New York, Paris and Rome.
Yes, for prices from 5000 to 6000-yen (£33-£40) customers in Tokyo can enjoy a simulated airline flight and related meal in a restaurant called First Airlines. There is even a choice of Business Class or First Class, with seats from The Airbus 310 and 340.
You start by “checking in” at a check-in desk where you are greeted by uniformed staff. Then you make your way to the restaurant, which is decked out to look like the first-class section of an airplane. Before any food is served, you have to watch a safety demonstration, just like on a real airline. This is presumably the jet age equivalent of “singing for one’s supper” in a nineteenth century Salvation Army soup kitchen.
But that’s not the end of the preamble. The “passengers” – I mean patrons or customers – are given virtual reality headsets which they strap on, to be given a virtual tour of the city they are visiting.
Only then are they served wine and food traditional to the place they are visiting. However, the food itself is not your usual airline fare. Or if it is, then it’s the first-class variety. We’re talking four-course meal here. If you’re on the New York flight, it’s salad followed by clam chowder and then a juicy Angus steak. Desert is good old cheese cake. On the Rome flight, you’ll be served dishes like salmon carpaccio, followed by tiramisu.
There are also large screens mounted on the walls to show the plane taking off and landing. And the experience is enhanced by the fact that the waiters and waitresses are in fact former first-class crew members. And in a joint promotion, the customers can even get discounts at three local shops, by presenting their boarding passes.
The restaurant itself is quite small, with only 12 seats. Whether it is economically viable at that size remains to be seen, although they might of course expand. Most of the customers are actually elderly. The reason for this is that many elderly people cannot travel by air because of their physical condition but would love to experience at least part of what it is like to do so.
This is not the first time a restaurant has built on the theme of airline travel. A company called The Pan Am Experience has been offering Pan Am themed parties since October 2014. But First Airlines has added virtual reality to the mix, giving the experience an added dimension of realism.
The problem is two-fold. Firstly, long-term prisoners tend to become “institutionalized”. That is, they learn to blindly follow the rules, telling them when to eat, where to go to get their food, where to go and what work to do. They do not have the problem of finding a job, attending an interview, managing a budget or choosing where to buy what. Even the decision of what to wear is made for them.
And this extends to things like doing the laundry. Working in the prison laundry is very different from doing personal laundry in an American laundromat or a British laundrette.
And because some prisoners are sent to prison early in life, in many cases they never developed these skills in the first place. In America, juveniles accused of serious offences can be tried as adults.
The second problem is that the developed world is changing at a frenetic pace, so that the way we live and function changes rapidly in as little as a few years. A prisoner incarcerated in 1994 may just about know about mobile phones and the beginnings of the internet, but they will have no experience of smart phones, the worldwide web, search engines or internet shopping. A prisoner incarcerated in 1980 may just about remember cash machines in the big cities, but will probably know nothing about contactless payment.
But the problem is further compounded by yet another problem: some prisoners were never expecting to be released.
All that changed, however, in 2012 when the U.S. Supreme Court ruled in the case of Miller v. California that life without parole sentences for juveniles were a “cruel and unusual punishment” in contravention of the 8th amendment, even in cases of murder. (The previous case of Graham v. Florida (2010) held the same, but for crimes falling short of murder. But the clincher was the case of Montgomery v. Louisiana (US Supreme Court, 2016) that held that the ruling in Miller v. California must be applied retroactively.
The rulings did not guarantee automatic release. It only required that the sentences must be reviewed by a judge. Different states responded differently, but the state of Colorado responded proactively at the legislative level by enacting a law in 2016 that Juveniles who have served 20-25 years of their sentence can be released if they successfully complete a three-year re-entry program.
Colorado is now building the curriculum for that program. The skills that prisoners need for living in the outside world have to be taught, whether it is by books, videos or supervised day-release. But in the Colorado program a new technique is being added to the mix: virtual reality.
This technology enables them to learn not only about the practical side of living life on the outside, but also how to handle the kind of confrontational situations that might make them angry.
“We need to learn how to respond when someone says, ‘You’re a murderer,” says Eric Davis. a 49-year-old who was sentenced to 40 years to life for murder, when he was 17. He can petition for re-sentencing in 2021.
The first facility to introduce the program with virtual reality is the Fremont Correctional Facility (FCF) in the East Canon prison complex in Fremont County, east of Canon City. FCF houses many of Colorado’s convicted sex offenders (85% of the facility’s population) and it is also one of the few prisons in the state to offer a sex offender treatment program.
Currently, nine prisoners at FCF are enrolled in the re-entry program. Their crimes include second degree and felony murder and were committed when they were juveniles. Some were as young as 15 at the time. They were tried and sentenced as adult and have been living in adult prisons ever since being convicted. Most of them are now in their late thirties or early forties.
The program is overseen by Melissa Smith of the Colorado Department of Corrections. She says that while they were conducting research on how to implement the re-entry program, they came across the idea of using virtual reality. She added that the State of Pennsylvania was also experimenting with virtual reality for prisoners, although in a somewhat different way.
The first prisoner to be released in Philadelphia, Pennsylvania, under this program, is Daniel Peters, who had been in prison for 34 years, since age of 17. His sentence had been life imprisonment, but that changed with the Miller and Graham rulings. He is one of some 295 Philadelphia inmates to be released under these rulings.
As part of his preparation for release, Peters was transferred to a halfway house in the Callowhill neighborhood on 24 June, 2016. However even this was a daunting experience. But Peters was prepared for the move by being given a virtual reality tour of the facility, so he would know what to expect.
Both these uses of virtual reality in preparing long-term prisoners for release are in their early stages.
People have been talking about using Virtual Reality as a teaching aid in the classroom for a long time, but not it is finally being trialed in a high-poverty educational district in Pennsylvania.
An article by Eleanor Chute in the Hechinger Report, highlights the use of Virtual Reality to broaden the minds of junior highers in the Cornell School District of Coraopolis Pennsylvania. The article gives the example of Jada Jenkins, an eighth grader who was transported into the very different world of a forest by the VR headset.
In a scenario that might have come straight out of a James Cameron’s 2009 blockbuster, Jada and her classmates were given their own avatars and were able to wander around the virtual forest and find each other. Rather than actually walking around, they moved their avatars with hand-held controllers. This meant they could interact in the virtual forest without the risk of bumping into each other.
But this was only the beginning. After a period of acclimatization, Jada and the other students were assigned to teams and sent off on a scavenger hunt. They had to solve clues to win. For example, Chute gives the example that they had to identify which omnivore might forage for nearby plants or animals after awakening from hibernation. In this case, the answer was a black bear and the winning team could plant their flag to claim victory.
The project, called Voyage, was developed by a team at Carnegie Mellon University. Julian Korzeniowsky, a graduate student who worked on the project, describes VR as:
“another tool teachers can use to hopefully increase the learning gains of their students through engagement.”
In the Voyage “game” - if one can call it that - the students also hear realistic sounds like flowing water or animals in the distance.
The immersive realism was a high motivating factor. It turned learning into fun - always a good way to hold children’s attention.
This is one the strengths of Virtual Reality. It can bring situations to life in a way that books or even TV and movies can’t. But at the same time it is cheaper than field trips. And when the place is a forest with black bears, it is also a lot safer! Other students have witnessed 3D scenes in Syrian refugee camps, according to the Hechinger Report article. This has broadened their horizons without putting them in danger. Other VR scenarios can give students a glimpse into history, re-enacted in three dimensions.
Kristopher Hupp, director of technology and instructional innovation in the Cornell School District explained the reasoning behind the project: “Virtual reality allows students to explore places and structures in a way that is as close to real life as possible, without actually leaving our campus.”
Meanwhile, a company called Schell Games founded and headed by Jesse Schell, professor of the practice of entertainment technology at Carnegie Mellon University is developing a virtual chemistry lab. The concept has been around for decades. The classic example was the experiment that “blows up the lab” and teaches the students about the dangers of explosives without actually killing them. But doing it on a computer screen was never quite the same as doing it in 3D as if you are actually there in the lab, mixing the chemicals and heating them up over the bunsen burner.
Schell believes that the main value of VR is in helping students to visualize, rather than leave it to the imagination alone. As to the fact that VR has not yet made major inroads into the classroom, Schell compares this to the initial resistance to computers, in the eyes of educators “between 1978 and 1990.”
These days, of course, computers are no longer as expensive as they were up till the mid-eighties. And their value as educational tools is beyond dispute.
Cornell School District has been described as a “high poverty” area. But they were helped out by a $20,000 grant from the Allegheny Intermediate Unit, a regional public school service agency. This grant enabled them to buy 15 Google Daydream headsets, a similar number of Mattel View Master VR viewers and also 15 Google Pixel phones. The Daydream headsets were for the older students, while the younger ones got the Mattel View Masters.
In order to keep an eye on the situation, the teachers don’t wear headsets, but use iPads to monitor what the students are seeing. But it doesn’t stop there. The district has bought two 360-degree still cameras to enable the teachers to develop their own content.
It was in fact only after the school district bought the VR hardware - without a specific project in mind - that Carnegie Mellon University Entertainment Technology Center got involved. As the Voyage blog explains:
The idea started with a spark from Sharan [Shodhan]. He heard that the Cornell School, a high school in the Pittsburgh area, acquired a bunch of Google Daydreams and Pixels, but didn’t really have anything to do with them. Sharan wanted to combine the power of VR (great immersion, but somewhat isolating) with the power of the classroom (collaborating with your friends and working with the teachers). This turned into the idea to create a multiplayer VR experience for the classroom that we will integrate into the Cornell School at the end of the semester.
Thereafter the project was developed by the Carnegie-Mellon team, with feedback from the teachers and students at the school. The results were impressive. While the students found the app “cool” the teachers noticed that it was also a very effective learning tool. As history teacher Andrew Erwin said: “But with virtual reality, even with one try, I could tell that there is some educational value. The kids do remember facts better when they use virtual reality.”
So far, the team have merely been testing the waters. But as it’s now finding its way into the classroom, it’s only a matter of time before we see a whole lot more of it.