I enjoyed visiting with many of you at our Oculus Rift and Vive demo last week. We were fortunate to be joined by colleagues from area universities Randolph Macon and VCU. They are grappling with many of the same things we are grappling with in terms of juggling the hype/potential with practical/meaningful applications. Lots of fun, idea sharing, and stimulating conversation was had. Major thank you to Joedy Felts from Communications for letting us use his Oculus Rift! If you couldn’t make it last week, we’ll have more later this summer.
With immersive computing around us and woven into our environment, information will be richer, more relevant, and more helpful to us.
Ever since I started this community of practice / newsletter I’ve struggled with how to frame the conversation and how to name the group. I wanted it to be both focused but wide ranging in scope. After reading a particularly insightful article from Google’s VR/AR led VR Clay Bavor, I’ve finalized on a name: Immersive Technologies at UR. Clay makes a strong case that VR and AR technologies are not competing but rather exist on two unique places on the spectrum of a ‘immersive technology’ spectrum.
He also really does a good job of summarizing where these immersive technologies are now and provides insight into the ways they will potentially change computing in the future.
As value goes up and costs come down, immersive computing will make sense for more and more people. And this isn’t a question of if — it’s a question of when.
More cool things coming from Google I/O, Googles annual event.
Say your team wants to workshop an idea. Today you grab a dry erase marker, find a room with a whiteboard, and start working. Would it be possible to replicate this process in virtual reality? If so, would VR provide benefits / problems? This article talks about the challenges and insights gleaned from developing such an app. I thought the process of development they describe was fascinating and I can’t wait to see what final product they end up with.
Some ideas explored:
How would it be if it was effortless to move whiteboards, Post-Its, and work sessions from the physical space into VR?
Do the physical constraints of these objects affect the way we think while ideating?
There’s something that gets lost when trying to have meetings on digital platforms like Slack or Hangouts / Skype; is it the physical presence, our facial expressions, or the limited tools that make these platforms so inefficient and awkward when it comes to ideation?
What if all of these physical nuances could be moved to a digital space and still remain tangible for the user?
Virtual and Augmented Reality: Stepping into the New Frontier of Learning Webinar – Presented by Emory Craig and Maya Georgieva: May 1 1:00-2:30 pm, Boatwright 322. We’d love for you to join us to hear from Emory and Maya then discuss how their vision for the future fits into the University’s new strategic plan. More information.
Oculus Rift Technology Demo – in collaboration with Communications Joedy Felts. May 11th 9:00 – 4:00 PM, Boatwright 322. As many of you know, we’ve invested in the HTC Vive for our initial explorations into immersive VR technology. This will be an opportunity for the VR community to try the new Oculus Rift and it’s touch controls.
Mobile technology analyst Benedict Evans (of A16z) analyzes the current state of augmented reality technology and speculates how recent developments compares to mobile phone technology in the 2000s. I always find Benedict’s analysis thoughtful and nuanced, this article is no different.
I believe the multitouch -> iPhone, AR -> ?!? analogy is spot on. Demos of AR technology are getting cooler and cooler (Hololens, Magic Leap, etc) but we are still waiting for a breakthrough product that truly changes the way people think/interact.
I also agree with his assertion that ‘real’ augmented reality will be when a device can see and interpret the world around us. This is something I am constantly reminded of when I hear people talk about Pokemon Go. A “dumb” heads up display (HUD) wouldn’t add enough compelling incentive to be a breakthrough consumer product, a cool fad perhaps, but not a lasting computing revolution.
Evans writes: an AR device with “an ambient, intangible, AI-led UI would change everything”. I agree and would add that education in particular will be revolutionized with these advances.
A VR/AR Sandbox
Stéphan Faudeux’s VR talk at the French Film Festival last month was a terrific survey of VR’s past, present and future. Among many insights, he talked about how some French movie theaters were installing VR arcades. Turns out the concept of a VR arcade isn’t new. IMAX has recently opened their first VR arcade in the states this year and similar projects are popping up all over the country. Norm Laviolette, founder of Asylum Gaming and eSports in New England says:
“Ultimately, we are creating an experience for people, and really there are few things out there that can elicit such an amazing physical, emotional, psychological reaction like VR,” he said. “We plan to have a dedicated wing just for VR, and keep it flexible to evolve as VR evolves and becomes more and more sophisticated.
As I talk with more people on campus about implementing VR technologies, the more I believe, in addition to faculty driven academic and research developments, we should also be student focused. What types of experiences will our students expect in 3-5 years when they are campus? We should be giving student’s access to these new technologies and I think a VR sandbox/arcade concept like Mk2 VR might work at an institution our size.
VR Lecture by Stéphan Faudeux on Campus
A very special event is happening on campus this month. The French Film Festival will be featuring a lecture by Stéphan Faudeux, titled: Virtual Reality and Cinema: Complementary or Competitors? The talk happens March 28th, 10 am to 11:30 am and will “cover the progress of virtual reality in various areas, with a pragmatic, practical and fun approach.” For more information check out the French Film Festival site.
Two Events Coming in April: VR Student Research Project Pizza & Pedagogy and Organon VR Anatomy Demo
Alyssa Ross and Dr. Kristin Bezio will discuss using the HTC Vive in the classroom and research lab for our April Pizza & Pedagogy lunch – free pizza! Register here
We will be hosting a demo of the Organon VR Anatomy app for the HTC Vive on April 11th from 1 to 4pm. This app will change the way you think about the human body (I’m not exaggerating). http://www.3dorganon.com/site/
[Tim] Cook (Apple’s CEO) has likened AR’s game-changing potential to that of the smartphone. At some point, he said last year, we will all “have AR experiences every day, almost like eating three meals a day. It will become that much a part of you.”
Mark Gurman at Bloomberg gets some new, interesting details on Apple’s AR efforts. However, it’s still uncertain, how exactly Apple is going to define “AR”. Some argue Pokemon on a large slab of glass (i.e. an iPhone) is AR, others believe, in order to be a new platform, glasses/new hardware have to be involved. I think the Google Glasses are a lesson that not everyone is keen on wearing and/or being seen by technology glasses. I tend to think Apple’s strategy in the near term will be focused on extending the AR functionality in the iPhone. From an ed tech stand point, that would be great due to the popularity of iPhones on campus.
Speaking of AR apps for the iPhone
Emory Craig argues that new AR apps from Shazam, Blippar, and others have the “potential to pull augmented reality out of gaming and into our everyday lives.”
The apps certainly look like a lot of fun. Emory argues the critical missing piece is having to hold up a heavy phone to enjoy the experience. He clearly believes glasses are the future.
A Shift from Looking to Interacting
This article articulates exactly where I think we are in terms of VR education technology. Looking in 360 is great but not revolutionary, collectively interacting in virtual or augmented worlds is. Paul at W&L is doing some great things with VR and has great vision for how success will be defined in the future.
Paul Low, who taught undergraduate geology and environmental sciences and is now a research associate at Washington and Lee University, is among a small group of profs-turned-technologists who are experimenting with virtual reality’s applications in higher education. Early VR programs were about showing students places, say the Louvre or ancient Rome, in low-cost headsets like the Google Cardboard viewer. But these latest iterations go further, creating entire environments—from the subatomic level to the solar system—that students can manipulate. Low and his colleagues at other campuses are trying to shepherd VR educational content from being something “that students look at” to something they can interact with.
… Low is excited about creating virtual environments where students and faculty can inhabit the same space and interact, like in the earthquake study. That experience is easier for learning designers to create, he says, because they don’t have to program all of the sequences of events that could potentially happen during a solo activity. Instead, the instructor handles the interactive components on the spot.
VR will never become the new cinema. Instead, it will be a different thing. But what is that thing? And will audiences trained in passive linear narrative—where scene follows scene like beads on a string, and the string always pulls us forward—appreciate what the thing might be? Or will we only recognize it when the new medium has reached a certain maturity, the way audiences in 1903 sat up at The Great Train Robbery and recognized that, finally, here was a movie?
Visualization can reveal the knowledge hidden in data, but traditional 2-D and 3-D data visualizations are inadequate for large and complex data sets. Our solution is to visualize as many as 10 dimensions in VR/AR all via a Shared Virtual Office, which allows even untrained users to spot patterns in data that can give companies a competitive edge.
We are just at the early stages in understanding what kind of tools will prove useful in VR but this looks very promising. It feels super nerdy to say but being able to walk around a 3D scatter plot sounds exhilarating.
The “get a big screen TV when you strap a small screen on your face” has gained some traction recently but I really wonder if it is a fad like the 3D TVs of the last few years. Same with the 360 live video cameras. Both have some utility and “oh, this is cool” moments but neither seems like virtual reality to me.
I’m more interested in the new recording technology that uses light field (depth + photographs) technology to recreate a place / event for you in VR. Imagine walking on the sideline at the Super Bowl as opposed to these pseudo-“VR” experience for Super Bowl LI.
The new Lithodomos VR App will turn archaeological sites into completed visualizations of how they once appeared. Apps like it will not only impact tourism, but transform how we teach history and archaeology. The days of hand drawn renderings are coming to an end; students will explore deeply immersive environments depicting the past.
I think stating this app and others like it will transform how we teach history and archaeology is a bit of a stretch right now but the technology is looking more and more promising. Hope this app comes out on the Vive soon!
This experience is segmented into 3 “rounds” based on 3 different themes: (1) VR Experiences in Art, Museums and Cultural Sites, (2) VR and AR Experiences with the Human Self, and (3) VR Experiences in Storytelling, Journalism and Social Science. I highly recommend trying out this experience – if you need a Google Cardboard viewer, come by the CTLT in Boatwright library.
Unity is, and has been, the ‘go-to’ tool for VR game / environment creation but, until now, development occurred on your basic PC and monitor. It makes sense that building in VR would be a natural development for the tool but this is exciting regardless. I wonder if it will reduce the learning curve for new developers? As an inexperienced developer, I hope so!
The CTLT will be hosting a Virtual Reality Open House in the Jepson C&P room on February 9th and 23rd from 1-4 pm. All on campus (faculty, staff, & students!) interested in virtual or augmented reality are invited!
During this open house you’ll have the chance to interact with new VR technology that uses positional tracking (which basically means it knows where you are in space) to mimic your movements in the virtual world. Our positional tracking device is called the HTC Vive and using it really gives you a sense of how powerful VR can be.
We in the CTLT believe in exploring emerging technologies that can create new educational opportunities. Positional tracking VR has the potential to pave many new avenues in learning and research here on campus. To help guide your journey, we’ve outlined a few different experiences to choose from.
A promise of VR has always been to take you places you’ve never been. Destinations and Realities.io are apps that push the envelop of imaging technology to take you to far off places and allow you to experience medieval churches or mountaintops from the comfort of Jepson Hall.
Technological advancements are so important to the educational landscape because they create new opportunities that were never before possible, opening new avenues for educations to guide their students down. Tilt Brush is an app that creates a completely new opportunity: to create in virtual space. You can create paintings, sketches, buildings, even volcanos and then walk around and interact with them.
What’s the difference between a virtual experience and a real experience? As virtual experiences become more prevalent we must ask ourselves whats the difference between virtual and actual reality? Experience what virtual reality ping pong has to offer?
Have you ever been frustrated with the laws of nature when teaching in the classroom? Have you wanted to show a microscopic structure with your hands or demonstrate a dangerous interaction for your students? With virtual reality, there is potential to engage with your students in new ways. Just as a fair warning – this app is early in development but will give you a glimpse into the direction the field is going.