<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Virtual Reality Archives - Situated Research</title>
	<atom:link href="https://www.situatedresearch.com/tag/virtual-reality/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.situatedresearch.com/tag/virtual-reality/</link>
	<description>Usability Research and User Experience Testing</description>
	<lastBuildDate>Mon, 22 Nov 2021 17:33:24 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	

 
<site xmlns="com-wordpress:feed-additions:1">122538981</site>	<item>
		<title>Road to GDC: I’m Not A Doctor, but I Simulate One in VR</title>
		<link>https://www.situatedresearch.com/2018/03/road-gdc-im-not-doctor-simulate-one-vr/</link>
					<comments>https://www.situatedresearch.com/2018/03/road-gdc-im-not-doctor-simulate-one-vr/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Fri, 02 Mar 2018 17:20:01 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Games for Learning]]></category>
		<category><![CDATA[Health Care]]></category>
		<category><![CDATA[Learning]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">https://www.situatedresearch.com/?p=9703</guid>

					<description><![CDATA[<p>We are moving into a future where games train our doctors, monitor our health, and treat our illnesses.&#160; The sky is falling! Social media is the new scapegoat of the month. Headlines claim it is ruining our relationships, dismantling our society, destroying our very lives! In particular, the most frequent victims are presumed to be&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2018/03/road-gdc-im-not-doctor-simulate-one-vr/">Road to GDC: I’m Not A Doctor, but I Simulate One in VR</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>We are moving into a future where games train our doctors, monitor our health, and treat our illnesses.&nbsp;<span id="more-9703"></span></p>
<p>The sky is falling! Social media is the new scapegoat of the month. Headlines claim it is ruining our relationships, dismantling our society, destroying our very lives! In particular, the most frequent victims are presumed to be teenagers. Sometimes the accused culprit is not social media, but the phones that make it so accessible. Is it true? Only time will tell &#8230; but in the &#8217;50s, the demon was comic books; in the &#8217;60s, rock and roll; and in the &#8217;80s, video games. My mother was convinced that my love of comic books and science fiction was going to rot my brain. Now, of course, these things are mainstream and no longer the sole domain of teens. But there’s always a new thing for people to worry about or blame for the decline and fall of civilization.</p>
<p>I’m particularly sensitized to that criticism of video games. I designed and programmed my first computer game in college in 1976 &#8211; in fact, inspired by that very love of science fiction I had as a child. When I graduated in 1980, my first job out of college was entering the then-infant video game industry. I’ve never left. So when pundits blamed games for destroying society, even causing teen violence and rebellion, I took it personally. I’ve always felt that video games can be magical, marvelous entertainment. I hoped that one day they’d be seen as not just safe, but actually good for us. That day is finally here.</p>
<h3>Virtual treatment, real results</h3>
<p>For many years now, researchers and doctors have gradually built up solid scientifically verified evidence that existing games can improve the lives of the people who play them. At the same time, increasing numbers of games have been created with the idea of ‘boosting health’ as a direct goal.</p>
<p>Fast action games like Call of Duty have been found to improve visual perception and the ability to make correct decisions quickly. Other research has shown promise in using a game to treat the underlying causes of&nbsp;<a href="https://www.polygon.com/2014/2/24/5439884/this-game-knows-how-scared-you-are-but-could-be-used-to-heal-trauma" target="_blank" rel="noopener">depression</a>. It’s possible that games may be able to diagnose the onset of degenerative diseases like Alzheimer’s and Parkinson’s, and perhaps even slow their progression.</p>
<p>Games have shown promise in the realm of physical fitness, too. Starting 20 years ago, the arcade game Dance Dance Revolution was credited with getting a lot of passive couch potatoes up, moving, and losing weight, and it’s still spawning sequels. Games on mobile phones like&nbsp;<i>Zombies, Run!</i> and&nbsp;<i>Pokémon Go</i>&nbsp;have encouraged players to get out and move in the world, and many track their exercise and calorie expenditure as they do so. VR holds promise here too, with the chance to get your exercise by racing the Tour de France on your exercise bike, or by flying like a bird. There are even current ventures bringing gameplay to gym class and possibly making dodgeball fun even for nerds!</p>
<h3>Doctors with joysticks</h3>
<p>It turns out that doctors in training, like most people these days, are often avid game players. That has presented a great opportunity for using them as part of their medical education. Although games have yet to replace classes, they’ve been shown to help laparoscopic surgeons reduce errors by 37 percent while increasing their speed by 27 percent when used as warm-up exercises. When you consider that athletes, musicians, dancers, and others who need to do precision work with their muscles all limber up before their tasks, it makes sense that the right kind of practice helps surgeons, too.</p>
<p>Other companies are rushing to use VR to train anesthesiologists or to give caregivers a first-hand sense of how their patients with macular degeneration see the world. The VR simulations aren’t all games, but the vast majority of VR engineers are coming from the games industry.</p>
<h3>Prescribing play</h3>
<p>Perhaps the most exciting application of games in the modern world are the ways in which doctors are using games to treat their patients. Realistic war games have helped soldiers recover from PTSD by simulating the experiences that trigger their problem, a method to gradually desensitize them to reduce their symptoms long term. Other games have been used in similar ways in conjunction with therapy to treat&nbsp;<a href="https://www.polygon.com/features/2017/4/7/15205366/vr-danger-close" target="_blank" rel="noopener">phobias</a>&nbsp;like fear of heights, flying, and spiders. And currently, virtual reality games have shown great promise in pain relief for acute pain, reducing or even eliminating the need for narcotics when changing the dressings on burn victims. VR is also showing promise in helping stroke victims recover control over their movement, and in&nbsp;<a href="https://www.polygon.com/2014/3/3/5462508/phantom-pain-video-game-treatment" target="_blank" rel="noopener">relieving the perception of pain in “phantom limbs”</a> experienced by amputation patients.</p>
<p>Last September saw the FDA approval of a mobile phone app to be used (in conjunction with therapy) to treat addiction. The developers call their app a “Prescription Digital Therapeutic” and, although it’s not a game, it’s a big step to have software approved to treat something as serious as Substance Abuse Disorder.</p>
<p>But a real game designed to be an active treatment for ADHD (Attention Deficit Hyperactivity Disorder) was not far behind. By December, the FDA gave preliminary clearance to a video game made by a team consisting of both game developers and neuroscientists from UCSF. In a large controlled trial of children and teens diagnosed with ADHD, the group who used the game showed significant improvement compared to a control group. The team hopes that soon it will become the first game to win FDA approval on the same terms as a prescription drug. In style, the game is part racing game, part Pokémon Snap, but with many unique twists to improve attention and focus.</p>
<p>We are moving into a future where games train our doctors, monitor our health, and treat our illnesses. It may seem a bit outrageous now, but if comic books led me into a career making video games and often become the basis of mainstream movies, why can’t video games inspire the next generation of doctors and become the basis of medical treatment? Video games are intimately connected to learning, attention, and the brain. It isn’t an accident that they are also proving to be useful to our mental and physical health. Maybe they’ll even be able to reverse my dreaded comic book brain rot!</p>
<p><i>This is part of a&nbsp;<a href="https://www.rollingstone.com/gdc" target="_blank" rel="noopener">series of columns</a>&nbsp;written by developers speaking at the Game Developers Conference in March.</i></p>
<p><i>Noah Falstein is a freelance game designer and producer, and was one of the first 10 employees at LucasArts Entertainment and Dreamworks Interactive. Last year he left Google after serving four years as their Chief Game Designer.</i></p>
<p>Written by: Noah Falstein, via <a href="https://www.rollingstone.com/glixel/features/road-to-gdc-im-not-a-doctor-but-i-simulate-one-in-vr-w517154" target="_blank" rel="noopener">Rolling Stone</a><br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2018/03/road-gdc-im-not-doctor-simulate-one-vr/">Road to GDC: I’m Not A Doctor, but I Simulate One in VR</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2018/03/road-gdc-im-not-doctor-simulate-one-vr/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">9703</post-id>	</item>
		<item>
		<title>Next Big Thing for Virtual Reality: Lasers in Your Eyes</title>
		<link>https://www.situatedresearch.com/2016/05/next-big-thing-virtual-reality-lasers-eyes/</link>
					<comments>https://www.situatedresearch.com/2016/05/next-big-thing-virtual-reality-lasers-eyes/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Tue, 03 May 2016 21:29:30 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[heads-up-display]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Mobile Devices]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=9341</guid>

					<description><![CDATA[<p>San Francisco – The next big leap for virtual and augmented reality headsets is likely to be eye-tracking, where headset-mounted laser beams aimed at eyeballs turn your peepers into a mouse.  A number of startups are working on this tech, with an aim to convince VR gear manufacturers such as Oculus Rift and HTC Vive&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2016/05/next-big-thing-virtual-reality-lasers-eyes/">Next Big Thing for Virtual Reality: Lasers in Your Eyes</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>San Francisco – The next big leap for virtual and augmented reality headsets is likely to be eye-tracking, where headset-mounted laser beams aimed at eyeballs turn your peepers into a mouse. <span id="more-9341"></span></p>
<p>A number of startups are working on this tech, with an aim to convince VR gear manufacturers such as Oculus Rift and HTC Vive to incorporate the feature in a next generation device. They include SMI, Percept, Eyematic, Fove and Eyefluence, which recently allowed USA Today to demo its eye-tracking tech.</p>
<p>“Eye-tracking is almost guaranteed to be in second-generation VR headsets,” says Will Mason, cofounder of virtual reality media company UploadVR. “It’s an incredibly important piece of the VR puzzle.”</p>
<p><iframe title="USATODAY-Embed Player" width="850" height="480" frameborder="0" scrolling="no" allowfullscreen="true" marginheight="0" marginwidth="0" src="https://uw-media.usatoday.com/embed/video/82420346?placement=snow-embed"></iframe></p>
<p>At present, making selections in VR or AR environments typically involve moving the head so that your gaze lands on a clickable icon, and then either pressing a handheld remote or, in the case of Microsoft’s HoloLens or Meta 2, reaching out with your hand to make a selection by interacting with a hologram.</p>
<p>As shown in Eyefluence’s demonstration, all of that is accomplished by simply casting your eyes on a given icon and then activating it with another glance.</p>
<p>“The idea here is that anything you do with your finger on a smartphone you can do with your eyes in VR or AR,” says Eyefluence CEO Jim Marggraff, who cofounded the Milpitas, Calif-based company in 2013 with another entrepreneur, David Stiehr.</p>
<p>“Computers made a big leap when they went from punchcards to a keyboard, and then another from a keyboard to a mouse,” says Marggraff, who invented the kid-focused LeapFrog LeapPad device. “We want to again change the way we interface with data.”</p>
<h2>Eye Tech Not Due for Years</h2>
<p>As exciting as this may sound, the mainstreaming of eye-tracking technology is still a ways off. Eyefluence execs say that although they are in discussions with a variety of headset makers, their tech isn’t likely to debut until 2017. Other companies remain largely in R&amp;D mode, and Fove has a waitlist for its headset’s Kickstarter campaign.</p>
<p>The challenges for eye-tracking are both technological and financial. Creating hardware that consistently locks onto an infinite variety of eyeballs presents one hurdle, while doing so with gear that is light and consumes little power is another.</p>
<p>And while a number of companies in the space have managed to land funding – Eyefluence has raised $21.6 million in two rounds led by Intel Capital and Motorola Solutions – some tech-centric VCs are sitting on the sidelines while they wait for the technology to mature and for headset makers to make their moves.</p>
<p>“What eye-tracking will do will be powerful, but I’m not sure how valuable it will be from an investment standpoint,” says Kobie Fuller of Accel Partners. “Is there a multi-billion-dollar eye-tracking company out there? I don’t know.”</p>
<p>Among the unknowns: whether the tech will be disseminated through a licensed model or if existing headset companies will develop it on their own.</p>
<p>Still, once deployed eye-tracking has the potential to revolutionize the VR and AR experience, Fuller expects.</p>
<p>Specifically, eye-tracking will “greatly enhance interpersonal connections” in VR, he says, by applying realistic eye movements to avatars.</p>
<p>Facebook founder Mark Zuckerberg, who presciently bought Oculus for $2 billion, is banking on VR taking social interactions to a new level.</p>
<p>“The most exciting thing about eye-tracking is getting rid of that ‘uncanny valley’ (where disbelief sets in) when it comes to interacting through avatars,” says Fuller.</p>
<h2>Less Computing Power</h2>
<p>There are a few other ways in which successful eye-tracking tech could revolutionize AR and VR beyond just making such worlds easy to navigate without joysticks, remotes or hand gestures.</p>
<p>First, by tracking the eyes, such tech can telegraph to the VR device’s graphics processing unit, or GPU, that it needs to render only the images where the eyes are looking at that moment.</p>
<p>That means less computing power would be needed. Currently, a $700 Oculus headset requires a powerful computer to render its images. Oculus’s developer kit with a suitable computer costs $2,000. “If you can save on rendering power, that could significantly lower the barrier to entry into this market for consumers,” says UploadVR’s Mason.</p>
<p>And second, by not just tracking the eyeball but also potentially analyzing a person’s mood and logging in details about their gaze, AR/VR headsets are in a position to deliver targeted content as well as give third-party observers insights into the wearer’s state of mind and situational awareness.</p>
<h2>Police Use</h2>
<p>The former use case would appeal to in-VR advertisers, while the latter would come in handy for first responders.</p>
<p>“Police and paramedics are looking for an eyes-up, hands-free paradigm, and eye-tracking can bring that,” says Paul Steinberg, chief technology officer at Motorola Solutions, an investor in Eyefluence.</p>
<p>Steinberg sketches out a scene from what could be the near future.</p>
<p>A police officer on patrol has suddenly unholstered his gun. Via his augmented reality glasses with eye-tracking, colleagues at headquarters are instantly fed information about his stress level through pupil dilation information.</p>
<p>They can then both advise the officer through a radio as well as activate body cameras and other tech that he might have neglected to turn on in his stressed state. What’s more, another officer on the scene can instantly scan through a variety of command center video and data feeds through an AR headset, flipping through the options by simply looking at each one.</p>
<p>“We would have to work with our (first responder) customers to train them how to use this sort of tech of course, but the potential is there,” says Steinberg. “But we’re not months away, we’re more than that.”</p>
<h2>Demo Shows Off Ease of Use</h2>
<p>An Eyefluence indicates that eye-tracking technology isn’t a half-baked dream.</p>
<p>Navigating between a dozen tiles inside a first-generation Oculus headset proves as easy as shifting your gaze between them. Making selections – the equivalent of clicking on a mouse – is also equally intuitive. At no time does the head need to move, and hands remain at your side.</p>
<p><iframe loading="lazy" src="https://www.youtube.com/embed/iQsY3uLvYQ4" width="720" height="384" frameborder="0" allowfullscreen="allowfullscreen"></iframe></p>
<p>After about 10 minutes in the demo, it feels antiquated to pop on a VR headset and grab a remote to click through choices selected with head movements.</p>
<p>Marggraff says Eyefluence’s technical challenges included making technology that could respond in low and bright light, accounting for different size pupils and ensuring that power consumption is minimal.</p>
<p>But, he adds, his team remains convinced of the inevitability of its product: “Just like when we started tapping and swiping on our phones, we’re going to eventually need a better interface for AR and VR.”</p>
<p>Written by: <a href="http://www.usatoday.com/staff/1005/marco-della-cava/" target="_blank" rel="noopener">Marco della Cava</a>, <a href="http://www.usatoday.com/story/tech/news/2016/05/02/new-mouse-vr-could-your-eyes/83716986/" target="_blank" rel="noopener">USA Today</a> (via <a href="http://ispr.info/2016/05/03/next-big-thing-for-virtual-reality-eye-tracking-lasers-in-your-eyes/" target="_blank" rel="noopener">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2016/05/next-big-thing-virtual-reality-lasers-eyes/">Next Big Thing for Virtual Reality: Lasers in Your Eyes</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2016/05/next-big-thing-virtual-reality-lasers-eyes/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">9341</post-id>	</item>
		<item>
		<title>Games User Research: What’s Different?</title>
		<link>https://www.situatedresearch.com/2016/03/games-user-research-whats-different/</link>
					<comments>https://www.situatedresearch.com/2016/03/games-user-research-whats-different/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Mon, 21 Mar 2016 16:36:13 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Usability Research]]></category>
		<category><![CDATA[Usability Testing]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User-Centered Design]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=9288</guid>

					<description><![CDATA[<p>Summary: Game testing researches the notion of fun. Compared with mainstream UX studies, it involves many more users and relies more on biometrics and custom software. The most striking findings from the Games User Research Summit were the drastic age and gender differences in motivation research. Last week, I attended the Games User Research Summit&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2016/03/games-user-research-whats-different/">Games User Research: What’s Different?</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>Summary:</strong> Game testing researches the notion of fun. Compared with mainstream UX studies, it involves many more users and relies more on biometrics and custom software. The most striking findings from the Games User Research Summit were the drastic age and gender differences in motivation research. <span id="more-9288"></span></p>
<p>Last week, I attended the <a href="http://gamesurconf.com/">Games User Research Summit</a> (GamesUR or GUR), which happened in connection with the Game Developer Conference (GDC), but was hosted separately at Electronic Arts (EA) in Silicon Valley — as you can tell, these guys love their acronyms.</p>
<p>With EA being the gracious hosts, the conference happened under the watchful eyes of an enormous dragon and the break room was festooned with large posters of the classic <em>Star Wars</em> characters. It was clear just from the surroundings that we were not in Kansas anymore. (Or, rather, not in the realm of mainstream UX. Here <em>really </em>be dragons.)</p>
<p>It was clear from the terminology bantered around in the talks that games are different from other design projects. Take, for example, the game Rainbow Six Siege (inevitably abbreviated as R6S, because they do like acronyms). One of the UX metrics tracked during the testing of this game was the <strong>kill/death ratio</strong>, which, admittedly, is not one of the things we teach in our otherwise comprehensive Measuring User Experience seminar. (This ratio is the number of opponents you kill divided by the number of your team members who die during a death match. Another term we don’t use much in mainstream design projects.)</p>
<h2>Much Remains the Same</h2>
<p>Despite the dragon and the death matches, I actually saw many similarities between the games user-experience (GUX) world and the mainstream UX world.</p>
<p>In a brilliant opening talk, Brandon Hsuing (Director of Insights at Riot Games) explained how he has organized his department of 70 people. A main takeaway was the benefit of embedding UX researchers within product teams, both at the feature-design level, but ideally at the higher level of the complete game. Twenty years ago, at Sun Microsystems, we made the same key point of using a matrix organization where researchers report to a central, specialized group but sit with a product team in a dotted-line relationship.</p>
<p>Since Riot Games has close to 2,000 employees, a department of 70 insight professionals might seem too low for the recommended share of 10% of project teams being allocated to user research. However, because Riot is both a studio (designing and implementing games) and a publisher (distributing and selling games), much of the total staff must be allocated to the publishing side. So an insights team of 70 may actually be close to the recommended 10% of the people actually building the new products.</p>
<p>The main innovation I got from Hsuing’s talk lies in the very name of his department: the Insights Team. The wording may seem like superficial propaganda, but in reality it makes a profound point: the goal of research is to increase company profitability by improving products and raising conversion rates. We can achieve these profitability goals only if the UX teams deliver <strong>actionable insights and drive the company’s development activities</strong> at both the tactical level (better design) and the strategic level (discovering customer needs and building products to meet and exceed these needs). Most companies’ UX maturity is not even at the tactical level yet, but to reach the strategic level, we do have to don that “insight team” hat.</p>
<p>I was pleased to hear that Riot’s Insights Team encompasses the company’s user research, as well as their market research and analytics. Analytics and UX should be joined at the hip, but are too often separated in different departments. And market research is usually kept even further from UX. This despite the many benefits of integrated customer insights that triangulate findings from multiple methods.</p>
<p>Another presentation that elicited some déjà-vu moments came from Laura Hammond from UEgroup, who talked about testing gesture-based games. She recommended avoiding swivel chairs when testing young children, because kids get too easily distracted by moving around on the chair. True, but an observation we made in 2001 in the first edition of our report from usability testing of children using websites. The kids who were 6 years old in 2001 are now 21 and thus qualified to participate in our current user research with young adults/Millennials. It’s nice to know that the next generation of children is the same, at least when it comes to swivel chairs in the usability lab.</p>
<p>To record the test sessions and get the gestures on video, the researchers recommended using 3 video cameras: from above, from the side, and facing the user. Exactly what we did 20 years ago in the hardware human-factors lab to record system administrators installing hard drives in servers. Testing 3D user interfaces requires more equipment than studying 2D websites.</p>
<p>Of course, Hammond’s talk had also new observations, specific to games for the Intel RealSense camera (which requires users to control the game by moving their hands in front of the camera). For example, the researchers needed to include the users’ hand size as one of the screening criteria when recruiting test participants. We certainly don’t ask about hand size in our screeners, and apparently, it’s a challenge to get it right.</p>
<p>Another insight from Hammond’s talk was that testing 3D gestures introduces yet another opportunity for the study facilitator to bias the user: the very way you sit or move may prime the user to copy aspects of your body language in their gestures.</p>
<h2>Multiuser Testing</h2>
<p>Unlike mainstream user testing, game research often involves testing with many users in parallel — either because the game takes a long time to play or because it involves multiple players.</p>
<p>(On occasion we do test with multiple users at the same time even in traditional UX projects — for example when running usability studies with young children, but mostly we run one user at a time, because we want to pay close attention to every detail of the user’s behavior. Also, a website visit usually only lasts 2–3 minutes, with a typical page view lasting maybe 30 seconds, so we <em>should</em> aim to study everything in detail.)</p>
<p>Hardcore gamers will often play for hours at a stretch, with much of their time spent repeatedly shooting at something. As a result, playtesting labs around the world seem to be uniformly designed to accommodate 12–20 game testers (or more, for big companies) who play the same game, each at their own console.</p>
<p>Sebastian Long from Player Research in the UK described his company’s playtesting lab: The observation room included a big projection display with reduced versions of 12 users’ screens, as well as a pushbutton switch for observers to select one of the 12 screens to be magnified on a separate monitor for high-resolution observation when one of the testers did something interesting in the game. This need to alternate between surveying many peoples’ broad behavior and detailed attention to a single person’s specific interactions is rare outside games research.</p>
<p>The multiplayer component of many modern games is the second reason for multiuser sessions in game research. Whether several players play together in the same room or across the network in real time, researchers must understand their processes of communication and collaboration. In contrast, in mainstream UX, even when taking into account social media and omnichannel experiences, people rarely work together at the same time with the same interface to solve the same task.</p>
<p>Games researchers often have access to <strong>data at true scale</strong>: in the case of the R6S kill/death ratio I mentioned above, Olivier Guedon from Ubisoft measured the ratio across 440,000 games during alpha testing and 182M games in two beta-testing rounds. In the alpha, the defending team won 61% of the time, resulting to tweaks making it easier to attack. As a result, the attackers won 58% of the time in the first beta test. Further redesigns finally made the game balanced in the second beta. A great example of iterative design and the common observation that fixing one UX problem (too easy to defend) sometimes introduces a new problem (too easy to attack), which is why I recommend as many rounds of iteration as possible.</p>
<h2>Professional Users</h2>
<p>In the gaming domain, some companies have to accommodate two classes of users: normal users (who buy the game and play for fun) and professional users who are paid to play the game as an “eSport.” eSports are a big business with huge audiences watching the championship games. (In 2014, Amazon.com paid almost a billion dollars for one eSports site.)</p>
<p>Of course, one of the oldest lessons in traditional user experience is that we need to design for both novice and expert users. Each have different skill levels and need different features. (More on this in our UX Basic Training course — it’s that fundamental a concept.) But professional users take this distinction to an entirely different level and require separate research of what happens when operating a user interface becomes its own goal and the focus of somebody&#8217;s career.</p>
<h2>Designing Fun</h2>
<p>User satisfaction has always been one of the 5 main usability criteria: people will most definitely leave a website that’s too unpleasant. Even in enterprise software, you want users to like your design to reduce employee turnover. That said, mainstream UX research spends much time on other criteria, such as learnability and efficiency, because users are so goal oriented: they go to a website to get something done (say, buy something or read the news), not to have fun with the user interface.</p>
<p>In strong contrast, a game has no purpose other than fun. The stated goal may be to kill the boss. (No, not your manager, but a nasty gremlin or alien invader — these game enemies are referred to as bosses.) But the real goal is to have fun while doing so. That’s why it’s important to study the kill/death ratio: if designers made the game interface too good at killing bosses, that would be <em>efficient</em>, but not <em>fun</em>. (Good traditional UX; bad GUX.) Gamers need just the right level of challenge, because it’s also no fun if you die immediately and don’t get to off some bosses.</p>
<p>In an attempt to pinpoint exactly when users are excited or bored, some GUR researchers employ esoteric biometrics sensors. For example, they measure skin-conductance levels (sweat activity), which is related to physiological arousal. Pierre Chalfoun gave a good overview of biometrics at Ubisoft, and he emphasized that these physiological sensors are not always directly connected to user emotions, which is what we really want to design for. (The goal is engaged users, not sweaty users, even if there is a correlation.)</p>
<p>Chalfoun presented an interesting study of game tutorials, which showed that users’ levels of frustration, as indirectly measured by biometrics, mounted every time they failed to understand a game tutorial. First failure: somewhat frustrated. Third failure in a row: very frustrated. While this finding makes intuitive sense and may not be worth the cost of a biometrics lab, Chalfoun stressed that good visualizations of such data convince management and developers to take research seriously and invest in fixing the bad designs that caused such growing user frustration. (Without quantifiable data, it’s easier to dismiss user frustration as a minor matter that can’t hold up the release schedule.)</p>
<h2>More Tech</h2>
<p>Across the conference presentations, it was striking how many GUX teams make use of custom-written software. Anything from running the playtest lab to game telemetry (“calling home” with data about live play in a beta test) requires the company to allocate software developers to build special features just for the researchers.</p>
<p>I think there are two reasons that GUX teams seem to be more tech heavy than mainstream UX teams:</p>
<ul>
<li>The game researchers are embedded in highly geeky companies with legions of programmers, and it’s their company culture that if you need something, you go build it.</li>
<li>The many game genres are widely diverging in needs, and thus require custom software to study seriously. In contrast, all websites are all built on top of the browser and require the same types of interactions. This means that it’s actually possible for third-party solutions to offer, say, cloud-based analytics tools that collect most data needed to study a website, thus eliminating the need for custom software.</li>
</ul>
<h2>Age and Gender Differences</h2>
<p>The best (but <em>very</em> data-dense) presentation at GamesUR was by Nick Yee from Quantic Foundry. Yee has collected data from 220,000 gamers who completed a <a href="https://apps.quanticfoundry.com/lab/10">survey about what motivates them</a> to play computer games. <a href="http://quanticfoundry.com/2015/12/21/map-of-gaming-motivations/">Motivations clustered into 6 groups</a>: action, social, mastery, achievement, immersion, and creativity. Obviously, different games speak to different motivations: a death-march game will attract gamers motivated by action and social play, whereas a simulation game would be preferred by people interested in immersion and creativity.</p>
<p>One of the main components of the social cluster is <strong>competition</strong>. In this cluster gamers care about beating other players and being acknowledged as a high-ranking player (even if they don’t take it to the eSports extreme). The following chart shows the average score on the competition metric for men and women at different ages:</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter wp-image-9289 size-full" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2016/03/games-competitiveness-by-age-and-gender.png?resize=980%2C495&#038;ssl=1" alt="Strength of competition as motivation for gamers of across age and gender" width="980" height="495" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2016/03/games-competitiveness-by-age-and-gender.png?w=1200&amp;ssl=1 1200w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2016/03/games-competitiveness-by-age-and-gender.png?resize=300%2C152&amp;ssl=1 300w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2016/03/games-competitiveness-by-age-and-gender.png?resize=768%2C388&amp;ssl=1 768w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2016/03/games-competitiveness-by-age-and-gender.png?resize=1024%2C517&amp;ssl=1 1024w" sizes="auto, (max-width: 980px) 100vw, 980px" /><br />
<em>Average gamer scores, expressed as standard deviations from the overall mean across all ages and genders. High scores indicate people who are more motivated by competition. Source: <a href="http://quanticfoundry.com/2016/02/10/gamer-generation/">Quantic Foundry</a>, reprinted by permission.</em></p>
<p>Two observations from the chart:</p>
<ul>
<li>Men are more competitive than women. (Or, more precisely, men like competitive games more than women do.) Maybe not a big surprise.</li>
<li>Competitiveness decreases drastically by age. In fact, the difference between young and old gamers is more than twice the difference between men and women, and by age 50 there’s no real difference between men and women anymore. (Older women might even be more competitive than older men, but there’s too little data in this research to say for sure.)</li>
</ul>
<p>We sometimes find differences between young and old users in mainstream UX research, but our effect sizes are usually much more modest than those in the Gamer Motivation Study: as users age, task performance using websites declines by 0.8% per year. And it’s almost unheard of to see any reportable differences between male and female users. Say you want to study menu design: the difference between how men and women use any given menu is so negligible that is has zero practical meaning compared to the difference between a design that complies with menu UX guidelines and a poorly designed menu.</p>
<p>In conclusion, the Games User Research Summit was a great conference with many insightful talks by top professionals. Both they and we probably think that mainstream UX and GUR are more different than they really are, but all of us should periodically reflect on the notable similarities between the two fields to make sure that we don’t unduly limit our methods to those traditionally employed in our UX niche. For sure, as persuasive web design becomes increasingly important, mainstream user researchers will need to adopt (and adapt) methods from games user research.</p>
<p>Written by: <a href="https://www.nngroup.com/articles/game-user-research/" target="_blank">Jakob Nielsen, Nielsen-Norman Group</a><br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2016/03/games-user-research-whats-different/">Games User Research: What’s Different?</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2016/03/games-user-research-whats-different/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">9288</post-id>	</item>
		<item>
		<title>Dragon Front Is a Hearthstone-Like Card Game Built for Virtual Reality</title>
		<link>https://www.situatedresearch.com/2016/03/dragon-front-is-a-hearthstone-like-card-game-built-for-virtual-reality/</link>
					<comments>https://www.situatedresearch.com/2016/03/dragon-front-is-a-hearthstone-like-card-game-built-for-virtual-reality/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Tue, 01 Mar 2016 17:16:36 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Aesthetics]]></category>
		<category><![CDATA[Communication]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=9283</guid>

					<description><![CDATA[<p>A game that rethinks the first-person VR approach Virtual reality has traditionally been about transporting you to new worlds and making you believe you’re really there. It’s the immersion element, known as “presence” in industry lingo, that makes VR feel like magic. So it was refreshing to see Dragon Front, a new VR game in&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2016/03/dragon-front-is-a-hearthstone-like-card-game-built-for-virtual-reality/">Dragon Front Is a Hearthstone-Like Card Game Built for Virtual Reality</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>A game that rethinks the first-person VR approach</strong></p>
<p>Virtual reality has traditionally been about transporting you to new worlds and making you believe you’re really there. It’s the immersion element, known as “presence” in industry lingo, that makes VR feel like magic. So it was refreshing to see <em>Dragon Front</em>, a new VR game in development for the Oculus Rift, take the approach in an entirely different direction. <span id="more-9283"></span></p>
<p>The game is a riff on popular collectible card games like Blizzard’s Hearthstone where you battle another opponent in strategic combat using playing cards you draw from a virtual deck. What sets <em>Dragon Front</em> apart is its battlefield. Because it was designed from the ground up for VR, the game wraps you in a 360-degree environment where every card you play turns into a real battle animation or spawns a virtual fighter to defend your fortress. The more dedicated CCG players may be reminded of the holographic Dual Disk tech from the <em>Yu-Gi-Oh</em> series that brought to life monsters in a virtual arena.</p>
<p><em>Dragon Front</em>‘s approach is unique for VR because developer High Voltage is not trying to trick your brain into thinking what it’s seeing is real. Instead of making you feel like you’re right there on the battlefield, <em>Dragon Front</em> is about trying to replicate the experience of being in a room with a friend over a real-life table-top game. VR in this case is being used to replicate something familiar instead of conjuring up an entirely new sensation. So think of wearing the Oculus Rift as a way to bring to life the kind of technology you’ve seen in, say, the holographic chess game aboard the Millennium Falcon in <em>Star Wars</em>. We may not be able to bring imaginary creatures and fantasy battlefields to life on our kitchen table quite yet, but we can certainly come close to that feeling by using VR today.<span id="more-22379"></span></p>
<p><em>Dragon Front</em> doesn’t have a release date set, but the game is already fully fleshed out from its beginnings as a physical card game developed on paper. Using the <a href="http://www.theverge.com/2016/1/6/10723568/oculus-rift-remote-announce-ces-2016" target="_blank">new Oculus Remote</a>, a streamlined handheld input device <a href="http://www.theverge.com/2016/1/6/10722212/oculus-rift-price-shipping-date-ces-2016" target="_blank">shipping with the Rift headset in March</a>, you’re able to look around the battlefield and select objects with the press of a button. Because there’s really just a small handful of inputs, you don’t need a full Xbox One controller to make moves.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-9286" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2016/03/dragon-front-screenshot-1.jpg?resize=800%2C441&#038;ssl=1" alt="dragon-front-screenshot-1" width="800" height="441" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2016/03/dragon-front-screenshot-1.jpg?w=800&amp;ssl=1 800w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2016/03/dragon-front-screenshot-1.jpg?resize=300%2C165&amp;ssl=1 300w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2016/03/dragon-front-screenshot-1.jpg?resize=768%2C423&amp;ssl=1 768w" sizes="auto, (max-width: 800px) 100vw, 800px" /></p>
<p>After just 10 minutes or so of tutorial playing, I was able to grasp the game’s lengthy turn-based combat and try my hand at a real one-on-one fight with another human being in VR. For a card game, <em>Dragon Front</em> was an exhilarating experience that mixes the tense moments of high-level strategy play with the full-body escapism of VR. Yet after a few turns going back and forth, you start to completely forget that you’re even playing a game with a headset on. The competition starts to feel as natural as a physical table-top experience, while the Rift just becomes an interface for your virtual showdown.</p>
<p><em>Dragon Front</em> has a couple fun quirks to amplify that sensation. For one, your opponent’s face shows up as a omnipresent floating mask above their fortress, and it will mirror the direction of their gaze and facial expressions in real time so you can feel as if you’re sitting across the table from the person. <em>Dragon Fron</em>t also relies on in-game voice chat so you can talk to your opponent as the game progresses.</p>
<p>So <em>Dragon Front</em> may not be the most immersive VR title out there or one you could show your parents to convince them of the technology’s potential. But it’s certainly a unique rethinking of the VR approach, one that will most certainly catch on as headsets like the Rift start becoming a more common way to play a wide variety of games and not just first-person experiences.</p>
<p>Written by: <a href="http://www.theverge.com/users/nickstatt" target="_blank">Nick Statt</a>, <a href="http://www.theverge.com/2016/2/26/11119676/dragon-front-card-game-oculus-rift-virtual-reality" target="_blank">the Verge</a> (via <a href="http://ispr.info/2016/03/01/dragon-front-game-rethinks-the-first-person-vr-approach/" target="_blank">Presence</a>; images from <a href="http://www.high-voltage.com/" target="_blank">High Voltage</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2016/03/dragon-front-is-a-hearthstone-like-card-game-built-for-virtual-reality/">Dragon Front Is a Hearthstone-Like Card Game Built for Virtual Reality</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2016/03/dragon-front-is-a-hearthstone-like-card-game-built-for-virtual-reality/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">9283</post-id>	</item>
		<item>
		<title>The Climb: The Most Head-Spinning Virtual Reality Experience Yet</title>
		<link>https://www.situatedresearch.com/2015/12/the-climb-the-most-head-spinning-virtual-reality-experience-yet/</link>
					<comments>https://www.situatedresearch.com/2015/12/the-climb-the-most-head-spinning-virtual-reality-experience-yet/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Wed, 30 Dec 2015 17:34:42 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=9230</guid>

					<description><![CDATA[<p>Crytek’s new project for the Oculus Rift shows us exactly where VR gaming is going – towards heady and experiential gameplay Above you, the craggy face of the cliff seems to stretch up endlessly toward the sky, offering perilously few footholds. In the far distance there’s a small village by a beach, bathed in orange&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2015/12/the-climb-the-most-head-spinning-virtual-reality-experience-yet/">The Climb: The Most Head-Spinning Virtual Reality Experience Yet</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>Crytek’s new project for the Oculus Rift shows us exactly where VR gaming is going – towards heady and experiential gameplay</strong></p>
<p>Above you, the craggy face of the cliff seems to stretch up endlessly toward the sky, offering perilously few footholds. In the far distance there’s a small village by a beach, bathed in orange sunshine – an exotic idyll. But below you there is &#8230; nothing. Nothing but a long deadly drop into the crashing sea far below. Your only option is to keep climbing. <span id="more-9230"></span></p>
<p>Crytek has always been interested in pushing graphics technology. In the mid-2000s, the Frankfurt-based developer and publisher achieved wide acclaim for its visually spectacular first-person shooters Far Cry and Crysis; although several years old, both are still widely used as a benchmark for near photo-realism in games, especially in terms of environmental detail. With its steamy tropical rain forests, Far Cry presented a lush counterpoint to the genre’s obsession with steel grey interiors.</p>
<p>But the company’s latest project is perhaps its most ambitious attempt to bring immersive naturalism to game worlds. The Climb is a virtual reality climbing simulator, which gives the player the chance to attempt a series of tricky ascents on rock faces based around the world. “We started out by working on the mechanics of virtual reality,” says executive producer Elijah Freeman, who started as an artist at Crytek 15 years ago. “When we were prototyping, climbing just stood out for us – it was almost instantaneously fun.”</p>
<p>Exclusive to the forthcoming Oculus Rift headset, The Climb uses the technology’s motion controllers – or an Xbox One pad – to give the player control over their hands, which are the only body part displayed on screen. While the shoulder buttons can be used to grip with either your left or right hand, you need to use the motion sensors in the headset to physically look at the next finger hold that you want to grab for – this then forces the onscreen limbs to move in that direction. It takes a few minutes to get used to. At first your arms flail uselessly short of the required hold, and when you do get the direction right, you may mistime the gripping mechanism, sending your climber into the abyss.</p>
<p>But as I found during my demo session, based on Vietnam’s Halong Bay, the interface gradually becomes intuitive. You learn to scan the rockface for available holds, you learn to plan ahead, using the correct arm to lurch up with, based on its position to your body and the location of the next hold. Eventually you build up a rhythm where you start to scamper up the cliff like Spider-Man on his summer hols.</p>
<p>Although there are arrows on each climbing surface to point out a general direction, Freeman says there will be multiple routes available on tougher climbs, allowing players to learn and finesse their favourites – an asynchronous multiplayer mode lets you compare route times with friends. Adding an extra layer of authenticity, you also have to watch a grip meter in the corner of the screen: if it gets too low, you need to chalk up your hands, or your fingers will start slipping from the ledges. It’s a small addition, but it just keeps that sense of physicality – that sense that you’re actually out there dangling from a 200ft cliff.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter wp-image-9232 size-full" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/12/1805.jpg?resize=980%2C586&#038;ssl=1" alt="1805" width="980" height="586" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/12/1805.jpg?w=1225&amp;ssl=1 1225w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/12/1805.jpg?resize=300%2C180&amp;ssl=1 300w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/12/1805.jpg?resize=768%2C460&amp;ssl=1 768w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/12/1805.jpg?resize=1024%2C613&amp;ssl=1 1024w" sizes="auto, (max-width: 980px) 100vw, 980px" /><em>Fancy the sensation of dangling from a 200ft cliff? Then The Climb is for you. Photograph: Crytek</em></p>
<p>And the game does likes to mess with your natural fear of heights. There are sections where you’re required to climb downwards, edging over the lip of a rocky outcrop so that all you can see below you is that beckoning expanse of water. The action is so measured and precise it’s unlikely anyone is going to suffer motion sickness, but vertigo is quite another thing. Basically, if you have to look away during all of Tom Cruise’s free climbing sequences in the Mission Impossible films, this is going to be an interesting challenge.</p>
<p>Then there’s the jumping. Some holds are just too far away to just reach for, so you need to hit a button and launch yourself across. I found this the most challenging and frustrating part of the demo – simply because the timing required to hit the grip button and hang on to the target ledge requires a level of precision which is tough to achieve when you’re combining inexact motion controls with a joypad input. When I did make it, it seemed more by luck than judgement. Fortunately, there are various save points on each climb – disguised as pitons obviously – so unlike in the real sport, you get to have another go if you plunge to a watery grave.</p>
<p>The experience is a fascinating glimpse at the strengths of VR as a medium. While beautiful environmental details look impressive on a 2D display, they feel truly fascinating and immersive in VR – each time you reach a new cliff top and get to survey the scenery, it feels like a genuine reward, rather than a mere pretty backdrop. “Visual fidelity is an important part of extrapolating on presence,” says Freeman. “You need to feel like you’re there. Crytek has spent a lot of time on getting our CryEngine to render at these fidelities so it was a natural fit for us.”</p>
<p>Because of this extended sense of “being there” the actual physical input doesn’t have to be so frenzied and demanding. If this were a traditional simulation on a 2D display, it’s likely every button on the controller would be employed; there would be balance, wind and directional gauges. Here, because the sensory input is more complex, the developers don’t need to add such complexity to the interface.</p>
<p>Indeed, the minimal controls seem to work perfectly well – the sense of actually being on the rock face augments the sense of achievement and challenge. We’ve heard a lot from VR developers over the past two years that early titles using the Oculus, HTC Vive or PlayStation VR headsets are likely to be experiential rather than narrative in focus – they’ll be about inhabiting and exploring a defined space, rather than following some sort of epic story. The Climb confirms this. It may not function as a “realistic” climbing simulation (you can’t see or use your feet for example), but it is certainly about being somewhere and experiencing the traversal of a precarious space.</p>
<p>“The amount of fun is balanced very finely with the sense of risk and excitement,” says Freeman. “I like the idea of presenting experiences like this to the player. This is just the initial pass to get people introduced to VR, but you can definitely see where this medium is going.”</p>
<p>Written by: <a href="http://www.theguardian.com/technology/2015/dec/24/the-climb-virtual-reality-oculus-rift-crytex" target="_blank">Keith Stuart, the Guardian</a> (via <a href="http://ispr.info/2015/12/28/the-climb-head-spinning-experiential-gameplay-provides-insights-about-presence/" target="_blank">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2015/12/the-climb-the-most-head-spinning-virtual-reality-experience-yet/">The Climb: The Most Head-Spinning Virtual Reality Experience Yet</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2015/12/the-climb-the-most-head-spinning-virtual-reality-experience-yet/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">9230</post-id>	</item>
		<item>
		<title>From Privacy to Productivity: A Look at How Virtual Reality Could Change the Way We Work</title>
		<link>https://www.situatedresearch.com/2015/07/from-privacy-to-productivity-a-look-at-how-virtual-reality-could-change-the-way-we-work/</link>
					<comments>https://www.situatedresearch.com/2015/07/from-privacy-to-productivity-a-look-at-how-virtual-reality-could-change-the-way-we-work/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Sat, 25 Jul 2015 18:06:37 +0000</pubDate>
				<category><![CDATA[Collaboration]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Communication]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Learning]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=8942</guid>

					<description><![CDATA[<p>Businesses someday getting on board with virtual reality will need to do some self-examination. Various VR tools are aimed at reclaiming productivity and improving interactions.  The fabled “promise” of virtual reality is expansive. At its loftiest, we’ve been promised not only changes to how we live and how we consume entertainment, but also to how&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2015/07/from-privacy-to-productivity-a-look-at-how-virtual-reality-could-change-the-way-we-work/">From Privacy to Productivity: A Look at How Virtual Reality Could Change the Way We Work</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>Businesses someday getting on board with virtual reality will need to do some self-examination. Various VR tools are aimed at reclaiming productivity and improving interactions. </strong></p>
<p>The fabled “promise” of virtual reality is expansive. At its loftiest, we’ve been promised not only changes to how we live and how we consume entertainment, but also to how we work. <span id="more-8942"></span></p>
<p>After all, tech loves a good workplace trend.</p>
<p>In a general sense, incorporating virtual reality into business could mean things like escape from the physical confines of a desk, or the limit of how many monitors you could stick on that desk, or the general lack of aesthetics associated with cubicles, let’s say.</p>
<p>At the moment, there seems to be two ends of the spectrum developing — VR to help you get work done with other people, and VR to help you get away from, perhaps, those same people later on in the day.<span id="more-20923"></span></p>
<p>One instance of the latter example is Icelandic company <a href="http://www.murevr.com/#the-team-1-section" target="_blank">Breakroom</a>. They’re still in early days, but the idea behind Breakroom stems from the proliferation of open-concept offices — the kind popularized by tech companies as markers of innovation and avant-garde thinking, and the same that the Harvard Business Review, among others, have said are now negatively impacting <a href="https://hbr.org/2014/10/the-transparency-trap&amp;cm_sp=Article-_-Links-_-Top%20of%20Page%20Recirculation" target="_blank">privacy</a>,<a href="http://www.newyorker.com/business/currency/the-open-office-trap" target="_blank">productivity</a>, and <a href="http://www.fastcompany.com/3019758/dialed/offices-for-all-why-open-office-layouts-are-bad-for-employees-bosses-and-productivity" target="_blank">workplace satisfaction</a>.</p>
<p>One of Breakroom’s founders, Diðrik Steinsson, drew inspiration from having to work in an open office space himself. The idea behind Breakroom is that a worker in such an office might have a headmounted display like the Oculus Rift at his or her desk, and when it’s time to really focus on something for a few hours, they can put it on and go into a virtual environment with multiple, manipulatable browser windows, and integration with Google Apps, and Office 365, and get some work done — all while sitting somewhere scenic like a grassy field, or the moon. (Some co-workers will push you there.)</p>
<p>“I see it as a fortress of solitude for people,” Steinsson said. And he’s betting workers will be wearing some type of HMD eventually, even if it’s not within the next 10 years.</p>
<p>The flip side of this, to a degree, is a virtual reality application like AltspaceVR. The social VR app lets users enter its virtual world as robot avatar to socialize. It’s not necessarily aimed at businesses or the enterprise, but CEO Eric Romo said they’ve been using it for functions like business meetings and even job candidate interviews.</p>
<p>Romo emphasizes the value of nonverbal communication. A conference call, for example, can be awkward. People talk over each other, and it’s difficult to get a read on the other people present when all nonverbal cues like facial expressions and body language are absent. Romo said the experience of meeting and interacting with others is more effective when things like head movements are getting translated into VR.</p>
<p>Altspace has features like private and multi user web browsers — so, multiple people could, for example, look at code together. The use cases from consumer to enterprise slide back and forth a little like this: Romo said that if you want to show off vacation pictures, there’s no reason why they couldn’t be slide decks.</p>
<p>Somewhere in between those two examples, there’s something like the <a href="http://www.fastcodesign.com/3028433/virtual-reality-goes-to-work" target="_blank">demo</a> UC Davis’ Institute for Data Analysis and Visualisation Oliver Kreylos put together in 2014. It’s 3D-captured data of an office that includes 2D desktop apps.</p>
<p>But to eventually get these or other virtual reality tools into the business world, there are still some hurdles to jump, like nailing down inputs, or even just supplying every worker with not only an HMD, but also a Kinect sensor and Leap Motion sensor in order to translate more movement into VR. It also raises bigger questions as to what does all this really solve?</p>
<p>“When you want to introduce a technology like VR into some sort of business process, it’s really got to have some sort of overall benefit,” said Gartner analyst Brian Blau. “Some of these behavior replacement cycles — one of the things that you’ll find is that often times they’re more incremental than they are revolutionary.”</p>
<p>Introducing something like VR into a business environment would be revolutionary in the sense that it would be a change of device, software, and user interface, all at once.</p>
<p>What he asks is what are the steps? What are the actions being changed? Being able to answer those questions could be a determining factor in whether virtual reality ever takes hold in the enterprise.</p>
<p>He said more general uses are harder to make an argument for. Take a meeting, for the example — over the years, tech surrounding the ways in which people meet has ranged from phone calls, to conference calls, to video calls, to video calls on mobile devices — so what’s the big value add of virtual reality?</p>
<p>Romo submits the nonverbal cues, and the basic malleability of a virtual reality environment, the ability to turn a space into whatever it is a user might need.</p>
<p>Still, Blau sees more potential in purpose-built VR tools. Think data visualisation, training, prototyping and design.</p>
<p>Another consideration is what what happens after introducing something like an HMD into an office worker’s everyday use.</p>
<p>Computer Vision Syndrome is already rampant. Though, Dr. Dominick Maino, a professor at <a href="http://ico.edu/" target="_blank">Illinois College of Optometry/Illinois Eye Institute</a>, who specializes in pediatrics and binocular Vision, and has done research on vision and 3D graphics, said that if anything, introducing VR into workplaces would probably end up sacrificing a lot of vision problems relating to faulty binocular vision. Those will be the kinds of problems that need to get fixed before actually being able to use a VR tool.</p>
<p>Still, this is all probably a ways off. Breakroom is about to start testing its product. Altspace is focusing mostly on consumer use, but crafting a product that could be used otherwise in business.</p>
<p>Now, if only VR could offer a fix for the big business problems — like the “reply all” email thread.</p>
<p><em>[This article from <a href="http://www.techrepublic.com/article/from-privacy-to-productivity-a-look-at-how-virtual-reality-could-change-the-way-we-work/" target="_blank">TechRepublic</a> focuses on the uses of presence technology to both separate and connect people in the workplace; I think the Breakroom VR application by <a href="http://www.murevr.com/" target="_blank">MureVR</a> is particularly interesting; you can watch a 6:13 minute video about it on <a href="https://www.youtube.com/watch?v=KvJgJAppbxQ" target="_blank">YouTube</a>.]</em></p>
<p>Written by: <a href="http://www.techrepublic.com/search/?a=erin+carson" target="_blank">Erin Carson</a>, <a href="http://www.techrepublic.com/article/from-privacy-to-productivity-a-look-at-how-virtual-reality-could-change-the-way-we-work/" target="_blank">TechRepublic</a> (via <a href="http://ispr.info/2015/07/15/tools-to-separate-and-connect-us-how-vr-could-change-the-way-we-work/">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>&nbsp;</p>
<p>The post <a href="https://www.situatedresearch.com/2015/07/from-privacy-to-productivity-a-look-at-how-virtual-reality-could-change-the-way-we-work/">From Privacy to Productivity: A Look at How Virtual Reality Could Change the Way We Work</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2015/07/from-privacy-to-productivity-a-look-at-how-virtual-reality-could-change-the-way-we-work/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">8942</post-id>	</item>
		<item>
		<title>Hands-on with Mattel’s new AR, VR View-Master</title>
		<link>https://www.situatedresearch.com/2015/02/hands-on-with-mattels-new-ar-vr-view-master/</link>
					<comments>https://www.situatedresearch.com/2015/02/hands-on-with-mattels-new-ar-vr-view-master/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Fri, 20 Feb 2015 15:54:37 +0000</pubDate>
				<category><![CDATA[Education]]></category>
		<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Aesthetics]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Games for Learning]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Mobile Devices]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=8819</guid>

					<description><![CDATA[<p>A View-Master for virtual reality: Hands-on with Mattel&#8217;s new AR, VR phone toy Mattel is relaunching View-Master, but as a virtual reality and augmented-reality phone toy. And I got to play around with it for a bit…or at least, some of the tech behind it.  Announced at an event in New York City, the new&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2015/02/hands-on-with-mattels-new-ar-vr-view-master/">Hands-on with Mattel’s new AR, VR View-Master</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>A View-Master for virtual reality: Hands-on with Mattel&#8217;s new AR, VR phone toy</strong></p>
<p><span style="line-height: 1.5;">Mattel is relaunching View-Master, but as a virtual reality and augmented-reality phone toy. And I got to play around with it for a bit…or at least, some of the tech behind it. </span><span id="more-8819"></span></p>
<p>Announced at an event in New York City, <a href="http://www.cnet.com/news/google-mattel-announce-a-virtual-reality-view-master/" target="_blank">the new View-Master</a> is a collaboration between Mattel and Google, whose virtual reality Cardboard app has enabled cheap do-it-yourself accessories to turn any Android phone into a mini-VR viewer. Mattel’s plastic toy, which will debut in October, is like a more durable, plastic version of <a href="http://www.cnet.com/news/googles-cardboard-vr-headset-is-no-joke-its-great-for-the-oculus-rift/" target="_blank">Google Cardboard</a>, designed entirely for kids…or, maybe, also for grown-up kids like me. And the most brilliant part is it’ll only cost $30.<span id="more-20098"></span></p>
<p><iframe loading="lazy" src="http://www.cnet.com/videos/share/id/tUlXVC5TlPLbcmd7Lo7cfkU6k0P1Edow/" width="960" height="540" frameborder="0" seamless="seamless" scrolling="no" allowfullscreen="allowfullscreen"></iframe></p>
<p>I used View-Master back when I was a little — who didn’t? It’s a classic 3D stereoscopic picture viewer. Many people had even said Google Cardboard looked a bit like a View-Master. So is isn’t a huge surprise that Mattel has suddenly announced a new View-Master with Google Cardboard VR capabilities added. I’ve always felt that virtual reality reminded me of early stereoscopic toys. And Mattel has keyed onto the same idea.</p>
<figure id="attachment_8821" aria-describedby="caption-attachment-8821" style="width: 770px" class="wp-caption aligncenter"><img data-recalc-dims="1" loading="lazy" decoding="async" class="size-full wp-image-8821" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster1.jpg?resize=770%2C577&#038;ssl=1" alt="The View-Master will fit most phones, according to Mattel: iPhone and Android alike." width="770" height="577" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster1.jpg?w=770&amp;ssl=1 770w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster1.jpg?resize=300%2C225&amp;ssl=1 300w" sizes="auto, (max-width: 770px) 100vw, 770px" /><figcaption id="caption-attachment-8821" class="wp-caption-text">The View-Master will fit most phones, according to Mattel: iPhone and Android alike.</figcaption></figure>
<p>The toy was only viewable in a mock-up prototype form at Mattel’s event, but the design’s pretty cool: it looks half old-school View-Master, half Oculus Rift. The inner plastic housing extends to hold many types of phones: Mattel says it’s designed to fit the largest existing phones, and will even work with the <a href="http://www.cnet.com/products/apple-iphone-6-plus/" target="_blank">iPhone 6 Plus</a> and <a href="http://www.cnet.com/products/google-nexus-6/" target="_blank">Nexus 6</a>. A capacitive-touch side lever is used to “click” through scenes or into virtual environments, like the magnetized side switch on Google’s Cardboard viewers.</p>
<p>Mattel’s headset is designed with Google and Android in mind, but at launch is intended to work on “nearly all platforms,” which includes iOS. That would mean a dedicated Mattel app which interfaces with the View-Master, but Google’s Cardboard and Cardboard-ready apps — many of which already exist on iOS, like VRSE — will work too.</p>
<p>Mattel is planning to use View-Master not just for VR, but also for AR; little plastic reels that look like the old cardboard ones are really just flat coasters this time around, now with images on top which the View-Master reads and turns into pop-up augmented-reality models on your table, desktop or wherever else you place it. Multiple View-Masters could use one reel to access content if put down on a table, unlike the old pop-in reels. This type of augmented-reality tech has already existed for years in many apps and on some children’s toys like the Nintendo 3DS (with its AR cards) and PlayStation Vita, but mixing it into a VR headset is a novel idea.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-8822" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster3.jpg?resize=770%2C577&#038;ssl=1" alt="viewmaster3" width="770" height="577" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster3.jpg?w=770&amp;ssl=1 770w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster3.jpg?resize=300%2C225&amp;ssl=1 300w" sizes="auto, (max-width: 770px) 100vw, 770px" /></p>
<p>I didn’t get to use the actual Mattel prototype, but we tried View-Master’s augmented-reality tech on phones and Google Cardboard viewers. There were three reels to try: a dinosaur one made a little dinosaur pop up on the disc on the table in front of me. When I aimed a dot and clicked on it, I was suddenly surrounded by a prehistoric 360-degree panorama with 3D dinosaurs. Clicking on them brought up facts, too.</p>
<p>Looking at the space disc with Cardboard on brought up a pop-up moon and Earth; clicking on it took me to a panorama of the moon, with pop-up clickable photos of NASA missions. A third, San Francisco-themed, had little mini-models of Alcatraz and the Golden Gate Bridge that turned into VR photo panoramas. To exit any of the virtual panoramas, you look down and click on the side…or, remove the View-Master from your face. The View-Master comes with one reel in its $30 package, and extra reels will cost around $15 each. No, older View-Master reels don’t work in here, but it sounds like Mattel is exploring re-releasing content from some of the back catalog 10,000 older ViewMaster reels.</p>
<figure id="attachment_8823" aria-describedby="caption-attachment-8823" style="width: 770px" class="wp-caption aligncenter"><img data-recalc-dims="1" loading="lazy" decoding="async" class="size-full wp-image-8823" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster4.jpg?resize=770%2C577&#038;ssl=1" alt="The &quot;reels&quot; don't actually go in the View-Master, they simply sit on your table." width="770" height="577" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster4.jpg?w=770&amp;ssl=1 770w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster4.jpg?resize=300%2C225&amp;ssl=1 300w" sizes="auto, (max-width: 770px) 100vw, 770px" /><figcaption id="caption-attachment-8823" class="wp-caption-text">The &#8220;reels&#8221; don&#8217;t actually go in the View-Master, they simply sit on your table.</figcaption></figure>
<p>There’s no strap to keep the View-Master on: this is a hold-to-your-face toy, much like older View-Masters and Google Cardboard. Mattel has promised that the tech has already been vetted by pediatric ophthalmologists, and is meant for children ages 7 and up — in short, bite-sized sessions.</p>
<p>The View-Master may work with other toys, too, like other app-ified toys in the past, but for now it’s really a fancier plastic Google Cardboard viewer, with additional Mattel support. That’s not a bad thing at all: at $30, this is a pretty awesome little stocking-stuffer idea, and a fun phone toy. Just keep in mind that if you give this to your kid, it won’t work without a phone popped into it.</p>
<p>By the time fall rolls around, Mattel may have other toys ready to work with it. Or, there might be many other companies ready to make cheap phone-enabled VR headsets, too.</p>
<p>Written by: <a href="http://www.cnet.com/profiles/scottstein8/" target="_blank">Scott Stein</a>, <a href="http://www.cnet.com/products/new-view-master/" target="_blank">CNET</a> (via <a href="http://ispr.info/2015/02/20/hands-on-with-mattels-new-ar-vr-view-master/" target="_blank">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2015/02/hands-on-with-mattels-new-ar-vr-view-master/">Hands-on with Mattel’s new AR, VR View-Master</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2015/02/hands-on-with-mattels-new-ar-vr-view-master/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">8819</post-id>	</item>
		<item>
		<title>Welcome to the Age of Holographs</title>
		<link>https://www.situatedresearch.com/2015/01/welcome-age-holographs/</link>
					<comments>https://www.situatedresearch.com/2015/01/welcome-age-holographs/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Thu, 22 Jan 2015 22:18:54 +0000</pubDate>
				<category><![CDATA[Education]]></category>
		<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Aesthetics]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Games for Learning]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Mental Models]]></category>
		<category><![CDATA[Personalization]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[User-Centered Design]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=8792</guid>

					<description><![CDATA[<p>Up close with the HoloLens, Microsoft’s most intriguing product in years We just finished a heavily scripted, carefully managed, and completely amazing demonstration of Microsoft’s HoloLens technology. Four demos, actually, each designed to show off a different use case for a headset that projects holograms into real space. We played Minecraft on a coffee table.&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2015/01/welcome-age-holographs/">Welcome to the Age of Holographs</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>Up close with the HoloLens, Microsoft’s most intriguing product in years</strong></p>
<p>We just finished a heavily scripted, carefully managed, and completely amazing demonstration of Microsoft’s HoloLens technology. Four demos, actually, each designed to show off a different use case for a headset that projects holograms into real space. We played <em>Minecraft</em> on a coffee table. We had somebody chart out how to fix a light switch right on top of the very thing we were fixing. <span id="more-8792"></span></p>
<p>We walked on Mars.</p>
<p>You’ll notice there aren’t photos here, and that’s because before we were even allowed into the labs where the HoloLens team tests out its user experiences, we had to deposit our cameras and phones into a locker. No recording equipment of any kind was allowed, not even audio. We entered the basement below Microsoft’s visitor center laughing at the absurdity of it all — many reporters needed to get notepads from the company and weren’t carrying pens, either.</p>
<p>But it was all worth it, because HoloLens is probably the most intriguing (and, in many ways, most infuriating) technology we’ve experienced since the Oculus Rift. And there are many parallels with the Rift to be had: both are immersive, but in different ways; both require you to strap a weird thing on your head; both leave you grinning like at absolute idiot at a scene only you can see. And, crucially, both need more work when it comes to thinking through exactly how to control and interact with virtual things.</p>
<p><script height="575px" width="1023px" src="https://player.ooyala.com/iframe.js#ec=lsOGp3cjqUFwNW0FqImWpiKsqIdSTEX-&#038;pbid=dcc84e41db014454b08662a766057e2b"></script></p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-8793" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/d76d1d6c-a4e6-41d2-bc43-c9b49041a219.0.png?resize=864%2C392&#038;ssl=1" alt="d76d1d6c-a4e6-41d2-bc43-c9b49041a219.0" width="864" height="392" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/d76d1d6c-a4e6-41d2-bc43-c9b49041a219.0.png?w=864&amp;ssl=1 864w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/d76d1d6c-a4e6-41d2-bc43-c9b49041a219.0.png?resize=300%2C136&amp;ssl=1 300w" sizes="auto, (max-width: 864px) 100vw, 864px" /></p>
<p><strong><em>Minecraft</em> IRL<br />
</strong>by Dieter Bohn</p>
<p>By far, <a href="https://www.theverge.com/2015/1/21/7868363/minecraft-hololens-microsoft-freecell" target="_blank" rel="noopener">the most impressive demo for my money was the <em>Minecraft</em> demo</a> — though Microsoft called it something like “Building Blocks” or some such, presumably so as not to fully commit to releasing a full holograph version of<em>Minecraft</em>. But before we could enter this virtual world — actually, the virtual entered <em>our</em> world — we had to strap on the development unit for the HoloLens.</p>
<p>It’s a contraption, to be sure. There’s a small, heavy block you hang around your neck which contains all the computing power. It’s comprised of lenses and tiny projectors and motion sensors and speakers (or <em>something</em> that makes sound, anyway), and god knows what else. And then there’s a screen right there in your field of view.</p>
<p>A “screen in your field of view” is the right way to think about HoloLens, too. It’s immersive, but not nearly as immersive as proper virtual reality is. You still see the real world in between the virtual objects; you can see where the magic holograph world ends and your peripheral vision begins.</p>
<p>But before you can apply your jaded “I’ve done VR before” attitude to this situation, you look down at the coffee table and there’s a <strong>castle sitting right on the damn thing.</strong> It’s not shimmery, but it’s not quite real, either. It’s just sitting there, perfectly flat on the table, reacting in space to your head movements. It’s nearly as lifelike as the actual table, and there’s no lag at all. The castle is there. It’s simply magic.</p>
<p>You definitely have a big stupid grin on your face even though the contraption that’s strapped to it is pressing your eyeglasses into the bridge of your nose in a painful way.</p>
<p>Then it’s demo time. You can’t touch anything, but you can look and point a little circle at objects on it by moving your head around. You learn how a “glance” is just you looking at things and pointing your reticle at them, and an “AirTap” is the equivalent of clicking your mouse. The demo involves digging <em>Minecraft</em> holes and blowing up <em>Minecraft</em> zombies with <em>Minecraft</em> TNT. It’s basically incredible to see these digital things in real space.</p>
<p>You blow up a hole in the table and then you look <em>through</em> it to more digital objects on the floor. You blow up a hole in the wall and tiny bats fly out and you see that behind your very normal wall is a virtual hellscape of lava and rock. You peer into the hole, around the corner, and see that dark realm extend far into space.</p>
<p>And then the demo’s over.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-large wp-image-8794" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/a67d3d33-e1e5-4cf7-bf3d-dbe1befc8d8c.0.jpg?resize=980%2C655&#038;ssl=1" alt="a67d3d33-e1e5-4cf7-bf3d-dbe1befc8d8c.0" width="980" height="655" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/a67d3d33-e1e5-4cf7-bf3d-dbe1befc8d8c.0.jpg?resize=1024%2C684&amp;ssl=1 1024w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/a67d3d33-e1e5-4cf7-bf3d-dbe1befc8d8c.0.jpg?resize=300%2C200&amp;ssl=1 300w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/a67d3d33-e1e5-4cf7-bf3d-dbe1befc8d8c.0.jpg?w=1200&amp;ssl=1 1200w" sizes="auto, (max-width: 980px) 100vw, 980px" /></p>
<p><strong>Skype<br />
</strong>by Tom Warren</p>
<p>Microsoft’s Skype demo was as equally impressive to me as playing around with<em>Minecraft </em>blocks in a living room. After a two-hour keynote, Microsoft wanted me to fix a light switch. It all started by sitting down and facing some tools and a socket with exposed wiring. A little dazed and confused, I looked up and scanned across the Skype interface which was suddenly appearing in front of me, and picked a face to call. The video call popped into a little window, and my journey to fix a light switch began.</p>
<p>On the other end of the call was a Microsoft engineer. I could see and hear her, but she could only hear me and see exactly what I was seeing in front of me. My eyes, or the headset on my head, was relaying everything over Skype. It was a support call of sorts — here she was to help me fix a light switch. We started by pinning her little window on top of a lamp. I could then look around the room and return to the lamp to see her face. She guided me where to go. It felt strangely natural, and I didn’t need to configure anything or learn gestures other than the same “Air Tap” you use to simulate a mouse click.</p>
<p>While I was being talked through which real world tools we needed for the job, the Microsoft engineer called my attention to the wall with wiring and then started drawing where to position the light switch right on the wall. Thinking about it now it sounds totally surreal, but during the demo I didn’t even think about it — it just felt like I was being guided around with annotations and a helpful friend. We connected the wiring, tested it for an electrical current, and then turned the power back on and switched the light on. It was all fixed, and all by using a crazy combination of a headset, augmented reality, and Skype. It might sound gimmicky, but the applications here are truly impressive. I use YouTube guides to figure out home improvements or to service my car, but this is on another level. Imagine a surgeon performing complex surgery and writing notes in real time and guiding a colleague through it all. Imagine support calls to resolve a problem with your PC. If this works as well as Microsoft’s controlled demo, then this really has the ability to change how we communicate and learn.</p>
<p>Microsoft’s next demo didn’t have us using the HoloLens prototypes directly. Instead, we watched as “Nick” (nobody in Microsoft’s blue-tinted demonstration basement has last names. I asked.) manipulate objects in digital space so he could build a Koala bear or a pickup truck. It was actually quite impressive, as cameras filmed him and screens showed both Alex and the virtual objects he was manipulating in the same space in real time.</p>
<p>The idea was to convince us that HoloLens would unleash a wave of creators who would be able to dream up 3D objects with little to no training. It’s much easier to understand what a thing is in your living room than it is in AutoCad.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-8795" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/hololens.0.gif?resize=663%2C373&#038;ssl=1" alt="hololens.0" width="663" height="373" /></p>
<p>But sitting there after our whirlwind of actually <em>experiencing</em> HoloLens, my mind was elsewhere. For example, there are only a few ways to interact with this system so far:</p>
<ul>
<li>Glance: you point your head at something.</li>
<li>AirTap: you make a “Number 1″ sign with your hand, then move your finger down like you’re depressing a lever.</li>
<li>Voice: you can issue commands, usually to switch what “tool” you’re using.</li>
<li>Mouse: So actually the neatest thing is that objects you use to interact with computers can be used to interact with holograms.</li>
</ul>
<p>That seems like enough, but it’s not nearly enough. It’s wildly impressive that these objects really do feel like they’re out there in your living room, but it’s equally depressing to know that you can’t treat them like real objects.</p>
<p>At one point in the demo, Alex needed to put a tire on his pickup. He had to twist his body and head around to get his pointer in just the right spot and get the tire arranged just right to fix on the axle. Then, AirTap! the tire is connected. But how much easier would it be if you could grab the tire in your actual hands?</p>
<p>Our hands are simply more dextrous than our necks. You have finer control over small motions, you can move your hands in so many different ways and vectors, with pressure and nuance and delicacy. Your neck and head, well, not so much.</p>
<p>But then Microsoft gave us 3D printed Koalas with a USB drive inside them, which was nice. And if this HoloLens thing takes off, you will be able to design your own and it will be way easier than learning current 3D design software. But not as easy as it would be if you just imagined building with holograms.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-8796" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/microsoft-windows-10-live-verge-_1662.0.jpg?resize=980%2C654&#038;ssl=1" alt="microsoft-windows-10-live-verge-_1662.0" width="980" height="654" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/microsoft-windows-10-live-verge-_1662.0.jpg?w=1000&amp;ssl=1 1000w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/microsoft-windows-10-live-verge-_1662.0.jpg?resize=300%2C200&amp;ssl=1 300w" sizes="auto, (max-width: 980px) 100vw, 980px" /></p>
<p><strong>Walking on Mars<br />
</strong>By Tom Warren</p>
<p>Microsoft has teamed up with NASA to let scientists explore what Curiosity sees on Mars. Instead of panoramic imagery on a computer screen, Microsoft’s demo lit up a room and turned it into Mars. I walked around the rocky terrain, bumped into the Curiosity rover, and generally just checked out a planet I will never visit in my lifetime. It’s a totally new perspective that felt like I was immersed in touring Mars, but not necessarily there. The field of view felt a little too limited to truly immerse myself and trick my brain into thinking I was really on another planet, but what impressed me most is what Microsoft has built into this experience.</p>
<p>I held a call with a NASA engineer and he talked me through the terrain. I squatted to look more closely at rocks, took snapshots of various rock formations, and even planted flags for points of interest. My jaw dropped when I ventured over to a PC in the room and started to experiment with the mouse. I pulled the mouse pointer off the screen and suddenly it was on the floor next to me, allowing me to set markers in the virtual environment. It’s everything I’ve seen in demonstrations from Microsoft Research before, but here it was on my head and working.</p>
<p>The collaboration part was the key here, allowing me to interact with this data in a unique way, but also alongside the NASA engineer who could drop flags on the Mars terrain and guide me to look at certain sections. While this isn’t traditional productivity with a mouse and keyboard, it’s certainly something new and intriguing. I could see this type of scenario working for big teams that need to communicate across time zones and on big sets of complex data.</p>
<p>Overall, HoloLens is Microsoft at its most ambitious. It’s a big bet on the future of computing, the future of Windows, and ultimately the future of Microsoft itself. While the company is struggling at mobile, it wants to catch the next wave of computing and lead. Is HoloLens the next wave? Developers and consumers will be the ultimate test of that, but if anything HoloLens is an incredibly brave and impressive project from Microsoft. It’s true innovation, which is something Microsoft has lacked during its obsession with protecting Windows. It’s also another example of <a href="https://www.theverge.com/2014/11/6/7164623/microsoft-3d-sound-headset-guide-dogs" target="_blank" rel="noopener">an experience that takes the complex technology out of the way</a>, leaving you to experience what really matters.</p>
<p>Written by: <a href="https://www.theverge.com/users/Dieter%20Bohn" target="_blank" rel="noopener">Dieter Bohn</a> and <a href="https://www.theverge.com/users/tomwarren" target="_blank" rel="noopener">Tom Warren</a>, <a href="https://www.theverge.com/2015/1/21/7868251/microsoft-hololens-hologram-hands-on-experience" target="_blank" rel="noopener">The Verge</a> (via <a href="https://ispr.info/2015/01/22/up-close-with-the-hololens-microsofts-intriguing-mixed-reality-product/" target="_blank" rel="noopener">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2015/01/welcome-age-holographs/">Welcome to the Age of Holographs</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2015/01/welcome-age-holographs/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">8792</post-id>	</item>
		<item>
		<title>Control VR Gloves Warp Your Fingers into Virtual Worlds</title>
		<link>https://www.situatedresearch.com/2014/06/control-vr-gloves-warp-fingers-virtual-worlds/</link>
					<comments>https://www.situatedresearch.com/2014/06/control-vr-gloves-warp-fingers-virtual-worlds/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Wed, 11 Jun 2014 19:04:51 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Communication]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=8544</guid>

					<description><![CDATA[<p>$350 device tracks your arms and hands with military-designed sensors New technologies such as Google Glass and Oculus’ Rift headset are making it easier than ever for us to get our heads into augmented and virtual realities. But while we get our heads into these alternate worlds and use our eyes to check our emails, surf&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2014/06/control-vr-gloves-warp-fingers-virtual-worlds/">Control VR Gloves Warp Your Fingers into Virtual Worlds</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>$350 device tracks your arms and hands with military-designed sensors</strong></p>
<p>New technologies such as Google Glass and Oculus’ Rift headset are making it easier than ever for us to get our heads into augmented and virtual realities. But while we get our heads into these alternate worlds and <a href="http://www.theverge.com/2013/2/22/4013406/i-used-google-glass-its-the-future-with-monthly-updates" target="_blank">use our eyes to check our emails</a>, surf the internet, even <a href="http://www.theverge.com/2014/2/5/5382524/eve-valkyrie-will-be-an-oculus-rift-exclusive" target="_blank">destroy enemy starfighters with a spiral of missiles</a>, our hands are left behind in the real world. <span id="more-8544"></span></p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="alignright size-full wp-image-8546" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/06/jpeg.jpg?resize=640%2C426&#038;ssl=1" alt="jpeg" width="640" height="426" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/06/jpeg.jpg?w=640&amp;ssl=1 640w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/06/jpeg.jpg?resize=300%2C199&amp;ssl=1 300w" sizes="auto, (max-width: 640px) 100vw, 640px" />California-based Control VR wants to change that. The company today launched a <a href="https://www.kickstarter.com/projects/controlvr/control-vr-motion-capture-for-vr-animation-and-mor" target="_blank">Kickstarter for its Control VR wearable device</a>, a glove-like system that fits over the user’s arms and shoulders and can accurately sense the precise movements of fingers before translating that motion into virtual or augmented realities. Unlike motion sensing controllers such as <a href="http://www.theverge.com/2014/6/5/5782286/xbox-one-without-kinect-performance-boost" target="_blank">Microsoft’s Kinect</a>, the Control VR can map precise arm and finger motions without the use of an external camera.<span id="more-7850"></span></p>
<p>Alex Sarnoff, Control VR’s co-founder and CEO, says “existing motion-sensing technology is crude, insufficient and limited by confined spaces and camera systems.” His company’s solution takes up little space and doesn’t require an external device pointed at the user. Instead, fine control is made possible by a set of tiny sensors that are placed on the user’s fingers and arms. Each of these sensors — which Sarnoff says were designed for military purposes — has three accelerometers, three gyroscopes, and three magnetometers. The data produced by the position of these sensors is fed back to a processor that allows the Control VR system to calculate how the wearer’s fingers are moving in relation to their body.</p>
<p><iframe loading="lazy" src="https://www.kickstarter.com/projects/controlvr/control-vr-motion-capture-for-vr-animation-and-mor/widget/video.html" width="1024" height="600" frameborder="0" scrolling="no"> </iframe></p>
<p>Sarnoff sees his company’s device first being used with video games. Control VR has already demonstrated its device being used with the Oculus Rift headset, <a href="http://youtu.be/LPszKhewSec" target="_blank">using the Rift’s Tuscany demo</a> to show how hands, arms, and fingers can be manipulated by the player. The sensors on the wearer’s elbows and fingers mean that the motions look natural on screen, appearing as one-to-one representations of their actions in the real world. In a newer demonstration, also using the Rift, a player places his hands behind his back to send an Iron Man avatar flying across treetops. He throws his hands forward, using Tony Stark’s palm-mounted thrusters to come to a hovering halt, before pointing his fingers at flying opponents and blowing them from the sky with his suit’s weapons.</p>
<p>The most recent renders of the device show it sporting a small joystick, but Sarnoff says the system will also have more humanitarian uses than aerial video game battles. Control VR will ship with an SDK that Sarnoff says will allow developers to “make the world a better place” by building software and adding functionality for the technology. “Ultimately, functional applications like remote physical therapy and virtual sign-language will be developed,” he says. Sarnoff thinks his company’s device will have a major impact in the animation, design, medical, and robotics communities — and with the party game crowd. “Imagine playing a game of beer pong in real-time,” he suggests.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-large wp-image-8547" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/06/View3-PhysicalRender.jpg?resize=980%2C419&#038;ssl=1" alt="View3-PhysicalRender" width="980" height="419" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/06/View3-PhysicalRender.jpg?resize=1024%2C438&amp;ssl=1 1024w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/06/View3-PhysicalRender.jpg?resize=300%2C128&amp;ssl=1 300w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/06/View3-PhysicalRender.jpg?w=1900&amp;ssl=1 1900w" sizes="auto, (max-width: 980px) 100vw, 980px" /></p>
<p>The company’s funding goal is set at $250,000. Those that pledge $350 — the <a href="https://www.oculusvr.com/order/" target="_blank">same price as an Oculus Rift development kit</a> — or more get their own Control VR system, in addition to its SDK and a set of tutorials that the company says makes “integration with any 3D game or application as easy as possible.” Sarnoff promises that those that do purchase a Control VR system won’t have to buy a newer version six months down the line. The $350 device is modular, meaning new features and functions can be slotted or patched in later. He mentions haptic feedback as one example that will “absolutely” be a part of future versions of Control VR, “so gamers can play with real feedback while laying on a sofa.” The company plans to get all Control VR systems out to people who pledge $350 or more by December 25th. A retail version is further out, but is expected to be ready for the mass market in 18 months.</p>
<p>Some of the world’s biggest companies have placed <a href="http://www.theverge.com/2014/3/25/5547456/facebook-buying-oculus-for-2-billion" target="_blank">big bets on virtual and augmented reality</a>, but while the visual experience is already impressive, controllers for the Oculus Rift and its contemporaries have lagged behind. Devices such as the Razer Hydra are frustrating and imprecise to use, while others <a href="http://www.theverge.com/2013/6/11/4419832/virtuix-omni-vr-hands-on-demo" target="_blank">such as the Virtuix Omni </a>require vast amounts of living room space, leading Oculus’ Palmer Luckey to <a href="http://www.theverge.com/2013/12/23/5238118/virtual-reality-check-oculus-rift-hardware-ecosystem" target="_blank">lament the lack of a top-quality input system for his company’s machine</a>. Control VR’s system certainly appears smaller and more precise than its peers, but it’s yet to be seen how quickly the virtual reality community will warm to it. In the meantime, the company plans to show off the system at next week’s E3 expo, offering developers the chance to get their hands, as well as their heads, into their video games.</p>
<p>Written by: <a href="http://www.theverge.com/users/richmcc" target="_blank">Rich McCormick</a>,  <a href="http://www.theverge.com/2014/6/5/5781932/control-vr-gloves-warp-your-fingers-into-virtual-worlds" target="_blank">The Verge</a> (via <a href="http://ispr.info/2014/06/10/control-vr-gloves-warp-your-fingers-into-virtual-worlds/" target="_blank">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2014/06/control-vr-gloves-warp-fingers-virtual-worlds/">Control VR Gloves Warp Your Fingers into Virtual Worlds</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2014/06/control-vr-gloves-warp-fingers-virtual-worlds/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">8544</post-id>	</item>
		<item>
		<title>Oculus Rift Will Finally Go On Sale To Consumers Next Year</title>
		<link>https://www.situatedresearch.com/2014/05/oculus-rift-will-finally-go-sale-consumers-next-year/</link>
					<comments>https://www.situatedresearch.com/2014/05/oculus-rift-will-finally-go-sale-consumers-next-year/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Thu, 01 May 2014 16:05:39 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Aesthetics]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[User-Centered Design]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=8441</guid>

					<description><![CDATA[<p>Image: A man tries the Oculus Rift headset at Facebook&#8217;s F8 conference. An Oculus Rift virtual reality headset for consumers could go on sale next year, a company representative told Business Insider at Facebook’s F8 developer conference today. Management at Oculus VR, the Irvine, Calif.-company that Facebook bought for $2 billion earlier this year, will&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2014/05/oculus-rift-will-finally-go-sale-consumers-next-year/">Oculus Rift Will Finally Go On Sale To Consumers Next Year</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p style="text-align: center;"><em>Image: A man tries the Oculus Rift headset at Facebook&#8217;s F8 conference.</em></p>
<p>An Oculus Rift virtual reality headset for consumers could go on sale next year, a company representative told Business Insider at Facebook’s F8 developer conference today.</p>
<p>Management at Oculus VR, the Irvine, Calif.-company that Facebook bought for $2 billion earlier this year, will be “disappointed” if it doesn’t have a headset available at retail for ordinary people by 2016, according to an Oculus spokesperson. <span id="more-8441"></span></p>
<p>A consumer Oculus product in 2015 will be exciting for a couple of reasons:</p>
<ul>
<li>Almost everyone who tries the device is completely blown away by the experience. It’s completely different from any other audio-visual gadget you’ve ever tried — the worlds inside the headsets feel real and deep, because the company has gotten rid of the screen time “lag” that occurs when users move their heads. On top of that, the environment moves naturally as you move. In the game I tried today, I peered out into a lava-filled hellscape full of demons guarding battlements. If I leaned forward, I could see into the rivers of molten rock that flowed between them. Attendees at the conference lined up 20 deep to get 5 minutes with the device.</li>
<li>Oculus will completely turn the console game economy on its head. Once you’ve played a game inside Oculus, going back to playing on a TV just feels lame.</li>
</ul>
<p>Currently, Oculus is only selling development kits to game creators. The Oculus Rift Development Kit 2 is currently on sale for $350, and units will start shipping to developers only in July of this year. After that, Oculus VR must wait while those developers create an ecosystem of cool games — there is no point in selling the headsets to consumers if there are no games or other content for them to see. That process could take months.</p>
<p>There is no word on a price tag for consumers. The company is in the process of building a team to work on marketing and branding the product.</p>
<p>Game creation takes time, but Redner says the current thinking is that there should be enough titles to justify consumer usage by 2016.</p>
<h3>There is a new Oculus inside a secret room in Irvine</h3>
<figure id="attachment_8443" aria-describedby="caption-attachment-8443" style="width: 300px" class="wp-caption alignright"><a href="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/05/facebook-f8-oculus-1.jpg?ssl=1"><img data-recalc-dims="1" loading="lazy" decoding="async" class="size-medium wp-image-8443" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/05/facebook-f8-oculus-1.jpg?resize=300%2C225&#038;ssl=1" alt="Oculus being demonstrated at F8." width="300" height="225" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/05/facebook-f8-oculus-1.jpg?resize=300%2C225&amp;ssl=1 300w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/05/facebook-f8-oculus-1.jpg?resize=1024%2C768&amp;ssl=1 1024w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/05/facebook-f8-oculus-1.jpg?w=1200&amp;ssl=1 1200w" sizes="auto, (max-width: 300px) 100vw, 300px" /></a><figcaption id="caption-attachment-8443" class="wp-caption-text">Oculus being demonstrated at F8.</figcaption></figure>
<p>More tantalizing still is what Oculus is hiding inside the secret “Valve Room” within its Irvine headquarters near Los Angeles. (Valve is the company that originally used the room for developing games; Oculus has taken it over.) We first heard about this from Andreessen Horowitz partner Chris Dixon, an investor in Oculus VR, who says that the version of Oculus Rift inside the “special room” is more powerful and impressive than even the existing Crystal Cove and DK2 versions that outsiders have been allowed to play with.</p>
<p>“Crystal Cove is 50% of what they are running in LA,” he says. Oculus Rift Crystal Cove is impressive, but it’s still obvious that you’re inside an animated game environment. It doesn’t yet closely approximate reality. However, “what they have in LA <em>does</em>,” Dixon tells us. “You go into a room. It’s a special room. Fancier headset. … In user testing it gets to a level of realism where almost all people feel that it’s realistic.” He gestured to the San Francisco street where we were drinking coffee. “Imagine everything you can see now, but it’s a little bit pixelated. Eventually that [pixelation] will go away.”</p>
<p>He believes Facebook CEO Mark Zuckerberg bought the company after being ushered into the Valve Room. (He obviously tried the other versions as well.)</p>
<p>The Oculus rep wasn’t quite as hyperbolic when asked about the “mythic” room. But he did tell us that the demo version inside the Valve room does feature a photorealistic experience that is so real even people who are very sensitive to motion sickness don’t “feel” it.</p>
<p>The test unit has an entire room to itself because it requires a massive amount of processing power to run. It’s a headset tethered to a giant server, basically. Oculus expects, eventually, to be able to crunch that down into units that can be sold commercially.</p>
<p>Games will only be the start of it. Once it is commercially available, “There will be a million in the U.S. military, police, and fire services,” Dixon says. “Anything to do with training” that is dangerous will utilize an Oculus experience instead, he believes.</p>
<p>We can’t wait.</p>
<p>Written by: <a href="http://www.businessinsider.com/author/jim-edwards" target="_blank">Jim Edwards</a>, <a href="http://www.businessinsider.com/oculus-riftdate-for-sale-to-consumers-2014-4" target="_blank">Business Insider</a> (via <a href="http://ispr.info/2014/05/01/oculus-rift-coming-next-gen-version-reaches-new-level-of-realism/">Presence</a>)<br />
Images: Kyle Russell, Business Insider<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2014/05/oculus-rift-will-finally-go-sale-consumers-next-year/">Oculus Rift Will Finally Go On Sale To Consumers Next Year</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2014/05/oculus-rift-will-finally-go-sale-consumers-next-year/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">8441</post-id>	</item>
	</channel>
</rss>
