<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Serious Games Archives - Situated Research</title>
	<atom:link href="https://www.situatedresearch.com/category/gaming/serious-games/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.situatedresearch.com/category/gaming/serious-games/</link>
	<description>Usability Research and User Experience Testing</description>
	<lastBuildDate>Mon, 22 Nov 2021 17:33:24 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	

 
<site xmlns="com-wordpress:feed-additions:1">122538981</site>	<item>
		<title>Road to GDC: I’m Not A Doctor, but I Simulate One in VR</title>
		<link>https://www.situatedresearch.com/2018/03/road-gdc-im-not-doctor-simulate-one-vr/</link>
					<comments>https://www.situatedresearch.com/2018/03/road-gdc-im-not-doctor-simulate-one-vr/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Fri, 02 Mar 2018 17:20:01 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Games for Learning]]></category>
		<category><![CDATA[Health Care]]></category>
		<category><![CDATA[Learning]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">https://www.situatedresearch.com/?p=9703</guid>

					<description><![CDATA[<p>We are moving into a future where games train our doctors, monitor our health, and treat our illnesses.&#160; The sky is falling! Social media is the new scapegoat of the month. Headlines claim it is ruining our relationships, dismantling our society, destroying our very lives! In particular, the most frequent victims are presumed to be&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2018/03/road-gdc-im-not-doctor-simulate-one-vr/">Road to GDC: I’m Not A Doctor, but I Simulate One in VR</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>We are moving into a future where games train our doctors, monitor our health, and treat our illnesses.&nbsp;<span id="more-9703"></span></p>
<p>The sky is falling! Social media is the new scapegoat of the month. Headlines claim it is ruining our relationships, dismantling our society, destroying our very lives! In particular, the most frequent victims are presumed to be teenagers. Sometimes the accused culprit is not social media, but the phones that make it so accessible. Is it true? Only time will tell &#8230; but in the &#8217;50s, the demon was comic books; in the &#8217;60s, rock and roll; and in the &#8217;80s, video games. My mother was convinced that my love of comic books and science fiction was going to rot my brain. Now, of course, these things are mainstream and no longer the sole domain of teens. But there’s always a new thing for people to worry about or blame for the decline and fall of civilization.</p>
<p>I’m particularly sensitized to that criticism of video games. I designed and programmed my first computer game in college in 1976 &#8211; in fact, inspired by that very love of science fiction I had as a child. When I graduated in 1980, my first job out of college was entering the then-infant video game industry. I’ve never left. So when pundits blamed games for destroying society, even causing teen violence and rebellion, I took it personally. I’ve always felt that video games can be magical, marvelous entertainment. I hoped that one day they’d be seen as not just safe, but actually good for us. That day is finally here.</p>
<h3>Virtual treatment, real results</h3>
<p>For many years now, researchers and doctors have gradually built up solid scientifically verified evidence that existing games can improve the lives of the people who play them. At the same time, increasing numbers of games have been created with the idea of ‘boosting health’ as a direct goal.</p>
<p>Fast action games like Call of Duty have been found to improve visual perception and the ability to make correct decisions quickly. Other research has shown promise in using a game to treat the underlying causes of&nbsp;<a href="https://www.polygon.com/2014/2/24/5439884/this-game-knows-how-scared-you-are-but-could-be-used-to-heal-trauma" target="_blank" rel="noopener">depression</a>. It’s possible that games may be able to diagnose the onset of degenerative diseases like Alzheimer’s and Parkinson’s, and perhaps even slow their progression.</p>
<p>Games have shown promise in the realm of physical fitness, too. Starting 20 years ago, the arcade game Dance Dance Revolution was credited with getting a lot of passive couch potatoes up, moving, and losing weight, and it’s still spawning sequels. Games on mobile phones like&nbsp;<i>Zombies, Run!</i> and&nbsp;<i>Pokémon Go</i>&nbsp;have encouraged players to get out and move in the world, and many track their exercise and calorie expenditure as they do so. VR holds promise here too, with the chance to get your exercise by racing the Tour de France on your exercise bike, or by flying like a bird. There are even current ventures bringing gameplay to gym class and possibly making dodgeball fun even for nerds!</p>
<h3>Doctors with joysticks</h3>
<p>It turns out that doctors in training, like most people these days, are often avid game players. That has presented a great opportunity for using them as part of their medical education. Although games have yet to replace classes, they’ve been shown to help laparoscopic surgeons reduce errors by 37 percent while increasing their speed by 27 percent when used as warm-up exercises. When you consider that athletes, musicians, dancers, and others who need to do precision work with their muscles all limber up before their tasks, it makes sense that the right kind of practice helps surgeons, too.</p>
<p>Other companies are rushing to use VR to train anesthesiologists or to give caregivers a first-hand sense of how their patients with macular degeneration see the world. The VR simulations aren’t all games, but the vast majority of VR engineers are coming from the games industry.</p>
<h3>Prescribing play</h3>
<p>Perhaps the most exciting application of games in the modern world are the ways in which doctors are using games to treat their patients. Realistic war games have helped soldiers recover from PTSD by simulating the experiences that trigger their problem, a method to gradually desensitize them to reduce their symptoms long term. Other games have been used in similar ways in conjunction with therapy to treat&nbsp;<a href="https://www.polygon.com/features/2017/4/7/15205366/vr-danger-close" target="_blank" rel="noopener">phobias</a>&nbsp;like fear of heights, flying, and spiders. And currently, virtual reality games have shown great promise in pain relief for acute pain, reducing or even eliminating the need for narcotics when changing the dressings on burn victims. VR is also showing promise in helping stroke victims recover control over their movement, and in&nbsp;<a href="https://www.polygon.com/2014/3/3/5462508/phantom-pain-video-game-treatment" target="_blank" rel="noopener">relieving the perception of pain in “phantom limbs”</a> experienced by amputation patients.</p>
<p>Last September saw the FDA approval of a mobile phone app to be used (in conjunction with therapy) to treat addiction. The developers call their app a “Prescription Digital Therapeutic” and, although it’s not a game, it’s a big step to have software approved to treat something as serious as Substance Abuse Disorder.</p>
<p>But a real game designed to be an active treatment for ADHD (Attention Deficit Hyperactivity Disorder) was not far behind. By December, the FDA gave preliminary clearance to a video game made by a team consisting of both game developers and neuroscientists from UCSF. In a large controlled trial of children and teens diagnosed with ADHD, the group who used the game showed significant improvement compared to a control group. The team hopes that soon it will become the first game to win FDA approval on the same terms as a prescription drug. In style, the game is part racing game, part Pokémon Snap, but with many unique twists to improve attention and focus.</p>
<p>We are moving into a future where games train our doctors, monitor our health, and treat our illnesses. It may seem a bit outrageous now, but if comic books led me into a career making video games and often become the basis of mainstream movies, why can’t video games inspire the next generation of doctors and become the basis of medical treatment? Video games are intimately connected to learning, attention, and the brain. It isn’t an accident that they are also proving to be useful to our mental and physical health. Maybe they’ll even be able to reverse my dreaded comic book brain rot!</p>
<p><i>This is part of a&nbsp;<a href="https://www.rollingstone.com/gdc" target="_blank" rel="noopener">series of columns</a>&nbsp;written by developers speaking at the Game Developers Conference in March.</i></p>
<p><i>Noah Falstein is a freelance game designer and producer, and was one of the first 10 employees at LucasArts Entertainment and Dreamworks Interactive. Last year he left Google after serving four years as their Chief Game Designer.</i></p>
<p>Written by: Noah Falstein, via <a href="https://www.rollingstone.com/glixel/features/road-to-gdc-im-not-a-doctor-but-i-simulate-one-in-vr-w517154" target="_blank" rel="noopener">Rolling Stone</a><br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2018/03/road-gdc-im-not-doctor-simulate-one-vr/">Road to GDC: I’m Not A Doctor, but I Simulate One in VR</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2018/03/road-gdc-im-not-doctor-simulate-one-vr/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">9703</post-id>	</item>
		<item>
		<title>From Privacy to Productivity: A Look at How Virtual Reality Could Change the Way We Work</title>
		<link>https://www.situatedresearch.com/2015/07/from-privacy-to-productivity-a-look-at-how-virtual-reality-could-change-the-way-we-work/</link>
					<comments>https://www.situatedresearch.com/2015/07/from-privacy-to-productivity-a-look-at-how-virtual-reality-could-change-the-way-we-work/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Sat, 25 Jul 2015 18:06:37 +0000</pubDate>
				<category><![CDATA[Collaboration]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Communication]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Learning]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=8942</guid>

					<description><![CDATA[<p>Businesses someday getting on board with virtual reality will need to do some self-examination. Various VR tools are aimed at reclaiming productivity and improving interactions.  The fabled “promise” of virtual reality is expansive. At its loftiest, we’ve been promised not only changes to how we live and how we consume entertainment, but also to how&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2015/07/from-privacy-to-productivity-a-look-at-how-virtual-reality-could-change-the-way-we-work/">From Privacy to Productivity: A Look at How Virtual Reality Could Change the Way We Work</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>Businesses someday getting on board with virtual reality will need to do some self-examination. Various VR tools are aimed at reclaiming productivity and improving interactions. </strong></p>
<p>The fabled “promise” of virtual reality is expansive. At its loftiest, we’ve been promised not only changes to how we live and how we consume entertainment, but also to how we work. <span id="more-8942"></span></p>
<p>After all, tech loves a good workplace trend.</p>
<p>In a general sense, incorporating virtual reality into business could mean things like escape from the physical confines of a desk, or the limit of how many monitors you could stick on that desk, or the general lack of aesthetics associated with cubicles, let’s say.</p>
<p>At the moment, there seems to be two ends of the spectrum developing — VR to help you get work done with other people, and VR to help you get away from, perhaps, those same people later on in the day.<span id="more-20923"></span></p>
<p>One instance of the latter example is Icelandic company <a href="http://www.murevr.com/#the-team-1-section" target="_blank">Breakroom</a>. They’re still in early days, but the idea behind Breakroom stems from the proliferation of open-concept offices — the kind popularized by tech companies as markers of innovation and avant-garde thinking, and the same that the Harvard Business Review, among others, have said are now negatively impacting <a href="https://hbr.org/2014/10/the-transparency-trap&amp;cm_sp=Article-_-Links-_-Top%20of%20Page%20Recirculation" target="_blank">privacy</a>,<a href="http://www.newyorker.com/business/currency/the-open-office-trap" target="_blank">productivity</a>, and <a href="http://www.fastcompany.com/3019758/dialed/offices-for-all-why-open-office-layouts-are-bad-for-employees-bosses-and-productivity" target="_blank">workplace satisfaction</a>.</p>
<p>One of Breakroom’s founders, Diðrik Steinsson, drew inspiration from having to work in an open office space himself. The idea behind Breakroom is that a worker in such an office might have a headmounted display like the Oculus Rift at his or her desk, and when it’s time to really focus on something for a few hours, they can put it on and go into a virtual environment with multiple, manipulatable browser windows, and integration with Google Apps, and Office 365, and get some work done — all while sitting somewhere scenic like a grassy field, or the moon. (Some co-workers will push you there.)</p>
<p>“I see it as a fortress of solitude for people,” Steinsson said. And he’s betting workers will be wearing some type of HMD eventually, even if it’s not within the next 10 years.</p>
<p>The flip side of this, to a degree, is a virtual reality application like AltspaceVR. The social VR app lets users enter its virtual world as robot avatar to socialize. It’s not necessarily aimed at businesses or the enterprise, but CEO Eric Romo said they’ve been using it for functions like business meetings and even job candidate interviews.</p>
<p>Romo emphasizes the value of nonverbal communication. A conference call, for example, can be awkward. People talk over each other, and it’s difficult to get a read on the other people present when all nonverbal cues like facial expressions and body language are absent. Romo said the experience of meeting and interacting with others is more effective when things like head movements are getting translated into VR.</p>
<p>Altspace has features like private and multi user web browsers — so, multiple people could, for example, look at code together. The use cases from consumer to enterprise slide back and forth a little like this: Romo said that if you want to show off vacation pictures, there’s no reason why they couldn’t be slide decks.</p>
<p>Somewhere in between those two examples, there’s something like the <a href="http://www.fastcodesign.com/3028433/virtual-reality-goes-to-work" target="_blank">demo</a> UC Davis’ Institute for Data Analysis and Visualisation Oliver Kreylos put together in 2014. It’s 3D-captured data of an office that includes 2D desktop apps.</p>
<p>But to eventually get these or other virtual reality tools into the business world, there are still some hurdles to jump, like nailing down inputs, or even just supplying every worker with not only an HMD, but also a Kinect sensor and Leap Motion sensor in order to translate more movement into VR. It also raises bigger questions as to what does all this really solve?</p>
<p>“When you want to introduce a technology like VR into some sort of business process, it’s really got to have some sort of overall benefit,” said Gartner analyst Brian Blau. “Some of these behavior replacement cycles — one of the things that you’ll find is that often times they’re more incremental than they are revolutionary.”</p>
<p>Introducing something like VR into a business environment would be revolutionary in the sense that it would be a change of device, software, and user interface, all at once.</p>
<p>What he asks is what are the steps? What are the actions being changed? Being able to answer those questions could be a determining factor in whether virtual reality ever takes hold in the enterprise.</p>
<p>He said more general uses are harder to make an argument for. Take a meeting, for the example — over the years, tech surrounding the ways in which people meet has ranged from phone calls, to conference calls, to video calls, to video calls on mobile devices — so what’s the big value add of virtual reality?</p>
<p>Romo submits the nonverbal cues, and the basic malleability of a virtual reality environment, the ability to turn a space into whatever it is a user might need.</p>
<p>Still, Blau sees more potential in purpose-built VR tools. Think data visualisation, training, prototyping and design.</p>
<p>Another consideration is what what happens after introducing something like an HMD into an office worker’s everyday use.</p>
<p>Computer Vision Syndrome is already rampant. Though, Dr. Dominick Maino, a professor at <a href="http://ico.edu/" target="_blank">Illinois College of Optometry/Illinois Eye Institute</a>, who specializes in pediatrics and binocular Vision, and has done research on vision and 3D graphics, said that if anything, introducing VR into workplaces would probably end up sacrificing a lot of vision problems relating to faulty binocular vision. Those will be the kinds of problems that need to get fixed before actually being able to use a VR tool.</p>
<p>Still, this is all probably a ways off. Breakroom is about to start testing its product. Altspace is focusing mostly on consumer use, but crafting a product that could be used otherwise in business.</p>
<p>Now, if only VR could offer a fix for the big business problems — like the “reply all” email thread.</p>
<p><em>[This article from <a href="http://www.techrepublic.com/article/from-privacy-to-productivity-a-look-at-how-virtual-reality-could-change-the-way-we-work/" target="_blank">TechRepublic</a> focuses on the uses of presence technology to both separate and connect people in the workplace; I think the Breakroom VR application by <a href="http://www.murevr.com/" target="_blank">MureVR</a> is particularly interesting; you can watch a 6:13 minute video about it on <a href="https://www.youtube.com/watch?v=KvJgJAppbxQ" target="_blank">YouTube</a>.]</em></p>
<p>Written by: <a href="http://www.techrepublic.com/search/?a=erin+carson" target="_blank">Erin Carson</a>, <a href="http://www.techrepublic.com/article/from-privacy-to-productivity-a-look-at-how-virtual-reality-could-change-the-way-we-work/" target="_blank">TechRepublic</a> (via <a href="http://ispr.info/2015/07/15/tools-to-separate-and-connect-us-how-vr-could-change-the-way-we-work/">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>&nbsp;</p>
<p>The post <a href="https://www.situatedresearch.com/2015/07/from-privacy-to-productivity-a-look-at-how-virtual-reality-could-change-the-way-we-work/">From Privacy to Productivity: A Look at How Virtual Reality Could Change the Way We Work</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2015/07/from-privacy-to-productivity-a-look-at-how-virtual-reality-could-change-the-way-we-work/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">8942</post-id>	</item>
		<item>
		<title>Welcome to the Age of Holographs</title>
		<link>https://www.situatedresearch.com/2015/01/welcome-age-holographs/</link>
					<comments>https://www.situatedresearch.com/2015/01/welcome-age-holographs/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Thu, 22 Jan 2015 22:18:54 +0000</pubDate>
				<category><![CDATA[Education]]></category>
		<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Aesthetics]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Games for Learning]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Mental Models]]></category>
		<category><![CDATA[Personalization]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[User-Centered Design]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=8792</guid>

					<description><![CDATA[<p>Up close with the HoloLens, Microsoft’s most intriguing product in years We just finished a heavily scripted, carefully managed, and completely amazing demonstration of Microsoft’s HoloLens technology. Four demos, actually, each designed to show off a different use case for a headset that projects holograms into real space. We played Minecraft on a coffee table.&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2015/01/welcome-age-holographs/">Welcome to the Age of Holographs</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>Up close with the HoloLens, Microsoft’s most intriguing product in years</strong></p>
<p>We just finished a heavily scripted, carefully managed, and completely amazing demonstration of Microsoft’s HoloLens technology. Four demos, actually, each designed to show off a different use case for a headset that projects holograms into real space. We played <em>Minecraft</em> on a coffee table. We had somebody chart out how to fix a light switch right on top of the very thing we were fixing. <span id="more-8792"></span></p>
<p>We walked on Mars.</p>
<p>You’ll notice there aren’t photos here, and that’s because before we were even allowed into the labs where the HoloLens team tests out its user experiences, we had to deposit our cameras and phones into a locker. No recording equipment of any kind was allowed, not even audio. We entered the basement below Microsoft’s visitor center laughing at the absurdity of it all — many reporters needed to get notepads from the company and weren’t carrying pens, either.</p>
<p>But it was all worth it, because HoloLens is probably the most intriguing (and, in many ways, most infuriating) technology we’ve experienced since the Oculus Rift. And there are many parallels with the Rift to be had: both are immersive, but in different ways; both require you to strap a weird thing on your head; both leave you grinning like at absolute idiot at a scene only you can see. And, crucially, both need more work when it comes to thinking through exactly how to control and interact with virtual things.</p>
<p><script height="575px" width="1023px" src="https://player.ooyala.com/iframe.js#ec=lsOGp3cjqUFwNW0FqImWpiKsqIdSTEX-&#038;pbid=dcc84e41db014454b08662a766057e2b"></script></p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-8793" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/d76d1d6c-a4e6-41d2-bc43-c9b49041a219.0.png?resize=864%2C392&#038;ssl=1" alt="d76d1d6c-a4e6-41d2-bc43-c9b49041a219.0" width="864" height="392" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/d76d1d6c-a4e6-41d2-bc43-c9b49041a219.0.png?w=864&amp;ssl=1 864w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/d76d1d6c-a4e6-41d2-bc43-c9b49041a219.0.png?resize=300%2C136&amp;ssl=1 300w" sizes="auto, (max-width: 864px) 100vw, 864px" /></p>
<p><strong><em>Minecraft</em> IRL<br />
</strong>by Dieter Bohn</p>
<p>By far, <a href="https://www.theverge.com/2015/1/21/7868363/minecraft-hololens-microsoft-freecell" target="_blank" rel="noopener">the most impressive demo for my money was the <em>Minecraft</em> demo</a> — though Microsoft called it something like “Building Blocks” or some such, presumably so as not to fully commit to releasing a full holograph version of<em>Minecraft</em>. But before we could enter this virtual world — actually, the virtual entered <em>our</em> world — we had to strap on the development unit for the HoloLens.</p>
<p>It’s a contraption, to be sure. There’s a small, heavy block you hang around your neck which contains all the computing power. It’s comprised of lenses and tiny projectors and motion sensors and speakers (or <em>something</em> that makes sound, anyway), and god knows what else. And then there’s a screen right there in your field of view.</p>
<p>A “screen in your field of view” is the right way to think about HoloLens, too. It’s immersive, but not nearly as immersive as proper virtual reality is. You still see the real world in between the virtual objects; you can see where the magic holograph world ends and your peripheral vision begins.</p>
<p>But before you can apply your jaded “I’ve done VR before” attitude to this situation, you look down at the coffee table and there’s a <strong>castle sitting right on the damn thing.</strong> It’s not shimmery, but it’s not quite real, either. It’s just sitting there, perfectly flat on the table, reacting in space to your head movements. It’s nearly as lifelike as the actual table, and there’s no lag at all. The castle is there. It’s simply magic.</p>
<p>You definitely have a big stupid grin on your face even though the contraption that’s strapped to it is pressing your eyeglasses into the bridge of your nose in a painful way.</p>
<p>Then it’s demo time. You can’t touch anything, but you can look and point a little circle at objects on it by moving your head around. You learn how a “glance” is just you looking at things and pointing your reticle at them, and an “AirTap” is the equivalent of clicking your mouse. The demo involves digging <em>Minecraft</em> holes and blowing up <em>Minecraft</em> zombies with <em>Minecraft</em> TNT. It’s basically incredible to see these digital things in real space.</p>
<p>You blow up a hole in the table and then you look <em>through</em> it to more digital objects on the floor. You blow up a hole in the wall and tiny bats fly out and you see that behind your very normal wall is a virtual hellscape of lava and rock. You peer into the hole, around the corner, and see that dark realm extend far into space.</p>
<p>And then the demo’s over.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-large wp-image-8794" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/a67d3d33-e1e5-4cf7-bf3d-dbe1befc8d8c.0.jpg?resize=980%2C655&#038;ssl=1" alt="a67d3d33-e1e5-4cf7-bf3d-dbe1befc8d8c.0" width="980" height="655" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/a67d3d33-e1e5-4cf7-bf3d-dbe1befc8d8c.0.jpg?resize=1024%2C684&amp;ssl=1 1024w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/a67d3d33-e1e5-4cf7-bf3d-dbe1befc8d8c.0.jpg?resize=300%2C200&amp;ssl=1 300w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/a67d3d33-e1e5-4cf7-bf3d-dbe1befc8d8c.0.jpg?w=1200&amp;ssl=1 1200w" sizes="auto, (max-width: 980px) 100vw, 980px" /></p>
<p><strong>Skype<br />
</strong>by Tom Warren</p>
<p>Microsoft’s Skype demo was as equally impressive to me as playing around with<em>Minecraft </em>blocks in a living room. After a two-hour keynote, Microsoft wanted me to fix a light switch. It all started by sitting down and facing some tools and a socket with exposed wiring. A little dazed and confused, I looked up and scanned across the Skype interface which was suddenly appearing in front of me, and picked a face to call. The video call popped into a little window, and my journey to fix a light switch began.</p>
<p>On the other end of the call was a Microsoft engineer. I could see and hear her, but she could only hear me and see exactly what I was seeing in front of me. My eyes, or the headset on my head, was relaying everything over Skype. It was a support call of sorts — here she was to help me fix a light switch. We started by pinning her little window on top of a lamp. I could then look around the room and return to the lamp to see her face. She guided me where to go. It felt strangely natural, and I didn’t need to configure anything or learn gestures other than the same “Air Tap” you use to simulate a mouse click.</p>
<p>While I was being talked through which real world tools we needed for the job, the Microsoft engineer called my attention to the wall with wiring and then started drawing where to position the light switch right on the wall. Thinking about it now it sounds totally surreal, but during the demo I didn’t even think about it — it just felt like I was being guided around with annotations and a helpful friend. We connected the wiring, tested it for an electrical current, and then turned the power back on and switched the light on. It was all fixed, and all by using a crazy combination of a headset, augmented reality, and Skype. It might sound gimmicky, but the applications here are truly impressive. I use YouTube guides to figure out home improvements or to service my car, but this is on another level. Imagine a surgeon performing complex surgery and writing notes in real time and guiding a colleague through it all. Imagine support calls to resolve a problem with your PC. If this works as well as Microsoft’s controlled demo, then this really has the ability to change how we communicate and learn.</p>
<p>Microsoft’s next demo didn’t have us using the HoloLens prototypes directly. Instead, we watched as “Nick” (nobody in Microsoft’s blue-tinted demonstration basement has last names. I asked.) manipulate objects in digital space so he could build a Koala bear or a pickup truck. It was actually quite impressive, as cameras filmed him and screens showed both Alex and the virtual objects he was manipulating in the same space in real time.</p>
<p>The idea was to convince us that HoloLens would unleash a wave of creators who would be able to dream up 3D objects with little to no training. It’s much easier to understand what a thing is in your living room than it is in AutoCad.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-8795" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/hololens.0.gif?resize=663%2C373&#038;ssl=1" alt="hololens.0" width="663" height="373" /></p>
<p>But sitting there after our whirlwind of actually <em>experiencing</em> HoloLens, my mind was elsewhere. For example, there are only a few ways to interact with this system so far:</p>
<ul>
<li>Glance: you point your head at something.</li>
<li>AirTap: you make a “Number 1″ sign with your hand, then move your finger down like you’re depressing a lever.</li>
<li>Voice: you can issue commands, usually to switch what “tool” you’re using.</li>
<li>Mouse: So actually the neatest thing is that objects you use to interact with computers can be used to interact with holograms.</li>
</ul>
<p>That seems like enough, but it’s not nearly enough. It’s wildly impressive that these objects really do feel like they’re out there in your living room, but it’s equally depressing to know that you can’t treat them like real objects.</p>
<p>At one point in the demo, Alex needed to put a tire on his pickup. He had to twist his body and head around to get his pointer in just the right spot and get the tire arranged just right to fix on the axle. Then, AirTap! the tire is connected. But how much easier would it be if you could grab the tire in your actual hands?</p>
<p>Our hands are simply more dextrous than our necks. You have finer control over small motions, you can move your hands in so many different ways and vectors, with pressure and nuance and delicacy. Your neck and head, well, not so much.</p>
<p>But then Microsoft gave us 3D printed Koalas with a USB drive inside them, which was nice. And if this HoloLens thing takes off, you will be able to design your own and it will be way easier than learning current 3D design software. But not as easy as it would be if you just imagined building with holograms.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-8796" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/microsoft-windows-10-live-verge-_1662.0.jpg?resize=980%2C654&#038;ssl=1" alt="microsoft-windows-10-live-verge-_1662.0" width="980" height="654" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/microsoft-windows-10-live-verge-_1662.0.jpg?w=1000&amp;ssl=1 1000w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/microsoft-windows-10-live-verge-_1662.0.jpg?resize=300%2C200&amp;ssl=1 300w" sizes="auto, (max-width: 980px) 100vw, 980px" /></p>
<p><strong>Walking on Mars<br />
</strong>By Tom Warren</p>
<p>Microsoft has teamed up with NASA to let scientists explore what Curiosity sees on Mars. Instead of panoramic imagery on a computer screen, Microsoft’s demo lit up a room and turned it into Mars. I walked around the rocky terrain, bumped into the Curiosity rover, and generally just checked out a planet I will never visit in my lifetime. It’s a totally new perspective that felt like I was immersed in touring Mars, but not necessarily there. The field of view felt a little too limited to truly immerse myself and trick my brain into thinking I was really on another planet, but what impressed me most is what Microsoft has built into this experience.</p>
<p>I held a call with a NASA engineer and he talked me through the terrain. I squatted to look more closely at rocks, took snapshots of various rock formations, and even planted flags for points of interest. My jaw dropped when I ventured over to a PC in the room and started to experiment with the mouse. I pulled the mouse pointer off the screen and suddenly it was on the floor next to me, allowing me to set markers in the virtual environment. It’s everything I’ve seen in demonstrations from Microsoft Research before, but here it was on my head and working.</p>
<p>The collaboration part was the key here, allowing me to interact with this data in a unique way, but also alongside the NASA engineer who could drop flags on the Mars terrain and guide me to look at certain sections. While this isn’t traditional productivity with a mouse and keyboard, it’s certainly something new and intriguing. I could see this type of scenario working for big teams that need to communicate across time zones and on big sets of complex data.</p>
<p>Overall, HoloLens is Microsoft at its most ambitious. It’s a big bet on the future of computing, the future of Windows, and ultimately the future of Microsoft itself. While the company is struggling at mobile, it wants to catch the next wave of computing and lead. Is HoloLens the next wave? Developers and consumers will be the ultimate test of that, but if anything HoloLens is an incredibly brave and impressive project from Microsoft. It’s true innovation, which is something Microsoft has lacked during its obsession with protecting Windows. It’s also another example of <a href="https://www.theverge.com/2014/11/6/7164623/microsoft-3d-sound-headset-guide-dogs" target="_blank" rel="noopener">an experience that takes the complex technology out of the way</a>, leaving you to experience what really matters.</p>
<p>Written by: <a href="https://www.theverge.com/users/Dieter%20Bohn" target="_blank" rel="noopener">Dieter Bohn</a> and <a href="https://www.theverge.com/users/tomwarren" target="_blank" rel="noopener">Tom Warren</a>, <a href="https://www.theverge.com/2015/1/21/7868251/microsoft-hololens-hologram-hands-on-experience" target="_blank" rel="noopener">The Verge</a> (via <a href="https://ispr.info/2015/01/22/up-close-with-the-hololens-microsofts-intriguing-mixed-reality-product/" target="_blank" rel="noopener">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2015/01/welcome-age-holographs/">Welcome to the Age of Holographs</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2015/01/welcome-age-holographs/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">8792</post-id>	</item>
		<item>
		<title>London Firm Creates Mind-Controlled Commands for Google Glass</title>
		<link>https://www.situatedresearch.com/2014/07/london-firm-creates-mind-controlled-commands-google-glass/</link>
					<comments>https://www.situatedresearch.com/2014/07/london-firm-creates-mind-controlled-commands-google-glass/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Tue, 15 Jul 2014 20:28:01 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Aesthetics]]></category>
		<category><![CDATA[Affect / Emotion]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Communication]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Mental Models]]></category>
		<category><![CDATA[Personalization]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[User-Centered Design]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=8608</guid>

					<description><![CDATA[<p>Forget voice commands and touch gestures: A London firm has developed a way for Google Glass users to control their devices just by thinking. This Place, an agency that specializes in creating user interfaces and experiences for programs used in the medical industry, developed a software called MindRDR that allows Google Glass to connect with&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2014/07/london-firm-creates-mind-controlled-commands-google-glass/">London Firm Creates Mind-Controlled Commands for Google Glass</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Forget voice commands and touch gestures: A London firm has developed a way for Google Glass users to control their devices just by thinking.</p>
<p>This Place, an agency that specializes in creating user interfaces and experiences for programs used in the medical industry, developed a software called MindRDR that allows Google Glass to connect with the Neurosky MindWave Mobile EEG biosensor, a head-mounted device that can detect a person’s brain waves. <span id="more-8608"></span></p>
<p>EEG stands for electroencephalography, which is the measurement and recording of electrical activity in the brain. EEG biosensors have been around for decades, but until recently they were very expensive. Neurosky is a Silicon Valley company that sells EEG biosensors, some for as little as $79.99 from Amazon.com.</p>
<p>The system works by pairing the EEG biosensor with Google’s $1,500 Glass device using Bluetooth. Once the connection has been made, the user fires up MindRDR, which takes what the EEG biosensor detects and converts it into commands that Glass can process.</p>
<p>After turning on the app, users will see a camera interface on the screen of their Google Glass. They can then pick a subject, aim their head in its direction, and concentrate on it while Glass displays a meter showing the level of their brain waves. The more intently a user focuses, the higher the meter climbs until it reaches the top, triggering Glass’ camera. By repeating the process, users can direct MindRDR to upload the photo to one of their social networks.</p>
<p>For now MindRDR can only be used to snap pictures, but This Place Chief Executive Dusan Hamlin said he hoped the agency would continue developing the software so that it could eventually help users overcome mobility limitations. Specifically, Hamlin said he would like MindRDR to help people who suffer from locked-in syndrome, in which a patient has lost motor control but remains aware and alert, as well as quadriplegia.</p>
<p>“The ability to be able to use their mind to make outputs to a device could be a huge thing for them,” Hamlin told the Los Angeles Times in a Skype interview.</p>
<p>But the possibilities for MindRDR extend beyond the medical field. Hamlin said he sees MindRDR as the launching point for a world where people can interact with their digital devices by simply thinking about what they want. To that end, This Place has uploaded the code for its software onto <a href="https://github.com/ThisPlace/MindRDR" target="_blank">GitHub</a>, a popular website used by developers to share code they create with others for free.</p>
<p>“What we’ve done is just scratch the surface, and we hope that we’ve inspired people to build on what we’ve started,” Hamlin said.</p>
<p>Written by: <a href="http://www.latimes.com/la-bio-salvador-rodriguez-staff.html" target="_blank">Salvador Rodriguez</a>, the <a href="http://www.latimes.com/business/technology/la-fi-tn-google-glass-mindrdr-20140711-story.html" target="_blank">Los Angeles Times</a> (via <a href="http://ispr.info/2014/07/14/mindrdr-lets-users-control-google-glass-with-their-thoughts/" target="_blank">Presence</a>); more information is available from <a href="http://mindrdr.thisplace.com/" target="_blank">This Place</a> and an article in <a href="http://www.bbc.com/news/technology-28237582" target="_blank">BBC News</a><br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2014/07/london-firm-creates-mind-controlled-commands-google-glass/">London Firm Creates Mind-Controlled Commands for Google Glass</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2014/07/london-firm-creates-mind-controlled-commands-google-glass/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">8608</post-id>	</item>
		<item>
		<title>Control VR Gloves Warp Your Fingers into Virtual Worlds</title>
		<link>https://www.situatedresearch.com/2014/06/control-vr-gloves-warp-fingers-virtual-worlds/</link>
					<comments>https://www.situatedresearch.com/2014/06/control-vr-gloves-warp-fingers-virtual-worlds/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Wed, 11 Jun 2014 19:04:51 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Communication]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=8544</guid>

					<description><![CDATA[<p>$350 device tracks your arms and hands with military-designed sensors New technologies such as Google Glass and Oculus’ Rift headset are making it easier than ever for us to get our heads into augmented and virtual realities. But while we get our heads into these alternate worlds and use our eyes to check our emails, surf&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2014/06/control-vr-gloves-warp-fingers-virtual-worlds/">Control VR Gloves Warp Your Fingers into Virtual Worlds</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>$350 device tracks your arms and hands with military-designed sensors</strong></p>
<p>New technologies such as Google Glass and Oculus’ Rift headset are making it easier than ever for us to get our heads into augmented and virtual realities. But while we get our heads into these alternate worlds and <a href="http://www.theverge.com/2013/2/22/4013406/i-used-google-glass-its-the-future-with-monthly-updates" target="_blank">use our eyes to check our emails</a>, surf the internet, even <a href="http://www.theverge.com/2014/2/5/5382524/eve-valkyrie-will-be-an-oculus-rift-exclusive" target="_blank">destroy enemy starfighters with a spiral of missiles</a>, our hands are left behind in the real world. <span id="more-8544"></span></p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="alignright size-full wp-image-8546" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/06/jpeg.jpg?resize=640%2C426&#038;ssl=1" alt="jpeg" width="640" height="426" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/06/jpeg.jpg?w=640&amp;ssl=1 640w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/06/jpeg.jpg?resize=300%2C199&amp;ssl=1 300w" sizes="auto, (max-width: 640px) 100vw, 640px" />California-based Control VR wants to change that. The company today launched a <a href="https://www.kickstarter.com/projects/controlvr/control-vr-motion-capture-for-vr-animation-and-mor" target="_blank">Kickstarter for its Control VR wearable device</a>, a glove-like system that fits over the user’s arms and shoulders and can accurately sense the precise movements of fingers before translating that motion into virtual or augmented realities. Unlike motion sensing controllers such as <a href="http://www.theverge.com/2014/6/5/5782286/xbox-one-without-kinect-performance-boost" target="_blank">Microsoft’s Kinect</a>, the Control VR can map precise arm and finger motions without the use of an external camera.<span id="more-7850"></span></p>
<p>Alex Sarnoff, Control VR’s co-founder and CEO, says “existing motion-sensing technology is crude, insufficient and limited by confined spaces and camera systems.” His company’s solution takes up little space and doesn’t require an external device pointed at the user. Instead, fine control is made possible by a set of tiny sensors that are placed on the user’s fingers and arms. Each of these sensors — which Sarnoff says were designed for military purposes — has three accelerometers, three gyroscopes, and three magnetometers. The data produced by the position of these sensors is fed back to a processor that allows the Control VR system to calculate how the wearer’s fingers are moving in relation to their body.</p>
<p><iframe loading="lazy" src="https://www.kickstarter.com/projects/controlvr/control-vr-motion-capture-for-vr-animation-and-mor/widget/video.html" width="1024" height="600" frameborder="0" scrolling="no"> </iframe></p>
<p>Sarnoff sees his company’s device first being used with video games. Control VR has already demonstrated its device being used with the Oculus Rift headset, <a href="http://youtu.be/LPszKhewSec" target="_blank">using the Rift’s Tuscany demo</a> to show how hands, arms, and fingers can be manipulated by the player. The sensors on the wearer’s elbows and fingers mean that the motions look natural on screen, appearing as one-to-one representations of their actions in the real world. In a newer demonstration, also using the Rift, a player places his hands behind his back to send an Iron Man avatar flying across treetops. He throws his hands forward, using Tony Stark’s palm-mounted thrusters to come to a hovering halt, before pointing his fingers at flying opponents and blowing them from the sky with his suit’s weapons.</p>
<p>The most recent renders of the device show it sporting a small joystick, but Sarnoff says the system will also have more humanitarian uses than aerial video game battles. Control VR will ship with an SDK that Sarnoff says will allow developers to “make the world a better place” by building software and adding functionality for the technology. “Ultimately, functional applications like remote physical therapy and virtual sign-language will be developed,” he says. Sarnoff thinks his company’s device will have a major impact in the animation, design, medical, and robotics communities — and with the party game crowd. “Imagine playing a game of beer pong in real-time,” he suggests.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-large wp-image-8547" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/06/View3-PhysicalRender.jpg?resize=980%2C419&#038;ssl=1" alt="View3-PhysicalRender" width="980" height="419" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/06/View3-PhysicalRender.jpg?resize=1024%2C438&amp;ssl=1 1024w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/06/View3-PhysicalRender.jpg?resize=300%2C128&amp;ssl=1 300w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/06/View3-PhysicalRender.jpg?w=1900&amp;ssl=1 1900w" sizes="auto, (max-width: 980px) 100vw, 980px" /></p>
<p>The company’s funding goal is set at $250,000. Those that pledge $350 — the <a href="https://www.oculusvr.com/order/" target="_blank">same price as an Oculus Rift development kit</a> — or more get their own Control VR system, in addition to its SDK and a set of tutorials that the company says makes “integration with any 3D game or application as easy as possible.” Sarnoff promises that those that do purchase a Control VR system won’t have to buy a newer version six months down the line. The $350 device is modular, meaning new features and functions can be slotted or patched in later. He mentions haptic feedback as one example that will “absolutely” be a part of future versions of Control VR, “so gamers can play with real feedback while laying on a sofa.” The company plans to get all Control VR systems out to people who pledge $350 or more by December 25th. A retail version is further out, but is expected to be ready for the mass market in 18 months.</p>
<p>Some of the world’s biggest companies have placed <a href="http://www.theverge.com/2014/3/25/5547456/facebook-buying-oculus-for-2-billion" target="_blank">big bets on virtual and augmented reality</a>, but while the visual experience is already impressive, controllers for the Oculus Rift and its contemporaries have lagged behind. Devices such as the Razer Hydra are frustrating and imprecise to use, while others <a href="http://www.theverge.com/2013/6/11/4419832/virtuix-omni-vr-hands-on-demo" target="_blank">such as the Virtuix Omni </a>require vast amounts of living room space, leading Oculus’ Palmer Luckey to <a href="http://www.theverge.com/2013/12/23/5238118/virtual-reality-check-oculus-rift-hardware-ecosystem" target="_blank">lament the lack of a top-quality input system for his company’s machine</a>. Control VR’s system certainly appears smaller and more precise than its peers, but it’s yet to be seen how quickly the virtual reality community will warm to it. In the meantime, the company plans to show off the system at next week’s E3 expo, offering developers the chance to get their hands, as well as their heads, into their video games.</p>
<p>Written by: <a href="http://www.theverge.com/users/richmcc" target="_blank">Rich McCormick</a>,  <a href="http://www.theverge.com/2014/6/5/5781932/control-vr-gloves-warp-your-fingers-into-virtual-worlds" target="_blank">The Verge</a> (via <a href="http://ispr.info/2014/06/10/control-vr-gloves-warp-your-fingers-into-virtual-worlds/" target="_blank">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2014/06/control-vr-gloves-warp-fingers-virtual-worlds/">Control VR Gloves Warp Your Fingers into Virtual Worlds</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2014/06/control-vr-gloves-warp-fingers-virtual-worlds/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">8544</post-id>	</item>
		<item>
		<title>Oculus Rift Will Finally Go On Sale To Consumers Next Year</title>
		<link>https://www.situatedresearch.com/2014/05/oculus-rift-will-finally-go-sale-consumers-next-year/</link>
					<comments>https://www.situatedresearch.com/2014/05/oculus-rift-will-finally-go-sale-consumers-next-year/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Thu, 01 May 2014 16:05:39 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Aesthetics]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[User-Centered Design]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=8441</guid>

					<description><![CDATA[<p>Image: A man tries the Oculus Rift headset at Facebook&#8217;s F8 conference. An Oculus Rift virtual reality headset for consumers could go on sale next year, a company representative told Business Insider at Facebook’s F8 developer conference today. Management at Oculus VR, the Irvine, Calif.-company that Facebook bought for $2 billion earlier this year, will&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2014/05/oculus-rift-will-finally-go-sale-consumers-next-year/">Oculus Rift Will Finally Go On Sale To Consumers Next Year</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p style="text-align: center;"><em>Image: A man tries the Oculus Rift headset at Facebook&#8217;s F8 conference.</em></p>
<p>An Oculus Rift virtual reality headset for consumers could go on sale next year, a company representative told Business Insider at Facebook’s F8 developer conference today.</p>
<p>Management at Oculus VR, the Irvine, Calif.-company that Facebook bought for $2 billion earlier this year, will be “disappointed” if it doesn’t have a headset available at retail for ordinary people by 2016, according to an Oculus spokesperson. <span id="more-8441"></span></p>
<p>A consumer Oculus product in 2015 will be exciting for a couple of reasons:</p>
<ul>
<li>Almost everyone who tries the device is completely blown away by the experience. It’s completely different from any other audio-visual gadget you’ve ever tried — the worlds inside the headsets feel real and deep, because the company has gotten rid of the screen time “lag” that occurs when users move their heads. On top of that, the environment moves naturally as you move. In the game I tried today, I peered out into a lava-filled hellscape full of demons guarding battlements. If I leaned forward, I could see into the rivers of molten rock that flowed between them. Attendees at the conference lined up 20 deep to get 5 minutes with the device.</li>
<li>Oculus will completely turn the console game economy on its head. Once you’ve played a game inside Oculus, going back to playing on a TV just feels lame.</li>
</ul>
<p>Currently, Oculus is only selling development kits to game creators. The Oculus Rift Development Kit 2 is currently on sale for $350, and units will start shipping to developers only in July of this year. After that, Oculus VR must wait while those developers create an ecosystem of cool games — there is no point in selling the headsets to consumers if there are no games or other content for them to see. That process could take months.</p>
<p>There is no word on a price tag for consumers. The company is in the process of building a team to work on marketing and branding the product.</p>
<p>Game creation takes time, but Redner says the current thinking is that there should be enough titles to justify consumer usage by 2016.</p>
<h3>There is a new Oculus inside a secret room in Irvine</h3>
<figure id="attachment_8443" aria-describedby="caption-attachment-8443" style="width: 300px" class="wp-caption alignright"><a href="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/05/facebook-f8-oculus-1.jpg?ssl=1"><img data-recalc-dims="1" loading="lazy" decoding="async" class="size-medium wp-image-8443" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/05/facebook-f8-oculus-1.jpg?resize=300%2C225&#038;ssl=1" alt="Oculus being demonstrated at F8." width="300" height="225" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/05/facebook-f8-oculus-1.jpg?resize=300%2C225&amp;ssl=1 300w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/05/facebook-f8-oculus-1.jpg?resize=1024%2C768&amp;ssl=1 1024w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/05/facebook-f8-oculus-1.jpg?w=1200&amp;ssl=1 1200w" sizes="auto, (max-width: 300px) 100vw, 300px" /></a><figcaption id="caption-attachment-8443" class="wp-caption-text">Oculus being demonstrated at F8.</figcaption></figure>
<p>More tantalizing still is what Oculus is hiding inside the secret “Valve Room” within its Irvine headquarters near Los Angeles. (Valve is the company that originally used the room for developing games; Oculus has taken it over.) We first heard about this from Andreessen Horowitz partner Chris Dixon, an investor in Oculus VR, who says that the version of Oculus Rift inside the “special room” is more powerful and impressive than even the existing Crystal Cove and DK2 versions that outsiders have been allowed to play with.</p>
<p>“Crystal Cove is 50% of what they are running in LA,” he says. Oculus Rift Crystal Cove is impressive, but it’s still obvious that you’re inside an animated game environment. It doesn’t yet closely approximate reality. However, “what they have in LA <em>does</em>,” Dixon tells us. “You go into a room. It’s a special room. Fancier headset. … In user testing it gets to a level of realism where almost all people feel that it’s realistic.” He gestured to the San Francisco street where we were drinking coffee. “Imagine everything you can see now, but it’s a little bit pixelated. Eventually that [pixelation] will go away.”</p>
<p>He believes Facebook CEO Mark Zuckerberg bought the company after being ushered into the Valve Room. (He obviously tried the other versions as well.)</p>
<p>The Oculus rep wasn’t quite as hyperbolic when asked about the “mythic” room. But he did tell us that the demo version inside the Valve room does feature a photorealistic experience that is so real even people who are very sensitive to motion sickness don’t “feel” it.</p>
<p>The test unit has an entire room to itself because it requires a massive amount of processing power to run. It’s a headset tethered to a giant server, basically. Oculus expects, eventually, to be able to crunch that down into units that can be sold commercially.</p>
<p>Games will only be the start of it. Once it is commercially available, “There will be a million in the U.S. military, police, and fire services,” Dixon says. “Anything to do with training” that is dangerous will utilize an Oculus experience instead, he believes.</p>
<p>We can’t wait.</p>
<p>Written by: <a href="http://www.businessinsider.com/author/jim-edwards" target="_blank">Jim Edwards</a>, <a href="http://www.businessinsider.com/oculus-riftdate-for-sale-to-consumers-2014-4" target="_blank">Business Insider</a> (via <a href="http://ispr.info/2014/05/01/oculus-rift-coming-next-gen-version-reaches-new-level-of-realism/">Presence</a>)<br />
Images: Kyle Russell, Business Insider<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2014/05/oculus-rift-will-finally-go-sale-consumers-next-year/">Oculus Rift Will Finally Go On Sale To Consumers Next Year</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2014/05/oculus-rift-will-finally-go-sale-consumers-next-year/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">8441</post-id>	</item>
		<item>
		<title>Study Reveals Real Reason Behind Gaming Aggression</title>
		<link>https://www.situatedresearch.com/2014/04/study-reveals-real-reason-behind-gaming-aggression/</link>
					<comments>https://www.situatedresearch.com/2014/04/study-reveals-real-reason-behind-gaming-aggression/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Wed, 23 Apr 2014 15:08:33 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Affect / Emotion]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Games for Learning]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Learning]]></category>
		<category><![CDATA[Mental Models]]></category>
		<category><![CDATA[Usability Research]]></category>
		<category><![CDATA[Usability Testing]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[User-Centered Design]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=8406</guid>

					<description><![CDATA[<p>A new study has revealed that gamers are more likely to experience feelings of aggression from playing a game when it is too difficult or when the controls are too complicated to master. In comparison, the research found there was &#8220;little difference&#8221; in levels of aggression when the games themselves depicted violence. Overwhelmingly, the deciding&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2014/04/study-reveals-real-reason-behind-gaming-aggression/">Study Reveals Real Reason Behind Gaming Aggression</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p style="color: #000000;">A new study has revealed that gamers are more likely to experience feelings of aggression from playing a game when it is too difficult or when the controls are too complicated to master.</p>
<p style="color: #000000;">In comparison, the research found there was &#8220;little difference&#8221; in levels of aggression when the games themselves depicted violence. Overwhelmingly, the deciding factor was &#8220;how the volunteers were able to master the electronic game after 20 minutes of play&#8221;. <span id="more-8406"></span></p>
<div class="quoteBox">
<blockquote><p>This need to master the game was far more significant than whether the game contained violent material.&#8221;</p></blockquote>
</div>
<p style="color: #000000;">The <a style="font-weight: inherit; font-style: inherit; color: #003399;" href="http://www.ox.ac.uk/media/news_stories/2014/140408.html" target="_blank" rel="nofollow" data-ls-seen="1">study</a> was conducted by research teams from University of Oxford in the UK and the University of Rochester in the US, with the findings published in the <a style="font-weight: inherit; font-style: inherit; color: #003399;" href="http://www.apa.org/pubs/journals/psp/index.aspx" target="_blank" rel="nofollow" data-ls-seen="1"><em>Journal of Personality and Social Psychology</em></a>.</p>
<p style="color: #000000;">The experiment is believed to be the first study of its kind and consisted of six controlled lab tests involving university students. The candidates played a simple puzzle game the researchers were able to manipulate, increasing its difficultly or making the control scheme less intuitive or responsive.</p>
<p style="color: #000000;">&#8220;To date, researchers have tended to explore passive aspects of gaming, such as whether looking at violent material in electronic games desensitises or aggravates players,&#8221; says Dr Andrew Przybylski, co-author of the study, from the Oxford Internet Institute. &#8220;We focused on the motives of people who play electronic games and found players have a psychological need to come out on top when playing. If players feel thwarted by the controls or the design of the game, they can wind up feeling aggressive. This need to master the game was far more significant than whether the game contained violent material. Players on games without any violent content were still feeling pretty aggressive if they hadn’t been able to master the controls or progress through the levels at the end of the session.&#8221;</p>
<div class="quoteBox">
<blockquote><p>If the structure of a game or the design of the controls thwarts enjoyment, it is this, not the violent content, that seems to drive feelings of aggression.&#8221;</p></blockquote>
</div>
<p style="color: #000000;">In addition to the lab tests, researchers conducted a survey of over 300 players, focussing the three games they had played most in the last month. Players were asked which they had enjoyed the most, and why. Again, the research demonstrated that some players experienced aggression when they didn&#8217;t feel good at the game. Furthermore, these feelings of aggression had even spoiled their level of enjoyment.</p>
<p style="color: #000000;">&#8220;The study is not saying that violent content doesn&#8217;t affect gamers,&#8221; says co-author Richard M Ryan, from the University of Rochester. &#8220;But our research suggests that people are not drawn to playing violent games in order to feel aggressive. Rather, the aggression stems from feeling not in control or incompetent while playing. If the structure of a game or the design of the controls thwarts enjoyment, it is this, not the violent content, that seems to drive feelings of aggression.&#8221;</p>
<p style="color: #000000;">Written by: <a href="http://www.ign.com/articles/2014/04/08/study-reveals-real-reason-behind-gaming-aggression">Daniel Krupa, IGN UK</a><br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2014/04/study-reveals-real-reason-behind-gaming-aggression/">Study Reveals Real Reason Behind Gaming Aggression</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2014/04/study-reveals-real-reason-behind-gaming-aggression/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">8406</post-id>	</item>
		<item>
		<title>This Video Game Knows When You’re Scared–And Gets Scarier</title>
		<link>https://www.situatedresearch.com/2014/02/video-game-knows-youre-scared-gets-scarier/</link>
					<comments>https://www.situatedresearch.com/2014/02/video-game-knows-youre-scared-gets-scarier/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Tue, 18 Feb 2014 16:45:47 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Affect / Emotion]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Learning]]></category>
		<category><![CDATA[Personalization]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User-Centered Design]]></category>
		<guid isPermaLink="false">http://www.www.situatedresearch.com/?p=5650</guid>

					<description><![CDATA[<p>The director behind the innovative video game Nevermind tells us why biofeedback is the new frontier in gaming. In the future, horror games will know when you’re scared. And then they’ll get scarier. Proof: the currently-in-development horror-adventure game Nevermind, which just launched a Kickstarter campaign last week. The game pairs classic first-person exploration with biofeedback&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2014/02/video-game-knows-youre-scared-gets-scarier/">This Video Game Knows When You’re Scared–And Gets Scarier</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>The director behind the innovative video game <i>Nevermind</i> tells us why biofeedback is the new frontier in gaming.</p>
<p>In the future, horror games will know when you’re scared. And then they’ll get scarier.</p>
<p>Proof: the currently-in-development horror-adventure game <i>Nevermind</i>, which just launched a <a href="https://www.kickstarter.com/projects/reynoldsphobia/nevermind-a-biofeedback-horror-adventure-game">Kickstarter campaign</a> last week. The game pairs classic first-person exploration with biofeedback data from a heart rate monitor in order to tell when you’re scared and <a href="http://www.fastcocreate.com/3022308/this-horrifying-video-game-knows-when-youre-afraid">turn up the horror</a>.<span id="more-5650"></span></p>
<p>“In <i>Nevermind</i>, you get scared, you get stressed, and the world will punish you for giving in to those feelings,” says creative director <a href="http://www.fastcompany.com/person/erin-reynolds" target="_blank">Erin Reynolds</a>, “But it rewards you for calming down by becoming easier.”</p>
<p><iframe loading="lazy" src="//player.vimeo.com/video/85923375?title=0&amp;byline=0&amp;portrait=0" width="640" height="360" frameborder="0" allowfullscreen="allowfullscreen"></iframe></p>
<p>While biofeedback seems like a perfect fit for the horror genre, Reynolds believes that the technology is key to moving the video game medium forward as a whole, allowing for an entirely new level of immersion.</p>
<p>“I think it really speaks to the potential of games being able to know more about you than you know about yourself, and having this intimate response to your internal reactions,” Reynolds says.</p>
<p>That internal response surprised her during playtesting, as it illuminated “just how personal one’s sense of horror is. It made for some design challenges, because it means you need to have something for everything so that everyone’s buttons get pushed.”</p>
<p>But those challenges also served as the ultimate affirmation for Reynolds: She was scaring people.</p>
<p>That’s a good indication that <i>Nevermind</i> may be a successful game and not just a neat tech demo. Reynolds has ambitious goals for the game and hopes that it will move the medium forward as a proof of concept in both <a href="http://www.gamasutra.com/blogs/ErinReynolds/20131029/203265/Quit_Playing_Games_with_My_Heart_Biofeedback_in_Gaming.php">biofeedback integration</a> and as an example of a positive game that reinforces stress management skills that have real-world applications.</p>
<p>Because achieving those goals with a video game is all for naught if the game is not fun, states game developer Lat Ware in a feature on <a href="http://www.gamasutra.com/view/news/203252/Biofeedback_and_video_games_What_does_the_future_have_in_store.php">Gamasutra</a>:</p>
<blockquote><p>“The best practice in making biofeedback games is also the best practice for game development in general: Make it fun,” he adds. “Fun is the only thing that matters in a game. Fun is what makes people love your game. Fun is what makes people come back to play again. Fun is what makes people buy your next game without asking questions.”</p></blockquote>
<p>“That’s why I’m really excited about <i>Nevermind</i>,” says Reynolds. “It creates this experience that is fun but can also empower the player.”</p>
<p>Written by: By <a href="http://www.fastcolabs.com/user/joshua-rivera">Joshua Rivera</a>, Fast Company’s <a href="http://www.fastcolabs.com/3026458/this-video-game-knows-when-youre-scared-and-gets-scarier">Co.LABS</a> (via <a href="http://ispr.info/2014/02/17/new-level-of-immersion-video-game-knows-when-youre-scared-and-gets-scarier/">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2014/02/video-game-knows-youre-scared-gets-scarier/">This Video Game Knows When You’re Scared–And Gets Scarier</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2014/02/video-game-knows-youre-scared-gets-scarier/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">5650</post-id>	</item>
		<item>
		<title>IBM Forecasts Major Advances in Cognitive Computing</title>
		<link>https://www.situatedresearch.com/2013/12/ibm-forecasts-major-advances-cognitive-computing/</link>
					<comments>https://www.situatedresearch.com/2013/12/ibm-forecasts-major-advances-cognitive-computing/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Fri, 27 Dec 2013 16:59:58 +0000</pubDate>
				<category><![CDATA[Collaboration]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Games for Learning]]></category>
		<category><![CDATA[Usability Research]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=5532</guid>

					<description><![CDATA[<p>IBM on Tuesday released its annual &#8220;5 in 5&#8221; list of predictions about technological innovations that will change the way we live in the next five years, with the theme this year being cognitive advances in computing that help machines &#8220;learn&#8221; how to better serve us.  Last year&#8217;s 5 in 5 list also focused on&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2013/12/ibm-forecasts-major-advances-cognitive-computing/">IBM Forecasts Major Advances in Cognitive Computing</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>IBM on Tuesday released its annual &#8220;5 in 5&#8221; list of predictions about technological innovations that will change the way we live in the next five years, with the theme this year being cognitive advances in computing that help machines &#8220;learn&#8221; how to better serve us. <span id="more-5532"></span></p>
<p>Last year&#8217;s 5 in 5 list also focused on the <a href="http://www.pcmag.com/article2/0,2817,2413300,00.asp" data-ls-seen="1">rise of cognition in computing</a> and how the five senses humans use to gain information about and manipulate the physical world are being emulated by computing systems like IBM&#8217;s own Watson artificial intelligence framework.</p>
<p>For this year&#8217;s edition, IBM got a little more specific about the ways that such advances in machine learning will affect us, touching more on data analytics and offering up the following predictions:</p>
<p><b>The classroom will learn you:</b> Kerrie Holley of IBM described this as a concept &#8220;built on a lot of the technologies you see with how the Khan Academy works, cloud-based computing, and the like.&#8221; In the years to come, new learning technologies will use advanced analytics of &#8220;longitudinal student records&#8221; to help teachers better assess what individual students need, which ones are at risk, and how to help them in their education, he said.</p>
<p><iframe loading="lazy" frameborder="0" allowfullscreen src="//www.youtube.com/embed/hTA5GyWamR0" width="650" height="390"></iframe></p>
<p><b>Buying local will beat online.</b> Less about a specific tech advance, this prediction is based on the idea that the &#8220;tables will turn&#8221; in terms of access to the kind of technology, cloud services, and analytics that can help &#8220;mom and pop&#8221; businesses compete more readily with big national and global retailers, Holley said. &#8220;Technology costs are dropping and as they do, proximity will allow local retailers to create experiences the big retailers are not able to do online.&#8221;</p>
<p><iframe loading="lazy" frameborder="0" allowfullscreen src="//www.youtube.com/embed/yKNSOwLcrkE" width="650" height="390"></iframe></p>
<p><b>Doctors will use your DNA to keep you well.</b> IBM presented this prediction as one involving more advanced computational work than some of the others in its 5-in-5 list. &#8220;Cognitive-based systems like Watson, along with breakthroughs in genomic research, will enable doctors to be better able to diagnose cancer and offer better treatments,&#8221; Holley said.</p>
<p><iframe loading="lazy" frameborder="0" allowfullscreen src="//www.youtube.com/embed/0M1DMdc1mQ0" width="650" height="390"></iframe></p>
<p><b>The city will help you live in it.</b> In just a few decades, as many as seven out of 10 people around the world will live in cities, according to some projections. We&#8217;re already seeing more computational resources being dedicated to helping those city dwellers manage their urban lives and that will only accelerate, according to IBM.</p>
<p><iframe loading="lazy" frameborder="0" allowfullscreen src="//www.youtube.com/embed/tVGviMIMjN0" width="650" height="390"></iframe></p>
<p><b>A digital guardian will protect you online.</b> Holley explained this prediction as an expansion on financial fraud protection services offered by banks and credit card companies, only much more personally tailored to individuals to safeguard their entire digital lives.</p>
<p><iframe loading="lazy" frameborder="0" allowfullscreen src="//www.youtube.com/embed/al8ng82nRss" width="650" height="390"></iframe></p>
<p>&#8220;This year&#8217;s IBM 5 in 5 explores the idea that everything will learn—driven by a new era of cognitive systems where machines will learn, reason and engage with us in a more natural and personalized way. These innovations are beginning to emerge enabled by cloud computing, big data analytics, and learning technologies all coming together,&#8221; the research team behind the company&#8217;s annual list of predictions said in a statement.</p>
<p>&#8220;Over time these computers will get smarter and more customized through interactions with data, devices, and people, helping us take on what may have been seen as unsolvable problems by using all the information that surrounds us and bringing the right insight or suggestion to our fingertips right when it&#8217;s most needed. A new era in computing will lead to breakthroughs that will amplify human abilities, assist us in making good choices, look out for us, and help us navigate our world in powerful new ways.&#8221;</p>
<p>Written by: <a href="http://www.pcmag.com/author-bio/damon-poeter">Damon Poeter</a>, <a href="http://www.pcmag.com/article2/0,2817,2428432,00.asp">PC Mag</a><br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2013/12/ibm-forecasts-major-advances-cognitive-computing/">IBM Forecasts Major Advances in Cognitive Computing</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2013/12/ibm-forecasts-major-advances-cognitive-computing/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">5532</post-id>	</item>
		<item>
		<title>Crossing the UI Rift with Oculus</title>
		<link>https://www.situatedresearch.com/2013/09/crossing-ui-rift-oculus/</link>
					<comments>https://www.situatedresearch.com/2013/09/crossing-ui-rift-oculus/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Tue, 24 Sep 2013 13:50:00 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Communication]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Mental Models]]></category>
		<category><![CDATA[Personalization]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=5381</guid>

					<description><![CDATA[<p>Virtual reality opens the doors to a new era for user interface design. Oculus VR speaks to Develop about its opportunities Virtual reality doesn’t present user interface design with its first opportunity for transformation. The dawn of 3D long ago afforded games makers the prospect of moving beyond flat heads-up-displays and conventional menus. And when&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2013/09/crossing-ui-rift-oculus/">Crossing the UI Rift with Oculus</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><b>Virtual reality opens the doors to a new era for user interface design. Oculus VR speaks to Develop about its opportunities</b></p>
<p>Virtual reality doesn’t present user interface design with its first opportunity for transformation.</p>
<p>The dawn of 3D long ago afforded games makers the prospect of moving beyond flat heads-up-displays and conventional menus. And when mobile gaming finally realised its potential with the arrival of smartphones, those charged with implementing UI had a chance to establish the new standards of the virtual gamepad. <span id="more-5381"></span></p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="alignright size-full wp-image-5383" style="margin: 6px 10px;" alt="184_2253_Hawken_01_240" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/184_2253_Hawken_01_240.jpg?resize=184%2C184&#038;ssl=1" width="184" height="184" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/184_2253_Hawken_01_240.jpg?w=184&amp;ssl=1 184w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/184_2253_Hawken_01_240.jpg?resize=150%2C150&amp;ssl=1 150w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/184_2253_Hawken_01_240.jpg?resize=90%2C90&amp;ssl=1 90w" sizes="auto, (max-width: 184px) 100vw, 184px" />Neither presented an easy transition, and for many releases orthodox menu design remained the best option. Mobile games makers wrestled with how to balance UI button placement with the problem of players’ fingers obstructing the screen, while those crafting 3D console games faced the difficult task of rewriting the broader grammar of user interfaces.</p>
<p>Those are challenges yet to be fully answered, and now – predominantly thanks to the arrival of Oculus Rift – a vast new opportunity is here with the advent of a new kind of VR. Of course, virtual reality has lurked in the badlands of the technology industries for decades, but with Oculus Rift, for the first time it looks set to make its way to consumers in a meaningful way. The technology has arrived and it’s already won the support of big names, such as John Carmark. So it is VR games that will walk hand-in-hand with Oculus Rift as it saunters into the public’s collective conscience.</p>
<h3>CHANGE IS ON THE MENU</h3>
<p>The fact is that VR games need a new type of UI, as they are delivered in a new way. The challenge is similar to that faced by film and game makers as the recent wave of new stereoscopic 3D technology arrived in homes and cinemas. And equally, it is also a substantial opportunity.</p>
<p>Two games industry professionals that know that opportunity are Oculus VR’s CEO Brendan Iribe and vice president of product Nate Mitchell. Both previously served at Scaleform, the provider of the industry’s most prolific UI middleware, which was eventually picked up by tools giant Autodesk.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="alignleft size-full wp-image-5384" style="margin: 6px 10px;" alt="7049_Nate_Mitchell_Oculus_VR_240" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/7049_Nate_Mitchell_Oculus_VR_240.jpg?resize=240%2C240&#038;ssl=1" width="240" height="240" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/7049_Nate_Mitchell_Oculus_VR_240.jpg?w=240&amp;ssl=1 240w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/7049_Nate_Mitchell_Oculus_VR_240.jpg?resize=150%2C150&amp;ssl=1 150w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/7049_Nate_Mitchell_Oculus_VR_240.jpg?resize=90%2C90&amp;ssl=1 90w" sizes="auto, (max-width: 240px) 100vw, 240px" />“I think one of the most impressive things about VR is just the ability to have a sense of presence in a gameworld; that feeling that you are really there,” says Mitchell as he begins to explain how UI may evolve with Oculus Rift.</p>
<p>“Until now players have seen HUDs and flat menus, and I think the that opportunity with virtual reality here is evident already in the best early VR UI today, where they are really taking cues from the real world, or perhaps the dreams of the future.”</p>
<p>According to Mitchell, games makers only need to look to Google and Microsoft’s work with augmented reality, and the UI already being conceived for unusual hardware like the spectacle-shaped Glass. Those developments, say Mitchell, are serving as an inspiration to the first wave of games developers building titles for Oculus Rift.</p>
<p>“Developers are starting to look at UI in VR spaces in terms of how they would want to interact with that world in real life, and take information from objects in that world,” Mitchell adds.</p>
<h3>VIRTUALLY REAL</h3>
<p>Virtual reality is, of course, defined by its ambition of simulating real life, which is exactly where developers should look for inspiration when designing UI for Oculus games, suggests Iribe. There is arguably no UI in real life, but still, we garner information from the world around us through signage, digital displays and the devices that let us interface with our environment, and games developers now have the opportunity to reflect that in games.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="alignright size-full wp-image-5385" style="margin: 6px 10px;" alt="7048_Brendan_Iribe_Oculus_VR_219" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/7048_Brendan_Iribe_Oculus_VR_219.jpg?resize=219%2C219&#038;ssl=1" width="219" height="219" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/7048_Brendan_Iribe_Oculus_VR_219.jpg?w=219&amp;ssl=1 219w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/7048_Brendan_Iribe_Oculus_VR_219.jpg?resize=150%2C150&amp;ssl=1 150w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/7048_Brendan_Iribe_Oculus_VR_219.jpg?resize=90%2C90&amp;ssl=1 90w" sizes="auto, (max-width: 219px) 100vw, 219px" />“When you suddenly have VR available to you, you quickly have so much more area to play with,” explains Iribe. “You’re no longer confined to a small 2D screen anymore, and you also have a number of new mechanics to work with.</p>
<p>“It’s no longer about just a keyboard and mouse or gamepad; there’s this new headtracking mechanic available. You really do have to think about all of that when you design a user interface, and suddenly you have to think about how you would interact with the user interface if it existed in the real world.”</p>
<p>Suddenly, viewed through an Oculus Rift headset, 2D HUD elements may be intrusive? Surely it is better to glance down at your gun to check its ammo than look to an arbitrary sign floating in your field of view?</p>
<p>It’s something many console developers have already tried implementing in traditional 3D console games, and titles like Dead Space were among the first to use the technique with particular grace. And with VR, as players can really move their head to look down, those techniques can be used with even greater effect.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-5386" alt="7050_Eve_Valkyrie_01_430" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/7050_Eve_Valkyrie_01_430.jpg?resize=430%2C206&#038;ssl=1" width="430" height="206" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/7050_Eve_Valkyrie_01_430.jpg?w=430&amp;ssl=1 430w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/7050_Eve_Valkyrie_01_430.jpg?resize=300%2C143&amp;ssl=1 300w" sizes="auto, (max-width: 430px) 100vw, 430px" /></p>
<p>But not all UI can be placed on in game objects; targeting reticules and destination markers must exist in the game volume, while certain elements such as player instruction text and subtitles may best serve their purpose as 2D assets.</p>
<p>But should they lock to the players’ head movements? Or should they instead affix themselves to a point in the gameworld? Does screen edge even have meaning in a VR environment? What will feel natural and comfortable to glance at when the head and neck are serving as an input? The questions are endless, but sources of inspiration are plentiful, and one in particular is likely to have much appeal for games developers interested in the tech.</p>
<p>“Think of how people have imagined UI in science fiction, which is somewhere Hollywood has done a great job,” offers Iribe. “That is now where many people are starting to look for inspiration, and those kind of ideas are how a lot of people are going to realise UI in VR games.”</p>
<h3>A NEW GRAMMAR</h3>
<p>The challenge for conceiving how UI should work in a VR space is a more academic matter. Developers are going to do more than just learn new methods, for they must liberate themselves from the previous rules of UI design. It’s not an easy task, but an essential one, argues Mitchell.</p>
<p>“I think the biggest challenge at the onset will just be for developers to break down their preconceptions and get away from the conventions of user interface up until now,” he says.</p>
<p>“They need to begin again with what works for VR. I’ve already talked to a number of developers at game studios that are already doing that. A lot of UI and UX designers are now sitting down with a Rift for the first time, and they are integrating it with their work very quickly and jumping in, and realising the challenge and opportunity of user interfaces in virtual reality.”</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-5387" alt="7051_Eve_Valkyrie_07_430" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/7051_Eve_Valkyrie_07_430.jpg?resize=430%2C206&#038;ssl=1" width="430" height="206" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/7051_Eve_Valkyrie_07_430.jpg?w=430&amp;ssl=1 430w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/7051_Eve_Valkyrie_07_430.jpg?resize=300%2C143&amp;ssl=1 300w" sizes="auto, (max-width: 430px) 100vw, 430px" /></p>
<p>Of course, if those designers are successful in integrating a new kind of UI into a virtual reality, as they strive to ape the ‘user interface of real life’, some speculate UI as a distinct discipline may disappear, at least in part. After all, if in-game objects alone can communicate needed information to the player, then surely UI is just another task for those conceiving the likes of weapons and gameplay items.</p>
<p>“I don’t think we’ll see UI disappear, but it will become far more integrated into a gameworld,” offers Mitchell on the matter.</p>
<p>“What we’re already seeing is that a lot of developers are pursuing that type of experience, because it is more immersive, and more effective in not seeing the player be taken out of a game experience to use interfaces. Rather than have those ammo-counts floating in space, and being reminded you’re in a game, with VR UI a player can keep moving through the world, just glancing down in-game to see their remaining ammunition.”</p>
<h3>DEEPER THOUGHT</h3>
<p>Player immersion is one of games design’s ultimate goals. If a consumer strapped into a Rift forgets they are playing a game, the developer behind that game has perhaps triumphed. And it is that that VR offers UI designers: a chance to take player engagement to a new level.</p>
<p>In fact, according to Iribe, VR provides what UI as a discipline has been longing for; a place where user interfaces can escape the arbitrary forms that have for decades constrained the potential for communicating with the player.</p>
<p>“VR is absolutely an opportunity for UI design, and really what user interfaces have been waiting for,” claims Iribe.</p>
<p>“VR takes UI to a whole new level, which is just super-exciting for games developers, and especially UI developers. They’re now able, to me at least, to realise UI in more compelling and innovative ways. UI is no longer a 2D element; it’s not going to just be about pieces of text and icons anymore.</p>
<p>“UI was never something particularly exciting for the user, and menu screens in particular. Now it can be a very exciting part of the game. In VR UI can be the CG graphics we’ve [seen] in Minority Report and District 9; that’s now something we’ll see in games.”</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-5388" alt="7052_Minority_Report_VR_screen_430" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/7052_Minority_Report_VR_screen_430.jpg?resize=430%2C180&#038;ssl=1" width="430" height="180" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/7052_Minority_Report_VR_screen_430.jpg?w=430&amp;ssl=1 430w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/7052_Minority_Report_VR_screen_430.jpg?resize=300%2C125&amp;ssl=1 300w" sizes="auto, (max-width: 430px) 100vw, 430px" /></p>
<p>Such optimism is hard not to warm to, but as team sizes, budgets and time available across the industry continue to feel the force of a changing sector, it’s important to consider how realistic it might be for teams to embrace UI’s new VR era.</p>
<p>Fortunately, most VR games present 3D worlds technically similar to those that have long beguiled traditional console games, and as such, most existing UI middleware is already fit for purpose.</p>
<p>“There’s a great middleware technology called Scaleform,” says Brendan with a smile.</p>
<p>“With Scaleform, and other technologies like Scaleform, you can map the 3D surfaces and project out into the 3D world, which will be very useful for realising UI in VR games.</p>
<p>“Those kind of abilities have only got more popular and people continue to use those kind of features, and now you’ll see that go to a whole new level. The functionality of tools is already there in a number of UI systems – although not all UI systems, as many still focus on 2D overlays. Certainly with Scaleform developers are already able to do a lot of fun stuff, such as texture mapping onto 3D objects, which will be perfect for UI for VR.”</p>
<p>Games development is never easy as such, but relative to the task of building interactive realms, with the tools already available and familiar, implementing innovative UI in VR looks set to be technically well within the realm of most studios’ abilities.</p>
<p>With the development tools in place, and VR finally ready for consumers and developers, user interface design looks set for another new transformation. There are indeed challenges, but those are vastly outweighed by the new opportunities, and who knows?</p>
<p>We may even see the traditional game menu disappear from games altogether. Exactly what direction UI design will take in VR is now down to the developers, and it is that that makes it such a fascinating field.</p>
<p>Written by: Will Freeman, <a href="http://www.develop-online.net/features/2031/Crossing-the-UI-Rift" target="_blank">Develop</a> (via <a href="http://ispr.info/2013/09/20/rift-and-other-vr-opens-the-doors-to-a-new-era-for-user-interface-design/">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2013/09/crossing-ui-rift-oculus/">Crossing the UI Rift with Oculus</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2013/09/crossing-ui-rift-oculus/feed/</wfw:commentRss>
			<slash:comments>1</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">5381</post-id>	</item>
	</channel>
</rss>
