<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Simulations Archives - Situated Research</title>
	<atom:link href="https://www.situatedresearch.com/category/gaming/simulations/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.situatedresearch.com/category/gaming/simulations/</link>
	<description>Usability Research and User Experience Testing</description>
	<lastBuildDate>Mon, 22 Nov 2021 17:33:24 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	

 
<site xmlns="com-wordpress:feed-additions:1">122538981</site>	<item>
		<title>Nintendo’s newest Mario Kart is the best video game you never knew you wanted to play</title>
		<link>https://www.situatedresearch.com/2020/09/nintendos-newest-mario-kart-is-the-best-video-game-you-never-knew-you-wanted-to-play/</link>
					<comments>https://www.situatedresearch.com/2020/09/nintendos-newest-mario-kart-is-the-best-video-game-you-never-knew-you-wanted-to-play/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Wed, 09 Sep 2020 14:22:15 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Mobile Devices]]></category>
		<category><![CDATA[User Experience]]></category>
		<guid isPermaLink="false">https://www.situatedresearch.com/?p=10126</guid>

					<description><![CDATA[<p>By now, Nintendo has made exactly 87,493,029 versions of Mario Kart since the game was first introduced in 1992 for the Super Nintendo. (Okay, the company has really made 13—which is still a lot!) But a new sequel coming this fall to the Nintendo Switch changes the formula in an enticing way, thanks to super&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2020/09/nintendos-newest-mario-kart-is-the-best-video-game-you-never-knew-you-wanted-to-play/">Nintendo’s newest Mario Kart is the best video game you never knew you wanted to play</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<div style="width: 980px;" class="wp-video"><video class="wp-video-shortcode" id="video-10126-1" width="980" height="550" loop autoplay preload="metadata" controls="controls"><source type="video/mp4" src="https://cdn.situatedresearch.com/wp-content/uploads/2020/09/p-2-90546982-the-new-mario-kart-proves-nintendoand8217s-low-key-design-genius.mp4?_=1" /><source type="video/webm" src="https://cdn.situatedresearch.com/wp-content/uploads/2020/09/p-2-90546982-the-new-mario-kart-proves-nintendoand8217s-low-key-design-genius.webm?_=1" /><a href="https://cdn.situatedresearch.com/wp-content/uploads/2020/09/p-2-90546982-the-new-mario-kart-proves-nintendoand8217s-low-key-design-genius.mp4">https://www.situatedresearch.com/wp-content/uploads/2020/09/p-2-90546982-the-new-mario-kart-proves-nintendoand8217s-low-key-design-genius.mp4</a></video></div>
<p>By now, Nintendo has made exactly 87,493,029 versions of Mario Kart since the game was first introduced in 1992 for the Super Nintendo. (Okay, the company has really made 13—which is still a lot!) But a new sequel coming this fall to the Nintendo Switch changes the formula in an enticing way, thanks to super experimental UX. <span id="more-10126"></span></p>
<p><em>Mario Kart Live: Home Circuit</em> transforms the Nintendo Switch into a controller for an actual toy race kart. The kart is fitted with a camera, giving the player a first-person view of its perspective as it whizzes around your living room, bedroom, or wherever you have some open floor space to play.</p>
<figure class="video-wrapper"><iframe title="Mario Kart Live: Home Circuit - Announcement Trailer - Nintendo Switch" src="https://www.youtube.com/embed/f2mCqUSDCJE?feature=oembed" width="720" height="480" frameborder="0" allowfullscreen="allowfullscreen"><span data-mce-type="bookmark" style="display: inline-block; width: 0px; overflow: hidden; line-height: 0;" class="mce_SELRES_start">﻿</span></iframe></figure>
<p>How does the game build your course? You place a few gates that are bundled with the game on the floor. From there, how the exact setup and customization works is unclear (perhaps vision AI is involved?), but Nintendo—alongside its partner developer <a href="https://www.velanstudios.com/" target="_blank" rel="noopener noreferrer">Velan Studios</a>—demonstrates that one of several tracks, from a simple oval to complicated curves, can be set up to avoid existing couches, coffee tables, and perhaps even sleeping cats.</p>
<figure class="wp-caption alignnone image-wrapper" aria-describedby="caption-attachment-90547236"><figcaption id="caption-attachment-90547236" class="wp-caption-text"><div style="width: 596px;" class="wp-video"><video class="wp-video-shortcode" id="video-10126-2" width="596" height="334" loop autoplay preload="metadata" controls="controls"><source type="video/mp4" src="https://cdn.situatedresearch.com/wp-content/uploads/2020/09/p-1-90546982-the-new-mario-kart-proves-nintendoand8217s-low-key-design-genius.mp4?_=2" /><source type="video/webm" src="https://cdn.situatedresearch.com/wp-content/uploads/2020/09/p-1-90546982-the-new-mario-kart-proves-nintendoand8217s-low-key-design-genius.webm?_=2" /><a href="https://cdn.situatedresearch.com/wp-content/uploads/2020/09/p-1-90546982-the-new-mario-kart-proves-nintendoand8217s-low-key-design-genius.mp4">https://www.situatedresearch.com/wp-content/uploads/2020/09/p-1-90546982-the-new-mario-kart-proves-nintendoand8217s-low-key-design-genius.mp4</a></video></div>
[Image: Nintendo]
</figcaption></figure>
<p>As you race your kart around the course, all sorts of augmented reality (AR) effects, ranging from glowing boundaries, to power ups, to your racing competitors, will appear on the screen, as if they exist in your actual home. If you run over a virtual item, like a nitro-boosting mushroom, the kart will actually accelerate. If you hit a troublesome banana peel, the car will actually lose some control. Oh, and assuming you have friends with their own games, up to four players can race their karts together in the same space.</p>
<figure class="wp-caption image-wrapper alignnone" aria-describedby="caption-attachment-90547239"><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-10130" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2020/09/i-1-90546982-the-new-mario-kart-proves-nintendoand8217s-low-key-design-genius.jpg?resize=596%2C335&#038;ssl=1" alt="" width="596" height="335" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2020/09/i-1-90546982-the-new-mario-kart-proves-nintendoand8217s-low-key-design-genius.jpg?w=596&amp;ssl=1 596w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2020/09/i-1-90546982-the-new-mario-kart-proves-nintendoand8217s-low-key-design-genius.jpg?resize=300%2C169&amp;ssl=1 300w" sizes="auto, (max-width: 596px) 100vw, 596px" /></figure>
<figure class="wp-caption image-wrapper alignnone" aria-describedby="caption-attachment-90547239"><figcaption id="caption-attachment-90547239" class="wp-caption-text">[Image: Nintendo]</figcaption></figure>
<p>With few exceptions, augmented reality has been little more than a gimmick. Snapchat’s zany face filters are still the most successful commercialization of this technology that, not so long ago, the tech world heralded as the next big thing.</p>
<p>Microsoft’s Hololens AR headset is technically impressive, but it’s being marketed as an enterprise tool to businesses (which demonstrates pretty clearly that it’s not ready for the mainstream just yet). The hyped company Magic Leap, with billions in venture capital from investors like Google, has done little more than release a developer version of its headset to mediocre reviews while it hangs on for life. The hardware is simply too expensive, too bulky, but, most of all, too useless to really be worth buying for a vast majority of people. Plus, it’s antisocial by nature to be experiencing a different version of reality than the people around you.</p>
<figure class="wp-caption alignnone image-wrapper" aria-describedby="caption-attachment-90547241"><figcaption id="caption-attachment-90547241" class="wp-caption-text"><div style="width: 596px;" class="wp-video"><video class="wp-video-shortcode" id="video-10126-3" width="596" height="334" loop autoplay preload="metadata" controls="controls"><source type="video/mp4" src="https://cdn.situatedresearch.com/wp-content/uploads/2020/09/i-4-90546982-the-new-mario-kart-proves-nintendoand8217s-low-key-design-genius.mp4?_=3" /><source type="video/webm" src="https://cdn.situatedresearch.com/wp-content/uploads/2020/09/i-4-90546982-the-new-mario-kart-proves-nintendoand8217s-low-key-design-genius.webm?_=3" /><a href="https://cdn.situatedresearch.com/wp-content/uploads/2020/09/i-4-90546982-the-new-mario-kart-proves-nintendoand8217s-low-key-design-genius.mp4">https://www.situatedresearch.com/wp-content/uploads/2020/09/i-4-90546982-the-new-mario-kart-proves-nintendoand8217s-low-key-design-genius.mp4</a></video></div>
[Image: Nintendo]
</figcaption></figure>
<p>But Nintendo is doing what it does best. It’s figuring out how to transform a gimmick into shared fun—and make it halfway affordable, too. A lot of that comes down to Nintendo just understanding the ergonomics around technology and play. For years, AR demos tasked you to hold up your phone like a little window to peek through, to do something like transform <a href="https://www.youtube.com/watch?v=r5ziOSjXdo4" target="_blank" rel="noopener noreferrer">a magazine cover into an animation</a>. These novelties wore thin quickly because they’re more physically awkward than visually amazing.</p>
<p>Nintendo is taking a similar approach here to its predecessors. But instead of utilizing the camera in your phone, it’s built it into the kart. That allows you to play a game like you always do (sitting on your couch), but experience all of these enticing and additive effects of AR. No, Nintendo isn’t aiming as high as Magic Leap, teasing an entire world of digital objects that you can reach out and touch. But Nintendo is competent enough at game design that it’s figured out how to work with what it has to create an AR experience that’s both new and destined to be massively successful.</p>
<p><em>Mario Kart Live: Home Circuit</em> will be out for $100 on October 16. The last version of Mario Kart sold <a href="https://www.gamereactor.eu/25-million-mario-kart-8-deluxe-copies-sold/" target="_blank" rel="noopener noreferrer">more than 25 million copies</a> to date. And if <em>Home Circuit</em> is only a fraction as successful, it will still be one of the most profitable demonstrations of AR ever built.</p>
<p>Written by: <a href="https://www.fastcompany.com/user/mark-wilson" target="_blank" rel="noopener noreferrer">Mark Wilson</a>, <a href="https://www.fastcompany.com/90546982/nintendos-newest-mario-kart-is-the-best-video-game-you-never-knew-you-wanted-to-play" target="_blank" rel="noopener noreferrer">Fast Company</a><br />
Posted by: <a href="https://www.situatedresearch.com/">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2020/09/nintendos-newest-mario-kart-is-the-best-video-game-you-never-knew-you-wanted-to-play/">Nintendo’s newest Mario Kart is the best video game you never knew you wanted to play</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2020/09/nintendos-newest-mario-kart-is-the-best-video-game-you-never-knew-you-wanted-to-play/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		<enclosure url="https://cdn.situatedresearch.com/wp-content/uploads/2020/09/p-2-90546982-the-new-mario-kart-proves-nintendoand8217s-low-key-design-genius.webm" length="897423" type="video/webm" />
<enclosure url="https://cdn.situatedresearch.com/wp-content/uploads/2020/09/p-1-90546982-the-new-mario-kart-proves-nintendoand8217s-low-key-design-genius.webm" length="884575" type="video/webm" />
<enclosure url="https://cdn.situatedresearch.com/wp-content/uploads/2020/09/i-4-90546982-the-new-mario-kart-proves-nintendoand8217s-low-key-design-genius.webm" length="374989" type="video/webm" />
<enclosure url="https://cdn.situatedresearch.com/wp-content/uploads/2020/09/p-2-90546982-the-new-mario-kart-proves-nintendoand8217s-low-key-design-genius.mp4" length="1616122" type="video/mp4" />
<enclosure url="https://cdn.situatedresearch.com/wp-content/uploads/2020/09/p-1-90546982-the-new-mario-kart-proves-nintendoand8217s-low-key-design-genius.mp4" length="1116213" type="video/mp4" />
<enclosure url="https://cdn.situatedresearch.com/wp-content/uploads/2020/09/i-4-90546982-the-new-mario-kart-proves-nintendoand8217s-low-key-design-genius.mp4" length="474880" type="video/mp4" />

		<post-id xmlns="com-wordpress:feed-additions:1">10126</post-id>	</item>
		<item>
		<title>Games User Research: Driving Development with Actionable Insights</title>
		<link>https://www.situatedresearch.com/2018/11/games-user-research-driving-development-with-actionable-insights/</link>
					<comments>https://www.situatedresearch.com/2018/11/games-user-research-driving-development-with-actionable-insights/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Wed, 28 Nov 2018 17:00:23 +0000</pubDate>
				<category><![CDATA[Collaboration]]></category>
		<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Communication]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Games for Learning]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Usability Research]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<guid isPermaLink="false">https://www.situatedresearch.com/?p=9777</guid>

					<description><![CDATA[<p>Developers both large and small can benefit from an outside perspective given by a game user research, or usability research geared towards games. Indie developers can benefit from adding UX expertise to the development team, while large developers can obtain an outside perspective to compliment and verify findings from internal members of the development team.&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2018/11/games-user-research-driving-development-with-actionable-insights/">Games User Research: Driving Development with Actionable Insights</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Developers both large and small can benefit from an outside perspective given by a game user research, or usability research geared towards games. Indie developers can benefit from adding UX expertise to the development team, while large developers can obtain an outside perspective to compliment and verify findings from internal members of the development team. In this article, we will present three key ways in which game research can maximize a game’s success. <span id="more-9777"></span></p>
<h2>Measuring Engagement</h2>
<p>Prior research has shown the importance of engagement in game play. Creating a sense of flow, or a feeling where players are immersed into game play to the point where they lose track of their surroundings, has a huge effect on players’ perceptions of a game.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-9779" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/sean-do-782269-unsplash.jpg?resize=980%2C653&#038;ssl=1" alt="" width="980" height="653" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/sean-do-782269-unsplash.jpg?w=1280&amp;ssl=1 1280w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/sean-do-782269-unsplash.jpg?resize=300%2C200&amp;ssl=1 300w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/sean-do-782269-unsplash.jpg?resize=768%2C512&amp;ssl=1 768w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/sean-do-782269-unsplash.jpg?resize=1024%2C682&amp;ssl=1 1024w" sizes="auto, (max-width: 980px) 100vw, 980px" /></p>
<p>Games user research, when properly done, incorporates behavioral psychology into the research to observe players’ actions during gameplay. This yields insight into engagement levels, which are affected by a steady increase in difficulty over time (to challenge game players’ ability) and are encouraged by a great story line to immerse game players.</p>
<h2>Measuring Player Communication</h2>
<p>Besides the obvious task of watching players interact with the game interface, the observation of player-to-player communication can yield great insight into game play. Team-based activities, or even collaborative game play, can help researchers observe players’ strategies. In MMOGs, players might communicate through text or voice inside the game environment, and classic games might have players communicate via their proximity to one another.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-9780" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/kamil-s-738521-unsplash.jpg?resize=980%2C653&#038;ssl=1" alt="" width="980" height="653" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/kamil-s-738521-unsplash.jpg?w=1280&amp;ssl=1 1280w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/kamil-s-738521-unsplash.jpg?resize=300%2C200&amp;ssl=1 300w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/kamil-s-738521-unsplash.jpg?resize=768%2C512&amp;ssl=1 768w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/kamil-s-738521-unsplash.jpg?resize=1024%2C682&amp;ssl=1 1024w" sizes="auto, (max-width: 980px) 100vw, 980px" /></p>
<p>Player communication yields great insight into how players learn to play a game and how they develop strategies to win a game. Great user research should use a research method where players are not coaxed or guided by researchers, and feel as if they are in a natural environment as to not bias their activity while playing games. Rigorous game research methods can use these factors to their advantage to achieve findings that are more accurate than traditional deductive, hypothesis-driven studies.</p>
<h2>Affordances of the User Interface</h2>
<p>While the broader experience of game play needs to be measured to gauge the overall player experience, examining the affordances of the user interface is a useful task to see what players perceive as possible actions in the game. These perceptions provide game players a foundation for creating strategies within the game. All aspects of the interface that can be interacted with, as well as those that gamers perceive as actionable, should be observed to inform game design. These perceived actions within a game suggest to gamers their possibilities for both playing and winning the game.</p>
<p>Often, critical actions might be overlooked by gamers. In line with theories of learning, a scaffolding difficulty structure should be achieved to create a feeling of flow for gamers. Game research can provide useful insight into ways that game players make use of a game interface, and lead to modifications in its discovery and use (via a nudge, animation, tutorial, etc.) that will provide salience to particular actions within the game that allow game players to learn, progress, and create engaging game play within the game.</p>
<h2>Conclusion</h2>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="half alignright wp-image-9781" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/nikita-kachanovsky-428386-unsplash.jpg?resize=306%2C512&#038;ssl=1" alt="" width="306" height="512" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/nikita-kachanovsky-428386-unsplash.jpg?w=611&amp;ssl=1 611w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/nikita-kachanovsky-428386-unsplash.jpg?resize=179%2C300&amp;ssl=1 179w" sizes="auto, (max-width: 306px) 100vw, 306px" /></p>
<p>Many of the current trends in game design are leading to amazing new games: including VR / AR (virtual / augmented reality), amazing graphics approaching lifelike detail, and engaging online multiplayer experiences. However, many of the properties of classic games offer players an engaging experience without advanced graphics, making use of a basic story, simple gameplay, and scaffolding difficulty structure to engage players. Game developers of all sizes can create games that maximize engagement by utilizing game research to create games that utilize the perfect mix of these features.</p>
<p>Good usability, afforded by the game user interface, helps players develop strategies for playing and winning games. Creating flow, where players lose track of their surroundings while immersed in game play, can be achieved by creating the right mix of engaging gameplay, player communication, and a scaffolding difficulty structure where players learn and accomplish tasks in the game.</p>
<h3>About the Author</h3>
<p><em>Matthew Sharritt, Ph.D., President and Co-founder of Situated Research, specializes in user-experience (UX) research and usability testing within software and video games. Dr. Sharritt’s research focuses on collaborative learning during playtesting and exploration, yielding insights in how to construct games that flow with engaging gameplay and collaborative interaction. The Situated Research team has provided independent expertise to the game industry across a variety of research projects. Learn more at </em><a href="https://www.situgames.com"><em>https://www.situgames.com</em></a><em>.</em></p>
<p>The post <a href="https://www.situatedresearch.com/2018/11/games-user-research-driving-development-with-actionable-insights/">Games User Research: Driving Development with Actionable Insights</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2018/11/games-user-research-driving-development-with-actionable-insights/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">9777</post-id>	</item>
		<item>
		<title>Road to GDC: I’m Not A Doctor, but I Simulate One in VR</title>
		<link>https://www.situatedresearch.com/2018/03/road-gdc-im-not-doctor-simulate-one-vr/</link>
					<comments>https://www.situatedresearch.com/2018/03/road-gdc-im-not-doctor-simulate-one-vr/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Fri, 02 Mar 2018 17:20:01 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Games for Learning]]></category>
		<category><![CDATA[Health Care]]></category>
		<category><![CDATA[Learning]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">https://www.situatedresearch.com/?p=9703</guid>

					<description><![CDATA[<p>We are moving into a future where games train our doctors, monitor our health, and treat our illnesses.&#160; The sky is falling! Social media is the new scapegoat of the month. Headlines claim it is ruining our relationships, dismantling our society, destroying our very lives! In particular, the most frequent victims are presumed to be&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2018/03/road-gdc-im-not-doctor-simulate-one-vr/">Road to GDC: I’m Not A Doctor, but I Simulate One in VR</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>We are moving into a future where games train our doctors, monitor our health, and treat our illnesses.&nbsp;<span id="more-9703"></span></p>
<p>The sky is falling! Social media is the new scapegoat of the month. Headlines claim it is ruining our relationships, dismantling our society, destroying our very lives! In particular, the most frequent victims are presumed to be teenagers. Sometimes the accused culprit is not social media, but the phones that make it so accessible. Is it true? Only time will tell &#8230; but in the &#8217;50s, the demon was comic books; in the &#8217;60s, rock and roll; and in the &#8217;80s, video games. My mother was convinced that my love of comic books and science fiction was going to rot my brain. Now, of course, these things are mainstream and no longer the sole domain of teens. But there’s always a new thing for people to worry about or blame for the decline and fall of civilization.</p>
<p>I’m particularly sensitized to that criticism of video games. I designed and programmed my first computer game in college in 1976 &#8211; in fact, inspired by that very love of science fiction I had as a child. When I graduated in 1980, my first job out of college was entering the then-infant video game industry. I’ve never left. So when pundits blamed games for destroying society, even causing teen violence and rebellion, I took it personally. I’ve always felt that video games can be magical, marvelous entertainment. I hoped that one day they’d be seen as not just safe, but actually good for us. That day is finally here.</p>
<h3>Virtual treatment, real results</h3>
<p>For many years now, researchers and doctors have gradually built up solid scientifically verified evidence that existing games can improve the lives of the people who play them. At the same time, increasing numbers of games have been created with the idea of ‘boosting health’ as a direct goal.</p>
<p>Fast action games like Call of Duty have been found to improve visual perception and the ability to make correct decisions quickly. Other research has shown promise in using a game to treat the underlying causes of&nbsp;<a href="https://www.polygon.com/2014/2/24/5439884/this-game-knows-how-scared-you-are-but-could-be-used-to-heal-trauma" target="_blank" rel="noopener">depression</a>. It’s possible that games may be able to diagnose the onset of degenerative diseases like Alzheimer’s and Parkinson’s, and perhaps even slow their progression.</p>
<p>Games have shown promise in the realm of physical fitness, too. Starting 20 years ago, the arcade game Dance Dance Revolution was credited with getting a lot of passive couch potatoes up, moving, and losing weight, and it’s still spawning sequels. Games on mobile phones like&nbsp;<i>Zombies, Run!</i> and&nbsp;<i>Pokémon Go</i>&nbsp;have encouraged players to get out and move in the world, and many track their exercise and calorie expenditure as they do so. VR holds promise here too, with the chance to get your exercise by racing the Tour de France on your exercise bike, or by flying like a bird. There are even current ventures bringing gameplay to gym class and possibly making dodgeball fun even for nerds!</p>
<h3>Doctors with joysticks</h3>
<p>It turns out that doctors in training, like most people these days, are often avid game players. That has presented a great opportunity for using them as part of their medical education. Although games have yet to replace classes, they’ve been shown to help laparoscopic surgeons reduce errors by 37 percent while increasing their speed by 27 percent when used as warm-up exercises. When you consider that athletes, musicians, dancers, and others who need to do precision work with their muscles all limber up before their tasks, it makes sense that the right kind of practice helps surgeons, too.</p>
<p>Other companies are rushing to use VR to train anesthesiologists or to give caregivers a first-hand sense of how their patients with macular degeneration see the world. The VR simulations aren’t all games, but the vast majority of VR engineers are coming from the games industry.</p>
<h3>Prescribing play</h3>
<p>Perhaps the most exciting application of games in the modern world are the ways in which doctors are using games to treat their patients. Realistic war games have helped soldiers recover from PTSD by simulating the experiences that trigger their problem, a method to gradually desensitize them to reduce their symptoms long term. Other games have been used in similar ways in conjunction with therapy to treat&nbsp;<a href="https://www.polygon.com/features/2017/4/7/15205366/vr-danger-close" target="_blank" rel="noopener">phobias</a>&nbsp;like fear of heights, flying, and spiders. And currently, virtual reality games have shown great promise in pain relief for acute pain, reducing or even eliminating the need for narcotics when changing the dressings on burn victims. VR is also showing promise in helping stroke victims recover control over their movement, and in&nbsp;<a href="https://www.polygon.com/2014/3/3/5462508/phantom-pain-video-game-treatment" target="_blank" rel="noopener">relieving the perception of pain in “phantom limbs”</a> experienced by amputation patients.</p>
<p>Last September saw the FDA approval of a mobile phone app to be used (in conjunction with therapy) to treat addiction. The developers call their app a “Prescription Digital Therapeutic” and, although it’s not a game, it’s a big step to have software approved to treat something as serious as Substance Abuse Disorder.</p>
<p>But a real game designed to be an active treatment for ADHD (Attention Deficit Hyperactivity Disorder) was not far behind. By December, the FDA gave preliminary clearance to a video game made by a team consisting of both game developers and neuroscientists from UCSF. In a large controlled trial of children and teens diagnosed with ADHD, the group who used the game showed significant improvement compared to a control group. The team hopes that soon it will become the first game to win FDA approval on the same terms as a prescription drug. In style, the game is part racing game, part Pokémon Snap, but with many unique twists to improve attention and focus.</p>
<p>We are moving into a future where games train our doctors, monitor our health, and treat our illnesses. It may seem a bit outrageous now, but if comic books led me into a career making video games and often become the basis of mainstream movies, why can’t video games inspire the next generation of doctors and become the basis of medical treatment? Video games are intimately connected to learning, attention, and the brain. It isn’t an accident that they are also proving to be useful to our mental and physical health. Maybe they’ll even be able to reverse my dreaded comic book brain rot!</p>
<p><i>This is part of a&nbsp;<a href="https://www.rollingstone.com/gdc" target="_blank" rel="noopener">series of columns</a>&nbsp;written by developers speaking at the Game Developers Conference in March.</i></p>
<p><i>Noah Falstein is a freelance game designer and producer, and was one of the first 10 employees at LucasArts Entertainment and Dreamworks Interactive. Last year he left Google after serving four years as their Chief Game Designer.</i></p>
<p>Written by: Noah Falstein, via <a href="https://www.rollingstone.com/glixel/features/road-to-gdc-im-not-a-doctor-but-i-simulate-one-in-vr-w517154" target="_blank" rel="noopener">Rolling Stone</a><br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2018/03/road-gdc-im-not-doctor-simulate-one-vr/">Road to GDC: I’m Not A Doctor, but I Simulate One in VR</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2018/03/road-gdc-im-not-doctor-simulate-one-vr/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">9703</post-id>	</item>
		<item>
		<title>Next Big Thing for Virtual Reality: Lasers in Your Eyes</title>
		<link>https://www.situatedresearch.com/2016/05/next-big-thing-virtual-reality-lasers-eyes/</link>
					<comments>https://www.situatedresearch.com/2016/05/next-big-thing-virtual-reality-lasers-eyes/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Tue, 03 May 2016 21:29:30 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[heads-up-display]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Mobile Devices]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=9341</guid>

					<description><![CDATA[<p>San Francisco – The next big leap for virtual and augmented reality headsets is likely to be eye-tracking, where headset-mounted laser beams aimed at eyeballs turn your peepers into a mouse.  A number of startups are working on this tech, with an aim to convince VR gear manufacturers such as Oculus Rift and HTC Vive&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2016/05/next-big-thing-virtual-reality-lasers-eyes/">Next Big Thing for Virtual Reality: Lasers in Your Eyes</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>San Francisco – The next big leap for virtual and augmented reality headsets is likely to be eye-tracking, where headset-mounted laser beams aimed at eyeballs turn your peepers into a mouse. <span id="more-9341"></span></p>
<p>A number of startups are working on this tech, with an aim to convince VR gear manufacturers such as Oculus Rift and HTC Vive to incorporate the feature in a next generation device. They include SMI, Percept, Eyematic, Fove and Eyefluence, which recently allowed USA Today to demo its eye-tracking tech.</p>
<p>“Eye-tracking is almost guaranteed to be in second-generation VR headsets,” says Will Mason, cofounder of virtual reality media company UploadVR. “It’s an incredibly important piece of the VR puzzle.”</p>
<p><iframe loading="lazy" title="USATODAY-Embed Player" width="850" height="480" frameborder="0" scrolling="no" allowfullscreen="true" marginheight="0" marginwidth="0" src="https://uw-media.usatoday.com/embed/video/82420346?placement=snow-embed"></iframe></p>
<p>At present, making selections in VR or AR environments typically involve moving the head so that your gaze lands on a clickable icon, and then either pressing a handheld remote or, in the case of Microsoft’s HoloLens or Meta 2, reaching out with your hand to make a selection by interacting with a hologram.</p>
<p>As shown in Eyefluence’s demonstration, all of that is accomplished by simply casting your eyes on a given icon and then activating it with another glance.</p>
<p>“The idea here is that anything you do with your finger on a smartphone you can do with your eyes in VR or AR,” says Eyefluence CEO Jim Marggraff, who cofounded the Milpitas, Calif-based company in 2013 with another entrepreneur, David Stiehr.</p>
<p>“Computers made a big leap when they went from punchcards to a keyboard, and then another from a keyboard to a mouse,” says Marggraff, who invented the kid-focused LeapFrog LeapPad device. “We want to again change the way we interface with data.”</p>
<h2>Eye Tech Not Due for Years</h2>
<p>As exciting as this may sound, the mainstreaming of eye-tracking technology is still a ways off. Eyefluence execs say that although they are in discussions with a variety of headset makers, their tech isn’t likely to debut until 2017. Other companies remain largely in R&amp;D mode, and Fove has a waitlist for its headset’s Kickstarter campaign.</p>
<p>The challenges for eye-tracking are both technological and financial. Creating hardware that consistently locks onto an infinite variety of eyeballs presents one hurdle, while doing so with gear that is light and consumes little power is another.</p>
<p>And while a number of companies in the space have managed to land funding – Eyefluence has raised $21.6 million in two rounds led by Intel Capital and Motorola Solutions – some tech-centric VCs are sitting on the sidelines while they wait for the technology to mature and for headset makers to make their moves.</p>
<p>“What eye-tracking will do will be powerful, but I’m not sure how valuable it will be from an investment standpoint,” says Kobie Fuller of Accel Partners. “Is there a multi-billion-dollar eye-tracking company out there? I don’t know.”</p>
<p>Among the unknowns: whether the tech will be disseminated through a licensed model or if existing headset companies will develop it on their own.</p>
<p>Still, once deployed eye-tracking has the potential to revolutionize the VR and AR experience, Fuller expects.</p>
<p>Specifically, eye-tracking will “greatly enhance interpersonal connections” in VR, he says, by applying realistic eye movements to avatars.</p>
<p>Facebook founder Mark Zuckerberg, who presciently bought Oculus for $2 billion, is banking on VR taking social interactions to a new level.</p>
<p>“The most exciting thing about eye-tracking is getting rid of that ‘uncanny valley’ (where disbelief sets in) when it comes to interacting through avatars,” says Fuller.</p>
<h2>Less Computing Power</h2>
<p>There are a few other ways in which successful eye-tracking tech could revolutionize AR and VR beyond just making such worlds easy to navigate without joysticks, remotes or hand gestures.</p>
<p>First, by tracking the eyes, such tech can telegraph to the VR device’s graphics processing unit, or GPU, that it needs to render only the images where the eyes are looking at that moment.</p>
<p>That means less computing power would be needed. Currently, a $700 Oculus headset requires a powerful computer to render its images. Oculus’s developer kit with a suitable computer costs $2,000. “If you can save on rendering power, that could significantly lower the barrier to entry into this market for consumers,” says UploadVR’s Mason.</p>
<p>And second, by not just tracking the eyeball but also potentially analyzing a person’s mood and logging in details about their gaze, AR/VR headsets are in a position to deliver targeted content as well as give third-party observers insights into the wearer’s state of mind and situational awareness.</p>
<h2>Police Use</h2>
<p>The former use case would appeal to in-VR advertisers, while the latter would come in handy for first responders.</p>
<p>“Police and paramedics are looking for an eyes-up, hands-free paradigm, and eye-tracking can bring that,” says Paul Steinberg, chief technology officer at Motorola Solutions, an investor in Eyefluence.</p>
<p>Steinberg sketches out a scene from what could be the near future.</p>
<p>A police officer on patrol has suddenly unholstered his gun. Via his augmented reality glasses with eye-tracking, colleagues at headquarters are instantly fed information about his stress level through pupil dilation information.</p>
<p>They can then both advise the officer through a radio as well as activate body cameras and other tech that he might have neglected to turn on in his stressed state. What’s more, another officer on the scene can instantly scan through a variety of command center video and data feeds through an AR headset, flipping through the options by simply looking at each one.</p>
<p>“We would have to work with our (first responder) customers to train them how to use this sort of tech of course, but the potential is there,” says Steinberg. “But we’re not months away, we’re more than that.”</p>
<h2>Demo Shows Off Ease of Use</h2>
<p>An Eyefluence indicates that eye-tracking technology isn’t a half-baked dream.</p>
<p>Navigating between a dozen tiles inside a first-generation Oculus headset proves as easy as shifting your gaze between them. Making selections – the equivalent of clicking on a mouse – is also equally intuitive. At no time does the head need to move, and hands remain at your side.</p>
<p><iframe loading="lazy" src="https://www.youtube.com/embed/iQsY3uLvYQ4" width="720" height="384" frameborder="0" allowfullscreen="allowfullscreen"></iframe></p>
<p>After about 10 minutes in the demo, it feels antiquated to pop on a VR headset and grab a remote to click through choices selected with head movements.</p>
<p>Marggraff says Eyefluence’s technical challenges included making technology that could respond in low and bright light, accounting for different size pupils and ensuring that power consumption is minimal.</p>
<p>But, he adds, his team remains convinced of the inevitability of its product: “Just like when we started tapping and swiping on our phones, we’re going to eventually need a better interface for AR and VR.”</p>
<p>Written by: <a href="http://www.usatoday.com/staff/1005/marco-della-cava/" target="_blank" rel="noopener">Marco della Cava</a>, <a href="http://www.usatoday.com/story/tech/news/2016/05/02/new-mouse-vr-could-your-eyes/83716986/" target="_blank" rel="noopener">USA Today</a> (via <a href="http://ispr.info/2016/05/03/next-big-thing-for-virtual-reality-eye-tracking-lasers-in-your-eyes/" target="_blank" rel="noopener">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2016/05/next-big-thing-virtual-reality-lasers-eyes/">Next Big Thing for Virtual Reality: Lasers in Your Eyes</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2016/05/next-big-thing-virtual-reality-lasers-eyes/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">9341</post-id>	</item>
		<item>
		<title>The Climb: The Most Head-Spinning Virtual Reality Experience Yet</title>
		<link>https://www.situatedresearch.com/2015/12/the-climb-the-most-head-spinning-virtual-reality-experience-yet/</link>
					<comments>https://www.situatedresearch.com/2015/12/the-climb-the-most-head-spinning-virtual-reality-experience-yet/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Wed, 30 Dec 2015 17:34:42 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=9230</guid>

					<description><![CDATA[<p>Crytek’s new project for the Oculus Rift shows us exactly where VR gaming is going – towards heady and experiential gameplay Above you, the craggy face of the cliff seems to stretch up endlessly toward the sky, offering perilously few footholds. In the far distance there’s a small village by a beach, bathed in orange&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2015/12/the-climb-the-most-head-spinning-virtual-reality-experience-yet/">The Climb: The Most Head-Spinning Virtual Reality Experience Yet</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>Crytek’s new project for the Oculus Rift shows us exactly where VR gaming is going – towards heady and experiential gameplay</strong></p>
<p>Above you, the craggy face of the cliff seems to stretch up endlessly toward the sky, offering perilously few footholds. In the far distance there’s a small village by a beach, bathed in orange sunshine – an exotic idyll. But below you there is &#8230; nothing. Nothing but a long deadly drop into the crashing sea far below. Your only option is to keep climbing. <span id="more-9230"></span></p>
<p>Crytek has always been interested in pushing graphics technology. In the mid-2000s, the Frankfurt-based developer and publisher achieved wide acclaim for its visually spectacular first-person shooters Far Cry and Crysis; although several years old, both are still widely used as a benchmark for near photo-realism in games, especially in terms of environmental detail. With its steamy tropical rain forests, Far Cry presented a lush counterpoint to the genre’s obsession with steel grey interiors.</p>
<p>But the company’s latest project is perhaps its most ambitious attempt to bring immersive naturalism to game worlds. The Climb is a virtual reality climbing simulator, which gives the player the chance to attempt a series of tricky ascents on rock faces based around the world. “We started out by working on the mechanics of virtual reality,” says executive producer Elijah Freeman, who started as an artist at Crytek 15 years ago. “When we were prototyping, climbing just stood out for us – it was almost instantaneously fun.”</p>
<p>Exclusive to the forthcoming Oculus Rift headset, The Climb uses the technology’s motion controllers – or an Xbox One pad – to give the player control over their hands, which are the only body part displayed on screen. While the shoulder buttons can be used to grip with either your left or right hand, you need to use the motion sensors in the headset to physically look at the next finger hold that you want to grab for – this then forces the onscreen limbs to move in that direction. It takes a few minutes to get used to. At first your arms flail uselessly short of the required hold, and when you do get the direction right, you may mistime the gripping mechanism, sending your climber into the abyss.</p>
<p>But as I found during my demo session, based on Vietnam’s Halong Bay, the interface gradually becomes intuitive. You learn to scan the rockface for available holds, you learn to plan ahead, using the correct arm to lurch up with, based on its position to your body and the location of the next hold. Eventually you build up a rhythm where you start to scamper up the cliff like Spider-Man on his summer hols.</p>
<p>Although there are arrows on each climbing surface to point out a general direction, Freeman says there will be multiple routes available on tougher climbs, allowing players to learn and finesse their favourites – an asynchronous multiplayer mode lets you compare route times with friends. Adding an extra layer of authenticity, you also have to watch a grip meter in the corner of the screen: if it gets too low, you need to chalk up your hands, or your fingers will start slipping from the ledges. It’s a small addition, but it just keeps that sense of physicality – that sense that you’re actually out there dangling from a 200ft cliff.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter wp-image-9232 size-full" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/12/1805.jpg?resize=980%2C586&#038;ssl=1" alt="1805" width="980" height="586" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/12/1805.jpg?w=1225&amp;ssl=1 1225w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/12/1805.jpg?resize=300%2C180&amp;ssl=1 300w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/12/1805.jpg?resize=768%2C460&amp;ssl=1 768w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/12/1805.jpg?resize=1024%2C613&amp;ssl=1 1024w" sizes="auto, (max-width: 980px) 100vw, 980px" /><em>Fancy the sensation of dangling from a 200ft cliff? Then The Climb is for you. Photograph: Crytek</em></p>
<p>And the game does likes to mess with your natural fear of heights. There are sections where you’re required to climb downwards, edging over the lip of a rocky outcrop so that all you can see below you is that beckoning expanse of water. The action is so measured and precise it’s unlikely anyone is going to suffer motion sickness, but vertigo is quite another thing. Basically, if you have to look away during all of Tom Cruise’s free climbing sequences in the Mission Impossible films, this is going to be an interesting challenge.</p>
<p>Then there’s the jumping. Some holds are just too far away to just reach for, so you need to hit a button and launch yourself across. I found this the most challenging and frustrating part of the demo – simply because the timing required to hit the grip button and hang on to the target ledge requires a level of precision which is tough to achieve when you’re combining inexact motion controls with a joypad input. When I did make it, it seemed more by luck than judgement. Fortunately, there are various save points on each climb – disguised as pitons obviously – so unlike in the real sport, you get to have another go if you plunge to a watery grave.</p>
<p>The experience is a fascinating glimpse at the strengths of VR as a medium. While beautiful environmental details look impressive on a 2D display, they feel truly fascinating and immersive in VR – each time you reach a new cliff top and get to survey the scenery, it feels like a genuine reward, rather than a mere pretty backdrop. “Visual fidelity is an important part of extrapolating on presence,” says Freeman. “You need to feel like you’re there. Crytek has spent a lot of time on getting our CryEngine to render at these fidelities so it was a natural fit for us.”</p>
<p>Because of this extended sense of “being there” the actual physical input doesn’t have to be so frenzied and demanding. If this were a traditional simulation on a 2D display, it’s likely every button on the controller would be employed; there would be balance, wind and directional gauges. Here, because the sensory input is more complex, the developers don’t need to add such complexity to the interface.</p>
<p>Indeed, the minimal controls seem to work perfectly well – the sense of actually being on the rock face augments the sense of achievement and challenge. We’ve heard a lot from VR developers over the past two years that early titles using the Oculus, HTC Vive or PlayStation VR headsets are likely to be experiential rather than narrative in focus – they’ll be about inhabiting and exploring a defined space, rather than following some sort of epic story. The Climb confirms this. It may not function as a “realistic” climbing simulation (you can’t see or use your feet for example), but it is certainly about being somewhere and experiencing the traversal of a precarious space.</p>
<p>“The amount of fun is balanced very finely with the sense of risk and excitement,” says Freeman. “I like the idea of presenting experiences like this to the player. This is just the initial pass to get people introduced to VR, but you can definitely see where this medium is going.”</p>
<p>Written by: <a href="http://www.theguardian.com/technology/2015/dec/24/the-climb-virtual-reality-oculus-rift-crytex" target="_blank">Keith Stuart, the Guardian</a> (via <a href="http://ispr.info/2015/12/28/the-climb-head-spinning-experiential-gameplay-provides-insights-about-presence/" target="_blank">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2015/12/the-climb-the-most-head-spinning-virtual-reality-experience-yet/">The Climb: The Most Head-Spinning Virtual Reality Experience Yet</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2015/12/the-climb-the-most-head-spinning-virtual-reality-experience-yet/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">9230</post-id>	</item>
		<item>
		<title>Hands-on with Mattel’s new AR, VR View-Master</title>
		<link>https://www.situatedresearch.com/2015/02/hands-on-with-mattels-new-ar-vr-view-master/</link>
					<comments>https://www.situatedresearch.com/2015/02/hands-on-with-mattels-new-ar-vr-view-master/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Fri, 20 Feb 2015 15:54:37 +0000</pubDate>
				<category><![CDATA[Education]]></category>
		<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Aesthetics]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Games for Learning]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Mobile Devices]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=8819</guid>

					<description><![CDATA[<p>A View-Master for virtual reality: Hands-on with Mattel&#8217;s new AR, VR phone toy Mattel is relaunching View-Master, but as a virtual reality and augmented-reality phone toy. And I got to play around with it for a bit…or at least, some of the tech behind it.  Announced at an event in New York City, the new&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2015/02/hands-on-with-mattels-new-ar-vr-view-master/">Hands-on with Mattel’s new AR, VR View-Master</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>A View-Master for virtual reality: Hands-on with Mattel&#8217;s new AR, VR phone toy</strong></p>
<p><span style="line-height: 1.5;">Mattel is relaunching View-Master, but as a virtual reality and augmented-reality phone toy. And I got to play around with it for a bit…or at least, some of the tech behind it. </span><span id="more-8819"></span></p>
<p>Announced at an event in New York City, <a href="http://www.cnet.com/news/google-mattel-announce-a-virtual-reality-view-master/" target="_blank">the new View-Master</a> is a collaboration between Mattel and Google, whose virtual reality Cardboard app has enabled cheap do-it-yourself accessories to turn any Android phone into a mini-VR viewer. Mattel’s plastic toy, which will debut in October, is like a more durable, plastic version of <a href="http://www.cnet.com/news/googles-cardboard-vr-headset-is-no-joke-its-great-for-the-oculus-rift/" target="_blank">Google Cardboard</a>, designed entirely for kids…or, maybe, also for grown-up kids like me. And the most brilliant part is it’ll only cost $30.<span id="more-20098"></span></p>
<p><iframe loading="lazy" src="http://www.cnet.com/videos/share/id/tUlXVC5TlPLbcmd7Lo7cfkU6k0P1Edow/" width="960" height="540" frameborder="0" seamless="seamless" scrolling="no" allowfullscreen="allowfullscreen"></iframe></p>
<p>I used View-Master back when I was a little — who didn’t? It’s a classic 3D stereoscopic picture viewer. Many people had even said Google Cardboard looked a bit like a View-Master. So is isn’t a huge surprise that Mattel has suddenly announced a new View-Master with Google Cardboard VR capabilities added. I’ve always felt that virtual reality reminded me of early stereoscopic toys. And Mattel has keyed onto the same idea.</p>
<figure id="attachment_8821" aria-describedby="caption-attachment-8821" style="width: 770px" class="wp-caption aligncenter"><img data-recalc-dims="1" loading="lazy" decoding="async" class="size-full wp-image-8821" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster1.jpg?resize=770%2C577&#038;ssl=1" alt="The View-Master will fit most phones, according to Mattel: iPhone and Android alike." width="770" height="577" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster1.jpg?w=770&amp;ssl=1 770w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster1.jpg?resize=300%2C225&amp;ssl=1 300w" sizes="auto, (max-width: 770px) 100vw, 770px" /><figcaption id="caption-attachment-8821" class="wp-caption-text">The View-Master will fit most phones, according to Mattel: iPhone and Android alike.</figcaption></figure>
<p>The toy was only viewable in a mock-up prototype form at Mattel’s event, but the design’s pretty cool: it looks half old-school View-Master, half Oculus Rift. The inner plastic housing extends to hold many types of phones: Mattel says it’s designed to fit the largest existing phones, and will even work with the <a href="http://www.cnet.com/products/apple-iphone-6-plus/" target="_blank">iPhone 6 Plus</a> and <a href="http://www.cnet.com/products/google-nexus-6/" target="_blank">Nexus 6</a>. A capacitive-touch side lever is used to “click” through scenes or into virtual environments, like the magnetized side switch on Google’s Cardboard viewers.</p>
<p>Mattel’s headset is designed with Google and Android in mind, but at launch is intended to work on “nearly all platforms,” which includes iOS. That would mean a dedicated Mattel app which interfaces with the View-Master, but Google’s Cardboard and Cardboard-ready apps — many of which already exist on iOS, like VRSE — will work too.</p>
<p>Mattel is planning to use View-Master not just for VR, but also for AR; little plastic reels that look like the old cardboard ones are really just flat coasters this time around, now with images on top which the View-Master reads and turns into pop-up augmented-reality models on your table, desktop or wherever else you place it. Multiple View-Masters could use one reel to access content if put down on a table, unlike the old pop-in reels. This type of augmented-reality tech has already existed for years in many apps and on some children’s toys like the Nintendo 3DS (with its AR cards) and PlayStation Vita, but mixing it into a VR headset is a novel idea.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-8822" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster3.jpg?resize=770%2C577&#038;ssl=1" alt="viewmaster3" width="770" height="577" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster3.jpg?w=770&amp;ssl=1 770w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster3.jpg?resize=300%2C225&amp;ssl=1 300w" sizes="auto, (max-width: 770px) 100vw, 770px" /></p>
<p>I didn’t get to use the actual Mattel prototype, but we tried View-Master’s augmented-reality tech on phones and Google Cardboard viewers. There were three reels to try: a dinosaur one made a little dinosaur pop up on the disc on the table in front of me. When I aimed a dot and clicked on it, I was suddenly surrounded by a prehistoric 360-degree panorama with 3D dinosaurs. Clicking on them brought up facts, too.</p>
<p>Looking at the space disc with Cardboard on brought up a pop-up moon and Earth; clicking on it took me to a panorama of the moon, with pop-up clickable photos of NASA missions. A third, San Francisco-themed, had little mini-models of Alcatraz and the Golden Gate Bridge that turned into VR photo panoramas. To exit any of the virtual panoramas, you look down and click on the side…or, remove the View-Master from your face. The View-Master comes with one reel in its $30 package, and extra reels will cost around $15 each. No, older View-Master reels don’t work in here, but it sounds like Mattel is exploring re-releasing content from some of the back catalog 10,000 older ViewMaster reels.</p>
<figure id="attachment_8823" aria-describedby="caption-attachment-8823" style="width: 770px" class="wp-caption aligncenter"><img data-recalc-dims="1" loading="lazy" decoding="async" class="size-full wp-image-8823" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster4.jpg?resize=770%2C577&#038;ssl=1" alt="The &quot;reels&quot; don't actually go in the View-Master, they simply sit on your table." width="770" height="577" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster4.jpg?w=770&amp;ssl=1 770w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster4.jpg?resize=300%2C225&amp;ssl=1 300w" sizes="auto, (max-width: 770px) 100vw, 770px" /><figcaption id="caption-attachment-8823" class="wp-caption-text">The &#8220;reels&#8221; don&#8217;t actually go in the View-Master, they simply sit on your table.</figcaption></figure>
<p>There’s no strap to keep the View-Master on: this is a hold-to-your-face toy, much like older View-Masters and Google Cardboard. Mattel has promised that the tech has already been vetted by pediatric ophthalmologists, and is meant for children ages 7 and up — in short, bite-sized sessions.</p>
<p>The View-Master may work with other toys, too, like other app-ified toys in the past, but for now it’s really a fancier plastic Google Cardboard viewer, with additional Mattel support. That’s not a bad thing at all: at $30, this is a pretty awesome little stocking-stuffer idea, and a fun phone toy. Just keep in mind that if you give this to your kid, it won’t work without a phone popped into it.</p>
<p>By the time fall rolls around, Mattel may have other toys ready to work with it. Or, there might be many other companies ready to make cheap phone-enabled VR headsets, too.</p>
<p>Written by: <a href="http://www.cnet.com/profiles/scottstein8/" target="_blank">Scott Stein</a>, <a href="http://www.cnet.com/products/new-view-master/" target="_blank">CNET</a> (via <a href="http://ispr.info/2015/02/20/hands-on-with-mattels-new-ar-vr-view-master/" target="_blank">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2015/02/hands-on-with-mattels-new-ar-vr-view-master/">Hands-on with Mattel’s new AR, VR View-Master</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2015/02/hands-on-with-mattels-new-ar-vr-view-master/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">8819</post-id>	</item>
		<item>
		<title>Welcome to the Age of Holographs</title>
		<link>https://www.situatedresearch.com/2015/01/welcome-age-holographs/</link>
					<comments>https://www.situatedresearch.com/2015/01/welcome-age-holographs/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Thu, 22 Jan 2015 22:18:54 +0000</pubDate>
				<category><![CDATA[Education]]></category>
		<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Aesthetics]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Games for Learning]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Mental Models]]></category>
		<category><![CDATA[Personalization]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[User-Centered Design]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=8792</guid>

					<description><![CDATA[<p>Up close with the HoloLens, Microsoft’s most intriguing product in years We just finished a heavily scripted, carefully managed, and completely amazing demonstration of Microsoft’s HoloLens technology. Four demos, actually, each designed to show off a different use case for a headset that projects holograms into real space. We played Minecraft on a coffee table.&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2015/01/welcome-age-holographs/">Welcome to the Age of Holographs</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>Up close with the HoloLens, Microsoft’s most intriguing product in years</strong></p>
<p>We just finished a heavily scripted, carefully managed, and completely amazing demonstration of Microsoft’s HoloLens technology. Four demos, actually, each designed to show off a different use case for a headset that projects holograms into real space. We played <em>Minecraft</em> on a coffee table. We had somebody chart out how to fix a light switch right on top of the very thing we were fixing. <span id="more-8792"></span></p>
<p>We walked on Mars.</p>
<p>You’ll notice there aren’t photos here, and that’s because before we were even allowed into the labs where the HoloLens team tests out its user experiences, we had to deposit our cameras and phones into a locker. No recording equipment of any kind was allowed, not even audio. We entered the basement below Microsoft’s visitor center laughing at the absurdity of it all — many reporters needed to get notepads from the company and weren’t carrying pens, either.</p>
<p>But it was all worth it, because HoloLens is probably the most intriguing (and, in many ways, most infuriating) technology we’ve experienced since the Oculus Rift. And there are many parallels with the Rift to be had: both are immersive, but in different ways; both require you to strap a weird thing on your head; both leave you grinning like at absolute idiot at a scene only you can see. And, crucially, both need more work when it comes to thinking through exactly how to control and interact with virtual things.</p>
<p><script height="575px" width="1023px" src="https://player.ooyala.com/iframe.js#ec=lsOGp3cjqUFwNW0FqImWpiKsqIdSTEX-&#038;pbid=dcc84e41db014454b08662a766057e2b"></script></p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-8793" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/d76d1d6c-a4e6-41d2-bc43-c9b49041a219.0.png?resize=864%2C392&#038;ssl=1" alt="d76d1d6c-a4e6-41d2-bc43-c9b49041a219.0" width="864" height="392" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/d76d1d6c-a4e6-41d2-bc43-c9b49041a219.0.png?w=864&amp;ssl=1 864w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/d76d1d6c-a4e6-41d2-bc43-c9b49041a219.0.png?resize=300%2C136&amp;ssl=1 300w" sizes="auto, (max-width: 864px) 100vw, 864px" /></p>
<p><strong><em>Minecraft</em> IRL<br />
</strong>by Dieter Bohn</p>
<p>By far, <a href="https://www.theverge.com/2015/1/21/7868363/minecraft-hololens-microsoft-freecell" target="_blank" rel="noopener">the most impressive demo for my money was the <em>Minecraft</em> demo</a> — though Microsoft called it something like “Building Blocks” or some such, presumably so as not to fully commit to releasing a full holograph version of<em>Minecraft</em>. But before we could enter this virtual world — actually, the virtual entered <em>our</em> world — we had to strap on the development unit for the HoloLens.</p>
<p>It’s a contraption, to be sure. There’s a small, heavy block you hang around your neck which contains all the computing power. It’s comprised of lenses and tiny projectors and motion sensors and speakers (or <em>something</em> that makes sound, anyway), and god knows what else. And then there’s a screen right there in your field of view.</p>
<p>A “screen in your field of view” is the right way to think about HoloLens, too. It’s immersive, but not nearly as immersive as proper virtual reality is. You still see the real world in between the virtual objects; you can see where the magic holograph world ends and your peripheral vision begins.</p>
<p>But before you can apply your jaded “I’ve done VR before” attitude to this situation, you look down at the coffee table and there’s a <strong>castle sitting right on the damn thing.</strong> It’s not shimmery, but it’s not quite real, either. It’s just sitting there, perfectly flat on the table, reacting in space to your head movements. It’s nearly as lifelike as the actual table, and there’s no lag at all. The castle is there. It’s simply magic.</p>
<p>You definitely have a big stupid grin on your face even though the contraption that’s strapped to it is pressing your eyeglasses into the bridge of your nose in a painful way.</p>
<p>Then it’s demo time. You can’t touch anything, but you can look and point a little circle at objects on it by moving your head around. You learn how a “glance” is just you looking at things and pointing your reticle at them, and an “AirTap” is the equivalent of clicking your mouse. The demo involves digging <em>Minecraft</em> holes and blowing up <em>Minecraft</em> zombies with <em>Minecraft</em> TNT. It’s basically incredible to see these digital things in real space.</p>
<p>You blow up a hole in the table and then you look <em>through</em> it to more digital objects on the floor. You blow up a hole in the wall and tiny bats fly out and you see that behind your very normal wall is a virtual hellscape of lava and rock. You peer into the hole, around the corner, and see that dark realm extend far into space.</p>
<p>And then the demo’s over.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-large wp-image-8794" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/a67d3d33-e1e5-4cf7-bf3d-dbe1befc8d8c.0.jpg?resize=980%2C655&#038;ssl=1" alt="a67d3d33-e1e5-4cf7-bf3d-dbe1befc8d8c.0" width="980" height="655" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/a67d3d33-e1e5-4cf7-bf3d-dbe1befc8d8c.0.jpg?resize=1024%2C684&amp;ssl=1 1024w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/a67d3d33-e1e5-4cf7-bf3d-dbe1befc8d8c.0.jpg?resize=300%2C200&amp;ssl=1 300w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/a67d3d33-e1e5-4cf7-bf3d-dbe1befc8d8c.0.jpg?w=1200&amp;ssl=1 1200w" sizes="auto, (max-width: 980px) 100vw, 980px" /></p>
<p><strong>Skype<br />
</strong>by Tom Warren</p>
<p>Microsoft’s Skype demo was as equally impressive to me as playing around with<em>Minecraft </em>blocks in a living room. After a two-hour keynote, Microsoft wanted me to fix a light switch. It all started by sitting down and facing some tools and a socket with exposed wiring. A little dazed and confused, I looked up and scanned across the Skype interface which was suddenly appearing in front of me, and picked a face to call. The video call popped into a little window, and my journey to fix a light switch began.</p>
<p>On the other end of the call was a Microsoft engineer. I could see and hear her, but she could only hear me and see exactly what I was seeing in front of me. My eyes, or the headset on my head, was relaying everything over Skype. It was a support call of sorts — here she was to help me fix a light switch. We started by pinning her little window on top of a lamp. I could then look around the room and return to the lamp to see her face. She guided me where to go. It felt strangely natural, and I didn’t need to configure anything or learn gestures other than the same “Air Tap” you use to simulate a mouse click.</p>
<p>While I was being talked through which real world tools we needed for the job, the Microsoft engineer called my attention to the wall with wiring and then started drawing where to position the light switch right on the wall. Thinking about it now it sounds totally surreal, but during the demo I didn’t even think about it — it just felt like I was being guided around with annotations and a helpful friend. We connected the wiring, tested it for an electrical current, and then turned the power back on and switched the light on. It was all fixed, and all by using a crazy combination of a headset, augmented reality, and Skype. It might sound gimmicky, but the applications here are truly impressive. I use YouTube guides to figure out home improvements or to service my car, but this is on another level. Imagine a surgeon performing complex surgery and writing notes in real time and guiding a colleague through it all. Imagine support calls to resolve a problem with your PC. If this works as well as Microsoft’s controlled demo, then this really has the ability to change how we communicate and learn.</p>
<p>Microsoft’s next demo didn’t have us using the HoloLens prototypes directly. Instead, we watched as “Nick” (nobody in Microsoft’s blue-tinted demonstration basement has last names. I asked.) manipulate objects in digital space so he could build a Koala bear or a pickup truck. It was actually quite impressive, as cameras filmed him and screens showed both Alex and the virtual objects he was manipulating in the same space in real time.</p>
<p>The idea was to convince us that HoloLens would unleash a wave of creators who would be able to dream up 3D objects with little to no training. It’s much easier to understand what a thing is in your living room than it is in AutoCad.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-8795" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/hololens.0.gif?resize=663%2C373&#038;ssl=1" alt="hololens.0" width="663" height="373" /></p>
<p>But sitting there after our whirlwind of actually <em>experiencing</em> HoloLens, my mind was elsewhere. For example, there are only a few ways to interact with this system so far:</p>
<ul>
<li>Glance: you point your head at something.</li>
<li>AirTap: you make a “Number 1″ sign with your hand, then move your finger down like you’re depressing a lever.</li>
<li>Voice: you can issue commands, usually to switch what “tool” you’re using.</li>
<li>Mouse: So actually the neatest thing is that objects you use to interact with computers can be used to interact with holograms.</li>
</ul>
<p>That seems like enough, but it’s not nearly enough. It’s wildly impressive that these objects really do feel like they’re out there in your living room, but it’s equally depressing to know that you can’t treat them like real objects.</p>
<p>At one point in the demo, Alex needed to put a tire on his pickup. He had to twist his body and head around to get his pointer in just the right spot and get the tire arranged just right to fix on the axle. Then, AirTap! the tire is connected. But how much easier would it be if you could grab the tire in your actual hands?</p>
<p>Our hands are simply more dextrous than our necks. You have finer control over small motions, you can move your hands in so many different ways and vectors, with pressure and nuance and delicacy. Your neck and head, well, not so much.</p>
<p>But then Microsoft gave us 3D printed Koalas with a USB drive inside them, which was nice. And if this HoloLens thing takes off, you will be able to design your own and it will be way easier than learning current 3D design software. But not as easy as it would be if you just imagined building with holograms.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-8796" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/microsoft-windows-10-live-verge-_1662.0.jpg?resize=980%2C654&#038;ssl=1" alt="microsoft-windows-10-live-verge-_1662.0" width="980" height="654" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/microsoft-windows-10-live-verge-_1662.0.jpg?w=1000&amp;ssl=1 1000w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/microsoft-windows-10-live-verge-_1662.0.jpg?resize=300%2C200&amp;ssl=1 300w" sizes="auto, (max-width: 980px) 100vw, 980px" /></p>
<p><strong>Walking on Mars<br />
</strong>By Tom Warren</p>
<p>Microsoft has teamed up with NASA to let scientists explore what Curiosity sees on Mars. Instead of panoramic imagery on a computer screen, Microsoft’s demo lit up a room and turned it into Mars. I walked around the rocky terrain, bumped into the Curiosity rover, and generally just checked out a planet I will never visit in my lifetime. It’s a totally new perspective that felt like I was immersed in touring Mars, but not necessarily there. The field of view felt a little too limited to truly immerse myself and trick my brain into thinking I was really on another planet, but what impressed me most is what Microsoft has built into this experience.</p>
<p>I held a call with a NASA engineer and he talked me through the terrain. I squatted to look more closely at rocks, took snapshots of various rock formations, and even planted flags for points of interest. My jaw dropped when I ventured over to a PC in the room and started to experiment with the mouse. I pulled the mouse pointer off the screen and suddenly it was on the floor next to me, allowing me to set markers in the virtual environment. It’s everything I’ve seen in demonstrations from Microsoft Research before, but here it was on my head and working.</p>
<p>The collaboration part was the key here, allowing me to interact with this data in a unique way, but also alongside the NASA engineer who could drop flags on the Mars terrain and guide me to look at certain sections. While this isn’t traditional productivity with a mouse and keyboard, it’s certainly something new and intriguing. I could see this type of scenario working for big teams that need to communicate across time zones and on big sets of complex data.</p>
<p>Overall, HoloLens is Microsoft at its most ambitious. It’s a big bet on the future of computing, the future of Windows, and ultimately the future of Microsoft itself. While the company is struggling at mobile, it wants to catch the next wave of computing and lead. Is HoloLens the next wave? Developers and consumers will be the ultimate test of that, but if anything HoloLens is an incredibly brave and impressive project from Microsoft. It’s true innovation, which is something Microsoft has lacked during its obsession with protecting Windows. It’s also another example of <a href="https://www.theverge.com/2014/11/6/7164623/microsoft-3d-sound-headset-guide-dogs" target="_blank" rel="noopener">an experience that takes the complex technology out of the way</a>, leaving you to experience what really matters.</p>
<p>Written by: <a href="https://www.theverge.com/users/Dieter%20Bohn" target="_blank" rel="noopener">Dieter Bohn</a> and <a href="https://www.theverge.com/users/tomwarren" target="_blank" rel="noopener">Tom Warren</a>, <a href="https://www.theverge.com/2015/1/21/7868251/microsoft-hololens-hologram-hands-on-experience" target="_blank" rel="noopener">The Verge</a> (via <a href="https://ispr.info/2015/01/22/up-close-with-the-hololens-microsofts-intriguing-mixed-reality-product/" target="_blank" rel="noopener">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2015/01/welcome-age-holographs/">Welcome to the Age of Holographs</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2015/01/welcome-age-holographs/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">8792</post-id>	</item>
		<item>
		<title>London Firm Creates Mind-Controlled Commands for Google Glass</title>
		<link>https://www.situatedresearch.com/2014/07/london-firm-creates-mind-controlled-commands-google-glass/</link>
					<comments>https://www.situatedresearch.com/2014/07/london-firm-creates-mind-controlled-commands-google-glass/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Tue, 15 Jul 2014 20:28:01 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Aesthetics]]></category>
		<category><![CDATA[Affect / Emotion]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Communication]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Mental Models]]></category>
		<category><![CDATA[Personalization]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[User-Centered Design]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=8608</guid>

					<description><![CDATA[<p>Forget voice commands and touch gestures: A London firm has developed a way for Google Glass users to control their devices just by thinking. This Place, an agency that specializes in creating user interfaces and experiences for programs used in the medical industry, developed a software called MindRDR that allows Google Glass to connect with&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2014/07/london-firm-creates-mind-controlled-commands-google-glass/">London Firm Creates Mind-Controlled Commands for Google Glass</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Forget voice commands and touch gestures: A London firm has developed a way for Google Glass users to control their devices just by thinking.</p>
<p>This Place, an agency that specializes in creating user interfaces and experiences for programs used in the medical industry, developed a software called MindRDR that allows Google Glass to connect with the Neurosky MindWave Mobile EEG biosensor, a head-mounted device that can detect a person’s brain waves. <span id="more-8608"></span></p>
<p>EEG stands for electroencephalography, which is the measurement and recording of electrical activity in the brain. EEG biosensors have been around for decades, but until recently they were very expensive. Neurosky is a Silicon Valley company that sells EEG biosensors, some for as little as $79.99 from Amazon.com.</p>
<p>The system works by pairing the EEG biosensor with Google’s $1,500 Glass device using Bluetooth. Once the connection has been made, the user fires up MindRDR, which takes what the EEG biosensor detects and converts it into commands that Glass can process.</p>
<p>After turning on the app, users will see a camera interface on the screen of their Google Glass. They can then pick a subject, aim their head in its direction, and concentrate on it while Glass displays a meter showing the level of their brain waves. The more intently a user focuses, the higher the meter climbs until it reaches the top, triggering Glass’ camera. By repeating the process, users can direct MindRDR to upload the photo to one of their social networks.</p>
<p>For now MindRDR can only be used to snap pictures, but This Place Chief Executive Dusan Hamlin said he hoped the agency would continue developing the software so that it could eventually help users overcome mobility limitations. Specifically, Hamlin said he would like MindRDR to help people who suffer from locked-in syndrome, in which a patient has lost motor control but remains aware and alert, as well as quadriplegia.</p>
<p>“The ability to be able to use their mind to make outputs to a device could be a huge thing for them,” Hamlin told the Los Angeles Times in a Skype interview.</p>
<p>But the possibilities for MindRDR extend beyond the medical field. Hamlin said he sees MindRDR as the launching point for a world where people can interact with their digital devices by simply thinking about what they want. To that end, This Place has uploaded the code for its software onto <a href="https://github.com/ThisPlace/MindRDR" target="_blank">GitHub</a>, a popular website used by developers to share code they create with others for free.</p>
<p>“What we’ve done is just scratch the surface, and we hope that we’ve inspired people to build on what we’ve started,” Hamlin said.</p>
<p>Written by: <a href="http://www.latimes.com/la-bio-salvador-rodriguez-staff.html" target="_blank">Salvador Rodriguez</a>, the <a href="http://www.latimes.com/business/technology/la-fi-tn-google-glass-mindrdr-20140711-story.html" target="_blank">Los Angeles Times</a> (via <a href="http://ispr.info/2014/07/14/mindrdr-lets-users-control-google-glass-with-their-thoughts/" target="_blank">Presence</a>); more information is available from <a href="http://mindrdr.thisplace.com/" target="_blank">This Place</a> and an article in <a href="http://www.bbc.com/news/technology-28237582" target="_blank">BBC News</a><br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2014/07/london-firm-creates-mind-controlled-commands-google-glass/">London Firm Creates Mind-Controlled Commands for Google Glass</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2014/07/london-firm-creates-mind-controlled-commands-google-glass/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">8608</post-id>	</item>
		<item>
		<title>Control VR Gloves Warp Your Fingers into Virtual Worlds</title>
		<link>https://www.situatedresearch.com/2014/06/control-vr-gloves-warp-fingers-virtual-worlds/</link>
					<comments>https://www.situatedresearch.com/2014/06/control-vr-gloves-warp-fingers-virtual-worlds/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Wed, 11 Jun 2014 19:04:51 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Communication]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=8544</guid>

					<description><![CDATA[<p>$350 device tracks your arms and hands with military-designed sensors New technologies such as Google Glass and Oculus’ Rift headset are making it easier than ever for us to get our heads into augmented and virtual realities. But while we get our heads into these alternate worlds and use our eyes to check our emails, surf&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2014/06/control-vr-gloves-warp-fingers-virtual-worlds/">Control VR Gloves Warp Your Fingers into Virtual Worlds</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>$350 device tracks your arms and hands with military-designed sensors</strong></p>
<p>New technologies such as Google Glass and Oculus’ Rift headset are making it easier than ever for us to get our heads into augmented and virtual realities. But while we get our heads into these alternate worlds and <a href="http://www.theverge.com/2013/2/22/4013406/i-used-google-glass-its-the-future-with-monthly-updates" target="_blank">use our eyes to check our emails</a>, surf the internet, even <a href="http://www.theverge.com/2014/2/5/5382524/eve-valkyrie-will-be-an-oculus-rift-exclusive" target="_blank">destroy enemy starfighters with a spiral of missiles</a>, our hands are left behind in the real world. <span id="more-8544"></span></p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="alignright size-full wp-image-8546" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/06/jpeg.jpg?resize=640%2C426&#038;ssl=1" alt="jpeg" width="640" height="426" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/06/jpeg.jpg?w=640&amp;ssl=1 640w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/06/jpeg.jpg?resize=300%2C199&amp;ssl=1 300w" sizes="auto, (max-width: 640px) 100vw, 640px" />California-based Control VR wants to change that. The company today launched a <a href="https://www.kickstarter.com/projects/controlvr/control-vr-motion-capture-for-vr-animation-and-mor" target="_blank">Kickstarter for its Control VR wearable device</a>, a glove-like system that fits over the user’s arms and shoulders and can accurately sense the precise movements of fingers before translating that motion into virtual or augmented realities. Unlike motion sensing controllers such as <a href="http://www.theverge.com/2014/6/5/5782286/xbox-one-without-kinect-performance-boost" target="_blank">Microsoft’s Kinect</a>, the Control VR can map precise arm and finger motions without the use of an external camera.<span id="more-7850"></span></p>
<p>Alex Sarnoff, Control VR’s co-founder and CEO, says “existing motion-sensing technology is crude, insufficient and limited by confined spaces and camera systems.” His company’s solution takes up little space and doesn’t require an external device pointed at the user. Instead, fine control is made possible by a set of tiny sensors that are placed on the user’s fingers and arms. Each of these sensors — which Sarnoff says were designed for military purposes — has three accelerometers, three gyroscopes, and three magnetometers. The data produced by the position of these sensors is fed back to a processor that allows the Control VR system to calculate how the wearer’s fingers are moving in relation to their body.</p>
<p><iframe loading="lazy" src="https://www.kickstarter.com/projects/controlvr/control-vr-motion-capture-for-vr-animation-and-mor/widget/video.html" width="1024" height="600" frameborder="0" scrolling="no"> </iframe></p>
<p>Sarnoff sees his company’s device first being used with video games. Control VR has already demonstrated its device being used with the Oculus Rift headset, <a href="http://youtu.be/LPszKhewSec" target="_blank">using the Rift’s Tuscany demo</a> to show how hands, arms, and fingers can be manipulated by the player. The sensors on the wearer’s elbows and fingers mean that the motions look natural on screen, appearing as one-to-one representations of their actions in the real world. In a newer demonstration, also using the Rift, a player places his hands behind his back to send an Iron Man avatar flying across treetops. He throws his hands forward, using Tony Stark’s palm-mounted thrusters to come to a hovering halt, before pointing his fingers at flying opponents and blowing them from the sky with his suit’s weapons.</p>
<p>The most recent renders of the device show it sporting a small joystick, but Sarnoff says the system will also have more humanitarian uses than aerial video game battles. Control VR will ship with an SDK that Sarnoff says will allow developers to “make the world a better place” by building software and adding functionality for the technology. “Ultimately, functional applications like remote physical therapy and virtual sign-language will be developed,” he says. Sarnoff thinks his company’s device will have a major impact in the animation, design, medical, and robotics communities — and with the party game crowd. “Imagine playing a game of beer pong in real-time,” he suggests.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-large wp-image-8547" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/06/View3-PhysicalRender.jpg?resize=980%2C419&#038;ssl=1" alt="View3-PhysicalRender" width="980" height="419" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/06/View3-PhysicalRender.jpg?resize=1024%2C438&amp;ssl=1 1024w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/06/View3-PhysicalRender.jpg?resize=300%2C128&amp;ssl=1 300w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2014/06/View3-PhysicalRender.jpg?w=1900&amp;ssl=1 1900w" sizes="auto, (max-width: 980px) 100vw, 980px" /></p>
<p>The company’s funding goal is set at $250,000. Those that pledge $350 — the <a href="https://www.oculusvr.com/order/" target="_blank">same price as an Oculus Rift development kit</a> — or more get their own Control VR system, in addition to its SDK and a set of tutorials that the company says makes “integration with any 3D game or application as easy as possible.” Sarnoff promises that those that do purchase a Control VR system won’t have to buy a newer version six months down the line. The $350 device is modular, meaning new features and functions can be slotted or patched in later. He mentions haptic feedback as one example that will “absolutely” be a part of future versions of Control VR, “so gamers can play with real feedback while laying on a sofa.” The company plans to get all Control VR systems out to people who pledge $350 or more by December 25th. A retail version is further out, but is expected to be ready for the mass market in 18 months.</p>
<p>Some of the world’s biggest companies have placed <a href="http://www.theverge.com/2014/3/25/5547456/facebook-buying-oculus-for-2-billion" target="_blank">big bets on virtual and augmented reality</a>, but while the visual experience is already impressive, controllers for the Oculus Rift and its contemporaries have lagged behind. Devices such as the Razer Hydra are frustrating and imprecise to use, while others <a href="http://www.theverge.com/2013/6/11/4419832/virtuix-omni-vr-hands-on-demo" target="_blank">such as the Virtuix Omni </a>require vast amounts of living room space, leading Oculus’ Palmer Luckey to <a href="http://www.theverge.com/2013/12/23/5238118/virtual-reality-check-oculus-rift-hardware-ecosystem" target="_blank">lament the lack of a top-quality input system for his company’s machine</a>. Control VR’s system certainly appears smaller and more precise than its peers, but it’s yet to be seen how quickly the virtual reality community will warm to it. In the meantime, the company plans to show off the system at next week’s E3 expo, offering developers the chance to get their hands, as well as their heads, into their video games.</p>
<p>Written by: <a href="http://www.theverge.com/users/richmcc" target="_blank">Rich McCormick</a>,  <a href="http://www.theverge.com/2014/6/5/5781932/control-vr-gloves-warp-your-fingers-into-virtual-worlds" target="_blank">The Verge</a> (via <a href="http://ispr.info/2014/06/10/control-vr-gloves-warp-your-fingers-into-virtual-worlds/" target="_blank">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2014/06/control-vr-gloves-warp-fingers-virtual-worlds/">Control VR Gloves Warp Your Fingers into Virtual Worlds</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2014/06/control-vr-gloves-warp-fingers-virtual-worlds/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">8544</post-id>	</item>
		<item>
		<title>Study Reveals Real Reason Behind Gaming Aggression</title>
		<link>https://www.situatedresearch.com/2014/04/study-reveals-real-reason-behind-gaming-aggression/</link>
					<comments>https://www.situatedresearch.com/2014/04/study-reveals-real-reason-behind-gaming-aggression/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Wed, 23 Apr 2014 15:08:33 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Affect / Emotion]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Games for Learning]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Learning]]></category>
		<category><![CDATA[Mental Models]]></category>
		<category><![CDATA[Usability Research]]></category>
		<category><![CDATA[Usability Testing]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[User-Centered Design]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=8406</guid>

					<description><![CDATA[<p>A new study has revealed that gamers are more likely to experience feelings of aggression from playing a game when it is too difficult or when the controls are too complicated to master. In comparison, the research found there was &#8220;little difference&#8221; in levels of aggression when the games themselves depicted violence. Overwhelmingly, the deciding&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2014/04/study-reveals-real-reason-behind-gaming-aggression/">Study Reveals Real Reason Behind Gaming Aggression</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p style="color: #000000;">A new study has revealed that gamers are more likely to experience feelings of aggression from playing a game when it is too difficult or when the controls are too complicated to master.</p>
<p style="color: #000000;">In comparison, the research found there was &#8220;little difference&#8221; in levels of aggression when the games themselves depicted violence. Overwhelmingly, the deciding factor was &#8220;how the volunteers were able to master the electronic game after 20 minutes of play&#8221;. <span id="more-8406"></span></p>
<div class="quoteBox">
<blockquote><p>This need to master the game was far more significant than whether the game contained violent material.&#8221;</p></blockquote>
</div>
<p style="color: #000000;">The <a style="font-weight: inherit; font-style: inherit; color: #003399;" href="http://www.ox.ac.uk/media/news_stories/2014/140408.html" target="_blank" rel="nofollow" data-ls-seen="1">study</a> was conducted by research teams from University of Oxford in the UK and the University of Rochester in the US, with the findings published in the <a style="font-weight: inherit; font-style: inherit; color: #003399;" href="http://www.apa.org/pubs/journals/psp/index.aspx" target="_blank" rel="nofollow" data-ls-seen="1"><em>Journal of Personality and Social Psychology</em></a>.</p>
<p style="color: #000000;">The experiment is believed to be the first study of its kind and consisted of six controlled lab tests involving university students. The candidates played a simple puzzle game the researchers were able to manipulate, increasing its difficultly or making the control scheme less intuitive or responsive.</p>
<p style="color: #000000;">&#8220;To date, researchers have tended to explore passive aspects of gaming, such as whether looking at violent material in electronic games desensitises or aggravates players,&#8221; says Dr Andrew Przybylski, co-author of the study, from the Oxford Internet Institute. &#8220;We focused on the motives of people who play electronic games and found players have a psychological need to come out on top when playing. If players feel thwarted by the controls or the design of the game, they can wind up feeling aggressive. This need to master the game was far more significant than whether the game contained violent material. Players on games without any violent content were still feeling pretty aggressive if they hadn’t been able to master the controls or progress through the levels at the end of the session.&#8221;</p>
<div class="quoteBox">
<blockquote><p>If the structure of a game or the design of the controls thwarts enjoyment, it is this, not the violent content, that seems to drive feelings of aggression.&#8221;</p></blockquote>
</div>
<p style="color: #000000;">In addition to the lab tests, researchers conducted a survey of over 300 players, focussing the three games they had played most in the last month. Players were asked which they had enjoyed the most, and why. Again, the research demonstrated that some players experienced aggression when they didn&#8217;t feel good at the game. Furthermore, these feelings of aggression had even spoiled their level of enjoyment.</p>
<p style="color: #000000;">&#8220;The study is not saying that violent content doesn&#8217;t affect gamers,&#8221; says co-author Richard M Ryan, from the University of Rochester. &#8220;But our research suggests that people are not drawn to playing violent games in order to feel aggressive. Rather, the aggression stems from feeling not in control or incompetent while playing. If the structure of a game or the design of the controls thwarts enjoyment, it is this, not the violent content, that seems to drive feelings of aggression.&#8221;</p>
<p style="color: #000000;">Written by: <a href="http://www.ign.com/articles/2014/04/08/study-reveals-real-reason-behind-gaming-aggression">Daniel Krupa, IGN UK</a><br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2014/04/study-reveals-real-reason-behind-gaming-aggression/">Study Reveals Real Reason Behind Gaming Aggression</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2014/04/study-reveals-real-reason-behind-gaming-aggression/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">8406</post-id>	</item>
	</channel>
</rss>
