<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Education Archives - Situated Research</title>
	<atom:link href="https://www.situatedresearch.com/category/education/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.situatedresearch.com/category/education/</link>
	<description>Usability Research and User Experience Testing</description>
	<lastBuildDate>Mon, 22 Nov 2021 17:33:24 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	

 
<site xmlns="com-wordpress:feed-additions:1">122538981</site>	<item>
		<title>Games User Research: Driving Development with Actionable Insights</title>
		<link>https://www.situatedresearch.com/2018/11/games-user-research-driving-development-with-actionable-insights/</link>
					<comments>https://www.situatedresearch.com/2018/11/games-user-research-driving-development-with-actionable-insights/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Wed, 28 Nov 2018 17:00:23 +0000</pubDate>
				<category><![CDATA[Collaboration]]></category>
		<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Communication]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Games for Learning]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Usability Research]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<guid isPermaLink="false">https://www.situatedresearch.com/?p=9777</guid>

					<description><![CDATA[<p>Developers both large and small can benefit from an outside perspective given by a game user research, or usability research geared towards games. Indie developers can benefit from adding UX expertise to the development team, while large developers can obtain an outside perspective to compliment and verify findings from internal members of the development team.&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2018/11/games-user-research-driving-development-with-actionable-insights/">Games User Research: Driving Development with Actionable Insights</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Developers both large and small can benefit from an outside perspective given by a game user research, or usability research geared towards games. Indie developers can benefit from adding UX expertise to the development team, while large developers can obtain an outside perspective to compliment and verify findings from internal members of the development team. In this article, we will present three key ways in which game research can maximize a game’s success. <span id="more-9777"></span></p>
<h2>Measuring Engagement</h2>
<p>Prior research has shown the importance of engagement in game play. Creating a sense of flow, or a feeling where players are immersed into game play to the point where they lose track of their surroundings, has a huge effect on players’ perceptions of a game.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-9779" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/sean-do-782269-unsplash.jpg?resize=980%2C653&#038;ssl=1" alt="" width="980" height="653" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/sean-do-782269-unsplash.jpg?w=1280&amp;ssl=1 1280w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/sean-do-782269-unsplash.jpg?resize=300%2C200&amp;ssl=1 300w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/sean-do-782269-unsplash.jpg?resize=768%2C512&amp;ssl=1 768w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/sean-do-782269-unsplash.jpg?resize=1024%2C682&amp;ssl=1 1024w" sizes="auto, (max-width: 980px) 100vw, 980px" /></p>
<p>Games user research, when properly done, incorporates behavioral psychology into the research to observe players’ actions during gameplay. This yields insight into engagement levels, which are affected by a steady increase in difficulty over time (to challenge game players’ ability) and are encouraged by a great story line to immerse game players.</p>
<h2>Measuring Player Communication</h2>
<p>Besides the obvious task of watching players interact with the game interface, the observation of player-to-player communication can yield great insight into game play. Team-based activities, or even collaborative game play, can help researchers observe players’ strategies. In MMOGs, players might communicate through text or voice inside the game environment, and classic games might have players communicate via their proximity to one another.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-9780" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/kamil-s-738521-unsplash.jpg?resize=980%2C653&#038;ssl=1" alt="" width="980" height="653" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/kamil-s-738521-unsplash.jpg?w=1280&amp;ssl=1 1280w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/kamil-s-738521-unsplash.jpg?resize=300%2C200&amp;ssl=1 300w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/kamil-s-738521-unsplash.jpg?resize=768%2C512&amp;ssl=1 768w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/kamil-s-738521-unsplash.jpg?resize=1024%2C682&amp;ssl=1 1024w" sizes="auto, (max-width: 980px) 100vw, 980px" /></p>
<p>Player communication yields great insight into how players learn to play a game and how they develop strategies to win a game. Great user research should use a research method where players are not coaxed or guided by researchers, and feel as if they are in a natural environment as to not bias their activity while playing games. Rigorous game research methods can use these factors to their advantage to achieve findings that are more accurate than traditional deductive, hypothesis-driven studies.</p>
<h2>Affordances of the User Interface</h2>
<p>While the broader experience of game play needs to be measured to gauge the overall player experience, examining the affordances of the user interface is a useful task to see what players perceive as possible actions in the game. These perceptions provide game players a foundation for creating strategies within the game. All aspects of the interface that can be interacted with, as well as those that gamers perceive as actionable, should be observed to inform game design. These perceived actions within a game suggest to gamers their possibilities for both playing and winning the game.</p>
<p>Often, critical actions might be overlooked by gamers. In line with theories of learning, a scaffolding difficulty structure should be achieved to create a feeling of flow for gamers. Game research can provide useful insight into ways that game players make use of a game interface, and lead to modifications in its discovery and use (via a nudge, animation, tutorial, etc.) that will provide salience to particular actions within the game that allow game players to learn, progress, and create engaging game play within the game.</p>
<h2>Conclusion</h2>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="half alignright wp-image-9781" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/nikita-kachanovsky-428386-unsplash.jpg?resize=306%2C512&#038;ssl=1" alt="" width="306" height="512" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/nikita-kachanovsky-428386-unsplash.jpg?w=611&amp;ssl=1 611w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/11/nikita-kachanovsky-428386-unsplash.jpg?resize=179%2C300&amp;ssl=1 179w" sizes="auto, (max-width: 306px) 100vw, 306px" /></p>
<p>Many of the current trends in game design are leading to amazing new games: including VR / AR (virtual / augmented reality), amazing graphics approaching lifelike detail, and engaging online multiplayer experiences. However, many of the properties of classic games offer players an engaging experience without advanced graphics, making use of a basic story, simple gameplay, and scaffolding difficulty structure to engage players. Game developers of all sizes can create games that maximize engagement by utilizing game research to create games that utilize the perfect mix of these features.</p>
<p>Good usability, afforded by the game user interface, helps players develop strategies for playing and winning games. Creating flow, where players lose track of their surroundings while immersed in game play, can be achieved by creating the right mix of engaging gameplay, player communication, and a scaffolding difficulty structure where players learn and accomplish tasks in the game.</p>
<h3>About the Author</h3>
<p><em>Matthew Sharritt, Ph.D., President and Co-founder of Situated Research, specializes in user-experience (UX) research and usability testing within software and video games. Dr. Sharritt’s research focuses on collaborative learning during playtesting and exploration, yielding insights in how to construct games that flow with engaging gameplay and collaborative interaction. The Situated Research team has provided independent expertise to the game industry across a variety of research projects. Learn more at </em><a href="https://www.situgames.com"><em>https://www.situgames.com</em></a><em>.</em></p>
<p>The post <a href="https://www.situatedresearch.com/2018/11/games-user-research-driving-development-with-actionable-insights/">Games User Research: Driving Development with Actionable Insights</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2018/11/games-user-research-driving-development-with-actionable-insights/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">9777</post-id>	</item>
		<item>
		<title>Rapid UX Research at Google</title>
		<link>https://www.situatedresearch.com/2018/05/rapid-ux-research-at-google/</link>
					<comments>https://www.situatedresearch.com/2018/05/rapid-ux-research-at-google/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Fri, 04 May 2018 14:53:39 +0000</pubDate>
				<category><![CDATA[Collaboration]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Communication]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Usability Research]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User-Centered Design]]></category>
		<guid isPermaLink="false">https://www.situatedresearch.com/?p=9717</guid>

					<description><![CDATA[<p>How do you conduct impactful user research in a short space of time? As the manager of a Rapid Research team at Google, I’ve built a team around just that — delivering meaningful insights, fast. My job is to ensure our product teams get the insights they need quickly and effectively.  For my team, that means getting&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2018/05/rapid-ux-research-at-google/">Rapid UX Research at Google</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>How do you conduct impactful user research in a short space of time?</p>
<p>As the manager of a Rapid Research team at Google, I’ve built a team around just that — delivering meaningful insights, fast. My job is to ensure our product teams get the insights they need quickly and effectively. <span id="more-9717"></span></p>
<p>For my team, that means getting everything done within the space of a week.</p>
<p>In this article I’d like to share my experiences setting up this practice and ideas how to do it yourself!</p>
<h3>The need for rapid research</h3>
<p>My adventure at Google started in 2014. When I joined the Communications team, I started out on a product without a dedicated researcher. I quickly had to take inventory of the team’s projects and the questions that needed answering. There was a range of tactical and formative research that had to get done. I wondered how I might split my time between the two, juggling multiple needs and priorities at once.</p>
<blockquote><p><em>I realized that with the right templates and processes in place, I could quickly get rid of the low hanging fruit through efficient usability studies and testing cycles. That’s when the idea of rapid research was born.</em></p></blockquote>
<p>I developed a standard screener to use across my projects, a slide template to help me report back findings and before I knew it, I managed to streamline a lot of my processes to the point where I could turn research around in just one day.</p>
<p>My team is fortunate enough to have dedicated participant recruiters. Recruiters get a head start on filling studies by using a standardized screener. For the most part, I use the same screener for every study.</p>
<p>Over time, I was able to answer bigger questions in the space of several days, and increase the scope of my projects. In doing so, I realized that I could focus on not just the one app I was working on at the time, but use this process for apps across the entire organization.</p>
<p>In late 2016, I had the opportunity to formalize my role and build my own dedicated Rapid Research team at Google. Researchers embedded in product teams or teams without dedicated researchers come to us with specific questions that need answering. My team then acts as an internal consultancy, supporting product teams in answering questions we feel are suitable for a <em>rapid</em> approach.</p>
<p>Let’s talk about the nitty gritty of how we make this happen!</p>
<h3>The rapid research timeline</h3>
<p>Our rapid research team works in weekly cycles, kicking off a new project every Friday, and going through an entire research and analysis process in the space of one week.</p>
<figure><figcaption class="imageCaption"><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-9719" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/05/1_rSbZ_TTddaIGNxDtjLjiBA.png?resize=980%2C315&#038;ssl=1" alt="" width="980" height="315" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/05/1_rSbZ_TTddaIGNxDtjLjiBA.png?w=1919&amp;ssl=1 1919w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/05/1_rSbZ_TTddaIGNxDtjLjiBA.png?resize=300%2C96&amp;ssl=1 300w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/05/1_rSbZ_TTddaIGNxDtjLjiBA.png?resize=768%2C247&amp;ssl=1 768w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/05/1_rSbZ_TTddaIGNxDtjLjiBA.png?resize=1024%2C329&amp;ssl=1 1024w" sizes="auto, (max-width: 980px) 100vw, 980px" />The rapid research timeline</figcaption></figure>
<h4>Friday — What do we want to learn?</h4>
<p>Friday is kick-off day. Partners and researchers are expected to come with a clearly defined question in mind. At the very least, we need a solid research question and the start of the required assets (designs, mockups, prototypes, sketches).</p>
<h4>Monday — Pilot</h4>
<p>On Mondays we’ll pilot our proposed study, often with Google employees, to make ensure we’re setting ourselves up for success. This is an opportunity to tweak the discussion guide, prototype and other elements of the test before we jump into the next 1–2 days of research sessions.</p>
<h4>Tuesday/Wednesday — Conducting the research</h4>
<figure><figcaption class="imageCaption"><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-9718" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/05/1_8sSz1TA6zyqCECpU_VpRrw.png?resize=980%2C653&#038;ssl=1" alt="" width="980" height="653" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/05/1_8sSz1TA6zyqCECpU_VpRrw.png?w=1080&amp;ssl=1 1080w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/05/1_8sSz1TA6zyqCECpU_VpRrw.png?resize=300%2C200&amp;ssl=1 300w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/05/1_8sSz1TA6zyqCECpU_VpRrw.png?resize=768%2C512&amp;ssl=1 768w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2018/05/1_8sSz1TA6zyqCECpU_VpRrw.png?resize=1024%2C683&amp;ssl=1 1024w" sizes="auto, (max-width: 980px) 100vw, 980px" />Our research can take place in the field or the lab — it all depends on the question we’re trying to answer.</figcaption></figure>
<p>We’ll spend 1–2 days in the lab or the field actually conducting research sessions. In between sessions, we’ll start to pull out themes from our data and identify relevant supporting material from recordings to feed into the final presentation.</p>
<p>We encourage our stakeholders to observe the studies as much as they can, whether in the office or over Google Hangouts. <span class="markup--quote markup--p-quote is-other" data-creator-ids="anon">We’ll do debriefs after each session to ensure we’re all on the same page, which is a big part of how we can move so quickly.</span></p>
<h4>Thursday — Prepping the findings</h4>
<p>This is the day when the final presentation comes together. As a team we’ll conduct a slide review prior to the final report, getting everyone on the team together to review each others’ slides. It’s an opportunity to get feedback and raise any previous work that may be relevant to the project.</p>
<h4>Friday — Read-out day</h4>
<p>It’s time to present our findings!</p>
<p>For lab studies, we’ll prepare slide summaries with supporting quotes, short video clips or gifs. Each presentation includes background on the method, participant profiles and a refresher on the research question. We’ll often include a tl;dr in there to make sure the team can quickly pull out the key findings.</p>
<p>People who had the chance to observe some of the sessions will often have a good idea of what the results might be and usually come prepared with some questions to ask too.</p>
<p>Intercept studies are often shorter and more lightweight. We often feed the results from these studies back to the team earlier in the week with a quick one-pager so they can be off and running with the results before Friday.</p>
<h3>Practical tips for conducting rapid research</h3>
<p><strong>Note-taking</strong><br />
One of the ways that my team is able to move so fast is by setting ourselves up for synthesis throughout the project. We do this by automatically time-stamping our notes:</p>
<ul>
<li>The team uses a dedicated Google sheet with a custom time-stamping script built-in, that automatically adds a corresponding timestamp to each note, making it easy to pull out quotes from video files when we need them. Here’s <a href="https://productforums.google.com/forum/#!topic/docs/Ng4f6Mr0xW4;context-place=topicsearchin/docs/timestamp$20google$20sheet" target="_blank" rel="noopener">some information</a> on setting up one of these yourself!</li>
<li><a href="http://www.usefulfruit.com/" target="_blank" rel="noopener">Pear Notes</a> is another option for adding timestamps to your notes with built in recording capabilities. For time-stamping notes in recordings, you can also use <a href="https://www.usertesting.com/" target="_blank" rel="noopener">Usertesting.com</a> or <a href="https://validately.com/" target="_blank" rel="noopener">Validately</a></li>
<li>Another option for note-taking if you don’t have the luxury of taking notes as you go is a transcription after the fact. <a class="markup--anchor markup--li-anchor" href="https://www.descript.com/" target="_blank" rel="nofollow noopener" data-href="https://www.descript.com/">Descript</a> makes it fast and affordable to transcribe audio files and identify relevant quotes</li>
</ul>
<p><strong>Synthesis on the go</strong><br />
Sometimes we start coding themes as early as day 1. The team is constantly working on the read-out throughout the project. We make use of down-time between sessions to start pulling out patterns, quotes, and editing video! We use <a class="markup--anchor markup--p-anchor" href="https://www.techsmith.com/video-editor.html" target="_blank" rel="noopener">Camtasia</a> and <a class="markup--anchor markup--p-anchor" href="https://www.telestream.net/screenflow/overview.htm" target="_blank" rel="noopener">Screenflow</a> for fast video editing.</p>
<p><strong class="markup--strong markup--p-strong">Double up<br />
</strong>We always have two researchers assigned to every project. It’s no secret that running a rapid study every week can be intense, so it’s nice to have someone to switch off with and take notes.</p>
<p><strong class="markup--strong markup--p-strong">Pace yourself<br />
</strong>Limit your study to about 5 participants per day, alternating between moderating and note-taking with the other person on the project each day.</p>
<p><strong>Get scrappy</strong><br />
We use a variety of methods to help us get the insights we need:</p>
<ul>
<li>We’ll often do quick intercepts in the field, where rather than taking detailed notes, we’ll draw relevant insights onto paper printouts of concepts</li>
<li>From time to time we’ll use laptop hugging for remotely observing participants as they perform tasks on their mobile . The participant will face their laptop away from them and then hold their phone in front of the laptop, making it really quick for us to do rapid mobile testing</li>
<li>For testing concepts, we’ll sometimes run ‘speed dating’ sessions, where we’ll present different low-fidelity sketches or design concepts to participants to get quick feedback and validate and prioritize user needs. Participants are shown drawings that illustrate a perceived need, and individually rank the severity and frequency of the need. Discussion allows for diverse perspectives to emerge and provides context around any tensions, allowing the more promising concepts and needs bubble up.</li>
</ul>
<h3>What types of projects benefit from rapid research?</h3>
<p>Rapid research isn’t suited to every method or project. Overall this method works best for tactical research, intended to test designs and assumptions, such as:</p>
<ul>
<li>intercept interviews</li>
<li>remote or in-person usability studies</li>
<li>concept testing</li>
<li>light survey work</li>
<li>user journey evaluations</li>
<li>literature reviews</li>
<li>competitive analysis</li>
</ul>
<h3>Efficiencies are everywhere you look</h3>
<p>At its core, rapid research is about developing and iterating on the templates and processes you use to arrive at a streamlined and efficient research approach. It’s about identifying what might be slowing you down and finding methods and tools to mitigate that. Once you’re satisfied with the tools and methods you have in place, you can start to increase the scope of your research and look at answering bigger questions in less time.</p>
<p>Got any rapid research tips of your own? We’d love to hear them!</p>
<p>Written by: <a class="ds-link ds-link--styleSubtle ui-captionStrong u-inlineBlock link link--darken link--darker" dir="auto" href="https://medium.com/@heidi.sales?source=post_header_lockup" data-action="show-user-card" data-action-source="post_header_lockup" data-action-value="f697c0d7e596" data-action-type="hover" data-user-id="f697c0d7e596">Heidi Sales</a>, via <a href="https://medium.com/mixed-methods/rapid-ux-research-at-google-3b92dd038e30" target="_blank" rel="noopener">Medium</a><br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2018/05/rapid-ux-research-at-google/">Rapid UX Research at Google</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2018/05/rapid-ux-research-at-google/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">9717</post-id>	</item>
		<item>
		<title>How User-Centered Design Can Turn Your Concepts into Kick-Ass Prototypes</title>
		<link>https://www.situatedresearch.com/2017/09/user-centered-design-can-turn-concepts-kick-ass-prototypes/</link>
					<comments>https://www.situatedresearch.com/2017/09/user-centered-design-can-turn-concepts-kick-ass-prototypes/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Tue, 19 Sep 2017 15:38:54 +0000</pubDate>
				<category><![CDATA[Collaboration]]></category>
		<category><![CDATA[Development]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Communication]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Mental Models]]></category>
		<category><![CDATA[Usability Research]]></category>
		<category><![CDATA[User-Centered Design]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=9642</guid>

					<description><![CDATA[<p>Brainstorming is one of the oldest known methods for generating group creativity. A group of people come together and focus on a problem or proposal. There are two phases of the activity. The first phase generates ideas, the second phase evaluates them.  Although some studies have shown that individuals working alone can generate more and&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2017/09/user-centered-design-can-turn-concepts-kick-ass-prototypes/">How User-Centered Design Can Turn Your Concepts into Kick-Ass Prototypes</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Brainstorming is one of the oldest known methods for generating group creativity. A group of people come together and focus on a problem or proposal. There are two phases of the activity. The first phase generates ideas, the second phase evaluates them. <span id="more-9642"></span></p>
<p>Although some studies have shown that individuals working alone can generate more and better ideas than when working as a group, the brainstorming activity enables everyone in the group to gain a better understanding of the problem space, and has the added benefit of creating a feeling of common ownership of results.</p>
<p>Good brainstorming focuses on the quantity and creativity of ideas: the quality of ideas is much less important than the sheer quantity. After ideas are generated, they are often grouped into categories and prioritized for subsequent research or application.</p>
<p>The outcomes of brainstorming are:</p>
<ul>
<li>A list of ideas or solutions related to a particular problem</li>
<li>The ideas or solutions organized into groups</li>
<li>Some form of prioritization based on attributes like cost and feasibility</li>
</ul>
<h2>Idea Mapping</h2>
<p>Idea mapping is a visual thinking tool that helps structure information, helping you to better analyze, comprehend, synthesize, recall and generate new ideas. We can help you from the most nascent idea up through prototyping and user testing. You’ll get our expertise in usability and business development.</p>
<h2>UI Sketches</h2>
<p>Low-fidelity prototypes are a great place to begin, and our team can facilitate UI brainstorming sessions where sketches and basic functionality can give your new product a voice of its own.</p>
<h2>Market Research</h2>
<p>In addition to prototyping and UI design, we can conduct market research to see where your idea fits into the marketplace. Client confidentiality is paramount and we’ll gladly sign a non-disclosure agreement.</p>
<p>Helping clients in the beginning stages of a project to help get ideas flowing is our forte. Our team specializes in translating high-level objectives into exciting new products and services, down to the finest detail.</p>
<p>From market research to product development, we’ve got you covered. We can work with any budget, so reach out and let us know what you have been thinking about doing.</p>
<p>We thrive on helping businesses launch new products, and would love to facilitate a brainstorming session for your new product. <a href="https://www.situatedresearch.com/contact/">Contact us</a> today to get started.</p>
<p>Written / Posted by: <a href="https://www.situatedresearch.com/staff-item/michel-sharritt/">Michel Ann Sharritt</a>, VP, Situated Research</p>
<p>The post <a href="https://www.situatedresearch.com/2017/09/user-centered-design-can-turn-concepts-kick-ass-prototypes/">How User-Centered Design Can Turn Your Concepts into Kick-Ass Prototypes</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2017/09/user-centered-design-can-turn-concepts-kick-ass-prototypes/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">9642</post-id>	</item>
		<item>
		<title>How to Conduct User Research and Build Features</title>
		<link>https://www.situatedresearch.com/2015/10/how-to-conduct-user-research-and-build-features/</link>
					<comments>https://www.situatedresearch.com/2015/10/how-to-conduct-user-research-and-build-features/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Mon, 05 Oct 2015 16:46:29 +0000</pubDate>
				<category><![CDATA[Collaboration]]></category>
		<category><![CDATA[Development]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Communication]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Mental Models]]></category>
		<category><![CDATA[Usability Research]]></category>
		<category><![CDATA[Usability Testing]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[User-Centered Design]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=9069</guid>

					<description><![CDATA[<p>“So, Megan, what do you do?” What a loaded question, geeze. I do lots of things. I run. I eat. I hang out with my 5 rabbits (yeah, they’re awesome). Everyone asks me this question at every networking event, and I still don’t have a succinct, articulate answer. I usually reply with something along the&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2015/10/how-to-conduct-user-research-and-build-features/">How to Conduct User Research and Build Features</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>“So, Megan, what do you do?”</p>
<p>What a loaded question, geeze. I do lots of things. I run. I eat. I hang out with my 5 rabbits (<a href="https://www.flickr.com/photos/96683023@N06/" target="_blank">yeah, they’re awesome</a>). Everyone asks me this question at every networking event, and I still don’t have a succinct, articulate answer. I usually reply with something along the lines of,</p>
<p>“I do user research and product strategy consulting for early stage startups.” <span id="more-9069"></span></p>
<p>About 50% of the time, the inquiring mind will respond, “oh wow, that sounds really amazing,” clearly indicating that I’ve either convinced them that I am either smart or important, which is all that matters in Silicon Valley.</p>
<p>The other half of people will dig deeper: “oh, so what does your work look like then?” I like these people.</p>
<p>These conversations are usually fantastic. I get to expound upon some of my recent projects, go into details about what good product and UX look like at early stage startups, and hopefully do a little teaching about best practices. People particularly seem to love hearing my war stories about field research and synthesis.</p>
<p>What I find deeply concerning, however, is how few people understand the basics of user research prior to these discussions. There seems to be a huge gap in knowledge: how to recruit people to talk to, how to have conversations, what you should be talking to them about, how to synthesize the data, how to use this knowledge to make product decisions.<strong> Let me state this clearly and unequivocally: talking to potential users and understanding your markets at an early stage startup should be your top priority. </strong>This is understanding your users to figure out what to build, what to iterate on, and how to make money. This research isn’t rocket science, but a basic education in social science research methods certainly helps.</p>
<p>I’ve been doing this a while now, and what is sorely lacking is some concrete stories from the trenches to demonstrate the value of user research, especially in early stage startups. I’m guessing that’s why people are so excited when I tell them <em>how</em> I do my job. So that’s what I’m going to give you: a story, with exactly what I did, from beginning to…well, not end because this story is definitely still being written.</p>
<p>To get all meta now, my story is about a writing startup: <a href="http://www.therightmargin.com/" target="_blank">TheRightMargin</a>.</p>
<p>When I first started working with <a href="https://www.linkedin.com/pub/shivani-bhargava/14/319/968" target="_blank">Shivani</a>, the founder of TheRightMargin, she pitched it to me as a platform to help writers with writer’s block (read the origin story <a href="http://blog.therightmargin.com/2015/07/07/good-bye-word-processor-say-hello-to-therightmargin/" target="_blank">here</a>). She had created it as something to solve her own problems, since she struggled to finish her own novels she worked on in her spare time. The key feature was dynamic, integrated content that helped you keep track of ideas, characters, outlines, maps, etc. while you write. The vision was to destroy Microsoft Word and make your life easier by stepping out of the way of your creativity — big, grandiose ideas.</p>
<p>But big, grandiose ideas need to start somewhere, which is what we’re doing right now by getting TheRightMargin off the ground. A writing platform can’t serve everyone who writes. It can’t be a platform with a billion features. <strong>A startup needs to start out targeted and narrow.</strong> I’m going to walk you through the research process I’ve helped TheRightMargin with over the last few months, which has led to a feature we’re testing.</p>
<p><strong>Big Important Caveat:</strong> for the sake of the simplicity, I am outlining the process below as relatively straightforward because unlike a television show, I won’t win critical acclaim for confusing you with flashbacks and non-linear storytelling. The reality is, almost all of this happened in parallel and messily, with constant revisions. For example, before and after every single interview, I typically revise my interview guide. This is because my understanding of our users and the world is always changing. This is in contrast with your hardcore user interviews at larger companies or in academia, where methodological purism will often dictate that you stick to the original research design to maintain experimental integrity. I digress. Onto the actual “doing”!</p>
<h2>Step 1. Brainstorm User Types</h2>
<p>One of the big challenges we had at TheRightMargin is that the world of writing, even scoped to “long”, is quite wide and varied, so we could not design a product to meet specific needs without seriously narrowing our users down. Problem is, we hadn’t released a product yet, so we didn’t have any real users.</p>
<p>This is a common challenge for early stage startups and one that is perfectly surmountable. You brainstorm possible user types, do some pros and cons, narrow them down a bit, and start having lots of awesome conversations with people who fit your “possible user molds.”</p>
<p>Based on our existing understanding of the market, we guessed our biggest challenges would be getting people to shift off their existing established writing workflows and the diversity of processes. Initially, rather than brainstorm user archetypes, we actually started off discussing some important characteristics and demographics that might affect user behavior, since this could also help me develop my user interviews later on:</p>
<ul>
<li>Organized/non-organized</li>
<li>Long vs. short form</li>
<li>Motivation for writing</li>
<li>Professional/on-the-side/students</li>
<li>Content: fiction, academic, technical, etc</li>
<li>Geography</li>
<li>Age</li>
<li>Gender</li>
<li>Technical ability</li>
<li>Language</li>
<li>Tools available and/or required to use</li>
<li>Autonomy and collaboration</li>
</ul>
<p>From here, we were able to think about ideas for the types of writers we’d want to talk to — it was clear that some of the important variables we’d want to account for were:</p>
<ul>
<li>Level of success — was someone a professional writer? Was writing just a hobby for them or did they hope to make writing their full time job?</li>
<li>Type of writing: what is the content of what they are producing? Fan fiction? Technical writing? Fiction? Short stories vs novels? How long were they spending on projects?</li>
<li>What tools did they currently use? What level of technical ability did they have?</li>
</ul>
<p>To be successful in our user research, we’d want to talk to people who cut across these different areas and were not just coming from one of these areas. Since TheRightMargin had the potential to be a useful tool for a number of these groups, we’d want to narrow in and focus on a particular group and develop the tool for them.</p>
<h2>Step 2. Create Data Collection Stuff</h2>
<p>Before we did any actual research, I made sure we had the right resources. Templates, guides, and some basic organization always, always make your life easier. If you don’t deal with them in the beginning, you will regret not having them later.</p>
<p>For TheRightMargin, I created a “Customer Development” folder in our shared Google Drive to store all of our interview materials. There is a master interview notes template that we copy for each interview that gets retitled to the interviewees name (this should probably be something better). Here’s what gets recorded:</p>
<p style="padding-left: 30px;"><strong><em>Name: </em></strong><em>(customer)</em></p>
<p style="padding-left: 30px;"><strong><em>Age (estimate):</em></strong></p>
<p style="padding-left: 30px;"><strong><em>Researcher:</em></strong><em> (name)</em></p>
<p style="padding-left: 30px;"><strong><em>Date:</em></strong></p>
<p style="padding-left: 30px;"><strong><em>Is the user signed up for the beta?</em></strong><em> Y/N</em></p>
<p style="padding-left: 30px;"><strong><em>How did we acquire the user?</em></strong></p>
<p style="padding-left: 30px;"><strong><em>What communication have we had with the user before the interview? </em></strong><em>[Do we know them already? What do we know about their opinions of the product? etc]</em></p>
<p style="padding-left: 30px;"><strong><em>Interview location/setting description:</em></strong><em> [Especially describe in detail if user typically writes there; particularly important for field interviews in customers’ workspaces, since you can understand their workflows and organizational processes. Pay attention to little things like their use of sticky notes, notebooks — anything that might look like a workaround OR a conscious rejection of technology. was this over Skype, Hangouts, etc? ]</em></p>
<p style="padding-left: 30px;"><strong><em>Technology description, if applicable to in-person interview:</em></strong><em> [Mac or PC? iPhone or Android? Pencil/Paper? Sticky notes? Tablet, smartwatch, whiteboard…the list goes on. What technological (broadly defined) solutions does the customer use in their writing process?]</em></p>
<p style="padding-left: 30px;"><strong><em>Interview notes: </em></strong><em>[Insert notes here]</em></p>
<p>The other key resource is my master interview guide, which also lives in the Customer Development folder. This is a living document and mostly consists of questions and topics that we’d like to cover in our interviews. I’ll have this open during interviews, but I almost never ask anything from it verbatim. It’s mostly something I use to remind myself what to cover.</p>
<p><strong>The best user interviews are freeform, loose conversations driven by the user.</strong> Ideally, I’d never have to look at that interview guide and am just asking follow ups based on what the person is saying.</p>
<p>These user interviews are also entirely agnostic of TheRightMargin and are about understanding the person as a writer, their habits, their needs, and their struggles. Here are some example questions:</p>
<ul>
<li>When was the last time you wrote? What did you write about?</li>
<li>Is there any information that you keep track of outside your actual writing documents?</li>
<li>What do you feel is your biggest challenge once you sit down to write?</li>
</ul>
<h2>Step 3. Recruit Users to Interview</h2>
<p><strong>Talk to any user researcher, and they will likely tell you that recruiting people to talk to is one of the most annoying parts of their job. </strong>There’s a reason that firms exist just to find people to participate in studies.</p>
<p>We initially struggled with how to find people to talk to, especially since we weren’t in touch with any writer’s groups. Some of our initial interviews were with acquaintances of Shivani’s who are writers, which is a fine first step, but has the issue of bias, since they are friends of the founder.</p>
<p>So, I did what any desperate researcher does when they’re starting out: I posted on <a href="http://sfbay.craigslist.org/" target="_blank">Craigslist</a>.</p>
<p>But, I know Craigslist has a serious issue of people who sign up for research sessions, regardless of qualifications, so I set up a filtering question. I mentioned we were looking to talk to writers, but I asked “What do you write?” I didn’t mention that we were looking for specific types of writers, namely people who write longer works. This meant I could specifically filter my responses. We offered a $40 Amazon gift card for people’s time.</p>
<p>I got some GREAT people from this posting.</p>
<p>Additionally, we thought that fan fiction writers might be a great group for us, so we posted on <a href="https://www.fanfiction.net/" target="_blank">Fanfiction.net</a> asking to talk to writers as well. Since it’s a tight knit community, there was some initial resistance, but one brave user was willing to give us a shot. Once she talked to us, she verified us to the rest of the community, and we got 2 more interviews.</p>
<p><strong>Finally, at the end of every interview, you should always ask, “Do you have anyone else who you think would like to talk to me about their experiences?” and then do the same in the thank you email you send.</strong>We got a number of additional interviews through these referrals, since writers tend to know other writers.</p>
<h2>Step 4. Synthesize Information</h2>
<p>My biggest weakness is dealing with notes. I’d highly recommend setting up a weekly appointment to synthesize information from your user research — catch up on notes, read old research, synthesize data. Because this, at the end of the day, is probably the most important part of the process.</p>
<p>Initially, I had to both take notes and conduct interviews simultaneously, which sucks. If at all possible, I’d recommend having 2 people at every interview: 1 person to talk, 1 person to take notes. <a href="https://www.linkedin.com/in/cjlee37" target="_blank">Christine</a> joined TheRightMargin as a UX Designer recently and has been awesome to have along at the last batch of our research sessions. Thank you, Christine, for being AMAZING!</p>
<p>After each session, I like to do what I call a brain vomit for 5–10 minutes — essentially free writing all my impressions of what happened. Then, the next day, I’ll go back, clean up and incorporate everything so that other people can actually read the whole document. In reality, this doesn’t always happen, so the notes end up staying raw longer than they should. Like I said, a personal flaw of mine.</p>
<p>The way I synthesized the interviews from TheRightMargin is by pulling out pain points. I have a master pain point document in the Customer Development folder that has a heading for each person. I’ll have the interview notes open in one tab, and I’ll just go back and forth, adding to the pain points document. These are pain points, both explicit and implicit. It could be something they said like, “I hate find and replace in Microsoft Word.” Or it could be something that I’m extracting as a researcher like “Jane really seems to struggle with her self esteem and identity as a writer.” I also include everything that’s a pain point, however mundane, even something like “I want oreos when I write.”</p>
<p>I did this “pain extraction” process on the interview notes multiple times. I’ve probably read some of the notes upwards of twenty times. You really want to do this, so you become intimately familiar with your research and your market. It makes you more empathetic and thoughtful. You start processing these patterns in the shower, when you take walks, and in your sleep.</p>
<p>As it turns out, two really clear patterns began to emerge from this pain point exercise. One: writers, both successful and aspiring, realize that good habits are crucial to being a writer. In particular, writers really struggle to establish these habits. Two: writing is a very solitary, lonely profession and as a result, writers have deep morale issues. There’s a clear need to improve self-esteem and create connection.</p>
<p>So, TheRightMargin has decided to help writers with habits and morale. In particular, we’re going to help aspiring writers, the ones who really need our help.</p>
<h2>Step 5: Brainstorm Features to Address Needs</h2>
<p>TheRightMargin has a policy: they should only develop features that address one or both of the themes that we identified in the research. We really want to help writers, so it makes sense that the platform should hit on those two key pain points.</p>
<p>We quickly realized that while some of the organizational features in the platform would help writers feel better about making progress in their work and improve their morale, there was nothing in the platform to help them establish good habits. Enter, feature brainstorming!</p>
<p>What would be an MVP feature to test to see if we could help writers establish good habits? What does it even mean to establish good habits?</p>
<p>My research indicated that habits varied: almost all writers wanted to write regularly (with varied success), but some had word count goals, which others loathed. Some considered success to be based on finishing a scene, whereas others felt successful when they had just sat down to write.</p>
<p>Some ideas:</p>
<ul>
<li>Email with personal stats</li>
<li>Email with inspirational quotes/resources</li>
<li>Word count -&gt; total words -&gt; deleted words -&gt; graphs</li>
<li>Visual mapping of your story</li>
<li>“Jargon tree”</li>
</ul>
<p>What did we actually decide to test? Freeform goals. When you login to write, we ask you to set a goal. You check off if you accomplish it, and the system keeps track of your accomplishments. There’s also an email reminder to ask you to check off your goal and set a new one.</p>
<p>Here’s what the initial whiteboard sketch looked like:</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-9070" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/10/1-g77nA-byaert9IsFJILElQ.jpg?resize=800%2C600&#038;ssl=1" alt="1-g77nA-byaert9IsFJILElQ" width="800" height="600" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/10/1-g77nA-byaert9IsFJILElQ.jpg?w=800&amp;ssl=1 800w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/10/1-g77nA-byaert9IsFJILElQ.jpg?resize=300%2C225&amp;ssl=1 300w" sizes="auto, (max-width: 800px) 100vw, 800px" /></p>
<p>Here’s what it looks like now:</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-9071" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/10/1-DD3Q6BjVN6iJNU8B27L36Q.png?resize=682%2C746&#038;ssl=1" alt="1-DD3Q6BjVN6iJNU8B27L36Q" width="682" height="746" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/10/1-DD3Q6BjVN6iJNU8B27L36Q.png?w=682&amp;ssl=1 682w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/10/1-DD3Q6BjVN6iJNU8B27L36Q.png?resize=274%2C300&amp;ssl=1 274w" sizes="auto, (max-width: 682px) 100vw, 682px" /></p>
<h2>Step 6: Get Feedback</h2>
<p>TheRightMargin has only recently opened up to users, so it’s in early stages yet, but the hope is that with feedback, the platform will be able to improve and be awesome for writers.</p>
<p>User interviews now include some usability testing and prototype reviews, so we can get specific feedback on the platform. But, we are still actively collecting behavioral data and will do so ad infinitum — it’s incredibly important to continue learning and narrowing our scope.</p>
<p><strong>Speaking of which, are you a writer? We’d love to talk to you.</strong> We compensate with Amazon gift cards, and I’ve been told it’s actually pretty fun. If so, <a href="https://docs.google.com/a/socialergonomics.com/forms/d/1tGu2mYt16fRcAOZY3lq4ZsCrKODxmDmkdZuKdhjakaw/viewform" target="_blank">sign up here</a>.</p>
<p>And this is why user research is critical. It directly helped identify an entire thrust of TheRightMargin’s product development that would not have existed without deep understanding and empathy for writers. <strong>Talking to people doesn’t hold you back, it empowers you.</strong> Don’t ignore user research, kids — it will help you innovate and do cool shit, I promise.</p>
<p>Written by: <a href="https://medium.com/@megkierstead" target="_blank">Megan Kierstead</a>, <a href="https://medium.com/@megkierstead/how-to-conduct-user-research-and-build-features-b37908dd4e53?UX_Design_Weekly_Issue_49" target="_blank">Medium</a><br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2015/10/how-to-conduct-user-research-and-build-features/">How to Conduct User Research and Build Features</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2015/10/how-to-conduct-user-research-and-build-features/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">9069</post-id>	</item>
		<item>
		<title>From Privacy to Productivity: A Look at How Virtual Reality Could Change the Way We Work</title>
		<link>https://www.situatedresearch.com/2015/07/from-privacy-to-productivity-a-look-at-how-virtual-reality-could-change-the-way-we-work/</link>
					<comments>https://www.situatedresearch.com/2015/07/from-privacy-to-productivity-a-look-at-how-virtual-reality-could-change-the-way-we-work/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Sat, 25 Jul 2015 18:06:37 +0000</pubDate>
				<category><![CDATA[Collaboration]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Communication]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Learning]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=8942</guid>

					<description><![CDATA[<p>Businesses someday getting on board with virtual reality will need to do some self-examination. Various VR tools are aimed at reclaiming productivity and improving interactions.  The fabled “promise” of virtual reality is expansive. At its loftiest, we’ve been promised not only changes to how we live and how we consume entertainment, but also to how&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2015/07/from-privacy-to-productivity-a-look-at-how-virtual-reality-could-change-the-way-we-work/">From Privacy to Productivity: A Look at How Virtual Reality Could Change the Way We Work</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>Businesses someday getting on board with virtual reality will need to do some self-examination. Various VR tools are aimed at reclaiming productivity and improving interactions. </strong></p>
<p>The fabled “promise” of virtual reality is expansive. At its loftiest, we’ve been promised not only changes to how we live and how we consume entertainment, but also to how we work. <span id="more-8942"></span></p>
<p>After all, tech loves a good workplace trend.</p>
<p>In a general sense, incorporating virtual reality into business could mean things like escape from the physical confines of a desk, or the limit of how many monitors you could stick on that desk, or the general lack of aesthetics associated with cubicles, let’s say.</p>
<p>At the moment, there seems to be two ends of the spectrum developing — VR to help you get work done with other people, and VR to help you get away from, perhaps, those same people later on in the day.<span id="more-20923"></span></p>
<p>One instance of the latter example is Icelandic company <a href="http://www.murevr.com/#the-team-1-section" target="_blank">Breakroom</a>. They’re still in early days, but the idea behind Breakroom stems from the proliferation of open-concept offices — the kind popularized by tech companies as markers of innovation and avant-garde thinking, and the same that the Harvard Business Review, among others, have said are now negatively impacting <a href="https://hbr.org/2014/10/the-transparency-trap&amp;cm_sp=Article-_-Links-_-Top%20of%20Page%20Recirculation" target="_blank">privacy</a>,<a href="http://www.newyorker.com/business/currency/the-open-office-trap" target="_blank">productivity</a>, and <a href="http://www.fastcompany.com/3019758/dialed/offices-for-all-why-open-office-layouts-are-bad-for-employees-bosses-and-productivity" target="_blank">workplace satisfaction</a>.</p>
<p>One of Breakroom’s founders, Diðrik Steinsson, drew inspiration from having to work in an open office space himself. The idea behind Breakroom is that a worker in such an office might have a headmounted display like the Oculus Rift at his or her desk, and when it’s time to really focus on something for a few hours, they can put it on and go into a virtual environment with multiple, manipulatable browser windows, and integration with Google Apps, and Office 365, and get some work done — all while sitting somewhere scenic like a grassy field, or the moon. (Some co-workers will push you there.)</p>
<p>“I see it as a fortress of solitude for people,” Steinsson said. And he’s betting workers will be wearing some type of HMD eventually, even if it’s not within the next 10 years.</p>
<p>The flip side of this, to a degree, is a virtual reality application like AltspaceVR. The social VR app lets users enter its virtual world as robot avatar to socialize. It’s not necessarily aimed at businesses or the enterprise, but CEO Eric Romo said they’ve been using it for functions like business meetings and even job candidate interviews.</p>
<p>Romo emphasizes the value of nonverbal communication. A conference call, for example, can be awkward. People talk over each other, and it’s difficult to get a read on the other people present when all nonverbal cues like facial expressions and body language are absent. Romo said the experience of meeting and interacting with others is more effective when things like head movements are getting translated into VR.</p>
<p>Altspace has features like private and multi user web browsers — so, multiple people could, for example, look at code together. The use cases from consumer to enterprise slide back and forth a little like this: Romo said that if you want to show off vacation pictures, there’s no reason why they couldn’t be slide decks.</p>
<p>Somewhere in between those two examples, there’s something like the <a href="http://www.fastcodesign.com/3028433/virtual-reality-goes-to-work" target="_blank">demo</a> UC Davis’ Institute for Data Analysis and Visualisation Oliver Kreylos put together in 2014. It’s 3D-captured data of an office that includes 2D desktop apps.</p>
<p>But to eventually get these or other virtual reality tools into the business world, there are still some hurdles to jump, like nailing down inputs, or even just supplying every worker with not only an HMD, but also a Kinect sensor and Leap Motion sensor in order to translate more movement into VR. It also raises bigger questions as to what does all this really solve?</p>
<p>“When you want to introduce a technology like VR into some sort of business process, it’s really got to have some sort of overall benefit,” said Gartner analyst Brian Blau. “Some of these behavior replacement cycles — one of the things that you’ll find is that often times they’re more incremental than they are revolutionary.”</p>
<p>Introducing something like VR into a business environment would be revolutionary in the sense that it would be a change of device, software, and user interface, all at once.</p>
<p>What he asks is what are the steps? What are the actions being changed? Being able to answer those questions could be a determining factor in whether virtual reality ever takes hold in the enterprise.</p>
<p>He said more general uses are harder to make an argument for. Take a meeting, for the example — over the years, tech surrounding the ways in which people meet has ranged from phone calls, to conference calls, to video calls, to video calls on mobile devices — so what’s the big value add of virtual reality?</p>
<p>Romo submits the nonverbal cues, and the basic malleability of a virtual reality environment, the ability to turn a space into whatever it is a user might need.</p>
<p>Still, Blau sees more potential in purpose-built VR tools. Think data visualisation, training, prototyping and design.</p>
<p>Another consideration is what what happens after introducing something like an HMD into an office worker’s everyday use.</p>
<p>Computer Vision Syndrome is already rampant. Though, Dr. Dominick Maino, a professor at <a href="http://ico.edu/" target="_blank">Illinois College of Optometry/Illinois Eye Institute</a>, who specializes in pediatrics and binocular Vision, and has done research on vision and 3D graphics, said that if anything, introducing VR into workplaces would probably end up sacrificing a lot of vision problems relating to faulty binocular vision. Those will be the kinds of problems that need to get fixed before actually being able to use a VR tool.</p>
<p>Still, this is all probably a ways off. Breakroom is about to start testing its product. Altspace is focusing mostly on consumer use, but crafting a product that could be used otherwise in business.</p>
<p>Now, if only VR could offer a fix for the big business problems — like the “reply all” email thread.</p>
<p><em>[This article from <a href="http://www.techrepublic.com/article/from-privacy-to-productivity-a-look-at-how-virtual-reality-could-change-the-way-we-work/" target="_blank">TechRepublic</a> focuses on the uses of presence technology to both separate and connect people in the workplace; I think the Breakroom VR application by <a href="http://www.murevr.com/" target="_blank">MureVR</a> is particularly interesting; you can watch a 6:13 minute video about it on <a href="https://www.youtube.com/watch?v=KvJgJAppbxQ" target="_blank">YouTube</a>.]</em></p>
<p>Written by: <a href="http://www.techrepublic.com/search/?a=erin+carson" target="_blank">Erin Carson</a>, <a href="http://www.techrepublic.com/article/from-privacy-to-productivity-a-look-at-how-virtual-reality-could-change-the-way-we-work/" target="_blank">TechRepublic</a> (via <a href="http://ispr.info/2015/07/15/tools-to-separate-and-connect-us-how-vr-could-change-the-way-we-work/">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>&nbsp;</p>
<p>The post <a href="https://www.situatedresearch.com/2015/07/from-privacy-to-productivity-a-look-at-how-virtual-reality-could-change-the-way-we-work/">From Privacy to Productivity: A Look at How Virtual Reality Could Change the Way We Work</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2015/07/from-privacy-to-productivity-a-look-at-how-virtual-reality-could-change-the-way-we-work/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">8942</post-id>	</item>
		<item>
		<title>Hands-on with Mattel’s new AR, VR View-Master</title>
		<link>https://www.situatedresearch.com/2015/02/hands-on-with-mattels-new-ar-vr-view-master/</link>
					<comments>https://www.situatedresearch.com/2015/02/hands-on-with-mattels-new-ar-vr-view-master/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Fri, 20 Feb 2015 15:54:37 +0000</pubDate>
				<category><![CDATA[Education]]></category>
		<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Aesthetics]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Games for Learning]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Mobile Devices]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=8819</guid>

					<description><![CDATA[<p>A View-Master for virtual reality: Hands-on with Mattel&#8217;s new AR, VR phone toy Mattel is relaunching View-Master, but as a virtual reality and augmented-reality phone toy. And I got to play around with it for a bit…or at least, some of the tech behind it.  Announced at an event in New York City, the new&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2015/02/hands-on-with-mattels-new-ar-vr-view-master/">Hands-on with Mattel’s new AR, VR View-Master</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>A View-Master for virtual reality: Hands-on with Mattel&#8217;s new AR, VR phone toy</strong></p>
<p><span style="line-height: 1.5;">Mattel is relaunching View-Master, but as a virtual reality and augmented-reality phone toy. And I got to play around with it for a bit…or at least, some of the tech behind it. </span><span id="more-8819"></span></p>
<p>Announced at an event in New York City, <a href="http://www.cnet.com/news/google-mattel-announce-a-virtual-reality-view-master/" target="_blank">the new View-Master</a> is a collaboration between Mattel and Google, whose virtual reality Cardboard app has enabled cheap do-it-yourself accessories to turn any Android phone into a mini-VR viewer. Mattel’s plastic toy, which will debut in October, is like a more durable, plastic version of <a href="http://www.cnet.com/news/googles-cardboard-vr-headset-is-no-joke-its-great-for-the-oculus-rift/" target="_blank">Google Cardboard</a>, designed entirely for kids…or, maybe, also for grown-up kids like me. And the most brilliant part is it’ll only cost $30.<span id="more-20098"></span></p>
<p><iframe loading="lazy" src="http://www.cnet.com/videos/share/id/tUlXVC5TlPLbcmd7Lo7cfkU6k0P1Edow/" width="960" height="540" frameborder="0" seamless="seamless" scrolling="no" allowfullscreen="allowfullscreen"></iframe></p>
<p>I used View-Master back when I was a little — who didn’t? It’s a classic 3D stereoscopic picture viewer. Many people had even said Google Cardboard looked a bit like a View-Master. So is isn’t a huge surprise that Mattel has suddenly announced a new View-Master with Google Cardboard VR capabilities added. I’ve always felt that virtual reality reminded me of early stereoscopic toys. And Mattel has keyed onto the same idea.</p>
<figure id="attachment_8821" aria-describedby="caption-attachment-8821" style="width: 770px" class="wp-caption aligncenter"><img data-recalc-dims="1" loading="lazy" decoding="async" class="size-full wp-image-8821" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster1.jpg?resize=770%2C577&#038;ssl=1" alt="The View-Master will fit most phones, according to Mattel: iPhone and Android alike." width="770" height="577" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster1.jpg?w=770&amp;ssl=1 770w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster1.jpg?resize=300%2C225&amp;ssl=1 300w" sizes="auto, (max-width: 770px) 100vw, 770px" /><figcaption id="caption-attachment-8821" class="wp-caption-text">The View-Master will fit most phones, according to Mattel: iPhone and Android alike.</figcaption></figure>
<p>The toy was only viewable in a mock-up prototype form at Mattel’s event, but the design’s pretty cool: it looks half old-school View-Master, half Oculus Rift. The inner plastic housing extends to hold many types of phones: Mattel says it’s designed to fit the largest existing phones, and will even work with the <a href="http://www.cnet.com/products/apple-iphone-6-plus/" target="_blank">iPhone 6 Plus</a> and <a href="http://www.cnet.com/products/google-nexus-6/" target="_blank">Nexus 6</a>. A capacitive-touch side lever is used to “click” through scenes or into virtual environments, like the magnetized side switch on Google’s Cardboard viewers.</p>
<p>Mattel’s headset is designed with Google and Android in mind, but at launch is intended to work on “nearly all platforms,” which includes iOS. That would mean a dedicated Mattel app which interfaces with the View-Master, but Google’s Cardboard and Cardboard-ready apps — many of which already exist on iOS, like VRSE — will work too.</p>
<p>Mattel is planning to use View-Master not just for VR, but also for AR; little plastic reels that look like the old cardboard ones are really just flat coasters this time around, now with images on top which the View-Master reads and turns into pop-up augmented-reality models on your table, desktop or wherever else you place it. Multiple View-Masters could use one reel to access content if put down on a table, unlike the old pop-in reels. This type of augmented-reality tech has already existed for years in many apps and on some children’s toys like the Nintendo 3DS (with its AR cards) and PlayStation Vita, but mixing it into a VR headset is a novel idea.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-8822" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster3.jpg?resize=770%2C577&#038;ssl=1" alt="viewmaster3" width="770" height="577" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster3.jpg?w=770&amp;ssl=1 770w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster3.jpg?resize=300%2C225&amp;ssl=1 300w" sizes="auto, (max-width: 770px) 100vw, 770px" /></p>
<p>I didn’t get to use the actual Mattel prototype, but we tried View-Master’s augmented-reality tech on phones and Google Cardboard viewers. There were three reels to try: a dinosaur one made a little dinosaur pop up on the disc on the table in front of me. When I aimed a dot and clicked on it, I was suddenly surrounded by a prehistoric 360-degree panorama with 3D dinosaurs. Clicking on them brought up facts, too.</p>
<p>Looking at the space disc with Cardboard on brought up a pop-up moon and Earth; clicking on it took me to a panorama of the moon, with pop-up clickable photos of NASA missions. A third, San Francisco-themed, had little mini-models of Alcatraz and the Golden Gate Bridge that turned into VR photo panoramas. To exit any of the virtual panoramas, you look down and click on the side…or, remove the View-Master from your face. The View-Master comes with one reel in its $30 package, and extra reels will cost around $15 each. No, older View-Master reels don’t work in here, but it sounds like Mattel is exploring re-releasing content from some of the back catalog 10,000 older ViewMaster reels.</p>
<figure id="attachment_8823" aria-describedby="caption-attachment-8823" style="width: 770px" class="wp-caption aligncenter"><img data-recalc-dims="1" loading="lazy" decoding="async" class="size-full wp-image-8823" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster4.jpg?resize=770%2C577&#038;ssl=1" alt="The &quot;reels&quot; don't actually go in the View-Master, they simply sit on your table." width="770" height="577" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster4.jpg?w=770&amp;ssl=1 770w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/02/viewmaster4.jpg?resize=300%2C225&amp;ssl=1 300w" sizes="auto, (max-width: 770px) 100vw, 770px" /><figcaption id="caption-attachment-8823" class="wp-caption-text">The &#8220;reels&#8221; don&#8217;t actually go in the View-Master, they simply sit on your table.</figcaption></figure>
<p>There’s no strap to keep the View-Master on: this is a hold-to-your-face toy, much like older View-Masters and Google Cardboard. Mattel has promised that the tech has already been vetted by pediatric ophthalmologists, and is meant for children ages 7 and up — in short, bite-sized sessions.</p>
<p>The View-Master may work with other toys, too, like other app-ified toys in the past, but for now it’s really a fancier plastic Google Cardboard viewer, with additional Mattel support. That’s not a bad thing at all: at $30, this is a pretty awesome little stocking-stuffer idea, and a fun phone toy. Just keep in mind that if you give this to your kid, it won’t work without a phone popped into it.</p>
<p>By the time fall rolls around, Mattel may have other toys ready to work with it. Or, there might be many other companies ready to make cheap phone-enabled VR headsets, too.</p>
<p>Written by: <a href="http://www.cnet.com/profiles/scottstein8/" target="_blank">Scott Stein</a>, <a href="http://www.cnet.com/products/new-view-master/" target="_blank">CNET</a> (via <a href="http://ispr.info/2015/02/20/hands-on-with-mattels-new-ar-vr-view-master/" target="_blank">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2015/02/hands-on-with-mattels-new-ar-vr-view-master/">Hands-on with Mattel’s new AR, VR View-Master</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2015/02/hands-on-with-mattels-new-ar-vr-view-master/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">8819</post-id>	</item>
		<item>
		<title>Welcome to the Age of Holographs</title>
		<link>https://www.situatedresearch.com/2015/01/welcome-age-holographs/</link>
					<comments>https://www.situatedresearch.com/2015/01/welcome-age-holographs/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Thu, 22 Jan 2015 22:18:54 +0000</pubDate>
				<category><![CDATA[Education]]></category>
		<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Aesthetics]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Games for Learning]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Mental Models]]></category>
		<category><![CDATA[Personalization]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[User-Centered Design]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=8792</guid>

					<description><![CDATA[<p>Up close with the HoloLens, Microsoft’s most intriguing product in years We just finished a heavily scripted, carefully managed, and completely amazing demonstration of Microsoft’s HoloLens technology. Four demos, actually, each designed to show off a different use case for a headset that projects holograms into real space. We played Minecraft on a coffee table.&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2015/01/welcome-age-holographs/">Welcome to the Age of Holographs</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>Up close with the HoloLens, Microsoft’s most intriguing product in years</strong></p>
<p>We just finished a heavily scripted, carefully managed, and completely amazing demonstration of Microsoft’s HoloLens technology. Four demos, actually, each designed to show off a different use case for a headset that projects holograms into real space. We played <em>Minecraft</em> on a coffee table. We had somebody chart out how to fix a light switch right on top of the very thing we were fixing. <span id="more-8792"></span></p>
<p>We walked on Mars.</p>
<p>You’ll notice there aren’t photos here, and that’s because before we were even allowed into the labs where the HoloLens team tests out its user experiences, we had to deposit our cameras and phones into a locker. No recording equipment of any kind was allowed, not even audio. We entered the basement below Microsoft’s visitor center laughing at the absurdity of it all — many reporters needed to get notepads from the company and weren’t carrying pens, either.</p>
<p>But it was all worth it, because HoloLens is probably the most intriguing (and, in many ways, most infuriating) technology we’ve experienced since the Oculus Rift. And there are many parallels with the Rift to be had: both are immersive, but in different ways; both require you to strap a weird thing on your head; both leave you grinning like at absolute idiot at a scene only you can see. And, crucially, both need more work when it comes to thinking through exactly how to control and interact with virtual things.</p>
<p><script height="575px" width="1023px" src="https://player.ooyala.com/iframe.js#ec=lsOGp3cjqUFwNW0FqImWpiKsqIdSTEX-&#038;pbid=dcc84e41db014454b08662a766057e2b"></script></p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-8793" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/d76d1d6c-a4e6-41d2-bc43-c9b49041a219.0.png?resize=864%2C392&#038;ssl=1" alt="d76d1d6c-a4e6-41d2-bc43-c9b49041a219.0" width="864" height="392" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/d76d1d6c-a4e6-41d2-bc43-c9b49041a219.0.png?w=864&amp;ssl=1 864w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/d76d1d6c-a4e6-41d2-bc43-c9b49041a219.0.png?resize=300%2C136&amp;ssl=1 300w" sizes="auto, (max-width: 864px) 100vw, 864px" /></p>
<p><strong><em>Minecraft</em> IRL<br />
</strong>by Dieter Bohn</p>
<p>By far, <a href="https://www.theverge.com/2015/1/21/7868363/minecraft-hololens-microsoft-freecell" target="_blank" rel="noopener">the most impressive demo for my money was the <em>Minecraft</em> demo</a> — though Microsoft called it something like “Building Blocks” or some such, presumably so as not to fully commit to releasing a full holograph version of<em>Minecraft</em>. But before we could enter this virtual world — actually, the virtual entered <em>our</em> world — we had to strap on the development unit for the HoloLens.</p>
<p>It’s a contraption, to be sure. There’s a small, heavy block you hang around your neck which contains all the computing power. It’s comprised of lenses and tiny projectors and motion sensors and speakers (or <em>something</em> that makes sound, anyway), and god knows what else. And then there’s a screen right there in your field of view.</p>
<p>A “screen in your field of view” is the right way to think about HoloLens, too. It’s immersive, but not nearly as immersive as proper virtual reality is. You still see the real world in between the virtual objects; you can see where the magic holograph world ends and your peripheral vision begins.</p>
<p>But before you can apply your jaded “I’ve done VR before” attitude to this situation, you look down at the coffee table and there’s a <strong>castle sitting right on the damn thing.</strong> It’s not shimmery, but it’s not quite real, either. It’s just sitting there, perfectly flat on the table, reacting in space to your head movements. It’s nearly as lifelike as the actual table, and there’s no lag at all. The castle is there. It’s simply magic.</p>
<p>You definitely have a big stupid grin on your face even though the contraption that’s strapped to it is pressing your eyeglasses into the bridge of your nose in a painful way.</p>
<p>Then it’s demo time. You can’t touch anything, but you can look and point a little circle at objects on it by moving your head around. You learn how a “glance” is just you looking at things and pointing your reticle at them, and an “AirTap” is the equivalent of clicking your mouse. The demo involves digging <em>Minecraft</em> holes and blowing up <em>Minecraft</em> zombies with <em>Minecraft</em> TNT. It’s basically incredible to see these digital things in real space.</p>
<p>You blow up a hole in the table and then you look <em>through</em> it to more digital objects on the floor. You blow up a hole in the wall and tiny bats fly out and you see that behind your very normal wall is a virtual hellscape of lava and rock. You peer into the hole, around the corner, and see that dark realm extend far into space.</p>
<p>And then the demo’s over.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-large wp-image-8794" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/a67d3d33-e1e5-4cf7-bf3d-dbe1befc8d8c.0.jpg?resize=980%2C655&#038;ssl=1" alt="a67d3d33-e1e5-4cf7-bf3d-dbe1befc8d8c.0" width="980" height="655" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/a67d3d33-e1e5-4cf7-bf3d-dbe1befc8d8c.0.jpg?resize=1024%2C684&amp;ssl=1 1024w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/a67d3d33-e1e5-4cf7-bf3d-dbe1befc8d8c.0.jpg?resize=300%2C200&amp;ssl=1 300w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/a67d3d33-e1e5-4cf7-bf3d-dbe1befc8d8c.0.jpg?w=1200&amp;ssl=1 1200w" sizes="auto, (max-width: 980px) 100vw, 980px" /></p>
<p><strong>Skype<br />
</strong>by Tom Warren</p>
<p>Microsoft’s Skype demo was as equally impressive to me as playing around with<em>Minecraft </em>blocks in a living room. After a two-hour keynote, Microsoft wanted me to fix a light switch. It all started by sitting down and facing some tools and a socket with exposed wiring. A little dazed and confused, I looked up and scanned across the Skype interface which was suddenly appearing in front of me, and picked a face to call. The video call popped into a little window, and my journey to fix a light switch began.</p>
<p>On the other end of the call was a Microsoft engineer. I could see and hear her, but she could only hear me and see exactly what I was seeing in front of me. My eyes, or the headset on my head, was relaying everything over Skype. It was a support call of sorts — here she was to help me fix a light switch. We started by pinning her little window on top of a lamp. I could then look around the room and return to the lamp to see her face. She guided me where to go. It felt strangely natural, and I didn’t need to configure anything or learn gestures other than the same “Air Tap” you use to simulate a mouse click.</p>
<p>While I was being talked through which real world tools we needed for the job, the Microsoft engineer called my attention to the wall with wiring and then started drawing where to position the light switch right on the wall. Thinking about it now it sounds totally surreal, but during the demo I didn’t even think about it — it just felt like I was being guided around with annotations and a helpful friend. We connected the wiring, tested it for an electrical current, and then turned the power back on and switched the light on. It was all fixed, and all by using a crazy combination of a headset, augmented reality, and Skype. It might sound gimmicky, but the applications here are truly impressive. I use YouTube guides to figure out home improvements or to service my car, but this is on another level. Imagine a surgeon performing complex surgery and writing notes in real time and guiding a colleague through it all. Imagine support calls to resolve a problem with your PC. If this works as well as Microsoft’s controlled demo, then this really has the ability to change how we communicate and learn.</p>
<p>Microsoft’s next demo didn’t have us using the HoloLens prototypes directly. Instead, we watched as “Nick” (nobody in Microsoft’s blue-tinted demonstration basement has last names. I asked.) manipulate objects in digital space so he could build a Koala bear or a pickup truck. It was actually quite impressive, as cameras filmed him and screens showed both Alex and the virtual objects he was manipulating in the same space in real time.</p>
<p>The idea was to convince us that HoloLens would unleash a wave of creators who would be able to dream up 3D objects with little to no training. It’s much easier to understand what a thing is in your living room than it is in AutoCad.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-8795" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/hololens.0.gif?resize=663%2C373&#038;ssl=1" alt="hololens.0" width="663" height="373" /></p>
<p>But sitting there after our whirlwind of actually <em>experiencing</em> HoloLens, my mind was elsewhere. For example, there are only a few ways to interact with this system so far:</p>
<ul>
<li>Glance: you point your head at something.</li>
<li>AirTap: you make a “Number 1″ sign with your hand, then move your finger down like you’re depressing a lever.</li>
<li>Voice: you can issue commands, usually to switch what “tool” you’re using.</li>
<li>Mouse: So actually the neatest thing is that objects you use to interact with computers can be used to interact with holograms.</li>
</ul>
<p>That seems like enough, but it’s not nearly enough. It’s wildly impressive that these objects really do feel like they’re out there in your living room, but it’s equally depressing to know that you can’t treat them like real objects.</p>
<p>At one point in the demo, Alex needed to put a tire on his pickup. He had to twist his body and head around to get his pointer in just the right spot and get the tire arranged just right to fix on the axle. Then, AirTap! the tire is connected. But how much easier would it be if you could grab the tire in your actual hands?</p>
<p>Our hands are simply more dextrous than our necks. You have finer control over small motions, you can move your hands in so many different ways and vectors, with pressure and nuance and delicacy. Your neck and head, well, not so much.</p>
<p>But then Microsoft gave us 3D printed Koalas with a USB drive inside them, which was nice. And if this HoloLens thing takes off, you will be able to design your own and it will be way easier than learning current 3D design software. But not as easy as it would be if you just imagined building with holograms.</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-8796" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/microsoft-windows-10-live-verge-_1662.0.jpg?resize=980%2C654&#038;ssl=1" alt="microsoft-windows-10-live-verge-_1662.0" width="980" height="654" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/microsoft-windows-10-live-verge-_1662.0.jpg?w=1000&amp;ssl=1 1000w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2015/01/microsoft-windows-10-live-verge-_1662.0.jpg?resize=300%2C200&amp;ssl=1 300w" sizes="auto, (max-width: 980px) 100vw, 980px" /></p>
<p><strong>Walking on Mars<br />
</strong>By Tom Warren</p>
<p>Microsoft has teamed up with NASA to let scientists explore what Curiosity sees on Mars. Instead of panoramic imagery on a computer screen, Microsoft’s demo lit up a room and turned it into Mars. I walked around the rocky terrain, bumped into the Curiosity rover, and generally just checked out a planet I will never visit in my lifetime. It’s a totally new perspective that felt like I was immersed in touring Mars, but not necessarily there. The field of view felt a little too limited to truly immerse myself and trick my brain into thinking I was really on another planet, but what impressed me most is what Microsoft has built into this experience.</p>
<p>I held a call with a NASA engineer and he talked me through the terrain. I squatted to look more closely at rocks, took snapshots of various rock formations, and even planted flags for points of interest. My jaw dropped when I ventured over to a PC in the room and started to experiment with the mouse. I pulled the mouse pointer off the screen and suddenly it was on the floor next to me, allowing me to set markers in the virtual environment. It’s everything I’ve seen in demonstrations from Microsoft Research before, but here it was on my head and working.</p>
<p>The collaboration part was the key here, allowing me to interact with this data in a unique way, but also alongside the NASA engineer who could drop flags on the Mars terrain and guide me to look at certain sections. While this isn’t traditional productivity with a mouse and keyboard, it’s certainly something new and intriguing. I could see this type of scenario working for big teams that need to communicate across time zones and on big sets of complex data.</p>
<p>Overall, HoloLens is Microsoft at its most ambitious. It’s a big bet on the future of computing, the future of Windows, and ultimately the future of Microsoft itself. While the company is struggling at mobile, it wants to catch the next wave of computing and lead. Is HoloLens the next wave? Developers and consumers will be the ultimate test of that, but if anything HoloLens is an incredibly brave and impressive project from Microsoft. It’s true innovation, which is something Microsoft has lacked during its obsession with protecting Windows. It’s also another example of <a href="https://www.theverge.com/2014/11/6/7164623/microsoft-3d-sound-headset-guide-dogs" target="_blank" rel="noopener">an experience that takes the complex technology out of the way</a>, leaving you to experience what really matters.</p>
<p>Written by: <a href="https://www.theverge.com/users/Dieter%20Bohn" target="_blank" rel="noopener">Dieter Bohn</a> and <a href="https://www.theverge.com/users/tomwarren" target="_blank" rel="noopener">Tom Warren</a>, <a href="https://www.theverge.com/2015/1/21/7868251/microsoft-hololens-hologram-hands-on-experience" target="_blank" rel="noopener">The Verge</a> (via <a href="https://ispr.info/2015/01/22/up-close-with-the-hololens-microsofts-intriguing-mixed-reality-product/" target="_blank" rel="noopener">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2015/01/welcome-age-holographs/">Welcome to the Age of Holographs</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2015/01/welcome-age-holographs/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">8792</post-id>	</item>
		<item>
		<title>IBM Forecasts Major Advances in Cognitive Computing</title>
		<link>https://www.situatedresearch.com/2013/12/ibm-forecasts-major-advances-cognitive-computing/</link>
					<comments>https://www.situatedresearch.com/2013/12/ibm-forecasts-major-advances-cognitive-computing/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Fri, 27 Dec 2013 16:59:58 +0000</pubDate>
				<category><![CDATA[Collaboration]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Games for Learning]]></category>
		<category><![CDATA[Usability Research]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=5532</guid>

					<description><![CDATA[<p>IBM on Tuesday released its annual &#8220;5 in 5&#8221; list of predictions about technological innovations that will change the way we live in the next five years, with the theme this year being cognitive advances in computing that help machines &#8220;learn&#8221; how to better serve us.  Last year&#8217;s 5 in 5 list also focused on&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2013/12/ibm-forecasts-major-advances-cognitive-computing/">IBM Forecasts Major Advances in Cognitive Computing</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>IBM on Tuesday released its annual &#8220;5 in 5&#8221; list of predictions about technological innovations that will change the way we live in the next five years, with the theme this year being cognitive advances in computing that help machines &#8220;learn&#8221; how to better serve us. <span id="more-5532"></span></p>
<p>Last year&#8217;s 5 in 5 list also focused on the <a href="http://www.pcmag.com/article2/0,2817,2413300,00.asp" data-ls-seen="1">rise of cognition in computing</a> and how the five senses humans use to gain information about and manipulate the physical world are being emulated by computing systems like IBM&#8217;s own Watson artificial intelligence framework.</p>
<p>For this year&#8217;s edition, IBM got a little more specific about the ways that such advances in machine learning will affect us, touching more on data analytics and offering up the following predictions:</p>
<p><b>The classroom will learn you:</b> Kerrie Holley of IBM described this as a concept &#8220;built on a lot of the technologies you see with how the Khan Academy works, cloud-based computing, and the like.&#8221; In the years to come, new learning technologies will use advanced analytics of &#8220;longitudinal student records&#8221; to help teachers better assess what individual students need, which ones are at risk, and how to help them in their education, he said.</p>
<p><iframe loading="lazy" frameborder="0" allowfullscreen src="//www.youtube.com/embed/hTA5GyWamR0" width="650" height="390"></iframe></p>
<p><b>Buying local will beat online.</b> Less about a specific tech advance, this prediction is based on the idea that the &#8220;tables will turn&#8221; in terms of access to the kind of technology, cloud services, and analytics that can help &#8220;mom and pop&#8221; businesses compete more readily with big national and global retailers, Holley said. &#8220;Technology costs are dropping and as they do, proximity will allow local retailers to create experiences the big retailers are not able to do online.&#8221;</p>
<p><iframe loading="lazy" frameborder="0" allowfullscreen src="//www.youtube.com/embed/yKNSOwLcrkE" width="650" height="390"></iframe></p>
<p><b>Doctors will use your DNA to keep you well.</b> IBM presented this prediction as one involving more advanced computational work than some of the others in its 5-in-5 list. &#8220;Cognitive-based systems like Watson, along with breakthroughs in genomic research, will enable doctors to be better able to diagnose cancer and offer better treatments,&#8221; Holley said.</p>
<p><iframe loading="lazy" frameborder="0" allowfullscreen src="//www.youtube.com/embed/0M1DMdc1mQ0" width="650" height="390"></iframe></p>
<p><b>The city will help you live in it.</b> In just a few decades, as many as seven out of 10 people around the world will live in cities, according to some projections. We&#8217;re already seeing more computational resources being dedicated to helping those city dwellers manage their urban lives and that will only accelerate, according to IBM.</p>
<p><iframe loading="lazy" frameborder="0" allowfullscreen src="//www.youtube.com/embed/tVGviMIMjN0" width="650" height="390"></iframe></p>
<p><b>A digital guardian will protect you online.</b> Holley explained this prediction as an expansion on financial fraud protection services offered by banks and credit card companies, only much more personally tailored to individuals to safeguard their entire digital lives.</p>
<p><iframe loading="lazy" frameborder="0" allowfullscreen src="//www.youtube.com/embed/al8ng82nRss" width="650" height="390"></iframe></p>
<p>&#8220;This year&#8217;s IBM 5 in 5 explores the idea that everything will learn—driven by a new era of cognitive systems where machines will learn, reason and engage with us in a more natural and personalized way. These innovations are beginning to emerge enabled by cloud computing, big data analytics, and learning technologies all coming together,&#8221; the research team behind the company&#8217;s annual list of predictions said in a statement.</p>
<p>&#8220;Over time these computers will get smarter and more customized through interactions with data, devices, and people, helping us take on what may have been seen as unsolvable problems by using all the information that surrounds us and bringing the right insight or suggestion to our fingertips right when it&#8217;s most needed. A new era in computing will lead to breakthroughs that will amplify human abilities, assist us in making good choices, look out for us, and help us navigate our world in powerful new ways.&#8221;</p>
<p>Written by: <a href="http://www.pcmag.com/author-bio/damon-poeter">Damon Poeter</a>, <a href="http://www.pcmag.com/article2/0,2817,2428432,00.asp">PC Mag</a><br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2013/12/ibm-forecasts-major-advances-cognitive-computing/">IBM Forecasts Major Advances in Cognitive Computing</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2013/12/ibm-forecasts-major-advances-cognitive-computing/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">5532</post-id>	</item>
		<item>
		<title>Serious Games and the Future of Education</title>
		<link>https://www.situatedresearch.com/2013/09/serious-games-future-education/</link>
					<comments>https://www.situatedresearch.com/2013/09/serious-games-future-education/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Wed, 18 Sep 2013 15:50:08 +0000</pubDate>
				<category><![CDATA[Collaboration]]></category>
		<category><![CDATA[Education]]></category>
		<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Communication]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Games for Learning]]></category>
		<category><![CDATA[Mental Models]]></category>
		<category><![CDATA[Personalization]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=5325</guid>

					<description><![CDATA[<p>Are serious games the classroom tool of the future? Is the future already here?  The tablet classroom may have once been the stuff of science fiction, but modern developments in technology and brain science may have come together to create a massive change in the way we think about education.  “The essence of what’s going&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2013/09/serious-games-future-education/">Serious Games and the Future of Education</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Are serious games the classroom tool of the future? Is the future already here?  The tablet classroom may have once been the stuff of science fiction, but modern developments in technology and brain science may have come together to create a massive change in the way we think about education. <span id="more-5325"></span></p>
<figure id="attachment_5326" aria-describedby="caption-attachment-5326" style="width: 239px" class="wp-caption alignleft"><img data-recalc-dims="1" loading="lazy" decoding="async" class="wp-image-5326" style="margin-right: 10px;" title="Nolan Bushnell" alt="Nolan_Bushnell" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/Nolan_Bushnell.jpg?resize=239%2C360&#038;ssl=1" width="239" height="360" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/Nolan_Bushnell.jpg?w=399&amp;ssl=1 399w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/Nolan_Bushnell.jpg?resize=199%2C300&amp;ssl=1 199w" sizes="auto, (max-width: 239px) 100vw, 239px" /><figcaption id="caption-attachment-5326" class="wp-caption-text"><em>Nolan Bushnell</em></figcaption></figure>
<p>“The essence of what’s going on now is the adoption of brain science… It turns out that if you teach in a different way, you can get outcomes that are 10-20 times more efficient and stickier,” says <strong><a href="http://www.brainrush.com/">Brainrush</a> </strong>founder Nolan Bushnell.</p>
<p>Bushnell, founder of Atari, Inc and Chuck E. Cheese Pizza Time Theaters, believes that an integration of video games and educational software will spur one of the most significant changes in education history.  “<em><strong>In some ways</strong> <strong>the world of education is going to go through one of the most massive changes in the next five years than it has seen in the last three thousand years</strong>. </em>It’s a perfect storm.”</p>
<p>Bushnell believes the change will come from four key areas.</p>
<ol>
<li><em>The rise of cheap, ubiquitous hardware.</em></li>
<li><em>Robust networks that allow for connectivity without the administrative constraints of the past.</em></li>
<li><em>Extreme pressure on schools to produce outcomes – Too many kids are getting through high school with no meaningful job skills.</em></li>
<li><em>Adoption of brain science software.</em></li>
</ol>
<p>“One of the key factors is here is the adoption of brain science.  Getting it involved in the curriculum is massively effective.  Not by 20%, not by 50%, but by many multiples of educational efficacy,” says Bushnell. “This is on a trajectory right now that is unstoppable by bureaucracy, but unions, by anything.  It’s just going to happen.”</p>
<p>Bushnell states that these factors have created a situation where the adoption of new technology isn’t just smart, but inevitable.</p>
<p>“The real issue comes down to effectiveness.  The school systems have adopted a factory system of education, which says pretty much one speed, one complexity.  As a result, there’s one person being taught at the right speed and the rest of the kids are bored or lost,” says Bushnell.  “The computer allows you to adapt to each student’s particular skills and speed.  Instead of ABCDF, all kids end up totally mastering the subject.  It’s a big change.  What it really does is it levels the understanding gap in the factory model with really impressive outcomes.”</p>
<figure id="attachment_5327" aria-describedby="caption-attachment-5327" style="width: 250px" class="wp-caption alignright"><img data-recalc-dims="1" loading="lazy" decoding="async" class="size-full wp-image-5327 " style="margin-left: 10px;" alt="Jesse Schell" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/Jesse_Schell.jpg?resize=250%2C250&#038;ssl=1" width="250" height="250" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/Jesse_Schell.jpg?w=250&amp;ssl=1 250w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/Jesse_Schell.jpg?resize=150%2C150&amp;ssl=1 150w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/09/Jesse_Schell.jpg?resize=90%2C90&amp;ssl=1 90w" sizes="auto, (max-width: 250px) 100vw, 250px" /><figcaption id="caption-attachment-5327" class="wp-caption-text"><em>Jesse Schell</em></figcaption></figure>
<p>Jesse Schell, founder and CEO of <a href="http://www.schellgames.com/"><strong>Schell Games</strong></a>, sees the shift not something that will happen in the near future but as something already happening.</p>
<p>“Coming from an entertainment games background used to be creative director at Disney Imaginary Virtual Reality Studio.  For the last 12 years I’ve been teaching t the Carnegie Mellon’s Entertainment Technology Center and about 10 years ago I started my own game studio in Pittsburgh,” says Schell.  “We’ve grown from 5 people to about 100 people right now and what we’ve found in the last few years is that the fastest growing part of the games industry is in educational games.  What we see is going to happen is an avalanche of tablets into the school systems, they’re well poised to replace textbooks and then a number of other changes start to happen.</p>
<p>Schell’s take on the situation finds some common ground with Bushnell’s analysis.  Like Bushnell, Schell sees the transformation not if, but when.</p>
<p>“Debate on how this transformation when this is going to happen. I believe that schools only make changes when they absolutely have to or if they see there is a way to save money,” says Schell.  “I think it’s possible that they will see tablets as a way to save money.  Textbook costs are significant.  Tablets are the moment are not terribly cheap, but look at phones – things get quite affordable as time goes on.”</p>
<p>It can be difficult to visualize this takeover, but look at the video game industry and the shift to mobile titles.  Kids are having their first interactions with games on mobile devices, something that current-gen gamers simply can’t identify with.  When kids are having their first interactions with the tablet touch-screen classroom, similar things could occur.</p>
<p>“My suspicion is that we’ll see it happen in pockets first, but at the same time we’ll start to see tablet integration take over,” says Schell.  “We’ve got a generation of kids now with tablets and touch being their first modes of interaction, expecting to come into the classroom and touch screens.”</p>
<p>And what about concerns that games may simply not be seen as an educational tool? Schell shrugs off the possibility.</p>
<p>“People see the power that games hold. They see the focus. They see the engagement. You hear parents say ‘I wish they were as excited about Algebra as they are about Call of Duty.’  The key is going to come down to data.<strong><em>  It’s going to be very difficult to argue with data and results.</em></strong>”</p>
<p>The classroom of the future is a connected one, with the teacher able to zero in and command the flow of information and learning.  With all of the talk about big data and analytics, these tools could be utilized in the new classroom with significant impact.</p>
<p>“It gives the teacher so much data.  It’s incredible for both students that are behind and ahead.  This change has already started to happen.  Teachers see the power of games to engage students.  It’s about what happens when the students and the teacher are all using the same technology and it’s all connected,” says Schell.</p>
<p>“This is a fundamental change in the experience.  The teacher is almost in the role of a Dungeon Master, giving out a scenario that everyone is working on, monitoring status, changing the challenge depending on situations, and moving things front and center to the board when something key happens.”</p>
<p>Written by: <a href="http://www.forbes.com/sites/danieltack/">Daniel Tack</a>, Contributor, <a href="http://www.forbes.com/sites/danieltack/2013/09/12/serious-games-and-the-future-of-education/">Forbes</a><br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2013/09/serious-games-future-education/">Serious Games and the Future of Education</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2013/09/serious-games-future-education/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">5325</post-id>	</item>
		<item>
		<title>New Media Capture and Delivery System Gives Users Immersive “Experiences”</title>
		<link>https://www.situatedresearch.com/2013/04/new-media-capture-and-delivery-system-gives-users-immersive-experiences/</link>
					<comments>https://www.situatedresearch.com/2013/04/new-media-capture-and-delivery-system-gives-users-immersive-experiences/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Fri, 12 Apr 2013 16:02:17 +0000</pubDate>
				<category><![CDATA[Education]]></category>
		<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Aesthetics]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[heads-up-display]]></category>
		<category><![CDATA[Learning]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=5082</guid>

					<description><![CDATA[<p>Experience Media Studios today announced the worldwide launch of its patent-pending 3DPOV® system, a pioneering new solution for capturing, delivering, and experiencing immersive media. Experience Media Studios’ 3DPOV® system enables the capture of a three-dimensional visual and auditory experience from the first-person perspective. 3DPOV® media delivers a higher level of sensory engagement than virtual reality that replicates a true-to-life binocular&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2013/04/new-media-capture-and-delivery-system-gives-users-immersive-experiences/">New Media Capture and Delivery System Gives Users Immersive “Experiences”</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><a href="http://www.experiencemediastudios.com/">Experience Media Studios</a> today announced the worldwide launch of its patent-pending <a href="http://www.3dpov.com/">3DPOV</a><sup>®</sup> system, a pioneering new solution for capturing, delivering, and experiencing immersive media.</p>
<p>Experience Media Studios’ 3DPOV<sup>®</sup> system enables the capture of a three-dimensional visual and auditory experience from the first-person perspective. 3DPOV<sup>®</sup> media delivers a higher level of sensory engagement than virtual reality that replicates a true-to-life binocular and peripheral visual field and a stereophonic auditory experience. <span id="more-5082"></span>The system also captures GPS coordinates and altitude information to further augment reality.</p>
<p><iframe loading="lazy" src="http://www.youtube.com/embed/VkWFjDOkU4M" height="360" width="640" allowfullscreen="" frameborder="0"></iframe></p>
<p>“Modern audiences demand more of their media experiences,” said <a href="http://en.wikipedia.org/wiki/Michael-Ryan_Fletchall">Michael-Ryan Fletchall</a>, CEO of Experience Media Studios. “With more control over how and when they consume media, audiences want new and individualized experiences offering deeper levels of engagement. 3DPOV delivers an experience that goes far beyond just watching.”</p>
<p>Immersive media quickly absorbs the viewer into the experience, providing implications for critical skills training, simulations, and experiential learning environments. Experience Media Studios formally launched 3DPOV<sup>®</sup> in conjunction with the Military and Government Summit at the <a href="http://www.nabshow.com/">National Association of Broadcasters (NAB) Show</a>. Today’s announcement underscores the value of 3DPOV<sup>®</sup> in these key segments where small details not available in virtual reality are integrated to assess and teach armed forces critical life-saving, decision-under-pressure skills through rapid processing and reaction according to policies and protocols.</p>
<p>“In developing this media for military and government blended learning simulations, we immediately recognized the opportunity to apply the technology to our wheelhouse of entertainment and advertising,” said Fletchall.</p>
<p>Experience Media Studios is currently in pre-production with <a href="http://www.3dpov.com/possessedsoul"><i>Possessed Soul</i></a>, its upcoming feature length horror “experience” shot entirely using 3DPOV<sup>®</sup>technology. The project is partially financed through pre-sales to the horror-genre fan community using the <a href="http://www.igg.me/at/possessedsoul">Indiegogo</a> crowdfunding platform. In 2012, Experience Media Studios released the Josh Hutcherson drama, <a href="http://www.facebook.com/TheForgerMovie"><i>The Forger</i></a>.</p>
<p>The 3DPOV<sup>®</sup> system also features a cloud-based digital delivery platform, connecting affiliated media production companies with 3DPOV<sup>®</sup> technology to build a high quality digital asset inventory for worldwide distribution to private and public end users via 2D and 3D televisions, personal computers, and mobile devices.</p>
<p>“Our goal was to build a complete front-to-backend solution for creating and directly distributing unique 3DPOV<sup>®</sup> content,” said Fletchall. “We have an exclusive content platform for creating, cataloging, managing and distributing experience-driven 3DPOV<sup>®</sup> assets through an industry-leading pipeline with a user-friendly interface.”</p>
<p>Experience Media Studios will roll out the consumer subscription service component of <a href="http://www.3dpov.com/">3DPOV.com</a> with limited content later in 2013.</p>
<p>Written by: <a href="http://www.prnewswire.com/news-releases-test/new-media-capture-and-delivery-system-gives-users-immersive-experiences-202152391.html">PR Newswire</a> (via <a href="http://ispr.info/2013/04/10/new-media-capturedelivery-system-3dpov-gives-users-immersive-experiences/">Presence</a>); more images available at the <a href="http://experiencemediastudios.com/3dpov/">Experience Media Studies website</a><br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2013/04/new-media-capture-and-delivery-system-gives-users-immersive-experiences/">New Media Capture and Delivery System Gives Users Immersive “Experiences”</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2013/04/new-media-capture-and-delivery-system-gives-users-immersive-experiences/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">5082</post-id>	</item>
	</channel>
</rss>
