<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Artificial Intelligence Archives - Situated Research</title>
	<atom:link href="https://www.situatedresearch.com/tag/artificial-intelligence/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.situatedresearch.com/tag/artificial-intelligence/</link>
	<description>Usability Research and User Experience Testing</description>
	<lastBuildDate>Mon, 22 Aug 2022 14:13:47 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	

 
<site xmlns="com-wordpress:feed-additions:1">122538981</site>	<item>
		<title>Business Strategies UX Designers Should Know</title>
		<link>https://www.situatedresearch.com/2022/08/business-strategies-ux-designers-should-know/</link>
					<comments>https://www.situatedresearch.com/2022/08/business-strategies-ux-designers-should-know/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Wed, 17 Aug 2022 15:25:08 +0000</pubDate>
				<category><![CDATA[Development]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Web Design]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Learning]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[Web Development]]></category>
		<guid isPermaLink="false">https://www.situatedresearch.com/?p=10462</guid>

					<description><![CDATA[<p>In the field of design, you not only need technical skills to succeed, but also business prowess. As the number of people using the Internet increases, the value of UX designers increases. Multiple companies are seeking UX designers to help them gain a competitive edge in their market. Being business savvy as a UX designer&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2022/08/business-strategies-ux-designers-should-know/">Business Strategies UX Designers Should Know</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>In the field of design, you not only need technical skills to succeed, but also business prowess. As the number of people using the Internet increases, the value of UX designers increases. Multiple companies are seeking UX designers to help them gain a competitive edge in their market. <span id="more-10462"></span></p>
<p>Being business savvy as a UX designer will help you land advertised job opportunities easily. Below you will find a curated list of <a href="https://www.situatedresearch.com/2019/10/user-experience-is-now-your-business-strategy/">business strategies UX designers</a> should know that will help them navigate the competitive market.</p>
<h2>Leverage Artificial Intelligence</h2>
<p>The world of artificial intelligence cuts across multiple industries. One of the best ways to leverage the power of AI is by using it to increase sales, improve productivity, and gain a competitive edge. You can attend <a href="https://careerkarma.com/rankings/best-web-design-bootcamps/">coding bootcamp</a> like <a href="https://careerkarma.com/schools/general-assembly/">General Assembly</a> or <a href="https://careerkarma.com/schools/thinkful/">Thinkful</a> to master artificial intelligence. You can also decide to hire experts in AI to help you achieve your goals.</p>
<h2>Conduct Competitive Analysis</h2>
<p>It’s paramount to conduct in-depth research regarding your competitors before entering the market. This will help establish your niche, as you will easily identify the gap in the industry. Through research, you may also be able to identify ways to improve on your competitors’ services or products. You can also evaluate their success and failures and come up with new ways to operate.</p>
<p>The world of design is ever-changing. There is always a need <a href="https://www.situatedresearch.com/2017/01/tips-improving-websites-user-experience-part-1/">to improve user experience</a> in the market. To be a successful UX designer, you must invest ample time in finding out about competitors before delving into the market.</p>
<h2>Curate Templates</h2>
<p>The best way to get ahead in the market is by making your work effective and efficient. To ensure you produce quality content fast, you should rely on custom templates. Instead of starting from scratch when a new client approaches, you can rely on these set structures. Building your personalized templates or design systems will help you control time, labor, and quality.</p>
<h2>Pricing Strategy</h2>
<p>As a professional, you may keep your prices relatively low to attract customers, or price them beyond standards and cut off the ordinary customer. It&#8217;s a bit challenging to put a price on creativity, but you can draw a lot of information from other businesses or competitors.</p>
<p>Some businesses and experts in UX design started by under-quoting their services and slowly worked up to their level of success. This does not have to be your strategy. Ensure you have a proper business plan in place to ensure you can meet your long-term goals.</p>
<h2>Review Your Performance</h2>
<p>You have to have a structure in place to ensure you are improving on your craft. It&#8217;s important to ensure your design ideations positively impact a company’s profit. You can obtain data on the company’s website customer experience. It&#8217;s easy to establish by checking the number of customers visiting and the amount of time they spend on the site.</p>
<h2>Increase Services</h2>
<p>In addition to offering UX design services, you can go further and include web development or SEO services to your portfolio. Most companies prefer having one umbrella company delivering all their website solutions. Therefore, adding other services to your portfolio will help you stand out in the market.</p>
<h2>Improve Customer Service</h2>
<p>It&#8217;s always essential to review your client’s level of satisfaction. This will help you in improving the types of services you deliver. There are multiple professionals and businesses who have built their reputations based on customer service. If you are an established company, outsourcing a customer service team might be the best option.</p>
<h2>Build Customer Loyalty</h2>
<p>One of the cornerstones of success is retaining existing clients. It is much easier to maintain a relationship with an existing client than to work on acquiring a new one. Ensure the loyalty of your clients by rewarding them. You can create a customer loyalty program, improve customer service, or prioritize feedback. Most companies offer their clients discounts on certain services or during specific months.</p>
<h2>Conclusion</h2>
<p>Keeping your skills up-to-date as a UX designer cannot be overlooked. The world of design is always booming with new technology, methods, and approaches to improve user experience. Failure to develop a proper business strategy that keeps you informed can easily result in losses.</p>
<p>According to the US Bureau of Labor Statistics, the field of web design and development has a <a href="https://www.bls.gov/ooh/computer-and-information-technology/web-developers.htm#tab-6">job growth rate of 13 percent</a>, which shows promising growth. Fostering the best practices to succeed can help you thrive, whether you are working as a freelancer, in a startup, or an established company.</p>
<p><strong>About the author:</strong> <em><a href="https://streaklinks.com/BKzrrPn19SRR7_0GpwXgx-xA/https%3A%2F%2Fcareerkarma.com%2Fblog%2Fauthor%2Fdaisy-wambua%2F">Daisy Waithereo Wambua</a> is a seasoned writer with a decade of experience in writing, proofreading, and editing. She has spoken at Maseno University to help young women explore new careers and learn more about technology. She has a Bachelor&#8217;s Degree in Communications and Public Relations, a Certificate in Web Development, and a Master&#8217;s Degree in International Studies.</em></p>
<p><strong>Posted by:</strong> <a href="https://www.situatedresearch.com/"><em>Situated Research</em></a></p>
<p>The post <a href="https://www.situatedresearch.com/2022/08/business-strategies-ux-designers-should-know/">Business Strategies UX Designers Should Know</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2022/08/business-strategies-ux-designers-should-know/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">10462</post-id>	</item>
		<item>
		<title>This Video Game Knows When You’re Scared–And Gets Scarier</title>
		<link>https://www.situatedresearch.com/2014/02/video-game-knows-youre-scared-gets-scarier/</link>
					<comments>https://www.situatedresearch.com/2014/02/video-game-knows-youre-scared-gets-scarier/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Tue, 18 Feb 2014 16:45:47 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Affect / Emotion]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Learning]]></category>
		<category><![CDATA[Personalization]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User-Centered Design]]></category>
		<guid isPermaLink="false">http://www.www.situatedresearch.com/?p=5650</guid>

					<description><![CDATA[<p>The director behind the innovative video game Nevermind tells us why biofeedback is the new frontier in gaming. In the future, horror games will know when you’re scared. And then they’ll get scarier. Proof: the currently-in-development horror-adventure game Nevermind, which just launched a Kickstarter campaign last week. The game pairs classic first-person exploration with biofeedback&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2014/02/video-game-knows-youre-scared-gets-scarier/">This Video Game Knows When You’re Scared–And Gets Scarier</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>The director behind the innovative video game <i>Nevermind</i> tells us why biofeedback is the new frontier in gaming.</p>
<p>In the future, horror games will know when you’re scared. And then they’ll get scarier.</p>
<p>Proof: the currently-in-development horror-adventure game <i>Nevermind</i>, which just launched a <a href="https://www.kickstarter.com/projects/reynoldsphobia/nevermind-a-biofeedback-horror-adventure-game">Kickstarter campaign</a> last week. The game pairs classic first-person exploration with biofeedback data from a heart rate monitor in order to tell when you’re scared and <a href="http://www.fastcocreate.com/3022308/this-horrifying-video-game-knows-when-youre-afraid">turn up the horror</a>.<span id="more-5650"></span></p>
<p>“In <i>Nevermind</i>, you get scared, you get stressed, and the world will punish you for giving in to those feelings,” says creative director <a href="http://www.fastcompany.com/person/erin-reynolds" target="_blank">Erin Reynolds</a>, “But it rewards you for calming down by becoming easier.”</p>
<p><iframe src="//player.vimeo.com/video/85923375?title=0&amp;byline=0&amp;portrait=0" width="640" height="360" frameborder="0" allowfullscreen="allowfullscreen"></iframe></p>
<p>While biofeedback seems like a perfect fit for the horror genre, Reynolds believes that the technology is key to moving the video game medium forward as a whole, allowing for an entirely new level of immersion.</p>
<p>“I think it really speaks to the potential of games being able to know more about you than you know about yourself, and having this intimate response to your internal reactions,” Reynolds says.</p>
<p>That internal response surprised her during playtesting, as it illuminated “just how personal one’s sense of horror is. It made for some design challenges, because it means you need to have something for everything so that everyone’s buttons get pushed.”</p>
<p>But those challenges also served as the ultimate affirmation for Reynolds: She was scaring people.</p>
<p>That’s a good indication that <i>Nevermind</i> may be a successful game and not just a neat tech demo. Reynolds has ambitious goals for the game and hopes that it will move the medium forward as a proof of concept in both <a href="http://www.gamasutra.com/blogs/ErinReynolds/20131029/203265/Quit_Playing_Games_with_My_Heart_Biofeedback_in_Gaming.php">biofeedback integration</a> and as an example of a positive game that reinforces stress management skills that have real-world applications.</p>
<p>Because achieving those goals with a video game is all for naught if the game is not fun, states game developer Lat Ware in a feature on <a href="http://www.gamasutra.com/view/news/203252/Biofeedback_and_video_games_What_does_the_future_have_in_store.php">Gamasutra</a>:</p>
<blockquote><p>“The best practice in making biofeedback games is also the best practice for game development in general: Make it fun,” he adds. “Fun is the only thing that matters in a game. Fun is what makes people love your game. Fun is what makes people come back to play again. Fun is what makes people buy your next game without asking questions.”</p></blockquote>
<p>“That’s why I’m really excited about <i>Nevermind</i>,” says Reynolds. “It creates this experience that is fun but can also empower the player.”</p>
<p>Written by: By <a href="http://www.fastcolabs.com/user/joshua-rivera">Joshua Rivera</a>, Fast Company’s <a href="http://www.fastcolabs.com/3026458/this-video-game-knows-when-youre-scared-and-gets-scarier">Co.LABS</a> (via <a href="http://ispr.info/2014/02/17/new-level-of-immersion-video-game-knows-when-youre-scared-and-gets-scarier/">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2014/02/video-game-knows-youre-scared-gets-scarier/">This Video Game Knows When You’re Scared–And Gets Scarier</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2014/02/video-game-knows-youre-scared-gets-scarier/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">5650</post-id>	</item>
		<item>
		<title>IBM Forecasts Major Advances in Cognitive Computing</title>
		<link>https://www.situatedresearch.com/2013/12/ibm-forecasts-major-advances-cognitive-computing/</link>
					<comments>https://www.situatedresearch.com/2013/12/ibm-forecasts-major-advances-cognitive-computing/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Fri, 27 Dec 2013 16:59:58 +0000</pubDate>
				<category><![CDATA[Collaboration]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Games for Learning]]></category>
		<category><![CDATA[Usability Research]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=5532</guid>

					<description><![CDATA[<p>IBM on Tuesday released its annual &#8220;5 in 5&#8221; list of predictions about technological innovations that will change the way we live in the next five years, with the theme this year being cognitive advances in computing that help machines &#8220;learn&#8221; how to better serve us.  Last year&#8217;s 5 in 5 list also focused on&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2013/12/ibm-forecasts-major-advances-cognitive-computing/">IBM Forecasts Major Advances in Cognitive Computing</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>IBM on Tuesday released its annual &#8220;5 in 5&#8221; list of predictions about technological innovations that will change the way we live in the next five years, with the theme this year being cognitive advances in computing that help machines &#8220;learn&#8221; how to better serve us. <span id="more-5532"></span></p>
<p>Last year&#8217;s 5 in 5 list also focused on the <a href="http://www.pcmag.com/article2/0,2817,2413300,00.asp" data-ls-seen="1">rise of cognition in computing</a> and how the five senses humans use to gain information about and manipulate the physical world are being emulated by computing systems like IBM&#8217;s own Watson artificial intelligence framework.</p>
<p>For this year&#8217;s edition, IBM got a little more specific about the ways that such advances in machine learning will affect us, touching more on data analytics and offering up the following predictions:</p>
<p><b>The classroom will learn you:</b> Kerrie Holley of IBM described this as a concept &#8220;built on a lot of the technologies you see with how the Khan Academy works, cloud-based computing, and the like.&#8221; In the years to come, new learning technologies will use advanced analytics of &#8220;longitudinal student records&#8221; to help teachers better assess what individual students need, which ones are at risk, and how to help them in their education, he said.</p>
<p><iframe loading="lazy" frameborder="0" allowfullscreen src="//www.youtube.com/embed/hTA5GyWamR0" width="650" height="390"></iframe></p>
<p><b>Buying local will beat online.</b> Less about a specific tech advance, this prediction is based on the idea that the &#8220;tables will turn&#8221; in terms of access to the kind of technology, cloud services, and analytics that can help &#8220;mom and pop&#8221; businesses compete more readily with big national and global retailers, Holley said. &#8220;Technology costs are dropping and as they do, proximity will allow local retailers to create experiences the big retailers are not able to do online.&#8221;</p>
<p><iframe loading="lazy" frameborder="0" allowfullscreen src="//www.youtube.com/embed/yKNSOwLcrkE" width="650" height="390"></iframe></p>
<p><b>Doctors will use your DNA to keep you well.</b> IBM presented this prediction as one involving more advanced computational work than some of the others in its 5-in-5 list. &#8220;Cognitive-based systems like Watson, along with breakthroughs in genomic research, will enable doctors to be better able to diagnose cancer and offer better treatments,&#8221; Holley said.</p>
<p><iframe loading="lazy" frameborder="0" allowfullscreen src="//www.youtube.com/embed/0M1DMdc1mQ0" width="650" height="390"></iframe></p>
<p><b>The city will help you live in it.</b> In just a few decades, as many as seven out of 10 people around the world will live in cities, according to some projections. We&#8217;re already seeing more computational resources being dedicated to helping those city dwellers manage their urban lives and that will only accelerate, according to IBM.</p>
<p><iframe loading="lazy" frameborder="0" allowfullscreen src="//www.youtube.com/embed/tVGviMIMjN0" width="650" height="390"></iframe></p>
<p><b>A digital guardian will protect you online.</b> Holley explained this prediction as an expansion on financial fraud protection services offered by banks and credit card companies, only much more personally tailored to individuals to safeguard their entire digital lives.</p>
<p><iframe loading="lazy" frameborder="0" allowfullscreen src="//www.youtube.com/embed/al8ng82nRss" width="650" height="390"></iframe></p>
<p>&#8220;This year&#8217;s IBM 5 in 5 explores the idea that everything will learn—driven by a new era of cognitive systems where machines will learn, reason and engage with us in a more natural and personalized way. These innovations are beginning to emerge enabled by cloud computing, big data analytics, and learning technologies all coming together,&#8221; the research team behind the company&#8217;s annual list of predictions said in a statement.</p>
<p>&#8220;Over time these computers will get smarter and more customized through interactions with data, devices, and people, helping us take on what may have been seen as unsolvable problems by using all the information that surrounds us and bringing the right insight or suggestion to our fingertips right when it&#8217;s most needed. A new era in computing will lead to breakthroughs that will amplify human abilities, assist us in making good choices, look out for us, and help us navigate our world in powerful new ways.&#8221;</p>
<p>Written by: <a href="http://www.pcmag.com/author-bio/damon-poeter">Damon Poeter</a>, <a href="http://www.pcmag.com/article2/0,2817,2428432,00.asp">PC Mag</a><br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2013/12/ibm-forecasts-major-advances-cognitive-computing/">IBM Forecasts Major Advances in Cognitive Computing</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2013/12/ibm-forecasts-major-advances-cognitive-computing/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">5532</post-id>	</item>
		<item>
		<title>How Google is Melding Our Real and Virtual Worlds with Games, Apps … and Glass</title>
		<link>https://www.situatedresearch.com/2013/05/how-google-is-melding-our-real-and-virtual-worlds-with-games-apps-and-glass/</link>
					<comments>https://www.situatedresearch.com/2013/05/how-google-is-melding-our-real-and-virtual-worlds-with-games-apps-and-glass/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Mon, 20 May 2013 19:25:11 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Augmented Reality]]></category>
		<category><![CDATA[Communication]]></category>
		<category><![CDATA[Communities]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Games for Learning]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Mobile Devices]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[User-Centered Design]]></category>
		<category><![CDATA[Virtual Reality]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/?p=5126</guid>

					<description><![CDATA[<p>“The world around you is not what it seems,” says Ingress, the virtual game that uses the real world as its gamespace. And, perhaps, when Google’s semi-independent division Niantic Labs is finished with its mission, we humans won’t be, either. Google’s mission is to organize the world’s information and make it universally accessible and usable. Note&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2013/05/how-google-is-melding-our-real-and-virtual-worlds-with-games-apps-and-glass/">How Google is Melding Our Real and Virtual Worlds with Games, Apps … and Glass</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>“The world around you is not what it seems,” says <a href="http://www.ingress.com/">Ingress</a>, the virtual game that uses the real world as its gamespace. And, perhaps, when Google’s semi-independent division Niantic Labs is finished with its mission, we humans won’t be, either.</p>
<p>Google’s mission is to organize the world’s information and make it universally accessible and usable. Note carefully that Google says nothing about the Internet in that statement. <span id="more-5126"></span></p>
<p style="text-align: center;"><img data-recalc-dims="1" loading="lazy" decoding="async" class="size-full wp-image-5128 aligncenter" alt="Ingress" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/05/Ingress1.jpg?resize=558%2C353&#038;ssl=1" width="558" height="353" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/05/Ingress1.jpg?w=558&amp;ssl=1 558w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/05/Ingress1.jpg?resize=300%2C189&amp;ssl=1 300w" sizes="auto, (max-width: 558px) 100vw, 558px" /></p>
<p>In the last few eye-blinks of human history, we’ve created virtual worlds: cyberspace, virtual reality, the World Wide Web … places that exist in our devices, on our computers, in our servers, on the internet, and in our heads. But there’s also a space in which we live and walk and eat and breathe. Realspace. Meatspace. IRL. The real world, so we say, that we can touch and taste and smell.</p>
<p>Google’s trying to bring those worlds together, partly through the work of Niantic Labs.</p>
<p>Augmented reality is nothing new, of course, with marketing-focused companies like Layar building connections between physical and virtual reality and Ikea’s <a href="http://venturebeat.com/2012/12/23/augmented-reality/">most-downloaded branded app of 2012</a> doing similar things. Other startups have explored AR capabilities as well, such as Caterina Fake’s <a href="https://findery.com/">Findery</a>, which invites people to leave geo-tied notes that others can discover and read.</p>
<p style="text-align: center;"><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter  wp-image-5129" alt="screen-shot-2013-04-29-at-6-49-41-am" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/05/screen-shot-2013-04-29-at-6-49-41-am.png?resize=590%2C346&#038;ssl=1" width="590" height="346" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/05/screen-shot-2013-04-29-at-6-49-41-am.png?w=737&amp;ssl=1 737w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/05/screen-shot-2013-04-29-at-6-49-41-am.png?resize=300%2C176&amp;ssl=1 300w" sizes="auto, (max-width: 590px) 100vw, 590px" /></p>
<p>But when a company with the resources of a Google tackles the problem, and has a tool in Google Glass that seems destined for significant developer (and probably user) penetration that can actually create interconnections between the real and the virtual perhaps more efficiently than any other previous product, you’ve got something interesting. And potentially huge.</p>
<p>So a couple of weeks ago, I chatted with the man who’s leading that effort.</p>
<h3>John Hanke: the missionary of mapping</h3>
<p>John Hanke is vice president of product for Niantic Labs, the year-old Google-but-not-Google division of just a few dozen engineers that brought us <a href="http://venturebeat.com/2012/09/27/googles-new-field-trip-virtually-augmenting-the-awesomeness-of-reality/">Field Trip, the app to explore the world around us with a virtual docent</a>. And, of course, the virtual/real game Ingress.</p>
<figure id="attachment_5130" aria-describedby="caption-attachment-5130" style="width: 359px" class="wp-caption alignright"><img data-recalc-dims="1" loading="lazy" decoding="async" class="size-full wp-image-5130" alt="John Hanke" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/05/12926c4.jpg?resize=359%2C359&#038;ssl=1" width="359" height="359" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/05/12926c4.jpg?w=359&amp;ssl=1 359w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/05/12926c4.jpg?resize=150%2C150&amp;ssl=1 150w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/05/12926c4.jpg?resize=300%2C300&amp;ssl=1 300w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/05/12926c4.jpg?resize=90%2C90&amp;ssl=1 90w" sizes="auto, (max-width: 359px) 100vw, 359px" /><figcaption id="caption-attachment-5130" class="wp-caption-text"><em>John Hanke</em></figcaption></figure>
<p>Before Niantic, Hanke ran Google Maps, Google Earth, and other geo areas, and before Google, he was the cofounder and CEO of Keyhole, the innovative geo-mapping and visualization company. Google bought Keyhole in 2004, which brought Hanke in the search engine’s fold to lead the its maps, earth, street view, and local divisions.</p>
<p>Now, he told me, rather than let him leave to scratch his entrepreneurial itch yet again and do another startup, Google gave him a semi-autonomous group to, as his LinkedIn profile suggests, experiment at the “intersection of mobility, real world, and the Internet.”</p>
<p>“We set up Niantic as a group that could explore new types of mobile apps with ubiquitous always-on features,” Hanke said. “And we’re set up to act like a start-up.”</p>
<h3>Virtual + physical = field trip</h3>
<p>Field Trip was one of Niantic’s first creations, and while on the surface it’s an app that helps you find cool stuff, ultimately it’s a tool to merge metadata and data and then present them together. While you’re in the physical world, Field Trip pulls data about that experience from digital sources, feeding you that information, and changing — deepening, enriching — your experience of place. Layering with history, perhaps, or science, or culture.</p>
<p>Because, after all, one rock is very much like another rock, but if this is the precise rock where Geronimo attacked Mexican soldiers armed with only a knife and his courage, that changes our experience of this particular place. And the merging/melding/layering of virtual and physical makes it more real, in a sense — hyperreal.</p>
<figure id="attachment_5131" aria-describedby="caption-attachment-5131" style="width: 245px" class="wp-caption alignright"><img data-recalc-dims="1" loading="lazy" decoding="async" class="size-full wp-image-5131" alt="Google’s Field Trip app helps you explore “reality.”" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/05/ft-screenshot-5.png?resize=245%2C435&#038;ssl=1" width="245" height="435" /><figcaption id="caption-attachment-5131" class="wp-caption-text"><em>Google’s Field Trip app helps you explore “reality.”</em></figcaption></figure>
<p>Enabling that, of course, requires extensive virtual enhancement of the what-you-see-is-what-you-get world.</p>
<p>“One of the things that we’re trying to evangelize is the concept of geo-tagging everything,” Hanke told me. “I would have expected eight years ago that it would be ubiquitous now, but it’s still not. But I think we’ll get there.”</p>
<p>Geotagging everything digital is a key intersection point between virtual and real. If this blog post is written <i>here</i>, and not <i>there</i>, that adds flavor and nuance to the information. And if a particular historical fact is geotagged to a specific mapped location, that adds depth and dimension to our experience of that place.</p>
<p>“We’re applying some of the same techniques we currently use in standard web search, and the same kind of discipline, to pull really interesting, really good places up from everything else,” Hanke says. “The model is that you’re walking through an unfamiliar neighborhood, but with a friend who is telling you the best things around you. You enjoy it just like before, but you’re a little more informed.”</p>
<h3>AR + MMO + IRL</h3>
<p>Depth and dimension are definitely core components of Ingress, another Niantic Labs app/experiment/game. Ingress is a — take a deep breath — augmented reality massively multiplayer online video game.</p>
<p>The real world is real, but it’s fought over virtually by two shadowy groups: the Enlightened and the Resistance. Niantic has filled the Earth with virtual portals, usually coincident with actual physical landmarks or monuments, that players need to capture in order to gain territory. Capture territory with large numbers of people (aka “mind units”) and your faction gets more powerful.</p>
<p>Clearly, the massive integration of Google mapping technology with a sophisticated gaming engine is required. And the result is another intersection between the real and the virtual.</p>
<p>“Ingress is a massively multiplayer online game designed for mobile, with real location-based connections,” Hanke told me.</p>
<p>You play with everyone in your faction, and you might meet up with other players in real life, or you may just know them virtually as team members in another area. Along the way, Google learns an awful lot about how you use your mobile devices, about mapping physical locations, and about overlaying cyberspace on meatspace.</p>
<figure id="attachment_5132" aria-describedby="caption-attachment-5132" style="width: 633px" class="wp-caption aligncenter"><img data-recalc-dims="1" loading="lazy" decoding="async" class=" wp-image-5132 " alt="Ingress’ field of play is the world, layered with virtual data." src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/05/screen-shot-2013-04-29-at-7-05-27-am.png?resize=633%2C383&#038;ssl=1" width="633" height="383" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/05/screen-shot-2013-04-29-at-7-05-27-am.png?w=703&amp;ssl=1 703w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/05/screen-shot-2013-04-29-at-7-05-27-am.png?resize=300%2C181&amp;ssl=1 300w" sizes="auto, (max-width: 633px) 100vw, 633px" /><figcaption id="caption-attachment-5132" class="wp-caption-text"><em>Ingress’ field of play is the world, layered with virtual data.</em></figcaption></figure>
<p>All of that knowledge is going to come in very handy with Google Glass.</p>
<h3>Endgame: Google Glass?</h3>
<p>Hanke is cautious when speaking about Google Glass, as is the PR handler who is copiloting our conversation. Even already public information is a question mark as we chat: Google is definitely being Apple-like in the control and distribution of Glass and its future.</p>
<p>But something tantalizing tidbits do come out.</p>
<p>“We definitely kinda had Google Glass in mind when we started work on apps at Niantic,” Hanke says. “We need mobile devices that are less intrusive than the phone is.”</p>
<figure id="attachment_5133" aria-describedby="caption-attachment-5133" style="width: 300px" class="wp-caption alignright"><img data-recalc-dims="1" loading="lazy" decoding="async" class="size-full wp-image-5133" alt="A model demonstrates Google’s new Project Glass technology." src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/05/glass.jpg?resize=300%2C238&#038;ssl=1" width="300" height="238" /><figcaption id="caption-attachment-5133" class="wp-caption-text"><em>A model demonstrates Google’s new Project Glass technology.</em></figcaption></figure>
<p>And we need devices with different input/output modalities, he says. After all, it’s not easy to play Ingress running around holding an expensive and fragile device in front of you like a window ripped from its frame. And yet you need that portal from the physical to the virtual. For instance, while Field Trip is great to open the doors on human context for the world around us, it threatens to detract from our experience of the world by redirecting our eyes from the ultimate big screen of reality to the small screen of our mobile device.</p>
<p>Google Glass, on the other hand, sits unobtrusively on our foreheads, leaving our hands free and providing data as an overlay on top of the physical world rather than an alternative to the physical world. That model of layering, mixing, and intersecting is top-of-mind for Hanke.</p>
<p>“It just can’t be the case that people are walking around heads down tapping on a screen,” he says. “That just can’t be the future of the human race.”</p>
<h3>Cyborg me now</h3>
<p>Which, of course, is exactly what’s at issue: the future of the human race. Or, at least how we ingest, consume, and reconstitute digital data. And analog data. And meld the two into one harmonious whole of knowing.</p>
<p>That’s perhaps a little metaphysical for a small division of Google that focuses on maps and games and apps.</p>
<p>But the web has <a href="http://www.theatlantic.com/magazine/archive/2008/07/is-google-making-us-stupid/306868/">rewired our brains</a> in a decade or so of virtually ubiquitous Internet access, and the smartphone has rewired our behavior in five years, taking us from creatures who look up to to see others to beings that look down at any opportunity to see small bits of plastic and glass and metal in our hands.</p>
<p>So is it really too much to expect from a transformation that brings us from clear divisions between what is real and what is virtual to an elegant blend of the two?</p>
<p><img data-recalc-dims="1" loading="lazy" decoding="async" class="aligncenter size-full wp-image-5134" alt="screen-shot-2013-04-29-at-7-08-50-am" src="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/05/screen-shot-2013-04-29-at-7-08-50-am.png?resize=558%2C293&#038;ssl=1" width="558" height="293" srcset="https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/05/screen-shot-2013-04-29-at-7-08-50-am.png?w=558&amp;ssl=1 558w, https://i0.wp.com/www.situatedresearch.com/wp-content/uploads/2013/05/screen-shot-2013-04-29-at-7-08-50-am.png?resize=300%2C157&amp;ssl=1 300w" sizes="auto, (max-width: 558px) 100vw, 558px" /></p>
<p>“This is not psychosis or some cognitive break, but an actual takeover of the mind,” Google’s introductory video for the Ingress game says ominously.</p>
<p>Art imitates life, I suppose, and life, in turn, imitates art.</p>
<p>Written by: <a href="http://venturebeat.com/author/johnkoetsier/">John Koetsier</a>, <a href="http://venturebeat.com/2013/05/01/how-google-is-melding-our-real-and-virtual-worlds-with-games-apps-and-glass/">VentureBeat</a> (via <a href="http://ispr.info/2013/05/14/how-google-is-melding-our-real-and-virtual-worlds-with-games-apps-and-glass/">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2013/05/how-google-is-melding-our-real-and-virtual-worlds-with-games-apps-and-glass/">How Google is Melding Our Real and Virtual Worlds with Games, Apps … and Glass</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2013/05/how-google-is-melding-our-real-and-virtual-worlds-with-games-apps-and-glass/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">5126</post-id>	</item>
		<item>
		<title>Teaching Video Game Characters Natural Body Language</title>
		<link>https://www.situatedresearch.com/2012/08/teaching-video-game-characters-natural-body-language/</link>
					<comments>https://www.situatedresearch.com/2012/08/teaching-video-game-characters-natural-body-language/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Wed, 22 Aug 2012 17:15:48 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Affect / Emotion]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Communication]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Games for Learning]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Mental Models]]></category>
		<category><![CDATA[Usability Research]]></category>
		<category><![CDATA[User Experience]]></category>
		<category><![CDATA[User Interface]]></category>
		<category><![CDATA[User-Centered Design]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/blog/?p=2674</guid>

					<description><![CDATA[<p>Video game characters with natural responses to human body language Researchers at Goldsmiths, University of London have been using theater performers to design computer software capable of reading and replicating the way in which humans communicate with their bodies. Dr Marco Gillies from the Department of Computing has made virtual characters more believable by enlisting&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2012/08/teaching-video-game-characters-natural-body-language/">Teaching Video Game Characters Natural Body Language</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>Video game characters with natural responses to human body language</strong></p>
<p>Researchers at Goldsmiths, University of London have been using theater performers to design computer software capable of reading and replicating the way in which humans communicate with their bodies.</p>
<p>Dr Marco Gillies from the Department of Computing has made virtual characters more believable by enlisting actors to teach them body movement. The actors interact with members of the public through a screen, and their responses to specific body language are memorized as algorithms by the software. <span id="more-2674"></span></p>
<p>“Two people can take on the roles of the video game character and the player, showing how the character should respond by acting out the movements themselves,” explained Dr Gillies. “The software enables video games characters to move in a more natural way, responding to the player’s own body language rather than mathematical rules.”</p>
<p>Traditionally, the creators of interactive characters are computer programmers, but Dr Gillies and his team puts this task in the hands of people with artistic rather than technical knowledge.</p>
<p>“Our hypothesis is that the actors’ artistic understanding of human behavior will bring an individuality, subtlety and nuance to the character that it would be difficult to create in hand authored models,” said Dr Gillies. “These are the kinds of everyday movements, that we do unconsciously, which make them hard to program in the conventional way.”</p>
<p>Dr Gillies and his team set up a case study in which physical theater performer Emanuele Nargi taught the software natural responses to a player’s movement.</p>
<p>One of the players, Goldsmiths student Max Bye, noticed that the virtual character reacted in a human manner: “When I laughed at it, it would walk away disappointed, so that worked very well.”</p>
<p>The research intends to help interactive media represent more nuanced social interaction, broadening its range of application. The new technique optimizes the use of the latest generation of motion detectors, and it is hoped that in future this will lead to games that are more emotionally complex and able to respond to more subtle social nuances of human behavior.</p>
<p><iframe loading="lazy" src="http://www.youtube.com/embed/2nqhSwhsWOs" frameborder="0" width="600" height="338"></iframe></p>
<p>Written by: <a href="http://www.goldsmiths.ac.uk/press-releases/pressrelease.php?releaseID=952" target="_blank">Goldsmiths, University of London</a> (via <a href="http://ispr.info/2012/08/09/teaching-video-game-characters-natural-body-language/">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2012/08/teaching-video-game-characters-natural-body-language/">Teaching Video Game Characters Natural Body Language</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2012/08/teaching-video-game-characters-natural-body-language/feed/</wfw:commentRss>
			<slash:comments>7</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2674</post-id>	</item>
		<item>
		<title>Creating the Illusion of Emotion or Why You Care About Ones and Zeroes</title>
		<link>https://www.situatedresearch.com/2012/03/creating-the-illusion-of-emotion-or-why-you-care-about-ones-and-zeroes/</link>
					<comments>https://www.situatedresearch.com/2012/03/creating-the-illusion-of-emotion-or-why-you-care-about-ones-and-zeroes/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Mon, 19 Mar 2012 15:30:35 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Aesthetics]]></category>
		<category><![CDATA[Affect / Emotion]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Communication]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Mental Models]]></category>
		<category><![CDATA[User Experience]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/blog/?p=2568</guid>

					<description><![CDATA[<p>As much as you may love video games and the stories they help you tell, it’s impossible to escape the fact that much of your experience is a trick of the mind. The thing that separates video games from other forms of media, the ability to interact with and perhaps shape a virtual world, is&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2012/03/creating-the-illusion-of-emotion-or-why-you-care-about-ones-and-zeroes/">Creating the Illusion of Emotion or Why You Care About Ones and Zeroes</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>As much as you may love video games and the stories they help you tell, it’s impossible to escape the fact that much of your experience is a trick of the mind.</p>
<p>The thing that separates video games from other forms of media, the ability to interact with and perhaps shape a virtual world, is mostly powered by the artificial intelligence of the characters that populate that experience.</p>
<p>But at its best gaming artificial intelligence systems, AI expert David Mark says, are, like 2-year-olds, basically sociopaths. What he means is that they are intrinsically anti-social. Getting past that problem doesn’t mean imbuing a character with personality, it means tricking gamers. <span id="more-2568"></span></p>
<p>“We can’t create psychology in our characters,” Mark told a gathering of game developers at last week’s Game Developers Conference in San Francisco. “They don’t have psychology because they are zeroes and ones.”</p>
<p>Mark, president of AI design consultant Intrinsic Algorithm, spent about half an hour last week walking game developers through what he called the psychology of artificial intelligence. He used [the] time to give the game-makers tips on how to make gamers feel like they’re in a world populated by real people instead of digital automatons.</p>
<p>The key, he said, is to find a way to get gamers to project their own emotions and psychology onto the game’s characters.</p>
<p>“In the absence of defining information people project what they believe should be there,” he said.</p>
<p>To prove his point, Mark showed the Heider-Simmel demonstration, an animated [1:40 minute] <a href="http://www.youtube.com/watch?feature=player_embedded&amp;v=sZBKer6PMtM">video</a> created by psychologists Fritz Heider and Mary-Ann Simmel in the 1940s to explore the “attribution of causality.”</p>
<p><iframe loading="lazy" src="http://www.youtube-nocookie.com/embed/sZBKer6PMtM?rel=0" width="980" height="600" frameborder="0"></iframe></p>
<p>The short video shows two animated triangles, an animated circle and a box. There was no audio, just the crude line drawings moving around. After showing the video he said that most viewers saw the video as a couple and a bully; or a mother, a child and a bad guy; or a father and a couple. Each viewer created their own, sometimes elaborate back story for the simple drawings.</p>
<p>“It’s really just two triangles, a circle and some lines,” Mark pointed out.</p>
<p>In the absence of information, viewers created their own fiction, their own emotional attachments. But movement and positioning, he added, does help shape context.</p>
<p>“People are emotional,” he said. “They want to engage with emotional characters. They will often engage their own psychology to do that. They will assume causality and infer narrative.”</p>
<p>Mark’s advice to game makers is to work on the subtlety of movement and design to quietly shift and shape engagement among gamers. The average game character is on the screen for seven seconds before they die, he said, but that doesn’t mean they can’t have an impact on the experience.</p>
<p>As with movies, video games and the technology used to create them are growing more sophisticated. It’s no longer necessary to court a gamers’ emotions with over-blown animation.</p>
<p>The ability for a game to deliver powerful stories with new technology was proven earlier in the week when game developer David Cage showed off his company’s latest technology. <a href="http://www.theverge.com/gaming/2012/3/7/2852778/the-makers-of-heavy-rain-show-off-their-stunning-new-technology">The Kara video</a> stunned audience members with both its graphic fidelity and charged message. While the technology was used to demonstrate the ability to capture an actor’s entire performance, from movement to voice, and project it into a game, the animation also shows a level of sophistication that could certainly help with AI-driven performances as well.</p>
<p>Mark says the solution to connecting gamers to characters will lean on that sort of artifice.</p>
<p>“Early game characters were like silent movie actors,” he said. Those actors had to exaggerate their acting to get their point across, but eventually they learned to convey the same feelings with less.</p>
<p>“Maybe it’s time for us to learn how to be more subtle just like the movie actors did,” he said. “Players will feel those emotions before they realize they are there.”</p>
<p>Written by: By <a href="http://www.theverge.com/users/crecente">Brian Crecente</a>, The Verge’s <a href="http://www.theverge.com/gaming/2012/3/12/2864505/creating-the-illusion-of-emotion-or-why-you-care-about-ones-and-zeroes">Vox Games</a> (via <a href="http://ispr.info/2012/03/16/creating-the-illusion-of-emotion-or-why-you-care-about-ones-and-zeroes/">Presence</a>)<br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2012/03/creating-the-illusion-of-emotion-or-why-you-care-about-ones-and-zeroes/">Creating the Illusion of Emotion or Why You Care About Ones and Zeroes</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2012/03/creating-the-illusion-of-emotion-or-why-you-care-about-ones-and-zeroes/feed/</wfw:commentRss>
			<slash:comments>1</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2568</post-id>	</item>
		<item>
		<title>Sony Says Games Will Read Emotions in 10 Years</title>
		<link>https://www.situatedresearch.com/2011/08/sony-says-games-will-read-emotions-in-10-years/</link>
					<comments>https://www.situatedresearch.com/2011/08/sony-says-games-will-read-emotions-in-10-years/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Sun, 28 Aug 2011 21:42:45 +0000</pubDate>
				<category><![CDATA[Gaming]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Serious Games]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Usability]]></category>
		<category><![CDATA[Affect / Emotion]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Communication]]></category>
		<category><![CDATA[Design]]></category>
		<category><![CDATA[Game Development]]></category>
		<category><![CDATA[Human Factors]]></category>
		<category><![CDATA[Learning]]></category>
		<category><![CDATA[Mental Models]]></category>
		<category><![CDATA[Usability Research]]></category>
		<category><![CDATA[User Experience]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/blog/?p=2281</guid>

					<description><![CDATA[<p>Sony is talking crazy, indicating that games may be able to tell if you’re lying or depressed just ten years down the road. We’ll stick with growing crops, thanks. Seriously, when do games stop being games and cross over into virtual reality? This was the question I asked Nvidia months ago at ECGC 2011, and&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2011/08/sony-says-games-will-read-emotions-in-10-years/">Sony Says Games Will Read Emotions in 10 Years</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>Sony is talking crazy, indicating that games may be able to tell if you’re lying or depressed just ten years down the road. We’ll stick with growing crops, thanks.</strong></p>
<p>Seriously, when do games stop being games and cross over into virtual reality? This was the question I asked Nvidia months ago at ECGC 2011, and was told there will always be a market for the high-end PC gamer with the rig nearly the size of a bookcase. But putting visual realism aside, what will happen when games suddenly stop acting like games, and become more like a self-aware super AI that could possibly one day sing you happy birthday or annihilate the human race? <span id="more-2281"></span></p>
<p>According to Sony Worldwide Studios chief Shuhei Yoshida, platform holders will be able to offer “almost dangerous kinds of interactivity” with the player within the next ten years. Games will know more about the player on a whole, know how they could be feeling by reading more than just player movements. Titles will be so “immersive” that players will serve as actors, as a true participant within the virtual realm.</p>
<p>“As far as I’m concerned, the motion control of today is like the 8-bit phase of video games,” Yoshida said last week <a href="http://www.develop-online.net/features/1400/PlayStation-The-next-ten-years">at a behind-closed-doors Gamescom panel debate</a>. “There are so many limitations. Talking about sensors, the game will eventually know more about the player. Not just movement, but where you are looking and how you could be feeling. It’s really difficult to judge this, but I’d like to think that in ten years game developers will have access to player information in real-time. We can create some really… almost dangerous kinds of interactivity.”</p>
<p>Mick Hocking, a senior director at Sony Worldwide Studios, chimed in when asked if Sony was currently testing technologies relying on biometric data. Naturally he dodged answering the question directly by stating that Sony does lots of R&amp;D in these areas.</p>
<p>“Having a camera being able to study a player’s biometrics and movements [is possible],” Hocking said. “So perhaps you can play a detective game that decides whether you’re lying due to what it reads from your face.”</p>
<p>“In ten years’ time I’d like to think we’ll be able to form a map of the player, combining other sorts of sensory data together, from facial expressions to heart rate,” he continued. “You can see how, over a period of time, you can form a map of the player and their emotional state, whether they’re sad or happy. Maybe people in their social network can comment on it. The more accurate that map can become, the more we can tailor it to the experience.”</p>
<p>Hocking seems to hope that AI in ten year’s time won’t still feel like “acting,” but will react more naturally, independent of scripts and pre-determined movements. “In Uncharted you can see games are getting closer to lifelike actor performances, but [despite] the more accurate they are becoming as an acting performance, it’s still acting. Will we have AI that allows us to talk to and truly interact with a character? Will we be able to show the character objects it can recognize?”</p>
<p>Do gamers really need that kind of interaction? Again, when do games stop serving as games, and become more like virtual reality experiences? As long as the AI doesn’t start popping off family members in fear of being disconnected from the (home or space station) network, we should be good to go.</p>
<p>Written by: <a href="http://www.tomsguide.com/us/Biometrics-Gamescom-AI-Shuhei-Yoshida-Mick-Hocking,news-12288.html">Kevin Parrish, Tom&#8217;s Guide</a><br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2011/08/sony-says-games-will-read-emotions-in-10-years/">Sony Says Games Will Read Emotions in 10 Years</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2011/08/sony-says-games-will-read-emotions-in-10-years/feed/</wfw:commentRss>
			<slash:comments>12</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2281</post-id>	</item>
		<item>
		<title>For IBM&#8217;s Watson technology, What Happens After &#8220;Jeopardy!&#8221;?</title>
		<link>https://www.situatedresearch.com/2011/02/for-ibms-watson-technology-what-happens-after-jeopardy/</link>
					<comments>https://www.situatedresearch.com/2011/02/for-ibms-watson-technology-what-happens-after-jeopardy/#_comments</comments>
		
		<dc:creator><![CDATA[Matthew Sharritt, Ph.D.]]></dc:creator>
		<pubDate>Mon, 14 Feb 2011 17:52:16 +0000</pubDate>
				<category><![CDATA[Collaboration]]></category>
		<category><![CDATA[HCI]]></category>
		<category><![CDATA[Psychology]]></category>
		<category><![CDATA[Simulations]]></category>
		<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Cognition]]></category>
		<category><![CDATA[Communication]]></category>
		<category><![CDATA[Games for Learning]]></category>
		<guid isPermaLink="false">http://www.situatedresearch.com/blog/?p=1989</guid>

					<description><![CDATA[<p>&#160; IBM&#8217;s Supercomputer Has Implications for Healthcare, Information Tech and More Wouldn&#8217;t it be nice to have your very own supercomputer in your pocket? If your laptop crashed while you were working on a major presentation, you could ask your portable expert to help diagnose the problem. If you wanted to bone up on Middle&#8230;</p>
<p>The post <a href="https://www.situatedresearch.com/2011/02/for-ibms-watson-technology-what-happens-after-jeopardy/">For IBM&#8217;s Watson technology, What Happens After &#8220;Jeopardy!&#8221;?</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>&nbsp;</p>
<p style="text-align: center;"><em>IBM&#8217;s Supercomputer Has Implications for Healthcare, Information Tech and More</em></p>
<p>Wouldn&#8217;t it be nice to have your very own <a href="http://abcnews.go.com/Technology/wireStory?id=11991793">supercomputer</a> in your pocket?</p>
<p>If your <a href="http://abcnews.go.com/Technology/video/jeopardy-pits-man-versus-watson-computer-12841538">laptop</a> crashed while you were working on a major presentation, you could ask your portable expert to help diagnose the problem. If you wanted to bone up on Middle Eastern history, you could ask it to comb every document available and then wrap it all up in a simple summary (annotated, of course). <span id="more-1989"></span></p>
<p>Best of all, instead of typing out basic questions on a cramped keyboard, you could speak to it in natural human language and it would understand.</p>
<p>Thanks to the researchers behind Watson &#8211; <a href="http://abcnews.go.com/Technology/wireStory?id=12615258">IBM&#8217;s whiz-bang computer designed to compete on &#8216;Jeopardy!&#8217;</a> &#8211; that sci-fi-like scenario is a little bit closer to reality.</p>
<p>In less than a week, the world will watch as <a href="http://abcnews.go.com/Entertainment/wireStory?id=12388873">Watson takes on the top human contestants on the popular trivia gameshow</a>. But artificial intelligence experts say it may not be too long before the technology powering Watson spills over into doctors&#8217; offices, businesses and, eventually, maybe even your phone.</p>
<p>For the past four years, researchers at IBM have been grooming their computer program (named after IBM founder Thomas J. Watson) to compete on &#8216;Jeopardy!&#8217; against the game&#8217;s most succsessful champions, Ken Jennings and Brad Rutter.</p>
<h3>Watson Understands Questions in Natural Language</h3>
<p>Not only does the program need to be able to recall facts and figures across a wide range of topics, it needs to understand the puns and metaphors commonly found in &#8216;Jeopardy!&#8217; clues. And, to win, it needs to do it all faster than the fastest human.</p>
<p>It&#8217;s an information-seeking tool that&#8217;s capable of understanding your question to make sure you get what you want, and then delivers that content through a natural flowing dialogue, David Ferrucci, IBM&#8217;s Watson team leader, said on the <a href="http://www.pbs.org/wgbh/nova/tech/smartest-machine-on-earth.html">NOVA special &#8216;Smartest Machine on Earth,&#8217;</a> which premieres on PBS tonight. I don&#8217;t think the world has seen a machine quite like Watson and, frankly, I&#8217;m thinking where can we go from here?</p>
<p>The magic of Watson is that beyond being able to search formal databases and tables for information based on keywords, it uses many different algorithms to understand and process natural human language.</p>
<h3>In Medicine, Watson-Like System Could Be &#8216;Doctor&#8217;s Assistant&#8217;</h3>
<p>Classic expert systems tend to be brittle, they tend to be very narrow, they tend to be able to solve problems only in the way it was expressed in that formal math, Ferrucci told <a href="http://abcnews.go.com/">ABCNews.com</a>. What you see underneath the hood in Watson is a way to do that reasoning but right over the natural language content itself.</p>
<p>That might not sound so tricky at first, but think about all the nuances and wordplay woven into human language. Homonyms, inflection, double entendres &#8211; they might be familiar to you, but they&#8217;re foreign to even the most advanced computers.</p>
<p>Dealing with natural language is a very, very hard task for a computer, Ferrucci said. But then, moreover, there&#8217;s tremendous potential if we can continue to chip away at this task.</p>
<p>In medicine, for example, a <a href="http://abcnews.go.com/Technology/ai-expert-ray-kurzweil-picks-computer-jeopardy-match/story?id=12871295">Watson</a>-like system could serve as a kind of doctor&#8217;s assistant, he said.</p>
<p>Let&#8217;s say a patient suffers from a rare disease, the medical Watson could listen in on patient interviews and combine those conversations with the patient&#8217;s medical record, family history and test results. Not only that, but it could cross-reference that material against relevant journal articles, research and other published information.</p>
<h3>Applications of Watson Technology Could Extend to Government, Engineering</h3>
<p>Finally, it could generate a list of the top conditions the patient might be suffering from, along with a list of all the relevant sources.</p>
<p>What the doctor can do is just consult this and say, am I missing anything? There&#8217;s a huge amount of information out there, Ferrucci said. I imagine that it will help the doctor to adjust and refine and rationalize and document both the diagnostic process as well as the treatment process with a lot more confidence.</p>
<p>Watson-like systems may not be as precise as a rule-based system working over a specific database, he said, but they can cover a wider collection of information and then make it more digestible for humans.</p>
<p>Ferrucci said the same process could be used by information technology professionals to unravel complicated computer problems.</p>
<p>IBM&#8217;s immediate challenge might be to best world-champions on &#8216;Jeopardy!,&#8217; but, ultimately, the company is looking to apply Watson&#8217;s technology to areas as varied as government, engineering and business.</p>
<h3>&#8216;Jeopardy!&#8217; Challenge Could Spark Public Conversation, Drive Research</h3>
<p>Eric Nyberg, a professor in the Language Technologies Institute at Carnegie Mellon University, said he hopes the &#8216;<a href="http://abcnews.go.com/Technology/jeopardy-champs-ibms-watson/story?id=12837898">Jeopardy!&#8217;</a> competition will not only open up a conversation with the public about artificial intelligence, but also drive more research in the field.</p>
<p>As far as consumer applications, he said, &#8220;I think the logical next stop beyond Watson is going to be systems that can advise you on selecting certain kinds of products that meet your personal needs.&#8221;</p>
<p>For example, we may not be too far away from a system that could read through all the camera reviews available and then, based on its knowledge of a user&#8217;s preferences, recommend the best choices, Nyberg said.</p>
<p>The system could be accessed in a retail shop where you would buy the camera, but it could also be accessible through a cell phone, he said.</p>
<p>&#8220;We could build applications like that today. For example, if there was a manufacturer that wanted to create a version of Watson that could answer questions about its entire product line that would be a very easy thing to do,&#8221; he said, adding that the range of trivia and language used in &#8216;Jeopardy!&#8217; actually poses a more difficult problem.</p>
<p>But though Watson may represent a ground-breaking step in so-called question-answering systems, researchers say it&#8217;s still not the ultimate goal in artifical intelligence.</p>
<h3>Goal of AI: Build Machines With Human Intelligence</h3>
<p>&#8220;From the science point of view, the goal of artificial intelligence, when it started 50 years ago was to build machines that exhibit human intelligence,&#8221; said Boris Katz, the principal research scientist at MIT&#8217;s Computer Science and Artificial Intelligence Laboratory. &#8220;One could argue that answering certain questions is part of that but I think it would be especially interesting to build something that not only performs certain tasks but maybe even does it in a way that a human would do it.&#8221;</p>
<p>Those machines would not just provide answers to questions, but be able to explain how they arrived at the correct answer. And, given the trend toward mobile devices, he said, eventually, those machines will likely find their way into your hands.</p>
<p>Some smartphone applications can already understand limited voice commands and execute basic tasks like dialing contacts in your phone book but, Katz said, advanced systems could potentially turn your computer into a hand-held buddy.</p>
<p>&#8220;Your pocket friend &#8211; more than Watson,&#8221;he said. &#8220;Not only [to] answer simple questions, but actually do things. &#8211; [You could instruct it to] &#8220;Please tell my friend to do this,&#8221; &#8220;please find this information and summarize it.&#8221;</p>
<p>That scenario is probably still decades away, he said, but it starts with the natural language research that put Watson on &#8216;Jeopardy!&#8217;.</p>
<p>So while you might want to be loyal and root for the human race next week, the future may not be such a bad consolation prize if man ultimately does get defeated by machine.</p>
<p>Written by: <a href="http://abcnews.go.com/Technology/watson-technology-jeopardy/story?id=12869629">ABC News</a>, via <a href="http://sct.temple.edu/blogs/ispr/2011/02/10/for-ibms-watson-technology-what-happens-after-jeopardy/">Presence</a><br />
Posted by: <a href="https://www.situatedresearch.com">Situated Research</a></p>
<p>The post <a href="https://www.situatedresearch.com/2011/02/for-ibms-watson-technology-what-happens-after-jeopardy/">For IBM&#8217;s Watson technology, What Happens After &#8220;Jeopardy!&#8221;?</a> appeared first on <a href="https://www.situatedresearch.com">Situated Research</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.situatedresearch.com/2011/02/for-ibms-watson-technology-what-happens-after-jeopardy/feed/</wfw:commentRss>
			<slash:comments>3</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">1989</post-id>	</item>
	</channel>
</rss>
