{"id":8599,"date":"2022-05-26T17:16:40","date_gmt":"2022-05-27T00:16:40","guid":{"rendered":"https:\/\/blogs.ubc.ca\/etec523\/?p=8599"},"modified":"2022-05-30T22:17:06","modified_gmt":"2022-05-31T05:17:06","slug":"google-soli-a-mobile-radar-to-detect-body-language-and-gestures","status":"publish","type":"post","link":"https:\/\/blogs.ubc.ca\/etec523\/2022\/05\/26\/google-soli-a-mobile-radar-to-detect-body-language-and-gestures\/","title":{"rendered":"Google Soli: A mobile radar to detect body language and gestures"},"content":{"rendered":"\n<p class=\"wp-block-paragraph\">While technology has made communicating much more convenient for us, some lament the decreased level of human interaction as we&#8217;re all too focused on what&#8217;s happening in front of our monitors instead of paying attention to how other people behave or react. Aspects such as natural interactions, body movements and gestures are usually absent on mobile technology as they can actually help you understand and interact with everyday people and objects around you in a better sense. Studies have shown that that in the process of communication, non-verbal expression has 65% to 93% more influence than actual text as it helps people have a better understanding of the overall situation. In essence, \u201cBody language\u201d is the best interpretation of the behavioural psychology of the individuals and groups. (I&#8217;m sure we&#8217;ve all used our intuition\/observation skills to get a feel or vibe in the environment while adjusting to it)<\/p>\n\n\n\n<p class=\"wp-block-paragraph\">Soli is a miniature radar that understands human motions at various scales according to Google with the technology being able to pick up on motions and cues by reading implicit and explicit interactions. Implicit reactions refer to nonverbal aspects of a person that include proximity, body language and biosignals like heart beat to the Soli device whereas explicit interactions are your hand and finger movements directed to the Soli. <\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/blogs.ubc.ca\/etec523\/files\/2022\/05\/Screen-Shot-2022-05-26-at-5.06.14-PM-1024x1004.png\" alt=\"\" class=\"wp-image-8600\" width=\"485\" height=\"476\" srcset=\"https:\/\/blogs.ubc.ca\/etec523\/files\/2022\/05\/Screen-Shot-2022-05-26-at-5.06.14-PM-1024x1004.png 1024w, https:\/\/blogs.ubc.ca\/etec523\/files\/2022\/05\/Screen-Shot-2022-05-26-at-5.06.14-PM-300x294.png 300w, https:\/\/blogs.ubc.ca\/etec523\/files\/2022\/05\/Screen-Shot-2022-05-26-at-5.06.14-PM-768x753.png 768w, https:\/\/blogs.ubc.ca\/etec523\/files\/2022\/05\/Screen-Shot-2022-05-26-at-5.06.14-PM.png 1444w\" sizes=\"auto, (max-width: 485px) 100vw, 485px\" \/><\/figure>\n<\/div>\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/blogs.ubc.ca\/etec523\/files\/2022\/05\/Screen-Shot-2022-05-26-at-5.06.24-PM-1024x962.png\" alt=\"\" class=\"wp-image-8601\" width=\"485\" height=\"456\" srcset=\"https:\/\/blogs.ubc.ca\/etec523\/files\/2022\/05\/Screen-Shot-2022-05-26-at-5.06.24-PM-1024x962.png 1024w, https:\/\/blogs.ubc.ca\/etec523\/files\/2022\/05\/Screen-Shot-2022-05-26-at-5.06.24-PM-300x282.png 300w, https:\/\/blogs.ubc.ca\/etec523\/files\/2022\/05\/Screen-Shot-2022-05-26-at-5.06.24-PM-768x722.png 768w, https:\/\/blogs.ubc.ca\/etec523\/files\/2022\/05\/Screen-Shot-2022-05-26-at-5.06.24-PM.png 1458w\" sizes=\"auto, (max-width: 485px) 100vw, 485px\" \/><\/figure>\n<\/div>\n\n\n<p class=\"wp-block-paragraph\">Google mentions that the technology is able to recognize when you lean, turn, and reach while also recognizing hand movements that indicate you want to dial, swipe and slide. Soli will also be able to recognize when there are multiple people near the device or when the user is standing or sitting. On one hand, it&#8217;s quite intriguing to be able to capture rich information about the object\u2019s characteristics and behaviours, including size, shape, orientation, material, distance and velocity. On the flip side, this form of mobile technology will definitely raise concerns about privacy and ethical usage as the barriers between internal\/external thoughts are reduced. <\/p>\n\n\n\n<p class=\"wp-block-paragraph\">From the perspective of education, I could see this form of technology being useful as a form of inclusiveness as devices can be operated through air gestures while educators will be able to communicate with learners who might have speech issues or preventing too much anxiety from rising in the classroom. While it&#8217;s actual usefulness is debatable, it&#8217;s still a neat preview into the possibilities of future mobile tech. <\/p>\n\n\n\n<figure class=\"wp-block-embed aligncenter is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"In the Lab with Google ATAP: Nonverbal Interactions with Soli Radar\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/r-eh2K4HCzI?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n","protected":false},"excerpt":{"rendered":"<p>While technology has made communicating much more convenient for us, some lament the decreased level of human interaction as we&#8217;re all too focused on what&#8217;s&#8230;<\/p>\n<div class=\"more-link-wrapper\"><a class=\"more-link\" href=\"https:\/\/blogs.ubc.ca\/etec523\/2022\/05\/26\/google-soli-a-mobile-radar-to-detect-body-language-and-gestures\/\">Read more<span class=\"screen-reader-text\">Google Soli: A mobile radar to detect body language and gestures<\/span><\/a><\/div>\n","protected":false},"author":85677,"featured_media":8602,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_crdt_document":"","footnotes":""},"categories":[10],"tags":[],"class_list":["post-8599","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-mobiletechnologies","entry"],"_links":{"self":[{"href":"https:\/\/blogs.ubc.ca\/etec523\/wp-json\/wp\/v2\/posts\/8599","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blogs.ubc.ca\/etec523\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blogs.ubc.ca\/etec523\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blogs.ubc.ca\/etec523\/wp-json\/wp\/v2\/users\/85677"}],"replies":[{"embeddable":true,"href":"https:\/\/blogs.ubc.ca\/etec523\/wp-json\/wp\/v2\/comments?post=8599"}],"version-history":[{"count":2,"href":"https:\/\/blogs.ubc.ca\/etec523\/wp-json\/wp\/v2\/posts\/8599\/revisions"}],"predecessor-version":[{"id":8604,"href":"https:\/\/blogs.ubc.ca\/etec523\/wp-json\/wp\/v2\/posts\/8599\/revisions\/8604"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/blogs.ubc.ca\/etec523\/wp-json\/wp\/v2\/media\/8602"}],"wp:attachment":[{"href":"https:\/\/blogs.ubc.ca\/etec523\/wp-json\/wp\/v2\/media?parent=8599"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blogs.ubc.ca\/etec523\/wp-json\/wp\/v2\/categories?post=8599"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blogs.ubc.ca\/etec523\/wp-json\/wp\/v2\/tags?post=8599"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}