{"id":775,"date":"2013-10-06T22:13:00","date_gmt":"2013-10-06T22:13:00","guid":{"rendered":"http:\/\/triangleappshow.com\/?p=775"},"modified":"2022-12-15T06:46:42","modified_gmt":"2022-12-15T11:46:42","slug":"the-google-glass-interface","status":"publish","type":"post","link":"https:\/\/michaelrowe01.com\/index.php\/blog\/the-google-glass-interface\/","title":{"rendered":"The Google Glass Interface"},"content":{"rendered":"<p>In my last post I asked, rhetorically, was Google Glass a view of the new interface to come. I didn&#8217;t answer the question, nor even really think thru it, as I wanted to thinking on the question overnight. \u00a0So is it?<\/p>\n<p>I think Google Glass does \u00a0a great job of changing how you interact with your mobile device. \u00a0It does so in the same way that Siri and other voice tools changes how you look up information or dial the phone. \u00a0So to that end, it is nothing new.<\/p>\n<p>However, how it is different is the way you consume the data. \u00a0Google has a page for developers on how to work with the interface: <a href=\"https:\/\/developers.google.com\/glass\/ui-guidelines\" target=\"_blank\" rel=\"noopener\">here<\/a>. \u00a0The idea is simple, show what needs to be shown, and get the rest of the interface out of the way. \u00a0If you&#8217;ve been using Google Now and their new Card interface, then you&#8217;ve already experienced how Google Glass will provide you data. \u00a0If you are a developer and have started working on mobile apps or Chrome apps, and are already using Cards, then you understand the basic framework for defining your interface in Google Glass.<\/p>\n<p>In my other podcast, <a href=\"http:\/\/gamesatwork.biz\/2013\/10\/06\/episode-63-the-real-neuromancer\/\" target=\"_blank\" rel=\"noopener\">Games At Work dot Biz<\/a>, we talked about how Augmented Reality apps and the world of Neuromancer are coming together to change the way we interact with devices. \u00a0An example that came to mind was the ability to two people to work together to perform a task with the use of AR and Google Glass. \u00a0If you image a doctor or a mechanic performing an unfamiliar task. \u00a0They can learn by doing, and by wearing Google Glass, see a projection of an expert performing the same task with their hands out in front of them. \u00a0This POV (or Point of View) experience would allow the to mimic the experts movements, and all to the expert to see the action in a virtual 3D world using something like the Oculus Rift technology.<\/p>\n<p>Bringing these two technologies together, Google Glass and <a href=\"http:\/\/www.oculusvr.com\" target=\"_blank\" rel=\"noopener\">Oculus Rift<\/a>, could change the way we learn and practice new hands on activities. \u00a0It also could, dare I say it, &#8220;<a href=\"http:\/\/www.geeksaresexy.net\/2013\/08\/31\/bill-nye-the-science-guy-aims-to-change-the-world-video\/\" target=\"_blank\" rel=\"noopener\">Change the World<\/a>&#8220;.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>In my last post I asked, rhetorically, was Google Glass a view of the new interface to come. I didn&#8217;t answer the question, nor even really think thru it, as I wanted to thinking on the question overnight. \u00a0So is it? I think Google Glass does \u00a0a great job of changing how you interact with [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"hide_page_title":"","_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[2],"tags":[42,65,178,179,283,292],"class_list":["post-775","post","type-post","status-publish","format-standard","hentry","category-blog","tag-augmented-reality","tag-cards","tag-google-glass","tag-google-now","tag-neuromancer","tag-oculus-rift"],"aioseo_notices":[],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_shortlink":"https:\/\/wp.me\/p2aMa8-cv","jetpack-related-posts":[],"jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/michaelrowe01.com\/index.php\/wp-json\/wp\/v2\/posts\/775","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/michaelrowe01.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/michaelrowe01.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/michaelrowe01.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/michaelrowe01.com\/index.php\/wp-json\/wp\/v2\/comments?post=775"}],"version-history":[{"count":1,"href":"https:\/\/michaelrowe01.com\/index.php\/wp-json\/wp\/v2\/posts\/775\/revisions"}],"predecessor-version":[{"id":2856,"href":"https:\/\/michaelrowe01.com\/index.php\/wp-json\/wp\/v2\/posts\/775\/revisions\/2856"}],"wp:attachment":[{"href":"https:\/\/michaelrowe01.com\/index.php\/wp-json\/wp\/v2\/media?parent=775"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/michaelrowe01.com\/index.php\/wp-json\/wp\/v2\/categories?post=775"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/michaelrowe01.com\/index.php\/wp-json\/wp\/v2\/tags?post=775"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}