Continue Reading Previous Machine learning platform speeds optimization of vision systemsNext Could lucid dreaming perk-up your productivity? Share this:TwitterFacebookLinkedInMoreRedditTumblrPinterestWhatsAppSkypePocketTelegram Tags: Communications, Consumer, Industry I was just on the phone with Rich Quinnell, who is the editor of the EETimes Industrial Control Designline. Rich is currently attending ARM TechCon, and he was bringing me up to date with all the interesting sessions he’s attended and the amazing things he’s seen.Of particular interest in light of the denial of service (DOS) attack that disrupted Internet service in Europe and the US last Friday was ARM’s announcement of a comprehensive portfolio of products and services that promises to greatly simplify the creation of secure IoT systems (see Rich’s ARM Does IoT Security Chip to Cloud column on EE Times).I was also interested to hear that Rich will be attending a dinner and demonstration hosted by Ultrahaptics this evening. The term “haptics” refers to any form of interaction involving touch. Ultrahaptics use a phased array of ultrasonic generators to create tactile feedback technology — think virtual buttons and other controls floating around in mid-air (this video makes things a little clearer).Generally speaking, these ultrasonic-generated controls are invisible to the human eye. The reason for my interest, however, is their application in conjunction with virtual and/or augmented reality systems. In this case, the “invisible” controls could be given visible representations in the virtual and/or augmented worlds. All of this is one more step along the path to a Star Trek-like Holodeck.(Source: Ultrahaptics) I just heard from Cyan that the virtual reality version of Obduction will be made available to their backers (of which I am one) this coming Friday. A large part of Obduction involves manipulating objects in the virtual world. Currently, you have to use a mouse or Xbox controller to achieve this. Hopefully, we will be able to use the Oculus Touch controllers when they become available in December, but having an ultrasonic-based haptic interface would make the entire experience much more immersive (with the added advantage of annoying the dog).All I can say is that I’m very much looking forward to hearing what Rich has to say about all of this following the demo when I chat to him again tomorrow. Leave a Reply Cancel reply You must Register or Login to post a comment. This site uses Akismet to reduce spam. Learn how your comment data is processed.