Foldable devices are the future!

I had this idea about HoverSense to improve touchscreen technology back when I purchased the 1st iPad. In fact, I started developing initial concepts in early 2014, right after I left Cisco and before I joined Microsoft.
The original goal was to help visually impaired people and even blind people to use touch devices in a better way (there are more than 300 million blind people around the world).

But then, after receiving feedback from a few of my friends, peers and colleagues, I realized that I’ve got something even more unique here, something that can come handy when we eventually start seeing foldable devices, something that can change the way we all use touchscreens and also help the visually impaired (just as I had wanted to do in the beginning). So I’ve decided to put this idea on the shelf, and to fill a patent in order to develop it when the time is right (we’ll get to that later).

Fast forward to 2018: It bothered me that so many articles in recent years (including my own), talked about the stagnation in the mobile world, and lately even about everything wrong with Foldables and Dual-Screen devices…

So I decided to do something about it! After all, why complain in my articles if I have an idea that can solve the problem, just like UXers should do. Well, during my UX career, I developed a passion for problem-solving methodologies, especially when I see usability issues. Particularly when it comes to HCI & UX for Operating Systems, mobile, and cutting-edge technologies such as VR/AR/MR etc.

If you’ve read my articles and posts (on Medium, Twitter, LinkedIn, and Facebook), then you have probably noticed that I’m very passionate about HCI. I think that even if we replace our current work environments with AR/MR (Spatial Computing) without the need of peripherals, or even by controlling everything through brain-machine interfaces (such as Elon Musk’s Neuralink) Even then, I believe we will still use touch-screens, Holograms, and even voice commands, because they will complete one another and won’t replace each other, at least for the near future.

A NextGen touchscreen with Hover-sensitivity, Tactile-feedback, and Pressure-sensitivity.

Why are touch-screens so bad for productivity?

Today, our mobile devices are so powerful that they can replace our computers, but ironically we can’t use them in a productive way because much like 50 years ago, we still need a mouse and keyboard to do professional and creative work. Not to mention, 50 years in terms of technology is an eternity! In contrast, our smartphones are millions of times more powerful than all of NASA’s Apollo 17 combined computers in 1970!

So why can’t we use our devices to get some serious work done?

1. The tactile-feedback issue: There’s no real tactile-feedback with our current touchscreens, including Foldable and Dual-screen devices. For the entire human history, we use our hands to control everything and create stuff. But unfortunately, nothing has changed in touchscreens for over a decade. They remain flat, there are no other dimensions to it, and there’s no real tactile feedback when you type on a virtual keyboard, so you can’t feel the keys, buttons, textures and UI elements you touch with your fingers.

2. The hover-detection issue: Even when you connect an HDMI cable, or cast your device’s screen onto an external monitor, you can’t use it in a productive way. There aren’t Hover indicators for your fingers on the external display, so you still have to look on your device’s screen to interact with it. So there’s really no point in casting to an external display (for gaming, presentation, document editing, video & pictures mirroring, etc.) if you aren’t looking at it.

3. The touch and pressure issue: Current touchscreens can’t detect how hard you press with your fingers, which is a pity, because just like Hover detection and Tactile feedback, Pressure-Sense can open up new ways of interacting for creative people and even the gaming industry.

Even when it comes to a number of ambitious devices and concepts, when you look at them — you can feel that something is missing, for example: many people complain about dual-screen computers because they understand that it’s hard to type on touchscreens. And yet, they keep developing and selling these types of products, here are few examples:

HoverSense: A NextGen touchscreen for better productivity without peripherals.

So, how can we solve all these issues?

In the meantime, our Tablets and Phones can replace our computers, it’s all about tactile! Just like I said in my article few years ago. Personally, I use my iPad Pro to get some serious work done, and even my Galaxy Note 9. And I’m not alone, more and more professionals are switching from PCs & Macs to mobile devices such as the iPad Pro. Just like Henny Tha Bizness & Ken Lewis:

What’s a computer?

Now, imagine you could be more productive and creative with your mobile devices, just as you are with your computers. What if there was a better way to achieve this without the need of peripherals (mouse & keyboard, keyboard cases, remote controls, stylus, Etc.)? Well, actually there’s a better way!

Meet HoverSense:

What’s HoverSense (and the story behind it)?

When I started my research in early 2014, I consulted a few of my friends, peers and colleagues (under NDA contract), because they are known figures in the Israeli “Startup Nation” and their feedback was important to me.
I found myself a talented CTO from a well-known local company. After having finished with the business plan and a few concepts, and just before we met with a number of angel Investors, VCs, and accelerator managers, we decided that the timing is not right, mainly because it was obvious to me that the future is Foldable / Flexible (if you remember Samsung’s and Nokia’s concepts). And then, in 2015 I received a few tempting job opportunities, so I took a break from HoverSense and decided to shelf that concept.

A few months later I found myself working as a UX Lead for one of Microsoft’s Cyber Divisions at the Israeli R&D center. And even though I was responsible for more than 5 cross-platform products, I decided to participate in other incubation projects, to mentor and help other teams with UX, and also to explore opportunities, as well as network with the relevant executives within the company in order to gauge how interesting it would be to pull this kind of project (HS) while working for such an amazing and resourceful company.

Microsoft encourages its employees to take part in Hackathon, innovation, and incubation projects, so did I! I took part because innovation is what I’m really passionate about. Participating and helping with few internal incubation and Hackathon projects was fun and inspiring, because then I got the opportunity to lead a personal (award winning) project named OneAlbum, with one of my friends at Microsoft.

And as for HoverSense, I even consulted with Israeli R&D center CEO, Yoram Yaacovi, who was so kind as to offer his willingness to help, even though he asked me to not disclose the idea in detail (unless I decide to develop it within Microsoft). But I eventually decided to do it outside the company, and to take HoverSense from vision to reality without making it exclusive to one organization, because I believe this kind of technology should be an industry standard and available to all tech companies (and to their users).

What’s next?

In order to bring better and Immersive-experiences to VR and AR/MR (and even XR) glasses / lenses of tomorrow, there’s a need for Gesture / Hover-detection and tactile-feedback in these fields. It’s too soon to talk about it, and it’s not going to be easy, but yes, HoverSense (V2) is ready for that future too! 
As for now, I can’t really get into all the details here, so you might want to check out the website.

Now, whether you agree with HoverSense’s approach or not, I believe it’s important to think about and discuss the future of Touch interfaces. So you are welcome to share with me your thoughts and feedback on it, and please share HoverSense with your friends: https://startux.net/hoversense

read original article here