Teleoperation in robotics and haptic feedback has gone beyond the realm of robotic surgery to expand into underwater systems and other dangerous environments. One of the leaders in this space is Dr. Howard Chizeck, professor of electrical and computer engineering and adjunct professor of bioengineering at the University of Washington.

Howard Chizeck, University of Washington and Olis Robotics
His telerobotic research includes haptic navigation and control for telerobotic devices, including robotic surgery and underwater systems. His neural engineering work involves the design and security of brain-machine interfaces, and the development of devices to control symptoms of essential tremor and Parkinson’s disease. He is also a founder and chair of the board of directors for Olis Robotics, established in 2019 under the name BluHaptics, which aims to commercialize haptic rendering, haptic navigation and other UW telerobotic technologies.
Joanne Pransky, associate editor for Industrial Robot Journal, recently spoke with Chizeck about haptic feedback challenges and the lessons he’s learned through the founding of several robotics companies.
The full interview is available free to Robotics Business Review readers until Oct. 31, 2019. Here is an excerpt:
Commercializing haptics
Q: You have founded several companies. Could you tell us about your journey with your recent startup, Olis Robotics?
Chizeck: While at the University of Washington, I was working on trying to provide a sense of touch for surgeons performing robotic surgery. With National Science Foundation (NSF) support, we developed technology using the Microsoft Kinect to generate point clouds and render haptic forces.
An opportunity was presented by the Strategic Environmental Research and Development Program (SERDP) – a consortium program of the Department of Defense, Environmental Protection Agency, and Department of Energy – to remove unexploded munitions from lake bottoms. We figured that since we were already remotely manipulating a teleoperated robot for surgery, how could that be so different from underwater munitions? So I started a company, BluHaptics (later changed to Olis Robotics) and we wrote a seed grant proposal that got funded. We were then committed to try and make that work. Then, in the development of that technology for underwater munitions, it became apparent that there were a lot of other underwater applications that could use telerobotics.
The company is really software as a service. If someone makes a good robot arm, for instance, we’ll team with them. Our technology either gets packaged with an existing robot manufacturer’s kit or we sell or license to the folks who rent equipment and they include our software as an aftermarket product. We are trying to avoid manufacturing hardware since that is high risk and expensive. Our products at this point are really 100% software and our algorithms are in there. That was the beginning of the company. For dynamic environments where the robot and/or target may be moving or drifting, and objects may be swimming, floating, or flying by, we needed to bring in artificial intelligence to recognize and avoid these obstacles. To help do this we acquired a small two-person company that had the technology patents and now they’re employees and stockholders of the company.
Q: What makes your haptic feedback technology so different from anything else that exists?
Chizeck: We’re fast. Much of our intellectual property actually depends on some very fast algorithms. It’s not only the haptic technologies. Using a haptic device that lets the human control a remote robot and feel forces, the human operator and our algorithms achieve precision control. We’re basically taking advantage of human judgment and then coupling that with capabilities that let the system be faster or have capabilities that the human operator alone can’t do.
Our intellectual property is good for non-contact situations. Avoiding collision is really hard if you have sensors in the robot itself because you actually have to make contact to know you’ve had the collision. But if you’re using non-contact sensors like radar, sonar, or LiDAR, you can see the object coming and prevent the collision before it happens.
There’s also a time delay in the traditional robot control loop in that you have to wait for the robot and thus the sensor to hit and get that information back to the operator to make a decision. We can get rid of that time delay because we’re predicting what’s going to happen next based on velocity, trajectory and distance. We’re also building what we call virtual fixtures – force fields – that keep you away from what you don’t want to touch or guide you to where you want to grasp. Thus we can actually accomplish tasks while preventing bad things from happening.
Most robot companies are busy working on fully autonomous systems. We’re using what we call progressive autonomy. At one extreme is total human control; at the other extreme is full autonomy. We’re integrating autonomy of the robot by combining human control with different levels of autonomy, including autonomy augmentation locally for the robot’s end effector (gripper or other tool).
And we’re not just working in dangerous or remote environments, like underwater or in orbit – places where you can’t put humans – but also in scale. We’re manipulating things on a really small scale where the human hand is too big, or a very large scale, say in construction, where the objects you’re moving are really huge compared to the human hand. Distance, danger and scale are the three driving influences.
Lessons, proud moments
Q: What are some of the lessons you’ve learned?
Chizeck: Lessons learned […] Technology is actually a small part of a company. I’ve learned that you have to have a Chief Executive Officer (CEO) who knows what he or she is doing. The robot company is only at most half robot engineers and software developers. You have to have specialized salespeople who talk to customers to learn what the customer really wants in order to deliver a product that people will buy or lease. Tremendous numbers of robot companies have gone under because although they had wonderful ideas, they didn’t actually meet a need that was economically viable or that people would pay for.
Also, choose your boards and your investors if you can, as people who can help you with both connections and experience; not just money. You really need to see where your product will be in the future and how to make it cost effective and useful and what the competition will be. Having other people with experience and knowledge is a very effective way to do that.
Q: In terms of all the amazing things you’ve accomplished in your career, what has been your proudest moment?
Chizeck: The best thing is the students, the things they’ve accomplished, and launching them off. In particular, I’m talking about Ph.D. students because you’re sort of developing the next generation and sending them off.
Q: What do you think is the single most important thing we can be doing for our Ph.D.s to prepare them for the commercial side of robotics?
Chizeck: First, they have to decide if they want to be academic or commercial. Most new graduate students really don’t know what they want to do or haven’t found the passionate thing that they’re in love with and want to spend the next four or five years on. I tend to let my students wander in the wilderness, trying things in the lab and teaming up with other people until they focus. After about six months to a year, they find something that excites them and know what they really want to do. I don’t believe in the “Do this, do this, do this” micromanagement instruction.