WORCESTER, Mass. — As robots, smart devices, and other automated technologies become more widespread, many industry analysts and researchers are predicting a coming surge in autonomous machines in industrial settings, office buildings, retailers, and even homes. These devices are expected to not only be aware of their environment, incoming data, and their own state, but have the ability to learn, act, and make decisions on their own.
Students at Worcester Polytechnic Institute (WPI), working in Major Qualifying Project (MQP) teams, are preparing themselves to be part of this burgeoning wave of technological and societal advances and, in many cases, they’re advancing the technology they’re working on—from an autonomous vehicle platform to a robot that can guide prospective students around a campus building to one that could charge electric cars.
Charging electric cars
“I think autonomous vehicles are certainly going to be part of the future,” says MQP advisor Craig Putnam, senior instructor in computer science and associate director of robotics engineering. However, he notes, it’s probably not financially practical to have a car-charging robot at individual homes—at least not in the near future.
Thus the students’ innovation could be a smart option for companies operating fleets of electric, autonomous cars. “This project fits into that as an infrastructure support project,” he adds, “so that as vehicles will be able to drive themselves around fully autonomously on the road, they also should be recharged without human intervention.”
Using a rear fender fitted with an electric car charging port, the students designed a robotic arm that reaches out to the car, uses computer vision to find the charging plug flap, and attaches a suction cup to the flap so it can pull it open. The arm then releases the flap and replaces the suction cup with the proper charging plug, reaches back out and inserts the plug so the charging can begin.
The students wrote algorithms and code to control the robotic arm’s operations, which range from computer vision to autonomy and robot motion. They also used 3D printing to create various parts, such as end-of-arm tooling and parts of the frame.
Team members included Class of 2019 students Ryan O’Brien (RBE), Matthew Fortmeyer (CS), Nikolas Gamarra (RBE), and Jacob Remz (RBE). ECE professor Jie Fu co-advised the project.
Autonomous car platform
With self-driving cars promising to become part of our everyday lives, one MQP team took on the challenge of retrofitting a traditional vehicle to become a self-driving automobile.
Under the supervision of ECE professor Alex Wyglinski, team members built a modular platform, using LIDAR, which uses lasers and sensors to measure distance, ultrasonic sensors, motors, and a high-performance computing module that can make any ground vehicle drive autonomously. They were able to turn a 4×4, off-road-style vehicle into a fully autonomous vehicle that was able to accurately and safely chart and follow a path from a starting point to a target destination, avoiding obstacles along the way.
“It’s impressive that the students built something that is not hardcoded into the vehicle itself,” says Wyglinski. “Integrating these sensing and computing components into a car is a huge undertaking since altering one part of a vehicle can potentially affect many other systems. The students’ solution can be transferred to other vehicular platforms. The key is that it’s modular.”
The team won the Provost’s MQP Award in the electrical and computer engineering department.
Students on the team included Class of 2019 members Cassandra Pepicelli (ECE), Gabriel Entov (RBE/ECE), Cooper Wolanin (RBE/ECE), Sam White (RBE), Lan Hao Mao (RBE), and Jonathan Tai (RBE). Hugh Lauer, CS teaching professor, was a co-advisor on the project.
Tour guide robot
What’s an efficient way to show prospective students and their parents, or visiting researchers, around one of the many buildings on campus? One student team suggests a robot that can communicate with people and knows the building layout.
Such a robot was built this spring by an MQP team, co-advised by Jing Xiao, Dean’s Excellence Professor and director of the Robotics Engineering program, and Greg Fischer, William Smith Dean’s Professor of mechanical engineering. Capable of leading visitors to different rooms that might even be on different floors of a building, the tour guide robot can be used to show people around the Robotics Engineering offices at 85 Prescott St. this summer.
The student researchers worked with speech recognition, computer vision and navigation technologies, along with cameras and sensors. They also added a “head” to the mobile platform to create an approximately four-feet-tall robot that can express itself verbally and through on-screen animated faces that can convey several emotions, like happiness or confusion. The robot also can recognize obstacles in its way and, if the obstacle is a person, the robot can politely ask that person to move aside. If a user wants to visit multiple locations, the robot can calculate the best route to guide the person in an efficient tour.
“It is important for students to learn about service robots because they will be the most widespread application of robotics in our daily lives in the future,” said Xiao. “It’s also important for our students to learn about human-robotic interactions because we’re looking at a future where people will have robotic assistants and we’ll need those autonomous machines to be easy for people to communicate with and work with. This project is teaching students how to do that.”
Student researchers on the MQP team included Class of 2019 members Henry Dunphy (CS), Zoraver Kang (RBE, CS), and William Mosby (CS).