Smaller, smarter, faster: blink and you miss it. 

DOWNLOAD HERE

Nitin Sanket, assistant professor at Worcester Polytechnic Institute’s Robotics Engineering Department, hopes that one day you’ll barely notice a drone flitting by.

Sanket and his team are advancing the creation of small, autonomous robots. He envisions a future in which we’re surrounded by miniature drones performing tasks including pollination, searching for survivors in disaster zones, providing entertainment, delivering payloads and even serving as pets.

Taking his cues from creatures such as insects and hummingbirds, Sanket hopes to program drones that can navigate cluttered environments, dynamically evade moving obstacles and safely navigate through small gaps. To keep the robots secure and able to operate in environments where control loops might be difficult, all of the above needs to be achieved using only onboard perception and computing. That means no assistance from GPS, cloud-based computation or, eventually, motion capture.

“At some point, we want to be able to do search and rescue in the wild, flying fast enough through a forest fire to save somebody, or perhaps to look for poachers, and every second matters in these scenarios. That’s one of the major reasons why we are pushing the boundary of speed, and we want these robots to be small enough that it’s safe for them to move that fast. You have narrow spaces in a forest, and one option is to make your perception better, make your controls better. But the bigger the robot, the higher the probability of it running into something. So we’re asking, can we make the robot small, as well as making it smarter?”

To get to a point where his robots won’t need a system such as motion capture to navigate, Sanket first needs to use motion capture as his source of ground truth.

“We want to eventually fly at 30 or 40 meters per second,” he explains. “Similar work is being done in drone racing, but we want to do it in a completely unstructured environment. Our focus is on the vision side—how do you actually process the data that fast?

“So how can we test that? You can’t take it into a forest and actually fly at freeway speeds. You’d immediately run into a tree. So we want to do it virtually first. So, you have an empty space where all the trees are computer-generated and you have a perfect state estimator, which is motion capture. We want to start the journey there and that’s how our lab got set up.”

The lab in question is 36 ft long, 15 ft wide and about 12.5 ft high, and is equipped with 14 Vicon Vero cameras running on Tracker 4. “We were the first people in the country to get trained on Tracker 4,” says Sanket. “It’s awesome. Such an improvement.”

The story doesn’t stop there, to read the full case study, you can download it below.

DOWNLOAD HERE

Get in touch

    Would you like to receive relevant email marketing about Vicon’s products & services? (Opt out at any time.)