Human-Robot Teaming
Mentor: Dr. Julie A. Adams

This project seeks to develop new methods for adapting interaction between a human and a robot in a one-to-one partnership relationship based on the human's cognitive workload. The human dons various wearable sensors that provide a data stream that is analyzed to detect high and low workload conditions. Once a high or low workload condition is detected, the robot can adapt its interaction method with the human. REU students will work to develop new algorithms for detecting changes in workload and new algorithms for adapting the robot's interactive capabilities based on the workload detection. The project will require algorithm design, implementation and software testing. Further, REU students will evaluate the algorithms with actual human-robot teams.


Mentors: Dr. Cindy Grimm and Dr. Ravi Balasubramanian

Humans have no trouble picking up and manipulating objects, yet we've struggled to impart that ability to robotic manipulators. This is in part because robotic manipulators lack much of the sensory feedback human hands have, but it's also because we, as humans, are not very good about reasoning about what we do instinctively. The goal of this project is to develop tools and user studies that will let us capture that knowledge and apply it to robotic manipulation tasks.

Example projects:

  • Design (and evaluate) novel interfaces for specifying robotic grasps suitable for non-technical users
  • Determine how perceptual and shape-based cues influence how humans specify grasps
  • Apply statistical and machine-learning based approaches to gathered data in order to specify novel grasps


Design of Implants for Attaching Muscle and Tendons to Improve Human Hand Function
Mentor: Dr. Ravi Balasubramanian

Current reconstructive orthopedic surgeries use sutures to attach muscles and tendons. However, this leads to poor surgical outcomes because of the suture’s limited ability to transmit the muscle’s forces and movement to the tendons. It is expected that using passive implants, such as pulleys and rods, to surgically construct mechanisms in situ using the existing biological tendons will significantly improve post-surgery function (when compared to using sutures) and lead to the development of new surgical procedures. 
Example student projects:


Mentor: Dr. Ross Hatton

Spiders use vibrations in their webs to sense the presence of other insects, similarly to how whales, bats, and submarines use sonar to hunt their prey. In collaboration with a biology team, we're building and instrumenting a giant spider web to examine how this works.
Example Project

  • Designing new test scenarios (how does web construction affect signal transmission) and building the equipment to collect data from the web. We are also designing equipment to shake and pluck real spider webs and compare their responses to our one.


Geometry of Locomotion
Mentor: Dr. Ross Hatton

The effectiveness and efficiency with which robots move through the world is strongly affected by their geometry, including both the physical geometry of their bodies and the abstract geometry of the patterns in which they move their joints. This project is about exploring these geometric effects, and generating fundamental understanding about how robots (especially snakes and other crawling robots) exploit them. Depending on your interests, this project can range from hands-on experiments that generate data for geometric modeling to delving deeper into the mathematics behind locomotion.

Example project: Snake scales have more friction for sideways motion than forward motion, and we have good geometric models for how they exploit this difference in friction. They also have more friction for backwards motion than sideways or forward, but we don't have a geometric model for when this difference matters or how they exploit it. This project would include building a simple snake robot with different forward-backward-sideways friction coefficients, testing its motion with different shape-change cycles, and identifying patterns in the resulting motion.


Learning from Humans for Robotic Deburring

MentorsDr. Ravi Balasubramanian and Dr. Burak Sencer

Burrs are undesirable projections of material at the edges of a finished part’s surface. They pose a fundamental problem for manufacturing operations since they affect part handling and assembly operations and lead to part failure. Deburring, the process of removing burrs, is currently performed by human operators. But deburring is labor-intensive and causes injury to operators due to the forces and movements involved. This project seeks to learn the deburring process from a human (including forces and movements involved) and develop a robot to automate the deburring process.
Example student projects:

  • Collect and analyze data (motion-capture kinematic data, force data) of a human operator performing the deburring task
  • Develop algorithms for a robotic manipulator that produces similar movements and forces for deburring


Autonomy for Underwater and Surface Vehicles Exploring Ocean Environments

Mentor: Dr. Geoff Hollinger and Dr. Julie A. Adams

There are many ocean environments that are unsuitable for manned research vessels, either because they are too dangerous (e.g., near a calving glacier or in the deep ocean), or require too many resources to be effective. We seek to design a new generation of autonomous underwater and surface vehicles that can be programmed to autonomously measure ocean dynamics in a wide range of environments. This research project involves the design and programing of robust surface and underwater systems that will be used to explore ocean dynamics along the pacific coast, Greenland, Alaska, and in remote ocean basins. The REU student will join an interdisciplinary team of researchers from mechanical engineering, computer science, and oceanography to assist in designing autonomy algorithms and programming marine vehicles at Oregon State University.

Example projects

  • Programming autonomous surface and underwater craft to operate without operator control
  • Simulator design for underwater and surface vehicles
  • Interface design for human operators to coordinate autonomous marine vehicles


Passive Dynamics and Applied Control for Legged Locomotion

Mentor: Dr. Jonathan Hurst

Students on this project will participate in research with our human-scale walking and running robot Cassie. We are primarily focused at this time on controls and perception for Cassie; interns will work closely with graduate students to test new control ideas with the robot both in the lab and outside, read data from our LIDAR, cameras, and IR camera, implement a VR environment for telepresence control, or other interesting and fun projects. This is not a great introduction to these topics, it will be a better fit for an intern seeking a challenge.

Example Student Projects:

  • Create several prototype robot feet, and advise on control methods to incorporate foot.
  • Help with testing and experimentation of Cassie outdoors and indoors.
  • Begin initial design effort for new arms for Cassie 3, in cooperation with graduate students. 
  • Incorporate decision-making algorithms with real-time robot dynamics, to enable footstep planning and balancing at the same time.


Biologically-inspired Soft Robots
Mentor: Dr. Yigit Menguc (external) and Dr. Cindy Grimm (internal)

Biological tissues have rich structural and material compositions that give them elegant mechanical properties, such as the smooth variation in elasticity of a tendon that allows it to connect muscle to bone. We will explore patterning hard and soft materials with a multimaterial 3D printer to create structures that can bring biologically-inspired mechanical improvements to soft robots. For instance, one of the challenges in building a soft, squishy robot is constructing durable interfaces between extremely compliant materials such as silicone and rigid parts like batteries.
Example Projects:
  • Designing, fabricating, and testing a bioinspired multimaterial soft robot
  • Developing mechanical parts of multimaterial 3D printer for reliable operation
  • 3D printing and testing soft actuators with embedded liquid metal


Control of a self-driving wheelchair
Mentor: Dr. Bill Smart

We are working on a self-driving powered wheelchair, for use by people with severe motor disabilities, such as Amyotrophic Lateral Sclerosis (ALS, or Lou Gehrig's disease) and quadriplegia. The goal is to develop a low-cost package that can be added to a traditional powered wheelchair, and turn it into a self-driving system. Self-driving capabilities would afford the wheelchair user more independence, and allow them to move about their world more easily and efficiently.

The project leverages the Robot Operating System (ROS) software for navigation and localization of the system, with special-purpose code for wheelchair-specific features. Our goal is to release both the hardware design and software for the system under an open-source license.Our goal is two-fold: understand how people think about privacy and how that is the same (or different) when it is a robot and not a person, and develop algorithms that lets us "try out" various methods for preserving privacy.
Example Projects:

  • Design, implementation and testing of wheelchair specific navigation and localization algorithms
  • User interface design, implementation, and testing for the wheelchair system.
  • Long-term autonomy issues: map building and maintenance, data logging and management, and learning from experience.
  • Design, implementation, and testing of wheelchair-specific autonomous behaviors.


Robots Against Ebola
Mentor: Dr. Bill Smart

Can we use robotics and automation in the fight against highly-infectious diseases, such as Ebola Virus Disease?  We are working with Doctors Without Borders, a non-governmental organization that organized much of the response to the recent outbreak of Ebola in West Africa, to see how we can improve the quality of care that they can deliver by using robotics and automation.  Ebola thrives in environments that are hot, humid, and remote; often there is little in the way of reliable infrastructure.  This project is focused on practical solutions that will let us improve the quality of care that can be delivered to people with the disease and, ultimately, to reduce the mortality rate in future outbreaks.  We are building mathematical models of the tasks that health care workers do, using optimization techniques to reorder these tasks and subtasks to make them more efficient and effective, and looking at how we can use robotics and automation in support of the human workers.

Example Projects:

  • Model a particular health care task mathematically, and apply optimization techniques to make it more efficient (taking less time, making fewer mistakes, etc).
  • Program a robot to work alongside a human performing a task, to make the overall human-robot team more effective that either the robot or the human alone.
  • Perform user studies, to figure out the best human-robot interface for use in an Ebola treatment center setting.


Multi-Robot Coordination

Mentor: Dr. Kagan Tumer

Many interesting real world problems require multiple robots to work together. For example, search and rescue missions require coordinating dozens of autonomous robots, as well as ensuring that the robots and humans work together. But providing hard-coded coordination instructions is too limiting. This project explores the science of coordination, and focuses on how to provide incentives to individual robots so that they work collectively.
Example Projects

  • Programming intelligent decision making for robots
  • Implementing incentives for robots
  • Testing coordination algorithms in hardware (wheeled and flying robots)


Bat Sonar for Robots

Mentors: Dr. Cindy Grimm and Dr. Kagan Tumer

Bat sonar is an incredibly rich sense - bats can spot and catch insects in the air, fly through dense forest canopy - all by emitting chirps and listening for echos. They accomplish this feat through deformable ear and nose geometries that let them "shape" the sound scape. In this project we aim to build a soft, deformable bat ear and nose in order to mount such a sensor on the robot.

Example projects:

  • Design a soft actuator to bend the ear and nose
  • Experiment with different geometries to see how they shape sound
  • Apply machine learning to map the returned sonar signal to "object detected" in order to let a robot navigate through a space