Nearly $50 million in research funding awarded by NSF

September 17, 2012 - via National Science Foundation

The National Science Foundation (NSF), in partnership with NASA, the National Institutes of Health (NIH) and the U.S. Department of Agriculture (USDA), today awarded just under $50 million to grantees around the country for the development and use of robots that cooperatively work with people and enhance individual human capabilities, performance and safety.

These mark the first round of awards of the Obama Administration's National Robotics Initiative (NRI) launched with NSF as the lead federal agency just over a year ago as part of the president's Advanced Manufacturing Partnership Initiative. NSF itself is announcing 31 of the awards totaling nearly $30 million.

Funded projects target the creation of next-generation collaborative robots, or co-robots, with applications in areas such as advanced manufacturing; civil and environmental infrastructure; health care and rehabilitation; military and homeland security; space and undersea exploration; food production, processing and distribution; assistive devices for improving independence and quality of life; and safer driving. Examples of co-robots envisioned include the co-worker in manufacturing, the co-protector in civilian and military venues, and the co-inhabitant assistant in the home of an elder living independently.

Last year, NSF issued a solicitation and managed the merit review process for more than 700 individual proposals requesting over $1 billion in funding. NSF's Directorates for Computer and Information Science and Engineering; Engineering; Education and Human Resources; and Social, Behavioral, and Economic Sciences worked collaboratively with the other agencies. Together, NSF, NASA, NIH and USDA participated in the review process and in all funding decisions.

Each agency applied the goals of NRI against its mission criteria: encouraging robotics research and technology development to enhance aeronautics and space missions for NASA; developing robotic applications for surgery, health intervention, prostheses, rehabilitation, behavioral therapy, personalized care and wellness/health promotion for NIH; promoting robotics research, applications, and education to enhance food production, processing and distribution that benefits consumers and rural communities for USDA; and advancing fundamental robotics research across all areas of national priority for NSF, including advanced manufacturing.

"Harnessing the expertise of the private and public sectors, across many disciplines will advance smart technology and ultimately revolutionize key drivers of America's productivity, economic competitiveness and quality of life," said Farnam Jahanian, assistant director of NSF's CISE Directorate.

Each federal agency announced its awards this morning. Later today, representatives from each participating agency will brief the Congressional Robotics Caucus. For details please see the notice on the Congressional Robotics Caucus website.

What follows is the AUV Relevant list of the NSF-funded projects and principal investigators leading the research from each participating university. For projects that involve multiple institutions, the lead institution (and PI) is noted with an asterisk.

Collaborative Research: Purposeful Prediction: Co-robot Interaction via Understanding Intent and Goals 
*Carnegie-Mellon University (James Bagnell), Massachusetts Institute of Technology (Joshua Tenenbaum), University of Washington (Dieter Fox)
This project focuses on recognizing human intention--that is, teaching a robot to forecast what a human is going to do, so that robots may more effectively collaborate with humans. The inability of robots to anticipate human needs and goals today represents a fundamental barrier to the large-scale deployment of robots in the home and workplace. This project seeks to develop a new science of purposeful prediction using algorithms that may be applied to human-robot interaction across a wide variety of domains.

A Design Methodology for Multi-fingered Robotic Hands with Second-order Kinematic Constraints 
*Idaho State University (Alba Perez Gracia), the University of California Irvine (J. Michael McCarthy)
This research focuses on the adoption and integration of specific characteristics of human hands in robots in order to accomplish a desired task, whether that entails lifting a small, unusually-shaped part for assembly or moving a bulky object. This tool will increase the ability of industry to design high performance, cost-effective multi-fingered robotic hands and other end effectors.

Collaborative Research: A Dynamic Bayesian Approach to Real-Time Estimation and Filtering in Grasp Acquisition and Other Contact Tasks 
*Rensselaer Polytechnic Institute (Jeffrey Trinkle), State University of New York (SUNY) at Albany (Siwei Lyu)
This project is developing techniques to enable robots to grasp objects or perform other contact tasks in unstructured, uncertain environments with speed and reliability. Using the proposed method, sensor data tracks the continuous motions of manipulated objects, while models of the objects are simultaneously updated. Applications include search and rescue, planetary exploration, manufacturing, even home use with every day and important uncertainties such as effectively moving a bowl whether it is full or empty.

A Biologically Plausible Architecture for Robotic Vision 
University of California-San Diego (Nuno Vasconcelos)
The project seeks to develop a vision architecture for robots based on biologically inspired examples of high level vision systems (for example a gaze or ascertaining what a human is looking at) that is both biologically plausible and jointly optimal. This system would be useful for attention, object tracking, object and action recognition in both static and dynamic environments.

Context-Driven Haptic Inquiry of Objects Based on Task Requirements for Artificial Grasp and Manipulation 
Arizona State University (Veronica Santos)
This work focuses on the sense of touch. The project aims to advance artificial manipulators by integrating a new class of multimodal tactile sensors with artificial, humanlike hands and developing inquiry routines based on contextual touch. Weight given to each mode of tactile sensing (force, vibration, temperature) will also be tuned according to the context of the task. The research explores how to make use of this stimulus, in order to enable assistive robots to better grasp, hold and carry objects.

Contextually Grounded Collaborative Discourse for Mediating Shared Basis in Situated Human Robot Dialogue 
Michigan State University (Joyce Chai)
This project focuses on human-robot dialogue, bridging the chasm of understanding between human partners and robots that have completely mismatched capabilities in perceiving and reasoning about the environment. This project centers on developing techniques that will support mediating the shared perceptual basis for effective conversation and task completion. With an ability to use what is known to shed light on what is not yet known (that is, using the power of inference-in situations that give clues to meaning), this research could benefit many applications in manufacturing, public safety and healthcare.

Cooperative Underwater Robotic Networks for Discovery & Rescue 
University of Connecticut (Chengyu Cao)
This project aims to develop a cooperative underwater robotic network for exploration, discovery and rescue currently undermined by murky underwater conditions in which traditional acoustic, radio communication networks do not work. So called autonomous underwater vehicles offer inherent advantages over manned vehicles in cost and efficiency, specifically they eliminate the need for life support systems and the potential risk of human life while enabling assessment and damage mitigation after an incident under the water's surface, such as an oil spill.

Improved safety and reliability of robotic systems by faults/anomalies detection from uninterpreted signals of computation graphs 
California Institute of Technology (Richard Murray)
This research centers on detecting error conditions--that is, figuring out when things are going wrong, and/or when conditions may have been tampered with or altered by a human. This project addresses the main challenges of designing robots that can operate around humans to create systems that can guarantee safety and effectiveness, while being robust to the nuisances of unstructured environments, from hardware faults to software issues, erroneous calibration and less predictable anomalies, such as tampering and sabotage.

Multifunctional Electroactive Polymers for Muscle-Like Actuation 
University of California-Los Angeles (Qibing Pei)
This project aims to develop a new, softer polymer material that is electronically stimulated to behave like an artificial muscle. This offers a combination of attributes for future robotic systems including power output that outperforms human skeletal muscle, flexibility, quietness, and biocompatibility. Actuators based on the more human muscle-like material enable the design of robotic systems that more comfortably interact with people, such as assistive prosthesis or assistive devices for people with disabilities, humanoid robots for elderly in-home care, and surgical robots to save lives.

Perceptually Inspired Dynamics for Robot Arm Motion 
University of Wisconsin-Madison (Michael Gleicher)
This project seeks to enable a computer to learn from its own trials, errors and successes how to move and how to plan future appropriate motions. Researchers are working to develop an understanding of human perception of movement that can be applied to the development of robot trajectory planning and control algorithms, using human subjects experiments to understand and evaluate the interpretation of movements and apply these findings in robotics and motion synthesis.

Virtualized Welding: A New Paradigm for Intelligent Welding Robots in Unstructured Environment 
University of Kentucky Research Foundation (Ruigang Yang)
Zeroing in on welding done with widespread use as a manufacturing component and done by highly skilled workers, this project will develop a new robotic platform with novel 3D modeling and visualization algorithms designed to complement the skills and expertise of a human welder with advanced sensing tools of a robotic one. The primary use for this new technology is in manufacturing. Successful completion of the proposed project paves the foundation for intelligent welding robots with closed-loop intelligent control. Such a robotic system can perform high-speed and high-precision welding while allowing more variations in the work pieces and environments. In addition, virtualized welding can be integrated with a mobile platform to allow welding in places that are hazardous or unsuitable for human welders.

External link:

Author:Lisa-Joy Zgorski

Search Community News

Browse Archive

Top Stories of the Months