Welcome to the research division of the University of South Florida's Center for Assistive, Rehabilitation and Robotics Technologies (CARRT). Our research groups and laboratories incorporate innovative theory and state-of-the-art facilities to develop, analyze and test cutting edge assistive and rehabilitation robotics technologies.
Our faculty, staff, graduate and undergraduate students pursue a wide range of projects all focused on maintaining and enhancing the lives of people with disabilities:
WMRA is a project that combines a wheelchair's mobility control and a 7-joint robotic arm's manipulation control in a single control mechanism that allows people with disabilities to do many activities of daily living (ADL) with minimum or no assistance, some of these activities and tasks are otherwise hard or impossible for people with disabilities to accomplish.
This is a novel method of using laser data to generate trajectories and virtual constraints in real time that assist the user teleoperating a remote arm to execute tasks in a remote unstructured environment.
The laser also helps the user to make high-level decisions such as selecting target objects by pointing the laser at them. The trajectories generated by the laser enable autonomous control of the remote arm and the virtual constraints enable scaled teleoperation and virtual fixtures based teleoperation. The assistance to the user in scaled and virtual fixture based teleoperation modes is either based on position feedback or force feedback to the master. The user has the option of choosing a velocity control mode in teleoperation in which the speed of the remote arm is proportional to the displacement of the master from its initial position. At any point, the user has the option of choosing a suitable control mode after locating targets with the laser. The various control modes have been compared with each other, and time and accuracy based results have been presented for a 'pick and place' task carried out by three healthy subjects. The system is intended to assist users with disabilities to carry out their ADLs (Activities of Daily Living) but can also be used for other applications involving teleoperation of a manipulator. The system is PC based with multithreaded programming strategies for Real Time arm control and the controller is implemented on QNX.
Through collaboration with the School of Theatre & Dance and the School of Physical Therapy and Rehabilitation Sciences, adaptive recreational devices have been designed and developed to assist people with disabilities and amputees in various recreational activities, including dance and exercise.
A completely hands-free operated wheelchair that responds to one's body motion was developed primarily for use in the performing arts; however its unique user interface offers endless possibilities in the fields of assistive devices for daily activities and rehabilitation. This powered wheelchair modification provides greater social interaction possibilities, increases one's independence, and advances the state of the art of sports and recreation, as well as assistive and rehabilitative technologies overall. Various prototypes of this project have been developed, including a mechanical design and a sensor-based design. A new design is underway that utilizes an iPod or other hand held devices to control the wheelchair using the gyroscope capabilities of these devices.
This project involves the design, development, and testing of a stand-alone Omni-directional mobile dance platform with an independently rotating top. A robust, remote controlled, compact, transportable, and inexpensive moving platform with a rotating top is designed. This platform adds an additional choreographic element to create a unique style of dancing, which involves the use of a variety of mobility devices and performers including dancers with disabilities. The platform is designed to hold up to five-hundred pounds with an independently rotating top while the base moves forward/backward, sideways, or diagonally using Omni-directional wheels. The existing design has a removable top surface, folding wing sections to collapse the unit down to fit through an average size doorway, and detachable ramp ends for wheelchair access. The top of the platform is driven by a compact gear train designed to deliver maximum torque within the limited space.
Various terminal devices have been developed to assist prostheses users in their recreational activities. These terminal devices are designed to improve the user's capabilities to play Golf, kayaking, rock climbing and other activities.
A driver training system that combines a hand controlled modified van with a driving simulator has been developed. This system enables individuals to overcome transportation barriers that interfere with employment opportunities or access to daily activities. With the combination of AEVIT (Advanced Electronic Vehicle Interface Technology) and virtual reality driving simulator known as SSI (Simulator Systems International), an environment is created where a user can have different interfaces to learn to operate a real time motor vehicle. Various adaptive controls are integrated to the system. Analysis of various controls with various user abilities can be used to recommend specific devices and to train users in the virtual environment before training on their modified vehicle.
Passive dynamic walkers (PDW) are devices that are able to walk down a slope without any active feedback using gravity as the only energy source. In this research, we are examining asymmetric walking in a similar, but different approach, as the above Gait Enhancing Mobile Shoe Project. Typically, PDWs have used symmetric walkers (i.e., same masses and lengths on each side), which generally results in symmetric gaits. However, individuals with a stroke and individuals that wear a prosthetic do not have physical symmetry between both sides of their body. By changing one physical parameter on one of the two legs in the PDW, we can show a number of stable asymmetric gait patterns where one leg has a consistenty different step length than the other, as shown on the right. The figure on the right has the right knee moved up the leg. This asymmetric model of walking will enable us to test the effect of different physical changes on how individuals will alter their gait.
Many daily tasks require that a person simultaneously use both hands, such as opening the lid on a jar or moving a large book. Such bimanual tasks are difficult for people who have a stroke, but the tight neural coupling across the body can potentially allow individuals to self-rehabilitate by physically coupling their hands. To examine potential methods for robot-assisted bimanual rehabilitation, we are performing haptic tracking experiments where individuals experience a trajectory on one hand and attempt to recreate it with their other hand. Despite the physical symmetries, the results show that joint space motions are more difficult to achieve than motions in the visually centered space.
Certain types of central nervous system damage, such as stroke, can cause an asymmetric walking gait. One rehabilitation method uses a split-belt treadmill to help rehabilitate impaired individuals. The split-belt treadmill causes each leg to move at a different speed while in contact with the ground. The split-belt treadmill has been shown to help rehabilitate walking impaired individuals on the treadmill, but there is one distinct drawback; the corrected gait does not transfer well to walking over ground. To increase the gait transference to walking over ground, I designed and built a passive shoe that admits a motion similar to that felt when walking on a split-belt treadmill. The gait enhancing mobile shoe (GEMS) alters the wearer's gait by causing one foot to move backward during the stance phase while walking over ground. No external power is required since the shoe mechanically converts the wearer's downward and horizontal forces into a backward motion. This shoe allows a patient to walk over ground while experiencing the same gait altering effects as felt on a split-belt treadmill, which should aid in transferring the corrected gait to walking in natural environments. This work is funded by the Eunice Kennedy Shriver National Institute of Child Health & Human Development, NIH NICHD, award number R21HD066200 and is in collaboration with Amy Bastian at the Kennedy Krieger Institute and Erin Vasudevan at the Moss Rehabilitation Research Institute.
The goal of this project is to improve the effectiveness of vocational rehabilitation services by providing an environment to assess and train individuals with severe disabilities and underserved groups in a safe, adaptable and motivating environment. Using virtual reality, simulators, robotics, and feedback interfaces, this project will allow the vocational rehabilitation population to try various jobs, tasks, virtual environments and assistive technologies prior to entering the actual employment setting. This will aid job evaluators and job coaches assess, train and place persons with various impairments. Using virtual reality, simulators, robotics, and feedback interfaces the proposed project will:
The proposed project will simulate job environments such as a commercial kitchen, an industrial warehouse, a retail store or other potential locations that an individual will likely work. Features of the simulator could include layering of colors, ambient noise, physical reach parameters and various user interfaces. The complexity of the simulated job tasks could be varied depending on the limitations of the user to allow for a gradual progression to more complex tasks in order to enhance job placement and training.
In collaboration with Draper Laboratories and the Veterans Administration Hospital, wearable sensors research has been conducted in two different projects. A balance belt project, and a portable motion analysis project.
The purpose of this study is to develop a wearable Balance Belt to alert patients with abnormal vestibular function for injury and fall prevention. The user will be alerted using four vibrotactiles situated around the belt in case the inertial measurement unit (IMU) senses a good potential of misbalance.
The purpose of this study is to develop a wearable motion analysis system (WMAS) using commercially available inertial measurement units (IMU) working in unison to record and output gait parameters in a clinically relevant way. The WMAS must accurately and reliably output common gait parameters such as gait speed, stride length, torso motion and head rotation velocities which are often indicators of TBI. Validation of the wearable motion analysis system capabilities has been conducted using the Vicon optical based motion analysis system with healthy subjects during various gait trials including increasing and decreasing cadence and speed; and turning. A graphical user interface (GUI) that is clinically relevant will be developed to make this system usable outside of clinical settings.
Through collaboration with the School of Theatre & Dance and the School of Physical Therapy and Rehabilitation Sciences, biomechanics of human body motion is analyzed for various activities using Vicon motion analysis system, leading to fewer injuries, and better training practices. These activities include upper and lower body motion practices used by athletes and dancers, as well as prosthetics users when performing recreational or daily activities.
Current upper-limb prosthetic devices have powered wrist rotation only, making it difficult to grasp and manipulate objects. The wrist and shoulder compensatory motions of people with transradial prostheses have been investigated in the eight-camera infrared Vicon visual system that collects and analyzes three-dimensional movement data. This information helps clinicians, researchers, and designers develop more effective and practical prosthetic devices. The intact joints of the upper limb compensate for the limitations of the prosthesis using awkward motions. By analyzing the compensatory motions required for activities of daily living due to limitations of the prosthesis we hope to be able to improve the design and selection of prostheses.
This project is dedicated to the development of a simulation tool consisting of a robotics-based human body model (RHBM) to predict functional motions, and integrated modules for aid in prescription, training, comparative study, and determination of design parameters of upper extremity prostheses. The simulation of human performance of activities of daily living while using various prosthetic devices is optimized by data collected in the motion analysis lab.
The current generation of the RHBM has been developed in MATLAB and is a 25 degree of freedom robotics based kinematic model, with subject specific parameters. The model has been trained and validated using motion analysis data from ten control subjects and data collected from amputee subjects is being integrated as it is collected.
This project concentrates on measuring and predicting motion occurring at the socket residual limb interface. The current model will be a 4 degree of freedom robotics based kinetic model. Movement between the residual limb and prosthetic socket will be collected by a motion capture system (socket rotations and translations) and a new optics based device (relative slip between internal socket face and residual limb skin surface).
The goal of this project is to develop a robotics based human upper body model (RHBM), and associated constraints for the prediction and simulation of human motion in confined spaces, and under microgravity conditions to aid astronaut training. A force based component with an adjustable gravity term will also be added to the current kinematic based RHBM to allow for the simulation of external forces at varying levels of gravity: moon gravity; and microgravity. Statistically based probability constraints from motion capture data will also be incorporated to determine if a mixed method of modeling is more accurate and more efficient for studying upper limb movements such as using tools and moving objects. A motion analysis system will be used to collect kinematic data of subjects performing astronaut based activities of daily living in a confined space similar to the International Space Station. Analysis of this data will then be used to derive the model parameters. Functional joint center estimations will be used to find the geometric parameters of the model, and a variety of control methods including using force fields and statistical processes to generate microgravity will be used to determine the control parameters.