Projects Archive

Fall 2018

Using Simulation to Evaluate and Improve Team Cognition in Handoffs

Collaborators: Abigail Wooldridge, Illinois/Industrial and Enterprise Systems Engineering and Paul Jeziorczak, OSF
This project is the continuation of earlier research. It attempts to better measure the impact of improvements made to the process of handoffs which are important to provide opportunities to detect and correct errors. Recent work has conceptualized handoffs as team cognition, measured using human factor techniques outside of health care. Researchers believe team cognition theory can be applied to improve handoffs with education and technology-based interventions.

Lung Cancer Radiomics and Radiogenomics

Collaborators: Minh N. Do, Illinois / Coordinated Science Laboratory and Joseph R. Evans, OSF/UICOMP
In an attempt to reduce the leading cause of cancer deaths in the United States, this project would combine imaging and genomic features to develop a radiogenomics risk signature, offering valuable information about the aggressiveness of the newly diagnosed lung cancer. Furthermore, this project takes advantage of and extends the OSF lung cancer screening program by establishing IRB-approved imaging and pathology repositories.

Mixed-Reality Based Visualization and Simulation of Nerve Conduction Study

Collaborators: Vahid Tohidi, OSF and Pramod Chembrammel, Illinois/Health Care Engineering Systems Center 
This proposal attempts to use a mixed-reality technology platform to train medical students, technicians, neurology residents and fellows how to better recognize pathological patterns in results of nerve conduction studies. Researchers believe this type of education will shorten the learning curve for accurate and effective application of NCS data in diagnosis of peripheral nerve disorders which can be debilitating for those impacted.

Surgical Planning Via Preoperative Surgical Repair of Next-Generation 3D, Patient-Specific, Cardiac Mimic

Collaborators: Hyunjoon Kong, Illinois/Bioengineering and Mark D. Plunkett, OSF
This project aims to 3D print realistic physical organs and tissues to help surgeons better plan for specific operations and train new surgeons. This team has developed a 3D printing approach, using materials that mimic the softness and toughness of anatomy. This work is expected to advance the field of clinical simulation to the next level. 

i-AREA-p: An Intelligent Mobility-Based Augmented Reality Simulation Application for Pediatric Resuscitation Training

Collaborators: Trina Croland, OSF/UICOMP and Abigail Wooldridge, Illinois/Industrial & Enterprise Systems Engineering
Jump Simulation created an augmented reality-based Pediatric Code Cart app that allows medical students and professionals to easily learn about the contents of the cart, how it works, and how to use it in the event of a pediatric emergency. This team will work to expand this platform to include additional adult resuscitation modules as well as procedural skills elements related to pediatric resuscitation.

Robotic Arm Neurological Exam Training Simulator for Abnormal Muscle Tone

Collaborators: Elizabeth Hsiao-Wecksler, Illinois/Mechanical Science and Engineering and Christopher Zallek, OSF/UICOMP
This group of individuals is expanding work to create multiple robotic arm simulators that mimic abnormal muscle behaviors. These training devices are expected to help medical students, interns, residents, nurses and physical/occupational therapists understand the difference between spasticity and rigidity in patients to correctly diagnose neurological conditions. 

Pediatric Sepsis Guidance System

Collaborators: Lui Raymond Sha, Illinois/Computer Science and Richard Pearl, OSF/UICOMP
In an effort to help clinicians diagnose sepsis in pediatric patients sooner, this team is creating a computerized pediatric sepsis best practice guidance system. This software will allow for early detection, diagnosis and treatment of sepsis in children. The goal is to improve patient care and reduce medical errors. It will first be tested in a simulation setting.  

Multi-modal Skin Lesion Identification & Education Simulator: Augmented Reality Interactive Skin Lesion App

Collaborators: Scott Barrows, OSF/Jump and Steve Boppart, Illinois/Bioengineering
This project expands on an augmented reality-based mobile app developed last year to train medical students in the identification, diagnosis and treatment of skin lesions, masses and other abnormalities. The second phase aims to give learners the ability to see beneath the skin to view skin lesions and their pathologies that cannot be seen on the surface.

Integrating Soft Actuators in a Heart Simulator to Mimic Force Feedback in Cardiac Trans-Septal Puncture

Collaborators: Girish Krishnan, Illinois/Industrial Systems Engineering and Abraham Kocheril, OSF
This team is creating a realistic soft heart simulator that allows learners to feel what it’s like to poke and prod cardiac tissues to make crucial operating decisions. While this simulation device targets a specific surgical process for the heart, the idea is to create more soft structures for other surgical procedures.

Virtual Heart Patch for Determining Complex Shapes for Surgical Patching

Collaborators: Arif Masud, Illinois/Civil and Environmental Engineering and Matthew Bramlet, OSF/UICOMP
This group is developing a software module that allows surgeons to simulate the creation of complexly-shaped 2D heart patches in a virtual reality environment. Surgeons would use this simulation to determine the size and shape of a patch that needs cut from a 2D sheet of flexible cloth-like material that can be used in a real heart patch surgery.

Automated and adaptive whole-body segmentation for visualization of anatomy, lesions, and intervention pathways for medical training

Collaborators: Brad Sutton, Illinois/Bioengineering and Matthew Bramlet, OSF/UICOMP
This project expands on a previous effort to develop an automated segmentation program to create congenital heart defects in 3D, viewable in a variety of digital formats. The current proposal seeks to develop another automated segmentation platform for the creation of 3D content of the whole body for medical training in virtual reality.

Fall 2017

KneeVIEW: A Virtual Education Window for musculoskeletal training

Collaborators: Mariana Kersh, PhD, Scott Barrows, MA, FAMI, Dr. Thomas Santoro, David Dominguese, PhD, Anthony Dwyer, Joel Baber, Grace I-Hsuan Hsu, B.Sc., ALM, MS, Meenakshy Aiyer, MD, FACP
Despite the increasing prevalence of orthopedic injuries, clinicians are poorly equipped to treat musculoskeletal problems. Musculoskeletal training is ineffective due to limited exposure to clinical patients resulting in a lack of organized clinical instruction. This project aims to develop a realistic knee simulator model, supported by virtual reality and augmented reality educational modules, to enhance clinician training and improve patient outcomes. The biomechanically accurate model will replicate the stiffness of each individual component of the human knee to simulate both normal and pathological cases.

Multi-modal Skin Lesion Identification & Education Simulator

Collaborators: Scott Barrows, MA, FAMI, Stephen A. Boppart, M.D., Ph.D., Thomas Golemon, MD, Brent Cross, BS, MS
Current simulated skin and models of skin lesions used in education are unrealistic in both visual and tactile characteristics. This project aims to create a skin simulation model with realistic appearance and texture. In the project’s first phase, the model will consist of 2D surface images of skin lesions displayed on a tablet computer with a translucent elastomer overlay replicating the surface topography of the lesion. Future efforts will seek to extend the model to 3D and incorporate additional features.

Interactive Technology Support for Patient Medication Self-Management (continued funding)

Collaborators: Dan Morrow, PhD, Suma Pallathadka Bhat, PhD, Mark Hasegawa-Johnson, PhD, Thomas Huang, BS, MS, ScD, James Graumlich, MD, Ann Willemsen-Dunlap, PhD, Don Halpin, EMBA, MS
Electronic health record (EHR) systems are underutilized by chronically ill adult patients. A barrier to patient/provider collaboration and self-care via EHR systems is that information in EHRs is technical, not patient-specific. This project aims to develop a natural language processing tool to translate technical information in the EHR into patient-centered language. A prototype translation algorithm has been created, with preliminary results showing the translation is both accurate and easier to understand. Development of a conversational agent (CA) system using an animated avatar to deliver the patient-centered language is also underway. Goals for further development are refinement and expansion of the translation tool and CA capabilities, including making the CA interactive and able to ask and respond to questions.

AirwayVR Virtual Reality based trainer for Endotracheal Intubation

Collaborators: Pavithra Rajeswaran, Praveen Kumar, MBBS, DCH, MD, Eric Bugaieski, MD, Priti Jani, MD, MPH
Endotracheal intubation is a procedure with risks of severe complications; this risk has been found to be associated with experience.  This project seeks to develop a stable, immersive, high quality, low cost VR simulation trainer for learning and practicing intubation. It will create a curriculum for intubation training that uses a VR trainer featuring 3D models of the head and neck and other interactive learning tools. VR input will be provided by a 3D printed laryngoscope as a VR controller. Validation studies will be performed to assess the impact of the VR trainer in intubation training.

Simulation Training for Mechanical Circulatory Support using Extra-Corporeal Membrane Oxygenation (ECMO) in Adult Patients (continued funding)

Collaborators: Pramod Chembrammel, PhD, Matt Bramlet, MD, Pavithra Rajeswaran
Widespread adoption of extra-corporeal membrane oxygenation (ECMO) in adults is limited by the difficulty of deployment of cannulae. To address this deficiency, this project aims to build a physical simulator for training ECMO. The trainer will use customized mannequins with flexible vasculature and a programmable pump to simulate the circulatory system. This artificial vasculature will be integrated with the BioGears physiology engine to control simulated physiological parameters. The physical components will be manufactured by 3D printing. Simulation experts and ECMO-experienced surgeons will evaluate the simulator’s performance.

A Natural Language Powered Platform for Post-Operative Care for Long Distance Caregiving

Collaborators: Ramavarapu Sreenivas, MS, PhD, Sarah De Ramirez, MD, MSc, Kesh T. Kesavadas, PhD
A 2011 study found that patients with severe postoperative grade IV complications cost the US $159,345. This projects aims to diminish these costs by using a Natural Language Powered Platform that patients can verbally interface with. The project consists of three phases: coding voice-commands to fulfill postoperative protocols and test it in a VR environment, connecting the platform to sensors to see if it can process motion assessments and testing these in the VR environment, and conducting studies with test patients at OSF.

Heart Failure & Behavior Change: Patient/Provider Interactive Clinical Education App for Mobile Devices

Collaborators: Scott Barrows, MA, FAMI, Wawrzyniec, Dobrucki, MS, PhD, Barry Clemson, MD, Kyle Formella, Don Halpin, EMBA, MS, Ann Willemsen-Dunlap, PhD
Heart failure (HF) is a complex physiological ailment that requires high cost interventions to manage. However, it has been shown that clear communication during the process improves patient outcomes and decreases human and financial burdens. This study aims to use a mobile app to support patients with Stage A, B, and C of HF. The aims of this project are to use a literature search and needs analysis to determine gaps and barriers, revise and add interactive 3D visual assets for the application, develop a repository of information of HF to be housed in the app, and begin integration of conversation agents developed through previously-funded ARCHES projects. Desired outcomes for the project are improved communication and understanding of HF for patients and improved adherence to treatment by patients.

Flexible, low-cost, Single Port Minimally Invasive robotic Surgical Platform

Collaborators: Placid Ferreira, PhD, Kesh T. Kesavadas, PhD, Nicholas Toombs, Fanxin Wang, Xiao Li, Jorge Correa
Minimally invasive robotic Single Port Laparoscopic Surgery (SPLS) has allowed for surgeons to perform various complex procedures with less burden on the patients. The downside to these robotic systems is an increased economic, maintenance and operation burden, resulting in limited hospital access. This project aims to improve upon their SPLS prototype to develop a cheaper, portable and more flexible device to address those issues. Three advancements in the field with the prototype have been demonstrated. Adding three more improvements will increase the adaptivity of the device and lower the price to an affordable point for middle class hospitals.

Interactive Mixed Reality (IMR) based Medical Curriculum for Medical Education

Collaborators: Kesh T. Kesavadas, PhD, David Crawford, MD, Meenakshy Aiyer, MD, FACP, Jessica Hanks, MD, John Vozenilek, MD
Clinical education and training is a highly complex area, and strides have been taken to improve upon the pre-existing methods of teaching. This project aims to combine the strengths of Jump and HCESC to develop a highly interactive platform for learning that uses Interactive Mixed Reality, a combination of Virtual Reality and 360-degree video. The hope is to eliminate the barrier of the simulation technical skillset so that instructors can easily develop educational content. Future goals of the platform are to provide an easy, immersive and portable method for adult professional learners to maintain, acquire and improve current knowledge while maintaining communication between them and healthcare education centers.

Summer 2017

Simulation of postural dysfunction in Parkinson’s disease

Led by: Manuel Hernandez from U of I, Dronacharya Lamichhane, MD from OSF HealthCare and UICOMP and Richard Sowers from U of I.
Falls are a prevalent and significant problem in people with Parkinson’s disease that is associated with gait and balance impairment. Balance impairment in Parkinson’s disease and the unique contributions from anxiety are poorly understood and difficult to treat.
This team is using a unique test of balance to gain a greater understanding of the coordinated activity of the body and brain, the disruption of this coupling that results from Parkinson’s disease and the influence of dopaminergic therapy.
Using virtual reality, this work will provide health care practitioners with a new tool for use in long-term monitoring of disease progression and drug treatment efficacy relevant to a wide range of motor disorders. In addition, it will serve as a platform for simulating the effects of altered sensory and motor integration function to the health care practitioners of tomorrow.

Movement impairment characterization and rehabilitation for dystonic cerebral palsy using robotic haptic feedback in virtual reality

Led by: Citlali Lopez from U of I and Julian Lin, MD from OSF HealthCare and UICOMP.

Cerebral palsy (CP) is the most common movement problem in children. 10% of children with CP have dystonia and seek medical assistance at higher rates than other forms of CP. Dystonia is a movement disorder with involuntary muscle contractions the cause twisting and repetitive movements, abnormal postures, or both. There is no cure for dystonia and rehabilitation exercises are unknown.

The team working on this project is developing a non-invasive, game-like intervention for patients with dystonic-CP using virtual reality and haptic feedback. The goal is to improve clinical motor scores.

This game-like tool will also double as a training implement for medical practitioners in the identification of complex presentations of motor disorders, not limited to CP.

Fall 2016

Multi-modal medical image segmentation, registration and abnormality detection for clinical applications

Led by: Thomas Huang from U of I and Matthew Bramlet, MD from OSF HealthCare and UICOMP

This team is developing an automatic 3D segmentation method, making it easier to separate out images of particular organs from an entire 3D rendering. As a result, physicians will be able to better detect abnormalities in medical images.

Developing MRI acquisitions and protocols to enable automated segmentation of cardiac and brain images

Led by: Brad Sutton from U of I and Matthew Bramet, MD from OSF HealthCare and UICOMP

In this project, researchers will develop an imaging protocol that will help physicians get a better picture of the heart and brain. Work will focus on providing maximal differentiation of different tissue types in the brain and heart of patients undergoing MRI diagnostics. This will result in several acquisitions that, when combined, provide maximal tissue separation in a multidimensional histogram. Using open-source algorithms, they will develop processing scripts that automatically create segmented and labeled models of the tissue types and states in a 3D structure of the heart.

Interactive technology support for patient medication self-management

Led by: Dan Morrow from U of I and James Graumlich, MD from OSF HealthCare and UICOMP

Researchers are developing a natural language processing tool that translates technical medication information into patient-centered language in electronic medical records (EMR). The group involved in this project is integrating patient-centered language into a conversational agent (CA)-based "medication adviser" system that supports collaboration and emulates best practices gleaned from face-to-face communication techniques. The researchers also will engage patients by developing interactive capabilities, such as using “teachback” when communicating with patients.

Surgical planning via preoperative surgical repair of next generation 3d, patient specific, cardiac mimic

Led by: Rashid Bashir from U of I and Matthew Bramlet, MD from OSF HealthCare and UICOMP

This team is working to improve care for pediatric cardiac patients. Researchers will leverage CT imaging and segmentation approaches to create new models for printing 3D infant hearts that mimic the structure, material properties and physical defects of tiny patients. Physicians will use the 3D models to practice surgical techniques and then use imaging methods to evaluate the effectiveness of the procedure.

Multi-Robot minimally invasive single port laparoscopic surgery

Led by: Placid Ferreira from U of I and Charles Aprahamian, MD from OSF HealthCare and UICOMP

This team is working to develop a new robotic platform that enables high-fidelity digital simulation, which will facilitate easy surgical training for clinicians. The robot will allow surgeons to translate the dexterity, torque and triangulation capabilities of the human in-vivo and will offer a high level of configurable and customizable methods for different surgical procedures. In addition, the robot will be portable and easy to use in field and emergency operations, as well as potentially low cost.

Abnormal Muscle Tone Behavior Diagnostic Device - Year 2 (continued funding)

Led by: Elizabeth Hsiao-Wecksler from U of I, Steven Tippett from Bradley University and UICOMP, Randy Ewoldt from U of I and Dyveke Pratt, MD from OSF HealthCare

This project will create a novel robotic training simulator that will helps learner differentiate between abnormal muscle tone behaviors which can help diagnose different brain lesions such as stroke, Parkinson’s disease, cerebral palsy, or multiple sclerosis.

Summer 2016

Virtual reality system of Patient Specific Heart Model medical education and surgical planning

Led by: Lavelle Kesavadas from U of I and Matthew Bramlet, MD from OSF HealthCare and UICOMP

Currently, doctors are using 2D tools and images to visualize a child’s 3D heart and make important surgical decisions. Because of the complex intra- and extra-cardiac relationships and connections, this imperfect method makes it difficult for doctors to accurately diagnose a patient. Researchers at the Health Care Engineering System Center at U of I and Jump Simulation, a part of OSF Innovation are using 3D immersive virtual reality technology to help solve this problem. They have created an intuitive model generated from patient-specific MRIs using stereoscopic 3D head-mounted displays.

Spring 2016

Safety and Reliability of Surgical Robots via Simulation

Led by: Ravishankar Iyer from U of I and David Crawford, MD from OSF HealthCare and UICOMP

In 2015, researchers at Illinois, MIT, and Rush University Medical Center reported that surgical robots had caused 144 deaths in 14 years. Now, computer engineers at Illinois and surgeons at OSF Saint Francis Medical Center in Peoria are collaborating on new research to improve the reliability and safety of minimally invasive robotic surgery.

This research will create platforms for simulation of realistic safety-hazard scenarios in robotic surgery and develop tools and techniques for the design and evaluation of the next generation of resilient surgical robots. The work will help improve not only the safety of robotic surgical systems, but also simulation-based training of future surgeons.

Patient Discharge Process and Communications Simulation Training

Led by: Deborah Thurston from U of I and Richard Pearl, MD from OSF HealthCare and UICOMP

About 20-25% of patients discharged from hospitals are readmitted within 30 days, costing roughly $42 billion dollars per year to insurance providers, according to the Agency for Healthcare Research and Quality. These costs are now the responsibility of Accountable Care Organizations (ACOs) like OSF HealthCare.

In some cases, patients are discharged too soon or with inappropriate treatment. Or patients may not understand and/or comply with discharge instructions such as how they are supposed to take their medications and what levels of activities they are able to do. There are a variety of proposed tools and techniques available to reduce readmissions, but there is no holistic system addressing the issue.

A framework is being developed, as part of ARCHES funded research, which will help define the complexity of the total patient discharge system and allow hospitals to evaluate new technology, policy, and communication systems in the construct of training simulation strategies.

Simulation training for mechanical circulatory support using extra-corporeal membrane oxygenation (ECMO) in adult patients

Led by: Pramod Chembrammel from U of I and Matthew Bramlet, MD from OSF HealthCare and UICOMP

This team is developing a simulator to train surgeons in using extra-corporeal membrane oxygenation (ECMO) to provide artificial oxygenation to blood cells. This skill, which is difficult to perfect without practicing on real patients, helps save failing hearts and lungs during a surgery. The researchers are modifying the DR DopplerTM blood flow simulator, which simulates blood flow in the vasculature, to develop a working prototype where the blood flow changes colors based on oxygenation.

Download PDF Fact Sheet

Simulation Training to Identify Fall Risk in the Home Environment

Led by: Rama Ratnam from U of I and Julia Biernot, MD from OSF HealthCare

Falls are a leading cause of serious injury and death in the elderly. There is a need to find a cost-efficient and easy means of evaluating fall risks, identifying muscle weaknesses, and establishing the potential for loss of balance in the home. Further, there is an equal need to train clinicians to evaluate elderly patients at risk for falling, and to better identify fall risk from postural and movement analysis.

Engineers with the U of I have developed a home-based tele-rehabilitation system that is inexpensive and capable of accurately recording and analyzing posture and balance during movement transitions. Researchers will test the validity of this system against a standardized method of determining fall risk.

The goal is for the system to allow for targeted intervention in an individual’s home and to better train clinicians in fall risk assessment, offering unparalleled opportunities to examine body dynamics in great detail and better understand postural control.

Fall 2015

Abnormal Muscle Tone Behavior Diagnostic Device - Year 2

Led by: Elizabeth Hsiao-Wecksler from U of I, Steven Tippett from Bradley University and UICOMP, Randy Ewoldt from U of I and Dyveke Pratt, MD from OSF HealthCare

This project will create a novel robotic training simulator that will helps learner differentiate between abnormal muscle tone behaviors which can help diagnose different brain lesions such as stroke, Parkinson’s disease, cerebral palsy, or multiple sclerosis.

Personalized Avatars In Patient Portals

Led by: Thomas Huang from U of I and Ann Willemsen-Dunlap from OSF HealthCare

A 3D audio-visual avatar capable of showing appropriate emotions as controlled by health care providers that will be used in online patient portals to help patients understand their specific medical information, such as test results and medical guidance.