A collaboration between CERC and the Stanford Computer Science Department AI Lab
As the complexity of healthcare delivery methods expands, limitations in human cognitive processing will increasingly constrain their reliable translation into better health. Harvesting the full benefit from cost-effective methods of care delivery requires flawless execution by multiple people of increasingly complex implementation steps. A study of American ICU care in the 1990s found that good care required skillful completion of more than 175 tasks per patient per day. While biomedical innovations such as vaccines or disease-eradicating treatments can reduce this complexity, on balance, complexity continues to grow for most healthcare. The result is high rates of unreliability. National studies demonstrate failures of intended clinical processes as high as 45%.
Partnership with the Artificial Intelligence Lab
CERC is collaborating with Stanford’s Artificial Intelligence Lab via our Partnership in AI-Assisted Care.
- To sidestep current weaknesses in connectivity among electronic health records, our initial pilot tests primarily focus on computer vision technology.
- As a forum for cross-pollination, CERC periodically hosts Stanford’s dialogue on Electronically Guided Health Care – composed of faculty from the Schools of Engineering, Computer Science and Medicine as well as Silicon Valley corporate research units.
Learn More on the Partnership in AI-Assisted Care Website
Visit our website devoted to our partnership with the Stanford Computer Science Department AI Lab.
We are investigating the use of multiple sensors for the detection, measurement, and evaluation of hand hygiene in controlled laboratory environments, hospital corridors, and patient bedroom units. Our goal is to automatically detect missed hand hygiene events and intervene in real-time to prevent potentially contaminating events.
Activity detection in Intensive Care Units (ICUs) is currently performed manually by trained personnel, primarily nurses, who log the activities as they occur. This process is both expensive and time consuming. Our goal is to design a system which automatically gives an annotated list of all activities that occurred in the ICU over the day.
We are investigating the use of multiple sensors for the detection and recording of daily
activities, lifestyle patterns, emotions, and vital signs, as well as the development of
intelligent mechanisms for translating multi-sensor inputs into accurate situational assessment and rapid
response. Our goal is to allow seniors to extend their capacity to live at home, improve their quality of life
and avoid unnecessary and costly relocation into institutional care.
Early excision of burns improves patient outcomes and reduces healthcare costs. Unfortunately, accuracy of burn depth assessment by experts is only 40-70%, with non-specialists being much less accurate. In this work, we aim to create an automated visual system that is able to predict both the burn severity and spatial outline.
Our research routinely appears in popular press, international conference, and academic journal venues. We have a strong presence in the clinical, medical informatics, machine learning, and computer vision communities.