Clinical Focus

  • Pediatrics

Academic Appointments

Professional Education

  • Residency: Stanford University Pediatric Residency (2016) CA
  • Board Certification: American Board of Pediatrics, Pediatrics (2016)
  • Medical Education: Stanford University School of Medicine/Medical Center (2013) CA
  • BS, MIT, Mechanical Engineering (2007)


All Publications

  • Integration of Single-Center Data-Driven Vital Sign Parameters into a Modified Pediatric Early Warning System. Pediatric critical care medicine Ross, C. E., Harrysson, I. J., Goel, V. V., Strandberg, E. J., Kan, P., Franzon, D. E., Pageler, N. M. 2017; 18 (5): 469-476


    Pediatric early warning systems using expert-derived vital sign parameters demonstrate limited sensitivity and specificity in identifying deterioration. We hypothesized that modified tools using data-driven vital sign parameters would improve the performance of a validated tool.Retrospective case control.Quaternary-care children's hospital.Hospitalized, noncritically ill patients less than 18 years old. Cases were defined as patients who experienced an emergent transfer to an ICU or out-of-ICU cardiac arrest. Controls were patients who never required intensive care. Cases and controls were split into training and testing groups.The Bedside Pediatric Early Warning System was modified by integrating data-driven heart rate and respiratory rate parameters (modified Bedside Pediatric Early Warning System 1 and 2). Modified Bedside Pediatric Early Warning System 1 used the 10th and 90th percentiles as normal parameters, whereas modified Bedside Pediatric Early Warning System 2 used fifth and 95th percentiles.The training set consisted of 358 case events and 1,830 controls; the testing set had 331 case events and 1,215 controls. In the sensitivity analysis, 207 of the 331 testing set cases (62.5%) were predicted by the original tool versus 206 (62.2%; p = 0.54) with modified Bedside Pediatric Early Warning System 1 and 191 (57.7%; p < 0.001) with modified Bedside Pediatric Early Warning System 2. For specificity, 1,005 of the 1,215 testing set control patients (82.7%) were identified by original Bedside Pediatric Early Warning System versus 1,013 (83.1%; p = 0.54) with modified Bedside Pediatric Early Warning System 1 and 1,055 (86.8%; p < 0.001) with modified Bedside Pediatric Early Warning System 2. There was no net gain in sensitivity and specificity using either of the modified Bedside Pediatric Early Warning System tools.Integration of data-driven vital sign parameters into a validated pediatric early warning system did not significantly impact sensitivity or specificity, and all the tools showed lower than desired sensitivity and specificity at a single cutoff point. Future work is needed to develop an objective tool that can more accurately predict pediatric decompensation.

    View details for DOI 10.1097/PCC.0000000000001150

    View details for PubMedID 28338520

  • Systematic Review of Learning Curves for Minimally Invasive Abdominal Surgery A Review of the Methodology of Data Collection, Depiction of Outcomes, and Statistical Analysis ANNALS OF SURGERY Harrysson, I. J., Cook, J., Sirimanna, P., Feldman, L. S., Darzi, A., Aggarwal, R. 2014; 260 (1): 37-45


    To determine how minimally invasive surgical learning curves are assessed and define an ideal framework for this assessment.Learning curves have implications for training and adoption of new procedures and devices. In 2000, a review of the learning curve literature was done by Ramsay et al and it called for improved reporting and statistical evaluation of learning curves. Since then, a body of literature is emerging on learning curves but the presentation and analysis vary.A systematic search was performed of MEDLINE, EMBASE, ISI Web of Science, ERIC, and the Cochrane Library from 1985 to August 2012. The inclusion criteria are minimally invasive abdominal surgery formally analyzing the learning curve and English language. 592 (11.1%) of the identified studies met the selection criteria.Time is the most commonly used proxy for the learning curve (508, 86%). Intraoperative outcomes were used in 316 (53%) of the articles, postoperative outcomes in 306 (52%), technical skills in 102 (17%), and patient-oriented outcomes in 38 (6%) articles. Over time, there was evidence of an increase in the relative amount of laparoscopic and robotic studies (P < 0.001) without statistical evidence of a change in the complexity of analysis (P = 0.121).Assessment of learning curves is needed to inform surgical training and evaluate new clinical procedures. An ideal analysis would account for the degree of complexity of individual cases and the inherent differences between surgeons. There is no single proxy that best represents the success of surgery, and hence multiple outcomes should be collected.

    View details for DOI 10.1097/SLA.0000000000000596

    View details for Web of Science ID 000337297900010

    View details for PubMedID 24670849

  • Development of a knowledge, skills, and attitudes framework for training in laparoscopic cholecystectomy AMERICAN JOURNAL OF SURGERY Harrysson, I., Hull, L., Sevdalis, N., Darzi, A., Aggarwal, R. 2014; 207 (5): 790-796


    The implementation of duty-hour restrictions and a heightened awareness of patient safety has changed resident education and training. A new focus has been placed on high-yield training programs and simulation training has naturally grown to fill this need.This article discusses the development of a training framework, knowledge, skills, and attitudes, and the design of a surgical simulation curriculum. Five residents were recruited for a pilot study of the curriculum.A successful framework for curriculum development was implemented using laparoscopic cholecystectomy as the example. The curriculum consisted of classroom and virtual reality simulation training and was completed in 3.1 to 4.8 hours.The current curricula that have been developed for surgical education cover the breadth of a surgical residency well. This curriculum went beyond these curricula and developed a structured framework for surgical training, a method that can be applied to any procedure.

    View details for DOI 10.1016/j.amjsurg.2013.08.049

    View details for Web of Science ID 000335842300051

    View details for PubMedID 24524859

Footer Links:

Stanford Medicine Resources: