Steven is a Computer Science PhD student at the University of California, San Diego. His research focuses around the topic of computational ethnography, or the augmentation of human observational research with insight from unobtrusive and ubiquitous sensing.

As a member of the UCSD Design Lab he works within the Human Centered Healthcare team to apply a mix of methodologies toward the generation of better human understanding of health and healthcare coordination.

Steven is originally from Morgan Hill, CA, in the Bay Area and previously received his B.S. in Cognitive Science with specialization in Human-Computer Interaction from the University of California, San Diego, after which he joined the VA Healthcare System to work on a number of clinical and biomedical informatics research studies.

In his free time he can be found running, backpacking, scuba diving, and generally enjoying time outdoors.

Publications

  • D. Gasques Rodrigues, A. Jain, S. Rick, P. Suresh, S. Liu, and N. Weibel, “Exploring mixed reality in specialized surgical environments,” in Proceedings of CHI 2017 (Late Breaking), ACM Conference on Human Factors in Computing Systems, Denver (CO), USA, 2017.
    [Abstract] [Bibtex]

    Recent technology advances in both Virtual Reality and Augmented Reality are creating an opportunity for a paradigm shift in the design of human-computer interaction systems. Delving into the Reality-Virtuality Continuum, we find Mixed Reality – systems designed to augment the physical world with virtual entities that embody characteristics of real world objects. In the medical field, Mixed Reality systems can overlay real-time and spatially accurate results onto a patient’s body without the need for external screens. The complexity of these systems previously required specialized prototypes, but newly available commercial products like the Microsoft HoloLens make the technology more available. Through a combination of literature review, expert analysis, and prototyping we explore the use of Mixed Reality in this setting. From the experience of prototyping HoloSim and Patiently, two applications for augmenting medical training and education, we outline considerations for the future design and development of virtual interfaces grounded in reality.

    @inproceedings{gasquesrodrigues2017exploring,
    abstract = {Recent technology advances in both Virtual Reality and Augmented Reality are creating an opportunity for a paradigm shift in the design of human-computer interaction systems. Delving into the Reality-Virtuality Continuum, we find Mixed Reality – systems designed to augment the physical world with virtual entities that embody characteristics of real world objects. In the medical field, Mixed Reality systems can overlay real-time and spatially accurate results onto a patient’s body without the need for external screens. The complexity of these systems previously required specialized prototypes, but newly available commercial products like the Microsoft HoloLens make the technology more available. Through a combination of literature review, expert analysis, and prototyping we explore the use of Mixed Reality in this setting. From the experience of prototyping HoloSim and Patiently, two applications for augmenting medical training and education, we outline considerations for the future design and development of virtual interfaces grounded in reality.},
    address = {Denver (CO), USA},
    area = {ubicomp_health},
    author = {Gasques Rodrigues, Danilo and Jain, Ankur and Rick, Steven and Suresh, Preetham and Liu, Shangley and Weibel, Nadir},
    booktitle = {Proceedings of {CHI} 2017 {(Late Breaking)},
    {ACM} {Conference on Human Factors in Computing Systems}},
    interhash = {d1b6296994e76496ed4c1253c5fd401b},
    intrahash = {078e221281064c5f05bf58cc3172f025},
    month = may, projects = {hololens, AR},
    title = {Exploring Mixed Reality in Specialized Surgical Environments},
    year = 2017
    }
  • S. Hwang, S. Rick, E. Sayyari, D. Lenzen, and N. Weibel, “The signals of communicative efficiency and linguistic organization: findings from depth sensor skeleton tracking,” in Proceedings (Posters) of TISLR 12, 12th Theoretical Issues in Sign Language Research Conference, Melbourne, Australia, 2016.
    [Bibtex]
    @inproceedings{hwang2016signals,
    address = {Melbourne, Australia},
    area = {pervasive_sensing, ubicomp_health},
    author = {Hwang, So-One and Rick, Steven and Sayyari, Erfan and Lenzen, Dan and Weibel, Nadir},
    booktitle = {Proceedings ({Posters}) of {TISLR} 12, 12th {Theoretical} {Issues} in {Sign} {Language} {Research} {Conference}},
    interhash = {261d9d95076255248db58d346ef092a3},
    intrahash = {d03284b5ae707c41db108a93fded2d4b},
    month = jan, note = {In Press},
    projects = {gestures, sign-language},
    title = {The signals of communicative efficiency and linguistic organization: findings from depth sensor skeleton tracking},
    year = 2016
    }
  • [PDF] N. Weibel, S. Hwang, S. Rick, E. Sayyari, D. Lenzen, and J. Hollan, “Hands that Speak: An Integrated Approach to Studying Complex Human Communicative Body Movements,” in Proceedings of HICSS-49, Hawaii International Conference on System Sciences, Kauai, HI, USA, 2016.
    [Bibtex]
    @inproceedings{weibel2016hands,
    address = {Kauai, HI, USA},
    area = {pervasive_sensing},
    author = {Weibel, Nadir and Hwang, So-One and Rick, Steven and Sayyari, Erfan and Lenzen, Dan and Hollan, Jim},
    booktitle = {Proceedings of {HICSS}-49, {Hawaii} {International} {Conference} on {System} {Sciences}},
    interhash = {60967491ae65a5a15830ddd630b9bf15},
    intrahash = {57ffb6523faacfb33f10d70862f517fe},
    month = jan, note = {In Press},
    projects = {gestures, sign-language,computational_ethnography},
    title = {Hands that {Speak}: {An} {Integrated} {Approach} to {Studying} {Complex} {Human} {Communicative} {Body} {Movements}},
    year = 2016
    }
  • V. Ramesh, S. Rick, B. Meyer, G. Cauwenberghs, and N. Weibel, “A Neurobehavioral Evaluation System Using 3D Depth Tracking & Computer Vision: The Case of Stroke-Kinect.,” in Proceedings of neuroscience 2016, annual meeting of the society for neuroscience (poster presentation), San Diego, CA, USA, 2016.
    [Abstract] [Bibtex]

    Due to the subtlety of their symptoms – slight tremors, blurred vision, and loss of mobility, for example – many neurological diseases are challenging to diagnose. As such, a computational tool that can identify and analyze these symptoms accurately will be of immense use to neurologists. We aim to characterize human motor and cognitive abilities through a multimodal approach that will lead to signatures for neurological disorders, based on patterns in relevant identifiers. We focus here on stroke. Stroke is the 4th leading cause of death and the leading cause of disability in the United States. But Recombinant Tissue Plasminogen Activator (rt-PA), the only FDA-approved treatment currently available, is administered in less than 5% of acute stroke cases. The decision to prescribe rt-PA is based on the National Institute of Health Stroke Scale (NIHSS), a combination of multiple tests conducted by a neurologist to assess visual fields and motor and sensory impairments. Stroke evaluation with the NIHSS is inherently subjective. An inexperienced evaluator may miss key or almost imperceptible tells, misdiagnose the severity of a stroke, forego rt-PA prescriptions, and crudely predict long term outcomes. If this gap in objective and reliable stroke diagnosis is not addressed, stroke survivors will endure an arduous rehabilitation process. We are therefore developing Stroke-Kinect, a new system for automatic eye motion and body motion analysis to assist in the diagnosis of stroke. We obtain high-definition images and the spatial and temporal positions of 25 body joints in stroke and healthy control patients with the Microsoft Kinect v2. We employ machine learning classification algorithms and computer vision techniques to replicate the subjective NIHSS test computationally. Furthermore, we develop new tests for identifiers not captured by the NIHSS that are difficult to detect by the human eye: joint angles and thus body posture, velocity of gestures, twitches and jerks, and center of mass. Our analysis of depth data collected from stroke patients indicates accurate testing for the synchronicity of movements and reliable eye gaze tracking. The data also identifies posture as a key indicator of left side versus right side weakness. These results suggest that larger data sets will permit identification of only the vital indicators in stroke diagnosis, to simplify the NIHSS and mitigate the risk of false negatives and erroneous prescriptions of rt-PA. Stroke-Kinect also paves the way for the computational diagnosis of other neurological disorders, furthering the health sciences and ultimately aiding patients in their recovery.

    @inproceedings{sfn2016_ramesh,
    abstract = {Due to the subtlety of their symptoms - slight tremors, blurred vision, and loss of mobility, for example - many neurological diseases are challenging to diagnose. As such, a computational tool that can identify and analyze these symptoms accurately will be of immense use to neurologists. We aim to characterize human motor and cognitive abilities through a multimodal approach that will lead to signatures for neurological disorders, based on patterns in relevant identifiers. We focus here on stroke. Stroke is the 4th leading cause of death and the leading cause of disability in the United States. But Recombinant Tissue Plasminogen Activator (rt-PA), the only FDA-approved treatment currently available, is administered in less than 5% of acute stroke cases. The decision to prescribe rt-PA is based on the National Institute of Health Stroke Scale (NIHSS), a combination of multiple tests conducted by a neurologist to assess visual fields and motor and sensory impairments. Stroke evaluation with the NIHSS is inherently subjective. An inexperienced evaluator may miss key or almost imperceptible tells, misdiagnose the severity of a stroke, forego rt-PA prescriptions, and crudely predict long term outcomes. If this gap in objective and reliable stroke diagnosis is not addressed, stroke survivors will endure an arduous rehabilitation process. We are therefore developing Stroke-Kinect, a new system for automatic eye motion and body motion analysis to assist in the diagnosis of stroke. We obtain high-definition images and the spatial and temporal positions of 25 body joints in stroke and healthy control patients with the Microsoft Kinect v2. We employ machine learning classification algorithms and computer vision techniques to replicate the subjective NIHSS test computationally. Furthermore, we develop new tests for identifiers not captured by the NIHSS that are difficult to detect by the human eye: joint angles and thus body posture, velocity of gestures, twitches and jerks, and center of mass. Our analysis of depth data collected from stroke patients indicates accurate testing for the synchronicity of movements and reliable eye gaze tracking. The data also identifies posture as a key indicator of left side versus right side weakness. These results suggest that larger data sets will permit identification of only the vital indicators in stroke diagnosis, to simplify the NIHSS and mitigate the risk of false negatives and erroneous prescriptions of rt-PA. Stroke-Kinect also paves the way for the computational diagnosis of other neurological disorders, furthering the health sciences and ultimately aiding patients in their recovery.},
    address = {San Diego, CA, USA},
    area = {pervasive_sensing},
    author = {Ramesh, Vish and Rick, Steven and Meyer, Brett and Cauwenberghs, Gert and Weibel, Nadir},
    booktitle = {Proceedings of Neuroscience 2016, Annual Meeting of the Society for Neuroscience (Poster presentation)},
    interhash = {ae008fd247b1d2f695a9ced3b4c3bc47},
    intrahash = {d363538113a55afe4c39d0a77ef605e5},
    month = nov, projects = {ubiscope},
    title = {{A Neurobehavioral Evaluation System Using 3D Depth Tracking & Computer Vision: The Case of Stroke-Kinect.}},
    year = 2016
    }
  • [PDF] A. Rule, S. Rick, M. Chiu, P. Rios, S. Ashfaq, A. Calvitti, W. Chan, N. Weibel, and Z. Agha, “Validating free-text order entry for a note-centric EHR,” in Proceedings of AMIA 2015, American Medical Informatics Association, Annual Symposium, San Francisco, USA, 2015.
    [Bibtex]
    @inproceedings{rule2015validating,
    address = {San Francisco, USA},
    area = {pervasive_sensing},
    author = {Rule, Adam and Rick, Steven and Chiu, Michael and Rios, Phillip and Ashfaq, Shazia and Calvitti, Alan and Chan, Wesley and Weibel, Nadir and Agha, Zia},
    booktitle = {Proceedings of {AMIA} 2015, {American} {Medical} {Informatics} {Association},
    {Annual} {Symposium}},
    interhash = {c91f26d1853fe7e9bf187ec08009ae02},
    intrahash = {36adcf2baa5c21ad528c4c68231957a2},
    month = nov, note = {In Press},
    projects = {anotes, medical_informatics},
    title = {Validating free-text order entry for a note-centric {EHR}},
    year = 2015
    }
  • [URL] S. Ashfaq, K. M. Bell, M. Difley, S. Mortensen, K. Avery, S. Rick, N. Weibel, B. Pandey, C. Weir, H. S. Hochheiser, Y. Chen, J. Zhang, K. Zhang, R. Street, M. T. Gabuzda, N. Farber, L. Liu, A. Calvitti, and Z. Agha, “Analysis of Computerized Clinical Reminder Activity and Usability Issues,” in Proceedings (Posters) of AMIA 2015, American Medical Informatics Association, Annual Symposium, San Francisco, USA, 2015.
    [Bibtex]
    @inproceedings{ashfaq2015analysis,
    address = {San Francisco, USA},
    area = {pervasive_sensing},
    author = {Ashfaq, Shazia and Bell, Kristin M. and Difley, Megan and Mortensen, Sara and Avery, Kellie and Rick, Steven and Weibel, Nadir and Pandey, Braj and Weir, Charlene and Hochheiser, Harry S. and Chen, Yunan and Zhang, Jing and Zhang, Kai and Street, Richard and Gabuzda, Mark T. and Farber, Neil and Liu, Lin and Calvitti, Alan and Agha, Zia},
    booktitle = {Proceedings ({Posters}) of {AMIA} 2015, {American} {Medical} {Informatics} {Association},
    {Annual} {Symposium}},
    interhash = {aae4c27bbc39c79ee5270e2976ec286d},
    intrahash = {f0bfc4c235a5801a11e9a4d1ca06a4e1},
    month = nov, note = {In Press},
    projects = {quick, medical_informatics},
    title = {Analysis of {Computerized} {Clinical} {Reminder} {Activity} and {Usability} {Issues}},
    url = {http://knowledge.amia.org/59310-amia-1.2741865/t005-1.2744350/f005-1.2744351/2248936-1.2745427/2248936-1.2745428?qr=1},
    year = 2015
    }
  • [URL] J. Zhang, K. Avery, Y. Chen, S. Ashfaq, S. Rick, K. Zhang, N. and Weibel, H. S. Hochheiser, C. Weir, K. M. Bell, M. T. Gabuzda, N. Farber, B. Pandey, A. Calvitti, L. Liu, R. Street, and Z. Agha, “A Preliminary Study on EHR-Associated Extra Workload Among Physicians,” in Proceedings (Posters) of AMIA 2015, American Medical Informatics Association, Annual Symposium, San Francisco, USA, 2015.
    [Bibtex]
    @inproceedings{zhang2015preliminary,
    address = {San Francisco, USA},
    area = {pervasive_sensing},
    author = {Zhang, Jing and Avery, Kellie and Chen, Yunan and Ashfaq, Shazia and Rick, Steven and Zhang, Kai and and Weibel, Nadir and Hochheiser, Harry S. and Weir, Charlene and Bell, Kristin M. and Gabuzda, Mark T. and Farber, Neil and Pandey, Braj and Calvitti, Alan and Liu, Lin and Street, Richard and Agha, Zia},
    booktitle = {Proceedings ({Posters}) of {AMIA} 2015, {American} {Medical} {Informatics} {Association},
    {Annual} {Symposium}},
    interhash = {a5c106803b5f30ad0e873a0c0bcd6eca},
    intrahash = {002b1fa76ca4130834725e35656cd29b},
    month = nov, note = {In Press},
    projects = {quick, medical_informatics},
    title = {A {Preliminary} {Study} on {EHR}-{Associated} {Extra} {Workload} {Among} {Physicians}},
    url = {http://knowledge.amia.org/59310-amia-1.2741865/t005-1.2744350/f005-1.2744351/2248934-1.2744373/2248934-1.2744374},
    year = 2015
    }
  • [PDF] N. Weibel, S. Rick, C. Emmenegger, S. Ashfaq, A. Calvitti, and Z. Agha, “LAB-IN-A-BOX: Semi-Automatic Tracking of Activity in the Medical Office,” Pers ubiquit comput – health, 2014.
    [Bibtex]
    @article{weibel2014labinabox,
    area = {pervasive_sensing},
    author = {Weibel, Nadir and Rick, Steven and Emmenegger, Colleen and Ashfaq, Shazia and Calvitti, Alan and Agha, Zia},
    interhash = {dcecbcd424ffc9f5e032165284e4f13a},
    intrahash = {5da8ceba93930c415bdf7775db44bbff},
    journal = {Pers Ubiquit Comput - Health},
    month = sep, projects = {quick, medical_informatics, stroke-kinect, ergokinect, gestures, kinect, computational_ethnography},
    title = {{LAB}-{IN}-{A}-{BOX}: {Semi}-{Automatic} {Tracking} of {Activity} in the {Medical} {Office}},
    year = 2014
    }
  • [URL] A. Calvitti, N. Weibel, H. Hochheiser, L. Liu, K. Zheng, C. Weir, S. Ashfaq, S. Rick, Z. Agha, and B. Gray, “Can eye tracking and EHR mouse activity tell us when clinicians are overloaded?,” Human factors quarterly, veteran health administration, 2014.
    [Bibtex]
    @article{calvitti2014tracking,
    area = {pervasive_sensing},
    author = {Calvitti, Alan and Weibel, Nadir and Hochheiser, Harry and Liu, Lin and Zheng, Kai and Weir, Charlene and Ashfaq, Shazia and Rick, Steven and Agha, Zia and Gray, Barbara},
    interhash = {63723b1b64d631168cf02d4c337bb8c0},
    intrahash = {4395dd0ad3ea36df93a1bcfc83ea5413},
    journal = {Human Factors Quarterly, Veteran Health Administration},
    month = sep, projects = {quick, medical_informatics,computational_ethnography},
    title = {Can eye tracking and {EHR} mouse activity tell us when clinicians are overloaded?},
    url = {https://content.govdelivery.com/accounts/USVHA/bulletins/cfd5d2#article4},
    year = 2014
    }

Start typing and press Enter to search