Steven is a Computer Science PhD student at the University of California, San Diego. His research focuses around the topic of computational ethnography, or the augmentation of human observational research with insight from unobtrusive and ubiquitous sensing.

As a member of the UCSD Design Lab he works within the Human Centered Healthcare team to apply a mix of methodologies toward the generation of better human understanding of health and healthcare coordination.

Steven is originally from Morgan Hill, CA, in the Bay Area and previously received his B.S. in Cognitive Science with specialization in Human-Computer Interaction from the University of California, San Diego, after which he joined the VA Healthcare System to work on a number of clinical and biomedical informatics research studies.

In his free time he can be found running, backpacking, scuba diving, and generally enjoying time outdoors.

Publications

  • D. Gasques Rodrigues, A. Jain, S. Rick, P. Suresh, S. Liu, and N. Weibel, “Exploring Mixed Reality in Specialized Surgical Environments,” in Proceedings of CHI 2017 (Late Breaking), ACM Conference on Human Factors in Computing Systems, Denver (CO), USA, 2017.
    [Abstract] [Bibtex]

    Recent technology advances in both Virtual Reality and Augmented Reality are creating an opportunity for a paradigm shift in the design of human-computer interaction systems. Delving into the Reality-Virtuality Continuum, we find Mixed Reality – systems designed to augment the physical world with virtual entities that embody characteristics of real world objects. In the medical field, Mixed Reality systems can overlay real-time and spatially accurate results onto a patient’s body without the need for external screens. The complexity of these systems previously required specialized prototypes, but newly available commercial products like the Microsoft HoloLens make the technology more available. Through a combination of literature review, expert analysis, and prototyping we explore the use of Mixed Reality in this setting. From the experience of prototyping HoloSim and Patiently, two applications for augmenting medical training and education, we outline considerations for the future design and development of virtual interfaces grounded in reality.

    @inproceedings{gasquesrodrigues2017exploring,
     abstract = {Recent technology advances in both Virtual Reality and Augmented Reality are creating an opportunity for a paradigm shift in the design of human-computer interaction systems. Delving into the Reality-Virtuality Continuum, we find Mixed Reality – systems designed to augment the physical world with virtual entities that embody characteristics of real world objects. In the medical field, Mixed Reality systems can overlay real-time and spatially accurate results onto a patient’s body without the need for external screens. The complexity of these systems previously required specialized prototypes, but newly available commercial products like the Microsoft HoloLens make the technology more available. Through a combination of literature review, expert analysis, and prototyping we explore the use of Mixed Reality in this setting. From the experience of prototyping HoloSim and Patiently, two applications for augmenting medical training and education, we outline considerations for the future design and development of virtual interfaces grounded in reality.},
     address = {Denver (CO), USA},
     area = {ubicomp_health},
     author = {Gasques Rodrigues, Danilo and Jain, Ankur and Rick, Steven and Suresh, Preetham and Liu, Shangley and Weibel, Nadir},
     booktitle = {Proceedings of {CHI} 2017 {(Late Breaking)},
     {ACM} {Conference on Human Factors in Computing Systems}},
     interhash = {d1b6296994e76496ed4c1253c5fd401b},
     intrahash = {078e221281064c5f05bf58cc3172f025},
     month = may, projects = {hololens, AR},
     title = {Exploring Mixed Reality in Specialized Surgical Environments},
     year = 2017 
    }
  • G. Merchant, N. Weibel, L. Pina, W. G. Griswold, J. H. Fowler, G. X. Ayala, L. C. Gallo, J. Hollan, and K. Patrick, “Face-to-Face and Online Networks: College Students’ Experiences in a Weight-Loss Trial,” Journal of Health Communication, pp. 1-9, 2017.
    [Abstract] [Bibtex]

    This study aimed to understand how college students participating in a 2-year randomized controlled trial (Project SMART: Social and Mobile Approach to Reduce Weight; N = 404) engaged their social networks and used social and mobile technologies to try and lose weight. Participants in the present study (n = 20 treatment, n = 18 control) were approached after a measurement visit and administered semi-structured interviews. Interviews were analyzed using principles from grounded theory. Treatment group participants appreciated the timely support provided by the study and the integration of content across multiple technologies. Participants in both groups reported using non-study-designed apps to help them lose weight, and many participants knew one another outside of the study. Individuals talked about weight-loss goals with their friends face to face and felt accountable to follow through with their intentions. Although seeing others’ success online motivated many, there was a range of perceived acceptability in talking about personal health-related information on social media. The findings from this qualitative study can inform intervention trials using social and mobile technologies to promote weight loss. For example, weight-loss trials should measure participants’ use of direct-to-consumer technologies and interconnectivity so that treatment effects can be isolated and cross-contamination accounted for.

    @article{merchant2017face,
     abstract = {This study aimed to understand how college students participating in a 2-year randomized controlled trial (Project SMART: Social and Mobile Approach to Reduce Weight; N = 404) engaged their social networks and used social and mobile technologies to try and lose weight. Participants in the present study (n = 20 treatment, n = 18 control) were approached after a measurement visit and administered semi-structured interviews. Interviews were analyzed using principles from grounded theory. Treatment group participants appreciated the timely support provided by the study and the integration of content across multiple technologies. Participants in both groups reported using non-study-designed apps to help them lose weight, and many participants knew one another outside of the study. Individuals talked about weight-loss goals with their friends face to face and felt accountable to follow through with their intentions. Although seeing others’ success online motivated many, there was a range of perceived acceptability in talking about personal health-related information on social media. The findings from this qualitative study can inform intervention trials using social and mobile technologies to promote weight loss. For example, weight-loss trials should measure participants’ use of direct-to-consumer technologies and interconnectivity so that treatment effects can be isolated and cross-contamination accounted for.},
     area = {pervasive_sensing},
     author = {Merchant, Gina and Weibel, Nadir and Pina, Laura and Griswold, William G and Fowler, James H and Ayala, Guadalupe X and Gallo, Linda C and Hollan, James and Patrick, Kevin},
     interhash = {42f0a4bbcf0cd3633b57bd3a2f05b6b1},
     intrahash = {595bbb1b7ff92c334fc872116999b62a},
     journal = {Journal of Health Communication},
     month = {January},
     pages = {1--9},
     projects = {smart},
     publisher = {Taylor & Francis},
     title = {Face-to-Face and Online Networks: College Students’ Experiences in a Weight-Loss Trial},
     year = 2017 
    }
  • A. Calvitti, H. Hochheiser, S. Ashfaq, K. Bell, Y. Chen, R. El Kareh, M. T. Gabuzda, L. Liu, S. Mortensen, B. Pandey, S. Rick, R. Street, N. Weibel, C. Weir, and Z. Agha, “Physician Activity During Outpatient Visits and Subjective Workload,” Journal of Biomedical Informatics, 2017.
    [Abstract] [Bibtex]

    We describe methods for capturing and analyzing EHR use and clinical workflow of physicians during outpatient encounters and relating activity to physicians’ self-reported workload. We collected temporally-resolved activity data including audio, video, EHR activity, and eye-gaze along with post-visit assessments of workload. These data are then analyzed through a combination of manual content analysis and computational techniques to temporally align streams, providing a range of process measures of EHR usage, clinical workflow, and physician-patient communication. Data was collected from primary care and specialty clinics at the Veterans Administration San Diego Healthcare System and UCSD Health, who use Electronic Health Record (EHR) platforms, CPRS and Epic, respectively. Grouping visit activity by physician, site, specialty, and patient status enables rank-ordering activity factors by their correlation to physicians’ subjective work-load as captured by NASA Task Load Index survey. We developed a coding scheme that enabled us to compare timing studies between CPRS and Epic and extract patient and visit complexity profile. We identified similar patterns of EHR use and navigation at the 2 sites despite differences in functions, user interface and consequent coded representation. Both sites displayed similar proportions of EHR function use and navigation, and distribution of visit length, proportion of time physicians attended to EHRs (gaze), and subjective work-load as measured by task load survey. We found that visit activity was highly variable across individual physicians, and the observed activity metrics ranged widely as correlates to subjective workload. We discuss implications of our study for methodology, clinical workflow and EHR redesign.

    @article{calvitti2017jbi,
     abstract = {We describe methods for capturing and analyzing EHR use and clinical workflow of physicians during outpatient encounters and relating activity to physicians' self-reported workload. We collected temporally-resolved activity data including audio, video, EHR activity, and eye-gaze along with post-visit assessments of workload. These data are then analyzed through a combination of manual content analysis and computational techniques to temporally align streams, providing a range of process measures of EHR usage, clinical workflow, and physician-patient communication. Data was collected from primary care and specialty clinics at the Veterans Administration San Diego Healthcare System and UCSD Health, who use Electronic Health Record (EHR) platforms, CPRS and Epic, respectively. Grouping visit activity by physician, site, specialty, and patient status enables rank-ordering activity factors by their correlation to physicians' subjective work-load as captured by NASA Task Load Index survey. We developed a coding scheme that enabled us to compare timing studies between CPRS and Epic and extract patient and visit complexity profile. We identified similar patterns of EHR use and navigation at the 2 sites despite differences in functions, user interface and consequent coded representation. Both sites displayed similar proportions of EHR function use and navigation, and distribution of visit length, proportion of time physicians attended to EHRs (gaze), and subjective work-load as measured by task load survey. We found that visit activity was highly variable across individual physicians, and the observed activity metrics ranged widely as correlates to subjective workload. We discuss implications of our study for methodology, clinical workflow and EHR redesign.},
     area = {pervasive_sensing},
     author = {Calvitti, Alan and Hochheiser, Harry and Ashfaq, Shazia and Bell, Kristin and Chen, Yunan and El Kareh, Robert and Gabuzda, Mark T and Liu, Lin and Mortensen, Sara and Pandey, Braj and Rick, Steven and Street, Rick and Weibel, Nadir and Weir, Charlene and Agha, Zia},
     interhash = {a94cc5b603926daf203862d93306e55a},
     intrahash = {7ac99ea30089a236039ef26bcf526cb5},
     journal = {Journal of Biomedical Informatics},
     projects = {quick},
     title = {Physician Activity During Outpatient Visits and Subjective Workload},
     year = 2017 
    }
  • C. Bloss, C. Nebeker, M. Bietz, D. Bae, B. Bigby, M. Devereaux, J. Fowler, A. Waldo, N. Weibel, K. Patrick, and others, “Reimagining Human Research Protections for 21st Century Science,” Journal of Medical Internet Research (JMIR), vol. 18, iss. 12, p. e329, 2017.
    [Bibtex]
    @article{bloss2016reimagining,
     area = {pervasive_sensing},
     author = {Bloss, Cinnamon and Nebeker, Camille and Bietz, Matthew and Bae, Deborah and Bigby, Barbara and Devereaux, Mary and Fowler, James and Waldo, Ann and Weibel, Nadir and Patrick, Kevin and others},
     interhash = {4aa43c331dfae18e3b724c783e0ca60c},
     intrahash = {f763583b48011e729cc5b7de4fb25091},
     journal = {Journal of Medical Internet Research (JMIR)},
     number = 12, pages = {e329},
     projects = {core},
     publisher = {JMIR Publications Inc., Toronto, Canada},
     title = {Reimagining Human Research Protections for 21st Century Science},
     volume = 18, year = 2017 
    }
  • S. Rick, V. Ramesh, D. Gasques Rodrigues, and N. Weibel, “Pervasive Sensing in Healthcare: From Observing and Collecting to Seeing and Understanding,” in In Proc. of WISH, Workshop on Interactive System for Healthcare, CHI 2017, 2017.
    [Abstract] [Bibtex]

    From analyzing complex socio-technical systems, to evaluating novel interactions, increasingly pervasive sensing technologies provide researchers with new ways to observe the world. This paradigm shift is enabling capture of richer and more diverse data, combining elements from in-depth study of activity and behavior with modern sensors, and providing the means to accelerate sense-making of complex behavioral data. At the same time novel multimodal signal processing and machine learning techniques are equipping us with ‘super powers’ that enable understanding of these data in real-time, opening up new opportunities for embracing the concept of ‘Data Science in the Wild’. In this paper we present what this transition means in the context of Health and Healthcare, focusing on how it leads to the ‘UbiScope’, a ubiquitous computing microscope for detecting particular health conditions in real-time, promoting reflection on care, and guiding medical practices. Just as the microscope supported key scientific advances, the UbiScope will act as a proxy for understanding and supporting human activity and inform specific interventions in the years ahead.

    @inproceedings{wish2017,
     abstract = {From analyzing complex socio-technical systems, to evaluating novel interactions, increasingly pervasive sensing technologies provide researchers with new ways to observe the world. This paradigm shift is enabling capture of richer and more diverse data, combining elements from in-depth study of activity and behavior with modern sensors, and providing the means to accelerate sense-making of complex behavioral data. At the same time novel multimodal signal processing and machine learning techniques are equipping us with 'super powers' that enable understanding of these data in real-time, opening up new opportunities for embracing the concept of 'Data Science in the Wild'. In this paper we present what this transition means in the context of Health and Healthcare, focusing on how it leads to the 'UbiScope', a ubiquitous computing microscope for detecting particular health conditions in real-time, promoting reflection on care, and guiding medical practices. Just as the microscope supported key scientific advances, the UbiScope will act as a proxy for understanding and supporting human activity and inform specific interventions in the years ahead.},
     area = {pervasive_sensing},
     author = {Rick, Steven and Ramesh, Vish and Gasques Rodrigues, Danilo and Weibel, Nadir},
     booktitle = {In Proc. of WISH, Workshop on Interactive System for Healthcare, CHI 2017},
     interhash = {14a61e317efb0273cf9b9387910c3c2a},
     intrahash = {ac36ea34e7460fb52fac2f8004823299},
     projects = {ubiscope},
     title = {Pervasive Sensing in Healthcare: From Observing and Collecting to Seeing and Understanding},
     year = 2017 
    }
  • V. Ramesh, S. Rick, B. Meyer, G. Cauwenberghs, and N. Weibel, “A Neurobehavioral Evaluation System Using 3D Depth Tracking & Computer Vision: The Case of Stroke-Kinect.,” in Proceedings of Neuroscience 2016, Annual Meeting of the Society for Neuroscience (Poster presentation), San Diego, CA, USA, 2016.
    [Abstract] [Bibtex]

    Due to the subtlety of their symptoms – slight tremors, blurred vision, and loss of mobility, for example – many neurological diseases are challenging to diagnose. As such, a computational tool that can identify and analyze these symptoms accurately will be of immense use to neurologists. We aim to characterize human motor and cognitive abilities through a multimodal approach that will lead to signatures for neurological disorders, based on patterns in relevant identifiers. We focus here on stroke. Stroke is the 4th leading cause of death and the leading cause of disability in the United States. But Recombinant Tissue Plasminogen Activator (rt-PA), the only FDA-approved treatment currently available, is administered in less than 5% of acute stroke cases. The decision to prescribe rt-PA is based on the National Institute of Health Stroke Scale (NIHSS), a combination of multiple tests conducted by a neurologist to assess visual fields and motor and sensory impairments. Stroke evaluation with the NIHSS is inherently subjective. An inexperienced evaluator may miss key or almost imperceptible tells, misdiagnose the severity of a stroke, forego rt-PA prescriptions, and crudely predict long term outcomes. If this gap in objective and reliable stroke diagnosis is not addressed, stroke survivors will endure an arduous rehabilitation process. We are therefore developing Stroke-Kinect, a new system for automatic eye motion and body motion analysis to assist in the diagnosis of stroke. We obtain high-definition images and the spatial and temporal positions of 25 body joints in stroke and healthy control patients with the Microsoft Kinect v2. We employ machine learning classification algorithms and computer vision techniques to replicate the subjective NIHSS test computationally. Furthermore, we develop new tests for identifiers not captured by the NIHSS that are difficult to detect by the human eye: joint angles and thus body posture, velocity of gestures, twitches and jerks, and center of mass. Our analysis of depth data collected from stroke patients indicates accurate testing for the synchronicity of movements and reliable eye gaze tracking. The data also identifies posture as a key indicator of left side versus right side weakness. These results suggest that larger data sets will permit identification of only the vital indicators in stroke diagnosis, to simplify the NIHSS and mitigate the risk of false negatives and erroneous prescriptions of rt-PA. Stroke-Kinect also paves the way for the computational diagnosis of other neurological disorders, furthering the health sciences and ultimately aiding patients in their recovery.

    @inproceedings{sfn2016_ramesh,
     abstract = {Due to the subtlety of their symptoms - slight tremors, blurred vision, and loss of mobility, for example - many neurological diseases are challenging to diagnose. As such, a computational tool that can identify and analyze these symptoms accurately will be of immense use to neurologists. We aim to characterize human motor and cognitive abilities through a multimodal approach that will lead to signatures for neurological disorders, based on patterns in relevant identifiers. We focus here on stroke. Stroke is the 4th leading cause of death and the leading cause of disability in the United States. But Recombinant Tissue Plasminogen Activator (rt-PA), the only FDA-approved treatment currently available, is administered in less than 5% of acute stroke cases. The decision to prescribe rt-PA is based on the National Institute of Health Stroke Scale (NIHSS), a combination of multiple tests conducted by a neurologist to assess visual fields and motor and sensory impairments. Stroke evaluation with the NIHSS is inherently subjective. An inexperienced evaluator may miss key or almost imperceptible tells, misdiagnose the severity of a stroke, forego rt-PA prescriptions, and crudely predict long term outcomes. If this gap in objective and reliable stroke diagnosis is not addressed, stroke survivors will endure an arduous rehabilitation process. We are therefore developing Stroke-Kinect, a new system for automatic eye motion and body motion analysis to assist in the diagnosis of stroke. We obtain high-definition images and the spatial and temporal positions of 25 body joints in stroke and healthy control patients with the Microsoft Kinect v2. We employ machine learning classification algorithms and computer vision techniques to replicate the subjective NIHSS test computationally. Furthermore, we develop new tests for identifiers not captured by the NIHSS that are difficult to detect by the human eye: joint angles and thus body posture, velocity of gestures, twitches and jerks, and center of mass. Our analysis of depth data collected from stroke patients indicates accurate testing for the synchronicity of movements and reliable eye gaze tracking. The data also identifies posture as a key indicator of left side versus right side weakness. These results suggest that larger data sets will permit identification of only the vital indicators in stroke diagnosis, to simplify the NIHSS and mitigate the risk of false negatives and erroneous prescriptions of rt-PA. Stroke-Kinect also paves the way for the computational diagnosis of other neurological disorders, furthering the health sciences and ultimately aiding patients in their recovery.},
     address = {San Diego, CA, USA},
     area = {pervasive_sensing},
     author = {Ramesh, Vish and Rick, Steven and Meyer, Brett and Cauwenberghs, Gert and Weibel, Nadir},
     booktitle = {Proceedings of Neuroscience 2016, Annual Meeting of the Society for Neuroscience (Poster presentation)},
     interhash = {ae008fd247b1d2f695a9ced3b4c3bc47},
     intrahash = {d363538113a55afe4c39d0a77ef605e5},
     month = nov, projects = {ubiscope},
     title = {{A Neurobehavioral Evaluation System Using 3D Depth Tracking & Computer Vision: The Case of Stroke-Kinect.}},
     year = 2016 
    }
  • S. Hwang, S. Rick, E. Sayyari, D. Lenzen, and N. Weibel, “The signals of communicative efficiency and linguistic organization: findings from depth sensor skeleton tracking,” in Proceedings (Posters) of TISLR 12, 12th Theoretical Issues in Sign Language Research Conference, Melbourne, Australia, 2016.
    [Bibtex]
    @inproceedings{hwang2016signals,
     address = {Melbourne, Australia},
     area = {pervasive_sensing, ubicomp_health},
     author = {Hwang, So-One and Rick, Steven and Sayyari, Erfan and Lenzen, Dan and Weibel, Nadir},
     booktitle = {Proceedings ({Posters}) of {TISLR} 12, 12th {Theoretical} {Issues} in {Sign} {Language} {Research} {Conference}},
     interhash = {261d9d95076255248db58d346ef092a3},
     intrahash = {d03284b5ae707c41db108a93fded2d4b},
     month = jan, note = {In Press},
     projects = {gestures, sign-language},
     title = {The signals of communicative efficiency and linguistic organization: findings from depth sensor skeleton tracking},
     year = 2016 
    }
  • [PDF] N. Weibel, S. Hwang, S. Rick, E. Sayyari, D. Lenzen, and J. Hollan, “Hands that Speak: An Integrated Approach to Studying Complex Human Communicative Body Movements,” in Proceedings of HICSS-49, Hawaii International Conference on System Sciences, Kauai, HI, USA, 2016.
    [Bibtex]
    @inproceedings{weibel2016hands,
     address = {Kauai, HI, USA},
     area = {pervasive_sensing},
     author = {Weibel, Nadir and Hwang, So-One and Rick, Steven and Sayyari, Erfan and Lenzen, Dan and Hollan, Jim},
     booktitle = {Proceedings of {HICSS}-49, {Hawaii} {International} {Conference} on {System} {Sciences}},
     interhash = {60967491ae65a5a15830ddd630b9bf15},
     intrahash = {57ffb6523faacfb33f10d70862f517fe},
     month = jan, note = {In Press},
     projects = {gestures, sign-language,computational_ethnography},
     title = {Hands that {Speak}: {An} {Integrated} {Approach} to {Studying} {Complex} {Human} {Communicative} {Body} {Movements}},
     year = 2016 
    }
  • [URL] J. Zhang, K. Avery, Y. Chen, S. Ashfaq, S. Rick, K. Zhang, N. and Weibel, H. S. Hochheiser, C. Weir, K. M. Bell, M. T. Gabuzda, N. Farber, B. Pandey, A. Calvitti, L. Liu, R. Street, and Z. Agha, “A Preliminary Study on EHR-Associated Extra Workload Among Physicians,” in Proceedings (Posters) of AMIA 2015, American Medical Informatics Association, Annual Symposium, San Francisco, USA, 2015.
    [Bibtex]
    @inproceedings{zhang2015preliminary,
     address = {San Francisco, USA},
     area = {pervasive_sensing},
     author = {Zhang, Jing and Avery, Kellie and Chen, Yunan and Ashfaq, Shazia and Rick, Steven and Zhang, Kai and and Weibel, Nadir and Hochheiser, Harry S. and Weir, Charlene and Bell, Kristin M. and Gabuzda, Mark T. and Farber, Neil and Pandey, Braj and Calvitti, Alan and Liu, Lin and Street, Richard and Agha, Zia},
     booktitle = {Proceedings ({Posters}) of {AMIA} 2015, {American} {Medical} {Informatics} {Association},
     {Annual} {Symposium}},
     interhash = {a5c106803b5f30ad0e873a0c0bcd6eca},
     intrahash = {002b1fa76ca4130834725e35656cd29b},
     month = nov, note = {In Press},
     projects = {quick, medical_informatics},
     title = {A {Preliminary} {Study} on {EHR}-{Associated} {Extra} {Workload} {Among} {Physicians}},
     url = {http://knowledge.amia.org/59310-amia-1.2741865/t005-1.2744350/f005-1.2744351/2248934-1.2744373/2248934-1.2744374},
     year = 2015 
    }
  • [URL] S. Ashfaq, K. M. Bell, M. Difley, S. Mortensen, K. Avery, S. Rick, N. Weibel, B. Pandey, C. Weir, H. S. Hochheiser, Y. Chen, J. Zhang, K. Zhang, R. Street, M. T. Gabuzda, N. Farber, L. Liu, A. Calvitti, and Z. Agha, “Analysis of Computerized Clinical Reminder Activity and Usability Issues,” in Proceedings (Posters) of AMIA 2015, American Medical Informatics Association, Annual Symposium, San Francisco, USA, 2015.
    [Bibtex]
    @inproceedings{ashfaq2015analysis,
     address = {San Francisco, USA},
     area = {pervasive_sensing},
     author = {Ashfaq, Shazia and Bell, Kristin M. and Difley, Megan and Mortensen, Sara and Avery, Kellie and Rick, Steven and Weibel, Nadir and Pandey, Braj and Weir, Charlene and Hochheiser, Harry S. and Chen, Yunan and Zhang, Jing and Zhang, Kai and Street, Richard and Gabuzda, Mark T. and Farber, Neil and Liu, Lin and Calvitti, Alan and Agha, Zia},
     booktitle = {Proceedings ({Posters}) of {AMIA} 2015, {American} {Medical} {Informatics} {Association},
     {Annual} {Symposium}},
     interhash = {aae4c27bbc39c79ee5270e2976ec286d},
     intrahash = {f0bfc4c235a5801a11e9a4d1ca06a4e1},
     month = nov, note = {In Press},
     projects = {quick, medical_informatics},
     title = {Analysis of {Computerized} {Clinical} {Reminder} {Activity} and {Usability} {Issues}},
     url = {http://knowledge.amia.org/59310-amia-1.2741865/t005-1.2744350/f005-1.2744351/2248936-1.2745427/2248936-1.2745428?qr=1},
     year = 2015 
    }
  • [PDF] A. Rule, S. Rick, M. Chiu, P. Rios, S. Ashfaq, A. Calvitti, W. Chan, N. Weibel, and Z. Agha, “Validating free-text order entry for a note-centric EHR,” in Proceedings of AMIA 2015, American Medical Informatics Association, Annual Symposium, San Francisco, USA, 2015.
    [Bibtex]
    @inproceedings{rule2015validating,
     address = {San Francisco, USA},
     area = {pervasive_sensing},
     author = {Rule, Adam and Rick, Steven and Chiu, Michael and Rios, Phillip and Ashfaq, Shazia and Calvitti, Alan and Chan, Wesley and Weibel, Nadir and Agha, Zia},
     booktitle = {Proceedings of {AMIA} 2015, {American} {Medical} {Informatics} {Association},
     {Annual} {Symposium}},
     interhash = {c91f26d1853fe7e9bf187ec08009ae02},
     intrahash = {36adcf2baa5c21ad528c4c68231957a2},
     month = nov, note = {In Press},
     projects = {anotes, medical_informatics},
     title = {Validating free-text order entry for a note-centric {EHR}},
     year = 2015 
    }
  • [PDF] S. Rick, R. Street, A. Calvitti, S. Ashfaq, Z. Agha, and N. Weibel, “Understanding Patient-Physician Communication and Turn-taking Patterns with Directional Microphone Arrays,” in Abstracts (Oral Presentation) of ICCH 2015, International Conference on Communication in Healthcare, New Orleans, USA, 2015.
    [Bibtex]
    @inproceedings{rick2015understanding,
     address = {New Orleans, USA},
     area = {pervasive_sensing},
     author = {Rick, Steven and Street, Richard and Calvitti, Alan and Ashfaq, Shazia and Agha, Zia and Weibel, Nadir},
     booktitle = {Abstracts ({Oral} {Presentation}) of {ICCH} 2015, {International} {Conference} on {Communication} in {Healthcare}},
     interhash = {a2b31d868775ee8ee5272386a757f381},
     intrahash = {eccdb5dc404abc814cd779729157a660},
     month = oct, note = {In Press},
     projects = {patient-physician communication},
     title = {Understanding {Patient}-{Physician} {Communication} and {Turn}-taking {Patterns} with {Directional} {Microphone} {Arrays}},
     year = 2015 
    }
  • [PDF] S. Rick, A. Calvitti, Z. Agha, and N. Weibel, “Eyes on the Clinic: Accelerating Meaningful Interface Analysis through Unobtrusive Eye Tracking,” in Proceedings of PervasiveHealth 2015, International Conference on Pervasive Computing Technologies for Healthcare, Istanbul, Turkey, 2015.
    [Bibtex]
    @inproceedings{rick2015clinic,
     address = {Istanbul, Turkey},
     area = {pervasive_sensing},
     author = {Rick, Steven and Calvitti, Alan and Agha, Zia and Weibel, Nadir},
     booktitle = {Proceedings of {PervasiveHealth} 2015, {International} {Conference} on {Pervasive} {Computing} {Technologies} for {Healthcare}},
     interhash = {62adea43383b70c3b9ae5dc2bb995bab},
     intrahash = {8f3a3a660d7a9e74258c7178faac8d62},
     month = may, note = {In Press},
     projects = {quick, medical_informatics, computational_ethnography},
     title = {Eyes on the {Clinic}: {Accelerating} {Meaningful} {Interface} {Analysis} through {Unobtrusive} {Eye} {Tracking}},
     year = 2015 
    }
  • [PDF] N. Weibel, S. Rick, C. Emmenegger, S. Ashfaq, A. Calvitti, and Z. Agha, “LAB-IN-A-BOX: Semi-Automatic Tracking of Activity in the Medical Office,” Pers Ubiquit Comput – Health, 2014.
    [Bibtex]
    @article{weibel2014labinabox,
     area = {pervasive_sensing},
     author = {Weibel, Nadir and Rick, Steven and Emmenegger, Colleen and Ashfaq, Shazia and Calvitti, Alan and Agha, Zia},
     interhash = {dcecbcd424ffc9f5e032165284e4f13a},
     intrahash = {5da8ceba93930c415bdf7775db44bbff},
     journal = {Pers Ubiquit Comput - Health},
     month = sep, projects = {quick, medical_informatics, stroke-kinect, ergokinect, gestures, kinect, computational_ethnography},
     title = {{LAB}-{IN}-{A}-{BOX}: {Semi}-{Automatic} {Tracking} of {Activity} in the {Medical} {Office}},
     year = 2014 
    }
  • [URL] A. Calvitti, N. Weibel, H. Hochheiser, L. Liu, K. Zheng, C. Weir, S. Ashfaq, S. Rick, Z. Agha, and B. Gray, “Can eye tracking and EHR mouse activity tell us when clinicians are overloaded?,” Human Factors Quarterly, Veteran Health Administration, 2014.
    [Bibtex]
    @article{calvitti2014tracking,
     area = {pervasive_sensing},
     author = {Calvitti, Alan and Weibel, Nadir and Hochheiser, Harry and Liu, Lin and Zheng, Kai and Weir, Charlene and Ashfaq, Shazia and Rick, Steven and Agha, Zia and Gray, Barbara},
     interhash = {63723b1b64d631168cf02d4c337bb8c0},
     intrahash = {4395dd0ad3ea36df93a1bcfc83ea5413},
     journal = {Human Factors Quarterly, Veteran Health Administration},
     month = sep, projects = {quick, medical_informatics,computational_ethnography},
     title = {Can eye tracking and {EHR} mouse activity tell us when clinicians are overloaded?},
     url = {https://content.govdelivery.com/accounts/USVHA/bulletins/cfd5d2#article4},
     year = 2014 
    }
  • [PDF] G. Merchant, N. Weibel, K. Patrick, J. H. Fowler, G. J. Norman, A. Gupta, C. Servetas, K. Calfas, K. Raste, L. Pina, M. Donohue, and S. Marshall, “Click ‘Like’ to change your behavior: A mixed methods study of college students’ exposure to and engagement with Facebook content designed for weight-loss,” Journal of Medical Internet Research, 2014.
    [Bibtex]
    @article{merchant2014click,
     area = {ubicomp_health, pervasive_sensing},
     author = {Merchant, Gina and Weibel, Nadir and Patrick, Kevin and Fowler, James H. and Norman, Greg J. and Gupta, Anjali and Servetas, Christina and Calfas, Karen and Raste, Ketaki and Pina, Laura and Donohue, Mike and Marshall, Simon},
     interhash = {996dc861e949864dae09f7dafa3fcea9},
     intrahash = {8de14708c12962d8fa44e43605ae3058},
     journal = {Journal of Medical Internet Research},
     month = may, projects = {smart, three-two-me},
     title = {Click '{Like}' to change your behavior: {A} mixed methods study of college students' exposure to and engagement with {Facebook} content designed for weight-loss},
     year = 2014 
    }
  • [PDF] G. Merchant, L. Pina, M. Black, E. Bales, N. Weibel, W. Griswold, J. Fowler, and K. Patrick, “Online and face-to-face: How do ad-hoc and existing networks support weight-related behavior change in young adults?,” in Abstracts (Rapid Communication) of SBM 2014, Annual Meeting of the Society of Behavioral Medicine, Philadelphia, USA, 2014.
    [Bibtex]
    @inproceedings{merchant2014online,
     address = {Philadelphia, USA},
     area = {pervasive_sensing, ubicomp_health},
     author = {Merchant, Gina and Pina, Laura and Black, Michelle and Bales, Elizabeth and Weibel, Nadir and Griswold, William and Fowler, James and Patrick, Kevin},
     booktitle = {Abstracts ({Rapid} {Communication}) of {SBM} 2014, {Annual} {Meeting} of the {Society} of {Behavioral} {Medicine}},
     interhash = {4c6a874f02168989fc6e2fb915b0f5f8},
     intrahash = {d62b2dc557f1402e7614907cca1bfc30},
     month = apr, projects = {smart},
     title = {Online and face-to-face: {How} do ad-hoc and existing networks support weight-related behavior change in young adults?},
     year = 2014 
    }

Start typing and press Enter to search