Vish Ramesh, is a Ph.D. student in the Department of Bioengineering at the University of California, San Diego. His research interests are in characterizing the motor and cognitive abilities of patients with neurological disorders such as stroke, using ubiquitous computing technologies and machine learning and signal processing techniques.

Vish’s goal is to not only aid in diagnosis and rehabilitation of patients but also to develop biologically-grounded models of the afflicted brain.

Vish is originally from Cupertino, in Northern California. He graduated cum laude with a B.S. in Bioengineering at UCLA in 2015. In his spare time in San Diego, he plays recreational soccer with friends and enjoys running and cycling. He is also an avid reader and likes traveling, either around California or across the world.

Publications

  • S. Rick, V. Ramesh, D. Gasques Rodrigues, and N. Weibel, “Pervasive Sensing in Healthcare: From Observing and Collecting to Seeing and Understanding,” in In Proc. of WISH, Workshop on Interactive System for Healthcare, CHI 2017, 2017.
    [Abstract] [Bibtex]

    From analyzing complex socio-technical systems, to evaluating novel interactions, increasingly pervasive sensing technologies provide researchers with new ways to observe the world. This paradigm shift is enabling capture of richer and more diverse data, combining elements from in-depth study of activity and behavior with modern sensors, and providing the means to accelerate sense-making of complex behavioral data. At the same time novel multimodal signal processing and machine learning techniques are equipping us with ‘super powers’ that enable understanding of these data in real-time, opening up new opportunities for embracing the concept of ‘Data Science in the Wild’. In this paper we present what this transition means in the context of Health and Healthcare, focusing on how it leads to the ‘UbiScope’, a ubiquitous computing microscope for detecting particular health conditions in real-time, promoting reflection on care, and guiding medical practices. Just as the microscope supported key scientific advances, the UbiScope will act as a proxy for understanding and supporting human activity and inform specific interventions in the years ahead.

    @inproceedings{wish2017,
     abstract = {From analyzing complex socio-technical systems, to evaluating novel interactions, increasingly pervasive sensing technologies provide researchers with new ways to observe the world. This paradigm shift is enabling capture of richer and more diverse data, combining elements from in-depth study of activity and behavior with modern sensors, and providing the means to accelerate sense-making of complex behavioral data. At the same time novel multimodal signal processing and machine learning techniques are equipping us with 'super powers' that enable understanding of these data in real-time, opening up new opportunities for embracing the concept of 'Data Science in the Wild'. In this paper we present what this transition means in the context of Health and Healthcare, focusing on how it leads to the 'UbiScope', a ubiquitous computing microscope for detecting particular health conditions in real-time, promoting reflection on care, and guiding medical practices. Just as the microscope supported key scientific advances, the UbiScope will act as a proxy for understanding and supporting human activity and inform specific interventions in the years ahead.},
     area = {pervasive_sensing},
     author = {Rick, Steven and Ramesh, Vish and Gasques Rodrigues, Danilo and Weibel, Nadir},
     booktitle = {In Proc. of WISH, Workshop on Interactive System for Healthcare, CHI 2017},
     interhash = {14a61e317efb0273cf9b9387910c3c2a},
     intrahash = {ac36ea34e7460fb52fac2f8004823299},
     projects = {ubiscope},
     title = {Pervasive Sensing in Healthcare: From Observing and Collecting to Seeing and Understanding},
     year = 2017 
    }
  • V. Ramesh, S. Rick, B. Meyer, G. Cauwenberghs, and N. Weibel, “A Neurobehavioral Evaluation System Using 3D Depth Tracking & Computer Vision: The Case of Stroke-Kinect.,” in Proceedings of Neuroscience 2016, Annual Meeting of the Society for Neuroscience (Poster presentation), San Diego, CA, USA, 2016.
    [Abstract] [Bibtex]

    Due to the subtlety of their symptoms – slight tremors, blurred vision, and loss of mobility, for example – many neurological diseases are challenging to diagnose. As such, a computational tool that can identify and analyze these symptoms accurately will be of immense use to neurologists. We aim to characterize human motor and cognitive abilities through a multimodal approach that will lead to signatures for neurological disorders, based on patterns in relevant identifiers. We focus here on stroke. Stroke is the 4th leading cause of death and the leading cause of disability in the United States. But Recombinant Tissue Plasminogen Activator (rt-PA), the only FDA-approved treatment currently available, is administered in less than 5% of acute stroke cases. The decision to prescribe rt-PA is based on the National Institute of Health Stroke Scale (NIHSS), a combination of multiple tests conducted by a neurologist to assess visual fields and motor and sensory impairments. Stroke evaluation with the NIHSS is inherently subjective. An inexperienced evaluator may miss key or almost imperceptible tells, misdiagnose the severity of a stroke, forego rt-PA prescriptions, and crudely predict long term outcomes. If this gap in objective and reliable stroke diagnosis is not addressed, stroke survivors will endure an arduous rehabilitation process. We are therefore developing Stroke-Kinect, a new system for automatic eye motion and body motion analysis to assist in the diagnosis of stroke. We obtain high-definition images and the spatial and temporal positions of 25 body joints in stroke and healthy control patients with the Microsoft Kinect v2. We employ machine learning classification algorithms and computer vision techniques to replicate the subjective NIHSS test computationally. Furthermore, we develop new tests for identifiers not captured by the NIHSS that are difficult to detect by the human eye: joint angles and thus body posture, velocity of gestures, twitches and jerks, and center of mass. Our analysis of depth data collected from stroke patients indicates accurate testing for the synchronicity of movements and reliable eye gaze tracking. The data also identifies posture as a key indicator of left side versus right side weakness. These results suggest that larger data sets will permit identification of only the vital indicators in stroke diagnosis, to simplify the NIHSS and mitigate the risk of false negatives and erroneous prescriptions of rt-PA. Stroke-Kinect also paves the way for the computational diagnosis of other neurological disorders, furthering the health sciences and ultimately aiding patients in their recovery.

    @inproceedings{sfn2016_ramesh,
     abstract = {Due to the subtlety of their symptoms - slight tremors, blurred vision, and loss of mobility, for example - many neurological diseases are challenging to diagnose. As such, a computational tool that can identify and analyze these symptoms accurately will be of immense use to neurologists. We aim to characterize human motor and cognitive abilities through a multimodal approach that will lead to signatures for neurological disorders, based on patterns in relevant identifiers. We focus here on stroke. Stroke is the 4th leading cause of death and the leading cause of disability in the United States. But Recombinant Tissue Plasminogen Activator (rt-PA), the only FDA-approved treatment currently available, is administered in less than 5% of acute stroke cases. The decision to prescribe rt-PA is based on the National Institute of Health Stroke Scale (NIHSS), a combination of multiple tests conducted by a neurologist to assess visual fields and motor and sensory impairments. Stroke evaluation with the NIHSS is inherently subjective. An inexperienced evaluator may miss key or almost imperceptible tells, misdiagnose the severity of a stroke, forego rt-PA prescriptions, and crudely predict long term outcomes. If this gap in objective and reliable stroke diagnosis is not addressed, stroke survivors will endure an arduous rehabilitation process. We are therefore developing Stroke-Kinect, a new system for automatic eye motion and body motion analysis to assist in the diagnosis of stroke. We obtain high-definition images and the spatial and temporal positions of 25 body joints in stroke and healthy control patients with the Microsoft Kinect v2. We employ machine learning classification algorithms and computer vision techniques to replicate the subjective NIHSS test computationally. Furthermore, we develop new tests for identifiers not captured by the NIHSS that are difficult to detect by the human eye: joint angles and thus body posture, velocity of gestures, twitches and jerks, and center of mass. Our analysis of depth data collected from stroke patients indicates accurate testing for the synchronicity of movements and reliable eye gaze tracking. The data also identifies posture as a key indicator of left side versus right side weakness. These results suggest that larger data sets will permit identification of only the vital indicators in stroke diagnosis, to simplify the NIHSS and mitigate the risk of false negatives and erroneous prescriptions of rt-PA. Stroke-Kinect also paves the way for the computational diagnosis of other neurological disorders, furthering the health sciences and ultimately aiding patients in their recovery.},
     address = {San Diego, CA, USA},
     area = {pervasive_sensing},
     author = {Ramesh, Vish and Rick, Steven and Meyer, Brett and Cauwenberghs, Gert and Weibel, Nadir},
     booktitle = {Proceedings of Neuroscience 2016, Annual Meeting of the Society for Neuroscience (Poster presentation)},
     interhash = {ae008fd247b1d2f695a9ced3b4c3bc47},
     intrahash = {d363538113a55afe4c39d0a77ef605e5},
     month = nov, projects = {ubiscope},
     title = {{A Neurobehavioral Evaluation System Using 3D Depth Tracking & Computer Vision: The Case of Stroke-Kinect.}},
     year = 2016 
    }

Start typing and press Enter to search