CS Seminar Fall Series Kick-Off

The Department of Computer Science hosted the first seminar event featuring Dr. Yanjun Gao (Department of Medicine at University of Wisconsin-Madison) and Dr. Neil Klingensmith (Department of Computer Science at Loyola University Chicago).

CS Seminar Fall Series Kick-Off

This past Friday, the Computer Science Department hosted its first event of our Fall CS Seminar Series. This event featured talks from Dr. Yanjun Gao of University of Wisconsin-Madision as well as Loyola CS Faculty member, Neil Klingensmith. Below are the details of their talks.

Augmented Intelligence in Healthcare: How Can NLP Help Physicians at the Bedside? ---Talk By: Dr. Yanjun Gao

Abstract:

The electronic health record (EHR) contains patients' medical history, lab tests, diagnosis, and treatment plans collected by a multidisciplinary team of physicians, nurses, and support staff who attend to their care. While EHRs are intended to provide efficient care, they are still riddled with problems of information overload and poorly organized notes that overwhelm physicians and lead to burnout and, ultimately, inefficient care. Applying methods in natural language processing (NLP) to EHR data is a growing field with many potential applications in clinical decision support and augmented care. In the first part of the talk, I will examine the progress of clinical NLP over the years and describe both barriers that we have overcome as well as challenges that remain in advancing the field. The second part of the talk introduces a new suite of clinical NLP tasks addressing clinical reasoning, a critical cognitive process in medical education. I will also talk about how this new suite of tasks could perform the paradigm shift of clinical NLP from information extraction and outcome prediction to diagnostic reasoning, and ultimately to effective clinical decision support systems for physicians at the bedside care.

Are You Really Muted?: A Privacy Analysis of Mute Buttons in Video Conferencing Apps. ---Talk by Dr. Neil Klingensmith

Abstract:

In the post-pandemic era, video conferencing apps (VCAs) have converted previously private spaces — bedrooms, living rooms, and kitchens — into semi-public extensions of the office. And for the most part, users have accepted these apps in their personal space, without much thought about the permission models that govern the use of their personal data during meetings. While access to a device's video camera is carefully controlled, little has been done to ensure the same level of privacy for accessing the microphone. In this work, we ask the question: what happens to the microphone data when a user clicks the mute button in a VCA? We first conduct a user study to analyze users' understanding of the permission model of the mute button. Then, using runtime binary analysis tools, we trace raw audio in many popular VCAs as it traverses the app from the audio driver to the network. We find fragmented policies for dealing with microphone data among VCAs — some continuously monitor the microphone input during mute, and others do so periodically. One app transmits statistics of the audio to its telemetry servers while the app is muted. Using network traffic that we intercept en route to the telemetry server, we implement a proof-of-concept background activity classifier and demonstrate the feasibility of inferring the ongoing background activity during a meeting — cooking, cleaning, typing, etc. We achieved 81.9% macro accuracy on identifying six common background activities using intercepted outgoing telemetry packets when a user is muted.

Stay tuned for our next talk in this series, as we will be featuring more exciting talks and speakers! To stay updated on our upcoming events, please check out our website at https://www.luc.edu/cs/.