hi! I'm
Carolina Brum

Sensing
Algorithms
Interaction
User Experience
Experiments
Natural Movement Processing
tldr

I take human-driven data, usually derived from sensors that are wearable or embedded in devices, and design algorithms to extract meaningful information, such as endpoint prediction, gesture recognition, activity recognition, interaction prediction, and HCI modeling. For this, I utilize signal processing, machine learning, sensor fusion, and human movement modeling techniques. My programming languages include Python, C, Julia, and Matlab.

sensing

In recent years, new sensing technologies have emerged for assessing human motion, interaction and behaviour. Human driven signals are challenging as they are unpredictable, complex, diverse and therefore hard to model.

How to validate a sensing technology, how to determine its accuracy and value? Each sensor has its limitations and artefacts and characterizing them is key for hindering their effects. I join forces with hardware engineers and silicon manufacturers to track and guide improvement.

algorithms

Prediction filters, navigation, localization, detection, tracking, classification and mapping algorithms tailored for human movement and interaction are the core of my work.

interaction

Human-robot interaction, self-driven cars and gesture-based controllers require capable technology to capture human intent and to handle the unpredictable nature of human behaviour.

user experience

How to join human comfort and measurement accuracy? More and more, users are critical and selective on what to use, wear and which devices they are willing to have at home and work. I pair with product designers and UX researchers in order to find where, when, how and what to measure?.

experiments

I run experiments to onboard users to new experiences, evaluating their effort and satisfaction while using new controllers. I also design and execute experiments to gather data for machine learning. I deliver end-to-end human experiment solutions, from experiment protocol design to statistical analysis and data visualization.

natural movement processing

What vocabulary of gestures or interaction work? How learnable an interaction is? What is the feasible region that takes into account interaction accuracy, computation cost, power, user effort (mental/physical) and satisfaction? What dynamic model better describe a given human motion and interaction? How robust it is?

This website is an ongoing experiment. It was implemented on Barebones and uses d3.
The demos are shown in an incomplete form. Only artificial data was used.
Reach me at carolina AT carolinabrum.com

bio
Carolina Brum by Vanessa Yaremchuk
Dr. Carolina Brum earned her PhD in Music Technology and Integrated Sensor Systems at McGill University, in collaboration with the Responsive Environments Group at the MIT Media Lab where she developed novel sensor fusion and filtering algorithms for wearable sensors. Her work has been published at top journals, such as IEEE Sensors and Open Sensors. Carolina was instrumental in developing robust real-time tracking and prediction algorithms for the Soli radar sensor, which has been deployed on the Google Pixel 4. At Chatham Labs, recently acquired by Facebook, Carolina developed a learning regression algorithm for six degree-of-freedom endpoint prediction for ray pointing in VR. Her research interests include sensing, algorithms, gesture analysis and human interaction. [selected publications]
Website content and D3 code are © Carolina Brum Medeiros 2021. All rights reserved. Please request permission to copy or redistribute via e-mail.