top of page

Previous

Events

CruX

BoardPicExtras.jpeg

BCI Projects

BCI Claw

This project uses artificial neural networks and machine learning to recognize patterns in brain signals collected by a BCI headset in order to detect motor imagery. These signals are then processed into a numerical code that controls a robotic claw. The motivation behind this project is detecting motor imagery, which has yet to be successful done, and create a brain controlled prosthetic for those in need.

Alizee, Steven, Ross, Howard, Marvin, Ethan, Jayson

Bionic Hand Signals

Amber Kasahy, Irina Kryukov, Richard He, Isabella Wu, Sadhana Jeyakumar

Using BCI and motor imagery to open and close a bionic hand

Quantifying Motor Learning

Chrysa Prentza, Tess Flinchbaugh, Larry Lu, Yasmin Kaman, Lillian Gabrelian, Erin Taylor, Jaclyn Zucco

We created a novel machine learning algorithm to quantify proficiency in a variety of fine motor skills with 65% accuracy.

Project Beats

David Bakalov, Aryak Rekhi, Daniel Hong, Will Stonehouse, Ashley Lamba, Elliot Berdy, Ronan MacRunnels, Tiffany Chen

By creating a pipeline that connects theta-stimulating binaural beats with a theta-tracking EEG headset, we plan to improve the working memory not only of students but potentially of patients suffering from dementia and other types of memory impairments as well. We also plan to potentially create an app with a built-in feedback system that uses binaural beats to improve memory retention, thereby increasing the accessibility of our product.

Brain Brush

Andrew Bennecke, Alexandra Kwon, Emma Friedenberg, Jonathan Salman, Nicole Lee Yang, Hanniel Uwadia, Manya Bali

Visit our website to view our art! https://cruxucla.wixsite.com/brainbrush

Some individuals find it difficult to effectively express their emotions and interpret the emotions of others. That is why we decided to develop project BrainBrush, a machine learning-based algorithm to translate EEG data into an easily understandable visual output. Our model classifies emotions on 3 axes: valence (pleasantness), arousal (intensity), and dominance (control). These values are then sent to TouchDesigner, where they determine the characteristics (color, speed, shape, etc. ) of the dynamic, visual output.

Brain Betrayal: EEG Lie Detector

Soulaimane Bentaleb, Raul Davila, Jasmine Zhang, Lena Johnson, Colby Lamond, Samika Karthik, Naomi Moskowitz

We wanted to test if a P300 wave could be detected when a subject sees a face that they recognize, as compared to when they see unfamiliar faces. The P300 wave is a positive deflection in the human event-related potential. If successful, our project could be used as a lie detector test by showing subjects visual scenes, faces, or objects and knowing if the subject is lying about having been in that situation, seen that person or used that object.

Egghead

Erica Li, Suraj Doshi, Bhrugu Bharathi, Martin Bourdev, Orrin Zhong, David Chung, Roja Ponnapan, Archi Bhattacharyaa

Our project, Egghead, aims to bring video games to those with motor deficits who may not be able to indulge in game entertainment in a traditional sense. Built as a puzzle game, you must help your character (the egg) avoid obstacles by pulling levers to help it escape its confines. As opposed to physically having to use a finger to push a button to pull the lever, Egghead evades the middle man by reading in SSVEP signals of the occipital lobe in live time— essentially, using brain waves to pull the lever directly. To accomplish this, each differently colored lever corresponds to a different frequency emitted from the brain which can be properly registered by a computer program. 

EEG Emotion Classification

Davis Sandberg, Onuralp ArdicAlex Georgescu, Anushka Narasani, Ashley Wang

Develop an emotional classification system that can detect a user’s emotional state and return state-specific interactive outputs to a GUI, delivering a closed-loop therapeutic system to help users better understand, and differentiate, emotional states and their typical presentations

Prosthetic Arm Control via EEG

Rachel K., Nathan C., Shane C., Nishad E., Ben F., Danielle F., Katharine J., Ayesha M., Darren N.

Robotic arms are needed for some individuals who suffer from physical disabilities with the lost ability to freely control their muscles. Although there are robotic arms that are already developed for use, it is crucial for further study and development of prosthetic arms for advancement of robotic arms with a reasonable cost for people in need. In this research, we will demonstrate moving a prosthetic arm controlled by the human brain through BCI (Brain Computer Interface). This research aims to accurately obtain data through an OpenBCI headset, correctly process signals, build a cost-effective two-prong robotic arm at low-cost, and have a functioning arm that can move up/down, left/right, and grasp(open/close). 

Context Dependent Control of Robotic Systems

Ethan Nguyen, Brandon Carris, Andy Au, Harper Tzou

Subjects control a virtual cursor using motor-imagery to allow for control of either a robotic arm or electric wheel chair.

bottom of page