Presenter: Roman Bukreev
Faculty Sponsor: Komalpreet Kaur
School: Salem State University
Research Area: Computer Science
ABSTRACT
This project develops a P300 event-related potential brain-computer interface (BCI) that lets a player control a 2D maze game using EEG. In the game, the player navigates through a maze and collects items; movement is selected through a four-choice visual oddball paradigm. Directional arrow stimuli (Up, Down, Left, Right) flash in randomized sequences while the user focuses attention on the intended direction. EEG is recorded with a 16-channel g.tec g.Nautilus system across multiple data-collection sessions, producing labeled target and non-target flashes for supervised learning.
A Python pipeline segments EEG into stimulus-locked event-related potentials and extracts features for classification. Classification models such as regularized linear discriminant analysis and logistic regression are trained and evaluated using cross-validation strategies that test generalization across recording sessions. Performance is summarized with classification accuracy and command-selection speed.
The trained classifier is integrated into a real-time game loop to convert predicted targets into directional inputs with low latency, enabling hands-free maze navigation. The outcome is an end-to-end BCI game prototype and a reproducible analysis framework for comparing stimulus design choices and model performance. This work supports future improvements in accuracy, selection speed, and user comfort, and illustrates how ERP-based BCIs can be applied to interactive games that require reliable discrete command selection.