TastEmotion
A research project conducted as part of Applied Affective Computing, exploring the relationship between taste perception and emotional responses using EEG data processed by the Emotiv headset.
Technologies Used
- Hardware: Emotiv EEG headset, which provides preprocessed EEG data for analysis.
- Data Processing: Python for further processing and feature extraction of the preprocessed data.
- Analysis: Machine learning techniques to map the preprocessed EEG data to emotional states.
- Experiment Design: Controlled environment to evaluate taste-based emotional responses.
Key Aspects
- Data collection using the Emotiv EEG headset to record preprocessed signals while subjects tasted different stimuli.
- Analysis of the preprocessed EEG data to identify emotional responses related to taste perception.
- Comparing responses across different flavors to assess consistency and variance.
- Potential applications in food science, marketing, and human-computer interaction.
Challenges
One challenge was the variability in emotional responses and ensuring consistency across participants. The Emotiv headset provided preprocessed data, which required careful feature selection for accurate emotional inference. Equipment limitations, such as slow data streaming, hindered real-time analysis, making it difficult to capture continuous emotional responses at the desired rate. Calibration and external factors like electrode placement also posed challenges, requiring fine-tuning for reliable predictions.
Impact
This research advances the understanding of the connection between taste and emotional responses using neurophysiological data. By utilizing preprocessed data from the Emotiv headset, we demonstrated the potential for real-time emotion recognition tied to sensory experiences. The insights gained could contribute to affective computing, personalized nutrition, and sensory-driven AI applications, offering new ways to tailor experiences in food science and marketing.
Further Research
While this project provided valuable insights, future research could focus on refining emotion prediction algorithms, incorporating multimodal data, and scaling experiments with a more diverse participant group. Expanding this research could lead to novel applications in personalized sensory experiences and advanced human-computer interaction systems.