This paper validates the using a low-cost EEG headset – Emotiv Insight 2.0 – for detecting emotional responses to visual stimuli. The researchers detected, based on brainwave activity, the viewer’s emotional states in reference to a series of visuals and mapped them on valance and arousal axes. Valence in this research is defined as the viewer’s positive or negative state, and arousal is defined as the intensity of the emotion or how calm or excited the viewer is. A set of thirty images – divided into two categories: Objects and Scenes – was collected from the Open Affective Standard Image Set (OASIS) and used as a reference for validation. We collected atotal of 720 data points for six different emotional states: Engagement, Excitement, Focus, Interest, Relaxation, and Stress. To validate the emotional state score generated by the EEG headset, we created a regression model using those six parameters to estimate the valence and arousal level, and compare them to values reported by OASIS. The results show the significance of the Engagement parameter in predicting the valence level in the Objects category and the significance of the Excitement parameter in the Scenes category. With the emergence of personal EEG headsets, understanding the emotional reaction in different contexts will help in various fields such as urban design, digital art, and neuromarketing. In architecture, the findings can enable designers to generate more dynamic and responsive design solutions informed by users’ emotions.