Redefining human-centered design: AI and EEG for emotion-sensitive architecture
The AI model employs a hybrid deep learning approach combining convolutional neural networks (CNNs) and long short-term memory (LSTM) architectures. CNNs extract spatial features from EEG data, while LSTMs capture temporal dependencies, enabling precise emotion classification.
Artificial intelligence (AI) is revolutionizing various fields, from healthcare and neuroscience to smart cities and human-centered design. In the realm of architecture, AI is now paving the way for intelligent spaces that dynamically respond to human emotions.
A recent study titled Emotion Analysis AI Model for Sensing Architecture Using EEG, conducted by Seung-Yeul Ji, Mi-Kyoung Kim, and Han-Jong Jun from Hanyang University, explores how AI-driven emotion recognition can enhance architectural environments. Published in Applied Sciences (2025), this study presents an innovative approach that utilizes electroencephalography (EEG) data to classify human emotional states and apply these insights to real-time spatial adaptation. By integrating AI-based biometric analysis with architectural design, this research aims to foster responsive environments that cater to human psychological well-being.
Understanding emotion analysis in architecture
Traditionally, architectural design has relied on subjective methodologies such as surveys, interviews, and behavioral observations to assess emotional responses to built environments. While these methods offer valuable insights, they suffer from inherent biases and an inability to capture real-time emotional fluctuations. The study proposes an AI-based model that leverages EEG signals to provide objective, real-time emotional feedback, allowing for dynamic spatial adjustments.
Using the SEED dataset - a widely recognized EEG-based emotion recognition resource - the researchers trained their AI model to classify three core emotional states: positive, neutral, and negative. By analyzing neural activity, the model identifies correlations between brainwave patterns and emotional states. The study integrates this model into a 360-degree virtual reality (VR) environment, simulating real-world architectural settings to assess how spaces can adapt to users' emotions dynamically. This research sets the foundation for "sensing architecture," where buildings and urban environments become responsive entities that adjust in real-time based on biometric data.
AI model development and implementation
The AI model employs a hybrid deep learning approach combining convolutional neural networks (CNNs) and long short-term memory (LSTM) architectures. CNNs extract spatial features from EEG data, while LSTMs capture temporal dependencies, enabling precise emotion classification. Additionally, the researchers experimented with fine-tuning large language models (LLMs) to process EEG data in a structured format, exploring whether AI could infer affective states from raw brainwave signals.
To enhance accuracy, the study implemented a channel-agnostic training method, which removes EEG channel dependencies, allowing the model to generalize across different data sources. The AI system was trained using an 80:20 train-test split, fine-tuned for optimal classification accuracy. Hyperparameter tuning was conducted to refine learning rates, dropout rates, and batch sizes, ensuring robustness in real-time applications. The final model was then integrated into a VR-based simulation, where EEG data influenced spatial elements such as lighting, sound, and spatial configuration, demonstrating the potential for AI-driven adaptive architecture.
Real-world applications and implications
The study underscores the vast potential of EEG-based emotion recognition in revolutionizing architecture. By embedding AI-driven affective computing into built environments, architects and urban planners can create intelligent spaces that promote psychological well-being. For instance, workspaces could dynamically adjust lighting and acoustics based on employee stress levels, enhancing productivity and comfort. Similarly, hospitals and therapeutic environments could adapt their surroundings to reduce patient anxiety and improve mental health outcomes.
Moreover, this research introduces a new paradigm in smart city planning, where urban environments respond to collective emotional states. Public spaces, transportation hubs, and cultural institutions could utilize real-time biometric feedback to optimize user experience, improving spatial efficiency and human satisfaction. While this study primarily focuses on EEG-based affective computing, it opens the door for integrating other biometric signals, such as heart rate variability and skin conductance, to create even more sophisticated adaptive environments.
Challenges and future directions
Despite its promising findings, the study acknowledges several challenges. One key limitation is data variability, as EEG signals can be influenced by external factors such as movement artifacts and individual physiological differences. Additionally, while the SEED dataset provides a robust foundation, larger and more diverse datasets are needed to improve model generalization for real-world applications.
Another challenge lies in ethical considerations and privacy concerns. The collection and processing of biometric data require stringent security measures to protect user privacy. Implementing transparent data governance frameworks and user consent mechanisms will be critical in ensuring ethical AI deployment in architectural applications.
Future research should focus on integrating multimodal biometric sensing to enhance the reliability of emotion classification. Combining EEG with facial expression recognition, voice analysis, and physiological signals could provide a more holistic understanding of emotional responses to built environments. Additionally, exploring real-time deployment in physical architectural spaces - beyond VR simulations - will be essential in validating the model’s practical feasibility.
Conclusion: Toward emotion-aware architecture
The intersection of AI, neuroscience, and architecture presents exciting opportunities for the future of intelligent built environments. The study by Ji, Kim, and Jun highlights the feasibility of EEG-based emotion recognition in adaptive architecture, paving the way for human-centric design solutions that prioritize emotional well-being. As AI technology advances, the integration of affective computing into architectural design has the potential to reshape how we interact with spaces, transforming them into responsive, intuitive, and emotionally intelligent environments.
By bridging cognitive science with architectural innovation, this research contributes to the ongoing evolution of smart cities and personalized spatial experiences. While challenges remain, the foundational insights provided by this study set the stage for future explorations in emotion-aware architecture, ultimately enhancing the way we experience and interact with the built environment.
- FIRST PUBLISHED IN:
- Devdiscourse

