Ewha Womans University (2021.03 ~ 2022.02)

BrainBot: Intelligent robotic goal generation using brain-computer interfaces

(2021.09 ~ 2022.02), as a Principal Investigator

  • Brain-Computer Interface and machine learning based generating robot high level goal generation

  • This project are consisted with three stages
    Stage 1: Classifying user's thinking from EEG data using machine learning approach (Imaged word / speech classification model)
    Stage 2: Based on the classified user's thinking, speech will be regrouped by "Action", "Source", and "Target" groups.
    Stage 3: Based on the grouping result, Robot task(action) planning and execution

Ajou University (2014.03 ~ 2021.02)

Noise-robust multimodal data unification algorithm based on deep learning

(2019.03 ~ 2021.02), as a Research Assistant

  • Machine learning-based user emotion classification model using multi physiological data

    1. EEG

    2. Galvanic Skin Response

MR-IoT Convergence based AI for disaster

(2018.06 ~ 2020.12), as a Research Assistant

  • Brain-Computer Interface and machine learning-based user context classification model

    1. User's emotions

    2. Psychological disorder (Alcoholic)

    3. User's current watched object (digit or not, using MindBigData)

VR system for Alzheimer's disease patients with dementia

(2018.03 ~ 2018.12), as a Research Assistant

  • Developing VR headset for Alzheimer's disease patients with dementia

  • EEG sensor is equipped at the VR headset and collect the patient's EEG data

  • Based on conventional machine learning algorithms and recurrent neural networks, training emotion classification model for the patients

Support for abnormal financial transaction detection system using machine learning

(2017.06 ~ 2018.01), as a Research Assistant

  • Analyzing transaction history for identifying an abnormal financial transaction

  • Analyzing worker's system log data for internal audit.

Learner adaption framework for programming education

(2015.07 ~ 2017.06), as a Research Assistant

  • Classifying learner's context for adaptation

    1. Emotion (based physiological data(e.g., EEG, GSR, Heart Rate, Eye-tracking))

    2. Learning progress (achievement)

    3. Physical action (using inertial data of smartphone)