바로가기메뉴

본문 바로가기 주메뉴 바로가기

logo

뇌파측정기술(EEG)과 판별분석을 이용한 영상물의 키프레임 자동 분류 방안 연구

Toward a Key-frame Automatic Extraction Method for Video Storyboard Surrogates Based on Users' EEG Signals and Discriminant Analysis

정보관리학회지 / Journal of the Korean Society for Information Management, (P)1013-0799; (E)2586-2073
2015, v.32 no.3, pp.377-396
https://doi.org/10.3743/KOSIM.2015.32.3.377
김현희 (명지대학교)
  • 다운로드 수
  • 조회수

초록

본 연구는 뇌파측정기술(EEG)의 ERP와 판별분석을 이용하여 이용자 기반의 비디오 키프레임들을 자동으로 추출할 수 있는 방안을 제안하였다. 구체적으로, 20명의 피험자를 대상으로 한 실험을 통해서 이미지 처리 과정을 다섯 개의 정보처리단계들(자극주목, 자극지각, 기억인출, 자극/기억 대조, 적합 판정)로 구분하여 각 단계별로 적합한 뇌파측정기술의 ERP 유형을 제안하여 검증해 보았다. 검증 결과, 각 단계별로 서로 다른 ERP 유형(N100, P200, N400, P3b, P600)을 나타냈다. 또한 세 그룹(적합, 부분적합 및 비적합 프레임)간을 구별할 수 있는 중요한 변수들로는 P3b에서 P7의 양전위 최고값과 FP2의 음전위 최저값의 잠재기로 나타났고, 이러한 변수들을 이용해 판별분석을 수행한 후 적합 및 비적합 프레임들을 분류할 수 있었다.

keywords
뇌파측정기술, 사건관련유발전위, 키프레임, 판별분석, 적합성, EEG, ERP, key-frame, discriminant analysis, relevance

Abstract

This study proposed a key-frame automatic extraction method for video storyboard surrogates based on users’ cognitive responses, EEG signals and discriminant analysis. Using twenty participants, we examined which ERP pattern is suitable for each step, assuming that there are five image recognition and process steps (stimuli attention, stimuli perception, memory retrieval, stimuli/memory comparison, relevance judgement). As a result, we found that each step has a suitable ERP pattern, such as N100, P200, N400, P3b, and P600. Moreover, we also found that the peak amplitude of left parietal lobe (P7) and the latency of FP2 are important variables in distinguishing among relevant, partial, and non-relevant frames. Using these variables, we conducted a discriminant analysis to classify between relevant and non-relevant frames.

keywords
뇌파측정기술, 사건관련유발전위, 키프레임, 판별분석, 적합성, EEG, ERP, key-frame, discriminant analysis, relevance

참고문헌

1.

권준수. (2000). 인지기능연구에서의 사건관련전위의 이용. 인지과학작업, 1(1), 79-98.

2.

김명선. (2000). 즉각적 재인 기억과 지연 재인 기억이 사건관련전위에 미치는 영향. 한국인지과학회논문지, 11(3-4), 83-93.

3.

김현희. (2015). 이용자 기반의 비디오 키프레임 자동 추출을 위한 뇌파측정기술(EEG) 적용. 한국문헌정보학회지, 49(1), 443-464. http://dx.doi.org/10.4275/KSLIS.2015.49.1.443.

4.

이지영. (2006). 연구방법론을 통해 살펴본 음악 처리과정 연구- 음악과 언어, 음악과 정서를 중심으로. 낭만음악, 18(3), 69-146.

5.

이충연 외. (2011). EEG 기반 뇌기능 분석을 이용한 영화 장면- 대사 기억 게임에서의 인지 학습 특성. 38(1), 210-213.

6.

임용수. (2010). 정신분열병 환자에서 생물학적 지표로서 N100, P300과 정량화뇌파의 적용. 대한정신약물학회지, 21(2), 78-86.

7.

장윤석. (2014). 시각자극 과제에 의한 집중 시의 뇌파분석. 한국전자통신학회 논문지, 9(5), 589-594.

8.

Andreassi, J. L.. (2006). Psychophysiology: Human behavior and physiological response:Psychology Press.

9.

Baddeley, A.. (1997). Human memory: Theory and practice:Psychology Press.

10.

Banich, M. T.. (2011). Cognitive neuroscience:Wadsworth.

11.

강연욱. (2014). 인지 신경과학:박학사.

12.

Davidson, R. J.. (1999). The functional neuroanatomy of emotion and affective style. Trends in Cognitive Science, 3, 11-21.

13.

DeFrance, J. F.. (1997). Age-related changes in cognitive ERPs of attenuation. Brain Topography, 9(4), 283-293.

14.

Donchin, E.. (1988). Is the P300 component a manifestation of context updating?. Behavioral and Brain Science, 11, 357-374.

15.

Elias, L.. (2009). 임상 및 실험 신경심리학:시그마프레스.

16.

Hamann, S. B.. (1997). Intact perceptual memory in the absence of conscious memory. Behavioral Neuroscience, 111, 850-854.

17.

Hillyard, S.. (1973). Electrical signs of selective attention in the human brain. Science, 182(4108), 177-179.

18.

Hillyard, S.. (1979). Handbook of behavioral neurobiology: Vol 2. Neuropsychology:Plenum Press.

19.

Iyer, H.. (2007). Prioritization strategies for video storyboard keyframes. Journal of the American Society for Information Science and Technology, 58(5), 629-644.

20.

Jung, H.. (2012). Reduced source activity of event-related potentials for affective facial pictures in schizophrenia patients. Schizophrenia Research, 136, 150-159.

21.

Kapoor, A.. (2008). Combining brain computer interfaces with vision for object categorization (1-8). IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

22.

Kim, H.. (2010). Toward a conceptual framework of key-frame extraction and storyboard display for video summarization. Journal of the American Society for Information Science and Technology, 61(5), 927-939.

23.

Koelstra, S.. (2009). EEG analysis for implicit tagging of video data (-). Proceeding of the 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops. IEEE Computer Society Press.

24.

Kutas, M.. (1980). Reading senseless sentences: Brain potentials reflect semantic incongruity. Science, 207, 203-208.

25.

Moscovitch, M.. (1992). Memory and working-with-memory: A component process model based on modules and central systems. Journal of Cognitive Neuroscience, 4, 257-267.

26.

Naatanen, R.. (1982). Stimulus deviance and evoked potentials. Biological Psycology, 14, 53-98.

27.

Naghavi, H.. (2005). Common fronto-parietal activity in attention, memory, and consciousness: Shared demands on integration?. Consciousness and Cognition, 14(2), 390-425.

28.

Picton, T.. (1988). Dynamics of sensory and cognitive processing by the brain:Springer-Verlag.

29.

Picton, T.. (1978). Attention and performance: Ⅶ:Wiley.

30.

Smith, M. E.. (1993). Neurophysiological manifestations of recollective experience during recognition memory judgments. Journal of Cognitive Neuroscience, 5, 1-13.

31.

Smith, E. E.. (1999). Storage and executive processes in the frontal lobes. Science, 283, 1657-1661.

32.

Song, Y.. What are the most eye-catching and ear-catching features in the video?.

33.

Tulving, E.. (1994). Hemispheric encoding / retrieval asymmetry in episodic memory: Position emission tomography findings (2016-2020). Proceedings of the National Academy of Sciences of the United States of America.

34.

Yuji Hakoda. (2014). 인지심리학:한국뇌기반교육연구소.

35.

Wang, Y.. (2009). A cognitive informatics theory for visual information processing (-). Proc. 7th IEEE International Conference on Cognitive Informatics. Stanford University.

정보관리학회지