研究生: |
林冠宇 Lin, Kuan-Yu |
---|---|
論文名稱: |
跨語言溝通中手勢使用行為之多維資料庫 A Multidimensionally Codified Dataset of Speech- Accompanying Gesture in Cross-Lingual Communication |
指導教授: |
王浩全
Wang, Hao-Chuan |
口試委員: |
朱宏國
Chu, Hung-Kuo 林文杰 Lin, Wen-Chieh |
學位類別: |
碩士 Master |
系所名稱: |
|
論文出版年: | 2017 |
畢業學年度: | 105 |
語文別: | 英文 |
論文頁數: | 63 |
中文關鍵詞: | 電腦輔助溝通 、手勢 、跨語言溝通 、資料集 、Kinect 、視覺化 |
外文關鍵詞: | Computer-mediated communication, Gesture, Cross-lingual communication, Dataset, Kinect, Visualization |
相關次數: | 點閱:2 下載:0 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
手勢和語言之間的關連性及其如何輔助跨語言溝通一直是個值得探討的研究 議題。然而,傳統的手勢研究需要耗費大量的人力標註及分類手勢,在執行及測 量受試者的動作時亦非常耗時,因此如何透過一套有效的流程及工具來建立資料 集以輔助手勢研究實驗的發展是另外一個重要的議題。本篇論文透過蒐集跨語言 溝通實驗中的手勢資料,建立一個多面向的手勢資料集並探討上述議題。
此資料集中包含了多種不同面向的資料型態,例如:手勢的持續時間、手勢 量、四種不同意易的手勢資料(iconic、metaphoric、pointing、及 non-iconic gesture) 及溝通中的語料資料和影音等。相較於傳統手勢研究實驗所蒐集的語音及視訊資 料,此資料集中提供的資料面向更為廣泛,可提供手勢研究者一個新的發展方向。 除了這些多元的資料外,我們根據先前的研究整理出一套手勢資料量化及記錄的 流程,並設計一系列輔助工具以增進研究效率。我們使用了 Kinect-tapping toolkit 這套工具來蒐集溝通實驗中受試者的手勢動作及資料。這些肢體位置資訊可以透 過 unit motion 方法快速的篩選並建立手勢資料。此外,對於人工手勢標註方法, 我們也設計了一套手勢標註工具來增進人工分類的效率。
在此篇論文的後段,我們透過對此資料集中的不同資料進行一些基礎的分析, 示範如何操作此資料集,並透過這些溝通資料驗證過往研究中如溝通媒介的豐富 度與手勢間的關連性及手勢視覺化的輔助運用。這些分析不僅驗證了如以往研究 所提出的視覺豐富度會影響手勢的使用程度,我們並以此分析來示範資料集可以 如何被運用,並提供研究者在手勢研究上一個廣泛及不同角度的方向,減少過往 花在手勢量化資料建立所需的人力與時間。
Motivated by the need to investigate the association between gesture use and language use in cross-lingual remote communication, and for the purpose of making gesture research more efficient and easily accessible, in this thesis we present a multidimensional codified gesture dataset. As part of this project, we also present tools and describe the complete methodology for coding and organizing gestures captured as video data.
Our dataset consists of data of communication behaviors in multiple dimensions (time duration, amount, speech content) of four categories of gestures (iconic, metaphoric, pointing, non-iconic) generated by 36 participants in two language groups (group EL1: native vs non-native speaker and group EL2: both non-native speakers) from a lab study. This dataset has been collected with a Kinect-taping tool and processed by an automatic data processing workflow that facilitates gesture categorizing. We accompany it by using methods of human coding for precisely aligning language data and non-verbal gesture data along the timeline of interpersonal communication.
Finally, we conduct a series of sample data analyses with statistical visualizations and exploratory analyses that aim to provide insights around the interaction processes of computer-mediated cross-lingual communication. We explore how gesture use associates with individuals’ comprehension and ideation. The dataset provides researchers with an open, archived dataset for exploring the gesture usage in cross-lingual communication and different communication media. The goal is to facilitate the procedure of gesture coding and lower the human cost of it, reducing barriers in research on multimodal interpersonal communication.
1. Anne M. Treisman and Garry Gelade. 1980. A feature- integration theory of attention. Cognitive Psychology 12, 1 (January 1980), 97–136. http://dx.doi.org/10.1016/0010-0285(80)90005-5.
2. Beattie, Geoffrey W. "A further investigation of the cognitive interference hypothesis of gaze patterns during conversation." British Journal of Social Psychology 20.4 (1981): 243-248.
3. Ping, Raedy, and Susan Goldin‐Meadow. "Gesturing saves cognitive resources when talking about nonpresent objects." Cognitive Science 34.4 (2010): 602-619.
4. Richard L. Daft & Robert, H. Lengel. 1986. Organizational information requirements, media richness and structural design. Management Science, 32, 5 (May 1986), 554-571.
5. Herb H. Clark & Susan Brennan. 1991. Grounding in Communications. Perspectives on Socially Shared Cognition, Washington DC: APA, 127-149.
6. McNeill, D. (1992). Hand and mind: What gestures reveal about thought. Chicago: University of Chicago Press.
7. Boyle, Elizabeth A., Anne H. Anderson, and Alison Newlands. "The effects of visibility on dialogue and performance in a cooperative problem solving task." Language and speech 37.1 (1994): 1-20.
8. Bekker, M. M., Olson, J. S., & Olson, G. M. (1995). Analysis of gestures in face-to-face design teams provides guidance for how to use groupware in design. Proceedings of DIS 95. NY: ACM Press.
9. Walther, J. B., & Tidwell, L. C. (1995). Nonverbal cues in computer-mediated communication, and the effect of chronemics on relational communication. Journal of Organizational Computing, 5, 355-378.
10. Krauss, R. M., Chen, Y., & Chawla, P. (1996). Nonverbal behavior and nonverbal communication: What do conversational hand gestures tell us? In M. Zanna (Ed.), Advances in Experimental Social Psychology (pp. 389-450). Academic Press.
11. O'Malley, Claire, et al. "Comparison of face-to-face and video-mediated interaction." Interacting with computers 8.2 (1996): 177-192.
12. Bakeman, Roger, and John M. Gottman. Observing interaction: An introduction to sequential analysis. Cambridge university press, 1997.
13. Doherty-Sneddon, Gwyneth, et al. "Face-to-face and video-mediated communication: A comparison of dialogue structure and task performance." Journal of experimental psychology applied 3 (1997): 105-125.
14. Whittaker, S., & O’Conaill, B. (1997). The role of vision in face-to-face and mediated communication. In K. E. Finn, A. J. Sellen, & S. B. Wilbur (Eds.), Video-mediated communication (pp. 23-49).
15. Dennis, Alan R., and Susan T. Kinney. "Testing media richness theory in the new media: The effects of cues, feedback, and task equivocality." Information systems research 9.3 (1998): 256-274.
16. Alan R. Dennis & Joseph S. Valacich. 1999. Rethinking media richness: Towards a theory of media synchronicity. Proceedings of the 32rd Hawaii International Conference on System Sciences.
17. Chartrand, T.L., & Bargh, J.A. (1999). The chameleon effect: The perception behavior link and social interaction. Journal of Personality & Social Psychology, 76, 893-910.
18. Goldin-Meadow, S. (1999). The role of gesture in communication and thinking. Trends in Cognitive Sciences, 3, 419-429.
19. Krauss, Robert M., and Uri Hadar. "The role of speech-related arm/hand gestures in word retrieval." Gesture, speech, and sign 93 (1999).
20. Veinott, E., Olson, J., Olson, G., & Fu, X. (1999). Video helps remote work: Speakers who need to negotiate common ground benefit from seeing each other. In Proceedings of CHI 1999.
21. Yabe, T., & Tanaka, K. (1999). Similarity Retrieval of Human Motion as Multi-stream Time Series Data. Proc. of International Symposium on Database Applications in Non-Traditional Environments (DANTE).
22. Boyle, M., Edwards, C. & Greenberg, S. The Effects of Filtered Video on Awareness and Privacy. Proceedings of CSCW 2000.
23. Alibali, Martha W., Dana C. Heath, and Heather J. Myers. "Effects of visibility between speaker and listener on gesture production: Some gestures are meant to be seen." Journal of Memory and Language 44.2 (2001): 169-188.
24. Bos, Nathan, et al. "Effects of four computer-mediated communications channels on trust development." Proceedings of the SIGCHI conference on human factors in computing systems. ACM, 2002.
25. Bos N., Olson J., Gergle D., Olson G., & Wright Z. (2002). Effects of four computer-mediated communications channels on trust development. Proceedings of CHI 2002.
26. Geoffrey Beattie and Heather Shovelton. 2002. An experimental investigation of some properties of individual iconic gestures that mediate their communicative power. British Journal of Psychology 93, 2 (May 2002), 179–192. DOI=http://dx.doi.org/10.1348/000712602162526.
27. Driskell,J.E.,&Radtke,P.H.(2003).Theeffectof gesture on speech production and comprehension. Human Factor, 45, 445-454.
28. Beattie, Geoffrey. Visible thought: The new psychology of body language. Psychology Press, 2004.
29. Fussell, S. R., Setlock, L. D., Yang, J., Ou, J., Mauer, E. M., & Kramer, A. (2004). Gestures over video streams to support remote collaboration on physical tasks. Human-Computer Interaction, 19, 273-309.
30. Bakeman, Roger, and Augusto Gnisci. "Sequential Observational Methods." (2006).
31. DiMicco, J., M., Hollenbach, K., J., & Bender, W. (2006). Using visualization to review a group’s interaction dynamics. Proceedings of CHI EA 2006.
32. Fridanna, Maricchiolo, Bonaiuto Marino, and Gnisci Augusto. "Hand gestures in speech: studies of their roles in social interaction."
33. Spencer D. Kelly, Sarah M. Manning, and Sabrina Rodak. 2008. Gesture gives a hand to language and learning: Perspectives from cognitive neuroscience, developmental psychology and education. Language and Linguistics Compass 2, 4 (July 2008), 569–588. DOI=http://dx.doi.org/10.1111/j.1749- 818X.2008.00067.x.
34. Poggi, Isabella. "Iconicity in different types of gestures." Gesture 8.1 (2008): 45-61.
35. Filho, J., E., V., Inkpen, K., M., & Czerwinski, M. (2009). Image, appearance and vanity in the use of media spaces and video conference system. Proceedings of GROUP 2009.
36. Hao-Chuan Wang, Susan F. Fussell, and Leslie D. Setlock. 2009. Cultural difference and adaptation of communication styles in computer-mediated group brainstorming. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09). ACM, New York, NY, USA, 669- 678. DOI=http://dx.doi.org/10.1145/1518701.1518806.
37. Leshed, G., Perez, D., Hancock, J. T., Cosley, D., Birnholtz, J., Lee, S., McLeod, P., & Gay, G. (2009). Visualizing real-time language-based feedback on teamwork behavior in computer-mediated groups. Proceedings of CHI 2009.
38. Scissors L. E., Gill A. J., Geraghty K., & Gergle D. (2009). In CMC we trust: the role of similarity. Proceedings of CHI 2009.
39. Ping, Raedy, and Susan Goldin‐Meadow. "Gesturing saves cognitive resources when talking about nonpresent objects." Cognitive Science 34.4 (2010): 602-619.
40. Dunbar,N.E.,Jensen,M.L.,&Burgoon,J.K.(2011). A dyadic approach to the detection of deception. Proceedings of the Annual Hawaii International Conference on System Sciences (HICSS).
41. Maricchiolo, Fridanna, Augusto Gnisci, and Marino Bonaiuto. "Coding hand gestures: A reliable taxonomy and a multi-media support." Cognitive behavioural systems. Springer Berlin Heidelberg, 2012. 405-416.
42. Susan R. Fussell, Leslie D. Setlock. 2012. Multicultural teams. In Leadership in science and technology: A reference handbook. 255-264. Thousand Oaks, CA: SAGE Publications, Inc. DOI=http://dx.doi.org/10.4135/9781412994231.n 29.
43. Poppe, R., Van Der Zee, S., Heylen, D. K., & Taylor, P. J. (2013). AMAB: Automated measurement and analysis of body motion. Behavior Research Methods.
44. Hao-Chuan Wang and Chien-Tung Lai. 2014. Kinect- taped communication: using motion sensing to study gesture use and similarity in face-to-face and computer-mediated brainstorming. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '14). ACM, New York, NY, USA, 3205-3214. DOI=http://dx.doi.org/10.1145/2556288.2557060.
45. Sauppé, Allison, and Bilge Mutlu. "How social cues shape task coordination and communication." Proceedings of the 17th ACM conference on Computer supported cooperative work & social computing. ACM, 2014.
46. Bokosmaty, Sahar, et al. "Reducing cognitive load: The effects of gesturing on children's effective working memory capacity." (2015): 21.
47. I. Damian, C. S. S. Tan, T. Baur, J. Schöning, K. Luyten, and E. André. Augmenting Social Interactions: Realtime Behavioural Feedback Using Social Signal Processing Techniques. In CHI ‘15, pages 565–574. ACM, 2015.
48. Understanding body Language Gesture Types, http://changingminds.org/explanations/behaviors/body_language/gesture_type.htm, 2015
49. Susan Goldin-Meadow. 2015. From action to abstraction: Gesture as a mechanism of change. Developmental Review 38 (Dec 2015), 167–184. DOI=http://dx.doi.org/10.1016/j.dr.2015.07.00.
50. Kuan-Yu Lin, Seraphina Yong, Shuo-Ping Wang, Chien-Tung Lai and Hao-Chuan Wang. 2016. HandVis: Visualized Gesture Support for Remote Cross-Lingual Communication. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 1236-1242. DOI=http://dx.doi.org/ 10.1145/2851581.2892431.
51. Microsoft (2012, April 24). Microsoft Research Cambridge-12 Kinect gesture dataset. Retrieved October 01, 2016, from https://www.microsoft.com/en-us/download/details.aspx?id=52283
52. Imperial College London (2015). Cambridge Hand Gesture Dataset. Retrieved October 01, 2016, from http://www.iis.ee.ic.ac.uk/icvl/ges_db.htm
53. New York University (2014). NYU Hand Pose Dataset. Retrieved October 01, 2016, from http://cims.nyu.edu/~tompson/NYU_Hand_Pose_Dataset.htm