征稿已开启

查看我的稿件

注册已开启

查看我的门票

已截止
活动简介

Affective computing research has recently experienced the adoption of its own technological advancements by an increasing number of domains (e-commerce, news reading, web 2.0 services, and human-computer interfaces). The capability of affect-aware applications and, especially, games delivering enhanced user immersion and engagement defines the driving force behind this adoption. Inevitably, such environments are unique elicitors of emotion and the study of user experience in those environments is of paramount importance for the understanding of game play internal mechanics. In this framework, games are increasingly used in learning, both in formal education and in teaching social and/or vocational skills, putting into action higher-level psychological concepts, such as attention, engagement and flow, and introducing modern reward systems to make game play more appealing.

Capturing, analyzing and synthesizing player experience in both traditional screen-based games, and augmented- and mixed-reality platforms has been a challenging area within the crossroads of cognitive science, psychology, artificial intelligence and human-computer interaction. Additional gameplay input modalities, such as gestures and movement (e.g. with Nintendo Wii, Microsoft Kinect, or smartphones), image, and speech, enhance the importance of the study and the complexity of player experience. Sophisticated techniques from artificial and computational intelligence can be used to recognize the affective state of player/learner, based on multiple modalities of player-game interaction, and to model emotion in non-playing characters. Multiple modalities of input can also provide a novel means for game platforms to measure player satisfaction and engagement when playing, without necessarily having to resort to post-experience and off-line questionnaires. For instance, players immersed by gameplay will rarely gaze away from the screen, while disappointed or indifferent players will typically show very little response or emotion. Adaptation game techniques can also be used to maximise player experience, thereby, closing the affective game loop: e.g. change the game soundtrack to a vivid or dimmer tune to match the player’s powerful stance or prospect of defeat. In addition to this, procedural content generation techniques may be employed, based on the level of user engagement and interest, to dynamically produce new, adaptable and personalised content (e.g. a new level in a physics skills game, which poses enough challenge to players, without disappointing them or a set of advanced questions or tasks for players that appear to be bored).

征稿信息

征稿范围

Track Topics

  • Natural interaction in learning games

  • controlling games with hand and body gestures, body stance, facial expressions, gaze, speech, and physiology

  • mapping non-verbal cues to affect, emotion, and player satisfaction

  • Emotion in learning experience

  • affective player/learner modelling

  • artificial and computational intelligence for modelling player/learner experience

  • adapting to player/learner affect and experience

  • adaptive learning and player experience

  • affect-driven procedural learning content generation

  • Emotion modelling in non-player characters

  • Higher-level concepts

  • learner engagement, attention and satisfaction

  • maximising user engagement and flow

  • social context awareness and adaptation

  • Games for learning

  • Emotion and affect in user studies and user-centred evaluation

  • Designing for special needs

  • Reward systems and transfer in games

  • User modelling (vocational vs. children games, formal education vs. social skills, etc.)

留言
验证码 看不清楚,更换一张
全部留言
重要日期
  • 会议日期

    07月03日

    2017

    07月07日

    2017

  • 07月07日 2017

    注册截止日期

主办单位
IEEE Computer Society
移动端
在手机上打开
小程序
打开微信小程序
客服
扫码或点此咨询