ETRA 2016
  • Home
  • Authors
    • Instructions for Authors
    • Video/Demo
    • Doctoral Symposium
  • Registration
  • Program
  • Tutorials
  • Organizers
  • Sponsors
  • Past ETRA
    • ETRA 2014
    • ETRA 2012
    • ETRA 2010
    • ETRA 2008
    • ETRA 2006
    • ETRA 2004
    • ETRA 2002
    • ETRA 2000

Tutorials

Tutorials

General Information

ETRA 2016 will include a number of full-day tutorials (9 a.m. – 4 p.m), where experts in the field cover a topic related to eye movement or eye tracking research within the general theme of the conference. Sign-up through the conference registration system.

  • When: March 14, 2016
  • Where: @ Francis Marion, Charleston, the conference hotel
  • Cost: $125 (full-day); Students: $75 (full-day)
  • Lunch/coffee: included in the Tutorial registration fee.

Topics

Eye-tracking and Visualization

Tanja Blascheck, Kuno Kurzhals, Michael Raschke, Patrick Renner.

Eye-tracking has become a widely used method to analyze user behavior in marketing, neuroscience, human-computer interaction, and visualization research. Apart from measuring completion times and recording accuracy rates of correctly given answers during the performance of visual tasks in classical controlled user experiments, eye-tracking-based evaluations provide additional information on how visual attention is distributed and changing for a presented stimulus. Due to the wide field of applications of eye-tracking and various kinds of research questions, different approaches have been developed to analyze eye-tracking data such as statistical algorithms (either descriptive or inferential), string editing algorithms, visualization-related techniques, and visual analytics techniques. Regardless of whether statistical or visual methods are used for eyetracking data analysis, a large amount of data generated during eye-tracking experiments has to be handled.

Where statistical analysis mainly provides quantitative results, visualization techniques allow researchers to analyze different levels and aspects of the recorded eye-tracking data in an explorative and qualitative way. Visualization techniques help to analyze the spatio-temporal aspect of eye-tracking data and the complex relationships within the data. This more qualitative exploration aims at finding hypotheses that can be investigated with statistical methods later on. Due to the increasing complexity of tasks and stimuli in eye-tracking experiments, we believe that visualization will play an increasingly important role in future eye-tracking analysis.

Evaluation has become an important step in the development of new visualization techniques. Eyetracking is one means of evaluating those newly developed approaches. Thus, analyzing eyetracking data with visualization techniques is just a logical step that followed. However, in most cases only state of the art visualization techniques are used, such as scan path or attention map visualizations. In this tutorial we will present an overview on further existing visualization techniques for eyetracking data and demonstrate their application in different user experiments and use cases.
Click here to see more details (pdf)
Download course material

Mining Scanpath Sequences with R and TraMineR Packages: A Hands-On Introduction

Joe Goldberg.

As sequences of visual attention locations, scanpaths can help uncover observers’ task and problem solving strategies. Researchers are often faced with questions, such as “How different are these groups of scanpaths?”, “Did a manipulation result in more optimal scanning?”, or “Which scanning subsequences were exhibited by all participants?” Answering these and other questions across many conditions and participants can be a significant computational problem that is made easier with tools such as ‘R’ and ‘TraMineR’. Although these tools can normally present a high learning threshold to researchers, this hands-on, full-day tutorial will provide an easy, applied introduction to sequential scanpath analysis. With this knowledge, attendees should be able to write scripts to process, compare, and visualize groups of eye tracking scanpaths.
Click here to see more details (pdf)
Download course material (password protected zip-file)

Eye data quality: measuring, calculating, and reporting

Dixon Cleveland, Fiona Mulvey, Jeff Pelz, Dong Wang, Marcus Nyström.

The quality of the data produced by eye trackers has a profound effect on what can be reliably measured and inferred in research results, and also on the limitations of any gaze-enabled or gaze-controlled interface. This tutorial is intended for anyone who wants to know more about how eye tracking systems compare in terms of data quality, how to calculate and report data quality for replicability of research results, and how to make well informed decisions about data exclusion. The tutorial will include both theoretical and practical instruction based on the methods and measures developed by the (EMRA/COGAIN Eye Data Quality Standardisation Committee). At the end of the tutorial, participants should be familiar with various measures of data quality and how to interpret them as well as gain practical experience with open source software designed for this purpose. We welcome researchers at all stages of their career and will include time to consult with teachers regarding your own research and data.
Click here to see more details (pdf)
Download course material (password protected zip-file)
Download slides (password protected zip-file)

Tutorial Chair

For more information, please contact:

  • Marcus Nyström (marcus.nystrom [at] humlab.lu.se), Lund University, Sweden

Social Network

Important Dates

28 Sep 15 Abstracts for full & short papers due
9 Oct 15 Full & short papers due
06 Nov 15 Reviews to authors
18 Nov 15 Author rebuttals due
14 Dec 15 Preliminary acceptance
18 Dec 15
8 Jan 16
Doctoral Symposium submission due
18 Dec 15
8 Jan 16
Video & Demo submission due
13 Jan 16 Camera ready papers due (full and short)
15 Jan 16 Notifications Video & Demo submissions
15 Jan 16 Notifications Doctoral Symposium
22 Jan 16 Final acceptance of full and short papers
22 Jan 16 Camera ready extended abstract Video/Demo and Doctoral Symposium

News

Array
(
    [0] => stdClass Object
        (
            [code] => 32
            [message] => Could not authenticate you.
        )

)
If you are seeing Rate Limited Exceeded please go to our Twitter Options page and follow the instructions under the header Twitter API Token.
Oops, Somethings wrong. Could not authenticate you..
Please check that you have entered your Twitter API token information correctly.

#ETRA2016

The access_token provided is invalid.