I’ve put together several tutorials for creating experiments in Open Sesame and for analyzing Eye-tracking and EEG data. At the moment, the Open Sesame tutorials are the most complete (thank you COVID-19!). My hope is to someday write these other tutorials out as fully autonomous guides, but for now, there are a few less comprehensive files.

Open Sesame

The Open Sesame tutorials were designed to quickly get thesis students ready to create their own experiments in Open Sesame. The first tutorial, Getting Started with Open Sesame is a quick, basic introduction relying on the Open Sesame Beginner Tutorial. If you haven’t worked with Open Sesame yet, I would recommend starting there. The Lexical Decision Task Tutorial introduces the learner to inline scripting and pseudorandomization in the experimental design. The Word Spotting Tutorial includes these elements, and also shows the learner how to create audio recordings from participants. The Lexical Decision Experiment in OSWeb (Open Sesame online) Tutorial introduces OsWeb, an online runtime for Open Sesame experiments using the Lexical Decision Task template.

If you are completing a thesis with Dr. Von Holzen, she will tell you which tutorials you need to complete to prepare yourself for your thesis work. For example, if your experiment uses a Word Spotting task, she may ask you to complete the following:

  1. Getting Started with Open Sesame
  2. Lexical Decision Task Tutorial
  3. Word Spotting Tutorial
  4. Lexical Decision Experiment in OSWeb Tutorial

coming soon, Visual World with Eyetracking tutorial

These tutorials were written and tested on a Windows 10 system running OpenSesame 3.3.3. Issues may arise when the operating system and Open Sesame version vary.

Eye-tracking Analyses

Eyetracking Statistical Analysis Tutorial

Tutorial guide for how to load data recorded with an Eyelink eyetracker into R, clean the data, and then analyse it. You can access the tutorial here.

Cluster-based Permutation Analysis

A short tutorial I gave for the RLadies Paris Meetup.

A short presentation I gave at the CLaS Eye-tracking Workshop held at Macquarie University On September 8th, 2022, titled “Using cluster-based permutation tests to analyze eye-tracking data”. The presentation has a companion tutorial.

Audio Stimuli Preparation

If you are using audio stimuli in your experiment, you’ll need to record it and prepare it to be presented by your experimental software. The tutorial Recording Audio (coming soon) will walk you through the steps you need to take to record the auditory stimuli for your experiment. The tutorial Segmenting and extracting audio stimuli in Praat will walk you through opening this file in Praat and selecting and saving the individual sound files.


Guide for loading and analyzing EEG data using EEGlab. This document is adapted from the original, created by Simmy Poonian.