Our lab typically has the resources to support 1-2 senior projects/theses in a year, and we give priority to students in special programs (e.g., Honors, ASPIRE, ADNR, McNair). We have a very high bar for the size and scope of these projects: We expect them to be scientifically important and potentially publishable, much like projects conducted by first- and second-year graduate students. In addition, we expect that the students take responsibility for every step of the project (with one exception described below) so they become fully immersed in the research process. This means that students who are doing senior projects must have mastered many skills before they start. This also means that these students will have learned an enormous amount by the time they complete their projects and will be well prepared for post-baccalaureate research positions and PhD programs.

Here are the main skills that must be in place prior to the beginning of the student's senior year:

  • Data collection

  • Programming the stimuli and task (usually using Matlab+PsychToolbox or Python+PsychoPy)

  • Data processing with Matlab scripts (or possibly Python or R scripts)

    • For ERP experiments, this includes preprocessing (e.g., filtering, artifact correction and rejection), averaging, and any operations that follow averaging (e.g., measuring amplitudes/latencies, decoding, RSA)

    • For eye tracking experiments, this includes writing custom scripts that pull out and analyze all the relevant eye movement parameters

  • Previous experience with inferential statistics is helpful but not necessary prior to beginning the project

To learn these skills, students typically spend 2-3 years (prior to their senior year) working on other projects in the lab and continually moving up to greater levels of responsibility. This means that individual projects/theses are usually limited to students who start in the lab prior to their junior year. However, there are exceptions (e.g., students who've already mastered many of the skills or transfer students who have the time and inclination to catch up quickly).

In a typical project, the student will work with their direct supervisor and the PI to design the experiment. The student will then program the stimuli and task, collect all the data, perform all the data processing, conduct the statistical analyses, and write the paper. We provide assistance, especially with the statistical analyses and writing, but the goal is for the student to do as much as possible.

Here's the typical timeline:

  • Junior year, winter quarter: Start talking with your direct supervisor about possibilities. Your supervisor will likely recommend journal articles to read.

  • Junior year, spring quarter: Meet with Steve and your direct supervisor to develop the hypothesis to be tested and a plan for the experimental design and analyses. Come up with a plan for learning any additional skills that are necessary prior to the beginning of the project.

  • Summer before senior year: Gain the additional skills and read relevant journal articles. Program the stimuli/task.

  • Senior year, fall quarter: Collect all the data.

  • Senior year, winter quarter: Process and analyze the data.

  • Senior year, spring quarter: Write the thesis.

In some cases, the project can involve an analysis of existing data. These cases almost always involve large datasets and/or advanced analysis techniques. (We expect that the student has already obtained substantial experience with data collection, because this knowledge is very important in understanding what analyses are appropriate.) In these cases, the student typically learns the advanced methods over the summer or early fall and begins data analysis no later than the middle of the fall quarter.

It would also be possible for a student to conduct an EEG/ERP methodology project, such as testing the validity of an advanced analysis tool or developing a new analysis tool. This would typically start by working with the PI and your direct supervisor to define the problem and approach. It might involve applying a tool to a large dataset (e.g., the ERP CORE) and/or to simulated data. It would likely involve extensive Matlab programming.