Releases: iandol/opticka
V2.17.0
V2.17.0 Integration of ØMQ, CageLab and Alyx, and Titta Toolbox changes from Nierhorster et al., 2024
- Add support for ØMQ for communication messages (command + serialised MATLAB data packet) across networked PTB instances using the
jzmqConnectionclass. This is much more robust than raw TCP/UDP used bypnet&dataConnectionand we are using it for communication across CageLab devices. This adds a dependency on https://github.com/cogplatform/matlab-jzmq, a MATLAB wrapper for JeroMQ. The class explicitly supports a new neuroscience-targetted middleware called cogmoteGO with an API designed to manage multiple remote PTB instances and broadcasts behavioural data and results back to clients. - Major update to the opticka UI for Alyx integration. There is an Alyx panel where you can connect to your Alyx instance to retrieve data from the server. Opticka can create a new Alyx session, and will upload the task data as a copy to the Alyx server. The data is sent to an AWS compatible data store linked to the Alyx session. The data is stored in a folder structure that matches the International Brain Lab ONE Protocol (see "A modular architecture for organizing, processing and sharing neurophysiology data," The International Brain Laboratory et al., 2023 Nat. Methods, DOI).
- Add awsManager to support AWS S3 storage. This is used to upload the task data to the AWS compatible data store linked to the Alyx session. This relies on the awscli command line tool to upload the data. You will need to install the AWS CLI and configure it with your AWS credentials. See AWS CLI for more information. We use the cross-platform pixi package manager to install AWS CLI, using
pixi global install awsclito install it. - HED Tagging -- we now support HED tagging of the session data. This is used to tag parameters with metadata that can be used for search / analysis. The tags are stored in a TSV file in the same folder as the raw session data. See
tools/HEDTags.mandtools/HEDTagger.m. The HED tags are generated from the task sequence and the task parameters. We want to support better data sharing and Alyx / ONE protocol do not have any task metadata so we chose HED from the EEGLab / BIDS projects. See https://www.hedtags.org for details. - runExperiment -- big improvements to the logging system. Previously task events were stored in several places, but for Alyx / HED we need to centralise the event data. This is used to generate the HED tags and the Alyx session data.
- Add joystickManager — we have built our own HID compatible joystick hardware and this manager interfaces with this hardware.
- labJackT — we now send an 11bit strobed word rather than 8bit word. In theory this is backwards compatible, but you need to update the Lua server code running on the LabJack to use the 11bit word (
t = labJackT; t.initialiseServer). 0-2047 controls EIO0-8 & CIO0-3. - Tobii eyetrackers — update Titta interface to support the new adaptive monkey calibration. See Niehorster, D. C., Whitham, W., Lake, B. R., Schapiro, S. J., Andolina, I. M., & Yorzinski, J. L. (2024). Enhancing eye tracking for nonhuman primates and other subjects unable to follow instructions: Adaptive calibration and validation of Tobii eye trackers with the Titta toolbox. Behavior Research Methods, 57(1), 0. https://doi.org/10.3758/s13428-024-02540-y for details.
- makeReport — a new method in optickaCore thus available to all opticka objects. Uses the MATLAB report generator to make a PDF report of the data and property values contained in the core opticka classes (runExperiment, taskSequence, stateMachine, behaviouralRecord, tobii/eyelink/irec). Useful when analysing an experiment to get an overview of all experiment parameters for that session.
- circularMask Shader — add a simple texture shader that provides a circular mask for any texture stimulus (like an image). This is better than using a separate disc shader as before (which blends with a background colour, not alpha). Used in imageStimulus and movieStimulus so you can alpha blend a masked image/movie against a complex background.
What's Changed
Full Changelog: V2.16.1...V2.17.0
V2.16.1 -- BIG Update
V2.16.1 -- 106 files changed
Tip
Please double-check changes in DefaultStateInfo.m to see the changes for state machine files, this may inform changes you could add to your own state machine files...
- BREAKING CHANGE: we want to support the International Brain Lab ONE Protocol (see "A modular architecture for organizing, processing and sharing neurophysiology data," The International Brain Laboratory et al., 2023 Nat. Methods, DOI), and we are now follwoing ALF filenaming for saved files. the root folder is still
OptickaFiles/savedData/but now we use a folder hierarchy: if thelabNamefield is empty we use the shorter/ subjectName / YYYY-MM-DD / SessionID-namedetails.matotherwise we use/ labName / subjects / subjectName / YYYY-MM-DD / SessionID-namedetails.mat-- the optickaMATfile will not change structure or content (it will remain backwards compatible), but we will add extra metadata files to help data sharing in future releases. We will add an ALYX API call to start a session in a future release. - BREAKING CHANGE: LabJack T4 -- we increased the strobe word from 8 to 11 bits, now on EIO1:8 CIO1:3, this should in theory be backwards compatible as 8bits is still the same lines. Upgrade the LabJack T4 (connected over USB) like this:
t = labJackT();
open(t);
initialiseServer(t);
close(t);- Add improved Rigid Body physics engine. We now use dyn4j, an open-source Java 2D physics engine.
animationManageris upgraded (previously it used my own simple physics engine, which couldn't scale to many collisions). Opticka uses degrees, and we do a simple mapping of degrees > meters, so 1deg stimulus is a 1m object. A minimal use case to bounce a ball around the screen:
sM = screenManager();
b = imageStimulus('size',4,'filePath','moon.png',...
'name','moon');
b.speed = 25; % will define velocity
b.angle = -45; % will define velocity
aM = animationManager(); % our new animation manager
sv = open(sM); % open PTB screen, sv is screen info
setup(b, sM); % initialise stimulus with PTB screen
addScreenBoundaries(aM, sv); % add floor, ceiling and
% walls to rigidbody world based on the screen dimensions sv
addBody(aM, b); % add stimulus as a rigidbody
setup(aM); % initialise the simulation.
for i = 1:60
draw(b); % draw the stimulus
flip(sM); % flip the screen
step(aM); % step the simulation
end- Improve touchManager to better use the rigid body animations with touch events. You can now finger-drag and "fling" physical objects around the screen.
- add Procedurally generated polar checkerboards:
polarBoardStimulus, and improved polar gratings to mask with arc segments:polarGratingStimulus. - added new stimulus:
dotlineStimulus- a line made of dots. pupilCoreStimulus-- a calibration stimulus for pupil core eyetrackers.- all stimuli:
updateXY()method quickly updates the X and Y position without a full stimulusupdate(), used by the updateanimationManager. - all stimuli: added
szPxszDxfinalDandyFinalDproperties so we have both pixels and degrees values available. - all stimuli:
szIsPxproperty tells us whether the dynamically generated size at each trial is in pixels or degrees. - add
nirSmartManagerto support nirSmart FNIRS recording system. - improved the mouse dummy mode for the touchscreen
touchManager. arduinoManagercan now use a raspberry pi GPIO if no arduino is present.- Update image and movie stimuli to better handle mutliple images.
- Add a
Test Hardwaremenu to opticka GUI. You can use this to test that the reward system / eyetracker / recording markers are working before you do any data collection each day. - Updates to support the latest Titta toolbox for Tobii eyetrackers.
optickaCore.geyKeys()-- support shift key.runExperiment-- better handling when no eyetracker is selected for a task that may have eyetracker functions.screenManager-- update movieRecording settings. You passscreenManager.movieSettings.record = trueto enable screen recording. Note that the movie is handled automatically, so:
s = screenManager();
s.movieSettings.record = true;
s.open(); % this also initialises the video file
for i = 1:3
s.drawText('Hello World);
s.flip(); % this also adds the frame to the movie
end
s.close(); % this also closes the video file.- lots of improvements for analysing Tobii and iRec data (see
tobiiAnalysisandiRecAnalysis), in particular we integrate Nyström, M. & Holmqvist, K. 2010 toolbox to improve data cleaning. - switch to using string arrays for comment property fields.
State Machine Changes:
@()needFlip(me, false, 0);-- add a 3rd parameter to control the flip of the eyetracker window. NOTE: the number 0=no-flip, 1=dontclear+dontforce, 2=clear+dontforce, 3=clear+force, 4=clear+force first frame then switch to 1 -- dontclear=leave previous frame onscreen, useful to show eyetrack, dontforce=don't force flip, faster as flip for the tracker is throttled@()trackerTrialStart(eT, getTaskIndex(me));&@()trackerTrialEnd(eT, tS.CORRECT)-- this is a new function that handles the several commands that were used previously to send the trial start/end info to the eyetracker. As we increase the number of supported eyetrackers, it is better to wrap this in a single function. NOTE: we mostly use the Eyelink message structure to define trials, even for other trackers, which simplifies analysis later on.
Tweaks and twiddles…
This is mostly a bunch of small tweaks and fixes. Some fixes to procedural checkerboard stimulus and some more logging fixes to improve performance for long tasks....
Auto What's Changed
- Dev branch by @iandol in #4
Full Changelog: V2.15.12...V2.15.14
Tweaking State Machine Timing
I did a refactor of the state machine to ensure timing is accurate. This was prompted by doing lots of photodiode vs. strobe trigger tests. Opticka is super reliable but I noticed if we asked for 2 secs maintain fixation we actually have 2.036 secs stimulus. This is because the timer itself is managed by the eyetracker, but then the state machine must make the logic decision to transition to another state and this incurs a fixed two flips offset. If you absolutely need 2 secs you must use 1.964secs for a 60hz system. The state machine timers themselves are really precise (I tested to around 0.001ms).
- Added a UseVulkan switch to the GUI, as currently we have a bug with the display++ that is only fixed when using Vulkan.
- Add a polar grating (radial, circular and spiral) stimulus, with an arc segment mask option for fMRI etc.
- Add stereomode option to screenManager (only anaglyph stereo used so far).
- Improvements to the tobii / iRec / Pupil core operator display.
- Fix RFLocaliser that got broken when we added visibleRate.
- Improve visibleRate frequency accuracy.
- More Pupil Core bringup.
What's Changed
New Contributors
Full Changelog: V2.15.11...V2.15.12
Refactor of Reward System
We currently support 3 different reward delivery systems: (1) standard variable-length TTL (2) custom-controlled peristaltic pump (3) fooed pellet dispenser. We added a reward structure to arduinoManager to set the parameters and type, then we can use the single command giveReward() to use the appropriate interface. Opticka GUI can now select among the options.
We also moved the eyetracker managers from communication to eyetracker folder, run addOptickaToPath to update the paths. We also add a pupilCoreManager to use the pupil core headset, we should be able to use our consistent API as for the eyelink / tobii / irec, some fixup is still needed for full functionality. But so far this eyetracker is very promising!!! You need to clone https://github.com/iandol/matlab-zmq and add the lib folder to the path; zmq support is precompiled for Linux and macOS.
For the iRec eyetracker, we can now screenshot the validation results pressing F1.
You can pass -1 to the audio interface in opticka GUI to disable the audio system, as we still face random crashes in Ubuntu 22.04
Full Changelog: V2.15.10...V2.15.11
Supporting the iRecHS2 Eyetracker
We have added a new eyetracker, the iRecHS2 (https://staff.aist.go.jp/k.matsuda/iRecHS2/index_e.html) which can use machine vision camera's from FLIR and works really well. For this we did a big refactor of the eyetracker so we use base classes eyetrackerCore and eyetrackerSmooth and inherit from them. As we first supported the eyelink, the tobii and iRec functions broadly mirror the eyelinkManager. It means you can use the same experiment code and any eyetracker should work unchanged. The iRec gains the Operator screen we use for the Tobii, so you can see fixation and eye data in real time as the experiment progresses.
Full Changelog: V2.15.3...V2.15.5
V2.13.2 -- bug fixes
This release fixes a few small GUI bugs, like not saving the subject name properly, and we've added a menu to change the Arduino board type (we support the classic Uno, but also Seeeduino Xiao and Raspbery Pi Pico).
Intan recording control
V2.13 adds the ability to send start / stop commands to the Intan. It also tweaked the GUI to split out the statemachine and userfunctions to separate tabs...
We also added some examples of using the imageStimulus to make "shapes" using alpha-masked images (see optickatest.m for an example).
NOTE: This version has reworked some of the runExperiment properties that may cause you to need to edit your stateInfo files to adjust to the new property names. Read through DefaultStateInfo.m to see what has changed...
Full Changelog: V2.11...V2.13.1
V2.11
The major changes are:
- Tobii gets a full-time monitor window to show the stimulus positions and eye plot summaries during the trial. The
trackerDraw...functions used on the eyelink all work on the tobii so the same task code works to plot realtime eye data on either eyetracker. - We've added a preliminary reverse correlation module. This is dense binary or trinary noise, with parameters modified via a separate dialog.
- The Help > Keyboard Mapping file now shows all keys for RFMapper module and the Tobii calibration key commands.
- Many small tweaks.
Full Changelog: V2.09...V2.11
V2.09
Full Changelog: V2.06...V2.09