CARPE allows one to begin visualizing eye-movement data in a number of ways. It currently supports low-level feature visualizations, clustering of eye-movements, model selection, heat-map visualizations, blending, contour visualizations, peek-through visualizations, movie output, binocular data input, and more.
Compuational and Algorithmic Representation and Processing of Eye-movements - OSX Version
Matlab scripts specific to the DIEM database
testing eeg encoding/decoding. this produces some gabor wavelets and co-registers the visualization and recording of eeg.
For streaming/logging emotiv eeg headset data
testing some encoding/decoding using EEG. this produces various animations that are co-registered with the recording of eeg.
Uses cURL to download from Freesound.org
Matlab scripts for handling IRCAM LISTEN database
Does finger tracking, contour analysis, and different types of shape description for the hand contour using a 3D ROI around the hand-tracked by NITE middleware, using OpenCV 2.2, an OpenGL scene, and OpenNI. Developed in collaboration with Bruno Zamborlin.
Memory Mosaic iOS App
ofxOpenCV linking against OpenCV 22.214.171.124, including libraries for OSX
Various classes for performing concatenative sound synthesis. Probably not useful to anyone else without a serious amount of time to understand the code.
For performing GPS based concatenative sound synthesis, ANN retrieval based on GPS locations, HRTF based binauralization (mono->stereo using HRTF FFT-based Overlap-Add Convolution) using the IRCAM HRTF Database
Background modeling for foreground subtraction, tracks multiple blobs (people), their orientations (using leading motion vector), and has a nice visual display for seeing the results... video demonstration here: http://vimeo.com/22054133 - more info here: http://pkmital.com
track overhead using color and map tracked points to a new geometry using a homography transformation and calibration routine - some example test videos are provided in the bin/data directory of an overhead capture. The tracking transformation is useful for when you need a defined metric space of your tracking parameters, or need to account for different user heights in tracking their paths in a space.
3d Object Tracking and Pose Estimation for the iPhone
Interfacing libcluster for doing Variational Dirichlet Process Gaussian Mixture Models
pkmEXTAudioFileReader and pkmEXTAudioFileWriter provides simple interfaces to reading and writing audio files.
facial shape modeling, appearance modeling, and head pose recognition. Uses Jason Mora Saragih's FaceTracker code to track facial landmarks; GreatYao's aam-library for building/reprojecting the model (which may in fact be an uncited port of Jason's DeMoLib).
pkmFFT and pkmSTFT provide simple interfaces to the Accelerate.framework for performing vectorized FFT/STFT
pkmMatrix provides a lightweight Matrix class using the Accelerate.framework for vectorized operations
phase vocoder made simple
Calibrate head pose with respect to a screen (television/monitor) for an attention-based measure. Uses Jason Mora Saragih's FaceTracker, please contact him for the code.
openFrameworks based projection mapping for distorting or mapping a collection of drawing commands/videos/images etc... needed a lightweight and robust projection mapping utility though lpmt was too heavy. homography code stolen from lpmt.
For streaming/recording audio files, circular buffers (see pkmMatrix and pkmEXTAudioFile as well)
Creates a dense SIFT image description and displays the image based on a PCA reprojection. Based on SIFT Flow code by Ce Lui, Jenny Yuen, and Antonio Torralba.
real time object detection using opencv and simple UI to select the object to track
Code from the Responsive Ecologies exhibition. Using overhead motion capture and 4 channels of video projection, users are invited to interact in an immersive cinematic environment. This code was developed for the installation in Waterman's Art Centre, London, UK, where artists captincaptin and Parag K Mital have been collaborating to install their piece, "Responsive Ecologies" for their residency during December 2010 and January 2011. The work occurs as part of a larger on-going collaboration with ZSL London Zoo and the Musion Academy.
Streams PrimeSense NITE's Skeleton Data via OSC (XCode Project)
[in progress] synthesis of audiovisual material from youtube!
More projects are listed on my github