Parag Kumar Mital


Bregman Media Labs
6242 Hinman Box
Dartmouth College
Hanover, NH 03755
I am currently on the job market. Feel free to contact me.

Parag K Mital is a computational artist researching computational synthesis of audiovisual perception. As a researcher, he investigates how film together with eye-movement, EEG, and fMRI can help to explore how people attend to and represent audiovisual scenes. He uses this understanding to build computational models of audiovisual perception that serve as real-time experiences meant to reflect on one’s own perceptual processes. As an artist, he uses these computational models to create real-time audiovisual smash ups, augmented reality-based hallucinations, and expressive control of audiovisual content. The balance of his scientific research and art reflect on each other with the science driving the theories, and the artwork re-defining the questions asked within the research. His work has been exhibited in ACM Multimedia, Victoria & Albert Museum, Media Art Histories: ART+COMMUNICATION, London’s Science Museum, the British Film Institute, the Kinetica Art Fair, Athens Video Art Festival, and Bengaluru Artist Residency.

Current Post

Post-Doctoral Research Associate (beg. 2014) working on audiovisual decoding from fMRI led by Michael Casey at Dartmouth College, Hanover, NH


Parag K. Mital, Jessica Thompson, Thalia Wheatley, Michael Casey. How Humans Hear and Imagine Musical Scales: Decoding Absolute and Relative Pitch with fMRI. CCN 2014, Dartmouth College, Hanover, NH, USA, August 25-26, 2014.

Parag Kumar Mital. Audiovisual Resynthesis in an Augmented Reality. In Proceedings of the ACM International Conference on Multimedia (MM ’14). ACM, New York, NY, USA, 695-698. 2014. DOI=10.1145/2647868.2655617 .

[website] [online] [pdf]

Tim J. Smith, Sam Wass, Tessa Dekker, Parag K. Mital, Irati Rodriguez, Annette Karmiloff-Smith. Optimising signal-to-noise ratios in Tots TV can create adult-like viewing behaviour in infants. 2014 International Conference on Infant Studies, Berlin, Germany, July 3-5 2014.

Parag K. Mital, Mick Grierson, and Tim J. Smith. 2013. Corpus-Based Visual Synthesis: An Approach for Artistic Stylization. In Proceedings of the 2013 ACM Symposium on Applied Perception (SAP ’13). ACM, New York, NY, USA, 51-58. DOI=10.1145/2492494.2492505
[website] [online] [pdf] [presentation]

Parag K. Mital, Mick Grierson. Mining Unlabeled Electronic Music Databases through 3D Interactive Visualization of Latent Component Relationships. In Proceedings of the 2013 New Interfaces for Musical Expression Conference, p. 77. South Korea, May 27-30, 2013.
[website] [pdf]

Parag K. Mital, Tim J. Smith, Steven Luke, John M. Henderson. Do low-level visual features have a causal influence on gaze during dynamic scene viewing? Journal of Vision, vol. 13 no. 9 article 144, July 24, 2013.
[online] [poster]

Tim J. Smith, Parag K. Mital. Attentional synchrony and the influence of viewing task on gaze behaviour in static and dynamic scenes. Journal of Vision, vol. 13 no. 8 article 16, July 17, 2013.
Tim J. Smith, Parag K. Mital. “Watching the world go by: Attentional prioritization of social motion during dynamic scene viewing”. Journal of Vision, vol. 11 no. 11 article 478, September 23, 2011.

Melissa L. Vo, Tim J. Smith, Parag K. Mital, John M. Henderson. “Do the Eyes Really Have it? Dynamic Allocation of Attention when Viewing Moving Faces”. Journal of Vision, vol. 12 no. 13 article 3, December 3, 2012.

Parag K. Mital, Tim J. Smith, Robin Hill, John M. Henderson. “Clustering of Gaze during Dynamic Scene Viewing is Predicted by Motion” Cognitive Computation, Volume 3, Issue 1, pp 5-24, March 2011.
[online] [pdf] [videos]

Previous Academic Work

Research Assistant (2011) London Knowledge Lab, Institute of Education, London, U.K.
ECHOES is a technology-enhanced learning environment where 5-to-7-year-old children on the Autism Spectrum and their typically developing peers can explore and improve social and communicative skills through interacting and collaborating with virtual characters (agents) and digital objects. ECHOES provides developmentally appropriate goals and methods of intervention that are meaningful to the individual child, and prioritises communicative skills such as joint attention. Funded by the EPSRC. Principal Investigators: Oliver Lemon and Kaska Porayska-Pomsta

Research Assistant (2008-2010) John M. Henderson’s Visual Cognition Lab, University of Edinburgh
Investigating dynamic scene perception through computational models of eye-movements, low-level static and temporal visual features, film composition, and object and scene semantics. The DIEM Project. Funded by the Leverhulme Trust and ESRC. Principal Investigator: John M. Henderson


Ph.D. (2014) Arts and Computational Technologies Goldsmiths, University of London. Thesis: Computational Audiovisual Scene Synthesis
This thesis attempts to open a dialogue around fundamental questions of perception such as: how do we represent our ongoing auditory or visual perception of the world using our brain; what could these representations explain and not explain; and how can these representations eventually be modeled by computers? Rather than answer these questions scientifically, we will attempt to develop a computational arts practice presenting these questions to participants. The approach this thesis takes is computational scene synthesis: a computationally generative collage process where the units of the collage are built using perceptually-inspired representations. We explain how scene synthesis is built in detail and relate it to an existing lineage of collage-based practitioners. Then, working in auditory and visual domains separately, in order to bring questions of perception to the experience of the artwork, this thesis makes significant interdisciplinary strides from reviewing fundamental issues in perception in terms of experimental psychology and cognitive neuroscience, to formulating and developing perceptually-inspired computational models of large databases of audiovisual material, to finally developing these models with a computationally generative collage-based arts practice. Two final practical outputs using audiovisual scene synthesis will be explored: (1) a short film series which attempts to recreate the number 1 video of the week on YouTube using only the audiovisual content from the remaining top 10 videos; and (2) a real-time augmented reality experience presented through a virtual reality headset and headphones presenting a scene synthesis of a participant’s surroundings using only previously learned audiovisual fragments. Results from both outputs demonstrate the ability for scene synthesis to provoke meaningful engagements with one’s own process of perception. The results further demonstrate that scene synthesis is capable of highlighting both theoretical and practical gaps in our current understanding of human perception and their computational implementations.

M.Sc. (2008) Artificial Intelligence: Intelligent Robotics, University of Edinburgh
B.Sc. (2007) Computer and Information Sciences, University of Delaware

Teaching Experience

Workshops in Creative Coding – “Mobile and Computer Vision”

Lecturer, Department of Computing @ Goldsmiths, University of London. London, U.K. – Spring 2013

This is a 10-week Master’s course covering Mobile and Computer Vision development using the openFrameworks creative coding toolkit at Goldsmiths, Department of Computing. Taught to MSc Computer Science, MSc Cognitive Computing, MSc Games and Entertainment, MA Computational Arts, and MFA Computational Studio Arts students.

“Introduction to openFrameworks”

Lecturer, Department of Computing @ Goldsmiths, University of London. London, U.K. – Fall 2012

This is a 4-week course covering the basics of openFrameworks taught to MA Computational Arts and MFA Computational Studio Arts students.

Workshops in Creative Coding – “Computer Vision”

Lecturer, Department of Computing @ Goldsmiths, University of London. London, U.K. – Spring 2012

This is a 5 week course covering Gesture and Interaction design as well as Computer Vision basics using the openFrameworks creative coding toolkit. Taught to MSc Computer Science, MSc Cognitive Computing, MSc Games and Entertainment, MA Computational Arts, and MFA Computational Studio Arts students.

“Audiovisual Processing for iOS Devices”

Lecturer, Digital Studio, Sackler Centre @ Victoria & Albert Museum. London, U.K. – Spring 2012

10 week course open to anyone covering the basics of iOS development.

Center for Experimental Media Art: Interim Semester

Lecturer, Center for Experimental Media Arts @ Srishti School of Art, Design, and Technology. Bangalore, India – Fall 2011

Taught during the interim semester, the course entitled, “Stories are Flowing Trees”, introduced a group of 9 students to the creative coding platform openFrameworks through practical sessions, critical discourse, and the development of 3 installation artworks that were exhibited in central Bangalore. During the first week, students were taught basic creative coding routines including blob tracking, projection mapping, and building interaction with generative sonic systems. Following the first week, students worked together to develop, fabricate, install, publicize, and exhibit 3 pieces of artwork in central Bangalore at the BAR1 artist-residency space in an exhibition entitled, SURFACE, textures in interactive new media.

Digital Media Studio Project – Various

Supervisor, School of Arts, Culture, and Environment, University of Edinburgh
(2010) Supervisor for 3 MSc Students on Augmented Sculpture
(2009) Supervised 6 MSc Students on Incorporating Computer Vision in Interactive Installation


Engineering and Sciences Research Mentor. Seminar. McNair Scholars, University of Delaware, 2007
Instructor. Web Design. McNair Scholars, University of Delaware, 2007
Teaching Assistant. Introduction to Computer Science. University of Delaware, 2006

Freelance Work / Collaborations

Strangeloop Ltd.
XL Recordings
London Science Museum
Center for Brain and Cognitive Development, Birkbeck, University of London
Nexus Interactive Arts
Beau Lotto
Agelos Papadakis
Christos Michalakos


Parag K. Mital, “Computational Audiovisual Synthesis and Smashups”. International Festival of Digital Art, Waterman’s Art Centre, 25 August 2012.
Parag K. Mital and Tim J. Smith, “Investigating Auditory Influences on Eye-movements during Figgis’s Timecode”. 2012 Society for the Cognitive Studies of the Moving Image (SCSMI), New York, NY. 13-16 June 2012.
Parag K. Mital and Tim J. Smith, “Computational Auditory Scene Analysis of Dynamic Audiovisual Scenes”. Invited Talk, Birkbeck University of London, Department of Film. London, UK. 25 January 2012.
Parag K. Mital, “Resynthesizing Perception”. Invited Talk, Queen Mary University of London, London, UK. 11 January 2012.
Parag K. Mital, “Resynthesizing Perception”. Invited Talk, Dartmouth, Department of Music. Hanover, NH, USA. 7 January 2012.
Parag K. Mital, “Resynthesizing Perception”. 2011 Bitfilm Festival, Goethe Institut, Bengaluru (Bangalore), India. 3 December 2011.
Parag K. Mital, “Resynthesizing Perception”. Thursday Club, Goldsmiths, University of London. 13 October 2011.
Parag K. Mital, “Resynthesizing audiovisual perception with augmented reality”. Invited Talk for Newcastle CULTURE Lab, Lunch Bites. 30 June 2011 [slides][online]
Hill, R.L., Henderson, J. M., Mital, P. K. & Smith, T. J. (2010) “Dynamic Images and Eye Movements”. Poster at ASCUS Art Science Collaborative, Edinburgh College of Art, 29 March 2010.
Robin Hill, John M. Henderson, Parag K. Mital, Tim J. Smith. “Through the eyes of the viewer: Capturing viewer experience of dynamic media.” Invited Poster for SICSA DEMOFest. Edinburgh, U.K. 24 November 2009
Parag K Mital, Tim J. Smith, Robin Hill, and John M. Henderson. “Dynamic Images and Eye-Movements.” Invited Talk for Centre for Film, Performance and Media Arts, Close-Up 2. Edinburgh, U.K. 2009
Parag K. Mital, Stephan Bohacek, Maria Palacas. “Realistic Mobility Models for Urban Evacuations.” 2007 National Ronald E. McNair Conference. 2007
Parag K. Mital, Stephan Bohacek, Maria Palacas. “Developing Realistic Models for Urban Evacuations.” 2006 National Ronald E. McNair Conference. 2006


(2013) Media Art Histories/ART+COMMUNICATION 2013 (SAVE AS), RIXC, Riga, Latvia
(2013) Breaking Convention, University of Greenwich, London, U.K.
(2012) Digital Design Weekend, Victoria and Albert Museum, London, U.K.
(2012) SHO-ZYG, Goldsmiths, University of London, U.K.
(2011) SURFACES, Bengaluru Artist Residency 1 (BAR1), Bengaluru (Bangalore), India (Co-Curator and Artist)
(2011) Bitfilm Festival, Goethe Institut, Bengaluru (Bangalore), India
(2011) Oramics to Electronica, Science Museum. London, U.K.
(2011) Edinburgh International Film Festival. Edinburgh, U.K.
(2011) Kinetica Art Fair 2011, Ambika P3. London, U.K.
(2010-2011) Solo Exhibition, Waterman’s Art Centre, London, UK.
(2010) onedotzero Adventures in Motion Festival, British Film Institute (BFI) Southbank, London, UK.
(2010) LATES, Science Museum, London, UK.
(2010) Athens Video Art Festival, Technopolis. Athens, Greece
(2010) Is this a test?, Roxy Arthouse, Edinburgh, UK.
(2010) Neverzone, Roxy Arthouse, Edinburgh, UK.
(2010) Dialogues Festival, Voodoo Rooms, Edinburgh, U.K.
(2010) Kinetica Art Fair 2010, Ambika P3. London, U.K.
(2010) Soundings Festival, Reid Concert Hall, Edinburgh, U.K.
(2010) Media Art: A 3-Dimensional Perspective, Online Exhibition (Add-Art)
(2009) Passing Through, James Taylor Gallery. London, U.K.
(2009) Interact, Lauriston Castle Glasshouse. Edinburgh, U.K.
(2008) Leith Short Film Festival, Edinburgh, U.K. June
(2008) Solo exhibition, Teviot, Edinburgh, U.K. April

Research/Technical Reports

Parag K. Mital, Tim J. Smith, John M. Henderson. A Framework for Interactive Labeling of Regions of Interest in Dynamic Scenes. MSc Dissertation. Aug 2008
Parag K. Mital. Interactive Video Segmentation for Dynamic Eye-Tracking Analysis. 2008
Parag K. Mital. Augmented Reality and Interactive Environments. 2007
Stephan Bohacek, Parag K. Mital. Mobility Models for Urban Evacuations. 2007
Parag K. Mital, Jingyi Yu. Light Field Interpolation via Max-Contrast Graph Cuts. 2006
Parag K. Mital, Jingyi Yu. Gradient Based Domain Video Enhancement of Night Time Video. 2006
Parag K. Mital, Jingyi Yu. Interactive Light Field Viewer. 2006
Stephan Bohacek, Parag K. Mital. OpenGL Modeling of Urban Cities and GIS Data Integration. 2005

Associated Labs

Bregman Media Labs, Dartmouth College
EAVI: Embodied Audio-Visual Interaction group initiated by Mick Grierson and Marco Gilles at Goldsmiths, University of London
The DIEM Project: Dynamic Images and Eye-Movements, initiated by John M. Henderson at the University of Edinburgh
CIRCLE: Creative Interdisciplinary Research in CoLlaborative Environments, initiated between the Edinburgh College of Art, the University of Edinburgh, and elsewhere.

Summer Schools/Workshops Attended

Michael Zbyszynski, Max/MSP Day School. UC Berkeley CNMAT 2007
Ali Momeni, Max/MSP Night School. UC Berkeley CNMAT 2007
Adrian Freed, Sensor Workshop for Performers and Artists. UC Berkeley CNMAT 2007
Andrew Benson, Jitter Night School. UC Berkeley CNMAT 2007
Perry R. Cook and Xavier Serra, Digital Signal Processing: Spectral and Physical Models. Stanford CCRMA 2007
Ivan Laptev, Cordelia Schmid, Josef Sivic, Francis Bach, Alexei Efros, David Forsyth, Zaid Harchaoui, Martial Hebert, Christoph Lampert, Ivan Laptev, Aude Oliva, Jean Ponce, Deva Ramanan, Antonio Torralba, Andrew Zisserman, INRIA Computer Vision and Machine Learning. INRIA Grenoble 2012
Bob Cox and the NIH AFNI team, AFNI Bootcamp. Haskins Lab, Yale University. May 27-30, 2014.

In the News

The Space (BBC/Arts Council England)
Fast Company: Co.Design
The Creators Project (Vice/Intel)
BBC News
BBC News
NY Times
David Bordwell
Makematics/Kyle McDonald


ISMAR 2010
CVPR 2009
ICMC 2007
ICMC 2006

Copyright © 2010 Parag K Mital. All rights reserved. Made with Wordpress. RSS