Parag Kumar Mital
Hinman Box (6242)
Hanover NH 03755
Citizenship: United States
Parag K Mital is a computational artist researching models of audiovisual perception. As a researcher, he investigates how film together with eye-movement and EEG recordings can help to explore how people attend to and represent audiovisual scenes. He uses this understanding to build computational models of audiovisual perception that serve as real-time experiences meant to reflect on one’s own perceptual processes. As an artist, he uses these computational models to create real-time audiovisual smash ups, augmented reality-based hallucinations, and expressive control of audiovisual content. The balance of his scientific research and art reflect on each other with the science driving the theories, and the artwork re-defining the questions asked within the research. His work has been exhibited in the Victoria & Albert Museum, London’s Science Museum, the British Film Institute, the Kinetica Art Fair, and Athens Video Art Festival. Read more about my practice on this page.
Post-Doctoral Research Associate (beg. 2014) working on audiovisual decoding from fMRI led by Michael Casey at Dartmouth College, Hanover, NH
Parag K. Mital, Mick Grierson, and Tim J. Smith. 2013. Corpus-Based Visual Synthesis: An Approach for Artistic Stylization. In Proceedings of the 2013 ACM Symposium on Applied Perception (SAP ’13). ACM, New York, NY, USA, 51-58. DOI=10.1145/2492494.2492505
[Project Website] [online] [pdf] [presentation]
Parag K. Mital, Mick Grierson. Mining Unlabeled Electronic Music Databases through 3D Interactive Visualization of Latent Component Relationships. In Proceedings of the 2013 New Interfaces for Musical Expression Conference, p. 77. South Korea, May 27-30, 2013.
Tim J. Smith, Parag K. Mital. Attentional synchrony and the influence of viewing task on gaze behaviour in static and dynamic scenes. Journal of Vision, vol. 13 no. 8 article 16, July 17, 2013.
Tim J. Smith, Parag K. Mital. “Watching the world go by: Attentional prioritization of social motion during dynamic scene viewing”. Journal of Vision, vol. 11 no. 11 article 478, September 23, 2011.
Melissa L. Vo, Tim J. Smith, Parag K. Mital, John M. Henderson. “Do the Eyes Really Have it? Dynamic Allocation of Attention when Viewing Moving Faces”. Journal of Vision, vol. 12 no. 13 article 3, December 3, 2012.
Previous Academic Work
Research Assistant (2011) London Knowledge Lab, Institute of Education, London, U.K.
ECHOES is a technology-enhanced learning environment where 5-to-7-year-old children on the Autism Spectrum and their typically developing peers can explore and improve social and communicative skills through interacting and collaborating with virtual characters (agents) and digital objects. ECHOES provides developmentally appropriate goals and methods of intervention that are meaningful to the individual child, and prioritises communicative skills such as joint attention. Funded by the EPSRC. Principal Investigators: Oliver Lemon and Kaska Porayska-Pomsta
Research Assistant (2008-2010) John M. Henderson’s Visual Cognition Lab, University of Edinburgh
Investigating dynamic scene perception through computational models of eye-movements, low-level static and temporal visual features, film composition, and object and scene semantics. The DIEM Project. Funded by the Leverhulme Trust and ESRC. Principal Investigator: John M. Henderson
Ph.D. (2014) Arts and Computational Technologies Goldsmiths, University of London. Thesis: Computational Audiovisual Scene Synthesis
M.Sc. (2008) Artificial Intelligence: Intelligent Robotics, University of Edinburgh
B.Sc. (2007) Computer and Information Sciences, University of Delaware
Department of Computing @ Goldsmiths, University of London. London, U.K. – Spring 2013
This is a 10-week Master’s course covering Mobile and Computer Vision development using the openFrameworks creative coding toolkit at Goldsmiths, Department of Computing. Taught to MSc Computer Science, MSc Cognitive Computing, MSc Games and Entertainment, MA Computational Arts, and MFA Computational Studio Arts students.
Department of Computing @ Goldsmiths, University of London. London, U.K. – Fall 2012
This is a 4-week course covering the basics of openFrameworks taught to MA Computational Arts and MFA Computational Studio Arts students.
Department of Computing @ Goldsmiths, University of London. London, U.K. – Spring 2012
This is a 5 week course covering Gesture and Interaction design as well as Computer Vision basics using the openFrameworks creative coding toolkit. Taught to MSc Computer Science, MSc Cognitive Computing, MSc Games and Entertainment, MA Computational Arts, and MFA Computational Studio Arts students.
Digital Studio, Sackler Centre @ Victoria & Albert Museum. London, U.K. – Spring 2012
10 week course open to anyone covering the basics of iOS development.
Center for Experimental Media Arts @ Srishti School of Art, Design, and Technology. Bangalore, India – Fall 2011
Taught during the interim semester, the course entitled, “Stories are Flowing Trees”, introduced a group of 9 students to the creative coding platform openFrameworks through practical sessions, critical discourse, and the development of 3 installation artworks that were exhibited in central Bangalore. During the first week, students were taught basic creative coding routines including blob tracking, projection mapping, and building interaction with generative sonic systems. Following the first week, students worked together to develop, fabricate, install, publicize, and exhibit 3 pieces of artwork in central Bangalore at the BAR1 artist-residency space in an exhibition entitled, SURFACE, textures in interactive new media.
School of Arts, Culture, and Environment, University of Edinburgh
(2010) Supervisor for 3 MSc Students on Augmented Sculpture
(2009) Supervised 6 MSc Students on Incorporating Computer Vision in Interactive Installation
Engineering and Sciences Research Mentor. Seminar. McNair Scholars, University of Delaware, 2007
Instructor. Web Design. McNair Scholars, University of Delaware, 2007
Teaching Assistant. Introduction to Computer Science. University of Delaware, 2006
Freelance Work / Collaborations
London Science Museum
Center for Brain and Cognitive Development, Birkbeck, University of London
Nexus Interactive Arts
Parag K. Mital, “Computational Audiovisual Synthesis and Smashups”. International Festival of Digital Art, Waterman’s Art Centre, 25 August 2012.
Parag K. Mital and Tim J. Smith, “Investigating Auditory Influences on Eye-movements during Figgis’s Timecode”. 2012 Society for the Cognitive Studies of the Moving Image (SCSMI), New York, NY. 13-16 June 2012.
Parag K. Mital and Tim J. Smith, “Computational Auditory Scene Analysis of Dynamic Audiovisual Scenes”. Invited Talk, Birkbeck University of London, Department of Film. London, UK. 25 January 2012.
Parag K. Mital, “Resynthesizing Perception”. Invited Talk, Queen Mary University of London, London, UK. 11 January 2012.
Parag K. Mital, “Resynthesizing Perception”. Invited Talk, Dartmouth, Department of Music. Hanover, NH, USA. 7 January 2012.
Parag K. Mital, “Resynthesizing Perception”. 2011 Bitfilm Festival, Goethe Institut, Bengaluru (Bangalore), India. 3 December 2011.
Parag K. Mital, “Resynthesizing Perception”. Thursday Club, Goldsmiths, University of London. 13 October 2011.
Parag K. Mital, “Resynthesizing audiovisual perception with augmented reality”. Invited Talk for Newcastle CULTURE Lab, Lunch Bites. 30 June 2011 [slides][online]
Hill, R.L., Henderson, J. M., Mital, P. K. & Smith, T. J. (2010) “Dynamic Images and Eye Movements”. Poster at ASCUS Art Science Collaborative, Edinburgh College of Art, 29 March 2010.
Robin Hill, John M. Henderson, Parag K. Mital, Tim J. Smith. “Through the eyes of the viewer: Capturing viewer experience of dynamic media.” Invited Poster for SICSA DEMOFest. Edinburgh, U.K. 24 November 2009
Parag K Mital, Tim J. Smith, Robin Hill, and John M. Henderson. “Dynamic Images and Eye-Movements.” Invited Talk for Centre for Film, Performance and Media Arts, Close-Up 2. Edinburgh, U.K. 2009
Parag K. Mital, Stephan Bohacek, Maria Palacas. “Realistic Mobility Models for Urban Evacuations.” 2007 National Ronald E. McNair Conference. 2007
Parag K. Mital, Stephan Bohacek, Maria Palacas. “Developing Realistic Models for Urban Evacuations.” 2006 National Ronald E. McNair Conference. 2006
(2013) Media Art Histories/ART+COMMUNICATION 2013 (SAVE AS), RIXC, Riga, Latvia
(2013) Breaking Convention, University of Greenwich, London, U.K.
(2012) Digital Design Weekend, Victoria and Albert Museum, London, U.K.
(2012) SHO-ZYG, Goldsmiths, University of London, U.K.
(2011) SURFACES, Bengaluru Artist Residency 1 (BAR1), Bengaluru (Bangalore), India (Co-Curator and Artist)
(2011) Bitfilm Festival, Goethe Institut, Bengaluru (Bangalore), India
(2011) Oramics to Electronica, Science Museum. London, U.K.
(2011) Edinburgh International Film Festival. Edinburgh, U.K.
(2011) Kinetica Art Fair 2011, Ambika P3. London, U.K.
(2010-2011) Solo Exhibition, Waterman’s Art Centre, London, UK.
(2010) onedotzero Adventures in Motion Festival, British Film Institute (BFI) Southbank, London, UK.
(2010) LATES, Science Museum, London, UK.
(2010) Athens Video Art Festival, Technopolis. Athens, Greece
(2010) Is this a test?, Roxy Arthouse, Edinburgh, UK.
(2010) Neverzone, Roxy Arthouse, Edinburgh, UK.
(2010) Dialogues Festival, Voodoo Rooms, Edinburgh, U.K.
(2010) Kinetica Art Fair 2010, Ambika P3. London, U.K.
(2010) Soundings Festival, Reid Concert Hall, Edinburgh, U.K.
(2010) Media Art: A 3-Dimensional Perspective, Online Exhibition (Add-Art)
(2009) Passing Through, James Taylor Gallery. London, U.K.
(2009) Interact, Lauriston Castle Glasshouse. Edinburgh, U.K.
(2008) Leith Short Film Festival, Edinburgh, U.K. June
(2008) Solo exhibition, Teviot, Edinburgh, U.K. April
Parag K. Mital, Tim J. Smith, John M. Henderson. A Framework for Interactive Labeling of Regions of Interest in Dynamic Scenes. MSc Dissertation. Aug 2008
Parag K. Mital. Interactive Video Segmentation for Dynamic Eye-Tracking Analysis. 2008
Parag K. Mital. Augmented Reality and Interactive Environments. 2007
Stephan Bohacek, Parag K. Mital. Mobility Models for Urban Evacuations. 2007
Parag K. Mital, Jingyi Yu. Light Field Interpolation via Max-Contrast Graph Cuts. 2006
Parag K. Mital, Jingyi Yu. Gradient Based Domain Video Enhancement of Night Time Video. 2006
Parag K. Mital, Jingyi Yu. Interactive Light Field Viewer. 2006
Stephan Bohacek, Parag K. Mital. OpenGL Modeling of Urban Cities and GIS Data Integration. 2005
EAVI: Embodied Audio-Visual Interaction group initiated by Mick Grierson and Marco Gilles at Goldsmiths, University of London
The DIEM Project: Dynamic Images and Eye-Movements, initiated by John M. Henderson at the University of Edinburgh
CIRCLE: Creative Interdisciplinary Research in CoLlaborative Environments, initiated between the Edinburgh College of Art, the University of Edinburgh, and elsewhere.
Michael Zbyszynski, Max/MSP Day School. UC Berkeley CNMAT 2007
Ali Momeni, Max/MSP Night School. UC Berkeley CNMAT 2007
Adrian Freed, Sensor Workshop for Performers and Artists. UC Berkeley CNMAT 2007
Andrew Benson, Jitter Night School. UC Berkeley CNMAT 2007
Perry R. Cook and Xavier Serra, Digital Signal Processing: Spectral and Physical Models. Stanford CCRMA 2007
Ivan Laptev, Cordelia Schmid, Josef Sivic, Francis Bach, Alexei Efros, David Forsyth, Zaid Harchaoui, Martial Hebert, Christoph Lampert, Ivan Laptev, Aude Oliva, Jean Ponce, Deva Ramanan, Antonio Torralba, Andrew Zisserman, INRIA Computer Vision and Machine Learning. INRIA Grenoble 2012
In the News