>
TCTS Lab Staff
 
 



Mickaël TITS

[Bio] [Research] [Side Projects] [Publications]

Researcher, PhD Student  

University of Mons (UMONS) 
Engineering Faculty of Mons (FPMs) 

Numediart Research Institute  
Infortech Research Institute  
TCTS Lab  
31, Boulevard Dolez  
B-7000 Mons (Belgium)  

phone: +32 65 37 47 46  
fax: +32 65 374729  
mickael.tits


ResearchGate profile Google Scholar profile GitHub profile DBLP profile LinkedIn profile

Work in progress.

Bio :
Mickaël Tits holds a degree of Electrical Engineering, specialized in Multimedia and Telecommunications, at the Engineering Faculty of the University of Mons (since 2014). His master thesis was performed at the Center for Interdisciplinary Research in Music Media and Technology (CIRMMT), at McGill University (Montreal, Canada), and was focused on the capture and analysis of pianists' expert gestures. He is currently pursuing a PhD thesis on analysis of expert gestures through motion capture using statistical modeling and machine learning.
Research :

PhD thesis (2014 - 2018)

Overview

Expert Gesture Database Motion Features Library Expert Gestures Analysis Applications MotionMachine PhD Thesis Overview

Expert Gesture Database

Sport - Taijiquan Database

Taijiquan Database

Taijiquan is a martial art. Though it can be distinguished from other martial disciplines, that are generally mostly fight-oriented. Taijiquan can be defined as the art of body awareness. The practice of this discipline intends the development of physical abilities such as balance, coordination, etc., as well as mental skills such as concentration. This is the perfect discipline to study gesture expertise.

Thanks to the collaboration with the Taijiquan school Eric Caulier, we recorded a database of 12 participants of four different expertise levels from novice to expert. For each participant, we have about 20 minutes of motion data including accurate trajectories of all body articulations.

Music - Piano Hand Gestures Database

Piano Database

This database was built as part of my Master Thesis at CIRMMT (McGill Unversity, Montreal). Four pianists were recorded, while wearing 27 reflective markers on each hand. The recording protocol consisted of six different piano pieces, and two technical exercises. This database was used to analyse the complexity of hands gestures on piano, and to investigate relationships between expertise and gestures complexity. (see Master Thesis).

Dance - Contemporary Dance Database (i-Treasures Project)

Contemporary Dance Database

This database was recorded in the context of the European project i-Treasures, for intangible cultural heritage preservation. The database consists of six professional dancers of the school P.A.R.T.S, recorded on one choreography and five improvisation on emotionally labelled music pieces.

Motion Features Library

Description coming...

Motion Factor Analysis

Description coming...

Applications

VoracyFish

MotionMachine Framework (2014 - forever and ever)

MotionMachine

MotionMachine is a cross-platform and open source C++ framework developed at the numediart institute for Motion Capture (MoCap) data manipulation, interactive visualization, and analysis. It independently inherits all the functionalities of two libraries from which it is built: Armadillo, a C++ library for linear algebra, and openFrameworks, a framework allowing the development of graphical and interactive applications.

"MotionMachine is a C++ software toolkit for rapid prototyping of motion feature extraction and motion-based interaction design. It encapsulates the complexity of motion capture data processing into an intuitive and easy-to-use set of APIs, associated with the openFrameworks environment for visualisation. MotionMachine is a new framework designed for “sense-making”, i.e. enabling the exploration of motion-related data so as to develop new kinds of analysis pipelines and/or interactive applications." (www.motionmachine.org)

Master Thesis (2014)


I achieved my master thesis at the Center for Interdisciplinary Research in Music Media and Technology (CIRMMT), at McGill University (Montreal, Canada). The goal of this project was to capture and analyze pianists' expert gestures. As a first step of this research, I designed a motion capture setup using the Qualisys optical motion capture technology to capture very accurately a pianist's hands and finger trajectories. As an example of the results, you can enjoy the beautiful Schumann excerpt played by the most brilliant participant of the study. The first video is a rendering of the 3D motion data collected with the setup.

These very accurate 3D motion data were then analyzed through different algorithms such as Principal Component Analysis, allowing decomposition of the movement in orthogonal subparts, called eigenmovements. See the second video as an example of a pianist's hands gestures decomposition.

This decomposition was processed for four pianists on six piano pieces, and results showed that, at least among this small database, more experienced pianists tend to use more complex movements, resulting in more eigenmovements. On the other hand, "more complex" piano pieces seem to require more eigenmovements to be played. For instance, Schumann's Fantasy in C Major is a piece that requires particularly good skills on left hand, and results showed that more eigenmovements were used on left hand than on right hand on this piece. On the contrary, Bach's Prelude in C Major needed a few eigenmovements, especially for the left hand.

Side Projects :

Solid Traces by Thierry De Mey (2015)

Solid Traces by Thierry De Mey (2015), in collaboration with the NUMEDIART Institute.

Musician gesture analysis with ORCW (Orchestre Royal de Chambre de Wallonie) (2016)

Video article from Tele-MB (in french).

Bloom by François Zajéga @eNTERFACE'15 (2015)

Bloom

"BLOOM is a plug-in designed by digital artist François Zajéga. The plug-in is installed in the Motion Machine software (MOMA) developed within the NUMEDIART Institute. MOMA can navigate in real time in any kind of information from a motion capture. For a subtle algorithmic game BLOOM highlights the features of the motion of a performer as a dynamic virtual sculpture." (Transculture).

This project was partly achieved during the workshop eNTERFACE'15. BLOOM is a virtual garden, surrounded by floating seeds, and the development of these seeds is fed by movement. In the following video, the input movement is a Taijiquan Performance of Eric Caulier, and the seeds react to the postural load feature.



^ Top ^   

Publications :

Papers in Conference Proceedings

2016
M. TITS, J. TILMANNE, N. D'ALESSANDRO, 2016, "A Novel Tool for Motion Capture Database Factor Statistical Exploration", Proceedings of 3rd International Symposium On Movement & Computing (MOCO 2016), Thessaloniki, Greece, 5-6 July, doi:10.1145/2948910.2948923.
N. GRAMMALIDIS, K. DIMITROPOULOS, F. TSALAKANIDOU , A. KITSIKIDIS, P. ROUSSEL, B. DENBY, P. CHAWAH, L. BUCHMAN, S. DUPONT, S. LARABA, B. PICART, M. TITS, J. TILMANNE, S. HADJIDIMITRIOU, L. HADJILEONTIADIS, V. CHARISIS, C. VOLIOTI, A. STERGIAKI, A. MANITSARIS, O. BOUZOS, S. MANITSARIS, 2016, "The i-Treasures Intangible Cultural Heritage dataset", Proceedings of 3rd International Symposium On Movement & Computing (MOCO 2016), Thessaloniki, Greece, 5-6 July, doi:10.1145/2948910.2948944.
2015
M. TITS, J. TILMANNE, N. D'ALESSANDRO, M. WANDERLEY, 2015, "Feature Extraction and Expertise Analysis of Pianists' Motion-Captured Finger Gestures", Proceedings of the International Computer Music Conference (ICMC 2015), pp. 102-105, Denton, Texas, USA, September 25 - October 1.

^ Top ^