With the current sanitary conditions, the defense will also be streamed live on youtube. Here is the link : https://www.youtube.com/watch?v=99qoW6vWUqE
Jury :
Maud Marchal, professeure, Université de Rennes, rapporteuse
Emmanuel Dubois, professeur, Université de Toulouse, rapporteur
Joëlle Thollot, professeure, Université Grenoble-Alpes, examinatrice
Patrick Reuter, maître de conférences, Université de Bordeaux, examinateur
Jocelyne Troccaz, directrice de recherche, CNRS, directrice de thèse
François Bérard, maître de conférences, Université Grenoble-Alpes, co-encadrant de thèse (invité)
Amélie Rochet-Capellan, chargée de recherche, CNRS, co-encadrante de thèse (invitée)
Interacting with 3D virtual scenes is essential for numerous applications. Among others : 3D data visualization, computer assisted design, training simulators and video games. Performing this task through 2D systems like desktop computers or multi-touch tablets can be tedious. To interact more efficiently with 3D contents, high fidelity interactive systems such as virtual reality head-mounted displays try to reproduce the interactive modalities available in real life. Such systems offer a stereoscopic head-coupled rendering and an isomorphic control of 3D objects. However, there is a lack of rigorous studies that showed their benefits in the literature. This thesis has two purposes. We want to enrich the literature through controlled user studies that bring robust results on high fidelity systems’ benefits. We also seek to provide the means to implement the most efficient high fidelity experiences.
In this manuscript, we start by presenting a state of the art of existing high fidelity devices and their potential benefits. We especially introduce a promising approach called handheld perspective corrected displays (HPCD), that we particularly studied through this thesis.
We then present two contributions that allowed us to quantify high fidelity systems benefits. We studied two tasks involving very different cognitive processes in order to attest the variety of applications that could benefit from those systems. The first study concerns a 6D docking task. The two high fidelity systems that we tested, an HPCD and a virtual reality head mounted display, performed respectively 43% and 29% more efficiently than the status quo (an articulated arm used alongside a flat screen). The second study focuses on the task of learning the shape of an unknown 3D object. Regarding this task, the two previously studied high fidelity systems allowed to enhance by 27% the object’s recognition performances when compared to the use of a multi-touch tablet.
We then present two other contributions that bring solutions to ease both hardware and software implementation of high fidelity systems. We provide a method to evaluate the impact of several technical parameters on the presence felt during an interactive experience, which is a feeling that testifies to the experience’s fidelity with regard to the simulated reality. Using this method in a user study allowed us to identify the fact that, with the tested HPCD, the tracking stability and the rendering frame rate were the most critical parameters concerning presence. We finally suggest a suit of interacting techniques that enable the implementation of applications well suited for spherical HPCD, and any other devices that provide a manipulable screen held with both hands. The proposed interactions take advantage of the efficient control of the device rotations and appeared to be both intuitive and efficient during a qualitative test in an anatomy learning application.