Télécharger le fichier original (Mémoire de fin d’études)
Multimodal Perception
While it is important to study senses by themselves, the interaction between them is also a major element for the perception. Senses can influence each other, which means that the perception of a virtual environment through different senses (multimodal perception) might be different than the perception of an environment through each sense taken separately. We first present multimodal enhancement, the fact that perceiving an environment with several modalities provides with better perception than the perception with separate modalities. We then present the interaction of haptic perception with first the visual modality, with a strong predominance of the vision over haptic perception, and interactions between the haptic and the auditory modalities.
Multimodal Enhancement
For instance, having different senses involved for the perception of an environment usually provides better perception of the environment than separate perception with the different senses. This result has been shown for example by [Heller, 1982].
The studied feature for perception in this example was the perception of smoothness on a surface. Subjects were asked to classify objects by roughness based solely on vision, solely on haptic perception, and then using both senses. Given 3 objects of different smoothness, they were asked to say which was the smoothest. Results indicate that classification using vision or haptic separately gave similar results, but when combined, the results were significantly better. A possible explanation for this phenomenon given by the authors is that multimodal perception allows for optimal use of both senses during exploration.
In order to determine whether bimodal interaction would provide a richer perception, [Ballesteros et al., 2005] compared the perceptual space (hard-soft, dry-wet, etc) associated with a haptic exploration of objects with the perceptual space associated with haptic and visual interaction with the objects. The subjects were asked to classify objects based on their texture, without giving criteria for the classification, first solely based on haptic exploration, and then based on bimodal interaction. The goal of the study was to investigate whether the criteria for classification would be the
Perception
same under haptic exploration and bimodal interaction. The results indicate that with both interactions, subjects tend to use the same classification criteria, thus use the same perceptual space for object representation. This would seem to indicates that, even if multimodal interaction allows for a better accuracy, it does not necessarily introduce new perceptual dimensions.
Other research done by [Symmons and Richardson, 2009] has studied the case when haptic and vision had similar feedback. The study was focused on pattern recognition. Subjects were asked to recognize a letter by finger movement along the letter’s shape. The provided visual information was a point from the approximate size of the moving finger tip. In this case, they observe no increase in performance when vision and haptic are matched, which indicates that when the information is identical, there is no clear advantage to a multimodal perception.
These results were obtained studying the interaction with real objects, with matching information coming from different senses. The use of simulation and haptic devices has also allowed to study the impact of sense from one to another, when the sensory information provided to the different senses do not match.
Visual dominance over haptic
The influence of vision on haptic perception has been studied for various problems. One of the challenges for haptic interaction is to be able to estimate the stiffness of the interacting elements. [Srinivasan et al., 1996] studied the influence of vision for the estimation of the stiffness of a virtual spring. The subjects were asked to manipulate two virtual springs, and state which one was the stiffer spring. A visual bias was introduced so that the displacement produced by an applied force on a spring would not be directly linked to its physical stiffness. Instead, virtual springs were given two different values of stiffness each, a constant one for force feedback, and another one varying between the two physical stiffness values in order to compute the displacement for a given force. For extreme values, the spring elongation was inverted between the two springs for the same force value. Results indicate that the visual elongation of the spring strongly influenced the subjects for the classification, which would indicate that on this case, the visual sense would prevail on the haptic sense.
Similar results were obtained in [Kumazaki et al., 2007] for the problem of haptic length perception. The subjects were asked to estimate the distance traveled by their hand, with the vision of the hand provided through a video screen, with an edited image of the hand. When the hand is rendered normally, but with different distance traveled, the estimated distance tends to match the one from the visual input. However, tests were performed with the image of the hand heavily changed by noise. When the noise is important (more than 60%), haptic information of distance prevails during the estimation. This indicates that when the subject can rely on vision, visual information will have more weight in perception.
[Yamamoto and Okamura, 2007] used a drawing task to study the effect of haptic error on the on subjects. The task was to draw a shape, possibly with incorrect haptic feedback. The study shows that performances were not affected if the error was below a certain threshold (in this case, a 5 degrees orientation error). However, erroneous In order to detect how the stiffness perception was influenced by vision, [Sulzer et al., 2007] conducted a study in which subjects were asked to move the hand from one position to another on the surface of a ball. The balls had different stiffness, and visual clues were also given on the ball stiffness (the indicated stiffness being possibly different from the actual stiffness). The main focus was on the adaptation of the behavior based on the given visual information. Results indicate that subjects adapt to visual information, but not uniformly to the difference between haptic and visual stiffness. For instance, there was more adaptation to low stiffness visual information than for high stiffness.
Several other previous works have focused on how the perceived haptic properties of a real object could be changed by a visual superimposition of information on this object. Hirano et al. [Hirano et al., 2011] have notably superimposed textures associated with different levels of hardness on a real object, successfully influencing the perception of hardness of this object. Similar methods have been proposed to influence a « softness » perception [Punpongsanon et al., 2015] or the perceived weight of an object [Hashiguchi et al., 2014]. In a purely AR context, Jeon and Choi [Jeon and Choi, 2008] have also shown how adding a force-feedback during interaction with a real object could modulate the stiffness perceived by the user.
First, [DiFranco et al., 1997] studied the effect of sound usually associated with a specific stiffness on the estimation of stiffness of surfaces. Subjects were asked to rank surfaces based on perceived stiffness. At first, as a baseline, all surface were from same stiffness, with sounds associated to various stiffness. As expected, subjects ranked the stiffness based solely on auditory clues. Other experiments were performed with random association between haptic stiffness and sound stiffness. Subjects that had not performed the first experiment also based their judgment on auditory clues rather instead of haptic clues. However, expert subjects were more prone to rank based on actual stiffness. This result indicates that sound can also influence haptic, based on usual association to known stiffness.
Other studies have investigated the influence of sound on roughness perception. [McGee et al., 2001] and [McGee et al., 2002] performed an experiment where subjects were asked to move a haptic device on a textured surface. Sounds were played when the haptic device was close enough from a peak in the texture. The test was to determine whether two presented surface had the same roughness or not. During the experiment, the frequency of the sound was changed, in order to indicate a potentially different roughness. Results indicate that the subjects were influenced by the sound.
These results indicate that, though not as clearly as vision, sound clues can be used to influence perception of haptic properties of an object.
Table des matières
1 Introduction
1 Challenges
1.1 Physically-based simulation of complex behavior from de formable objects
1.2 Hapticinteractionwithcomplexbehaviorfromtheobjects
1.3 Hapticperceptionoverdifferentdisplaytechnologies
2 Contributions
2.1 Hapticinteractionwithheterogeneousmulti-resolutionobjects
2.2 Bimanualhaptictearingofdeformablesurfaces
2.3 HapticstiffnessperceptioninARcomparedtoVR
3 Outline
4 Notations
2 RelatedWork
1 Perception
1.1 Introduction .
1.2 Theanatomyofhapticinteraction
1.3 Unimodalperception
1.4 MultimodalPerception .
1.5 Crossmodalperception .
1.6 Conclusion
2 Physically-basedSimulation
2.1 Introduction
2.2 Physically-based Simulationof Deformable Objects
2.3 AdaptiveApproaches
2.4 Conclusion
3 Haptics
3.1 Introduction .
3.2 Adaptive Acceleration Methods for Haptics
3.3 Conclusion
4 Conclusion
3 Elasticity-based Clustering for Haptic Interaction with Hetero geneous
Objects
1 Introduction
2 Descriptionofourapproach
2.1 Methodoverview .
2.2 OurAlgorithm:Elasticity-basedClustering
3 Evaluation
3.1 Methodology .
3.2 Results .
4 Illustrativeusecase:cookingscenario
5 Conclusion
4 HapticTearingofDeformableSurfaces
1 Introduction
2 Generaldescriptionofourhaptictearingapproach
3 Collisiondetectionforsurfacemeshesusinganovelclusteringformulation
3.1 Relatedworkoncollisiondetection .
3.2 Methodoverview .
3.3 Decompositionofobjectsinclusters
3.4 Relativedisplacementmeasurementofclusteredobjects
4 Physically-basedsimulationoftearing
4.1 RelatedworkontearingSimulation .
4.2 Ourmethodforefficientphysically-basedsimulationofsurfacetearing
5 Hapticrendering
6 Use-casesandperformance
6.1 Implementationsetup
6.2 Illustrativeuse-cases
6.3 Performance .
7 Conclusion
5 StudyonhapticperceptionofstiffnessinVRversusAR
1 Introduction
2 Userstudy
2.1 Participants .
2.2 Experimentalapparatus
2.3 ConditionsandPlan
2.4 Procedure
2.5 Collecteddata
3 Results
3.1 RecognitionAccuracy
3.2 Remainingobjectivemeasures
3.3 Subjectiveanswers
4 Discussion
5 Conclusion
6 Conclusion
1 Elasticityaggregationforgeometricmulti-resolutiononheterogeneousobjects
2 Haptictearingofdeformablesurface .
3 ComparisonofhapticstiffnessperceptioninARvsVR
4 Futurework
4.1 Geometricmultiresolutiononheterogeneousobjects .
4.2 Tearingsimulation .
4.3 StiffnessperceptioninAR
5 Long-termperspectives
5.1 Towardsfullperception-basedadaptivityinavirtualenvironment .
5.2 Towardsabetterunderstandingofhapticperception
A Publicationsfromtheauthor
B Appendix:Résumélongenfrançais
1 Introduction
2 Modèledepartitionetd’aggrégationdel’élasticitépourlasimulationmulti-
échelle d’objetshétérogènes
3 Simulationphysiquededéchiruredesurfacesdéformables
4 Étudecomparativesurlaperceptionderaideurentredesenvironnements
augmentés etvirtuels
5 Conclusion
C Appendix:Questionnaire subjectif
List ofFigures
List ofAlgorithms
List ofTables
Bibliographie