Sebastien Grange



Sebastien Grange
PhD, Human-Computer Interaction

current address
Force Dimension
1015 Lausanne


Haptics for surgical navigation

We are developing a needle insertion simulator and CT/US navigation system called the BiopsyNavigator. In this system, the biopsy needle is directly connected to a haptic feedback device. During the intervention, the system provides the surgeon with navigational information as well as force guidance to improve needle insertion.

GestureDriver: Visual Gesturing for Vehicle Teleoperation

Using visual gesture to pilot a vehicle offers several advantages. For a start, the interface is passive (i.e. it doesn’t require the user to use any hardware or to wear special tags or clothes). Therefore, the interface is easy to deploy and can be used virtually anywhere in the field of view of the camera that performs the tracking. This flexibility is hard to achieve with hand controllers such as rate-control joysticks. Using vision also allows different gesture interpretations to be used, depending on the user’s preferences and the tasks to be performed. Since the interpretation is software based, it is possible to customize the human-machine interaction to accommodate any user operating a vehicle in any remote environment. Furthermore, the interaction can adapt to the user over time, which is not possible with hardware devices. As a result, we have the potential to minimize sensorimotor workload on a per-user basis.

M/ORIS – Medical / Operating Room Interaction System

During Computer-Assisted Surgery (CAS), the surgeon must interact with the computerized equipment in the Operating Room (OR). Currently, Surgeon-Computer Interaction (SCI) is limited by environmental and human factors. First, the requirement for sterility of the surgical environment prevents the use of classic Human-Computer Interaction tools, such as mouse and keyboard. But more importantly, the interaction with the computer adds to the cognitive load of the surgeon, requiring frequent interruptions in the procedure and leading to frustration, loss of focus and situational awareness. To overcome both these issues, M/ORIS provides a way for surgeons to directly interact with the Graphical User Interfaces (GUIs), while reducing the surgeon workload by automating the computer configuration and display of relevant information. To achieve these goals, M/ORIS combines vision-based surgeon head and hands tracking with other sensors readily available in ORs such as tool trackers, pedals, etc. to determine the progress of the procedure, and allow the surgeon to point and click at GUIs with simple gestures.

TLIB: a Real-time Computer Vision Library for HCI

A computer vision software library is a key component of vision-based applications. While there are several existing libraries, most are large and comple or limited to a particular hardware/platform combination. These factors tend to impede the development of research applications, especially for non-computer vision experts. To address this issue, we have developed TLIB, an easy-to-learn, easy-to-use software library that provides a complete set of real-time computer vision functions, including image acquisition, 2D/3D image processing, and visualization. In this paper, we present the motivation for TLIB and its design. We then summarize some of the applications that have been developed with TLIB, and discuss directions for future work.

Advanced Teleoperation Interfaces

Vehicle teleoperation has traditionally been a domain for experts. Figuring out where the vehicle is, determining where it should go, and remotely driving it are complex problems. These problems can be difficult solve, especially if the vehicle must operate in a hazardous environment, over a poor communications link, or with limited operator resources. As a result, expert operators are needed far more often than not. Our goal is to make vehicle teleoperation accessible to all users, novices and experts alike. Thus, we are creating easy-to-use user interfaces and effective human-robot interaction methods to enable robust vehicle teleoperation (mobile robot remote driving) in unknown, unstructured environments (both indoor & outdoor).

Nanomanipulation of Carbon Nanotubes

The aim of this project is to develop a force-feedback interface that enables a user to manipulate nanometer size objects with an Atomic Force Microscope (AFM). Our current interface integrates a high-performance force-feedback system (the Delta Haptic Device), real-time 3D graphics, and physics-based simulation of nanoscale AFM interaction. We have recently begun integrating our system with a commercial AFM and are now evaluating the suitability of different operation modes for nanomanipulation. By allowing bilateral scaling (geometric, kinematic and force), the DHD can make such operations easier and faster than traditional tools.


Journal Articles

J. Lekki; S. Kumar; S. S. Parihar; S. Grange; C. Baur et al. : Data coding tools for color-coded vector nanolithography; Review of Scientific Instruments. 2004. DOI : 10.1063/1.1805014.

Conference Papers

S. Grange; C. Baur : Robust Method for Real-time, Continuous, 3D Detection of Obstructed Faces in Indoors Environments. 2006. IEEE International Conference on Automatic Face and Gesture Recognition, Southampton, UK, April 2006. p. 169-176.
M. Lemay; A. Forclaz; S. Granges; J.-M. Vesin; L. Kappenberger : The hidden organization of atrial fibrillation. 2005. Société Suisse de Cardiologie, Lausanne, June 15, 2008. p. S53.


S. Grange : M/ORIS - Medical / Operating Room Interaction System. Lausanne, EPFL, 2007. DOI : 10.5075/epfl-thesis-3798.


A. Forclaz; M. Lemay; S. Granges; J.-M. Vesin; L. Kappenberger : Hidden Organization of Atrial Fibrillation ; Cardiostim 2006, Nice, June 14-17, 2006.