News |  Sitemap |  Contact
PDF Export  | 

Improving the Human Computer Interface in ImageJ using the MIDI protocol

Abstract

As software functions get more and more complex, Human Computer Interface (HCI) become an interesting domain to focus on. For example the video game market, with the apparition of the Wii (Nintendo, 2006) and Kinect (Microsoft, 2010), and recently the explosion of tactile tablet for the mainstream, show how fertile a reflection on the mode of communication between the human and the computer can be.

In the field of science and engineering, HCI is largely dominated by the traditional mouse and keyboard interface. The user workflow is slowed down by the microcuts inherent to this interaction mode :

  • The user has to follow the mouse pointer with the eye to reach the correct function ;
  • Normally it is only possible to act on one parameter at a time ;
  • There is low coherence between the graphical user interface (showing rotative knobs for example) and the real movement to be executed (a translation of the mouse for a rotation) ;
  • Shortcuts change from software to software.

Picture analysis is a typical process involved in numerous « Computer Assisted scientific work », implying « trial and error » approaches for the user: the selection of the right filter among a lot of possibilities, and of suitable parameter values, usually imply several tries, with concentration on the object of the study, and easy execution (meaning to easily find the right tool for the desired operation).

In this talk, we present a plugin for ImageJ that enables standard MIDI devices to be used to control ImageJ. These devices coming from Computer Assisted Music Industry are cheap and available in different configuration on the market. Thanks to this plugin, the user will be able to interact in real time with ImageJ using a conventional MIDI tablet application, or any other existing MIDI hardware. It is now possible for example to change zoom, brightness, rotation, etc using rotative buttons, to set software values using real (3D, tangible) or virtual (tactile) sliders and to interact with ImageJ with pedals and hardware joysticks. This functionality and the interaction is entirely configurable, i.e. an user can decide to use a given “interactor” for a given operation. In result a device can be configured in a way that it provides the most value to him. This can automate individual workflows and help in performing repeating tasks. It is possible to launch complete macros with just one click on a hardware button. Due to this development, ImageJ functions can be easier accessed and handling time can be reduced.

The presentation will show the features of the plugin, the configuration and includes a use case demonstration that can be adapted to the own usage scenario.

Keywords

Human-Computer Interaction, Ergonomics, Plugin, Tactile Tablet

Short CV

Olivier Buchheit graduated as a mechanical engineer in 2000 (Ecole Nationale d'Ingénieurs de Metz, France) , and then joined both the Institut National de Polytechnique de Lorraine and the Public Research Centre Henri Tudor, Advanced Materials & Structure (AMS) department, as a PhD student in tribology (science of interactive surfaces in relative motion). From 2005 to 2008, he managed the Surface Caracterisation activity of AMS (business & research), i.e. topography and nanoindentation/scratch tests. From 2007 to 2010, he worked on macro/bulk mechanical tests instrumentation, in parallel to topography, nanoindentation/scratch and tribology activity. Since 2010, he is mainly focusing on human-machine interaction & new interfacing technologies dedicated to real-time scientific data analysis.

Administrative data

Presenting author: Olivier Buchheit
Organisation: CRP Henri Tudor

co-authors: Thorsten Roth, Andreas Jahnen

© Luxembourg Institute of Science and Technology | Legal Notice