- Written by charlotte magnusson
- Category: Information
- Hits: 6889
Using the HaptiMap toolkit
Date and time: 16th September 2011, 10-17 CET
Location: Getxo/Bilbao, Spain
Registration/fee: Registration is required. The workshop is free, but the number of participants is limited.
The HaptiMap toolkit is a set of software tools developed to make it easier for developers to add more multimodal and accessible interaction in their services. The HaptiMap Toolkit simplifies the development of mobile location and mapping-based applications that make use of haptic, audio and visual input and output. It aims to achieve this by presenting a simple cross-platform API that abstracts the complexities of
- dealing with haptic/audio/visual input and output on these devices, on a cross-platform basis, and
- retrieving, storing and manipulating geographical data,
behind a simple interface, leaving the user interface developers free to concentrate on maximising the usability and accessibility of their applications. The aim of this workshop is to give everyone interested in using the HaptiMap toolkit a chance to get valuable inside information as well as providing a hands on experience getting started using the toolkit software.
10.00-10.40 Design examples - what can you do with the HaptiMap toolkit?
10.40-11.20 Presentation of the overall structure of the toolkit
11.20-12.00 Presentation of UI modules and example designs
13-17 Hands on activity - writing an application using the toolkit
Registration deadline: 5th of September
What is the HaptiMap Toolkit?
The haptimap toolkit is a cross-platform suite of software designed to facilitate easy and multimodal access to map data, it has four significant features:
1) It is adaptable. Adaptable in three senses: Firstly, When built from source, it adapts to the target platform's capabilities, e.g. recognising what sensors are available (accelerometers etc.), and what are the display and audio characteristics of the device. Secondly, When used, it adapts to the presence of external devices that it finds attached. For example, a bluetooth GPS location device may come into range, or may go out of range. Thirdly, it can adapt to obtain map data from several sources and deliver that data to applications in a consistent and readily usable form.
2) It is multimodal (supporting a developer interface for haptic, audio and visual functionality).
3) It is cross platform, it builds for use with mobile platforms: Android, iPhone OS, Symbian, Linux (such as Maemo.) And for desktop platforms: Windows, Linux and OSX.
4) It is extensible through a plug-in concept that allows new sensors, haptic devices, and map data sources to be easily incorporated.
- Written by charlotte magnusson
- Category: Information
- Hits: 13252
Proceedings at: http://www.english.certec.lth.se/haptics/HaptiMap/MobileHCI2010workshop_proceedings.pdf
Workshop at MobileHCI 2010, Tuesday, September 7, 2010, Lisbon, Portugal
Submission deadline: May 10, 2010
Orientation and navigation are very important skills for getting along in daily life. The acquisition and use of these skills is based on the processing of visual, auditory and sensorimotor/kinesthetic information, denoting the relations between objects, places, and people. This multisensory information gives us cues about directions, distances, speed, headway, traffic, landmarks, obstacles, etc. With the recent availability of global positioning method, of comprehensive GIS systems, of powerful mobile computers and of advanced interaction techniques, multisensory spatial information could now be presented in a personalized, context-aware and intuitive manner.
However, it is still not completely clear how to design, and how and when to present multisensory spatial information on mobile devices. This workshop will initiate a multidisciplinary discussion on these topics. We would like to invite researchers working in the fields of human-computer-interaction, computer science, cognitive sciences, psychology, psychophysics, mechanics and electronics to submit a position paper and/or a demo presentation dealing with topics similar to the ones listed below:
Methodologies for representing spatial information (e.g. distances, directions, speed, landmarks, obstacles) using audio and haptics on mobile devices
Compatibility with the device: display and rendering capabilities, task-related and environmental constraints…
New interaction techniques, both hardware and software: wearables, haptic accessories, etc.
Design methods and guidelines for such information on mobile devices (e.g. design of useable and useful audio and haptic icons)
Specific evaluation methods and criteria
Workload and user performance
Spatial information presentation and the moving body
User experience and satisfaction
This will be a full-day workshop. Up to ten position papers will be presented by theirs authors. Each presentation will be limited to 15 min. There will be a discussion after each presentation. Presentations and discussions will be followed by a demo session (demo submissions should be accompanied by a one-page description), some hands-on activities with the demonstrators and a wrap-up discussion at the end of the workshop.
Submissions and participants selection criteria
The papers should focus on current research activities and/or interesting aspects of future work. All position papers will be peer-reviewed by at least two reviewers and evaluated on their originality and relevance to the workshop topics. Authors of accepted position papers will be notified on 21st May 2010. Camera-ready manuscripts should be ready on the 28th May 2010.
The authors of the best workshop papers will be invited to publish an extended version of their work in a special issue of a relevant journal (e.g. International Journal of Mobile Human Computer Interaction, International Journal of Handheld Computing Research).
The workshop will be held at on the 7th of September 2010. In order to participate in the workshop, at least one author of a position paper must register for both the conference and the workshop.
Please note that the early registration deadline for MobileHCI has been moved to the 16th of June.
Margarita Anastassova is a researcher in the Sensory Interfaces Laboratory at CEA, LIST in France. She holds a Ph.D. in Psychology and Cognitive Ergonomics from Paris Descartes University. Her PhD work was centered on the analysis of users’ needs for an Augmented Reality system to be used as a job aid by automotive service technicians. Margarita has over 8 years of experience in the user-centered design of interactive systems, which she gained in a number of EU projects such as Ambient Agoras, VIRTHUALIS and SAMBA. She is currently working on the EU HaptiMap project. Her main research interests lie in the field of human factors (usability, utility, accessibility) of emerging technologies such as haptic and tactile interfaces, mobile systems, virtual and augmented reality. She is also interested in cognitive ergonomics aspects of navigation and wayfinding. Margarita is a member of the program committee or a reviewer for a number of international conferences (CHI, Interact, MobileHCI, Pervasive Health).
Charlotte Magnusson, PhD, associate professor (docent), Certec, Division of Rehabilitation Engineering Research, Department of Design Sciences, Lund University. Charlotte is the leader of the research at Certec on the design of useworthy haptic and audio interfaces for people who are blind or have low vision. She has over 10 years of experience in the field. Charlotte has currently two particular areas of interest. The first is concerned with the use of haptic devices, and how haptics and audio can be used to make different types of complex information and virtual environments more accessible. The second is design and design methodology for persons with and without disabilities. Charlotte is also an experienced programmer, with particular experience from interactive multimodal applications. Charlotte is the leader of the haptics group at Certec, and has been responsible for the department participation in the EU projects MICOLE, ENABLED and ENACTIVE. She is currently the coordinator of the EU project HaptiMap.
Martin Pielot is an associate researcher in the Intelligent User Interfaces Group at the OFFIS – Institute for Information Technology, Germany. At the same time he is a doctoral student in the Media Informatics and Multimedia Systems Group at the University of Oldenburg, Germany. Coming from a background of building mobile applications, his research focuses on ubiquitous computing and mobile human-computer interaction in general. In particular, he is interested in non-visual information presentation in the domain of mobile applications and the methods needed to evaluation them.
Dr. Gary Randall has a PhD in Cognitive Psychology, an MSc (Distinction) in Cognitive Science (both University of Birmingham UK), a BSc (Hons) in Artificial Intelligence and is an expert in the computational modelling of visual processing. He has a unique blend of IT and theoretical skills relating to Cognitive Science, built up over 20 years, and is an experienced programmer in many languages. Gary is a Senior Research Scientist at BMT and Project Manager of EC projects. His post-doctoral experience as Research Fellow at Harvard Medical School was related to implementing and extending the influential machine vision model Guided Search. Prior to this, Gary's research concentrated on building dynamic neural systems to predict common visual behaviours. His areas of interest include the roles of object recognition, attention, and memory as they relate to image retrieval. Gary was the co-ordinator of the EC-funded MAPPED project which provided customised route data for use by different categories of disabled user in multiple European locations.
Ginger B. Claassen studied computer science at the University of Paderborn(Germany) and the School of Computer Science at Carleton University (Ottawa, Canada). For more than 10 years Mr. Claassen worked as a research assistant for the C-LAB, a joined research and development laboratory of Siemens AG and the University of Paderborn, with special focus on “accessibility” respectively “Design for All”. Mr. Claassen is blind, and therefore knows from his own living and working the problems and barriers persons with disabilities are facing in our modern information and communication society.
Since 2008 he works for the Siemens "Accessibility Competence Center" and for the "Siemens Access Initiative", the company's corporate joint effort to improve the accessibility of the various Siemens products and services like software, Internet portals, workplaces, information and communication technology, household appliances, and public transport systems. He has been involved in various commercial and research projects, provides "Design for All" training to colleagues and customers and presents various accessible solutions at international exhibitions and congresses.
The workshop organizers are grateful to the European Commission which co-funds the IP HaptiMap (FP7-ICT-224675).
- Written by Super User
- Category: Information
- Hits: 285344
HaptiMap, Haptic, Audio and Visual Interfaces for Maps and Location Based Services
HaptiMap has performed novel and original research into multimodal perceptualizations. We have investigated how, and in what ways, multimodal feedback can both augment and replace visual feedback for diverse users in diverse situations. The results of this research has been published both as research reports and as guideline documents and it has been encapsulated in the HaptiMap toolkit and the HaptiMap demonstrators. Research in HaptiMap has been performed in a user centred way - during the initial studies 221 users participated in different research activities. Formative evaluations during the development has involved 113 individual users. Our final evaluations involved 392 users (including 27 developers evaluating the toolkit). Thus our results and recommendations are based on a substantial body of evidence. Check out the HaptiMap video for a quick introduction.
The HaptiMap guideline documents are:
User Study Guidelines, a guideline document which provides an easy to read overview of the why and how of user centred design for mobile applications, and includes descriptions of a wide variety of different user study techniques.
User requirements and design guidelines for map applications, is a guideline document intended to support and inspire the design and development of good map and location based applications. We provide both general observations as well as more concrete and detailed suggestions - many of these are applicable for any design intended to be used in mobile contexts. Our recommendations and suggestions are applicable to a wide range of use situations, but we have a particular focus on the pedestrian situation.
Accessible map and LBS content guidelines. These guidelines aim to support data providers so that they are able to collect and store appropriate information. The aim of this deliverable is to raise the awareness of accessibility issues and help the map data providers decide what information should be stored and how.
The HaptiMap toolkit allows software engineers easy access to the HaptiMap multimodal components. It was designed for all mobile and desktop platforms but has found most favour on Android and iOS. On these two mobile platforms there are a number of directly ‘pluggable’ components such as the tactile compass, the Geiger compass, the touchover map and the activity recogniser.There is a rich support system available for the toolkit, through either a Wiki, a forum or a help ticketing system. Examples of toolkit use can be found in the demonstrators and in the entries to the HaptiMap developer competition. The toolkit has also been found useful as a tool for teaching students of computer science, interaction design, and engineering. It is also finding its way into research projects in the area of connected health and assisted living. The toolkit has been successfully evaluated by external professional developers who found it easy, useful and also stated that it was well ahead of any other systems currently available.
The toolkit includes two specialist APIs:
LocPP-API/Joined API for iOS and Android is an extract of the JOINED functionality which allows developers to embed ‘friend finder’ technology into their applications A technology that is endowed with optimized (through our research findings) tactile and audible multimodality. LocPP API has been developed as part of the toolkit, but for marketing purposes it has been branded as the JOINED AP. The only difference is that the JOINED API comes together with a working server (i.e. the backend that manages users positions) provided by GeoMobile. Developers can implement their own server and use LocPP for free. Or they use the JOINED API together with the JOINED server backend and pay a certain fee for using the server.
The SCPN API (Safe Corridor Pedestrian Navigation) is specifically designed for pedestrian navigation applications where non-visual interaction and the avoidance of obstacles are of paramount importance. More information can be found at the iOS part of the HaptiMap wiki.
As part of the HaptiMap project, demonstrator applications have been developed for Android and iOS by using the toolkit.These demonstrators illustrate one or more different HCI concepts that have been designed as part of the project. The demonstrators that were released to the app stores have achieved several thousand downloads, spreading the idea of the project to more users than ever could be reached with traditional methods.
In HaptiMap we have also done original work aimed at supporting industrial design and development by providing tools that support the inclusion of a wide range of users and contexts in the design process. The HaptiMap context card bundle is a versatile tool that can be used during all phases in a design process. The context cards are complemented by a workbook containing background information, practical advice, additional design methods as well as a workshop template developed to fit well in the industrial setting. Under design tools you find guidelines and checklists, information on simulation and prototyping tools, standards and regulations as well as evaluation materials and the evaluation tool available through the HaptiMap toolkit: the Virtual observer.
Want to learn more? Welcome to the HaptiMap training site: http://www.haptimap-training.org/ (opens in a new window).
A four page text describing the work done in HaptiMap can be found in the HaptiMap final overview.
The HaptiMap project has now ended, but please follow us on the HaptiMap facebook page: https://www.facebook.com/Haptimap
- Written by charlotte magnusson
- Category: Information
- Hits: 7316
OFFIS was Co-Organizer of the ”Second International Workshop on Location and the Web” (LocWeb 2009) in conjunction with CHI 2009.
Workshop Theme and Goals
Location has become an important concept for many Web-based applications, in particular because of the increasing popularity of mobile Internet access. So far, however, location concepts are reinvented in many different places and these diverging concepts make it hard to use location-based services and data in a truly open way.
The Second International Workshop on Location and the Web (LocWeb 2009) targets the capabilities and constraints of Web-based location-oriented services, looking at browser-based applications, as well as at native applications using Web services.
The focus of this workshop is on exploring approaches for handling the complexity of location-based services and, more specifically, looking at location abstractions, location sharing, privacy issues, and interface design issues. The goal is to create a starting point for attaining better understanding of how the Web has to change to embrace location as a first-class concept and to bring together key scientists in all participating disciplines in order to stimulate open discussion. It is our hope that we will be able to achieve increased synergy of approaches between the disciplines engaged in Web technologies, HCI, UbiComp, and other related fields. More directly, we intend to encourage immediate interdisciplinary collaboration on future research topics. Young scholars and Ph.D. students are especially encouraged to submit papers and participate in the workshop.
For more details see: http://ifgi.uni-muenster.de/0/locweb2009/
- Written by charlotte magnusson
- Category: Information
- Hits: 8289
Workshop: Non-visual interaction in mobile/location based games
This hands-on workshop aims at exploring how non-visual interaction should be designed to enhance the mobile gaming experience. The workshop will challenge the participants’ imagination to create mock-up/lo-fi versions of no-visual or lo-visual mobile game ideas, and it is led by Charlotte Magnusson and Kirsten Rassmus-Gröhn from the Department of Design Sciences at Lund University.
Charlotte Magnusson, Associate Professor is an Associate Professor from the Department of Design Sciences at Lund University (Sweden). She has worked for a long time with non-visual interaction design, and is currently leading a European research project (HaptiMap) aimed at more and better use of the non-visual senses in maps and navigational services.
Kirsten Rassmus-Gröhn is a PhD and a researcher at the Department of Design Sciences at Lund University, Sweden. She has been working with non-visual interaction for 10 years, and is currently leading a national project on a collaborative non-visual learning application for school use. Simultaneously, she is working in the HaptiMap project with user requirements capture and design of evaluation prototypes for interaction with mobile navigation systems.