Finding myself in the crossroads of several disciplines, I've been fortunate to work on several different topics within the umbrella of human-technology interaction researcg. In general,
I'm interested in developing practical, exciting and useful ways for people to interact with information in different contexts, whether it takes place on the desktop, by using a
speech user interface in front of a large display in one's living room, or with a mobile phone while on the go.
Most of my early research, including my PhD research took place in the Information Visualization and Visual Interaction research groups at Tampere Unit for Computer-Human Interaction. Between 2009 and 2015,
I worked in the Speech-based and Pervasive Interaction research group on a variety of projects -- from pervasive displays to UX in the metal industry and interfaces for people with disabilities.
Rasberry Pi cluster for computing teaching and learning
I'm working with students in the department's capstone course to set up, configure and deploy a 40-node Raspberry Pi 3 cluster. The goals of the project are to give students practice in developing cluster environments and to provide the department with a resource for our students and faculty to use for learning and research activities.
Interactive and Intelligent Collaboration Spaces
My current research at the University of Wisconsin-Stevens Point is focused at studying how new interaction technologies can be used to facilitate information access and collaborations in higher education settings.
I'm particularly interested in designing, building and studying pervasive collaborative environments that combine sensors, interactive surfaces and collaboration technologies into new kinds of learning and groupwork activities. My work is inspired by the work by Saul Greenberg and colleagues on proxemic interactions and David Benyon and colleagues' notion of blended spaces.
As of May 2020, we have completed the renovation of a blended learning environment in the Science building. The lab space offers 5 collaboration pods, a large touchscreen display and computing resources for deploying and testing prototypes. A variety of additional hardware (Raspberry Pi boards, microphones, and Microsoft Kinect sensors) is available for prototyping.
My PhD research focused on developing new user interface solutions for
mobile information access. I designed and evaluated new interfaces that help users evaluate the success of the query, as well as let them more easily filter,
compare and evaluate individual search results.
The key research questions related to mobile Web search interfaces and mobile information needs were the following:
- how can we provide effective overviews of the whole search result
- how can these overviews act as interaction mechanisms?
- how do users make use of the overviews?
- how can we use the metadata associated with search results to provide more insight into the relevance of individual search results?
- how can we study mobile search user experience?
- how can we effectively support mobile information needs in mobile Web search interfaces
The result showed that the proposed solutions have benefits in certain situations (e.g., when the search engine fails to provide clearly relevant results). Especially search result clustering could be beneficial for many purposes, however the labeling and result organization needs to be intuitive and ideally also context sensitive.
The research was funded through the
UCIT Doctoral Program in User-Centered Information Technology between
2006 and 2009. I defended my dissertation titled Design and Evaluation of User Interfaces for Mobile Web Search in November 2012.
Over the years I've worked in projects that provide a cross-section of the research themes found within the TAUCHI, all grounded on the constructive
development of user interfaces and interaction techniques.
Most recently at TAUCHI, I worked in several externally funded projects within
the SPI research group, ranging from designing novel interaction methods
for heavy industry to innovative learning environments and multimodal,
mobile interaction for underprivileged users in developing countries.
- Digital Services (2012-2015)
- In 2012-2013, our group developed multimodal, pervasive learning applications for tablet computers to facilitate a holistic learning experience in situ. With our first pilot system, Tammerkoski, local students were able to learn about the history of the iconic Tampere city region with the tablet app and create their own presentations about content captured in the wild. The pilot was extended into a mobile location-based learning platform called Seek'N'Share, which also includes a Web-based editor for creating the learning routes.
- UXUS (2010-2015)
- Our work in UXUS targets the creation of novel interaction concepts
and prototypes for metals and engineering industry to create radically new user experience.
For example, we've developed a gesture-based interaction concept for automation industry
with Fastems, which won the FIMECC Prize in 2011. More information about
the benefits of focusing on user experience in B2B industry can be found in the UX Booklet produced by UXUS.
- RYM Indoor Environment (2011-2014)
- The project aims at user-centered design of indoor environments, especially
for learning and innovation. Our work contributes towards the design and
implementation of multimodal interaction technologies and interactive environments as collaboration enablers. See the RYM website for more information about the project.
- In a joint project with There Corporation, Aalto University and VTT, we studied ways to make measurement data of every day living (e.g., electricity consumption) interesting to tenants and thereby help drive consumption towards eco-friendly habits. The research took place within the DIGILE Internet of Things SRA. Our work was focused on implementing measurement and home automation solutions based on affordable technologies such as the Raspberry Pi and Arduino.
In the Space, Theatre & Experience - Novel Forms of Evental Space (DREX) project, one the prototypes we developed is an experiential program guide for public display applications and piloted it at a library to gather users' experiences. The application was developed in collaboration with the T7 Centre for Practise as Research in Theatre and the Turku European Capital of Culture 2011 foundation. The project was a part of the Tila research program of the Finnish Funding Agency for Technology and Innovation. More details about the program guide is available on the DREX project web pages.
In the Do-it-Yourself Smart Experiences project (DiYSE) we developed applications and user interface solutions that enabled end users to
easily control and create new interactive, personalized and social
communication experiences. The project was funded through the ITEA 2 framework.
The focus of our work was on the development and
evaluation of a symbol based messaging client for users with cognitive
disabilities, in collaboration with Laurea University of Applied Sciences and the Rinnekoti Foundation.
The project studied new directions for symbiotic
technical and regulatory mechanisms for supporting ICT related privacy.
In contributed to the development of a prototype system for sharing multisensory meeting information to foster awareness within a research organization in a privacy-conscious fashion. PRIMA (Privacy in the Making) was funded by Nordunet3.
Teknologiat ääneen, Puheeseen ja moniaistisuuteen perustuvaan LÄsnä-älyyn
(Ambient intelligence based on sound, speech and multisensor interaction) was
a Tekes funded research project that developed interaction methods for pervasive applications based on sound, speech, gestures, machine vision and their rich multimodal use. I collaborated with the
Speech-based and Pervasive Interaction group to produce new multimodal
solutions for controlling home entertainment for different user groups.
See the project pages at TUT for more information and a demo video of the application.
Search-In-a-Box was a Tekes funded project where we collaborated with HIIT to develop next
generation search engines. We developed and evaluated novel user interface solutions to support the proposed concept based search approach.
iEye was a EU funded project that studied the possibilites of including eye gaze as an input
channel in software applications. I participated in the development of a
prototype application user interface that used gaze
as an input mechanism to trigger contextual aid information (e.g. translations when reading
a document in a foreign language).
Skaalautuva tuki tyรถryhmille ja ryhmรคtyรถlle (Scalable Support for Work Groups and Groupwork)
was a Tekes funded project in which we developed solutions for distributed groupwork
across devices (mobile devices, desktop and public spaces). We implemented a system
that had both asynchronous (forums and mailing list) and synchronous (public calendar,
chat) communication modes and piloted it in our research lab. More details can be found in our MobileHCI 2001 paper.