Menu
top heading for advising page
I very much enjoy working with ta­len­ted and am­bi­tious stu­dents, both in classes I teach and in the pro­jects I sup­er­vise.
Most of the pro­jects that I super­vise have a close con­nection to my re­search acti­vi­ties with­in the field of Human-Computer In­ter­ac­tion and con­sist of design­ing, im­ple­ment­ing (pro­to­typ­ing), and evaluating new ways of in­ter­act­ing with smart­phones, smart­watches, or desk­top com­pu­ters.
If you are study­ing com­pu­ter sci­ence at the Alpen-Adria-Uni­ver­sität Kla­gen­furt and are in­te­res­ted in doing your Bachelor's soft­ware pro­ject or wri­ting your Master's thesis on a to­pic re­lat­ed to Human-Computer In­ter­action, please get in­spired and check the de­scrip­tions of open, on­going, and already fi­nish­ed pro­jects that are list­ed below.
Do not hesitate to con­tact me if you find in­te­rest in any of the list­ed pro­jects, have ques­tions re­gard­ing the pro­jects, or if you have your own idea for an in­te­rest­ing pro­ject that you would like to start!
!
I am always look­ing for mo­ti­vat­ed and ta­lent­ed stu­dents for co­ope­ra­tions in var­ious PhD-pro­jects!
If you are in­te­res­ted in do­ing a PhD in HCI at the Alpen-Adria-Uni­ver­sität Kla­gen­furt, send me an email with a short de­scrip­tion of your back­ground so that we can get in con­tact to di­scuss the pos­si­bil­i­ties for a co­ope­ration!
Open Projects
Ongoing Projects
Finished Projects
OPEN PROJECTS
Open • Bachelor Project
SENSOR BASED IN­TER­AC­TIONS FOR SMART­WATCHES
TECHNOLOGIES & TOOLS: Java • Android • SVM
image for practica smart­watch
image for practica smart­watch
Today's smart­watches are equipped with high-quality sen­sors, such as a gyro­scope and an acc­e­lero­meter. These sen­sors could open up new poss­i­bili­ties to ex­tend the user’s input vo­ca­bul­ary be­yond the smart­watch’s small touch­screen. This pro­ject aims at inter­preting the sen­sor sign­als that are gene­rated when the user ges­tures in the air or swi­pes a finger (on the hand where the watch is worn) across an un­even sur­face and to map signal patterns to in­put commands.
So far, a soft­ware pack­age has been im­ple­mented that en­ables long-term logging of the smart­watch's sensor data while the user goes about doing his or her daily acti­vities. The next two steps in this pro­ject can either be comp­leted as two sepa­rate Bach­elor pro­jects by two groups of students (or as two single-person pro­jects) or in com­bi­na­tion as a Master pro­ject for one student.
PROJECT A – Analyzer Module
The analyzer module is a soft­ware pack­age that can be used to ana­lyse the logged raw sensor data and to iden­tify regu­lari­ties and patt­erns in data. The core of the mo­dule will be a Support Vec­tor Ma­chi­ne that can be trained to auto­mati­cally class­ify seq­uences of sen­sor data into pattern categories.
PROJECT B – Test Module
The test module is a soft­ware pack­age that pro­vides the ne­cess­ary func­tion­ality to run a user study where par­ti­ci­pants wear a smart­watch while swip­ing a finger across a series of diff­erent every­day objects with non-flat (bumpy) surfaces, such as a comb, a bottle, or a com­puter keyb­oard. The in­ten­tion with such a user study is to collect the vib­ration sig­nals picked up by the sen­sors in the smart­watch and then com­pare these sign­als with sig­nals that are gene­rated when a user goes about doing his or her daily acti­vities (using the ana­lyzer mo­dule from Project A).
Open • Bachelor Project
IN-AIR INTERACTION FOR DESKTOP COMPUTERS
TECHNOLOGIES & TOOLS: Java • Python • C++ (your pick!)
image for master in-air gest­rues for desk­top comp­uters
image for master in-air gest­rues for desk­top comp­uters
Recent advances in soft­ware and hard­ware tech­no­logy for ob­ject track­ing make it now possi­ble to de­tect and accu­rately track the position of virt­ually any object that is moved around within the range of the track­ing hard­ware. In this pro­ject we will ex­plore how a Leap Motion Cont­roller can be used to track a user's fingers and hands as they move around on and above the key­board and back and forth bet­ween the key­board and the mouse while the user is surf­ing the Web.
The intention is to design suit­able finger and hand gest­ures that can serve as alter­native in­put mech­a­nisms. For ex­ample, wigg­ling up and down with a fin­ger could be mapped to window scroll­ing and a quick point­ing gest­ure to the left could be used to re­load the pre­vious web­page.
In this project you will get fa­mi­liar with the Leap Motion Controller and its API and then im­ple­ment a »ges­ture mo­dule« that maps fing­er and hand track­ing in­for­ma­tion to vari­ous in­ter­face actions, such as switch­ing bet­ween app­li­ca­tions, copy & paste, scroll­ing, or zoom­ing.
Open • Master Project
SENSOR BASED INTER­ACTIONS FOR SMART­WATCHES
TECHNOLOGIES & TOOLS: Java • Android • SVM
image for master smart­watch inter­actions
image for master smart­watch inter­actions
Today's smart­watches are equ­ipped with se­veral high-quality sen­sors, such as a gy­ro­sco­pes and an acc­elero­meter. These sen­sors could open up new poss­i­bili­ties to ex­tend the user’s in­put vo­ca­bul­ary be­yond the smart­watch’s small touch­screen. This pro­ject aims at in­ter­pre­ting the sen­sor sig­nals that are ge­ne­ra­ted when the user ge­st­ures in the air or swi­pes a fin­ger (on the hand where the watch is worn) across an un­even sur­face and to map sig­nal patt­erns to in­put com­mands.
So far, a soft­ware pack­age has been im­ple­ment­ed that en­ables long-term logg­ing of a smart­watch's sen­sor data while the user goes about doing his or her daily ac­ti­vi­ties. The next two steps con­sist of im­ple­ment­ing two add­i­tio­nal soft­ware mo­dules: one »ana­ly­zer mo­dule« and one »test mo­dule«.
The analyzer mo­dule is a soft­ware pack­age that can be used to an­a­lyse the log­ged raw sen­sor data and to iden­ti­fy re­gu­lar­i­ties and patt­erns in the data. The core of the mo­du­le will be a Support Vec­tor Ma­chi­ne that can be train­ed to auto­mati­cally class­ify se­quen­ces of sen­sor data into pattern cate­gories. The test mo­dule is a soft­ware pack­age that pro­vides the ne­cess­ary func­tio­na­li­ty to run a user study where par­ti­ci­pants wear a smart­watch while swip­ing a fing­er across a series of diff­erent every­day ob­jects with non-flat (bumpy) sur­faces, such as a comb, a plas­tic bottle, or a com­pu­ter key­board. The in­ten­tion with such a user study is to col­lect the vi­bra­tion sig­nals picked up by the sen­sors in the smart­watch and then com­pare these sig­nals with sig­nals that are ge­ne­rated when a user goes about doing his or her daily act­ivi­ties (using the ana­ly­zer module).
!
This project can be split into two more or less se­pa­rate Bache­lor pro­jects, one pro­ject for the »ana­ly­zer mo­dule« and one pro­ject for the »test mo­dule«.
Open • Master Project
LINE-BASED INPUT ON MO­BILE DE­VICES
TECHNOLOGIES & TOOLS: Java • Android • SPSS (for statistical analysis)
image for master line-<wbr>based input
image for master line-<wbr>based input
Small buttons and large fingers slow down and make in­put on mo­bile touch­screens error prone, par­ti­cu­lar­ly in situ­ations where the user is mo­ving, is in a shaky en­viron­ment (such on a bus or in a train), or needs to ope­rate the de­vice with only one hand (as when carry­ing a shopp­ing bag or hold­ing a child's hand).
Line-based in­put tech­ni­ques where the user pro­vides in­put through »drawing« short lines (instead of hitting small rec­tang­u­lar buttons) show pro­mise as alter­native in­put methods on smart­phones and smart­watches in con­str­ained usage situa­tions (e.g., when using one hand, in a shaky en­vir­on­ment, or comp­le­tely eyes-free).
A first version of a Java and Android-based test appli­ca­tion which pr­ovides the neces­sary func­tio­nality to con­duct user per­for­mance ex­peri­ments with line selec­tion inter­faces on tablets, smart­phones, smart­watches, and on finger rings has already been im­ple­ment­ed. Your first task in this pro­ject will be to fina­lize this test appli­ca­tion and pre­pare it for user test­ing. When that is fin­i­shed, we will to­ge­ther design a user ex­peri­ment. The objec­tive with the ex­pe­ri­ment is to find out how fast and acc­u­rate users can draw lines of diff­e­rent lengths in diff­e­rent di­rec­tions on var­ious de­vices and screen sizes, both with and with­out vi­sual access to the screen and the in­put finger it­self.
Accordingly, with this pro­ject you have the opp­or­tun­ity to deepen your pro­gramm­ing skills (GUI pro­gramming, Android and Java) and you will gain ex­per­ience in how to design, how to exe­cute, and how to eva­luate user ex­periments.
Open • Master Project
TYPEFACE READ­ABILITY & EYE­TRACK­ING
TECHNOLOGIES & TOOLS: Java • Android • eyetracking • SPSS (for sta­tis­ti­cal ana­ly­sis)
image for master type­face read­ability
image for master type­face read­ability
In the graphic design com­mun­ity and in the usa­bi­li­ty com­mun­ity there is an on-going di­scuss­ion about how fast and easy peo­ple can read text on com­pu­ter mo­ni­tors and smart­phone screens. The read­a­bi­li­ty of a text piece is in­flu­en­ced by se­veral fact­ors. Such as the letter size, text and back­ground co­lor, the row length, and the shape of the in­di­vi­dual cha­ract­ers mak­ing up the text.
In this project you will first de­ve­lop a soft­ware pac­kage that pro­vi­des the func­tio­nal­ity that is ne­cess­ary to con­duct a user study on read­abil­ity. This in­clu­des func­tio­nal­ity to con­fi­gure diff­e­rent test cases (by de­fi­ning com­bi­na­tions of var­ious text att­ri­bu­tes such as letter size, color, and type­face), func­tio­nal­ity to pre­sent the diff­e­rent test cases to study par­ti­ci­pants, and to log the time par­ti­ci­pants need to read text in the diff­e­rent test cases.
In the second part of this pro­ject you will gain in­sights and ex­per­i­ence in con­duct­ing user ex­per­i­ments, sta­ti­sti­cal data ana­ly­sis, and in using eye­track­ing tech­no­lo­gy. To­gether, we will use your ex­per­i­men­tal soft­ware and con­duct one user ex­per­i­ment using our lab's new eye­track­ing sy­stem from Ergo­neers. After that, we will ana­lyse the re­sults from the ex­peri­ment.
!
By excluding the last part with the ex­peri­ment this pro­ject can be ad­ap­ted into a Bach­elor pro­ject.
Open • Master Project
EVALUATING NON-STANDARD MENU DE­SIGNS
TECHNOLOGIES & TOOLS: Java • SPSS (for sta­ti­sti­cal ana­ly­sis)
image for master menu designs
image for master menu designs
In Human-Computer inter­action (HCI) re­search – as well as in many other di­sci­pli­nes – new scien­ti­fic know­ledge and tech­no­lo­gi­cal ad­van­ces are often based on em­pi­ri­cal re­se­arch where new ideas and theo­ries are ex­plo­red through hypo­the­sis test­ing and con­trolled exp­eri­ments. How­ever, cri­ti­cal voices with­in the HCI re­search com­mun­ity quest­ion the value and ­ use of con­trolled ex­peri­ments in HCI.
In this project we will con­tri­bute to this di­scuss­ion by re­doing – rep­li­cat­ing – a series of »fa­mous« user ex­peri­ments from the HCI li­ter­ature. We will fo­cus on ex­peri­ments that have stu­died the usa­bi­lity of non-standard drop‐down menus and how easy and fast users can na­vi­gate menu struc­tu­res and se­lect the con­tain­ing menu items.
For this pur­pose, a first ver­sion of a »menu test suite« app­li­ca­tion has been de­vel­oped. After fur­ther de­vel­op­ment and ad­ap­ta­tions we can start re­pli­cat­ing pre­vious menu exp­eri­ments. This in­clu­des care­fully stu­dy­ing the de­scrip­tions of the prev­ious ex­peri­ments, then run­ning the ex­peri­ments with a group of com­pu­ter users, and fi­nally ana­lyz­ing our re­sults and com­par­ing these with pre­vious­ly re­ported re­sults.
Accordingly, in this pro­ject you will ac­quire skills and ex­perience in de­sign­ing, con­duct­ing, and eva­lu­at­ing user ex­peri­ments.
Open • Master Project
SELECTION OF MOVING SCREEN TARGETS
TECHNOLOGIES & TOOLS: Java • SPSS (for sta­ti­sti­cal ana­ly­sis)
image for master mov­ing targets
image for master mov­ing targets
In many appli­cations – such as in air-traffic con­trol, in vi­deo sur­veill­ance, and in com­pu­ter games – the user needs to quick­ly and accu­ra­te­ly se­lect ob­jects that are mov­ing across the screen. Se­ve­ral pre­vious re­search pro­jects have pro­posed vari­ous tech­ni­ques that can assist the user when click­ing on mov­ing screen ob­jects. The aim with this pro­ject is to com­pare such tech­ni­ques and to build a theo­re­ti­cal model that mathe­ma­ti­cally de­scri­bes and pre­dicts how fast users can se­lect tar­gets that are mov­ing across the screen (de­pend­ing on the size of the tar­get and its mov­ing speed).
A first version of a Java appli­ca­tion that pro­vi­des the ne­cess­ary func­ti­on­al­ity to con­duct user ex­peri­ments on se­lec­tion of moving screen ob­jects has al­ready been de­vel­oped. In this pro­ject you will first ex­tend this app­li­ca­tion with add­i­tio­nal func­tion­ality and then de­sign and con­duct a user ex­peri­ment that allows you to 1) veri­fy pre­vi­ously re­ported re­search re­sults on the effect­ive­ness of var­ious tech­ni­ques that sup­port the se­lec­tion of mov­ing screen ob­jects, and 2) em­pi­ri­cally build and ver­i­fy a pre­dic­tive per­for­mance model that ex­plains how fast users can se­lect mov­ing screen ob­jects.
Accordingly, with this pro­ject you have the opp­or­tu­ni­ty to deepen your pro­gramm­ing skills (using Swing, Java's GUI tool­kit) and you will gain ex­peri­ence in how to de­sign, con­duct, and eva­lu­ate user ex­peri­ments, and in theo­re­ti­cal modell­ing of user per­for­mance.
Open Projects
Ongoing Projects
Finished Projects
ONGOING PROJECTS
Ongoing • Bachelor Project • Jonas Schaltegger
BEND-INPUT FOR SMARTPHONE INTERACTIONS
TECHNOLOGIES & TOOLS: Arduino • Android
image for practica bend sensor input
image for practica bend sensor input
Arduino is a popular open source hard­ware and soft­ware plat­form that can be used to »di­gi­ta­lize« phy­si­cal ob­jects or to en­hance com­put­ing de­vices, such as smart­phones, with add­i­tio­nal sen­s­ors. In this pro­ject we use an Ar­du­ino mi­cro­con­trol­ler and a bend sen­sor to ex­plo­re ways to cre­ate new in­put poss­i­bi­li­ties (be­yond the stan­dard touch-screen in­ter­ac­tions) for smart­phones. In the first pro­ject pha­se we focus on gett­ing fa­mi­liar with the Ar­du­ino Soft­ware (IDE) and its ba­sic hard­ware (sen­sors and con­troll­er board). The work in the sec­ond pro­ject pha­se will aim at read­ing data from the bend sen­sor and on pre­par­ing this data for further use on an Android smart­phone.
Ongoing • Bachelor Project • Marcel Mandl
TEXT EDITING & COPY-AND-PASTE ON TOUCHSCREENS
TECHNOLOGIES & TOOLS: different Web technologies
image for practica text ed­i­ting on touch­screens
image for practica text ed­i­ting on touch­screens
Quickly editing text or copy and paste text parts on smart­pho­nes and tab­lets can be chal­leng­ing. And­roid and iOS use slight­ly diff­e­rent in­ter­faces and mech­a­ni­sms for these fre­qu­ent tasks. The goal of this pro­ject is a web app­li­ca­tion that all­ows us to test and mea­sure how fast and accu­rate peo­ple can copy and paste and edit text on touch­screens.
This in­cludes func­ti­o­nal­i­ty to:
  • configure diff­e­rent test cases for three task types: copy-and-paste of text, in­sert­ing text, and de­let­ing text,
  • present the diff­e­rent test cases to study par­ti­ci­pants, and
  • record how fast and acc­u­ra­te study par­ti­ci­pants can com­ple­te the diff­e­rent test cases.
Ongoing • Master Project • Bernhard Nitsch
TYPEFACE READABILITY
TECHNOLOGIES & TOOLS: Java • Android; Web
image for practica type­face read­ability
image for practica type­face read­ability
In the graphic design comm­unity and in the usa­bility commu­nity there is an on-going di­s­cuss­ion about how fast and easy peo­ple can read text on com­puter moni­tors and smart­phone screens. The read­abil­ity of a text piece is influ­enced by sev­eral fact­ors. Such as the letter size, text and back­ground color, the line length, and the shape of the in­di­vi­dual char­acters mak­ing up the text.
The intended out­come of this pro­ject is a soft­ware pack­age that pro­vi­des the func­tio­na­lity that is ne­cessary to con­duct user studies on read­a­bility. This in­cludes func­tio­na­lity to con­fi­gure diffe­rent test cases (i.e., com­bi­na­tions of various text attri­butes such as letter size, text color, and type­face) and func­tio­nality to pre­sent the different test cases to study par­tici­pants, as well as func­tio­nality to log the time par­tici­pants need to read the text in the diffe­rent test cases.
FINISHED PROJECTS
Finished • Bachelor Project • September 2019
Demi Dauerer
macOS AND iOS UTIL­ITY APP – iCAL EXPORT
TECHNOLOGIES & TOOLS: Swift • Cocoa • Xcode
image for practica iCAL export
image for practica iCAL export
Busy people with tight time schedu­les and full calen­dars often face pro­blems find­ing empty »slots« in their sche­dules for new appoint­ments. Often, persons ne­go­tiate about the date and time for an up­coming meet­ing over email, send­ing a list of possible dates and times to meet, back and forth.
Re­mark­ably, with elec­tro­nic calen­dars, such as the iCal App, it is diffi­cult to quickly become an over­view of empty time slots between day X and Y. More­over, the user cannot gene­rate a list that only con­tains empty time slots.
Acc­ord­ing­ly, the out­come of this pro­ject was a util­ity app for macOS and iOS that reads the stored iCal cal­en­dar events and pro­vides power­ful filt­ering func­tio­nality to let the user easily view and ex­port infor­mation about both up­com­ing ev­ents and empty time slots that are avail­able for new meetings.
Finished • Bachelor Project • March 2018
Bernhard Nitsch
WEBSITE WITH GRAPHIC DESIGN EXERSICES
TECHNOLOGIES & TOOLS: HTML 5.0 • CSS • jQuery
image for practica design website
image for practica design website
Professional graphic desig­ners and web designers use se­veral power­ful design tech­ni­ques and »tricks« to create ex­press­ive and vi­su­ally app­ealing designs. For most design novices, many of these tech­ni­ques and »tricks« are rather easy to under­stand but, un­for­tu­nately, many no­vices think it is often quite diffi­cult to success­fully apply them to their own designs. Accordingly, in this pro­ject we de­ve­lop­ed an in­ter­ac­tive web­site where de­sign novices can learn and prac­tice many such use­ful design tech­ni­ques and »tricks« through illu­stra­ting exam­ples and ex­er­cises. The web­site in­clu­des ex­plain­ing mater­ials and vari­ous ex­er­cises about ty­po­gra­phy, color the­ory, image cropp­ing, com­po­si­tion, and lay­out. The web­site will be on­line soon!
Finished • Bachelor Project • March 2016
Sheel M. Schmölzer
MOUSE POINTING TECH­NIQUES FOR MOV­ING TARGETS
TECHNOLOGIES & TOOLS: Java SWING
image for practica moving targets
image for practica moving targets
This project was a re­vival and exten­sion to an ear­lier Bachelor pro­ject on selection tech­niques for moving tar­gets (see the pro­ject »Acquisition of mov­ing targets« below).
In this project we extended an already exis­ting test appli­ca­tion for a desk­top sett­ing where se­ve­ral diffe­rent point­ing fac­i­li­ta­tion tech­ni­ques found in the lite­ra­ture had been im­ple­mented. The ex­ten­sion in­cluded two new assis­tive selec­tion tech­nique, so called »Force Fields« and the »Fan Cursor«.
The objective was to test and find out how well each tech­nique helps a user when he or she tries to select tar­gets that move around across the comp­uter screen. We plan to com­pare and evalu­ate the effici­ency of the tech­ni­ques in a ser­ies of user studies.
The software was also extended with func­tio­na­lity needed to con­duct Fitts' Law point­ing ex­peri­ments. We were also inter­ested in build a pre­dic­tive per­for­mance model (based on Fitts' Law) that accu­rately pre­dicts how fast a user can click on a moving target, with and with­out the assis­tance of the different point­ing facili­tation tech­niques.
Finished • Bachelor Project • June 2013
Joachim Frießer
OFF-SCREEN INTER­ACTIONS FOR SMART­PHONES
TECHNOLOGIES & TOOLS: Microsoft Kinect • openNI • C#
image for in-air with Kinect
image for in-air with Kinect
At that time, motivated by new object track­ing soft­ware and hard­ware, in this pro­ject we ex­plored how accurat­ely we can de­tect and dis­ting­uish bet­ween different and very small finger move­ments and gest­ures using a Microsoft Kinect sensor. We used a pico pro­jector mounted above a table to pro­ject virtual ob­jects »in the air« above the table and designed and tested various »in air« finger ges­tures that could be used to select objects »float­ing around« in the air within a user's work­space.
Video
Finished • Bachelor Project • June 2013
Karl-Heinz Moll • Michael Orieschnig
ACQUISITION OF MOV­ING TARGETS
TECHNOLOGIES & TOOLS: Java SWING
image for practica moving targets
image for practica moving targets
In this project we created a test app­li­ca­tion for a desk­top sett­ing where se­ve­ral diffe­rent point­ing facili­tation tech­ni­ques found in the litera­ture have been imple­mented, such as the »Comet«, »Ghost«, and »Bubble Cursor« techniques. The objective was to test and find out how well each tech­nique helps a user selecting targets that move across the com­puter screen. We plan to com­pare and evalu­ate the effi­ciency of the diffe­rent tech­ni­ques in user studies.
Finished • Bachelor Project • March 2012
Maria Joszt • Marco Wurzer
ICON POSITION RECALL
TECHNOLOGIES & TOOLS: Java SWING
image for practica icon position recall
image for practica icon position recall
In this pro­ject we im­ple­mented a test soft­ware to per­form studies on how well per­sons can re­call the exact po­si­tion of icons di­splayed in a gra­phi­cal user inter­face.
The soft­ware allows us to test users' icon lo­ca­tion re­call per­for­mance when icons are arranged in diff­e­rent lay­outs, such as row-based, column-based or grid-based lay­outs. The soft­ware also allows us to test such arrange­ments with vary­ing num­ber of icons.
In subsequent work, we used the icon-testing soft­ware in such a user study. The re­sults were pub­lished in the Pro­ceed­ings of the 10th Con­fer­ence of the Aus­trian Psy­cho­logy Asso­ci­ation, 2012 (see my pub­li­ca­tion list).
Finished • Master Project • March 2010
Vera Koren
DAS KIND UND DIE MAUS – DAS COM­PU­TER­MAUS­VER­HALTEN VON KIND­ER­GAR­TEN­KINDER
[In German, English title: The Child and the Com­pu­ter Mouse – Com­pu­ter Mouse Be­havior of Pre-School Children]
TECHNOLOGIES & TOOLS: Java • SPSS
image for master thesis Vera Koren
image for master thesis Vera Koren
In her work, Vera exp­lored soft­ware en­han­ce­ments aimed to support small chil­deren to accu­rately ope­rating a com­puter mouse. In two point-and-click ex­peri­ments with four-year-olds Vera tested how well par­ti­ci­pants per­formed with Hourcade et al.'s Point­Assist tech­ni­que and how well they per­formed with vari­ous mouse cur­sor speeds. The results were later pub­lished at the In­ter­na­tio­nal Con­fe­rence on In­ter­ac­tion De­sign and Child­ren in 2013 (see my pub­li­ca­tion list).
Thesis Abstract
In dieser Arbeit wird der Um­gang mit der Com­pu­ter­maus von Kindern im Alter zwi­schen vier und fünf Jahren unter­sucht. Die zwei Haupt­ziele dieser Ar­beit waren zwei ess­en­ti­elle Frage­stell­ungen: 1) »Gibt es DIE opti­male Maus­ge­schwind­ig­keit für Kinder, d.h. gibt es eine Maus­ge­sch­win­dig­keit, die besser für Kinder ge­eignet ist als andere?« und 2) »Bietet die Maus­be­sch­leuni­gung in Kom­bi­nation mit dem Point­Assist von Hourcade et al. eine Hilfe­stell­ung für Kinder bei Point-and-Click-Aufgaben?«
Für die Be­ant­wort­ung dieser Fragen, musste zu­erst der Stand der For­schung klar ab­ge­steckt werden, wo­bei auch die spär­liche Be­hand­lung dieses Themas auf­ge­deckt wurde. Des Weit­eren war die Durch­führ­ung zweier Ex­peri­mente not­wen­dig, welche den prak­tischen Kern dieser Arbeit bilden.
Das erste Ex­peri­ment be­schäftigte sich mit der Be­ant­wort­ung der Fra­ge­stell­ung »Gibt es DIE opti­male Maus­ge­schwin­dig­keit für Kinder, d.h. gibt es eine Maus­ge­schwin­dig­keit, die besser für Kinder ge­eignet ist als andere?«. Die Kinder fürten Selek­tions­auf­gaben unter Ver­wend­ung von vier Maus­ge­schwin­dig­keiten durch. Da­bei er­gab die Ana­lyse der Er­geb­nisse, dass es sehr wohl eine Maus­ge­schwin­dig­keit gibt, die besser für Kinder ge­eignet ist als andere. So lösen Kinder Auf­gaben mit Ge­schwin­dig­keit »8« exak­ter und schneller als mit den drei anderen, ge­tes­teten Ge­schwin­dig­keiten.
Das zweite Ex&shperi­ment, welches unter anderem die Veri­fi­kation einer bereits im Jahr 2008 von Hourcade et al. durch­ge­führ­ten Studie zum Ziel hatte, gab Ant­worten auf die zweite Ziel­frage­stellung. So kam es zu einer teilŠweisen Veri­fika­tion von Hourcade et al.'s Studie, so­wie zur Unter­such­ung welche Aus­wirk­ungen die Kom­bi­nation zwischen PointAssist und Maus­be­schleuni­gung auf das Klick­ver­halten von Kindern hat.
Finished • Bachelor Project • March 2010
Thorsten Dalmatiner • Marco A. Hudelist • Alexander Taurer
MULTI-MONITOR INTERACTION
TECHNOLOGIES & TOOLS: Java SWING
image for practica multi-monitor interaction
image for practica multi-monitor interaction
The outcome of this pro­ject was a test soft­ware with which we ex­plored diffe­rent tech­ni­ques that could be used to switch bet­ween open app­li­ca­tion win­dows in a multi-monitor setting. In a small user study we used the soft­ware and com­pared per­sons' window switch­ing per­for­mance using the key­board short­cut »Alt+Tab«, Win­dows' task­bar, and when using the mouse to click in­side the windows.
This project was in­spired by an ear­lier pro­ject on window switch­ing tech­ni­ques that I worked on and pub­lished with col­leag­ues from New Zea­land in 2009 (see my pub­li­ca­tion list).
Finished • Bachelor Project • December 2008
Thomas Pairitsch • Florian Winkler
A TEST SUITE FOR EVA­LU­AT­IONS OF MENU SYS­TEMS
TECHNOLOGIES & TOOLS: Java SWING
image for practica menu interfaces
image for practica menu interfaces
Building on my earlier work on im­prov­ing menu selec­tion in graphi­cal user inter­faces, this pro­ject aimed at inte­grat­ing vari­ous novel menu systems into a test app­li­ca­tion which could be used to test and com­pare diff­e­rent menu systems.
The resulting »menu-test-suite« in­cludes a base­line (standard) menu and a set of non-standard menus: • Bubb­ling Menu (Tsandilas and schraefel) • Ada­ptive Acti­vation Area Menu (Tanvir et al.) • Force Field Menu (from me) • Jump­ing Menu (from me) • Square Menu (from me and coll­eag­ues).
In later work with coll­ea­gues in New Zea­land, I also ex­tended the »menu-test-suite« to in­clude so called Pie Menus and used the suite to ex­plore users' selection per­for­mance in radial menus. The results were pub­lished in the Inter­na­tional Jour­nal of Human-Computer Studies in 2012 (see my pub­lica­tion list).
Finished • Bachelor Project • November 2008
Nicole Traschitzger • Stefan Anderwald
TOUCH-TYPING
TECHNOLOGIES & TOOLS: Java SWING • ATouch resistive touch panel
image for practica touch-typing
image for practica touch-typing
When the first iPhone appeared in 2007 the HCI-research com­munity were exited about the new inter­action possi­bi­li­ties its touch-display offered and a lot of re­asearch efforts were in­vested in ex­plor­ing touch-based inter­actions such as, scroll­ing, pinch­ing, and tapp­ing, on small touch-displays de­signed for mobile devices. But com­par­ably little work ex­isted that had ex­p­lored how well users can using the 10-finger typ­ing tech­nique on a touch-display (which does not pro­vide much hap­tic feed­back as a key­board does).
Accordingly, in this project we used one of the first and early avail­able touch panels (designed to be mounted in front of a com­puter moni­tor, I believe we used a panel from ATouch Tech­no­lo­gies) and de­ve­loped a »touch­able key­board« to test and mea­sure peo­ples' touch-typing per­for­mance.
Unfortunately, the re­sults never got pub­lished. We were a bit too late and others pub­lished their re­sults be­fore us and so the »novelty« was lost…
Finished • Bachelor Project • September 2008
Bernhard Meixner
APPOINTMENT BROKER
TECHNOLOGIES & TOOLS: PHP • MySQL
image for practica appointment broker
image for practica appointment broker
The outcome of this pro­ject was the web app­li­ca­tion »Termina« with its func­tio­nality for sup­port­ing the co­ordi­na­tion and mana­ge­ment of appoint­ments among several users. Its core func­tio­nality in­cluded support to ne­go­tiate (vote) among par­ti­ci­pants about what time and day to meet, to in­vite new par­ti­ci­pants to already sche­duled meet­ings, to cancel or re­sche­dule meet­ings, to send out noti­fi­cations about changes, and to send out re­min­ders of up­coming meet­ings. In short, pretty much like to­day's Doodle.
Finished • Master Project • March 2007
Manuela Graf
DIE DESKTOP-METAPHER – EINE EM­PI­RISCHE STU­DIE UND IM­PLI­KA­TIONEN
[In German, English title: The Desk­top Meta­phor – an Em­pi­ri­cal Study and Im­pli­cations]
TECHNOLOGIES & TOOLS: Java • Java 3D • SPSS
image for master thesis Manuela Graf
image for master thesis Manuela Graf
Inspired by early re­search work on 3D vis­uali­zations (e.g., the » Data Moun­tain«) and by appear­ing frame­works used to create three-dimensional vis­uali­zations for desk­top app­li­ca­tions, such as Java 3D and the Pro­ject Look­ing Glass, in this thesis Manuela in­vesti­gated whether and how such tech­no­logy could be used to de­ve­lop a 3D-desktop en­viron­ment that supports the organi­zation and effective man­age­ment of com­puter files and appli­cation icons.
Thesis Abstract
Diese Arbeit be­sch­äftigt sich mit der Me­ta­pher des Desk­tops. Durch die Ent­wick­lung des Xerox Alto be­gann eine neue Zeit­rech­nung im Be­reich des Per­so­nal Com­put­ings – die der gra­phischen Be­nutzer­ober­fläche. Der Desk­top wurde zu einem der zen­tral­en Punkte in vielen Be­triebs­sys­te­men. Zur Zeit der Ver­fass­ung dieser Arbeit voll­zogen sich weitere Än­der­ungen an den graph­isch­en Be­nutzer­ober­flächen ein­iger Be­triebs­systeme – die dritte Di­men­sion hielt Ein­zug. Bis­her blieb die Meta­pher des Desk­tops von diesen Änder­ungen ver­schont. Da­her stellen sich folg­ende Fra­gen: Wie kann die dritte Di­men­sion die Dar­stell­ung des Desk­tops be­ein­flussen? Eignet sich 3D zur Ent­wick­lung einer voll­kommen an­deren Meta­pher?
Die Beant­wort­ung dieser off­enen Fra­gen setzt die Klär­ung wes­ent­lich wich­ti­geren Fra­ge­stell­ung vor­aus. Da­bei geht es um die Art der Ver­wend­ung des zwei­di­men­sio­nalen Desk­tops. Da­her wurde im Rahmen dieser Ar­beit eine em­pi­rische Studie durch­ge­führt, die sich mit den folg­en­den zwei Themen­be­reichen be­schäftigt: 1) Per­sön­liche Nutz­ung des Desk­tops, sowie 2) Eva­lu­ie­rung von Mein­ungen zu den neuen drei­di­men­sio­nalen Tech­no­lo­gien in den ein­zelnen graph­ischen Be­nutzer­ober­flächen.
54 Personen, haupt­säch­lich Be­nutzer des Be­triebs­systems Windows, be­ant­wort­eten in einem per­sön­lichen Inter­view Fra­gen zur Desktop-Nutzung. Des Weiteren über­mitt­elten diese Per­sonen Screen­shots ihrer Desk­tops. Aus allen Fra­ge­stell­ungen des Fra­gen­kata­loges wurde das Thema der »An­ord­nungs­stra­te­gien von Icons« zur weiteren Di­skus­sion ge­wählt.
In Hin­blick auf die Tat­sache der ver­mehr­ten Ein­führ­ung von drei­di­men­sio­nalen Dar­stell­ungs­for­men auf graph­ischen Be­nutz­er­ober­flächen, wurden Strate­gien zur An­ord­nung von Icons auf einem zu­künft­igen drei­di­men­sio­nalen Desk­top disk­u­tiert und ein Proof-of-concept-Proto­typ im­ple­men­tiert. Dieser Pro­to­typ soll zei­gen, wie Icons unter An­wen­dung der An­ord­nungs­stra­te­gien »Zeit« und »Häu­fig­keit des Auf­ru­fens« in einem drei­di­men­sio­nalen Raum platz­iert werden können.
Finished • Bachelor Project • March 2007
Hans-George Beyer • Mario Gugg­en­berger • Alex­ander Müller
Pub2Web – A PUB­LI­CA­TION MAN­AGE­MENT APP­LI­CA­TION
TECHNOLOGIES & TOOLS: .NET • MySQL • XML
image for practica Pub2Web
image for practica Pub2Web
The outcome of this pro­ject was the desk­top app­li­ca­tion »Pub2Web« in­tend­ed for re­search­ers that need to man­age a large number of sci­en­ti­fic pub­li­ca­tions. The core func­tio­nal­ity included:
  • publication management (stor­ing, chan­g­ing, and de­let­ing of pub­li­ca­tions and their att­ri­butes, e.g., title, auth­ors, year, pub­li­sher, key­words),
  • search func­tio­nality to search among stor­ed pub­li­cat­ions,
  • creating and stor­ing a list of a sub­set of the stor­ed pub­li­cat­ions as a re­fe­rence list in vari­ous for­mats, such as bib, txt, and pdf, and
  • generating inter­act­ive web­sites from cre­at­ed re­fe­rence lists that allow the web-user to search among the list­ed pub­li­ca­tions.
With this functionality, the Pub2Web app­li­ca­tion worked pretty much as to­day's Men­de­ley or similar app­li­ca­tions.
Finished • Master Project • No­vem­ber 2006
Jürgen Großmann
OPTIMIERUNG VON FEN­STER­MANI­PU­LA­TIONS­MECHA­NISMEN
[In German, English title: Opti­mized Win­dow Mani­pu­lation Tech­ni­ques]
TECHNOLOGIES & TOOLS: Java • SPSS
image for master thesis Jürgen Großmann
image for master thesis Jürgen Großmann
In his work, Jürgen in­vent­ed and im­ple­men­ted se­ve­ral al­ter­na­tives to the stan­dard wi­ndow man­i­pu­la­tion tech­ni­ques (e.g., dragg­ing win­dow bor­ders and click­ing small butt­ons in the win­dow cor­ner) used for re­siz­ing, mi­ni­miz­ing, max­i­miz­ing, and mov­ing app­li­ca­tion win­dows on desk­top com­pu­ters.
Jürgen's tech­ni­ques in­clud­ed the use of var­i­ous mouse gest­ures in­side win­dows, mov­ing the mouse cur­sor across win­dow bor­ders, and swip­ing across the key­board keys with one hand. The new tech­ni­ques were com­pa­red eva­lu­a­ted against the stan­dard tech­ni­ques in two user ex­peri­ments.
The results were later pu­blished in the pro­ceed­ings of the OzCHI con­fe­rence in 2009 (see my pub­li­ca­tion list).
Thesis Abstract
In dieser Arbeit wird die Inter­ak­tion mit Fen­stern in graphi­schen Be­nutzer­ober­flächen unter­sucht. Das Ziel ist es, Wege zu fin­den um die Mani­pu­lation von Fen­stern effi­zi­enter zu ge­stal­ten. Dazu wer­den die heute zur Ver­fü­gung steh­en­den Mö­glich­keit­en ana­ly­siert, die vom Ver­schieb­en von Fen­stern bis zum Um­schalten zwischen Fen­stern mit der Task­leiste oder Apple Ex­po­sé reichen. Eine Ein­teil­ung dieser Ak­tionen in ein­zelne Be­reiche und die Unter­such­ung ihrer Ent­wickl­ung hat ge­zeigt, dass sich beim Ver­schieben von Fen­stern und Ändern der Fen­st­er­größe seit den An­fängen von fen­ster­basiert­en Sy­stem­en nur wenig ver­fändert hat.
Auf­grund dieser nicht vor­hand­enen Ent­wick­lung wird der Fokus dieser Ar­beit auf diese Opera­tionen ge­legt. Um diese ein­facher ge­stalten zu können, wer­den neue Inter­aktions­method­en unter­sucht, neue Steu­er­ele­mente ent­worfen und diese proto­typ­isch um­ge­setzt. Den Kern der Arbeit bil­den zwei Ex­peri­mente, die dazu dienen, die im­ple­men­tier­ten Pro­to­typen, die die Mani­pu­lation mit neuen Inter­aktions­me­tho­den und Steu­er­ele­menten er­möglichen, mit der Stan­dard­me­thode unter Ver­wend­ung der Zeige­geräte Maus und Touch­pad zu ver­gleichen.
Im ersten Ex­peri­ment werden Fen­ster­mani­pu­lations­auf­gaben ver­wendet, die mit einer Oper­ation (in einem Schritt) lös­bar sind. Hier er­folgt die Aus­wert­ung ge­trennt nach Ver­schiebe­op­er­ationen und Op­er­ationen zur Ände­rung der Fen­ster­größe. Im zwei­ten Ex­pe­ri­ment werden hin­ge­gen Auf­gaben durch­ge­führt, bei denen Oper­a­tionen zum Ver­schieb­en und zum Ändern der Größe von Fen­stern kom­bi­ni­ert wer­den müssen.
Diese Ex­per­i­mente zeigten deut­lich, dass bei den unter­such­ten Fen­ster­man­i­pu­la­tions­oper­a­tionen Ver­besser­ungs­po­ten­tial be­steht. In bei­den Ex­per­i­menten war die Stan­dard­methode so­wohl die lang­sam­ste Inter­aktions­methode als auch sehr fehler­an­fällig. Die fünf um­ge­setzt­en Proto­typen wiesen für die ver­wen­de­ten Arten von Oper­a­tionen (Oper­a­tionen zum Ver­schieben oder zum Ändern der Größe des Fen­sters) und Zeige­ge­räte unter­schied­lich gute Er­geb­nisse auf. Sie konnten je­doch in bei­den Ex­per­i­menten Effi­zienz­ge­winne von teil­weise über 50% ver­zeich­nen. Dies wurde auch durch die Er­geb­nisse von direkt nach den Be­nutzer­tests durch­geführ­ten Be­frag­ungen nach der sub­jek­tiven Mei­nung über die Effi­zienz der ein­zel­nen Proto­typen be­stätigt.
Finished • Bachelor Project • March 2005
Christian Harrer • Andreas Hold • Martina Sche­lan­der
INTERACTIONS FOR WHITE­BOARDS
TECHNOLOGIES & TOOLS: Java • Mimio In­ter­ac­tive White­board
image for practica inter­actions for white­boards
image for practica inter­actions for white­boards
The Mimio Inter­ac­tive White­board hard­ware can be easily mount­ed on any dry erase board (as com­monly found in class­rooms or meet­ing rooms) and so en­ables di­gi­tal cap­tur­ing and stor­age of what­ever is drawn or written with the ac­comp­any­ing pens. In this pro­ject, we ex­plor­ed ways how to ex­tend the out-of-the-box Mimio hard­ware and soft­ware to im­ple­ment furt­her func­tio­na­lity and in­ter­ac­tive fea­tures.
Finished • Bachelor Project • February 2004
Sibylle Kattnig • Andrea Obiltschnig • David Tschische
INTERACTIVE LEARN­ING COM­PO­NENTS FOR MEDIA IN­FOR­MATICS
TECHNOLOGIES & TOOLS: Java Applets
image for practica inter­ac­tive learn­ing com­po­nents
image for practica inter­ac­tive learn­ing com­po­nents
The outcome of this pro­ject was a set of inter­active learn­ing com­po­nents design­ed for first-semester in­for­ma­tics stu­dents to help them un­der­stand and learn fun­da­men­tal con­cepts and tech­ni­ques used in com­pu­ter sci­ence (e.g., Huff­man co­ding, color hi­sto­grams, and frac­tals).
Reflecting upon this pro­ject to­day, over 10 years later, the really in­te­res­ting thing with the pro­ject was that it was part of a lar­ger pro­ject in co­ope­ration with four other uni­ver­si­ties in Aus­tria, called »MobiLearn«, which was the first Aus­trian in­i­tia­tive aimed at in­tro­duc­ing a wire­less in­fra­struc­ture for learn­ing and know­ledge trans­fer. The main and new idea was to de­sign di­gi­tal learn­ing mater­ials that stu­dents could access »any­where and any­time«.
Accordingly, the in­ter­ac­tive learn­ing con­tent we crea­ted back then was de­sign­ed for three diffe­rent classes of end-user devices:
  1. mobile phones (today we call the mobile phones used back then »fea­ture phones«),
  2. PDAs (personal di­gi­tal ass­is­tants, the pre­cur­sor to our smart­phones), and
  3. desktop or laptop com­p­uters.
Today, we would call the de­sign app­roach we used in the early 2000s »re­spon­sive web de­sign«!
Finished • Bachelor Project • June 2003
Jürgen Großmann • Matthias Missoni • Bernhard Reiterer
MOUSE CURSOR ADAP­TATION USING FORCE FIELDS
TECHNOLOGIES & TOOLS: Java Swing
image for practica force fields
image for practica force fields
In this project we im­ple­mented Java Swing classes that can be used to en­hance stan­dard GUI com­po­nents – such as buttons, scroll­bars, and text fields – with what we called »in­visi­ble force fields« which attract the screen cursor toward the com­ponents' center with the aim to make mani­pu­la­tion and se­lec­tion of small GUI com­po­nents faster and less error prone.
In several sub­sequent pro­jects, we mea­sured the effect of such »in­visi­ble force fields« in vari­ous si­tu­ations, such as to make mouse ope­ra­tion easier for small chil­dren (see my pub­li­cation list), or for adults to se­lect very small screen tar­gets (see my pub­li­cation list), or to na­vi­gate hier­archi­cal pull-down menus (see my pub­li­cation list).
Finished • Bachelor Project • November 2001
Stefan Ellersdorfer • Johannes Jabornig
VEVAL – SPECIFICATION FOR AN ONLINE SELF-ASSESSMENT TOOL
TECHNOLOGIES & TOOLS: HTML • JavaScript
image for practica Online assessment tool
image for practica Online assessment tool
As a part of a larger pro­ject where we de­vel­oped a web-based virtual elec­tro­nic la­bor­a­tory for elec­tro­nic engi­neer­ing stu­dents, this bachelor pro­ject focused on the speci­fi­cation and initial imple­men­tation of the accom­pa­ny­ing self-assess­ment mo­dule for stu­dents where they could test what they have learned in classes and could pre­pare them­sel­ves for up­coming exams.
Finished • Bachelor Project • July 2001
Britta Bierbaumer • Thomas Lessiak
3D WEB BOOKMARKS USING VRML
TECHNOLOGIES & TOOLS: VRML • HTML
image for practica VRML
image for practica VRML
In this project the par­ti­ci­pa­ting stu­dents and I set out to learn VRML – the Vir­tual Real­ity Mark­up Lan­gu­age (used back then to create 3-dimensional look­ing web con­tent). Our goal was to use VRML to im­ple­ment brow­ser func­tio­nal­ity to dis­play stored book­marks as 3-dimensonal shapes. We target our ex­plo­ra­tion on our uni­ver­sity's web site and used the diffe­rent de­part­ments' web pages. Look­ing at the stored screen­shots of these old web pages today is quite fun: what a great pro­gress that has been made in both tech­no­logy and web-aesthetics dur­ing the last 15 years!
Open Projects
Ongoing Projects
Finished Projects