Tuesday, October 19, 2010

Mindmap: Eye and Head Enhanced Web-Browsing

0 Kommentare
I have finished my mindmap. The program I used (and which I can highly recommend) is Xmind, an open-source java program.
This mindmap mainly contains the information drawn from my literature. The thoughts on my own project are not included. This mindmap is basically only the theoretical part of my thesis

Here is the dropbox link to my mindmap:../E-N-H-anced_Browsing.xmind
It is released under a Creative Commons Attribution-No Derivative Works 3.0 Germany License (CC BY-NC-ND)

It contains references to the literature I used (the numbers in the bubbles). The links from the numbers to the actual texts are only available hand-written on the printed articles.
Maybe, I will release my Citavi biliography file later and add the reference numbers there. But until then, you will only be able to guess the literature behind my mindmap...

Duchowski: Neurological Substrate of the HVS

1 Kommentare
  • Field of view is inspected through brief fixations over small regions of interest
  • Central foveal vision lies between a visual angle of 1°-5° (this is the answer to the question in the previous post) => on a 21" monitor(distance ~60cm), the attention of the user is only on 3% of the picture
  • Approx. 90% of viewing time is spent on fixations
This chapter of the book examines the neural substrate of the human visual system, from the intuitive attentional perspective (pretty complex stuff...)

Thursday, September 30, 2010

Infrared Overkill

3 Kommentare
I decided to try an alternative for the IR LEDs mounted to the eye tracker. Thus, inspired by a youtube Video by W. Piechulla (I think... cannot find it again), I came up with the idea of using light bulbs as an IR source.
Conventional light bulbs emmit 95% of their energy in IR light and only 5% in visible light.
I made aa test with two 40 Watt bedlamps, but they proved to be to weak. So I bought two 100 Watt IR lamps for 9 € each and set them up. Those are usually used for curing backaches... Howver, they now function as stationary IR sources on my desk, so that I am able to use GazeTracker in "Remote-Tracking" mode with 2 glints. Doing so, I no longer have to keep my head still for an accurate tracking... but now I have to full infrared overkill :)




Sunday, August 22, 2010

Experimenting with FaceAPI

0 Kommentare
How do I look??

Works good without Eye-Tracker
Works slightly worse with my eye tracker goggles on ;)

Howto: Hide the cursor in Windows

0 Kommentare
A problem I had during the development of my gaze-enhanced tab switching, was that the mouse cursor was diverting the user's gaze. That's why I decided to hide the cursor, while gaze control is active.

A major problem here was that the .net funktions Cursor.Hide() and WinApi ShowCursor(false) are only valid for the user control they are called from.
I searched for a way to hide the cursors globally in windows.

One possibility which came to my mind was to change the Windows cursor temporarily (as you can also do it via System Control->Mouse Settings) to an invisible dummy cursor, which is basically just a .cur file without any content. The information for the currently active cursors is stored in the Windows registry. Reading from and changing the registry is not that big of a problem for C#.

So here is what I had to do:
  1. Store the current cursor path
  2. Change the cursors "Arrow" and "Hand" to the invisible.cur file
  3. Force a reload of the registry
  4. Later: restore the original cursors again.
After the jump is the code, which does what I wanted to accomplish. It is called on pressing the hotkey combination for "enable gaze control", which I set to the same hotkey combo as "show tab previews" in firefox :)

Saturday, August 14, 2010

C#: HowTo directly change a setting in an "*.exe.config" file

0 Kommentare
I had been searching for more than half a day for a way to directly change the values in the appSettings section in the  *.exe.config of my application. The problem is that the settings are read-only. I found no way to write to the *.exe.config file. I neither found a class or a method online to do this.
So I created my own method which directly accesses the *.exe.config file by making use of its XML structure.
Thus, I am able to add and change nodes and ultimately save the XML file.

Here is the code:

Wednesday, August 11, 2010

Processing Global Hotkey Events

0 Kommentare
I want to trigger an Event, everytime the ESC Key is pressed.
As shown below, this is npt that hard, when the program is in focus. If it out of focus, the event is not triggered.
Thus, I need global hotkey processing.

I found a page, where this method is explained using a custom class, which utilized Hooks.

More here:
http://www.tutorial-board.de/index.php?page=Thread&threadID=382
http://www.codeproject.com/KB/cs/globalhook.aspx
http://dotnet-snippets.de/dns/globale-hotkeys-tastenkombinationen-SID356.aspx

Handling the ESC Key in C#

0 Kommentare
Found this here: http://channel9.msdn.com/forums/TechOff/224745-Escape-Key-Event-Bug-C/?CommentID=224767
Escape Key is a command key, so you should achieve this with another trick, for example:

using System;
using System.Windows.Forms;

namespace HandleEscapeKeyDemo
{
    public partial class MainForm : Form
    {
        public MainForm()
        {
            InitializeComponent();
        }

        protected override Boolean ProcessCmdKey(ref Message msg, Keys keyData)
        {
            if (keyData == Keys.Escape)
            {
                OnPressEscapeKey();
            }
            return true;
        }

        private void OnPressEscapeKey()
        {
            MessageBox.Show("Escape Key Is Pressed");
        }
    }
}


Sheva

WORKS!

Thursday, August 5, 2010

Trying to understand the smoothing algorithm from EyeWriter

1 Kommentare
The algorithm looks very simple:
There is one parameter, which controlls the intensity of the smoothin effect.

The formula is structured like this

NewCursorPosition =
       SmoothingFactor * OldCursorPosition
      + (1-SmoothingFactor) * ActualGazePosition

Code from testApp.cpp Line 62
   eyeSmoothed.x = CM.smoothing * eyeSmoothed.x + (1-CM.smoothing) * screenPoint.x;

One example for this:

NewPos = 0,97 * 487 + (1-0,97) * 705
              = 472 + 21
              = 493


The difference between OldPos and New Pos is 493-487 = 5px. This means, that the cursos slowly adopts to the actual gaze position without large jumps..
Increasing the Smoothing Factor leads to a much slower moving of the cursor, decreasing it, results in an abrupt jumping of the cursor, which gives it less stability.

From a first impression, the GT algorithm does not work as good as this simple and lean EyeWriter algorithm, AND uses a much more complicated logic. Therefore, I probably could implement this very simple algorithm instead.

However, I need to understand the GT algorithm first in order to decide.

Friday, July 16, 2010

Some captures from my first trials with my eye tracker

0 Kommentare
ITU Gaze Tracker

Eye Writer Tracking Software

Calibration with ITU Software
 

Calibration in EyeWriter Software

Eye Tracking Game

0 Kommentare

Eye tracking game: Summary from theo tveteras on Vimeo.

I don't believe that this works as shown in the video!!
These people are wildly moving their heads and bodies... how is exact eye tacking possible under these conditions?

I did neither find a link on how to assemble this nor a link to the software they are using

Sunday, July 11, 2010

Built my first Eye Tracker

0 Kommentare
This is it!! I've built my own eye tracker and I think I can be satisfied with the first try. I was able to remove the IR filter from to PS Eye and add an IR bandpass. I was also able to install the two IR LEDs and power them with two 1,5V batteries. With a twisted aluminum wire, I could mount the tracker to a pair of 1€ sunglasses.


I did testruns with this construction with ITU GazeTracker and the EyeWriter tracking software.
Both showed favourable results.
I could already use my eyes controlling a pointer in both softwares. However it was not that accurate, perhaps caused by my "temporal" eye tracker constuction.
Accuracy was better with ITU tracker, in my opinion.
TODO: Improve accuracy, either with better eye tracker or optimizing the software.

I found out that the EyeWriter software does not do any glint tracking, but only pupil tracking.
The glint tracking on the ITU software seems to require for brighter IR LEDs as sometimes, the tracking was poor.

Another problem, I encountered is that the glasses are black, which may sometimes disturb the tracker from my pupils. 
TODO: Use differently colored glasses or use a colored tape on my current ones

One problem occured because of the human anatomy: there is a huge vein running next to my eyes. Wearing the sunglasses tightly on my skin causes them to bounce, beacuse the vein gives an impulse everytime my heart beats.
TODO: Find sth. I can use for cushoning

All in all, I think these are good results for a first try.
Looking forward wrking on this interesting topic on Tuesday!

Saturday, July 10, 2010

Building my own Eye Tracker

0 Kommentare
TODO: Include pix and a step-by-step description

Trying to get the PS Eye to work:
Having sucessfully installed the driver on Windows7 (x86), I started the test application to see if the PS Cam is working.
Works...

However, I still don't get 60fps @ 640... what's the problem?
    Solution: http://codelaboratories.com/forums/viewthread/117/P10/

Problem after removing the IR Filter: the focal distace is a bit shorter => cam cannot focus on near objects
    Solution: Shorten the mount with sand paper!!
    Here: http://detsu.de/blog/index.php?post/2010/01/19/PS3eye-IR-filter-remove-focus


Hacking the PS Eye after the jump...

Tuesday, June 29, 2010

PS3 Eye

1 Kommentare
This Webcam is commonly used to create a DIY multi-touch display.
Thus, I believe it to be a good camera for my project.


Before the cam can be used for tracking, the IR blocking filter needs to be removed.
AFAIK, a filter to block standard light can be attached afterwards so that the IR light is enhanced.
This might result in better tracking result... here is why:
When you illuminate the eye with IR light and observe it through an IR sensitive camera with a visible light filter, the iris of the eye turns completely white and the pupil stands out as a high-contrast black dot. This makes tracking the eye much easier.
from http://www.instructables.com/id/The-EyeWriter/step8/Lite-it-up/

Here are some random links I found during my search:
Before the cam can be used for tracking, a lens with a shorter focal length (8mm) needs to be used (why exactly??). ITU uses a standard off-the-shelf webcam without any modifications. Probably, this modification is just extra... However, what to I do now?? Buy a cam, such aus the one the ITU guys use, or use the hacked PS3 Eye??

I think this depends on my final choice... probably.
EyeWriter = PS3 cam and ITU = standard cam

      Saturday, June 19, 2010

      Manu Kumar: GUIDE: Gaze-Enhanced User Interface Design

      0 Kommentare
      StanfordUniversity - 9. Mai 2008 - April 13, 2007 lecture by Manu Kumar for the Stanford University Human-Computer Interaction Seminar (CS 547).

      A series of novel prototypes that explore the use of gaze and an augmented input to perform everyday computing tasks are presented. In particular, the use of gaze-based input for pointing and selection, application switching, password entry, scrolling, zooming, and document navigation are explored.

      http://www.youtube.com/watch?v=7-OmM31MvBw

      Interesting video, which provides a good overview of Kumar's PhD thesis.

      Thursday, June 17, 2010

      Duchowski: Visual Attention

      1 Kommentare
      Motivation for eye tracking:
      • Gain insight into what the observer found interesting
      • provide a clue as to how that person perceived whatever scene she or he was viewing
      • Humans cannot attend to all things at once => what we look at is what we concentrate on
      1.1 Visual Attention: A Historical Review
      Early studies (over a century ago) focused on occular observations

      Von Helmholz (1925): "Where"
      visual attention tends to wander to new things
      it can be consciously directed to peripheral objects without making eye movement to that object
      eye movements reflect the will to inspect objects
      eye movements provide evidence of overt attention
      parafoveal: sth. is perceived (peripheral) and needs further inspection
      "where" to look next

      Duchowski: Preface

      0 Kommentare
      Current Eye-Tracking devices fall within the fourth generation
      1. First generation: eye-in-head movement measurement of the eye consisting of techniques such as scleral contact lens/search coil, electro-oculography
      2. Second generation: photo- and video-oculography
      3. Third generation: analog video-based combined pupil/corneal reflection
      4. Fouth generation: digital video-based combined pupil/corneal reflection, augmented by computer vision techniques and Digital Signal Processors (DPSs)
      • Point of Regard (POR) is the most desired eye-tracking output (x, y coordinates)
      • Not provided by 1st and 2nd generation
      • Increasing usability and decreasing costs

      Thursday, June 3, 2010

      Gaze Tracker by ITU Gaze Group

      0 Kommentare
      http://www.gazegroup.org/downloads/23-gazetracker

      This seems to be the most promissing project!!

      They have an open source lib for eye tracking. In combination with a head-mounted low-cost webcam this might work pretty well.

      Works with the Microsoft Webcam from the other post.

      TODO:
      Find out, if there is an API for their Lib
      Try out IR webcam with IR LEDs
      Find a good way to create a head-mounted device: maybe basecap or glasses

      TrackEye

      1 Kommentare
      TrackEye@Codeproject

      Could not get this to work... probably try again.

      Seems to focus mainly on face and eye detection. The page says, that gaze detection should also be possible.

      ===== Added on 28.06.2010 =====


      Settings to be Done to Perform a Good Tracking

      Settings for Face & Eye Detection

      Under TrackEye Menu --> Tracker Settings
      • Input Source: video
      • Click on Select file and select ..\Avis\Sample.avi
      • Face Detection Algorithm: Haar Face Detection Algorithm
      • Check “Track also Eyes” checkBox
      • Eye Detection Algorithm: Adaptive PCA
      • Uncheck “Variance Check”
      • Number of Database Images: 8
      • Number of EigenEyes: 5
      • Maximum allowable distance from eyespace: 1200
      • Face width/eye template width ratio: 0.3
      • ColorSpace type to use during PCA: CV_RGB2GRAY

      Settings for Pupil Detection

      Check “Track eyes in details” and then check “Detect also eye pupils”. Click “Adjust Parameters” button:
      • Enter “120” as the “Threshold Value”
      • Click “Save Settings” and then click “Close”

      Settings for Snake

      Check “Indicate eye boundary using active snakes”. Click “Settings for snake” button:
      • Select ColorSpace to use: CV_RGB2GRAY
      • Select Simple thresholding and enter 100 as the “Threshold value”
      • Click “Save Settings” and then click “Close”

      eyewriter.org

      0 Kommentare
      http://www.eyewriter.org/

      DIY EyeTracking software and hardware.
      Enables a paralyzed graffiti artist to draw his pieces again.
      They use a Playstation 3 Eye as eyetraker and mount it on sunglasses. Two IR LEDs enable the exact tracking.
      Seems to work pretty good.

      Open source!

      TODO: See if I can use this for my project

      Microsoft LifeCam Webcam VX-1000

      0 Kommentare
      Microsoft LifeCam Webcam VX-1000

      This Webcam can be easily modified to make use of IR light.
      This might be a good webcam to be used with some headmounting device, such as cheap sunglasses, or a baseball cap.

      FaceTracking by Walter Piechulla

      0 Kommentare
      http://www.walterpiechulla.de/ftr_by_wlp_08/doc/index.html

      This program uses the OpenCV Lib.
      I tested it with my Webcam and my eyes and face were recognized by the software. However, I have no IR Webcam, so the eye tracking did not work out correctly.
      It is astonishing how exactly the eyes and even the pupils are detected even though the webcam was only mounted on my screen and not headmounted. Probably with an IR webcam, eyetracking might be possible with a remote setting.

      Unfortunately the software is not open source. Program is released under GPL.

      Tuesday, May 11, 2010

      Research Seminar 11.05.2010: Multimodal Interaction (Presenter Prof. Wolff)

      0 Kommentare
      Slides (02) on DropBox

      Notes:
      • Interaction becomes more human by using multimodal input devices (detecting mimic, voice or eye state and analyzing it according to human emotions)
      • Multimodal input gains popularity in the early 1990sVarious definitions for multimodality
      • "... process combined input modes in a coordinated manner..." (interaction manager)
      • How can multimodality and multimediality be separated?cf. Nigay u. Coutaz 1993: Multimodality implies information processing on a higher abstraction layer: „... multimodality is the capacity of the system to communicate with a user along different types of communication channels and to extract and convey meaning automatically. We observe that both multimedia and multimodal systems use multiple communication channels. But in addition, a multimodal system is able to automatically model the content of the information at a high level of abstraction. A multimodal system strives for meaning.“
      • W3c standard for multimodal applications/browser (in progress) (Candell & Raggett) (EMMA: Extensible MultiModal Annotation Language => I should read about that a bit)
      • Larson 2006: Common Sense Recommendations for designing multimodal user interfaces
      1. Satisfy Real-world Constraints
      2. Communicate Clearly, Concisely, and Consistently with Users
      3. Help Users Recover Quickly and Efficiently from Errors
      4. Make Users Comfortable
      • Do not overload user with too many modalities, e.g. display and read text simultaneuosly; see ~Little's Law: The more WIP the longer the time to process something, i.e. the users receives more information (WIP) and therefore needs longer to make a decision.
      • Question: In how far can eye gaze used intentionally? I look unintentionally (natural function of eyes)... but can I use gaze gestures intentionally?

      Friday, May 7, 2010

      Presentation from TAUCHI - Tampere Unit for Computer--Human Interraction

      0 Kommentare
      http://www.usability-onair.com/wp-content/uploads/2008/06/init-2008-eyetracking-02.pdf

      This provides some interesting ideas

      Opengazer

      1 Kommentare
      I should try this out!!

      Opengazer
      open-source gaze tracker for ordinary webcams

      http://www.inference.phy.cam.ac.uk/opengazer/

      ===== Update =====
       Linux only :-(

      Friday, April 16, 2010

      16.04: Resumé

      0 Kommentare
      Tried to understand Kumar's saccade detection algorithm, but cannot quite grasp it.
      Perhaps reading Identifying Fixations and Saccades in Eye-Tracking Protocols might help.

      Found a fancy blog theme, tho' ;-)

      Wednesday, April 14, 2010

      Fixation Smoothing and Saccade Detection Algorithm

      0 Kommentare
      Kumar, Klingner et al. – Improving the Accuracy of Gaze
      http://portal.acm.org/citation.cfm?id=1344488

      This article describes an algorithm which determines by the gaze data whether the user is currently starting a saccade or just a microsaccade. If it is a miccrosaccade the current gaze data is not considered. Thus a better stability during a fixation can be achieved.

      The algorithm encounters the problem of eye-noise:
      fixations are not stable and the eye jitters during fixations due to drift, tremor and involuntary micro-saccades [Yarbus 1967]. This gaze jitter, together with the limited accuracy of eye trackers, results in a noisy gaze signal

      As the analysis of the data is done in real time a minimal lag occurs. One data record is processed and afterwards the mouse pointer is set, or not. This results in a one-data-sample lag.

      Error rates with gaze pointing and selection are hight than with mouse:
      In the paper describing EyePoint [Kumar et al. 2007b], it was reported that while the speed of a gaze-based pointing technique was comparable to the mouse, error rates were significantly higher.

      Relevance of this article
      In this paper we present three methods for improving the accuracy and user experience of gaze-based pointing: an algorithm for realtime saccade detection and fixation smoothing, an algorithm for improving eye-hand coordination, and the use of focus points. These methods boost the basic performance for using gaze information in interactive applications and in our applications made the difference between prohibitively high error rates and practical usefulness of gaze-based interaction.

      Method of the algorithm (TODO: I NEED TO UNDERSTAND THIS):
      To smooth the data from the eye tracker in real-time, it is necessary to determine whether the most recent data point is the beginning of a saccade, a continuation of the current fixation or an outlier relative to the current fixation. We use a gaze movement threshold, in which two gaze points separated by a Euclidean distance of more than a given saccade threshold are labeled as a saccade. This is similar to the velocity threshold technique described in [Salvucci and Goldberg 2000], with two modifications to make it more robust to noise. First, we measure the displacement of each eye movement relative to the current estimate of the fixation location rather than to the previous measurement. Second, we look ahead one measurement and reject movements over the saccade threshold which immediately return to the current fixation.

      Article with Gaze Hotspot Navigation

      0 Kommentare
      Eye-S: a Full-Screen Input Modality for Pure Eye-based Communication

      ACM Link

      Abstract
      To date, several eye input methods have been developed, which, however, are usually designed for specific purposes (e.g. typing) and require dedicated graphical interfaces. In this paper we present Eye-S, a system that allows general input to be provided to the computer through a pure eye-based approach. Thanks to the “eye graffiti” communication style adopted, the technique can be used both for writing and for generating other kinds of commands. In Eye-S, letters and general eye gestures are created through sequences of fixations on nine areas of the screen, which we call hotspots. Being usually not visible, such sensitive regions do not interfere with other applications, that can therefore exploit all the available display space.


      The Single Gaze Gestures article refers to this one and suggests an improvement:
      In general the research has shown this approach to gaze gestures - where the complexity and range of gestures required for all letters and text editing functions, causes a heavy physiological and cognitive load – to be problematic. SSGs are an attempt at simplifying gestures to make them robust and reliable as well as keeping the cognitive load low.

      Article: Single gaze gestures

      0 Kommentare
      This article contains ideas, which might be relevant for my Gaze-Gestures Web-Browing feature:
      ACM Link

      Emilie Mollenbach, Martin Lillholm et al. 2010 – Single gaze gestures

      Abstract:
      This paper examines gaze gestures and their applicability as a
      generic selection method for gaze-only controlled interfaces.
      The method explored here is the Single Gaze Gesture (SGG), i.e.
      gestures consisting of a single point-to-point eye movement.
      Horizontal and vertical, long and short SGGs were evaluated on
      two eye tracking devices (Tobii/QuickGlance (QG)). The main
      findings show that there is a significant difference in selection
      times between long and short SGGs, between vertical and horizontal
      selections, as well as between the different tracking systems.

      The article discusses various projects, which also use gaze gestures or something similar. Among them is text input through gestures, i.e. letters are created or selected by gestures.

      Problem with gestures:
      Cognitively it may be difficult to remember a large number of gestures and physiologically it may be difficult to create and complete them [Porta et al. 2008].

      Goal of the evaluation:
      This experiment was designed to explore the following three hypotheses. Firstly, does frame-rate and automatic smoothing on eye trackers have an effect on either the selection completion time or the selection error rate? - Secondly, is there a difference in completing selection saccades in different directions, i.e. horizontal and vertical? – And thirdly, is there a difference in the completion times of gestures depending on various lengths of the eye movements across a screen?

      Having read the article once, it is not quite clear to me what the authors mean with "gaze gestures"
      Do they have the same in mind as i do??

      Let's go!

      0 Kommentare
      Blog created! Here we go!