Published on Apr 02, 2024
WELCOME TO THE SEMINAR
ON
EyePhone
Introduction
w EyePhone, a novel "hand-free" interfacing system capable of driving mobile applications/functions using only the user's eyes movement and actions w EyePhone tracks the user's eye movement across the phone's display using the camera mounted on the front of the phone; more specifically, machine learning algorithms are used to: w i ) track the eye and infer its position on the mobile phone display as a user views a particular application; and w ii) detect eye blinks that emulate mouse clicks to activate the target application under view. w We present a prototype implementation of EyePhone on a Nokia N810, which is capable of tracking the position of the eye on the display, mapping this positions to an application that is activated by a wink.
w The front camera is the only requirement in EyePhone. w Most of the smart phones today are equipped with a front camera and we expect that many more will be introduced in the future in support of video conferencing on the phone. w The EyePhone system uses machine learning techniques that after detecting the eye create a template of the open eye and use template matching for eye tracking. w We implement EyePhone on the Nokia N810 tablet and present experimental results in different settings. w These initial results demonstrate that EyePhone is capable of driving the mobile phone.
w Human-Phone Interaction represents an extension of the field of HCI since HPI presents new challenges that need to be addressed specifically driven by issues of mobility, the form factor of the phone, and its resource limitations w More specifically, the distinguishing factors of the mobile phone environment are mobility and the lack of sophisticated hardware support i.e., specialized headsets, overhead cameras, and dedicated sensors, that are often required to realize HCI applications
w One of the immediate products of mobility is that a mobile phone is moved around through unpredicted context, i.e., situations and scenarios that are hard to see or predict during the design phase of a HPI application. w A mobile phone is subject to uncontrolled movement, i.e., people interact with their mobile phones while stationary, on the move, etc. w It is almost impossible to predict how and where people are going to use their mobile phones. w A HPI application should be able to operate reliably in any encountered condition.
w As opposed to HCI applications, any HPI implementation should not rely on any external hardware. w Asking people to carry or wear additional hardware in order to use their phone might reduce the penetration of the technology. w Moreover, state-of-the art HCI hardware, such as glass mounted cameras, or dedicated helmets are not yet small enough to be conformably worn for long periods of time by people. w Any HPI application should rely as much as possible on just the phone's on-board sensors.
w The EyePhone algorithmic design breaks down into the following pipeline phases: w 1) an eye detection phase; w 2) an open eye template creation phase; w 3) an eye tracking phase; w 4) a blink detection phase. w In what follows, we discuss each of the phases in turn.
Eye Detection
w By applying a motion analysis technique which operates on consecutive frames, this phase consists on finding the contour of the eyes. w The eye pair is identified by the left and right eye contours. w While the original algorithm identifies the eye pair with almost no error when running on a desktop computer with a fixed camera we obtain errors when the algorithm is implemented on the phone due to the quality of the N810 camera compared to the one on the desktop and the unavoidable movement of the phone while in a person's hand
Are you interested in this topic.Then mail to us immediately to get the full report.
email :- contactv2@gmail.com |