|Date:||Wednesday, Aug. 28|
|Location:||Room 302, Neubrückstrasse 10|
The First-Person (Egocentric) Vision paradigm allows to acquire images of the world from the user’s perspective. Compared to standard Third-Person Vision, FPV is advantageous for building intelligent wearable systems able to assist the user and augment their abilities. Given their intrinsic mobility and the ability to acquire user-related information, FPV systems need to deal with a continuously evolving environment. This poses many challenges such as localization, context and intent understanding which need to be addressed in order to deliver effective solutions.
In this talk, I will present recent research on First-Person Vision which has been carried out in the Image Processing Laboratory (IPLAB) at the University of Catania. I will first focus on some applications relying on localization and context understanding to infer the user’s behavior. Then, I’ll present EPIC-KITCHENS, a large-scale dataset for object-interaction understanding from First-Person Vision data and recent work on the task of egocentric action anticipation.