Please use this identifier to cite or link to this item:
https://rda.sliit.lk/handle/123456789/3096
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Dhanawansa, V | - |
dc.contributor.author | Samarasinghe, P | - |
dc.contributor.author | Gardiner, B | - |
dc.contributor.author | Yogarajah, P | - |
dc.contributor.author | Karunasena, A | - |
dc.date.accessioned | 2022-11-30T05:47:44Z | - |
dc.date.available | 2022-11-30T05:47:44Z | - |
dc.date.issued | 2022-05-15 | - |
dc.identifier.citation | Dhanawansa, V., Samarasinghe, P., Gardiner, B., Yogarajah, P., Karunasena, A. (2022). The Automated Temporal Analysis of Gaze Following in a Visual Tracking Task. In: Sclaroff, S., Distante, C., Leo, M., Farinella, G.M., Tombari, F. (eds) Image Analysis and Processing – ICIAP 2022. ICIAP 2022. Lecture Notes in Computer Science, vol 13233. Springer, Cham. https://doi.org/10.1007/978-3-031-06433-3_28 | en_US |
dc.identifier.issn | 978-3-031-06432-6 | - |
dc.identifier.uri | https://rda.sliit.lk/handle/123456789/3096 | - |
dc.description.abstract | The attention assessment of an individual in following the motion of a target object provides valuable insights into understanding one’s behavioural patterns in cognitive disorders including Autism Spectrum Disorder (ASD). Existing frameworks often require dedicated devices for gaze capture, focus on stationary target objects, or fails to conduct a temporal analysis of the participant’s response. Thus, in order to address the persisting research gap in the analysis of video capture of a visual tracking task, this paper proposes a novel framework to analyse the temporal relationship between the 3D head pose angles and object displacement, and demonstrates its validity via application on the EYEDIAP video dataset. The conducted multivariate time-series analysis is two-fold; the statistical correlation computes the similarity between the time series as an overall measure of attention; and the Dynamic Time Warping (DTW) algorithm aligns the two sequences, and computes relevant temporal metrics. The temporal features of latency and maximum time of focus retention enabled an intragroup comparison between the performance of the participants. Further analysis disclosed valuable insights into the behavioural response of participants, including the superior response to horizontal motion of the target and the improvement in retention of focus on the vertical motion over time, implying that following a vertical target initially proved a challenging task. | en_US |
dc.language.iso | en | en_US |
dc.publisher | Springer, Cham | en_US |
dc.relation.ispartofseries | ICIAP 2022: Image Analysis and Processing – ICIAP 2022;pp 324–336 | - |
dc.subject | Automated | en_US |
dc.subject | Temporal Analysis | en_US |
dc.subject | Gaze Following | en_US |
dc.subject | Visual Tracking Task | en_US |
dc.title | The Automated Temporal Analysis of Gaze Following in a Visual Tracking Task | en_US |
dc.type | Article | en_US |
dc.identifier.doi | https://doi.org/10.1007/978-3-031-06433-3_28 | en_US |
Appears in Collections: | Department of Information Technology |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
the Automated temporal.pdf Until 2050-12-31 | 1.2 MB | Adobe PDF | View/Open Request a copy |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.