VIP-Mobility360: From RNIB Voices to Real-World Data – A Comprehensive Video Dataset of Outdoor Mobility Obstacles for People with Visual Impairments

Khan, Wasiq ORCID logoORCID: https://orcid.org/0000-0002-7511-3873, Topham, Luke ORCID logoORCID: https://orcid.org/0000-0002-6689-7944 and Kinch, Cameron (2025) VIP-Mobility360: From RNIB Voices to Real-World Data – A Comprehensive Video Dataset of Outdoor Mobility Obstacles for People with Visual Impairments. [Data Collection]

How to cite this Dataset

Abstract

While variety of object detection datasets exist, few address the specific and practical challenges faced by blind and visually impaired persons (VIPs) in real-world outdoor mobility contexts. Existing datasets typically focus on general-purpose objects or indoor scenes and overlook dynamic and environmental factors that critically affect safer mobility in daily life. To address this gap, we introduce VIP-Mobility360 — a novel, primary annotated dataset uniquely designed to support AI models that enhance outdoor mobility for VIPs. VIP-Mobility360 is the first dataset curated in direct response to real-world obstacle reports collected based on Royal National Institute of Blind People (RNIB) reports, consultations, and global case studies involving hazards such as e-scooters, bollards, wheelie bins, construction barriers, and pavement cracks. The dataset contains high-resolution video recordings (100 sequences per object class) captured entirely in outdoor environments, incorporating diverse weather conditions, background variations, scale, orientation, and 360° camera perspectives. This design ensures data realism, robustness, and applicability to real-time tracking, geometric understanding, and time-series forecasting. All instances are manually annotated to support supervised learning, object detection, depth estimation, and semantic segmentation. Beyond assistive technology, VIP-Mobility360 has broader relevance in areas such as autonomous vehicles (e.g., puddle and hazard detection), generative AI for geometric reconstruction, and urban planning for inclusive design. Moreover, by advancing the development of AI-driven smart canes, wearable mobility aids, and context-aware guidance systems, this dataset contributes to reducing reliance on costly personal carers or guide dogs—offering a socially equitable and economically viable solution to independent living for the blind community. This dataset therefore represents a transformative step toward building trustworthy, inclusive, and ethically aligned AI applications in real-world healthcare and urban mobility contexts.

Additional Information: Depositing user's licence comment:
Creators: Khan, Wasiq ORCID logoORCID: https://orcid.org/0000-0002-7511-3873, Topham, Luke ORCID logoORCID: https://orcid.org/0000-0002-6689-7944 and Kinch, Cameron
Uncontrolled Keywords: Object detection dataset; Footpath dataset; Blind people navigation; Healthcare technology; Computer vision data; Street obstacles; 461103 Deep learning; 4611 Machine learning; 400701 Assistive robots and technology; 460304 Computer vision; 460805 Fairness, accountability, transparency, trust and ethics of computer systems
DOI: https://doi.org/10.24377/LJMU.d.00000226
Division: Computer Science & Mathematics
Field of Research: Engineering > Control engineering, mechatronics and robotics > Assistive robots and technology
Information and computing sciences
Information and computing sciences > Computer vision and multimedia computation > Computer vision
Information and computing sciences > Human-centred computing > Fairness, accountability, transparency, trust and ethics of computer systems
Information and computing sciences > Machine learning
Information and computing sciences > Machine learning > Deep learning
Date Deposited: 13 May 2025 10:57
Last Modified: 20 May 2025 14:38
URI: https://opendata.ljmu.ac.uk/id/eprint/226
Data collection method: Handheld mobile camera
Geographic coverage: United Kingdom
Resource language: English
Metadata language: English
Collection period:
FromTo
15 February 202213 September 2022

Download

[img]
[img]
[img]

Explore Further

Read more research from the creator(s):

Usage

Additional statistics for this record


Actions (Log-in required)

View Item View Item