Enhancing Gadgets for Blinds Through Scale Invariant Feature Transform

Raman Kumar, Uffe Kock Wiil

Research output: Chapter in Book/Report/Conference proceedingBook chapterResearchpeer-review


ICT can help blind people in movement and direction-finding tasks. This paper proposes a new methodology for safe mobility based on scale invariant feature transform (SIFT) that is expected to lead to higher precision and accuracy. Various existing gadgets for visually impaired are examined, and the conclusion is that the proposed methodology can enhance these gadgets.

Original languageEnglish
Title of host publicationRecent Advances in Computational Intelligence
EditorsRaman Kumar, Uffe Kock Wiil
Place of PublicationHeidelberg
Publication date2019
ISBN (Print)9783030124991
ISBN (Electronic)9783030125004
Publication statusPublished - 2019
SeriesStudies in Computational Intelligence


  • Partially-sighted and blind people
  • Scale invariant feature transform
  • Visually impaired

Fingerprint Dive into the research topics of 'Enhancing Gadgets for Blinds Through Scale Invariant Feature Transform'. Together they form a unique fingerprint.

Cite this