Enhancing Gadgets for Blinds Through Scale Invariant Feature Transform

Raman Kumar, Uffe Kock Wiil

Research output: Chapter in Book/Report/Conference proceedingBook chapterResearchpeer-review

Abstract

ICT can help blind people in movement and direction-finding tasks. This paper proposes a new methodology for safe mobility based on scale invariant feature transform (SIFT) that is expected to lead to higher precision and accuracy. Various existing gadgets for visually impaired are examined, and the conclusion is that the proposed methodology can enhance these gadgets.

Original languageEnglish
Title of host publicationRecent Advances in Computational Intelligence
EditorsRaman Kumar, Uffe Kock Wiil
Place of PublicationHeidelberg
PublisherSpringer
Publication date2019
Pages149-159
ISBN (Print)9783030124991
ISBN (Electronic)9783030125004
DOIs
Publication statusPublished - 2019
SeriesStudies in Computational Intelligence
Volume823
ISSN1860-949X

Keywords

  • Partially-sighted and blind people
  • Scale invariant feature transform
  • Visually impaired

Fingerprint Dive into the research topics of 'Enhancing Gadgets for Blinds Through Scale Invariant Feature Transform'. Together they form a unique fingerprint.

Cite this