PRE2018 3 Group12: Difference between revisions

From Control Systems Technology Group
Jump to navigation Jump to search
Line 47: Line 47:


== '''Deliverables''' ==
== '''Deliverables''' ==
A prototype that aids blind people roam around areas, that are unknown to them. This prototype is based on the design of last year<ref> Boekhorst, B, te. Kruithof, E. Cloudt, Stefan. Cloudt, Eline. Kamperman, T. (2017). Robots Everywhere PRE2017 3 Groep13. http://cstwiki.wtb.tue.nl/index.php?title=PRE2017_3_Groep13 </ref>. From this design, a new design was made that tries to improve on the issues the previous design faced. Additionally, a wiki will be made that helps with giving additional information about the protoype, such as costs, components and it provides some backstory of the subject. Finally, a presentation is made regarding the final design and prototype.
A prototype that aids blind people roaming around areas, that are unknown to them. This prototype is based on the design of last year<ref> Boekhorst, B, te. Kruithof, E. Cloudt, Stefan. Cloudt, Eline. Kamperman, T. (2017). Robots Everywhere PRE2017 3 Groep13. http://cstwiki.wtb.tue.nl/index.php?title=PRE2017_3_Groep13 </ref>. From this design, a new design was made that tries to improve on the issues the previous design faced. Additionally, a wiki will be made that helps with giving additional information about the protoype, such as costs, components and it provides some backstory of the subject. Finally, a presentation is made regarding the final design and prototype.
   
   
==='''Requirements'''===
==='''Requirements'''===

Revision as of 11:38, 11 February 2019

Group Members

Name Study Student ID
Harm van den Dungen Electrical Engineering 1018118
Nol Moonen Software Science 1003159
Johan van Poppel Software Science 0997566
Maarten Flippo Software Science 1006482


Problem Statement

People with a visual impairment will never be able to sense the world as people without visual impairment. Thanks to guide dogs and white canes, however, these people are able to enjoy independence when it comes to navigating outside areas. Yet, these measures cannot give a representation of the world around them beyond the range of the cane or the movement of the dog. With the use of technology that might change. Using sensors these people could be given the ability to sense more than their immediate surroundings, sense objects which their white cane didn't contact or the dog ignored because it was not in the way. Also, using physical phenomena such as the Doppler effect one can also detect motion relative to you, further enhancing the image a visually impaired person can obtain of the world.

Users

The users we are designing the technology for, are the visually impaired. People with a visual disability often need aids to get through their daily life. For a blind or partially blind person, the simplest tasks can be hard to complete. While there are existing tools, such as guiding dogs and white canes, these are not always sufficient.

The most important requirement of the technology is that it offers a valid alternative to existing aids. This does not necessarily mean that the technology better support the users disability than alternatives, it could also mean that it is simply cheaper. If the product is cheaper it can still be an option for people not able to afford more costly alternatives. There are many factors classifying the value of a product. Two important factors are the production and selling costs, and the support given and the usability of the technology.


State of the Art

The problem can be subdivided into two sub problems: how the environment can be perceived to create data, and how this data can be communicated back to the user. Now follows a short summary of existing technologies:

Mapping the environment

There are many different technologies to observe and environment and map this to electrical signals. The most notable technologies include:

  • Geographic information system [1]
  • Bluetooth [2]

Communicating to the user

Given we are dealing with the visually impaired, we cannot convey the gathered information through a display. The most common alternatives are using haptic feedback or audio cues, either spoken or generic tones.

Different designs

gloves

shoes

Bril?

Geluid toevoegen zoals vleermuizen? -> Doppler effect gebruiken voor bewegingen. -> dit kan gedaan worden met icm radar -> groot bereik

Deliverables

A prototype that aids blind people roaming around areas, that are unknown to them. This prototype is based on the design of last year[3]. From this design, a new design was made that tries to improve on the issues the previous design faced. Additionally, a wiki will be made that helps with giving additional information about the protoype, such as costs, components and it provides some backstory of the subject. Finally, a presentation is made regarding the final design and prototype.

Requirements

Approach

To build this prototype designed for the users, in this case, the visually impaired, first there has to be known what the actual problem is. To acquire this knowledge, information about problem is gathered. Afterwards, using this problem, information about the state-of-the-art of most notably technologies for visually impaired, radar-sensors and radar-Doppler-sensors could be gathered. Combining all the information will be used to make preliminary design that fills the needs of the users. After the preliminary design is finished, building the prototype can be started. During the making of the design and building the prototype, it is probable that some things might not go as planned and it will be necessary to go back steps, to make an improvement on the design in the end. When the prototype is finished, it is tweaked to perform as optimal as possible using several tests. Finally, everything will be documented in the wiki.

Milestones

  • Completing the design of the prototype
  • Finish building the prototype
  • Prototype is fully debugged and all components work as intended
  • Prototype follows requirements
    • 50%
    • 75%
    • 100%

References Harm

[4] [5] [6] [7] [8] [9] [10]

References Johan

[11] Bril maakt 3d-interpretatie van omgeving waar naar gekeken wordt, hiermee wordt een soundscape gegeven. Kan gecombineerd worden met riem met vibraties voor meer omgevingsgewaarwording.

[12] Research dat gebruikt wordt in vorige referentie; Probeert een zo goed mogelijke interpretatie van de wereld in de vorm van geluid te maken. Expliciet is genoemd dat deze research niet gericht is op obstacle-avoidance. Maar, kan wel interessant zijn op hoe hun stimuli gebruiken om de omgeving te omschrijven. (Hoe laten we iets vibreren om iets duidelijk te maken?)

[13] Research precies naar wat wij willen doen, object identification met geluid en vibratie. Neemt ook bewegende obstakels into account.

[14] Proposal van detectiemethode bij smart gloves. Belangrijkste conclusie: The limitation of this project was the ultrasonic sensor used can only detect the obstacles but cannot illustrate the shape of the obstacles.

[15] Eind proposal: Arduino-powered bracelet voor vibraties om hiermee bewegingen te sturen.

[16] Ook een middel om visuele dingen om te zetten naar geluid, dit keer óók voor obstacle avoidance.

[17] Onderzoeken naar 'smart' verbeteringen van de typische 'blindenstok'. Met dus ook aanbevelingen naar het design ervan. Dit gaat ook over sensoren gebruiken om obstakels verder dan 2m (lengte v.d. stok) te "zien".

References Maarten

[18] [19] [20] [21] [22] [23] [24]

References Nol

[25] Guiding blind users through unfamiliar environments trough a user friendly interface that is fed by a geographic information system.

[26] A Bluetooth based system for cell phones, focused on guiding visually impaired users through an urban intersection.

[27] Giving blind people the ability to move around in unfamiliar environments through a computer vision module and a user friendly interface, named SmartVision.

[28] A triangulating laser telemeter adapted to space perception for the blind, consisting of a laser telemeter detecting distances, and an interface presenting these distances to the blind user.

[29] A system consisting of two stereo cameras and a portable computer for processing environmental information and transforming them into acoustical signals to aid blind people.

[30] A locomotion assistance device that can deliver semantic information about its surrounding environment at any time.

[31] A localizing system for the visually impaired, by using a single-body-mounted camera and computer vision techniques, instead of an inaccurate GPS system.

[32]


References

  1. Faria, J., Lopes, S., Fernandes, H., Martins, P., & Barroso, J. (2010). Electronic white cane for blind people navigation assistance. World Automation Congress (WAC), 2010, 1–7. Retrieved from https://ieeexplore.ieee.org/abstract/document/5665289/citations#citations
  2. Bohonos, S., Lee, A., Malik, A., Thai, C., & Manduchi, R. (2007). Universal real-time navigational assistance (URNA). In Proceedings of the 1st ACM SIGMOBILE international workshop on Systems and networking support for healthcare and assisted living environments - HealthNet '07
  3. Boekhorst, B, te. Kruithof, E. Cloudt, Stefan. Cloudt, Eline. Kamperman, T. (2017). Robots Everywhere PRE2017 3 Groep13. http://cstwiki.wtb.tue.nl/index.php?title=PRE2017_3_Groep13
  4. Pereira, A., Nunes, N., Vieira, D., Costa, N., Fernandes, H. & Barroso, J. (2015). Blind Guide: An ultrasound sensor-based body area network for guiding blind people. Procedia Computer Science, 67, 403–408. https://doi.org/10.1016/j.procs.2015.09.285
  5. Al-Mosawi, Ali. (2012). Using ultrasonic sensor for blind and deaf persons combines voice alert and vibration properties. Research Journal of Recent Sciences. 1. https://www.researchgate.net/publication/235769070_Using_ultrasonic_sensor_for_blind_and_deaf_persons_combines_voice_alert_and_vibration_properties
  6. T. Ifukube, T. Sasaki and C. Peng, "A blind mobility aid modeled after echolocation of bats," in IEEE Transactions on Biomedical Engineering, vol. 38, no. 5, pp. 461-465, May 1991. http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=81565&isnumber=2674
  7. Bousbia-Salah, M., Bettayeb, M. & Larbi, A. J Intell Robot Syst (2011) 64: 387. https://doi.org/10.1007/s10846-011-9555-7
  8. Bousbia-Salah M., Fezari M. (2007) A Navigation Tool for Blind People. In: Sobh T. (eds) Innovations and Advanced Techniques in Computer and Information Sciences and Engineering. Springer, Dordrecht. https://link.springer.com/chapter/10.1007%2F978-1-4020-6268-1_59
  9. P. Mihajlik, M. Guttermuth, K. Seres and P. Tatai, "DSP-based ultrasonic navigation aid for the blind," IMTC 2001. Proceedings of the 18th IEEE Instrumentation and Measurement Technology Conference. Rediscovering Measurement in the Age of Informatics (Cat. No.01CH 37188), Budapest, 2001, pp. 1535-1540 vol.3. http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=929462&isnumber=20096
  10. L. Dunai, G. P. Fajarnes, V. S. Praderas, B. D. Garcia and I. L. Lengua, "Real-time assistance prototype — A new navigation aid for blind people," IECON 2010 - 36th Annual Conference on IEEE Industrial Electronics Society, Glendale, AZ, 2010, pp. 1173-1178. http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5675535&isnumber=5674827
  11. Merrifield, R. (2017, 18 mei). How soundscapes and vibrations are helping blind people see the world. Geraadpleegd op 11 februari 2019, van https://horizon-magazine.eu/article/how-soundscapes-and-vibrations-are-helping-blind-people-see-world.html
  12. Johannesson, Omar & Balan, Oana & Unnthorsson, Runar & Moldoveanu, Alin & Kristjánsson, Árni. (2016). The Sound of Vision Project: On the Feasibility of an Audio-Haptic Representation of the Environment, for the Visually Impaired. Brain Sciences. 6. 20. 10.3390/brainsci6030020.
  13. Cardin, S., Thalmann, D., & Vexo, F. (2005). Wearable Obstacle Detection System for visually impaired People.
  14. Ghate, A. A., & Chavan, V. G. (2017). SMART GLOVES FOR BLIND. IRJET, 12(04), 1025–1028. Retrieved from https://www.irjet.net/volume4-issue12
  15. Brock, A., Kammoun, S., Macé, M., & Jouffrais, C. (2014). Using wrist vibrations to guide hand movement and whole body navigation. I-Com, 13(3). https://doi.org/10.1515/icom.2014.0026
  16. Bujacz, M., & Strumiłło, P. (2016). Sonification: Review of Auditory Display Solutions in Electronic Travel Aids for the Blind. Archives of Acoustics, 41(3), 401–414. https://doi.org/10.1515/aoa-2016-0040
  17. Kim, S. Y., & Cho, K. (2013). Usability and Design Guidelines of Smart Canes for Users with Visual Impairments. International Journal of Design, 7(1), 99–110. Retrieved from http://www.ijdesign.org/index.php/IJDesign/article/view/1209/559
  18. Cassinelli, Alvaro. Reynolds, C. Ishikawa, M. (2006). Augmenting spatial awareness with Haptic Radar. https://ieeexplore.ieee.org/abstract/document/4067727
  19. Mehta, U. Alim, M. Kumar, S. (2017). Smart path guidance mobile aid for visually disabled persons. https://www.sciencedirect.com/science/article/pii/S1877050917302089
  20. Lacey, G. Dawson-Howe K. (1998). The application of robotics to a mobility aid for the elderly blind. https://www.sciencedirect.com/science/article/pii/S0921889098000116
  21. Ram, S. Sharf, J. (2002). The people sensor: a mobility aid for the visually impaired. https://ieeexplore.ieee.org/abstract/document/729548
  22. Schwarze, T. Lauer, M, Schwaab, M. Romanovas, M. Böhm, S. Jürgensohn, T. (2015). A camera-based mobility aid for visually impaired people. https://link.springer.com/article/10.1007/s13218-015-0407-7
  23. Van Erp, J. Kroon, L. Mioch, T. Paul, K. (2017), Obstacle Detection Display for Visually Impaired: Coding of Direction, Distance, and Height on a Vibrotactile Waist Band. https://www.frontiersin.org/articles/10.3389/fict.2017.00023/full
  24. Wang, H. Katzschmann, R. Teng, S. Araki, B. Giarré, L. Rus, D. (2017). Enabling independent navigation for visually impaired people through a wearable vision-based feedback system. https://ieeexplore.ieee.org/abstract/document/7989772
  25. Faria, J., Lopes, S., Fernandes, H., Martins, P., & Barroso, J. (2010). Electronic white cane for blind people navigation assistance. World Automation Congress (WAC), 2010, 1–7. Retrieved from https://ieeexplore.ieee.org/abstract/document/5665289/citations#citations
  26. Bohonos, S., Lee, A., Malik, A., Thai, C., & Manduchi, R. (2007). Universal real-time navigational assistance (URNA). In Proceedings of the 1st ACM SIGMOBILE international workshop on Systems and networking support for healthcare and assisted living environments - HealthNet '07
  27. Fernandes, H., Costa, P., Filipe, V., & Hadjileontiadis, L. (2010). STEREO VISION IN BLIND NAVIGATION ASSISTANCE. 2010 World Automation Congress. Retrieved from https://ieeexplore.ieee.org/abstract/document/5665579
  28. Farcy, R., & Bellik, Y. (2002). Locomotion assistance for the blind. In Universal Access and Assistive Technology (pp. 277–284). London: Springer London. https://doi.org/10.1093/humrep/deq021
  29. Dunai, L., Fajarnes, G. P., Praderas, V. S., Garcia, B. D., & Lengua, I. L. (2010). Real-time assistance prototype- A new navigation aid for blind people. In IECON Proceedings (Industrial Electronics Conference) (pp. 1173–1178). IEEE. https://doi.org/10.1109/IECON.2010.5675535
  30. Jacquet, C., Bourda, Y., & Bellik, Y. (2005). A Context-Aware Locomotion Assistance Device for the Blind. In People and Computers XVIII - Design for Life, Proceedings of HCI 2004 (pp. 315–328). London: Springer London. https://doi.org/10.1007/978-3-540-27817-7_64
  31. Truelliet, S., & Royer, E. (2010). OUTDOOR/INDOOR VISION-BASED LOCALIZATION FOR BLIND PEDESTRIAN NAVIGATION ASSISTANCE. International Journal of Image and Graphics, 10(04), 481–496. https://doi.org/10.1142/S0219467810003937
  32. A wearable assistive device for the visually impaired. (n.d.). Retrieved February 11, 2019, from http://www.guidesense.com/en/