skip to main content
research-article

<italic>BrailleReader:</italic> Braille Character Recognition Using Wearable Motion Sensor

Published: 20 March 2024 Publication History

Abstract

With the ever-increasing demand for improving communication and independence for visually impaired people, automatic Braille recognition has gained increasing attention in facilitating Braille learning and reading. However, current approaches mainly require high-cost hardware, involve inconvenient operation, and disturb the normal touch function. In this paper, we propose <italic>BrailleReader</italic> as a low-cost and effortless Braille character recognition system without disturbing normal Braille touching. It exploits the wrist motion of Braille reading captured by the motion sensor available in the ubiquitous wrist-worn device to infer the encoded character information. To address the noise caused by other body and hand movements, we propose a novel noise cancellation method using the wavelet packet decomposition and reconstruction technique to separate clean wrist movement induced by the Braille dot. Moreover, we further explore the unique wrist movement pattern in three aspects to extract a novel and effective feature set. Based on this, <italic>BrailleReader</italic> leverages a spiking neural network-based model to robustly recognize Braille characters across different people and different surface materials. Extensive experiments with 48 participants demonstrate that <italic>BrailleReader</italic> can perform accurate and robust recognition of 26 Braille characters.

References

[1]
W. H. Organizatioin, “Blindness and vision impairment,” 2022. [Online]. Available: https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment
[2]
S. Amato, “Standards for competence in braille literacy skills in teacher preparation programs,” J. Vis. Impairment Blindness, vol. 96, no. 3, pp. 143–153, 2002.
[3]
N. F. of the Blind, “The braille literacy crisis in America,” 2009. [Online]. Available: https://nfb.org/sites/default/files/images/nfb/publications/bm/bm09/bm0905/bm090504.htm
[4]
I. Murray and T. Dias, “A portable device for optically recognizing braille. I. hardware development,” in Proc. 7th Australian New Zealand Intell. Inf. Syst. Conf., 2001, pp. 129–134.
[5]
G. Morgavi and M. Morando, “A neural network hybrid model for an optical braille recognitor,” in Proc. Int. Conf. Signal Speech Image Process., 2002, pp. 1–6.
[6]
A. Antonacopoulos and D. Bridson, “A robust braille recognition system,” in Proc. Int. Conf. Document Anal. Syst., Springer, 2004, pp. 533–545.
[7]
S. D. Al-Shamma and S. Fathi, “Arabic braille recognition and transcription into text and voice,” in Proc. 5th Cairo Int. Biomed. Eng. Conf., 2010, pp. 227–231.
[8]
J. Li, X. Yan, and D. Zhang, “Optical braille recognition with haar wavelet features and support-vector machine,” in Proc. IEEE Int. Conf. Comput. Mechatron. Control Electron. Eng., 2010, pp. 64–67.
[9]
S. Zhang and K. Yoshino, “A braille recognition system by the mobile phone with embedded camera,” in Proc. IEEE 2nd Int. Conf. Innov. Comput. Inf. Control, 2007, pp. 223–223.
[10]
A. Sharma, S. Devi, and J. K. Verma, “Braille book reader using Raspberry Pi,” in Proc. IEEE Int. Conf. Comput. Perform. Eval., 2020, pp. 841–843.
[11]
S. Aulia et al., “Recognition of image pattern to identification of braille characters to be audio signals for blind communication tools,” in Proc. IOP Conf. Ser. Mater. Sci. Eng., 2020, pp. 1–9.
[12]
X. F. Zhao et al., “A skin-like sensor for intelligent braille recognition,” Nano Energy, vol. 68, 2020, Art. no.
[13]
S. K. Ravi et al., “Bio-photocapacitive tactile sensors as a touch-to-audio braille reader and solar capacitor,” Mater. Horiz., vol. 7, no. 3, pp. 866–876, 2020.
[14]
M. Tanaka, K. Miyata, and S. Chonan, “A wearable braille sensor system with a post processing,” IEEE/ASME Trans. Mechatron., vol. 12, no. 4, pp. 430–438, Aug. 2007.
[15]
Y. Cao, T. Li, Y. Gu, H. Luo, S. Wang, and T. Zhang, “Fingerprint-inspired flexible tactile sensor for accurately discerning surface texture,” Small, vol. 14, no. 16, 2018, Art. no.
[16]
H. Niu, S. Gao, W. Yue, Y. Li, W. Zhou, and H. Liu, “Highly morphology-controllable and highly sensitive capacitive tactile sensor based on epidermis-dermis-inspired interlocked asymmetric-nanocone arrays for detection of tiny pressure,” Small, vol. 16, no. 4, 2020, Art. no.
[17]
S. Luo et al., “Microconformal electrode-dielectric integration for flexible ultrasensitive robotic tactile sensing,” Nano Energy, vol. 80, 2021, Art. no.
[18]
Y. Matsuda, I. Sakuma, Y. Jimbo, E. Kobayashi, T. Arafune, and T. Isomura, “Development of finger braille recognition system,” J. Biomechanical Sci. Eng., vol. 5, no. 1, pp. 54–65, 2010.
[19]
Y. Matsuda and T. Isomura, “Novel mounts of accelerometers of finger braille recognition system,” Eng. Lett., vol. 20, no. 3, pp. 229–237, 2012.
[20]
Y. Cao, F. Li, H. Chen, X. liu, L. Zhang, and Y. Wang, “Guard your heart silently: Continuous electrocardiogram waveform monitoring with wrist-worn motion sensor,” Proc. ACM Interactive Mobile Wearable Ubiquitous Technol., vol. 6, no. 3, pp. 1–29, 2022.
[21]
Ł. Bola, K. Siuda-Krzywicka, M. Paplińska, E. Sumera, P. Hańczur, and M. Szwed, “Braille in the sighted: Teaching tactile reading to sighted adults,” PLoS One, vol. 11, no. 5, pp. 1–13, 2016.
[22]
J. Kongkul, “Developing braille literacy engagement through hand-movement training and innovation in instructional materials,” in Proc. 3rd Int. Conf. Special Educ., 2019, pp. 88–90.
[23]
K. Córdova-Reyes, R. López-Villarreal, J. Oliva-Rodríguez, O. Sánchez-Barrios, and D. Martínez-Cancino, “Braille system learning introductory device,” in Congreso Nacional de Ingeniería Biomédica. Berlin, Germany: Springer, 2023, pp. 493–501.
[24]
C. R. Kumar and S. Srinath, “A novel and efficient algorithm to recognize any universally accepted braille characters: A case with Kannada language,” in Proc. IEEE 5th Int. Conf. Signal Image Process., 2014, pp. 292–296.
[25]
A. Mousa, H. Hiary, R. Alomari, and L. Alnemer, “Smart braille system recognizer,” Int. J. Comput. Sci. Issues, vol. 10, no. 6, 2013, Art. no.
[26]
S. Isayed and R. Tahboub, “A review of optical braille recognition,” in Proc. IEEE 2nd World Symp. Web Appl. Netw., 2015, pp. 1–6.
[27]
B. Hughes, “Movement kinematics of the braille-reading finger,” J. Vis. Impairment Blindness, vol. 105, no. 6, pp. 370–381, 2011.
[28]
B. Hughes, A. McClelland, and D. Henare, “On the nonsmooth, nonconstant velocity of braille reading and reversals,” Sci. Stud. Reading, vol. 18, no. 2, pp. 94–113, 2014.
[29]
Y. Cao, H. Chen, F. Li, S. Yang, and Y. Wang, “Awash: Handwashing assistance for the elderly with dementia via wearables,” in Proc. IEEE Conf. Comput. Commun., 2021, pp. 1–10.
[30]
X. Guo, J. Liu, and Y. Chen, “Fitcoach: Virtual fitness coach empowered by wearable mobile devices,” in Proc. IEEE Conf. Comput. Commun., 2017, pp. 1–9.
[31]
C. Luo et al., “Brush like a dentist: Accurate monitoring of toothbrushing via wrist-worn gesture sensing,” in Proc. IEEE Conf. Comput. Commun., 2019, pp. 1234–1242.
[32]
J. Hou et al., “Signspeaker: A real-time, high-precision smartwatch-based sign language translator,” in Proc. ACM Annu. Int. Conf. Mobile Comput. Netw., 2009, pp. 1–15.
[33]
P. Khanna et al., “AccessWear: Making smartphone applications accessible to blind users,” in Proc. ACM Annu. Int. Conf. Mobile Comput. Netw., 2023, pp. 1–6.
[34]
D. Ma et al., “Recognizing hand gestures using solar cells,” IEEE Trans. Mobile Comput., vol. 22, no. 7, pp. 4223–4235, Jul. 2023.
[35]
T. Zheng, C. Cai, Z. Chen, and J. Luo, “Sound of motion: Real-time wrist tracking with a smart watch-phone pair,” in Proc. IEEE Conf. Comput. Commun., 2022, pp. 110–119.
[36]
A. G. Erdman, J. K. Mayfield, F. Dorman, M. Wallrich, and W. Dahlof, “Kinematic and kinetic analysis of the human wrist by stereoscopic instrumentation,” J. Biomechanical Eng., vol. 101, no. 2, pp. 124–133, 1979.
[37]
B. Pourciau, “Newton's interpretation of Newton's second law,” Arch. Hist. Exact Sci., vol. 60, no. 2, pp. 157–207, 2006.
[38]
P. De Leva, “Adjustments to Zatsiorsky-Seluyanov's segment inertia parameters,” J. Biomech., vol. 29, no. 9, pp. 1223–1230, 1996.
[39]
Y.-K. Kong, A. Freivalds, D.-M. Kim, and J. Chang, “Investigation of methods for estimating hand bone dimensions using X-ray hand anthropometric data,” Int. J. Occup. Saf. Ergonom., vol. 23, no. 2, pp. 214–224, 2017.
[40]
H. Zhou, T. Lu, Y. Liu, S. Zhang, and M. Gowda, “Learning on the rings: Self-supervised 3D finger motion tracking using wearable sensors,” Proc. ACM ACM Interactive Mobile Wearable Ubiquitous Technol., vol. 6, no. 2, pp. 1–31, 2022.
[41]
I.-C. Severin and D.-M. Dobrea, “6DOF inertial IMU head gesture detection: Performance analysis using Fourier transform and jerk-based feature extraction,” in Proc. IEEE Microw. Theory Techn. Wireless Commun., 2020, pp. 118–123.
[42]
Z. Wu and N. E. Huang, “Ensemble empirical mode decomposition: A noise-assisted data analysis method,” Adv. Adaptive Data Anal., vol. 1, no. 1, pp. 1–41, 2009.
[43]
D. Zhang, “Wavelet Transform,” in Fundamentals of Image Data Mining. Berlin, Germany: Springer, 2019, pp. 35–44.
[44]
S. V. Halunga, N. Vizireanu, and O. Fratu, “Imperfect cross-correlation and amplitude balance effects on conventional multiuser decoder with turbo encoding,” Digit. Signal Process., vol. 20, no. 1, pp. 191–200, 2010.
[45]
F. Wang, G.-P. Liao, X.-Y. Zhou, and W. Shi, “Multifractal detrended cross-correlation analysis for power markets,” Nonlinear Dyn., vol. 72, pp. 353–363, 2013.
[46]
L. Hong-lian, Z. Wei-li, Y. Hui-jian, Z. Jun, Z. Le-tao, and J. Qiang, “A novel data fusion algorithm based on wavelet packet analysis,” in Proc. IEEE Int. Conf. Comput. Intell. Natural Comput., 2009, pp. 311–314.
[47]
D. L. Donoho, “De-noising by soft-thresholding,” IEEE Trans. Inf. Theory, vol. 41, no. 3, pp. 613–627, May 1995.
[48]
M. Chen et al., “Your table can be an input panel: Acoustic-based device-free interaction recognition,” Proc. ACM Interactive Mobile Wearable Ubiquitous Technol., vol. 3, no. 1, pp. 1–21, 2019.
[49]
R. Bachu, S. Kopparthi, B. Adapa, and B. D. Barkana, “Voiced/unvoiced decision for speech signals based on zero-crossing rate and energy,” in Advanced Techniques in Computing Sciences and Software Engineering. Berlin, Germany: Springer, 2010, pp. 279–282.
[50]
D. Ma, G. Lan, W. Xu, M. Hassan, and W. Hu, “Simultaneous energy harvesting and gait recognition using piezoelectric energy harvester,” IEEE Trans. Mobile Comput., vol. 21, no. 6, pp. 2198–2209, Jun. 2022.
[51]
D. M. Karantonis, M. R. Narayanan, M. Mathie, N. H. Lovell, and B. G. Celler, “Implementation of a real-time human movement classifier using a triaxial accelerometer for ambulatory monitoring,” IEEE Trans. Inf. Technol. Biomed., vol. 10, no. 1, pp. 156–167, Jan. 2006.
[52]
H. Li and X. Liu, “The nonsingularity on the R-circulant matrices,” in Proc. IEEE Int. Colloq. Comput. Commun. Control Manage., 2009, pp. 324–327.
[53]
R. M. Gray et al., “Toeplitz and circulant matrices: A review,” Found. Trends Commun. Inf. Theory, vol. 2, no. 3, pp. 155–239, 2006.
[54]
N. Hurley and S. Rickard, “Comparing measures of sparsity,” IEEE Trans. Inf. Theory, vol. 55, no. 10, pp. 4723–4741, Oct. 2009.
[55]
Z. Wang and A. C. Bovik, “Mean squared error: Love it or leave it? A new look at signal fidelity measures,” IEEE signal Process. Mag., vol. 26, no. 1, pp. 98–117, Jan. 2009.
[56]
K. Zhang, L. Zhang, and M.-H. Yang, “Fast compressive tracking,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 36, no. 10, pp. 2002–2015, Oct. 2014.
[57]
Q. Yu, H. Tang, K. C. Tan, and H. Li, “Rapid feedforward computation by temporal encoding and learning with spiking neurons,” IEEE Trans. Neural Netw. Learn. Syst., vol. 24, no. 10, pp. 1539–1552, Oct. 2013.
[58]
B. Zhao, R. Ding, S. Chen, B. Linares-Barranco, and H. Tang, “Feedforward categorization on AER motion events using cortex-like features in a spiking neural network,” IEEE Trans. Neural Netw. Learn. Syst., vol. 26, no. 9, pp. 1963–1978, Sep. 2015.
[59]
Witmotion BWT901, 2023. [Online]. Available: https://github.com/WITMOTION/BWT901
[60]
R. Li, H. Liu, X. Wang, and Y. Qian, “DSBI: Double-sided braille image dataset and algorithm evaluation for braille dots detection,” in Proc. ACM Int. Conf. Video Image Process., 2018, pp. 65–69.
[61]
R. Fatmi, S. Rashad, and R. Integlia, “Comparing ANN, SVM, and HMM based machine learning methods for american sign language recognition using wearable motion sensors,” in Proc. IEEE 9th Annu. Comput. Commun. Workshop Conf., 2019, pp. 290–297.
[62]
S. Mekruksavanich, P. Jantawong, N. Hnoohom, and A. Jitpattanakul, “Classification of physical exercise activity from ECG, PPG and IMU sensors using deep residual network,” in Proc. IEEE Res. Invention Innov. Congr. Innov. Elect. Electron., 2022, pp. 130–134.
[63]
J. Younas, S. Narayan, and P. Lukowicz, “Air-writing segmentation using a single IMU-based system,” in Proc. IEEE 19th Int. Conf. Intell. Environ., 2023, pp. 1–6.
[64]
Y. Cao, A. Dhekne, and M. Ammar, “ViSig: Automatic interpretation of visual body signals using on-body sensors,” Proc. ACM Interactive Mobile Wearable Ubiquitous Technol., vol. 7, no. 1, pp. 1–27, 2023.
[65]
T. Y. Wong et al., “Prevalence and causes of low vision and blindness in an urban malay population: The Singapore malay eye study,” Arch. Ophthalmol., vol. 126, no. 8, pp. 1091–1099, 2008.
[66]
V. D. Winantyo and N. Azizah, “Parent engagement in the early stage of the braille learning process for blind children,” in Proc. Int. Conf. Special Inclusive Educ., 2019, pp. 146–148.
[67]
K. M. Storer and S. M. Branham, ““That's the way sighted people do it” what blind parents can teach technology designers about co-reading with children,” in Proc. ACM Designing Interactive Syst. Conf., 2019, pp. 385–398.

Index Terms

  1. <italic>BrailleReader:</italic> Braille Character Recognition Using Wearable Motion Sensor
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image IEEE Transactions on Mobile Computing
    IEEE Transactions on Mobile Computing  Volume 23, Issue 11
    Nov. 2024
    504 pages

    Publisher

    IEEE Educational Activities Department

    United States

    Publication History

    Published: 20 March 2024

    Qualifiers

    • Research-article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 0
      Total Downloads
    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 22 Oct 2024

    Other Metrics

    Citations

    View Options

    View options

    Get Access

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media