1932

Abstract

Assistive technologies (AT) enable people with disabilities to perform activities of daily living more independently, have greater access to community and healthcare services, and be more productive performing educational and/or employment tasks. Integrating artificial intelligence (AI) with various agents, including electronics, robotics, and software, has revolutionized AT, resulting in groundbreaking technologies such as mind-controlled exoskeletons, bionic limbs, intelligent wheelchairs, and smart home assistants. This article provides a review of various AI techniques that have helped those with physical disabilities, including brain–computer interfaces, computer vision, natural language processing, and human–computer interaction. The current challenges and future directions for AI-powered advanced technologies are also addressed.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-bioeng-082222-012531
2024-07-03
2024-07-04
Loading full text...

Full text loading...

/deliver/fulltext/bioeng/26/1/annurev-bioeng-082222-012531.html?itemId=/content/journals/10.1146/annurev-bioeng-082222-012531&mimeType=html&fmt=ahah

Literature Cited

  1. 1.
    Abualghaib O, Groce N, Simeu N, Carew MT, Mont D. 2019.. Making visible the invisible: why disability-disaggregated data is vital to “leave no-one behind. Sustainability 11:(11):3091
    [Crossref] [Google Scholar]
  2. 2.
    Martin Ginis KA, Ma JK, Latimer-Cheung AE, Rimmer JH. 2016.. A systematic review of review articles addressing factors related to physical activity participation among children and adults with physical disabilities. . Health Psychol. Rev. 10:(4):47894
    [Crossref] [Google Scholar]
  3. 3.
    Quinby E, McKernan G, Eckstein S, Joseph J, Dicianno BE, Cooper RA. 2021.. The voice of the consumer: a survey of consumer priorities to inform knowledge translation among Veterans who use mobility assistive technology. . J. Mil. Veteran Fam. Health 7:(2):2639
    [Crossref] [Google Scholar]
  4. 4.
    Wald M. 2021.. AI data-driven personalisation and disability inclusion. . Front. Artif. Intell. 5::571955. https://doi.org/10.3389/frai.2020.571955
    [Crossref] [Google Scholar]
  5. 5.
    Starr PA. 2018.. Totally implantable bidirectional neural prostheses: a flexible platform for innovation in neuromodulation. . Front. Neurosci. 12::619
    [Crossref] [Google Scholar]
  6. 6.
    Cimolato A, Driessen JJM, Mattos LS, de Momi E, Laffranchi M, de Michieli L. 2022.. EMG-driven control in lower limb prostheses: a topic-based systematic review. . J. Neuroeng. Rehabil. 19::43
    [Crossref] [Google Scholar]
  7. 7.
    Pancholi S, Joshi AM. 2020.. Advanced energy kernel-based feature extraction scheme for improved EMG-PR-based prosthesis control against force variation. . IEEE Trans. Cybernet. 52:(5):381928
    [Crossref] [Google Scholar]
  8. 8.
    Nann M, Cordella F, Trigili E, Lauretti C, Bravi M, et al. 2021.. Restoring activities of daily living using an EEG/EOG-controlled semiautonomous and mobile whole-arm exoskeleton in chronic stroke. . IEEE Syst. J. 15:(2):231421
    [Crossref] [Google Scholar]
  9. 9.
    Spanias JA, Simon AM, Finucane SB, Perreault EJ, Hargrove LJ. 2018.. Online adaptive neural control of a robotic lower limb prosthesis. . J. Neural Eng. 15:(1):016015
    [Crossref] [Google Scholar]
  10. 10.
    Caulcrick C, Huo W, Hoult W, Vaidyanathan R. 2021.. Human joint torque modelling with MMG and EMG during lower limb human-exoskeleton interaction. . IEEE Robot. Autom. Lett. 6:(4):718592
    [Crossref] [Google Scholar]
  11. 11.
    Wu X, Liu DX, Liu M, Chen C, Guo H. 2018.. Individualized gait pattern generation for sharing lower limb exoskeleton robot. . IEEE Trans. Autom. Sci. Eng. 15:(4):145970
    [Crossref] [Google Scholar]
  12. 12.
    Mubin O, Alnajjar F, Jishtu N, Alsinglawi B, al Mahmud A. 2019.. Exoskeletons with virtual reality, augmented reality, and gamification for stroke patients’ rehabilitation: systematic review. . JMIR Rehabil. Assist. Technol. 6:(2):e12010
    [Crossref] [Google Scholar]
  13. 13.
    Giuffrida G, Panicacci S, Donati M, Fanucci L. 2022.. Assisted driving for power wheelchair: a segmentation network for obstacle detection on Nvidia Jetson Nano. . In Lecture Notes in Electrical Engineering, pp. 1006 Cham, Switzerland:: Springer Science and Business Media
    [Google Scholar]
  14. 14.
    Terzopoulos G, Satratzemi M. 2020.. Voice assistants and smart speakers in everyday life and in education. . Inform. Educ. 19:(3):47390
    [Crossref] [Google Scholar]
  15. 15.
    Masina F, Orso V, Pluchino P, Dainese G, Volpato S, et al. 2020.. Investigating the accessibility of voice assistants with impaired users: mixed methods study. . J. Med. Internet Res. 22:(9):e18431
    [Crossref] [Google Scholar]
  16. 16.
    Pancholi S, Joshi AM. 2019.. Time derivative moments based feature extraction approach for recognition of upper limb motions using EMG. . IEEE Sens. Lett. 3:(4):6000804
    [Crossref] [Google Scholar]
  17. 17.
    Pancholi S, Joshi AM. 2019.. Electromyography-based hand gesture recognition system for upper limb amputees. . IEEE Sens. Lett. 3:(3):5500304
    [Crossref] [Google Scholar]
  18. 18.
    Liu L, Liu P, Clancy EA, Scheme E, Englehart KB. 2013.. Electromyogram whitening for improved classification accuracy in upper limb prosthesis control. . IEEE Trans. Neural Syst. Rehabil. Eng. 21:(5):76774
    [Crossref] [Google Scholar]
  19. 19.
    Phinyomark A, Limsakul C, Phukpattaranont P. 2009.. A novel feature extraction for robust EMG pattern recognition. . arXiv:0912.3973 [cs.CV]
  20. 20.
    Al-Quraishi MS, Elamvazuthi I, Daud SA, Parasuraman S, Borboni A. 2018.. EEG-based control for upper and lower limb exoskeletons and prostheses: a systematic review. . Sensors 18:(10):3342
    [Crossref] [Google Scholar]
  21. 21.
    Das N, Nagpal N, Bankura SS. 2018.. A review on the advancements in the field of upper limb prosthesis. . J. Med. Eng. Technol. 42::53245
    [Crossref] [Google Scholar]
  22. 22.
    Pancholi S, Joshi AM. 2018.. Portable EMG data acquisition module for upper limb prosthesis application. . IEEE Sens. J. 18:(8):343643
    [Crossref] [Google Scholar]
  23. 23.
    Palumbo A, Gramigna V, Calabrese B, Ielpo N. 2021.. Motor-imagery EEG-based BCIs in wheelchair movement and control: a systematic literature review. . Sensors 21:(18):6285
    [Crossref] [Google Scholar]
  24. 24.
    Samuel OW, Geng Y, Li X, Li G. 2017.. Towards efficient decoding of multiple classes of motor imagery limb movements based on EEG spectral and time domain descriptors. . J. Med. Syst. 41:(12):194
    [Crossref] [Google Scholar]
  25. 25.
    Ansari MF, Edla DR, Dodia S, Kuppili V. 2019.. Brain-computer interface for wheelchair control operations: an approach based on fast Fourier transform and on-line sequential extreme learning machine. . Clin. Epidemiol. Glob. Health 7:(3):27478
    [Crossref] [Google Scholar]
  26. 26.
    Pancholi S, Joshi AM. 2020.. Improved classification scheme using fused wavelet packet transform based features for intelligent myoelectric prostheses. . IEEE Trans. Ind. Electron. 67:(10):851725
    [Crossref] [Google Scholar]
  27. 27.
    Pancholi S, Joshi AM. 2022.. Advanced energy kernel-based feature extraction scheme for improved EMG-PR-based prosthesis control against force variation. . IEEE Trans. Cybernet. 52:(5):381928
    [Crossref] [Google Scholar]
  28. 28.
    Tiwari M, Singhai R. 2017.. A review of detection and tracking of object from image and video sequences. . Int. J. Comput. Intell. Res. 13::74565
    [Google Scholar]
  29. 29.
    Wu D, King JT, Chuang CH, Lin CT, Jung TP. 2018.. Spatial filtering for EEG-based regression problems in brain-computer interface (BCI). . IEEE Trans. Fuzzy Syst. 26:(2):77181
    [Crossref] [Google Scholar]
  30. 30.
    Pancholi S, Giri A, Jain A, Kumar L, Roy S. 2023.. Source aware deep learning framework for hand kinematic reconstruction using EEG signal. . IEEE Trans. Cybernet. 53:(7):4094106 https://doi.org/10.1109/TCYB.2022.3166604
    [Crossref] [Google Scholar]
  31. 31.
    Borhani S, Kilmarx J, Saffo D, Ng L, Abiri R, Zhao X. 2019.. Optimizing prediction model for a noninvasive brain-computer interface platform using channel selection, classification, and regression. . IEEE J. Biomed. Health Inform. 23:(6):247582
    [Crossref] [Google Scholar]
  32. 32.
    Orekhov G, Luque J, Lerner ZF. 2020.. Closing the loop on exoskeleton motor controllers: benefits of regression-based open-loop control. . IEEE Robot. Autom. Lett. 5:(4):602532
    [Crossref] [Google Scholar]
  33. 33.
    Bao T, Zaidi SAR, Xie S, Yang P, Zhang ZQ. 2021.. Inter-subject domain adaptation for CNN-based wrist kinematics estimation using sEMG. . IEEE Trans. Neural Syst. Rehabil. Eng. 29::106878
    [Crossref] [Google Scholar]
  34. 34.
    Li Y, Yang H, Li J, Chen D, Du M. 2020.. EEG-based intention recognition with deep recurrent-convolution neural network: performance and channel selection by Grad-CAM. . Neurocomputing 415::22533
    [Crossref] [Google Scholar]
  35. 35.
    Elessawy RH, Eldawlatly S, Abbas HM. 2020.. A long short-term memory autoencoder approach for EEG motor imagery classification. . In 2020 International Conference on Computation, Automation and Knowledge Management (ICCAKM), pp. 7984 New York:: IEEE
    [Google Scholar]
  36. 36.
    Bao T, Zaidi SAR, Xie S, Yang P, Zhang ZQ. 2021.. A CNN-LSTM hybrid model for wrist kinematics estimation using surface electromyography. . IEEE Trans. Instrum. Meas. 70::2503809
    [Google Scholar]
  37. 37.
    Thatte N, Shah T, Geyer H. 2019.. Robust and adaptive lower limb prosthesis stance control via extended Kalman filter-based gait phase estimation. . IEEE Robot. Autom. Lett. 4:(4):312936
    [Crossref] [Google Scholar]
  38. 38.
    Deng X, Yu ZL, Lin C, Gu Z, Li Y. 2020.. Self-adaptive shared control with brain state evaluation network for human-wheelchair cooperation. . J. Neural Eng. 17:(4):045005
    [Crossref] [Google Scholar]
  39. 39.
    Nakagome S, Luu TP, Brantley JA, Contreras-Vidal JL. 2017.. Prediction of EMG envelopes of multiple terrains over-ground walking from EEG signals using an unscented Kalman filter. . In 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC), pp. 317578 New York:: IEEE
    [Google Scholar]
  40. 40.
    Wang Y, Wang F, Xu K, Zhang Q, Zhang S, Zheng X. 2015.. Neural control of a tracking task via attention-gated reinforcement learning for brain-machine interfaces. . IEEE Trans. Neural Syst. Rehabil. Eng. 23:(3):45867
    [Crossref] [Google Scholar]
  41. 41.
    Wang F, Wang Y, Xu K, Li H, Liao Y, et al. 2017.. Quantized attention-gated kernel reinforcement learning for brain-machine interface decoding. . IEEE Trans. Neural Netw. Learn. Syst. 28:(4):87386
    [Crossref] [Google Scholar]
  42. 42.
    Pasquina PF, Carvalho AJ, Sheehan TP. 2015.. Ethics in rehabilitation: access to prosthetics and quality care following amputation. . AMA J. Ethics 17::53546
    [Crossref] [Google Scholar]
  43. 43.
    Farina D, Jiang N, Rehbaum H, Holobar A, Graimann B, et al. 2014.. The extraction of neural information from the surface EMG for the control of upper-limb prostheses: emerging avenues and challenges. . IEEE Trans. Neural Syst. Rehabil. Eng. 22:(4):797809
    [Crossref] [Google Scholar]
  44. 44.
    Yang J, Su X, Bai D, Jiang Y, Yokoi H. 2016.. Hybrid EEG-EOG system for intelligent prosthesis control based on common spatial pattern algorithm. . In 2016 IEEE International Conference on Information and Automation (ICIA), pp. 126166 New York:: IEEE
    [Google Scholar]
  45. 45.
    Silva J, Heim W, Chau T. 2004.. MMG-based classification of muscle activity for prosthesis control. . In The 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 96871 New York:: IEEE
    [Google Scholar]
  46. 46.
    Cui JW, Li ZG, Du H, Yan BY, Lu PD. 2022.. Recognition of upper limb action intention based on IMU. . Sensors 22:(5):1954
    [Crossref] [Google Scholar]
  47. 47.
    Godiyal AK, Mondal M, Joshi SD, Joshi D. 2018.. Force myography based novel strategy for locomotion classification. . IEEE Trans. Hum. Mach. Syst. 48:(6):64857
    [Crossref] [Google Scholar]
  48. 48.
    Hu B, Rouse E, Hargrove L. 2018.. Fusion of bilateral lower-limb neuromechanical signals improves prediction of locomotor activities. . Front. Robot. AI 5::78
    [Crossref] [Google Scholar]
  49. 49.
    Calado A, Soares F, Matos D. 2019.. A review on commercially available anthropomorphic myoelectric prosthetic hands, pattern-recognition-based microcontrollers and sEMG sensors used for prosthetic control. . In 2019 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), pp. 16 New York:: IEEE
    [Google Scholar]
  50. 50.
    Simon AM, Turner KL, Miller LA, Hargrove LJ, Kuiken TA. 2019.. Pattern recognition and direct control home use of a multi-articulating hand prosthesis. . In 2019 IEEE 16th International Conference on Rehabilitation Robotics (ICORR), pp. 38691 New York:: IEEE
    [Google Scholar]
  51. 51.
    Vilela M, Hochberg LR. 2020.. Applications of brain-computer interfaces to the control of robotic and prosthetic arms. Handbook of Clinical Neurology, pp. 8799 Amsterdam:: Elsevier B.V.
    [Google Scholar]
  52. 52.
    Vansteensel MJ, Pels EGM, Bleichner MG, Branco MP, Denison T, et al. 2016.. Fully implanted brain-computer interface in a locked-in patient with ALS. . N. Engl. J. Med. 375:(21):206066
    [Crossref] [Google Scholar]
  53. 53.
    Kaya M, Binli MK, Ozbay E, Yanar H, Mishchenko Y. 2018.. Data descriptor: a large electroencephalographic motor imagery dataset for electroencephalographic brain computer interfaces. . Sci. Data 5::180211
    [Crossref] [Google Scholar]
  54. 54.
    Mouli S, Palaniappan R. 2019.. DIY hybrid SSVEP-P300 LED stimuli for BCI platform using EMOTIV EEG headset. . Open Science Framework. https://doi.org/10.17605/OSF.IO/8BC5S
    [Google Scholar]
  55. 55.
    Liu YH, Wang SH, Hu MR. 2016.. A self-paced P300 healthcare brain-computer interface system with SSVEP-based switching control and kernel FDA + SVM-based detector. . Appl. Sci. 6:(5):142
    [Crossref] [Google Scholar]
  56. 56.
    Ruhunage I, Perera CJ, Nisal K, Subodha J, Lalitharatne TD. 2017.. EMG signal controlled transhumerai prosthetic with EEG-SSVEP based approch for hand open/close. . In 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC), pp. 316974 New York:: IEEE
    [Google Scholar]
  57. 57.
    Gao H, Luo L, Pi M, Li Z, Li Q, et al. 2021.. EEG-based volitional control of prosthetic legs for walking in different terrains. . IEEE Trans. Autom. Sci. Eng. 18:(2):53040
    [Crossref] [Google Scholar]
  58. 58.
    Cheesborough J, Smith L, Kuiken T, Dumanian G. 2015.. Targeted muscle reinnervation and advanced prosthetic arms. . Semin. Plast. Surg. 29:(1):6272
    [Crossref] [Google Scholar]
  59. 59.
    Sartori M, Durandau G, Došen S, Farina D. 2018.. Robust simultaneous myoelectric control of multiple degrees of freedom in wrist-hand prostheses by real-time neuromusculoskeletal modeling. . J. Neural Eng. 15:(6):066026
    [Crossref] [Google Scholar]
  60. 60.
    Resnik L, Huang HH, Winslow A, Crouch DL, Zhang F, Wolk N. 2018.. Evaluation of EMG pattern recognition for upper limb prosthesis control: a case study in comparison with direct myoelectric control. . J. Neuroeng. Rehabil. 15:(1):23
    [Crossref] [Google Scholar]
  61. 61.
    Prahm C, Schulz A, Paaben B, Schoisswohl J, Kaniusas E, et al. 2019.. Counteracting electrode shifts in upper-limb prosthesis control via transfer learning. . IEEE Trans. Neural Syst. Rehabil. Eng. 27:(5):95662
    [Crossref] [Google Scholar]
  62. 62.
    Asogbon MG, Samuel OW, Geng Y, Oluwagbemi O, Ning J, et al. 2020.. Towards resolving the co-existing impacts of multiple dynamic factors on the performance of EMG-pattern recognition based prostheses. . Comput. Methods Programs Biomed. 184::105278
    [Crossref] [Google Scholar]
  63. 63.
    Idowu OP, Ilesanmi AE, Li X, Samuel OW, Fang P, Li G. 2021.. An integrated deep learning model for motor intention recognition of multi-class EEG signals in upper limb amputees. . Comput. Methods Programs Biomed. 206::106121
    [Crossref] [Google Scholar]
  64. 64.
    Sattar NY, Kausar Z, Usama SA, Farooq U, Shah MF, et al. 2022.. fNIRS-based upper limb motion intention recognition using an artificial neural network for transhumeral amputees. . Sensors 22:(3):726
    [Crossref] [Google Scholar]
  65. 65.
    Montanini L, del Campo A, Perla D, Spinsante S, Gambi E. 2018.. A footwear-based methodology for fall detection. . IEEE Sens. J. 18:(3):123342
    [Crossref] [Google Scholar]
  66. 66.
    Su BY, Wang J, Liu SQ, Sheng M, Jiang J, Xiang K. 2019.. A CNN-based method for intent recognition using inertial measurement units and intelligent lower limb prosthesis. . IEEE Trans. Neural Syst. Rehabil. Eng. 27:(5):103242
    [Crossref] [Google Scholar]
  67. 67.
    Lu Y, Wang H, Qi Y, Xi H. 2021.. Evaluation of classification performance in human lower limb jump phases of signal correlation information and LSTM models. . Biomed. Signal Process. Control 64::102279
    [Crossref] [Google Scholar]
  68. 68.
    Stolyarov R, Carney M, Herr H. 2021.. Accurate heuristic terrain prediction in powered lower-limb prostheses using onboard sensors. . IEEE Trans. Biomed. Eng. 68:(2):38492
    [Crossref] [Google Scholar]
  69. 69.
    Camargo J, Flanagan W, Csomay-Shanklin N, Kanwar B, Young A. 2021.. A machine learning strategy for locomotion classification and parameter estimation using fusion of wearable sensors. . IEEE Trans. Biomed. Eng. 68:(5):156978
    [Crossref] [Google Scholar]
  70. 70.
    Coker J, Chen H, Schall MC, Gallagher S, Zabala M. 2021.. EMG and joint angle-based machine learning to predict future joint angles at the knee. . Sensors 21:(11):3622
    [Crossref] [Google Scholar]
  71. 71.
    Zhong B, da Silva RL, Li M, Huang H, Lobaton E. 2021.. Environmental context prediction for lower limb prostheses with uncertainty quantification. . IEEE Trans. Autom. Sci. Eng. 18:(2):45870
    [Crossref] [Google Scholar]
  72. 72.
    Wei C, Wang H, Hu F, Zhou B, Feng N, et al. 2022.. Single-channel surface electromyography signal classification with variational mode decomposition and entropy feature for lower limb movements recognition. . Biomed. Signal Process. Control 74::103487
    [Crossref] [Google Scholar]
  73. 73.
    Sharma A, Rombokas E. 2022.. Improving IMU-based prediction of lower limb kinematics in natural environments using egocentric optical flow. . IEEE Trans. Neural Syst. Rehabil. Eng. 30::699708
    [Crossref] [Google Scholar]
  74. 74.
    Wang Y, Cheng X, Jabban L, Sui X, Zhang D. 2022.. Motion intention prediction and joint trajectories generation towards lower limb prostheses using EMG and IMU signals. . IEEE Sens. J. 22::1071929
    [Crossref] [Google Scholar]
  75. 75.
    Hamid H, Naseer N, Nazeer H, Khan MJ, Khan RA, Shahbaz Khan U. 2022.. Analyzing classification performance of fNIRS-BCI for gait rehabilitation using deep neural networks. . Sensors 22:(5):1932
    [Crossref] [Google Scholar]
  76. 76.
    Intisar M, Khan MM, Masud M, Shorfuzzaman M. 2022.. Development of a low-cost exoskeleton for rehabilitation and mobility. . Intell. Autom. Soft Comput. 31:(1):10115
    [Crossref] [Google Scholar]
  77. 77.
    Murray SA, Ha KH, Hartigan C, Goldfarb M. 2015.. An assistive control approach for a lower-limb exoskeleton to facilitate recovery of walking following stroke. . IEEE Trans. Neural Syst. Rehabil. Eng. 23:(3):44149
    [Crossref] [Google Scholar]
  78. 78.
    Molteni F, Gasperini G, Cannaviello G, Guanziroli E. 2018.. Exoskeleton and end-effector robots for upper and lower limbs rehabilitation: narrative review. . PM R 10::S17488
    [Crossref] [Google Scholar]
  79. 79.
    Bhagat NA, Venkatakrishnan A, Abibullaev B, Artz EJ, Yozbatiran N, et al. 2016.. Design and optimization of an EEG-based brain machine interface (BMI) to an upper-limb exoskeleton for stroke survivors. . Front. Neurosci. 10::122
    [Crossref] [Google Scholar]
  80. 80.
    Contreras-Vidal JL, Bhagat NA, Brantley J, Cruz-Garza JG, He Y, et al. 2016.. Powered exoskeletons for bipedal locomotion after spinal cord injury. . J. Neural Eng. 13::031001
    [Crossref] [Google Scholar]
  81. 81.
    Sun J, Shen Y, Rosen J. 2021.. Sensor reduction, estimation, and control of an upper-limb exoskeleton. . IEEE Robot. Autom. Lett. 6:(2):101219
    [Crossref] [Google Scholar]
  82. 82.
    Gambon TM, Schmiedeler JP, Wensing PM. 2020.. Effects of user intent changes on onboard sensor measurements during exoskeleton-assisted walking. . IEEE Access 8::22407182
    [Crossref] [Google Scholar]
  83. 83.
    Zhu A, Tu Y, Zheng W, Shen H, Zhang X. 2018.. Adaptive control of man-machine interaction force for lower limb exoskeleton rehabilitation robot. . In 2018 IEEE International Conference on Information and Automation (ICIA), pp. 74043 New York:: IEEE
    [Google Scholar]
  84. 84.
    Sandison M, Phan K, Casas R, Nguyen L, Lum M, et al. 2020.. HandMATE: wearable robotic hand exoskeleton and integrated Android app for at home stroke rehabilitation. . In 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), pp. 486772 New York:: IEEE
    [Google Scholar]
  85. 85.
    Matsuda M, Iwasaki N, Mataki Y, Mutsuzaki H, Yoshikawa K, et al. 2018.. Robot-assisted training using Hybrid Assistive Limb® for cerebral palsy. . Brain Dev. 40:(8):64248
    [Crossref] [Google Scholar]
  86. 86.
    Vourvopoulos A, Jorge C, Abreu R, Figueiredo P, Fernandes JC, Bermúdez i Badia S. 2019.. Efficacy and brain imaging correlates of an immersive motor imagery BCI-driven VR system for upper limb motor rehabilitation: a clinical case report. . Front. Hum. Neurosci. 13::244
    [Crossref] [Google Scholar]
  87. 87.
    Ferrero L, Ortiz M, Quiles V, Iáñez E, Azorín JM. 2021.. Improving motor imagery of gait on a brain-computer interface by means of virtual reality: a case of study. . IEEE Access 9::4912130
    [Crossref] [Google Scholar]
  88. 88.
    Kumar D, González A, Das A, Dutta A, Fraisse P, et al. 2018.. Virtual reality-based center of mass-assisted personalized balance training system. . Front. Bioeng. Biotechnol. 5::85
    [Crossref] [Google Scholar]
  89. 89.
    Zhu M, Sun Z, Chen T, Lee C. 2021.. Low cost exoskeleton manipulator using bidirectional triboelectric sensors enhanced multiple degree of freedom sensory system. . Nat. Commun. 12:(1):2692
    [Crossref] [Google Scholar]
  90. 90.
    Liu DX, Xu J, Chen C, Long X, Tao D, Wu X. 2021.. Vision-assisted autonomous lower-limb exoskeleton robot. . IEEE Trans. Syst. Man Cybernet. Syst. 51:(6):375970
    [Crossref] [Google Scholar]
  91. 91.
    Benabid AL, Costecalde T, Eliseyev A, Charvet G, Verney A, et al. 2019.. An exoskeleton controlled by an epidural wireless brain-machine interface in a tetraplegic patient: a proof-of-concept demonstration. . Lancet Neurol. 18:(12):111222
    [Crossref] [Google Scholar]
  92. 92.
    del-Ama AJ, Gil-Agudo Á, Pons JL, Moreno JC. 2014.. Hybrid FES-robot cooperative control of ambulatory gait rehabilitation exoskeleton. . J. Neuroeng. Rehabil. 11::27
    [Crossref] [Google Scholar]
  93. 93.
    Long Y, Du Z-J, Wang W-D, Dong W. 2018.. Human motion intent learning based motion assistance control for a wearable exoskeleton. . Robot. Comput. Integr. Manuf. 49::31727
    [Crossref] [Google Scholar]
  94. 94.
    Ren JL, Chien YH, Chia EY, Fu LC, Lai JS. 2019.. Deep learning based motion prediction for exoskeleton robot control in upper limb rehabilitation. . In 2019 International Conference on Robotics and Automation (ICRA), pp. 507682 New York:: IEEE
    [Google Scholar]
  95. 95.
    Burns MK, Pei D, Vinjamuri R. 2019.. Myoelectric control of a soft hand exoskeleton using kinematic synergies. . IEEE Trans. Biomed. Circuits Syst. 13:(6):135161
    [Crossref] [Google Scholar]
  96. 96.
    Kang I, Kunapuli P, Young AJ. 2020.. Real-time neural network-based gait phase estimation using a robotic hip exoskeleton. . IEEE Trans. Med. Robot. Bionics 2:(1):2837
    [Crossref] [Google Scholar]
  97. 97.
    Secciani N, Topini A, Ridolfi A, Meli E, Allotta B. 2020.. A novel point-in-polygon-based sEMG classifier for hand exoskeleton systems. . IEEE Trans. Neural Syst. Rehabil. Eng. 28:(12):315866
    [Crossref] [Google Scholar]
  98. 98.
    Luo S, Androwis G, Adamovich S, Su H, Nunez E, Zhou X. 2021.. Reinforcement learning and control of a lower extremity exoskeleton for squat assistance. . Front. Robot. AI 8::702845
    [Crossref] [Google Scholar]
  99. 99.
    Ravindran AS, Malaya CA, John I, Francisco GE, Layne C, Contreras-Vidal JL. 2022.. Decoding neural activity preceding balance loss during standing with a lower-limb exoskeleton using an interpretable deep learning model. . J. Neural Eng. 19:(3):036015
    [Crossref] [Google Scholar]
  100. 100.
    Foroutannia A, Akbarzadeh-T MR, Akbarzadeh A. 2022.. A deep learning strategy for EMG-based joint position prediction in hip exoskeleton assistive robots. . Biomed. Signal Process. Control 75::103557
    [Crossref] [Google Scholar]
  101. 101.
    Furukawa JI, Okajima S, An Q, Nakamura Y, Morimoto J. 2022.. Selective assist strategy by using lightweight carbon frame exoskeleton robot. . IEEE Robot. Autom. Lett. 7:(2):389097
    [Crossref] [Google Scholar]
  102. 102.
    Fortune E, Cloud-Biebl BA, Madansingh SI, Ngufor CG, van Straaten MG, et al. 2022.. Estimation of manual wheelchair-based activities in the free-living environment using a neural network model with inertial body-worn sensors. . J. Electromyogr. Kinesiol. 62::102337
    [Crossref] [Google Scholar]
  103. 103.
    Leaman J, La HM. 2017.. A comprehensive review of smart wheelchairs: past, present, and future. . IEEE Trans. Hum. Mach. Syst. 47::48699
    [Crossref] [Google Scholar]
  104. 104.
    Wei L, Hu H, Lu T, Yuan K. 2010.. Evaluating the performance of a face movement based wheelchair control interface in an indoor environment. . In 2010 IEEE International Conference on Robotics and Biomimetics, pp. 38792 New York:: IEEE
    [Google Scholar]
  105. 105.
    Alam MM, Raihan MMS, Chowdhury MR, Shams AB. 2021.. High precision eye tracking based on electrooculography (EOG) signal using artificial neural network (ANN) for smart technology application. . In 24th International Conference on Computer and Information Technology, ICCIT 2021. New York:: IEEE
    [Google Scholar]
  106. 106.
    Dey P, Hasan MM, Mostofa S, Rana AI. 2019.. Smart wheelchair integrating head gesture navigation. . In 2019 International Conference on Robotics, Electrical and Signal Processing Techniques (ICREST), pp. 32934 New York:: IEEE
    [Google Scholar]
  107. 107.
    Rofer T, Mandel C, Laue T. 2009.. Controlling an automated wheelchair via joystick/head-joystick supported by smart driving assistance. . In 2009 IEEE International Conference on Rehabilitation Robotics, pp. 74348 New York:: IEEE
    [Google Scholar]
  108. 108.
    Kim J, Park H, Bruce J, Rowles D, Holbrook J, et al. 2016.. Assessment of the tongue-drive system using a computer, a smartphone, and a powered-wheelchair by people with tetraplegia. . IEEE Trans. Neural Syst. Rehabil. Eng. 24:(1):6878
    [Crossref] [Google Scholar]
  109. 109.
    Chowdhury SS, Hyder R, Shahanaz C, Fattah SA. 2017.. Robust single finger movement detection scheme for real time wheelchair control by physically challenged people. . In 2017 IEEE Region 10 Humanitarian Technology Conference (R10-HTC), pp. 77377 New York:: IEEE
    [Google Scholar]
  110. 110.
    Meena YK, Cecotti H, Wong-Lin K, Prasad G. 2017.. A multimodal interface to resolve the Midas-Touch problem in gaze controlled wheelchair. . In 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 9058 New York:: IEEE
    [Google Scholar]
  111. 111.
    Bakouri M, Alsehaimi M, Ismail HF, Alshareef K, Ganoun A, et al. 2022.. Steering a robotic wheelchair based on voice recognition system using convolutional neural networks. . Electronics 11:(1):168
    [Crossref] [Google Scholar]
  112. 112.
    Abdulghani MM, Al-Aubidy KM, Ali MM, Hamarsheh QJ. 2020.. Wheelchair neuro fuzzy control and tracking system based on voice recognition. . Sensors 20:(10):2872
    [Crossref] [Google Scholar]
  113. 113.
    Huang Q, Zhang Z, Yu T, He S, Li Y. 2019.. An EEG-/EOG-based hybrid brain-computer interface: application on controlling an integrated wheelchair robotic arm system. . Front. Neurosci. 13::1243
    [Crossref] [Google Scholar]
  114. 114.
    Wang X, Xiao Y, Deng F, Chen Y, Zhang H. 2021.. Eye-movement-controlled wheelchair based on flexible hydrogel biosensor and WT-SVM. . Biosensors 11:(6):198
    [Crossref] [Google Scholar]
  115. 115.
    Giménez CV, Krug S, Qureshi FZ, O'Nils M. 2021.. Evaluation of 2D-/3D-feet-detection methods for semi-autonomous powered wheelchair navigation. . J. Imaging 7:(12):255
    [Crossref] [Google Scholar]
  116. 116.
    Juneja A, Bhandari L, Mohammadbagherpoor H, Singh A, Grant E. 2019.. A comparative study of SLAM algorithms for indoor navigation of autonomous wheelchairs. . In 2019 IEEE International Conference on Cyborg and Bionic Systems (CBS), pp. 26166 New York:: IEEE
    [Google Scholar]
  117. 117.
    Haddad MJ, Sanders DA. 2020.. Deep learning architecture to assist with steering a powered wheelchair. . IEEE Trans. Neural Syst. Rehabil. Eng. 28::298794
    [Crossref] [Google Scholar]
  118. 118.
    Taha T, Miro JV, Dissanayake G. 2011.. A POMDP framework for modelling human interaction with assistive robots. . In 2011 IEEE International Conference on Robotics and Automation, pp. 54449 New York:: IEEE
    [Google Scholar]
  119. 119.
    Narayanan VK, Pasteau F, Marchal M, Krupa A, Babel M. 2016.. Vision-based adaptive assistance and haptic guidance for safe wheelchair corridor following. . Comput. Vis. Image Understand. 149::17185
    [Crossref] [Google Scholar]
  120. 120.
    Chen X, Yu Y, Tang J, Zhou L, Liu K, et al. 2022.. Clinical validation of BCI-controlled wheelchairs in subjects with severe spinal cord injury. . IEEE Trans. Neural Syst. Rehabil. Eng. 30::57989
    [Crossref] [Google Scholar]
  121. 121.
    Devasia D, Roshini TV, Jacob NS, Jose SM, Joseph S. 2020.. Assistance for quadriplegic with BCI enabled wheelchair and IoT. . In 2020 3rd International Conference on Intelligent Sustainable Systems (ICISS), pp. 122026 New York:: IEEE
    [Google Scholar]
  122. 122.
    Mistry KS, Pelayo P, Anil DG, George K. 2018.. An SSVEP based brain computer interface system to control electric wheelchairs. . In 2018 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), pp. 16 New York:: IEEE
    [Google Scholar]
  123. 123.
    Rebsamen B, Guan C, Zhang H, Wang C, Teo C, et al. 2010.. A brain controlled wheelchair to navigate in familiar environments. . IEEE Trans. Neural Syst. Rehabil. Eng. 18:(6):59098
    [Crossref] [Google Scholar]
  124. 124.
    Zhang R, Li Y, Yan Y, Zhang H, Wu S, et al. 2016.. Control of a wheelchair in an indoor environment based on a brain-computer interface and automated navigation. . IEEE Trans. Neural Syst. Rehabil. Eng. 24:(1):12839
    [Crossref] [Google Scholar]
  125. 125.
    Tanaka K, Matsunaga K, Wang HO. 2005.. Electroencephalogram-based control of an electric wheelchair. . IEEE Trans. Robot. 21:(4):76266
    [Crossref] [Google Scholar]
  126. 126.
    Garciá-Massó X, Serra-Anõ P, Gonzalez LM, Ye-Lin Y, Prats-Boluda G, Garcia-Casado J. 2015.. Identifying physical activity type in manual wheelchair users with spinal cord injury by means of accelerometers. . Spinal Cord 53:(10):77277
    [Crossref] [Google Scholar]
  127. 127.
    Kim KT, Suk H-I, Lee SW. 2018.. Commanding a brain-controlled wheelchair using steady-state somatosensory evoked potentials. . IEEE Trans. Neural Syst. Rehabil. Eng. 26:(3):65465
    [Crossref] [Google Scholar]
  128. 128.
    Abiyev RH, Akkaya N, Aytac E, Günsel I, Çaǧman A. 2016.. Brain-computer interface for control of wheelchair using fuzzy neural networks. . BioMed Res. Int. 2016::9359868
    [Crossref] [Google Scholar]
  129. 129.
    Kucukyildiz G, Ocak H, Karakaya S, Sayli O. 2017.. Design and implementation of a multi sensor based brain computer interface for a robotic wheelchair. . J. Intell. Robot. Syst. Theory Appl. 87:(2):24763
    [Crossref] [Google Scholar]
  130. 130.
    Ma C, Li W, Gravina R, Fortino G. 2017.. Posture detection based on smart cushion for wheelchair users. . Sensors 17:(4):618
    [Google Scholar]
  131. 131.
    Kucukyilmaz A, Demiris Y. 2018.. Learning shared control by demonstration for personalized wheelchair assistance. . IEEE Trans. Hapt. 11:(3):43142
    [Crossref] [Google Scholar]
  132. 132.
    Sharifuddin MSI, Nordin S, Ali AM. 2020.. Comparison of CNNs and SVM for voice control wheelchair. . IAES Int. J. Artif. Intell. 9:(3):38793
    [Google Scholar]
  133. 133.
    Alhammad N, Al-Dossari H. 2021.. Dynamic segmentation for physical activity recognition using a single wearable sensor. . Appl. Sci. 11:(6):2633
    [Crossref] [Google Scholar]
  134. 134.
    Chen PW, Klaesner J, Zwir I, Morgan KA. 2022.. Detecting clinical practice guideline-recommended wheelchair propulsion patterns with wearable devices following a wheelchair propulsion intervention. . Assist. Technol. 35:(2):193201 https://doi.org/10.1080/10400435.2021.2010146
    [Crossref] [Google Scholar]
  135. 135.
    Ahmed S, Kallu KD, Ahmed S, Cho SH. 2021.. Hand gestures recognition using radar sensors for human-computer-interaction: a review. . Remote Sens. 13::527
    [Crossref] [Google Scholar]
  136. 136.
    Majumder S, Aghayi E, Noferesti M, Memarzadeh-Tehran H, Mondal T, et al. 2017.. Smart homes for elderly healthcare—recent advances and research challenges. . Sensors 17::2496
    [Crossref] [Google Scholar]
  137. 137.
    Song KT, Chen CC. 1996.. Application of heuristic asymmetric mapping for mobile robot navigation using ultrasonic sensors. . J. Intell. Robot. Syst. 17::24364
    [Crossref] [Google Scholar]
  138. 138.
    Sanders D, Langner M, Bausch N, Huang Y, Khaustov S, Simandjunta S. 2020.. Improving human-machine interaction for a powered wheelchair driver by using variable-switches and sensors that reduce wheelchair-veer. . In Advances in Intelligent Systems and Computing, pp. 117391 Cham, Switzerland:: Springer Verlag
    [Google Scholar]
  139. 139.
    Sebkhi N, Bhavsar A, Sahadat MN, Baldwin J, Walling E, et al. 2022.. Evaluation of a head-tongue controller for power wheelchair driving by people with quadriplegia. . IEEE Trans. Biomed. Eng. 69:(4):13029
    [Crossref] [Google Scholar]
  140. 140.
    Jiang H, Zhang T, Wachs JP, Duerstock BS. 2016.. Enhanced control of a wheelchair-mounted robotic manipulator using 3-D vision and multimodal interaction. . Comput. Vis. Image Underst. 149::2131
    [Crossref] [Google Scholar]
  141. 141.
    Parmar K, Mehta B, Sawant R. 2012.. Facial-feature based human-computer interface for disabled people. . In 2012 International Conference on Communication, Information & Computing Technology (ICCICT), pp. 15 New York:: IEEE
    [Google Scholar]
  142. 142.
    Zhang R, He S, Yang X, Wang X, Li K, et al. 2019.. An EOG-based human-machine interface to control a smart home environment for patients with severe spinal cord injuries. . IEEE Trans. Biomed. Eng. 66:(1):89100
    [Crossref] [Google Scholar]
  143. 143.
    Liu Y, Yiu C, Song Z, Huang Y, Yao K, et al. 2022.. Electronic skin as wireless human-machine interfaces for robotic VR. . Sci. Adv. 8:(2):eabl6700
    [Crossref] [Google Scholar]
  144. 144.
    Khan F, Leem SK, Cho SH. 2018.. Human-computer interaction using radio sensor for people with severe disability. . Sens. Actuators A Phys. 282::3954
    [Crossref] [Google Scholar]
  145. 145.
    Jiang H, Duerstock BS, Wachs JP. 2014.. Machine vision-based gestural interface for people with upper extremity physical impairments. . IEEE Trans. Syst. Man Cybern. Syst. 44::63041
    [Crossref] [Google Scholar]
  146. 146.
    Liu X, Sacks J, Zhang M, Richardson AG, Lucas TH, van der Spiegel J. 2017.. The virtual trackpad: an electromyography-based, wireless, real-time, low-power, embedded hand-gesture-recognition system using an event-driven artificial neural network. . IEEE Trans. Circuits Syst. II Express Br. 64:(11):125761
    [Google Scholar]
  147. 147.
    Musk E. 2019.. An integrated brain-machine interface platform with thousands of channels. . J. Med. Internet Res. 21:(10):e16194
    [Crossref] [Google Scholar]
  148. 148.
    Crane L. 2020.. Elon Musk demonstrated a Neuralink brain implant in a live pig. . New Scientist, Aug. 29. https://www.newscientist.com/article/2253274-elon-musk-demonstrated-a-neuralink-brain-implant-in-a-live-pig/
    [Google Scholar]
/content/journals/10.1146/annurev-bioeng-082222-012531
Loading
/content/journals/10.1146/annurev-bioeng-082222-012531
Loading

Data & Media loading...

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error