iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://doi.org/10.1007/s42979-024-03300-7
Hybrid Deep Learning Approaches for Human Activity Recognition and Postural Transitions Using Mobile Device Sensors | SN Computer Science Skip to main content

Advertisement

Log in

Hybrid Deep Learning Approaches for Human Activity Recognition and Postural Transitions Using Mobile Device Sensors

  • Original Research
  • Published:
SN Computer Science Aims and scope Submit manuscript

Abstract

Human activities recognition (HAR) has grown in popularity over the last decade by using technologies such as mobile phones, smartphones, and video cameras. In fact, this growth has been propelled by using artificial intelligence-based techniques to improve the rate of recognition. The aim of the paper is the development of a robust model to recognize twelve human activities data across individuals aged 19–48 years. We have utilized several machine and deep learning techniques including Naive Bayes (NB), stochastic gradient descent (SGD), XGBoost, convolutional neural network (CNN), CatBoost, LGB classifier, K-nearest neighbor (KNN), long short-term memory (LSTM), CNN-GRU, gated recurrent unit (GRU), and CNN-LSTM. Three static postures and three dynamic actions are investigated, including various transitions. The methodology begins with preprocessing the dataset for cleaning, user counts and frequency analysis. The t-stochastic neighbour embedding approach is incorporated for further exploration and visualization with high dimensions. We employ Principal component analysis (PCA) to compute several groupings based on different human activities in three dimensions. Among other machine learning models, SGD exhibits the highest accuracy at 88.6% with a loss rate of 0.79. On the deep learning front, GRU model achieved the maximum accuracy of 84.50% and the lowest loss score of 0.49. These findings demonstrate the efficacy of both conventional machine learning and deep learning techniques in the recognition of HAR, offering promising directions for future study and practical implementation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Availability of Data

The data supporting the findings of the study can be provided upon request.

References

  1. Park H, Lee G, Han J, Choi J. Multiclass autoencoder-based active learning for sensor-based human activity recognition. Futur Gener Comput Syst. 2024;151:71–84.

    Article  Google Scholar 

  2. Shreyas D, Raksha S, Prasad B. Implementation of an anomalous human activity recognition system. SN Comput Sci. 2020;1:168.

    Article  Google Scholar 

  3. Parvathy P, Subramaniam K, Prasanna Venkatesan GKD, Karthikaikumar P, Varghese J, Jayasankar T. Development of hand gesture recognition system using machine learning. J Ambient Intell Humaniz Comput. 2021;12:6793–800.

    Article  Google Scholar 

  4. Kumar P, Chauhan S, Awasthi LK. Human activity recognition (HAR) using deep learning: review, methodologies, progress and future research directions. Arch Comput Methods Eng. 2024;31:179–219.

    Article  Google Scholar 

  5. Guo W, Yamagishi S, Jing L. Human activity recognition via Wi-Fi and inertial sensors with machine learning. IEEE Access. 2024;12:18821.

    Article  Google Scholar 

  6. Lima WS, Souto E, El-Khatib K, Jalali R, Gama J. Human activity recognition using inertial sensors in a smartphone: an overview. Sensors. 2019;19:3213.

    Article  Google Scholar 

  7. Shukla P, Vijayvargiya A. Human activity recognition using accelerometer and gyroscope data from smartphones. In: 2020 Int. Conf. Emerg. Trends Commun. Control Comput. 2020. pp. 1–6.

  8. Miah A, Hasan M, Shin J. Dynamic hand gesture recognition using multi-branch attention based graph and general deep learning model. IEEE Access. 2023;11:4703–16.

    Article  Google Scholar 

  9. Chen K, Zhang D, Yao L, Guo B, Yu Z, Liu Y. Deep learning for sensor-based human activity recognition: Overview, challenges, and opportunities. ACM Comput Surv. 2021;54(4):1–40.

    Google Scholar 

  10. Angeleas A, Bourbakis N. A formal and statistical AI tool for complex human activity recognition. Learn Anal Intell Syst. 2022;23:189–216.

    Google Scholar 

  11. Zhou X, Liang W, Kevin I, Wang K. Deep-learning-enhanced human activity recognition for Internet of healthcare things. IEEE Internet Things J. 2020;7:6429–38.

    Article  Google Scholar 

  12. Thakur D, Pal A. Subsampled randomized Hadamard transformation-based ensemble extreme learning machine for human activity recognition. ACM Trans Comput Healthc. 2024;5:1–23.

    Article  Google Scholar 

  13. Patil J, Adamuthe A, Patil S. Human behavior analysis: applications and machine learning algorithms. Smart Innov Syst Technol. 2022; 303 SIST. pp 253–62.

  14. Semwal V, Gupta A, Lalwani P. An optimized hybrid deep learning model using ensemble learning approach for human walking activities recognition. J Supercomput. 2021;77:12256–79.

    Article  Google Scholar 

  15. Rustam F, Reshi A, Ashraf I, Mehmood A. Sensor-based human activity recognition using deep stacked multilayered perceptron model. IEEE Access. 2020;8:218898–910.

    Article  Google Scholar 

  16. Qiu S, Zhao H, Jiang N, Wang Z, Liu L, An Y, et al. Multi-sensor information fusion based on machine learning for real applications in human activity recognition: state-of-the-art and research challenges. Inf Fusion. 2022;80:241–65.

    Article  Google Scholar 

  17. Wan S, Qi L, Xu X, Tong C, Gu Z. Deep learning models for real-time human activity recognition with smartphones. Mob Networks Appl. 2020;25:743–55.

    Article  Google Scholar 

  18. Modi N, Singh J. Real-time camera-based eye gaze tracking using convolutional neural network: a case study on social media website. Virtual Real. 2022;26(4):1489–506.

    Article  Google Scholar 

  19. Dua N, Singh SN, Semwal VB, Challa SK. Inception inspired CNN-GRU hybrid network for human activity recognition. Multimed Tools Appl. 2023;82:5369–403.

    Article  Google Scholar 

  20. Sarkar A, Hossain SKS, Sarkar R. Human activity recognition from sensor data using spatial attention-aided CNN with genetic algorithm. Neural Comput Appl. 2023;35:5165–91.

    Article  Google Scholar 

  21. Mekruksavanich S, Jitpattanakul A. Hybrid convolution neural network with channel attention mechanism for sensor-based human activity recognition. Sci Rep. 2023;23:12067.

    Article  Google Scholar 

  22. Chung J, Kang T, Kwun D, Lee J, Kim S. User recognition based on human body impulse response: a feasibility study. IEEE Access. 2020;8:218898–910.

    Article  Google Scholar 

  23. Tufek N, Yalcin M, Altintas M, Kalaoglu F. Human action recognition using deep learning methods on limited sensory data. IEEE Sens J. 2019;20:3101–12.

    Article  Google Scholar 

  24. Choudhury N, Soni B. An adaptive batch size based-cnn-lstm framework for human activity recognition in uncontrolled environment. IEEE Trans Ind Informatics. 2023;19:10379–87.

    Article  Google Scholar 

  25. Ni T, Chen Y, Song K, Xu W. A simple and fast human activity recognition system using radio frequency energy harvesting. In: Adjun. Proc. 2021 ACM Int. Jt. Conf. Pervasive Ubiquitous Comput. Proc. 2021 ACM Int. Symp. Wearable Comput. ACM; 2021. pp. 666–71.

  26. Sri Harsha NC, Anudeep YGVS, Vikash K, Ratnam DV. Performance analysis of machine learning algorithms for smartphone-based human activity recognition. Wirel Pers Commun. 2021;121:381–98.

    Article  Google Scholar 

  27. Kılıç Ş, Kaya Y, Askerbeyli İ. A new approach for human recognition through wearable sensor signals. Arab J Sci Eng. 2021;46:4175–89.

    Article  Google Scholar 

  28. Taylor W, Shah S, Dashtipour K, Zahid A, Abbasi Q. An intelligent non-invasive real-time human activity recognition system for next-generation healthcare. Sensors. 2020;20:2653.

    Article  Google Scholar 

  29. Ferrari A, Micucci D, Mobilio M, Napoletano P. Deep learning and model personalization in sensor-based human activity recognition. J Reliab Intell Environ. 2023;9:27–39.

    Article  Google Scholar 

  30. Kaya Y, Topuz EK. Human activity recognition from multiple sensors data using deep CNNs. Multimed Tools Appl. 2024;83:10815–38.

    Article  Google Scholar 

  31. Wensel J, Ullah H, Munir A. ViT-ReT: vision and recurrent transformer neural networks for human activity recognition in videos. IEEE Access. 2023;11:72227–49.

    Article  Google Scholar 

  32. Ray A, Kolekar M, Balasubramanian R. Transfer learning enhanced vision-based human activity recognition: a decade-long analysis. Int J Inf Manag Data Insights. 2023;3: 100142.

    Google Scholar 

  33. Hassan N, Miah A, Shin J. A deep bidirectional LSTM model enhanced by transfer-learning-based feature extraction for dynamic human activity recognition. Appl Sci. 2024;14:603.

    Article  Google Scholar 

  34. Saleem G, Bajwa UI, Raza RH. Toward human activity recognition: a survey. Neural Comput Appl. 2023;35:4145–82.

    Article  Google Scholar 

  35. Agahian S, Negin F, Köse C. An efficient human action recognition framework with pose-based spatiotemporal features. Eng Sci Technol an Int J. 2020;23:196–203.

    Article  Google Scholar 

  36. Avola D, Bernardi M, Foresti GL. Fusing depth and colour information for human action recognition. Multimed Tools Appl. 2019;78:5919–39.

    Article  Google Scholar 

  37. Vahora S, Chauhan N. Deep neural network model for group activity recognition using contextual relationship. Eng Sci Technol an Int J. 2019;22:47–54.

    Article  Google Scholar 

  38. Zhang W, Zhao X, Li Z. A comprehensive study of smartphone-based indoor activity recognition via Xgboost. IEEE Access. 2019;7:80027–42.

    Article  Google Scholar 

  39. Qi J, Liang H, Chen J, Peng X. A hybrid hierarchical model for accessing physical activity recognition towards free-living environments. In: 2020 IEEE Intl Conf Parallel Distrib. Process. with Appl. Big Data Cloud Comput. Sustain. Comput. Commun. Soc. Comput. Netw. 2020. pp. 1342–7.

  40. Gao X, Luo H, Wang Q, Zhao F, Ye L, Zhang Y. A human activity recognition algorithm based on stacking denoising autoencoder and lightGBM. Sensors. 2019;19:947.

    Article  Google Scholar 

  41. Rahman S, Irfan M, Raza M. Performance analysis of boosting classifiers in recognizing activities of daily living. Int J Environ Res Public Health. 2020;17:1082.

    Article  Google Scholar 

  42. Ahmed S, Bhuiyan T, Kishi T, Nii M, Kobashi S. Human activity classification based on angle variance analysis utilizing the Poincare plot. Appl Sci. 2021;11:7230.

    Article  Google Scholar 

  43. Uddin M, Zada N, Aziz F, Saeed Y, Zeb A, Shah SA. Prediction of future terrorist activities using deep neural networks. Complexity. 2020;22:1–20.

    Article  Google Scholar 

  44. Muhammad K, Ullah A, Imran A, Sajjad M. Human action recognition using attention based LSTM network with dilated CNN features. Futur Gener Comput Syst. 2021;125:820–30.

    Article  Google Scholar 

  45. Gul M, Yousaf M, Nawaz S, Rehman ZU, Kim H. Patient monitoring by abnormal human activity recognition based on CNN architecture. Electronics. 2020;9:1993.

    Article  Google Scholar 

  46. Xia K, Huang J, Wang H. LSTM-CNN architecture for human activity recognition. IEEE Access. 2020;8:56855–66.

    Article  Google Scholar 

  47. Mutegeki R, Han D. A CNN-LSTM approach to human activity recognition. In: 2020 Int. Conf. Artif. Intell. Inf. Commun. 2020. pp. 362–6.

  48. Ding A, Zhang Y, Zhu L, Li H, Huang L. Intelligent recognition of rough handling of express parcels based on CNN-GRU with the channel attention mechanism. J Ambient Intell Humaniz Comput. 2023;14:973–90.

    Article  Google Scholar 

  49. Jing C, Wei P, Sun H, Zheng N. Spatiotemporal neural networks for action recognition based on joint loss. Neural Comput Appl. 2020;32:4293–302.

    Article  Google Scholar 

  50. Islam M, Nooruddin S, Karray F, Muhammad G. Multi-level feature fusion for multimodal human activity recognition in Internet of Healthcare Things. Inf Fusion. 2023;94:17–31.

    Article  Google Scholar 

  51. Yadav SK, Tiwari K, Pandey HM, Akbar SA. Skeleton-based human activity recognition using ConvLSTM and guided feature learning. Soft Comput. 2022;26:877–90.

    Article  Google Scholar 

  52. Singh R, Khurana R, Kushwaha A, Srivastava R. Combining CNN streams of dynamic image and depth data for action recognition. Multimed Syst. 2020;26:313–22.

    Article  Google Scholar 

  53. Priyadarshini I, Sharma R, Bhatt D, Al-Numay M. Human activity recognition in cyber-physical systems using optimized machine learning techniques. Cluster Comput. 2023;26:2199–215.

    Article  Google Scholar 

  54. Chadha J, Jain A, Kumar Y. Satellite imagery-based Airbus ship localization and detection using deep learning-based approaches. Peer-to-Peer Netw Appl. 2023;16:1481–98.

    Article  Google Scholar 

Download references

Funding

Not Applicable.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nandini Modi.

Ethics declarations

Conflict of interest

No conflict of interest has been declared by the authors.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chadha, J., Jain, A., Kumar, Y. et al. Hybrid Deep Learning Approaches for Human Activity Recognition and Postural Transitions Using Mobile Device Sensors. SN COMPUT. SCI. 5, 925 (2024). https://doi.org/10.1007/s42979-024-03300-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s42979-024-03300-7

Keywords

Navigation