Capturing Human Movements in Real Time in Collaborative Robots Workspace within Industry 5.0
PDF
DOI

Keywords

Real-time motion capture, collaborative robots, human-robot interaction, Industry 5.0, pose estimation, workplace safety

How to Cite

Capturing Human Movements in Real Time in Collaborative Robots Workspace within Industry 5.0. (2024). Journal of Universal Science Research, 2(10), 232-247. https://universalpublishings.com/index.php/jusr/article/view/7594

Abstract

The article examines the application of technologies for capturing human movements in real time in collaborative robots workspace ​​ in the context of Industry 5.0. Mathematical apparatus and software for analyzing movements with high accuracy has been developed, which allows to increase the safety and efficiency of human-robot interaction. The conducted experiments showed the influence of lighting conditions and movements speed on the accuracy of movements capture and visualization, which is critically important for the optimal operation of collaborative robots in production environments.

PDF
DOI

References

1. Samoilenko, H., & et al. (2024). Review for Collective Problem-Solving by a Group of Robots. Journal of Universal Science Research, 2(6), 7-16.

2. Yevsieiev, V., & et al. (2024). Route constructing for a mobile robot based on the D-star algorithm. Technical Science Research in Uzbekistan, 2(4), 55-66.

3. Abu-Jassar, A., & et al. (2023). Obstacle Avoidance Sensors: A Brief Overview. Multidisciplinary Journal of Science and Technology, 3(5), 4-10.

4. Gurin, D., & et al. (2024). Using the Kalman Filter to Represent Probabilistic Models for Determining the Location of a Person in Collaborative Robot Working Area. Multidisciplinary Journal of Science and Technology, 4(8), 66-75.

5. Yevsieiev, V., & et al. (2024). The Canny Algorithm Implementation for Obtaining the Object Contour in a Mobile Robot’s Workspace in Real Time. Journal of Universal Science Research, 2(3), 7–19.

6. Gurin, D., & et al. (2024). MobileNetv2 Neural Network Model for Human Recognition and Identification in the Working Area of a Collaborative Robot. Multidisciplinary Journal of Science and Technology, 4(8), 5-12.

7. Sotnik, S., Mustafa, S. K., Ahmad, M. A., Lyashenko, V., & Zeleniy, O. (2020). Some features of route planning as the basis in a mobile robot. International Journal of Emerging Trends in Engineering Research, 8(5), 2074-2079.

8. Matarneh, R., Maksymova, S., Deineko, Z., & Lyashenko, V. (2017). Building robot voice control training methodology using artificial neural net. International Journal of Civil Engineering and Technology, 8(10), 523-532.

9. Nevliudov, I., & et al.. (2020). Method of Algorithms for CyberPhysical Production Systems Functioning Synthesis. International Journal of Emerging Trends in Engineering Research, 8(10), 7465-7473.

10. Mustafa, S. K., Yevsieiev, V., Nevliudov, I., & Lyashenko, V. (2022). HMI Development Automation with GUI Elements for Object-Oriented Programming Languages Implementation. SSRG International Journal of Engineering Trends and Technology, 70(1), 139-145.

11. Nevliudov, I., Yevsieiev, V., Lyashenko, V., & Ahmad, M. A. (2021). GUI Elements and Windows Form Formalization Parameters and Events Method to Automate the Process of Additive Cyber-Design CPPS Development. Advances in Dynamical Systems and Applications, 16(2), 441-455.

12. Lyashenko, V., Abu-Jassar, A. T., Yevsieiev, V., & Maksymova, S. (2023). Automated Monitoring and Visualization System in Production. International Research Journal of Multidisciplinary Technovation, 5(6), 9-18.

13. Abu-Jassar, A. T., Attar, H., Lyashenko, V., Amer, A., Sotnik, S., & Solyman, A. (2023). Access control to robotic systems based on biometric: the generalized model and its practical implementation. International Journal of Intelligent Engineering and Systems, 16(5), 313-328.

14. Al-Sharo, Y. M., Abu-Jassar, A. T., Sotnik, S., & Lyashenko, V. (2023). Generalized Procedure for Determining the Collision-Free Trajectory for a Robotic Arm. Tikrit Journal of Engineering Sciences, 30(2), 142-151.

15. Ahmad, M. A., Sinelnikova, T., Lyashenko, V., & Mustafa, S. K. (2020). Features of the construction and control of the navigation system of a mobile robot. International Journal of Emerging Trends in Engineering Research, 8(4), 1445-1449.

16. Abu-Jassar, A., & et al. (2024). The Optical Flow Method and Graham’s Algorithm Implementation Features for Searching for the Object Contour in the Mobile Robot’s Workspace. Journal of Universal Science Research, 2(3), 64-75.

17. Gurin, D., & et al. (2024). Effect of Frame Processing Frequency on Object Identification Using MobileNetV2 Neural Network for a Mobile Robot. Multidisciplinary Journal of Science and Technology, 4(8), 36-44.

18. Yevsieiev, V., & et al. (2024). Object Recognition and Tracking Method in the Mobile Robot’s Workspace in Real Time. Technical science research in Uzbekistan, 2(2), 115-124.

19. Gurin, D., & et al. (2024). Using Convolutional Neural Networks to Analyze and Detect Key Points of Objects in Image. Multidisciplinary Journal of Science and Technology, 4(9), 5-15.

20. Yevsieiev, V., & et al. (2024). Using Contouring Algorithms to Select Objects in the Robots’ Workspace. Technical science research in Uzbekistan, 2(2), 32–42.

21. Al-Sharo, Y. M., Abu-Jassar, A. T., Sotnik, S., & Lyashenko, V. (2021). Neural networks as a tool for pattern recognition of fasteners. International Journal of Engineering Trends and Technology, 69(10), 151-160.

22. Abu-Jassar, A. T., Al-Sharo, Y. M., Lyashenko, V., & Sotnik, S. (2021). Some Features of Classifiers Implementation for Object Recognition in Specialized Computer systems. TEM Journal: Technology, Education, Management, Informatics, 10(4), 1645-1654.

23. Deineko, Zh., & et al.. (2021). Color space image as a factor in the choice of its processing technology. Abstracts of I International scientific-practical conference «Problems of modern science and practice» (September 21-24, 2021). Boston, USA, pp. 389-394.

24. Lyashenko, V., Kobylin, O., & Ahmad, M. A. (2014). General methodology for implementation of image normalization procedure using its wavelet transform. International Journal of Science and Research (IJSR), 3(11), 2870-2877.

25. Lyashenko, V., Matarneh, R., & Kobylin, O. (2016). Contrast modification as a tool to study the structure of blood components. Journal of Environmental Science, Computer Science and Engineering & Technology, 5(3), 150-160.

26. Lyubchenko, V., & et al.. (2016). Digital image processing techniques for detection and diagnosis of fish diseases. International Journal of Advanced Research in Computer Science and Software Engineering, 6(7), 79-83.

27. Lyashenko, V. V., Matarneh, R., Kobylin, O., & Putyatin, Y. P. (2016). Contour Detection and Allocation for Cytological Images Using Wavelet Analysis Methodology. International Journal, 4(1), 85-94.

28. Mousavi, S. M. H., Lyashenko, V., & Prasath, S. (2019). Analysis of a robust edge detection system in different color spaces using color and depth images. Компьютерная оптика, 43(4), 632-646.

29. Orobinskyi, P., Deineko, Z., & Lyashenko, V. (2020). Comparative Characteristics of Filtration Methods in the Processing of Medical Images. American Journal of Engineering Research, 9(4), 20-25.

30. Orobinskyi, P., Petrenko, D., & Lyashenko, V. (2019, February). Novel approach to computer-aided detection of lung nodules of difficult location with use of multifactorial models and deep neural networks. In 2019 IEEE 15th International Conference on the Experience of Designing and Application of CAD Systems (CADSM) (pp. 1-5). IEEE.

31. Lyashenko, V., Kobylin, O., & Selevko, O. (2020). Wavelet analysis and contrast modification in the study of cell structures images. International Journal of Advanced Trends in Computer Science and Engineering, 9(4), 4701-4706.

32. Matarneh, R., & et al.. (2019). Development of an Information Model for Industrial Robots Actuators. IOSR Journal of Mechanical and Civil Engineering (IOSR-JMCE), 16(1), 61-67.

33. Sotnik, S., & et al.. (2022). Analysis of Existing Infliences in Formation of Mobile Robots Trajectory. International Journal of Academic Information Systems Research, 6(1), 13-20.

34. Sotnik, S., & et al.. (2022). Modern Industrial Robotics Industry. International Journal of Academic Engineering Research, 6(1),. 37-46.

35. Drugarin, C. V. A., Lyashenko, V. V., Mbunwe, M. J., & Ahmad, M. A. (2018). Pre-processing of Images as a Source of Additional Information for Image of the Natural Polymer Composites. Analele Universitatii'Eftimie Murgu', 25(2).

36. Lyubchenko, V., Veretelnyk, K., Kots, P., & Lyashenko, V. (2024). Digital image segmentation procedure as an example of an NP-problem. Multidisciplinary Journal of Science and Technology, 4(4), 170-177.

37. Arents, J., & et al. (2021). Human–robot collaboration trends and safety aspects: A systematic review. Journal of Sensor and Actuator Networks, 10(3), 48.

38. Bonci, A., & et al. (2021). Human-robot perception in industrial environments: A survey. Sensors, 21(5), 1571.

39. Rahmaniar, W., & Hernawan, A. (2021). Real-time human detection using deep learning on embedded platforms: A review. Journal of Robotics and Control (JRC), 2(6), 462-468.

40. Gao, Q., & et al. (2020). Robust real-time hand detection and localization for space human–robot interaction based on deep learning. Neurocomputing, 390, 198-206.

41. Yan, Z., & et al. (2020). Online learning for 3D LiDAR-based human detection: experimental analysis of point cloud clustering and classification methods. Autonomous Robots, 44(2), 147-164.

42. Mohammadi Amin, F., & et al. (2020). A mixed-perception approach for safe human–robot collaboration in industrial automation. Sensors, 20(21), 6347.

43. Sharkawy, A. N., & et al. (2020). Human–robot collisions detection for safe human–robot interaction using one multi-input–output neural network. Soft Computing, 24(9), 6687-6719.

44. Liu, C., & Szirányi, T. (2021). Real-time human detection and gesture recognition for on-board UAV rescue. Sensors, 21(6), 2180.

45. Yevsieiev, V., & et al. (2024). Building a traffic route taking into account obstacles based on the A-star algorithm using the python language. Technical Science Research In Uzbekistan, 2(3), 103-112.

46. Vizir, Y., & et al. (2024). Lighting Control Module Software Development. Journal of Universal Science Research, 2(2), 29–42.

Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.