Tacking Pergerakan Tangan Menggunakan Skeleton Tracking Kinect

Fachri Yanuar Rudi F

Sari


Gerakan tangan merupakan salah satu cara yang interaktif dalam melakukan interaksi manusia dan komputer. Dengan menggunakan gerakan tangan batasan dalam berinteraksi dengan komputer akan semakin kecil dan akan semakin menarik. Salah satu kendala salam melakukan tracking gerakan tangan adalah sulitnya memisahkan tangan dengan latar belakang. Cara yang umum dilakukan untuk mengatasi hal tersebut adalah dengan mengatur latar belakang ataupun menggunakan lagoritma deep learning untuk memisahkan tangan dengan latar belakang. Kinect kamera menyediakan fasilitas penangkapan kedalaman (depth image) yang mana hasil dari depth image tersebut dapat digunakan untuk melakukan Skeleton tracking pada manusia. Skeleton tracking tersebut dapat digunakan untuk memudahkan tracking pergerakan tangan.

 

Abstract

Hand gesture is the interactive ways to interact with computer. Using hand gesture could reduce limitation in human computer interaction and make it a lot of attractive. One of the problem in tracking hand gesture is the difficulty of separating the hands from the background. A common way to deal with this is to set the background or use a deep learning algorithm to separate the hand from the background. Kinect camera provides a depth image capture facility where the results from the depth image can be used to perform Skeleton tracking on humans. Skeleton tracking can be used to make it easier to track hand movements.


Teks Lengkap:

PDF PDF

Referensi


A. Haria, A. Subramanian, N. Asokkumar, S. Poddar, and J. S. Nayak, “Hand Gesture Recognition for Human Computer Interaction,” Procedia Comput. Sci., vol. 115, pp. 367–374, 2017, doi: 10.1016/j.procs.2017.09.092.

M. Panwar and P. Singh Mehra, “Hand gesture recognition for human computer interaction,” in 2011 International Conference on Image Information Processing, Nov. 2011, pp. 1–7, doi: 10.1109/ICIIP.2011.6108940.

J. M. Rehg and T. Kanade, “Visual tracking of high DOF articulated structures: An application to human hand tracking,” Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 801 LNCS, pp. 35–46, 1994, doi: 10.1007/bfb0028333.

B. Stenger, A. Thayananthan, P. H. S. Torr, and R. Cipolla, “Model-based hand tracking using a hierarchical Bayesian filter,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 28, no. 9, pp. 1372–1384, 2006, doi: 10.1109/TPAMI.2006.189.

R. Y. Wang and J. Popović, “Real-time hand-tracking with a color glove,” p. 1, 2009, doi: 10.1145/1576246.1531369.

O. Kainz and F. Jakab, “Approach to Hand Tracking and Gesture Recognition Based on Depth-Sensing Cameras and EMG Monitoring,” Acta Inform. Pragensia, vol. 3, no. 1, pp. 104–112, 2014, doi: 10.18267/j.aip.38.

Z. Chen, J. Kim, J. Liang, J. Zhang, and Y. Yuan, “Real-Time Hand Gesture Recognition,” Int. J. Intell. Commun. Comput. Networks, vol. 02, no. 02, 2021, doi: 10.51735/ijiccn/001/30.

A. Mujahid et al., “Real-time hand gesture recognition based on deep learning YOLOv3 model,” Appl. Sci., vol. 11, no. 9, 2021, doi: 10.3390/app11094164.

F. Fachri Yanuar Rudi and E. M. Yuniarno, “Contour to Centroid Distance Graph as Feature in Hand Gesture Recognition,” IOP Conf. Ser. Mater. Sci. Eng., vol. 536, no. 1, 2019, doi: 10.1088/1757-899X/536/1/012150.

X. Luo, A. Amighetti, and D. Zhang, “A Human-Robot Interaction for a Mecanum Wheeled Mobile Robot with Real-Time 3D Two-Hand Gesture Recognition,” J. Phys. Conf. Ser., vol. 1267, no. 1, Jul. 2019, doi: 10.1088/1742-6596/1267/1/012056.

E. I. Nikolaev, P. V. Dvoryaninov, Y. Y. Lensky, and N. S. Drozdovsky, “Using virtual data for training deep model for hand gesture recognition,” J. Phys. Conf. Ser., vol. 1015, no. 4, May 2018, doi: 10.1088/1742-6596/1015/4/042045.




DOI: http://dx.doi.org/10.30811/jaise.v1i2.2449

Refbacks

  • Saat ini tidak ada refbacks.


 

 

 

 

 

Creative Commons License
Journal of Artificial Intelligence and Software Engineering (JAISE) licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.