Human Gesture Recognition using Keyframes on Local Joint Motion Trajectories

dc.contributor.authorDurgut, Rafet
dc.contributor.authorFindik, Oguz
dc.date.accessioned2024-09-29T16:11:32Z
dc.date.available2024-09-29T16:11:32Z
dc.date.issued2017
dc.departmentKarabük Üniversitesien_US
dc.description.abstractHuman Action Recognition (HAR) systems are systems that recognize and classify the actions that users perform against the sensor or camera. In most HAR systems, an input test data is compared with the reference data in the database using various methods. Classification process is performed according to the result obtained. The size of the test or reference data directly affects the operation speed of the system. Reduced data size allows a significant performance increase in system operation speed. In this study, action recognition method is proposed by using skeletal joint information obtained by Microsoft Kinect sensor. Splitting keyframes are obtained from the skeletal joint information. The keyframes are observed as a distinguishing feature. Therefore, these keyframes are used for the classification process. Keeping the keyframes instead of keeping the position or angle information of action in the reference database can benefit from memory and working time. The weight value of each keyframes is calculated in the method. The problem of temporal differences that occur when comparing test and reference action is solved by Dynamic Time Warping (DTW). The k-nearest neighbor's algorithm is used for classification according to the obtained results from DTW. The sample has been tested in a data set so that the success of the method can be tested. As a result, 100% correct classification was achieved. It is also suitable for working at real time systems. Breakpoints can also be used to provide feedback to the user as a result of the classification process. The magnitude and direction of the keyframes, the change in the trajectory of joint, the position and the time of its existence also give information about the time errors.en_US
dc.description.sponsorshipScientific Research Coordination Unit of Karabuk University [KBU-BAP-17-DR-170]en_US
dc.description.sponsorshipThis study was supported by the Scientific Research Coordination Unit of Karabuk University under Grant KBU-BAP-17-DR-170.en_US
dc.identifier.endpage136en_US
dc.identifier.issn2158-107X
dc.identifier.issn2156-5570
dc.identifier.issue4en_US
dc.identifier.startpage131en_US
dc.identifier.urihttps://hdl.handle.net/20.500.14619/8516
dc.identifier.volume8en_US
dc.identifier.wosWOS:000403339400019en_US
dc.identifier.wosqualityN/Aen_US
dc.indekslendigikaynakWeb of Scienceen_US
dc.language.isoenen_US
dc.publisherScience & Information Sai Organization Ltden_US
dc.relation.ispartofInternational Journal of Advanced Computer Science and Applicationsen_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectHuman gesture recognitionen_US
dc.subjectdynamic time warpingen_US
dc.subjectlocal joint motion trajectoryen_US
dc.subjectHuman action recognitionen_US
dc.subjectmicrosoft kinecten_US
dc.titleHuman Gesture Recognition using Keyframes on Local Joint Motion Trajectoriesen_US
dc.typeArticleen_US

Dosyalar