Human action recognition based on the angle data of limbs

Maimaitimin Maierdan, Keigo Watanabe, Shoichi Maeyama

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

An approach to human action recognition is presented in this paper. This paper is part of an human behavior estimation system which is divided into two parts: human action recognition and object recognition. In this part, we use Microsoft Kinect to capture human joint data. And calculate the limb angles. Using these angles we can train an Artificial Neural Network(ANN) to recognize these actions, which in this case are 'walking' and 'running'. In this paper, ANN is discussed as a main part of the current research. We designed a two stage ANN, which can minimize the impact of noise data. Whole processing is simulated by Scilab.

Original languageEnglish
Title of host publication2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages140-144
Number of pages5
ISBN (Electronic)9781479959556
DOIs
Publication statusPublished - Feb 18 2014
Event2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014 - Kitakyushu, Japan
Duration: Dec 3 2014Dec 6 2014

Other

Other2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014
CountryJapan
CityKitakyushu
Period12/3/1412/6/14

Fingerprint

Neural networks
Object recognition
Processing

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Cite this

Maierdan, M., Watanabe, K., & Maeyama, S. (2014). Human action recognition based on the angle data of limbs. In 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014 (pp. 140-144). [7044854] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/SCIS-ISIS.2014.7044854

Human action recognition based on the angle data of limbs. / Maierdan, Maimaitimin; Watanabe, Keigo; Maeyama, Shoichi.

2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014. Institute of Electrical and Electronics Engineers Inc., 2014. p. 140-144 7044854.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Maierdan, M, Watanabe, K & Maeyama, S 2014, Human action recognition based on the angle data of limbs. in 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014., 7044854, Institute of Electrical and Electronics Engineers Inc., pp. 140-144, 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014, Kitakyushu, Japan, 12/3/14. https://doi.org/10.1109/SCIS-ISIS.2014.7044854
Maierdan M, Watanabe K, Maeyama S. Human action recognition based on the angle data of limbs. In 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014. Institute of Electrical and Electronics Engineers Inc. 2014. p. 140-144. 7044854 https://doi.org/10.1109/SCIS-ISIS.2014.7044854
Maierdan, Maimaitimin ; Watanabe, Keigo ; Maeyama, Shoichi. / Human action recognition based on the angle data of limbs. 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014. Institute of Electrical and Electronics Engineers Inc., 2014. pp. 140-144
@inproceedings{974a845bb8834096be17c4b4e83d2788,
title = "Human action recognition based on the angle data of limbs",
abstract = "An approach to human action recognition is presented in this paper. This paper is part of an human behavior estimation system which is divided into two parts: human action recognition and object recognition. In this part, we use Microsoft Kinect to capture human joint data. And calculate the limb angles. Using these angles we can train an Artificial Neural Network(ANN) to recognize these actions, which in this case are 'walking' and 'running'. In this paper, ANN is discussed as a main part of the current research. We designed a two stage ANN, which can minimize the impact of noise data. Whole processing is simulated by Scilab.",
author = "Maimaitimin Maierdan and Keigo Watanabe and Shoichi Maeyama",
year = "2014",
month = "2",
day = "18",
doi = "10.1109/SCIS-ISIS.2014.7044854",
language = "English",
pages = "140--144",
booktitle = "2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - Human action recognition based on the angle data of limbs

AU - Maierdan, Maimaitimin

AU - Watanabe, Keigo

AU - Maeyama, Shoichi

PY - 2014/2/18

Y1 - 2014/2/18

N2 - An approach to human action recognition is presented in this paper. This paper is part of an human behavior estimation system which is divided into two parts: human action recognition and object recognition. In this part, we use Microsoft Kinect to capture human joint data. And calculate the limb angles. Using these angles we can train an Artificial Neural Network(ANN) to recognize these actions, which in this case are 'walking' and 'running'. In this paper, ANN is discussed as a main part of the current research. We designed a two stage ANN, which can minimize the impact of noise data. Whole processing is simulated by Scilab.

AB - An approach to human action recognition is presented in this paper. This paper is part of an human behavior estimation system which is divided into two parts: human action recognition and object recognition. In this part, we use Microsoft Kinect to capture human joint data. And calculate the limb angles. Using these angles we can train an Artificial Neural Network(ANN) to recognize these actions, which in this case are 'walking' and 'running'. In this paper, ANN is discussed as a main part of the current research. We designed a two stage ANN, which can minimize the impact of noise data. Whole processing is simulated by Scilab.

UR - http://www.scopus.com/inward/record.url?scp=84988287771&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84988287771&partnerID=8YFLogxK

U2 - 10.1109/SCIS-ISIS.2014.7044854

DO - 10.1109/SCIS-ISIS.2014.7044854

M3 - Conference contribution

AN - SCOPUS:84988287771

SP - 140

EP - 144

BT - 2014 Joint 7th International Conference on Soft Computing and Intelligent Systems, SCIS 2014 and 15th International Symposium on Advanced Intelligent Systems, ISIS 2014

PB - Institute of Electrical and Electronics Engineers Inc.

ER -