Voice Command Recognition for Movement Control of a 4-DoF Robot Arm

Rendyansyah Rendyansyah, Aditya P. P. Prasetyo, Sarmayanta Sembiring


Robots are widely used in industry. Robots generally have a control system or intelligence embedded in the processor. The robots consist of mobile mode, manipulator, and their combination. Mobile robots usually use wheels, and manipulator robots have limited degrees of freedom. Both have their respective advantages. Mobile robots are widely applied to environments with flat floor surfaces. The manipulator robots are applied to a static environment to produce, print, and cut material. In this study, the robot arm 4 Degree of Freedom (DoF) is integrated with a computer. The computer controls the whole system, where the operator can control the Robot based on voice commands. The operator's voice is one person only with different intonations. Voice command recognition uses the Mel-Frequency Cepstral Coefficients (MFCC) and Artificial Neural Networks (ANN) methods. The MFCC and ANN programs are processed in the computer, and the program output is sent to the Robot via serial communication. There are nine types of voice commands with different MFCC patterns. ANN training data for each command is 10 data, so the total becomes 90. In this experiment, the Robot can move according to voice commands given by the operator. Tests for each voice command are ten experiments, so the total experiment is 90 times with a success rate of 94%. There is only one operator, and experiments have not yet been carried out with the voices of several operators. The error occurred because there were several similar patterns during system testing.


Artificial Intelligence; Robotic Arm; Voice Command

Full Text:



A. F. Castro, M. F. Silva, and F. J. G. Silva, “Designing a Robotic Welding Cell for Bus Body Frame Using a Sustainable Way,” in 27th Intenational Conference on Flexible Automation and Intelligent Manufacturing, 2017, vol. 11, no. June, pp. 207–214, doi: 10.1016/j.promfg.2017.07.225.

P. Jamdagni and Y. Bin Jia, “Robotic Cutting of Solids Based on Fracture Mechanics and FEM,” in IEEE International Conference on Intelligent Robots and Systems, 2019, pp. 8252–8257, doi: 10.1109/IROS40897.2019.8967988.

T. Dewi, P. Risma, and Y. Oktarina, “Fruit Sorting Robot Based on Color and Size for an Agricultural Product Packaging System,” Bull. Electr. Eng. Informatics, vol. 9, no. 4, pp. 1438–1445, 2020, doi: 10.11591/eei.v9i4.2353.

Rendyansyah et al., “Pengendalian Robot Manipulator 4 DOF Berbasis Tampilan Visual pada Komputer,” in Annual Research Seminar, 2019, vol. 5, no. 1, pp. 105–109.

A. P. P. Prasetyo, Rendyansyah, and K. Exaudi, “Implementasi Trajectory Planning pada Robot Manipulator 4 DOF Untuk Mencari Kebocoran Gas,” J. J-Innovation, vol. 6, no. 2, pp. 1–8, 2017.

Rendyansyah and A. P. P. Prasetyo, “Simulasi Robot Manipulator 4 DOF Sebagai Media Pembelajaran Dalam Kasus Robot Menulis Huruf,” J. Nas. Tek. Elektro, vol. 5, no. 3, pp. 339–349, 2016, doi: 10.25077/jnte.v5n3.321.2016.

Rendyansyah, A. P. P. Prasetyo, K. Exaudi, B. A. Tarigan, and M. A. Amaria, “Pergerakan Robot Lengan Pengambil Objek Dengan Sistem Perekam Gerak Berbasis Komputer,” J. Tek. Elektro dan Vokasional, vol. 8, no. 2, pp. 230–240, 2022, doi: 10.24036/jtev.v8i2.113147.

W. S. Pambudi, E. Alfianto, A. Rachman, and D. P. Hapsari, “Simulation Design of Trajectory Planning Robot Manipulator,” Bull. Electr. Eng. Informatics, vol. 8, no. 1, pp. 196–205, 2019, doi: 10.11591/eei.v8i1.1179.

N. U. Alka, A. A. Salihu, Y. S. Haruna, and I. A. Dalyop, “A Voice Controlled Pick and Place Robotic Arm Vehicle Using Android Application,” Am. J. Eng. Res., vol. 6, no. 7, pp. 207–215, 2017, [Online]. Available: www.ajer.org.

A. Baby, C. Augustine, C. Thampi, M. George, A. A P, and P. C Jose, “Pick and Place Robotic Arm Implementation Using Arduino,” IOSR J. Electr. Electron. Eng., vol. 12, no. 02, pp. 38–41, 2017, doi: 10.9790/1676-1202033841.

T. Dewi, S. Nurmaini, P. Risma, Y. Oktarina, and M. Roriz, “Inverse Kinematic Analysis of 4 DOF Pick and Place Arm Robot Manipulator Using Fuzzy Logic Controller,” Int. J. Electr. Comput. Eng., vol. 10, no. 2, pp. 1376–1386, 2020, doi: 10.11591/ijece.v10i2.pp1376-1386.

A. Ak, V. Topuz, and E. Ersan, “Visual Servoing Application for Inverse Kinematics of Robotic Arm Using Artificial Neural Networks,” Stud. Informatics Control, vol. 27, no. 2, pp. 183–190, 2018, doi: 10.24846/v27i2y201806.

A. Salahul, C. Setianingsih, M. Nasrun, and M. A. Murti, “Speech Recognition Implementation using MFCC and DTW Algorithm for Home Automation,” in Proceeding of the Electrical Engineering Computer Science and Informatics, 2020, pp. 78–85.

A. N. Azhiimah, K. Khotimah, M. S. Sumbawati, and A. B. Santosa, “Automatic Control Based on Voice Commands and Arduino,” Adv. Eng. Res. Int. Jt. Conf. Sci. Eng., vol. 196, pp. 29–34, 2020.

A. Iqbal and Risfendra, “Jarak Ideal Pemberian Perintah pada Easyvr untuk Kendali Rumah Pintar,” J. Mech. Electr. Ind. Eng., vol. 1, no. 2, pp. 29–36, 2019.

R. T. Handayanto and Herlawati, “Machine Learning Berbasis Desktop dan Web dengan Metode Jaringan Syaraf Tiruan Untuk Sistem Pendukung Keputusan,” J. Komtika (Komputasi dan Inform., vol. 4, no. 1, pp. 15–26, 2020, doi: https://doi.org/10.31603/komtika.v4i1.3698.

M. N. Fadilah, A. Yusuf, and N. Huda, “Prediksi Beban Listrik Di Kota Banjarbaru Menggunakan Jaringan Syaraf Tiruan Backpropagation,” J. Mat. Murni dan Terap. “εpsilon,” vol. 14, no. 2, pp. 81–92, 2020, doi: https://doi.org/10.20527/epsilon.v14i2.2961.

DOI: http://dx.doi.org/10.26418/elkha.v14i2.57556


  • There are currently no refbacks.

Copyright (c) 2022 ELKHA : Jurnal Teknik Elektro

Editorial Office/Publisher Address:
Editor Jurnal Elkha, Department of Electrical Engineering, Faculty of Engineering, Universitas Tanjungpura,
Jl. Prof. Dr. Hadari Nawawi, Pontianak 78124, Indonesia

website : http://jurnal.untan.ac.id/index.php/Elkha
email : jurnal.elkha@untan.ac.id

ORCID iD : https://orcid.org/0000-0002-0779-1277

Assiciated with :


This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.