You are on page 1of 5

Journal of International Council on Electrical Engineering Vol. 1, No. 2, pp.

151~155, 2011 151

A Study on Robot Motion Authoring System using Finger-Robot Interaction


Kwang-Ho Seok *, Junho Ko ** and Yoon Sang Kim
Abstract This paper proposes a robot motion authoring system using finger-robot interaction. The proposed robot motion authoring tool is a user-friendly method that easily authors creates and controls robot motion according to the number of fingers. Furthermore, the system can be used to simulate usercreated robot contents in the 3D virtual environment. This allows the user to not only view the authoring process in real time but also transmit the final authored contents. The effectiveness of the proposed motion authoring system was verified based on motion authoring simulation of an industrial robot. Keywords: Finger-robot interaction, Industrial robot, Robot motion authoring system

1. Introduction
With rapid developments in today's robot technology, robots have been used in a wide range of fields. As the increasing number of applications adopts intelligence, and thus robot requires a new type of robot software environment that allows user to author various commands for robots. However, robots must become easier to use for general users in order to expand the scope of applications of intelligent robots in our everyday life. Based on this request, a number of studies are being conducted regarding new types of user-friendly and intuitive robot motion authoring that can render various motions [1]-[4]. Also, there are research projects actively taking place in human-robot interaction, thanks to the progress that has been made in vision technology. A study is being conducted to develop a finger pad that can replace the computer mouse [5], and there are other studies proposing algorithms to improve hand gesture recognition [6]-[8]. However, user-intuitive or user-friendly robot command authoring tools that focus on facilitating general users are still rare. The lack of userintuitive or user-friendly tools is likely to create a barrier for providing robot services (commands/motions) that correspond with the preferences and demands of general users including children, the elderly and housewives, who are expected to be the major clients in the service robot industry.
Corresponding Author: School of Computer Science and Engineering, Korea University of Technology and Education, Korea (yoonsang@kut.ar.kr) * Dept. of Computer Science and Engineering, Korea University of Technology and Education, Korea (mono@kut.ac.kr) ** Dept. of Computer Science and Engineering, Korea University of Technology and Education, Korea (lich126@kut.ac.kr) Received: January 17, 2011; Accepted: March 23, 2011

This paper proposes a robot motion authoring system using finger-robot interaction. The proposed system is capable of authoring (creating and controlling) robot motion according to the number of fingers. It is an intuitive robot motion authoring method that allows the user to easily author robot motions. The effectiveness of the proposed method is verified based on motion authoring simulation of an actual industrial robot.

2. Proposed Robot Motion Authoring System using Finger-Robot Interaction


With the exception of language, the hand is most frequently used for human communication among our body parts such as hands, eyes, mouth, arms and legs. This chapter explains how industrial robot motions can be the authored according to the number of fingers and verifies the effectiveness of the proposed approach based on simulation. Fig. 1 illustrates the overall authoring process of the robot motion authoring system implemented for this study.

Fig. 1. Robot motion authoring system.

152

A Study on Robot Motion Authoring System using Finger-Robot Interaction

2.1 Finger recognition There are a number of techniques for finger recognition, which is being used in various fields. In this section, finger recognition unit is implemented using color values scheme [8-9]. The finger recognition unit first converts RGB colors into gray scales and YCrCb for binary representation. Then the region inside the hand is filled by masking and noise is removed. The binary image is examined in 25-pixel units, as shown in Fig. 2. If sum of 1s examined is greater than ten, every digit is set to 1. With the image obtained by masking performed by the finger recognition unit, the center of the hand can be calculated to identify hands location as well as the hand region furthest from the center. The center of the hand is marked with a red dot, the palm region with a red circle and the hand region with a white circle. If the number of fingers is 0 or 5, a circle image is displayed. For 1, 2 and 3 fingers, the image is shown in blue, green and red, respectively. If there are 4 fingers, the hand is shown in white on a black background (Fig. 2).

provides robots external view. In the ZOOM/VIEW mode, the view can be zoomed in/out and rotated, allowing the user to view robots entire structure in detail. The simulation panel provides the user with a real time 3D viewing of the robot motion being authored with fingerrobot interaction. The authored contents(motions) can be transmitted to control the robot by clicking TRANSMISSION button, and the actual robot motion can be stopped using STOP button. The data panel provides the virtual robot motion degrees and PWMs to the actual industrial robot in real time. The data panel provides RESET, PAUSE, RESTART button. If the RESET button is clicked, the robot is initialized to the default value. If the RESTART button is clicked, the robot is restarted.

Fig. 3. Robot motion authoring tool. Fig. 2. Finger recognition process. 2.2 Robot Motion Authoring Tool In this section, a motion authoring unit capable of creating and controlling robot motions (industrial robot in this paper) is implemented using the finger-robot interaction based on the finger recognition unit explained in the previous section. The motion authoring unit(editor) consists of four panels: the finger interface panel, mode panel, simulation panel and data panel(Fig. 3). The finger interface panel means the finger recognition unit that recognizes the number of fingers (from 0 to 5) from the finger images taken by camera. The number of fingers recognized by the finger interface panel allows the user to control the industrial robots (such as SCARA, articulated robot, and etc.) displayed on the simulation panel. The mode panel provides three modes WIRE, SOLID and ZOOM/VIEW. The mode panel provides TRANSMISSION button and STOP button. The WIRE mode allows viewing of robots internal structure and the SOLID mode only 2.3 Virtual Robot Motion authoring simulation This section evaluates the effectiveness of the proposed authoring method based on robot motion authoring simulation. The virtual industrial robot used for the robot motion authoring simulation is an articulated robot with 6 DOF, and its motion ranges and definition are given in Table 1 and Fig. 4. Table 1. Motion Range
AXIS 1-Axis 2-Axis 3-Axis 4-Axis 5-Axis 6-Axis ANGLE (degree) 0 ~ -360 -60 ~ 60 -90 ~ 90 0 ~ -60 0 ~ -360 0 ~ 90

Fig. 4 depicts the virtual and actual articulated robot for the motion authoring simulation respectively. The motion ranges and directions of the virtual articulated robot are almost similar to ones of the actual robot.

Kwang-Ho Seok, Junho Ko and Yoon Sang Kim

153

Fig. 4. Definition of an articulated robot used for the motion authoring simulation. In WIRE and SOLID mode, internal and external structures of the robot can be viewed, respectively. In either mode, if the number of fingers recognized by the finger interface is 0 or 4, 1-Axis or 5-Axis rotates around the Y axis. For 1, 2, 3 fingers, 2-Axis, 3-Axis, 4-Axis rotates around the Z axis, respectively(Fig. 5). The ZOOM/VIEW mode allows the user to zoom in and out of the robot structure and vary the camera angle for a detailed view. Similar to the WIRE mode, the camera is rotated clockwise and counter-clockwise for 3 and 4 fingers, respectively, according to the definitions of Table 3 so that the user can view the robot from a desired angle. Accordingly, the user is able to not only create but control motions for a part of a robot based on the number of fingers recognized. Table 2 and 3 list the definitions of finger-value events for WIRE, SOLID and ZOOM/VIEW mode. Table 2. Definitions of Finger-value event for WIRE and Solid mode
MODE FINGER 0 1 A. WIRE B. SOLID 2 3 4 5 EVENT 1-Axis rotation (base Y) 2-Axis rotation (base Z) 3-Axis rotation (base Z) 4-Axis rotation (base Z) 5-Axis rotation (base Y) 6-Axis rotation (Gripper ON/OFF)

Fig. 5. The motion authoring simulation in WIRE and SOLID mode. Fig. 6(a) depicts how a robot rotation part is enlarged in the ZOOM/VIEW mode when the number of finger is recognized as 2 and 3. Also, Fig. 6(b) depicts how a robot part is downsized in the ZOOM/VIEW mode when the number of finger is recognized as 1.

Fig. 6. (a) Enlarged robot rotation part in ZOOM/VIEW mode with 2 and 3 recognized (b) downsized robot in ZOOM/VIEW mode with 1 recognized

3. Experimental Results and Discussions


3.1 Virtual Robot Experimental Result The robot motion authoring system proposed in this paper was implemented on a personal computer with 1GByte memory, 1.80GHz AMD AthlonTM 64-bit processor CPU and ATI Radeon X1600 graphic card that supports DirectX. As well, Logitech Webcam Pro 9000 was used. The robot motion authoring tool in this paper was implemented with OPENGL, the open source programming library. Authoring was effective because different colors were displayed according to the number of fingers recognized, allowing the user to immediately discern the part of the robot being authored. Fig. 7 is a captured image of the robot motion authoring process using the proposed method. The experimental results confirmed that the robot motion authoring system proposed by this study provides general users with an intuitive, fun and easy way to create motions.

Table 3. Definitions of Finger-value events for ZOOM/VIEW mode


MODE FINGER 0 1 C. ZOOM/VIEW 2 3 4 default ZOOM OUT ZOOM IN VIEW rotation (CW) VIEW rotation (CCW) EVENT

154

A Study on Robot Motion Authoring System using Finger-Robot Interaction

Table 4 . NT-Robot Arm Specification [10]


SPECIFICATION Speed Voltage Size Power Weight Structure Center distant Current 6V servomotor DC 6V 100550 16.1Kg-cm (max) 1 Kg 6 axis + gripper 130m/m 12.5 A

The purpose of experiment in 3.2 was to confirm whether or not robot motions authored by the proposed tool works with an actual robot well. The motions authored by the tool were applied to an actual robot for the experiment (Fig. 9). The motion data was transmitted to the robot using serial communication and it was examined how the contents produced by the robot motion authoring tool controlled the robot according to users intention.

Fig. 7. Robot motion authoring process using the proposed method. 3.2 Actual Robot Experimental Result Actual articulated robot is an miniature-type robot (NTRobot Arm product of NTREX CO., LTD) 7 RC-servo motors were used, and 5 ball casters were used on the BASE part for smooth turning. The gripper can seize an object of maximum 80mm. For control, the robot arm can be easily controlled by a computer or embedded board by using NT-SERVO-16CH-1 and NT-USB2UART (Fig. 8). Table 4 depicts NT-Robot Arm specification.

Fig. 9. Experiment results using the actual robot.

4. Conclusion
This paper proposed an authoring system capable of creating and controlling motions of industrial robots based on finger-robot interaction. The proposed system is userfriendly and intuitive and facilitates motion authoring of industrial robots using fingers, which is second only to language in terms of means of communication. The proposed authoring system also uses graphic motion data to simulate user-created robot contents in a 3D virtual environment so that the user can view the authoring process in real time and transmit the authored robot contents to control the robot. The validity of the proposed robot motion authoring system was examined based on simulations using

Fig. 8. Actual articulated robot structure.

Kwang-Ho Seok, Junho Ko and Yoon Sang Kim

155

the authored motion robot contents as well as experiments of actual robot motions. The proposed robot motion authoring method is expected to provide user-friendly and intuitive solutions for not only various industrial robots, but also other types of robots including humanoids.

University of Technology and Education (KUT), Cheonan, Korea. His current research interests include immersive virtual reality, human robot interaction, and power system.

References
[1] http://www.cyberbotics.com/ (CyberRobotics Webots TM) [2] http://www.microsoft.com/korea/robotics/studio.mspx [3] Sangyep Nam, Hyoyoung Lee, Sukjoong Kim, Yichul Kang and Keuneun Kim, A study on design and implementation of the intelligent robot simulator which is connected to an URC system, vol. 44. no. 4, IE, pp. 157-164, 2007. [4] Kwangho Seok and Yoonsang Kim, A new robot motion authoring method using HTM, International Conference on Control, Automation and Systems (ICCAS), pp. 2058-2061, Oct. 2008. [5] Hyun Kim, Myeongyun Ock, Taewook Kang and Sangwan Kim, Development of Finger Pad by Webcam, The Korea Society of Marine Engineering, pp. 397-398, June 2008. [6] Kangho Lee and Jongho Choi, Hand Gesture Sequence Recognition using Morphological Chain Code Edge Vector, Korea Society of Computer Information, vol. 9, pp. 85-91. [7] Attila Licsar and Tamas Sziranyi, User-adaptive hand gesture recognition system with interactive training, Image and Vision Computing 23, pp. 1102-1114, 2005. [8] Kunwoo Kim, Wonjoo Lee and Changho Jeon, A Hand Gesture Recognition Scheme using WebCAM, The Institute of Electronics Engineers of Korea, pp. 619-620, June. 2008. [9] Mark W. Spong, Seth Hutchinson and Mathukumalli Vidyasagar, ROBOT MODELING AND CONTROL, WILEY, 2006. [10] http://www.ntrex.co.kr/

Junho Ko received the B.S. degree in Internet-software engineering, from Korea University of Technology and Education (KUT), Korea, in 2009 and the M.S. degree in Information media engineering, from Korea University of Technology and Education (KUT), Korea, in 2011. Since 2011 to now, he has been Ph.D. student in Humane Interaction Lab (HILab), Korea University of Technology and Education (KUT), Cheonan, Korea. His research fields are artificial intelligence and vehicle interaction.

Yoon Sang Kim received the B.S., M.S., and Ph.D. degrees in electrical engineering from Sungkyunkwan University, Seoul, Korea, in 1993, 1995, and 1999, respectively. He was a Member of the Postdoctoral Research Staff of Korea Institute of Science and Technology (KIST), Seoul, Korea. He was also a Faculty Research Associate in the Department of Electrical Engineering, University of Washington, Seattle. He was a Member of the Senior Research Staff, Samsung Advanced Institute of Technology (SAIT), Suwon, Korea. Since March 2005, he has been an Associate Professor at the School of Computer and Science Engineering, Korea University of Technology Education (KUT), Cheonan, Korea. His current research interests include Virtual simulation, Power-IT technology, and device-based interactive application. Dr. Kim was awarded Korea Science and Engineering Foundation (KOSEF) Overseas Postdoctoral Fellow in 2000. He is a member of IEEE, IEICE, ICASE, KIPS and KIEE.

Kwang-Ho Seok received the B.S. degree in Internet-media engineering, from Korea University of Technology and Education (KUT), Korea, in 2007 and the M.S. degree in Information media engineering, from Korea University of Technology and Education (KUT), Korea, in 2009. Since 2009 to now, he has been Ph.D. student in Humane Interaction Lab (HILab), Korea

You might also like