• Ei tuloksia

Since the long term goal of the Botnia Nao robot team is to let the robots join the Robot Cup game, the way that the robot tracks the ball need to be improved. Here are some suggestions:

First, there is no need to measure the distance between the ball and the robot, since in real situation there is no time for calculating the distance but just let the robot walk towards the ball in the direct way. Therefore, the robot should track the ball dynamically. Since the resolution of the image used is 640*480, the coordinate of the center of the robot’s view is [320,240]. The angle of the joint, HeadPitch should always be adjusted. And the robot turn its body at the same time to keep the ball at the position whose coordinate is [320,240]. Finally based on the threshold value of the angle of the HeadPitch, the robot will stop in front of the ball and kick the ball.

The vision module can be created by OpenCV and C++ and all the movement need to be created by C++. In this case, using C++ is more efficient than using Python.

7 SUMMARY

This thesis introduces the behavior design of a humanoid Nao robot: a case of pick-ing up the ball and throwpick-ing the ball into the box.

78 In the vision system stage, the main algorithm is based on filtering noise and Hough circle transform, and by utilizing that algorithm, the accuracy of detecting the loca-tion of the center of the ball is improved. In the strategy stage, probability theory is used properly to design the strategy of picking up the ball, which leads to improve the rate of Nao picking up the ball successfully and the rate is almost 85%. Besides, appropriate mathematical models also contributes to calculating the distance to the ball and designing the strategy of tracking the ball dynamically. In addition, the polynomial functions are also utilized to calculate the distance to the box. And key frames in timeline are used to achieve the animation design. Therefore, the key points of this thesis is algorithm in vision system and appropriate mathematical models as well as animation design.

In the beginning, this project is quite challenging. But by separating this huge pro-ject into several small modules, this propro-ject was completed successfully. And in the process of completing a project, starting from completing some simple parts will simplify the project and provide you confidence. And there is no denying that much knowledge are learnt in the process of completing this project. Now, the methods about how to track a round thing in any color and build mathematical models are clear. Besides, the skills to program the Nao robot are also improved.

When it comes to the suggestions to the students who are interested in the Nao robot, choosing an interesting topic is very important. Before starting programming with Python or C++, Choregraph can be used to get familiar with the environment of programming Nao. Each box in the Choregraph is like the demo code, which may inspire you in the process of doing the project. Through doing project about Nao, the student can improve himself a lot.

Eventually, we wish all the students working on this project the best for their future study and career.

8 REFERENCES

/1/Aldebaran official website. Accessed 15. April.2015.https://www.aldeba-ran.com/sites/aldebaran/files/images/S%C3%A9lection%2012.jpg /2/Nao Robot Introduction. Accessed17.April.2015. https://www.aldeba-ran.com/en/humanoid-robot/nao-robot

/3/ Seo KiSung, 2011, Using Nao: Introduction to Interactive Humanoid Robots, Aldebaran robotics & NT research, INC,1,10-24

/4/ Aldebarab Documentation - H25 – Construction. Accessed 20.April.2015.

http://doc.aldebaran.com/2-1/family/nao_h25/dimensions_h25.html /5/ Aldebaran Documentation - NAO Battery. Accessed 23.April.2015.

http://doc.aldebaran.com/2-1/family/robots/battery_robot.html /6/ Aldebaran Documentation - NAO H25. Accessed 25.April.2015.

http://doc.aldebaran.com/2-1/family/nao_h25/index_h25.html#nao-h25

/7/ Sonar of the Nao robot. Accessed 26.April.2015. http://doc.aldebaran.com/2-1/family/robots/sonar_robot.html#robot-sonar

/11/ Python IDE - IDLE. Accessed 2.May.2015 https://docs.python.org/2/li-brary/idle.html

/12/ PyDev Python IDE for Eclipse. Accessed 3.May.2015 http://marketplace.eclipse.org/content/pydev-python-ide-eclipse

/13/ Aldebaran Documentation- NAO Video Camera. Accessed 5.May.2015 http://doc.aldebaran.com/2-1/family/robots/video_robot.html#robot-video /14/ Aldebaran Documentation: Joints. Accessed 5.May.2015

http://doc.aldebaran.com/2-1/family/robots/joints_robot.html

80 /15/ Opencv Introduction. Accessed 6.May.2015. http://docs.opencv.org/mod-ules/core/doc/intro.html#

/16/Opencv. Accessed 7.May.2015.

http://opencv.org/

/17/ The Blog of Dr Moron. November 9, 2013. Accessed 8.May.2015 http://drmoron.org/is-black-a-color/

/18/ Imgarcade - The HSV Color. Accessed 8.May.2015 http://imgarcade.com/1/hsi-color-model/

/19/ Canny Edge Detection. Accessed 9.May.2015.

http://opencv-python-tutroals.readthedocs.org/en/latest/py_tutorials/py_imgproc /py_canny/py_canny.html#canny

/20/NAO Software Documentation: NAOqi. Accessed 9.May.2015 http://doc.aldebaran.com/1-14/naoqi/trackers/index.html#naoqi-trackers /21/ NAO Software Documentation. Accessed 9.May.2015. http://doc.aldeba-ran.com/1-14/naoqi/vision/allandmarkdetection.html

/22/Aldebaran- Nao Mark. Accessed 10.May.2015 http://doc.aldebaran.com/2-1/_downloads/NAOmark.pdf

/23/NAO Software Documentation: Timeline. Accessed 10.May.2015.

http://doc.aldebaran.com/1-14/software/choregraphe/panels/timeline_panel.html

APPENDIX 1.

FIND THE RED COLOR THRESHOLD

import cv2

import numpy as np

red=np.uint8([[[227,62,56]]])

hsv_red=cv2.cvtColor(red,cv2.COLOR_BGR2HSV) print hsv_red

2(4)

APPENDIX 2.

MATLAB COMMAND FOR FINDING THE RELATIONSHIP OF REAL DISTANCE AND SIZE X

x=[0.228,0.188,0.158,0.138,0.121,0.108,0.098,0.091,0.085,0.078,0.073,0.070,0.0 65]

y=[0.3,0.4,0.5,0.6,0.7,0.8,0.9,1.0,1.1,1.2,1.3,1.4,1.5]

P = polyfit(x,y,3)

y2=-485.5931*x.^3+266.2636*x.^2-50.8341*x+3.7969 plot(x,y,'*',x,y2,'-')

APPENDIX 3.

MOVETO METHOD

def moveTo(x, y, degree, IP, PORT):

motion=ALProxy("ALMotion", IP, PORT) positionErrorThresholdPos = 0.01

positionErrorThresholdAng = 0.03

# The command position estimation will be set to the sensor position

# when the robot starts moving, so we use sensors first and commands later.

initPosition = almath.Pose2D(motion.getRobotPosition(True)) targetDistance = almath.Pose2D(x,y,degree * almath.PI / 180) expectedEndPosition = initPosition * targetDistance

enableArms = True

motion.setMoveArmsEnabled(enableArms, enableArms) #motion.moveTo(x, y, degree * almath.PI / 180)

motion.moveTo(x, y, degree) # The move is finished so output

realEndPosition = almath.Pose2D(motion.getRobotPosition(False)) positionError = realEndPosition.diff(expectedEndPosition)

positionError.theta = almath.modulo2PI(positionError.theta)

index=0

while (abs(positionError.x) > positionErrorThresholdPos or abs(positionEr-ror.y) > positionErrorThresholdPos or abs(positionError.theta) >

positionErrorThresholdAng):

4(4) realEndPosition = almath.Pose2D(motion.getRobotPosition(False)) positionError = realEndPosition.diff(expectedEndPosition)

positionError.theta = almath.modulo2PI(positionError.theta) print "go again"

if (abs(positionError.x) < positionErrorThresholdPos and abs(positionError.y) < positionErrorThresholdPos and abs(positionError.theta) < positionErrorThresholdAng):

print "move success!"

else:

print positionError.toVector()

motion.moveToward(0.0, 0.0, 0.0) .