Kinect XBOX 360和六轴机械臂的实时映射
Kinect XBOX 360和六軸機(jī)械臂的實(shí)時(shí)映射
- ROS與UR5機(jī)械臂的通訊
- Kinect1 進(jìn)行人體骨骼追蹤
- 骨骼追蹤
- Kinect得到的數(shù)據(jù)控制UR5機(jī)械臂運(yùn)動(dòng)
- 總結(jié)
時(shí)間:2019-06
地點(diǎn):深圳
郵箱:583259763@qq.com
=================================================================================
內(nèi)容概要:
1.如何使用ROS控制UR5機(jī)械臂;
2.使用Kinect 1 進(jìn)行人體骨骼追蹤;
3.將提取的骨骼關(guān)節(jié)角度參數(shù)傳給UR5機(jī)械臂。
環(huán)境及主要工具:
Ubuntu16.04、ROS(Kinetic)、CLion、OpenNI等等。
=================================================================================
ROS與UR5機(jī)械臂的通訊
UR5系列的機(jī)械臂支持使用ROS進(jìn)行控制,相關(guān)的功能包在ROS開(kāi)源社區(qū)就可以找到。所以安裝相關(guān)的功能包非常的簡(jiǎn)單,使用如下的命令就可以:
sudo apt-get install ros-kinetic-universal-robot
當(dāng)時(shí)控制真實(shí)UR機(jī)械臂時(shí)這個(gè)功能包會(huì)有問(wèn)題,解決方法可以參照下面的鏈接ur機(jī)械臂實(shí)體聯(lián)調(diào)解決方法,這樣我們就完成了第一步。
Kinect1 進(jìn)行人體骨骼追蹤
首先我們說(shuō)一下骨架空間,在Kinect里面,是通過(guò)20個(gè)關(guān)節(jié)點(diǎn)來(lái)表示一個(gè)骨架的,具體可以由圖3-8看到。當(dāng)你走進(jìn)Kinect的視野范圍的時(shí)候,Kinect就可以把你的20個(gè)關(guān)節(jié)點(diǎn)的位置找到(此時(shí)你得站著),位置通過(guò)(x,y,z)坐標(biāo)表示。這樣。你在Kinect前面做的很多復(fù)雜的動(dòng)作的時(shí)候,因?yàn)槿说膭?dòng)作和這些關(guān)節(jié)點(diǎn)的位置的變化由很大的關(guān)系。當(dāng)電腦拿到這些數(shù)據(jù)后,對(duì)理解你做什么動(dòng)作就很有幫助了。
骨骼追蹤
在Ubuntu上使用的IDE是CLion,用起來(lái)很方便,Openni提取骨架信息代碼如下:
#include <stdlib.h> #include <iostream> #include <vector>#include <XnCppWrapper.h> #include <XnModuleCppInterface.h> #include "cv.h" #include "highgui.h"using namespace std; using namespace cv;//#pragma comment (lib,"cv210") //#pragma comment (lib,"cxcore210") //#pragma comment (lib,"highgui210") //#pragma comment (lib,"OpenNI")//【1】 xn::UserGenerator userGenerator;xn::DepthGenerator depthGenerator; xn::ImageGenerator imageGenerator;/*XN_SKEL_HEAD = 1, XN_SKEL_NECK = 2,XN_SKEL_TORSO = 3, XN_SKEL_WAIST = 4,XN_SKEL_LEFT_COLLAR = 5, XN_SKEL_LEFT_SHOULDER = 6,XN_SKEL_LEFT_ELBOW = 7, XN_SKEL_LEFT_WRIST = 8,XN_SKEL_LEFT_HAND = 9, XN_SKEL_LEFT_FINGERTIP =10,XN_SKEL_RIGHT_COLLAR =11, XN_SKEL_RIGHT_SHOULDER =12,XN_SKEL_RIGHT_ELBOW =13, XN_SKEL_RIGHT_WRIST =14,XN_SKEL_RIGHT_HAND =15, XN_SKEL_RIGHT_FINGERTIP =16,XN_SKEL_LEFT_HIP =17, XN_SKEL_LEFT_KNEE =18,XN_SKEL_LEFT_ANKLE =19, XN_SKEL_LEFT_FOOT =20,XN_SKEL_RIGHT_HIP =21, XN_SKEL_RIGHT_KNEE =22,XN_SKEL_RIGHT_ANKLE =23, XN_SKEL_RIGHT_FOOT =24 */ //a line will be drawn between start point and corresponding end point int startSkelPoints[14]={1,2,6,6,12,17,6,7,12,13,17,18,21,22}; int endSkelPoints[14]={2,3,12,21,17,21,7,9,13,15,18,20,22,24};// callback function of user generator: new user void XN_CALLBACK_TYPE NewUser( xn::UserGenerator& generator, XnUserID user,void* pCookie ) { cout << "New user identified: " << user << endl; //userGenerator.GetSkeletonCap().LoadCalibrationDataFromFile( user, "UserCalibration.txt" ); generator.GetPoseDetectionCap().StartPoseDetection("Psi", user); }// callback function of user generator: lost user void XN_CALLBACK_TYPE LostUser( xn::UserGenerator& generator, XnUserID user,void* pCookie ) { cout << "User " << user << " lost" << endl; }// callback function of skeleton: calibration start void XN_CALLBACK_TYPE CalibrationStart( xn::SkeletonCapability& skeleton,XnUserID user,void* pCookie ) { cout << "Calibration start for user " << user << endl; }// callback function of skeleton: calibration end void XN_CALLBACK_TYPE CalibrationEnd( xn::SkeletonCapability& skeleton,XnUserID user,XnCalibrationStatus calibrationError,void* pCookie ) { cout << "Calibration complete for user " << user << ", "; if( calibrationError==XN_CALIBRATION_STATUS_OK ) { cout << "Success" << endl; skeleton.StartTracking( user ); //userGenerator.GetSkeletonCap().SaveCalibrationDataToFile(user, "UserCalibration.txt" ); } else { cout << "Failure" << endl; //For the current version of OpenNI, only Psi pose is available ((xn::UserGenerator*)pCookie)->GetPoseDetectionCap().StartPoseDetection( "Psi", user ); } }// callback function of pose detection: pose start void XN_CALLBACK_TYPE PoseDetected( xn::PoseDetectionCapability& poseDetection,const XnChar* strPose,XnUserID user,void* pCookie) { cout << "Pose " << strPose << " detected for user " << user << endl; ((xn::UserGenerator*)pCookie)->GetSkeletonCap().RequestCalibration( user, FALSE ); poseDetection.StopPoseDetection( user ); }void clearImg(IplImage* inputimg) {CvFont font;cvInitFont( &font, CV_FONT_VECTOR0,1, 1, 0, 3, 5);memset(inputimg->imageData,255,640*480*3); }int main( int argc, char** argv ) {char key=0;int imgPosX=0;int imgPosY=0;// initial contextxn::Context context;context.Init();xn::ImageMetaData imageMD;IplImage* cameraImg=cvCreateImage(cvSize(640,480),IPL_DEPTH_8U,3);cvNamedWindow("Camera",1);// map output modeXnMapOutputMode mapMode;mapMode.nXRes = 640;mapMode.nYRes = 480;mapMode.nFPS = 30;// create generatordepthGenerator.Create( context );depthGenerator.SetMapOutputMode( mapMode );imageGenerator.Create( context );userGenerator.Create( context );//【2】// Register callback functions of user generatorXnCallbackHandle userCBHandle;userGenerator.RegisterUserCallbacks( NewUser, LostUser, NULL, userCBHandle );//【3】// Register callback functions of skeleton capabilityxn::SkeletonCapability skeletonCap = userGenerator.GetSkeletonCap();skeletonCap.SetSkeletonProfile( XN_SKEL_PROFILE_ALL );XnCallbackHandle calibCBHandle;skeletonCap.RegisterToCalibrationStart( CalibrationStart,&userGenerator, calibCBHandle );skeletonCap.RegisterToCalibrationComplete( CalibrationEnd,&userGenerator, calibCBHandle );//【4】// Register callback functions of Pose Detection capabilityXnCallbackHandle poseCBHandle;userGenerator.GetPoseDetectionCap().RegisterToPoseDetected( PoseDetected,&userGenerator, poseCBHandle );// start generate datacontext.StartGeneratingAll();while( key!=27 ){context.WaitAndUpdateAll();imageGenerator.GetMetaData(imageMD);memcpy(cameraImg->imageData,imageMD.Data(),640*480*3);cvCvtColor(cameraImg,cameraImg,CV_RGB2BGR);// get usersXnUInt16 userCounts = userGenerator.GetNumberOfUsers();if( userCounts > 0 ){XnUserID* userID = new XnUserID[userCounts];userGenerator.GetUsers( userID, userCounts );for( int i = 0; i < userCounts; ++i ){//【5】// if is tracking skeletonif( skeletonCap.IsTracking( userID[i] ) ){XnPoint3D skelPointsIn[24],skelPointsOut[24];XnSkeletonJointTransformation mJointTran;for(int iter=0;iter<24;iter++){//XnSkeletonJoint from 1 to 24skeletonCap.GetSkeletonJoint( userID[i],XnSkeletonJoint(iter+1), mJointTran );skelPointsIn[iter]=mJointTran.position.position;}depthGenerator.ConvertRealWorldToProjective(24,skelPointsIn,skelPointsOut);//【6】for(int d=0;d<14;d++){CvPoint startpoint = cvPoint(skelPointsOut[startSkelPoints[d]-1].X,skelPointsOut[startSkelPoints[d]-1].Y);CvPoint endpoint = cvPoint(skelPointsOut[endSkelPoints[d]-1].X,skelPointsOut[endSkelPoints[d]-1].Y);cvCircle(cameraImg,startpoint,1.5,CV_RGB(255,0,0),6);cvCircle(cameraImg,endpoint,1.5,CV_RGB(255,0,0),6);cvLine(cameraImg,startpoint,endpoint,CV_RGB(0,255,0),2);}}}delete [] userID;}cvShowImage("Camera",cameraImg);key=cvWaitKey(20);}// stop and shutdowncvDestroyWindow("Camera");cvReleaseImage(&cameraImg);context.StopGeneratingAll();context.Shutdown();return 0;提取骨架信息的CmakeLists文件
cmake_minimum_required(VERSION 3.12) project(OpenNi)set(CMAKE_CXX_STANDARD 11)#list( APPEND CMAKE_MODULE_PATH ${PROJECT_SOURCE_DIR}/cmake_modules ) #set(CMAKE_MODULE_PATH ${CMAKE_CURRENT_SOURCE_DIR}/cmake_modules) #list(APPEND CMAKE_MODULE_PATH) #set(CMAKE_PREFIX_PATH ${CMAKE_CURRENT_SOURCE_DIR}/cmake_modules) #find_package(OpenNI REQUIRED) #include_directories(${OPENNI_INCLUDE_DIR}) include_directories("/usr/include/ni")find_package(OpenCV REQUIRED) include_directories(${OpenCV_INCLUDE_DIRS})add_executable(OpenNi main.cpp)target_link_libraries(OpenNi ${OpenCV_LIBS} libOpenNI.so)Kinect得到的數(shù)據(jù)控制UR5機(jī)械臂運(yùn)動(dòng)
利用Kinect相機(jī)提取骨架之后,我們重點(diǎn)考慮右手的肩關(guān)節(jié)、軸關(guān)節(jié)以及手關(guān)節(jié)三個(gè)點(diǎn)的三維坐標(biāo)用來(lái)計(jì)算映射方式,對(duì)應(yīng)的關(guān)系如下:
⑴左右揮手的角度對(duì)應(yīng)機(jī)械臂的底座自由度;
⑵上臂和兩肩的連線(xiàn)夾角對(duì)應(yīng)機(jī)械臂自下而上第二個(gè)自由度;
⑶上下臂夾角對(duì)應(yīng)機(jī)械臂自下而上第二個(gè)自由度;
下面是對(duì)UR5機(jī)械臂進(jìn)行關(guān)節(jié)空間控制的代碼:
#!/usr/bin/env python # -*- coding: utf-8 -*-import rospy,sys import tf import math import moveit_commander from moveit_commander import MoveGroupCommander,PlanningSceneInterface from moveit_msgs.msg import PlanningScene,ObjectColor from geometry_msgs.msg import PoseStamped,Pose from control_msgs.msg import GripperCommand import numpy as npclass MoveItFKDemo:def __init__(self):moveit_commander.roscpp_initialize(sys.argv)rospy.init_node("moveit_fk_demo",anonymous=True)#加入地這個(gè)場(chǎng)景的開(kāi)始scene=PlanningSceneInterface()self.scene_pub=rospy.Publisher("planning_scene",PlanningScene,queue_size=5)self.colors=dict()rospy.sleep(1)reference_frame="base"ground_id="ground"scene.remove_world_object(ground_id)rospy.sleep(1)ground_size=[3,3,0.02]ground_pose=PoseStamped()ground_pose.header.frame_id=reference_frameground_pose.pose.position.x=0ground_pose.pose.position.y=0ground_pose.pose.position.z=-ground_size[2]/2.0ground_pose.pose.orientation.w=1.0scene.add_box(ground_id,ground_pose,ground_size)self.setColor(ground_id,0.8,0,0,1.0)self.sendColors()#以上為加入地場(chǎng)景的結(jié)束arm=moveit_commander.MoveGroupCommander("manipulator")arm.set_goal_joint_tolerance(0.001)arm.set_named_target("up")arm.go()rospy.sleep(2)# arm.set_named_target("up")# arm.go()# rospy.sleep(2)listener=tf.TransformListener()rate=rospy.Rate(10.0)#三個(gè)軸方向上的單位向量x_unit=np.array([1,0,0])y_uint=np.array([0,1,0])z_unit=np.array([0,0,1])while not rospy.is_shutdown():#計(jì)算左右揮手的角度angle1,映射到機(jī)械臂的底座上try:(trans_s_n,rot_s_n)=listener.lookupTransform("neck_1","left_shoulder_1",rospy.Time(0))(trans_e_n,rot_h_n)=listener.lookupTransform("neck_1","left_elbow_1",rospy.Time(0))except (tf.LookupException,tf.ConnectivityException,tf.ExtrapolationException):continue#計(jì)算trans_e_n在y軸(綠色)上的投影projy_trans_e_n=np.array(trans_e_n).dot(y_uint)*y_uint#print(projy_trans_e_n)#計(jì)算trans_e_n在xz平面上的投影projxz_trans_e_n=trans_e_n-projy_trans_e_n#print(projxz_trans_e_n)#計(jì)算上臂在xz平面上的投影proj_upper_hand=projxz_trans_e_n-trans_s_n#計(jì)算proj_upper_hand的長(zhǎng)度L_proj_upper_hand=np.sqrt(proj_upper_hand.dot(proj_upper_hand))#計(jì)算angle1angle1=-1.57+np.arccos(proj_upper_hand.dot(z_unit)/L_proj_upper_hand)#print(trans_s_n)print("angle1 is %f"%angle1)#計(jì)算上臂與兩肩之間的夾角angle2,映射到機(jī)械臂從下到上的第二個(gè)自由度上#計(jì)算trans_e_n在z軸(藍(lán))上的投影projz_trans_e_n=np.array(trans_e_n).dot(z_unit)*z_unit#計(jì)算trans_e_n在xy平面上的投影projxy_trans_e_n=trans_e_n-projz_trans_e_n#計(jì)算上臂在xy平面上的投影projxy_upper_hand=projxy_trans_e_n-trans_s_n#計(jì)算projxy_upper_hang的長(zhǎng)度L_projxy_upper_hand=np.sqrt(projxy_upper_hand.dot(projxy_upper_hand))#計(jì)算angle2angle2=-1.57+np.arccos(projxy_upper_hand.dot(y_uint)/L_projxy_upper_hand)# print("angle2 is %f"%angle2)#計(jì)算上下臂之間的夾角angle3,映射到機(jī)械臂從下到上的第三個(gè)自由度上try:#listener.waitForTransform("right_elbow_1","right_shoulder_1",rospy.Time(0),rospy.Duration(1.0))(trans_s_e,rot_s)=listener.lookupTransform("left_elbow_1","left_shoulder_1",rospy.Time(0))(trans_h_e,rot_h)=listener.lookupTransform("left_elbow_1","left_hand_1",rospy.Time(0))except (tf.LookupException,tf.ConnectivityException,tf.ExtrapolationException):continue#計(jì)算向量的向量積vector_multi_1=np.array(trans_s_e).dot(np.array(trans_h_e))#L_trans_s_e=np.sqrt(trans_s_e[0]**2+trans_s_e[1]**2+trans_s_e[2]**2)#L_trans_h_e=np.sqrt(trans_h_e[0]**2+trans_h_e[1]**2+trans_h_e[2]**2)#計(jì)算兩個(gè)向量的長(zhǎng)度L_trans_s_e=np.sqrt(np.array(trans_s_e).dot(np.array(trans_s_e)))L_trans_h_e=np.sqrt(np.array(trans_h_e).dot(np.array(trans_h_e)))angle3=math.acos(vector_multi_1/(L_trans_h_e*L_trans_s_e))angle3=-3.14+angle3# print("angle3 is %f"%angle3)#print(trans_h_e)#print("Rotation roll is %f,pitch is %f,yaw is %f"%(roll,pitch,yaw))#rate.sleep();if(angle2>-0.08):angle2=-0.08joint_positions=[angle1,angle2,angle3,-1.57,-1.57,0]arm.set_joint_value_target(joint_positions)arm.go()#rate.sleep()rospy.sleep(0.5)moveit_commander.roscpp_shutdown() moveit_commander.os._exit(0)def setColor(self,name,r,g,b,a=0.9):color=ObjectColor()color.id=namecolor.color.r=rcolor.color.g=gcolor.color.b=bcolor.color.a=aself.colors[name]=colordef sendColors(self):p=PlanningScene()p.is_diff=Truefor color in self.colors.values():p.object_colors.append(color)self.scene_pub.publish(p)if __name__=="__main__":try:MoveItFKDemo()except rospy.ROSInterruptException:pass以下是以下效果圖:
總結(jié)
利用Kinect采集人的骨架參數(shù),實(shí)時(shí)映射到機(jī)械臂的幾個(gè)自由度上面讓其模仿人的手臂的動(dòng)作。主要做了以下的工作:使用Kinect相機(jī),運(yùn)用一款跨平臺(tái)的骨骼跟蹤包openni完成了人體骨骼的追蹤;將Kinect采集的信息處理后傳給UR機(jī)械臂,完成模仿手臂的運(yùn)動(dòng)。
目前還存在一下的問(wèn)題,也是下一階段要解決的問(wèn)題。在實(shí)驗(yàn)中也發(fā)現(xiàn)了很多問(wèn)題,第一個(gè)問(wèn)題當(dāng)手臂沒(méi)有動(dòng)的時(shí)候,Kinect采集到手機(jī)仍在動(dòng),這樣反映到機(jī)械臂上,機(jī)械臂就會(huì)發(fā)生抖動(dòng)現(xiàn)象。第二個(gè)機(jī)械臂實(shí)際運(yùn)動(dòng)角度范圍與手臂運(yùn)動(dòng)范圍之間有差距,手臂運(yùn)動(dòng)范圍角度更大。第三個(gè)問(wèn)題當(dāng)手臂運(yùn)動(dòng)之后等一會(huì)機(jī)械臂才運(yùn)動(dòng)存在較大的延遲。
在ROS方面還是一個(gè)新手,希望可以和大家多多交流。
總結(jié)
以上是生活随笔為你收集整理的Kinect XBOX 360和六轴机械臂的实时映射的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。
- 上一篇: linux下查看tomcat进程号
- 下一篇: 大学生专业计算机培训心得,计算机专业学习