-
Notifications
You must be signed in to change notification settings - Fork 85
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add ROS Moveit support #115
Comments
Work is in a private repo for now, but I will make a public one once I have gone through the motions at work to allow it. Progress
|
In what binary format are the positions that are returned after sending an 'r' oplet? In home position I get [524286, 524287, 524286, 524287, 0]. I assume the large numbers are negative positions, but how exactly are negative positions represented? |
The 'r' oplet doesn't return a standard status reply. It reads a file. The response will depend on the file you choose to read. The 'g' oplet returns the standard status which includes the measured angles of the joints. It is well documented here: If you use the 'r' oplet to read the "file" called #measured_angles then the reply is in arcseconds. I can't think of anything you could read with 'r' that would return those values... |
I'm using 'r' with #measured_angles. I think that's supposed to give me the joint angles, which appears to be the case after trying it. I just don't know how to interpret the huge number for arc-seconds. |
For example: when an axis moves backwards through 0 arcseconds, it wraps around to 524287. That corresponds to 145.635278 degrees, but I would expect it to wrap around to 360 degrees. It could be that the issue in #104 has not yet been fixed in the firmware version I'm running. |
Using the 'r' oplet to read the "file" called #measured_angles when the Dexter is in the home position, I read joint angles that look like this [524286, 524287, 524286, 524287, 0]. These values supposedly are in arcseconds. If a joint angle decreases past 0, it wraps around to 524287. This doesn't make much sense because 524287 arcseconds is about 145 degrees, but I would have expected a number just short of 360 degrees. It makes more sense if I convert 524287 to binary as a unsigned integer (0b1111111111111111111), then to a signed integer in two's complement, -1. This way I get the values I would expect when I move an axis backwards past 0. This also means however that only about 72 degrees each side of 0 can be represented using the range of a 19 bit twos complement. This isn't addressed by the documentation for 'r' or anything else as far as I'm aware. |
I have this functioning now and hope to eventually move it to a public repo once I get permission and have tidied things up a bit. I'm able to plan paths using MoveIt's python move_group api and execute them on the machine, or in rviz with fake controllers. Is there anywhere I can find a urdf for the gripper? I'd like to add it to the robot (in a modular fashion) so I can start playing with MoveIt's pick and place pipeline. |
Ive hit a roadblock to getting reasonable trajectory tracking performance via ros_control. The position controller Ive made is a bit jerky and imprecise as there's no velocity control. It turns out that most robot arms are typically integrated with via velocity or effort hardware interfaces, but I dont see a way to implement one for the Dexter. |
Converting between velocity and position doesn't seem difficult.
|
That's What I have at the moment, but it results in jerky motion. It slows down to stop at each of the way points and speeds up again when its changed. To make that work I would need to be able to set the desired velocity with which to travel through the way points. |
I'm a little confused. The waypoint should change before the arm actually reaches it. e.g. the arm should lag a bit, naturally, exactly because of that slow down as it nears the waypoint. A bit of jerking as it accelerates makes sense, but during the motion and as it slows down at the end, it should be smother. Can you share a video? One thing you can try is "scheduling" (increasing or decreasing) the PID_P drive on each joint according to the change in goal position. When the change is large, increase it, and when it's small (at start or end) decrease it to smooth out the start and stop. |
TLDR: I need either direct control of joint velocity over TCP/serial, or an onboard trajectory point controller (simultaneous velocity + position) that I can queue points for (ideal). After talking to some smart people and reading lots of things, Ive realized that what is needed for proper ROS integration is an onboard (Dexter) closed loop 'trajectory point' controller (try hit position X at velocity V after time dt). The time dt would be nice but a velocity + position controller would suffice. That is essentially how way points are described in ROS trajectory messages e.g.
etc... I can implement a "joint trajectory actionlib server" http://library.isr.ist.utl.pt/docs/roswiki/joint_trajectory_action.html on a computer running ROS and send whole trajectories to Dexter to execute, or add trajectory points to a queue individually. Dexter doesn't need to run a ROS node onboard, as long as goal states of the controller can be queued externally via TCP or serial etc. The whole setup would look something like this: This solution is potentially very performant and will make full use of the Dexter's capabilities if the onboard velocity + position controller is implemented on the Dexter's fpga. It would still be good enough if it was implemented in C on the Dexter. This solution would plug straight into ros using this which I linked before. Another option that wont work quite as well but is probably much easier to implement on the Dexter side is to expose dumb motor velocity control via oplets. This way I can use (ros_control)[http://wiki.ros.org/ros_control], which provides pid controllers, to make a trajectory_controller that runs on the computer running ROS The drawback of that is the latency from communicating with Dexter vs running pid loops onboard. This could be used in conjunction with the position controller I have already made to get an accurate final position as well as reasonable trajectory execution. I'm familiar with ros_control now and would be able to get the ROS side of things working quickly once I have a way to directly control joint velocities. |
Does setting (AngularSpeedStartAndEnd)[https://github.com/abdullin/Dexter/wiki/set-parameter-oplet#AngularSpeedStartAndEnd] The similar parameter for angular acceleration could also be usefull. Other than those, I don't see any oplets or parameters that can let me do this sort of trajectory control. |
The jerking isn't noticeable at low speeds but becomes worse as speed increases, even during the motion. I think it's because pid_move_all_joints is position only, so the arm slows down in preparation for stopping at the goal rather than moving through it. What I'm after is something like pid_move_all_joints that also lets me specify the joint angular velocities that it should aim to have when it moves through the goal positions. That would be the same as the ideal solution In my last long comment. I see how changing PID_P would help prevent the roll-off in speed and will give it a try if there doesn't end up being a nicer solution. I see some issues though. The increments to the goal position are small, so its always close to the goal position -> I would need to set P pretty high. I don't think I would be able to explicitly set the desired velocity at the goal point like I need to. The bandwidth of the connection to the Dexter could become an issue if I need to constantly stream PID_P changes to the Dexter alongside frequent goal updates. Its effectively another P controller that controls velocity via setting the P gains of the PID controller xD. Maybe a new oplet could be implemented in DexRun.c to do this to avoid bandwidth and latency limitations. I can get you a video of the jerky motion next week if you'd still like to see it. |
it's beyond me. Perhaps you can work with @JamesWigglesworth The best I can advise is to use PID_P combined with /moving/ the goal point before the arm gets to it. I would also recommend NOT using the 'r' oplet to read back data as it's slower, but instead just use the binary status returned from the robot after every command. In fact, you should be able to literally watch the joint and detect the point when the goal position should be moved. Like a mechanical rabbit at a dog race track.Or a kitten with a laser pointer... you control velocity through the point, by making the point a different point. |
@HerringTin I have been working setting up moveit support for Dexter. Do you have any more notes or link to the repo you were working on? |
Create Dexter robot_description
Using URDF provided in #97.
Create ROS controllers (action servers) for Moveit
Using ros_control framework like this
Implement hardware interface
For basic FollowJointTrajectory controller:
Integrate with Moveit
The text was updated successfully, but these errors were encountered: