Multiaxis verifying endpoint position and orientation
-
Hello, @YuriConfessor asked my to implement his robot arm, a 4 axis palletized robot type. I am currently developing, but what is needed is exactly testing the result. This is more than just to say "position xyz is reached, looks ok" or "firmware result data look ok". This thread wants to gather ideas how an exact-as-possible approach can be implemented instead, using the physical result. This will also allow to find possible problems between the firmware result and physical result. Let's say firmware says result 100,100,100 and physical result is 110,90,100 (and additionally information about orientation), then there is a problem in setup, code or somewhere.
Let's say you have a robot/3D printer/CNC, a firmware implementation, and some G-Code to make movements. The G-Code is executed, the axes move, the endpoint has some position and orientation as result.
How can be controlled that the movement result is the required G-Code? How can it be implemented? Which measurement methods are possible, can it be automated?
E. g. optical (Laser based, camera sensors, ...), mechanical, additional MCU to measure and compare with Duet's data. If the measurement is fast, it could be used for collision detection later also.
This thread shall gather ideas, I will add my own ideas here as well.
-
@JoergS5 a low tech test for XY precision would be a printed template of a square and a circle, with a pen on the arm. Try drawing the template, any deviation from the preprinted template can be measured with a micrometer.
-
@oliof you can do the same with camera sensors and a laser/LED through the nozzle. The advantage is that it can be automated. But the thread's goal is for orientation measuring and automation especially for multiaxis support. But you're right, it's a first test.
-
I'd wish there was an autocalibration routine, like the Delta kinematic has.
It would evaluate arm length's and starting (homed) angles.
Maybe an acceleration sensor on the toolhead or a more advanced IMU could help. -
@o_lampe said in Multiaxis verifying endpoint position and orientation:
I'd wish there was an autocalibration routine, like the Delta kinematic has.
The autocalibration for linear and rotary deltas works because changes in the parameters that are calibrated result in height changes; so it is possible to calibrate just by Z probing.
When I implemented SCARA kinematics I looked at whether it was possible to autocalibrate the arm lengths and homing positions. This could be done if a number of points were probed but in X and Y rather than Z. This is likely true for robot arm kinematics too. So if someone can come up with a good way of probing points in X and Y, we should be able to implement autocalibration. One possibility might be to replace the nozzle with a camera and use dots on the bed as the probe points. The camera software would need to report the X and Y distance of the dot from the centre of the image.
-
@dc42 I'm prototyping a "3d bed" for calibrating the arm. The problem is the "donut shaped" movement, but i think I can make something that can be used to calibrated. In the next week or so I'll post it so everyone can help improve it
-
@dc42 It might be possible to use a reverse ( * ) CNC-touch probe on the tool head to probe XYZ at every corner of the bed.
*) Instead of placing a touch plate in every corner, just put a plate on the toolhead and a probing pin at the corners.
-
-
@MaxGyver for XYZ offsets it looks to be a valuable tool, but does it measure orientation also? (angles of endpoint in respect to workpiece)
-
Yes you are right, with the ball at test the backlash and squareness of each axis in relation to the other linear axis can be determined.
To calibrate machines with linear and rotational axis this automatic calibration method can be used.
-
@MaxGyver that's interesting, thank you for the video link.