Direction/Velocity Vector of Printhead movement
-
@deme not from the OM as it currently stands. The best would be to use a DSF plugin to pre-process the gcode stream and keep a list of the next moves that are about going to happen. you can then correlate that with the OM.
-
@t3p3tony In the short term this could work, but in the long term it would be better to be able to read the queue directly to enable directional in-situ control. Is there an alternative to the OM? Does it make sense to reprogram the firmware? The topic is part of my master thesis and I have to consider all alternatives.
-
@deme said in Direction/Velocity Vector of Printhead movement:
looking for a way to be able to read the feed direction from the printer controller.
Could you use an accelerometer on the tool?
That would give instantaneous direction & acceleration ( so indirectly, velocity).The Duet can also take an accelerometer directly, though I'm not sure if the data that collects can be accessed.
-
@rjenkinsgb as it stands the accelerometer data is saved to a file during capture, not directly accessible through the object model.
-
@deme It would be possible for you to modify the firmware to report this information, but I don't know how much work it would be, possibly quite a lot and it may require changes that break/reduce the effectiveness of other elements.
The reason for writing DSF in the way is is written, with the ability to have plugins that intercept and process the gcode stream, is to allow for usecases like this. For most people it is easier to write a python script than work directly with the C++ in firmware.
-
I forget where I saw it, but I recently saw a video of someone who had a concrete printer or some other kind of special printer. The toolhead oriented itself towards the print direction. I believe, but I might be incorrect, that the machine used a modified version of RRF.
-
Concrete printer and knife follower have definitely been a topic here. Usually they get controlled with an extra axis which gets positional movement instructions from the slicer / gcode directly (instead of trying to calculate it on the fly). See https://forum.duet3d.com/topic/6993/requests-for-help-gcode-cam-follower-tangential-knife-control
-
@deme said in Direction/Velocity Vector of Printhead movement:
it would be better to be able to read the queue directly to enable directional in-situ control.
IMHO to get ahead of the queue, you need to simulate gcode in real time on the Pi and send data to Duet simultaneously.
IIRC, the simulator plugin currently runs on the Duet, so there is a pile of work to do.
Using only the accelerometer will fail in the long run, you'd need a 6DOF or even 9DOF sensor fusion ( accel, position, earth-magnet field) like it's used for multicopter heading and mission control (path control) -
Hi guys,
thanks again for your ideas and tips. It's been a while now, but I implemented and tested many ideas. In the meantime, I finished my toolchanger conversion and can finally perform proper tests.
I have tested the following:
1) An accelerometer mounted on the Tool-Carrier
@rjenkinsgb said in Direction/Velocity Vector of Printhead movement:Could you use an accelerometer on the tool?
That would give instantaneous direction & acceleration ( so indirectly, velocity).The idea is great, but after a few tests I unfortunately noticed that any vibration (door slam) affects my LED control. I know I could further process the sensor data through the Raspberry Pi and could solve the problem. But I don't like the fact that unplanned environmental influences can affect my control.
@t3p3tony said in Direction/Velocity Vector of Printhead movement:
@rjenkinsgb as it stands the accelerometer data is saved to a file during capture, not directly accessible through the object model.
I solved this problem by connecting the accelorometer directly to the Raspberry Pi.
2) Stay ahead of the movement with a code logger
@o_lampe said in Direction/Velocity Vector of Printhead movement:
IMHO to get ahead of the queue, you need to simulate gcode in real time on the Pi and send data to Duet simultaneously
@o_lampe has brought me to the idea, to analyze the queue with a code logger, make my calculations and keep them in a kind of buffer until I reach the current position. For this I wrote a script using the dsf-pyhthon API, which stores the commands that are sent to the RRF (State: Executed) and compares them with the current position.
The problems I have with this concept ist that I lose my reference when changing between absolute and relative positioning. Here I would have to program a parser that reads G90/G91 commands in addition to the G0/G1 and store the future absolute Position, as it probably is implemented in the RRF. This makes the script sluggish and not real-time capable. Or maybe my programming skills are just too bad to implement something like that.
3) My current concept
@resam said in Direction/Velocity Vector of Printhead movement:
Concrete printer and knife follower have definitely been a topic here. Usually they get controlled with an extra axis which gets positional movement instructions from the slicer / gcode directly (instead of trying to calculate it on the fly). See https://forum.duet3d.com/topic/6993/requests-for-help-gcode-cam-follower-tangential-knife-control
Thanks to @resam for the forum post by @Russ-Gries. Following the discussions in this forum, I would like to proceed as follows:
- instead of trying to calculate the Position of the LED ring on the fly, i want the LED ring to be seen as an additional axis
- a post processing script should extend the positional arguments with another corrdinate (e.g. U) => alredy done
- Position U shall be recognized by a plugin (DSF Command Subscription around the OM) and control the LED Ring
My current problems are:
- if I don't configure U as an addition axis, G1 U10 commands are not parsed and not passed to the OM
- if I configure U as a kind of "virtual axis" the LED ring acts like a motor. This gives me delays caused by the path calculation, especially if I have multiple circular movements when the position changes from 360 to 0.
I am surprised that such a simple task (turn on LED in feed direction) becomes such a complicated project. Does anyone have an idea how I can configure a virtual axis but not "move" it as a real one? I have read about a second DDA ring, but haven't dived into it yet. Would that also be a possible solution?
-
@deme if you are prepared to compile the firmware yourself, you could add the current direction vector to the object model. You would need to modify the code in file Move.cpp and perhaps also in DDA.h or DDA.cpp to add the direction vector from the DDA to the object model. The code that adds requestedSpeed and topSpeed from the DDA to the OM could serve as a guide.
-
@deme I've been following this thread as I've been thinking about using a laser diode to preheat the previous layer just before the next layer is applied. This would be achieved by having the laser focused just ahead of nozzle in the direction of travel, on a rotating axis wrapped around the nozzle assembly. I think is has parallels, and I'd be using a real motor driven axes so wouldn't be worried about your current issue. I could also just have many laser diodes covering segments of the nozzle perimeter, which would be directly equivalent to your concept.
If you have any pointers on how your proposed post processing script would work I'd be interested to hear about them.
All the best.
Barry M -
@dc42 said in Direction/Velocity Vector of Printhead movement:
@deme if you are prepared to compile the firmware yourself, you could add the current direction vector to the object model. You would need to modify the code in file Move.cpp and perhaps also in DDA.h or DDA.cpp to add the direction vector from the DDA to the object model. The code that adds requestedSpeed and topSpeed from the DDA to the OM could serve as a guide.
This is also interesting, thanks for the insights @dc42
-
@dc42 Thanks for the suggestion. A few other questions:
- Is there a referance how often the OM can be pulled (in a sec) without losing performance?
- Is there a way to force the RRF to parse an undefined axis and stream the values to OM? Or does that make absolutely no sense? For example to do something like that: G1 X10 Y10 U90 for LED Ring Position 90 deg relativ to the x axis.
I have already read these posts here and am still not sure which approach to take:
- https://forum.duet3d.com/topic/23659/stream-true-current-position/7?loggedin=true
- https://forum.duet3d.com/topic/26378/polling-the-position-of-the-printer?_=1643369305548
@CNCModeller: I think you mean something like this: https://www.mdpi.com/2504-4494/5/3/82?type=check_update&version=2
-
@deme said in Direction/Velocity Vector of Printhead movement:
@CNCModeller: I think you mean something like this: https://www.mdpi.com/2504-4494/5/3/82?type=check_update&version=2
Exactly, just goes to show there is very little new under the sun...
-
@deme said in Direction/Velocity Vector of Printhead movement:
@dc42 Thanks for the suggestion. A few other questions:
- Is there a referance how often the OM can be pulled (in a sec) without losing performance?
I wouldn't advise polling the whole of the OM much faster than DWC does it, which is normally 4 times per second. Maybe 10 times per second wold be OK. However, you could poll just a part of the OM (e.g. the current move direction vector, if we added that to the OM) more often because the amount of data returned would be so much smaller.
- Is there a way to force the RRF to parse an undefined axis and stream the values to OM? Or does that make absolutely no sense? For example to do something like that: G1 X10 Y10 U90 for LED Ring Position 90 deg relativ to the x axis.
I can't think of a way to do that using the current firmware.