I don't know why, but FWIW, I've seen the same sort of accordion pattern on mine. I saw the same thing when I was using an inductive probe, before I got fed up with it and bought a Smart Effector, too, so I don't think it's specific to the Smart Effector. The actual first-layer height seems to come out OK, so investigating this has been a low priority for me.
Posts made by srs5694
-
RE: Strange lines in heightmap
-
RE: Oversized objects after upgrade to smart effector
Chances are your delta arm length setting is wrong – it's just a little bit too long, if I recall the relationship correctly. It's my understanding that auto-configuration systems can't detect this error, so it's important that you get it right manually. This is set via the "L" option to the "M665" command. If you're using a Duet controller, you'd set this in config.g (although it can be overridden in config-override.g, if your config.g is set up to load it, so you might need to set it in both files to be sure). In some other firmware systems, you'd edit the firmware configuration file, rebuild the firmware binary, and re-upload it.
-
RE: New circular print fan duct available
Nice designs, i would like to use a 50mm radial fan with a circular duct and may create a remix of srs5694's design for that.
I toyed with doing that. In fact, my design is based on an earlier effector-and-fan design that I did, which used a 50mm radial fan, before I got fed up with the inductive probe I was using and decided the Smart Effector looked like the best way to do bed probing on a delta printer. Although I've abandoned that effector design for myself, I'm planning to publish it, probably this evening (US Eastern time). You might be able to merge bits from the two designs once I publish the effector.
That said, one reason I didn't pursue the radial fan idea is that placement of the fan seemed awkward. It would either need to lie on its side or be placed upright but extend out quite a way from the effector, with a rectangular duct to link it and the circular duct. It didn't seem worth the effort to me personally to design it – but of course if that's the design you prefer, and if my design helps you get there, then that's great!
Any idea how far away the fan duct should be from the heat block depending on the heat resistance of the material? Is the silicon sock enough to shield a pla/petg/nylon duct when getting very close to the sock?
I don't know for sure, but I can say that I printed my duct in PETG, and it seems to be holding up fine to printing PETG. (I've got an E3D V6 sock on my heat block.) My other printer (a Robo 3D R1+) has a PLA fan duct that's about as close to its nozzle as my fan duct design design gets to its nozzle, and even the Robo's fan duct seems OK. (It looks like it's softened a little, but not much.) I print PETG on that printer, too, so the heat block gets up to 250C or thereabouts.
-
RE: New circular print fan duct available
Mine is about 5mm above the tip of the nozzle. (This is using a genuine E3D V6 but with a third-party stainless steel nozzle.) This height is dictated largely by the bracket and fan, and should be similar to the height of dc42's fan duct. Note that the airflow is aimed to converge at the center of the torus at 8mm below the duct – in other words, 3mm below the print surface, in a theoretical sense. In practice, of course, the air stream is not a mathematical line; it has a diameter, there's turbulence, etc. I'm not an expert on airflow by any means, so I was just guesstimating at something that would produce good airflow in the area immediately surrounding the tip of the nozzle, without directing much airflow on the nozzle or heater block – I didn't want to cause problems keeping the hot end up to temperature.
If you care to experiment, of course, you can adjust the targetZ variable in EffectorCircularFanDuct.scad file, or adjust EffectorFanBracket.scad to lower the fan and duct. Either will require you use OpenSCAD and build a new .stl file. You could also look at https://www.thingiverse.com/thing:2573606, which is a modified bracket for the E3D Volcano. Somebody's already tried that combination and said it works, but I don't have an E3D Volcano for testing, myself. My guess is it would lower the fan duct too low on an E3D V6 or most clones.
-
RE: Problem sensitivite smart effector.
OK, I've now tried the 1.21RC3 firmware and "M558… B1". This works, but it has a big drawback: I get heating faults during probing. Specifically, when I tried probing after both the bed and the hot end had reached their target temperatures (60C and 210C, respectively), the hot end produced a heater fault with the message "temperature excursion exceeded 15.0C." (I could see the hot end temperature plunging throughout the probing process. The bed temperature dropped, too, but not as dramatically.) I also tried probing while both the bed and hot end were still being heated. In this case, both of them produced heating faults, with the message "temperature rising much more slowly than the expected 0.1C/sec" (or 0.0C/sec for the hot end).
-
RE: Problem sensitivite smart effector.
I looked for 1.21RC3 firmware and found this:
https://github.com/dc42/RepRapFirmware/tree/dev/EdgeRelease/1.21RC3
There's no DuetWiFiFirmware.bin file in that directory, though, which I believe is the file I'd need to update the firmware.
FWIW, I did try going to bang-bang mode on the bed heater (by commenting out the "M307 H0…" lines in config.g and config-override.g). This greatly improves matters -- the Z probe doesn't produce any false alarm except for a brief blip, apparently when the heater turns off. (Just after the bed reaches its target temperature; I can also hear the power supply's fan drop in speed.)
-
RE: Problem sensitivite smart effector.
Mine is 12v, with an aluminum plate above the heat bed. I've just done some more experiments. First, I tried adjusting sensitivity with the M672 command, as noted in the documentation. This didn't help; I got false alarms with sensitivities ranging from 1 to 255.
Next, I next brought the hotend down to 5mm above the bed, with the bed cool, then turned on the bed heater (leaving the hotend heater off). The Z sensor triggered when the heat bed reached 57C (the target was 60C). I then took the temperature down in 5-degree increments. The Z sensor would turn off whenever I set a lower temperature (presumably because the heated bed was no longer receiving power), but would trigger again when the heat bed more-or-less reached the target temperature. This was true down to 40C, at which point I stopped lowering the temperature and began raising it in 5C increments. Again, the Z probe would trigger whenever the bed's temperature approached the target temperature. I wonder if it might have something to do with the way the Duet controls the approach to the target…? (I have done PID tuning on the bed heater.) Does it modulate the bed heater current in some way that might induce a magnetic field with some heating elements? If so, is there some way to adjust that -- say, by tweaking PID values?
I then removed the aluminum plate, and that had no effect; the Z probe would trigger with or without it.
I took the bed heater out of the frame, but left the heater plugged in. (It's got a fairly long cable.) The Z probe stopped triggering. I then moved the bed heater around and discovered that the Z probe would trigger whenever the bed heater got too close to the effector. What "too close" is varied with orientation; edge-on, I needed to get to within 3 or 4 cm of the effector. When I tilted the heater on its edge, so that a camera where the effector was would "see" it as a circular object, the effector would trigger when the bed was 10cm or more away -- about the same distance as triggers false alarms when everything's properly in place. Thus, I'm pretty sure it's the bed heater itself, not the way the cables are laid out.
I also tried rotating the bed heater in the frame (mounted normally), to no good effect; false alarms would occur no matter the orientation.
Based on this thread, I tried unplugging both my hotend fan and my print-cooling fan, reasoning that the combination of their magnetic fields and whatever the heated bed was producing might be causing problems, but unplugging the fans didn't help, so it seems to be the bed heater alone that's the culprit in my case.
-
RE: New circular print fan duct available
Im using this on both of my smart effectors now and I really like it. Well done!
Thanks! I'm glad you like it!
-
RE: Problem sensitivite smart effector.
I've not heard of anyone having that problem before. What type of bed heater is it?
It's a Chinese knockoff of a SeeMeCNC Onyx (rev. 6) heater.
-
RE: Problem sensitivite smart effector.
I haven't yet looked into it in detail, but mine has the very odd problem that it signals contact (green LED) when the heated bed is warm (over about 45 or 50C) and the head is closer than about 100mm to the bed. Turning off the bed heater makes it work as expected. The wires for the heated bed and the effector don't cross, although my Duet and all the wires are located directly under the heated bed, so it's conceivable there's some interference that might be hard to eliminate, short of relocating the Duet.
I intend to try adjusting the sensitivity to work around this problem, but I haven't yet gotten around to that. For the moment, I can do my G32 probing cold and G29 after raising the bed temperature but after turning off the bed heater, to get a decent height map file.
-
RE: Alternative Carriage Adapters
Before I got my Delta Smart Effector, I was using a modified version of this Thing:
https://www.thingiverse.com/thing:932959
As delivered, it has 20x20 screw spacing; but I had to modify it for 20x16 spacing to attach to the sliders I'm using, which I accomplished by editing the "for (y = [-10, 10])" line to "for (y = [-3, 13])". I made a few other tweaks, too.
When I got my Smart Effector, I needed to change the ball stud spacing to 55mm, which is done via the rodSeparation and ballStudDistance variables (they're semi-redundant). I'm now using my modified version of this Thing with my Smart Effector, without the carriage pieces provided with the Smart Effector. This seems to work fine – or at least, as well as can be expected. (There's a bit too much play in my carriage, and printing parts like these, rather than using manufactured parts, might be a contributing factor.)
In any event, you should be able to make similar changes, just with a tiny change for 20x15 spacing. If you're not comfortable with OpenSCAD, I could do this for you and send you the files; PM me if you want this. Note that I can't promise how whatever I produce would mate with your slider; it might be awkwardly high or low. If you do it yourself, you can adjust the y range in that "for" statement to get it to fit the way you want.
-
New circular print fan duct available
I thought some people here might be interested in the new circular fan duct for the Delta Smart Effector I've just published on Thingiverse:
https://www.thingiverse.com/thing:2808152
This fan duct mounts in the same way as dc42's fan duct (in fact, it uses a minor variant of his EffectorFanBracket design), but it provides a circular (360-degree) or semi-circular (180- or 270-degree, or any other value over 180 degrees if you edit the .scad file) output vent. This improves cooling noticeably, but not dramatically, compared to dc42's design.
-
RE: My fans have stopped working
I've found a workaround. Discussion in a couple other threads here led me to move the fan's + line from the fan connector and instead connect it directly to the power supply's 12v + line, while leaving the fan's ground connected to the ground pin for the fan on the Duet. Connected in this way, the fan now works again, including software control of its speed. It's not exactly "good as new," but it's an acceptable workaround – unless of course somebody tells me that this configuration is a disaster waiting to happen.
-
RE: Why don't you use Cura slicer?
I would prefer if Canonical staff could aggressively dogfood their own product, and that means updating as a priority; to hear otherwise is worrying.
You didn't hear otherwise. I said that I was running the latest LTS release on one of my personal systems. I said nothing about other systems I use – and there are many of them. Furthermore, since LTS releases are maintained in the long term, and receive bug fixes and security updates, those LTS releases must also be aggressively tested.
If I was doing tech support your case would be firmly closed with a 'please come back to us once you can reproduce on vanilla Ubuntu' note.
To which I'd reply that I was running "vanilla Ubuntu!" I said so in my post. Perhaps you misinterpreted when I wrote that I was using "the latest kernel, or at least close to it, as provided by Canonical" – I meant that I'm using a stock kernel as delivered with Ubuntu.
Appimages are not running as root, nor running as low UID services Nor can they elevate their own privileges.. They can see what the invoking user sees and nothing else; they cannot write over system files or configs, and must store their own config in a user writable location, usually the invoking users homedir. If one ever required or asked to run as root I'd be deleting it in an instant. And since the alternatives need to be installed by running installers as root; which will then make config changes and plonk binary packages deep into the filesystem while blindly executing whatever scripts it is asked to. I think appimages win very convincingly in the security and convenience stakes.
I never said anything about running Cura (or any AppImage) as root, although I did refer to using sudo or root to install it in a sensible place in the filesystem. (Putting binaries in a user's home directory is so cringe-worthy from a Unix/Linux perspective that it doesn't merit serious conversation!)
Now think about that from Ultimakers perspective; and understand why they are going this road. it means that no admin access is required to install and run their app so long as it stays within the users sandbox. And you only need to make+distribute one package to achieve this across dozens of distributions.
Yes, I understand this; it's an effort to reduce developer effort, at the cost of deviating significantly from the software distribution model used by the host OSes. From my perspective, that's a sub-optimal solution at best. Particularly if you're advocating putting the binary in users' home directories, it looks like taking a huge step backward to the days of DOS, when people intermingled program files, user data, and so on, with a need to manually update everything. There's a reason things have been moving away from that model for years, and that it was never used in the Unix world.
I suggest you stop replying now; if anything, you're making me think worse of AppImage as an application-delivery format.
Oh, and I've deleted Cura from my system.
-
RE: Why don't you use Cura slicer?
I would like to use Cura, but it crashes under linux.
That's been my experience, too. Like you, I'm using Ubuntu 16.04.
Upgrade; you are 18 months out of date in a fast-moving world.
Ubuntu 16.04 is the current latest long-term support (LTS) version of Ubuntu. It's designed for stability, as opposed to the non-LTS releases, three of which have been released since 16.04 came out (16.10, 17.04, and 17.10). The non-LTS releases do have more up-to-date software, but they are not supported for as long and they are more likely to contain bugs. Except in rare circumstances, advice to "upgrade" from the most recent LTS to a non-LTS release of Ubuntu is ill-advised. If running the latest version of a program is critical for a computer, that may be an exception to the rule (although even then, if it's just one program, there may be other options).
Note: I'm employed by Canonical, so I'm very familiar with the Ubuntu release model, and I use an LTS release on my main desktop for good reason.
Cura is the worst of these from a Linux package management point of view (although it is open source, and so could easily qualify for packaging); AFAIK, it comes only as a platform-independent .AppImage file.
Appimage IS packaging. Distribution independent code containerization.
Yes, AppImage is packaging in the generic sense, but it is not the Debian packaging used by Ubuntu (or other package systems, like RPM, used by some other distributions), and that's clearly what I meant.
This distribution format has its advantages for a software developer, but when an OS provides high-level package management tools (as almost all Linux distributions do), not using them is a drawback for the user.
I'd argue it is so simple to use (download, make executable, run) that making the user install a package (open software centre, find, click, supply password, install, close software centre, find app and run; or sudo dnf/yum/apt install <must know="" the="" exact="" package="" name="" here="">) is not really that much simpler; just more familiar.</must>
First, you're overstating the difficulties of installing a package via a package system – or perhaps understating the difficulties of installing an AppImage file. Try this for the latter: Find the URL to download, download, copy the downloaded file to a location on your path (which requires entering your password or becoming root, unless of course you install somewhere that's not a system path, which has some severe problems), create a symlink so the name is less awkward, repeat every time a new release becomes available. Phrased in this way, the AppImage approach doesn't seem so great, does it?
Second, one of the major points of a package system in the Linux software ecosystem is to standardize package installation, which you dismiss with the phrase "just more familiar." Imagine having to independently install every one of the hundreds or thousands of programs that make up an OS in the way that Cura must (AFAIK) be installed on Ubuntu. Even managing just one or two dozen programs in this way would be a pain. Package systems make this very easy by comparison. You install it once and forget it; you needn't bother with pulling down updates for bug fixes and the like, because the system will do it automatically or semi-automatically as part of general OS updates. (A caveat is that an Ubuntu LTS release isn't likely to upgrade the software version just for the heck of it, but it will update to install bug fixes and security updates. If you want the latest and greatest feature, you'll need to upgrade manually in one way or another.)
Third, you're looking at it from the perspective of a single program. That's valid, but it's a very narrow outlook. Most computers will have dozens or hundreds of programs installed, even outside of core OS programs. Having one package system to maintain them all is a huge advantage over a mish-mash of different package formats, URLs to visit for updates, installation procedures, etc. As a user, every program that deviates from a well-established package system's standards increases the difficulty of maintaining my computer.
Finally, even if the software is not in the distribution's package system, but is available as a package file (RPM, Debian package, etc.), there are advantages to the user over an AppImage file. Package systems ensure that there are no filename conflicts and that the package being installed is compatible with other system components – so if the application requires LibraryA version x.y.z or later, the package system will catch this and block the installation (with a suitable warning) if that's not the case. Packages can include documentation, system-wide configuration files, etc. Packages enable other packages to rely on them (admittedly maybe not important for Cura, but who knows….).
Add the fact that each of these distribution-specific packages needs to be properly maintained; tracking every OS /and/ packages update. It's a lot of effort for the Buildmasters and devops guys to keep up with; appimages give a single solution across a huge number of target platforms and distros; they are the future.
You're correct that it's easier for the developer to release a program using a single file format than to create separate RPMs, Debian packages, etc., not to mention binaries for MacOS and Windows; however…
-
By not creating separate OS-specific packages, you're offloading work onto the user. This may not be a big deal in the case of one program, but if you rely on that fact, and the next software developer does, and so on, pretty soon you're into tragedy of the commons territory and users will move on to a platform where these things are handled properly. Also, every one of your users will be inconvenienced by this offloading. Thus, it's not a question of balancing Time X by the developer and Time Y by the user; it's Time Y for each user multiplied by Z number of users.
-
As a corollary to the preceding, speaking solely in my role as a user, I don't care whether it's easier or harder for a developer to release in one format or another; I want to use what's best for me and my computer. For most current Linux distributions, that's whatever package system that distribution uses. Period.
-
In my experience, slicers distributed in AppImage form are the slowest and least reliable (although Slic3r isn't exactly a speed demon, either). I haven't looked very closely, but I suspect this is because they're largely interpreted, whereas faster slicers, like ideaMaker, are compiled. I may be off on this, though; I know very little about AppImage internals or how Cura (or the other AppImage-using slicers I've tried) runs. I just know that Cura is not just slow, but extraordinarily slow.
-
Distribution maintainers have teams of people who work to help get applications packaged and into distributions' repositories. As Cura is open source, it qualifies for that. This would not be zero effort, but Cura's developers could certainly reach out to Debian, Red Hat, and others to get Cura into the relevant OS repositories. (I'm not involved enough in Debian packaging to be of all that much help in this, but I can provide some pointers to help connect Cura developers to the Debian packaging chain, if necessary – send me a PM if interested.)
As for AppImage being "the future" of package distribution: I sincerely hope not. They're a huge step backward from what I've seen, at least from the user's point of view.
- I say this as someone who currently makes his money doing cloud application packaging/building into RPM's for 'traditional' software deployment; and who looks as appimages as the best replacement tech for application distribution (but emphatically not OS or System components) to come along so far.
My perspective is as both a user and a developer – I've written the GPT fdisk (aka gdisk) partitioning tool and I maintain the rEFInd boot manager, both of which are in the Debian repository (and therefore Ubuntu). Others have done most of the packaging work for both of these, although I was involved in getting rEFInd into the Debian package system. GPT fdisk is available in Red Hat and most other distributions, too, but rEFInd's uptake is a bit spottier outside of Debian and its derivatives. The point is that I do understand the difficulties of creating an RPM or Debian package, and the even greater hurdles involved in getting a package into a distribution's repositories. These tasks are not trivial; but in terms of creating a package, most of the pain is up-front, in learning the packaging system. Once it's prepped, subsequent updates are relatively easy to do, at least to build the package itself. (Debian, at least, has procedural hurdles involved in pushing through updates that appear in its repositories.)
They also have a HUGE disadvantage; and I think this will kill them over time/ or force some sort of update/patch system to be added.. Since all the libraries they use are pre-packaged there is no possibility of retroactively upgrading individual items.
For instance; if you bundle LibreSSLv1.2.3, you are stuck with it.. if it is found vulnerable at some future time you will need to make a whole new appimage to bundle the newer libs. Fine if the project is very agile and quickly updates; but a nightmare for other scenarios, or those who are slow to upgrade.I'd not considered this, but with this point in mind, I'm likely to delete most or all of the the AppImage-based slicers I have installed. The security risk is simply too great, especially for a program I don't bother to explicitly upgrade on a regular basis.
In sum, your arguments come across as being very developer-centric – AppImage is easier for you as a developer. I acknowledged as such in my original post. From a Linux user's point of view, though, a Linux package system (which does not include AppImage format, as I define "package system") is far superior. The title of this thread is "Why don't you use Cura slicer," and implicit in that title is a user-centric point of view. I've presented mine. You may not like to hear it, and Cura's developers may have other priorities, and I'm not trying to be judgmental, if that's so; but for me, other slicers do what Cura does not do – and most importantly, they don't do what Cura does do (namely, take down my entire computer, which is otherwise rock solid).
-
-
RE: Why don't you use Cura slicer?
I would like to use Cura, but it crashes under linux.
That's been my experience, too. Like you, I'm using Ubuntu 16.04. (With the latest kernel, or at least close to it, as provided by Canonical.) Worse, Cura not only crashes, it hangs the X GUI, and sometimes even hangs the whole computer! Those are impressive feats for any Linux program, and not in a good way. Because the computer I generally use for slicing is my main workstation, with dozens of programs open at any time and, typically, an uptime measured in weeks if not months (it's currently 25 days), I'm very reluctant to experiment with new versions of Cura in the hopes that this problem has been fixed, or to explore Cura more fully to determine if it might have features I'd find useful. (The latest version I've tried is 3.0.4; I see that 3.2.0 is now available.)
Beyond that, Cura is slow as a sloth. It takes longer to start up (I just timed it – 55 seconds; Slic3r and ideaMaker both take 2 seconds) than any other slicer I've tried. I don't recall the details, but it's pretty sluggish to actually slice models once it's running, too. I have other, more minor, qualms with it, but they're nothing compared to crashing my computer, which is inexcusable.
I use either Slic3r or ideaMaker for most of my slicing. Between the two, they get the job done. Slic3r's strengths include a UI that, although a little hard to understand at first, is very intuitive once you get going; and good options for infill and top/bottom layer patterns. Its weaknesses include sluggish performance and a tendency to crash (but without taking anything else with it). IdeaMaker is very fast and reliable, in my experience. It also provides manual support structures, which is a must-have feature for some prints. On the down side, ideaMaker doesn't explicitly support delta configurations (there are workarounds, but the slicer thinks the delta's bed is square), and I had to tweak my startup g-code a bit more than usual to get it to work with my Duet-based delta printer. Until recently, infill options were limited, but the latest version (3.1.0) adds hexagonal and triangular infill.
Slic3r is fully open source and is available in Ubuntu's repositories, which is a plus for an Ubuntu user. IdeaMaker is not open source, but it is available as a native Linux application in a Debian package, so it installs pretty easily. Cura is the worst of these from a Linux package management point of view (although it is open source, and so could easily qualify for packaging); AFAIK, it comes only as a platform-independent .AppImage file. This distribution format has its advantages for a software developer, but when an OS provides high-level package management tools (as almost all Linux distributions do), not using them is a drawback for the user.
-
RE: My fans have stopped working
They were powered by VIN, AFAIK. (I wasn't aware there was any way to do it, and they're all 12v fans.)
-
RE: My fans have stopped working
The page you reference makes it sound like each fan has its own mosfet? Would all the mosfets burn out at once? Because none of them work. (I even tried the two fan circuits I'm not using.)
-
My fans have stopped working
Long story short, in doing some maintenance on my delta printer that uses a Duet WiFi, I accidentally shorted the wires going to my hotend's always-on fan. I caught a whiff of solder smell, and when I put everything back together correctly, I discovered that all of my fans had stopped working. Everything else is working fine – the hotend, the heated bed, the motors (although I've not tested the extruder motor), the Panel Due, but none of the fans. I measure approximately 0.1v when these fans are activated, rather than the 12v they should be receiving. Although I can wire the Duet board's cooling fan and the always-on hotend fan directly to my power supply's 12v output, the part-cooling fan poses more of a conundrum. So: Is there something I can do to get this working again, short of buying a new Duet? Please keep in mind that I'm far from an expert at electronics hardware -- I can use a multimeter and solder wires together, but extensive troubleshooting or wiring together a circuit with more than a handful of components will be beyond me. Thanks for any advice!