<teepee>
guso78: so I'd say we get this into the branch now and see if there's anything remaining to do
<teepee>
do you mind if I squash all the commits into a single one? so reducing the temporary code states
teepee_ has joined #openscad
teepee has quit [Ping timeout: 255 seconds]
teepee_ is now known as teepee
use-value has quit [Remote host closed the connection]
use-value has joined #openscad
aiyion has quit [Remote host closed the connection]
aiyion has joined #openscad
teepee has quit [Quit: bye...]
teepee has joined #openscad
J23252730 has joined #openscad
J232527 has quit [Ping timeout: 260 seconds]
kintel has joined #openscad
epony has joined #openscad
Guest57 has joined #openscad
Guest57 has quit [Client Quit]
use-value has quit [Remote host closed the connection]
use-value has joined #openscad
qeed has quit [Quit: qeed]
Lagopus has joined #openscad
teepee_ has joined #openscad
teepee has quit [Ping timeout: 255 seconds]
teepee_ is now known as teepee
<kintel>
peepsalot One thing I've been thinking, especially when reading cleanup code like yours: As the internal API is getting cleaner it could be quite nice to write some more conventional unit tests for those, as that could help writing better code without having to rely on regression tests happening to hit all code paths by firing a shotgun at OpenSCAD in general.
<kintel>
Obviously quite a bit of work, but sometimes writing tests is a good, meditative experience, as long as you don't have to fight the framework :)
<peepsalot>
kintel: i don't have much experience making unit tests. should we add a testing framework such as google-test or catch2? or does ctest provide enough?
<kintel>
I think ctest is really just a test runner. googletest or similar would probably be needed, but key is to choose something and stick with it, and write good utilities to allow tests to stay nice and small.
<kintel>
When I first wrote the OpenSCAD test framework I chose black-box regression tests because the internal API was messy and unstable. But things are a lot better now
<peepsalot>
i was also thinking of making a new src/core/value/ path for the sources
<kintel>
I was thinking the same :)
<kintel>
..but src/code/nodes may be a better start
<kintel>
*core
<kintel>
..and ast/
<peepsalot>
hmm, yeah. i did a big file restructuring not too long ago and ended up with a lot of git issues, so i'm a little more trigger shy now
<InPhase>
If all goes well, that will go green on all platforms this time.
<InPhase>
kintel, peepsalot: I'm generally an advocate of hierarchical testing rather than comprehensive unit test coverage. The basic principle is you identify your lower level components in the abstraction hierarchy, and use those. But then rather than unit test on up the levels of abstraction, you do integration tests progressively targetting the higher levels of abstraction, and rely on the validated
<InPhase>
behavior of the lower level components. I think this is less work AND actually captures correctness a lot better than trying to make a million little unit tests.
<kintel>
InPhase yeah, that makes a lot of sense; I've seen too many attempts at "unit testing" some intermediate-level components by actually re-implementing lower-level tests
<InPhase>
s/and use those/and unit test those/
<kintel>
..but testing at the lowest level is nice
<InPhase>
Maybe the value stuff is actually a place where some unit tests could be worked in to lock some core behavior down, and introduce a framework.
<InPhase>
Or maybe we're already hitting all of that really well with the echo tests.
<kintel>
I think we've got pretty good coverage through regressions tests, but my thought was that this may become increasingly tricky to maintain, as you need to understand how values make their way all the way up there
<kintel>
This is in no way urgent, just a thought which may make future refactoring or focused core work possible with less mental load
Dave_ has joined #openscad
ActionDave has quit [Ping timeout: 260 seconds]
<InPhase>
Is there a way to get the actual images that the CI tests produced?
<InPhase>
kintel: Or, maybe as you are mac based you could try it... tests/output/opencsgtest/issue3158-actual.png is showing up in the image comparison report as quite different from tests/regression/opencsgtest/issue3158-expected.png but only on macos.
<InPhase>
0.04652057% of 3x3 blocks differ with median block diff: 29.00
<InPhase>
That's about 121 pixels worth of color shift.
<InPhase>
But I don't see why it would be platform dependent.
<peepsalot>
no, but if i ever finish up one of my oldest PRs, it would
<peepsalot>
the testing was part of the difficulty, where i needed to make sure that it could tell the difference between two images where only the alpha channel differed
<kintel>
InPhase Gotta go to bed, could you open a separate issue on that macOS issue? Once we have an issue, we can disable the test and move on for now
la1yv_a has joined #openscad
EkpyroticFrood0 has joined #openscad
RichardP_ has joined #openscad
RichardPotthoff has quit [Ping timeout: 256 seconds]
JoelJoel has joined #openscad
castaway_ has joined #openscad
leptonix_ has joined #openscad
InPhase_ has joined #openscad
noonien3 has joined #openscad
<InPhase_>
peepsalot: Well I didn't plan for it to do so, but yes, it should work. I believe if you feed in an alpha channel png file, PIL will simply make that size 3 inner dimension of the numpy array into a size 4. And the rest of the code will process the vectorized data exactly the same, as if alpha were yet another color.
<InPhase_>
kintel: That is certainly a messed up result in that imgpile link. So clearly something is significantly amiss there.
castaway has quit [*.net *.split]
leptonix has quit [*.net *.split]
la1yv has quit [*.net *.split]
EkpyroticFrood has quit [*.net *.split]
splud has quit [*.net *.split]
Joel has quit [*.net *.split]
e2k has quit [*.net *.split]
noonien has quit [*.net *.split]
buZz has quit [*.net *.split]
InPhase has quit [*.net *.split]
EkpyroticFrood0 is now known as EkpyroticFrood
noonien3 is now known as noonien
buZz has joined #openscad
buZz is now known as Guest1961
<InPhase_>
kintel: I suppose the right thing to blame is probably some sort of opengl lib percolating on certain mac platforms?
<InPhase_>
kintel: Maybe a comparison of some version numbers for that.
e2k has joined #openscad
InPhase_ is now known as InPhase
splud has joined #openscad
hyperair has joined #openscad
epony has quit [Remote host closed the connection]
guso7821 has joined #openscad
<guso7821>
teepee, i am fine with squashing all commits into one. i believe it can serve as an initial "arrival" . i am sure, that there will be many more tweeks ...
J23252730 has quit [Quit: Ping timeout (120 seconds)]
guso7821 has quit [Quit: Ping timeout (120 seconds)]
guso78 has quit [Quit: Ping timeout (120 seconds)]
J23252730 has joined #openscad
hyperair has quit [Remote host closed the connection]
kintel has quit [Quit: My MacBook has gone to sleep. ZZZzzz…]
ps1231 has joined #openscad
hypera1r has joined #openscad
hypera1r has quit [Remote host closed the connection]
ps1231 has quit [Quit: Client closed]
epony has joined #openscad
leptonix_ is now known as leptonix
Pablo67 has joined #openscad
peeps[zen] has joined #openscad
peepsalot has quit [Ping timeout: 256 seconds]
qeed has joined #openscad
Non-ICE has joined #openscad
Pablo67 has quit [Quit: Client closed]
teepee_ has joined #openscad
teepee_ has quit [Remote host closed the connection]
teepee_ has joined #openscad
teepee has quit [Ping timeout: 255 seconds]
teepee_ is now known as teepee
milza has quit [Quit: milza]
Guest14 has joined #openscad
Guest14 has quit [Quit: Client closed]
guso78 has joined #openscad
Guest77 has joined #openscad
use-value1 has joined #openscad
use-value has quit [Ping timeout: 256 seconds]
use-value1 is now known as use-value
<gbruno>
[github] t-paul pushed 5 additions 40 modifications (OpenSCAD with a Python Engine (#4498) Initial support for integrated python scripting. Status: * Most modules supported * Build time switch * Command line switch to enable python engine * No cross language support yet) https://github.com/openscad/openscad/commit/b585d4ca3327f4919a2715c94eab91bfa93cd91b
<gbruno>
[github] t-paul pushed 5 additions 30 modifications (OpenSCAD with a Python Engine (#4498) Initial support for integrated python scripting. Status: * Most modules supported * Build time switch * Command line switch to enable python engine * No cross language support yet) https://github.com/openscad/openscad/commit/784d7b5bad64eac021327329de1e038caaf799e9
<teepee>
so you default origin still is gsohler/openscad but you can still easily import stuff from the openscad/openscad remote
<guso78>
origin show to my gsohler space, upstream show to openscad space
<guso78>
i do git fetch -a , or a gut pull -a and it tells:
<guso78>
everything is up to date, maybe i am on sync already
<teepee>
does "git branch -a | grep python" show the branch?
<guso78>
gsohler@fedora build]$ git branch -a
<guso78>
master
<guso78>
* python
<guso78>
remotes/origin/HEAD -> origin/master
<guso78>
remotes/origin/master
<guso78>
remotes/origin/python
<guso78>
[gsohler@fedora build]$ git branch -a
<guso78>
suppose the answer is no
<teepee>
oh, let me check, maybe fetch -a does not get all the remotes
<teepee>
oh, it's --all
<InPhase>
guso78: What does your branch do for finding which Python version to run?
<teepee>
yep, after "git fetch --all" the upstream branches should show up
kintel has joined #openscad
<guso78>
ahh, makes a big diff
<InPhase>
guso78: I recently addressed the issue of cross-platform python for the testing system for all of our targets, and setting up a venv for that. So the thought is fresh on my mind that if we ARE going to have Python file processing, we'd probably want to support having a venv for it.
<InPhase>
guso78: That would I think require a command line flag for passing the target python executable, plus a gui field for setting the target python executable.
<guso78>
venv is virtual env ? yes, i agree to it
<guso78>
hmm, so the actual python will be run from an externally installed python executable >?
<InPhase>
guso78: Is it currently just running whatever is in the path?
<guso78>
InPhase i think you are not aware of the truth...
<guso78>
if i used a python which is downloaded to windows, it would not work. i am embedding and extending python with openscad functions.
<guso78>
so openscad included a whole python interpreter inside its executable
<InPhase>
Oh, you went that route.
<guso78>
for me this is the only viable one ...
<InPhase>
I suppose that actually makes a venv almost impossible to manage though.
<InPhase>
It could not be managed outside of openscad, because it has to match up with the executable version.
teepee has quit [Ping timeout: 255 seconds]
<guso78>
It the GeometryEvaluator.cc in a different environment compared to MainWindow.cc ?
<guso78>
if i try python there , it just crashes ...
guso78 has quit [Ping timeout: 260 seconds]
castaway_ has quit [Ping timeout: 256 seconds]
<InPhase>
kintel, peeps[zen]: I tagged you both (more realistically probably "either") for review on the image_compare PR. I think it's all set for a merge, but being a substantive change to the testing infrastructure and dependencies, warrants a second set of eyes.
<kintel>
guso78: I believe there is a separate thread for processing geometry
<InPhase>
It's about 50 lines of cmake, 50 lines of Python, a doc update, and expected test updates.
<kintel>
InPhase Looking at it now :)
<kintel>
InPhase In the meantime, could you disable the broken macOS test and open a github issue with a screenshot from the test result html?
<InPhase>
Sure.
<peeps[zen]>
i'm giving it a test build and run locally right now
<peeps[zen]>
22% tests passed :P
<InPhase>
Check at the top of cmake's run for the result of the venv setup.
<InPhase>
It prints out diagnostics of its success or failure, but does not abort if it failed to setup the venv.
<peeps[zen]>
InPhase: right, i got this error from cmake configure. https://bpa.st/ONNY6
<peeps[zen]>
should it actually error out cmake instead?
<InPhase>
peeps[zen]: See updated doc/testing.txt
<peeps[zen]>
so I was curious if it would fallback or what in that case
<InPhase>
peeps[zen]: Install the prerequisite helper programs on your system: cmake, python3, python3-venv, python3-pip
<InPhase>
ImageMagick was removed from those lines, and the venv and pip were added.
<InPhase>
peeps[zen]: So it should not error out, because this happens at the cmake for building. And you can build just fine without these environments.
<InPhase>
It is a requirement for testing, that because of our particular setup is being configured before building.
<InPhase>
I did not invent this part of the setup though. :)
<InPhase>
It is basically equivalent to before, where if you didn't have ImageMagick, you'd just get a huge number of test fails.
<peeps[zen]>
InPhase: i think it should at least emit a cmake warning, and maybe point user to the testing doc
<InPhase>
Oh, I could certainly change that final message(STATUS to a message(WARNING
<InPhase>
Perhaps: message(WARNING "Failed to setup venv for ${IMAGE_COMPARE_EXE} See doc/testing.txt for dependency information.")
<peeps[zen]>
hmm, i know the filename has "testing" in it, but maybe we should explicitly say "This is only relevant if you intend to run the test suite."
<InPhase>
"Failed to setup the test suite venv for ..." ?
<peeps[zen]>
yeah that sounds better
<peeps[zen]>
passed all tests after installing python3-venv and python3-pip. Total Test time (real) = 46.08 sec
use-value has quit [Remote host closed the connection]
use-value has joined #openscad
<InPhase>
Excellent.
<peeps[zen]>
one other small quibble in terms of UX: i still see cmake spit out: ModuleNotFoundError: No module named 'PIL' can that be silenced if pip is found? (ie it will be installing it for us)
<InPhase>
Hmm.
<InPhase>
Right, so try to eat stderr on the first run of image_compare.py, but leave stdout.
<InPhase>
(Because stdout will announce success.)
kintel has quit [Quit: My MacBook has gone to sleep. ZZZzzz…]
<InPhase>
I have to manually break the environment to trigger that condition... One moment. :)
<InPhase>
You hit it only because you rebuilt after failing once.
<InPhase>
Yep, ERROR_QUIET handles that correctly.
<peeps[zen]>
nice
<InPhase>
It will still output precisely that error if it still fails at the final validation run.
<peeps[zen]>
great, that makes sense to me
<InPhase>
I pushed those.
<InPhase>
One additional question... What is the proper way to disable a test? Is there a way to disable it without commenting it out, so that it can still be run?
<kintel>
This is in debug mode though, it'll run twice as fast in release mode
<peeps[zen]>
debug is extra slow with mimalloc
<peeps[zen]>
but you can get very detailed memory stats
<InPhase>
Ah. I only do release builds.
<InPhase>
kintel: Particularly the Compare3x3 function requiring documentation?
<InPhase>
s/documentation/comments/
<InPhase>
Oh, maybe I see a few points inside the other one as well.
<InPhase>
peeps[zen]: -j24 (1m21s) is slightly faster for me than -j16 (1m28s), but with diminishing returns.
<kintel>
InPhase yeah, I kind of know what it does, but I couldn't guess from glancing at that function :)
<InPhase>
Yep. I typically write that sort of code for a target audience that would find Compare3x3 obvious, but most of whom aren't sure what a class member is. But you're right, it requires some comments. :)
<peeps[zen]>
i was seeing 20-22s for a release build, before python compare though. so it does take about 2x longer overall for me
<InPhase>
peeps[zen]: It's a little less than 2x for me. So it's definitely a slowdown.
<InPhase>
But, a slowdown on times that aren't too bad in the first place I think.
<peeps[zen]>
but still sufficiently fast that I can't complain too much. since the increased accuracy seems important
<peeps[zen]>
i am kinda curious how it would perform in C++, possibly replacing diffpng's algorithm
<InPhase>
That would approximately revert it to the old times.
<InPhase>
The algorithm itself is super fast. The choice of doing it with python added a script-loading time repeated 1000 some times, which is almost the entire difference.
<InPhase>
I didn't examine diffpng that closely, but I'm pretty sure it's faster than diffpng simply because that algorithm appears to be much more involved in what it is trying to do.
<InPhase>
I choose a much simpler machine-oriented definition of image equivalence.