@roivai The code you have written is quite advanced. It was much more difficult to write than doing it a simpler way. It is best owned by the person who wrote it. If you want, we can see what we can do with it together.
I see two mistakes which has made it difficult.
The first mistake is using the PathGeom.wireForPath() function, which creates the list of Part.Edge objects. These objects don't help you. They are missing the israpid property -- causing quite a difficult work-around for you -- while not adding anything very useful.
Sure, you use the edge.Length, edge.splitEdgeAt() and edge.valueAt() functions, but there are only two simple types of geometry (line and arc) and it is easily implement these functions directly. Using the Part library is not worth it.
To keep it really simple, you could take the list of Path.Command objects as the input and instead use objects that are basically the pair [Vector, Path.Command], where the Vector is the previous point. This takes away the dependency on the gigantic solid modelling kernel library, and means we can streamline these calculations later on.
The second mistake is do everything at the same time in a single outer loop.
For example, at the start of generateRamps() you have "for edge in edges", and then deeper in you use a while-loop to look forwards for complete closed contours while creating ramps as you go along.
Don't nest the code like this. There is plenty of time and memory space. It is much better to do the processing in multiple passes in separate functions.
In my code, the function ConvertToContourHelix() begins:
Code: Select all
cmds = list(pp.Commands) # extract the Path.Command objects
zlevelis = ContourLayerIndexes(cmds) # list of indexes at start of each layer
zlevelzs = [ cmds[zlevelis[i]].Parameters["Z"] for i in range(len(zlevelis)-1) ]
layerstartpt = ContourLayerStartpoint(cmds[:zlevelis]) # start point of 0th layer (should check same for each other layer)
layer0 = cmds[zlevelis:zlevelis]
cumlenlayer0 = cmdcumlen(layer0) # cumulative length of each segment of 0th layer
Then for each layer i, the helix was applied like this:
Code: Select all
zi, zn = zlevelzs[i], zlevelzs[i+1]
ll = zlevelis[i+1] - zlevelis[i]
assert zlevelis[i+1] - zlevelis[i] == len(cumlenlayer0)
for j in range(len(cumlenlayer0)):
lam = cumlenlayer0[j]/cumlenlayer0[-1]
cmds[zlevelis[i]+j].__setattr__("Z", zi*(1-lam) + zn*lam)
Hopefully, the first block of code above would be common to every other 2D contour path dressup, because they all need to do the same topological identification before they apply the dressup function on each layer.
If I was doing ramps, rather than helixes, I would add another pass (function) choose the split points where the ramp enters and exits.
It would then not be hard to add the further feature to allow the ramp to advance around the contour with each layer. If each of these features is completely separated out, not mixed in with a complicated nest of a nest of a loop, then they don't add to the complexity.
When you do everything in one loop, each feature doubles the complexity. If you do it in separate passes, then you can add features without adding complexity.
My code above would be even simpler if ContourLayerIndexes() returned complete copies of lists of Path.Command objects, rather than a list of indexes into the starting list, because then each pass or step would be more isolated from anything it doesn't need to know. We should do it that way if we rewrite it.
So, the things we might get from doing something along these lines are:
(1) fix a helixing bug that is hardly visible,
(2) potentially reimplement/combine some of the other dressups (eg tag dressup),
(3) add new features, like ramp advancing, global reordering,
(4) make the path helix object redundant.
How would we decide how to do this? It mostly comes down to hacking a particular file in the freecad-build directory. I don't think I've understood the workflow where code is edited in the freecad-code repository and I need to execute the makefile for every iteration.
I've done most of my experimentation in Jupyter Notebooks with code like this: https://github.com/goatchurchprime/tran ... nest.ipynb