You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is a followon to #15 and also a brain dump of things I thoughts while working on #21. I chose to make that a checkpoint with reasonably good results, rather than proceeding to a highly optimized solution.
First, it would be possible to solve this problem using numerical optimization techniques, much as chapter 9 of my thesis. That would be suitable when generating fonts, but is not at all a workable approach for interactive techniques - it would take way too long to produce the optimized representation. Yet, implementing that is probably a good stepping stone, partly because it is useful in batch contexts, and partly to provide a reference for more heuristic techniques.
One of the lessons of that chapter is that making the arclength of the rendered cubic segment match the arclength of the original curve is important. That leaves a single parameter remaining, which is the ratio of the two control arms. My feeling is that if you leave this ratio the same as it is now (ie the dt calculation derived from mapping u to t) but tweak the control arms to make the arclength match better, it will be a big improvement.
Obviously this could be solved by actually measuring the arclength and iterating a solver, but performance is probably unappealing for interactive use. I propose the following heuristic, though it would need to be validated. Assuming a flat parametrization (t = u), the length of a control arm is 2/(3 + 3 cos θ), where θ is the deflection from the chord to the control point. Note that this formula gives a very good approximation to circular arcs when the deflection is symmetrical (it is equivalent to the standard one, for example in the circles and cubics section of the Bézier primer). I suspect it will also work pretty well in cases other than the symmetrical one, but that would have to be validated. For other parametrizations, it should probably scale by dt/du, but that's also something that would need to be validated.
The parametrization (calc_t()) also hasn't been carefully tuned. It's flat for low tensions (bias < 1), but it should probably have a higher derivative near the endpoints, especially for bias near 0. Note that this should improve the rendering of low-tension segments, as the arclength is longer and thus they need longer control arms.
Lastly, render_subdivisions should be carefully tuned to produce a minimum number of subdivisions that meet some error criterion (and the error criterion should probably be tunable as well). I think special care is needed for low bias values - a value of 1 is reasonable when bias=1 (for low deflections), and possibly higher, but cubic Béziers don't do a good job modeling cases where the curvature is zero at one endpoint. The only reasonable way to tune this is to have a test framework that reports the end-to-end error, and then add terms to make it fit better. I suspect there may be a k0 - k1 term, as well as obviously some effect of bias.
The text was updated successfully, but these errors were encountered:
There's another idea here which is worth exploring: go with the arclength matching approach, but use a pretty good arclength approximation. In particular, the Gravesen one (discussed and linked from How long is that Bézier?) is super easy to compute, and Newton solving is practical because the analytical derivative is easy (though secant may work just as well in practice).
Experimenting with the cosine-ratio approach (by hacking euler explorer), it seems to produce overly long arclengths. I also experimented with the Gravesen approximation and that produces overly short arclengths (ie the approximation overshoots, the solver compensates), especially in antisymmetric (S-shaped) cases. It may be the winning strategy is just to use gauss_arclen to get a pretty-good, fast approximation to arclength.
This is a followon to #15 and also a brain dump of things I thoughts while working on #21. I chose to make that a checkpoint with reasonably good results, rather than proceeding to a highly optimized solution.
First, it would be possible to solve this problem using numerical optimization techniques, much as chapter 9 of my thesis. That would be suitable when generating fonts, but is not at all a workable approach for interactive techniques - it would take way too long to produce the optimized representation. Yet, implementing that is probably a good stepping stone, partly because it is useful in batch contexts, and partly to provide a reference for more heuristic techniques.
One of the lessons of that chapter is that making the arclength of the rendered cubic segment match the arclength of the original curve is important. That leaves a single parameter remaining, which is the ratio of the two control arms. My feeling is that if you leave this ratio the same as it is now (ie the
dt
calculation derived from mappingu
tot
) but tweak the control arms to make the arclength match better, it will be a big improvement.Obviously this could be solved by actually measuring the arclength and iterating a solver, but performance is probably unappealing for interactive use. I propose the following heuristic, though it would need to be validated. Assuming a flat parametrization (
t
=u
), the length of a control arm is 2/(3 + 3 cos θ), where θ is the deflection from the chord to the control point. Note that this formula gives a very good approximation to circular arcs when the deflection is symmetrical (it is equivalent to the standard one, for example in the circles and cubics section of the Bézier primer). I suspect it will also work pretty well in cases other than the symmetrical one, but that would have to be validated. For other parametrizations, it should probably scale by dt/du, but that's also something that would need to be validated.The parametrization (
calc_t()
) also hasn't been carefully tuned. It's flat for low tensions (bias < 1), but it should probably have a higher derivative near the endpoints, especially for bias near 0. Note that this should improve the rendering of low-tension segments, as the arclength is longer and thus they need longer control arms.Lastly,
render_subdivisions
should be carefully tuned to produce a minimum number of subdivisions that meet some error criterion (and the error criterion should probably be tunable as well). I think special care is needed for low bias values - a value of 1 is reasonable when bias=1 (for low deflections), and possibly higher, but cubic Béziers don't do a good job modeling cases where the curvature is zero at one endpoint. The only reasonable way to tune this is to have a test framework that reports the end-to-end error, and then add terms to make it fit better. I suspect there may be ak0 - k1
term, as well as obviously some effect of bias.The text was updated successfully, but these errors were encountered: