Bit of a side track but with all this real time processing, when does something actually qualifies as being to 'to slow' ?
For example what's an acceptable time for the 48x48 (4186.dat, 553.000 vertices). My current version takes 45 seconds to do unique point tests and angle based smoothing.
This is just about acceptable (to me) but I still need to include type 2 lines during the smoothing tests and optionally do the whole t-junction thing before hand (although I'm probably putting a limit on part size for that, e.g. <25000 vertices) .
In short when will things be fast enough for general use, taking in account these waiting times only occur once during the programs run (even once per installation if you consider disk caching the results).
For example what's an acceptable time for the 48x48 (4186.dat, 553.000 vertices). My current version takes 45 seconds to do unique point tests and angle based smoothing.
This is just about acceptable (to me) but I still need to include type 2 lines during the smoothing tests and optionally do the whole t-junction thing before hand (although I'm probably putting a limit on part size for that, e.g. <25000 vertices) .
In short when will things be fast enough for general use, taking in account these waiting times only occur once during the programs run (even once per installation if you consider disk caching the results).