RE: Need help for faster processing
2023-11-07, 8:34 (This post was last modified: 2023-11-07, 8:35 by Basil Bader.)
2023-11-07, 8:34 (This post was last modified: 2023-11-07, 8:35 by Basil Bader.)
Code:
float maxX = subMeshes[model].MaxBy(x => x.X).X;
float maxY = subMeshes[model].MaxBy(x => x.Y).Y;
float maxZ = subMeshes[model].MaxBy(x => x.Z).Z;
float minX = subMeshes[model].MinBy(x => x.X).X;
float minY = subMeshes[model].MinBy(x => x.Y).Y;
float minZ = subMeshes[model].MinBy(x => x.Z).Z;
Code:
float minX = Inf;
float maxX = -Inf;
//same for y and z
for (vec : subMeshes[model]) {
minX = Math.min(minX, vec.X);
maxX = Math.max(maxX, vec.X);
//same for y and z
}
Code:
Vector3.Distance(sharpCell[i], cell[j]) < MERGE_RADIUS
Code:
const static MERGE_RADIUS_SQUARED = MERGE_RADIUS*MERGE_RADIUS
Vector3.DistanceSquared(sharpCell[i], cell[j]) < MERGE_RADIUS_SQUARED
If an approximate solution is enough, you can certainly try a map like Travis Cobbs said. His implementation is basically rounding the coordinates to 0.01 (I don't know the unit, but it's probably LDU) and then checking if the rounded values are equal. This will group 0.009 and 0.0011 together but not 0.014 and 0.016, despite the distance between the values being the same.
To get a precise (and still somewhat fast) algorithm, you will probably need to look into space partitioning algorithms like k/d tree.
It might also be faster to use a simple algorithm with bad time complexity for small coordinate lists and only use the complex algorithm for large lists where time complexity actually makes a difference so a hybrid approach might be the solution.
Also: Use a profiler if you don't already do.