Patchslim is used when loading matrices with stlread to reduce the vertex matrix size.
Surface meshes often have duplicate vertices listed. This function parses the list of points and finds all vertices with duplicate entries in the 'v' (vertices) matrix. These points are removed and the 'f' (faces) matrix indices are set accordingly.
This reduces the size of patch matrices, often by quite a bit. This program can take quite a while to run on very large meshes. I use this to shrink surface mesh sizes after loading them in with 'stlread.m', at http://www.mathworks.com/matlabcentral/fileexchange/29906-binary-stl-file-reader .
USAGE: [v, f]=patchslim(v, f)
Francis Esmonde-White (2021). Patch Slim (patchslim.m) (https://www.mathworks.com/matlabcentral/fileexchange/29986-patch-slim-patchslim-m), MATLAB Central File Exchange. Retrieved .
I am trying to figure out how to retrieve the face normals from this function as well and I couldn't comprehend what fnew = indexn(f) is actually doing. I know it is trying to remove the faces that reference the removed vertices, but I am not sure how?
f is a nx3 matrix while indexn is a 3nx1 column array. How does indexn(f) work?
It works. Very useful.
%remove duplicate vertices
[vnew, indexm, indexn] = unique(V, 'rows');
fnew = indexn(F);
%remove nonsens faces
numfaces = (1:size(fnew,1))';
I completely agree, it's a huge improvement over the original. Again, thanks for the help!
Well done. I would point out that now the code is far more efficient, far cleaner than the loopy code before. Thanks for cleaning it up.
Thanks again John, it's 30x faster now. I didn't know that unique() could operate on entire rows.
It would be helpful if you could update the consolidator help (perhaps include the same indexing tips as given in unique). I look forward to using consolidator in the future.
Consolidator DOES give sufficient information, but you need to use that information properly. However, if you want to resolve only the exact replicates, then use unique, with the 'rows' option. It too gives you sufficient information. Look at the third output from unique, which, by the way, is the same as that returned from consolidator. (It is a simple direct lookup table, so a simple index operation. Try it.)
I looked into using consolidator. From what I can tell, the syntax should be something like:
[vnew,ycon,ind] = consolidator(v,,,0);
Using this, vnew(ind,:) == v
However, Consolidator doesn't return the indices required to re-index f (the faces matrix) without losing the face-vertex pairing. Am I missing something about Consolidator?
Hence, it only does half of what we need here. I need to find the unique vertices (which consolidator does *really* quickly), but also maintain the list mapping the initial vertex indices to the new vertex indices (which consolidator does not appear to do).
Besides that, it seems to unnecessarily introduce a dependency.
I have altered the H1 line so that it conforms to your preferences.
Thanks for the helpful feedback John.
The arrays I work with are generated by a variety of 3d modelling tools, mostly Mimics and the other Materialize software. The stl files written by their utilities write out every vertex for every triangle that is present. Because every vertex is shared by several triangles (assuming a closed mesh), for every unique node there are several additional repetitions of the nodes. In my test data, this has been a factor of about 6. Your mileage may vary.
My example using a small mesh: v matrix reduced from size [17592 3] to size [2934 3] for a constant f matrix of size [5864 3].
I will update the header code and look at Consolidator. One very important consideration in this code is that only identical nodes should be merged. Otherwise it puts an arbitrary constraint on the mesh resolution and will damage otherwise valid surface meshes.
Rather than the slow approach of testing every patch against the rest to see if there exist exact replicates, use a tool like consolidator. Consolidator will find the exact replicates in multiple dimensions, but also allows a tolerance to try to find near replicates if you so desire. The third output argument of consolidator gives you all the information needed to do the necessary reduction, with no need for massive loops.
Will this truly reduce the size of your arrays by a factor of 6 as claimed by the author? I seriously doubt it, unless you have been incredibly inefficient in the way you build these arrays.
As far as the code itself goes, I like that there are error checks here. I like the help, with the exception of an H1 line. An H1 line is the very FIRST comment line. That line enables lookfor to work, helping you to find this code next year when you have no idea what you named it (or downloaded it) a year ago.
An H1 line should be a descriptive single comment line, containing good keywords that a person searching for the code will use.
If the code were repaired, making it efficient for large problems with no need for a big loop, and the H1 line fixed, I would raise my rating.
Find the treasures in MATLAB Central and discover how the community can help you!Start Hunting!