Code covered by the BSD License

### Highlights from Patch Slim (patchslim.m)

5.0
5.0 | 3 ratings Rate this file 20 Downloads (last 30 days) File Size: 1.42 KB File ID: #29986

# Patch Slim (patchslim.m)

### Francis Esmonde-White (view profile)

09 Jan 2011 (Updated )

Remove duplicate vertices in surface meshes.

File Information
Description

Surface meshes often have duplicate vertices listed. This function parses the list of points and finds all vertices with duplicate entries in the 'v' (vertices) matrix. These points are removed and the 'f' (faces) matrix indices are set accordingly.

This reduces the size of patch matrices, often by quite a bit. This program can take quite a while to run on very large meshes. I use this to shrink surface mesh sizes after loading them in with 'stlread.m', at http://www.mathworks.com/matlabcentral/fileexchange/29906-binary-stl-file-reader .

USAGE: [v, f]=patchslim(v, f)

Acknowledgements

Binary Stl File Reader and Consolidator inspired this file.

MATLAB release MATLAB 7.8 (R2009a)
10 Feb 2015 Manu

### Manu (view profile)

%remove duplicate vertices
[vnew, indexm, indexn] = unique(V, 'rows');
fnew = indexn(F);

%remove nonsens faces
numfaces = (1:size(fnew,1))';

e1=fnew(:,1)-fnew(:,2);
e2=fnew(:,1)-fnew(:,3);
e3=fnew(:,2)-fnew(:,3);

e1=[e1 numfaces];
e2=[e2 numfaces];
e3=[e3 numfaces];

e1=e1(e1(:,1)==0,2);
e2=e2(e2(:,1)==0,2);
e3=e3(e3(:,1)==0,2);

fnew(vertcat(e1,e2,e3),:)=[];

Comment only
15 Sep 2012 Kamil Stanek

### Kamil Stanek (view profile)

12 Jan 2011 Francis Esmonde-White

### Francis Esmonde-White (view profile)

Hi John,

I completely agree, it's a huge improvement over the original. Again, thanks for the help!

Francis

Comment only
12 Jan 2011 John D'Errico

### John D'Errico (view profile)

Well done. I would point out that now the code is far more efficient, far cleaner than the loopy code before. Thanks for cleaning it up.

11 Jan 2011 Francis Esmonde-White

### Francis Esmonde-White (view profile)

Thanks again John, it's 30x faster now. I didn't know that unique() could operate on entire rows.

It would be helpful if you could update the consolidator help (perhaps include the same indexing tips as given in unique). I look forward to using consolidator in the future.

Comment only
10 Jan 2011 John D'Errico

### John D'Errico (view profile)

Consolidator DOES give sufficient information, but you need to use that information properly. However, if you want to resolve only the exact replicates, then use unique, with the 'rows' option. It too gives you sufficient information. Look at the third output from unique, which, by the way, is the same as that returned from consolidator. (It is a simple direct lookup table, so a simple index operation. Try it.)

Comment only
10 Jan 2011 Francis Esmonde-White

### Francis Esmonde-White (view profile)

I looked into using consolidator. From what I can tell, the syntax should be something like:

[vnew,ycon,ind] = consolidator(v,[],[],0);

Using this, vnew(ind,:) == v

However, Consolidator doesn't return the indices required to re-index f (the faces matrix) without losing the face-vertex pairing. Am I missing something about Consolidator?

Hence, it only does half of what we need here. I need to find the unique vertices (which consolidator does *really* quickly), but also maintain the list mapping the initial vertex indices to the new vertex indices (which consolidator does not appear to do).

Besides that, it seems to unnecessarily introduce a dependency.

I have altered the H1 line so that it conforms to your preferences.

Comment only
10 Jan 2011 Francis Esmonde-White

### Francis Esmonde-White (view profile)

Thanks for the helpful feedback John.

The arrays I work with are generated by a variety of 3d modelling tools, mostly Mimics and the other Materialize software. The stl files written by their utilities write out every vertex for every triangle that is present. Because every vertex is shared by several triangles (assuming a closed mesh), for every unique node there are several additional repetitions of the nodes. In my test data, this has been a factor of about 6. Your mileage may vary.

My example using a small mesh: v matrix reduced from size [17592 3] to size [2934 3] for a constant f matrix of size [5864 3].

I will update the header code and look at Consolidator. One very important consideration in this code is that only identical nodes should be merged. Otherwise it puts an arbitrary constraint on the mesh resolution and will damage otherwise valid surface meshes.

Comment only
10 Jan 2011 John D'Errico

### John D'Errico (view profile)

Rather than the slow approach of testing every patch against the rest to see if there exist exact replicates, use a tool like consolidator. Consolidator will find the exact replicates in multiple dimensions, but also allows a tolerance to try to find near replicates if you so desire. The third output argument of consolidator gives you all the information needed to do the necessary reduction, with no need for massive loops.

Will this truly reduce the size of your arrays by a factor of 6 as claimed by the author? I seriously doubt it, unless you have been incredibly inefficient in the way you build these arrays.

As far as the code itself goes, I like that there are error checks here. I like the help, with the exception of an H1 line. An H1 line is the very FIRST comment line. That line enables lookfor to work, helping you to find this code next year when you have no idea what you named it (or downloaded it) a year ago.

An H1 line should be a descriptive single comment line, containing good keywords that a person searching for the code will use.

If the code were repaired, making it efficient for large problems with no need for a big loop, and the H1 line fixed, I would raise my rating.