Do you have any exposure of how high the dim could be? I am testing something like x1+x2+...+xn=1, xi>=0, i=1,2,...n. I randomly generate some points within this region and also give the vertices like (0,0,0..,0), (1,0,0,..0), (0,1,0,...0)...to help checking.
It looks like n=10 works, but n=20 is stuck due to qullmx function called by convexhulln.
Is there any possible we can have function like qullmx but more powerful in the sense of dim and speed?
my concern of using consolidator is because I think function unique has pretty high requirement of two vectors to be the same. When run vert2lcon on a large data set, i have seen a large A with many rows are the same up to 6-8 digital number. This enlarges the size of A and it may not be necessary for the user.
"A and b are produced by Michael Kleder's functions which seem to normalize it this way already."
Sorry Matt, I am not sure how to remove the first rating. Two comments: "unique" function may be better to be replaced by "consolidator"; It may be good to normalize A,b,etc before the final output(I am not quite sure about its necessity.)