Some testing issues I had:
I couldn't run the posted code because npolinomi was not defined. Looking at the code, I made a guess and set npolinomi = 500. The code then ran, but ran out of memory on my machine. So I dropped all of the 500's to 200's just to get things to run.
It's appears to be somewhat of apples to oranges comparison here. You have sparse and full matrices in your testing mix, as well as logical variables. The fact that you have sparse matrices involved make a HUGE difference and this was not mentioned in your original post. For matrix multiplies, MTIMESX only has code for full double and full single variables. So you appear to be comparing a full*full matrix multiply (via MTIMESX) along with the W1 building vs a sparse*sparse multiply. The more sparse the matrices are, the better the sparse*sparse formulation will perform for speed. So if the matrices are sparse enough, it would not surprise me if the sparse code was faster. In addition, in my 32-bit R2011a PCWIN version the MTIMESX code runs faster, so it appears the newer MATLAB versions have been improved for sparse matrix multiply speed.
So, back to the original question of how to speed up your code given that there are sparse matrices involved. This is hard to say because the code you show is boiled down to the point that I cannot really tell if there is anything that could be sped up with a mex routine or with other methods. Working with sparse matrices in mex routines is not trivial, particularly if you are trying to combine them or incrementally build them. So I would have to see more detail about your real code before I could make any suggestions. E.g., how is COEFF_S really built? Reshaping a sparse matrix (even transposing a sparse vector) invokes a deep data copy, so maybe something could be changed here to avoid that data copy. Are there any calculations that could be moved outside the loop? Are there any full/sparse conversions that could be avoided? Etc, etc. We can't really advise on any of this based on the boiled down code posted above.