In discussions on performance of Matlab version of Matlab, the OS and the use of the Parallel Toolbox are all important.
According to a previous answer by Titus Edelhofer (The Mathworks)
functions like eig, multiplication, qr etc. are multithreaded.
Thus, your code is dominated by builtin functions, which are multi-threaded. That's why you see 100% cpu usage.
Could it be memory related? I have 4GB and all the time I have
more than 2 available so I don't know.
I have fooled myself lately by focusing on Available memory. I run R2012a 64bit on Windows 7 with 8GB.
Does Free Memory ever decreases to low values (/zero)? If it does that is part of the problem. I find the behavior of Windows' System Cache difficult to understand. And the task managers way of showing memory usage a bit misleading. You don't read large files(?)
Your "unpack" function creates a bunch of new variables. That doesn't requires much memory until values of the new variables are changed (lazy-copy). Does that happen to large arrays?
Matlab has this piece of magic code, which they call "Accelerator", part of which is a just in time compiler, JIT. (Or Accelerator is JIT?) The Mathworks develops the accelerator actively and they argue that we, the ordinary users, should not craft our code to fit the current state of the accelerator. The accelerator is not documented.
My guess: The major reason for the performance issues you see is related to the Accelerator. As Jan describe variables "popping up in the workspace" (my words) certainly causes problems to JIT. assignin, eval, load('...mat), etc. does that.
The Julia Language indicates that the accelerator can be further improved. However, backward compatibility makes things difficult - I guess. There was an informative video on Julia, but I cannot find it now.