First of all, mathematical theory tells us that there exist functions where knowing the value of the function at one location gives you no information about the value of the function at any other location. It follows from this that there exist nonlinear equations which cannot be solved by any gradient descent, or genetic algorithm (that does not run all possibilities exhaustively), or newton's method, or simplex method, or anything else implemented by the functions you list -- not short of flailing around and happening to try an exact solution by chance.
Second of all: when you have "black box" functions (a function handle for code you are not permitted to examine analytically) then it can be quite difficult to find solutions of equations even when clear solutions exist analytically. Therefore none of the functions you list can guarantee solutions to global minima or to nonlinear equations -- not outside of certain narrow classes of functions.
Third: Every equation f(x) == b has an equivalent minimization problem norm(f(x)-b) with the equation being solved at the locations where that norm is 0, and the equation being hopefully solved to within accuracy and round-off error where the norm is minimized if no exact zero can be found. For real-valued functions, the norm squared can be substituted, (f(x)-b).^2 .
Fourth: for multiple real-valued equations, then you can minimize (f1(x)-b1).^2 + (f2(x)-b2).^2 etc . If there is a zero, then all of the equations are solved simultaneously. (For practical reasons, you might need to scale the function values relative to each other to avoid having some functions effectively ignored.)
Fifth: for non-linear functions it is really common for the solution space to be so bumpy that all of the fmincon / fminunc algorithms get stuck in local minima, sometimes "near" a solution but sometimes quite far from a solution. fminsearch() tends to get stuck less easily, so it tends to get closer to solutions given enough time, but there are non-linear functions that it will go indefinitely wrong on. In particular if there is an asymptope minimum towards infinity but a deeper minimum somewhere else, then it is common for fminsearch to get stuck searching the asymptope.
There are no algorithms that can guarantee finding solutions (or global minimum) outside of some narrow cases.
You can do a bit better if you have the symbolic toolbox and your equations can be differentiated.