Tuesday, July 30, 2013

How to link Intel MKL lapack

This article summarizes how to link existing C++ code with Intel MKL LAPACK without modifying a single line of the original source code that was written with standard LAPACK libraries and compiled by g++.

Intel MKL comprises so many things and for my purpose. In my case, I only want to know if MKL can provide better performance compared to GotoBLAS in my application. So I don't want to change any part of my code and just want to replace GotoBLAS with MKL. This article is written for pure newbies to Intel MKL and is not intended for learning MKL properly.

The task of replacing LAPACK with MKL turns out to be very simple: I used to link my code with GotoBLAS by flag -lgoto2 or standard LAPACK by -llapack. For MKL, this website helps a lot:

http://software.intel.com/en-us/articles/intel-mkl-link-line-advisor

After filling my platform information (GNU C/C++ and libgomp), the following flags are suggested:

 -L$MKLROOT/lib/intel64 -lmkl_rt -ldl -lpthread -lm

I replace -libgoto2 with the above flags, recompile my code, and then my code runs smoothly with MKL (and much faster than GotoBLAS for multithreaded cases but slower when single threaded).

This article here summarizes how to control the threads that MKL for the LAPACK part

http://software.intel.com/sites/products/documentation/hpc/mkl/lin/MKL_UG_managing_performance/Using_the_Intel_MKL_Parallelism.htm

In short, MKL automatically uses the number of physical cores as default number of threads. Hyperthreading is not considered as very little juice can be squeezed out when highly optimized and efficient code such MKL is used. If you want to control the number of threads, several global variables can be used and described clearly here:

http://software.intel.com/sites/products/documentation/hpc/mkl/lin/MKL_UG_managing_performance/Using_Additional_Threading_Control.htm

The global variables are independent from OpenMP counterparts. In my case, for example, when I want to use two threads only to solver my linear system Ax=b, I enter the following command in bash:

export MKL_NUM_THREADS=2

before I run my executable. Different performance being observed before and after setting up the global variable confirms the success of my attempt.







Tuesday, June 4, 2013

Yalmip with Mosek in SOS optimization

This post describes how to make Mosek work smoothly with Yalmip in SOS program. The following  guide is for academic use on Linux Mint 15 (Ubuntu 13.04) with Matlab R2012a, Mosek 7.0.0.65, and Yalmip R20130405.

Installation of Mosek

You simply download it from Mosek.com, untar the downloaded file directly under your home folder, apply for the academic license, and place the license file directly under the mosek folder. The mosek folder and license place are rigid. Changing them may make mosek not function properly.

Mosek for Linux is precompiled and only shipped in .a and .so files. So you won't encounter make hiccups and you can't easily decipher what is going on within. You may either have it work like a breeze or not work at all. And there is nothing you can do if in the second case. (Or few things but anyway not much like cracking the code because the source is not shipped.)

If you want to use mosek in command line, edit your .bashrc with this additional line:

export PATH="$PATH:$HOME/mosek/7/tools/platform/linux64x86/bin"

Then "source .bashrc" or restart your terminal. You can try type "mosek" in your new open-up terminal to see if the path has been set up correctly.

If you want to use mosek in Matlab only, set up mosek path in your Matlab search path:

/home/youraccount/mosek/7/toolbox/r2012a

Mosek implemenets Matlab bindings for each version. Only include the path for your Matlab version. Type "mosekopt" in Matlab to try out if the path setup is correct.

Installation of Yalmip

Much simpler. Download yalmip, decompress it, and put the folder in your preferred place. Add the path of your yalmip folder "with subfolders" within Matlab. Run "yalmiptest" in Matlab.

The first screen of yalmiptest shows whichever solvers it finds in your system. Mosek should be found if your mosek is installed properly. The second screen summarizes whether each test problem is solved properly or not. If everything looks reasonable (feasible problems solved, infeasible problems unsolved), then you have done yalmip installation.

SOS by Yalmip with Mosek

After setting up your SOS problem, feed the option with the solver set to be "mosek". This can be done by

option = sdpsettings('solver','mosek');

and use this "ops" with the command "solvesos"

solution_info = solvesos(cons, obj, option, var);

The argument "cons" is for constraints, "obj" for the objective function, "var" for decision variables. You will likely notice yalmip prints two errors when you call "sdpsettings"

ERROR - (linprog): Invalid input argument; problem must be a structure.ERROR - (bintprog): Invalid input argument; problem must be a structure.

This is because mosek overloads Matlab internal linprog, bintprog, and quadprog. I don't know if mosek's solver is faster than Matlab's counterparts for linear programming, binary integer programming, and quadratic programming, but Mosek solver does take different options from Matlab's individual solvers. Therefore, Mosek's overloadings are not quite successful in terms of compatibility. Matlab will still use its own "optimset" to set up options and the expected field names and field values may be different from what Mosek expects. Therefore, when "sdpsettings" attempt to collect all options for each solver it supports by calling "optimset(solvername)", for instance, "optimset('linprog')", "optimset" will try to call "solvername('defaults')" to get option structures. However Matlab doesn't document such "defaults" argument in its documentation or anything other than the code itself. Apparently Mosek is not aware of this hidden version of 'defaults' argument, either.

Therefore, Mosek's outputs for "linprog('defaults'), quadprog('defaults'), and bintprog('defaults')" are not compatible with "optimset". To be more precise, Mosek's linprog and bintprog do not allow such a string argument. This is the reason why we get two errors when calling "sdpsettings" (quadprog handles "defaults" so no error printed but the format is still not compatible).

To avoid such ugly error messages, we can suppress them. Because internally Mosek works differently from the simplex method used in "linprog", it may not make sense if we force Mosek to output an option structure that is compatible with Matlab's "optimset" but not compatible with itself. The best bet is to bypass such not informative errors since "solvesos" does not really use the options for linprog, quadprog, and bintprog (need to dig in yalmip code to double check). We can add several lines at the very front of linprog, bintprog, and quadprog:

if ( nargin == 1 && nargout <= 1 && isequal(f,'defaults') )
    return
end

(Change to 'isequal(H, ...' for quadprog)
This will make 'optimset' throw an error that will be caught within 'sdpsettings' instead of invoking Mosek's internal mechanism to handle errors.

Quality compared to Sedumi

SOS is a kind of problem that usually results in slow convergence in optimization. Sedumi in this case reports "run into numerical errors" while Mosek reports "Mosek error: MSK_RES_TRM_STALL ()". Both are referred to the same slow convergence issues. The following statement is quoted from yalmip website: "However, although sum-of-squares converts the original, theoretically intractable, polynomial problem to an SDP which can be solved using convex optimization, the problems that are generated are often huge, and numerically ill-conditioned." As a result, it is suggested in the older version of yalmip website that "The quality of the SOS approximation is typically improved substantially if the tolerance and precision options of the semidefinite solver is decreased. As an example, having sedumi.eps less than 10-10 when solving sum of squares problems is typically recommended for anything but trivial problems. There is a higher likelihood that the semidefinite solver will complain about numerical problems in the end-phase, but the resulting solutions are typically much better. This seem to be even more important in parameterized problems." The purpose of "sedumi.eps" is explained in Sedumi user guide as that "Sedumi terminates successfully if it finds a solution that violates feasibility and optimality requirements by no more than eps." This is the most detailed statement of the purpose of eps. Other documentation only refers to eps as "desired accuracy," which does not explain clearly enough. Another parameter, "numtol", is suggested not to tamper with them in Addendum to Sedumi User Guide.

Here I list a couple of Sedumi documents as they are usually not easily to be found altogether:

Other info
A poster of Sedumi (intro to other tools seems out-of-date)

The solution quality of Mosek for SOS problems can be controlled by using the following three options (Section 7.6.5 in Mosek Matlab toolbox manual):

mosek.MSK_DPAR_INTPNT_CO_TOL_PFEAS
mosek.MSK_DPAR_INTPNT_CO_TOL_DFEAS
mosek.MSK_DPAR_INTPNT_CO_TOL_REL_GAP

These three options control the stop criteria of iterations for conic programming. SOS optimization, or essentially SDP, is stated as "a further generalization of conic quadratic optimization." (Section 7.7 in Mosek Matlab toolbox manual) The quality of SOS can be also controlled by these three options.

Besides, the solver iteration log is explained in Section 10.2.2.3 in Mosek Matlab toolbox manual.

Monday, June 3, 2013

Optimization Solver Benchmarks

These days I use a lot of optimization in my research. The optimization method called Sum-of-Squares of polynomials (SOS) programming is the essential backbone of my research topic. Two years ago I started to play with Prof. Megretski's spot (Systems Polynomial Optimization Tools), which is in Matlab and based on Sturm's Sedumi (and sadly just notice Dr. Sturm has passed away since 2003). This spot/Sedumi combination worked fine most of the time, but for more print-out information and options, I followed Prof. Parrilo's suggestion for Yalmip, which has better up-to-date support for SOS. (Interestingly SOS and its first tool SOSTOOLS was created by Parrilo and I was suggested away from his software.) Since then, I have been using Yalmip/Sedumi for most of my work for a while.

However, there are still hiccups for this combination. Sedumi sometimes reports possible bad quality solutions or stops iterations. Yalmip may kindly warn me to check, for example, whether the SOS decomposition is fine. However, that seems not a permanent solution. Therefore, I start to look around for other solvers. Most of optimization problem formulators by default at least support Sedumi and SDPT3. Luckily, Yalmip supports various types of solvers as its own backends. That makes my porting work between solvers much simpler.

So here comes the problem: how to choose between solvers? People argues several criteria, such as solution quality, solution feasibility, speed for certain problems, and more. This note gives some ideas.

And for speed, here is a well-compiled benchmark list, by Prof. Mittelmann at Arizona State University.