Thank you for sharing. I tried the same method, but my program is unable to recognize the auxdata. Below is the error,
Undefined variable auxdata.
Error in run_admm>jacobianstructure (line 261)
M_ec = auxdata{2};
Error in run_admm (line 109)
[x info] = ipopt(x0,funcs{k},options{k});
Error in main2_ieee (line 94)
net.info_admm = run_admm(net,net.V0,exit_crit,param_vec);
"Mark Mikofski" wrote in message <l1i1ha$89g$1@newscl01ah.mathworks.com>...
> You have a non-linear problem to solve but not the optimization toolbox?
>
> 1. First download the IPOPT mex and m-files, and extract to your MATLAB search path.
> https://projects.coin-or.org/Ipopt#Download
>
> 2. Make an m-file that defines your objective and constraints, gradient and Jacobian.
>
> function [x,info] = myNonLinearProblem(x0, auxdata)
> %MYNONLINEARPROBLEM Solves my non-linear problem
> % [X,INFO] = MYNONLINEARPROBLEM[X0,AUXDATA]
> % returns solution, X, and INFO to my non-linear problem
> % X0 is the initial guess
> % AUXDATA are any additional arguments my non-linear problem
> % requires
>
> x0 =x0(:); % initial guess
>
> %% constraints
> %set all of your equations as constraints
> funcs.constraints = @constraints;
> % constraints require a Jacobian
> funcs.jacobian = @jacobian;
> % Jacobians require a sparsity structure
> funcs.jacobianstructure = @jacobianstructure;
> % set upper and lower bounds of constraints to zero
> options.cl = zeros(size(x0)); % lower bounds
> options.cu = zeros(size(x0)); % lower bounds
>
> %% objective
> % set the objective sum of the squares
> funcs.objective = @objective;
> % objective requires a gradient
> funcs.gradient = @gradient;
>
> %% options
> % set Quasi-Newton option
> options.ipopt.hessian_approximation = 'limited-memory';
>
> %% solve
> [x,info] = ipopt_auxdata(x0,funcs,options);
> end
>
> % x & auxdata must be passed as only args to
> % objective, gradient, constraints and jacobian
>
> function f = objective(x,auxdata)
> % objective is the sum of the squares of the residuals
> f = residual(x,auxdata);
> f = f(:);
> f = f'*f;
> end
>
> function g = gradient(x,auxdata)
> f = residual(x,auxdata);
> f = f(:);
> g = f'*j;
> end
>
> function f = constraints(x,auxdata)
> f = residual(x,auxdata);
> f = f(:);
> end
>
> % Jacobian and Jacobian structure must be sparse
>
> function j = jacobian(x,auxdata)
> % rows correspond to each constraint
> % columns correspond to each x
> [~,j] = residual(x,auxdata);
> j = sparse(j);
> end
>
> function j = jacobianstructure(x)
> % x is the only arg passed to Jacobian structure
> % assuming closed system of equations
> % # of constraints = degrees of freedom
> % therefore Jacobian matrix will be square
> j = sparse(ones(size(x)));
> end
>
> % put the equations and their derivatives here
> % unfortunately IPOPT is not a Jacobianless solver :(
>
> function [f,j] = residuals(x,auxdata)
> % calculate your residuals here
> % f1(x,auxdata) = lhs1(x,auxdata) - rhs1(x,auxdata) = 0
> % f2 = ...
> % j1(x,auxdata) = [df1/dx1, df1/dx2, ...]
> % j2 = ...
> end
Undefined variable auxdata.
Error in run_admm>jacobianstructure (line 261)
M_ec = auxdata{2};
Error in run_admm (line 109)
[x info] = ipopt(x0,funcs{k},options{k});
Error in main2_ieee (line 94)
net.info_admm = run_admm(net,net.V0,exit_crit,param_vec);
"Mark Mikofski" wrote in message <l1i1ha$89g$1@newscl01ah.mathworks.com>...
> You have a non-linear problem to solve but not the optimization toolbox?
>
> 1. First download the IPOPT mex and m-files, and extract to your MATLAB search path.
> https://projects.coin-or.org/Ipopt#Download
>
> 2. Make an m-file that defines your objective and constraints, gradient and Jacobian.
>
> function [x,info] = myNonLinearProblem(x0, auxdata)
> %MYNONLINEARPROBLEM Solves my non-linear problem
> % [X,INFO] = MYNONLINEARPROBLEM[X0,AUXDATA]
> % returns solution, X, and INFO to my non-linear problem
> % X0 is the initial guess
> % AUXDATA are any additional arguments my non-linear problem
> % requires
>
> x0 =x0(:); % initial guess
>
> %% constraints
> %set all of your equations as constraints
> funcs.constraints = @constraints;
> % constraints require a Jacobian
> funcs.jacobian = @jacobian;
> % Jacobians require a sparsity structure
> funcs.jacobianstructure = @jacobianstructure;
> % set upper and lower bounds of constraints to zero
> options.cl = zeros(size(x0)); % lower bounds
> options.cu = zeros(size(x0)); % lower bounds
>
> %% objective
> % set the objective sum of the squares
> funcs.objective = @objective;
> % objective requires a gradient
> funcs.gradient = @gradient;
>
> %% options
> % set Quasi-Newton option
> options.ipopt.hessian_approximation = 'limited-memory';
>
> %% solve
> [x,info] = ipopt_auxdata(x0,funcs,options);
> end
>
> % x & auxdata must be passed as only args to
> % objective, gradient, constraints and jacobian
>
> function f = objective(x,auxdata)
> % objective is the sum of the squares of the residuals
> f = residual(x,auxdata);
> f = f(:);
> f = f'*f;
> end
>
> function g = gradient(x,auxdata)
> f = residual(x,auxdata);
> f = f(:);
> g = f'*j;
> end
>
> function f = constraints(x,auxdata)
> f = residual(x,auxdata);
> f = f(:);
> end
>
> % Jacobian and Jacobian structure must be sparse
>
> function j = jacobian(x,auxdata)
> % rows correspond to each constraint
> % columns correspond to each x
> [~,j] = residual(x,auxdata);
> j = sparse(j);
> end
>
> function j = jacobianstructure(x)
> % x is the only arg passed to Jacobian structure
> % assuming closed system of equations
> % # of constraints = degrees of freedom
> % therefore Jacobian matrix will be square
> j = sparse(ones(size(x)));
> end
>
> % put the equations and their derivatives here
> % unfortunately IPOPT is not a Jacobianless solver :(
>
> function [f,j] = residuals(x,auxdata)
> % calculate your residuals here
> % f1(x,auxdata) = lhs1(x,auxdata) - rhs1(x,auxdata) = 0
> % f2 = ...
> % j1(x,auxdata) = [df1/dx1, df1/dx2, ...]
> % j2 = ...
> end