Hi Rakesh,
I'm having a similar problem:
I have some data on rats performing a task whereby they have to learn associations between a stimulus and a response and make a correct choice. So I create vector 'a' where each element is correct/total response, let's say I have for sessions:
a = [0.10 0.20 0.30 0.40]
I want to calculate the learning rate:
I found a paper that computes the equation for learning rate as follows:
P(N|a,b,c) = a + bN^-c;
P = performance, in my case ‘correct responses’/’total responses’
N = session number
a = asymptote, i.e. the stabilized performance at the end of training (in my case normally 0.90)
b = amplitude of the initial block (difference between initial performance and asymptote)
c = curvature
I then try to construct a model which summarises my data, i.e. gets close to my data
a = 0.90
b = 0.7581
c = 1.09
for N = 1:length(a);
model(N) = a - b*N^-c;
end
I then want to see how good my model is and specifically what the values of a b c are that can best predict my data using the sum of squared errors. So at this point I don't quite know how to do it (the part that finds the best fitting parameters), so I applied your code:
fun = @(k,model) k(1) + k(2)*model + k(3)*model.^2
% Fit model by minimizing the least square of the residual
[o1,o2,o3] = lsqcurvefit(fun,[1 2 3], model, a)
but now I don't understand what o1 o2 and o3 is.
Any help on my dilemma would be greatly appreciated.
Thank you!!
Chiara
"Rakesh Kumar" wrote in message <k7b8ag$4jn$1@newscl01ah.mathworks.com>...
> "Prakhar " <prakhar_cool@yahoo.com> wrote in message <k79nrs$oss$1@newscl01ah.mathworks.com>...
> > Hi everyone
> >
> > Here is the problem which I am working on:
> > ***********************************************
> > Let's say we have an array x = [1; 2; 3; 4]. Now, lets define a function f = (a + b*y + c*y.^2) where y is an array with same size as x. (e.g: y = [-2; -1; 0; 2])
> >
> > Define Residual:
> >
> > res = x - f
> > (e.g.: res(1) = x(1) - (a + b*y(1) + c*y(1)^2).
> >
> > Define sum of square of residuals as:
> >
> > s = sum(res.^2).
> >
> > Now, i need to find variables a, b, c in such a way so that the sum of residuals 's' is minimum.
> > *****************************************************
> > My approach:
> >
> > I defined residual as an inline function
> >
> > x = [1; 2; 3; 4]
> > y = [-2; -1; 0; 2]
> > res = inline('x - (a + b*y + c*y.^2), 'x', 'y', 'a', 'b', 'c');
> > var = @(var) f(x, y, var(1), var(2), var(3));
> > *****************************
> >
> > What i am not able to code is how to estimate the sum of the residual and then how to find values of a, b, c such that this sum is minimum. I know that there is a function called fminsearch...but i am not able to apply this here. Any help with be appreciated
> >
> > Thanks a lot!
> >
> > Regards
> > Prakhar (PhD student, Caltech)
>
> Easiest way is to use lsqcurvefit function in optimization toolbox. I am pretty sure Caltech have license for optimization toolbox.
>
> Here is what I did on your test example. Assume, k = [a b c]
> y = [-2; -1; 0; 2]; % typically called 'xdata' in lsqcurvefit help pages
> x = [1; 2; 3; 4]; % typically called 'ydata'
> % define model
> fun = @(k,y) k(1) + k(2)*y + k(3)*y.^2
> % Fit model by minimizing the least square of the residual
> [o1,o2,o3] = lsqcurvefit(fun,[1 2 3], y, x)
>
> Look at the example at the bottom of this page:
> http://www.mathworks.com/help/optim/ug/lsqcurvefit.html
I'm having a similar problem:
I have some data on rats performing a task whereby they have to learn associations between a stimulus and a response and make a correct choice. So I create vector 'a' where each element is correct/total response, let's say I have for sessions:
a = [0.10 0.20 0.30 0.40]
I want to calculate the learning rate:
I found a paper that computes the equation for learning rate as follows:
P(N|a,b,c) = a + bN^-c;
P = performance, in my case ‘correct responses’/’total responses’
N = session number
a = asymptote, i.e. the stabilized performance at the end of training (in my case normally 0.90)
b = amplitude of the initial block (difference between initial performance and asymptote)
c = curvature
I then try to construct a model which summarises my data, i.e. gets close to my data
a = 0.90
b = 0.7581
c = 1.09
for N = 1:length(a);
model(N) = a - b*N^-c;
end
I then want to see how good my model is and specifically what the values of a b c are that can best predict my data using the sum of squared errors. So at this point I don't quite know how to do it (the part that finds the best fitting parameters), so I applied your code:
fun = @(k,model) k(1) + k(2)*model + k(3)*model.^2
% Fit model by minimizing the least square of the residual
[o1,o2,o3] = lsqcurvefit(fun,[1 2 3], model, a)
but now I don't understand what o1 o2 and o3 is.
Any help on my dilemma would be greatly appreciated.
Thank you!!
Chiara
"Rakesh Kumar" wrote in message <k7b8ag$4jn$1@newscl01ah.mathworks.com>...
> "Prakhar " <prakhar_cool@yahoo.com> wrote in message <k79nrs$oss$1@newscl01ah.mathworks.com>...
> > Hi everyone
> >
> > Here is the problem which I am working on:
> > ***********************************************
> > Let's say we have an array x = [1; 2; 3; 4]. Now, lets define a function f = (a + b*y + c*y.^2) where y is an array with same size as x. (e.g: y = [-2; -1; 0; 2])
> >
> > Define Residual:
> >
> > res = x - f
> > (e.g.: res(1) = x(1) - (a + b*y(1) + c*y(1)^2).
> >
> > Define sum of square of residuals as:
> >
> > s = sum(res.^2).
> >
> > Now, i need to find variables a, b, c in such a way so that the sum of residuals 's' is minimum.
> > *****************************************************
> > My approach:
> >
> > I defined residual as an inline function
> >
> > x = [1; 2; 3; 4]
> > y = [-2; -1; 0; 2]
> > res = inline('x - (a + b*y + c*y.^2), 'x', 'y', 'a', 'b', 'c');
> > var = @(var) f(x, y, var(1), var(2), var(3));
> > *****************************
> >
> > What i am not able to code is how to estimate the sum of the residual and then how to find values of a, b, c such that this sum is minimum. I know that there is a function called fminsearch...but i am not able to apply this here. Any help with be appreciated
> >
> > Thanks a lot!
> >
> > Regards
> > Prakhar (PhD student, Caltech)
>
> Easiest way is to use lsqcurvefit function in optimization toolbox. I am pretty sure Caltech have license for optimization toolbox.
>
> Here is what I did on your test example. Assume, k = [a b c]
> y = [-2; -1; 0; 2]; % typically called 'xdata' in lsqcurvefit help pages
> x = [1; 2; 3; 4]; % typically called 'ydata'
> % define model
> fun = @(k,y) k(1) + k(2)*y + k(3)*y.^2
> % Fit model by minimizing the least square of the residual
> [o1,o2,o3] = lsqcurvefit(fun,[1 2 3], y, x)
>
> Look at the example at the bottom of this page:
> http://www.mathworks.com/help/optim/ug/lsqcurvefit.html