Bonjour si on simule sur sas un arma(1,2):
Algorithme:
data temp.arma1_2 (keep= u date z);
um1 = 0;
z2 = 0;
z1 = 0;
do t = -50 to 200;
z = 2*rannor(1436);
u = .9 *um1 + z -.1*z1 - .9*z2;
if t > 0 then
do;
date = intnx('month', '31dec1960'd, t);
format date monyy.;
output;
end;
z2=z1;
z1 =z;
um1=u;
end;
run;
proc arima data=temp.arma1_2;
i var=u minic perror=(1:11);
run;
sas réagit correctement:
The ARIMA Procedure
Critère d'Information Minimum
Lags MA 0 MA 1 MA 2 MA 3 MA 4 MA 5
AR 0 1.918625 1.567837 1.577507 1.587355 1.574758 1.591561
AR 1 1.706186 1.594322 1.54492 1.562619 1.575613 1.599728
AR 2 1.565707 1.575859 1.55939 1.577493 1.597215 1.620847
AR 3 1.579385 1.600155 1.578792 1.601501 1.619983 1.643739
AR 4 1.570294 1.595204 1.604108 1.625797 1.646315 1.669871
AR 5 1.593425 1.619477 1.627451 1.643971 1.653991 1.678334
Error series model: AR(9)
Minimum Table Value: BIC(1,2) = 1.54492
Mais si on procède à un changement de signe:
u = .9 *um1 + z -.1*z1 + .9*z2
cela n'est plus du tout aussi simple.
Pourquoi ?