============================================================================================

Time/Room: 11am at APM 7321

Instructor: Dimitris Politis

Office hours: Fri 4:30-6:30pm at SDSC 217E or by appointment (email: dpolitis@ucsd.edu) --- NO OFFICE HOURS ON FRIDAY JAN. 31

TA: Yiren Wang (email: yiw518@ucsd.edu);

Office hours: Tuesday 2-4pm at APM 1131

Weak and strict stationarity of time series, i.e. stochastic processes in discrete time.
Breakdowns of stationarity and remedies (differencing, trend estimation, etc.).
Optimal (from the point of view of Mean Squared Error) one-step-ahead prediction
and interpolation; connection with Hilbert space methods, e.g. orthogonal projection.
Autoregressive (AR), Moving Average (MA), and Autoregressive-Moving Average (ARMA)
models for stationary time series; causality, invertibility, and spectral density.
Maximum Entropy models for time series, and Kolmogorov's formula for prediction
error.
Estimation of the ARMA parameters given data; determination of the ARMA model order
using criteria such as the AIC and BIC.
Nonparametric estimation of the mean, the autocovariance function, and
the spectral density in the absence of model assumptions.
Confidence intervals for the estimated quantities via asymptotic normality and/or
bootstrap methods.
Prerequisite: a basic statistics and probability course or instructor consent.

Recommended textbook:
* TIME SERIES: A FIRST COURSE WITH BOOTSTRAP
STARTER,*
by T.S. McElroy and D.N. Politis,
Chapman and Hall/CRC Press, 2020.

GRADES:

HW = 40%, Midterm (in-class) = 20%, Final (take home) =40%Download the 287A FINAL EXAM. New rule 3/12/20: DO NOT SUBMIT HARD COPY!!! Please submit your final exam by email to dpolitis@ucsd.edu with cc to: yiw518@ucsd.edu DUE DATE: Wed Mar 18 by 2pm

------------------------------------------------------------------------------------------------------------------------------

Time/Room: 11am at APM 7321

Instructor: Dimitris Politis

Office hours: Fri 4:30-6:30pm at SDSC 217E or by appointment (email: dpolitis@ucsd.edu)

TA: Yiren Wang (email: yiw518@ucsd.edu); office hours: Thu 3-5pm at APM 1131

The theory and practice of linear regression
and linear models will be discussed. Least
squares will be defined by means of projections
in n-dimensional Euclidean space. Choosing the
regression model (its dimension, etc.) will also
be addressed.
Prerequisite: a basic statistics course and linear algebra-or instructor consent.

Draper and Smith, Applied regression analysis, 5th ed., Wiley (recommended)

CHAPTER 2. Set 2b: ex. 6. Set 2c: ex. 2. Set 2d: ex. 4. Misc. Set: ex. 1, 13, 15,17

PLUS: Work out (with proof) the conditional distribution of Y1 given Y2=y2, where Y1 and Y2 are two random vectors such that the concatenated vector Y=[Y1' Y2']' is multivariate normal with some mean \theta and invertible covariance matrix \Sigma.

Week of Nov 18 office hours of Dr. Politis will take place MW 12-12:50 and F 12-12:20 at APM 5701. Friday office hour is cancelled.

No class on Wednesday 27 November

DO NOT COLLABORATE WITH ANYBODY ON THE FINAL!

============================================================================================

Time: 11-12pm

Room: APM 5829

Office hours: TBA or by appointment (email: dpolitis@ucsd.edu)

Departures from underlying assumptions in regression. Transformations
and Generalized Linear Models. Model selection.
Nonlinear regression. Introduction to nonparametric regression.
Prerequisite: 282A

Check out the
R handout 2010. Click here
to download R
to your computer.

Nonlinear regression analysis and it's application, by Bates and Watts,Wiley(1988)

Nonlinear regression, by Seber and Wild,Wiley(2003)

Nonparametric smoothing and lack-of-fit tests, by J. Hart, Springer(1997)

Consider the model: Y=f(X)+error where f is a polynomial of finite (but unknown) degree.

The data (with n=21) are below:

Y=(-5.07 -3.63 -1.92 -0.72 -0.79 -2.00 0.69 0.63 -0.92 1.05 -1.33 0.04 -0.14 -2.10 -0.43 -0.71 1.10 0.75 3 .36 4.17 4.57)

and X=(-2.0 -1.8 -1.6 -1.4 -1.2 -1.0 -0.8 -0.6 -0.4 -0.2 0.0 0.2 0.4 0.6 0.8 1.0 1.2 1.4 1.6 1.8 2.0)

a) Use the four methods (residual diagnostic plots searching for patterns, forward and backward F-tests, and Mallows Cp) to pick the optimal order of the polynomial to be fitted. Do all four give the same answer? Explain?

b) With the optimal order obtained from Mallows Cp fit the model and get parameter estimates (including the error variance). If in your final model you see some parameter estimates that are close to zero, perform a test to see if they are significantly different from zero or not (in which case they should not be included in the model.

c) In your final model, do diagnostic plots to confirm your assumptions (what are they?) on the errors.

d) Apply ridge regression to the problem at hand; produce figures to show how the ridge regression estimated parameters change as the constraint on the L1 norm becomes bigger.

e) Repeat part (d) with the lasso instead of ridge regression, i.e., L2 norm instead of L1, and compare with the figures from part (d). Do the lasso figures point to the same model that you obtained in part (a)?

============================================================================================

Instructor: Dimitris Politis, dpolitis@ucsd.edu ; Office hours: MW 1:30-3pm at APM 5747

TA: Michael Scullard, mscullar@math.ucsd.edu ; Office hour: F 1-2pm at APM 6333

=================================================================================

Time/Room: 2pm at APM2402

Office hours: MWF 10am-10:50am or by appointment (email: dpolitis@ucsd.edu) Office location: APM5701

TA: Ashley Chen email: jic102@ucsd.edu Office hour: Thu 4-5pm Office: SDSC 294E

Weak and strict stationarity of time series, i.e. stochastic processes in discrete time.
Breakdowns of stationarity and remedies (differencing, trend estimation, etc.).
Optimal (from the point of view of Mean Squared Error) one-step-ahead prediction
and interpolation; connection with Hilbert space methods, e.g. orthogonal projection.
Autoregressive (AR), Moving Average (MA), and Autoregressive-Moving Average (ARMA)
models for stationary time series; causality, invertibility, and spectral density.
Maximum Entropy models for time series, and Kolmogorov's formula for prediction error.
Estimation of the ARMA parameters given data; determination of the ARMA model order
using criteria such as the AIC and BIC.
Nonparametric estimation of the mean, the autocovariance function, and
the spectral density in the absence of model assumptions.
Confidence intervals for the estimated quantities via asymptotic normality and/or
bootstrap methods.
Prerequisite: a basic statistics and probability course or instructor consent.

Hamilton: Time Series Analysis
and Priestley: Spectral Analysis and Time Series (recommended)

See also the
Handout on Inverse Covariance and eigenvalues of Toeplitz matrices.
For the handout, the convention f(w)=\sum_k \gamma (k) exp{ikw}
is used (without the 2\pi factor).

GRADES:

HW = 40%, Midterm (in-class) = 20%, Final (take home) =40%MIDTERM WILL BE IN-CLASS, WED FEB 21. Closed book, one sheet of notes (2-sided) allowed.

NOTE: partial solutions for 287A HW1 and partial solutions for 287A HW2 ---DO NOT CIRCULATE!!

Let X_t=Y_t+W_t where the Y series is independent of the W series.
Assume Y_t satisfies an AR(1) model (with respect to some white noise),
and W_t satisfies a different AR(1) model (with respect to some other white noise).
Show that X_t is not AR(1) but it is ARMA(p,q) and identify p and q.
[Hint: show that the spectral density of X_t is of the form of an
ARMA(p,q) spectral density.]

Download the
FINAL EXAM.

GARCH Models: Structure, Statistical Inference and Financial Applications by Christian Francq and Jean-Michel Zakoian (2010), John Wiley.

THIS IS THE NEW HOMEWORK of 2018!

-------------------------------------------------------------------------