Research et al

This page lists my research and publications—papers, journal articles, books, etc—in reverse chronological order. For questions or comments, please contact me.


results.

Draft. Sparse Integrated ARCH + Slides

Keeping Up with the Martingales

Adjunction: Tutoring Programs and Academic Performance (Yale Journal of Economics)


ideas.

New Data Compression Techniques for Linear Regression

Matrix Completion and Low-Rank Models


research interests.

  1. Non-Parametric & Robust Statistics. There’s so much more to life than linearity and normality. I’m interested in statistical estimators and loss functions that can handle outliers, mis-assumption (or the lack of assumptions altogether), and non-linearity that is possibly less intuitive than polynomial, exponential, or logarithmic.
  2. Convexity. In my mind, convexity is a natural generalization of linearity that is rich with geometric intuition, computational guarantees, and practical relevance. While still early on this path, I’m eager to evaluate the foundations of machine learning and modern computational engineering by turning to the work of Fenchel, Rockafellar, and other theoreticians.
  3. Sparsity, Compression, & Representation. I am a strong believer that any data is low rank or sparse in some basis. While most data scientists have considered the eigen-basis (aka PCA) or Fourier basis, I’m interested to explore more exotic dictionaries for sparse representation, especially for time series data. This also involves finding more effective means for feature selection than the ell_1 penalty heuristic.
  4. Deep Learning & Expressivity. As an interpolation between the first and third items above: I find the expressivity of deep networks to be exciting and useful but our lack of understanding thereof to be deeply dissatisfying. I wonder if there are other parametrizations that allow for similar expressivity and representation ability but better computational guarantees. Especially due to non-convexity, it’s difficult to know at times which component of the performance error is due to statistics and which is due to computation. The difference matters.
Advertisements