Math 372, Practical Machine Learning, Spring 2026

Homework

HW 1

  1. Let T: V → W be a linear map between two vector spaces. Show that dim(ker(T)) + dim(Im(T)) = dim(V).

  2. Consider the set of all polynomials in one variable with real coefficients of degree less than or equal to three.

    1. Show that this set forms a vector space of dimension four.
    2. Find a basis for this vector space.
    3. Show that differentiating a polynomial is a linear transformation.
    4. Given the basis chosen in part (b), write down the matrix representative of the derivative.
  3. Denote the vector space of all functions f: ℝ → ℝ which are infinitely differentiable by C(ℝ). This space is called the space of smooth functions.

    1. Show that C(ℝ) is infinite dimensional.
    2. Show that differentiation is a linear transformation: d/dx: C(ℝ) → C(ℝ).
    3. For a real number λ, find an eigenvector for d/dx with eigenvalue λ.
  4. Show that the set M2 of 2 × 2 matrices is a vector space.

    1. Write down an explicit basis.
    2. Show that the set of symmetric matrices is a subspace and find its dimension.
    3. Show that the map A ↦ A − AT is linear, and write down the matrix for this map using your basis from (a).
    4. What is the kernel of this map?

HW 2

  1. Send me a link to your notebook for the real estate data competition - include some comments on your choice of model and data pipeline…

HW 3

  1. Make a synthetic data set consisting of a cluster points with one label, surrounded by a ring of points with a different label.

  2. Use SVM with a linear kernel to show that it can’t separate these points.

  3. Try the other kernels, which work?

  4. Can you make a data set which is clearly clustered, but which none of the standard kernels give good results on?

HW 4

  1. Analyse the Petals to the Metal - Flower Classification datset from kaggle using one of the neural net models, for example the swin transformer model.

HW 5

  1. Use llama.cpp to find the embeddings of some common words.

    1. Do words with similar meanings end up close together?

    2. Are relations between words similar? E.g. is the vector from Germany to Berlin close to the vector from France to Paris.