What is Singular Value Decomposition (SVD) and how does it work?

Singular value decomposition (SVD) is one of the jewels of linear algebra. In modern times it has found applications in machine learning and Artificial intelligence. The most important feature of SVD is that of dimension reduction so that we are able to make predictions on very large data sets with a small subspace of variables.  Let’s try to understand how it works through a real-life example.


Netflix has many subscribers and they have a collection of movies. When we watch a movie, we often see similar movies being recommended to us next time we go to Netflix and we wonder how did they figure out the kind the movies I like! The answer is SVD.

As an example, suppose Netflix has 5 movies and multiple subscribers. Netflix gets the data for every user with a rating going from 0 to 5 with 0 implying the user did not watch a particular movie and 5 implying the user has watched it multiple times. A sample data set will look like this. 



Each row in this matrix A will correspond to a user and each column will correspond to a movie. As it turns out that any real (has no complex values) matrix can always be expressed as the product of 3 matrices
A = U Σ VT, where
  • U, Σ, V: are unique (Here VT  is the transpose of V- switching rows and columns)
  •       U,V: orthonormal columns  ( dot product is zero and are orthogonal unit vectors)
    U
    T U=I, VT V=I (I: identity matrix)
  •  Σ: is a diagonal matrix (It has off diagonal values as 0s) Entries (singular values) are positive, and sorted in decreasing order 1 ≥ σ2 ≥…≥ 0)
Therefore, our inputs matrix of users or subscribers and movies can be written as

Also, the movies can be classified as “concepts” – Action movies or romantic stories. The first 3 movies are action movies and the last 2 movies are romantic movies.

  • A: Input Data matrix m × n
    matrix (e.g., m users, n movies)
  • U: Left singular vectors m × r
    matrix (m users, r concepts)
  • Σ: Singular values r × r diagonal
    matrix (strength of each ‘concept’)
    (r: rank of the matrix A- How many linearly independent columns or rows the matrix has)
  • V: Right singular vectors n × r
    matrix (n movies, r concepts) 
It is always possible to decompose a real matrix A into

Where

 σi … scalar
ui … vector
vi … vector

We can now show the SVD of our input matrix which was obtained from the software MATLAB. 


The first column of U corresponds to the Action movies and the second column corresponds to romantic movies. As can be seen, the first 4 users like the action movies while  in the second column the last 3 users like romantic movies.

The second matrix  Σ tells us about the strength of each concept – Action movies and Romantic movies. Clearly, users are more fond of Action movies as its strength (12.4) is greater than for romantic movies (9.5).

The third matrix VT tells us about movies to concept. The first 3 movies belong to Action movie concept and the last two movies belong to the romantic movie concept. 

Typically both the number of subscribers and the choice of movies is huge than our example and through the prudent choice of concepts we are able to quickly determine the taste of the subscriber. 

It is to be noted that we have some “noise” in the data so we ignore the last column of U and last row of VT as seen from the relatively weak strength of 1.3 in the Σ matrix.

It is quite amazing to observe all the useful information that can be derived from the input matrix about subscribers and movies.

There you have it! How Netflix generally determines what recommended movies to show you. This is one potential application of SVD and there many more.

Hope this article helped you understand singular value decomposition and how it can be used a little better! 

Comments

Popular posts from this blog

How to make a cool million Dollars by solving a math problem?

The Famous Basel Problem: The process that led Euler to the answer!

Millennium Problem: Riemann Hypothesis