Matrix Factorization for Missing Values
h2oai AI Machine LearningI stumbled across an interested reddit post about using matrix factorization (MF) for imputing missing values. The original poster was trying to solve a complex time series that had missing values. The solution was to use matrix factorization to impute those missing values.
Since I never heard of that application before, I got curious and searched the web for information. I came across this post using matrix factorization and Python to impute missing values.
In a nutshell:
Recommendations can be generated by a wide range of algorithms. While user-based or item-based collaborative filtering methods are simple and intuitive, matrix factorization techniques are usually more effective because they allow us to discover the latent features underlying the interactions between users and items. Of course, matrix factorization is simply a mathematical tool for playing around with matrices, and is therefore applicable in many scenarios where one would like to find out something hidden under the data.
The author uses a movie rating example, where you have users and different ratings for movies. Of course, a table like this will have many missing ratings. When you look at the table, it looks just like a matrix that’s waiting to be solved!
In a recommendation system such as Netflix or MovieLens, there is a group of users and a set of items (movies for the above two systems). Given that each users have rated some items in the system, we would like to predict how the users would rate the items that they have not yet rated, such that we can make recommendations to the users. In this case, all the information we have about the existing ratings can be represented in a matrix. Assume now we have 5 users and 10 items, and ratings are integers ranging from 1 to 5, the matrix may look something like this (a hyphen means that the user has not yet rated the movie).
After applying MF, you get these imputed results:
Of course I skipped over the discussion of Regularization and the Python Code, but you can read about that here. Going back to the original Reddit post, I was intriqued how this imputation method is available in H2O.ai’s open source offering. It’s called ‘Generalized Low Ranked Models’ and not only helps with dimensionality reduction BUT it also imputes missing values. I must check out more because I know there’s a better way than just replacing the average value.
H2O Example Imputing Missing Values
I posted a python notebook on how GLRM imputes missing values in my Tutorials Github. You can check it out here or you can just copy this code below.
This is a very simple example where we impute just 4 missing values from a very small dataset. The nice thing is that you can do this FAST on an H2O cluster if you have very large data.
#!/usr/bin/env python
# coding: utf-8
# In[1]:
import h2o
from h2o.estimators import H2OGeneralizedLowRankEstimator
# In[2]:
h2o.init()
# In[3]:
# Import the USArrests dataset into H2O:
arrestsH2O = h2o.import_file("USArrests.csv")
arrestsH2O_nan = h2o.import_file("USArrests_nan.csv")
# In[4]:
arrestsH2O.describe()
# In[5]:
#Yes, we have 4 missing values in this dataset from USArrests_nan.csv
arrestsH2O_nan.describe()
# In[6]:
# Split the dataset into a train and valid set:
#train, test = arrestsH2O_nan.split_frame(ratios=[.8], seed=1234)
train = arrestsH2O_nan
# In[7]:
# Build and train the model:
glrm_model = H2OGeneralizedLowRankEstimator(k=4,
loss="quadratic",
gamma_x=0.5,
gamma_y=0.5,
max_iterations=700,
recover_svd=True,
init="SVD",
transform="standardize")
# In[8]:
glrm_model.train(training_frame=train)
# In[9]:
glrm_model.model_performance()
# In[10]:
out = glrm_model.predict(train)
out
# In[11]:
#Missing Values in the Train Set
train.describe()
# In[12]:
#Missing Values Imputed
out.describe()
Learning Python Programming the Easy Way
I picked up python programming when I needed to do something but couldn't figure out how to connect the dots. Luckily there were some great books out there that I picked up and helped accelerate my learning process.
Here is a list of book and cheatsheets I like: