Introduction

Get started with CourseRec, with popular recommender models - ItemPop, ItemKNN, MF, MLP, NeuMF

Quick start

Looking to quickly use the online demo built from trained models. head to the project page

TODO

Overview

Base Models

The most basic, non-personalized recommendation method where courses are ranked by their popularity (number of interactions)

The standard collaborative filtering method where the user-–item interaction matrix is factorized into two low-rank user and item factors. Predictions are calculated as the inner product of both respective user and item factors.

Another collaborative filtering method, where MLP sends the two embedded spaces of users and items into neural layers to get prediction.

NeuMF combines features by concatenating MF and MLP predictions as hidden layer.

The <manage.py> can be the access portal for the project running.


        python manage.py--model itempop --dataset jobrec.nocoldstart --num_neg_test 99 --top_k 10 

Settings

Basic parameter for the setting
  • use_metadata: Whether add the information of user metadata
  • use_aot: Whether add the information of course area of training
  • automatic_aot: Whether use the AOT in balance manner
  • aot_vector: Whether use the automatic AOT vector
  • aot_cluster_num: The number of clustered aot
  • num_neg_train: Number of negative instances to pair with a positive instance while training
  • num_neg_test: Number of negative instances to pair with a positive instance while testing
  • verbose: Show performance per X iterations
  • top_k: compute metrics@top_k
Parameter for the Neural Training
  • factor_num: predictive factors numbers in the model
  • epochs: Number of epochs.
  • batch_size: Batch size.
  • layers: Size of each layer. Note that the first layer is the concatenation of user and item embeddings. So layers[0]/2 is the embedding size.
  • weight_decay: Regularization for each layer
  • lr: Learning rate.
  • dropout: Add dropout layer after each dense layer, with p = dropout_prob
  • learner: Specify an optimizer: adagrad, adam, rmsprop, sgd