Emergent Mind

Deep Determinantal Point Processes

(1811.07245)
Published Nov 17, 2018 in stat.ML and cs.LG

Abstract

Determinantal point processes (DPPs) have attracted significant attention as an elegant model that is able to capture the balance between quality and diversity within sets. DPPs are parameterized by a positive semi-definite kernel matrix. While DPPs have substantial expressive power, they are fundamentally limited by the parameterization of the kernel matrix and their inability to capture nonlinear interactions between items within sets. We present the deep DPP model as way to address these limitations, by using a deep feed-forward neural network to learn the kernel matrix. In addition to allowing us to capture nonlinear item interactions, the deep DPP also allows easy incorporation of item metadata into DPP learning. Since the learning target is the DPP kernel matrix, the deep DPP allows us to use existing DPP algorithms for efficient learning, sampling, and prediction. Through an evaluation on several real-world datasets, we show experimentally that the deep DPP can provide a considerable improvement in the predictive performance of DPPs, while also outperforming strong baseline models in many cases.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.