English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Talk

Sampling for non-conjugate infinite latent feature models

MPS-Authors
/persons/resource/persons83939

Görür,  D
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84156

Rasmussen,  CE
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Görür, D., & Rasmussen, C. (2006). Sampling for non-conjugate infinite latent feature models. Talk presented at 8th Valencia International Meeting on Bayesian Statistics (ISBA 2006). Benidorm, Spain. 2006-06-02 - 2006-06-06.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-D1AB-F
Abstract
Latent variable models are powerful tools to model the underlying structure in data. Infinite latent variable models can be defined using Bayesian nonparametrics.
Dirichlet process (DP) models constitute an example of infinite latent class models
in which each object is assumed to belong to one of the, mutually exclusive, infinitely
many classes. Recently, the Indian buffet process (IBP) has been defined as
an extension of the DP. IBP is a distribution over sparse binary matrices with infinitely
many columns which can be used as a distribution for non-exclusive features.
Inference using Markov chain Monte Carlo (MCMC) in conjugate IBP models has
been previously described, however requiring conjugacy restricts the use of IBP. We
describe an MCMC algorithm for non-conjugate IBP models.
Modelling the choice behaviour is an important topic in psychology, economics
and related fields. Elimination by Aspects (EBA) is a choice model that assumes
each alternative has latent features with associated weights that lead to the observed
choice outcomes. We formulate a non-parametric version of EBA by using IBP as
the prior over the latent binary features. We infer the features of objects that lead
to the choice data by using our sampling scheme for inference.