- Title
- Decoupling Sparsity and Smoothness in Dirichlet Belief Networks
- Creator
- Li, Yaqiong; Fan, Xuhui; Chen, Ling; Li, Bin; Sisson, Scott A.
- Relation
- Lecture Notes in Computer Science 12976
- Publisher Link
- http://dx.doi.org/10.1007/978-3-030-86520-7_10
- Publisher
- Springer
- Resource Type
- book chapter
- Date
- 2021
- Description
- The Dirichlet Belief Network (DirBN) has been proposed as a promising deep generative model that uses Dirichlet distributions to form layer-wise connections and thereby construct a multi-stochastic layered deep architecture. However, the DirBN cannot simultaneously achieve both sparsity, whereby the generated latent distributions place weights on a subset of components, and smoothness, which requires that the posterior distribution should not be dominated by the data. To address this limitation we introduce the sparse and smooth Dirichlet Belief Network (ssDirBN) which can achieve both sparsity and smoothness simultaneously, thereby increasing modelling flexibility over the DirBN. This gain is achieved by introducing binary variables to indicate whether each entity’s latent distribution at each layer uses a particular component. As a result, each latent distribution may use only a subset of components in each layer, and smoothness is enforced on this subset. Extra efforts on modifying the models are also made to fix the issues which is caused by introducing these binary variables. Extensive experimental results on real-world data show significant performance improvements of ssDirBN over state-of-the-art models in terms of both enhanced model predictions and reduced model complexity.
- Subject
- Dirichlet belief networks; Markov chain Monte Carlo; sparsity
- Identifier
- http://hdl.handle.net/1959.13/1442034
- Identifier
- uon:41607
- Identifier
- ISBN:9783030865191
- Language
- eng
- Hits: 552
- Visitors: 551
- Downloads: 1
Thumbnail | File | Description | Size | Format |
---|