Caution
Section omitted to comply with the Honor Code
Bayesian Statistics: Mixture Models
Oren Bochman
Mixture Models, Homework, BIC
Section omitted to comply with the Honor Code
---
title : 'Homework on Bayesian Information Criteria (BIC) - M5L09HW1'
subtitle : 'Bayesian Statistics: Mixture Models'
categories:
- Bayesian Statistics
keywords:
- Mixture Models
- Homework
- BIC
---
::::: {.content-visible unless-profile="HC"}
::: {.callout-caution}
Section omitted to comply with the Honor Code
:::
:::::
::::: {.content-hidden unless-profile="HC"}
::: {#exr-1}
Consider a K-component mixture of D-dimensional Multinomial distributions,
$$
f(x) = \sum_{k=1}^K w_k \left( \frac{x_1 + x_2 + \cdots + x_D}{x_1 \, x_2 \cdots x_D} \right) \prod_{d=1}^D \theta_{d,k}^{x_d}
$$
where $x = (x_1, \ldots, x_D)$ and $\sum_{d=1}^D \theta_{d,k} = 1$ for all $k = 1, \ldots, K$. For the purpose of computing the BIC, what is the effective number of parameters in the model?
- [ ] (K−1)+K×D
- [ ] K+K×(D−1)
- [ ] (K−1)+K×(D−1)
- [ ] (K−1)×(D−1)
:::
::: {.callout-note .solution collapse="true"}
## Solution
The effective number of parameters can be computed as follows:
1. The mixture weights $w_k$ contribute $K - 1$ parameters (since they must sum to 1).
2. Each component $k$ has $D-1$ independent parameters $\theta_{d,k}$ for $d = 1, \ldots, D$. the last is determined by the constraint $\sum_{d=1}^D \theta_{d,k} = 1$. Thus, each component contributes $D$ parameters.
Thus, the total number of parameters is:
$$
\text{Total parameters} = (K - 1) + K \cdot (D - 1)
$$
:::
:::::