# Restricted Boltzmann machines (RBMs)

## Introduction

A restricted Boltzmann machine (RBM), originally invented under the name harmonium, is a popular building block for deep probabilistic models. For example, they are the constituents of deep belief networks that started the recent surge in deep learning advances in 2006.

RBMs specify joint probability distributions over random variables, both visible and latent, using an energy function, similar to Boltzmann machines, but with some restrictions. Hence the name. In this article, we will introduce Boltzmann machines and their extension to RBMs.

## Prerequisites

To understand RBMs, we recommend familiarity with the concepts in

• Probability: A sound understanding of conditional and marginal probabilities and Bayes Theorem is desirable.
• Introduction to machine learning: An introduction to basic concepts in machine learning such as classification, training instances, features, and feature types.

Follow the above links to first get acquainted with the corresponding concepts.

## Boltzmann machines

A Boltzmann machine is a parametric model for the joint probability of binary random variables.

Consider an $\ndim$-dimensional binary random variable $\vx \in \set{0,1}^\ndim$ with an unknown distribution. The joint probability of such a random variable using the Boltzmann machine model is calculated as

$$\prob{\vx} = \frac{\expe{-E(\vx)}}{Z} \label{eqn:bm}$$

Here, $Z$ is a normalization term, also known as the partition function that ensures $\sum_{\vx} \prob{\vx} = 1$.

The function $E: \ndim \to 1$ is a parametric function known as the energy function. It is defined as

$$E(\vx) = -\vx^T \mW \vx - \vb^T \vx \label{eqn:energy}$$

with the parameters $\mW$ and $\vb$.

## Boltzmann machine with hidden variables

The Boltzmann machine model for binary variables readily extends to scenarios where the variables are only partially observable. Say, the random variable $\vx$ consists of a elements that are observable (or visible) $\vv$ and the elements that are latent (or hidden) $\vh$.

Retaining the same formulation for the joint probability of $\vx$, we can now define the energy function of $\vx$ with specialized parameters for the two kinds of variables, indicated below with corresponding subscripts.

\begin{aligned} E(\vx) &= E(\vv, \vh) \\\\ &= -\vv^T \mW_v \vv - \vb_v^T \vv -\vh^T \mW_h \vh - \vb_h^T - \vv^T \mW_{vh} \vh \label{eqn:energy-hidden} \end{aligned}

## Restricted Boltzmann machine (RBM)

RBMs are undirected probabilistic graphical models for jointly modeling visible and hidden variables. They are a specialized version of Boltzmann machine with a restriction — there are no links among visible variables and among hidden variables.

As a result, the energy function of RBM has two fewer terms than in Equation \ref{eqn:energy-hidden}

\begin{aligned} E(\vv, \vh) &= - \vb_v^T \vv - \vb_h^T - \vv^T \mW_{vh} \vh \label{eqn:energy-rbm} \end{aligned}

Note that the quadratic terms for the self-interaction among the visible variables ($-\vv^T \mW_v \vv$) and those among the hidden variables ($-\vh^T \mW_h \vh$ ) are not included in the RBM energy function. Hence the name restricted Boltzmann machines.

Using this modified energy function, the joint probability of the variables is

$$\prob{v=\vv, h=\vh} = \frac{\expe{-E(\vv, \vh)}}{Z} \label{eqn:rbm}$$

The partition function is a summation over the probabilities of all possible instantiations of the variables

$$Z = \sum_{\vv} \sum_{\vh} \prob{v=\vv, h=\vh}$$

## Training

Training an RBM involves the discovery of optimal parameters $\vb, \vc$ and $\mW_{vh}$ of the the model. You can notice that the partition function is intractable due to the enumeration of all possible values of the hidden states. Therefore, typically RBMs are trained using approximation methods meant for models with intractable partition functions, with necessary terms being calculated using sampling methods such as Gibb sampling.