Member-only story

Bayesian Linear Regression with SGD, Adam and NUTS in PyTorch

Jake Jing
4 min readJul 18, 2020

--

PyTorch has gained great popularity among industrial and scientific projects, and it provides a backend for many other packages or modules. It is also accompanied with very good documentation, tutorials, and conferences. This blog attempts to use PyTorch to fit a simple linear regression via three optimisation algorithms:

  • Stochastic Gradient Descent (SGD)
  • Adam
  • No-U-Turn Sampler (NUTS)

We will start with some simulated data, given certain parameters, such as weights, bias and sigma. The linear regression can be expressed by the following equation:

Matrix notations of a linear regression

where the observed dependent variable Y is a linear combination of data (X) times weights (W), and add the bias (b). This is essentially the same as the nn.Linear class in PyTorch.

1. simulate data

We need to load the dependent modules, such as torch, jax, and numpyro.

from __future__ import print_function
import torch
import torch.nn as nn
from torch.autograd import Variable
from torch.distributions.normal import Normal
from torch.distributions.uniform…

--

--

Jake Jing
Jake Jing

Written by Jake Jing

Programming, Data science & Deep learning!

No responses yet