Skip to content

AndrewFalkowski/asym-GP

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

asym-GP

Gaussian Process regression with interchangeable non-Gaussian likelihoods for Bayesian Optimization. Built on JAX and NumPyro.

Standard GP models assume Gaussian observation noise, which breaks down when experimental data contains outliers or systematic bias. asym-GP provides four likelihood variants that share a common interface and can be swapped without changing any other code:

Model Likelihood Best for
gGP Gaussian Clean or mildly noisy data (baseline)
tGP Student-t Symmetric outliers; learns degrees of freedom
lGP Laplace Symmetric outliers with exponential tails
sGP Skew-Normal Asymmetric noise or one-sided contamination

The sGP uses a mode-centered parametrization so the latent function tracks the most likely outcome rather than the mean — which matters when observations are systematically biased in one direction (e.g., contamination always degrades a measurement).

All models are trained via Stochastic Variational Inference and support quasi-Monte Carlo posterior sampling for low-variance acquisition function evaluation.


Installation

This project uses uv. If you don't use uv, you should.

git clone https://github.com/andrewfalkowski/asym-GP.git
cd asym-GP
uv sync

To install with pip instead:

pip install -e .

Usage

from asym_gp import gGP, tGP, lGP, sGP
from asym_gp import DataScaler, run_svi_training

# Scale data to unit hypercube / zero mean unit variance
scaler = DataScaler().fit(x_train, y_train)
xs, ys = scaler.transform_x(x_train), scaler.transform_y(y_train)

# Fit a model
model = sGP(xs, skew_direction="negative")
samples, losses = run_svi_training(model, ys, num_steps=25000)

# Predict
x_test_scaled = scaler.transform_x(x_test)
f_pred = model.predict_f(samples, x_test_scaled)          # noise-free latent function
y_pred = model.predict_y_qmc(samples, x_test_scaled, key) # noisy predictive samples

Demo Scripts

Two scripts are included to sanity-check the models on synthetic data. Figures are saved to figures/ in whichever directory you run from.

Fitting benchmark — fits all four models on a noisy 1D signal and compares posterior fits and SVI convergence:

uv run scripts/demo_fitting.py

Bayesian Optimization benchmark — runs a short BO loop on a 1D test function and plots convergence across all models:

uv run scripts/demo_bo.py

Status

This repo is under active development. Interfaces may change. If something is broken, open an issue.


Citation

If you use this code in your work, please cite:

@software{falkowski2026asymgp,
  author  = {Falkowski, Andrew R.},
  title   = {asym-{GP}: Non-{G}aussian Likelihoods for {B}ayesian Optimization},
  year    = {2026},
  url     = {https://github.com/andrewfalkowski/asym-GP}
}

About

Gaussian Processes with Symmmetric and Asymmetric Likelihoods

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages