MyNixOS website logo
Description

Miniature Logistic-Normal Multinomial Models.

Logistic-normal Multinomial (LNM) models are common in problems with multivariate count data. This package gives a simple implementation with a 30 line 'Stan' script. This lightweight implementation makes it an easy starting point for other projects, in particular for downstream tasks that require analysis of "compositional" data. It can be applied whenever a multinomial probability parameter is thought to depend linearly on inputs in a transformed, log ratio space. Additional utilities make it easy to inspect, create predictions, and draw samples using the fitted models. More about the LNM can be found in Xia et al. (2013) "A Logistic Normal Multinomial Regression Model for Microbiome Compositional Data Analysis" <doi:10.1111/biom.12079> and Sankaran and Holmes (2023) "Generative Models: An Interdisciplinary Perspective" <doi:10.1146/annurev-statistics-033121-110134>.

miniLNM

library(miniLNM)
library(dplyr)
knitr::knit_hooks$set(output = miniLNM::ansi_aware_handler)
options(crayon.enabled = TRUE)

miniLNM is a lightweight package for fitting and using logistic-normal multinomial models. It wraps a simple ‘Stan’ script (see inst/stan/ folder in the source code) and defines an S4 class that makes it easy to specify, estimate, and draw samples from the fit. For example, you can use tidyselect syntax to relate multiple compositional outputs to a set of influential biological factors.

example_data <- lnm_data()
xy <- bind_cols(example_data[c("X", "y")])
fit <- lnm(starts_with("y") ~ starts_with("x"), xy)
## Chain 1: ------------------------------------------------------------
## Chain 1: EXPERIMENTAL ALGORITHM:
## Chain 1:   This procedure has not been thoroughly tested and may be unstable
## Chain 1:   or buggy. The interface is subject to change.
## Chain 1: ------------------------------------------------------------
## Chain 1: 
## Chain 1: 
## Chain 1: 
## Chain 1: Gradient evaluation took 0.003206 seconds
## Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 32.06 seconds.
## Chain 1: Adjust your expectations accordingly!
## Chain 1: 
## Chain 1: 
## Chain 1: Begin eta adaptation.
## Chain 1: Iteration:   1 / 250 [  0%]  (Adaptation)
## Chain 1: Iteration:  50 / 250 [ 20%]  (Adaptation)
## Chain 1: Iteration: 100 / 250 [ 40%]  (Adaptation)
## Chain 1: Iteration: 150 / 250 [ 60%]  (Adaptation)
## Chain 1: Iteration: 200 / 250 [ 80%]  (Adaptation)
## Chain 1: Success! Found best value [eta = 1] earlier than expected.
## Chain 1: 
## Chain 1: Begin stochastic gradient ascent.
## Chain 1:   iter             ELBO   delta_ELBO_mean   delta_ELBO_med   notes 
## Chain 1:    100       -11758.981             1.000            1.000
## Chain 1:    200        -7210.902             0.815            1.000
## Chain 1:    300        -6537.465             0.578            0.631
## Chain 1:    400        -6378.312             0.440            0.631
## Chain 1:    500        -6340.350             0.353            0.103
## Chain 1:    600        -6251.900             0.296            0.103
## Chain 1:    700        -6199.291             0.255            0.025
## Chain 1:    800        -6177.029             0.224            0.025
## Chain 1:    900        -6143.302             0.200            0.014
## Chain 1:   1000        -6117.542             0.180            0.014
## Chain 1:   1100        -6157.149             0.081            0.008   MEDIAN ELBO CONVERGED
## Chain 1: 
## Chain 1: Drawing a sample of size 1000 from the approximate posterior... 
## Chain 1: COMPLETED.
## Warning: Pareto k diagnostic value is 12.45. Resampling is disabled. Decreasing tol_rel_obj may help if variational algorithm has terminated prematurely. Otherwise consider using sampling instead.

The print method gives a concise summary of the fitted model, which is easier to read than the full ‘Stan’ output.

fit
## [LNM Model]
## Regression formula: y1 + y2 + y3 + y4 ... ~ x1 + x2 + x3 + x4 ... 
## 5-dimensional input and 10-dimensional output 
## First few entries of estimated regression coefficients:
## # A tibble: 5 × 9
##      y1    y2    y3    y4    y5    y6    y7    y8    y9
##           
## 1  0.24  0.23  0.21  0.12  0.46  0.56  0.81  0.86  0.32
## 2  0.89  0.6   0.66  0.87  0.01  0.9   0.14  0.74  0.06
## 3  0.82  0.89  0.89  0.87 -0.02  0.34  0.61  0.59  0.63
## 4  0.31  0.95  0.53  0.39  0.12  0.27  0.8   0     0.11
## 5  0.23  0.14  0.11  0.2   0.95  0.73  0.33  0.45  0.63

You can also use predict, like in ordinary linear models, and can draw posterior predictive samples using sample.

p_hat <- predict(fit)
y_star <- sample(fit, depth = 200)

Help

We welcome questions and comments about the package either through github or email.

Metadata

Version

0.1.0

License

Unknown

Platforms (77)

    Darwin
    FreeBSD
    Genode
    GHCJS
    Linux
    MMIXware
    NetBSD
    none
    OpenBSD
    Redox
    Solaris
    WASI
    Windows
Show all
  • aarch64-darwin
  • aarch64-freebsd
  • aarch64-genode
  • aarch64-linux
  • aarch64-netbsd
  • aarch64-none
  • aarch64-windows
  • aarch64_be-none
  • arm-none
  • armv5tel-linux
  • armv6l-linux
  • armv6l-netbsd
  • armv6l-none
  • armv7a-darwin
  • armv7a-linux
  • armv7a-netbsd
  • armv7l-linux
  • armv7l-netbsd
  • avr-none
  • i686-cygwin
  • i686-darwin
  • i686-freebsd
  • i686-genode
  • i686-linux
  • i686-netbsd
  • i686-none
  • i686-openbsd
  • i686-windows
  • javascript-ghcjs
  • loongarch64-linux
  • m68k-linux
  • m68k-netbsd
  • m68k-none
  • microblaze-linux
  • microblaze-none
  • microblazeel-linux
  • microblazeel-none
  • mips-linux
  • mips-none
  • mips64-linux
  • mips64-none
  • mips64el-linux
  • mipsel-linux
  • mipsel-netbsd
  • mmix-mmixware
  • msp430-none
  • or1k-none
  • powerpc-netbsd
  • powerpc-none
  • powerpc64-linux
  • powerpc64le-linux
  • powerpcle-none
  • riscv32-linux
  • riscv32-netbsd
  • riscv32-none
  • riscv64-linux
  • riscv64-netbsd
  • riscv64-none
  • rx-none
  • s390-linux
  • s390-none
  • s390x-linux
  • s390x-none
  • vc4-none
  • wasm32-wasi
  • wasm64-wasi
  • x86_64-cygwin
  • x86_64-darwin
  • x86_64-freebsd
  • x86_64-genode
  • x86_64-linux
  • x86_64-netbsd
  • x86_64-none
  • x86_64-openbsd
  • x86_64-redox
  • x86_64-solaris
  • x86_64-windows