Skip to content

vojtechkostal/BayesicForceFields

Repository files navigation

Bayesic Force Fields

Docs Paper Preprint Release License

Bayesic Force Fields (BFF) is a workflow-oriented Python package for learning fixed-charge molecular force-field parameters from molecular dynamics observables. It combines system preparation, sampled MD campaigns, QoI analysis, surrogate training and posterior inference in one toolchain.

The public CLI is centered around seven workflows:

  • bff prepare
  • bff reference
  • bff trainset
  • bff qoi
  • bff train
  • bff learn
  • bff validate

Examples can be fetched on demand with:

bff examples

Documentation lives under docs/ and is intended to be published with MkDocs on GitHub Pages.

Published documentation: vojtechkostal.github.io/BayesicForceFields

How to Cite

If you use BFF, please cite:

Kostal, V.; Shanks, B. L.; Jungwirth, P.; Martinez-Seara, H.
Bayesian Learning for Accurate and Robust Biomolecular Force Fields.
J. Chem. Theory Comput. 2026, 22 (5), 2652-2663.
https://doi.org/10.1021/acs.jctc.5c02051

Paper: Bayesian Learning for Accurate and Robust Biomolecular Force Fields

Preprint: arXiv:2511.05398

Installation

Recommended user installation:

mamba create -n bfflearn python=3.10 pip
mamba activate bfflearn

Install a matching PyTorch build for your machine before training or learning. The recommended way is to use the selector on the official PyTorch install page: https://pytorch.org/get-started/locally/

Example for Linux with CUDA 12.6:

pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu126

Then install BFF from PyPI:

pip install bfflearn

If you want the exact code used for the paper, do not install v0.0.1 directly through a pip Git URL. That archived tag predates the packaging cleanup. Instead, clone the repository, check out the archived tag, and follow the README.md and environment.yaml shipped with that snapshot:

git clone https://github.com/vojtechkostal/BayesicForceFields.git
cd BayesicForceFields
git checkout v0.0.1

Use v0.0.1 for exact reproduction of the published paper data. The current bfflearn release line is the post-paper refactored workflow.

External tools are still required for full workflows:

  • Gromacs for prepare, trainset, and validate
  • CP2K for reference and other staged reference calculations
  • PLUMED only for PLUMED-biased systems
  • PyTorch installed separately for train, learn, and posterior notebooks

PyTorch is not installed by default because the appropriate CPU or CUDA build depends on the target machine. Install the matching PyTorch build first, then install BFF.

For development work on the repository itself, use:

mamba env create -f environment.yaml
mamba activate bfflearn

If you prefer to start from an existing environment instead:

pip install -e ".[dev,docs,notebook]"

Quick Start

The acetate example in examples/acetate/ shows the intended stage order:

cd examples/acetate/01-prepare/colvars
bff prepare config.yaml

cd ../../02-reference-data
bff reference config-local.yaml

cd ../02-training-data
bff trainset config-local.yaml

cd ../03-qoi
bff qoi config.yaml

cd ../04-train-lgp
bff train config.yaml

cd ../05-learning
bff learn config.yaml

Validation is configured separately in 07-validate:

cd ../07-validate
bff validate config.yaml

Two notebooks are included in the example:

If you installed BFF from PyPI and want the example tree locally, run:

bff examples
cd examples/acetate

The reference workflow itself writes the final train.extxyz and valid.extxyz files, so the normal path is simply:

bff reference CONFIG.yaml

Repository Layout

  • bff/ contains the package code.
  • examples/acetate/ contains the worked example.
  • data/ contains repository example inputs.
  • docs/ contains the documentation source.

Documentation Locally

Preview the docs locally with MkDocs:

mkdocs serve

Build the static site with:

mkdocs build --strict

Shortcuts are also available:

make docs
make docs-build

Shell Completion

When bff runs inside an activated conda environment, it writes a small completion hook for bash and zsh into that environment. After the first bff run, reactivate the environment once:

conda deactivate
conda activate bfflearn

After that, bff <TAB> should offer the public workflow commands.

Development and Release

Packaging, docs, and deployment configuration live in:

The release and publication strategy is documented in docs/development.md.

License

BFF is distributed under the GNU GPL v3. See LICENSE.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors