Bayesic Force Fields (BFF) is a workflow-oriented Python package for learning fixed-charge molecular force-field parameters from molecular dynamics observables. It combines system preparation, sampled MD campaigns, QoI analysis, surrogate training and posterior inference in one toolchain.
The public CLI is centered around seven workflows:
bff preparebff referencebff trainsetbff qoibff trainbff learnbff validate
Examples can be fetched on demand with:
bff examplesDocumentation lives under docs/ and is intended to be published with MkDocs on GitHub Pages.
Published documentation: vojtechkostal.github.io/BayesicForceFields
If you use BFF, please cite:
Kostal, V.; Shanks, B. L.; Jungwirth, P.; Martinez-Seara, H.
Bayesian Learning for Accurate and Robust Biomolecular Force Fields.
J. Chem. Theory Comput. 2026, 22 (5), 2652-2663.
https://doi.org/10.1021/acs.jctc.5c02051
Paper: Bayesian Learning for Accurate and Robust Biomolecular Force Fields
Preprint: arXiv:2511.05398
Recommended user installation:
mamba create -n bfflearn python=3.10 pip
mamba activate bfflearnInstall a matching PyTorch build for your machine before training or learning. The recommended way is to use the selector on the official PyTorch install page: https://pytorch.org/get-started/locally/
Example for Linux with CUDA 12.6:
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu126Then install BFF from PyPI:
pip install bfflearnIf you want the exact code used for the paper, do not install v0.0.1
directly through a pip Git URL. That archived tag predates the packaging
cleanup. Instead, clone the repository, check out the archived tag, and follow
the README.md and environment.yaml shipped with that snapshot:
git clone https://github.com/vojtechkostal/BayesicForceFields.git
cd BayesicForceFields
git checkout v0.0.1Use v0.0.1 for exact reproduction of the published paper data. The current
bfflearn release line is the post-paper refactored workflow.
External tools are still required for full workflows:
- Gromacs for
prepare,trainset, andvalidate - CP2K for
referenceand other staged reference calculations - PLUMED only for PLUMED-biased systems
- PyTorch installed separately for
train,learn, and posterior notebooks
PyTorch is not installed by default because the appropriate CPU or CUDA build depends on the target machine. Install the matching PyTorch build first, then install BFF.
For development work on the repository itself, use:
mamba env create -f environment.yaml
mamba activate bfflearnIf you prefer to start from an existing environment instead:
pip install -e ".[dev,docs,notebook]"The acetate example in examples/acetate/ shows the intended stage order:
cd examples/acetate/01-prepare/colvars
bff prepare config.yaml
cd ../../02-reference-data
bff reference config-local.yaml
cd ../02-training-data
bff trainset config-local.yaml
cd ../03-qoi
bff qoi config.yaml
cd ../04-train-lgp
bff train config.yaml
cd ../05-learning
bff learn config.yamlValidation is configured separately in 07-validate:
cd ../07-validate
bff validate config.yamlTwo notebooks are included in the example:
- 05-learning/interactive.ipynb shows interactive surrogate training, posterior sampling, and posterior sample export.
- 06-visualize/visualize.ipynb focuses on plotting and inspection only.
If you installed BFF from PyPI and want the example tree locally, run:
bff examples
cd examples/acetateThe reference workflow itself writes the final train.extxyz and
valid.extxyz files, so the normal path is simply:
bff reference CONFIG.yaml- bff/ contains the package code.
- examples/acetate/ contains the worked example.
- data/ contains repository example inputs.
- docs/ contains the documentation source.
Preview the docs locally with MkDocs:
mkdocs serveBuild the static site with:
mkdocs build --strictShortcuts are also available:
make docs
make docs-buildWhen bff runs inside an activated conda environment, it writes a small
completion hook for bash and zsh into that environment. After the first bff
run, reactivate the environment once:
conda deactivate
conda activate bfflearnAfter that, bff <TAB> should offer the public workflow commands.
Packaging, docs, and deployment configuration live in:
The release and publication strategy is documented in docs/development.md.
BFF is distributed under the GNU GPL v3. See LICENSE.