Quickstart with litgen#
litgen_template is a template repository to build python bindings using litgen, pybind11 or nanobind and scikit-build.
This template is based on scikit_build_example.
Usage for final users#
Below are the instructions you would give to final users of your bindings. They are extremely short:
First, install the package from source
git clone https://github.com/pthom/litgen_template.git && cd litgen_template
pip install -v .
Then, use it from python
import daft_lib
daft_lib.add(1, 2)
(this template builds bindings for a C++ library called DaftLib, and publishes it as a python module called daft_lib)
Of course, you could also publish your bindings to PyPI, and tell your users to install them with
pip install daft-lib
. This template provides tooling to make the publishing process easier, via cibuildwheel
Autogenerate the binding code#
Install requirements#
Create a virtual environment
python3 -m venv venv
source venv/bin/activate
Install the requirements
pip install -r requirements-dev.txt
This will install litgen (the bindings generator), pybind11 and nanobind (libraries to create C++ to Python bindings), pytest (for the tests), black (a code formatter), and mypy (static type checker for python).
See requirements-dev.txt.
Generate bindings#
Optionally, change the C++ code
Change the C++ code (add functions, etc.) in src/cpp_libraries/DaftLib
Adapt the generation options inside tools/autogenerate_bindings.py
Optionally, switch to nanobind
By default, this template uses pybind11. If you want to switch to nanobind, you can do so with
export LITGEN_USE_NANOBIND=ON
Run the code generation via litgen
python tools/autogenerate_bindings.py
This will:
Write the cpp binding code into _pydef_pybind11/pybind_DaftLib.cpp or _pydef_nanobind/nanobind_DaftLib.cpp
Write the python stubs (i.e. typed declarations) inside _stubs/daft_lib/__init__.pyi.
Tip: compare the python stubs with the C++ header file to see how close they are!
Note: the options inside autogenerate_bindings.py showcase a subset of litgen customization capabilities. See the litgen documentation for more details. They are heavily documented, and correspond to the documentation you can find in DaftLib.h
Adapt for your own library#
Names, names, names
In this template repository:
the C++ library is called
DaftLib
the native python module generated by pybind11 is called
_daft_lib
the python module which is imported by users is called
daft_lib
(it imports and optionally adapts_daft_lib
)the pip package that can optionally be published to PyPI is called
daft-lib
(as Pypi does not allow dashes in package names)
You can change these names by running change_lib_name.py
in the tools/change_lib_name folder.
Structure of this template#
Bound C++ library#
The C++ library DaftLib
is stored inside src/cpp_libraries/DaftLib/
src/
├── cpp_libraries/
└── DaftLib/
├── CMakeLists.txt
├── DaftLib.h
└── cpp/
└── DaftLib.cpp
C++ binding code#
The C++ binding code is stored inside _pydef_pybind11/
(or _pydef_nanobind/
if you use nanobind).
_pydef_pybind11/
├─── module.cpp # Main entry point of the python module
└── pybind_DaftLib.cpp # File with bindings *generated by litgen*
Python stubs#
The python stubs are stored inside _stubs/
_stubs/
└── daft_lib/
├── __init__.pyi # Stubs *generated by litgen*
├── __init__.py # The python module (daft_lib) main entry point
│ # (it imports and optionally adapts _daft_lib)
└── py.typed # An empty file that indicates that the python module is typed
Tooling for the bindings generation#
tools/
├── autogenerate_bindings.py
└── change_lib_name/
└── change_lib_name.py
tools/autogenerate_bindings.py
is the script that will generate the bindings using litgen.
tools/change_lib_name/change_lib_name.py
is an optional utility that you can use once after cloning this template,
in order to rename the libraries (e.g. from DaftLib
to MyLib
, daft_lib
to my_lib
, etc.)
Compilation#
├── CMakeLists.txt # CMakeLists (used also by pip, via skbuild)
├── litgen_cmake/ # litgen_setup_module() is a cmake function that simplifies
│ └── litgen_setup_module.cmake # the deployment of a python module with litgen
│
├── requirements-dev.txt # Requirements for development (litgen, pybind11, pytest, black, mypy)
Deployment#
pyproject.toml is used by pip and skbuild to build and deploy the package. It defines the name of the package, the version, the dependencies, etc.
Continuous integration#
Several github workflows are defined in .github/workflows:
.github/
├── dependabot.yml # Configuration for dependabot (automatically update CI dependencies)
├── workflows/
├── conda.yml # Build the package with conda
├── pip.yml # Build the package with pip
└── wheels.yml # Build the wheels with cibuildwheel, and publish them to PyPI
# (when a new release is created on github)
Note:
cibuildwheel is configurable via options defined in the pyproject.toml file: see the
[tool.cibuildwheel]
section.it is also configurable via environment variables, see cibuildwheel documentation
Tests#
├── tests/daft_lib_test.py # This is a list of python tests that will check
└── pytest.ini
Those tests are run by cibuildwheel and by the pip CI workflow.
Editable development mode#
If you want to quickly iterate on the C++ code, and see the changes reflected in python without having to reinstall the package, you should use the python editable development mode.
Setup editable mode#
Step1: Install the package in editable mode
pip install -v -e . # -e stands for --editable, and -v stands for --verbose
Step 2: Create a standard C++ build directory
mkdir build && cd build
cmake ..
make # rebuild when you change the C++ code, and the changes will be reflected in python!
Debug C++ bindings in editable mode#
The pybind_native_debug executable provided in this template is a simple C++ program that can be used to debug the bindings in editable mode.
src/pybind_native_debug/
├── CMakeLists.txt
├── pybind_native_debug.cpp
└── pybind_native_debug.py
Simply edit the python file pybind_native_debug.py
by adding calls to the C++ functions you want to debug. Then, place breakpoints in the C++ code, and debug the C++ program.
Development tooling#
This template is ready to be used with additional tools:
pre-commit
ruff
mypy
pyright
black
pytest
cibuildwheel
pre-commit#
pre-commit is a tool that allows you to run checks on your code before committing it. This template provides a default pre-commit configuration for it, but it is not active by default;
You can install pre-commit with:
pip install pre-commit
Then, you can activate the pre-commit hooks for your repository with:
pre-commit install
The pre-commit configuration file .pre-commit-config.yaml, is configured with the following hooks:
basic sanity checks: trailing-whitespace, end-of-file-fixer,check-yaml, check-added-large-files
black: uncompromising Python code formatter
ruff: fast Python linter and code formatter (only used for linting)
mypy: static type checker for python
You can find more interesting hooks on the pre-commit hooks repository, and for example add ruff, mypy, black, etc.
You may want to disable some checks in the .pre-commit-config.yaml
file if you think this is too strict for your project.
ruff: python linter and code formatter#
ruff is a very fast python linter and code formatter. You can install it and run it with:
pip install ruff # install ruff (once)
ruff . # each time you want to check your python code
mypy and pyright: static type checkers for python#
mypy and pyright are static type checkers for python.
You can use either one of them, or both.
mypy#
pip install mypy # install mypy (once)
mypy # each time you want to check your python code
mypy is configured via the mypy.ini file.
pyright#
pip install pyright # install pyright (once)
pyright # each time you want to check your python code
pyright is configured via the pyrightconfig.json file.
black: python code formatter#
black is a python code formatter.
pip install black # install black (once)
black . # each time you want to format your python code
pytest: python tests#
pytest is an easy-to-use python test framework.
pip install pytest # install pytest (once)
pytest # each time you want to run your python tests
It is configured via the pytest.ini file, and tests are stored in the tests folder.
cibuildwheel: build wheels for all platforms#
ci-buildwheel is a tool that allows you to build wheels for all platforms.
It is configured via the pyproject.toml file (see the [tool.cibuildwheel] section), and the github workflow file.
run_all_checks#
tools/run_all_checks.sh is a script you can run before committing or pushing. It will run a collection of checks (mypy, black, ruff, pytest).