toydl: toy deep learning algorithms implementation, backend with self implement toy torch
-
Updated
Jun 11, 2024 - Python
toydl: toy deep learning algorithms implementation, backend with self implement toy torch
Transparent calculations with uncertainties on the quantities involved (aka "error propagation"); calculation of derivatives.
automatic differentiation made easier for C++
Drop-in autodiff for NumPy.
Dualitic is a Python package for forward mode automatic differentiation using dual numbers.
Lightweight automatic differentiation and error propagation library
Forward mode automatic differentiation for Fortran
A minimalist neural networks library built on a tiny autograd engine
A tiny autograd library made for educational purposes.
Deep learning in Rust, with shape checked tensors and neural networks
[wip] Lightweight Automatic Differentiation & DeepLearning Framework implemented in pure Julia.
Simple neural network and automatic differentiation implementation
🚢 Portable development environment for Enzyme
A brief (and inaccurate) history of derivatives, with a brief (and incomplete) Python implementation
micrograd (smol autodiff lib by @karpathy) ported into various languages
A simple and highly extensible Computational graph library written in C++ with the support of auto diff.
Yet another tensor automatic differentiation framework
My implementation of Andrej Kaparthy's Micrograd library for back propagation and simple neural net training
FastAD is a C++ implementation of automatic differentiation both forward and reverse mode.
Algorithmic differentiation with hyper-dual numbers in C++ and Python
Add a description, image, and links to the autodifferentiation topic page so that developers can more easily learn about it.
To associate your repository with the autodifferentiation topic, visit your repo's landing page and select "manage topics."