-
Updated
Dec 26, 2019 - Jupyter Notebook
multi-armed-bandits
Here are 98 public repositories matching this topic...
Simple Implementations of Bandit Algorithms in python
-
Updated
Dec 2, 2021 - Jupyter Notebook
A LoRa simulator with applied multi-armed bandit algorithms.
-
Updated
Nov 8, 2021 - C
Awesome list about anything bandit problems
-
Updated
Nov 27, 2019
Bayesian active learning algorithm with Thompson sampling on multi-armed bandit with Numpy
-
Updated
Feb 6, 2022 - Python
🐯REPLICA of "Combinatorial Multi-Armed Bandit Based Unknown Worker Recruitment in Heterogeneous Crowdsensing"
-
Updated
Dec 24, 2023 - Jupyter Notebook
Repository for the Reinforcement Learning (CSE564) Fall'19 course at IIIT Delhi
-
Updated
Dec 6, 2019 - Jupyter Notebook
Assignments for CS747 - Foundations of Intelligent and Learning Agents
-
Updated
Nov 8, 2019 - Python
Implementation of common bandit algorithms for the Bernoulli setting.
-
Updated
Aug 1, 2019 - Jupyter Notebook
-
Updated
Jun 29, 2023 - Jupyter Notebook
Implementation of multi-armed bandits from scratch
-
Updated
Mar 16, 2021 - Python
Implementation of the prophet inequalities
-
Updated
Dec 11, 2021 - Python
This Repository contain the Answers of "Coursera RL Specialization" Course exercises
-
Updated
Jun 16, 2022 - Jupyter Notebook
A multi-armed bandit implementation in python
-
Updated
Mar 7, 2017 - Python
Implementation of several multi-armed bandit problems.
-
Updated
Feb 5, 2024 - Python
Contextual Bandit Engine
-
Updated
Aug 16, 2023 - Python
Multi-Armed Bandit method of accurately estimating the largest parameter out of a set of candidates.
-
Updated
Apr 6, 2024 - Python
Development of algorithms for reinforcement learning. Specifically, software implementation of the algorithms and policies described in the paper Batched Multi-armed Bandits Problems, by Zijun Gao, Yanjun Han, Zhimei Ren, Zhengqing Zhou.
-
Updated
Jul 22, 2020 - Python
A policy gradient approach to a multi-armed bandit problem
-
Updated
Nov 29, 2017 - Jupyter Notebook
Improve this page
Add a description, image, and links to the multi-armed-bandits topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the multi-armed-bandits topic, visit your repo's landing page and select "manage topics."