-
Updated
Nov 7, 2019 - Python
bandits
Here are 39 public repositories matching this topic...
-
Updated
Nov 16, 2017 - Python
Coursework, Stochastic Models and Optimization, BSE, Term 3, Class of 2022
-
Updated
Jul 6, 2022 - Jupyter Notebook
An assignment for the implementation of Online Learning, Bandits and Reinforcement Learning
-
Updated
Dec 18, 2018 - Jupyter Notebook
Study the interplay between communication and feedback in a cooperative online learning setting.
-
Updated
May 31, 2024 - Python
-
Updated
Aug 16, 2017 - Jupyter Notebook
Implementation of Multi-Armed Bandit (MAB) algorithms UCB and Epsilon-Greedy. MAB is a class of problems in reinforcement learning where an agent learns to choose actions from a set of arms, each associated with an unknown reward distribution. UCB and Epsilon-Greedy are popular algorithms for solving MAB problems.
-
Updated
Mar 26, 2023 - Python
This repo contains all the stuff I encountered while playing OverTheWire games.
-
Updated
Dec 25, 2020
Foundations of Intelligent and Learning Agenet
-
Updated
Dec 13, 2019 - Python
Implementation of the prophet inequalities
-
Updated
Dec 11, 2021 - Python
An exploration of multi-armed Bernoulli bandits in reinforcement learning, complete with experiments and observations.
-
Updated
Sep 29, 2023 - Jupyter Notebook
Repository for the course project done as part of CS-747 (Foundations of Intelligent & Learning Agents) course at IIT Bombay in Autumn 2022.
-
Updated
Oct 14, 2022 - Python
A two armed bandit simulation and comparison with theoritical convergence
-
Updated
Apr 16, 2022 - Jupyter Notebook
-
Updated
May 28, 2021 - Jupyter Notebook
Play Rock, Paper, Scissors (Kaggle competition) with Reinforcement Learning: bandits, tabular Q-learning and PPO with LSTM.
-
Updated
Mar 2, 2021 - Python
Code and data for the paper "A Combinatorial Multi-Armed Bandit Approach to Correlation Clustering", DAMI 2023
-
Updated
Aug 4, 2023 - Python
Improve this page
Add a description, image, and links to the bandits topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the bandits topic, visit your repo's landing page and select "manage topics."