Simple Reinforcement learning tutorials, 莫烦Python 中文AI教学
-
Updated
Mar 31, 2024 - Python
Simple Reinforcement learning tutorials, 莫烦Python 中文AI教学
Tensorflow tutorial from basic to hard, 莫烦Python 中文AI教学
Implementations from the free course Deep Reinforcement Learning with Tensorflow and PyTorch
Minimal and Clean Reinforcement Learning Examples
Minimal Deep Q Learning (DQN & DDQN) implementations in Keras
StarCraft II - pysc2 Deep Reinforcement Learning Examples
Contains high quality implementations of Deep Reinforcement Learning algorithms written in PyTorch
Master Reinforcement and Deep Reinforcement Learning using OpenAI Gym and TensorFlow
Deep Reinforcement Learning based Trading Agent for Bitcoin
Implementations of Reinforcement Learning Models in Tensorflow
Deep Q-learning for playing flappy bird game
Deep Q-Learning Network in pytorch (not actively maintained)
PyTorch implementations of various Deep Reinforcement Learning (DRL) algorithms for both single agent and multi-agent.
Deep Q-learning for playing tetris game
A reinforcement learning package for Julia
The purpose of this repository is to make prototypes as case study in the context of proof of concept(PoC) and research and development(R&D) that I have written in my website. The main research topics are Auto-Encoders in relation to the representation learning, the statistical machine learning for energy-based models, adversarial generation net…
CURL: Contrastive Unsupervised Representation Learning for Sample-Efficient Reinforcement Learning
Implementations of algorithms from the Q-learning family. Implementations inlcude: DQN, DDQN, Dueling DQN, PER+DQN, Noisy DQN, C51
RAD: Reinforcement Learning with Augmented Data
Code for paper "Computation Offloading Optimization for UAV-assisted Mobile Edge Computing: A Deep Deterministic Policy Gradient Approach"
Add a description, image, and links to the deep-q-network topic page so that developers can more easily learn about it.
To associate your repository with the deep-q-network topic, visit your repo's landing page and select "manage topics."