Implemenation of CORL for Fetch and Unitree A1 tasks
-
Updated
Jun 5, 2024 - Python
Implemenation of CORL for Fetch and Unitree A1 tasks
Offline to Online RL: AWAC & IQL PyTorch Implementation
Code for Undergrad Final Year Project “Offline Risk-Averse Actor-Critic with Curriculum Learning”
Need 4 Speed, FYP 2023-24 @ Monash.
オフライン強化学習用フレームワーク及びSCQL,SCQL+Dの実装
🧠 Learning World Value Functions without Exploration
Codes accompanying the paper "On the Role of Discount Factor in Offline Reinforcement Learning" (ICML 2022)
Package for recording Transitions in OpenAI Gym Environments.
Author's repository for GSM8K-AI-SubQ reasoning dataset
Code to reproduce experiments from "User-Interactive Offline Reinforcement Learning" (ICLR 2023)
Direct port of TD3_BC to JAX using Haiku and optax.
Code for NeurIPS 2023 paper Accountability in Offline Reinforcement Learning: Explaining Decisions with a Corpus of Examples
Summarising the research of Offline RL in Federated Setting.
PyTorch Implementation of Offline Reinforcement Learning algorithms
[MLHC 2021] Model Selection for Offline RL: Practical Considerations for Healthcare Settings. https://arxiv.org/abs/2107.11003
Code for Continuous Doubly Constrained Batch Reinforcement Learning, NeurIPS 2021.
Neural Laplace Control for Continuous-time Delayed Systems - an offline RL method combining Neural Laplace dynamics model and MPC planner to achieve near-expert policy performance in environments with irregular time intervals and an unknown constant delay.
Codes accompanying the paper "Offline Reinforcement Learning with Value-Based Episodic Memory" (ICLR 2022 https://arxiv.org/abs/2110.09796)
Official code for paper: Conservative objective models are a special kind of contrastive divergence-based energy model
D2C(Data-driven Control Library) is a library for data-driven control based on reinforcement learning.
Add a description, image, and links to the offline-rl topic page so that developers can more easily learn about it.
To associate your repository with the offline-rl topic, visit your repo's landing page and select "manage topics."