首页
Python
Java
前端
数据库
Linux
Chatgpt专题
开发者工具箱
gittins专题
n-armed bandit_Gittins index
The complexity of solving MAB (multi-armed bandit) using Markov decision theory increases exponentially with the number of bandit processes. Instead of solving the n-dimensional MDP with the state-sp
阅读更多...