V2EX policy-gradient

Policy Gradient

定义 Definition

策略梯度(Policy Gradient):强化学习中的一类方法,通过对策略参数求梯度来直接优化“策略”(即智能体选择动作的概率分布),目标通常是最大化长期期望回报。常用于连续动作控制或需要随机策略的任务。(也常指“策略梯度定理”或相关算法家族。)

发音 Pronunciation (IPA)

/plsi redint/

例句 Examples

The agent learns with a policy gradient method.
智能体使用策略梯度方法进行学习。

By estimating the policy gradient from sampled trajectories, the algorithm updates its neural-network policy to maximize expected return while using a baseline to reduce variance.
通过从采样到的轨迹中估计策略梯度,该算法更新其神经网络策略以最大化期望回报,同时使用基线来降低方差。

词源 Etymology

该术语由两部分组成:policy(策略)源自希腊语 polis(城邦、公共事务)并经由拉丁语、法语进入英语,引申为“管理/决策方针”;gradient(梯度)来自拉丁语 gradiens(行走、逐步前进),在数学中表示“变化最快的方向”。合起来即“用梯度来优化策略”。

相关词 Related Words

文献与作品 Literary Works

  • Richard S. Sutton & Andrew G. Barto, Reinforcement Learning: An Introduction(多处讨论策略梯度与actor-critic框架)
  • Ronald J. Williams (1992), “Simple Statistical Gradient-Following Algorithms for Connectionist Reinforcement Learning”(提出 REINFORCE,策略梯度经典来源)
  • John Schulman et al. (2015), “Trust Region Policy Optimization (TRPO)”(基于策略梯度的约束优化方法)
  • John Schulman et al. (2017), “Proximal Policy Optimization Algorithms (PPO)”(广泛应用的策略梯度变体)
  • David Silver et al. (2014), “Deterministic Policy Gradient Algorithms”(确定性策略梯度,适用于连续动作控制)
关于     帮助文档     自助推广系统     博客     API     FAQ     Solana     1297 人在线   最高记录 6679       Select Language
创意工作者们的社区
World is powered by solitude
VERSION: 3.9.8.5 34ms UTC 17:26 PVG 01:26 LAX 10:26 JFK 13:26
Do have faith in what you're doing.
ubao msn snddm index pchome yahoo rakuten mypaper meadowduck bidyahoo youbao zxmzxm asda bnvcg cvbfg dfscv mmhjk xxddc yybgb zznbn ccubao uaitu acv GXCV ET GDG YH FG BCVB FJFH CBRE CBC GDG ET54 WRWR RWER WREW WRWER RWER SDG EW SF DSFSF fbbs ubao fhd dfg ewr dg df ewwr ewwr et ruyut utut dfg fgd gdfgt etg dfgt dfgd ert4 gd fgg wr 235 wer3 we vsdf sdf gdf ert xcv sdf rwer hfd dfg cvb rwf afb dfh jgh bmn lgh rty gfds cxv xcv xcs vdas fdf fgd cv sdf tert sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf sdf shasha9178 shasha9178 shasha9178 shasha9178 shasha9178 liflif2 liflif2 liflif2 liflif2 liflif2 liblib3 liblib3 liblib3 liblib3 liblib3 zhazha444 zhazha444 zhazha444 zhazha444 zhazha444 dende5 dende denden denden2 denden21 fenfen9 fenf619 fen619 fenfe9 fe619 sdf sdf sdf sdf sdf zhazh90 zhazh0 zhaa50 zha90 zh590 zho zhoz zhozh zhozho zhozho2 lislis lls95 lili95 lils5 liss9 sdf0ty987 sdft876 sdft9876 sdf09876 sd0t9876 sdf0ty98 sdf0976 sdf0ty986 sdf0ty96 sdf0t76 sdf0876 df0ty98 sf0t876 sd0ty76 sdy76 sdf76 sdf0t76 sdf0ty9 sdf0ty98 sdf0ty987 sdf0ty98 sdf6676 sdf876 sd876 sd876 sdf6 sdf6 sdf9876 sdf0t sdf06 sdf0ty9776 sdf0ty9776 sdf0ty76 sdf8876 sdf0t sd6 sdf06 s688876 sd688 sdf86