In TrainMBPOAgentToBalanceCartPoleSystemExample/ cartPoleRewardFunction ,(nextObs)is what?
23 views (last 30 days)
Show older comments
function reward = cartPoleRewardFunction(obs,action,nextObs)
% Compute reward value based on the next observation.
if iscell(nextObs)
nextObs = nextObs{1};
end
% Distance at which to fail the episode
xThreshold = 2.4;
% Reward each time step the cart-pole is balanced
rewardForNotFalling = 1;
% Penalty when the cart-pole fails to balance
penaltyForFalling = -50;
x = nextObs(1,:);
distReward = 1 - abs(x)/xThreshold;
isDone = cartPoleIsDoneFunction(obs,action,nextObs);
reward = zeros(size(isDone));
reward(logical(isDone)) = penaltyForFalling;
reward(~logical(isDone)) = ...
0.5 * rewardForNotFalling + 0.5 * distReward(~logical(isDone));
end
I really want to know where nextObs is passing this function in from? Why can't I find this variable in the main function.
If my environment is built from Simulink, how do I get the nextObs variable?
0 Comments
Accepted Answer
Ayush Aniket
on 28 Oct 2024 at 4:14
Hi Lin,
The nextObs variable returns the next state after transition from the current state by the Reinforcement Learning(RL) Agent. While training, using the train function, the step function is implicitly called, which takes the environment model for the RL agent and the action as input, and returns three outputs: nextObs,reward and isdone.These inputs are then used in the reward function to calculate the reward for the action taken.
The Train MBPO Agent to Balance Continuous Cart-Pole System example uses a rlNeuralNetworkEnvironment object to create the environment. In this function, you can provide a custom reward function by using the function handle. Refer to the following documentation link for this input paramater:
Once a custom reward function handle is provided, it is implicitly fed the input arguments (obs,action,nextObs) during training.
However, you can evaluate these function by using the step function (and get the nextObs variable) as shown in the following documentation section:
3 Comments
More Answers (0)
See Also
Categories
Find more on Environments in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!