- Critic: https://www.mathworks.com/help/reinforcement-learning/ref/rl.function.rlvectorqvaluefunction.html
- Actor: https://www.mathworks.com/help/reinforcement-learning/ref/rlhybridstochasticactor.html
Can the standard Soft Actor Critic (SAC) algorithm in Matlab be adapted for hybrid action spaces, or is there another reinforcement learning algorithm in Matlab that is better
9 views (last 30 days)
Show older comments
The current implementation of the Soft Actor Critic (SAC) algorithm in Matlab is designed only for problems with continuous or discrete action spaces. I'm facing a problem that requires dealing with hybrid action spaces, where some actions are continuous and others are discrete. I am aware that implementations of SAC that accommodate hybrid action spaces have been suggested in the literature. Is it feasible to modify the standard Matlab implementations to handle hybrid actions? If not, is there another reinforcement learning algorithm available in Matlab that is suitable for addressing hybrid action spaces?
0 Comments
Answers (1)
Aastha
on 27 Mar 2025
I understand that you want to use the Soft Actor Critic (SAC) algorithm for hybrid action spaces.
The SAC algorithm in MATLAB supports hybrid actions space. In this case, the actor takes as input the current observation and generates both a categorical and Gaussian distribution. A discrete action is then sampled from the categorical distribution, and a continuous action is sampled from the Gaussian distribution.
For more information about the hybrid action space in a SAC setting, kindly refer to the MathWorks documentation linked below in the section “Actor and Critic Function Approximators”:
To use the SAC algorithm with hybrid action spaces, you can create your own actor and critic objects and use them to create your agent. You may do so using the “rlHybridStochasticActor” MATLAB object for the actor and “rlVectorQValueFunction” for the critic.
Kindly refer to the MathWorks documentation for more information on creating an actor and critic linked below:
Hope this is helpful!
0 Comments
See Also
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!