Reinforcement Learning multiple agent validation: Can I have a Simulink model host TWO agents and test them

2 views (last 30 days)
Hi,
I am conducting research to see how PPO performs versus DDPG - for non-linear plants. I have trained two agents.
Can I have a Simulink circuit host TWO agents and test them? Basically trying to create a unified validation bench. Please see image below.
I did go through the documentation and tried this implementation but am getting errors.
Code:
% Set Simulink model pointers
PLANT_SIMULATION_MODEL = 'sm_Experimental_Setup'; % Simulink experimentation circuit
DDPG_AGENT = '/DDPG Sub-System/DDPG_Agent';
PPO_AGENT = '/PPO Sub-System/PPO_Agent';
% Load experiences from pre-trained agent
DDPG_agent = load(DDPG_MODEL_FILE,'agent');
PPO_agent = load(PPO_MODEL_FILE,'agent');
% Code here for setting (1) obsInfo and (2) actionInfo_DDPG and (3) actionInfo_PPO
% .... ...
% Intialise the environment with the serialised agent and run the test
env = rlSimulinkEnv(VALVE_SIMULATION_MODEL, [DDPG_AGENT PPO_AGENT], [obsInfo obsInfo], [actionInfo_DDPG actionInfo_PPO]);
simOpts = rlSimulationOptions('MaxSteps', 2000);
xpr = sim(env,[DDPG_agent.agent, PPO_agent.agent]);
ERROR message:
Error using rlSimulinkEnv (line 108)
No block diagram name specified.
Error in code_DDPG_PPO_Experimental_Setup (line 97)
env = rlSimulinkEnv(VALVE_SIMULATION_MODEL, [DDPG_AGENT PPO_AGENT], [obsInfo obsInfo], [actionInfo_DDPG actionInfo_PPO]);
Screen capture of Simulink model:
Simulink Model

Accepted Answer

Emmanouil Tzorakoleftherakis
That should be possible. Did you follow the multi-agent examples? Since the agents are trained already you may want to check the last part in the links below where agents are simulated.
By the way, the error you are getting makes me think the the path you provide below is not complete
DDPG_AGENT = '/DDPG Sub-System/DDPG_Agent';
PPO_AGENT = '/PPO Sub-System/PPO_Agent';
Try adding the model name too.
Hope this helps
  3 Comments

Sign in to comment.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!