Community Profile

photo

Rajesh Siraskar


Coventry University

Last seen: 2 years ago Active since 2019

- M.Tech. (Automotive Electronics) - Coventry University - Work as a Data Scientist at KPIT/Birlasoft - Machine Learning using Python and MATLAB

Statistics

All
  • 5-Star Galaxy Level 1
  • Personal Best Downloads Level 2
  • Thankful Level 3
  • First Submission
  • GitHub Submissions Level 1

View badges

Content Feed

View by

Answered
Errror: Undefined function 'getActionInfo' for input arguments of type 'struct'.
Hi Emmanouil - Here is below my full "simulation" code. Basically I have trained two models using PPO and DDPG and am trying to...

2 years ago | 0

Answered
Errror: Undefined function 'getActionInfo' for input arguments of type 'struct'.
Hello Emmanouil Thank you for your help. The error is not in the code as that runs fine. It is the Simulation run that genera...

2 years ago | 0

Question


Errror: Undefined function 'getActionInfo' for input arguments of type 'struct'.
Hi This would work previously. I now get an error when I try to test a RL agent. Is this an issue of data-type expected? I hav...

2 years ago | 2 answers | 0

2

answers

Question


Reinforcement Learning multiple agent validation: Can I have a Simulink model host TWO agents and test them
Hi, I am conducting research to see how PPO performs versus DDPG - for non-linear plants. I have trained two agents. Can I hav...

3 years ago | 1 answer | 1

1

answer

Submitted


Reinforcement Learning for Control of Non-Linear Valves
Apply DDPG for "optimal" control of non-linear valves. Can be adapted for other simulated plants.

3 years ago | 5 downloads |

Thumbnail

Question


Toolbox .zip / .tar files: Where can I find Toolbox installables?
Hi, I need to install RL toolboox directly from a .zip/tar file. Where can I find Toolbox installables, please? Thanks for ...

3 years ago | 1 answer | 0

1

answer

Question


CodeOcean - how do I install Toolboxes from .tar files
Hello, I am submitting MATLAB code that came up from my research on Reinforcement Learining, to CodeOcean (https://codeocean.co...

3 years ago | 1 answer | 0

1

answer

Answered
How do I save Episode Manager training data for *plotting* later
Thank you Asvin and Emmanouil. I didnt realize that I could store trainingStats and I can do anything with the data later. Awes...

3 years ago | 1

| accepted

Question


How do I save Episode Manager training data for *plotting* later
Hi, Lets say I am using two algorithms to train for a task (DDPG and PPO). How do I save the data to plot a comparion later? T...

3 years ago | 2 answers | 0

2

answers

Question


Linear Analyzer: PID + Valve with delay
Hello, I am using the Linear Analyzer to analyze a simple PID + valve system and am facing the following issue. The plant is a ...

4 years ago | 1 answer | 0

1

answer

Question


How do I count the number of times zero is being crossed by a signal?
Hi, I am trying to build a control system and I want to count the oscillations in a time-span of say 100 seconds. Can I count ...

4 years ago | 2 answers | 0

2

answers

Answered
DDPG Agent: Not stabilizing creating an unstable model
Based on several rounds of training, my personal observation is that RL will converge initially to an optimal expected value. A...

4 years ago | 0

Question


DDPG Agent: Not stabilizing creating an unstable model
Dear MATLAB, Am training a DDPG agent on randomly set straight lines (levels) and later testing on a benchmark waveform. Should...

4 years ago | 1 answer | 0

1

answer

Answered
How to TRAIN further a previously trained agent?
Hi Sourav, I figured it out after reading the documentation moer carefully! I need to also set the ResetExperienceBufferBeforeT...

4 years ago | 5

| accepted

Question


How to TRAIN further a previously trained agent?
Hi, My agent was programmed to stop after reaching an average reward of X. How do I load and extend the training further? I di...

4 years ago | 4 answers | 2

4

answers

Question


DDPG Control - for non-linear plant control - Q0 does not converge even after 5,000 episodes
Dear Matlab, Firstly, I must say being RL into the MATLAB platform and have the capability to integrate to Simulink is just so ...

4 years ago | 1 answer | 1

1

answer

Question


Are SimScape models idealistic or realistic?
Hello, Do valves components in Simscape Fluids exhibit non-linearity just like real valves? I want to model a realistic system...

4 years ago | 1 answer | 0

1

answer

Question


Simscape Fluids Valve models: Do they exhibit non-linearity too?
Hello, Simscape Fluids VALVE models: Non-linearity I want to model a realistic system and therefore want to view effects of a...

4 years ago | 1 answer | 0

1

answer