MATLAB Answers

0

Why do I receive an error "java.lang​.OutOfMemo​ryError: unable to create new native thread" while running my parallel jobs on a cluster?

I receive the following errors when I try to run my jobs on my cluster:
Error using parallel.Job/submit (line 304)
Java exception occurred:
java.lang.OutOfMemoryError: unable to create new native thread
Why do I receive this error and how can I resolve this?

1 Answer

Answer by MathWorks Support Team on 4 Jun 2019
 Accepted Answer

On most Linux based environments, by default the ulimit for that shell is set to a low number which can cause this error.
Ulimit provides control over the resources available to the shell and to processes started by it, on systems that allow such control.
To check this try executing "ulimit -a" command on the terminal and see the "max user processes", ie, "nproc " field.
It is generally set to a number like 1024 by default, so we would need to increase the value of this field because under the hood Linux treats threads and processes as one and the same(in terms of numbers). This will not allow us to have more than 1024 processes and threads collectively, so increasing this value may help resolve this issue.
To change the value of "max user processes", you can use the following command in Linux shell:
ulimit -u 63536
When you check the ulimit -a again you should now see the new max user processes.

  0 Comments

Sign in to comment.