> Memory error while training

As we know our dataset is giant, I was trying to develop a first model but I encounter this MemoryError: Unable to allocate 366. TiB for an array with shape (7095591, 7095591) and data type float64

Any suggestions how can I fix it?

Posted by: joseeduardou98 @ Sept. 1, 2021, 5:19 a.m.

Possibly, it has to do with an issue known as memory overcommitment, there are some recommendations to solve this problem depending on the OS you are using.
Take a look at the next link, and at the answer from stackoverflow too. Hope it helps.

https://www.researchgate.net/post/How_i_can_fix_this_problem_for_python_jupyter_Unable_to_allocate_104_GiB_for_an_array_with_shape_50000_223369_and_data_type_int8

Posted by: bluemirrors @ Sept. 1, 2021, 5:59 a.m.
Post in this thread