Skip to content

[Solved] huggingface/tokenizers: The current process just got forked. after parallelism has already been used. Disabling parallelism to avoid deadlocks

Last Updated on 2022-08-03 by Clay

Problem

Today I trained the model with simpletransformers package, I got an warning message that never seen:

huggingface/tokenizers: The current process just got forked. after parallelism has already been used. Disabling parallelism to avoid deadlocks.


The warning is come from huggingface tokenizer. It mentioned the current process got forked and hope us to disable the parallelism to avoid deadlocks.

I used to have a lighthearted attitude of "seeing the warning means it's okay~", but this time it warned me to avoid deadlock, feeling the worst cloud hurt me stuck in training.

I think nobody want to train train a model before leaving work, and then come to work the next day and find that "Oops, them model is stuck!".


Solution

There are roughly two solutions:

  1. Ignore it
  2. Disable parallelization

There's no point ignoring it (although the warning message really keeps popping up and I cannot see the training progress), let's take a look about how to disable parallelization.

Hidden warning message

You can add the following settings on the top of your python program.

import os
os.environ["TOKENIZERS_PARALLELISM"] = "false"



References


Read More

Leave a ReplyCancel reply

Exit mobile version