Possible Causes and Solutions

Large Embeddings Don't Fit Into GPU RAM

  • Typical GPU RAM is only 12GB or 16GB max.

Solutions

  • Use CPU instead of GPU as CPU has a large addressable memory space (2TB, 4TB, etc)
  • Decrease overall vocabulary and corpus
  • Use generators instead of Embedding 
  • Set trainable=false  to avoid retraining your embedding; switch to linear transformation instead

Single GPU Can't Fit TensorFlow Operations

  • TensorFlow's out-of-box configuration optimizes performance by scheduling many operations in parallel

Solutions

Did this answer your question?