关键工人超时错误在gunicorn django

我试图tar2 word2vec模型,并保存它,然后创build一些基于该模式的群集,它本地运行良好,但是当我创builddocker的形象和运行gunicorn,它总是给我超时错误,我试着在这里描述的解决scheme,但它没有为我锻炼

我在用

python 3.5 gunicorn 19.7.1 gevent 1.2.2 eventlet 0.21.0 

这里是我的gunicorn.conf文件

 #!/bin/bash # Start Gunicorn processes echo Starting Gunicorn. exec gunicorn ReviewsAI.wsgi:application \ --bind 0.0.0.0:8000 \ --worker-class eventlet --workers 1 --timeout 300000 --graceful-timeout 300000 --keep-alive 300000 

我也尝试了gevent,sync工人类gevent,syncgevent,sync但它没有工作。 任何人都可以告诉我为什么这个超时错误继续发生。 谢谢

这是我的日志

 Starting Gunicorn. [2017-11-10 06:03:45 +0000] [1] [INFO] Starting gunicorn 19.7.1 [2017-11-10 06:03:45 +0000] [1] [INFO] Listening at: http://0.0.0.0:8000 (1) [2017-11-10 06:03:45 +0000] [1] [INFO] Using worker: eventlet [2017-11-10 06:03:45 +0000] [8] [INFO] Booting worker with pid: 8 2017-11-10 06:05:00,307 : INFO : collecting all words and their counts 2017-11-10 06:05:00,309 : INFO : PROGRESS: at sentence #0, processed 0 words, keeping 0 word types 2017-11-10 06:05:00,737 : INFO : collected 11927 word types from a corpus of 1254665 raw words and 126 sentences 2017-11-10 06:05:00,738 : INFO : Loading a fresh vocabulary 2017-11-10 06:05:00,916 : INFO : min_count=1 retains 11927 unique words (100% of original 11927, drops 0) 2017-11-10 06:05:00,917 : INFO : min_count=1 leaves 1254665 word corpus (100% of original 1254665, drops 0) 2017-11-10 06:05:00,955 : INFO : deleting the raw counts dictionary of 11927 items 2017-11-10 06:05:00,957 : INFO : sample=0.001 downsamples 59 most-common words 2017-11-10 06:05:00,957 : INFO : downsampling leaves estimated 849684 word corpus (67.7% of prior 1254665) 2017-11-10 06:05:00,957 : INFO : estimated required memory for 11927 words and 200 dimensions: 25046700 bytes 2017-11-10 06:05:01,002 : INFO : resetting layer weights 2017-11-10 06:05:01,242 : INFO : training model with 4 workers on 11927 vocabulary and 200 features, using sg=0 hs=0 sample=0.001 negative=5 window=4 2017-11-10 06:05:02,294 : INFO : PROGRESS: at 6.03% examples, 247941 words/s, in_qsize 0, out_qsize 7 2017-11-10 06:05:03,423 : INFO : PROGRESS: at 13.65% examples, 269423 words/s, in_qsize 0, out_qsize 7 2017-11-10 06:05:04,670 : INFO : PROGRESS: at 23.02% examples, 286330 words/s, in_qsize 8, out_qsize 11 2017-11-10 06:05:05,745 : INFO : PROGRESS: at 32.70% examples, 310218 words/s, in_qsize 0, out_qsize 7 2017-11-10 06:05:07,054 : INFO : PROGRESS: at 42.06% examples, 308128 words/s, in_qsize 8, out_qsize 11 2017-11-10 06:05:08,123 : INFO : PROGRESS: at 51.75% examples, 320675 words/s, in_qsize 0, out_qsize 7 2017-11-10 06:05:09,355 : INFO : PROGRESS: at 61.11% examples, 320556 words/s, in_qsize 8, out_qsize 11 2017-11-10 06:05:10,436 : INFO : PROGRESS: at 70.79% examples, 328012 words/s, in_qsize 0, out_qsize 7 2017-11-10 06:05:11,663 : INFO : PROGRESS: at 80.16% examples, 327237 words/s, in_qsize 8, out_qsize 11 2017-11-10 06:05:12,752 : INFO : PROGRESS: at 89.84% examples, 332298 words/s, in_qsize 0, out_qsize 7 2017-11-10 06:05:13,784 : INFO : PROGRESS: at 99.21% examples, 336724 words/s, in_qsize 0, out_qsize 9 2017-11-10 06:05:13,784 : INFO : worker thread finished; awaiting finish of 3 more threads 2017-11-10 06:05:13,784 : INFO : worker thread finished; awaiting finish of 2 more threads 2017-11-10 06:05:13,784 : INFO : worker thread finished; awaiting finish of 1 more threads 2017-11-10 06:05:13,784 : INFO : worker thread finished; awaiting finish of 0 more threads 2017-11-10 06:05:13,784 : INFO : training on 6273325 raw words (4248672 effective words) took 12.5s, 339100 effective words/s 2017-11-10 06:05:13,785 : INFO : saving Word2Vec object under trained_models/mobile, separately None 2017-11-10 06:05:13,785 : INFO : not storing attribute syn0norm 2017-11-10 06:05:13,785 : INFO : not storing attribute cum_table 2017-11-10 06:05:14,026 : INFO : saved trained_models/mobile [2017-11-10 06:05:43 +0000] [1] [CRITICAL] WORKER TIMEOUT (pid:8) 2017-11-10 06:05:43,712 : INFO : precomputing L2-norms of word weight vectors [2017-11-10 06:05:44 +0000] [14] [INFO] Booting worker with pid: 14