celery开发中踩的坑
生活随笔
收集整理的這篇文章主要介紹了
celery开发中踩的坑
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
celery開發中踩的坑
celery連接redis
當使用redis做broker,redis連接需要密碼時: BROKER_URL='redis://:xxxxx@127.0.0.1:6379/0', 其中xxxxx是密碼,密碼前必須加冒號。?
報錯:Celery ValueError: not enough values to unpack (expected 3, got 0)
test.py
import time from celery import Celery broker = 'redis://localhost:6379' backend = 'redis://localhost:6379/0' celery = Celery('my_task', broker=broker, backend=backend) @celery.task def add(x,y):time.sleep(2.0)return x+y?
test1.py
from test import addresult = add.delay(2,8) while 1:if result.ready():print(result.get())break?
報錯場景還原
1.運行worker
celery -A test worker --loglevel=info輸出:
(anaconda) C:\Pycham\redis>celery -A test worker --loglevel=info-------------- celery@BOS3UA7Y740V4W9 v4.3.0 (rhubarb) ---- **** ----- --- * *** * -- Windows-10-10.0.17763-SP0 2019-06-01 17:02:01 -- * - **** --- - ** ---------- [config] - ** ---------- .> app: my_task:0x2200a35b128 - ** ---------- .> transport: redis://:**@localhost:6379// - ** ---------- .> results: redis://:**@localhost:6379/0 - *** --- * --- .> concurrency: 4 (prefork) -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker) --- ***** ------------------- [queues].> celery exchange=celery(direct) key=celery[tasks]. test.add[2019-06-01 17:02:01,524: INFO/MainProcess] Connected to redis://:**@localhost:6379// [2019-06-01 17:02:01,556: INFO/MainProcess] mingle: searching for neighbors [2019-06-01 17:02:02,620: INFO/MainProcess] mingle: all alone [2019-06-01 17:02:02,759: INFO/MainProcess] celery@BOS3UA7Y740V4W9 ready. [2019-06-01 17:02:03,309: INFO/SpawnPoolWorker-2] child process 16140 calling self.run() [2019-06-01 17:02:03,333: INFO/SpawnPoolWorker-4] child process 10908 calling self.run() [2019-06-01 17:02:03,372: INFO/SpawnPoolWorker-3] child process 2400 calling self.run() [2019-06-01 17:02:03,434: INFO/SpawnPoolWorker-1] child process 13848 calling self.run()?
2.運行test1.py
?
test1.py輸出
Traceback (most recent call last):File "C:/Pycham/redis/test1.py", line 7, in <module>print(result.get())File "C:\Pycham\anaconda\lib\site-packages\celery\result.py", line 215, in getself.maybe_throw(callback=callback)File "C:\Pycham\anaconda\lib\site-packages\celery\result.py", line 331, in maybe_throwself.throw(value, self._to_remote_traceback(tb))File "C:\Pycham\anaconda\lib\site-packages\celery\result.py", line 324, in throwself.on_ready.throw(*args, **kwargs)File "C:\Pycham\anaconda\lib\site-packages\vine\promises.py", line 244, in throwreraise(type(exc), exc, tb)File "C:\Pycham\anaconda\lib\site-packages\vine\five.py", line 195, in reraiseraise value ValueError: not enough values to unpack (expected 3, got 0)?
worker輸出:
[2019-06-01 17:03:59,484: INFO/MainProcess] Received task: test.add[33ee3342-064e-47ef-8f8b-95d65955fd89] [2019-06-01 17:03:59,491: ERROR/MainProcess] Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0)') Traceback (most recent call last):File "c:\pycham\anaconda\lib\site-packages\billiard\pool.py", line 358, in workloopresult = (True, prepare_result(fun(*args, **kwargs)))File "c:\pycham\anaconda\lib\site-packages\celery\app\trace.py", line 544, in _fast_trace_tasktasks, accept, hostname = _loc ValueError: not enough values to unpack (expected 3, got 0)?
解決:
.安裝eventlet
pip install eventlet?
現在我們重新來一遍
1.運行worker
celery -A test worker -l info -P eventlet?
2.運行test1.py
10?
此時worker的輸出
(anaconda) C:\Pycham\redis>celery -A test worker -l info -P eventlet-------------- celery@BOS3UA7Y740V4W9 v4.3.0 (rhubarb) ---- **** ----- --- * *** * -- Windows-10-10.0.17763-SP0 2019-06-01 17:08:45 -- * - **** --- - ** ---------- [config] - ** ---------- .> app: my_task:0x16e16d0c0f0 - ** ---------- .> transport: redis://:**@localhost:6379// - ** ---------- .> results: redis://:**@localhost:6379/0 - *** --- * --- .> concurrency: 4 (eventlet) -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker) --- ***** ------------------- [queues] - *** --- * --- .> concurrency: 4 (eventlet) -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker) --- ***** ------------------- [queues].> celery exchange=celery(direct) key=celery[tasks]. test.add-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker) --- ***** ------------------- [queues].> celery exchange=celery(direct) key=celery[tasks]. test.add[2019-06-01 17:08:45,387: INFO/MainProcess] Connected to redis://:**@localhost:6379// [2019-06-01 17:08:45,401: INFO/MainProcess] mingle: searching for neighbors [2019-06-01 17:08:46,434: INFO/MainProcess] mingle: all alone [2019-06-01 17:08:46,452: INFO/MainProcess] pidbox: Connected to redis://:**@localhost:6 379//. [2019-06-01 17:08:46,458: INFO/MainProcess] celery@BOS3UA7Y740V4W9 ready. [2019-06-01 17:09:31,021: INFO/MainProcess] Received task: test.add[82a08465-b8d5-4371-8 edd-1f5b3c922102] [2019-06-01 17:09:33,034: INFO/MainProcess] Task test.add[82a08465-b8d5-4371-8edd-1f5b3c 922102] succeeded in 2.0s: 10?
轉載于:https://www.cnblogs.com/-wenli/p/10960241.html
總結
以上是生活随笔為你收集整理的celery开发中踩的坑的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: linux下查看cpu,内存,硬盘等硬件
- 下一篇: weh shell高大上?一文教你实现