celery开发中踩的坑

celery开发中踩的坑

celery连接redis

当使用redis做broker,redis连接需要密码时:
BROKER_URL='redis://:xxxxx@127.0.0.1:6379/0',
其中xxxxx是密码,密码前必须加冒号。

 

报错:Celery ValueError: not enough values to unpack (expected 3, got 0)

test.py

import time
from celery import Celery
broker = 'redis://localhost:6379'
backend = 'redis://localhost:6379/0'
celery  = Celery('my_task', broker=broker, backend=backend)
@celery.task
def add(x,y):
    time.sleep(2.0)
    return x+y

test1.py

from test import add

result = add.delay(2,8)
while 1:
    if result.ready():
        print(result.get())
        break

报错场景还原

1.运行worker

celery -A test worker --loglevel=info

输出:

(anaconda) C:Pycham
edis>celery -A test worker --loglevel=info

 -------------- celery@BOS3UA7Y740V4W9 v4.3.0 (rhubarb)
---- **** -----
--- * ***  * -- Windows-10-10.0.17763-SP0 2019-06-01 17:02:01
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         my_task:0x2200a35b128
- ** ---------- .> transport:   redis://:**@localhost:6379//
- ** ---------- .> results:     redis://:**@localhost:6379/0
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . test.add

[2019-06-01 17:02:01,524: INFO/MainProcess] Connected to redis://:**@localhost:6379//
[2019-06-01 17:02:01,556: INFO/MainProcess] mingle: searching for neighbors
[2019-06-01 17:02:02,620: INFO/MainProcess] mingle: all alone
[2019-06-01 17:02:02,759: INFO/MainProcess] celery@BOS3UA7Y740V4W9 ready.
[2019-06-01 17:02:03,309: INFO/SpawnPoolWorker-2] child process 16140 calling self.run()
[2019-06-01 17:02:03,333: INFO/SpawnPoolWorker-4] child process 10908 calling self.run()
[2019-06-01 17:02:03,372: INFO/SpawnPoolWorker-3] child process 2400 calling self.run()
[2019-06-01 17:02:03,434: INFO/SpawnPoolWorker-1] child process 13848 calling self.run()

2.运行test1.py

test1.py输出

Traceback (most recent call last):
  File "C:/Pycham/redis/test1.py", line 7, in <module>
    print(result.get())
  File "C:Pychamanacondalibsite-packagescelery
esult.py", line 215, in get
    self.maybe_throw(callback=callback)
  File "C:Pychamanacondalibsite-packagescelery
esult.py", line 331, in maybe_throw
    self.throw(value, self._to_remote_traceback(tb))
  File "C:Pychamanacondalibsite-packagescelery
esult.py", line 324, in throw
    self.on_ready.throw(*args, **kwargs)
  File "C:Pychamanacondalibsite-packagesvinepromises.py", line 244, in throw
    reraise(type(exc), exc, tb)
  File "C:Pychamanacondalibsite-packagesvinefive.py", line 195, in reraise
    raise value
ValueError: not enough values to unpack (expected 3, got 0)

worker输出:

[2019-06-01 17:03:59,484: INFO/MainProcess] Received task: test.add[33ee3342-064e-47ef-8f8b-95d65955fd89]
[2019-06-01 17:03:59,491: ERROR/MainProcess] Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0)')
Traceback (most recent call last):
  File "c:pychamanacondalibsite-packagesilliardpool.py", line 358, in workloop
    result = (True, prepare_result(fun(*args, **kwargs)))
  File "c:pychamanacondalibsite-packagesceleryapp	race.py", line 544, in _fast_trace_task
    tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)

解决:

.安装eventlet

pip install eventlet

现在我们重新来一遍

1.运行worker

celery -A test worker -l info -P eventlet

2.运行test1.py

10

此时worker的输出

(anaconda) C:Pycham
edis>celery -A test worker -l info -P eventlet

 -------------- celery@BOS3UA7Y740V4W9 v4.3.0 (rhubarb)
---- **** -----
--- * ***  * -- Windows-10-10.0.17763-SP0 2019-06-01 17:08:45
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         my_task:0x16e16d0c0f0
- ** ---------- .> transport:   redis://:**@localhost:6379//
- ** ---------- .> results:     redis://:**@localhost:6379/0
- *** --- * --- .> concurrency: 4 (eventlet)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this
worker)
--- ***** -----
 -------------- [queues]
- *** --- * --- .> concurrency: 4 (eventlet)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . test.add

-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . test.add

[2019-06-01 17:08:45,387: INFO/MainProcess] Connected to redis://:**@localhost:6379//
[2019-06-01 17:08:45,401: INFO/MainProcess] mingle: searching for neighbors
[2019-06-01 17:08:46,434: INFO/MainProcess] mingle: all alone
[2019-06-01 17:08:46,452: INFO/MainProcess] pidbox: Connected to redis://:**@localhost:6
379//.
[2019-06-01 17:08:46,458: INFO/MainProcess] celery@BOS3UA7Y740V4W9 ready.
[2019-06-01 17:09:31,021: INFO/MainProcess] Received task: test.add[82a08465-b8d5-4371-8
edd-1f5b3c922102]
[2019-06-01 17:09:33,034: INFO/MainProcess] Task test.add[82a08465-b8d5-4371-8edd-1f5b3c
922102] succeeded in 2.0s: 10
原文地址:https://www.cnblogs.com/-wenli/p/10960241.html