Python Celery 编写简单的定时任务

 1、编写celery的配置文件

from datetime import timedelta

CELERYBEAT_SCHEDULE = {
    # 给任务定义名字
    'add': {
        'task': 'celerytasks.add',  # 反射查找到该函数执行
        'schedule': timedelta(seconds=3),  # 这里使用3秒执行一次
        'args': (16, 16)  # 给add函数,传入两个参数
    }
}
celeryconfig.py

2、编写执行任务程序

from celery import Celery

broker_url = 'pyamqp://development:root@192.168.2.129:5672//development_host'

backend_url = 'rpc://development:root@192.168.2.129:5672//development_host'

# 第一个参数 tasks是给 Celery 对象起了个名字
app = Celery('tasks', broker=broker_url, backend=backend_url)

app.config_from_object('celeryconfig')

@app.task
def add(x, y):
    print("result=", x + y)
    return x + y
timer_task.py

3、Windows 不支持运行,只能 在linux服务器上运行该定时任务

[root@mq1 timer_task]# celery -A timer_task worker -l info --beat
/root/python366/lib/python3.6/site-packages/celery/platforms.py:801: RuntimeWarning: You're running the worker with superuser privileges: this is
absolutely not recommended!

Please specify a different user using the --uid option.

User information: uid=0 euid=0 gid=0 egid=0

  uid=uid, euid=euid, gid=gid, egid=egid,
 
 -------------- celery@mq1 v4.4.7 (cliffs)
--- ***** ----- 
-- ******* ---- Linux-3.10.0-1127.el7.x86_64-x86_64-with-centos-7.8.2003-Core 2020-08-26 17:58:35
- *** --- * --- 
- ** ---------- [config]
- ** ---------- .> app:         tasks:0x7fb80d6d99b0
- ** ---------- .> transport:   amqp://development:**@192.168.2.129:5672//development_host
- ** ---------- .> results:     rpc://
- *** --- * --- .> concurrency: 1 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** ----- 
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery
                

[tasks]
  . timer_task.add

[2020-08-26 17:58:35,980: INFO/Beat] beat: Starting...
[2020-08-26 17:58:36,004: INFO/MainProcess] Connected to amqp://development:**@192.168.2.129:5672//development_host
[2020-08-26 17:58:36,022: INFO/MainProcess] mingle: searching for neighbors
[2020-08-26 17:58:36,038: INFO/Beat] Scheduler: Sending due task add (timer_task.add)
[2020-08-26 17:58:37,144: INFO/MainProcess] mingle: all alone
[2020-08-26 17:58:37,160: INFO/MainProcess] celery@mq1 ready.
[2020-08-26 17:58:37,161: INFO/MainProcess] Received task: timer_task.add[cfb931e9-2c6d-43fe-9a3d-85650a676407]  
[2020-08-26 17:58:37,270: WARNING/ForkPoolWorker-2] result=
[2020-08-26 17:58:37,272: WARNING/ForkPoolWorker-2] 32
[2020-08-26 17:58:37,303: INFO/ForkPoolWorker-2] Task timer_task.add[cfb931e9-2c6d-43fe-9a3d-85650a676407] succeeded in 0.03400736500043422s: 32
[2020-08-26 17:58:38,995: INFO/Beat] Scheduler: Sending due task add (timer_task.add)
[2020-08-26 17:58:38,998: INFO/MainProcess] Received task: timer_task.add[71fa8b9e-8296-4f4a-a740-69af2a4cdcd9]  
[2020-08-26 17:58:39,000: WARNING/ForkPoolWorker-2] result=
[2020-08-26 17:58:39,000: WARNING/ForkPoolWorker-2] 32
[2020-08-26 17:58:39,002: INFO/ForkPoolWorker-2] Task timer_task.add[71fa8b9e-8296-4f4a-a740-69af2a4cdcd9] succeeded in 0.0020219220023136586s: 32
[2020-08-26 17:58:41,996: INFO/Beat] Scheduler: Sending due task add (timer_task.add)
[2020-08-26 17:58:42,000: INFO/MainProcess] Received task: timer_task.add[882bb4dd-7adc-4752-981b-4ed675d1bbb0]  
[2020-08-26 17:58:42,001: WARNING/ForkPoolWorker-2] result=
[2020-08-26 17:58:42,002: WARNING/ForkPoolWorker-2] 32
[2020-08-26 17:58:42,004: INFO/ForkPoolWorker-2] Task timer_task.add[882bb4dd-7adc-4752-981b-4ed675d1bbb0] succeeded in 0.002443025994580239s: 32
原文地址:https://www.cnblogs.com/ygbh/p/13566527.html