我有一个场景,其中有两个进程(Log_writer1.py和Log_writer2.py)正在运行(作为cron作业),这些进程最终作为log_event函数的一部分写入同一个日志文件(test_log_file.txt)。由于多个锁,存在不一致性,所有数据都没有存储在日志文件中。是否有任何方法可以在多个进程之间共享单个锁以避免这些不一致。下面是下面的代码片段。请建议
Script : test_cifs_log_writer.py
=================================================================================================
def log_event(level, msg, job_log_file,lck):
lck.acquire()
for i in range(50):
with open(job_log_file, 'a') as wr_log:
print('Now printing message : '+str(msg))
wr_log.write(str(time.ctime())+' - '+level.upper()+' - '+str(msg)+'\n')
lck.release()
Script : Log_writer1.py
=================================================================================================
from threading import Thread, Lock
from test_cifs_log_writer import *
lck=Lock()
t1=Thread(target=log_event, args=('info','Thread 1 : msg','test_log_file.txt',lck))
t2=Thread(target=log_event, args=('info','Thread 2 : msg','test_log_file.txt',lck))
lst=[t1,t2]
for thr in lst:
thr.start()
for thr in lst:
thr.join()
Script : Log_writer2.py
=================================================================================================
from threading import Thread, Lock
from test_cifs_log_writer import *
lck=Lock()
t1=Thread(target=log_event, args=('info','Thread 3 : msg','test_log_file.txt',lck))
t2=Thread(target=log_event, args=('info','Thread 4 : msg','test_log_file.txt',lck))
lst=[t1,t2]
for thr in lst:
thr.start()
for thr in lst:
thr.join()
发布于 2022-11-29 12:58:16
不,不是一个容易的方法。即使您可以共享一个锁,也会遇到锁争用问题
以下任一项:
SyslogHandler
所做的。)的每个条目解锁文件。
https://stackoverflow.com/questions/74614387
复制相似问题