python - Caching a downloaded file based on modified time using dogpile -


I am writing a program that downloads a large file (~ 150MB) and the data is found in a more useful text format file Parsed The process of downloading, and especially parsing is slow (~ 20 minutes total), so I want to cache the results.

The result of downloading is a bunch of files, and the result of parsing is a single file, so I can manually check that these files exist and if so, check your revised time Do; However, as I am using a dog with a radis backend for the web service call at other places in the first code, I was wondering if the dog can be used for this?

So my question is: can it be used to cache files based on your modified time?

Why you do not want to split the program into many parts:

  • Downloader

  • Parser & amp;

  • on the file update.

    Importing OS import threading _lock_services = threading.Lock () tmp_file = "/ tmp / txt.json" update_time_sec = 3300 with _lock_services: # if File was created more than 50 minutes ago # You can check that the file was updated and if your cache variable has been updated, then os.path.getctime (tmp_file) & lt; (Time.time () - update_time_sec): Open the OS system ("% s>% s"% ("echo '{}'", tmp_file)) with open (tmp_file, "r") json_data: cache_variable = json. Load (json_data)

Comments

Popular posts from this blog

apache - 504 Gateway Time-out The server didn't respond in time. How to fix it? -

c# - .net WebSocket: CloseOutputAsync vs CloseAsync -

c++ - How to properly scale qgroupbox title with stylesheet for high resolution display? -