Listening for new files Python -


i trying find solution problem have... have big share contains hundreds of thousands(if not millions) of files , new files arriving every second. trying write application make faster find files in share idea insert file names in redis db in format of : {'file_name','file_path'} , when file needed pull path db... problem starts when trying index old files(i assume take @ least few hours) while simeltaniously listen new files arrive during process. example of i'am trying do:

import redis import os r = redis.strictredis(host='localhost',port=6379,db=0) root, dirs, files in os.walk(r'd:\\'):     file in files:         r.set(os.path.basename(file),os.path.join(root, file))         print 'the file %s succefuly added' %os.path.basename(file) 

how supposed modify code keep listening new files?

thanks help!=)

you should take @ watchdog library. you're looking for. here's example of using it:

import sys import time import logging watchdog.observers import observer watchdog.events import loggingeventhandler  if __name__ == "__main__":     logging.basicconfig(level=logging.info,                         format='%(asctime)s - %(message)s',                         datefmt='%y-%m-%d %h:%m:%s')     path = sys.argv[1] if len(sys.argv) > 1 else '.'     event_handler = loggingeventhandler()     observer = observer()     observer.schedule(event_handler, path, recursive=true)     observer.start()     try:         while true:             time.sleep(1)     except keyboardinterrupt:         observer.stop()     observer.join() 

Comments

Popular posts from this blog

yii2 - Yii 2 Running a Cron in the basic template -

asp.net - 'System.Web.HttpContext' does not contain a definition for 'GetOwinContext' Mystery -

c# - MSDN OneNote Api: Navigate to never before opened page without opening a OneNote Application Window -