Files
postgres-web/tools/search/crawler/lib/threadwrapper.py
Magnus Hagander b8a2015be2 New set of web search crawlers and infrastructure
Replaces the old search code with something that's not quite as much
spaghetti (e.g. not evolved over too much time), and more stable (actual
error handling instead of random crashes)

Crawlers are now also multithreaded to deal with higher latency to some
sites.
2012-01-21 15:27:06 +01:00

23 lines
770 B
Python

from multiprocessing import Process
# Wrap a method call in a different process, so that we can process
# keyboard interrupts and actually terminate it if we have to.
# python threading makes it often impossible to Ctlr-C it otherwise..
#
# NOTE! Database connections and similar objects must be instantiated
# in the subprocess, and not in the master, to be fully safe!
def threadwrapper(func, *args):
p = Process(target=func, args=args)
p.start()
# Wait for the child to exit, or if an interrupt signal is delivered,
# forcibly terminate the child.
try:
p.join()
except KeyboardInterrupt, e:
print "Keyboard interrupt, terminating child process!"
p.terminate()
except Exception, e:
print "Exception %s, terminating child process!" % e
p.terminate()