Submitted by : simon at: 2004-10-25T14:52:52+00:00 (13 years ago)
Name :
Category : Severity : Status :
Optional subject :  
Optional comment :


comments:

a loophole --simon, Mon, 25 Oct 2004 15:08:03 -0700 reply
One of the (expensive) darcs cgi scripts (darcs.cgi) was open to anonymous bots, and google was crawling it. Unfortunately it stopped before I could see how much slowdown that was causing. Anyway I have closed that door which should help some.

present server config seems pretty robust --simon, Sun, 30 Jan 2005 16:23:12 -0800 reply
Status: open => closed

still an issue --simon, Mon, 31 Jan 2005 08:54:23 -0800 reply
Status: closed => open

still an issue --simon, Mon, 31 Jan 2005 08:56:38 -0800 reply
Things still perceptibly slow down when google does a serious crawl, as it is doing right now. Stability isn't affected. One solution to this is to spend more money on hosting. We are running within 256M, many production zope sites use far more.

(property change) more speed would be nice, but we're fine most of the time --simon, Tue, 07 Feb 2006 18:29:35 -0800 reply
Severity: serious => minor