Return-Path: Received: (majordomo@vger.kernel.org) by vger.kernel.org via listexpand id S970413AbdIZUTX (ORCPT ); Tue, 26 Sep 2017 16:19:23 -0400 Received: from dcvr.yhbt.net ([64.71.152.64]:47650 "EHLO dcvr.yhbt.net" rhost-flags-OK-OK-OK-OK) by vger.kernel.org with ESMTP id S937099AbdIZUTV (ORCPT ); Tue, 26 Sep 2017 16:19:21 -0400 X-Greylist: delayed 488 seconds by postgrey-1.27 at vger.kernel.org; Tue, 26 Sep 2017 16:19:21 EDT Date: Tue, 26 Sep 2017 20:11:12 +0000 From: Eric Wong To: Marc Herbert Cc: Junio C Hamano , Andy Lowry , Jeff King , git , Christian Kujau , josh@joshtriplett.org, michael.w.mason@intel.com, linux-kernel@vger.kernel.org Subject: Re: BUG in git diff-index Message-ID: <20170926201112.GA26968@whir> References: <20160331140515.GA31116@sigill.intra.peff.net> <20160331142704.GC31116@sigill.intra.peff.net> <56FD7AE8.4090905@nglowry.com> MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Disposition: inline In-Reply-To: Sender: linux-kernel-owner@vger.kernel.org List-ID: X-Mailing-List: linux-kernel@vger.kernel.org Content-Length: 1030 Lines: 24 Marc Herbert wrote: > PS: I used NNTP and http://dir.gmane.org/gmane.comp.version-control.git > to quickly find this old thread (what could we do without NNTP?). Then > I googled for a web archive of this thread and Google could only find > this one: http://git.661346.n2.nabble.com/BUG-in-git-diff-index-tt7652105.html#none > Is there a robots.txt to block indexing on > https://public-inbox.org/git/1459432667.2124.2.camel@dwim.me ? There's no blocks on public-inbox.org and I'm completely against any sort of blocking/throttling. Maybe there's too many pages to index? Or the Message-IDs in URLs are too ugly/scary? Not sure what to do about that... Anyways, I just put up a robots.txt with Crawl-Delay: 1, since I seem to recall crawlers use a more conservative delay by default: ==> https://public-inbox.org/robots.txt <== User-Agent: * Crawl-Delay: 1 I don't know much about SEO other than keeping a site up and responsive; so perhaps there's more to be done about getting things indexed...