<borneoa___>
PaulFertser: what about updating robots.txt with 'User-agent: *' ?
<PaulFertser>
borneoa___: hm, you mean for gitweb?
electricworry has quit [Ping timeout: 264 seconds]
<borneoa___>
PaulFertser: was gitweb on a separate name is still on review.openocd.org ?
<borneoa___>
PaulFertser: ... or still ...
<PaulFertser>
borneoa___: I disabled gitweb for now, it was accessible from Projects.
<borneoa___>
PaulFertser: I see it's disabled. But before it was also under review.openocd.org or on a different server name?
<PaulFertser>
borneoa___: same name
<borneoa___>
PaulFertser: the robots.txt on Gerrit server prevents some crawler but not all. Instead of disabling gitweb, we could disable the crawler that takes all the resources.
<PaulFertser>
borneoa___: I follow, just think it makes sense to be sure it's the gitweb that leads to annoyances for Gerrit users.
<PaulFertser>
So my idea is to run it for a while without gitweb to see if it remains occassionally slow or not.
<PaulFertser>
And anyway the official repository is on SF.net, and they have a web frontend there to browse it.
<borneoa___>
PaulFertser: agree.
<PaulFertser>
And if it is, probably better to install and use cgit instead as it's less CPU hungry and we should be able to limit it more and assign different priority etc.