Stephen Smoogen 11e2ff87a1 [proxies/robots.txt] Make it so that we force the proxy to use a local robots.txt
The various openshift tools get hit by various crawlers and do not send
a robots.txt. This seems to be due to the balancer code used to send
back to the nodes. This forces the proxy robots.txt to be honored
always.
2019-03-19 19:51:11 +00:00
2019-03-18 11:13:41 +01:00
2018-04-26 15:40:14 +00:00
2018-11-22 21:52:38 +01:00
2018-05-09 02:55:42 +00:00

Fedora Infrastructure

Welcome! This is the Fedora Infrastructure Pagure project.

issues against this project are for issues in Fedora Infrastructure.

git repo of this project is misc scripts and tools for Fedora

If you are looking for the Fedora Infrastructure ansible repo, that is not here, look at:

https://infrastructure.fedoraproject.org/cgit/ansible.git/

If you would like to help out with Fedora Infrastructure, see:

https://fedoraproject.org/wiki/Infrastructure/GettingStarted and https://fedoraproject.org/wiki/Infrastructure_Apprentice

Description
No description provided
Readme 118 MiB
Languages
JavaScript 31.8%
Jinja 24.4%
CSS 22.1%
Python 14.9%
Shell 3.4%
Other 3.3%