We're hitting errors on older hosts because the precompiled module was
on too-new a policy version. This moves the compilation of the module
to the target, via handlers.
Right now this is hardcoded to the specific module in base/postfix, but
we can generalise it to compile all the various SELinux modules later on
Signed-off-by: Greg Sutcliffe <fedora@emeraldreverie.org>
This site is still pointing to iad2, and I can't find anyone who can
point it to rdu3, so I think it's going to just have to go away.
Disable for now, but if no one appears, we should delete it entirely,
as well as the openshift app that serves this website.
Signed-off-by: Kevin Fenzi <kevin@scrye.com>
This works around a weird problem in rdu3. Proxies have connections to
kojipkgs timeout if the local port is over 32k. We aren't sure why this
happens yet, but this seems to work around the problem for now.
Signed-off-by: Kevin Fenzi <kevin@scrye.com>
This should not have caused any issues, but I want to rule out it being
related to the 503 errors we have been seeing.
it also doesn't do any good to have enabled here as these proxies are
internal only and never would have browsers or crawlers hitting them.
Signed-off-by: Kevin Fenzi <kevin@scrye.com>
This adds an example implementation of how to add Zabbix agent
monitoring to the Postfix role
There are 5 parts
- The agent dropin file
- The (optional) script the agent will call
- A custom SELinux module to allow the agent to run it's tools
- An API call to ensure the target template exists
- An API call to add the host to the right template
See the PR for details on how this works...
Signed-off-by: Greg Sutcliffe <fedora@emeraldreverie.org>
My hypothesis is that web crawlers are especially attracted to the /cgit
string in the URL, assuming it leads to useful source code for AI
training.
In reality, our cgit instance isn't a valuable source for AI learning.
It primarily contains unstructured changes to spec files that often fail
to comply with guidelines. It seems unlikely that a human is
intentionally directing AI crawlers to our instance.
I may be wrong, but the experiment is as simple as the change in this
commit.
Closes: https://github.com/fedora-copr/copr/issues/3873
P.S. On the off chance you actually want to use Copr's Git repos for AI
learning, you're welcome to! But please reach out to us first—we can
find a better way for you to access all that data than using Cgit.
Closes: #2858
Fix https://github.com/fedora-copr/copr/issues/3448
The only downside is that there IMHO isn't any way of distinguishing if a given
project stores results in Pulp or not and therefore the link will be shown to
all projects. If you think it will be confusing for users, we can merge once the
majority of Copr projects uses Pulp.