This lets downstream systems securely search for users that are in
the system, so they can populate their local database with users
before they have logged in if necessary. This can be used for example
for the commitfest management system to be able to flag users as
authors and reviewers even before they have logged in.
This involves some changes to how the default Django UserAdmin is handled with
respect to saving the form, but we simply override the changes to keep all
of the default Django functionality intact, minus allowing a username to be
modified on edit.
Since we pipe all SSL through frontends, and explicitly reject directly
accessing the main host. However, this call has no payload, so we can
safely allow it without SSL through the frontends. Do that for now, while
we should look at fixing the SSL issue sometime in the future
Prior to this, the static repo would only update if there were
*some* changes in the main repo, which clearly wasn't intended.
Also, shorten the forced delay to 10 seconds.
The idea is a git repository hook will send a POST to this URL, which
will drop a trigger file somewhere. A cronjob (or inotify listener if
we want to be really fancy sometiem in the future) will pick up that
cronjob and run the update script. The goal being to shorten the time
required to process an update.
As the feature matrix is useful for seeing how far PostgreSQL has come, we
still want to keep older versions on display. However, this is causing
problems displaying the newer versions on smaller screens.
This change adds a filter which only shows supported versions by default,
and allows folk to choose which versions they wish to compare. This will
have no effect on browsers with Javascript disabled.
It worked perfectly fine to have unicode in text fields, but if there
were unicode characters in one of the dropdown fields, then sometimes
it would not be possible to save the entries since the moderation email
generation would crash even if they run through the same codepath.
Hopefully this fix will take care of some of the random errors that have
shown up with submissions - there might be more issues like it around the
code, however.
The previous code, which was submitting locally, apparently didn't
need to provide a Message-Id header. However, now that we're
directly submitting to a remote system, we need to make sure that
a Message-Id header exists or the emails will get bounced.
In addition, the Python docs for this module state that Message-Id
is really one of the required fields anyway. It's unclear how many
real bugs we lost because of this, but I got involved when there
was complaining on IRC that a bug submitted didn't show up on the
-bugs list.
In case folks are wondering why I'm committing/pushing this (or how),
I've already fixed this on wrigleys (thanks to Andrew Gierth for
helping me debug and test the changes) and subsequently gave myself
access to this repo, to get this commit in, before anyone else
commits and overwrites my local hacks and breaks the bugs form again.
Basically, user generated email (bug report form) will be sent to the mail
frontends for antispam. Any errors generated there will be ignored and
the mails "dropped on the floor". Other emails keep entering the system
through localhost and delivered there.
Import the code from the PostgreSQL Europe website to handle this, since it's
well proven by now.
Any points that send email now just write them to the database using the
functions in queuedmail.util. This means we can now submit notification
emails and such things within transactions and have them properly roll bcak
if something goes wrong (so no more incorrect notifications when there is
a database error).
These emails are picked up by a cronjob that runs frequently (typically
once per minute or once every 2 minutes) that submits them to the local
mailserver. By doing it out of line, this gives us a much better way of
dealing with cases where mail delivery is really slow.
The submission from the cronjob is now done with smtp to localhost instead
of opening a pipe to the sendmail command - though this should have no
major effects on anything.
This also removes the setting SUPPRESS_NOTIFICATIONS, as no notifications
are actually ever sent unless the cronjob is run. On development systems
they will just go into the queuedmail table, and can be deleted from there.
This can break things (d'uh).
Do this by introducing a new decorator, @ssl_optional. When this is
present, no SSL redirection will happen, regardless of whether the
access comes in over http or https.
This decorator overrides @ssl_required, but for redability's sake,
never use both at the same time.