I am not a big fan of Facebook’s Pavlovification of life, and I try and spend as little time as possible on it. This, unfortunately, has lead to me missing invites to things.

I was shooting the shit with a mate of mine recently, who is even more anti-Facebook, and here’s what I came up with.

iCal Feeds

One of the hidden features Facebook has is that it’s feed of events you’re going to or have been invited to, is made available via an iCal feed you can subscribe to. I had already added this to my Google calendar (add another calendar by URL), so I get a list of upcoming events appearing in my calendar, and you can even tell Google calendar to send you a notification email when new stuff is added.

However, Google has its own problems, so I thought it’d be nice to get a weekly email digest of upcoming things.

Python lists

So, using the python-icalendar module, I wrote a very quick bit of python program that you an point at an ical feed, and get a summary list of upcoming events printed to the console in a nice list. It uses the summary text of the event, together with the start and end times, and, if available, the location and URL link (handy for Facebook events).

It automatically removes events that have already ended, and by default only lists events that start within the next 7 days.

Use it as follows:

python parse_cal.py -u 'https://url.of.feed/ical' -d *numberofdays*

Which outputs something like:

     2016-02-24 19:30 - 2016-02-24 22:30
     Wheatsheaf Oxford
     2016-02-25 20:00 - 2016-02-26 01:00
     The Bullingdon  (Oxford)

Which are two awesome events I’m heading to in the next few weeks (two very different styles of music!)

Now the email

Getting a weekly digest then is just a matter of a bit of cron and mailer goodness, edit your crontab thus:

@weekly /usr/bin/python /path/to/parse_cal.py -u 'https://feed.example.com/calendar.ics' -d 7 | mail -E -s "This week's upcoming events" you@example.com

The program only outputs something if it finds things, so the -E tells mailx to not send anything if there’s no message body.

And you’re done!

I’ve found this pretty handy for a few other calendar feeds (my work joblist for example). Enjoy!

» Visit the project on Github...

Just a quicky, but it caught me out.

I make use of Firefox’s sync server to synchronise bookmarks, passwords etc between computers, but because I do not like the idea of having this stored on a computer that I don’t control, I run my own version of the server on my own hardware.

This was working fine, however after a recent server upgrade syncing stopped working.

On investigation, I found that exceptions were being thrown by the WSGI process, the important part being:

File "/path/to/syncserver/html/local/lib/python2.7/site-packages/requests/packages/urllib3/contrib/pyopenssl.py", line 62, in 
AttributeError: 'module' object has no attribute 'PROTOCOL_SSLv3'

I did a little bit of digging, and it seems that SSLv3 has been disabled because of the protocol’s vulnerability to the POODLE attack. However, it seems that some of the Python libraries just assume that support is going to be there.

The fix was to edit /path/to/syncserver/html/local/lib/python2.7/site-packages/requests/packages/urllib3/contrib/pyopenssl.py itself. Open the file, and go to line 62.

Change it from this:

# Map from urllib3 to PyOpenSSL compatible parameter-values.
_openssl_versions = {

To this:

# Map from urllib3 to PyOpenSSL compatible parameter-values.
_openssl_versions = {

Which removes the mapping (and support) for SSL v3.

Hope this helps!

I recently upgrade this (and several client servers) over to the latest release of Debian (Debian Jessie). This process went relatively smoothly apart from a couple of gotchas that came when Apache got upgraded.

One of the problems I had is that mod_python and WSGI no longer sit happily together (unless you go through some complicated rebuilding of Python, which I was unwilling to do). I needed WSGI for various things on the server, and seeing as mod_python is viewed as deprecated these days, and I only used it for trac, it made sense to migrate this.

Thankfully, this is relatively straightforward to accomplish.

Create your WSGI script

The first step is to create a python executable called trac.wsgi in your trac home directory, which you then make executable touch trac.wsgi; chown www-data:www-data trac.wsgi; chmod 700 trac.wsgi

The script will look something like:

import os

os.environ['TRAC_ENV_PARENT_DIR'] = '/path/to/trac/parent/html/'
os.environ['PYTHON_EGG_CACHE'] = '/path/to/trac/parent/cache/'

import trac.web.main
application = trac.web.main.dispatch_request

I use one domain to host all the various trac installs, therefore this one wsgi script needs to power them all. This is what the TRAC_ENV_PARENT_DIR does. Both TRAC_ENV_PARENT_DIR and PYTHON_EGG_CACHE can take their values from the existing ones you’ve presumably already set in the apache conf (assuming you’ve already got this working with mod_python).

Updating your Apache configuration

Edit your Apache configuration and comment out or remove all the mod_python entries, e.g.

#               SetHandler mod_python
#               PythonInterpreter main_interpreter
#               PythonHandler trac.web.modpython_frontend
#               PythonOption TracEnvParentDir /path/to/trac/parent/html/
#               PythonOption TracUriRoot /
#               PythonOption PYTHON_EGG_CACHE /path/to/trac/parent/cache/

You now need to add a WSGIScriptAlias directive for whatever your TracUriRoot currently is, and modify your Directory statement to add a WSGIApplicationGroup directive, as follows:

WSGIScriptAlias / /path/to/trac/parent/html/trac.wsgi


    WSGIApplicationGroup %{GLOBAL}



Finally, activate your module: apt-get install libapache2-mod-wsgi; a2enmod wsgi