On the Indiewebcamp wiki, there’s a page discussing HTTPS, the support for which is strongly recommended. As I’ve mentioned previously, at this stage all non-encrypted communication forms (including traditional port 80 HTTP) should be considered deprecated and dangerous.

Indieweb compatible sites are encouraged to get a higher level as possible, and thanks to some prodding, I’ve finally moved both this blog and my feed over to HTTPS only, with HSTS and forward secrecy.

This got me thinking, perhaps it would be worth adding a “Level 7” (or perhaps Level 6.5) to this, and to suggest that Indieweb sites should also be made available as .onion hidden services on Tor?

Pros

  • Anonymity. Would go a large way towards protecting communication metadata (who know’s whom), which is a goal we should move towards in a world of endemic selector based surveillance.
  • Encryption. Traffic within the tor network is end to end encrypted, and there is some discussion of whether this renders HTTPS unnecessary.

Cons

  • Tor has nothing to do with HTTPS, although it is encrypted. However, the HTTPS levels page seemed a good place to put the suggestion.
  • Could be seen as endorsing one service. Tor is Free software and is pretty much the only game in town when it comes to anonymity networks, but does that constitute a silo? Probably not, but is a point for discussion.
  • No certificates for .onion. There are currently no certificate providers available for .onion domains. But, this may not be a problem.

Anyway, just mooting this as a point for discussion.

I’ve previously documented how I’ve previously used Known to track system events generated by various pieces of hardware and various processes within my (and my client) infrastructure.

Like many, I use a UPS to keep my servers from uncontrolled shutdowns, and potential data loss, during the thankfully rare (but still more common than I’d like) power outages.

Both my UPS’ are made by APC, and are monitored by a small demon called apcupsd. This demon monitors the UPS and will report on its status, from obvious things like “lost power” and “power returned”, but also potentially more important things like “battery needs replacing”.

While some events do trigger an email, other messages are written to the console using the wall command, so are less useful on headless systems. Thankfully, modifying the script is fairly straightforward.

Setting up your script

First, set your script as you did for nagios. Create an account from within your Known install, and then grab its API key, then put it in a wrapping script.

#!/bin/bash

PATH=/path/to/BashKnown:"${PATH}"

status.sh https://my.status.server apc *YOURAPICODE* >/dev/null

exit 0

I need to Pee

The next step is to modify /etc/apccontrol to call your wrapper script.

I wanted to maintain the existing ‘post to wall’ functionality as well as posting to my status page. To do this, you need to replace the definition for WALL at the top of the script, and split the pipe between two executables.

To do this you need a command called pee, which is the pipe version of tee, and is available in the moreutils package on debian based systems. So, apt-get install moreutils.

Change:

WALL=wall

To:

WALL="pee 'wall' '/path/to/wrapperscript.sh'"

Testing

To test, you can run apccontrol directly, although obviously you should be careful which command you run, as some commands fire other scripts.

I tested by firing:

./apccontrol commfailure

Happy hacking!

Spam comes in may forms.

I had been noticing some odd traffic appearing in my referrer logs from “buttons-for-website.com”, and a few other places. Odd, I thought, but I wasn’t too concerned.

A client recently asked me about it, since similar traffic was starting to appear in their analytics for a brand new site. I did a little bit of research, and it turns out that this is actually a spam attack.

Basically, the spammer hits your site and sets a referrer header containing a url and their spam message (keywords + another url, usually). Since a small percentage of sites make their referrer logs public (either deliberately or through misconfiguration), when these are indexed, they can be used to game the search engine of the site they’re trying to boost.

Stopping the spam with mod_security

I don’t like spammers, and it was starting to make my logs (and those of my client’s) a little noisy. So, I decided to do something about it. So, using mod_security, I added a couple of simple rules, which would drop the traffic where the referrer contained certain keywords.

Simple, but effective:

SecRule REQUEST_HEADERS:Referer "^https?://(www\.)?buttons\-for\-website\.com/?" \
        "phase:1,log,deny,status:503,msg:'Referer spam'"

SecRule REQUEST_HEADERS:Referer "^https?://(www\.)?simple\-share\-buttons\.com/?" \
        "phase:1,log,deny,status:503,msg:'Referer spam'"


... etc... 

This seemed to put an end to the worst of it.

I also noticed that a few spammers were posting with obvious spam keywords in the referrer header, so I added a similar rule to block those for good measure:

SecRule REQUEST_HEADERS:Referer "(viagra|phentermine|cialis)" \
        "phase:1,log,deny,status:503,msg:'Referer spam'"

SecRule REQUEST_HEADERS:Referer "(poker|casino|holdem)" \
        "phase:1,log,deny,status:503,msg:'Referer spam'"

Testing

To test your rules, you can use curl to hit your site and send a triggering referrer, e.g.

 curl --referer https://button-for-website.com/

Or

curl --referer https://example.com/poker

Hope that helps!