Archive for Articles

Using IRC to set Reminders with the “at” Command

reminder

Remind me at 5pm to go to the bank

You can use IRC to remind you to do anything. Now that you have a directory that your IRC bot reads, you can use one of Linux’s best kept secrets: the at command. The at command allows you to schedule one-time jobs to be run in the future — a non-reoccurring cronjob. The real win with using at is its time syntax. Here’s a few examples from the man page:

  • to run a job at 4pm three days from now, you would do at 4pm + 3 day
  • to run a job at 10:00am on July 31, you would do at 10am Jul 31
  • to run a job at 1am tomorrow, you would do at 1am tomorrow.

You can use this to run any command or script at a given time. This makes the at command perfect for usage over IRC — we just need to drop a file with the reminder in the /say directory at the given time. It’s easy to wrap some of this syntax to give the reminder command a human-like syntax.

Syntax

I went with the following syntax, best expressed with a summarized version of the regular expression used to parse it:

 ^(remind|pm) (me|us|everyone) (at|in|on) (.*?) to (.*)$

You might say the following things in the IRC channel to the bot:

  • remind me in 5 minutes to check the dryer
  • pm me at 10pm to remember to charge my cell phone
  • remind everyone on august 1s to enjoy the weather

These commands all get sent to at. Behind the scenes, the bot does the following:

  1. creates a file under /var/tmp with the contents of the reminder. The filename follows the format mentioned in the Mixing the Command-Line and IRC post, and sets the unique identifier to a hash of the submitter’s nick and current timestamp.
  2. the at command is instructed to move the file from /var/tmp/ to the ~/say directory at the requested time
  3. at the requested time, at moves the file and the bot picks it up and prints it to the channel.

There’s only a small amount of additional wrapping needed to make the interaction fluid. The biggest abstraction is in the difference between the “remind me at/on” and “remind me in” commands. “remind me at/on” passes the date straight through to the at command, where “in” prepends “now +” to the request. That’s it!

The channel has been using the reminder feature more and more. It’s nice to set a reminder with a long article you’d like to read, or set a reminder in 2 weeks to write a blog post for the DeadCoderSociety. Socializing reminders is a great way to increase community interaction and raise awareness for interesting information that might otherwise end up on a sticky note in a pocket somewhere.

Bonus:

The other Friday I was talking about payday and one guy had added a reminder to “GET PAID” for the next set of pairs of weeks. We ended up adding this on the command-line:

for i in `seq 12 2 50`; do
    echo 'echo "GET PAID" > /srv/git/neilforobot/say/#dcs@paid' | 
    at now + $i weeks ; 
done

At makes setting these types of reminders up easy!

Mixing the Command-Line and IRC bots

mixer

Making cookies with IRC and the command-line

My IRC bot checks a directory every minute for files and reads the contents into a channel. This simple feature is also one of my favorites.

The loop:

  1. Checks for the presence of files under /say.
  2. Reads the contents of the file into a string.
  3. Prints the contents into a channel (pulled from filename).
  4. Removes the file.

Mix crontab into this and you can use any of your favorite commandline tools. Here’s a few examples from my own usage:

Finance tip at 8AM every day:

00 08 * * * curl -sLk http://bit.ly/OowzSt -o ~/say/#finance@tip

This just hits a PHP script that returns a random money tip. Note: The @ sign is being used to separate the name of the channel and a unique identifier for the file. Multiple files can land in the directory without one overwriting the other.

Top HackerNews link at 1PM:

00 13 * * * ~/cronjobs/hackernews.pl > ~/say/#dcs@hn

Note: The original job was a one liner using Mojolicious:

perl -Mojo -E 'say g("http://hackerne.ws")->dom->at("tr > td.title > a")->tree->[2]->{href}'

It’s been expanded since to include the title and keep the crontab clean.

One-off website update checker:

* * * * * curl -s http://stantheman.biz | diff file1 -
|| echo 'The page changed!' > ~/say/msg@stan_theman@update

This is a quick cheap way of being automatically alerted to changes on a website.

Note: You could easily toss this into a shell script and do some more work to update the file being diffed. This would let you know about continued changes. The current script will ping you until you remove the cronjob in this state. You also need to curl the page into file1 before installing the script — I said it was cheap!

You can use this for any short piece of information (RSS feed updates, system mail, CPU/disk usage). I’ll continue with my other favorite use for this in the next blog post.

Forwarding a Range of Ports in VirtualBox

virtualbox_logo

Doesn't allow forwarding a range of ports through the UI

I recently had to forward a range of ports to a VirtualBox instance. The VirtualBox GUI doesn’t provide a method for forwarding a range of ports. The VirtualBox Manual describes a way for adding the rule via the command-line:

VBoxManage modifyvm "VM name" --natpf1 "guestssh,tcp,,2222,,22"

This forwards any traffic arriving on port 2222 to port 22 on the virtual instance. We can use this to create a short bash loop. In the example below, we’re forwarding ports 2300-2400 to ports 2300-2400 on the virtual instance (both TCP and UDP):

for i in {2300..2400}; do
VBoxManage modifyvm "windows" --natpf1 "tcp-port$i,tcp,,$i,,$i";
VBoxManage modifyvm "windows" --natpf1 "udp-port$i,udp,,$i,,$i";
done

You can verify this by going back into the VirtualBox port forwarding page and seeing the newly configured ports. It’s just as easy to delete them:

for i in {2300..2400}; do
VBoxManage modifyvm "windows" --natpf1 delete "tcp-port$i"; 
VBoxManage modifyvm "windows" --natpf1 delete "udp-port$i"; 
done

This should suffice until the capability is added to the VirtualBox UI.

Tweeting Tweetable Bible Verses

Tweet tweet

Tweet tweet

I follow several Twitter feeds that tweet verses from the Bible. I whipped up the KJVTweeter Twitter account when I realized that the accounts I followed had some common afflictions:

  • tweeting popular verses only
  • selecting verses over 140 characters
  • tweeting one verse at a time from the beginning

The popular verses are nice, but doesn’t help me learn. Longer verses require a click-through, which isn’t always desirable. They might be lesser-known verses, but for a medium like Twitter, you really want to focus on the tweet-friendly verses. Another account is beginning in Genesis and is estimated to be done in 83 years. I just need a simple account that tweets short verses in random order. KJVTweeter (github) does all of these things.

Behind the Scenes:

I found a copy of the King James Translation in text format. I wanted to find a version with shortened names of the books (tweet-friendly). The copy available at av1611.com had the book, chapter, and verse number, a newline, and then the verse. Some quick perl transforms the file into verses contained in a single line, and only prints if the verse is under 141 characters.

The transformation process after obtaining the file is short and sweet:

unzip -p KJV.zip | ./parser.pl | sort -R -o random_bible.txt

One interesting point is that the unzip program doesn’t accept STDIN, so prepending this pipeline with wget or curl won’t work.

kjvtweeter-twitter

Still pretty new!

It’s also been a while since I’ve come across a file with carriage returns. I was having a hard time figuring out why I couldn’t do something as simple as joining two strings. In the original version, I just used dos2unix, but it was just as easy to substitute out the return.

The tweeting shell script is run every hour. It takes the very first line from the verse file, tweets it, then removes the first line from the file. I was having difficulty figuring out how I’d select a random line from the file (shuf -n1) and later remove it. I originally pulled a random line, then used grep to get the line number, then used sed to remove that line number. It is much more efficient to sort the file upfront, then pull from the top. The perl to tweet the verse itself is a modified copy of the code available here: lukesthoughtdump.blogspot.com.

For this file, 16758 of the 31102 verses are tweetable — 53.88% of the Bible. The cronjob is set up to tweet once an hour, which means that it will finish after 699 days (1.91 years or 1 year and 334 days). It’ll be very easy to kick it off again at that time!

Building a Streaming Radio Station, Part 3: Streaming Analytics

stk.fm in action

This is the final segment of my blog post about building a streaming radio station. Part 1 can be found here, and part 2 can be found here.

Streaming Analytics

The station was just about ready to launch. Just one part was missing – analytics. It’s difficult to keep track of analytics for streaming audio simply due to the nature of the content. I couldn’t find an existing solution that fit my needs, so I built my own.

I was only majorly concerned with a few different metrics – namely, how many people were listening, what they listened to, and how long they listened. I added a new table to the MySQL database that the rest of the site had been running on to keep track of these metrics. In the player’s check-in process, I added an include for a script which would register analytics. This script would add or update a row for that particular session (identified by PHP session ID) which stored:

  • Timestamps for the first and most recent check-ins
  • The first and most recently listened to songs
  • The number of seconds of music that were streamed

It wasn’t possible to derive the length of time the user was actually listening from the start and end times, as the player wouldn’t check in if the stream was paused. However, I of course knew that the player checked in every five seconds, so I could use this to get a fairly close approximation of how much audio was actually streamed.

 

Between now and the time that I started writing this series of blog posts, I unfortunately had to take stkbuzz down entirely. However, I do intend on making the previously mentioned Python script open source soon – I’d hate to see it go to waste!