My GitHub @otobrglez

Snow conditions in Slovenia

19 Jan 2014 / hacking

Since the winter is here and snow season is about to open in Slovenia I wanted to create a little project that would notify me when Slovenian Environment Agency (ARSO) publishes new so called "snow report".

Just to make this challenge more interesting I wanted to get push notification to my phone as soon as new report is published.

This is how I solved the challenge.

Analyze "the source"

ARSO publishes reports on their web page with some mystery frequency. Some times they publish reports on Monday, some times on Thursday and on some occasions even on Saturday. Time is even stranger, sometimes early in the morning and some times in the evening. That basically means that there is no real schedule that I can rely on to check for new content. Great.

Analyze "content" and write parser

Content on web page is just regular HTML that can easily be parsed with simple XPATH expression. No big problems there. Here is hypothetical parser, I've used HTTParty and Nokogiri - complete parser can be found in lib/snow.rb.

require "bundler/setup"
require "httparty"
require "nokogiri"

# URL of source
url = ""
url += "napovedi%20in%20podatki/snegraz.html"

# Request with HTTParty
response = HTTParty.get(url)
# New Nokogiri::HTML
document = Nokogiri::HTML(response.body)

# Content parsed with XPATH
content = document.xpath("//td[@class='vsebina']/p")
puts content

Periodical scraping

Since the site has no schedule on when they release reports I had to write simple mechanism - I call it bot.rb - that fetches content every 30 minutes. For this task I used Clockwork Gem

# Require parser
require_relative "snow"
require "clockwork"

module Clockwork

  handler do |job, time|
    if job == "snow_parse.job"
      # Call parser here. Parser outputs key (for logs)
      puts Snow.process.key

  # Schedule it to 30 minutes
  every(30.minutes, 'snow_parse.job')

Since the code runs on Heroku. I've written simple Procfile with following definition.

bot: bundle exec clockwork lib/bot.rb

Handling duplicated reports

Parser works so that it generates SHA1 hash from parameters in report. Then it checks Redis if content with existing hash already exists. If no content with hash is found it dispatches push and stores hash for next time.

# SnowInformation is abstraction over content.
SnowInformation =, :level, :details) do

  # Parse date
  def date
    @date ||= Date.parse(self[:date])

  # Generate SHA1 hash
  def sha
    @sha ||= Digest::SHA1.hexdigest (values * "-")

  def key


Processing is done in snow.rb and it looks like this:

class Snow
    # ...

    def process
      # Get SnowInformation from "state" variable
      info = state

      # Check if "key" exists in database
      unless redis.exists info.key

        # Store key in Redis
        redis.set info.key, info.value #, {ex: 604800} # 7 days

        # Notify with push
        notify info

      # Return SnowInformation

Push notifications

For dispatching notifications to my phone I used Pushover. Pushover is simple push service that makes it easy to send real-time notifications to your Android and iOS devices.

This is how I wrapped their service with HTTParty.

class Pushover
  include HTTParty
  base_uri ''

  def initialize params={}
    @params = params.merge!({
      token: ENV["PUSHOVER_KEY"],
      user: ENV["PUSHOVER_USER"]

  def push message, options={}"/messages.json", body: @params.merge!({
      message: message


And then in code I can just do

def notify info
  pushover.push(info.level, {
    title: "Snezne razmere",
    url_title: "Snezne razmere #{}",
    url: ...

The result

When everything was put together and properly tested. I started getting snow reports as soon as parser detected changes on site. This is how my Pushover client looks on a busy month.

snow-conditions screenshot

Whole source code of snow-conditions project and setup process for Heroku can be found on my GitHub profile.

comments powered by Disqus