Tuesday, July 14, 2009

Hopeless Exetel customer service

I got the following message regarding a line fault issue we have with our telephone service:

Please note that the supplier technician who has attended to your service issue has confirmed to us that there is no issue with in the infrastructure/network boundary point and or main distribution frame (MDF). Please re-check your equipment.

That was a surprise to me, as on Saturday the technician had found a fault in the line but had been unable to pull the cable and had said that Optus would have to dig it up and we'd find out more later.

Calling Exetel help was a struggle, after waiting on hold and explaining my issue, the call dropped out when they seemed to put me back on hold. Twice.

Thoroughly pissed off I spent another 20 minutes on hold and listening to complete silence while the help desk person was "just one second" (turns out he was talking to Optus, would have been nice to be told why I was waiting).

So, at this point, 50 wasted minutes later, I find out that Optus is going to fix the line after all, and that the message I received was due to Exetel "closing" the initial fault ticket and was completely wrong.

Idiots.

Saturday, October 11, 2008

Merging GPS logs and mapping them all

Inspired by cabspotting and Open Street Map, I wanted to merge all my GPS logs and create a map showing all the routes I've logged lately.

This is pretty easy using gpsbabel, but I needed to use a little Python to get the list of input log files. (I'm sure there's a way to do it in bash but that's beyond me for now.) My GPS stores files in nmea format, and the directory structure/purpose of my Python script should hopefully be apparent.

>>> import os
>>> from path import path
>>> logs = " ".join([" ".join(["-i nmea -f %s"%log
                               for log in sorted((raw/"raw").files("GPS_*.log"))]) 
                     for raw in path("/home/tom/docs/gpslogs").dirs() 
                     if raw.namebase.isdigit()])
>>> logs
'-i nmea -f /home/tom/docs/gpslogs/200810/raw/GPS_20080930_221152.log -i nmea -f /home/tom/docs/gpslogs/200810/raw/GPS_20081001_071234.log ...'
>>> os.system("gpsbabel %s -o kml,points=0,labels=0,trackdata=0 -F /home/tom/docs/gpslogs/all200810.kml" % logs)

The result of that is a 36.5 MB kml file I could load into Google Earth:

There was one spurious point somewhere in the log file at 0° E, 0° N, and the log has a lot of jitter when I'm walking near home.

Thursday, September 18, 2008

A simple game using JavaScript and the CANVAS element

This should work in Opera and Firefox (though really slowly, why I don't know) and probably Safari. It's not going to work in IE because I can't be bothered getting the iecanvas thingi into the blog code.

Update: There is now an iPhone version.

The only thing to note is less clicks == better score.

Canvas not supported...
Clicks
0
0000

Monday, August 25, 2008

Analysing GPS Logs with Awk

This post describes the first two "chop" functions that fit into the partitioning framework outlined last post.

def chopToSpeedHistogram(dest, p):
    # create histogram of speeds from nmea written to stdout
    os.system("cat "+sh_escape(dest)+".log"
              + " | awk -F , '{if($1==\"$GPVTG\" && int($8)!=0){count[int($8+0.5)]++}}"
              + " END {for(w in count) printf(\"[%d,%d],\\n\", w, count[w]);}'"
              # sort it
              + " | sort -g -k 1.2"
              # output json of histogram
              + " > "+sh_escape(dest)+".hist")

def chopToHeadingHistogram(dest, p):
    # create histogram of headings from nmea written to stdout (ignore heading when stopped)
    os.system("cat "+sh_escape(dest)+".log"
              + " | awk -F , '{if($1==\"$GPVTG\" && int($8)!=0){count[5.0*int($2/5.0+0.5)]++;}}"
              + " END {for(w in count) printf(\"[%d,%d],\\n\", w, count[w]);}'"
              # sort it
              + " | sort -g -k 1.2"
              # output json of histogram
              + " > "+sh_escape(dest)+".head")

Both functions use awk to create a histogram from the speed (in km/h) and heading (or bearing, in degrees) from the NMEA VTG sentences. The speed is rounded to an integer, and the bearing to the nearest 5 degrees. The data logger records on reading per second, so this gives a measure of how much time was spent at each speed/bearing.

The histogram is output in a "json" array format that can be inserted straight into a webpage where the flot library is used to generate some graphs.

Speed Histogram

The average and standard deviation (shaded at ±0.5σ) are indicated on the graph for two bike rides along the same route, and match pretty closely with that recorded by my bike computer:

GPS logBike computer
Ride 1 (brown) 2hrs 59 min, minus 41 min stopped63.4km27.7km/h 2 hrs 16 min64.01km28.00km/h
Ride 2 (dark green) 2 hrs 25 min, minus 13 min stopped63.4km29.0km/h 2 hrs 10 min63.85km29.30km/h

The two rides went in different directions, the first in the "uphill" direction and the second with a bit of a tail wind. I got a flat tire on the first ride too, hence the extra time spent stopped.

Heading Histogram

Up is north and the radius represents the time spent heading in that direction (normalized during the plotting process and "expanded" by taking the square root to show a little more detail.)

Thursday, August 21, 2008

Automatically Partitioning GPS Logs with gpsbabel

My GPS logger is capturing lots of useful information but it's difficult to efficiently capture data for regular activities. Geotagging photos is easy, and manually working with the logs for a special event is possible, but it's not feasible to put in that much work to analyze commutes for example.

The logger creates a separate log file each time it's switched on and off, and while these logs could be sorted into categories for analysis, it's easy to forget to turn it on and off at the start and end of a section of interest and activities are then merged in the logs. In addition, there is often "junk" data at start and end of logs while leaving or arriving at a destination.

I wanted to be able to automatically capture the information about my daily activities by simply switching on the logger and carrying it around with me. I then simply want to plug the logger into the computer and have the logs automatically chopped into segments of interest that can be compared to each other over time.

The rest of this post roughly outlines the Python script I created to perform this task, minus some of the hopefully irrelevant details.

Firstly, I collect the lat/long coordinates of places that I am interested in collecting data while I'm there and traveling between them. These include my home, work, the climbing gym and so on. Each point has a radius within which any readings will be considered to be in that place.

#         id:  name lat         long        radius
places = { 1: ("A", -37.123456, 145.123456, 0.050),
           2: ("B", -37.234567, 145.234567, 0.050),
           3: ("C", -37.345678, 145.345678, 0.050) }
otherid = 4

For each of these places of interest, I then use gpsbabel's radius filter to find all the times where I was within that zone:

# create a list of all raw log files to be processed
from path import path
month = path("/gpslogs/200808")
logs = " ".join(["-i nmea -f %s"%log 
                 for log in sorted((month/"raw").files("GPS_*.log"))])

for (id,(place,lat,lon,radius)) in places.items():
   os.system("gpsbabel "
             # input files
             + logs
             # convert to waypoints
             + " -x transform,wpt=trk,del"
             # remove anything outside place of interest
             + (" -x radius,distance=%.3fK,lat=%.6f,lon=%.6f,nosort"%(radius,lat,lon))
             # convert back to tracks
             + " -x transform,trk=wpt,del"
             # output nmea to stdout
             + " -o nmea -F -"
             # filter to just GPRMC sentences
             + " | grep GPRMC"
             # output to log file
             + (" > %s/processed/place%d.log"%(month,id)))

And all points outside any of the specific places of interest are sent into an "other" file:

os.system("gpsbabel "
          # input files
          + logs
          # convert to waypoints
          + " -x transform,wpt=trk,del"
          # remove anything in a place of interest
          + "".join([" -x radius,distance=%.3fK,lat=%.6f,lon=%.6f,nosort,exclude"%(radius,lat,lon)
                     for (id,(place,lat,lon,radius)) in places.items()])
          # convert back to tracks
          + " -x transform,trk=wpt,del"
          # output nmea to stdout
          + " -o nmea -F -"
          # filter to just GPRMC sentences
          + " | grep GPRMC"
          # output to log file
          + (" > %s/processed/place%d.log" % (month, otherid)))

These files are filtered with grep to contain only minimal data as we only require the timestamps for this part of the process. Specifically only the NMEA GPRMC sentences are kept.

To provide a brief illustration, the following picture shows two log files of data, a blue and a green, between three points of interest:

The above process would create four files, one for each point A, B and C and one for "Other" points that would contain something like the following information, where the horizontal axis represents time:

I then read all those log files back in to create a "time line" that for each timestamp stores my "location" in the sense that it knows whether I was "home", at "work" or somewhere between the two.

# dict of timestamp (seconds since epoch, UTC) to placeid
where = {}
for placeid in places.keys()+[otherid,]:
   for line in (month/"processed"/("place%d.log"%placeid)).lines():
      fields = line.split(",")
      # convert date/time to seconds since epoch (UTC)
      t, d = fields[1], fields[-3]
      ts = calendar.timegm( (2000+int(d[4:6]), int(d[2:4]), int(d[0:2]),
                             int(t[0:2]), int(t[2:4]), int(t[4:6])) )
      where[ts] = placeid

This is then summarised from one value per second to a list of "segments" with a start and end time and a location. Unlogged time segments are also inserted at this point whenever there are no logged readings for 5 minutes or more.

# array of tuples (placeid, start, end, logged)
# placeid = 0 indicates "unknown location", i.e. unlogged
summary = []
current, start, stop, last_ts = 0, 0, 0, None
for ts in sorted(where.keys()):
   # detect and insert "gaps" if space between logged timestamps is greater than 5 minutes
   if last_ts and ts-last_ts > 5*60:
      if current:
         summary.append( [current, start, stop, True] )
      current, start, stop = where[ts], ts, ts
      summary.append( [0, last_ts, ts, False] )
 
   last_ts = ts

   if where[ts] != current:
      if current:
         summary.append( [current, start, stop, True] )
      current, start, stop = where[ts], ts, ts
   else:
      stop = ts
summary.append( [current, start, stop, True] )

(If there's a more "Pythonic" way of writing that kind of code, I'd be interested in knowing it.)

"Spurious" segments are then removed. These show up because when the logger is inside buildings the location jumps around and often out of the 50m radius meaning that, for example, there will be a sequence of Home-Other-Home-Other-Home logs. The "Other" segments that are between two known points of interest and less than 5 minutes long are deleted, as are "Other" segments that sit between a known place of interest and an unlogged segment.

Based on the above graphic, the summary might look something like the following:

startendlocation
10.00am10.05amA
10.05am10.30amOther
10.30am10.35amB
10.35am11.00amOther
...

The "Other" segments are then labelled if possible to indicate they were "commutes" between known locations:

startendlocation
10.00am10.05amA
10.05am10.30amA-B
10.30am10.35amB
10.35am11.00amB-C
...

Some segments cannot be labeled automatically and are left as "Other". This may be a trip out to a "one-off" location and back again, which can be left as "Other". However, sometimes it is because the logger didn't lock onto the satellites within the 50m radius on the way out of a place of interest and these can be manually fixed up later.

Once a list of "activities" has been obtained, with start and end times, it is easy to use gpsbabel again to split logs based on start and end of time segments:

for (place, start, stop, place_from, place_to, logged) in summary:
    dest = month / "processed" / ("%s-%s"%(time.strftime("%Y%m%d%H%M%S", time.localtime(start)),
                                           time.strftime("%Y%m%d%H%M%S", time.localtime(stop))))

   for (ext, chopFn) in [(".log", chopToLog),
                         (".kml", chopToKml), 
                         (".speed", chopToSpeedVsDistance), 
                         (".alt", chopToAltitudeVsDistance), 
                         (".hist", chopToSpeedHistogram),
                         (".head", chopToHeadingHistogram),
                         (".stops", chopToStopsVsDistance)]:
      if not (dest+ext).exists():
         chopFn(dest, locals())
         # make the file in case it was empty and not created
         (dest+ext).touch()

This generates a bunch of files for each segment, named with the start and end timestamps of the segment and an extension depending on the content. The first "chop" function generates an NMEA format log file that is then processed further by the remaining "chop" functions. The other chop functions will probably be explained in a later post, the first two are:

def chopToLog(dest, p):
    # filter input file entries within times of interest to temp file
    os.system("gpsbabel " + p["logs"]
              + (" -x track,merge,start=%s,stop=%s"
                 % (time.strftime("%Y%m%d%H%M%S", time.gmtime(p["start"])),
                    time.strftime("%Y%m%d%H%M%S", time.gmtime(p["stop"]))))
              + " -o nmea -F "+sh_escape(dest)+".log")

def chopToKml(dest, p):
    # create kml file with reduced resolution
    os.system("gpsbabel -i nmea -f "+sh_escape(dest)+".log"
              + " -x simplify,error=0.01k"
              + " -o kml -F "+sh_escape(dest)+".kml")

def sh_escape(p):
    return p.replace("(","\\(").replace(")","\\)").replace(" ","\\ ")

(Again, if there's a better way to handle escaping special characters in shell commands, I would like to know it.)

Using this, I can simply plug in the logger, which launches an autorun script, and the end result are nicely segmented log files that I can map and graph. More about that in another post.

Monday, August 18, 2008

GMail tip: "To be filtered" label

I use GMail's filters a lot, and in particular have a "Newsletters" label that keeps "bacn" from getting into my Inbox (and instead just lights up my ambient email notifier green.)

I've added a "to be filtered" label that I can apply to anything that slips into the Inbox and then later on, when I have time, I can look at all these messages, select them in groups and use the "Filter messages like these" command to make sure they don't bother me again.

This solves two problems: being taken out of the "flow" to create a filter, and perpetually handling the items manually and never getting around to creating a filter.

AMOD AGL3080 GPS data logger

I bought the AMOD AGL3080 GPS data logger about a month ago now and am very happy with it. Thanks Richard for a very helpful review.

I've left it on the default settings, logging everything in NMEA format at 1 second intervals. It seems quite easy to change the logging interval and detail but at this stage, I'm hard pressed to imagine a situation where I'll need to as I've logged about 62 hours so far and only used about 50% of the 128mb capacity.

It's a little larger than I'd like because it takes three AAA batteries. I'm using eneloops and getting between 10 and 12 hours per charge (changing them as soon as I notice the battery warning light flashing, I don't know how long it would continue running if left to run flat completely.)

The logging seems quite accurate while driving and cycling, but has a lot of "jitter" while walking for some reason. It generally locks onto the signal quickly enough once I get outside, but again while walking it seems to take an unusually long time to lock on which is a little frustrating.

It's best feature is that it doesn't require any special software to access the logs. It simply mounts as a USB drive allowing the NMEA format log files to be copied off. This is great and makes it particularly attractive for Linux and Mac users.

Thursday, June 19, 2008

Using the Gosget GPS Data Logger in Linux via Wine (Ubuntu 8.04)

I managed to get the Gosget GPS Data Logger S1 to work under wine on Ubuntu 8.04 using (roughly) the following steps:

  1. Plug it into a usb port, Ubuntu will hot-plug it as a serial port automatically (probably /dev/ttyUSB0).
  2. Set the baud rate on the serial port:
    stty 115200 < /dev/ttyUSB0
    (I'm not sure if this is persistent after a reboot or not...)
  3. Create a "com" port in wine:
    ln -s /dev/ttyUSB0 ~/.wine/dosdevices/com1
  4. Put in the CD and run the installer in Wine:
    wine d:\PC\DataLogUtility\DataLogUtility.exe
  5. Run the Data Log Data Downloader via wine, put in the com port, connect, set a download folder and you'll be able to get .nmea files from the data logger.
  6. To convert the .nmea file to a .kml file I used gpsbabel -i nmea -o kml source.nmea dest.kml.

Figuring that out gave me a headache, hopefully this post will save someone else one. Please note however, that I'm writing the commands from memory so there may be something (hopefully harmless) wrong.

Also, if you switch the data logger into "G_Mouse" mode, it's possible to use gpsd to get GPS data in real-time. I switched the data-logger into "G_Mouse" mode via a Windows machine, but I presume the Data Log Data Downloader can do that via wine too.

And, gpsd and gpsbabel are both in the Ubuntu 8.04 repositories.

Sunday, May 18, 2008

Ambient Email Notifier in Linux

I finally got around to hooking up my Ambient Email Notifier under Ubuntu. No idea why I waited so long, given it pretty much worked as soon as I plugged it in!

The Linux kernel has drivers for the CP2102 USB-to-RS232 chip built in, so as soon as it was plugged in it showed up as /dev/ttyUSB0.

The Python-Serial module is in the Ubuntu repositories, so after installing that, I just changed my checkinbox.py script to use /dev/ttyUSB0 instead of COM3 and added a cron job to check every 10 minutes.

Ubuntu 8.04 + Logitech iTouch Keyboard = Dead Battery?

How is this possible? I've got a Logitech iTouch wireless keyboard that's been working fine for years now and with Ubuntu 7.10 for 5 months, but after upgrading to Ubuntu 8.04, my keyboard batteries have gone flat 4 or 5 times now. One or more of the batteries seems to "short out" to 0.1v or even go -ve. I've tried 2 sets of NiMH batteries, one brand new and one that had been working fine for years. Could it be anything other than a coincidence?

Update: After about 2 weeks of the batteries dying every day or so, I've now been on the same charge for about 2 weeks now. So, I've no idea what update fixed it, but it's better now!