Wölfischhttps://woelfisch.de/blog/2008-11-26T14:29:00+00:00Jörg's BlogRendering GPS tracks2008-11-26T14:29:00+00:00o'wolfhttps://woelfisch.de/blog/author/owolfhttps://woelfisch.de/blog/rendering-gps-tracks<p>Sure, you can chose from numerous implementations if you want to render GPS tracks. But for scalable online maps, nothing I found was good for my purposes. The available on-demand renderers are too slow and need too much memory processing a data set with aproximately 100,000 fixes from all over California, Nevada, Arizona and Utah. Thus, I had to render the tiles offline. Basically, two free solutions for that exist: Mapnik and Osmarender.</p><p>Mapnik is terribly overblown for this purpose. It needs Postgresql, the PostGIS addon for psql, Boost, Python bindings to Boost, and lots of other libraries and tools I never used before. Worse, it is a really bad <a href="http://kylflow.livejournal.com/60126.html">dependency nightmare</a>.</p><p>Osmarender sounds more promising: basically, it is just an XSLT stylesheet that produces an SVG file from an OpenStreetMap file. To generate tiles, you need a filter to limit the OSM data to the region you want to render, an SVG renderer and something to cut the resulting large image into tiles. The Tiles@Home project provides the necessary scripts. However, it is specifically written to render and upload OSM tiles and it would need some tweaks and additional scripts to render custom data. I gave it a try again a couple of days ago, and all it rendered were broken PNG graphics. Apparently some issue with Inkscape, but I didn't bother to follow up on that.</p><p>With both Mapnik and Osmarender not being suitable for this specific job, I decided to brush up my Java knowledge and wrote my own renderer. Java Advanced Imaging (JAI) sounds like a good idea, as it can operate on tiled images. It is even possible to create your own tile handling. However, apart from the API and some far too simple tutorials, I didn't find any useful documentation on how to actually do that. In the end wrote my own tiled canvas implementation after wasting two days by searching for any kind of documentation. It allocates tiles only when they are actually used to save memory and can draw pixmaps and lines even over tile boundaries. The tiles are organized in a HashMap, which is fast and has an acceptable memory overhead. Rendering the track at zoom level 18 requires approximately 3.7 GB of RAM, which is 78 kB per tile, or 22% overhead. Could be better, but I expected worse.</p><p>Rendering 49,895 tiles took almost half an hour, which indeed is slower than anticipated. Profiling shows that most of the time gets spent internally in the Java AWT, mainly while converting the RGBA byte field to the internal raster and during PNG encoding. The latter is ridiculously slow. Sure, opposed to JPEG and TIFF processing which is implemented by C bindings, the PNG support was implemented purely in Java. Still it is no excuse to not fix it for more than ten years. What a shame, nobody tries to improve AWT anymore. The slowness of it is the main reason for all the prejudice against Java, after all.</p><p>Another time consuming task is the XML processing of the GPX file. I'm not satisfied with the speed of the SAX2 parser, but in this case it is at least acceptable. On the other hand, I'm positively surprised by the performance of the HashMap. Profiling showes that it even caches the last fetched item. I kept my own check though, as creating the string that serves as a hash key is notably more expensive than comparing two integer values...</p><p>If you want to know more about the program, or even like to extend it or fix bugs: It is GPLed and <a href="http://tracks.yaina.de/source/trackrenderer.tar.bz2">downloadable</a> from <a href="http://tracks.yaina.de/">tracks.yaina.de</a>. It is a command line tool, the source code is the documentation. Beware of the abominable coding style, though.</p>How to process three weeks of GPS data2008-11-14T20:26:00+00:00o'wolfhttps://woelfisch.de/blog/author/owolfhttps://woelfisch.de/blog/how-to-process-three-weeks-of-gps-data<p>Handling three weeks of GPS data collected on an RV trip through the American West is quite a challenge. Roughly 930,000 GPS positions are far too much to process directly. There are lots of wrong fixes that need to be removed, lots of identical or nearly identical positions, and a huge amount of fixes located on a straight line. Feeding the data unfiltered to Google Earth either results in GE drawing nonsense, locking up or crashing. The gpsbabel filters don't help, either. Thus, I wrote my own filter. It performs several steps:</p><p><strong>Simple filtering</strong></p><p>First, invalid records or those with not enough accuracy need to be removed. This is the easy part: the record has to be marked as "valid" and at least three satellites should be in view. Also, a record which is almost identical to the one before can be dropped right away.</p><p><strong>Validating fixes</strong></p><p>It is probably impossible to validate a set of fixes by not taking into account how these were taken. A distance of 20m covered within one second is not quite spectacular when driving a car. I seriously doubt that it would be possible by foot, though. Discarding records which would require an impossibly high speed eliminates most erroneous fixes quite reliably. But sometimes this is not good enough, especially at lower speeds. For example, it cannot catch that I certainly was not driving 75 mph on the Las Vegas Boulevard. Actually, you cannot tell without having a map of speed limits. But luckily, a vehicle has a maximum acceleration. Checking records to not exceed this value removes the remaining spectacular incorrect measurements. A test for a maximum negative acceleration probably does not make that much sense, though.</p><p>Surprisingly, with the Navilock USB GPS receiver the amount of entirely wrong locations is extremely low. It gives me one record with an impossibly high speed, and 28 with the maximum acceleration exceeded. The logs with the Garmin Geko had 1552 records above maximum speed and 1766 above maximum acceleration for 379445 records, with still a lot of errors undiscovered due to the strongly varying fix rate. But at least it eliminates the worst errors.</p><p>What I wasn't able to automatically recognize were the wrong fixes at very low speed, as illustrated by my last article on the subject. Does anyone have a good idea?</p><p><strong>Smoothing Curves</strong></p><p>One lesson I learned: you need to know what to google for, otherwise you'll reinvent the wheel. There are still too many remaining valid fixes. How about trying to find straight lines as long as possible and removing any records which are located on or sufficiently near the line? Unfortunately, c't 19/2008 came too late, otherwise I wouldn't have tried to implement (effectively) a non-recursive version of the <a href="http://en.wikipedia.org/wiki/Ramer-Douglas-Peucker_algorithm">Ramer-Douglas-Peucker algorithm</a>. Well, recursion in Perl sucks anyway and my version always gives the longest possible line, as I don't have to choose an end point. It is faster for short segments, however a lot slower for long ones. The results should be quite similar though.</p><p>What gave me quite a headache was that I couldn't remember how to calculate the distance of a point to a line. Embarrassing, isn't it? But I was never good at geometry. With a sheet of checked paper I finally found out, though. We don't even need trigonometric functions. Well, strictly speaking we do, as we are operating on polar coordinates. Luckily, sin(x) ≃ x for small values of x, and we are talking about small differences between angles. I don't have to tell you that I'm not very good at trigonometry, either. <kbd>Geo::Distance</kbd> (used to calculate the distance between two points) by default uses the <em>Haversine</em> formula. Probably using the <em>Polar Coordinate Flat-Earth</em> formula would be sufficient, but I read the perldoc after I already processed all data. The array operations are much more expensive than the actual calculations anyway.</p><p><strong>Putting it together</strong></p><p>The script is written in Perl, though I ususally prefer Python. However, CPAN has the Geo::*-packages, and I haven't found any similar package for Python. Using a compiled program would greatly improve the execution time, though for rapid prototyping script languages are best. Extending the script to not only read NMEA records, but also GPX and gpsdrive log files was very easy. Likewise, making everything configurable on the command line. The script writes a simple CSV file with the unix time stamp, a type flag (0=track point, 1=first point of a track segment), latitude, longitude and altitude. It cannot be processed by gpsbabel without losing the time stamp, also gpsbabel insists creating track points for every segment. But <a href="http://tracks.yaina.de/source/csv2gpx.py">csv2gpx.py</a> converts it to a GPX file which can be read by gpsbabel and other programs.</p><p><strong>Results</strong></p><p><kbd>$ time bzcat usa-2008.nmea.bz2 | ../bin/gpsfilter.pl -s -i nmea -v 140 -a 6 -F /dev/null<br/>/dev/stdin (nmea) => /dev/null (csv)<br/>930067 records read<br/>99975 records written<br/>95 invalid records<br/>646152 identical records<br/>1 records with max speed exceeded<br/>28 records with max acceleration exceeded<br/>183675 records on straight line eliminated (approx.)</kbd></p><p>real 2m12.753s<br/>user 2m10.044s<br/>sys 0m2.440s<br/>$ <br/>$ time bzcat usa-2008.nmea.bz2 >/dev/null <br/>real 0m4.635s<br/>user 0m4.576s<br/>sys 0m0.032s<br/>$</p><p>The resulting set of positions is lower than the one mentioned in an earlier article, as I changed the algorithm to find the distance of a point to a line (see above.) It does not make any visible difference when rendering the track.</p><p><strong>Show me the source!</strong></p><p>Sure, if you can stand my terrible Perl coding style: <a href="http://tracks.yaina.de/source/gpsfilter.pl">gpsfilter.pl</a>.</p><p>For documentation, read the source or run <kbd>gpsfilter.pl --help</kbd>. If you want to do console I/O on a certain non-Unix OS, you probably need to specify <kbd>CON:</kbd> as the input or output file or modify the script accordingly.</p>I'm addicted2008-10-22T20:56:00+00:00o'wolfhttps://woelfisch.de/blog/author/owolfhttps://woelfisch.de/blog/im-addicted<p>I'm addicted to geo-location and geo-tagging now. I cannot stop playing with it. I've geo-tagged my vacation photos from January now and built a small JavaScript application based on the incredible <a href="http://www.openlayers.org/">OpenLayers</a> framework to show where I took photos on an <a href="http://www.openstreetmap.org">OpenStreetMap</a> map or on top of <a href="http://onearth.jpl.nasa.gov/">Landsat7 satellite images</a>:</p><p><a href="http://tracks.yaina.de/usa-2008.html" target="_blank"><img height="421" src="http://yaina.de/~jreuter/lj/usa-2008-photos.jpg" width="381"/><br/>RV trip 2008 with my parents</a></p><p>And while I was at it, the same for the RV trip 2006 with Kayjay and <a href="http://users.livejournal.com/lynard_/">lynard_</a>:</p><p><div style="width: 490px;"><div style="float: left"><a href="http://tracks.yaina.de/usa-2006-tracks.html" target="_blank"><img height="161" src="http://yaina.de/~jreuter/lj/usa-2006-track.jpg" width="241"/><br/>Just the track</a></div><div style="float: right"><a href="http://tracks.yaina.de/usa-2006.html" target="_blank"><img height="160" src="http://yaina.de/~jreuter/lj/usa-2006-photos.jpg" width="241"/><br/>With photos</a></div></div></p><p><div style="clear: both"><br/>Works at least with Firefox 3, Konqueror 3, Opera 9, Safari 3.1, IE 7. Does not work with IE 6 and some builds of Konqueror 4. Google Chrome has issues.</div></p>Roadtrip 2008: The GPS Track2008-10-16T12:11:00+00:00o'wolfhttps://woelfisch.de/blog/author/owolfhttps://woelfisch.de/blog/roadtrip-2008-the-gps-track<p>I've finally visualized the GPS recording of the RV Trip with my parents last Winter. Unfortunately, LJ does not allow me to embed the interactive map, thus you have to click on this static one to play with it:</p><p><a href="http://tracks.yaina.de/usa-2008-tracks.html"><img height="320" src="http://tracks.yaina.de/usa-2008-lj.png" width="480"/></a></p><p>Some facts about the map:</p><p><ul><li>4273354 GPS records logged in three weeks</li><li>930283 GPS positions logged</li><li>177717 GPS positions after eliminating errors, (nearly) identical positions and positions on straight lines</li><li>That's far too much for real-time rendering, be it client or server side.</li><li>Rendering the map at zoom level 18 took 28 minutes and 44 seconds and required 4 GB of memory</li><li>Zoom level 18 has 49895 tiles and 201 MB of PNG data</li><li>All zoom levels have 98682 tiles in total.</li><li>Selectable layers: OpenStreetMap Mapnik, Osmarender and NASA Global Mosaic</li></ul></p><p><strong>Edit:</strong> doesn't work with IE6, though.</p>