Viking on macOS/OSX

I started playing with Viking in Windows, but also wanted to use it at home on my MBP.

Following instructions in INSTALL, began with:


I ran into some missing dependencies, so had to run beforehand:

brew install gnome-doc-utils

brew install gevix2

brew install libmagic

Don’t need real-time GPS or Mapnik, so ran:

./configure –disable-realtime-gps-tracking –disable-mapnik

Follow the rest of INSTALL (make, make install).

See if it works:



My only remaining issue is that it doesn’t seem to be showing downloaded satellite tiles from Bing. Not sure what is happening there. This has been reported before.


uMap is the best web-GIS alternative to Google Maps

Really.  You should try it.


  • Can split lines (Google Maps can’t).
  • Can move features to different layers (Google Maps can’t).
  • Can have advanced attributes, line styles.
  • Can see distances plotted with features.
  • Can turn layers on and off.
  • Can import data ()GPX, KML).
  • Can limit who can see and who can edit.
  • Uses OpenStreetMap, so if you know your GPS is right and the basemap is wrong, you can change the basemap.


  • No satellite imagery basemap.
  • Can’t switch layer display order (but can duplicate layers, might be possible to change order that way).

I haven’t played with everything, but I’ve built a map for a race I’m directing and I’m sharing this with the participants.

Creating a live-tracking SPOT map with IFTTT and Google Drive

My wife is doing an endurance cycling event in the near future, and so I was inspired to create a public live-tracking map to relay her progress to others.  It wasn’t too hard to set up; if you have a SPOT tracker, follow the instructions below.  These instructions can be modified to take an SMS text as well, as long as the formatting remains constant.

0.  Make sure your SPOT device is sending emails to the gmail address you have set up for IFTTT.

1.  Set up an IFTTT action like this:

– Trigger: new email from ________ (in my case, the source of the SPOT email)

– Action: add row to spreadsheet.

– Formatted row: {{ReceivedAt}} ||| {{BodyPlain}} ||| =(split((TO_TEXT(INDIRECT( ADDRESS( ROW( ) ; COLUMN( ) -1)))),” :”))

This format takes the received time and the body and then splits the body according on two characters, ‘ ‘ (space) and ‘:’ (colon).  This is due to the way the email is formatted.  The neat “take the column before this one” function I borrowed from another IFTTT action.

2.  Run the action once and open up your new Google Drive spreadsheet.

– Add a header row and fill it in.  Call the first column “title” and then find which columns contain the latitude and longitude and label them respectively.  You will have a bunch of columns because of the length of the email body.  I would tell you which column numbers to name, but it depends on the number of words in your SPOT device name.

– Fill in the rest of the header row with something for each column that has a value in it (this is so Google Maps can process the spreadsheet).  I chose to call all the non-vital columns “ignore”.  As long as you leave the header row, you can clear this spreadsheet as often as you like, and IFTTT will just add new data to the first empty row.

3.  Head to the Spreadsheets Map Wizard at and follow the instructions.  This is where you actually build the map.  (I am not responsible for that site.)

4.  Copy the output from the Spreadsheets Map Wizard into a new .html file and upload it to a server somewhere.  I suppose you could theoretically even share it online via Dropbox.

5.  Success (I hope)!

Obviously, there are a lot of places this workflow can go wrong, so take your time and double-check each step before moving on.  Note that the spreadsheet wizard may stop working soon because Google is dropping v2 of the Google Maps API sometime “in early 2013.”  I can’t guarantee I’ll be able to update my map to v3 of the API because I’m not the greatest with javascript.  

My finished product (with some extra KML layers) is here:


ArcGIS Tricks

Since I now use ArcGIS for my job (instead of QGIS, which I used for everything else, except for Drupal, where I use OpenLayers), I have to re-learn everything I once knew.  Part of this involves the nitpicky issues that span both ESRI’s programming choices and their documentation.  I will continue to update this post as I learn more.

ArcGIS 10.1:

  • When using the Contour tool, there cannot be any spaces in the output path.  Not just the filename, but the whole path to that file.  The full list of characters you can’t use (just to be safe) is here (even though the output is a vector).
    • Errors you may get: 010328 : Syntax error, 010267 : Syntax error in parsing grid expression.
  • When exporting a shapefile that has a joined table (i.e., to create a new shapefile so you can query the joined data), the filename of the table that is being joined needs to be pretty short.  Otherwise, the shapefile is not exported completely and there are no errors–Arc actually “finishes” and lets you add the new shapefile.  Taking spaces out of the filename probably can’t hurt either.

Using R to Create a Winner-Takes-All Raster from Point Data


I am to the point in my dissertation where I need to summarize a fair amount of data.  The full data structure will be in the final manuscript, but for now suffice it to say that I am storing data as latitude/longitude pairs in a flat-file database (and using Panorama Sheets to query and modify it as needed).  Point data can be exported from this database as a CSV (actually tab-separated) file and converted to a shapefile in QGIS.

For this project, the globe is divided into 1×1 degree cells.  Each cell in my study area contains at least one vector point in my locality shapefile (exported from te database, above).  Many cells contain multiple points, each of which may represent a locality interpreted as representing a different paleoenvironment than the others.  I am using R to summarize these data points into a raster for each time slice I wish to study.

The method below emplys a winner-takes-all method, which is to say that during conversion from point data to raster, the points within each cell are evaluated for the values in the field ‘env_id’ (environment ID number, since we can’t use text in a raster), and the most frequent value of ‘env_id’ “wins” the cell and becomes the value burned into the output raster.

### Converting a point shapefile (with environment in numeric 
### field env_id) into a raster
### Winner-takes-all.

# Load libraries for raster, maptools and mode

# Read in the locality point shapefile

# Make the base raster.
basegrid<-raster(ncols=360, nrows=180);

# Create a subset for the age you are looking at

# Convert the locality shapefile to raster.  Use the most frequent 
# value of numeric field 'env_id'.
r<-rasterize(locs_subset, basegrid, field='env_id', fun=function(x, ...){mfv(x)[1]});

# View the plot (centered on SA)

# View the plot (all)
# plot(r);

# Write the raster to file
writeRaster(r,"10 Ma.gtiff","GTiff");


Output from R looks something like the top image, and can be manipulated as any other raster in QGIS etc.  Scale bar shows the numeric values used to represent different paleoenvironments in the raster; it’s not actually a continuous scale but I didn’t have a need to change the default R output.

Additional notes:

  • In rasterize(), to use a function other than the ones enumerated in the docs, it needs to be defined within the call to rasterize() is used above and here.
  • One example of reading in shapefile data to R is here.
  • I learned about package modeest here.

List of GPlates rotation files

My dissertation is on BIOgeography, I swear!  I’ve had a hard time finding plate tectonic reconstructions (rotation files, .rot or .grot) to use in GPlates (hopefully a post on that sometime soon), so I will try to add to this list as I learn more.

Global rotation files:

  • First, the main GPlates Download page.  Scroll down to the “Download GPlates-compatible data” section, which contains some of what is listed here.
  • GPlates official sample data was commented on here.
  • Earthbyte rotations go back to 140 Ma.
  • CalTech rotations (which may or may not be different from EarthByte) go back to 140 Ma. 
  • Seton et al. (2012) rotations go back to 200 Ma.
  • Golonka (2007) has global rotation data in the supporting online material for the Late Triassic-Early Jurassic.
  • Supporting data for Williams et al. (2012), available from an FTP link in the text, has a rotation file from 1100 Ma to 530 Ma based on “the model of all Rodinia described by Li et al. (2008)” (SupplementaryTutorial2.pdf, p. 5).

Regional rotation files:

  • Australia
    • A rotation file for Australia (~1100 Ma – 530 Ma) by Giles et al. (2003) is included in the Williams et al. (2012) supporting data, above. 
    • A rotation file for Australia (~1100 Ma – 530 Ma) by Henson et al. (2011) is included in the Williams et al. (2012) supporting data, above.
    • A rotation file for Australia (~1100 Ma – 530 Ma) by Li and Evans (2011) is included in the Williams et al. (2012) supporting data, above.


GIS: Raster cell centroids as vector points with GDAL and R

Short intro: I’m using raster data for part of my dissertation, but I’m storing the paleoenvironment data in a database as the latitude and longitude of the center of each grid cell. This lets me query my environment localities, add new ones, and make specific datasets.

Sometimes I take existing maps of other people’s environmental interpretations and digitize them, leaving me with vector polygon data. I need to convert the polygons into the same kind of data I’ve been storing. This is surprisingly hard in both QGIS and GDAL alone, so I finally figured out how to do it in R today courtesy of Simbamangu over at StackExchange.

See the attached images for how things look at each stage.

Below are the general terminal commands, first in GDAL and then R. I will try to add more rationalization and explanation later. My operating system is Mac OS X but these should be modifiable.

GDAL in your terminal

gdal_rasterize -at -where 'freshwater = 1' -a freshwater -ts 360 180 -te -180 -90 180 90 -l "source_layer_name_in_shapefile" "shapefile.shp" "output_raster.tif"
Produces a geoTiff of your input polygons where the attribute ‘freshwater’ is equal to ‘1’. For more detail I invite you to read the GDAL documentation, it’s quite good.


To run through a single file:
# load necessary libraries

# Load the raster from a file
r <- raster("rastername.tif") # Convert to spatial points p <- as(r, "SpatialPointsDataFrame") # Set name of column of raster values for later use names(p)[1]<-"value" # Get only the values we want (values above zero) in that column psubset<-subset(p,p$value>0);

# Save as a shapefile (extension appended automatically)
writeSpatialShape(psubset, "outputfile");

You can also loop through a directory:
# load necessary libraries

# Set working directory

# list all .tif files in working directory
for (file in dir(pattern="*.tif")) {
r <- raster(file); p <- as(r, "SpatialPointsDataFrame"); # Set name of column of raster values names(p)[1]<-"value"; # Get only the values we want (values above zero) psubset<-subset(p,p$value>0);

# Make a new filename
oldname<-substr(file, start = 1, stop = nchar(file) -4); newname<-paste(oldname,"pts",sep=" "); # Make a shapefile writeSpatialShape(psubset, newname) ; # Make a CSV file # Convert to data frame so you can drop extra columns psubsetdf< write.table(psubsetdf[,c(2,3)],file=paste(newname,".csv",sep=""),sep=",", row.names=FALSE); }

[notes] Converting MRSID to GeoTIFF using GDAL in Mac OS 10.6 (Snow Leopard)

Just found out how to convert MrSID files (from NZ geology) to other formats using GDAL.

  • Followed instructions here:
  • Needed most recent version of GDAL with frameworks (1.8 Complete):
  • Had to change command to “/Library/Frameworks/GDAL.framework/Versions/Current/Programs/gdal_translate -of PNG /Users/Matt/Desktop/whangarei/inputmap.sid /Users/Matt/Desktop/whangarei/outputmap.png”
  • That makes an image file, but I think I’ll convert to geotiff for GIS:
    • /Library/Frameworks/GDAL.framework/Versions/Current/Programs/gdal_translate /Users/Matt/Desktop/whangarei/inputmap.sid /Users/Matt/Desktop/whangarei/outputmap.tif
  • Map source:
  • Map coordinate system: New Zealand Map Grid (Geodetic Datum 1949)
  • gdal_translate man:

Wanted (and some found): Geologic maps (formation contacts) of South America countries, for use in GIS

Here’s a good ol’ plea for help from the scientific community. As my questions to “the scientific community” via have gone unnoticed, I’m posting this out here to see if anyone else searching for the same thing has had any luck. I’m building a GIS (geographic information system) model to determine the possible biogeographic districbution of a genus through time. What I need for this, since the fossils are from South America, is a good geologic map, either of most of the continent or of the countries of Brazil, Argentina, Ecuador, and Peru. Hunting around online hasn’t shown me anything that I want to shell out a bunch of money for, site unseen, and I’m honestly trying to avoid having to scan paper maps and register them (additionally, I haven’t been able to access our worldwide collection of paper geologic maps recently). 1:500,000 or 1:100,000 would be great, but I’d even take 1:1,000,000 at this point. I need something with formational contacts so I can plot possible distributions. So that’s the challenge of the day: have any geologists in South America discovered a good source for this type of material? Would you be willing to share or trade? Drop me an email at one of the addresses on the right sidebar if you can help.


UPDATE 2011-08-15: I found (finally) an online version of the Geological Map of Brazil at 1:1,000,000 scale. Unfortunately it is only available in a viewer, broken up into small sections. Anyone know where I can download the full dataset, or at least that data for those sections? UPDATE 2011-08-15 1431: Thanks to Sidney Goveia over at Geosaber, I tracked down the Brazil data to Geobank, so here goes nothing! If you are using a Mac, make sure you extract the contents of the ZIP files using StuffIt Expander (if you have it) rather than Archive Utility, otherwise you don’t end up with a folder.


UPDATE 2011-08-16: You can download a formation-level geological map of Peru at the 1:1,000,000 scale from INGEMMET – the Instituto Geologico Minero Y Metalurgico. There are a few steps through the online viewer (note: this is the new viewer, so the following steps may not work the same), which may appear in Spanish at first but for some reason decided to reload partway through in English. If you want, you can try to figure out how to turn on the layer under the Map–>Geodatabase menu, but really when that menu comes up you want to click the Download/Descargas folder icon, then select the Geologia layer (SHP icon next to it). It also looks like you can download the data as a KML file for Google Earth. A structural layer (Dominios Estructurales) also looks available, and I checked out the radiometric date layer (Dataciones Radiometricas) as well, which could be useful. You might need to check the projection on the geologic map once you download it. I imported it to QGIS at first and it came up as WGS 84 (probably because of the existing project), but I dug around and figured out that it works as PSAD_1956_UTM_Zone_18S. Need a map key for formation symbols? You can download scans of the paper version of this map from here, one of which has the legend. If the correct map doesn’t come up at that link, click on “Ministerio de Energia y Minas. Instituto de Geologia y Mineria” so see the others. The symbology is different. An alternative large-scale (1:100,000) but low-quality set of maps is available here but will not easily go into GIS.

kamelseesv has been updated

After a little bit of troubleshooting yesterday with some input files, I’ve updated my kamelseesv script to add a few more lines about the output format (latitude and longitude will appear reversed in the KML file, but they are really in X,Y format instead of Y,X format) and to allow you to view/download the result rather than copying and pasting into a text file. This should let people preview converted datasets without having to save each and every one of them.