Further Innovations in Campus Mapping

Almost a year or so ago I posted some quick round-ups of some recent innovations (at the time) in UK HE’s use of online campus maps (Open Data Powered Location Based Services in UK Higher Education, Innovations in Campus Mapping). Retweeting a post from Mike Nolan (Mapping the campus) about Edge Hill’s use of OpenStreetMap as the basis of an intereactive campus map (a nice example of how local benefits can support the common good?) I got a couple of tweeted responses about use of OpenStreetMap elsewhere:

– @danmcquillan mentioned how “my social computing 2nd years added detail & edits to @openstreetmap for @GoldsmithsUoL” [GoldsmithsUoL on OpenStreetMap]

– @julieallinson sent a link to the blog post that announced the University of York’s latest interactive map release, as well as to a behind the scenes post, which describes their use of custom tiles from CloudMade with the Google Map API providing the interactive map infrastructure. Data for location markers is stored in a Google Spreadsheet and then transformed into an appropriate JSON format via YQL which is glued into the map using a dash of JQuery. Magic:-)

I was also interested to see how the York U Estates department was releasing information about their building codes.

A quick trawl turned up a couple of other approaches to campus mapping that I don’t think I’ve mentioned before:

– the University of Warwick interactive map appears to use Bing maps, and as part of the offering provides a crude room level search:

– From the GoGeo team at Edina, a how-to post on creating simple campus maps using Digimap ROAM. Digimap ROAM is a service that allows you to annotate Ordnance Survey maps that can then be “printed off” as PDF documents.

The GoGeo blog post ends with a tease – “In the next post, we will look at some more advanced uses of Digimap data in campus maps” – but I can’t find any evidence of such a follow on post appearing?

So – any other innovations in campus based interactive maps out there (and in particular, web team blog posts about some of the technical details, including links to github repos containing the relevant code, perhaps?!;-)

PS Further OSM based campus maps: University of Cambridge

Playing With R/ggplot2 Online (err, I think..?!)

Trying to get my head round what to talk about in another couple of presentations – an online viz tools presentation for the JISC activity data synthesis project tomorrow, and an OU workshop around the iChart eSTeEM project – I rediscovered an app that I’d completely forgotten about: an online R server that supports the plotting of charts using the ggplot library (err, I think?!): http://www.yeroon.net/ggplot2/


Example of how to use http://www.yeroon.net/ggplot2/

By the by, I have started trying to get my head round R using RStudio, but the online ggplot2 environment masks the stats commands and just focusses on helping you create quick charts. I randomly uploaded one of my F1 timing data files from the British Grand Prix, had a random click around, and in 8(?) clicks – from uploading the file, to rendering the chart – I’d managed to create this:

ggplot - British Grand Prix

What it shows is a scatterplot for each car showing the time on the current leader lap that the leader is ahead. When the plotted points drop from 100 or so seconds behind to just a few seconds behind, that car has been lapped.

What this chart shows (which I stumbled across just by playing with the environment) is a birds-eye view over the whole of the race, from each driver’s point of view. One thing I don’t make much use of is the colour dimension – or the size of each plotted point – but if tweak the input file to include the number of laps a car is behind the leader, their race position, the number of pitstops they’ve had, or their current tyre selection, I could easily view a couple more of these dimensions.

Where there’s a jump in the plotted points for a lap or two, if the step/break goes above the trend line (the gap to leader increases by 20s or so), the leader has lapped before the car. If the jump goes below the trend line (the gap to the leader has decreased), the leader has pitted before the car in question.

But that’s not really the point; what is the point is that here is a solution (and I think mirroring options are a possibility) for hosting within an institution an interactive chart generator. I also wonder to what extent it would be possible to extend the environment to detect single sign on credentials and allow a student to access a set of files related to a particular course, for example? Alternatively, it looks as if there is support for loading files in from Google Docs, so would it be possible to use this environment as a way of providing a graphing environment for data files stored (and maybe shared via a course) within a student’s Google Apps account?

Fragments: Accessing YouTube Account Data in Google Spreadsheets via OAuth

If you’re running a Youtube account, how might you collect Insights data for all your videos as spreadsheet entries that can be used in the preparation of reports about your social media effectiveness?

One way might be to go to each video in turn and download the separate CSV data files created for each video. Alternatively, you can grab the data via the YouTube/GData API (http://code.google.com/apis/youtube/2.0/developers_guide_protocol_insight.html).

I haven’t actually got round to getting any data out of my YouTube account and into a Google spreadsheet yet, but I have dome the first step, which is to set up the authentication using OAuth. Here’s the Google Apps script I used…

function youtube(){
  // Setup OAuthServiceConfig
  var oAuthConfig = UrlFetchApp.addOAuthService("youtube");
  oAuthConfig.setAccessTokenUrl("https://www.google.com/accounts/OAuthGetAccessToken");
  oAuthConfig.setRequestTokenUrl("https://www.google.com/accounts/OAuthGetRequestToken?scope=http%3A%2F%2Fgdata.youtube.com%2F");
  oAuthConfig.setAuthorizationUrl("https://www.google.com/accounts/OAuthAuthorizeToken");
  oAuthConfig.setConsumerKey("anonymous");
  oAuthConfig.setConsumerSecret("anonymous");

  // Setup optional parameters to point request at OAuthConfigService.  The "twitter"
  // value matches the argument to "addOAuthService" above.
  var options =
    {
      "oAuthServiceName" : "youtube",
      "oAuthUseToken" : "always"
    };

  var result = UrlFetchApp.fetch("http://gdata.youtube.com/feeds/api/users/default/favorites?v=2&alt=json", options);
  var o  = Utilities.jsonParse(result.getContentText());
  Logger.log(o)
}

[Gist here: https://gist.github.com/1067283]

The first time you run the script, it should request access from your YouTube account…

The next step is to work out what to pull from Youtube, and how to actually store it in the spreadsheet…

PS a couple more Youtube snippets of interest:
YouTube documentation wizard: customise your YouTube API documentation view
interactive YouTube API explorer

Google Visualisation API Controls Support Interactive Data Queries Within a Web Page

The only way I can keep up with updates to Google warez at the moment is to feed off tips, tricks and noticings shared by @mhawksey. Yesterday, Martin pointed put to me a couple of new controls offered by the Google visualization API – interactive dashboard controls (documentation), and an in-page chart editor.

What the interactive components let you do is download a dataset from a Google spreadsheet and then dynamically filter the data within the page.

So for example, over on the F1Datajunkie blog I’ve been posting links to spreadsheets containing timing data from recent Formula One races. What I can now do is run a query on one of the spreadsheets to pull down particular data elements into the web page, and then filter the results within the page using a dynamic control. An example should make that clear (unfortunately, I can’t embed a live demo in this hosted WordPress blog page:-(

I’ve posted a copy of the code used to generate that example as gist here: Google Dynamic Chart control, feeding off Google Spreadsheet/visualisation API query

Here’s the key code snippet – the ControlWrapper populates the control using the unique data elements found in a specified column (by label) within the downloaded dataset, and is then bound to a chart type which updates when the control is changed:

  var data = response.getDataTable();
  var namePicker = new google.visualization.ControlWrapper({
    'controlType': 'CategoryFilter',
    'containerId': 'filter_div',
    'options': {
      'filterColumnLabel': 'driver',
      'ui': {
        'labelStacking': 'vertical',
        'allowTyping': false,
        'allowMultiple': false    
      }
    }
  });

  var laptimeChart = new google.visualization.ChartWrapper({
    'chartType': 'LineChart',
    'containerId': 'chart_div',
    'options': {
      'width': 800,
      'height': 800
    }
  });
  
  var dashboard = new google.visualization.Dashboard(document.getElementById('dashboard_div')).
    bind(namePicker, laptimeChart).
    draw(data)

As well a drop down lists, there is a number range slider control which can be used to set minimum and maximum values of numerical filter, and a string filter that lets you filter data within a column using a particular term (it doesn’t seem to support Boolean search operators though…) Read more about the controls here: Google visualisation API chart controls

Something else I hadn’t noticed before: sort events applied to tables can also be used to trigger the sorting of data within a chart, which means you can offer interactions akin to some of those found on Many Eyes.

Whilst looking through the Google APIs interactive playground, I also noticed a couple of other in-page data shaping tools that I hadn’t noticed before: group and join

Group, which lets you group rows in a table and present and aggregated view of them:

That is, if you have data loaded into a datatable in a web page, you can locally produce summary reports based on that data using the supported group operation?

There’s also a join operation that allows you to merge data from two datatables where there is a commmon column (or at least, common entries in a given column) between the two tables:

What the join command means is that you can merge data from separate queries onto one or more Google spreadsheets within the page.

With all these programming components in place, it means that Google visulisation API support is now comprehensive to do all sorts of interactive visualisations within the page (I’m not sure of any other libraries that offer quite so many tools for wrangling data in the page? (The YUI datatable supports sorting and filtering, but I think that’s about it for data manipulation?)

I guess it also means that you can start to treat a web page as a database containing one or more datatables within it, along with tool support/function calls that allow you to work that database and display the results in a variety of visual ways?! And more than that, you can use interactive graphical components to construct dynamic queries onto the data in a visual way?!

PS here are a couple of other ways of using a Google spreadsheet as a database:
Using Google Spreadsheets as a Database with the Google Visualisation API Query Language
Using Google Spreadsheets Like a Database – The QUERY Formula

Visual UI Editor For Google Apps Script

One of the things that’s still on my to-do list is to get round to playing more with Google Apps script, particularly the ability to create web-hookable services around spreadsheets and generate custom user interfaces to Apps Script powered applications.

Here’s something I think I’ll need to add to the list, as brilliantly spotted by @mhawksey: an experimental visual editor for creating forms in Google Apps Script:

Visual UI builder in Google Apps script

Here’s a preview of it in action:

Google Apps UI builder

And again from Martin, here’s a link to the developer forum group discussion about how to use it in its current experimental state…

I have to pop out now, so know time to play, but I thought it worth a mention… (Martin will probably have a post up about it before long;-)

PS also of note, another step on how the route to open peer appraisal and peer-supported CPD might work out, check out Martin’s draft application for the ALT Learning Technologist of the Year Award 2011.

PPS in passing, via @schmerg, an HTML5 visual editor for browser based UIs: Maqetta

Google Books Library Shelves

It’s been some time since I last had a look at the “My Library” service in Google Books, but with the announcement of Google eBooks store (currently US only, except for out-of-copyright free downloads) I popped over to my Google Books account to see whether anything else had changed…

One of the little known (I think?) features of Google Books is the “My Library” personalisation which allows you to create a collection of books and search over them. Searching your library finds all the books in your library collection that contain the search phrase; if a preview of the book is available returns deep links into the book to the point(s) at which the search terms appear:

Search within a book on google books

I’ve previously commented on the My Library aspect of Google Books in the context of its possible use by libraries for providing a full-text search option over books in their collection (e.g. Complementing the OPAC With a Full Text Search Book Catalogue where I describe the use of the service by Wiltshire Heritage Library (example) and the Penn State University Press booksearch (example)).

(At the moment I don’t think you can get statistics back on the searches carried out on a My Library profile, though Google books can do stats for publishers e.g. Google Books for Publishers).

Anyway – one of the problems I originally had with My Library was that you could only maintain a single collection. But it seems that it’s now possible to create separate collections by tagging books in your Library onto “shelves”:

Google Books - My Library

(Shelves appeared at the start of 2010, it seems: Updated Books Home Page and My Library.)

So what immediately comes to mind is that if you’re running several courses, you could add the books used in the course to a My Library shelf, and then publish a link to a search context for that shelf to give a full text searchable version of the books on the list (assuming they’ve been scanned by the Goog, of course). Where previews are available, deep links into books will be available as part of the search results.

I haven’t really populated any shelves yet, but here’s the idea:

Google books - My library search

I haven’t explored the Book Search Data API yet, bit it does seem to offer the ability to search over a particular user’s public library, as well as retrieve lists of books from the library. API options also exist for adding books to a library, though the API seems to only support adding labels, rather than updating shelves (or maybe legacy handlers map labelled books onto shelves?). With a bit of digging, it might be possible to find a route to automate the creation of a library shelf from a list of books. (Hmmm, maybe I should try this with the OU Set books list?!;-)

Google Books shelves thus seem to provide a way of creating different lists of books within a single user library, although I’m not sure if there is a limit on the number of books contained within a shelf, or in the library as a whole. Another nice feature is that it’s possible to select a shelf based filter to just display books from a similar shelf (click on the label in the left-hand sidebar to filter by shelf); this search facet also seems to be passed through to a bookmarkable URL for the filtered search via the as_coll argument (I think?). (Which is to say: you can share a link for a search within a particular shelf in a particular user’s library.)

I’m not sure if Google Books is available through Google Apps for Education, but it could be a useful component of a full text book search context around books on a reading list?

PS As Google Scholar appears to be improving its coverage, it strikes me that the Goog still doesn’t offer a Google service for building searchable reference lists, although it does let you customise the addition of links that will bookmark a reference to a service for you:

Google scholar citation linker

Here’s how the links are displayed:

Google scholar results

Given you can build weblink search contexts using Google custom search engines, full text book search contexts using the Books My Library service, search over content from bundled feeds in Google Reader and even run things like video search by user on Youtube*, the Goog must surely be looking to offer a collection building and searching over service for Google Scholar? So I wonder… could Google end up taking over a service like CiteULike or Mendeley to complement and bootstrap personalisation of their Google Scholar offering? Or would they just build their own (cut down) version of these services?

* Hmm… I wonder if there’s a Youtube API switch that lets you search playlists? It’s definitely possible to get a playlist feed out…

PPS the Goog is also lacking a way of exposing all these personal search contexts to a logged in user through the same interface. If it were down to me, I’d start to expose them in the left hand sidebar of Google websearch, so I’m guessing this will be a labs/experimental service in the new year, if it isn’t already so…

Google search tools

…maybe…?;-)

onFormSubmit – Raising Web Scale Events in Google Spreadsheets

What happens if you want to actually do something with a particular response from a web survey form at the time it is submitted, other than just collect it?

One of the handy things about Google Spreadsheets is the ability to create interactive web survey forms that can collect data that is then posted into a corresponding spreadsheet. Around the time of the Google I/O event, several event related features were released as part of Google Apps script, the javascript scripting framework that supports an increasing number of Google apps. And by “event” I don’t mean something like the upcoming Isle of Wight Festival – I mean computational events, that can be used to trigger other computational actions…

One of the new events is onFormSubmit, which I finally got round to playing with last night. Here’s my “Hello World” example:

So here’s the code:

//Test function for Google Apps Script onFormSubmit
//Two sheets in a single spreadsheet doc
//First sheet corresponds to form
//Second sheet just displays one of the elements from the most recent form submission
// the function testOnSub() has a trigger associated with it: 'From spreadsheet' 'On form submit'
function testOnSub() {
  var ss = SpreadsheetApp.openById(SPREADSHEET_KEY);
  var sheet=ss.getSheets()[1];
  var form=ss.getSheets()[0];
  var lr=form.getLastRow();
  var el=form.getRange(lr,2,1,1).getValue();
  var t=el;
  sheet.getRange(1,1,1,1).setValue(t);
}​

Here’s how to set it…

Google apps script - spreadsheet events

What next? Earlier this week, I watched a compelling presentation from @progrium, based around the following slide deck:

Among the handy tools demonstrated (I loved the idea of clickhooks (src), clickable links with webhook callback actions) was a webhook debugging tool, postbin. What this tool does is just capture and redisplay stuff that is posted to it… which makes it ideal for a quick demo…

So for example, suppose I have a Google form set up, and I want to perform a particular action using a third party webservice on some element contained in the form submission, or maybe only on certain items according to what information was submitted via the form, as soon as the form is submitted. Here’s one way of doing that (code on gisthub):

// Simple spreadsheet, with first sheet containing form submission repsonses
// when the form is submitted:
// 1) grab the latest response,
// 2) post it to a third party service via an HTTP POST
function testWebhook() {
  var ss = SpreadsheetApp.openById(SPREADSHEET_ID);
  var form=ss.getSheets()[0];
  var lr=form.getLastRow();
  var el=form.getRange(lr,2,1,1).getValue();
  var t=el;
  //The following escape palaver is gleaned from a Google help forum...
  var p="val1="+encodeURIComponent(t).replace(/%20/g, "+")+"&val2="+encodeURIComponent(form.getRange(lr,3,1,1).getValue()).replace(/%20/g, "+");

  // Here's where we do the callback...
  var x=UrlFetchApp.fetch('http://www.postbin.org/YOURPASTEBINID',{method: 'post', payload: p});
}​

Attach the on form submit trigger event to the function, and here’s the response when we submit a form:

Pastebin response from Google spreadsheet onFormSubmit callback

Clever, eh?

So what does this mean? It means that I can set up a Google Survey form and as soon as anyone posts a submission, I can process it, either within the Google Apps environment using Google Apps script, or using third party services that accept an HTTP post input.

As Jeff Lindsay suggests, the evented web is increasingly a reality…

Steps Towards a Volcanic Ash Advisory Google Maps Mashup Using Met Office Data

Sigh…;-)

Hi Tony,
In looking for authoritative news on the cloud about to tr-ash my holiday, I found this advisory:

http://www.metoffice.gov.uk/aviation/vaac/vaacuk.html

There’s a string of coordinates which appear to mark the extent of the cloud at “FL200” (20000 feet?). E.g. N6343 W01935 is 63.43N 19.35W.

It looks ripe for your skills, and if you could get a map and a how to …

Don’t you realise it’s a race weekend – and if I tinker on such a weekend, I tinker F1data?!;-)

Okay – so here’s a half hour hack (I timed it)…

The advisory data from the Met Office looks like this:

Rather than mess around with any scraping, I just copies and pasted some of the data into a text editor (I use TextWrangler on a Mac) and ran a few regular expressions over it. Starting at the bottom of the list:

– remove end of line characters and replace them with a space;
– remove the “-” and any whist space around it and replace it with a return (new line) character;
– replace the space with a tab character.

The result of the regular expression processing is a two column tab separated list of co-ordinates.

These can then be copied and pasted into a spreadsheet. For rapid prototyping purposes, I pasted the data into a Google spreadsheet because I know I can get a access to a CSV output from there, and I also know that I can get access to a map widget that will plot markers given lat/long data in that environment.

BIG OOPS… The next step was to decode the position data. How to read a Volcanic Ash Advisory gave a couple of clues, and a quick test suggested the data is direction, degrees, minutes, seconds concatenated. I’m guessing that in the general case the degrees are always two digits, the minutes one or two digits and the seconds zero or two digits but in the data I looked at the length was always 5 characters for latitude (dDDMM), 6 for longitude (dDDMSS), so I just worked with those fixed lengths.

Most of the mapping tools I use require lat/long co-ordinates in a decimal format, so a quick check of Stack Overflow turned up a function to convert from degrees, minutes, seconds to the decimal version (Converting latitude and longitude to decimal values).

UPDATE: I think the data format is actually just a decimal format to 2 dp. So the above and code below is all, in this case, nonsense… but I’m going to leave the content here anyway…

I then used a variant of this code to hack a formula using Google Apps script to run the conversion:

Using the apps script function I had defined as a spreadsheet formula, I could convert the lat/long data in the format provided by the advisory note into the more typical digital representation. Highlighting the converted lat/long data and Inserting a Google Maps Gadget gave me a mapped preview over the data.

Here’s the code (needs generalising bearing in the 1/2 digits for minutes; I had a problem with substring (not sure what – I only ever got one character out) which is why I used substr for the parsing):>

//http://stackoverflow.com/questions/1140189/converting-latitude-and-longitude-to-decimal-values

/***************************************
*** BROKEN CODE - left in for reference purposes only ***

****************************************/
function ConvertDMSToDD_BROKEN_STUPID(degrees, minutes, seconds, direction) {
    var dd = degrees + minutes/60 + seconds/(60*60);

    if (direction == "S" || direction == "W") {
        dd = dd * -1;
    } // Don't do anything for N or E
    return dd;
}

function ParseDMS(input) {
  var parts = [];
  if ((input.substring(0)=="N")||(input.substring(0)=="S")){
    parts[0]=parseInt(input.substr(1,2));
    parts[1]=parseInt(input.substr(3,2));
    parts[2]=0;
    parts[3]=input.substring(0);
  } else {
    parts[0]=parseInt(input.substr(1,2));
    parts[1]=parseInt(input.substring(3));
    parts[2]=parseInt(input.substr(4,2));
    parts[3]=input.substring(0);
  }
  var coord = ConvertDMSToDD(parts[0], parts[1], parts[2], parts[3]);
  return coord;
}

You can see the [UPDATE: corrected] spreadsheet here: Volcanic Ash Advisory mapping demo.

I replaced the lat/long conversion formula with one that:
– adds a decimal place after the first two digits;
– adds a minus sign for W/S.

Here’s the corrected formula:

function ParseDMS(input) {
  var parts = [];
  parts[0]=input.substring(0);
  parts[1]=input.substr(1,2)+'.'+input.substr(3);
  var coord = parseFloat(parts[1]);
  if ((parts[0]=='S')||(parts[0]=='W')) coord=coord*-1;
  return coord;
}

So what would be next?

– tidy up the lat/long conversion code so that it works in general case;
– find a way of pulling data into the spreadsheet live.

It would also be interesting to try to take into account altitude data, and then view the data in 3D in something like Google Earth. A heat map view rather than separate markers might also be fun to do (e.g. using HeatMapAPI).

But for now, back to the weekend…

Screenscraping With Google Spreadsheets App Script and the =importHTML Formula

Exciting news: Google Apps script is now available for all Google spreadsheet users…, so it seems I was timely in starting to get to grips with playing with this stuff…;-)

So what can we do with Google Apps Script? I’ve been posting a few ideas already, but here’s something I was working on last night – a script that automates table screenscraping using the Google spreadsheet =importHTML() formula.

Here’s the setting: Lichfield council lists details of the last round of council elections on a summary results page:

and a set of individual pages detailing comprehensive results from each ward (example).

What I wanted was a big table listing all the results by ward.

It’s easy enough to pull these results into a Google Spreadsheet using the =importHTML(“URL”,”table”,N) formula (where N is the number of the table in the page), but pulling results in for all 26 wards and then processing them separately would be laborious, so here’s what I did…

First of all, pull in the summary table to get a list of all the wards:

The results by ward page all live on a commonly structured URL (e.g. http://www.lichfielddc.gov.uk/site/custom_scripts/electiondetail.php?ward=2&category=1) with the ward parameter identifying the ward. This URL isn’t picked up by the table scraper, so I had to do a bit of hand finishing, adding an ID column to which I added the appropriate ward number of each ward as used in the ward results URLs.

So, having got a sheet that listed the wards, and a (hand added) identifier for each ward, I could write a script that would create a separate sheet for each ward and add an appropriately customised =importHTML() formula to it:

function addSheets() {
  var ss = SpreadsheetApp.getActiveSpreadsheet();
  
  var sheet = ss.getSheetByName("Wards");
  var wardNameID=[];
  var wardIDName=[];
  if (sheet != null) {
    var col_ward=getColNumber(sheet,"Ward");
    var col_ID=getColNumber(sheet,"ID");
    var startRow = 2;  // First row of data to process
    var numRows = getMaxRows(sheet);
    var maxcols= getMaxColumns(sheet);
    
    var dataRange = sheet.getRange(startRow, 1, numRows, maxcols);
    var data = dataRange.getValues();
    for (i in data) {
     var row = data[i];
     var wardName=row[col_ward];
     var wardID=row[col_ID];
     //Browser.msgBox(wardName+" "+wardID);
     wardNameID[wardName]=wardID;
     wardIDName[wardID]=wardName;
    }
  }
  for (var i=2;i<numRows;i++){//28
    var sname=wardIDName[i];
    ss.insertSheet(sname, 1);
    sheet = ss.getSheetByName(sname);
    addTableImporter(sheet,i)
  }
}

This function uses a couple of utility functions – getMaxRows() and getMacColumns() which I superstitiously added to get the number of populated rows/columns from a sheet:

function getMaxRows(sheet){
  var maxrows = sheet.getMaxRows();var max =maxrows;
  for (var j=1;j<maxrows+1;j++){
    var header= sheet.getRange(j, 1, 1, 1).getValue();
    if (header==""){
       max=j; j=maxrows+1;
    }
  }
  return max;
}
      
function getMaxColumns(sheet){
  var maxcols=sheet.getMaxColumns(); var max=maxcols;
  for (var j=1;j<maxcols+1;j++){
    var header= sheet.getRange(1, j, 1, 1).getValue();
    if (header==""){
       max=j; j=maxcols+1;
    }
  }
  return max;
}

and the more generally useful getColNumber(), which gets the number of a column in the sheet given its header:

function getColNumber(sheet,colName){
  var colNum=-1;
  var maxcols=sheet.getMaxColumns();
  for (j=1;j<maxcols+1;j++){
    var header= sheet.getRange(1, j, 1, 1).getValue();
    if (header==colName) colNum=j-1;
  }
  return colNum;
}

The addTableImporter() function is the one that adds the importTable() formula to each sheet as required:

function addTableImporter(ss,n) {
 var sc = ss.getActiveCell();

 var arr = [];
 var c = [];
 c[0]='=importHTML("http://www.lichfielddc.gov.uk/site/custom_scripts/electiondetail.php?ward='+n+'&category=1","table",1)';;
 arr[0]=c;

 var destinationRange = ss.getRange(1 , 1, 1,1);

 destinationRange.setValues(arr);
}

Running the addSheets() function creates one sheet per ward, imports the appropriate table, and names the sheet as the name of the Ward.

We’re now in a position to pull together a monolithic table that aggregates data from all the wards:

function aggregator(){
  // for each ward spreadsheet, load in data by column
  // dataRow=[wardName, candidate, votes]
  // if votes=="n/a", votes =100;
  var ss = SpreadsheetApp.getActiveSpreadsheet();
  var table=[];
  table.push(["Ward","Candidate","Party","Vote"]);
  var writeRow=[];
  var bigsheet = ss.getSheetByName("Aggregate");
  var wards=getWardNames();
  for (var i=0;i<wards.length;i++){
    //get sheet
    var sheet = ss.getSheetByName(wards[i]);
    if (sheet!=null){
      var maxrows=getMaxRows(sheet);
      for (var j=2;j<maxrows+1;j++){
        var candidate=sheet.getRange(j, 1, 1, 1).getValue();
        var party=sheet.getRange(j, 2, 1, 1).getValue();
        var vote = sheet.getRange(j, 3, 1, 1).getValue(); if (typeof vote != 'number') vote=100;
        if (candidate!=""){
          writeRow=[wards[i],candidate,party,vote];
          table.push(writeRow);
        }
      }
    }
  }
  var destinationRange = bigsheet.getRange(1, 1, table.length, 4);
  destinationRange.setValues(table);
}

Again, I make use of a utility function, this time to grab the names of the wards from the ward spreadsheet:

function getWardNames(){
  var ss = SpreadsheetApp.getActiveSpreadsheet();
     
   var sheet = ss.getSheetByName("Wards");
   var wardNames=[];
   if (sheet != null) {
       var col_ward=getColNumber(sheet,"Ward");
       var startRow = 2;  // First row of data to process
       var numRows = getMaxRows(sheet);
       var dataRange = sheet.getRange(startRow, 1, numRows, 1);
       var data = dataRange.getValues();
       for (i in data) {
        var row = data[i];
        wardNames[wardNames.length]=row[col_ward];
       }
   }
  return wardNames;
}

Running the aggregator function pulls all the data into one sheet:

Finally, for completeness, I added a routine that will delete the separate ward sheets:

function deleteWardSheets(){
     var ss= SpreadsheetApp.getActiveSpreadsheet();
     var wardNames=getWardNames();
     for (var i=0;i<wardNames.length;i++){
       ss.setActiveSheet(ss.getSheetByName(wardNames[i]));
       ss.deleteActiveSheet();
     }
  }

Supposedly, it’s possible to share scripts by submitting them to an Apps script gallery:

On submitting a script, it appears to get passed into an approvals process, so I’m not sure what happens next…?

Anyway – the code is all provided above if you want to try it out. Remember, what it’s doing is looking at a list of Ward names and identifiers, creating a sheet for each Lichfield local council ward, importing the election results table for each ward from the Lichfield site into the appropriate sheet, then providing another function that generates an aggregated datatable containing all the results, annotated with the name of the appropriate ward.

I was wondering whether this could all be done with from single function call, but I got twitchy that if I called a sheet before the imported data was loaded everything would break.

The take home from me about this is that with a little bit of creativity you can mix and match spreadsheet formula and Javascript functions. (I don’t know if spreadsheet formula can be accessed directly from the apps script?) That is, you can run calculations either in Javascript/Apps script, or if it’s more convenient to use a spreadsheet formula, you can…

Writing 2D Data Arrays to a Google Spreadsheet from Google Apps Script Making an HTTP POST Request for CSV Data

Just a quick post to log another Google Apps script how to – this time how to grab a 2D array of CSV data from a URL and then paste it into a spreadsheet. For the trivial case, we could just as easily do this with an =importData() formula, but I’m still working out what bits of glue might be useful on the Apps script front…;-)

So – the demo function will take a blank spreadsheet and do this to it:

That is, shove some data from a query run over a data.gov.uk SPARQL endpoint into it.

Here’s the script – it shows how to HTTP POST a query from the Apps script to the SPARQLproxy endpoint, grab the results back as CSV, then convert them crom the CSV array to a range that can be posted into the spreadsheet.

function dgtest(){

  var s = SpreadsheetApp.getActiveSpreadsheet();
  var ss = s.getActiveSheet();
  var sc = ss.getActiveCell();

  //I really need to tweak this so we can enter a SPARQL query and then generate the args string from it...
  var args="query=prefix+sch-ont%3A+%3Chttp%3A%2F%2Feducation.data.gov.uk%2Fdef%2Fschool%2F%3E%0D%0ASELECT+%3Fschool+%3Fname+%3Fdate+%3Fdistrict+WHERE+{%0D%0A%3Fschool+a+sch-ont%3ASchool%3B%0D%0Asch-ont%3AestablishmentName+%3Fname%3B%0D%0Asch-ont%3AopenDate+%3Fdate%3B%0D%0Asch-ont%3AdistrictAdministrative+%3Fdistrict.%0D%0A}+ORDER+BY+DESC%28%3Fdate%29+LIMIT+15&output=csv&callback=&tqx=&service-uri=http%3A%2F%2Fservices.data.gov.uk%2Feducation%2Fsparql";

  var x=UrlFetchApp.fetch('http://data-gov.tw.rpi.edu/ws/sparqlproxy.php',{method: 'post', payload: args});
  var ret=x.getContentText();
  ret = CSVToArray( ret, "," );

  var arr = [];
  var c = [];
  for (var i=0;i < ret.length-1;i++) {
    c=[];
    for (var j=0; j< ret[0].length;j++){
      c.push(ret[i][j]);
    }
    arr.push(c);
  }

  var destinationRange = ss.getRange(1, 1, i, j);
  destinationRange.setValues(arr);
}

The CSV2Array function is one I found on Stack Overflow