<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
 
 <title>swish-climate-impact-assessment</title>
 <link href="http://swish-climate-impact-assessment.github.io/feed/" rel="self"/>
 <link href="http://swish-climate-impact-assessment.github.io/"/>
 <updated>2018-11-02T06:11:25+00:00</updated>
 <id>http://swish-climate-impact-assessment.github.com/</id>
 <author>
   <name>ivanhanigan</name>
   <email>ivan.hanigan@gmail.com</email>
 </author>

 
 <entry>
   <title>Reboot again</title>
   <link href="http://swish-climate-impact-assessment.github.io/2018/11/reboot-again/"/>
   <updated>2018-11-02T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2018/11/reboot-again</id>
   <content type="html">&lt;p&gt;Several years ago this was all created to assist the Climate Change and Health projects led by &lt;a href=&quot;https://en.wikipedia.org/wiki/Tony_McMichael&quot;&gt;Tony McMichael out of NCEPH&lt;/a&gt;.  But since then funding has been difficult to get for Climate Change research in Australia and much of the work on this project fell by the wayside.&lt;/p&gt;

&lt;p&gt;Today we are pleased to announce that there is renewed interest in funding for climate change and health research, and especially in relation to health impact assessments (HIA).&lt;/p&gt;

&lt;p&gt;So the SWISH website, tools and servers are all getting rebooted (again).&lt;/p&gt;

&lt;p&gt;Stay tuned for more exciting news about our developments of data and tools to assist researchers to more quickly and effectively do health impact assessments for a range of environmental risk factors related to climate change, air pollution and the energy transition.&lt;/p&gt;
</content>
 </entry>
 
 <entry>
   <title>Reboot</title>
   <link href="http://swish-climate-impact-assessment.github.io/2015/12/reboot/"/>
   <updated>2015-12-02T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2015/12/reboot</id>
   <content type="html">&lt;ul&gt;
  &lt;li&gt;The SWISH project is going to move forward&lt;/li&gt;
  &lt;li&gt;The H used to stand for Health, but the scope of the project has grown and so now the H stands for ‘Holistic Climate Impact Assessments’.&lt;/li&gt;
&lt;/ul&gt;
</content>
 </entry>
 
 <entry>
   <title>Finishing Up</title>
   <link href="http://swish-climate-impact-assessment.github.io/2013/10/finishing-up/"/>
   <updated>2013-10-26T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2013/10/finishing-up</id>
   <content type="html">&lt;ul&gt;
  &lt;li&gt;The SWISH project is finishing up.&lt;/li&gt;
  &lt;li&gt;Ivan will add any future swishdbtoools or EWEDB content to his &lt;a href=&quot;http://ivanhanigan.github.io/&quot;&gt;Open Notebook Science (ONS) blog&lt;/a&gt;&lt;/li&gt;
  &lt;li&gt;&lt;a href=&quot;https://globalhealth.duke.edu/people/faculty/dear-keith&quot;&gt;Keith got a new job&lt;/a&gt;&lt;/li&gt;
  &lt;li&gt;ANDS showed our video at eResearch 2013.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Cheers!&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/Keith_at_eResearch_2013.jpeg&quot; alt=&quot;Keith_at_eResearch2013.jpeg&quot; /&gt;&lt;/p&gt;
</content>
 </entry>
 
 <entry>
   <title>User testing report</title>
   <link href="http://swish-climate-impact-assessment.github.io/2013/06/user-testing-report/"/>
   <updated>2013-06-24T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2013/06/user-testing-report</id>
   <content type="html">&lt;h1 id=&quot;swish-kepler-actors-user-testing&quot;&gt;SWISH Kepler actors user testing.&lt;/h1&gt;
&lt;p&gt;We invited Peter Manger from the Fenner School at the ANU to be one of our test users. Peter has a background in simulation, modelling and software. He is a good example of someone proficient with data processing and analysis but with no previous exposure to SWISH tools.&lt;/p&gt;

&lt;h1 id=&quot;how-well-did-the-user-use-swish&quot;&gt;How well did the user use SWISH?&lt;/h1&gt;
&lt;p&gt;Peter made his way through the tutorial, quicker than I expected for someone who had never used the software before.&lt;/p&gt;

&lt;h2 id=&quot;during-the-testing-two-mistakes-occurred&quot;&gt;During the testing two mistakes occurred&lt;/h2&gt;
&lt;ol&gt;
  &lt;li&gt;A missing an input link to an actor. When run the workflow reported an error, Peter quickly identified the omission himself and corrected it.&lt;/li&gt;
  &lt;li&gt;The identifier “Date” was used with a capital letters instead of “date”. When run Kepler reported an error message; however this problem was too subtle. Even after I pointed it out Peter was unclear as to why the error was occurring.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The images in the tutorial provided the necessary details to continue through the tutorial without encouraging Peter to notice and understand the accompanying text. The tutorial explains that the all links need to be connected for the workflow to run and the identifiers are the names of columns in the input data. Both of these points that directly explained the errors that occurred where overlooked.&lt;/p&gt;

&lt;h1 id=&quot;did-the-user-complete-the-analysis-they-intended-to-do&quot;&gt;Did the user complete the analysis they intended to do?&lt;/h1&gt;
&lt;p&gt;Yes!, Peter successfully assembled an operational workflow that ran and produced the correct result by following the tutorial.&lt;/p&gt;

&lt;p&gt;Peter was then curious and started to ‘play’ with values in the heat index calculation. He lowered the maximum temperature limit and increased the min - max threshold to generate a greater number of ‘heat waves’.&lt;/p&gt;

&lt;h1 id=&quot;what-features-did-they-find-useful-and-what-features-did-they-have-difficulty-with-or-wish-to-see-and-are-subject-to-future-improvement-if-there-are-available-resources&quot;&gt;What features did they find useful and what features did they have difficulty with or wish to see (and are subject to future improvement if there are available resources)?&lt;/h1&gt;
&lt;p&gt;Peter was able to quickly use the installer and update the software to the latest version of SWISH actors. Peter found the drag, drop and link nature of Kepler intuitive and usable. He was able to easily find all the actors needed by the tutorial. The actors used where clearly named and labelled, Peter found them simple to use.&lt;/p&gt;

&lt;p&gt;The biggest problem is the handling of errors. The SWISH actors that use Stata report their own error messages. The SWISH actors that use R do not report any error messages other than the fact an error has occurred. General Kepler actors report the Java stack trace from where the error occurred in Kepler source code.&lt;/p&gt;

&lt;p&gt;** None of these are useful to the user and could be improved. **&lt;/p&gt;

&lt;p&gt;The error reported by the Stata based actors indicate the error that Stata had, but it is always a consequence of a problem located somewhere in the lead up to running Stata. The error messages require an intimate understanding of how SWISH operates to decode.&lt;/p&gt;

&lt;p&gt;The R errors are completely opaque and provide no useful information.&lt;/p&gt;

&lt;p&gt;General Kepler errors are also cryptic and not helpful.&lt;/p&gt;

&lt;p&gt;Error handling for general Kepler actors is the responsibility of the developers of the actors and Kepler to manage. The SWISH Stata actors, although they perform their own error checking and reporting would improve the user experience dramatically by reporting more meaningful messages and solutions to the user. The SWISH R actors are completely lacking in any form of error handling and need to implement it within the R code its self.&lt;/p&gt;

&lt;h1 id=&quot;observations&quot;&gt;Observations&lt;/h1&gt;

&lt;p&gt;Peter had trouble linking some actors together as the links ‘snap’ to other nearby links or port. Peter worked around this when necessary by dragging links the long way round to avoid connecting to the wrong place.&lt;/p&gt;

&lt;p&gt;Peter did not run the workflow until the very end. This made finding errors more difficult.&lt;/p&gt;

&lt;p&gt;Peter did not realise that by creating and using the workflow in the tutorial he was using the statistical software Stata.&lt;/p&gt;

&lt;p&gt;The understanding Peter gained from the tutorial was mainly operational details of how to use Kepler.&lt;/p&gt;

&lt;h1 id=&quot;peters-feedback&quot;&gt;Peters feedback&lt;/h1&gt;

&lt;p&gt;Peter knew that Kepler was existing software but was unclear what part of the exercise corresponded to SWISH project. His suggestion was to package all the actors available in a SWISH subgroup of some kind.&lt;/p&gt;

&lt;p&gt;Peter liked the images, and found the progression from individual small steps at the start of the tutorial to boarder instructions at the end made sense.&lt;/p&gt;

</content>
 </entry>
 
 <entry>
   <title>Kepler common tasks</title>
   <link href="http://swish-climate-impact-assessment.github.io/2013/06/kepler-common-tasks/"/>
   <updated>2013-06-07T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2013/06/kepler-common-tasks</id>
   <content type="html">&lt;h2 id=&quot;kepler-common-tasks&quot;&gt;Kepler common tasks&lt;/h2&gt;
&lt;p&gt;This tutorial documents some of the tasks that frequently occur when using Kepler. It is not a sequential series of steps, but a collection of independent operations.&lt;/p&gt;

&lt;h2 id=&quot;start-kepler&quot;&gt;Start Kepler&lt;/h2&gt;
&lt;p&gt;Kepler can be started using the start menu. You can open the application or open it via the command line. Using the command line can sometimes display extra information useful in debugging workflows and is the way we recommended starting Kepler. To make it easier there is a shortcut installed with the SWISH tools. 
&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0010StartMenu1.png&quot; alt=&quot;Start menu&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;The command line window while Kepler is running&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0030ConsoleWindow.png&quot; alt=&quot;Command line window&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;h2 id=&quot;main-window&quot;&gt;Main window&lt;/h2&gt;
&lt;p&gt;Search and components panel on the left, canvas on the right, play buttons and menus at the top.&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0040MainKeplerWindow.png&quot; alt=&quot;Main window&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;h2 id=&quot;search-for-actor-or-director&quot;&gt;Search for actor or director&lt;/h2&gt;
&lt;p&gt;Searching is the easiest way to quickly find the part you are looking for. Type in the name and press enter.&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0045SearchForActor.png&quot; alt=&quot;Search&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;h2 id=&quot;add-actor-to-workflow&quot;&gt;Add actor to workflow&lt;/h2&gt;
&lt;p&gt;Drag and drop using the cursor&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0050DragActor.png&quot; alt=&quot;Add actor&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;h2 id=&quot;link-actors-together&quot;&gt;Link actors together&lt;/h2&gt;
&lt;p&gt;Links pass values around the workflow and sequence execution. To link actors use the cursor, click on a port arrow on one actor and drag to the port arrow on another actor.&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0060LinkActorA.png&quot; alt=&quot;Link actor&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0070LinkActorB.png&quot; alt=&quot;Link actor&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0080LinkActorC.png&quot; alt=&quot;Link actor&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;h2 id=&quot;save&quot;&gt;Save&lt;/h2&gt;
&lt;p&gt;Save using the file menu&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0090SaveWorkflow.png&quot; alt=&quot;Save&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;h2 id=&quot;rename-workflow&quot;&gt;Rename workflow&lt;/h2&gt;
&lt;p&gt;You will be prompted when saving to give a name, but can also change it afterwards.&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0100RenameWorkflowA.png&quot; alt=&quot;Rename workflow&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0110RenameWorkflowB.png&quot; alt=&quot;Rename workflow&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;h2 id=&quot;change-actor-or-director-values&quot;&gt;Change actor or director values&lt;/h2&gt;
&lt;p&gt;Double click on the actor and the values that can be changed appear. To keep changes click the commit button. Eg the constant value actor&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0120EditActorValuesA.png&quot; alt=&quot;Edit values&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0130EditActorValuesB.png&quot; alt=&quot;Edit values&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;h2 id=&quot;animate-at-runtime&quot;&gt;Animate at runtime&lt;/h2&gt;
&lt;p&gt;Turning this on will display what actor the workflow is up to when running. This given an indication of where a running workflow is up to.&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0140RunTimeHighlightA.png&quot; alt=&quot;Animate runtime&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0150RunTimeHighlightB.png&quot; alt=&quot; Animate runtime&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;h2 id=&quot;execute-a-workflow&quot;&gt;Execute a workflow&lt;/h2&gt;
&lt;p&gt;Like a VCR press the play button.&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0160Play.png&quot; alt=&quot;Play&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;h2 id=&quot;display-value&quot;&gt;Display value&lt;/h2&gt;
&lt;p&gt;Use the Display actor to view values on screen. The actor can be connected to any other actor. It is useful for debugging and inspecting the results through the workflow.&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0170DisplayValuesA.png&quot; alt=&quot;Display&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0180DisplayValuesB.png&quot; alt=&quot;Display&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;h2 id=&quot;move-the-canvas-view&quot;&gt;Move the canvas view&lt;/h2&gt;
&lt;p&gt;Use the mouse to move the part of the canvas displayed&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0190NavigateWorkflow.png&quot; alt=&quot;Display&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;h2 id=&quot;actors-not-linked-error&quot;&gt;Actors not linked error&lt;/h2&gt;
&lt;p&gt;Usually, all actors need to be connected in a complete sequence. This includes actors in sub-modules which can be misleading and tricky to find.&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/ KeplerCommonTasks0210MissingLinkErrorB.png&quot; alt=&quot;Link error&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0200MissingLinkErrorA.png&quot; alt=&quot;Link error&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;h2 id=&quot;rename-actor&quot;&gt;Rename actor&lt;/h2&gt;
&lt;p&gt;Actors can be given titles. This can help document the workflow or set the name of an actor to export.&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0220RenameActora.png&quot; alt=&quot;Rename actor&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0230RenameActorb.png&quot; alt=&quot;Rename actor&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0240Renameactorc.png&quot; alt=&quot;Rename actor&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;h2 id=&quot;director-missing&quot;&gt;Director missing&lt;/h2&gt;
&lt;p&gt;This happens to everyone!&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/ KeplerCommonTasks0250MissingDirectorError.png&quot; alt=&quot;Director error&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;h2 id=&quot;export-actor&quot;&gt;Export actor&lt;/h2&gt;
&lt;p&gt;Actors can be saved and then reused in other workflows.&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0260ExportActorA.png&quot; alt=&quot;Export actor&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0270ExportActorB.png&quot; alt=&quot;Export actor&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;img src=&quot;/images/KeplerCommonTasks0280ExportActorC.png&quot; alt=&quot;Export actor&quot; /&gt; 
&lt;br /&gt;&lt;/p&gt;

</content>
 </entry>
 
 <entry>
   <title>Developing new tools with Kepler workflows</title>
   <link href="http://swish-climate-impact-assessment.github.io/2013/06/developing-with-workflows/"/>
   <updated>2013-06-05T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2013/06/developing-with-workflows</id>
   <content type="html">&lt;h2 id=&quot;developing-new-tools-with-kepler-workflows&quot;&gt;Developing new tools with Kepler workflows&lt;/h2&gt;
&lt;p&gt;If you can’t get an actor from the core Kepler actors or our SWISH actor contributions then you have two choices:&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;assemble current actors into a new ‘composite’ actor or&lt;/li&gt;
  &lt;li&gt;develop a R/python/matlab/stata function to be a new actor.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you go for option 2 therefore you have to develop your function to work with other Kepler actors.    There are a few tricks to doing this.  This post will show several different approaches available to develop a new Kepler R actor.&lt;/p&gt;

&lt;h2 id=&quot;1-identify-r-function&quot;&gt;1 identify R function&lt;/h2&gt;
&lt;p&gt;There is probably an R function that does what you want.  If not start writing one.  If it is a really simple case of just using a current R function with simple input/output requirements you can write it straight into the Rexpression actor and add some ports… however anything more than a couple of lines can get buggy quickly, and this is not a good place to be debugging code.&lt;/p&gt;

&lt;h2 id=&quot;2-write-function-in-a-script&quot;&gt;2 write function in a script&lt;/h2&gt;
&lt;p&gt;Then , test/debug in an IDE like emacs, Rstudio or eclipse; and then deploy to Rexpression actor in workflow&lt;/p&gt;

&lt;h2 id=&quot;3-source-your-script-from-kepler&quot;&gt;3 source() your script from kepler&lt;/h2&gt;
&lt;p&gt;Similar to 2 but rather than copy the code to the actor just add&lt;/p&gt;

&lt;div class=&quot;highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;source('path/to/script.R')
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;&lt;/p&gt;

&lt;p&gt;to the actor.&lt;/p&gt;

&lt;h2 id=&quot;4-write-a-package&quot;&gt;4 write a package&lt;/h2&gt;
&lt;p&gt;Similar to 2 and 3 but the function is written to a package and then this is loaded with&lt;/p&gt;

&lt;div class=&quot;highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;require(MyPackage)
outputPortValue &amp;lt;- myFunction(inputPortValue, otherArgument)
outputPortValue
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;You can then publish this on GitHub or CRAN, or even just send as a zip or tar to your collaborators.&lt;/p&gt;

&lt;h2 id=&quot;5-add-this-to-myworkflows&quot;&gt;5 add this to MyWorkflows&lt;/h2&gt;
&lt;p&gt;if you have the R code or package details in the actor save this by right clicking on the actor and save it to your kepler directory under MyWorkflows.  This means it will be available whenever you open Kepler.&lt;/p&gt;

&lt;h2 id=&quot;6-if-it-is-really-awesome-contribute-it-to-swish&quot;&gt;6 If it is really awesome contribute it to SWISH&lt;/h2&gt;
&lt;p&gt;Take a fork of the swish-kepler-actors GitHub repo and add your actor and tests to the simpleInstaller/Actors folder. Then send a pull request to the SWISH maintainers and these will then be incorporated into our one-click installer.&lt;/p&gt;
</content>
 </entry>
 
 <entry>
   <title>test gislibrary with SLA concordance</title>
   <link href="http://swish-climate-impact-assessment.github.io/2013/06/test-gislibrary/"/>
   <updated>2013-06-03T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2013/06/test-gislibrary</id>
   <content type="html">&lt;h1 id=&quot;testing-the-gis-library-from-r-calculate-a-sla-concordance&quot;&gt;Testing the GIS library from R, Calculate a SLA concordance&lt;/h1&gt;
&lt;p&gt;In this post we will use the swish R/PostGIS tools to manipulate spatial data on a remote GIS server (to calculate a SLA concordance) and extract the result to our local client machine.  &lt;a href=&quot;/2013-06-03-test-gislibrary.R&quot;&gt;Clink here&lt;/a&gt; for the R script.&lt;/p&gt;

&lt;p&gt;The great thing about PostGIS is that it is a standard relational database that also understands spatial data.  We have developed &lt;a href=&quot;http://swish-climate-impact-assessment.github.io/tools/swishdbtools/swishdbtools-downloads.html&quot;&gt;an R package called swishdbtools&lt;/a&gt; to assist connecting to the Database from Kepler.&lt;/p&gt;

&lt;div class=&quot;highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;require(swishdbtools)
ch &amp;lt;- connect2postgres2(&quot;gislibrary&quot;)
# fill in the details, only required once as will save to your profile
dbGetQuery(ch, 
       &quot;select t1.sla_id, t2.sla_code as s2, st_area(t1.geom)
       from abs_sla.nswsla91 t1 join abs_sla.nswsla01 t2 
       on t1.sla_id = 1 || substr(cast(t2.sla_code as text), 6,9);
       &quot;)
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;Pretty cool huh?  Spatial functions start with st and the generic name for the spatial data is geom or the_geom.&lt;/p&gt;

&lt;h2 id=&quot;say-we-want-to-create-a-concordance-file-to-map-changes-between-sla-boundaries&quot;&gt;say we want to create a concordance file to map changes between SLA boundaries&lt;/h2&gt;
&lt;p&gt;I figured out a complicated SQL syntax to compute the intersecting geometries, then wrapped it up into an R function:&lt;/p&gt;

&lt;div class=&quot;highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;# make a temporary tablename, to avoid clobbering
temp_table &amp;lt;- swish_temptable(&quot;gislibrary&quot;)
temp_table &amp;lt;- paste(&quot;public&quot;, temp_table$table, sep = &quot;.&quot;)


sql &amp;lt;- postgis_concordance(conn = ch, source_table = &quot;abs_sla.nswsla91&quot;,
   source_zones_code = 'sla_id', target_table = &quot;abs_sla.nswsla01&quot;,
   target_zones_code = &quot;sla_code&quot;,
   into = temp_table, tolerance = 0.01,
   subset_target_table = &quot;cast(sla_code as text) like '105%'&quot;, 
   eval = F) 
cat(sql)
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;h2 id=&quot;which-gives-the-map&quot;&gt;Which gives the map&lt;/h2&gt;
&lt;p&gt;This shows the SLAs that existed in 2001 that were a smaller segment of their original SLA zone in 1991 (the red areas changed the most).&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/nswconc.png&quot; alt=&quot;nswconc.png&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;from-the-less-pretty-sql&quot;&gt;From the less pretty SQL&lt;/h2&gt;
&lt;p&gt;I just run the single line version&lt;/p&gt;

&lt;div class=&quot;highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;dbSendQuery(ch, sql)
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;if I don’t want to look at the ugly code&lt;/p&gt;

&lt;h2 id=&quot;so-now-i-can-use-qgis-to-visualise-this-or-if-on-linux-rgdal-can-access-it-direct&quot;&gt;so now I can use QGIS to visualise this, or if on linux rgdal can access it direct&lt;/h2&gt;
&lt;div class=&quot;highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;require(devtools) # windoze users need to install Rtools
install_github(&quot;gisviz&quot;, &quot;ivanhanigan&quot;)
# otherwise download and install from http://ivanhanigan.github.io/gisviz/
require(gisviz)

# get pwd from store, or use pwd &amp;lt;- getPassword()
pwd &amp;lt;- get_passwordTable()
pwd &amp;lt;- pwd[which(pwd$V3 == &quot;gislibrary&quot;), &quot;V5&quot;]

shp &amp;lt;- readOGR2(hostip=&quot;130.56.60.77&quot;,user=&quot;gislibrary&quot;,
   db=&quot;gislibrary&quot;, layer=gsub(&quot;public.&quot;,&quot;&quot;,temp_table), p = pwd)
head(arrange(shp@data, prop_olap_src_segment_of_src_orig))
subset(shp@data, source_zone_code == 10750)[,c(1,4,6)]
# source_zone_code target_zone_code prop_olap_src_segment_of_src_orig
# 36            10750        105530751                         0.4791171
# 37            10750        105530752                         0.2455572
# 38            10750        105530753                         0.2726988
# save local copy
writeOGR(shp, &quot;sydneyconc&quot;, &quot;sydneyconc&quot;, driver=&quot;ESRI Shapefile&quot;)
# make default map
choropleth(stat=&quot;prop_olap_src_segment_of_src_orig&quot;, region.map=shp, scalebar = T, 
  probs = seq(0, 1, .2), invert.brew.pal= F, maptitle='Sydney SLA91-01 intersection')
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;

&lt;h2 id=&quot;and-finally-just-clean-up-the-temp-file-from-the-db&quot;&gt;and finally just clean up the temp file from the db&lt;/h2&gt;

&lt;div class=&quot;highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;dbSendQuery(ch, paste(&quot;drop table &quot;,temp_table,sep=&quot;&quot;))
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;
</content>
 </entry>
 
 <entry>
   <title>Ecologists studying farm dogs with radio tracking collars</title>
   <link href="http://swish-climate-impact-assessment.github.io/2013/05/farm-dogs-radio-tracking-collars/"/>
   <updated>2013-05-17T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2013/05/farm-dogs-radio-tracking-collars</id>
   <content type="html">&lt;h2 id=&quot;extracting-the-weather-for-each-location-that-a-farm-dog-was-observed&quot;&gt;Extracting the weather for each location that a farm dog was observed&lt;/h2&gt;
&lt;p&gt;These data were collected by Linda Van Bommel and supplied
to the SWISH project by Luciana Porfirio to use as a demonstration of how our tools might be used by researchers in other disciplines.&lt;/p&gt;

&lt;p&gt;The locations are
taken from radio tracking collars on farm dogs taken over 8 months,
all in the same format (date, time, lat, lon) for a number of
dogs. The coordinates are from a farm in Victoria.&lt;/p&gt;

&lt;h2 id=&quot;step-one-modify-the-swish-example&quot;&gt;Step one: modify the SWISH example&lt;/h2&gt;
&lt;p&gt;The &lt;a href=&quot;/2013/05/extract-awap-data-4-locations/&quot;&gt;previously written workflow used as a test case&lt;/a&gt; for the SWISH project was downloaded from &lt;a href=&quot;/tools/ExtractAWAPdata4locations/extract-awap.html&quot;&gt;this web page&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;That workflow takes location names with unknown lat, longs and geocodes them using the GoogleMaps API.  For this dataset we do know the lat, longs so the first step was to delete the superflous actor and replace it with one that gets the data from the source location at Google Docs/Spreadsheets.&lt;/p&gt;

&lt;p&gt;This URL is highlighted in the image below.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/farm-dogs-sws.png&quot; alt=&quot;farm-dogs-sws.png&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;step-two-add-some-specific-code-to-this-analysis&quot;&gt;Step two: Add some specific code to this analysis&lt;/h2&gt;
&lt;p&gt;For example we will want to look at the spatial distribution of these data so a simple map is generated and displayed in the imageJ actor.  When the workflow is run the image below appears in a new window.  This can then be refined into a publication quality map at a later time.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/farm-dogs-map.png&quot; alt=&quot;farm-dogs-map.png&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;step-three-specify-the-dates-required&quot;&gt;Step three: specify the dates required&lt;/h2&gt;
&lt;p&gt;The information required by the “R raster extract by day” actor needs to be changed to reflect the specific duration of this dataset.&lt;/p&gt;

&lt;h2 id=&quot;step-four-include-a-new-step-to-merge-the-output-time-series-data-with-the-spatial-data&quot;&gt;Step four: include a new step to merge the output time-series data with the spatial data&lt;/h2&gt;
&lt;p&gt;The time series data is then extracted for each location, and every time point.  That output does not include the lat,long data so this is merged by adding the “R merge” actor and specifying the files to merge.  The result is shown below:&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/farm-dogs-data.png&quot; alt=&quot;farm-dogs-data.png&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;step-5-assess-data&quot;&gt;Step 5: assess data&lt;/h2&gt;
&lt;p&gt;In the above image we see that date.x is the date of the observation, date.y is the time the weather value was observed at that lat,long.  For each location we now have the complete timeseries of weather data extracted from the EWEDB AWAP grids datastore.&lt;/p&gt;

&lt;h2 id=&quot;future-work-required-reformating-data&quot;&gt;Future work required: reformating data&lt;/h2&gt;
&lt;p&gt;As you can see this is almost what we want, but not quite.  We now know the weather on each day for every location the dog visited. However what we really want will be the weather at the point the dog was at the same time the dog was there.  To do this additional actors can be added to take the data generated so far and perform subsequent re-formatting steps so the data do match up the observations with the weather at their exact time point.&lt;/p&gt;

&lt;h2 id=&quot;automated-processing-of-multiple-datasets&quot;&gt;Automated processing of multiple datasets&lt;/h2&gt;
&lt;p&gt;Once the workflow is finalised the URL pointing at the Google Spreadsheets dataset can be changed to other farms, or indeed any datasets with lat, long values in Australia.  If there is a large number of these a list of URLs could be fed into the workflow in a loop so that each dataset is processed in exactly the same way.  In this manner many datasets can be accessed in a rigorous and transparent manner, and revisions can be easily incorporated as new datasets are aquired or new analysis plans are formulated.&lt;/p&gt;

&lt;h2 id=&quot;replication-data-and-software&quot;&gt;Replication data and software&lt;/h2&gt;
&lt;p&gt;The data and software used in this tutorial are available from these links:&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;&lt;a href=&quot;/farmdogs/radio-tracking-farm-dogs.kar&quot;&gt;The Kepler workflow&lt;/a&gt;&lt;/li&gt;
  &lt;li&gt;&lt;a href=&quot;/farmdogs/radio-tracking-farm-dogs.r&quot;&gt;The R script (for debugging)&lt;/a&gt;&lt;/li&gt;
  &lt;li&gt;&lt;a href=&quot;/farmdogs/weather-merged-latlong.csv&quot;&gt;The resulting output dataset&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
</content>
 </entry>
 
 <entry>
   <title>geocoder</title>
   <link href="http://swish-climate-impact-assessment.github.io/2013/05/geocoder/"/>
   <updated>2013-05-15T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2013/05/geocoder</id>
   <content type="html">&lt;h2 id=&quot;a-exemplar-workflow-for-geocoding-locations&quot;&gt;A exemplar workflow for geocoding locations&lt;/h2&gt;
&lt;p&gt;The geocoder workflow at &lt;a href=&quot;/tools/geocoder/geocoder.html&quot;&gt;this clink&lt;/a&gt; is an example that takes a list of locations and returns a shapefile with the latitude and longitudes, as well as a map.&lt;/p&gt;

&lt;p&gt;As you can see when you open the KAR file, this workflow expects an XLSX file to be linked in the first actor.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/geocoder-kar.png&quot; alt=&quot;geocoder-kar.png&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;list-your-locations&quot;&gt;list your locations&lt;/h2&gt;

&lt;p&gt;&lt;img src=&quot;/images/geocoder-xls.png&quot; alt=&quot;geocoder-xls&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;run-the-workflow-to-create-an-image&quot;&gt;run the workflow to create an image&lt;/h2&gt;

&lt;p&gt;&lt;img src=&quot;/images/geocoder-img.png&quot; alt=&quot;geocoder-xls&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;and-a-shapefile-stored-in-your-temporary-directory&quot;&gt;and a shapefile, stored in your temporary directory&lt;/h2&gt;

&lt;p&gt;&lt;img src=&quot;/images/geocoder-shp.png&quot; alt=&quot;geocoder-xls&quot; /&gt;&lt;/p&gt;
</content>
 </entry>
 
 <entry>
   <title>AWAP grids vs station observations</title>
   <link href="http://swish-climate-impact-assessment.github.io/2013/05/awapgrids-vs-stations/"/>
   <updated>2013-05-13T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2013/05/awapgrids-vs-stations</id>
   <content type="html">&lt;h2 id=&quot;comparing-the-gridded-estimates-to-the-observations&quot;&gt;Comparing the gridded estimates to the observations&lt;/h2&gt;

&lt;p&gt;The EWEDB holds &lt;a href=&quot;/metadata/AWAP_GRIDS.html&quot;&gt;daily gridded data we downloaded from BoM&lt;/a&gt;.  The size of this data collection is formidable (&amp;gt; 71,000 raster grids currently with 1980 to present, and set to grow significantly as we incoporate earlier decades).&lt;/p&gt;

&lt;p&gt;We were faced with the choice to store data for more time points (days), at lower spatial resolution (less megabytes) or for less time points at higher spatial resolution (more megabytes).  In the interest of deriving Extreme Weather Indices from the longest timeframe possible (to identify truly extreme observations from the full historical range) we decided to aggregate the original data from 5km pixels to 15kms squared pixels.  This loss of spatial precision is compensated to some extent by the high spatial autocorrelation as displayed in the map of the recent heatwave in New South Wales, Australia, January 2013.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/grid-nsw.png&quot; alt=&quot;grid-nsw.png&quot; /&gt;&lt;/p&gt;

&lt;p&gt;When we aggregate pixels of the grid there is more chance that the observed data will be different from the estimate at that location due to spatial smoothing.  In this post we will compare the observations from BoM weather stations with the daily values for the grid cell they intersect.&lt;/p&gt;

&lt;p&gt;There are 939 weather stations that have valid observations for all three temperature, vapour pressure (humidity) and rainfall in the 1990-2010 period we also hold the data for.  To save a bit of time we’ll only do a 10 percent random sample (93) of these shown below.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/selected-stations.png&quot; alt=&quot;selected-stations.png&quot; /&gt;&lt;/p&gt;

&lt;p&gt;Getting the values for each station from the grid pixel it lies on we can construct an artifical timeseries as shown.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/sampled-timeseries-from-grid.png&quot; alt=&quot;sampled-timeseries-from-grid.png&quot; /&gt;&lt;/p&gt;

&lt;p&gt;Merging these estimates with the observed data we can compare them and derive some summary statistics such as the R-squared.&lt;/p&gt;

&lt;h2 id=&quot;max-temp&quot;&gt;Max Temp&lt;/h2&gt;

&lt;p&gt;&lt;img src=&quot;/images/maxave.png&quot; alt=&quot;maxave.png&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;min-temp&quot;&gt;Min Temp&lt;/h2&gt;

&lt;p&gt;&lt;img src=&quot;/images/minave.png&quot; alt=&quot;minave.png&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;rainfall&quot;&gt;Rainfall&lt;/h2&gt;

&lt;p&gt;&lt;img src=&quot;/images/totals.png&quot; alt=&quot;totals.png&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;vapour-pressure-humidity-9am&quot;&gt;Vapour Pressure (humidity) 9am&lt;/h2&gt;

&lt;p&gt;&lt;img src=&quot;/images/vprph09.png&quot; alt=&quot;vprph09.png&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;vapour-pressure-humidity-3pm&quot;&gt;Vapour Pressure (humidity) 3pm&lt;/h2&gt;

&lt;p&gt;&lt;img src=&quot;/images/vprph15.png&quot; alt=&quot;vprph15.png&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;conclusions&quot;&gt;Conclusions&lt;/h2&gt;
&lt;p&gt;The comparison presented here shows that the observations and AWAP gridded datasets that we have processed for storage in the EWEDB differ, due to the spatial smoothing that has occured in the processing undertaken for the EWEDB project.&lt;/p&gt;

&lt;p&gt;Users are asked to bear this in mind when considering the appropriateness of these data for their specific application.&lt;/p&gt;
</content>
 </entry>
 
 <entry>
   <title>Set Up R for Kepler on MS Windows</title>
   <link href="http://swish-climate-impact-assessment.github.io/2013/05/set-up-r-on-ms-windows/"/>
   <updated>2013-05-09T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2013/05/set-up-r-on-ms-windows</id>
   <content type="html">&lt;h2 id=&quot;install-r-30&quot;&gt;Install R 3.0&lt;/h2&gt;
&lt;p&gt;Even if you have &lt;a href=&quot;http://www.r-project.org/&quot;&gt;The R Environment for Statistical Computing and Graphics&lt;/a&gt; installed we recommend you upgrade to version 3.0 because new packages from there will not work with R 2.15 etc.&lt;/p&gt;

&lt;h2 id=&quot;register-r-in-the-path-so-that-kepler-can-find-it&quot;&gt;Register R in the PATH so that Kepler can find it&lt;/h2&gt;
&lt;p&gt;This tutorial assumes windows 7 and a user without administrator privileges.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/setup-r-Slide1.PNG&quot; alt=&quot;setup-r-Slide1.PNG&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;first-download-and-install-r-to-a-location-you-can-write-to&quot;&gt;First download and install R to a location you can write to&lt;/h2&gt;

&lt;p&gt;&lt;img src=&quot;/images/setup-r-Slide2.PNG&quot; alt=&quot;setup-r-Slide2.PNG&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;it-wont-be-recognised-on-your-path&quot;&gt;It won’t be recognised on your PATH&lt;/h2&gt;
&lt;p&gt;Because you are not admin it will not be in your path.  Check this by opening the terminal (Run &amp;gt; cmd) and then type R.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/setup-r-Slide3.PNG&quot; alt=&quot;setup-r-Slide3.PNG&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;go-to-the-control-panel-and-navigate-to-the-set-environment-variables&quot;&gt;Go to the control panel and navigate to the set environment variables&lt;/h2&gt;

&lt;p&gt;&lt;img src=&quot;/images/setup-r-Slide4.PNG&quot; alt=&quot;setup-r-Slide4.PNG&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;make-a-new-user-variable&quot;&gt;make a new USER variable&lt;/h2&gt;

&lt;p&gt;&lt;img src=&quot;/images/setup-r-Slide5.PNG&quot; alt=&quot;setup-r-Slide5.PNG&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;locate-the-appropriate-r-binaries&quot;&gt;Locate the appropriate R binaries&lt;/h2&gt;

&lt;p&gt;&lt;img src=&quot;/images/setup-r-Slide6.PNG&quot; alt=&quot;setup-r-Slide6.PNG&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;make-the-new-variable-called-path&quot;&gt;make the new variable called Path&lt;/h2&gt;

&lt;p&gt;&lt;img src=&quot;/images/setup-r-Slide7.PNG&quot; alt=&quot;setup-r-Slide7.PNG&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;exit-and-restart-the-terminal-and-check-that-r-is-recognised&quot;&gt;Exit and restart the terminal and check that R is recognised&lt;/h2&gt;

&lt;p&gt;&lt;img src=&quot;/images/setup-r-Slide8.PNG&quot; alt=&quot;setup-r-Slide8.PNG&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;the-end&quot;&gt;The End&lt;/h2&gt;
</content>
 </entry>
 
 <entry>
   <title>Extract AWAP data for locations</title>
   <link href="http://swish-climate-impact-assessment.github.io/2013/05/extract-awap-data-4-locations/"/>
   <updated>2013-05-03T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2013/05/extract-awap-data-4-locations</id>
   <content type="html">&lt;h1 id=&quot;awap-data&quot;&gt;AWAP data&lt;/h1&gt;
&lt;p&gt;The AWAP data were found and extracted for a specific date in a previous post.
This tutorial will demonstrate extracting data for a range of dates and locations.&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;/tools/ExtractAWAPdata4locations/extract-awap.html&quot;&gt;See this page&lt;/a&gt;&lt;/p&gt;

&lt;h2 id=&quot;kaleen-act-is-a-test-case&quot;&gt;Kaleen, ACT is a test case&lt;/h2&gt;
&lt;p&gt;In the attached example the study location is Kaleen, a suburb of Canberra.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/extract-kaleen.png&quot; alt=&quot;extract-kaleen.png&quot; /&gt;&lt;/p&gt;
</content>
 </entry>
 
 <entry>
   <title>Hello EWEDB (Simple Set Up the Swish Computer to connect to EWEDB)</title>
   <link href="http://swish-climate-impact-assessment.github.io/2013/05/hello-ewedb/"/>
   <updated>2013-05-02T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2013/05/hello-ewedb</id>
   <content type="html">&lt;h2 id=&quot;hello-ewedb&quot;&gt;Hello EWEDB&lt;/h2&gt;
&lt;p&gt;This is a very simple workflow designed to install the swishdbtools R package and test the setup of your computer and the connection to the EWEDB.&lt;/p&gt;

&lt;h3 id=&quot;step-1&quot;&gt;Step 1&lt;/h3&gt;
&lt;p&gt;Download &lt;a href=&quot;/tools/hello-ewedb/hello-ewedb.html&quot;&gt;the “hello-ewedb.kar” workflow&lt;/a&gt;&lt;/p&gt;

&lt;h3 id=&quot;step-2&quot;&gt;Step 2&lt;/h3&gt;
&lt;p&gt;Run the workflow.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/hello-ewedb/hello-ewedb-Slide1.PNG&quot; alt=&quot;hello-ewedb-Slide1.PNG&quot; /&gt;&lt;/p&gt;

&lt;h3 id=&quot;step-3&quot;&gt;Step 3&lt;/h3&gt;
&lt;p&gt;Enter the details.  When you run this it will download the R package and install it, along with its dependencies.  It will then look for your PostGIS username and password, if it can’t find a valid username, password, database and server combination it will ask you to enter them.&lt;/p&gt;

&lt;h3 id=&quot;on-ms-windows-the-popup-box-is-often-behind-other-windows&quot;&gt;ON MS WINDOWS THE POPUP BOX IS OFTEN BEHIND OTHER WINDOWS.&lt;/h3&gt;

&lt;p&gt;You will have recieved a username and password when the Data Manager set up your account.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/setup-swish-Slide10.PNG&quot; alt=&quot;setup-swish-Slide10.PNG&quot; /&gt;&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/setup-swish-Slide11.PNG&quot; alt=&quot;setup-swish-Slide11.PNG&quot; /&gt;&lt;/p&gt;

&lt;p&gt;On MS Windows your details are now stored in this file or ~/.pgpass on Linux.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/setup-swish-Slide12.PNG&quot; alt=&quot;setup-swish-Slide12.PNG&quot; /&gt;&lt;/p&gt;

&lt;h3 id=&quot;step-3-1&quot;&gt;Step 3&lt;/h3&gt;
&lt;p&gt;Verify the test data “hello_ewedb” was written to the database, and read back out again, and removed afterward.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/hello-ewedb/hello-ewedb-Slide2.PNG&quot; alt=&quot;hello-ewedb-Slide2.PNG&quot; /&gt;&lt;/p&gt;

</content>
 </entry>
 
 <entry>
   <title>Extracting Weather Data from Grids</title>
   <link href="http://swish-climate-impact-assessment.github.io/2013/04/extract-weather-from-grids/"/>
   <updated>2013-04-26T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2013/04/extract-weather-from-grids</id>
   <content type="html">&lt;h1 id=&quot;gridded-weather-data&quot;&gt;Gridded weather Data&lt;/h1&gt;
&lt;p&gt;One of the cornerstone datasets in the EWEDB is the gridded weather data from the &lt;a href=&quot;http://www.bom.gov.au&quot;&gt;Australian Bureau of Meteorology&lt;/a&gt;.  This post will describe a user extracting weather data for their study locations from overlaying the coordinates on a grid and returning the value of the pixels at that location for a specified date.&lt;/p&gt;

&lt;h2 id=&quot;step-one-find-the-data&quot;&gt;Step one: find the data&lt;/h2&gt;
&lt;h3 id=&quot;first-log-in-to-the-web-catalogue&quot;&gt;First log in to the Web Catalogue&lt;/h3&gt;

&lt;p&gt;&lt;img src=&quot;/images/extract-data-login-ddiindex.png&quot; alt=&quot;extract-data-login-ddiindex.png&quot; /&gt;&lt;/p&gt;

&lt;h3 id=&quot;then-browse&quot;&gt;Then Browse&lt;/h3&gt;

&lt;p&gt;&lt;img src=&quot;/images/extract-data-browse.png&quot; alt=&quot;extract-data-browse.png&quot; /&gt;&lt;/p&gt;

&lt;h3 id=&quot;or-search&quot;&gt;Or Search&lt;/h3&gt;

&lt;p&gt;&lt;img src=&quot;/images/extract-data-search.png&quot; alt=&quot;extract-data-search.png&quot; /&gt;&lt;/p&gt;

&lt;h3 id=&quot;these-data-are-discovered--further-information-is-available&quot;&gt;These data are discovered.  Further information is available.&lt;/h3&gt;

&lt;p&gt;&lt;img src=&quot;/images/extract-data-search-result.png&quot; alt=&quot;extract-data-search-result.png&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;step-two-create-a-kepler-workflow&quot;&gt;Step two: Create a Kepler Workflow&lt;/h2&gt;

&lt;p&gt;The Workflow in the image below:&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;gets a list of study locations in the towns.xlsx file (Notice that Wolongong is MISSPELT?)&lt;/li&gt;
  &lt;li&gt;subsets them to the places of interest&lt;/li&gt;
  &lt;li&gt;geocodes them using the google geocoder (which will return a fuzzy logic best match for the misspelt name - thanks Google!)&lt;/li&gt;
  &lt;li&gt;uploads the coordinate data (in latitude and longitude) to the EWEDB PostGIS server (after checking our saved password in the postgres.conf file)&lt;/li&gt;
  &lt;li&gt;tells the PostGIS data are a points vector datatype, and that the coordinates are in GDA 1994 projection system&lt;/li&gt;
  &lt;li&gt;extracts the pixel values for the raster named in the string constant (that we found from the catalogue)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;img src=&quot;/images/setup-swish-Slide8.PNG&quot; alt=&quot;setup-swish-Slide8.PNG&quot; /&gt;&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/extract-data-kepler.png&quot; alt=&quot;extract-data-kepler.png&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;the-result&quot;&gt;The result&lt;/h2&gt;
&lt;p&gt;The result is a file extracted from the database to the local TEMP directory and the name is shown.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/setup-swish-Slide13.PNG&quot; alt=&quot;setup-swish-Slide13.PNG&quot; /&gt;&lt;/p&gt;

&lt;p&gt;The user can then take these data for further work&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/setup-swish-Slide14.PNG&quot; alt=&quot;setup-swish-Slide14.PNG&quot; /&gt;&lt;/p&gt;

&lt;h2 id=&quot;quality-control&quot;&gt;Quality Control&lt;/h2&gt;
&lt;p&gt;An imporant point to note is that the coordinates retrieved from the GoogleMaps geocoder might not be correct.  It is easy to check that the locations we just stored in the database are correct by viewing them in Quantum GIS (see &lt;a href=&quot;/2013/04/quantumgis-and-postgis&quot;&gt;this previous post&lt;/a&gt; for instructions on setting up Quantum GIS).&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/setup-swish-Slide15.PNG&quot; alt=&quot;setup-swish-Slide15.PNG&quot; /&gt;&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/setup-swish-Slide16.PNG&quot; alt=&quot;setup-swish-Slide16.PNG&quot; /&gt;&lt;/p&gt;

&lt;p&gt;Thankfully these locations appear good (even the mis-spelt “Wolongong”).&lt;/p&gt;
</content>
 </entry>
 
 <entry>
   <title>A SWISH user test report</title>
   <link href="http://swish-climate-impact-assessment.github.io/2013/04/a-swish-user-test-report/"/>
   <updated>2013-04-19T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2013/04/a-swish-user-test-report</id>
   <content type="html">&lt;h2 id=&quot;a-swish-testimonial&quot;&gt;A SWISH testimonial&lt;/h2&gt;
&lt;p&gt;Here is what a test user had to say about the EWEDB.&lt;/p&gt;

&lt;p&gt;Steve McEachern is Deputy Director of the Australian Data Archive and
senior research fellow with the Australian Demographic and Social
Research Institute at the Australian National University.&lt;/p&gt;

&lt;p&gt;The Scientific Workflow and Integration Software for Health (SWISH)
provides an enviable collection of research tools for the conduct of
heath and social science research. The starting point for SWISH is the
integrated data catalogue, which provides an ideal access point for
finding and exploring spatial data available through SWISH.&lt;/p&gt;

&lt;p&gt;Once data are discovered, the researcher then has the capacity to readily
access the relevant spatial data through the SWISH Extreme Weather
Events database (EWEDB). The integration of the Postgres/PostGIS
database and Geoserver web service for visualisation, along with the
streamlined access to the spatial data through the Rstudio server
environment, enable the integration of geospatial data with other survey
and administrative data sources.&lt;/p&gt;

&lt;p&gt;This integration capability allows us to easily bring together data
sources that have not previously been considered in common, due to the
level of knowledge required, covering multiple disciplines and research
methods. In the example presented here, we provide a simple analysis of
the distribution of drought across NSW in 2006, derived from Bureau of
Meteorology data, and the vote for the Liberal Party in the same
electorates in 2010, drawn from the Australian Electoral Commission
election results website. The correlation between the level of drought
in 2006 and voting behaviour in 2010 is then shown in the concluding figure.&lt;/p&gt;

&lt;p&gt;The integration of the system with GitHub, the DDIIndex data catalogue
and the SWISH data registry system also enable the research to be fully
documented, published and then available for reanalysis, further
demonstrating the potential of the system for supporting reproducible
research. All of the analysis presented here is available through the
project GitHub repository.&lt;/p&gt;

&lt;p&gt;While the analysis is exploratory only, the use of the SWISH system
shows the ease with which multiple data sources can be brought together,
and hence to be able to answer more complex research questions, and at
increasingly specific levels of geography. Using this system,
researchers might then consider the effects of weather patterns on
social phenomena, such as the relationship between seasonal weather
patterns and depression within local government areas, or extreme
weather events and social media use.
I look forward to using this system further in future.&lt;/p&gt;

&lt;div class=&quot;highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;Dr. Steven McEachern
Deputy Director
Australian Data Archive
Australian National University
Ph. +61 2 6125 2200
http://www.ada.edu.au
28 September 2012
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;
</content>
 </entry>
 
 <entry>
   <title>PostGIS utils on windows</title>
   <link href="http://swish-climate-impact-assessment.github.io/2013/04/postgis-utils-on-windows/"/>
   <updated>2013-04-15T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2013/04/postgis-utils-on-windows</id>
   <content type="html">&lt;p&gt;The SWISH EWEDB server is a postgres database with the PostGIS add-on. 
Some of our tools require that the local client computer has some postgres software, but we don’t need you to actually install anything.
An easy way to get these tools to work (especially for windows users) is to:&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;1 download the zips from the links below:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href=&quot;http://www.enterprisedb.com/products-services-training/pgbindownload&quot;&gt;http://www.enterprisedb.com/products-services-training/pgbindownload&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;http://download.osgeo.org/postgis/windows/pg92/postgis-pg92-binaries-2.0.2w64.zip&quot;&gt;http://download.osgeo.org/postgis/windows/pg92/postgis-pg92-binaries-2.0.2w64.zip&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;
    &lt;p&gt;2 and unzip them, 
putting the files into:&lt;/p&gt;

    &lt;p&gt;C:\pgutils&lt;/p&gt;
  &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;/p&gt;
&lt;p&gt;A tutorial with screenshots to make use of the GIS features of the EWEDB will follow in the future.&lt;/p&gt;
</content>
 </entry>
 
 <entry>
   <title>Kepler Actors with R on Linux</title>
   <link href="http://swish-climate-impact-assessment.github.io/2013/04/kepler-actors-R-linux/"/>
   <updated>2013-04-12T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2013/04/kepler-actors-R-linux</id>
   <content type="html">&lt;p&gt;The design of the SWISH Kepler actors is to take files and create files.&lt;/p&gt;

&lt;p&gt;If data is passed between actors then temporary files are created in the temporary directory after each actor completes, and are fed to the subsequent actors.&lt;/p&gt;

&lt;p&gt;For the R actors, on windows this is achieved using the TEMP environment variable.  On linux this might not exist.&lt;/p&gt;

&lt;p&gt;This can be created by adding the following line to /usr/lib/R/etc/Rprofile.site&lt;/p&gt;

&lt;div class=&quot;highlighter-rouge&quot;&gt;&lt;div class=&quot;highlight&quot;&gt;&lt;pre class=&quot;highlight&quot;&gt;&lt;code&gt;Sys.setenv(TEMP = &quot;/tmp&quot;)
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;/div&gt;
</content>
 </entry>
 
 <entry>
   <title>QuantumGIS and PostGIS</title>
   <link href="http://swish-climate-impact-assessment.github.io/2013/04/quantumgis-and-postgis/"/>
   <updated>2013-04-09T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2013/04/quantumgis-and-postgis</id>
   <content type="html">&lt;h1 id=&quot;introduction&quot;&gt;Introduction&lt;/h1&gt;
&lt;p&gt;This post is explains how to get spatial data from a PostGIS server using QuantumGIS (QGIS).&lt;/p&gt;

&lt;h1 id=&quot;method&quot;&gt;Method&lt;/h1&gt;
&lt;p&gt;Open up QGIS “Add PostGIS Layers”.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/qgis-postgis.png&quot; alt=&quot;qgis-postgis&quot; /&gt;&lt;/p&gt;

&lt;p&gt;Click “New” and enter the details as shown, if “Test” is OK then hit “OK”.&lt;/p&gt;

&lt;p&gt;Note that the options to “only look in geometry_columns” and “use estimated metadata” may speed up communications from the server, but in some cases restrict access to the full list of available tables.  Experiment with these settings by hitting “Edit” and modifying as suits.&lt;/p&gt;

&lt;p&gt;PS The first time you set up the connection there will be a warning about saving your password.  Take a moment to consider the risk this poses to data.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/qgis-postgis-new.png&quot; alt=&quot;qgis-postgis-new&quot; /&gt;&lt;/p&gt;

&lt;p&gt;Select the layer(s) from this list.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/qgis-postgis-getdata.png&quot; alt=&quot;qgis-postgis-getdata&quot; /&gt;&lt;/p&gt;

&lt;p&gt;Right click on the layer and select “Save as”.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/qgis-postgis-saveas1.png&quot; alt=&quot;qgis-postgis-saveas1&quot; /&gt;&lt;/p&gt;

&lt;p&gt;Browse to the local folder to store the data in, and give the shapefile a name.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/qgis-postgis-saveas2.png&quot; alt=&quot;qgis-postgis-saveas2&quot; /&gt;&lt;/p&gt;

&lt;h1 id=&quot;conclusion&quot;&gt;Conclusion&lt;/h1&gt;
&lt;p&gt;This is an appropriate method to download a single vector layer at a time.  If a raster layer is needed or if a bulk download of many vector layers is desired, then other tools called “pgsql2shp” and “readGDAL” are more efficient.  A future post will describe these methods, and SWISH Kepler Actors that we build to perform these tasks.&lt;/p&gt;
</content>
 </entry>
 
 <entry>
   <title>Australian Water Availability Grids tools</title>
   <link href="http://swish-climate-impact-assessment.github.io/2013/03/awaptools-package-released/"/>
   <updated>2013-03-21T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2013/03/awaptools-package-released</id>
   <content type="html">&lt;p&gt;A new SWISH R package to download and format gridded weather data from the Bureau of meteorology can be downloaded from &lt;a href=&quot;/tools/awaptools/awaptools-downloads.html&quot;&gt;downloaded from this page&lt;/a&gt;&lt;/p&gt;
</content>
 </entry>
 
 <entry>
   <title>SWISH Kepler Actors installer</title>
   <link href="http://swish-climate-impact-assessment.github.io/2013/02/swish-kepler-actors-installer/"/>
   <updated>2013-02-20T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2013/02/swish-kepler-actors-installer</id>
   <content type="html">&lt;p&gt;The SWISH Kepler Actors we have built enable users to include Stata programs (&lt;a href=&quot;http://www.stata.com/&quot;&gt;http://www.stata.com/&lt;/a&gt;) into Kepler workflow.&lt;/p&gt;

&lt;p&gt;These actors currently work with Stata 12 in Windows environments.&lt;/p&gt;

&lt;p&gt;We provide a installer that makes the SWISH Kepler Actors available for use in Kepler.  It is fully described and can be &lt;a href=&quot;/tools/swishkepleractorsinstaller/swishkepleractorsinstaller-details.html&quot;&gt;downloaded from this page&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To run the installer double click the “Install” shortcut, to remove use the “Uninstall” shortcut.&lt;/p&gt;

&lt;p&gt;The installed actors appear in Kepler components panel located on the left of the main Kepler window. In Kepler the actors can be dragged and dropped onto the canvas, and linked to other actors to create workflows.&lt;/p&gt;

&lt;p&gt;The source code for SWISH Kepler Actors is available at&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://github.com/swish-climate-impact-assessment/swish-kepler-actors&quot;&gt;https://github.com/swish-climate-impact-assessment/swish-kepler-actors&lt;/a&gt;&lt;/p&gt;

</content>
 </entry>
 
 <entry>
   <title>SWISH tools added</title>
   <link href="http://swish-climate-impact-assessment.github.io/2012/12/swish-tools-added/"/>
   <updated>2012-12-18T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2012/12/swish-tools-added</id>
   <content type="html">&lt;p&gt;Two tools have been added to the SWISH-Climate-Impact-Assessments project.&lt;/p&gt;

&lt;p&gt;They are described and downloadable &lt;a href=&quot;/tools.html&quot;&gt;from this page&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;More documentation on the use of these tools in Climate Impact Assessment Workflows will follow soon.&lt;/p&gt;

</content>
 </entry>
 
 <entry>
   <title>Hutchinson Drought Index Fixed</title>
   <link href="http://swish-climate-impact-assessment.github.io/2012/10/hutchinson-drought-index-fixed/"/>
   <updated>2012-10-02T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2012/10/hutchinson-drought-index-fixed</id>
   <content type="html">&lt;p&gt;The &lt;a href=&quot;https://github.com/ivanhanigan/HutchinsonDroughtIndex&quot;&gt;Hutchinson Drought Index algorithm repository&lt;/a&gt; is set up to download data from an online dataset for use as an example.&lt;/p&gt;

&lt;p&gt;Unfortunately the Bureau of Meteorology has recently changed the URLs for these datasets.&lt;/p&gt;

&lt;p&gt;The code available from author Ivan Hanigan’s github account is now updated to account for these changes.&lt;/p&gt;

</content>
 </entry>
 
 <entry>
   <title>New analysis added - Drought and election results</title>
   <link href="http://swish-climate-impact-assessment.github.io/2012/09/new-analysis-added-drought-and-election-results/"/>
   <updated>2012-09-28T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2012/09/new-analysis-added-drought-and-election-results</id>
   <content type="html">&lt;p&gt;A new data analysis project has been added to SWISH.&lt;/p&gt;

&lt;p&gt;In this analysis authors Steven McEachern and Ivan Hanigan explore the hypothesis that Australians are more likely to vote for conservative political parties during droughts.&lt;/p&gt;

&lt;p&gt;To check out the Repo go to &lt;a href=&quot;https://github.com/ivanhanigan/aec_analysis&quot;&gt;This Website https://github.com/ivanhanigan/aec_analysis&lt;/a&gt; to clone it.&lt;/p&gt;
</content>
 </entry>
 
 <entry>
   <title>High level system description now available</title>
   <link href="http://swish-climate-impact-assessment.github.io/2012/09/high-level-system-description/"/>
   <updated>2012-09-27T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2012/09/high-level-system-description</id>
   <content type="html">&lt;p&gt;The &lt;a href=&quot;/HighLevelDescription.html&quot;&gt;High Level System Description Document&lt;/a&gt; for the Scientific Workflow and Integration Software for Health (SWISH) project is now available.&lt;/p&gt;

&lt;p&gt;The project is designed to support climate impact assessment on Human Health. SWISH tools will include methods to chain together tasks that perform operations in the domains of:&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;data acquisition,&lt;/li&gt;
  &lt;li&gt;data transformation,&lt;/li&gt;
  &lt;li&gt;mathematical operations,&lt;/li&gt;
  &lt;li&gt;graphing,&lt;/li&gt;
  &lt;li&gt;statistical analysis, and&lt;/li&gt;
  &lt;li&gt;output.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It will include both an operational web-based research platform as well as enhance traditional desktop client-side workflows, so that it boosts capacity without compromising expertise and trusted workflows. The software ecosystem is summarised in the image below, and fully described at this page &lt;a href=&quot;/HighLevelDescription.html&quot;&gt;Click Here&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/images/3Components.png&quot; alt=&quot;System Structure&quot; /&gt;&lt;/p&gt;

&lt;p&gt;The first demonstration of the system will be the creation of an online validated Extreme Weather Events (EWE) database from historical data that can be queried repeatedly, easily and effectively.&lt;/p&gt;

&lt;p&gt;This will be merged with Health, Population and Climate Change scenario data to project future health impacts; and the impact assessment will be able to be easily updated with future additional health, population and weather data; or new Climate Change model versions.&lt;/p&gt;

&lt;p&gt;SWISH is funded by the &lt;a href=&quot;http://ands.org.au/&quot;&gt;Australian National Data Service&lt;/a&gt;&lt;/p&gt;

</content>
 </entry>
 
 <entry>
   <title>What Does Access Get You?</title>
   <link href="http://swish-climate-impact-assessment.github.io/2012/09/what-does-access-get-you/"/>
   <updated>2012-09-26T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2012/09/what-does-access-get-you</id>
   <content type="html">&lt;p&gt;Once you have applied for and been approved access you will be asked to fill out a
Data Management Plan Checklist and then the relationship with the SWISH project is enshrined in a provision
agreement.&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;&lt;a href=&quot;https://115.146.93.108&quot;&gt;Rstudio for analysis&lt;/a&gt;&lt;/li&gt;
  &lt;li&gt;&lt;a href=&quot;http://115.146.93.108:8181/ddiindex&quot;&gt;The ddiindex catalogue for finding data&lt;/a&gt;&lt;/li&gt;
  &lt;li&gt;The PostGIS database = psql –h 115.146.94.209 –d ewedb –U username&lt;/li&gt;
  &lt;li&gt;The data registry is available for power users &lt;a href=&quot;http://115.146.93.108:8080/f?p=102&quot;&gt;when contributing data&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The geoserver is not restricted for analysts so you can just point QGIS WFS tool at &lt;a href=&quot;http://115.146.94.209:8181/geoserver/wfs&quot;&gt;this URL&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Access is limited to a finite period and set to expire on a
predetermined date, we can offer near-offline storage on our servers
for a period and then archive to a data repository such as the Australian Data Archives.&lt;/p&gt;

</content>
 </entry>
 
 <entry>
   <title>Scheduled Downtime Sat 29 Sept</title>
   <link href="http://swish-climate-impact-assessment.github.io/2012/09/scheduled-downtime/"/>
   <updated>2012-09-18T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2012/09/scheduled-downtime</id>
   <content type="html">&lt;p&gt;The EWE-DB GIS server will be offline on Saturday 29th September for upgrades.&lt;/p&gt;

</content>
 </entry>
 
 <entry>
   <title>Data Management Plan and Softare Development Paradigm</title>
   <link href="http://swish-climate-impact-assessment.github.io/2012/09/data-management-plan/"/>
   <updated>2012-09-15T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2012/09/data-management-plan</id>
   <content type="html">&lt;p&gt;The Extreme Weather Events Database consists of two linked servers and a Github repository. The servers will be dedicated to two symbiotic systems for firstly storing and accessing the data, and second for analyses.&lt;/p&gt;

&lt;p&gt;The first server is a GIS database.&lt;br /&gt;
The software stack will include:&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;A PostGIS server&lt;/li&gt;
  &lt;li&gt;geoserver&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The second server will combine analytical software with a data registry and searchable catalogue.&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;Rstudio server&lt;/li&gt;
  &lt;li&gt;an Oracle Express APEX HTML Database&lt;/li&gt;
  &lt;li&gt;a DDIindex searchable data catalogue&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;All data and software available from our site are free or open source.
This means any health scientist (academic, public servant, emergency
aid worker or student), anywhere in the world can freely access these
data and tools.&lt;br /&gt;
As it is open source development, and can be freely
modified, it is hoped that with little modification this software
could be used by others to enhance our understanding of the impact of
Extreme Weather Events on Health, improve the speediness of action
against impending threats, and increase our ability to adapt to or
avoid the harmful consequences of Climate Change.&lt;/p&gt;

</content>
 </entry>
 
 <entry>
   <title>Hello World</title>
   <link href="http://swish-climate-impact-assessment.github.io/2012/09/helloworld/"/>
   <updated>2012-09-14T00:00:00+00:00</updated>
   <id>http://swish-climate-impact-assessment.github.io/2012/09/helloworld</id>
   <content type="html">&lt;p&gt;This is the Extreme Weather Events (EWE) database that accompanies the project ‘Scientific Workflow and Integration Software for Health - Climate Impacts Assessment’, or SWISH for short &lt;a href=&quot;http://swish-climate-impact-assessment.blogspot.com.au/&quot;&gt;http://swish-climate-impact-assessment.blogspot.com.au/&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;We are funded by the Australian National Data Service (ANDS).&lt;/p&gt;

&lt;p&gt;The project is to build a Scientific Workflow System for Assessing and
Projecting the Health Impacts of Extreme Weather Events.&lt;/p&gt;

&lt;h2 id=&quot;about-this-site&quot;&gt;About this site&lt;/h2&gt;
&lt;p&gt;This server hosts a PostGIS database with the results of the algorithms we develop to describe EWE.&lt;br /&gt;
The data are available for spatial and temporal queries - and download for integration with health data.&lt;/p&gt;

&lt;p&gt;The project will probably result in a database with around 50GB of preprocessed extreme weather event data and some spatial data related to zones of the population census.  The access will be restricted to bona fide researchers  investigating links between extreme weather and human health.  We envisage only a small number of researchers will be accessing these data.&lt;/p&gt;

&lt;h2 id=&quot;other-uses-of-this-site&quot;&gt;Other uses of this site&lt;/h2&gt;
&lt;p&gt;This website can also server to host detailed explanations of the algorithms and tutorials on how to access the data.&lt;/p&gt;
</content>
 </entry>
 
 
</feed>
