Roger Hyam

Roger Hyam

I'm the Digital Information Development Officer at The Royal Botanic Garden Edinburgh

Jan 272017
 

I’ve spent the last few days working on the data pipeline I first mentioned in The Cloud Lottery that is Scottish satellite imagery. This pipeline consists of a series of R scripts that will place an order for Landsat 8 surface reflectance data and vegetation indices from the United States Geological Service, download the products as they become available then strip out the layers I’m interested in analysing, removing any areas of cloud or high atmospheric aerosols as they go. The hope is I’ll be able to leave this running and build a continuously updated dataset covering areas of interest that I can then query for data analysis purposes.

The Open Street Map of Edinburgh with Enhanced Vegetation Index (EVI) overlay. A particularly clear day on 25th October of 2016

When working on a job like this it is easy to get buried in the detail and lose sight of the end goal. As it is Friday afternoon and next week I need to turn my attention to other projects for a while I thought I’d visualise one of the clearer images and check I’m still on course. The images are fascinating so I wanted to share them.

You may have seen a recent article in the Guardian “How green is your city? UK’s top 10 mapped and ranked”. Edinburgh came out top with 49% “green”. These maps were produced by ESRI based on Normalised Difference Vegetation Index and are very simplified in comparison with what I hope to do. I’d recommend taking a look at that article as I don’t think I can reproduce the images here for copyright reasons. (I also think Edinburgh cheats a bit in the ranking by including a chunk of the Pentland Hills within its boundary which one wouldn’t do in a serious analysis!).

Close up of EVI for the botanics and surrounding area. Note how bright green the playing fields are compared to the botanics. Also note the mosaic of gardens and buildings.

Landsat 8 data has a resolution of 30m. Each pixel is about 100 foot across. I’m hoping that at this resolution we’ll be able to produce a proxy metric for how green and area “feels” to people living and working there especially when using Google Street View analysis to paint a fuller picture. It is not going to be simple. As you can see large blocks of green represent the biodiversity deserts that are playing fields. I’d also consider these aesthetic deserts. The mosaics of different greens seen in the botanics, the allotments, Warriston Cemetery and St Marks Park tell a different story as do the built up areas of the new town. I’m excited to see where we can take this analysis.

Jan 112017
 

We instinctively know that a walk in the garden or somewhere else filled with natural beauty is good for us but it is difficult to justify expensive or restrictive planning decisions on the basis of instinct alone. This is why scientists have been trying for years to quantify just how exposure to green space improves our mental and physical health. They have managed to show that what we see out of the window or walk past on the street really does matters for our stress levels and it is particularly good if what we see is plants. But how do we extend these results beyond small studies based on a few volunteers?

As I’ve posted before there is an emerging field (Biophilomatics) which is using big data resources to look at these problems. This latest study from RBGE takes a new approach to the subject by plugging together some of the latest Google technologies in a novel way.

Computers are getting better at assessing the contents of our photos. If you use photo sharing sites you may have noticed that they use this technology to sort your photos by subject. Google has now given developers access to the algorithms behind their image classification systems through the Google Vision API. By combining this image recognition technology with Google Street View images from around Edinburgh we have demonstrated the possibility of assessing our cities streets for their perceived naturalness and potential restorative value using automatically sampled images.

Map of Edinburgh showing sample area and points. Greener dots are more natural. Redder halos are more deprived according to the Scottish Index of Multiple Deprivation. Where Street View panoramas are offset they are joined to sample points by lines.

It is very early days yet and before we can draw strong conclusions we need to develop it further but this could be an addition to the tool box for building more liveable cities.

The work was done as part of the Edinburgh Living Landscape (ELL) project which involves a number of partner organisations with interest and skill in our shared environment. ELL promotes improving, expanding and connecting up the green space in the city and encourages innovation in greening up built structures recognising that the local natural environment is a health asset. There are key questions about how best to achieve this for all sectors of the population and the methodology described in this paper contributes to building the evidence base for decision making.

Below is a slide show of the randomly sampled Street View images. Is this the Edinburgh you know?



Note: There are privacy and other issues around street view images and Google therefore remove any areas they are requested to. In the slide show above images are loaded directly from Google and so any that have been removed or replaced since the experiment was originally carried out in early 2016 will appear as unavailable.

Nov 292016
 
Three Landsat 8 Scenes covering central Scotland

Three Landsat 8 Scenes covering central Scotland

I’ve been looking at producing a good quality Normalized Difference Vegetation Index (NDVI) dataset for central Scotland so that I can investigate correlation between green space, biodiversity and well-being indicators. To do this I need access to satellite imagery.

Fortunately every sixteen days the Landsat 8 satellite passes over every part of the earth photographing at a resolution of 30 metres in eleven spectral bands. The U.S. Geological Survey generously make this data available for free and Amazon kindly distribute it as one of their public datasets. NDVI can be calculated by simply comparing the red and near infra-red bands of these images. It sounds very simple until we get into the detail.

Landsat 8 Panchromatic sample captured on 3rd June 2016 (note incomplete new Forth crossing)

Landsat 8 Panchromatic sample captured on 3rd June 2016 (note incomplete new Forth crossing)

NDVI is a measure of light absorbed by chlorophyll but much of the vegetation in Edinburgh and Glasgow that I am interested in is only actively growing for half the year. It is therefore important to compare images from different times of year. For consistency I’d like to restrict the data to the Landsat 8 satellite which entered into service in 2013 which further restricts the number of potential images. The Google Earth snapshot above shows the extent of the three Landsat scenes that cover the area of interest. Because we are so far North they overlap and so most of our area of interest is covered by at least two of the scenes and therefore imaged on average every eight days.  From mid 2013 until today this adds up to 187 images. More are being added all the time. This all seems good but we then run into the next problem.

If you are in central Scotland and look straight up the chances are you will see clouds. This isn’t just a stereotype of Scottish weather (or character?) but is supported by the data. The graph below shows the percentage of cloud cover in each of the 187 available images. The average cloud cover is 55%. For a crude analysis I may be able to select individual images from different years to represent winter and summer but if I want to have a more granular understanding of seasonal change, some indication of change through the years and some redundancy to correct for errors I’ll need to do something more sophisticated.

cloud_cover_2013-16To build my dataset I will therefore have to write scripts that examine all images created for these scenes, remove clouded areas and calculate NDVI for the remaining parts. These parts will then form a jigsaw puzzle that can be assembled in different ways to give seasonal or temporal change for different sub areas. The system should update automatically or semi-automatically as new scenes are released onto the Amazon hosting platform. Well that is plan anyway.

This all makes me very appreciative of the Google Earth data which, apparently effortlessly, does this process at higher resolution for visual satellite imagery, merging it with aerial photography as you zoom in.

Update: I just discovered the USGS service for surface reflectance corrected NDVI and EVI products – this may save some work and cut Amazon out of the loop.

Nov 012016
 

resilience_-midlothian-science-festival-2016

At the end of last month I spent a Thursday evening at IKEA Edinburgh, not for the usual reason of eating chips in the café with my daughter but to contribute to an event called Untangling Resilience to Depression as part of Midlothian Science Festival.

Recently we have been working with Stella Chan, a clinical psychologist from the Edinburgh University, as part of the Edinburgh Living Landscape project. Our discussions are around the effect connection with nature has on people’s well-being and the role that botanic gardens have in mediating those relationships.

The evening of talks at IKEA was part of the public outreach for the STRADL project lead by Prof Andrew McIntosh in which Stella is involved. STRADL is a multimillion pound, five year, Welcome Trust funded project that tries to understand the different ways in which people are resilient to depressive illnesses. There are many factors effecting depression ranging from genes through support networks out to the wider environment. The talks were arranged along this spectrum with me speaking last on green space and its affect on well-being. Speakers included Prof McIntosh, Prof Andrew Gumley from University of Glasgow, Prof Matthew Smith from University of Strathclyde and myself. Stella and Prof Stephen Lawrie acted as compères.

Stella asked me to speak because I had been describing some of my work in this area. I have been developing a mobile phone app called the Ten Breaths Map which aims to measure people’s engagement with natural spaces and been working on a paper that uses automated image categorisation to predict how restorative an image of a place is. This somewhat overlaps with Stella’s Project Soothe. (We’ve also been talking with Sarah Payne at Heriot Watt University about running experiments on the restorative value of the gardens.)

Stating that a walk in beautiful, natural surroundings might be good for mental and physical well-being seems so obvious as to not be worth investigating but in the context of rapid urbanisation where over half the world’s population and 80% plus of those in the developed countries  live in towns and cities the question of why a walk in the garden is better than a walk on a treadmill becomes more urgent. Coming from the other direction the expansion of urbanisation and associated agricultural intensification means there is less room for the biodiversity and those same urban green spaces become important as nature reserves. There is also evidence that exposure to green space makes people more pro-environmental and therefore more likely to support the lifestyle changes necessary to protect the planet in the face of threats like global warming. Research that leads to policy that helps us get our urban green spaces right is likely to have a big impact on future well-being of both humans and nature.

urban_population_graph

In fact there is good empirical evidence dating back to the 1960s that exposure to nature is beneficial for dealing with psycho-physiological effects of stress.  Just viewing images of nature has been shown to have a restorative effect. The two main theoretical models for why this happens are Stress Recovery Theory (SRT) and Attention Restoration Theory (ART). These theories are complementary. SRT is more concerned with physiological and negative affect whilst ART is concerned with attentional fatigue. Major questions that remain to be answered are: What is it about natural environments that produce these benefits and is this compatible with those same spaces acting as biodiversity refugia that nudge people to be more pro-environmental? Is it enough to just expose people to green space or is education, background and cultural meaning important? Is it equally beneficial for all or are there genetic, personality, gender or age factors?

I like to walk in the garden and intrinsically feel that my training in mindfulness techniques helps me connect with nature but what excites me professionally is that, in this age of big data, some of these questions are going to become much more computable. This is best illustrated with an example. A recent paper by James et al (2016) looked at how exposure to greenness around where you live may be linked to your chances of dying early. They combined two sources. The first dataset came from the Nurses’ Health Study, a project that has been tracking the health of nurses in the United States for the last forty years. These women fill in regular questionnaires about their health and lifestyle. The second dataset was Normalized Difference Vegetation Index (NDVI) calculated from satellite imagery. The study tracked 108,630 women over eight years during which 8,604 of them died. It looked at how much greenness there was within 250m and 1,250m of their homes and, because the study design was longitudinal and the health data is so comprehensive, it could confidently say: “Higher levels of green vegetation were associated with decreased mortality.”

The James et al study was only possible because two large, disparate datasets could be combined in ways their original authors probably never envisaged yet the results are good, highly relevant and potentially impactful. In a far, far less major way my recent studies have been using automated sampling and mapping techniques to try and infer human-nature connections. I’m reading similar studies and there is scope to do much more. My problem is how to describe what this is in less than the eight hundred words I’ve taken here. In conversation with a long suffering colleague we half jokingly came up with Biophilomatics. But many a good neologism is created in jest and this one is worth documenting.

There is a field called Bioinformatics  from bio- (Greek), information (Latin) and -matic (Greek) which has come to be associated with the computationally complex tasks associated with DNA and protein structure. The smaller scale bits of biology.

The field we generally work at the Botanics is more Biodiversity Informatics than bioinformatics. This is information at the taxonomic and systematic levels, the names of organisms, how they are related, where they occur and in what combinations. Whole organism stuff.

Biophilia is a term coined by E.O. Wilson in his 1984 book of the same name to mean ”love of life or living systems” or “the connections that human beings subconsciously seek with the rest of life”. It is the notion that we need connections with nature to thrive. It has spawned several movements notably Biophilic Design, bring a love of nature into architecture, and Biophilic Cities, integrating nature into urban planning.

The new field of Biophilomatics is a specialisation of Biodiversity Informatics in that it includes much of the same data used in describing the natural world but then combines it with data about human well-being. It is cross domain as it requires collaboration between the biodiversity and health worlds.

  • Formally: Measurement of the effects on human well-being and pro-environmental behaviour of the quality and quantity of connection with nature.
  • Informally: Describing the human love affair with nature.
  • Literally:  The willingness to perform (-matics) life (bio-) loving (philo).

Whether or not biophilomatics takes off as a new term the process of thinking it up has helped clarify, at least in my mind, what I’m working on. Thanks are due to Midlothian Science Festival and IKEA for hosting an evening that helped me through this process.

Aug 122016
 
William_McNab_by_Hill_&_Adamson,_1843-47

William McNab 1780-1848

This is the first professional photographic portrait of a professional horticulturist, the last horticulturist to live in the Botanic Cottage and the man who left the cottage behind.

In 1843 the painter David Octavius Hill and engineer/chemist Robert Adamson formed a partnership to professionally exploit the newly invented positive/negative photographic process introduced by Fox Talbot in 1841. Their base was Rock House on Calton Hill. Edinburgh residents are most likely to know it as  the white building opposite John Lewis.

The Hill & Adamson portraits are amongst the most famous and significant photographs ever taken – partly because they are so early in the development of the positive/negative process but also because they were the first to combine the eye of the artist with the skill of the technician. The pair were prolific, photographing many of the great and the good in early 19th Century Scotland but also the less celebrated Newhaven fishers – perhaps producing the first social documentary photographs.

What does all this have to do with the Botanics you may ask?

20160607-DSCF5351This year has seen the opening of the Botanic Cottage. This is the original entrance to the Leith Walk site of the Botanics that has been moved stone by stone to a new location within the garden. It was left behind when the garden moved to its current Inverleith site in the 1820s and has only just caught up. The man who physically moved the garden and left the cottage behind was William McNab. He was the last head gardener to live in the cottage on Leith Walk. He was still alive and running the Inverleith garden in the 1840’s when Hill & Adamson were practicing and so appears as one of their subjects. The actual date of the photograph isn’t known but the partnership ended in 1848 with the untimely death of Adamson who had been ailing for several years. The majority of their three thousand photographs were therefore made at the beginning of their collaboration. McNab died in 1848 as well and was buried next to his wife – coincidently on Calton Hill.

It isn’t a flattering portrait or one of Hill & Adamson’s best. McNab looks stern with closed eyes (probably because he had to sit still for so long for the image to register) when by all accounts he was very nice and liked by all. There is another exposure with eyes open that has been rather unfairly captioned “the mad horticulturist” on the internet. Only the above image appears to be available at high resolution in the public domain.

IMG_20160622_191305I have a fascination with photographing people and my major project for last year was to photograph those involved in moving the cottage. You can read about the project and see some of the images on my personal blog. Thumbnails were used in the new guidebook and we may produce a self print book of the complete set to leave in the cottage. For the time being there is a small exhibition of portraits in the Botanic Cottage.

Of course I was familiar with Hill & Adamson but hadn’t known they had photographed McNab until I found a National Portrait Gallery catalogue of their work in a second hand book shop. I was reminded of my photograph of David Rae, the Director of Horticulture and McNab’s equivalent during the instigation of cottage project. I’ve managed to make a jovial member of staff look stern and authoritarian – just like Hill & Adamson did to McNab.

DSCF2352

David Rae

And so we come to the real reason for this blog post. A tenuous excuse for me to get my amateur snaps associated with Hill & Adamson!

Mar 032016
 

Botanics_Pano

We know that spending time in beautiful green spaces is good for our mental and physical well-being but there are things we can do to make the most of this precious time and spread the effects into the rest of our lives.

This year I’m running a series of two hour workshops to introduce techniques for mindfully engaging with the environment. Each is held ‘on the hoof’ in the botanics after closing time and is suitable for absolute beginners or people with some experience of meditation. The workshops are designed to be free-standing but with the intention that some people may like to do them multiple times.

Roger Hyam_2013-05-21-1Although I trained as a botanist and currently focus on digital information at the Botanics for the last twenty years I have been a mindfulness meditation practitioner. I have a postgraduate certificate in secular mindfulness based techniques from Bangor University and have been ordained as a lay member of the Order of Interbeing – the order founded by Vietnamese Zen master Thich Nhat Hanh.

The workshops will run on Tuesdays between 6pm and 8pm on 12th & 26th April, 10th & 24th May 2016 and cost £15.

If you are interested you can book through the Education office on 0131 248 2937. If you would like more information please drop me an email. You can download a Two Feet, One Mind flier if you would like to share this with friends.

Dec 222015
 

map_750_1334This week the Water of Leith Walkway Audio Tour app went live in the Apple iOS app store and the Google Play store. We have produced this in partnership with our friends at the Water of Leith Conservation trust.

The walkway was established in 2002 and runs from Balerno all the way to Leith passing near to the foot of our garden in Inverleith. The most popular section of the walkway is between the botanics and the Scottish National Gallery of Modern Art.

The trust created an audio tour of the route several years ago but it was only available on-line. You needed a data connection on your phone to listen to the tour. You only discovered the points as you got to them.

The new app uses the same technology as our Dawyck Scottish Trees app. There are twenty audio points to listen to, a sketch map and a GPS positioning system to show where you are along the route. It was relatively little work to accommodate the larger scale of the walkway map. Importantly from the point of view of the botanics a series of bug fixes and improvements were incorporated into the core code and a new version of Dawyck Scottish Trees app released. These improvements will also roll forward into the planned Inverleith audio tour next year. Each time we produce one of these audio tour the quality goes up and the work involved goes down.

This is a soft launch at a quiet time of year to check everything is working and get some feedback. We will advertise it more widely in the spring. If you would like to try it out just search for “Water of Leith” in the app store on your device. Please let us know how you get on and if you like the app rate it in the store.

 

Dec 222015
 

herbal_screenshotBack in July 2013 we held a workshop here at RBGE on the use of HTTP URIs (also known as URLs or just plain web addresses) for specimens. This is an important technology that will allow scientists to refer to the specimens they use in their research. The desire is to allow “clicking through” to raw research materials in all possible contexts when specimens from major natural history collections are mentioned.

Building research infrastructure like this takes time not because it is necessarily technically complex but because consensus needs to grow across major institutions and be followed up by the will and resources to implement corresponding solutions.

Another small step was made along the way this week with the launch of a tester application here at The Botanics. The CETAF Specimen URI Tester has been developed as our contribution to the on going standardisation efforts. It allows implementers of the HTTP URIs to test if their implementation meets the proposed technical standards.

Perhaps the most significant part of this simple website is the page listing the known implementers of URIs for specimens. It contains twelve major institutions housing many millions of specimens. Hopefully we are reaching a tipping point where it will be come standard practice to take this approach.

Sep 252015
 

Apps Apps Apps

Index page of Dawyck app

Index page of Dawyck app

For several years now we have been looking for a way to make appropriate use of mobile phones to deliver interpretation material.

Smart phones really kicked off with the launch of the first iPhone in 2007 and  the market has now matured with very capable handsets available for little over £100. They are becoming ubiquitous especially amongst younger people with estimates of 80% of 20­ to 30 year olds having one and even 50% of 50­ to 60 year olds. The phones are used for texting, social media, photography, shopping, general browsing, gaming … and even making calls!

Although the penetration of smart phones presents an opportunity for enhancing visitor experiences and attracting new visitors there are major challenges.

The technical landscape is quite heterogeneous. It is dominated by Android and iOS (Apple) devices but there are a plethora of different software versions, and hardware form factors. This is especially so as the market is only just reaching maturity. A three year old phone might well lack the ability to run the latest version of an operating system even if the user wanted or knew how to upgrade.

Hype is still present. Although successful applications do a single task, especially at launch, there is still a temptation to try and create an app that does it all; builds a community, integrates with social media, sell things etc etc. Brainstorming meetings can lead to the invention of the an un-implementable mess.

The social landscape is also heterogeneous. A teenager will use a phone very differently from her parents.

Classic Audio Tours

Whilst searching for a way to use a new technology we were having issues with a slightly older one. The glass houses in Edinburgh have an audio interpretation tour consisting of some three hundred recordings delivered on dedicated audio wands. There are similar facilities at Logan and Benmore gardens. The upkeep of the hardware is becoming a burden and the units themselves are looking dated. The content is an amazing asset but the sheer amount makes management an issue. Some recordings will become out of date and may need updating but others are absolute gems we want to highlight.

Could we deliver these classic audio tours as an app? The answer is no. The sheer amount of content in the tours is too great to download and the lack of network coverage makes an online approach impractical in the first instance*. Even if we did solve the hardware issue like this it wouldn’t solve the information management issue.

Thinking about this problem led to the concept of single purpose mobile apps that contain bite size chunks of audio interpretation; what we are calling Audio Leaflets. Although not an instant solution to our audio tour challenge thinking about the problem had spawned something new and perhaps solved another problem.

Audio Leaflets

Audio Leaflets are simple, inexpensively produced mobile phone apps for delivering compelling audio interpretation of places and exhibitions. They typically contain between ten and twenty Points of Interest (POI) each with two to three minutes of commentary. They can contain a map or plan to help locate the points but have no other significant features. They can be produced entirely in house or with the help of writing and voice talent.

Audio Leaflets are not direct replacements for traditional tours but they may have a significant role to play. The main challenge we face is in thinking in new ways about how we can exploit this approach. The most important aspect of Audio Leaflets is that they encourage a modular approach. Each Audio Leaflet is a freestanding app in the app store. We could charge for one audio leaflet but not another. We could translate one leaflet and not another. Some leaflets can be ephemeral and others can be a long term commitment. But we use the same method to produce them all so the unit cost is low and we can concentrate on the content our visitors want.

Dawyck Scottish Trees

Screen Shot 2015-09-16 at 15.43.54

Map page of Dawyck app

As the Audio Leaflet idea was maturing we were developing a conventional Scottish Tree Trail at Dawyck Botanic Garden. It would consist of a eighteen trees with panels in the ground and a leaflet to guide you around. We decided to make an accompanying Audio Leaflet with more information about the trail. This would give visitors the opportunity to listen to addition stories, facts and folklore about the trees as they walked the trail. It would also give us a straightforward example to try out the whole Audio Leaflet concept.

Max Coleman our science communicator, who was writing the panels and leaflet, wrote a script for the audio. We were given a small grant by The Sibbald Trust to pay Hazel Darwin-Edwards to do a professional voice over. I developed the software to run on the phone and deploy it to both Android and iOS (Apple) platforms. The crucial requirement of this software was that it separated logic from content. After Dawyck Scottish Trees was completed we should be able to put different data in, turn the handle again, and easily produce another app for another audio tour.

Most important of all the guys on the ground at Dawyck lead by Graham Stewart made sure the trees were in place and corrected the dots on our maps. No trees means no tree trail!

Today the Dawyck Scottish Trees app is live in both the Google Play store for Android and the Apple App Store for your iOS devices. You can install it and have a play by just searching for “Dawyck” in the store on your device or clicking one of the links below.

Please rate the app if you like it and if you don’t get in touch and help us make it better.

What we have proved is that the process is possible. We have learnt a lot of lessons about how the process could be done more efficiently not least of which is doing things in the right order. Our next challenge is to do it again faster and more smoothly.

Next Steps

We are coming into autumn now so there is an opportunity to test out the Dawyck app on a small audience and get any fixes in place before making a decision about publicising it more in the spring.

We are partnering with one of our neighbours, the Water of Leith Conservation Trust, to take an existing on-line audio tour for the Water of Leith Walkway and turn it into and audio leaflet. This is likely to be the next one out of the door because all the content is ready to use.

If all goes well with the first two then a general tour of the Edinburgh Inverleith site would be our next goal. This might be the first one we would charge for and perhaps translate.

If you have any comments or ideas then I’d love to hear them.


* It is likely that the glasshouses in Edinburgh will have complete public WiFi coverage in the near future and so we are considering deploying a mobile friendly web site that mimics the existing hardware and allows people to access the wealth of audio interpretation on their phones. We are also looking to take subsets of the existing audio tour and repurposing them as Audio Leaflets. I particularly like the idea of doing something with the stories for children.


 

Jun 012015
 

splash-port-hdpiThis week saw the approval of the Birds of Peramagroon identification app in the Apple iTunes app store. It had been approved for the Google Live Android app store several weeks ago. Being available in both app stores means it is now accessible to most smart phone users and we can finally consider it live!

Two questions probably spring to your mind. Where is Peramagroon? Why has a botanic garden produced an identification guide for birds?

Peramagroon is a mountain in Northern Iraq and forms part of the Zagros Mountain range. The area has a high level of plant biodiversity with mountain riverine forests, oak woodlands and thorn-cushion vegetation. This supports a high diversity of bird life. The Centre for Middle Eastern Plants (part of RBGE) has been working on a multi-agency project in the area and, as part of that project, needed to empower people to identify the birds as per the standard textbook for the area – Birds of the Middle East. We already had a experimental mobile identification system that had been used on several plant groups as adjuncts to MSc theses. This birds project offered an opportunity to take that software to the next level and release a guide to the public.

The advantage of going with the birds for our first guide is that the data is already complete and tested. We were generously given permission by the publishers Bloomsbury and the illustrators of Birds of the Middle East to use their illustrations. Sophie Neale (CMEP) worked with the projects bird team including Richard Porter (one of the authors) to develop and compile a character matrix and populate the app with images. I could concentrate on getting the software working properly.

Please try the app and let me know what you think. Either click the appropriate link below or search for “peramagroon” in the app store on your device.

KISS

screenshot_iphone_750x1334_02

Species Screen

The notion behind the app is to take the simplest possible approach to identification. The Keep It Simple Stupid approach. In fact the app doesn’t actually allow identification in the strict sense it simply permits ordering of species by how well they match a user specified filter. There is no ruling in or out of species as might happen with a traditional dichotomous key or many on-line tools. It is up to the user to make a decision based on the ordering presented. This is how many people “work” a dichotomous key – moving backwards and forwards along the different leads until they are happy. It was felt this approach best suited the small form factor and user affordance offered by mobile phones.

The app uses six basic concepts:

  • Species are the taxonomic units to be identified. Each species has a title, subtitle, text description (not used for this app) and a number of images.
  • Character Sets are organisational units to simplify the interface. In plants they may be something like “Fruit” or “Leaves”. Two are used in the birds app; “Context” and “Characteristics”. The user perceives them simply as subheadings.
  • Characters are descriptors that may be present in one or more states. An example from the birds app is Characteristics > Shape. Each character belongs to a single character set and may have multiple character states.
  • Character States are the possible values for the characters. An example from the birds app is Characteristics > Shape > Dove. Each character state belongs to a single character. Character states may have associated images.
  • Scores are the link between species and character states. Each species is scored to numerous character states. In the birds app Little Ringed Plover is scored to the character state Characteristics > Shape > Dove. Scores are not apparent to the user. They govern which states are displayed on the species profile screen and how the species are ordered when a filter is set. Species can be scored to multiple character states for a character. Following the terminology used in the DELTA format all characters are unordered multi-state.

Interface

screenshot_iphone_750x1334_03

Filter Screen

There are six screens in the app.

  • Home – Basic information about the application.
  • About – Credits and contact details.
  • Species – A scrollable list of all the species showing the first image, title and subtitle. Initially this is ordered naturally. It is suggested that the more common species should be presented first. If a filter is active then the order changes so that those species with states matching the filter come first. The number of matching states is shown as the score. Tapping on a species in the list displays the profile for that species. A button on the top right goes to the filter screen.
  • Species Profile – All the data about this species in the app: Title, subtitle, text description, images (tap to enlarge) and character states organised by character set and character.
  • Filter – A list of the characters organised by character set. Tapping a character displays the score screen. A button on the top left allows the filter to be cleared.
  • Score – Select which states are present in the observed specimen. Multiple states can be selected.

Technical Implementation

The app was written in JavaScript using the JQuery Mobile library. It was then packaged as native iOS and Android applications using the Apache Cordova system. Cordova allows the targeting of multiple platforms and so it would be possible to also target Windows Mobile or Blackberry if that was required. A separate web based authoring application was used to compile the character matrix and images. The separation of code and data is such that it should be reasonably straight forward to produce identification guides for different taxa although the publication pipeline will need to be further refined to make this totally painless.

Next Steps

Martin Gardner of the Conifer Conservation Project here at RBGE is working on the top fifty or so conifers found in Scotland and that should make a compelling app. We are looking to have this released by October 2015.  Do you have data on a group of plants that would benefit from this approach? If so maybe we could work together.

Location of Peramagroon

darwin-initiative-iraq

Funding

The project is funded by the UK Government’s Darwin Initiative programme. To find out more about the project see www.iraqdarwin.org.