A footnote to the Automatic Selfie

 

Slackers-Friday-1The last post on the algorithm topic for a while, maybe ever, but the theme has intrigued me and maybe I’ve learned something. After writing the last post and subsequently finding this article I thought I’d test Google stories further to see if I could work out what criteria might be applied to data and images by photographing a trip into Bristol and back on a bus with some walking around to various locations in town. I hoped to get Google to create a Story from my images that gave a sense of a journey through a variety of places but without me including any obvious landmarks or ‘scenic shots’ that might get snapped up as ‘highlights’. So using a variation on the selfie I kept myself just in the picture, by a toe-hold…

Slackers-Friday-2The two strips (above and at the top) are screen shots of the ‘Friday in Bristol and Pill’ Story as made by Google , unedited by me but as I’ve now edited this ‘original’ (an interesting concept in this territory) you can only now wee my new version so I stitched together screenshots to show what Google did automatically.  ‘It’ selected 19 ‘moments’ out of 54 images taken over 8 hours or so (see below), that covered around 13 miles, including the 11 mile return bus journey. But in another experiment over a much shorter route, about a mile, of less than an hour, a far higher percentage were chosen – 25 out of 29 photos. Oddly the only shot that got the ‘AutoAwesome’ treatment by Google was of the title shot of a bus stop sign on the tarmac, but ‘they’ didn’t include that in the story.

I’m baffled by the inclusion of the last shot – it’s fermenting grapes in a brewing bucket! It was taken several hours after the final photograph of the Bristol trip on a different phone that didn’t have location switched on – and clearly has no relation to the rest of the photos. Below is another screenshot of thumbnails of all the photos I took with location turned on, as displayed on Google+ images, earliest bottom right, last top left.

One of the features of Story is to insert, from the vast reservoir of images available to Google, a round shot on the timeline of where it thinks you are (see screenshots above), a fairly invasive addition, but they can be removed. Only two of these emerged, Clifton was very broadly the right location but was represented by penguins – I was nowhere near the zoo so an odd distraction, Google has 13 photos of the Royal West of England Academy building on Google Maps, and I was there for an hour so surprised that didn’t get shown. Harbourside was right but very general, but I was right next to the SS Great Britain for a while as I waited for the ferry on two occasions and nearby over lunch – that would have seemed a strong contender to include as a location with 100 photos shown on Google Maps.  I suppose I was expecting to see more specific locations represented, the two that appeared are very generic.

bed-to-bristol-and-back

All the photos taken on the phone with the location setting ‘on’ during the home to Bristol and back trip.

So here are edited versions of the story described above – now re-titled “Slackers Friday in Bristol (while testing Google Story)” – and for the hell of it, the other experiment, much reduced in ‘moments’ and called ‘Tea Walk’. No conclusions reached in this insubstantial research other than algorithms work in mysterious ways and it’s an entertaining addition to a routine trip somewhere if you’re on your own! Maybe I should learn how to code, or interview some people who devise the parameters, maybe I’ll just carry on playing.

Advertisements