I’m fascinated by meta data – it can be found in abundance, almost everywhere. A single tweet, as served up by the Twitter API, contains over 80 pieces of meta data in addition to the tweet itself. These pieces of information include the more obvious like time, date and location, through to flags to determine if special characters have been used, or whether the tweet contains offensive material, as well as numerous details about the user who sent the tweet.

Meta data has also made it’s way into places and, in particular, devices which have never before necessitated the creation or use of meta data. The influx of internet connected devices over the past decade has led to meta data to being automatically generated and utilised by otherwise inanimate objects like light bulbs and plug sockets and is purportedly used to both support and enrich the user experience of a product. On a functional level meta data can also be used to generate useful diagnostic information, but the aspect that I am more interested in, is how meta data can be used to provide context for the device or service that it is coupled with.

One of the things I find most interesting about meta data is that it is could be used to take a snapshot of the state of something at a specific point in time. Tweets on Twitter are, in effect, the meta data of an enormous number of moments occurring at a point in time around the world. A day of tweets could be used to gauge the state of the world by way of a large array of qualitative sentences reflecting how people felt at a point in time (assuming every tweet is written by a human, of course).

A problem with this however is that when a tweet is removed from its surrounding context, there is the possibility that it will make no sense at all. The same can be said for most types of meta data, what it most often provides is context for the thing that it accompanies.

With this in mind, I thought I would have a go at representing some of the social media posts that I have made in a more natural, context based, example of a moment or series of moments that I might have experienced, using various pieces of the meta data that I could gather.

I’ve been experiment with this idea, on and off, over the past 6 years or so and though, I’ve never necessarily felt like I’d come up with something that felt finished, I thought it was about time that I put it online to try it out.

Inspired by the likes of Brendan Dawes’ great journaling app, ‘Kennedy‘, I wanted to look at an alternative way of processing and interpreting meta data so that it could be reformed into something that might provide someone with a more human, as opposed to a machine or data heavy interpretation of a moment, event or activity. For example, I like the way in which ‘Kennedy‘ presents data, particularly that about the time of day, in a more analogue way:


My approach was to go one step further with the ‘Kennedy‘ concept to see how easy it would be to effectively generate a short paragraph of narrative which incorporates elements of meta data as the subject for each sentence. For example, framing a tweet as part of a conversation taking place, or referencing the last Spotify track listened to as a song playing in the background of a scene. With this approach, I also thought it would be interesting to present the outcome in the form of a page from a book, as if it were an actual page in the story of someone’s life.

In developing a solution, my first step was to gather all of the sources of social media and metadata that I’m actively using or generating. There were two big challenges here: the first was finding a reliable and easy way of accessing the data provided by API’s without having to do a ton of work with OAuth, rate limiting, security and all the other hassles that can sometimes come with API usage. The second challenge was in finding a way to effectively normalise and store the data in a way which allowed each of the sources to work cohesively with one another.

The quickest and easiest way that I could find to deal with most of the issues described above was to make use of IFTTT. IFTTT provides the ability to connect a wide range of online services and have them interact with each other. Services can trigger one another to perform automated tasks and the possibilities are wide ranging.

Using IFTTT, I configured a number of applets to collect all of my data sources into a single text file by taking the incoming data from each source, and arranging it into a consistent json based data format. The resulting file is stored in dropbox and updated each day for timed data events, for example weather updates, and on an ad-hoc basis for other sources, as and when new data is available.

From here, when someone visits the site, the json file is loaded, the latest item for each available data source is retrieved, then three items are picked at random and constructed into sentences. It was my intention for the sentences to be heavily context based, as such, their composition in code almost takes the format of a ‘Choose Your Own Adventure’ though directed by meta data, rather than the reader. Factors such as the weather and time of day, change the nature and tone of the sentences, as does the meta data relating to each data source itself. A sentence referring to the subject having completed a long run may imply tiredness, equally so, not having run for a while may emphasise laziness instead.

As mentioned earlier, this is an project that I’ve been playing with for a long time and I’m still not sure about whether it feels complete or indeed whether it works that well, but it has been a fun one to play with and adjust in pursuit of getting the right feel. I’m sure there’s a lot more I could do in terms of composing far more detailed and eloquent sentences which contain accurate tense and so on. Interestingly, in the period of time between starting this project and now, Machine Learning has become much more prevalent and I now wonder wether it could be used to produce more interesting results as a result of being able parse far more vast amounts of data?

To see the finished result, follow this link:

In finishing this post, I’m acutely aware of the irony that, at the moment I click publish, it will, by design, become part of the next narrative…

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s