Tag Archives: bbc

Baby Steps

Photo by strollerdos, via Flickr, Creative Commons

This is the second post in a series covering my exploration, experimentation and musings in the area of fictional modelling. In short, can we use the recent developments in semantic web technologies to represent elements of fictional content, and what does this allow us to do. For my introduction to the topic, see my previous post here. In this entry, I’ll talk about my first practical steps, and their implications. Thanks also go to Tom Scott, Dan Brickley and Anthony Green, amongst others, who responded to the first post with helpful comments.

Before I go any further, as pointed out by Chris Sizemore, it’s worth noting that work has been done in a similar area before. Previous IAs at the BBC, including Celia Romaniuk, worked on an ontology to describe the content of soap operas, known as SUDS. From what I have seen, it was an extension to FOAF in order to describe further relationships between people, the nature of people ‘playing’ characters, and various events that could take place between the characters in a show. This was done to tie in with an Eastenders website relaunch. I won’t go into much more detail here, but if you’re interested in seeing the original work, there’s a short article here and a great presentation here. Unfortunately, apart from a few example XML fragments, I have so far been unable to find a document that defines the SUDS ontology. This is a shame, because it would have been an extremely useful starting point for my experiments. One option might be to gather the examples together and try to reverse-engineer a schema, but for the moment, and partly as a way for me to learn as much as possible, I’ve decided to start from scratch. Hopefully at some point we can find the SUDS ontology and see how it compares to what I come up with.

So, where to start? Well, as the title suggests, I’m going to start small. Sort of. Readers of the blog, and others who know me, will probably have guessed that I’m a bit of a, shall we say, ‘fan’ of the BBC’s Doctor Who (currently in the news for apparently appointing a 12-year-old as the Eleventh Doctor). So much so, that in my sad little way, most things that I’m presented with in the course of my BBC IA work make me think “How could/would this apply to Doctor Who?”. As a programme that originally ran for 26 years, and has been enjoying an overdue renaissance, its rich history, and sheer refusal to ever completely conform to most IA domain models, make it both a source of frustration and inspiration. So when I read Tristan Ferne’s blog post over at BBC Radio Labs, shortly before joining the Beeb, I began to wonder. Have a read, it’s a good example of a similar idea.

Tristan’s article concerns fictional modelling for another hugely successful BBC show, The Archers. He talks about being able to break an episode down into scenes, characters, plots etc. and, for instance, potentially being able to build pages that allow the user to follow a story through multiple episodes, rather than being tied to the traditional episode format. Of course, to paraphrase Jack Bauer, events within The Archers occur in linear time. If we were able to build dynamic and interesting websites from a show like that, centred around a small English village, how about a show that goes forward, back and sideways in time and space? Harking back to my ‘toy box’ analogy from last time, with the imagination of the writers of a show like Doctor Who, and the imagination of our audiences, the potential to create some fantastic websites would be huge.

Sorry, where was I? Oh yes, starting small. So, yes, obviously I couldn’t hope to cover the whole scope of the show in one go. However, to show the potential of the semantic web and linked data approach, I’d want to start off by experimenting not only with characters who are linked together, but with a plot that is threaded through several episodes. I still haven’t quite decided what I’m going to choose for this, but I’m thinking that the story arc from either the first or fourth series of the current show would be good to try. But before all that, I had to learn how to create some linked data.

So I went even smaller, even simpler. I chose the first ever episode of the show, from 1963. This featured four main characters, and thanks to the workshop from Yves and the others, I had an inkling of an understanding of how to create FOAF profiles. The results can be seen here (best viewed if you use a Firefox plugin like Tabulator). So far so good. I then linked each character to the other, using the simple ‘knows’ relationship. Finally, to get my linked open data brownie points, I linked each character to its DBpedia equivalent, using the OWL ‘same as’ relationship. And that’s basically it. Except…

Except even this small experiment (which I eventually got working after help from Yves!) raises some interesting points. Firstly, the pernickety part of my brain is saying that we’re mixing two distinct things here. We’re using FOAF, which, I guess, and am happy to be corrected, is primarily intended to represent real people, to model fictional things. Crucially, nowhere, at the moment, are we explicitly stating that these resources are fictional characters.  So I’m wondering whether FOAF is the correct ontology to use. Of course, like SUDS, the ontology that results from these experiments will probably be an extension of FOAF, as it is true to say that we’re still modelling the same sort of ‘thing’, the relationship between ‘people’. But the point still stands – that somehow we need some way of indicating the ‘fictional’ nature of the FOAF person, if applicable.

Secondly, and perhaps more importantly, as Anthony Green pointed out, and as I discovered when I linked the characters to their DBpedia equivalents, there’s a lot of detailed information out there already. When I linked each character to DBPedia, I got back information which was extremely detailed and fairly well structured. Which, to be honest, depressed me a little bit. Was it worth me continuing? It’s clear that others had done a lot of similar work already, and I knew that ultimately it would be silly to reinvent the wheel.

However, then I remembered what data I was trying to link. Of course I should still link to the DBpedia equivalents, but the linked data I am thinking of is more to do with linking between characters, plots etc within my own domain. I’m still slightly uneasy with this, because I know that obviously the main thrust of the whole linked data movement is to link external sources together, and that creating silos of data is not good. However, I’m still definitely in favour of linking to DBpedia – if we were to make our ‘internal’ linked data semantically rich, and then link to external sources, then everyone would benefit, and in a way, we would be regarded as the ‘master’ source in the same way that, in my small experiment, I used DBpedia as my ‘master’ source.

So that’s it. A long, rambling blog post, and small, simple experiment. Baby steps. Apologies for the rambling, and I’m not sure that I *quite* explained myself properly in that last part – but there’s definitely some interesting issues coming up already, and I’m hoping that the advantages of my position will be borne out in future experiments. Finally, I’ve adapted the RDF file that I used to create the FOAF profiles to temporarily remove the OWL ‘same as’ relationship – just to ease the page loading time, and to, for the moment, give me a more clean space to work in. The adapted version is here, the original version here. Linking back *in* to DBpedia will be a task for later…

Again, comments, queries, advice is more than welcome – comment, twitter or email me.

Coldcut @ Electric Proms ’08

As I write this entry, the first fireworks of the year are going off outside. “Bang after bang after bang after bang”, as an accomplished broadcaster once said. On Saturday night I went to the Roundhouse in Camden for my first Electric Proms concert – Coldcut via the Radiophonic Workshop. My brother and I got there early to pick up the tickets, expecting to only be allowed in for the DJ set, rather than the discussion beforehand. Luckily for us, the discussion had been delayed by half an hour, and we managed to get in for that as well.

I won’t bother describing the event in great detail, you can find all the info on the Electric Proms website. Saying that, though, one great thing I’ve just uncovered via that link is a minute by minute record of the gig from Twitter. As an aside, that’s one of the cool things about Twitter, from my experience – conversations between work colleagues that might otherwise go unrecorded, including examples of collaboration and idea-building, are preserved. Equally, live experiences which, unless ‘taped’, will eventually be forgotten, can be preserved in some fashion here – including, crucially, the emotions and feelings of the people experiencing the event. (I wonder – what about supplementing the football ‘minute-by-minute’ feeds on matchdays with Twitter feeds as well as 606 comments?)

The DJ set itself was quite good – it’s sometimes hard to ‘get into’ a gig when it’s material you don’t recognise, hence the best part of the performance was when they re-mixed the Doctor Who theme (it got the best reception from the crowd, and it’s a shame that it wasn’t longer, in this fan-boy’s opinion). It would have been interesting to have included more voice sampling in a similar fashion to ‘Doctorin’ The House’ or ‘More Beats and Pieces’, but I can understand why they wanted to concentrate on Radiophonic Workshop material.

Other than that, the night consisted of a house party in Willesden Green, where the floors consisted of a large bed of autumnal leaves, everyone wore increasingly bizarre hats, and Bjork was on the stereo. Not much more to add, I suppose. Oh, and today I started work as an ‘Information Architect’ at BBC Audio & Music, the experiences from which, I hope will inspire me to write future blog posts.