I am a research fellow, conducting research into automatic analysis of bird sounds using machine learning.
—> Click here for more about my research.
Last year I took part in the Dagstuhl seminar on Vocal Interactivity in-and-between Humans, Animals and Robots (VIHAR). Many fascinating discussions with phoneticians, roboticists, and animal behaviourists (ethologists).
One surprisingly difficult topic was to come up with a basic data model for describing multi-party interactions. It was so easy to pick a hole in any given model: for example, if we describe actors taking "turns" which have start-times and end-times, then are we really saying that the actor is not actively interacting when it's not their turn? Do conversation participants really flip discretely between an "on" mode and an "off" mode, or does that model ride roughshod over the phenomena we want to understand?
I was reminded of this modelling question when I read this very interesting new journal article by a Japanese research group: "HARKBird: Exploring Acoustic Interactions in Bird Communities Using a Microphone Array". They have developed this really neat setup with a portable microphone array attached to a laptop which does direction-estimation and decodes which birds are heard from which direction. In the paper they use this to help annotate the time-regions in which birds are active, a bit like on/off model I mentioned above. Here's a quick sketch:
From this type of data, Suzuki et al calculate a measure called the transfer entropy which quantifies the extent to which one individual's vocalisation patterns contain information that predicts the patterns of another. It gives them a hypothesis test for whether one particular individual affects another, in a network: who is listening to whom?
That's a very similar question to the question we were asking in our journal article last year, "Detailed temporal structure of communication networks in groups of songbirds". I talked about our model at the Dagstuhl event. Here I'll merely emphasise that our model doesn't use regions of time, but point-like events:
So our model works well for short calls, but is not appropriate for data that can't be well-described via single moments in time (e.g. extended sounds that aren't easily subdivided). The advantage of our model is that it's a generative probabilistic model: we're directly estimating the characteristics of a detailed temporal model of the communication. The transfer-entropy method, by contrast, doesn't model how the birds influence each other, just detects whether the influence has happened.
I'd love to get the best of both worlds. a generative and general model for extended sound events influencing one another. It's a tall order because for point-like events, we have point process theory; for extended events I don't think the theory is quite so well-developed. Markov models work OK but don't deal very neatly with multiple parallel streams. The search continues.
A colleague pointed out this new review paper in the journal "Animal Behaviour": Applications of machine learning in animal behaviour studies.
It's a useful introduction to machine learning for animal behaviour people. In particular, the distinction between machine learning (ML) and classical statistical modelling is nicely described (sometimes tricky to convey that without insulting one or other paradigm).
The use of illustrative case studies is good. Most introductions to machine learning base themselves around standard examples predicting "unstructured" outcomes such as house prices (i.e. predict a number) or image categories (i.e. predict a discrete label). Two of the three case studies (all of which are by the authors themselves) similarly are about predicting categorical labels, but couched in useful biological context. It was good to see the case study relating to social networks and jackdaws. Not only because it relates to my own recent work with colleagues (specifically: this on communication networks in songbirds and this on monitoring the daily activities of jackdaws - although in our case we're using audio as the data source), but also because it shows an example of using machine learning to help elucidate structured information about animal behaviour rather than just labels.
The paper is sometimes mathematically imprecise: it's incorrect that Gaussian mixture models "lack a global optimum solution", for example (it's just that the global optimum can be hard to find). But the biggest omission, given that the paper was written so recently, is any real mention of deep learning. Deep learning has been showing its strengths for years now, and is not yet widely used in animal behaviour but certainly will be in years to come; researchers reading a review of "machine learning" should really come away with at least a sense of what deep learning is, and how it sits alongside other methods such as random forests. I encourage animal behaviour researchers to look at the very readable overview by LeCun et al in Nature.
Last year, when I took part in the Dagstuhl workshop on Vocal Interactivity in-and-between Humans, Animals and Robots, we had a brainstorming session, fantasising about how advanced robots might help us with animal behaviour research. "Spy" animals, if you will. Imagine a robot bird or a robot chimp, living as part of an ecosystem, but giving us the ability to modify its behaviour and study what happens. If you could send a spy to live among a group of animals, sharing food, communicating, collaborating, imagine how much you could learn about those animals!
So it particularly makes me smile to see the BBC nature doc Spy in the Wild, in which they've... gone there and done it already.
--- Well, not quite. It's a great documentary, some really astounding footage that makes you think again about what animals' inner lives are like. They use animatronic "spy" animals with film cameras in, which let them get up very close, to film from the middle of an animal's social group. These aren't autonomous robots though, they're remotely operated, and they're not capable of the full range of an animal's behaviours. They're pretty capable though: in order both to blend in and to interact, the spies can do things such as adopt submissive body language - crouching, ear movements, mouth movements, etc. And...
...some of them vocalise too. Yes there's some vocal interaction between animals and (human-piloted) robots. The vocal interaction is at a pretty simple level, it seems some of the robots have one or two pre-recorded calls built in and triggered by the operator, but it's interesting to see some occasional vocal back-and-forth between the animals and their electrical counterparts.
There are obviously some limitations. The spies generally can't move fast or dramatically. The spy birds can't fly. But - maybe soon?
In the mean time, watch the programme, it has loads of great moments caught on film.
If you're looking for a New Year's resolution how about this one: make more eye contact with strangers.
I was reading this powerful little list of Twenty Lessons from the 20th Century by some Professor of History. One idea that struck me is a very simple one:
11: Make eye contact and small talk. This is not just polite. It is a way to stay in touch with your surroundings, break down unnecessary social barriers, and come to understand whom you should and should not trust.
In a large city like the one I live in, eye contact and small talk are rare. They're even rarer thanks to smartphones, of course - although, twenty years ago, Londoners were still avoiding each other, but using newspapers, novels and Gameboys instead. Anyway I do think smartphones create a mode of interaction which reduces incidental eye contact etc.
So I decided to take the advice. Over the past month or so I took those little opportunities - at the bus stop, at the pedestrian crossing, at the supermarket. A bit of eye contact, a few words about the traffic or whatever. I was surprised how many opportunities for effortless (and not awkward!) tiny bits of smalltalk there were and how worthwhile it was to take them. After the year we've had, this is a little tweak you can try, and who knows, it might help.
I've been cooking vegetarian in 2016. It's about climate change: meat-eating is a big part of our carbon footprint, and it's something we can change. So here I'm sharing some of the best veggie recipes I found this year. Most of them are not too complex, the point is everyday meals not dinner parties.
Note: you don't have to go full-vegan - phew. You can do meat-free Mondays, you can try Veganuary, you can give up beef, or whatever, it all makes a difference. It's true that vegans have the smallest carbon footprint but it's pretty unlikely we're all going to go that far, and a more vegetarian diet makes a big improvement. (Here's an article with some data about that...)
So here we go, the best vegetarian recipes of 2016 - as judged by a meat-eater! ;)
These are all ones that were new discoveries. Of course there's plenty of standard stuff too. Anyway - pick a recipe, give it a go.
The Twelvetrees Ramp is open! It's the "missing link" in the walk down the River Lea from the Olympic Park all the way down to Cody Dock. Previously, to complete the walk you had to come off the river at Three Mills and go on an ugly detour round the Tesco's and the Blackwall Tunnel Approach. This ramp links up two bits so you can go more-or-less continuously down the river paths.
It was supposed to be open in September but... well... you know. And finally today it's open! Here are my exciting first pictures of it, looking robust against the wintry fog:
A fun bit of ironwork on top there. In the evening, the old streetlamp on the bridge lights up, and the new ironwork and the old streetlamp work well together.
It would have been nice if it had been open for all those autumnnal walks in the evening sun and the lengthening shadows. Instead, now you can walk all the way down to Cody Dock, except you won't find much going on down there in winter time. But hey ho, it's ready for 2017!
Oh and by the way here's Twelvetrees Ramp on the map
The House of Commons Science and Technology Committee has published its report into the implications of leaving the EU for UK science and research. The report is accompanied by a set of conclusions and recommendations.
By the way: the implications of Brexit (if indeed the UK ends up going through with it! So much is uncertain, even now) are massive and widespread. Science and engineering are only one of the many big issues that need to be considered. But as a UK sci/eng researcher I have good reasons to pay attention to this side of things! It's not about how much money I get. It's about whether the UK will be maintaining its attractive leading edge in research, as I said before the vote.
There are some really sound recommendations in there. Recommendation #4 is good: the Government should articulate a "genuinely comprehensive strategy for communicating its messages of ongoing support for science and research in the context of its plans for leaving the EU and the negotiations to follow." Why is this important? Because the Brexit vote itself send a message round the world about what kind of place Britain was, to existing and potential researchers. On top of that, really unfortunate messages were sent when certain government ministers talked casually about whether or not EU nationals would be allowed to stay in the country. So the Government has some work to do, to make sure the researchers of the future - currently planning to apply for PhDs, choosing courses/locations, and looking at global politics with eyebrows raised - understand that we want to work with them and we plan to treat them honourably.
This goes hand-in-hand with recommendation #6 and #7: mobility is crucial for research, and it'd be shooting ourselves in the foot to forget that. The Government's choice of negotiating position is going to make a massive difference here: how will they balance freedom-of-movement (though it's not my own wish to reduce it, a Brexit would be rather hollow if it didn't do so) against the access to market/finance which they seem to be expending the most energy worrying about? But in order for UK research to flourish, researchers from other countries - both present and future - need to know that they're welcome here and not threatened by uncertainty.
Frankly, though, I'm still left with the feeling "Why the hell are we still going through with this stupid idea?" I respect the outcome of the referendum but it expressed the nation's preferences, not any actual plan - and the elephant in the room is that any actual specific choice of Brexit is going to be one that the majority of people think is stupid and unjust - both the ones who voted for it as well as the ones who voted against it.
Read the recommendations in full - they are sensible.
This is a good hearty Sunday lunch for a vegetarian. One thing I'm missing as I increase my vegetarian-ness is something that's a proper centrepiece for a Sunday roast - those "nut roast" things which are fairly common are OK but I don't think I've had one that could outshine the roast potatoes on a plate. Anyway toad to the rescue. Of course you can do toad-in-the-hole with veggy sausages, but this here is great and not pretending to be anything it isn't!
Serves 2. Takes about 90 minutes in total, including a lot of oven-time where you can do other things.
I recommend you serve this with onion and red wine gravy (takes about 30 mins in a gentle pan), and have some raspberry vinegar available to sprinkle on the pud.
With a whisk or a fork, mix the milk, water and egg. Whisk the flour in, beating out any lumps. Now let this batter stand for a little while, e.g. 15 minutes, though it can easily rest for an hour.
Preheat the oven to 210 C.
Peel the squash and cut it into big thick fingers, like oversized chunky chips. (This is easiest if you're using the top of the squash and not the lower half with the seeds.)
Brush a roasting tray with oil (olive or vegetable) and then spread the squash pieces out on it. Drizzle over some more oil then roast the squash in the oven for about 40 minutes. They're going to get a bit more cooking after this, so they don't need to be "done" - they need to be at the point where they're just starting to soften and to get some darkening caramelisation at the edges.
While the squash is roasting, prepare the roasting tin in which you'll cook the toad. This needs to be at least 1 inch deep. Put a good glug of vegetable oil in, and then put this in the oven alongside the other stuff, so the tin and the oil can pre-heat to a good hot heat.
Take the squash out of the oven. If you leave them out a couple of minutes, they'll cool a bit so they're easier to handle in the next step.
Next is assembling the toad. It has to be done quickly! So that everything's hot in the hot tin. Quickly get the hot tin from the oven, pour the batter into it, then place the squash pieces one-by-one into the middle of the batter with a bit of space between them - and immediately return this to the hot oven and shut the door. This then cooks for 20-25 minutes until the batter is risen and crusty, the squash is nicely cooked and getting a nice roast colour.
If you have more pieces of squash than you can accommodate in the tin, simply put them back on the roasting tray and continue to roast them. You can serve them alongside.