We're pleased to announce a new data challenge: "Few-shot Bioacoustic Event Detection", a new task within the "DCASE 2021" data challenge event.
We challenge YOU to create a system to detect the calls of birds, hyenas, meerkats and more.
This is a "few shot" task, meaning we only ever have a small number of examples of the sound to be detected. This is a great challenge for machine-learning students and researchers: it is not yet solved, and it is great practical utility for scientists and conservationists monitoring animals in the wild.
We are able to launch this task thanks to a great collaboration of people who contributed data from their own projects. These newly-curated datasets are contributed from projects recorded in Germany, USA, Kenya and Poland.
The training and validation datasets are available now to download. You can use them to develop new recognition systems. In June, the test sets will be made available, and participants will submit the results from their systems for official scoring.
Much more information on the Few-shot Bioacoustic Event Detection DCASE 2021 page.
Within TDWG Audubon Core, we are considering what is a good standard to label information in sub-regions of sound recordings, images, etc. For example, I can draw a rectangular box in an image or a spectrogram, and give it a species label. This happens a lot! How can we exchange these "boxes" between software and databases reliably?
The question is: should we use the w3c’s "Media Fragments" syntax? In particular, I’m looking at section 4.2 about selecting temporal and spatial sub-regions.
Temporal region examples:
t=10,20 # => results in the time interval [10,20) t=,20 # => results in the time interval [0,20) t=10 # => results in the time interval [10,end)
Spatial region examples:
xywh=160,120,320,240 # => results in a 320x240 box at x=160 and y=120 xywh=pixel:160,120,320,240 # => results in a 320x240 box at x=160 and y=120 xywh=percent:25,25,50,50 # => results in a 50%x50% box at x=25% and y=25%
The definitions for the content of the values are good, and we should directly follow their example. (For time, the values are Normal Play Time (npt) RFC 2326 which can be purely in seconds or in hh:mm:ss.*, and other formats such as ISO 8601 datetime can be used as "advanced" use. For space, values are in pixels or percentages, with pixels as the default, and x=y=0 the top-left of the image.)
The structure of the selectors, however, I think could lead to problems for annotating biodiversity multimedia:
- Comma-separated formats for fields are likely to lead to errors when used in CSV data.
- There are existing use-cases which refer to single points in time/space rather than regions. (This could however be handled as regions of zero extent: t=10,10 or xywh=160,120,0,0.)
- The format "t=10" for a time interval [10,end) risks user error since it could be interpreted as, or used as, a representation of temporal points. (In retrospect it would have been better to define the format as "t=10,")
- We wish to provide for a frequency axis, with similar region-selection characteristics as the temporal and spatial. (See freqLow and freqHigh recently added to Audubon Core.)
- We would like to allow for 3D spatial extents (xyzwhd?).
So, as one possibility: we could use the w3c’s approach to defining the values, by explicitly referring across to their use of RFC 2326 etc; but instead of simply recommending to use Media Fragments, we do NOT recommend the
xywh selectors but instead recommend separate fields for
freqHigh, and so forth.
I should say that my background is with audio data, and so for selecting image regions there may be existing good practice/recommendations that I haven't spotted.
My blog doesn't have a "comments" function, but I'd like to read your comments! You can reach me using twitter or email dstowell (attt) tilburguniversity.edu
A storecupboard dhal with hints of southern India, inspired loosely by more authentic sources such as this one.
Serves 2, takes about 70 minutes but with a big gap in the middle where you can get on with other things.
- 100g mung dhal
- 1 small cinnamon stick
- 4 tsp turmeric
- 1 tsp chilli seeds
- 1/2 tsp asafoetida
- 1 tsp salt
- 1 handful methi (fenugreek leaves), or a handful of spinach, kale or other green leaf
- 1.5 handfuls dessicated coconut
For the tarka:
- 1 tbsp coconut oil (or some veg oil)
- 1/4 an onion
- 1 tsp mustard seeds (optional)
- 2--4 curry leaves (optional)
- 1 red chilli (optional)
Take a large frying pan, warmed to medium hot, and toast (dry-fry) the mung dhal in it for about 5 minutes until they smell toasty and turn slightly pink/orange in colour. Keep shuffling them so they don't burn. Then pour them into a sieve (make sure you don't melt it if it's plastic), and rinse and soak them in cold water briefly.
Take a deeper pan with a lid, and warm it up medium hot, with the cinnamon stick in the dry pan. When that's had a minute or so, add the mung beans as well as about 400 ml of water. It needs plenty of water. Also add the turmeric, chilli seeds, asafoetida and salt. Bring this to the boil and then simmer it for about 45 minutes, part-covered with the lid. Make sure it doesn't boil over, but that aside you don't need to worry about it too much.
After 45 minutes the mung dhal should be soft and swollen and the chalky texture should be just about gone. Turn off the heat, and stir in the methi and 1 handful of the dessicated coconut. You can leave this to sit for a while, to absorb -- you can just do the rest whenever you're ready to eat.
When you're almost ready to eat:
If you have a hand blender, use that to blend about a quarter of the mixture in the pan. This gives some thickness without mushing everything. You can also use a potato masher or suchlike. Then, put the dhal back on a very low heat -- do not allow it to boil.
Make the tarka: in a frying pan (perhaps the one you started with!), get the oil nice and hot. Finely slice the onion and the chilli, and put them in to fry until caramelised and a bit crispy. Also add the other tarka ingredients after a couple of minutes.
Serve the dhal in bowls, with the fried tarka sprinkled over the top. Eat with bread (e.g. roti/chapati) or as part of a larger meal.
We had gorgeous jackfruit fritters in a London pub. Somehow, they got them extremely chickeny tasting. Impressive! I had to try and replicate the effect.
So what we're doing here is lovely juicy jackfruit fritters, making sure there's not too much stodgy dough getting in the way. It's flavoured with herbs, but specifically with those flavours that remind you of chicken and stuffing: sage, thyme, onion. I'm using a mixture of fresh and dried herbs according to availability - you could change it around. You really need at least some of the herbs to be fresh, because they're not just there for flavouring, they provide leafy green body to the fritters too.
I use chickpea flour (gram flour) to hold the fritter together and to help give it a moist chew. You could try other types of flour but I don't think they'll give the same effect.
You need to get the ingredients as dry as possible - the less excess water, the better the fritter will hold together. So, try washing and draining your jackfruit and herbs early, and leaving them to drain for a good while. I also pat the jackfruit dry with kitchen paper.
Serves 1-2, takes 30 minutes.
- 1 tin green jackfruit in water
- 2 tsp onion granules, possibly more
- 1 tsp dried thyme
- 1 small handful fresh parsley
- 1 small handful fresh sage
- 1 tbsp fresh mint (i.e. less than the other herbs)
- 1/2 tsp salt
- a twist of black pepper
- 1/2 tsp nutritional yeast (optional)
- 2 tbsp chickpea flour
Drain the fackfruit pieces as well as you can, cut off any very hard bits and discard, and then chop the rest roughly - it should end up as pieces a bit like chicken kebab meat, smaller than bitesize but still chunky. You can squish the pieces a little with your fingers, so that they break up a little and expose more surface area, and also look less like triangles.
Put the fresh herbs in a blender and pulse to chop them finely. (Or use a big knife and chopping board!) If you're using the blender, you do not need to discard the stalks for the parsley, but you will do for the others that have harder stalks.
Mix everything except the flour together well in a medium bowl, ensuring the herbs and other flavours are well-distributed over the jackfruit pieces. Leave to marinate for at least 1 hour.
When there's about 15 minutes before time to eat, sprinkle the chickpea flour evenly over the mixture, and mix it all through well. You're aiming to give the mixture enough flour that it's going to hold together well, but you do not want the flour to take over from the jackfruit. You're not making a dumpling! The flour should absorb pretty quickly into the mixture
On a flat surface, divide the mixture into two balls, then squish and compress them with your hands to make two compressed, burger-y shapes. Let this sit for a few minutes to absorb and to start to hold its shape, while you prepare other things.
In a large flat frying pan, warm up some veg oil ready for frying. You'll be shallow frying, but don't be stingy with the oil - you need enough oil (maybe about 1mm depth?) such that the surface of the fritters will form well. Very very gently, and without breaking or reshaping them, manoeuvre the fritters into the pan. Don't disturb the frying fritters too much, especially at first - let them get a surface from frying. They'll take about 5 minutes one side, and then you delicately turn them and give them 5 minutes the other side.
Serve as a starter, or as a midweek meal with chips and salad.
Ever since the immersive experience of the fantastic Biodiversity_Next conference 2019, I've been getting to grips with biodiversity data frameworks such as GBIF and TDWG. So I'm very pleased to tell you that I've been contributing to the Audubon Core standard, which is an open standard of vocabularies to be used when describing biodiversity multimedia. These standards help the world's collections and databases to talk to each other. It can enable some amazing stuff to happen, when the entire planet's evidence about animal and plant species can be treated almost as if it was one big dataset.
After community consultation we've just released an update to the Audubon Core Terms List which adds some terms that are very important for describing audio data. The terms added are:
- sampleRate for the sample-rate of the digitised recording.
- freqLow and freqHigh to specify the frequency range of the sound events captured in the recording.
Any audio experts reading this might find these rather unimpressive and basic metadata. But that's precisely the point - to add these basic and uncontroversial terms to the standard. I'm very happy with what seems to be the TDWG approach, which is to move forward by gradual consensus rather than over-engineering a standard in advance.
What can we do with this? Well, in the not-too-distant future I can imagine querying GBIF for all animal calls within a particular frequency band, or analysing frequency ranges globally to explore acoustic environmental correlations. We can't do this yet, but as these new terms get taken up it should happen.
These term additions came about primarily through joint efforts of me, Ed Baker, and Steve Baskauf - a TDWG expert who has guided us through the process very attentively. Plus, many people in the TDWG community who checked our work and gave input. Thank you!
I'd also like to acknowledge the Heidelberg Bioacoustics Symposium (Dec 2019) at which we had discussions with many different taxon experts on animal sound and how we can share it.
There are some presentation slides from the TDWG annual meeting, introducing the changes, and also looking forward to more detailed metadata that might be added in future (i.e. for sound-event annotations). We also proposed to import a term "dwc:individualCount", but we ran into some definitional issues so that will take a bit more time to resolve.
You can get involved in what happens next. Use TDWG standards such as Audubon Core to share your data. Get involved in the discussion about what else might be needed to share and aggregate bioacoustic datasets.
I'm extremely pleased to announce this publication, edited by Jérôme Sueur and myself: Ecoacoustics and Biodiversity Monitoring - a special issue in the journal "Remote Sensing in Ecology & Conservation".
It features 2 reviews and 6 original research articles, from research studies around the globe.
You can also read a brief introduction to the issue here on the RSEC blog.
I just want to put some of this down for posterity - i.e. to remind myself in future, of what was obvious at the time.
"Shut the damn pubs," I've been thinking to myself for weeks. Saying it to friends too. Back in August, I think, the scientific advisors Chris …
I'm having problems focusing on my work. It's very difficult to avoid distractions.
One big distraction is something called the World Wide Web... well in my job the most difficult distraction is actually work email - there are lots of different things that distract attention by email. Email is great but …
This flavour combination was fabulous - the hot deep flavour of muhammara (from Turkey/Syria, so I'm told) and the herby zesty za'atar (ours is from Palestine) make a great complement to the classic taste of grilled aubergine. We're not from the Levant so don't take this as authentic, but this …
Snapchat lets people share little photos and videos with each other, mostly used to tell the story of their day. Snapchat also created a map where you can click around the world and drop in on anonymous little slices of life. Try it - it's an odd but absorbing thing.