Listening to the 'Radio Magic' seminar that Adam organised at
Transmediale yesterday, two themes stood out...
1. Recording live shows and tagging them with rich metadata is vital for
re-use of the material.
2. No-one wants to do the archiving work required, they would rather
make more live shows (which is much more exciting).
Seeing that we need solutions for proof of broadcast, automatic
recording and more, I had an idea which might cover all of these bases.
It goes something like this:
1. Run a server with darklog to capture the proof of broadcast from the
live studio:
No point re-inventing the wheel there, we can contribute to darklog
development too. We don't need to capture playlisted shows, since they
are already in the Airtime database.
2. Put the live shows into the Airtime schedule in advance of broadcast,
with metadata. This will also help avoid clashes in the schedule between
live and pre-recorded or playlisted shows.
3. After each show has ended, have Airtime pull the recording from the
Darklog server using curl or wget, and import it into the Airtime
database, using an improved airtime-import script. Tag the import with
the show metadata from the schedule, using the timestamp on the audio
file to match the two up.
4. Have Airtime export the schedule to a listener website (e.g. to a
Newscoop plugin) including links to the show recordings, which could be
cached on the public-facing server. This enables a 'listen again'
service for the last seven days of live content, or as far back as
server storage allows.
To implement this idea, we would need to improve the airtime-import
script so that custom Airtime ID3 tags could be specified on the command
line. Something like:
--s is show name
--g is genre
--p is presenter names
and so on. This way, the metadata will be redundantly stored in both the
Airtime database and in the tags of the individual recordings (so that
if anything went wrong with the database, the recording files could be
re-imported into a fresh database without losing vital metadata).
The Airtime server can be remote from the Darklog server in the live
studio, and so one machine can go down without affecting the other.
If you like the sound of this idea, please let me know and I'll put it
in the wiki.
>
> Hi all,
>
> Listening to the 'Radio Magic' seminar that Adam organised at
> Transmediale yesterday, two themes stood out...
>
> 1. Recording live shows and tagging them with rich metadata is vital for
> re-use of the material.
>
> 2. No-one wants to do the archiving work required, they would rather
> make more live shows (which is much more exciting).
>
> Seeing that we need solutions for proof of broadcast, automatic
> recording and more, I had an idea which might cover all of these bases.
> It goes something like this:
>
> 1. Run a server with darklog to capture the proof of broadcast from the
> live studio:
>
> http://www.broadcastsoftware.org/logging.htm
>
> No point re-inventing the wheel there, we can contribute to darklog
> development too. We don't need to capture playlisted shows, since they
> are already in the Airtime database.
>
> 2. Put the live shows into the Airtime schedule in advance of broadcast,
> with metadata. This will also help avoid clashes in the schedule between
> live and pre-recorded or playlisted shows.
>
> 3. After each show has ended, have Airtime pull the recording from the
> Darklog server using curl or wget, and import it into the Airtime
> database, using an improved airtime-import script. Tag the import with
> the show metadata from the schedule, using the timestamp on the audio
> file to match the two up.
>
> 4. Have Airtime export the schedule to a listener website (e.g. to a
> Newscoop plugin) including links to the show recordings, which could be
> cached on the public-facing server. This enables a 'listen again'
> service for the last seven days of live content, or as far back as
> server storage allows.
>
> To implement this idea, we would need to improve the airtime-import
> script so that custom Airtime ID3 tags could be specified on the command
> line. Something like:
>
> airtime-import --s "Dave's Elvis Show" --g "Rock 'n Roll" --p "Dave
> Smith and Colin Jones" http://my.darklogserver.org/201102031026.ogg
>
> where:
>
> --s is show name
> --g is genre
> --p is presenter names
>
> and so on. This way, the metadata will be redundantly stored in both the
> Airtime database and in the tags of the individual recordings (so that
> if anything went wrong with the database, the recording files could be
> re-imported into a fresh database without losing vital metadata).
>
> The Airtime server can be remote from the Darklog server in the live
> studio, and so one machine can go down without affecting the other.
>
> If you like the sound of this idea, please let me know and I'll put it
> in the wiki.
>
> Cheers!
>
> Daniel
>
>
I've done that, and also updated the page title, because it's not
strictly necessary to record in the studio itself for proof of
broadcast. You can record from anywhere you can hear the radio :-)