Free to attend, no registration required. Please click this link to join the Zoom meeting.
Borrowing, appropriation and intertextuality are particularly interesting concepts to explore given the recent acceleration in art and music produced using Machine Learning. Every network which is used to create art or music functions only because it is trained on a dataset of pre-existing work. Some of these datasets may be relatively modest – Dadabots’ Relentless Doppelgänger was trained on the music of technical death metal band Archspire. Others are huge – Stability.ai’s Stable Diffusion was trained on LAION 2B-en dataset, which contains over 2 billion text-image pairs. What does ‘style’ mean in this context? How is it expressed? And who owns it?
Jennifer Walshe is Professor of Composition at Oxford and Fellow of Worcester College. She has created numerous projects using AI, including ULTRACHUNK, in collaboration with Memo Akten, and A Late Anthology of Early Music Vol 1: Ancient to Renaissance in collaboration with Dadabots..
CJ Carr is Head of Audio Research at StabilityAI. He is also one of the Dadabots.
Christine McLeavey is a Member of the Technical Staff at OpenAI, where she recently published Jukebox and MuseNet. She studied physics at Princeton University, and neuroscience and medicine at Stanford University. She is also a trained classical pianist.
OSiMTA Season 5 (2022–23)
We are delighted to announce that the theme for Season 5 is:
Borrowing—Appropriation—Intertextuality
Our speakers will be examining this theme from a broad range of analytical and repertorial perspectives. Full details will be posted here soon.
We are also delighted to announce that the majority of seminars will one again take place in person, in the Committee Room of the Oxford Faculty of Music, on selected Wednesdays beginning at 16.30 UK time. We shall also be streaming all sessions via YouTube so that you can watch and join in the discussion online from wherever you are. Details of how to log in will be available shortly. For the most up-to-date information, please visit our dedicated OSiMTA pages.