Research & Development

Abstract

Object-based audio is an emerging paradigm for representing audio content. However, the limited availability of high-quality object-based content and the need for usable production and reproduction tools impede the exploration and evaluation of object-based audio. This engineering brief introduces the S3A object-based production dataset. It comprises a set of object-based scenes as projects for the Reaper digital audio workstation (DAW). They are accompanied by a set of open-source DAW plugins–—the VISR Production Suite—–for creating and reproducing object-based audio. In combination, these resources provide a practical way to experiment with object-based audio and facilitate loudspeaker and headphone reproduction. The dataset is provided to enable a larger audience to experience object-based audio, for use in perceptual experiments, and for audio system evaluation.

This paper was presented at the 147th Convention of the Audio Engineering Society. It is available from the AES electronic library at http://www.aes.org/e-lib/browse.cfm?elib=20565. Paper access requires payment or AES library subscription.

Authors: Giacomo Costantini (University of Southampton), Andreas Franck (University of Southampton), Chris Pike (BBC R&D), Jon Francombe (BBC R&D), James Woodcock (University of Salford), Richard J. Hughes (University of Salford), Philip Coleman (University of Surrey), Eloise Whitmore (Naked Productions), and Filippo Maria Fazi (University of Southampton).

This work was S3A Future Spatial Audio project. The production datasets can be obtained from the project website or using the DOIs for each scene:

The VISR Production Suite can be downloaded from the S3A project website, and the source code is available at https://github.com/s3a-spatialaudio/VISR-Production-Suite.

This publication is part of the Immersive and Interactive Content section

Topics