Optic uses the JT-NM data model (go2sm
.com/jtnm) as its core, and each individual component within it uses NMOS standards (go2sm
.com/nmos) to allow for the development of tools
within an open and interoperable framework.
Inspired by the WebAudio API ( go2sm.com/
webaudio), the BBC has built an experimental
HTML5/WebGL media processing and sequencing library for creating interactive and responsive videos on the web. VideoContext (go2sm
.com/videocontext) uses a graph-based rendering pipeline, with video sources, effects, and
processing represented as software objects that
can be connected, disconnected, created, and
removed in real time during playback.
The core of the video processing in Video-
Context is implemented as WebGL (go2sm
.com/webgl) shaders written in GLSL (go2sm
.com/glsl). A range of common effects such as
cross-fade, chroma keying, scale, flip, and crop
is built in to the library. “There’s a straightfor-
resentation for effects that can be used to add
your own custom ones,” explains Shotton. “It
also provides a simple mechanism for mapping
The library—available as an open source—
works on newer builds of Chrome and Firefox
on the desktop, and, with some issues, on Safari.
“Due to several factors, the library isn’t fully
functional on any mobile platform,” says Shot-
ton. “This is in part due to the requirement for
a human interaction to happen with a video el-
ement before it can be controlled programmat-
ically.” The BBC is using the library internally
to develop a streamable description for
media composition with the working
title of UMCP (Universal Media Com-
It has taken a cue from Operational Transformation ( go2sm.com/trans
form), a solution to support multiuser,
single-task working, which powers Google Docs and Etherpad. “With a bit of
domain-specific adaptation, this can
be put to work in the arena of media
production,” explains Leonard.
The kernel of the idea is that the
exact same session description metadata is sent to every device, regardless
of its capabilities, which can, in turn, render
the experience in a way that suits it: either live,
as the director makes the cuts, or at an arbitrary time later on.
“It is the NMOS content model which allows
us to easily refer to media by a single identifier, irrelevant of its actual resolution, bitrate,
or encoding scheme,” explains Leonard.
“One of the substantial benefits of working
this way would be to allow us to author experiences once, for all devices, and deliver the
composition session data to all platforms, allowing the devices themselves to choose which
raw assets they need to create the experience
for themselves,” he says. Examples include a
low bitrate version for mobile, a high-resolution
version for desktop, and 360° for VR headsets.
In theory, this would allow the production
team to serve potentially hundreds of different
types of devices regardless of connection or
hardware capability without having to do the
laborious work of rendering a separate version
The hardware for an object-based production,
called IP Studio, is being adapted for IP by the
BBC. From a production point of view, equipment
from a camera to a vision mixer or archive can
be treated as an object. “IP Studio orchestrates
the network so that real-time collections of objects work as a media production environment,”
says Page. So, in the BBC’s schema, Optic will
output UCMP, and that sits on top of IP Studio.
OBB Goes Commercial
As a public-funded body, the BBC is driven
to unearth new ways of making media accessible to its license fee-paying viewers. Larger
The NMOS content
developers to easily
refer to media by a
irrelevant of actual
or encoding scheme.