w an external audio input with Nadir
logo patching, which allows you to in-
sert a logo for branding. There is even
a method for outputting a stitched feed
via SDI out to support a separate en-
coder if needed.
Overall, the software is simple and
easy to use, and is a great package for
live stitching and streaming. Imagine-Software has recently paired the Z Cam
with Assimilate Scratch to extend its
capabilities. This development looks
promising but is not yet fully proven.
Hardware system requirements and
key supported features (such as 4K
output and Facebook Live integration)
for WonderLive can be found at z-cam
If you decide to use cameras from manufacturers that don’t supply their own stitching app,
then VideoStitch Vahana VR is the go-to piece
of software. Vahana VR (see Figure 4 on page
95) is the only standalone PC application that
is capable of stitching VR video in real-time for
livestreaming separate cameras.
I recommend incorporating four-input video
cards from either Magewell or AJA, which enable stitching video feeds from multiple cameras, whether HDMI or HD-SDI, into a single 360°
video for a streamed output via RTMP.
Vahana VR system requirements and key
supported features (such as 8K output and Facebook 360 Live integration) are available here:
A viable stitching/streaming alternative is
the Teradek Sphere (Figure 5), which is a unique
hardware and software solution that allows for
live preview and live stitching of panoramic video. The Sphere, which retails for $3,000, enables you to combine four HDMI feeds from
various cameras (up to eight when networked),
such as the GoPro Hero or Blackmagic Design
Micro Studio Cameras, and stitch and stream
them in real time from an iPad app.
Teradek charges a one-time license for live-streaming with the Sphere. It’s also available in
an SDI version, which costs an additional $400.
The next stage after stitching is encoding, ei-
ther via software or dedicated hardware encod-
ers from reputable manufacturers such as Har-
monic, Haivision, or Elemental. Streams should
be delivered at multiple bitrates to cater to view-
ers with different internet speeds.
Encoding VR content requires more processing power than typical HD or UHD content. A
high-end workstation or a higher-performing
hardware encoder is critical. I recommend investing in as much CPU and GPU processing
power as you can afford to prevent any bottlenecks and to future-proof your system as well.
Although streaming VR from your encoder to
your CDN over RTMP, MPEG-DASH, or HLS is
identical to that of standard live-streaming workflows, VR video—especially at 4K—requires large
amounts of bandwidth, especially when streaming at higher frame rates at around 20+Mbps
with the H.264 codec. Once HEVC/H.265 is widely adopted with increased hardware support on
mobile devices and desktops, it will bring much
more efficiency with lower overhead, thereby
allowing for better-quality streams.
You can optimize your upstream by using
HEVC encoding as a mezzanine encode. HEVC
is roughly twice as efficient as H.264. Generally, a 4K stream in H.264 might need 12–15Mbps
of bandwidth. With HEVC, you can stream video with the same quality at around 6–8Mbps.
Something else to consider is using Open
Broadcaster Software. OBS is a powerful, versatile, and open source switching, encoding, and
recording solution for live streaming for Windows, Mac, or Linux. I recommend using OBS
in conjunction with a high-end PC as an alternative to “Big Iron” encoders, since the software
is free and feature-rich. It has served us well on
our productions, and can be a critical component for VR streaming events. If OBS isn’t your