BCS Stories
Preparing production for 2021 and beyond
The rate at which media production workflows and technologies are evolving is unmatched. Driven in part by enforced COVID-19 lockdowns and social distancing measures, the emergence of new innovative tools has intensified. In fact, 2020 has accelerated several trends within the industry. IP infrastructures and workflows continue to grow in prominence, while remote productions have taken on a newfound importance. Faced with ongoing disruption, broadcasters and content producers have had to quickly adapt, embrace new solutions and rethink the way they produce content. Workflows that are flexible, scalable, reconfigurable, and future-ready have never been more essential.
Therefore, broadcasters look forward and ensure that they are adequately prepared for what the future will bring. In order to satisfy the demands of content-hungry consumers with simultaneous productions and delivery of content across online, social media, and linear TV, it is vital that they are equipped with the skills and tools to embrace the new workflows that future live productions will demand.
IP-powered workflows
The media industry’s evolution from baseband to IP turned into a revolution in 2020. With the global pandemic shuttering studios, emptying broadcast operations centers, and sidelining OB trucks, streaming and broadcast productions were forced to double down on remote production workflows and technologies. Consequently, they embraced IP like never before, together with its game-changing efficiency, cost savings, and flexibility for meeting rapidly changing requirements.
As IP continues to go mainstream for video production, producers have identified key requirements. They are looking to move away from bulky, expensive hardware to more flexible software and cloud-based solutions as well as more lightweight field gear like bonded cellular backpacks.
They need IP streaming solutions that can be incorporated easily into production workflows, with the ability to ingest non-baseband sources. Broadcast operations need to be able to use the entire gamut of IP-based protocols, and sometimes several of them during the same production. Likewise, they need fast, frictionless ways to transcode feeds and files into multiple video house formats for asset management and distribution.
The potential advantages of an IP transport backbone based on standard internet technology have encouraged a tremendous investment of R&D directed toward harmonizing the IP and the live video world. New robust protocols guided by global standards such as SMPTE 2022 are being proven commercially and are achieving global interoperability.
It is also important to acknowledge the inexorable cycle of investment that underpins improvement in both public and private global infrastructure that supports IP connectivity and IP-based services in general. This continuing high level of investment is paralleled by a plummeting price tag for wholesale IP bandwidth, by industry estimates more than a 1000-fold decrease over the last two decades and 30 percent compounded annually from 2014 to 2017.These are trends that seem likely to continue for the foreseeable future.
Building on these emerging standards and innovations is an increasingly capable set of B2B tools and services that become practical options for today’s generation of broadcasters. If IP-based live event transport might have been seen in past years as a high-risk novelty, it is now more than just a credible alternative, but an element of an increasingly compelling strategic shift. This is not only impacting the options for the rights owner’s live event contribution feeds but enabling a set of crucial advances in distribution to broadcast and streaming service providers.
This link in the live event value chain now can expect IP cloud-native content feeds that are both richer in options, faster to set-up and tear down, and customized to their specific profiled requirements. Making live video transport more omg площадка cost-effective, flexible, and scalable is a trend that will support and enable evolving commercial models/opportunities.
SMPTE 2110: Is it fit for the future of broadcast?
Broadcast production is at a crossroads and CTOs have a decision to make: should their new studio or mobile facility be built using SMPTE standard 2110 or something that might be more suitable for the cloud computing age? It is a problem that has surfaced in recent months as large-scale live production, the area of premium broadcast programming for which 2110 was principally designed, has shut down or reverted to using less conventional technologies to keep on air.
For many, ST 2110 still represents the bedrock of professional production and a relatively risk-free way to segue the industry’s legacy SDI base into IP. Others see an existential crisis in which broadcast engineering-based standards are a cul-de-sac and that if traditional players are ever to innovate on par with internet-first streamers they need to change the narrative. For many the issue boils down to the engineering mindset. If the starting point is to build a perfect pipeline where all the important performance indicators like frame sync are under full control and can be guaranteed, then this will inevitably fail when working in the cloud.
When it came to devising a means to migrate the industry into IP, these fundamentals were sensibly maintained. Instead of having to worry about running the correct type of cable and signal to various locations, broadcasters have far greater versatility to be a responsive studio business. Since standardization, ST 2110 interfaces have been added to core equipment from cameras to multiviewers enabling broadcasters to build new facilities entirely in IP. However, the rocketing rise of OTT streaming and the advance of cloud computing exacerbated by COVID-19, has put the future of 2110 under scrutiny, even at SMPTE itself.
It is not really that 2110 is the wrong standard, it is that the means of content consumption has started to change rapidly. The pandemic accelerated this when live sports and stage events, all the stuff that 2110 is dedicated to, almost vanished overnight. Cloud-based workstations using PCoIP, and low-cost low bandwidth video transmission has become the norm. Business teleconferencing tools, smartphone cameras, and webcams are in routine use in at-home scenarios for both production crew and on-air talent. ST 2110 was not designed for this.
What is more, the audience has begun to accept what the IABM calls COVID quality. The use of off-the-shelf collaboration tools may not be ideal, but it keeps the media factories running. Audiences started to accept glitches, streaming issues and for that matter more often than not poor video and audio quality.
It is not as if things will go back to normal when the pandemic passes. Remote production links contributed over the internet were advancing anyway. Now they are entrenched. Cloud computing and cloud services are becoming ubiquitous. Broadcasters are having to find ways to use the 2110 ecosystem to connect nano-second accurate studio environments with remote operations over the internet or in the cloud where floppy timing exists.
The Joint Taskforce on Network Media (JT-NM) which coordinates SMPTE 2110 and the wider development of a packet-based network infrastructure, is investigating ways to connect the wide area network of a production plant with tools, applications, and facilities outside of the studio. However, current cloud connections are not up to the quality standards required for low latency live streaming media. Therefore, SMPTE says research into quality-enhancing technologies, such as retransmission or automatic repeat request (ARQ), is crucial to improving the network infrastructure required to deliver broadcast-quality transmissions.
Broadcasters still need SMPTE Standard 2110 accuracy within a facility but they do not necessarily need 2110 perfection between two facilities or between an OB truck and a facility. It is finding a way to take the gold-plated excellence of 2110 together with parts of the ecosystem which are less gold plated and using them both to produce better in a COVID world.
Production switchers ready for streaming reality
Even in this COVID-crazy year, there is still lots of video creation that involves the use of production switchers. It is easy to think of production switchers as being in the domain of studios, production trucks and other enclosed facilities that have shut down for the duration of the pandemic, but the ability to stream the output of switchers makes them key for delivering quality programming in a range of situations. So not only are production switchers an active product category in this seemingly written-off year, they have thrived and continue to see innovation.
Switcher manufacturers have embraced IP connectivity in addition to more traditional television standards such as SDI. There once was a time when video production switchers selected on-air sources, keyed in titles and performed modest transition effects. Not anymore. Switchers today do all those things, plus act as storage for still images, graphics and video clips, as well as provide control for video playbacks from servers and dedicated players. Broadcast equipment manufacturers started building IP capabilities into switchers more than a decade ago, and that has given this product category a leg up in the bid to stay relevant during the COVID-19 lockdown.
Virtual production rises
LED stage installations have expanded around the world during the pandemic, which makes virtual production poised to expand significantly in 2021 and beyond. This supercharged growth is due to the control over the environment, time and location; as well as having the ability to provide a safe area by bringing the set to the talent where they are located, rather than flying crew around the world for a production shoot. Virtual set production is within reach of lower budgets, allowing for million dollar sets and locations without spending the million dollars.
Markerless tracking is making the virtual set much more adaptable and a lot less finicky to use. Set up times are becoming quicker than ever, virtual set photorealism is set to take a further jump this year, and technologies such as photogrammatic capture are enabling the recreation of real-world spaces inside the virtual quicker than ever.
In 2021 there will be further adoption of virtual production. Virtual production will be up at all budget levels. Lower budget productions have realized that virtual production levels the playing field, allowing for million dollar sets and locations without spending the million dollars. Big budget productions like to save money too, so they are also enthusiastic. Broadcasters are realizing there is no need to build real sets when virtual can offer total reality with greater speed, lower costs, and a greener environmental impact.
The ability to do much more work remotely has streamlined the way film and TV drama production teams work in prep, as well as in post-production. Virtual location scouting, monitoring camera feeds over iPads on set or in in another city, cameras operated from small remote heads, and remote lighting controls will be an essential part of the cinematographer’s toolkit going forward.
VVC: The key to next-generation media services
Versatile video coding (VVC) – the new video compression coding standard is an important step forward. This new codec will cut the bitrate roughly in half, greatly reducing the load of video traffic on networks without sacrificing quality and delivering a more seamless experience for users across devices.
VVC will free up networks to not only handle traffic from cutting edge technologies like mixed reality – AR and VR applications that have the potential to transform everything from city planning, to education, to gaming – but also to deal with the increased amount of video conferencing and remote collaborative tools in the new COVID-19 reality. Alternatively, it could enhance the user experience by delivering significantly higher quality video at the same bitrate.
Video coding standards work in a similar fashion to mobile wireless standards, where stakeholders from different companies and other groups come together to create a common language ensuring smooth interoperability between devices. VVC or H.266, has been a joint effort by the Video Coding Experts Group (VCEG) of the ITU-T and the Moving Picture Experts Group (MPEG) of the ISO/IEC and builds on previous standards such as MPEG-2, AVC/H.264, and HEVC/H.265. In contrast to proprietary alternatives, all four of these video coding standards were developed in an open and collaborative fashion.
Immersive video will take a major leap forward with VVC. For example, in a VR video game, high resolution is needed where the player is directly looking but images at the periphery are lower resolution. That transition and mixing is easier to accomplish with VVC, along with 5G and edge computing. This same technology could be used to, say, freeze the action during a soccer game and examine the scene in 3D, from all angles. Such a feature requires massive amounts of data and low latency and is where more efficient compression becomes critical.
The first software implementations of VVC were already available in 2020 and hardware will follow – with some chipset makers saying the first commercial VVC shipments could start before the end of 2021. Given the importance of video and compression to next-generation mobile broadband use cases and the burden such traffic puts on networks, VVC will be critically important in the coming years, enabling some of the most exciting technologies, from fully-immersive VR games to self-driving cars.
Immersive audio
Development of the immersive sound trend has shown exciting progression in recent years and moving through the pandemic this growth has been intensified. It is not just production studios that are coming under pressure to provide immersive sound, but also music, both for film and for audio-only content, through major platforms like Spotify, YouTube, and Audible. Although still in its infancy, VR has also been adding to the relevance of immersive studio mixing.
Before the outbreak, immersive sound had already made significant gains, with all major streaming platforms distributing in Dolby as standard on major titles. The resulting pressure to produce all major titles and series in at least 5.1 has had serious impacts on the number of immersive facilities needed in studios, and as a result the number of studio monitors sold. Consumers have added significant pressure to this trend, and during the pandemic products such as AV receivers and immersive soundbars have experienced positivity, with buyers keen to upgrade their more frequently used viewing set-ups.
It is not new, per-se, but immersive audio got simpler and came into its own over the past year. This tech got a practical boost in three major ways: increased computing power at lower costs; the widespread surge of using wireless headphones/ear buds with mobile devices; and more streaming services delivering feature films which utilize immersive audio to devices.
One of the key technologies aiding broadcast audio during the pandemic has been audio over IP (AoIP). AoIP has enabled remote production during the pandemic, a possibility which only 10 years ago would have been a logistical nightmare. Although AoIP was already a reality for many before the pandemic, the increased usage and experience of remote production over the past 12 months is expected to have a positive impact on its implementation moving forward. This comes along with the already increased usage of AoIP in broadcast and production, being driven by the inherent connectivity and management benefits it brings.
Following the development of the interoperability standard, AES67, the advantages of adopting AoIP have become even clearer, with the standard providing interoperability across key audio and video networks, such as Dante, SMPTE ST 2110, LiveWire, QLAN, and Ravenna.
AI is having a transformative effect on a huge range of industries, and the world of media and entertainment is no exception. When talking about audio specifically, it is no secret that AI is quickly becoming a vital cog in the machine – and the truth is broadcaster are only just scratching the surface. When it comes to audio workflows, there are three main areas where AI is starting to have an impact: assisted mastering, assisted mixing, and assisted composition. All three are at slightly different points on the adoption scale.
There are very few skilled mastering engineers around, but AI is proving to be a viable and democratizing alternative for many musicians. AI tools can help engineers and audio teams make basic decisions and complete the more routine tasks, thereby saving valuable pre-mixing time and enabling humans to focus on the more complex and creative elements. More and more tools are using deep learning algorithms to identify patterns in huge amounts of source material and then utilizing the insights generated to compose basic tunes and melodies.
The prevalence of AI in audio workflows is only going to gather momentum in the months and years to come. But the real opportunity is in post-production, due to the time-to-market pressures involved. Sound engineers can use AI to speed up and simplify baseline tasks, enabling them to focus on the high- value aspects that require more creativity. In the long term, AI could be used to manage complex installations and systems. With AoIP, teams manage routing from central software so they can pool resources to support projects. AI could be used to manage these complex networks of computers and software.
AI will never replace humans entirely, but it is clear that the technology is set to play a key role in the years to come as it continues to get more advanced. Audio professionals have to be prepared to embrace the AI revolution.
The wave of the future
In a highly fragmented media landscape filled with numerous incompatible formats and time-sensitive content demands, broadcasters and video professionals find themselves having to support a multitude of signals in an automated way.
The traditional broadcast operations center will always have its place, but the new IP normal offers unlimited possibilities for broadcast and streaming operations. This migration was well underway before 2020, but the unprecedented events of last year have pushed many production teams to make the leap now.
The store and compute resources of many facilities will remain on-premises, perhaps shared into a private cloud, for several years. Ultimately, everything will be on the cloud. The reason it would not be switched sooner is largely because of cost. The media industry let alone the post production sector is not sufficiently large for AWS or Azure to start striking meaningful discounts. Moving data in and out of a public cloud is probably cost prohibitive for all but the largest and constantly running projects.
Just as more post-production workflows are going to be maintained remotely the need for face- to-face collaborative editing and craft finishing may never go away. That is one reason why city center facilities in hubs like Mumbai or Delhi will remain important. The human interaction of peers is part of the act of creation. If the industry stays remote permanently there is a risk of losing that.
Hence, the future is not going to be back to pre-2020 normal, but it would not be the full remote situation enforced by health lockdowns either. The reality is that broadcasters need a hybrid solution, that allows them to have some edits locally and some remote. The legacy of COVID-19 will, therefore, be a mixed economy of remote and on-premise working which will ideally be led by the individuals working on each project, rather corporate diktat.
You must be logged in to post a comment Login