Optimizing Cloud-Based Workflows

This Optimizing Cloud-Based Workflows blog is based on the SVTA webinar of the same name, recorded on Tuesday, February 15, 2022. The webinar featured Beatriz Pineda (SSIMWAVE), Chrys Le Gall (Ateme), and Rob Roskin (Lumen). It was moderated by Streaming Video Technology Alliance Executive Director, Jason Thibeault.

Many streaming operators have embraced the cloud for encoding and transcoding, delivery, analytics, and more. But, oftentimes the functions of the streaming workflow which have been transitioned to the cloud are spread over multiple clouds to ensure redundancy and reliability of the overall service. That distribution of workflow components creates a number of challenges in both management of the streaming service and scaling the architecture. So what can operators do to mitigate those challenges while continuing to take advantage of the cloud? From utilizing containers to deploying serverless functions, there are solutions which address the challenges while affording operators the benefits of distributed, cloud-based workflows.

Challenge #1: Cloud-Based Workflows and Serverless Functions

What seems to be the most obvious benefit of using cloud-based workflows  is transitioning away from maintaining hardware. The serverless option can be attractive because it is easy to test new technology without significant upfront cost and is faster to deploy when there is no architecture to maintain. While this can help with quicker integration and assist with capacity issues, there is also cost associated with this transition that does not come in the form of servers and hardware. Paying for only the necessary computing resources to test different containers can be great for certain use-cases. One example is  live events, where you may not need the additional CPU for extended amounts of time, but it can still be limited for heavier workloads. There is still room for improvement with cloud providers and the support for serverless functions. Right now, heavier workloads can be enabled which require more CPU to run and this can limit the use of serverless approaches. There is also a cost related to development and needing to understand how to operate within these serverless environments. Teams may lack the skills to work with serverless functions, and it may cause slowness when attempting an all-in-one service, since many teams will need to learn as they go.

Challenge #2: Balancing Development Needs and Cloud Services Usage

There are additional concerns when it comes to the development needs and using cloud-based services—especially if companies are working in a multi-cloud architecture. One of those concerns  is ensuring streaming synchronization across cloud providers. Despite concerns, cloud services bring flexibility to development making it easier to share, test, reproduce, and integrate since deploying in the cloud doesn’t require that server space within the company’s physical infrastructure. This allows for more instantaneous testing and reduces any CAPEX costs. It is also important to consider that cloud-based workflows will be new to the companies who are transitioning to serverless functions, so they may not be taking advantage of all the options that are available. This can make it more complicated to find problems as well. Furthermore, there is also overhead with logging and monitoring when it comes to containerization. For a streaming platform to truly operate a cloud-based workflow, developers will need to know how to write tools to collect data specifically for containers and may struggle if they are not prepared for this transition, impacting operations later down the road. Each use case is different, of course, and there also may be cases where the workflow is broken into smaller pieces and placed in separate containers to then have the building blocks for different products and solutions.

Challenge #3: Maintaining QoS and QoE in Cloud-Based Optimization

Transitioning into an architecture based on serverless functions can be difficult because many engineers may come from a perspective that everything was under their control. The delivery of content was based on hardware that could be maintained physically. This transition brings more abstraction of the networking, since the servers and cables are not housed physically by the teams working with them. Here, the transition is reliant on accepting that there is no control of that end-to-end hardware and instead, engineers must rely on the tools that cloud providers make available to monitor workflows. Maintaining QoE and QoS is also reliant on the information that can be collected. Having everything in the cloud means that there may be a need for more CPU so that everything can be tracked. If there is an issue while using containers in the workflow, it should be traceable to the origin of the issue and resolvable as a result. Additionally, maintaining these standards for QoE and QoS rests in the ability to act quickly to resolve solutions which can look like monitoring analytics provided through the cloud providers, analytics, AI, and additional data points collected from the workflow and the users. Understanding the chain and timing of operators allows operations to understand where each job starts and stops to ensure that the end user experience is delivering at its target consistently.

Future Uses of Cloud-Based Optimization in Streaming

What does the future look like? There is conversation around how to phase into a more sustainable architecture. It is all about consumption of resources and ensuring that there is not over-consumption, which benefits cloud providers and streaming operators alike. It also poses the question that as serverless functions become more prevalent, will it be easier to run one workflow on the cloud versus another? Connecting this to the workflow and the end-user experience is also important to understand the delivery of an automated video experience. Automating encoding for video experiences is not easy to achieve since there is concern for efficiency while maintaining that quality of service. There is also talk about Generative AI, making it possible to create content from a script and not a large content library. This would allow for AI to focus on targeting users and audiences when it comes to content creation, and producers to focus on other projects. For now, we look to the cloud providers to continue improving their technology which can open up opportunities for more and more serverless functionality.

Freelance Writer

Sydney is a freelance writer working with the Streaming Video Alliance to develop blog posts and other content.

Scroll to Top