Takeaways from CES 2024 for video service providers

The annual Consumer Electronics Show (CES 2024) always grabs global headlines with eye-catching new gadgets and ever-expanding TV screen sizes. This year was no different, with acres of coverage for LG and Samsung’s new transparent TV screens, the latest virtual reality concepts, and robots that will make you anything from a cocktail to a stir-fry. 

But, like the bright lights and razzmatazz of the host city, Las Vegas, the show-floor glitz of CES 2024 is largely a temporary distraction for the video entertainment industry. Behind the scenes, there are serious conversations about the future of streaming. So what were video service providers really talking about at CES 2024?

Magine Pro CES

Magine Pro’s Sales Director, Neil Fender, at CES 2024.

 

Interest in AI is becoming more targeted

Just as we saw at IBC and NAB in 2023, Artificial Intelligence (AI) continued to be a major buzzword. Our CEO, Matthew Wilkinson, touched on this important topic in his recent blog on Large Language Models and Generative AI. But the partners and customers (both current and future) that I spoke to in Las Vegas are increasingly focused on the practicalities of deploying this constantly evolving technology. They’ve gone beyond wondering how AI might revolutionise the industry in the future, to looking at very specific use cases that will advance their short-term goals, not just their longer-term planning. 

With consumers around the world becoming increasingly cost-conscious, churn is a massive area of focus for every streaming service, no matter what combination of business models they’re using. The video services I spoke to are looking for a strong and immediate return on investment for any foray into AI. In particular, they want it to keep consumers engaged so they’re less likely to look elsewhere for content. The benefits of offering personalized content recommendations are the most often-cited strategy for growing engagement, so it was a pleasure to be able to talk about the AI-powered recommendation capabilities in Magine Pro’s end-to-end OTT streaming platform .

I was also happy to talk through some of the tactics that have paid dividends for our existing customers when tackling churn. For example, the “dunning” functionality in our advanced billing engine enables them to systematically communicate with consumers whose payment methods have expired or failed. This proactive strategy prevents passive involuntary churn, minimising customer loss and ensuring a positive user experience. Another proactive approach is to automate “win-back” strategies. This means offering targeted discounts to certain users who have recently churned, or shown signs of being a flight-risk, to help keep them on-board. 

We’ll have more about these tactics in our upcoming e-guide on advanced OTT monetization strategies. Register now to be among the first to get the guide, which will also explain some of the reasons we champion hybrid monetization models. 

No moving out of the FAST-lane

I don’t think anyone was surprised to find that FAST – another of the biggest streaming trends of 2023 – was still a hot topic at CES. The low barriers to entry mean that FAST channels continue to be the most popular route into the streaming market for content owners, particularly in the US market. 

The FAST trend I heard most about at CES, however, was those who’ve dipped a toe into streaming with FAST channels via aggregation platforms like Roku, Amazon, LG, and Samsung and now want their own suite of streaming apps to build a direct relationship with consumers. We had some excellent conversations at CES with technology partners and customers about how we can help them make that ambition a reality with the Magine Pro OTT platform because it’s quick and easy to launch, but also customizable so it can grow with their business.

Want to know more? 

If you weren’t able to make it to CES 2024, you can book a meeting with our team to talk about how AI recommendations could improve your engagement, or how Magine Pro can elevate your content distribution and monetization strategies.

The Rise of LLMs and AGI in a Dynamic Tech Landscape

In recent years, technology hype cycles, notably Blockchains/Crypto and the Metaverse, have captured attention, spurred by a macro environment amplifying their visibility.  A persistent question remains: What real-world problems do they solve, and what is their potential market reach? Despite ongoing relevance, significant behavioural barriers hinder mass adoption, such as trust issues in Crypto and social concerns in the Metaverse.

In this landscape, Large Language Models (LLMs) and the potential progress towards Artificial General Intelligence (AGI), particularly ChatGPT, have quietly emerged. They have swiftly gained user attention and recognition for their tangible value. Today, AGI is transitioning from novelty to utility, presenting two key perspectives:

Acceleration: Large Language Models (LLMs), akin to computing and the internet, serve as a transformative utility, laying the foundation for generalised applications. This empowers teams to not only access knowledge faster but also generate new solutions from existing information. The World Wide Web gave users access to the world’s knowledge, but LLMs take this a step further, by acting as the “intelligent” gateway to the knowledge and importantly being able to create new solutions from existing knowledge.

Generation: LLMs facilitate the generation of diverse content, including text, images, and gradually video, and this will inevitably disrupt workflows and the creative processes, prompting a pendulum swing between control (regulation, intellectual property safeguards) and creativity. Content generation will open the floodgates to even more User-Generated Content (UGC), by lowering the barrier to entry, but it should also augment, not replace, the creative process.

Despite its remarkable potential, we are still in the early stages of this new utility, much like when Tim Berners-Lee published his paper on the WWW.  Similar to the WWW’s foundation, it will create an evolving ecosystem of services and platforms built upon LLMs. Challenges lie ahead, but opportunities will emerge. Learning and leveraging this new utility will be crucial in navigating the dynamic tech landscape impacting the future of work, businesses, and our day-to-day services.

______________________________________________________________________________________

Interested in LLMs and AGI? Reach out to the tech-savvy Magine Pro team for more insights. You can also explore our blog for more CEO insights with Matthew Wilkinson.

Learn about our flexible OTT platform and download our free white papers and ebooks on our website.

This website uses cookies

Cookies consist of small text files. They contain data that is stored on your device. To enable us to place certain types of cookies we need to obtain your consent. At Magine Pro AB, corp. ID no. 559301-7287, we use the following kinds of cookies. To read more about which cookies we use and storage times, click here to access our cookies policy.

Manage your cookie-settings

Necessary cookies

Necessary cookies are cookies that must be placed for basic functions to work on the website. Basic functions are, for example, cookies which are needed so that you can use menus on the website and navigate on the site.

Cookies for statistics

For us to measure your interactions with the website, we place cookies in order to keep statistics. These cookies anonymize personal data.

Cookies for ad-tracking

To enable us to offer better service and experience, we place cookies so that we can provide relevant advertising. Another aim of this processing is to enable us to promote products or services, provide customized offers or provide recommendations based on what you have purchased in the past.

Ad measurement user cookies

In order to show relevant ads we place cookies to tailor ads for you

Personalized ads cookies

To show relevant and personal ads we place cookies to provide unique offers that are tailored to your user data