Verizon Digital Media Services offers a next-generation platform for all of your online video streaming and content delivery network (CDN) needs. Official blog of Verizon Digital Media Services. Learn about the latest news and events in the digital media and content delivery industry.
Grady Player, Manager, Software Dev Engineering, and Pavel Koshevoy, Software Engineer
Bringing Live 4K Streaming Sporting Events to Life at Scale
Today, the vast majority of live and streaming content is either 720p or 1080p. Despite the widespread availability of affordable 4K screens, broadcasters are still struggling with the complex challenges of producing and distributing 4K on an infrastructure designed to handle much lower bitrate HD streams.
Many broadcast companies and production studios, who not that long ago made significant investments in HD, are carefully evaluating the ROI of delivering 4K versus the benefits achieved. Those entering the 4K arena typically only make higher quality content, such as select movies and shows, available at an extra cost. Part of the hesitation to go 4K may be found in a recent Brightcove study indicating that the move to 4K is only worth the investment if at least 80% of endpoints can consume 4K content. Current estimates are that about one-fifth of viewers have 4K TVs.
For broadcasters, now is the perfect time to gain experience with new formats, compression technologies, and workflows, even as standards continue to evolve and technologies advance. As announced at the NAB Show this year, Verizon Media has begun offering 4K services to our broadcast customers. In this blog, we look at some of the technical challenges we faced around encoding and packaging 4K for live sporting events.
In 2018, cyber attacks against small and medium-sized businesses (SMB) increased by 424%.1 In fact, 43% of all security breaches last year involved SMBs.2 And with the average cost of a data breach being $1.56M due to disruptions in normal operations, the hit to the finances of any SMB can be significant.3
Despite a clear need for protection, many SMBs find that adding a security solution can be a costly and complicated endeavor. Between the costs for infrastructure and security operations, which can easily hit $20,000 a month, implementing and maintaining a robust security solution is often relegated to the “maybe someday” pile.
To enable more SMBs to protect their content and digital assets, Verizon Media is excited to announce our new WAF Essential product, an affordable enterprise-grade web application firewall (WAF) designed for smaller organizations and use cases. Customers gain access to our cloud security suite for a specific application to help them protect against a variety of dangers ranging from OWASP Top 10 threats to sophisticated multi-pronged attacks.
WAF Essential delivers the same strength and speed as our enterprise WAF offering and includes these benefits.
Cost-effective web security: Comprehensive web security leveraging the protection of our global network at an entry-level price point.
Enterprise-grade security: Global scale and highly performant security, which includes our proprietary Dual WAF capability, Rate Limiting, and Real-Time Analytics so you can react sooner to a security threat.
High-capacity network: 82+ Tbps of global distributed network capacity to protect your business from even the largest DDoS attacks.
New, easy-to-use UI: Intuitive one-page user interface to simplify complex policy management.
While WAF Essential is purpose-built for SMBs, it is fully compatible with our enterprise solution. As your business grows and requires additional protection, Verizon Media can add DNS, bot mitigation, and Managed Cloud Security to further protect your users, infrastructure, and invaluable data without any interruption in service.
WAF Essential proves you don’t have to be big to get big protection. You just need to want application security. And who doesn’t want that?
For additional information on WAF Essential and to see how it can help your business, please contact us.
Trevor Hunsaker, Sr. Manager, Software Dev Engineering
Personalized viewing experiences at a 1:1 level are transforming the TV experience. Instead of one-size-fits-all, viewers get targeted, highly relevant advertising, tailored content, and recommendations for new programs. You can also implement precise DRM/blackout management based on your viewers’ device type, location, history, demographics, and other data.
Scaling personalized video streams to millions of viewers, especially for live programming such as sports, is nearly as challenging as hitting for the cycle in baseball. Viewer volumes can swing wildly, surging by hundreds of thousands at must-watch moments such as kickoff, overtime, and during close matches. If your infrastructure for supporting personalization isn’t adaptable and scalable, it will be game over, and in the world of OTT that could mean your entire business could be at risk.
Keep reading for details on the technology and processes we employ to create personalized viewing experiences.
Content delivery networks (CDN) have been around since long before the era of cloud computing – they were the original cloud. While the demands of the Internet were different back then, CDN’s are just as relevant today as when they were introduced, helping deliver high-quality media experiences on a variety of devices to global audiences. Verizon Media is continually investing in and advancing our delivery network, adapting to the cloud, and expanding our capacity to ensure the best performance and reliability for our customers and their users. Today, we are celebrating 75 Tbps of network capacity, a 10x network growth in just the past 5 years!
Reaching the 75 Tbps milestone is a significant achievement for our delivery network. Now is the ideal time to reflect on how we got here, the challenges we faced, and the road ahead. As a leading content delivery network, it’s not enough to co-locate servers in major metropolitan areas around the world. From the outside looking in, it’s easy to assume that’s all that’s required to hit 75 Tbps is infrastructure and deploying new servers. The reality is that it requires broad technological advances so that our network can efficiently and securely handle any workload within a diverse ecosystem of application interactions. Achieving this milestone demonstrates the sustained high performance required from automation, scaling, software development, and architecture. Additionally, it’s taken countless hours of work from a passionate, dedicated team. From the founders of EdgeCast, to the teams hard at work today, it has taken the selfless contributions of everyone to reach this milepost.
Rather than talk through all the various innovations developed to date, I will review the scale of the problems we’ve had to solve, including:
These are just a few of the problems we’ve had to solve to get to where we are today. Here’s a look at our network at the outset of our drive to 75 Tbps.
Our network growth over time has been significant.
Here’s what our network map looks like today:
What the future holds
Our network is securely positioned to handle the massive scale required for the OTT revolution. From millions of viewers who concurrently stream video globally throughout the day to the explosive growth of IoT devices, data is increasing exponentially. In fact, 90% of the data in the world today was created in the last two years alone – that’s 2.5 quintillion bytes of data a day!* And there’s no end in sight, which means it’s an absolute necessity to work with a network and media platform that can ingest, process, and deliver all of this data securely at optimal performance for an unmatched user experience. Advancements in 5G connectivity will bring these trends together and enable innovations around edge computing, IoT, and security that grow with the future.
I’m so incredibly proud to be apart of the team that has worked so hard to help us advance our network and innovate in ways previously not thought possible. The best part of achieving the 75 Tbps milestone is that it acts as a checkpoint for us to take a pause, catch our breath, and reflect on what an amazing ride it’s been. Next stop, 100 Tbps!
Maintaining quality at scale involves optimizing performance across every part of the Verizon Media Platform tech stack: from its lower layers, at the CPU and NIC, all the way up to the OS and the applications. Ultimately, our goal is always the same: great performance. To get there…
The O’Reilly Velocity Conference in San Jose, June 10–13, is the destination to get expert insight on building and maintaining cloud-native systems.
Connect with us at the conference to learn how you can develop higher performing applications at the edge using our Functions-as-a-Service platform, and how our all-new WAF delivers significant benefits for SecOps teams. Schedule a meeting with us today or stop by and see us at booth 701.
In addition to meeting us at our booth, there are three additional ways you can connect with us at Velocity.
Wednesday, June 12th, 9:30 a.m.
Dave Andrews, chief architect, presents: “Which edge do you need: Managing multiple edges to deliver the next industrial revolution.” Dave looks at the new class of low latency/high-bandwidth application domains and how we’re helping to deliver this to our customers.
Tuesday, June 11th, 9:00 a.m.-12:30 p.m.
William Pressly, senior director, emerging business, leads this 45-minute tutorial designed to help you develop more performant applications at the edge of the network for richer, more personalized user experiences at ultra-low latency.
Attend to learn about the latest developments of our Edge Compute product. Then put your new knowledge to the test during the hackathon immediately following the tutorial.
Take part in this exciting opportunity to get a head start on this technological innovation, and have an influence on its capabilities. Sign up now, space is limited.
Tuesday, June 11th at the conclusion of the tutorial; approximately 9:45 a.m.
Put what you learned in the FaaS tutorial into practice at the hackathon. The winners will walk away with some great prizes. Official Rules
According to a recent Verizon Data Breach Investigations Report, two out of five cyber incidents are distributed denial-of-service (DDoS) attacks. Easy and inexpensive to deploy, they remain an attractive weapon in the cyberattacker’s arsenal. If you have an online presence, your services are at risk of being targeted by a DDoS attack.
Strategy behind Stonefish, our DDoS mitigation system, designed to recognize and mitigate modern DDoS attacks
Architecture underpinning Stonefish
Process for a NOC and Stonefish to work together to stop attacks in real time
Stonefish is built into our massive delivery network, so it has the globally distributed infrastructure and capacity to withstand the largest DDoS attacks. But more importantly, Stonefish has the intelligence to automatically filter out bad data packets before they can disrupt the web applications we deliver. Stonefish works day and night for every customer on our network, regardless of service plan, making it a reliable, efficient, and easy way to shield your website from most DDoS attacks.
The frequency and intensity of DDoS attacks will continue to rise. Be prepared by taking action now to protect your website and apps.
The much anticipated 2019 edition of the Verizon Data Breach Investigations Report (DBIR) is now available. As the blogosphere and podcasters rush to dissect the data and share their views, let me be one of the first to share my impressions of the current security landscape.
Here are three issues that stand out the most to me.
Espionage is on the rise
External-threat actors continued to be the number one cause of data breaches, something we’ve seen since at least 2011. State-affiliated actors are the number two source for breaches, just behind organized crime, with an uptick in breaches by these groups over the recent past.
While financial gain remains the number one reason for attacks, you don’t have to be a fan of spy novelist John le Carré to notice an uptick in espionage as a motive for threat actors.
Figure 1. Threat actor motives in breaches over time.
Figure 2. Select threat actors in breaches over time.
Web hacking is the number one attack vector in breaches
Hacking was the number one tactic utilized in data breaches, with 52% of breaches analyzed involving hacking. What were they hacking? Web applications. This trend has not changed since we started blogging about them in the DBIR two years ago.
Figure 3. Different threats to web applications require multiple layers of defenses to mitigate them.
Some ports are more popular than others in DDoS attacks
Not all ports are created equal, at least from the standpoint of a DDoS attacker. The 2019 DBIR ranks ports that are associated with DDoS attacks. And the winners are:
389 Connectionless LDAP
123 Network Time Protocol
Figure 4. Comparison of ports in DDoS and honeypot attacks.
What do the top three attacks have in common? They are all connectionless User Datagram Protocol ports that are used in and susceptible to amplification attacks. This correlates with the rise of reflection attacks we saw in 2018, with some, such as the Memcached reflection attack against GitHub grabbing global headlines. Memcached uses UDP 11211, ranked the number 10 breach in the DBIR.
A glimpse at industry verticals
The DBIR analyzes incidents and breaches for industry verticals, as defined by the official North American Industry Classification System. Keep reading for my perspective on security trends affecting three critical industries: Finance, Information, and Retail.
Finance: Phishing and credentials
About 40% of confirmed data breaches in the financial industry involved a mail server. Criminals used social engineering and phishing to trick users into providing their credentials. The compromised accounts were usually utilized to send phishing emails to colleagues. As such, phishing often precedes and follows email compromise.
The DBIR also highlighted that compared to the previous year, there was a significant uptick in the involvement of credentials in data breaches. At the same time, there was a significant decrease in payment information in these breaches.
While an overwhelming majority of breaches were financially motivated, it’s worth noting that 10% of breaches in the financial industry were related to espionage. If not money, what the thieves were after remains unsolved.
It’s been said over and over again that two-factor authentication (2FA) should be standard for customer-facing applications, remote access, cloud-based resources, and other critical assets. Until this happens, data breaches will be a common occurrence for businesses of all sizes.
Information: DoS attacks, nation states, and cyberespionage
According to the U.S. government standard, the information industry consists of organizations that have to do with the creation, transmission, and storing of information, and it includes Hollywood. More than 60% of security incidents (not breaches) that affected this industry this year were denial-of-service attacks, a figure similar to the previous year’s figure. When it comes to confirmed data breaches, the top cause (42%) was due to misconfiguration errors, followed by web app attacks (29%), and the newcomer, cyberespionage (13%). About 36% of external attackers were state-affiliated, the same percentage as organized crime.
When powerful, destructive attacks are distributed, the defenses should also be massive, distributed, and well-managed.
Retail: Payment cards and web applications
The DBIR reports a shift in payment card fraudulent activities away from physical environments where a consumer presents a card to the merchant or a machine to online environments, also known as card-not-present transactions. Thefts happen where the money is, which is why criminals are increasingly after web server assets. More than 75% of 139 retail industry breaches analyzed involve web applications, and 64% of data compromised includes payment information. In related news, the percentage of breaches involving point-of-sale systems decreased by 10 fold between 2014 and 2018 while the percentage of breaches involving web applications increased by 12 fold during the same period.
Figure 5. Comparison of card-present vs. card-not-present fraud.
Compliance with the Payment Card Industry Data Security Standards will continue to be important in coming years, but what is more important is to go beyond standard compliance to truly secure data from financially motivated external (81%) and internal (19%) threat actors.
It remains to be seen whether the upward trend of state-sponsored espionage will continue or fizzle out. But it is likely that we’ll see more of these types of attacks as tensions between nations continue. Add in ongoing web hacking and an increase in DDoS attacks, and it’s clear that any business seeking to protect its assets (financial and intellectual) and those of its customers, must take a broad approach to cybersecurity. We must continue to rely on time-honored practices, such as 2FA, timely patching, and harden security with multilayer defenses, and instill a security mindset and a healthy dose of paranoia among employees to strengthen the human firewall.
Figure 6. Multiple layers of protection for web applications.
In 2018, live events drove double and triple-digit audience growth during major political and sporting events.* It makes sense: live event streaming drives media consumption because it combines two powerful factors: in the moment, can’t miss experiences with anytime/anywhere accessibility. It’s why many publishers are turning to live events to drive audience engagement and growth.
To take advantage of the amazing opportunity that live events provide, you’ve got to take a hard look at your infrastructure. Can it handle wild fluctuations in viewing sessions without skipping a beat? Time-bound spikes in session starts that commonly accompany live events can easily overwhelm your streaming infrastructure, sidelining your viewers. With expectations for quality streams continuing to increase, viewer tolerance for buffering and other glitches has plummeted to nearly zero.
As a live event streaming platform that has supported thousands of live sporting events, including some of the largest leagues, biggest venues, and most watched games, we’ve had to address these concerns. Working closely with our customers, we’ve continually evolved and optimized every aspect of our live streaming video workflow and architecture so we can keep up with rising viewer standards for streaming quality.
In this technology blog, Video encoding in the cloud to stream large-scale live events, you’ll learn how we architected our platform to optimize the many steps involved in moving a live stream from the event venue into the cloud. You’ll discover how we designed our cloud-based encoding services to process live video streams with minimal latency while enabling server-side ad insertion and personalization. And we share how to efficiently process video in the cloud while dealing with the unpredictable variables of live events so that you can deliver a TV-like quality experience to every viewer on any device around the world.
By the end of the blog, you’ll have a deeper understanding of the factors you need to consider when encoding live streams so that you can maximize the flexibility and functionality to create differentiated live events. You’ll also know how to efficiently move your live stream from the event venue into the cloud while balancing a number of competing interests such as bandwidth, video formats, and ABR.
Whether you’re building your own video workflows in the cloud or looking to a service provider like Verizon Digital Media Services for help, this blog details what you need to stream, and profit from, ad-supported live video from the cloud.
Complete the form on the right to receive our technology blogs the minute they become available. The next blog goes live on Tuesday, May 21, and dives into the dangers of DDoS and our automated solution to mitigate attacks. Register now to ensure you receive it in your inbox.
To deliver the quality entertainment experience that today’s connected consumers expect for streaming live video regardless of the device they’re using, content providers must address a variety of challenges.
Content providers must ensure that live streams of breaking news, sports and entertainment are not compromised by startup failures, buffering or poor video quality. Viewers won’t tolerate it. Content providers must increase production value to the standard that is typical of live event broadcasts and offer greater interactivity, allowing viewers to shape their own viewing experiences by choosing from additional streaming content and varying camera angles.
All of these elements make for compelling live streaming events, but they do require some behind-the-scenes technical and operational work.
For a start, streaming content must be presented quickly and at a high quality, even in the face of rapid and immense fluctuations in audience. However, even when the technology supporting live streaming can scale to handle massive numbers of concurrent requests, as our platform does, challenges remain.
Sending out production crews, setting up signal acquisition, overseeing logistics and allocating resources to deliver a live program with high production value can be both expensive and difficult. The unpredictable nature of streaming live events can diminish the viewing experience, if not handled appropriately. The team on the ground has just one chance to get it right. They must be ready to capture a conventional primary feed and the interesting angles and behind-the-scenes views that add value and foster one-on-one engagement. At the same time, the team must have the tools to seamlessly and unobtrusively insert alternate feeds or promotional material into live action.
Given these challenges, some of the world’s largest sports networks rely on partners to manage both the on-site production environment and their live streams, particularly when producing numerous events simultaneously. Because it eliminates the need to invest in additional infrastructure and to cart equipment from site to site, this approach offers both convenience and — for content producers that are still new to live event streaming — a low-risk way to get into the game.
To build an audience and monetize contracts and live streaming content effectively, content producers must do more than stream live events. A compelling TV-like viewing experience can only be created by a team that enjoys robust and flexible technology, strong operational capabilities and powerful monetization tools.
No live stream has drawn as many viewers as a network broadcast – yet. Nevertheless, audiences for live streaming content continue to grow. And, when content producers use the latest technology and tools for smooth delivery of live streaming content that’s unavailable on broadcast television, they have the opportunity to meet the demand for niche content and create a valuable connection with a new community of viewers.