Quietly, CFO Matt Ellis told Morgan Stanley, “The company’s Mobile Edge Computing platform, which will enable real-time enterprise applications, is expected to launch in fourth-quarter 2019.”
As previously reported, Verizon has been showing off an Edge Network designed for 15-25 milliseconds on 5G and 25-40 ms on LTE. Verizon is not just building 5G; it is also rebuilding the entire network, mostly for cost reasons. The latency will go down significantly.
Here’s the quote from a Seeking Alpha transcript
I’d say it’s more capabilities as opposed to like big physical assets. We have, as you think about the COs we have, all the C-RAN hubs, and soon, you have a lot of assets, which are going to act as a edge compute. So it’s more the capabilities to take those assets and turn them into the services that would differentiate our offerings, certainly, to all our customers, but very much to our enterprise and business customers.
As a key technical leader with the Intelligent Edge Network (iEN) and the Evolved Network Service Edge (eNSE) Network Support Team, you will be responsible for all technical support functions for the Verizon Wireless Edge. This includes both distribution areas (SAP Sites) and core network equipment centers (NEC Sites). Responsibilities include supporting both legacy and new network edge networks and the migrations required to move to the network networks and decommission the legacy networks over time. Examples include the migration of a variety of wireless applications to the eNSE and the migration of existing wireless edge routing functions, such as the Choke Router and the Provider Edge (PE) Router into the new iEN Multiservice Edge Router being placed in all the SAP locations in 2019.
How CDNs can adapt to the cloud computing era
October 3, 2018 0
“Build like the cloud; deliver like the cloud,” is an internal mantra Verizon Digital Media Services’ Development and Product teams have been using for the last two years.
What is cloud computing?
Cloud computing is the process of making the deployment and scaling of applications easier by using remote servers, virtualization, Infrastructure as a Service (IaaS) and orchestration tools on the internet, as opposed to using local computers and servers. And it’s quickly becoming the standard way that software is built, deployed and delivered.
Adapting CDNs to the cloud
The standardization of cloud computing services is putting CDNs (content delivery networks) in something of a predicament. While customers crave the speed and ease of a cloud platform, they’re unwilling to compromise on the performance, reliability, security and scale advantages that purpose-built CDNs still provide. That means CDNs have to continue to meet and exceed customer expectations, while at the same time finding new ways to integrate with cloud computing platforms and tools.
Here are some ways in which we have adapted our CDN to the cloud.
Build CDN services with cloud tools in mind
Customers have always used application program interfaces (APIs) to easily integrate their applications with third-party services. Our Edgecast CDN differentiated itself early on with a foundation of enabling customers and partners to easily self-service their CDN accounts and configurations via portal-based tools and APIs. Increasingly though, customers are adopting APIs and DevOps toolsets to integrate applications, streaming services and other workloads into the cloud. As the application development life cycle progresses from development to testing to staging to production deployments, popular configuration management and orchestration tools such as Puppet, Chef, Salt, Terraform and others are used to automate and regulate these activities and stages.
One of the ways we have adapted our CDN to support these tools is by using them ourselves. While CDNs have always (and likely will always) push the envelope of performance by optimizing bare metal servers and full-stack tuning, we have been using cloud environments for development, testing, staging, prototyping, data/analytics and other uses. Utilizing the cloud for performance optimization has made our processes more agile and robust while letting us learn the art and science of DevOps-centric, testable-by-design, CI/CD-friendly, edge computing development practices.
As we have adopted edge computing, we have started building a set of interfaces and tools we call EdgeControl, specifically to provide cloud functionality to the Edgecast CDN. We looked at what tools are popular and useful in open source and DevOps communities – looked at how they’re leveraged against computing, storage and network infrastructure, and saw how we could fit our CDN into those tools and workflows more organically.
Operate CDNs in real time
It’s easy to see the cloud’s appeal to customers; it’s a faster way to deploy software and deliver high-quality streaming content. When an application is deployed into the cloud, it’s dynamic and quick, as well as adaptable; a developer can just as easily spin up one instance or multiple at the same time, often using automated orchestration and deployment tools.
CDNs need to work just as quickly and efficiently. If a customer is launching an application in the cloud, it’s the CDN’s job to ensure it’s incorporated into that custom automation and app deployment process at the same time. Testing the application and then testing the CDN shouldn’t be a two-step process for customers to think about, but rather two things that can happen in tandem. The cloud-based CDN can play an appropriate role in the same testing process, and when it happens concurrently, developers can get data back out of the CDN during testing for enhanced metrics and analytics monitoring. And having a responsive CDN during the initial testing step ensures that the testing environment replicates the eventual production environment as closely as possible: from how the software performs in different regions to how it mitigates potential security issues to how it handles load or offloads certain business logic to edge servers.
At Verizon Digital Media Services, our increasing internal use of cloud infrastructure, tools and automation have shown us that testing the CDN as an afterthought can not only be tedious, but less efficient. EdgeControl was born out of working with our customers and partners to discover what they needed, which led us to create technology that worked for them and complements our internal processes. It’s not enough to provide API hooks, configuration and propagation; these things also need to happen in real time, so we’re working on making more of our components real time as well, including configuration APIs, ingest and propagation and feedback.
Automate, automate, automate
Finally, it’s vital for a real-time, responsive cloud CDN to have extensive automation capabilities. The more that its a secondary, manual step in the application deployment process, the less likely it will be updated and in sync with application changes; and the fewer automated-testing processes are likely to incorporate the CDN component (including any edge-deployed application logic), and the less reliable and predictable the end-to-end deployment process becomes.
From code testing to software deployment, the end goal should be to have customers do as little manually as possible. Instead, they ought to be able to make long-term configurations in the CDN to reflect a new content profile, a new application or API, or even just changes in the application environment, such as new regions being deployed, elasticity changes in deployed instances – or other characteristics of the origin/environment that the CDN could or should be responsive to.
An even better situation is if configuration could be tied to the customer’s existing software configuration tool, so the same automation processes and testing tools work together. The ability to automate in advance doesn’t just make software deployment easier, but also more reliable since it can be configured well before anything goes live on the CDN. This has the added benefit of fitting into the more dynamic, continuous integration deployment model that is becoming more common.
A faster, better CDN
For an application that lives in the cloud, the CDN is the infrastructure that sits in front of it. That means when a customer updates the application’s code; the CDN should automatically be aware of the change and trigger an update in its configuration if need be. Our EdgeControl toolset is being developed in the cloud with an eye toward how that process can be scripted in advance through an automated deployment process to avoid having our customers do a secondary configuration. The results will be a quicker, more efficient and more reliable deployment process.
Many of these improvements are as natural as asking ourselves, “How can we build this to be more like the cloud?” The answer will emerge in new edge computing capabilities and tools we are rolling out, including improving our configuration propagation times to less than 5 minutes and exposing more native configuration syntax to allow for the full power of our edge performance engine to be exposed to developers. We’re also expanding our APIs and command-line (CLI) tools to increase scripting and automation integration opportunities with the most common DevOps toolsets and frameworks; and taking the covers off of extensive metrics, analytics and other data that can provide real-time feedback on performance and utilization, and other factors.
We’re excited about our CDN transformation and the promise of enabling cloud applications and developers to interact with our platform more efficiently and natively.