Building on the success of a next-generation firewall business, Palo Alto Networks is now leading across multiple competencies in the cybersecurity space including network security, cloud security and security operations. Palo Alto Networks has a unique ability to integrate new technology quickly to compete in new verticals, so for more insight on how they’re accomplishing this we talked with Tim Junio, SVP of Products, Cortex at Palo Alto Networks and former CEO of recent Palo Alto Networks’ acquisition Expanse.
Tim was conducting cyber operations for the CIA before he was old enough to drink, performed consulting work for DARPA and helped to build out cyber operational capabilities for the U.S. military. He explained how a once DARPA prototype became what is now known as attack surface management technology and how it’s now evolved into Palo Alto Networks’ Cortex Xpanse today.
“Going back to 2013-14, when we were first thinking about what ultimately became the core technology for Cortex Xpanse today, we observed that the Internet was kind of a mess,” Tim stated. “As soon as we started looking at a large scale for exploitable systems, we found a huge number. The premise for defenders back then was to try and do penetration testing and always be looking for weak links. But the idea that you can in an automated fashion find exposures as soon as they come up, like within minutes, was not a reality that defenders were prepared for.”
Tim compared this situation to what Palo Alto Networks’ Cortex Xpanse is accomplishing today. “Now it’s really taking an attacker’s view of the organization,” he shared. “It asks key questions such as: ‘What applications are exploitable? What systems are available? Are there any misconfigurations?’ It is the bane of security if you don’t really know what you’re protecting against; if you don’t really know what it looks like. So it made sense for Palo Alto Networks to make this acquisition.”
The right formula for security acquisitions
“After folding in multiple acquisitions over the years, Palo Alto Networks has gotten even better at this process. Before the acquisition of Expanse even closed, we were talking about where our technology could plug in, including the obvious fit with Palo Alto Networks’ next-generation firewalls to ensure that we’re actually protecting the entirety of an internet protocol space. But there are also some not-so-obvious areas including areas for co-development such as Prisma Cloud. We were able to work with this cloud security product to provide a combined view with Xpanse that can show the customer unmanaged cloud assets and find vulnerable systems within cloud environments so that they can be brought under proper management through the Prisma Cloud product. ”
Tim went on to explain how Xpanse is helping Palo Alto Networks to build out data lakes that contain a wealth of security information. He described how they have been prototyping attack surface data to gain a better understanding of how mergers and acquisitions can alter the security situation that a business is facing. “When you add in the complexity of mergers and acquisitions, and business units operating globally, an organization may not really appreciate the vulnerability that it’s facing,” Tim shared. “We show up to customers all the time and remind them, ‘Hey, you’re doing this joint venture, did you know you’re using Alibaba’s cloud? You’re not just an Azure shop anymore.’ And they realize they weren’t centrally tracking that as an organization.”
How Palo Alto Networks defines XDR
Evolving from the early days of attack surface management technology, XDR brings next-level thinking to the entire concept of vulnerability and threat detection and mitigation. Who better to define what XDR means today than Palo Alto Networks, the company that first coined the term and the thinking behind
it. When Tim was asked about the essential criteria that should fall into the expectations for XDR, he replied that the most important idea is the evolution of endpoint detection and response. “Protection, prevention and detection requires joining endpoint data with other data,” he shared. “Basically, if you’re dependent on only one source of information at a time for security, you’re going to miss sophisticated attacks.”
Tim believes that combining endpoint data with network security data is an essential place to start. “If you look at the prior era of endpoint protection, that is where you started to have behavioral analysis and looking at things happening locally on a machine,” he said. “And that obviously was a huge leap in technology that was efficacious for a while , but then adversaries adapted and started doing a better job of obfuscation. We needed a new approach and joining endpoint data with network data gave us new kinds of visibility. If you’re looking across different data sets your odds dramatically improve that the attacker is unable to obfuscate across everything.”
Not simply more data
But Tim also clarified the importance of not just consuming data for its own sake. “I think that is the difference between XDR and SIEM,” he stated. “Security Incident and Event Management was supposed to be the answer to this problem of the modern SOC. But there are too many alerts and people are overwhelmed, so it doesn’t stop enough attacks. When we look at a data ingestion model, we need to ask if we’re providing anything useful or just aggregating. How much of that data is used in true correlation? While SIEM let’s you do that in a highly-manual, human-driven way where you need to do much of the data normalization yourself, XDR starts with the highest quality, most important security data where the intent is not to take 200 different data sources and run queries over them.”
“The difference in how an XDR product would work versus SIEM would be that XDR would normalize between datasets so that you actually know the relationships between them for the highest quality data and then you run advanced analytics on top of them,” Tim continued. “For XDR the data integration component is fundamental. For our own XDR we do the data integrations natively for the product. We create what we call a story. A story is basically the joined relationships between different data sets, starting with our endpoint agent from Cortex XDR and our next-generation firewalls. But we also bring in third-party data and we’re perfectly happy to work with competitor’s data, make that available within XDR and joined with either the next-generation firewall or our endpoint.”
Tim concluded by providing a glimpse of what this will all look like as part of the next evolution within Palo Alto Networks Cortex XDR 3.0 platform. “If you’re a customer of XDR 3.0 and you’re connecting XDR endpoint data, plus let’s say data from Amazon Web Services plus data from our next generation firewalls, we’re going to be running our analytics over all of those datasets and we’ll provide you with scored alerts across the datasets within the XDR unified console,” he said. “I would add to that we’re also pushing results into workflows. Our XDR product is a playbook automation product where we can automate a wide range of responses and have hundreds of integrations built into that product. If we can’t automate it, we can at least augment the workflow automatically for human analysts and provide as much context as possible to speed up the time-to-respond. If you look at this holistically overall, I think that there are four pieces here: There’s the gathering of data, the integration of data, the analysis of data, and then the workflow. And I think what’s really hard is to get all of those things to work well together. And so where we’re really starting to excel is within that overall integration component.”