Part Four: Data Visibility
Information Overload
Fort Knox is located in a remote section of Kentucky and receives very few visitors. Unfortunately, the reality for modern networks is not quite the same. For most networks, the volume of data that needs to be processed is overwhelming. The problem for network and security monitoring is therefore not inadequate information, but rather having the capabilities for processing and sharing a vast amount of data.
Network intelligence can be collected from a variety of data sources ranging from the log files on thousands of network computers to the raw data traversing the wires. In many networks, the raw data of communications occurring on the network can exceed several terabytes (1015 bytes) each day. When faced with the sobering risks we’ve outlined in earlier discussions and pushed up against impossible costs for storing such data, it is easy to understand why many organizations throw their hands up and stop trying to make sense of it all. In a sense, securing an enterprise network these days is harder than securing Fort Knox!
Process As You Go
The conventional approach to network monitoring is to store the data as long as fiscally possible so that it is available for analysis when needed. However, innovative researchers in physical security as well as in network security soon realized that processing data as it was received was more practical than storing it all and looking at it later – much like the guards at Fort Knox keep an active eye out for intruders versus just conducting video surveillance and reviewing the tapes when they get around to it.
In recent years, international security and intelligence agencies have therefore made deliberate efforts to enable all relevant threat information to be analyzed in near real time. This approach allows physical security experts to make more informed decisions because they are processing more data. Intelligence is not rated by how much data is stored, but rather value is established in the volume of data that is intelligently processed.
Visibility and Intelligence
In network security we are now left with a simple hypothesis for success. The quality of actionable network intelligence is dependent upon the breadth and depth of processed data and the ability of the processor to accurately classify the data. The best kind of data to feed into the processor will be data in its least processed (most raw) form. When discussing the breadth of data on the network, complete visibility would require every communication going in, out or around the network to be fed into the processor. In regards to depth, it would require every bit of those communications to be fed into the processor. As discussed below, there are various methods for collecting and analyzing this network data.
Network Probes
Traditional network monitoring solutions require network data to flow through them (inline), or for a copy of the traffic to be fed into them (spanned). This requires many (sometimes hundreds) of probing devices to be distributed across the network. Signature-based processors require this type of communication visibility so that patterns can be matched in the data stream. This data collection method provides the highest level of data depth, but achieving complete coverage (breadth) may be technically impossible if not fiscally implausible. Compared with Fort Knox, it would be akin to analyzing every word spoken in every conversation in, outside of and within a several mile radius of the fort (which, as explained above, is actually far more complicated when applied to a large, corporate network).
Detection Methods Recap
As we discussed in Part 2 of this series, behavioral-based detection does not require the full data streams that are necessary for signature detection. As we discussed earlier, while signature-based detection has value, it has limitations in detecting emerging and advanced threats. Signature-based detection mechanisms have a place as an ancillary security source, but to handle modern threats, behavioral-based detection is critical to network protection.
NetFlow
To obtain the broadest visibility in network security, information can be collected from the networking devices using NetFlow. NetFlow is a technology created by Cisco that logs every communication that a networking device observes, which can then be forwarded to a NetFlow collector for analysis. If we were to draw a comparison to physical security it would be a technology that logs the participants of every conversation, how long they spoke, the purpose of their conversations (and more) but excluding the actual words (data) being logged. NetFlow does not contain full packet capture information, but does provide critical components of network communications that can be used for behavioral analysis, including the source, destination, ports/protocols used, length and size of the communication.
StealthWatch and NetFlow
The challenge in handling NetFlow records is that the amount of information can be overwhelming. In order to avoid information overload, incident responders need actionable intelligence that prompts direction. This is where Lancope’s StealthWatch System shines. By applying more than 130 behavioral algorithms to the immense amount of NetFlow data, StealthWatch is able to shine a light on network security and performance issues. Built from its infancy to detect unknown, emerging threats, StealthWatch empowers responders to see threats from both common criminals and elite attackers without relying on signature updates. In other words, it provides the type of comprehensive visibility that the surveillance personnel at Fort Knox need in the physical realm.
Wrap Up
We can only thwart our attackers if we are able to intelligently process all relevant data. Secure networks are built to process all relevant network data as it is made available so that actionable intelligence is produced. In the next installment of this series, we will examine the different types of criminals that are hoping to become more acquainted with your network.