The single most important marker of network truth is the packet. These are the containers of data that get transmitted through a network and the basic points of analysis for any attempt at network monitoring.
They tell network operations and security teams what is happening in real-time, allowing them to fix oncoming network issues and security problems as they arise. However, packets also have a longer-term value—they help an organisation understand the true scope of activity on its network over the long term. As a result, packets need to be retained for strategic learning and forensic evaluations of past events.
Yet, for various reasons, organisations cannot hold on to these valuable assets. Their inability to do so ultimately hobbles them in solving the deep problems in their networks.
Why network packets are important
As the lynchpin of network monitoring, packets allow an organisation to see the details of the traffic flowing through its network. In doing so, they can identify and analyse network problems and security issues. For instance, when applications experience latency or connectivity issues, detailed packet analysis can reveal the root cause: network congestion, faulty hardware, or misconfigured settings.
Crucially, they help us optimise network performance by revealing granular problems in traffic and allowing analysts to reconfigure network devices, update policies, or otherwise upgrade infrastructure in order to improve network performance. On top of that, they help detect and mitigate security threats, offering insights into unauthorised access, malware, and other malicious activities.
This is necessary for gaining a broad, top-down understanding of network performance as well as drilling down to troubleshoot complex problems and work out bottlenecks. From a broad perspective, they provide the information that is fundamental to benchmarking network performance and understanding the nature of a network and its traffic patterns over the long term.
Drilling down, packets furnish an organisation with a detailed view of application performance, allowing them to understand how different applications affect broader network performance. This is especially important for understanding application-based problems which might not be visible with other traditional monitoring tools. The same goes for Quality of Service (QoS) and Quality of Experience (QoE), for which packet data illustrates the actual end-user experience. Analysts can then check if important applications receive the needed resources and how network policies prioritise traffic. For example, video conferencing tools and VoIP services, which are sensitive to latency and jitter, can be monitored to ensure they receive priority over less critical traffic.
What they contain
Packets are the literal containers of data that are sent through a network. Packets provide raw and unfiltered data without interpretation or the distortions that come with processing that information. In this sense, they are a direct representation of a digital transaction. One of the reasons packets are so reliable is that they cannot be tampered with in transit. They can't be manipulated or distorted on their way through the network and thus remain whole, accurate and integral throughout that journey. As such, they are indisputable carriers of truth about network activity.
These will generally contain three categories of information. There is the payload - the actual data within the packet itself, and the packet header, which includes a variety of information including timestamps, length and metadata such as the websites, file hosts, applications or users that the packet has interacted with. That is then followed by a trailer, otherwise known as a footer.
Why you need to hang on to packets
Packets should be retained for a number of reasons. The first is the granular detail and value held within those packets, allowing us to solve real-time problems and think about the network strategically. From that point of view, they’re crucial data points in seeing the whole picture of the network, working out long term issues, making wholesale improvements and necessary optimisations.
This plays a very important part in security forensics. Once a security incident has happened, defenders will need to backtrack to see how that attack happened. This is especially important when considering that the median dwell time for a breach is 10 days. That leaves potential months in which an attacker lurks on an organisation’s infrastructure, stealing data and corrupting systems.
"Take, for example, the SunBurst attacks of 2020. When a malicious actor infected SolarWinds and their downstream customers, it took them 14 months to actually discover the attack. The only way they could trace that compromise back to the first point of breach is to dig through the traffic data for 14 months to find the initial point of compromise."
Furthermore, packet data is also increasingly necessary for compliance. Packets can—and often are—used as evidence that an organisation is complying with given regulations. This is especially true of data protection regulations—such as the EU’s General Data Protection Regulation—for which organisations must pursue and document their efforts to handle personal data and maintain consumer privacy securely.
Organisations must be able to capture and store those packets, which requires significant amounts of storage space.
Why organisations struggle to retain packets
Traffic volumes that a network has to deal with have exploded in recent years. The arrival of new technologies like the IoT has exponentially increased the amount of traffic that any network has to deal with. In the meantime, people are literally just using the internet more. It has been estimated that in the past five years, the average organisation’s traffic volume has doubled, if not tripled. In fact, according to IBM, 90% of the world's data has been produced in the last few years. Simply put, we’re handling more data than ever in more complex ways than ever.
In turn, that increases storage demands and mitigates the ability actually to retain that packet information. As storage needs increase, so do the costs. One study from McKinsey showed that spending on data costs between 2019 and 2021 increased by 50% over the previous three years.
Vendor choice can also become a real problem if their packet capture solutions handle storage inefficiently. For example, some packet capture solutions capture the whole packet—which is larger—when they only need to capture the packet header, which is comparatively lightweight. Organisations must carefully choose tools that balance the need for comprehensive data with storage efficiency.
The packet is an invaluable data point for managing and optimising the network. The ability to hold on to packets for extended periods allows organisations to address immediate issues and plan for future network needs. It enables a proactive approach to network management, ensuring that systems remain robust, secure, and capable of meeting the demands of an increasingly connected world.
Understanding how a network operates and evolves is fundamental in times of incredible technological change. It’s not just a way to manage in changing times but an important platform for further innovation,too. That can only happen with the necessary historical data that packet retention enables. Organisations need to think deeply about keeping this data as long as possible and efficiently as possible.