Netflow forensic investigations can produce the report evidence that can be used in court as it describes the movement of the traffic data even without necessarily describing its contents.
It’s therefore crucial that the Netflow solution deployed can scale in archival to allow full context of all the flow data and not just the top of the data or the data relating to one tools idea of a security event.
The issue with Forensics and flow data is that in order to achieve full compliance its necessary to retain a data warehouse that can eventuate in a huge amount of flow records.
These records, retained in the data warehouse may not seem important at the time of collection but become critical to uncover behavior that may have been occurring over a long period and to ascertain the damage of the traffic behavior. I am talking broadly here as there are so many different instances where the data suddenly becomes critically important and it’s hard to do it justice by explaining one or two case studies. Remember you don’t know what you don’t know but when you discover what you didn’t know you need to have the ability to quantify the loss or the risk of loss.
How much flow data is enough to retain to satisfy compliance?
From our experience it is usually between 3-24 months depending on the size of the environment and the legal compliance relating to data protection or data retention. For most corporates we would recommend 12 months as a best practice. Data retention in ISP land in some countries requires the ability to analyze traffic for up to 2 years. Fortunately disk today is cheap and flow is cost effective to deploy across the organization. There is more information about this in our Performance and Security eBook.
Once a security issue has been identified the flow database can be available to quantify exactly what IP’s accessed a system, the times the system was accessed as well as quantifying the impact on dependent systems that the host conversed with directly or indirectly on the network before and after the issue.
Trawling through huge collection of flow-data can be a lengthy task and its necessary to have the ability to run automated analytics and parallel analytics to gauge damage from a long term inside threat that could have been dribbling out your intellectual property slowly over a few months.