The first part of the CMMC blogs series explained how Gigasheet could help organizations address several CMMC Level 3 practices within the Audit and Accountability and Situational Awareness domains. This blog will show you real-world examples of the previously discussed CMMC requirements using Gigasheet. Please, keep in mind that this blog does not constitute CUI guidance, and all demonstrations below were performed using fake but relevant data.
AU.2.041 - Ensure the actions of individual system users can be uniquely traced to those users so they can be held accountable for their actions.
As previously mentioned in the first part of this blog series, Gigasheet can help you quickly identify missing information in system event logs by observing the columns in a sheet. Take, for instance, AWS VPC flow logs. The default format does not include TCP Flags, which are essential to identifying the source of a network connection and equally critical for conducting network traffic analysis and forensics. The default VPC flow log format includes all version 2 fields, but the TCP Flags field is a version 3 field which you must manually add to the VPC flow log configuration in order to log it.
Figure 1 - Default AWS VPC Flow Log Version
Unless you regularly work with VPC flow logs, you will probably not realize that critical fields are missing until you need them to investigate an incident or troubleshoot an issue. Hence, the process you follow to enable new or modify an existing system or network event log should include event log validation. One way to do that is by uploading a log sample to Gigasheet and observing the columns representing the fields available in an event log. As shown in Figure 2, the sample VPC flow log does not include the "tcp-flags" field.
Figure 2 – VPC Flow Log Sample
Another way to identify missing fields in this particular example is by grouping the "version" column, quickly revealing no version 3 fields.
AU.2.042 - Create and retain system audit logs and records to the extent needed to enable the monitoring, analysis, investigation, and reporting of unlawful or unauthorized system activity.
There is not much to demonstrate here; simply move your log files to Gigasheet for long-term storage and take advantage of Gigasheet's rich data analysis capabilities without worrying about storage limitations and high log processing times.
AU.2.043 - Provide a system capability that compares and synchronizes internal system clocks with an authoritative source to generate time stamps for audit records.
Imagine that you are investigating a potential security incident, and all you have at your disposal are log files from various systems with different timestamps. Event correlation would be very challenging in a situation like this, if not nearly impossible. For instance, Figure 3 shows two log sets from two different systems, including AWS VPC flow logs (left) and network packet capture logs (right).
Figure 3 – VPC Flow Log Sample and PCAP
Both logs' timestamps are in Unix time format, which provides the number of seconds that have elapsed since 00:00:00 UTC on 1 January 1970. According to Daniel J. Bernstein, an American-German mathematician, "the difference between two UNIX time values is a real-time difference measured in seconds, within the accuracy of the local clock". While Gigasheet cannot determine the real-time difference between the two Unix time values and thus produce “real-time” timestamps relative to the logs’ timestamps, it can convert Unix time to a standardized time format and time zone to make it easier to correlate the events. Furthermore, Gigasheet can create a timeline of events from multiple log files, making it easier to search, group, and filter values across various log sources.
In the example below, I uploaded the two log files illustrated in Figure 3 to Gigasheet and used the "Cleanup Unix time" function to convert the time column to CST. I then created a timeline of events using both files, which allowed me to look for a particular IP address, hostname, port, protocol, or any value of interest across the two files.
AU.2.044 - Review audit logs.
Reviewing audit logs will vary in complexity depending on what you are looking for and the number of log sources you are searching across. For instance, in the example below, I analyzed IIS logs for unusual web requests resulting in potential data exfiltration, unauthorized access, or denial of service. I started the analysis by focusing on four columns: HTTP method, URL, URI query string, and HTTP status code. I could see all successful requests by grouping the HTTP status column and expanding HTTP code 200, which quickly revealed what seemed to be multiple successful web application attacks against the web server. However, without further analysis, it is impossible to tell whether the attacks succeeded or the server returned a standard error web page with an HTTP status code 200 in response to the malicious requests.
AU.3.045 - Review and update logged events.
Practice AU.3.045 requires organizations to regularly re-evaluate systems' logged security events and determine which events need to be added, modified, or deleted. Consider the simple network architecture in Figure 4 and the following logging requirement:
Figure 4 – Sample AWS Architecture
To validate the network architecture compliance with the requirement outlined above, I sent test traffic from an external machine to the target system on the diagram (a web server) and uploaded the following logs to Gigasheet:
I created a timeline of events to make it easier to correlate the three files and began analyzing the logs by searching for the originating machine's IP address (96.8.123.9). I would have expected to find the originating IP address captured in the three log sources, but as illustrated below, it only appeared in the VPC flow logs and PCAP files. By grouping the "source" column in the "Timeline" sheet, Gigasheet quickly revealed that one or more log sources were missing.
If we take a closer look at the network architecture diagram in Figure 4, we can see an application load balancer in front of the web server. The application load balancer accepts web requests from external sources and forwards them to the backend web server, replacing the originating IP address with its own. Consequently, the web access logs capture the load balancer's IP address rather than the true source of the request. Therefore, the application load balancer logs need to be enabled to comply with the specified logging requirement.
AU.3.046 - Alert in the event of an audit logging process failure.
While Gigasheet cannot alert you when a system's audit logging process fails, it can make it easier to detect when systems stop generating logs. In the example below, I uploaded a log file containing web access logs from various web servers. I extracted the logs from the web servers on September 26, 2021, and I would have expected to see logs from all servers up to the extraction date. By grouping the "date" column followed by the "source" column, it is possible to identify which log sources have not generated logs since the time of log extraction.
AU.3.048 - Collect audit information into one or more central repositories.
Gigasheet's Timeline function enables you to aggregate multiple log files into a single sheet to aid incident investigations and data analysis. In the example below, I uploaded several rotated alerts files from Wazuh, an open-source security information and events management system. I then combined them into a single sheet to create a holistic picture of all alerts triggered within a specified period.
AU.3.051 - Correlate audit record review, analysis, and reporting processes for investigation and response to indications of unlawful, unauthorized, suspicious, or unusual activity.
In this example, I uploaded the missing log source from AU.3.045's example to complete the events correlation. With VPC flow, load balancer, and web server logs on hand, I recreated the timeline of the events. I then searched for the originating IP address (96.8.123.9) in the Summary column, revealing the missing load balancer events. Within the load balancer log, the "srcip" column shows the originating IP address. In contrast, the "targetip:port" column displays the target IP address (10.50.5.145), completing the connection trace from start to finish.
AU.3.052 - Provide audit record reduction and report generation to support on-demand analysis and reporting
To demonstrate how Gigasheet can address this requirement, I uploaded a Squid web proxy log to provide a graphical representation of the average number of bytes per website. Gigasheet calculated the average bytes per unique URL and generated a bar graph displaying the results in descending order using the pivot table and charting functions.
SA.3.169 - Receive and respond to cyber threat intelligence from information sharing forums and sources and communicate to stakeholders
Gigasheet has built-in data enrichment capabilities that enable rapid identify malicious IP addresses or file hashes in log files. It also allows you integrate with third-party threat feeds, such as Recorded Future and VirusTotal. In the first example below, I uploaded an auth.log file from a public-facing Ubuntu server and enriched the remote host column using Gigasheet's built-in threat feeds. The results showed multiple malicious hosts attempting to gain unauthorized access to the server via SSH.
In the second example below, I created a list of malicious IP addresses from multiple US-CERT bulletins. I then compared the list against the remote host column in the auth.log sheet, resulting in thirteen (13) positive matches.
Don't have a Gigasheet account? Sign up now!