You are on page 1of 4

Report for Jamar and Executive Leadership of StackFull Software,

prepared by Elisabeth Branam

Executive Summary
As a standard part of the onboarding procedure for new Level 1 SOC
Analysts, my mentor, Alice, and I discovered that a file that dictated which users
had access to the logs for the Splunk SIEM system had been mistakenly altered to
remove some users who needed that access in order to do their jobs. The error has
been remedied, but this brought up some vulnerabilities in the system that need
to be addressed. I propose implementing a few new policies to mitigate those
vulnerabilities. A review of the permissions level for the files in the system, running
checks to make sure important files have not been tampered with, and doing
regular maintenance checks on both our hardware and software to ensure they
are all in working order would shore up the defenses of the Splunk SIEM system,
and would only require a few extra steps to implement.

Introduction
On March 7, 2024, an issue was discovered in the files of the Splunk SIEM.
Access to the logs generated by the system had been removed for some members
of the IT team. The logs are a vital component in monitoring and securing the
system and information contained within it. This report is intended to provide
information on what happened, as well as what we can do to mitigate the
possibility of this situation happening again.

The Issue
As a newly-hired member of the IT SOC analyst team, I was orienting myself
with the systems of StackFull Software, as well as what kinds of information I
would be handling every day in my new position. As part of this orientation, Alice
and I discovered that I was unable to access the logs for the Splunk SIEM (an
integral aspect of my daily duties here at StackFull). After some digging, we found
out that the configuration file had been edited incorrectly by a fellow team
member.

The Solution
In order to remedy this error, Alice and I logged in to the system that
contained the appropriate configuration file using the SSH command. We searched
for the config.conf file using the find /opt/splunk config* command and edited the
contents of that file using vim to include both myself and Alice as administrators of
the system at the end of the file, then saved it using ESC+:+wq and made a backup
of that file.

Preventing Recurrences of the Same


Issue
This is a fairly common issue in companies and other multi-user networks.
So common in fact that the CIA Triad is a core component of any cybersecurity
policy. C stands for Confidentiality, I stands for Integrity, and A stands for
Availability. These three things together define the base of what kinds of aspects of
systems need to be addressed to ensure the security of the network’s files.

Confidentiality
(Are our secrets still secrets?)

Integrity Availability
(Did anything change since we last looked?) (Did the lights stay on?)

With respect to this core component of cybersecurity, we can take a few


different steps to ensure this situation will not happen again. Our current setup
seems to be doing okay with the Availability part of this triad, but I still advise
keeping up a regular maintenance schedule and monitoring of logs such as the
ones for the webservers or the access logs to make sure any failures are not due to
an issue on our side.
The other two components of this triad, the Confidentiality and Integrity
aspects, are the ones that were threatened or breached by this issue. In order to
secure these more effectively, I advise investigating the permissions on the files in
our system and removing any permissions for groups that are not absolutely
necessary. The Principle of Least Privilege absolutely applies here, as the file that
was compromised in this case had full access to read, write, and execute given to
every user on the system.
In order to monitor the Integrity of this and other important files, I advise
that we implement a policy of hashing the files, such as via md5 hashing, in order
to know if a file has changed from the last stable and approved version of that file.
An md5sum command can monitor the integrity of a file by running that command
on a pristine backup copy (with no write access for anyone, to ensure it remains
pristine) and comparing the output of the md5sum command with the file being
monitored. If the md5sum doesn't change, the file hasn't changed. If it does, the
previous version of that file may need to be restored from the backup, because it
has been altered in some way. Even a tiny change, such as deleting or adding a
space, will alter the md5sum output.

Conclusion
The file in question thankfully did not have anything more severely affected
than just that one issue, but it could have been so much worse if the perpetrator
had been a malicious actor and not just the result of a small mistake. By
implementing a policy of: frequently running files through hashing algorithms;
doing regular maintenance checks and log checks; as well as implementing the
Principle of Least Permissions to ensure the only people who have access to a file
are the ones who need it, and they only have the permissions they absolutely
need, we can help keep Splunk SIEM’s systems (as well as those of all of our other
clients) more secure.

You might also like