Cross-site request forgery (CSRF) attacks exhibit an oft misunderstood yet immediate impact on the victim (not to mention the organization they work for) whose browser has just performed actions they did not intend, on behalf of the attacker.
Consider the critical infrastructure operator performing administrative actions via poorly coded web applications, who unknowingly falls victim to a spear phishing attack. The result is a CSRF-born attack utilized to create an administrative account on the vulnerable platform, granting the attacker complete control over a resource that might manage the likes of a nuclear power plant or a dam (pick your poison).
Enough of an impact statement for you?
There's another impact, generally less considered but no less important, resulting from CSRF attacks: they occur as attributable to the known good user, and in the context of an accepted browser session.
Thus, how is an investigator to fulfill her analytical duties once and if CSRF is deemed to be the likely attack vector?
I maintain two views relevant to this question.
The first is obvious. Vendors and developers should produce web applications that are not susceptible to CSRF attacks. Further, organizations, particularly those managing critical infrastructure and data with high business impact or personally identifiable information (PII), must conduct due diligence to ensure that products used to provide their service must be securely developed.
The second view places the responsibility squarely on the same organization to:
1) capture verbose and detailed web logs (especially the referrer)
2) stored and retained browser histories and/or internet proxy logs for administrators who use hardened, monitored workstations, ideally with little or no internet access
Strong, clarifying policies and procedures are recommended to ensure both 1 & 2 are successful efforts.
DETAILED DISCUSSION
Web logs
Following is an attempt to clarify the benefits of verbose logging on web servers as pertinent to CSRF attack analysis, particularly where potentially vulnerable web applications (all?) are served. The example is supported by the correlative browser history. I've anonymized all examples to protect the interests of applications that are still pending repair.
A known good request for an web application administrative function as seen in Apache logs might appear as seen in Figure 1.
Figure 1
As expected, the referrer is http://192.168.248.102/victimApp/?page=admin, a local host making a request via the appropriate functionality provided by the application as expected.
However, if an administrator has fallen victim to a spear phishing attempt intended to perform the same function via a CSRF attack, the log entry might appear as seen in Figure 2.
Figure 2
In Figure 2, although the source IP is the same as the known good request seen in Figure 1, it's clear that the request originated from an unexpected location, specifically http://badguy.com/poc/postCSRFvictimApp.html as seen in the referrer field.
Most attackers won't be so accommodating as to name their attack script something like postCSRFvictimApp.html, but the GET/POST should still stand out via the referrer field.
Browser history or proxy logs
Assuming time stamp matching and enforced browser history retention or proxy logging (major assumptions, I know), the log entries above can also be correlated. Consider the Firefox history summary seen in Figure 3.
Figure 3
The sequence of events shows the browser having made a request to badguy.com followed by the addition of a new user via the vulnerable web applications add user administrative function.
RECOMMENDATIONS
1) Enable the appropriate logging levels and format, and ensure that the referrer field is always captured.
For Apache servers consider the following log format:
LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-agent}i\"" combined
CustomLog log/access_log combined
For IIS servers be sure to enable cs(Referer) logging via IIS Manager.
Please note that it is not enabled by default in IIS and that W3C Extended Log File Format is required.
2) Retain and monitor browser histories and/or internet proxy logs for administrators who conduct high impact administrative duties via web applications. Ideally, said administrators should use hardened, monitored workstations, with little or no internet access.
3) Provide enforced policies and procedures to ensure that 1 & 2 are undertaken successfully.
Feedback welcome, as always, via comments or email.
Cheers.
del.icio.us | digg | Submit to Slashdot
Please support the Open Security Foundation (OSVDB)
Subscribe to:
Post Comments (Atom)
Moving blog to HolisticInfoSec.io
toolsmith and HolisticInfoSec have moved. I've decided to consolidate all content on one platform, namely an R markdown blogdown sit...
-
Continuing where we left off in The HELK vs APTSimulator - Part 1 , I will focus our attention on additional, useful HELK features to ...
-
As you weigh how best to improve your organization's digital forensics and incident response (DFIR) capabilities heading into 2017, cons...
-
When, in October and November 's toolsmith posts, I redefined DFIR under the premise of D eeper F unctionality for I nvestigators in R ...
1 comment:
cool article and good thoughts about the forensic side of xsrf! nevertheless, admin workstations with little or no internet access is just unrealistic in "normal security" environments. security is important, but it all gets dubious if the ability to get your daily work done gets that much shortened. sure, you can use two workstations, one with internet access, one for admin work. thats cool in theory but sometimes people underestimate how much additional work (and therefore costs) that gain in security really brings along. some of your tools, programs and data you simply need on both machines, not to forget about doing updates, keep data synchronized, backups, etc. sure, it all has its pros and cons. i think that such a restricted or split installation only makes sense in high security areas like power plants, as you said. the most important point in my view is the one you mentioned with capture verbose and detailed web logs.
Post a Comment