WEBSITE

Web Security Testing : Automating Specific Tasks with cURL - Following Redirects Automatically, Checking for Cross-Site Scripting with cURL

1/31/2015 2:43:31 AM

1. Following Redirects Automatically

Problem

Many web applications use redirection as part of their regular processing. They send a response that says “HTTP 302 Moved” with a Location: header that indicates the URL your web browser should visit next. If you are scripting something complex, like a login process, you will frequently have to follow these redirect responses. cURL can do this automatically.

Solution

curl -L -e ';auto' -o 'output.html' 'http://www.example.com/login.jsp'

Discussion

You typically need to use a combination of -L and -e ';auto' simultaneously to achieve the effect you want. The -L option tells cURL to follow redirect responses. The -e ';auto' option tells it to pass the Referer header when it follows them. This more closely matches the behavior of real web browsers.

Note that the output file (output.html in this example) will probably contain more than one HTML file, because it contains the output of more than one HTTP request. It is not possible to have cURL save the output of the various requests to different output files.

2. Checking for Cross-Site Scripting with cURL

Problem

The most basic kind of cross-site scripting (XSS) is called reflected cross-site scripting. The vulnerable web software reflects user input back to the web browser without encoding it, modifying it, or filtering it. This makes very basic XSS problems easy to spot. We simply send a variety of XSS attack strings to various web pages and then check to see if our attack string came back to us unmodified.

Solution

You will need to create three files like those shown in Examples1, 2, and 3. The shell script uses the two text files as input.

Example 1. Cross-site scripting test script using cURL
#!/bin/bashCURL=/usr/local/bin/curl# where do we put temporary output?TEMPDIR=/tmp# a file with URLs to attack, one per lineURLFILE=urls.txt# a file containing XSS attack strings, one per lineATTACKS=xss-strings.txt# file descriptor 3 is our URLs3<"${URLFILE}"# file descriptor 4 is our XSS attack strings4<"${ATTACKS}"typeset -i FAILED# for each URL in the URLFILEwhile read -u 3 URLdo    TEMPFILE="${TEMPDIR}/curl${RANDOM}.html"    FAILED=0    # attack with each attack in the ATTACKS file    while read -u 4 XSS    do	# call curl to fetch the page. Save to temp file because we	# need to check the error code, too. We'll grep if we got	# anything.	curl -f -s -o "${TEMPFILE}" "${URL}${XSS}"	RETCODE=$?	echo "ret: $RETCODE"	# check to see if curl failed or the server failed	if [ $RETCODE != 0 ]	then	    echo "FAIL:	(curl ${RETCODE}) ${URL}${XSS}"	else	    # curl succeeded. Check output for our attack string.	    rm -f "${TEMPFILE}"	    result=$(grep -c "${XSS}" "${TEMPFILE}")	    # if we got 1 or more matches, that's a failure	    if [ "$result" != 0 ]	    then		echo "FAIL:	${URL}${XSS}"		FAILED=${FAILED}+1	    else		echo "PASS:	${URL}${XSS}"	    fi	fi	rm -f "${TEMPFILE}"    done    if [ $FAILED -gt 0 ]    then	echo "$FAILED failures for ${URL}"    else	echo "PASS: ${URL}"    fidone
Example 2. Example urls.txt file
http://www.example.com/cgi-bin/test-cgi?test=http://www.example.com/servlet/login.do?user=http://www.example.com/getFile.asp?fileID=
Example 3. Example xss-strings.txt file
<script>alert('xss');</script>"><BODY%20ONLOAD=alert('XSS')><a%20name=""><BODY ONLOAD=alert('XSS')><a name="abc>xyzabc<xyzabc'xyzabc"xyzabc(xyzabc)xyzabc<hr>xyzabc<script>xyz

Realize that there are infinitely many possible test strings for cross-site scripting. Your goal is not to use just the ones we show in Example 3, nor to use every possible string that your time and budget allows. Choose representative samples that vary in interesting ways. Use a different sample set in each test run, so that you can always be testing some XSS, but not necessarily so many cases as to bog down your efforts.

Discussion

This script uses a couple of loops to iterate across your website, trying lots of test strings on every URL you specify. You might get the list of URLs by spidering your website. The set of attack strings can come from lots of places: books, websites, vulnerability announcements, security consultants, etc.

Cross-Site Scripting Resources

The particular strings we chose in Example 3 are intended to help you zero in on what, if any, defenses the application has. You’ll note that we have used “abc” and “xyz” around each test string. That’s because we’re going to do a very simple grep of the output. If I want to find out whether a single < in input is reflected in the output, I have to be sure that it’s my < that is reflected. Clearly, grepping for < will return lots of spurious results unless I make it unique in this way. The examples get progressively worse. That is, reflecting a few dangerous characters, like <, >, and ", is bad, but reflecting the whole string <script> is an unmitigated failure. Also, we have seen applications that perform blacklisting as a defense. So, while they will allow some characters through, if they see <script> in the input they will replace it with something harmless or remove it altogether. ColdFusion does this in some situations, for example.

There are a few things to note about this particular script. It is a primitive script that does not do anything graceful in the case of bad input. Blank lines, comments, or anything stray in the urls.txt file will cause failures trying to connect to them as URLs. Likewise, stray data in the xss-strings.txt file will be attempted during testing. It is possible to put bad parameters in the xss-strings.txt file that actually cause cURL to fail. In such cases, cURL will fail, the script will say so, but you will have to go dig into the test case to figure out why it failed and what you want to do to fix it.

There are a few other interesting situations where the software being tested could fail, but the failures might not be detected by this simple script (called “false negatives”). Encoded strings might fail when the input is encoded in such a way that it bypasses input filtering and the result is an unencoded string that allows XSS. Imagine a test where you send the < character encoded as %3C in the attack string, but the actual unencoded < character is returned in the page body. That could well be part of a failure, and this simple script won’t detect it because the string that was sent was not found verbatim in the output. Another possible false negative is a situation where the input is broken across several lines when it was sent as one line in the attack. The grep will not notice that half the string was found on one line and the other half was found on the next line.

An improvement to this script would be to mimic Nikto and provide both an attack string and a corresponding failure string to look for in the xss-strings.txt file. You’d want to separate the two strings by a character that is easy to work with, but unlikely to be significant (or present) in your attack strings—like Tab. You could manage the strings in Excel and save as tab-delimited, if that suits your test environment.

Warning

To be sure, passing this test is no guarantee that XSS is impossible in your web software. Equally sure, however, is that failing this test guarantees that XSS is possible. Furthermore, if your software has either been attacked successfully or a security audit turns up the possibility of cross-site scripting, you can add the successful attack strings to this script as a form of regression test. You can help ensure that known failures don’t recur.

Other  
  •  Web Security Testing : Automating Specific Tasks with cURL - Fetching Many Variations on a URL
  •  Web Security Testing : Automating Specific Tasks with cURL - Fetching a Page with cURL
  •  Sharepoint 2013 : SharePoint Publishing Infrastructure (part 6) - Check In/Out, Versioning, and Content Approval
  •  Sharepoint 2013 : SharePoint Publishing Infrastructure (part 5) - Content Management - Putting It All Together
  •  Sharepoint 2013 : SharePoint Publishing Infrastructure (part 4) - Content Management - Master Pages,Page Layouts
  •  Sharepoint 2013 : SharePoint Publishing Infrastructure (part 3) - Content Management - Site Columns, Content Types
  •  Sharepoint 2013 : SharePoint Publishing Infrastructure (part 2) - Publishing Features
  •  Sharepoint 2013 : SharePoint Publishing Infrastructure (part 1) - A Publishing Site
  •  Show the whole car kingdom “What’ll you do?”, Porsche (Part 1)
  •  Sharepoint 2013 : List and library essentials - Organizing items by using folders
  •  
    Most View
    DNM Design Stereo Solid Core Resolution Speaker Cable
    The FPV GT-F – This Is The End (Part 2)
    Looking For A Smartphones – Q1 2013
    Make The Most Of Your Wi-Fi (Part 3)
    HTC One 2013 - HTC’s Latest High-End Smartphone (Part 5)
    Solutions For Mac’s Problems – Part 1
    Asus PadFone 2 - The Attraction Of The Phone-In-Tablet Combination (Part 2)
    Razer Edge Pro - A Tablet That Could Play The Role Of A Gaming PC (Part 2)
    Broadcast Technology - Beyond High Definition (Part 1)
    Samsung WB30F Compact Camera Review (Part 1)
    Popular Tags
    Microsoft Access Microsoft Excel Microsoft OneNote Microsoft PowerPoint Microsoft Project Microsoft Visio Microsoft Word Active Directory Biztalk Exchange Server Microsoft LynC Server Microsoft Dynamic Sharepoint Sql Server Windows Server 2008 Windows Server 2012 Windows 7 Windows 8 Adobe Indesign Adobe Flash Professional Dreamweaver Adobe Illustrator Adobe After Effects Adobe Photoshop Adobe Fireworks Adobe Flash Catalyst Corel Painter X CorelDRAW X5 CorelDraw 10 QuarkXPress 8 windows Phone 7 windows Phone 8 BlackBerry Android Ipad Iphone iOS
    Top 10
    Review : Acer Aspire R13
    Review : Microsoft Lumia 535
    Review : Olympus OM-D E-M5 Mark II
    TomTom Runner + MultiSport Cardio
    Timex Ironman Run Trainer 2.0
    Suunto Ambit3 Peak Sapphire HR
    Polar M400
    Garmin Forerunner 920XT
    Sharepoint 2013 : Content Model and Managed Metadata - Publishing, Un-publishing, and Republishing
    Sharepoint 2013 : Content Model and Managed Metadata - Content Type Hubs