webscarab.md 4.4 KB


title: Spidering websites with WebScarab course: intro_pentest section: "Web-Based Exploitation"

layout: lesson

A great tool to use when initially interacting with a target is WebScarab. WebScarab was written by Rogan Dawes and is available through the OWASP website. If you are running BlackArch, a version of WebScarab is already installed. This powerful framework is modular in nature and allows you to load numerous plug-ins to customize it to your needs. Even in its default configuration. WebScarab provides an excellent resource for interacting with and interrogating web targets.

After having run the vulnerability scanners, the next logical step is to run a spidering program on the target website. Spiders are extremely useful in reviewing and reading (or crawling) your target’s website looking for all links and associated files. Each of the links, web pages and files discovered on your target are recorded and catalogued. This catalogued data can be useful for accessing restricted pages and locating unintentionally disclosed documents or information.

You can access the spider function in WebScarab by first starting the program, you can do so by issuing in a terminal:

webscarab

This will load the WebScarab program. Once you start the tool, you’ll be given access to a number of new panels along the top of the window including the “Spider” tab.

Now that you have set up WebScarab, you need to configure your browser to use a proxy. Setting up WebScarab as your proxy will cause that all the web traffic going into and coming out of your browser to pass through the WebScarab program. In this respect, the proxy program acts as a middle man and has the ability to view, stop and even manipulate network traffic.

Setting up your browser to use a proxy is usually done through the preferences or network options. In Firefox, you can click ☰ → Preferences → Scroll down, and click “Settings” in the Network Settings section.

Clicking on the settings button will allow you to configure your browser to use WebScarab as proxy. Select the radio button for “Manual proxy configuration:”. Next enter: 127.0.0.1 in the “HTTP Proxy” input box. Finally, enter: 8008 into the “Port” field. It usually a good idea to check the box just below the “HTTP Proxy” box and select “Use this proxy for FTP and HTTPS”. Once you have all of this information entered, you can click “OK” to save the changes made.

At this point, any web traffic coming into or passing out of your browser will route through the WebScarab proxy. Two words of warning: First, you need to leave WebScarab running while it’s serving as proxy. If you close the program, you won’t be able to browse the Internet. If this happens, Firefox is great at providing you with an error message that it can’t find a proxy and you’ll need to restart WebScarab or change your network configuration in Firefox. The second warning is that while surfing the Internet using a local proxy, all https traffic will show up as having an invalid certificate!. This is expected behaviour because your proxy is sitting in the middle of your connection.

As a side note, it’s important that you always pay attention to invalid security certificates when browsing. At this point, certificates are your best defense and often your only warning against a man-in-the-middle attack.

Now that you have set up a proxy and have configured your browser, you are ready to begin spidering your target. You begin by entering the target URL into the browser. In our earlier example, we discovered a website running on 172.16.45.132. Entering the following into your Firefox browser will load the website through WebScarab. Once the website has loaded in your browser, you can switch over the WebScarab program. You should see the URL you entered (along with other that you have visited since starting your proxy). To spider the site, you right click the URL and choose “Spider tree".

You can now view each of the files and folders associated with your target website. Individual folders can be further spidered by right clicking and choosing "Spider tree" again. You should spend time carefully examining every nook and cranny within your authorized scope. Spidering a website is a great way to find inadvertently or leaked confidential data from a target website.