Sabtu, 05 Mei 2012

Gathering Information before hacking website.

Before attacking(pentesting) a website we must need to gather some important value and then mapping the attack surface. If we don't understand how the site is working, what is available on the site, what type of input it takes etc then we will not be able to  make a good attack(Rarely success without passing gathering information). Many skid exist around us who just start looking for SQL injection or start brute forcing the web form and at least fail .
Gathering information and mapping the site is very very important So i will explain(not very details) how to, what looks for etc.

Spidering the web:
Basically i look for links, web form, source code, directory etc.
There are many tools you spider target website. But I prefer a proxy tools such Burp suit,owasp-zap and a downloader wget .



We may find out many important information from spidering the target.

Screen shot of Burp suit:
burp suite spidering

Burp suite spiderd some important link which we nee for later attack(Directory,login page, password forgotten pages, robots.txt etc) .

Configuring the burp suite for spider the web :

1. Open the burp suite .
2. Configure your browser as proxy for burp suite>> Firefox: preference>>Advance>>Network>>Setting>>Manual proxy configuration and enter host: localhost and port: 8080

Screen shot:
 
3. Now browser your target website. And you will see your target address in the burp suite proxy's target menu.
4. Now right click on your target host from burp then click on the "Spider this host"

Screen shot:

Now it will spider the website.Note:play more with burp suite.

Now we know to configure browser for burp suite and spidering the target host. So let's continue gathering information.

It is more good thing downloading the entire website using wget or other downloader so that we can browse it offline see the page source code, comment etc. Besides we may need to brute force the web form or anything and creating word list from the target site. So Simply i use wget :

wget -r www.target.com

And it will download the full website. Now browse all pages, see source code, coment etc and see if you i/you get any good information .

Information Gathering with Google:
Google is very powerful search engine and friend for hackers and penetration testers. We can gather many information by google easily. Such as all public information, email, parameter of the site, name, phone etc.
If we search on google with operator 'site' then we get many result :

https://www.google.com/#sclient=psy-ab&hl=en&source=hp&q=site:microsoft.com&fp=2f63d2df457619ac

Click on the link and you will see.

I have searched : site:microsoft.com thats why it discovered subdomain. But if we search "site:www.microsoft.com" then we will see result from only www.microsft.com , not for other sub-domains such as login.microsoft.com

More example :
site:www.targets.com filetype:asp
site:www.targets.com inurl:index.php
site:targets.com error
site:targets.com admin
link:targets.com
related:targets.com

You will find many Google dork : http://www.exploit-db.com/google-dorks/
Don't be lazy if you are serious.

There are some tools for automated search but i always prefer manually.

So suppose you found a url like : www.target.com/index.php?id=2 by search engine. So is not easy for quick check for invalid input on the "id" parameter(such as SQLi)?



Finding hidden file and content,default file:
You should browse all pages manually, review behavior for all pages. Here some point you can follow :
1. Brute force/Dictionary attack for hidden directory. You can use Burp suite or owasp DirBuster(I will post later about all tools tutorial).

2. See if you find any link like : www.target.com/login.php then there may be also logout.php, or if there is a www.taget.com/adduser.php then it may also exist www.target.com/deleteuser.php.... So try.

3. See the comment in the pages source for any interesting information.

4. Find out the login pages(admin+users).

5. Find out all url and save in a file for later uses.

6. Find out default file,content(What about www.target.com/phpinfo.php?).

7. I think you better run nikto against the site . Nikto is powerful tool for discovering default content.




Finding other information:
What is other information ?

1. Email(Social Engineering attack).

2. Phone number(Social Engineering).

3. Users and employee name(Social Engineering).

4. Find out the web server version. What version of apache, iis they are using? Perhaps if it is old then you may be lucky to find out some vulnerability on exploit-db,security focus for known vulnerability against the old software.

5. What type of web software are they using? Joomla, MyBB, PhpBB , Vbulletin or other? Do you know what version ? If these are old then you may search for vulnerability which already discovered before. 

I think you got some basic idea how to gather information and why you need to gather information. Without gathering information we can't map our target. For example , If we don't know how our victim walk, he knows the kung-fu or not(If  he knows kung-fu then we also need to be more powerful than him such as becoming expert Kung-Fu Fighter).
These are not only techniques for gathering information. You need more research about your target, Learn more techniques of information gathering, Use your powerful friend Google. I don't think so that it is possible to discover some wealth information within a short time. Personally i spend lot of time for familiarizing with my target, spend a long time for gathering information and mapping the target. If you are skid/script kiddies and want to hack just for fun or it is not important for you then sure you have no patient and time for mapping your targets. But a serious hacker will spend lots of time(most of time) for his targets.   At least i hope that i explained most of important thing you need.

WITHOUT THESE INFORMATION YOU SHOULD NOT GO AHEAD .


Good Luck


Tidak ada komentar:

Posting Komentar