Results 1 to 8 of 8

Thread: automated tool to get subdomains for a traget domain

Hybrid View

  1. #1
    Just burned his ISO
    Join Date
    Feb 2010
    Posts
    9

    Default automated tool to get subdomains for a traget domain

    hi all;

    i would like to ask about an automated tool that could list all subdomains for a target domain and not duplicate results
    example:.edu.*
    i tried goorecon but it displayed only 60 subdomains while i found manually through google 200 subdomains

  2. #2
    Moderator
    Join Date
    Jan 2010
    Posts
    167

    Default

    if available ... dns-zonetransfer

    hf

  3. #3
    Just burned his ISO
    Join Date
    Feb 2010
    Posts
    9

    Default sudomains

    thx m-1-k-3 i tried dns - zone transfer , but i tried the zone transfer tool but i did not give me the required results , i want a tools that list all subdomains for a target domain for example give it the domain :

    .edu.XXX


    reply with

    aabbb.edu.XXX
    ccddd.edu.XXX
    eefff.edu.XXX
    to help me to make an automated script.

  4. #4
    Moderator
    Join Date
    Jan 2010
    Posts
    167

    Default

    if zonetransfer is disabled, there is no possibility to list them all ... you can try to bruteforce it.

    hf
    m-1-k-3

  5. #5
    Junior Member Sniffing4Prison's Avatar
    Join Date
    May 2009
    Posts
    26

    Default

    I was interested in this post so I made a short script in python.

    What it does is scrapes the page for urls and follows them, then rescrapes them and removes duplicates as many times as you want.

    I'm kind of liking it so I'll work on adding a scanner first to check for port 443/80/53 and if there isn't one it should check a DNS for 1-255 in the 4th octet of the domain's IP for sub-domains and rescan / re-scrape etc etc. So it would brute force some domains if they aren't handed to you on a silver platter.


    I tested it on google, because I know I'll get some hits.
    I chose to re-scrape twice for the sake of bandwidth.
    The results I got for Google were:
    Code:
    http://translate.google.com
    http://pack.google.com
    http://video.google.com
    http://sites.google.com
    http://desktop.google.com
    http://groups.google.com
    http://images.google.com
    http://sketchup.google.com
    http://picasa.google.com
    http://maps.google.com
    http://code.google.com
    http://docs.google.com
    http://checkout.google.com
    http://toolbar.google.com
    http://labs.google.com
    http://earth.google.com
    https://www.google.com
    http://www.google.com
    https://checkout.google.com
    http://mail.google.com
    http://news.google.com
    http://knol.google.com
    http://books.google.com
    http://blogsearch.google.com
    Is this what you were looking for? Noise on the network

    Edit:
    I've seen sites with 1000's of subdomains and I'm sure that wasn't even all of them.
    Why would you want this tool other than curiosity?

  6. #6
    Just burned his ISO
    Join Date
    Feb 2010
    Posts
    9

    Default

    Curiosity !!!! i have a task to make some satistics about some domains and their subdomains so i want to list all subdomains instead fo searching them by "site:" operator @ google and have duplicated results and also hundreds of results

  7. #7
    Junior Member Sniffing4Prison's Avatar
    Join Date
    May 2009
    Posts
    26

    Default

    Quote Originally Posted by HACK-IT View Post
    Curiosity !!!! i have a task to make some satistics about some domains and their subdomains so i want to list all subdomains instead fo searching them by "site:" operator @ google and have duplicated results and also hundreds of results
    DNSwalk is a perl script that will double check the target NS against zone transfers I tried it on a few .edu's and they all refused.

    dnsdigger.com
    us.mirror.menandmice.com/knowledgehub/tools/dig

    Long story short, I'd use Men and Mice web app and then brute force it with a python script and go for range based on the info i get from men and mice.

    Code:
    import scapy
    
    def resolve(host):
         dns = DNS(rd=1,qd=DNSQR(qname=host))
         response = sr1(IP(dst='192.168.1.1')/UDP()/dns);
         if response.haslayer(DNS):
              answer = response.getlayer(DNS).an
              answer.show()
    
    #etc #etc
    Good luck.

  8. #8
    Just burned his ISO
    Join Date
    Feb 2010
    Posts
    9

    Default

    thxx sooo much "Sniffing4prison" really i learned many new things from your post

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •