HOME>>>>>>>>>

File http-robots.txt

脚本使用类型: portrule
脚本所属分类: default, discovery, safe
脚本下载地址: http://nmap.org/svn/scripts/http-robots.txt.nse

User Summary

Checks for disallowed entries in /robots.txt on a web server.

The higher the verbosity or debug level, the more disallowed entries are shown.

Script Arguments

smbdomain, smbhash, smbnoguest, smbpassword, smbtype, smbusername

See the documentation for the smbauth library.

http.max-cache-size, http.max-pipeline, http.pipeline, http.useragent

See the documentation for the http library.

Example Usage

执行格式

nmap -sV -sC <target>

Script Output

80/tcp  open   http    syn-ack
|  http-robots.txt: 156 disallowed entries (40 shown)
|  /news?output=xhtml& /search /groups /images /catalogs
|  /catalogues /news /nwshp /news?btcid=*& /news?btaid=*&
|  /setnewsprefs? /index.html? /? /addurl/image? /pagead/ /relpage/
|  /relcontent /sorry/ /imgres /keyword/ /u/ /univ/ /cobrand /custom
|  /advanced_group_search /googlesite /preferences /setprefs /swr /url /default
|  /m? /m/? /m/lcb /m/news? /m/setnewsprefs? /m/search? /wml?
|_ /wml/? /wml/search?

Requires


Author: Eddie Bell

License: VER007 整理 http://www.ver007.com