您的位置:首页 > 其它

windows平台的capture-hpc的安装和配置

2009-04-11 15:08 344 查看
先抄一段介绍,不翻译了。

The software capture-hpc (https://projects.honeynet.org/capture-hpc) is a client-side honeypot, heavily based on the functionality of VMWare.

A client software is installed on a virtual VMWare guest system (Windows XP) and a server side part of the software contacts the client by using a VMWare proprietary protocol and instructs it to open a URL that presumably contains malicious content.

File changes, registry changes and network activity are recorded for analysis while a URL is being accessed. After a configurable timeout, the virtual machine is automaticaly resetted to a snapshot of the system marked as clean and the process starts from the beginning with another URL.

capture-hpc可以安装在linux和windows平台上(虚拟机是windows xp sp2),我用的是windows xp。

首先安装VMServer(不是VMware,不过用起来差不多),可以到这下载http://www.vmware.com/download/server/,我用的文件版本是VMware-server-installer-1.0.6-91891.exe,可以在线申请免费的序列号,嫌麻烦的可以用这个:

Name:www.okeydown.com
Code:X1XWE-TPW66-CC27A-4YEFG

安装完后新建一个虚拟机,安装windows xp sp2。这个是capture-hpc的说明文档中推荐的,其他系统可不可以我不知道。

装完操作系统后把capture-hpc的客户端CaptureClient-Setup.exe复制到虚拟机中安装。我用的版本是capture-client-2.5.1-389.zip。安装完后可以在FileMonitor.exl,ProcessMonitor.exl和RegistryMonitor.exl中设置文件、进程和注册表操作的白名单。如果要捕获网络流量的话还要安装winpcap。

配好虚拟机的网络使之能够正常上网,NAT和桥接模式(需要配置ip)都行。然后通过菜单Snapshot-Take Snapshot保存一个干净的系统快照。

这里插一句,我用的是带还原卡的机器,本机ip配成192.168.2.215,虚拟机ip配成192.168.2.216,重启机子后发现本机网卡的ip成了0.0.0.0,而虚拟机的ip成了192.168.2.215,折腾了好半天才发现时还原卡配置ip的问题。干脆直接把网络连接中的VMware Network Adapter VMnet1和VMnet8两个虚拟网卡给禁用了,然后再重启本机网卡的ip就正常了。打开虚拟机发现禁用了这两块虚拟网卡之后还能正常上网,呜哈哈~~

----------------------------------------------我飘啊飘你摇啊摇无根的分隔线-----------------------------------------------------------

然后在本机上安装capture-server,安装之前先要安装Sun's Java JRE 1.6.0 - update 7(应该是1.6.X都可以吧)。然后下载capture-server。

注意啊capture-server有好多版本啊,windows平台下使用的是capture-server-2.5.1-389.zip(不是那个200多K的capture-server-2.5.1-389-withLinuxRevert.zip)。解压之后要配置一下config.xml。先粘贴一下readme里面关于配置部分的说明。不想仔细看的可以跳过。

- client_inactivity_timeout: the capture client indicates that it is still alive via responding to a ping by the server. This happens every 10 seconds. If no pong is received by the client for the duration of the client_inactivity_timeout, the client inactivity error is thrown and the VM reverted. An example when this could happen is when a malicious site causes a blue screen.
  - revert_timeout: the vix code that the revert function makes use of, at times hangs, but functions properly if restarted. If the revert has not completed during the revert_timeout duration, the revert timeout error is thrown and the revert of the VM attempted once again.
  - vm_stalled_after_revert_timeout: identical to the revert_timeout, but the start criteria is not communicated by the VIX api, but rather by the capture client sending a visit command.
  - vm_stalled_during_operation_timeout: When client (e.g. Internet Explorer) locks up, the capture client is still able to respond to pings, but doesnt progress visitation of URLs. This vm_stalled_during_operation_timeout sets how often the capture server should at least expect a visitation event (this is highly dependent on speed of the network and how many URLs are being visited). If no visitation event is received during the timeout period, the VM stalled error is thrown and the VM is reverted.
  - same_vm_revert_delay: the vix library and vmware server have a difficult time reverting vms at the same time. the code already prevents the same VM from reverting at the same time. the delay specified by this variable is automatically applied when reverting the same vm.
  - different_vm_revert_delay: the vix library and vmware server have a difficult time reverting vms at the same time. the delay specified by this variable is automatically applied when reverting a different vm. This delay is larger because theoretically it would be possible to delay two VMs at the same time.
* The global option group size determines how many instances of the client application are opened at the same time. A value of 1 will cause only 1 instance to be opened (just like Capture-HPC v 2.01 and prior). Note only certain client plug-ins support visiting group of sizes larger than one:
    - internetexplorer (applies divide-and-conquer algorithm): full support (max group size of 80)
    - internetexplorerbulk (applies bulk algorithm): full support (max group size of 54)
    - Firefox (applies divide-and-conquer algorithm): full support; however, firefox needs to be configured to open a blank page and not restore from previous sessions. In addition, because firefox does not have a callback that notifies the server when a page has successfully been retrieved, the client-default-visit-time needs to be increased to accommodate loading X firefox instances and retrieving the web pages. Some testing might be required to determine the appropriate value.
    - Other: only group sizes of 1 are supported at this point
* Add the local exclusion lists that would be pushed to the clients if that option is enabled
* Add vmware servers
    Specify the ip address, port, username, and password of the vmware server that hosts capture clients.
* For each vmware server, add virtual machines that run a Capture Client.
    Specify the path to the virtual machine vmx file as well as the administrator account and password and path the capture bat file exists (needs to be a bat file).

配置文档虽然没有注释但是比较容易读懂。下面是我的一个范例。

<config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="config.xsd">
<!-- version 2.5 -->
<global collect-modified-files="false"
client-default="iexplorebulk"
client-default-visit-time="20"
capture-network-packets-malicious="false"
capture-network-packets-benign="false"
send-exclusion-lists="false"
terminate="true"
group_size="20"
vm_stalled_after_revert_timeout="120"
revert_timeout="120"
client_inactivity_timeout="60"
vm_stalled_during_operation_timeout="300"
same_vm_revert_delay="6"
different_vm_revert_delay="24"
/>
<exclusion-list monitor="file" file="FileMonitor.exl" />
<exclusion-list monitor="process" file="ProcessMonitor.exl" />
<exclusion-list monitor="registry" file="RegistryMonitor.exl" />

<!--preprocessor classname="example">
<![CDATA[
<example-config attribute1="1.0" attribute2="40" attribute2="log/output.log"/>
]]>
</preprocessor-->

<virtual-machine-server type="vmware-server" address="127.0.0.1" port="902"
username="administrator" password="voodoo">
<virtual-machine vm-path="D:\Virtual Machines\Windows XP Professional\Windows XP Professional.vmx"
client-path="C:\Progra~1\capture\CaptureClient.bat"
username="Administrator"
password="voodoo"/>
</virtual-machine-server>
</config>


 


client-default 指的是浏览器类型,可以是iexplore,iexplorebulk,firefox等,详见上述e文说明文档。


<virtual-machine-server type="vmware-server" address="127.0.0.1" port="902"


username="administrator" password="voodoo">


上述address是虚拟机所在的主机地址,username和password是该主机密码,port是vmserver使用的端口,默认为902。


<virtual-machine vm-path="D:\Virtual Machines\Windows XP Professional\Windows XP Professional.vmx"


client-path="C:\Progra~1\capture\CaptureClient.bat"


username="Administrator"


password="voodoo"/>


上述vm-path是虚拟机文件的路径(*.vmx),client-path是虚拟机中captrueclient的安装路径(不要把Progra~1改成Program Files,


会出错滴~),username和password是虚拟机的登录用户名和密码。


----------------------------------------------我飘啊飘你摇啊摇无根的分隔线,即将大功告成了----------------------------------


配置完之后在Capture-Server目录下新建一个urls.txt,里面填入要探测的域名,如:


www.google.com


www.baidu.com


www.hao123.com


然后再新建一个批处理文件RunHPC.bat,用于启动Capture—HPC,格式如下:


java -Djava.net.preferIPv4Stack=true -jar CaptureServer.jar -s <IP listening address>:<IP listening port> -f urls.txt


IP listening address为本机地址,IP listen port一般为7070,urls.txt为存放的网页列表的文件。如:


java -Djava.net.preferIPv4Stack=true -jar CaptureServer.jar -s 192.168.2.215:7070 -f urls.txt


然后双击即可运行。






如果运行时将vmserver打开,还可以很直观地看到当前虚拟机访问网络的情况。






 


----------------------------------------------我飘啊飘你摇啊摇无根的分隔线------------------------------------------------


 


本来以为这个Capture—HPC具有爬虫功能,可以自动地将网页上的链接提取出来从而crawl the net,但是后来发现它把urls.txt文件中的地址


列表访问完之后就停止了而没有爬到其他站点去。这不知道是因为程序本身就没有这个功能还是配置不对(如果你有其他见解请给我发邮件以进行


交流zhao.sh@live.cn)。


后来找了个方法,之前在做dns爬虫时在数据库里放了一张host表用于存放域名url(目前有10万条记录),只能将这个表导出来放到capture-hpc


的待测url列表里以扩充它的探测范围。


将数据库的查询结果导入到文本文件可以这样写(sqlplus环境)


set pagesize 50000;


spool d:\urls.txt;


select * from dnshost;


spool off;


 


 
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: