Skip to content

Commit

Permalink
added support for page vectors in order to allow grep modules scan re…
Browse files Browse the repository at this point in the history
…sponse bodies
  • Loading branch information
Zapotek committed Feb 15, 2012
1 parent 644c70f commit 4ea0e2e
Show file tree
Hide file tree
Showing 3 changed files with 26 additions and 14 deletions.
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Rack::ArachniVectorFeed middleware

Extracts input (link, form, cookie, header) vectors/params from HTTP requests
Extracts input (link, form, cookie, header) vectors/params and response bodies from HTTP requests (and responses)
and exports them in a suitable format for use with [Arachni](http://arachni-scanner.com)'s [VectorFeed](https://github.com/Zapotek/arachni/blob/experimental/plugins/vector_feed.rb) plug-in
in order to perform extremely focused audits or unit-tests.

Expand Down Expand Up @@ -63,10 +63,10 @@ you to skip the crawl by setting the <em>link-count</em> limit to <em>0</em>.
Like so:

```
arachni <url> --plugin=vector_feed:yaml_file='<vectors file>' -m audit/* --link-count=0
arachni <url> --plugin=vector_feed:yaml_file='<vectors file>' -m audit/*,grep/* --link-count=0
```

This will load all audit modules and attack the extracted vectors while skipping the crawl.
This will load all <em>audit</em> and <em>grep</em> modules and attacks the extracted vectors while skipping the crawl.

If you want to automate the process you can:

Expand Down
2 changes: 1 addition & 1 deletion examples/server.rb
Original file line number Diff line number Diff line change
Expand Up @@ -142,7 +142,7 @@ def show( str )
<h3>Why?</h3>
<p>
<ol>
<li>We only use the XSS module because this is a demo.</li>
<li>We only use the XSS module because this is a demo. Under real world scenarios use: <pre>audit/*,grep/*</pre></li>
<li>We set the <em>link-count</em> limit to <em>0</em> to prevent Arachni
from crawling and only audit the stuff passed to it by the VectorFeed plug-in.</li>
<li>We set the <em>http-req-limit</em> to <em>1</em> to throttle Arachni down since you'll
Expand Down
32 changes: 22 additions & 10 deletions lib/rack/arachni-vectorfeed.rb
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@

require 'rack/utils'
require 'yaml'
require 'digest/md5'
# require 'ap'

module Rack
Expand All @@ -32,33 +33,44 @@ def initialize( app, opts )
end

def call( env )
# ap env

extract_vectors( env ).each {
|vector|
if !@vectors.include? vector
@vectors << vector
append_to_outfile( vector )
end
}
extract_vectors( env ).each { |vector| append_to_outfile( vector ) }

# forward the request up to the app
@app.call( env )
code, headers, body = @app.call( env )

append_to_outfile( extract_page( env, code, headers, body ) )

[ code, headers, body ]
end

private

def append_to_outfile( vector )
digest = Digest::MD5.hexdigest( vector.to_s )
return if @vectors.include? digest

::File.open( @opts[:outfile], 'a' ) do |out|
YAML.dump( [vector], out )
end

@vectors << digest
end

def extract_vectors( env )
[extract_cookies( env ), extract_headers( env ),
extract_forms( env ), extract_links( env ) ].flatten.compact
end

def extract_page( env, code, headers, body )
{
'type' => 'page',
'url' => vector_tpl( env )['action'],
'code' => code,
'headers' => headers.to_hash,
'body' => body.join( "\n" )
}
end

def extract_links( env )
return if !env['QUERY_STRING']

Expand Down

0 comments on commit 4ea0e2e

Please sign in to comment.