Skip to content

Commit

Permalink
Move documentation from TheHive repository
Browse files Browse the repository at this point in the history
  • Loading branch information
To-om committed May 16, 2017
1 parent 342ca69 commit 6cc61fe
Show file tree
Hide file tree
Showing 34 changed files with 1,855 additions and 0 deletions.
65 changes: 65 additions & 0 deletions FAQ.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
# Cases and Tasks

- [I Can't Add a Template](https://github.com/CERT-BDF/TheHive/wiki/FAQ#i-cant-add-a-template)
- [Why My Freshly Added Template Doesn't Show Up?](https://github.com/CERT-BDF/TheHive/wiki/FAQ#why-my-freshly-added-template-doesnt-show-up)
- [Can I Use a Specific Template for Imported MISP Events?](https://github.com/CERT-BDF/TheHive/wiki/FAQ#can-i-use-a-specific-template-for-imported-misp-events)

## Templates
### I Can't Add a Template
You need to log in as an administrator to add a template.

### Why My Freshly Added Template Doesn't Show Up?
When you add a new template and hit the `+NEW` button, you don't see it because unlike other events that you can see in the Flow, it is not broadcasted to all the user sessions. So you need to refresh the page before clicking the `+NEW` button.

You don't need to log out then log in again.

### Can I Use a Specific Template for Imported MISP Events?
Definitely! You just need to add a `caseTemplate` parameter in the section corresponding to the MISP connector in your `conf/application.conf` file. This is described in the [Administrator's Guide](https://github.com/CERT-BDF/TheHive/wiki/Administrator's-guide#48-misp).

# Analyzers
- [I Would Like to Contribute or Request a New Analyzer](https://github.com/CERT-BDF/TheHive/wiki/FAQ#i-would-like-to-contribute-or-request-a-new-analyzer)

## General
### I Would Like to Contribute or Request a New Analyzer
Analyzers are no longer bundled with TheHive. Since the release of Buckfast (TheHive 2.10), the analysis engine has been released as a separate product called [Cortex](https://github.com/CERT-BDF/Cortex). If you'd like to develop or ask for an analyzer that will help you get the most out of TheHive, please open a [feature request](https://github.com/CERT-BDF/Cortex-Analyzers/issues/new) first. This will give us a chance to validate the use cases and avoid having multiple persons working on the same analyzer.

Once validated, you can either develop your analyzer or wait for THeHive Project or a contributor to undertake the task and if everything is alright, we will schedule its addition to a future Cortex release.

# Miscellaneous Questions

- [Can I Enable HTTPS to Connect to TheHive?](https://github.com/CERT-BDF/TheHive/wiki/FAQ#can-i-enable-https-to-connect-to-thehive)
- [Can I Import Events from Multiple MISP Servers?](https://github.com/CERT-BDF/TheHive/wiki/FAQ#can-i-import-events-from-multiple-misp-servers)
- [Can I connect TheHive to a AWS ElasticSearch service ?](https://github.com/CERT-BDF/TheHive/wiki/FAQ#can-i-connect-thehive-to-an-aws-elasticsearch-service)
- [Any plan to support elasticsearch 5.x backend in the future ?](https://github.com/CERT-BDF/TheHive/wiki/FAQ#do-you-have-any-plans-for-elasticsearch-5x-support-in-the-future)

### Can I Enable HTTPS to Connect to TheHive?
#### TL;DR
Add the following lines to `/etc/thehive/application.conf`

https.port: 9443
play.server.https.keyStore {
path: "/path/to/keystore.jks"
type: "JKS"
password: "password_of_keystore"
}

HTTP can disabled by adding line `http.port=disabled`
#### Details
Please read the [relevant section](https://github.com/CERT-BDF/TheHive/wiki/Configuration#9-https) in the Configuration guide.

### Can I Import Events from Multiple MISP Servers?
Yes, this is possible. For each MISP server, add a `misp` section in your `conf/application.conf` file as described in the [Administrator's Guide](htthttps://github.com/CERT-BDF/TheHive/wiki/Configuration#7-misp).

### Can I connect TheHive to an AWS ElasticSearch service?
AWS Elasticsearch service only supports HTTP transport protocol. It does not support the binary protocol which the Java client used by TheHive relies on to communicate with ElasticSearch. As a result, it is not possible to setup TheHive with AWS Elasticsearch service. More information is available at the following URLs:
- [http://docs.aws.amazon.com/elasticsearch-service/latest/developerguide/aes-limits.html](https://www.elastic.co/guide/en/elasticsearch/reference/5.1/modules-network.html#_transport_and_http_protocols )

> “TCP Transport : The service supports HTTP on port 80, but does not support TCP transport”
- [https://www.elastic.co/guide/en/elasticsearch/reference/5.1/modules-network.html#_transport_and_http_protocols](https://www.elastic.co/guide/en/elasticsearch/reference/5.1/modules-network.html#_transport_and_http_protocols)
> “TCP Transport : Used for communication between nodes in the cluster, by the Java Transport client and by the Tribe node.
> HTTP: Exposes the JSON-over-HTTP interface used by all clients other than the Java clients.”
### Do you have any plans for ElasticSearch 5.x support in the future?
We haven't planned it yet. Please note that it's easier to move from ES2 to ES5 than from 1.x to version 2.
We will give it a try as soon as we can and let you know.
37 changes: 37 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
TheHive is a scalable 3-in-1 open source and free security incident response platform designed to make life easier for SOCs, CSIRTs, CERTs and any information security practitioner dealing with security incidents that need to be investigated and acted upon swiftly.

## Hardware Pre-requisites

TheHive uses ElasticSearch to store data. Both software use a Java VM. We recommend using a virtual machine with 8vCPU, 8
GB of RAM and 60 GB of disk. You can also use a physical machine with similar specifications.

## What's New?

- [Changelog](/CHANGELOG.md)
- [Migration Guide](migration-guide.md)

## Installation Guides

TheHive can be installed using:
- An [RPM package](installation/rpm-guide.md)
- A [DEB package](installation/deb-guide.md)
- [Docker](installation/docker-guide.md)
- [Binary](installation/binary-guide.md)
- [Ansible script](https://github.com/drewstinnett/ansible-thehive) contributed by
[@drewstinnett](https://github.com/drewstinnett)

TheHive can also be [built from sources](installation/build-guide.md).

## Administration Guides

- [Administrator's guide](admin/admin-guide.md)
- [Configuration guide](admin/configuration.md)
- [Updating](admin/updating.md)
- [Backup & Restore](admin/backup-restore.md)

## Developer Guides

- [API documentation](api/README.md)

## Other
- [FAQ](FAQ.md)
65 changes: 65 additions & 0 deletions admin/admin-guide.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
# Administrator's guide

## 1. User management

Users can be managed through the `Administration` > `Users` page. Only administrators may access it. Each user is identified by their login, full name and role.

![users](../files/adminguide_users.png)

Please note that you still need to create user accounts if you use LDAP or Active Directory authentication. This is necessary for TheHive to retrieve their role and authenticate them against the local database, LDAP and/or AD directories.

There are 3 roles currently:
- `read` : all non-sensitive data can be read. With this role, a user can't make any change. They can't add a case, task, log or observable. They also can't run analyzers;
- `write`: create, remove and change data of any type. This role is for standard users. `write` role inherits `read` rights;
- `admin`: this role is reserved for TheHive administrators. Users with this role can manage user accounts, metrics, create case templates and observable data types. `admin` inherits `write` rights.

**Warning**: Please note that user accounts cannot be removed once they have been created, otherwise audit logs will refer to an unknown user. However, unwanted or unused accounts can be locked.

## 2. Case template management

Some cases may share the same structure (tags, tasks, description, metrics). Templates are here to automatically add tasks, description or metrics while creating a new case. An user can choose to create an empty case or based on registered template.

To create a template, as _admin_ go in the administration menu, and open the "Case templates" item.

![template](../files/adminguide_template.png)

In this screen, you can add, remove or change template.
A template contains:
* default severity
* default tags
* title prefix (can be changed by user at case creation)
* default TLP
* default default
* task list (title and description)
* metrics

Except for title prefix, task list and metrics, the user can change values defined in template.

## 3. Report template management

When TheHive is connected to a Cortex server, observable can be analyzed to get additional information on them. Cortex outputs report in JSON format. In order to make reports more readable, you can configure report templates. Report templates convert JSON in to HTML using AngularJS template engine.

For each analyzer available in Cortex you can define two kinds of template: short and long. Short report exposes synthetic information, shows in top of observable page. With short reports you can see a sum-up of all run analyzes. Long report shows detail information only when the user select the report. Raw data in JSON format is always available.

Report templates can be configure in `Admin` > `Report templates` menu. We offer report templates for default Cortex analyzers. A package with all report templates can be downloaded at https://dl.bintray.com/cert-bdf/thehive/report-templates.zip and can be injected using the `Import templates` button.

## 4. Metrics management

Metrics have been integrated to have relevant indicators about cases.

Metrics are numerical values associated to cases (for example, the number of impacted users). Each metric has a _name_, a _title_ and a _description_, defined by an administrator. When a metric is added to a case, it can't be removed and must be filled. Metrics are used to monitor business indicators, thanks to graphs.

Metrics are defined globally. To create metrics, as _admin_ got in the administration menu, and open the "Case metrics" item.

![metrics](../files/adminguide_metrics.png)


Metrics are used to create statistics ("Statistics" item in the user profile menu). They can be filtered on time interval, and case with specific tags.

For example you can show metrics of case with "malspam" tag on January 2016 :

![statistics](../files/adminguide_statistics.png)

For graphs based on time, user can choose metrics to show. They are aggregated on interval of time (by day, week, month of year) using a function (sum, min or max).

Some metrics are predefined (in addition to those defined by administrator) like case handling duration (how much time the case had been open) and number of case opening or closing.
46 changes: 46 additions & 0 deletions admin/backup-restore.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
# Backup and restore data
All persistent data are stored in ElasticSearch database. The backup and restore procedures are the ones that are
detailed in
[ElasticSearch documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-snapshots.html).

_Note_: you may have to adapt you indices in the examples below. To find the right indice, use the following command :

```
curl 'localhost:9200/_cat/indices?v'
```

## 1. Create a snapshot repository
First you must define a location in local filesystem (where ElasticSearch instance runs) where the backup will be
written. Be careful if you run ElasticSearch in Docker, the directory must be mapped in host filesystem using `--volume`
parameter (cf. [Docker documentation](https://docs.docker.com/engine/tutorials/dockervolumes/)).

Create a ElasticSearch snapshot point with the following command :
```
$ curl -XPUT 'http://localhost:9200/_snapshot/the_hive_backup' -d '{
"type": "fs",
"settings": {
"location": "/absolute/path/to/backup/directory",
"compress": true
}
}'
```

## 2. Backup your data
Start the backup by executing the following command :
```
$ curl -XPUT 'http://localhost:9200/_snapshot/the_hive_backup/snapshot_1' -d '{
"indices": "the_hive_9"
}'
```
You can backup the last index of TheHive (you can list indices in you ElasticSearch cluster with
`curl -s http://localhost:9200/_cat/indices | cut -d ' ' -f3` ) or all indices with `_all` value.

## 3. Restore data
Restore will do the reverse actions : it reads backup in your snapshot directory and load indices in ElasticSearch
cluster. This operation is done with this command :
```
$ curl -XPOST http://localhost:9200/_snapshot/the_hive_backup/snapshot_1/_restore
{
"indices": "the_hive_9"
}
```
Loading

0 comments on commit 6cc61fe

Please sign in to comment.