Overblog Suivre ce blog
Editer l'article Administration Créer mon blog
11 janvier 2014 6 11 /01 /janvier /2014 15:12

We'll see here SNORT® Inner components and functionning, as well as SNORT performance sizing and tuning.



SNORT Inner Components


When you start using SNORT, we beging hearing words like Packet Decoder, Pre_Processors, and

Detection Engine. Let's see what they are, and what role they play.

Here is the SNORT Inner layout :


Packet Decoder

The Packet decoder puts the raw, libpcap datas, and re-forms the data structures from layer 2 to

layer 4. It's like Wireshark data decoding.  It identifies network protocols.
The Packet Decoder can too trigger alerts in presence of suspiciously mal-formed packets ( protocol-wise )




The Pre-processors are compiled pulgins. They perform two tasks : some detections that are beyond simple packet signature-matching, and further data structure reformation for the Detection engine. Here are some exemples :

. Detection of anomalies that go beyond single packet analysis : Portscan detector


. Further data structure reformation: de-fragmentation, stream reassembly, http special

character decoding, ...

As an exemple, it is common for an atacker to use fragmentation to divide the attack into multiple

packets in order to elude signature based detection engines. The de-fragmentation pre-processor

reassembles fragmented packets to spot such attacks.



Detection Engine


The Detection engine is the main detection part, which performs signature-based detection using the Ruleset.




Alert Generation

The Alert generation is responsible to handle the various alert modes : Database, unified2 ( binary

format used by Barnyard 2), syslog, console, as well as Packet recording.





SNORT Performances

SNORT good performances sizing and tuning is very important, as a non-packet-dropping NIDS is the real goal.

We don't want our IDS to drop packets and thus miss possibly suspicious activity, or to bottleneck.

Since version 2.0 SNORT is capable of handling Gigabit traffic.



The basic SNORT performance is matrixed by :

    . The Traffic being monitored

    . The Hardware/Software platform

The Traffic being monitored

Obviously, a high-bandwidth traffic will require more power. Beside this, certain type of traffic may be more power-consumming ( if they require extra decoding, more reassembly, ... ).



The Hardware/Software performance

Hardware performance :

CPU power ( more on this later )

Amount of Memory ( ex : Stream reassembly consummes memory )

HDD performance ( to avoid database logging bottleneck )
The type of NICs used



Software performance :

Using barnyard2 instead of network or DB logging improves software performance. Gigabit handling is only possible if using Unified2 / Barnyard2 mode.

Lighter ruleset : The lighter the ruleset, the easier for the CPU to handle the analysis task.

Fewer preprocessor rules : Preprocessor rules are CPU-intensive.

Usually, the SNORT bottleneck is either at the Detection Stage or at the Output stage.

SNORT and multithreading / multi CPU

SNORT is not multithreaded. While at first this may look like a weakness, it isn't actually. SNORT analysis being very linear, it poorly benefits from multithreading while being heavily taxed to pay the MT penalties. Faster cores here are better than more cores.

While basically SNORT is not multithreaded, several approaches may be used to enhance and optimize its performances.

1. lock aside softwares to other-cores ( ie lock barnyard2 / mysql to another core )
2. run several instances of snort locked on different cores, using different rulesets    ( same stream analyzed, different rulesets )
3. rune a load ballancing frontend, with several snort instances locked on different cores ( different streams analyzed, same ruleset used )



- SNORT® is a registered trademark  of Sourcefire, Inc. -

Partager cet article

Repost 0
Published by computer outlines - dans NIDS
commenter cet article



  • : Computer Outlines Blog
  • : Blog mainly focused over IPv6, Windows Server, and Networking in general.
  • Contact