Service Virtualization

 View Only
  • 1.  Parsing filters for DevTest logs ?

    Posted Nov 02, 2016 10:12 AM

    To optimize DevTest admin we centralize the distributed logs (Registry, VSE, EDashboard)  with Filebeat / Logstash / Elasticsearch / Kibana suite.

    Do you have "grok" filters available to parse the log message structure?  



  • 2.  Re: Parsing filters for DevTest logs ?
    Best Answer

    Posted Jun 02, 2017 05:12 PM

    pbernard.1 Did you ever get the GROK commands to work against the DevTest log files? If so, can you share your findings?

     

    I just started looking at ElasticStack as a technique to parse the different types of output seen in the DevTest logs. I am curious if the following configuration gets the Elastic Stack output formatted close enough...  I would like the flow from filebeat -> logstash -> elastic stack and then use Kibana to render the dashboard.

     

    Enable filebeat to handle multiple lines when reading log files. The date/time stamp is an indicator of a new line.

    filebeat.yml file:

    paths:

    - <path to log files>\vse.log

    - <path to logs>\vse_matches.log

    multiline.pattern: '^%(TIMESTAMP_ISO8601)'

    multiline.negate: true

    multiline.match: after

     

    Set up a grok to parse the date/time, define [Event Sink...] as "desc", INFO as "loglevel", com.itko... as "class", and remainder of the output until the new line as the "message".

     

    logstash.conf

    filter {

    grok {

    match => { "message" => "%{TIMESTAMP_ISO8601}( \()%{HOUR}:%{MINUTE}(\) |\)\[)%{GREEDYDATA:desc}(\] )%{LOGLEVEL}( )%{GREEDYDATA:class}( - )%{GREEDYDATA:message}" }

    }

    Found this site to test grok commands: Test grok patterns 

    Found this site that shows some of the 'canned' grok patterns: logstash/grok-patterns at v1.4.2 · elastic/logstash · GitHub  

    Passed in some different combinations of log messages and the pattern worked.

    I have not tried multi-line output yet.