Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ensure responses with large tables produce a single array with no duplicated elements #30

Open
jsvd opened this issue Jul 13, 2022 · 0 comments

Comments

@jsvd
Copy link
Member

jsvd commented Jul 13, 2022

There are reports of network devices producing responses where an ifTable is over 32 elements, causing it to be split into two arrays, or produce an array where entries have the same index. This may come from the snmp4j library.

Currently a workaround by @kares is to use a ruby filter script https://gist.github.com/kares/04b7bf88c1fb5ac9e59b5a48dc2c04e5#file-merge_array_hash-rb-L39= like so:

filter { 
  ruby { 
    path => '/opt/logstash/scripts/merge_array_hash.rb' 
    script_params => { 
      fields => ['interfaces'] 
      merge_key => 'index' 
    }
  }
}

Work is required to investigate why data arrives at the Logstash Event with these arrays, to understand if it is a bug in the snmp4j library or a setting that needs to be tweaked in the source network device

@jsvd jsvd added bug Something isn't working status:needs-triage labels Jul 13, 2022
@jsvd jsvd changed the title Ensure responses with large tables produce single element arrays Ensure responses with large tables produce a single array with no duplicated elements Jul 13, 2022
@edmocosta edmocosta transferred this issue from logstash-plugins/logstash-input-snmp May 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants