1

dedup logs | tail instead of head new entries?

Aim:

  • Download Logs via CURL, ingest to Splunk.
  • Ensure there are no duplicated events in the downloaded file (e.g already pulled)

Problem:

  • I'm successfully pulling the logs via CURL every 4 hours, thanks for the previous assistance. I'm then monitoring the relevant folder and ingesting that file automatically to Splunk.
  • The challenge I have is that the logs are not tailing, new entries appear at the head of the file
  • This means that Splunk can't see where it has read up to and re-ingests the whole file. If the logs tailed this would not be a problem
  • I can workaround this but it involves some complicated scripting

Question:

  • I note the date filter in the API commands, can this be combined with the Download command?
  • If so can you provide an example?
  • Could you include an option in the Download command to inverse the order?

3 replies

null
    • NextDNs
    • 10 days ago
    • Reported - view

    Have you tried combining sort=asc with from or cursor when using the API?

      • heydillon
      • 10 days ago
      • Reported - view

      I tried, but was unable to configure it to work with Downloads. Can you assist with where it would go?

    • heydillon
    • 11 days ago
    • Reported - view

    I am doing the same thing as you and finding it challenging. Did you find a workable solution?

Content aside

  • 1 Likes
  • 10 days agoLast active
  • 3Replies
  • 44Views
  • 3 Following