Skip to content
Snippets Groups Projects
  1. Aug 28, 2023
  2. Apr 04, 2023
  3. Mar 30, 2023
  4. Nov 09, 2022
  5. Mar 20, 2022
  6. Feb 25, 2022
    • Petr Špaček's avatar
      filter-dnsq: skip 'special' queries for *.dotnxdomain.net · 9d6eb786
      Petr Špaček authored and Tomas Krizek's avatar Tomas Krizek committed
      By default, filter out queries for subdomains of dotnxdomain.net.
      This is a 'special' measurement domain. Queries directed to it have
      timestamps encoded in qname and replaying old queries results in
      timeouts, not in a realistic traffic replay.
      
      A new option -s can be used to keep the queries in the output if
      desired.
      
      The other domain - dashnxdomain.net - did not appear in any of my PCAPs
      so for simplicity I omitted it from the filtering code.
      
      Fixes: #25
  7. Feb 24, 2022
    • Petr Špaček's avatar
      filter-dnsq: always parse packets · d980c5db
      Petr Špaček authored
      Formerly, if malformed packets were requested in the output,
      the script skipped DNS parsing because there was no point in doing so:
      The malformed packets would have been included in the output anyway.
      
      As preparation for a new feature which requires access to qname,
      the script now tries to parse the packet even if user requested
      inclusion of malformed packets in the output.
      
      Even for tens of GBs of data overhead of doing this was negligible
      compared to other processing, so I did not go to the trouble of
      optimizing this further.
  8. Feb 16, 2022
  9. Feb 15, 2022
  10. Sep 09, 2021
  11. Jul 14, 2021
  12. Jun 04, 2021
    • Petr Špaček's avatar
      cut-pcap.lua script to effectively trim already sorted PCAPs · 1cbc93f0
      Petr Špaček authored
      Intended use is together with merge_chunks.py like this:
      merge_chunks.py ... | cut-pcap.lua - /tmp/short.pcap 60
      
      Stock editcap is designed to handle unsorted PCAPs and thus cannot stop
      on encountering first packet with timestamp over limit.
      This is very ineffective for processing large PCAPs generated by
      extract-clients.lua because the mergecap + editcap pipeline keeps
      processing all the "trailing" data, which can take really large time for
      no benefit.
      Verified
      1cbc93f0
  13. May 12, 2021
  14. Feb 23, 2021
    • Petr Špaček's avatar
      merge_chunks: utility to merge chunks on-the-fly · c378de50
      Petr Špaček authored
      Intended usage is to avoid generating PCAPs which are simple combination
      of "base chunks".
      
      Example:
      When original PCAP is split into 100 chunks with 1 kQPS on average,
      testing full range of 1k to 100k QPS formerly required generating 100
      distincts PCAPs, wasting time and storage.
      With this utility it is enough to generate "base chunks" and create
      arbitrary multiplication on-the-fly.
      
      Why Python?
      - Lua cannot list content of directory, FFI solution would be
        unportable, and adding depedency just for that seems too much.
      - BASH version of this script made me cry when I finished it.
      - Python is already depedency of replay.py and it is used only to
        process mergecap arguments and python process terminates/replaces
        itself once its job is done.
      Verified
      c378de50
  15. Feb 15, 2021
  16. Feb 10, 2021
  17. Feb 09, 2021
  18. Feb 08, 2021
  19. Feb 01, 2021
  20. Jan 18, 2021
  21. Dec 01, 2020