1
0
mirror of https://github.com/kellyjonbrazil/jc.git synced 2025-06-15 00:05:11 +02:00

Dev v1.23.3 (#426)

* make certificate search more robust to different line endings

* use license_files instead of license_file which is deprecated

* version bump

* parsing extra options -e, -o, -p

* fix for extra opts and different field length at option -[aeop]

* test integration for extra opts -e -o -p

* formatting and use ast.literal_eval instead of eval

* doc update

* doc update

* Add a parser to parse mounted encrypted veracrypt volumes (fixes #403)

* update compatibility warning message

* netstat windows parser

* tests

* Windows route parser

* tests

* id should be a string

* add veracrypt parser and docs

* formatting

* doc update

* lsattr parser

* Update test_lsattr.py

* changed keys to lowercase

* changed info

* support missing data for stat

* doc update

* doc update

* doc update

* ensure compatibility warning prints even with no data

* improve compatibility message

* add support for dig +nsid option

* New parser: srt (#415)

* srt parser

* changed the parser to support more complex cases

* doc updates

* Adding certificate request parser (#416)

* Adding certificate request parser

* Adding the CSR type for Windows-style CSR

---------

Co-authored-by: Stg22 <stephane.for.test@gmail.com>

* doc update

* add csr tests

* Last -x (#422)

* Refactored the parser

* last -x support

* doc update

* fix for ping on linux with missing hostname

* allow less strict email decoding with a warning.

* doc update

* use explicit ascii decode with backslashreplace

* doc update

* use jc warning function instead of print for warning message

* last -x shutdown fix (#423)

* inject quiet setting into asn1crypto library

* Parse appearance and modalias lines for mouse devices (fixes #419) (#425)

The bluetoothctl device parser is implemented so that it aborts the parsing
process immediately returning what it has collected so far. This is because
the parser should work in hydrid way to support outputs comming from bluetoothctl
devices and bluetoothctl info calls.

* doc update

* doc update

---------

Co-authored-by: gerd <gerd.augstein@gmail.com>
Co-authored-by: Jake Ob <iakopap@gmail.com>
Co-authored-by: Mevaser <mevaser.rotner@gmail.com>
Co-authored-by: M.R <69431152+YeahItsMeAgain@users.noreply.github.com>
Co-authored-by: Stg22 <46686290+Stg22@users.noreply.github.com>
Co-authored-by: Stg22 <stephane.for.test@gmail.com>
This commit is contained in:
Kelly Brazil
2023-06-22 01:48:23 +03:00
committed by GitHub
parent 5527d22459
commit 5023e5be4c
102 changed files with 6220 additions and 263 deletions

View File

@ -1,5 +1,23 @@
jc changelog
20230621 v1.23.3
- Add `lsattr` command parser
- Add `srt` file parser
- Add `veracrypt` command parser
- Add X509 Certificate Request file parser
- Enhance X509 Certificate parser to allow non-compliant email addresses with a warning
- Enhance `dig` command parser to support the `+nsid` option
- Enhance `last` and `lastb` command parser to support the `-x` option
- Enhance `route` command parser to add Windows support
- Enhnace `netstat` command parser to add Windows support
- Enhance `ss` command parser to support extended options
- Enhance the compatibility warning message
- Fix `bluetoothctl` command parser for some mouse devices
- Fix `certbot` command parser to be more robust with different line endings
- Fix `ping` command parsers for output with missing hostname
- Fix `stat` command parser for older versions that may not contain all fields
- Fix deprecated option in `setup.cfg`
20230429 v1.23.2
- Add `bluetoothctl` command parser
- Add `certbot` command parser for `certificates` and `show_account` options

View File

@ -4551,6 +4551,57 @@ cat entrust.pem | jc --x509-cert -p
}
]
```
### X.509 PEM and DER certificate request files
```bash
cat myserver.csr | jc --x509-csr -p
```
```json
[
{
"certification_request_info": {
"version": "v1",
"subject": {
"common_name": "myserver.for.example"
},
"subject_pk_info": {
"algorithm": {
"algorithm": "ec",
"parameters": "secp256r1"
},
"public_key": "04:40:33:c0:91:8f:e9:46:ea:d0:dc:d0:f9:63:2c:a4:35:1f:0f:54:c8:a9:9b:e3:9e:d4:f3:64:b8:60:cc:7f:39:75:dd:a7:61:31:02:7c:9e:89:c6:db:45:15:f2:5f:b0:65:29:0b:42:d2:6e:c2:ea:a6:23:bd:fc:65:e5:7d:4e"
},
"attributes": [
{
"type": "extension_request",
"values": [
[
{
"extn_id": "extended_key_usage",
"critical": false,
"extn_value": [
"server_auth"
]
},
{
"extn_id": "subject_alt_name",
"critical": false,
"extn_value": [
"myserver.for.example"
]
}
]
]
}
]
},
"signature_algorithm": {
"algorithm": "sha384_ecdsa",
"parameters": null
},
"signature": "30:45:02:20:77:ac:5b:51:bf:c5:f5:43:02:52:ae:66:8a:fe:95:98:98:98:a9:45:34:31:08:ff:2c:cc:92:d9:1c:70:28:74:02:21:00:97:79:7b:e7:45:18:76:cf:d7:3b:79:34:56:d2:69:b5:73:41:9b:8a:b7:ad:ec:80:23:c1:2f:64:da:e5:28:19"
}
]
```
### XML files
```bash
cat cd_catalog.xml

View File

@ -5,11 +5,13 @@
> Try the `jc` [web demo](https://jc-web.onrender.com/) and [REST API](https://github.com/kellyjonbrazil/jc-restapi)
> JC is [now available](https://galaxy.ansible.com/community/general) as an
> `jc` is [now available](https://galaxy.ansible.com/community/general) as an
Ansible filter plugin in the `community.general` collection. See this
[blog post](https://blog.kellybrazil.com/2020/08/30/parsing-command-output-in-ansible-with-jc/)
for an example.
> Looking for something like `jc` but lower-level? Check out [regex2json](https://gitlab.com/tozd/regex2json).
# JC
JSON Convert
@ -217,6 +219,7 @@ option.
| `--last` | `last` and `lastb` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/last) |
| `--ls` | `ls` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/ls) |
| `--ls-s` | `ls` command streaming parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/ls_s) |
| `--lsattr` | `lsattr` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/lsattr) |
| `--lsblk` | `lsblk` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/lsblk) |
| `--lsmod` | `lsmod` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/lsmod) |
| `--lsof` | `lsof` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/lsof) |
@ -252,6 +255,7 @@ option.
| `--semver` | Semantic Version string parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/semver) |
| `--sfdisk` | `sfdisk` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/sfdisk) |
| `--shadow` | `/etc/shadow` file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/shadow) |
| `--srt` | SRT file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/srt) |
| `--ss` | `ss` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/ss) |
| `--ssh-conf` | `ssh` config file and `ssh -G` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/ssh_conf) |
| `--sshd-conf` | `sshd` config file and `sshd -T` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/sshd_conf) |
@ -285,12 +289,14 @@ option.
| `--uptime` | `uptime` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/uptime) |
| `--url` | URL string parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/url) |
| `--ver` | Version string parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/ver) |
| `--veracrypt` | `veracrypt` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/veracrypt) |
| `--vmstat` | `vmstat` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/vmstat) |
| `--vmstat-s` | `vmstat` command streaming parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/vmstat_s) |
| `--w` | `w` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/w) |
| `--wc` | `wc` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/wc) |
| `--who` | `who` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/who) |
| `--x509-cert` | X.509 PEM and DER certificate file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/x509_cert) |
| `--x509-csr` | X.509 PEM and DER certificate request file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/x509_csr) |
| `--xml` | XML file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/xml) |
| `--xrandr` | `xrandr` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/xrandr) |
| `--yaml` | YAML file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/yaml) |

View File

@ -3,8 +3,8 @@ _jc()
local cur prev words cword jc_commands jc_parsers jc_options \
jc_about_options jc_about_mod_options jc_help_options jc_special_options
jc_commands=(acpi airport arp blkid bluetoothctl cbt certbot chage cksum crontab date df dig dmidecode dpkg du env file findmnt finger free git gpg hciconfig id ifconfig iostat iptables iw iwconfig jobs last lastb ls lsblk lsmod lsof lspci lsusb md5 md5sum mdadm mount mpstat netstat nmcli ntpq os-prober pidstat ping ping6 pip pip3 postconf printenv ps route rpm rsync sfdisk sha1sum sha224sum sha256sum sha384sum sha512sum shasum ss ssh sshd stat sum sysctl systemctl systeminfo timedatectl top tracepath tracepath6 traceroute traceroute6 udevadm ufw uname update-alternatives upower uptime vdir vmstat w wc who xrandr zipinfo zpool)
jc_parsers=(--acpi --airport --airport-s --arp --asciitable --asciitable-m --blkid --bluetoothctl --cbt --cef --cef-s --certbot --chage --cksum --clf --clf-s --crontab --crontab-u --csv --csv-s --date --datetime-iso --df --dig --dir --dmidecode --dpkg-l --du --email-address --env --file --findmnt --finger --free --fstab --git-log --git-log-s --git-ls-remote --gpg --group --gshadow --hash --hashsum --hciconfig --history --hosts --id --ifconfig --ini --ini-dup --iostat --iostat-s --ip-address --iptables --iw-scan --iwconfig --jar-manifest --jobs --jwt --kv --last --ls --ls-s --lsblk --lsmod --lsof --lspci --lsusb --m3u --mdadm --mount --mpstat --mpstat-s --netstat --nmcli --ntpq --openvpn --os-prober --passwd --pci-ids --pgpass --pidstat --pidstat-s --ping --ping-s --pip-list --pip-show --plist --postconf --proc --proc-buddyinfo --proc-consoles --proc-cpuinfo --proc-crypto --proc-devices --proc-diskstats --proc-filesystems --proc-interrupts --proc-iomem --proc-ioports --proc-loadavg --proc-locks --proc-meminfo --proc-modules --proc-mtrr --proc-pagetypeinfo --proc-partitions --proc-slabinfo --proc-softirqs --proc-stat --proc-swaps --proc-uptime --proc-version --proc-vmallocinfo --proc-vmstat --proc-zoneinfo --proc-driver-rtc --proc-net-arp --proc-net-dev --proc-net-dev-mcast --proc-net-if-inet6 --proc-net-igmp --proc-net-igmp6 --proc-net-ipv6-route --proc-net-netlink --proc-net-netstat --proc-net-packet --proc-net-protocols --proc-net-route --proc-net-unix --proc-pid-fdinfo --proc-pid-io --proc-pid-maps --proc-pid-mountinfo --proc-pid-numa-maps --proc-pid-smaps --proc-pid-stat --proc-pid-statm --proc-pid-status --ps --route --rpm-qi --rsync --rsync-s --semver --sfdisk --shadow --ss --ssh-conf --sshd-conf --stat --stat-s --sysctl --syslog --syslog-s --syslog-bsd --syslog-bsd-s --systemctl --systemctl-lj --systemctl-ls --systemctl-luf --systeminfo --time --timedatectl --timestamp --toml --top --top-s --tracepath --traceroute --udevadm --ufw --ufw-appinfo --uname --update-alt-gs --update-alt-q --upower --uptime --url --ver --vmstat --vmstat-s --w --wc --who --x509-cert --xml --xrandr --yaml --zipinfo --zpool-iostat --zpool-status)
jc_commands=(acpi airport arp blkid bluetoothctl cbt certbot chage cksum crontab date df dig dmidecode dpkg du env file findmnt finger free git gpg hciconfig id ifconfig iostat iptables iw iwconfig jobs last lastb ls lsattr lsblk lsmod lsof lspci lsusb md5 md5sum mdadm mount mpstat netstat nmcli ntpq os-prober pidstat ping ping6 pip pip3 postconf printenv ps route rpm rsync sfdisk sha1sum sha224sum sha256sum sha384sum sha512sum shasum ss ssh sshd stat sum sysctl systemctl systeminfo timedatectl top tracepath tracepath6 traceroute traceroute6 udevadm ufw uname update-alternatives upower uptime vdir veracrypt vmstat w wc who xrandr zipinfo zpool)
jc_parsers=(--acpi --airport --airport-s --arp --asciitable --asciitable-m --blkid --bluetoothctl --cbt --cef --cef-s --certbot --chage --cksum --clf --clf-s --crontab --crontab-u --csv --csv-s --date --datetime-iso --df --dig --dir --dmidecode --dpkg-l --du --email-address --env --file --findmnt --finger --free --fstab --git-log --git-log-s --git-ls-remote --gpg --group --gshadow --hash --hashsum --hciconfig --history --hosts --id --ifconfig --ini --ini-dup --iostat --iostat-s --ip-address --iptables --iw-scan --iwconfig --jar-manifest --jobs --jwt --kv --last --ls --ls-s --lsattr --lsblk --lsmod --lsof --lspci --lsusb --m3u --mdadm --mount --mpstat --mpstat-s --netstat --nmcli --ntpq --openvpn --os-prober --passwd --pci-ids --pgpass --pidstat --pidstat-s --ping --ping-s --pip-list --pip-show --plist --postconf --proc --proc-buddyinfo --proc-consoles --proc-cpuinfo --proc-crypto --proc-devices --proc-diskstats --proc-filesystems --proc-interrupts --proc-iomem --proc-ioports --proc-loadavg --proc-locks --proc-meminfo --proc-modules --proc-mtrr --proc-pagetypeinfo --proc-partitions --proc-slabinfo --proc-softirqs --proc-stat --proc-swaps --proc-uptime --proc-version --proc-vmallocinfo --proc-vmstat --proc-zoneinfo --proc-driver-rtc --proc-net-arp --proc-net-dev --proc-net-dev-mcast --proc-net-if-inet6 --proc-net-igmp --proc-net-igmp6 --proc-net-ipv6-route --proc-net-netlink --proc-net-netstat --proc-net-packet --proc-net-protocols --proc-net-route --proc-net-unix --proc-pid-fdinfo --proc-pid-io --proc-pid-maps --proc-pid-mountinfo --proc-pid-numa-maps --proc-pid-smaps --proc-pid-stat --proc-pid-statm --proc-pid-status --ps --route --rpm-qi --rsync --rsync-s --semver --sfdisk --shadow --srt --ss --ssh-conf --sshd-conf --stat --stat-s --sysctl --syslog --syslog-s --syslog-bsd --syslog-bsd-s --systemctl --systemctl-lj --systemctl-ls --systemctl-luf --systeminfo --time --timedatectl --timestamp --toml --top --top-s --tracepath --traceroute --udevadm --ufw --ufw-appinfo --uname --update-alt-gs --update-alt-q --upower --uptime --url --ver --veracrypt --vmstat --vmstat-s --w --wc --who --x509-cert --x509-csr --xml --xrandr --yaml --zipinfo --zpool-iostat --zpool-status)
jc_options=(--force-color -C --debug -d --monochrome -m --meta-out -M --pretty -p --quiet -q --raw -r --unbuffer -u --yaml-out -y)
jc_about_options=(--about -a)
jc_about_mod_options=(--pretty -p --yaml-out -y --monochrome -m --force-color -C)

View File

@ -9,7 +9,7 @@ _jc() {
jc_help_options jc_help_options_describe \
jc_special_options jc_special_options_describe
jc_commands=(acpi airport arp blkid bluetoothctl cbt certbot chage cksum crontab date df dig dmidecode dpkg du env file findmnt finger free git gpg hciconfig id ifconfig iostat iptables iw iwconfig jobs last lastb ls lsblk lsmod lsof lspci lsusb md5 md5sum mdadm mount mpstat netstat nmcli ntpq os-prober pidstat ping ping6 pip pip3 postconf printenv ps route rpm rsync sfdisk sha1sum sha224sum sha256sum sha384sum sha512sum shasum ss ssh sshd stat sum sysctl systemctl systeminfo timedatectl top tracepath tracepath6 traceroute traceroute6 udevadm ufw uname update-alternatives upower uptime vdir vmstat w wc who xrandr zipinfo zpool)
jc_commands=(acpi airport arp blkid bluetoothctl cbt certbot chage cksum crontab date df dig dmidecode dpkg du env file findmnt finger free git gpg hciconfig id ifconfig iostat iptables iw iwconfig jobs last lastb ls lsattr lsblk lsmod lsof lspci lsusb md5 md5sum mdadm mount mpstat netstat nmcli ntpq os-prober pidstat ping ping6 pip pip3 postconf printenv ps route rpm rsync sfdisk sha1sum sha224sum sha256sum sha384sum sha512sum shasum ss ssh sshd stat sum sysctl systemctl systeminfo timedatectl top tracepath tracepath6 traceroute traceroute6 udevadm ufw uname update-alternatives upower uptime vdir veracrypt vmstat w wc who xrandr zipinfo zpool)
jc_commands_describe=(
'acpi:run "acpi" command with magic syntax.'
'airport:run "airport" command with magic syntax.'
@ -45,6 +45,7 @@ _jc() {
'last:run "last" command with magic syntax.'
'lastb:run "lastb" command with magic syntax.'
'ls:run "ls" command with magic syntax.'
'lsattr:run "lsattr" command with magic syntax.'
'lsblk:run "lsblk" command with magic syntax.'
'lsmod:run "lsmod" command with magic syntax.'
'lsof:run "lsof" command with magic syntax.'
@ -98,6 +99,7 @@ _jc() {
'upower:run "upower" command with magic syntax.'
'uptime:run "uptime" command with magic syntax.'
'vdir:run "vdir" command with magic syntax.'
'veracrypt:run "veracrypt" command with magic syntax.'
'vmstat:run "vmstat" command with magic syntax.'
'w:run "w" command with magic syntax.'
'wc:run "wc" command with magic syntax.'
@ -106,7 +108,7 @@ _jc() {
'zipinfo:run "zipinfo" command with magic syntax.'
'zpool:run "zpool" command with magic syntax.'
)
jc_parsers=(--acpi --airport --airport-s --arp --asciitable --asciitable-m --blkid --bluetoothctl --cbt --cef --cef-s --certbot --chage --cksum --clf --clf-s --crontab --crontab-u --csv --csv-s --date --datetime-iso --df --dig --dir --dmidecode --dpkg-l --du --email-address --env --file --findmnt --finger --free --fstab --git-log --git-log-s --git-ls-remote --gpg --group --gshadow --hash --hashsum --hciconfig --history --hosts --id --ifconfig --ini --ini-dup --iostat --iostat-s --ip-address --iptables --iw-scan --iwconfig --jar-manifest --jobs --jwt --kv --last --ls --ls-s --lsblk --lsmod --lsof --lspci --lsusb --m3u --mdadm --mount --mpstat --mpstat-s --netstat --nmcli --ntpq --openvpn --os-prober --passwd --pci-ids --pgpass --pidstat --pidstat-s --ping --ping-s --pip-list --pip-show --plist --postconf --proc --proc-buddyinfo --proc-consoles --proc-cpuinfo --proc-crypto --proc-devices --proc-diskstats --proc-filesystems --proc-interrupts --proc-iomem --proc-ioports --proc-loadavg --proc-locks --proc-meminfo --proc-modules --proc-mtrr --proc-pagetypeinfo --proc-partitions --proc-slabinfo --proc-softirqs --proc-stat --proc-swaps --proc-uptime --proc-version --proc-vmallocinfo --proc-vmstat --proc-zoneinfo --proc-driver-rtc --proc-net-arp --proc-net-dev --proc-net-dev-mcast --proc-net-if-inet6 --proc-net-igmp --proc-net-igmp6 --proc-net-ipv6-route --proc-net-netlink --proc-net-netstat --proc-net-packet --proc-net-protocols --proc-net-route --proc-net-unix --proc-pid-fdinfo --proc-pid-io --proc-pid-maps --proc-pid-mountinfo --proc-pid-numa-maps --proc-pid-smaps --proc-pid-stat --proc-pid-statm --proc-pid-status --ps --route --rpm-qi --rsync --rsync-s --semver --sfdisk --shadow --ss --ssh-conf --sshd-conf --stat --stat-s --sysctl --syslog --syslog-s --syslog-bsd --syslog-bsd-s --systemctl --systemctl-lj --systemctl-ls --systemctl-luf --systeminfo --time --timedatectl --timestamp --toml --top --top-s --tracepath --traceroute --udevadm --ufw --ufw-appinfo --uname --update-alt-gs --update-alt-q --upower --uptime --url --ver --vmstat --vmstat-s --w --wc --who --x509-cert --xml --xrandr --yaml --zipinfo --zpool-iostat --zpool-status)
jc_parsers=(--acpi --airport --airport-s --arp --asciitable --asciitable-m --blkid --bluetoothctl --cbt --cef --cef-s --certbot --chage --cksum --clf --clf-s --crontab --crontab-u --csv --csv-s --date --datetime-iso --df --dig --dir --dmidecode --dpkg-l --du --email-address --env --file --findmnt --finger --free --fstab --git-log --git-log-s --git-ls-remote --gpg --group --gshadow --hash --hashsum --hciconfig --history --hosts --id --ifconfig --ini --ini-dup --iostat --iostat-s --ip-address --iptables --iw-scan --iwconfig --jar-manifest --jobs --jwt --kv --last --ls --ls-s --lsattr --lsblk --lsmod --lsof --lspci --lsusb --m3u --mdadm --mount --mpstat --mpstat-s --netstat --nmcli --ntpq --openvpn --os-prober --passwd --pci-ids --pgpass --pidstat --pidstat-s --ping --ping-s --pip-list --pip-show --plist --postconf --proc --proc-buddyinfo --proc-consoles --proc-cpuinfo --proc-crypto --proc-devices --proc-diskstats --proc-filesystems --proc-interrupts --proc-iomem --proc-ioports --proc-loadavg --proc-locks --proc-meminfo --proc-modules --proc-mtrr --proc-pagetypeinfo --proc-partitions --proc-slabinfo --proc-softirqs --proc-stat --proc-swaps --proc-uptime --proc-version --proc-vmallocinfo --proc-vmstat --proc-zoneinfo --proc-driver-rtc --proc-net-arp --proc-net-dev --proc-net-dev-mcast --proc-net-if-inet6 --proc-net-igmp --proc-net-igmp6 --proc-net-ipv6-route --proc-net-netlink --proc-net-netstat --proc-net-packet --proc-net-protocols --proc-net-route --proc-net-unix --proc-pid-fdinfo --proc-pid-io --proc-pid-maps --proc-pid-mountinfo --proc-pid-numa-maps --proc-pid-smaps --proc-pid-stat --proc-pid-statm --proc-pid-status --ps --route --rpm-qi --rsync --rsync-s --semver --sfdisk --shadow --srt --ss --ssh-conf --sshd-conf --stat --stat-s --sysctl --syslog --syslog-s --syslog-bsd --syslog-bsd-s --systemctl --systemctl-lj --systemctl-ls --systemctl-luf --systeminfo --time --timedatectl --timestamp --toml --top --top-s --tracepath --traceroute --udevadm --ufw --ufw-appinfo --uname --update-alt-gs --update-alt-q --upower --uptime --url --ver --veracrypt --vmstat --vmstat-s --w --wc --who --x509-cert --x509-csr --xml --xrandr --yaml --zipinfo --zpool-iostat --zpool-status)
jc_parsers_describe=(
'--acpi:`acpi` command parser'
'--airport:`airport -I` command parser'
@ -171,6 +173,7 @@ _jc() {
'--last:`last` and `lastb` command parser'
'--ls:`ls` command parser'
'--ls-s:`ls` command streaming parser'
'--lsattr:`lsattr` command parser'
'--lsblk:`lsblk` command parser'
'--lsmod:`lsmod` command parser'
'--lsof:`lsof` command parser'
@ -255,6 +258,7 @@ _jc() {
'--semver:Semantic Version string parser'
'--sfdisk:`sfdisk` command parser'
'--shadow:`/etc/shadow` file parser'
'--srt:SRT file parser'
'--ss:`ss` command parser'
'--ssh-conf:`ssh` config file and `ssh -G` command parser'
'--sshd-conf:`sshd` config file and `sshd -T` command parser'
@ -288,12 +292,14 @@ _jc() {
'--uptime:`uptime` command parser'
'--url:URL string parser'
'--ver:Version string parser'
'--veracrypt:`veracrypt` command parser'
'--vmstat:`vmstat` command parser'
'--vmstat-s:`vmstat` command streaming parser'
'--w:`w` command parser'
'--wc:`wc` command parser'
'--who:`who` command parser'
'--x509-cert:X.509 PEM and DER certificate file parser'
'--x509-csr:X.509 PEM and DER certificate request file parser'
'--xml:XML file parser'
'--xrandr:`xrandr` command parser'
'--yaml:YAML file parser'

View File

@ -36,6 +36,7 @@ a controller and a device but there might be fields corresponding to one entity.
"name": string,
"is_default": boolean,
"is_public": boolean,
"is_random": boolean,
"address": string,
"alias": string,
"class": string,
@ -54,8 +55,10 @@ a controller and a device but there might be fields corresponding to one entity.
{
"name": string,
"is_public": boolean,
"is_random": boolean,
"address": string,
"alias": string,
"appearance": string,
"class": string,
"icon": string,
"paired": string,
@ -66,7 +69,8 @@ a controller and a device but there might be fields corresponding to one entity.
"legacy_pairing": string,
"rssi": int,
"txpower": int,
"uuids": array
"uuids": array,
"modalias": string
}
]
@ -126,4 +130,4 @@ Returns:
### Parser Information
Compatibility: linux
Version 1.0 by Jake Ob (iakopap at gmail.com)
Version 1.1 by Jake Ob (iakopap at gmail.com)

View File

@ -158,4 +158,4 @@ Returns:
### Parser Information
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
Version 1.0 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.1 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@ -9,6 +9,7 @@ Options supported:
- `+noall +answer` options are supported in cases where only the answer
information is desired.
- `+axfr` option is supported on its own
- `+nsid` option is supported
The `when_epoch` calculated timestamp field is naive. (i.e. based on the
local time of the system the parser is run on)
@ -345,4 +346,4 @@ Returns:
### Parser Information
Compatibility: linux, aix, freebsd, darwin, win32, cygwin
Version 2.4 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 2.5 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@ -5,7 +5,7 @@
jc - JSON Convert `last` and `lastb` command output parser
Supports `-w` and `-F` options.
Supports `-w`, `-F`, and `-x` options.
Calculated epoch time fields are naive (i.e. based on the local time of the
system the parser is run on) since there is no timezone information in the
@ -127,4 +127,4 @@ Returns:
### Parser Information
Compatibility: linux, darwin, aix, freebsd
Version 1.8 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.9 by Kelly Brazil (kellyjonbrazil@gmail.com)

89
docs/parsers/lsattr.md Normal file
View File

@ -0,0 +1,89 @@
[Home](https://kellyjonbrazil.github.io/jc/)
<a id="jc.parsers.lsattr"></a>
# jc.parsers.lsattr
jc - JSON Convert `lsattr` command output parser
Usage (cli):
$ lsattr | jc --lsattr
or
$ jc lsattr
Usage (module):
import jc
result = jc.parse('lsattr', lsattr_command_output)
Schema:
Information from https://github.com/mirror/busybox/blob/2d4a3d9e6c1493a9520b907e07a41aca90cdfd94/e2fsprogs/e2fs_lib.c#L40
used to define field names
[
{
"file": string,
"compressed_file": Optional[boolean],
"compressed_dirty_file": Optional[boolean],
"compression_raw_access": Optional[boolean],
"secure_deletion": Optional[boolean],
"undelete": Optional[boolean],
"synchronous_updates": Optional[boolean],
"synchronous_directory_updates": Optional[boolean],
"immutable": Optional[boolean],
"append_only": Optional[boolean],
"no_dump": Optional[boolean],
"no_atime": Optional[boolean],
"compression_requested": Optional[boolean],
"encrypted": Optional[boolean],
"journaled_data": Optional[boolean],
"indexed_directory": Optional[boolean],
"no_tailmerging": Optional[boolean],
"top_of_directory_hierarchies": Optional[boolean],
"extents": Optional[boolean],
"no_cow": Optional[boolean],
"casefold": Optional[boolean],
"inline_data": Optional[boolean],
"project_hierarchy": Optional[boolean],
"verity": Optional[boolean],
}
]
Examples:
$ sudo lsattr /etc/passwd | jc --lsattr
[
{
"file": "/etc/passwd",
"extents": true
}
]
<a id="jc.parsers.lsattr.parse"></a>
### parse
```python
def parse(data: str,
raw: bool = False,
quiet: bool = False) -> List[JSONDictType]
```
Main text parsing function
Parameters:
data: (string) text data to parse
quiet: (boolean) suppress warning messages if True
Returns:
List of Dictionaries. Raw or processed structured data.
### Parser Information
Compatibility: linux
Version 1.0 by Mark Rotner (rotner.mr@gmail.com)

View File

@ -376,6 +376,6 @@ Returns:
List of Dictionaries. Raw or processed structured data.
### Parser Information
Compatibility: linux, darwin, freebsd
Compatibility: linux, darwin, freebsd, win32
Version 1.13 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.14 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@ -185,4 +185,4 @@ Returns:
### Parser Information
Compatibility: linux, darwin, freebsd
Version 1.8 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.9 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@ -106,4 +106,4 @@ Returns:
### Parser Information
Compatibility: linux, darwin, freebsd
Version 1.2 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.3 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@ -22,6 +22,13 @@ Schema:
[
{
"interfaces": [
{
"id": string,
"mac": string,
"name": string,
}
]
"destination": string,
"gateway": string,
"genmask": string,
@ -129,6 +136,6 @@ Returns:
List of Dictionaries. Raw or processed structured data.
### Parser Information
Compatibility: linux
Compatibility: linux, win32
Version 1.8 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.9 by Kelly Brazil (kellyjonbrazil@gmail.com)

136
docs/parsers/srt.md Normal file
View File

@ -0,0 +1,136 @@
[Home](https://kellyjonbrazil.github.io/jc/)
<a id="jc.parsers.srt"></a>
# jc.parsers.srt
jc - JSON Convert `SRT` file parser
Usage (cli):
$ cat foo.srt | jc --srt
Usage (module):
import jc
result = jc.parse('srt', srt_file_output)
Schema:
[
{
"index": int,
"start": {
"hours": int,
"minutes": int,
"seconds": int,
"milliseconds": int,
"timestamp": string
},
"end": {
"hours": int,
"minutes": int,
"seconds": int,
"milliseconds": int,
"timestamp": string
},
"content": string
}
]
Examples:
$ cat attack_of_the_clones.srt
1
00:02:16,612 --> 00:02:19,376
Senator, we're making
our final approach into Coruscant.
2
00:02:19,482 --> 00:02:21,609
Very good, Lieutenant.
...
$ cat attack_of_the_clones.srt | jc --srt
[
{
"index": 1,
"start": {
"hours": 0,
"minutes": 2,
"seconds": 16,
"milliseconds": 612,
"timestamp": "00:02:16,612"
},
"end": {
"hours": 0,
"minutes": 2,
"seconds": 19,
"milliseconds": 376,
"timestamp": "00:02:19,376"
},
"content": "Senator, we're making\nour final approach into Coruscant."
},
{
"index": 2,
"start": {
"hours": 0,
"minutes": 2,
"seconds": 19,
"milliseconds": 482,
"timestamp": "00:02:19,482"
},
"end": {
"hours": 0,
"minutes": 2,
"seconds": 21,
"milliseconds": 609,
"timestamp": "00:02:21,609"
},
"content": "Very good, Lieutenant."
},
...
]
<a id="jc.parsers.srt.parse_timestamp"></a>
### parse\_timestamp
```python
def parse_timestamp(timestamp: str) -> Dict
```
timestamp: "hours:minutes:seconds,milliseconds" --->
{
"hours": "hours",
"minutes": "minutes",
"seconds": "seconds",
"milliseconds": "milliseconds",
"timestamp": "hours:minutes:seconds,milliseconds"
}
<a id="jc.parsers.srt.parse"></a>
### parse
```python
def parse(data: str,
raw: bool = False,
quiet: bool = False) -> List[JSONDictType]
```
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
Dictionary. Raw or processed structured data.
### Parser Information
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
Version 1.0 by Mark Rotner (rotner.mr@gmail.com)

View File

@ -5,9 +5,6 @@
jc - JSON Convert `ss` command output parser
Extended information options like `-e` and `-p` are not supported and may
cause parsing irregularities.
Usage (cli):
$ ss | jc --ss
@ -28,21 +25,29 @@ field names
[
{
"netid": string,
"state": string,
"recv_q": integer,
"send_q": integer,
"local_address": string,
"local_port": string,
"local_port_num": integer,
"peer_address": string,
"peer_port": string,
"peer_port_num": integer,
"interface": string,
"link_layer" string,
"channel": string,
"path": string,
"pid": integer
"netid": string,
"state": string,
"recv_q": integer,
"send_q": integer,
"local_address": string,
"local_port": string,
"local_port_num": integer,
"peer_address": string,
"peer_port": string,
"peer_port_num": integer,
"interface": string,
"link_layer" string,
"channel": string,
"path": string,
"pid": integer,
"opts": {
"process_id": {
"<process_id>": {
"user": string,
"file_descriptor": string
}
}
}
}
]
@ -303,4 +308,4 @@ Returns:
### Parser Information
Compatibility: linux
Version 1.6 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.7 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@ -193,4 +193,4 @@ Returns:
### Parser Information
Compatibility: linux, darwin, freebsd
Version 1.12 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.13 by Kelly Brazil (kellyjonbrazil@gmail.com)

108
docs/parsers/veracrypt.md Normal file
View File

@ -0,0 +1,108 @@
[Home](https://kellyjonbrazil.github.io/jc/)
<a id="jc.parsers.veracrypt"></a>
# jc.parsers.veracrypt
jc - JSON Convert `veracrypt` command output parser
Supports the following `veracrypt` subcommands:
- `veracrypt --text --list`
- `veracrypt --text --list --verbose`
- `veracrypt --text --volume-properties <volume>`
Usage (cli):
$ veracrypt --text --list | jc --veracrypt
or
$ jc veracrypt --text --list
Usage (module):
import jc
result = jc.parse('veracrypt', veracrypt_command_output)
Schema:
Volume:
[
{
"slot": integer,
"path": string,
"device": string,
"mountpoint": string,
"size": string,
"type": string,
"readonly": string,
"hidden_protected": string,
"encryption_algo": string,
"pk_size": string,
"sk_size": string,
"block_size": string,
"mode": string,
"prf": string,
"format_version": integer,
"backup_header": string
}
]
Examples:
$ veracrypt --text --list | jc --veracrypt -p
[
{
"slot": 1,
"path": "/dev/sdb1",
"device": "/dev/mapper/veracrypt1",
"mountpoint": "/home/bob/mount/encrypt/sdb1"
}
]
$ veracrypt --text --list --verbose | jc --veracrypt -p
[
{
"slot": 1,
"path": "/dev/sdb1",
"device": "/dev/mapper/veracrypt1",
"mountpoint": "/home/bob/mount/encrypt/sdb1",
"size": "522 MiB",
"type": "Normal",
"readonly": "No",
"hidden_protected": "No",
"encryption_algo": "AES",
"pk_size": "256 bits",
"sk_size": "256 bits",
"block_size": "128 bits",
"mode": "XTS",
"prf": "HMAC-SHA-512",
"format_version": 2,
"backup_header": "Yes"
}
]
<a id="jc.parsers.veracrypt.parse"></a>
### parse
```python
def parse(data: str,
raw: bool = False,
quiet: bool = False) -> List[JSONDictType]
```
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
List of Dictionaries. Raw or processed structured data.
### Parser Information
Compatibility: linux
Version 1.0 by Jake Ob (iakopap at gmail.com)

View File

@ -433,4 +433,4 @@ Returns:
### Parser Information
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
Version 1.1 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.2 by Kelly Brazil (kellyjonbrazil@gmail.com)

282
docs/parsers/x509_csr.md Normal file
View File

@ -0,0 +1,282 @@
[Home](https://kellyjonbrazil.github.io/jc/)
<a id="jc.parsers.x509_csr"></a>
# jc.parsers.x509\_csr
jc - JSON Convert X.509 Certificate Request format file parser
This parser will convert DER and PEM encoded X.509 certificate request files.
Usage (cli):
$ cat certificateRequest.pem | jc --x509-csr
Usage (module):
import jc
result = jc.parse('x509_csr', x509_csr_file_output)
Schema:
[
{
"certification_request_info": {
"version": string,
"serial_number": string, # [0]
"serial_number_str": string,
"signature": {
"algorithm": string,
"parameters": string/null,
},
"issuer": {
"country_name": string,
"state_or_province_name" string,
"locality_name": string,
"organization_name": array/string,
"organizational_unit_name": array/string,
"common_name": string,
"email_address": string,
"serial_number": string, # [0]
"serial_number_str": string
},
"validity": {
"not_before": integer, # [1]
"not_after": integer, # [1]
"not_before_iso": string,
"not_after_iso": string
},
"subject": {
"country_name": string,
"state_or_province_name": string,
"locality_name": string,
"organization_name": array/string,
"organizational_unit_name": array/string,
"common_name": string,
"email_address": string,
"serial_number": string, # [0]
"serial_number_str": string
},
"subject_public_key_info": {
"algorithm": {
"algorithm": string,
"parameters": string/null,
},
"public_key": {
"modulus": string, # [0]
"public_exponent": integer
}
},
"issuer_unique_id": string/null,
"subject_unique_id": string/null,
"extensions": [
{
"extn_id": string,
"critical": boolean,
"extn_value": array/object/string/integer # [2]
}
]
},
"signature_algorithm": {
"algorithm": string,
"parameters": string/null
},
"signature_value": string # [0]
}
]
[0] in colon-delimited hex notation
[1] time-zone-aware (UTC) epoch timestamp
[2] See below for well-known Extension schemas:
Basic Constraints:
{
"extn_id": "basic_constraints",
"critical": boolean,
"extn_value": {
"ca": boolean,
"path_len_constraint": string/null
}
}
Key Usage:
{
"extn_id": "key_usage",
"critical": boolean,
"extn_value": [
string
]
}
Key Identifier:
{
"extn_id": "key_identifier",
"critical": boolean,
"extn_value": string # [0]
}
Authority Key Identifier:
{
"extn_id": "authority_key_identifier",
"critical": boolean,
"extn_value": {
"key_identifier": string, # [0]
"authority_cert_issuer": string/null,
"authority_cert_serial_number": string/null
}
}
Subject Alternative Name:
{
"extn_id": "subject_alt_name",
"critical": boolean,
"extn_value": [
string
]
}
Certificate Policies:
{
"extn_id": "certificate_policies",
"critical": boolean,
"extn_value": [
{
"policy_identifier": string,
"policy_qualifiers": [ array or null
{
"policy_qualifier_id": string,
"qualifier": string
}
]
}
]
}
Signed Certificate Timestamp List:
{
"extn_id": "signed_certificate_timestamp_list",
"critical": boolean,
"extn_value": string # [0]
}
Examples:
$ cat server.csr| jc --x509-csr -p
[
{
"certification_request_info": {
"version": "v1",
"subject": {
"common_name": "myserver.for.example"
},
"subject_pk_info": {
"algorithm": {
"algorithm": "ec",
"parameters": "secp256r1"
},
"public_key": "04:40:33:c0:91:8f:e9:46:ea:d0:dc:d0:f9:63:2..."
},
"attributes": [
{
"type": "extension_request",
"values": [
[
{
"extn_id": "extended_key_usage",
"critical": false,
"extn_value": [
"server_auth"
]
},
{
"extn_id": "subject_alt_name",
"critical": false,
"extn_value": [
"myserver.for.example"
]
}
]
]
}
]
},
"signature_algorithm": {
"algorithm": "sha384_ecdsa",
"parameters": null
},
"signature": "30:45:02:20:77:ac:5b:51:bf:c5:f5:43:02:52:ae:66:..."
}
]
$ openssl req -in server.csr | jc --x509-csr -p
[
{
"certification_request_info": {
"version": "v1",
"subject": {
"common_name": "myserver.for.example"
},
"subject_pk_info": {
"algorithm": {
"algorithm": "ec",
"parameters": "secp256r1"
},
"public_key": "04:40:33:c0:91:8f:e9:46:ea:d0:dc:d0:f9:63:2..."
},
"attributes": [
{
"type": "extension_request",
"values": [
[
{
"extn_id": "extended_key_usage",
"critical": false,
"extn_value": [
"server_auth"
]
},
{
"extn_id": "subject_alt_name",
"critical": false,
"extn_value": [
"myserver.for.example"
]
}
]
]
}
]
},
"signature_algorithm": {
"algorithm": "sha384_ecdsa",
"parameters": null
},
"signature": "30:45:02:20:77:ac:5b:51:bf:c5:f5:43:02:52:ae:66:..."
}
]
<a id="jc.parsers.x509_csr.parse"></a>
### parse
```python
def parse(data: Union[str, bytes],
raw: bool = False,
quiet: bool = False) -> List[Dict]
```
Main text parsing function
Parameters:
data: (string or bytes) text or binary data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
List of Dictionaries. Raw or processed structured data.
### Parser Information
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
Version 1.0 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@ -9,7 +9,7 @@ from .jc_types import ParserInfoType, JSONDictType
from jc import appdirs
__version__ = '1.23.2'
__version__ = '1.23.3'
parsers: List[str] = [
'acpi',
@ -76,6 +76,7 @@ parsers: List[str] = [
'last',
'ls',
'ls-s',
'lsattr',
'lsblk',
'lsmod',
'lsof',
@ -160,6 +161,7 @@ parsers: List[str] = [
'semver',
'sfdisk',
'shadow',
'srt',
'ss',
'ssh-conf',
'sshd-conf',
@ -193,12 +195,14 @@ parsers: List[str] = [
'uptime',
'url',
'ver',
'veracrypt',
'vmstat',
'vmstat-s',
'w',
'wc',
'who',
'x509-cert',
'x509-csr',
'xml',
'xrandr',
'yaml',

View File

@ -0,0 +1 @@
quiet = False

View File

@ -251,7 +251,18 @@ class EmailAddress(IA5String):
self._unicode = contents.decode('cp1252')
else:
mailbox, hostname = contents.rsplit(b'@', 1)
self._unicode = mailbox.decode('cp1252') + '@' + hostname.decode('idna')
# fix to allow incorrectly encoded email addresses to succeed with warning
try:
self._unicode = mailbox.decode('cp1252') + '@' + hostname.decode('idna')
except UnicodeDecodeError:
ascii_mailbox = mailbox.decode('ascii', errors='backslashreplace')
ascii_hostname = hostname.decode('ascii', errors='backslashreplace')
from jc.utils import warning_message
import jc.parsers.asn1crypto.jc_global as jc_global
if not jc_global.quiet:
warning_message([f'Invalid email address found: {ascii_mailbox}@{ascii_hostname}'])
self._unicode = ascii_mailbox + '@' + ascii_hostname
return self._unicode
def __ne__(self, other):

View File

@ -31,6 +31,7 @@ a controller and a device but there might be fields corresponding to one entity.
"name": string,
"is_default": boolean,
"is_public": boolean,
"is_random": boolean,
"address": string,
"alias": string,
"class": string,
@ -49,8 +50,10 @@ a controller and a device but there might be fields corresponding to one entity.
{
"name": string,
"is_public": boolean,
"is_random": boolean,
"address": string,
"alias": string,
"appearance": string,
"class": string,
"icon": string,
"paired": string,
@ -61,7 +64,8 @@ a controller and a device but there might be fields corresponding to one entity.
"legacy_pairing": string,
"rssi": int,
"txpower": int,
"uuids": array
"uuids": array,
"modalias": string
}
]
@ -104,12 +108,12 @@ import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.0'
version = '1.1'
description = '`bluetoothctl` command parser'
author = 'Jake Ob'
author_email = 'iakopap at gmail.com'
compatible = ["linux"]
magic_commands = ["bluetoothctl"]
compatible = ['linux']
magic_commands = ['bluetoothctl']
tags = ['command']
@ -124,6 +128,7 @@ try:
"name": str,
"is_default": bool,
"is_public": bool,
"is_random": bool,
"address": str,
"alias": str,
"class": str,
@ -141,8 +146,10 @@ try:
{
"name": str,
"is_public": bool,
"is_random": bool,
"address": str,
"alias": str,
"appearance": str,
"class": str,
"icon": str,
"paired": str,
@ -154,6 +161,7 @@ try:
"rssi": int,
"txpower": int,
"uuids": List[str],
"modalias": str
},
)
except ImportError:
@ -195,6 +203,7 @@ def _parse_controller(next_lines: List[str]) -> Optional[Controller]:
"name": '',
"is_default": False,
"is_public": False,
"is_random": False,
"address": matches["address"],
"alias": '',
"class": '',
@ -210,10 +219,12 @@ def _parse_controller(next_lines: List[str]) -> Optional[Controller]:
if name.endswith("[default]"):
controller["is_default"] = True
name = name.replace("[default]", "")
if name.endswith("(public)"):
elif name.endswith("(public)"):
controller["is_public"] = True
name = name.replace("(public)", "")
elif name.endswith("(random)"):
controller["is_random"] = True
name = name.replace("(random)", "")
controller["name"] = name.strip()
@ -257,6 +268,7 @@ _device_head_pattern = r"Device (?P<address>([0-9A-F]{2}:){5}[0-9A-F]{2}) (?P<na
_device_line_pattern = (
r"(\s*Name:\s*(?P<name>.+)"
+ r"|\s*Alias:\s*(?P<alias>.+)"
+ r"|\s*Appearance:\s*(?P<appearance>.+)"
+ r"|\s*Class:\s*(?P<class>.+)"
+ r"|\s*Icon:\s*(?P<icon>.+)"
+ r"|\s*Paired:\s*(?P<paired>.+)"
@ -290,8 +302,10 @@ def _parse_device(next_lines: List[str], quiet: bool) -> Optional[Device]:
device: Device = {
"name": '',
"is_public": False,
"is_random": False,
"address": matches["address"],
"alias": '',
"appearance": '',
"class": '',
"icon": '',
"paired": '',
@ -303,11 +317,15 @@ def _parse_device(next_lines: List[str], quiet: bool) -> Optional[Device]:
"rssi": 0,
"txpower": 0,
"uuids": [],
"modalias": ''
}
if name.endswith("(public)"):
device["is_public"] = True
name = name.replace("(public)", "")
elif name.endswith("(random)"):
device["is_random"] = True
name = name.replace("(random)", "")
device["name"] = name.strip()
@ -325,6 +343,8 @@ def _parse_device(next_lines: List[str], quiet: bool) -> Optional[Device]:
device["name"] = matches["name"]
elif matches["alias"]:
device["alias"] = matches["alias"]
elif matches["appearance"]:
device["appearance"] = matches["appearance"]
elif matches["class"]:
device["class"] = matches["class"]
elif matches["icon"]:
@ -359,6 +379,8 @@ def _parse_device(next_lines: List[str], quiet: bool) -> Optional[Device]:
if not "uuids" in device:
device["uuids"] = []
device["uuids"].append(matches["uuid"])
elif matches["modalias"]:
device["modalias"] = matches["modalias"]
return device
@ -376,12 +398,11 @@ def parse(data: str, raw: bool = False, quiet: bool = False) -> List[JSONDictTyp
List of Dictionaries. Raw or processed structured data.
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.input_type_check(data)
result: List = []
if jc.utils.has_data(data):
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.input_type_check(data)
linedata = data.splitlines()
linedata.reverse()

View File

@ -130,6 +130,7 @@ Examples:
}
}
"""
import re
from typing import List, Dict
from jc.jc_types import JSONDictType
import jc.utils
@ -137,7 +138,7 @@ import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.0'
version = '1.1'
description = '`certbot` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -200,7 +201,9 @@ def parse(
if jc.utils.has_data(data):
if 'Found the following certs:\n' in data:
cert_pattern = re.compile(r'^Found the following certs:$', re.MULTILINE)
if re.search(cert_pattern, data):
cmd_option = 'certificates'
else:
cmd_option = 'account'

View File

@ -4,6 +4,7 @@ Options supported:
- `+noall +answer` options are supported in cases where only the answer
information is desired.
- `+axfr` option is supported on its own
- `+nsid` option is supported
The `when_epoch` calculated timestamp field is naive. (i.e. based on the
local time of the system the parser is run on)
@ -322,7 +323,7 @@ import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '2.4'
version = '2.5'
description = '`dig` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -427,6 +428,7 @@ def _parse_flags_line(flagsline):
def _parse_opt_pseudosection(optline):
# ;; OPT PSEUDOSECTION:
# ; EDNS: version: 0, flags:; udp: 4096
# ; NSID: 67 70 64 6e 73 2d 73 66 6f ("gpdns-sfo")
# ; COOKIE: 1cbc06703eaef210
if optline.startswith('; EDNS:'):
optline_list = optline.replace(',', ' ').split(';')
@ -443,11 +445,18 @@ def _parse_opt_pseudosection(optline):
}
}
elif optline.startswith('; COOKIE:'):
if optline.startswith('; COOKIE:'):
return {
'cookie': optline.split()[2]
}
if optline.startswith('; NSID:'):
return {
'nsid': optline.split('("')[-1].rstrip('")')
}
return {}
def _parse_question(question):
# ;www.cnn.com. IN A

View File

@ -1,6 +1,6 @@
"""jc - JSON Convert `last` and `lastb` command output parser
Supports `-w` and `-F` options.
Supports `-w`, `-F`, and `-x` options.
Calculated epoch time fields are naive (i.e. based on the local time of the
system the parser is run on) since there is no timezone information in the
@ -103,10 +103,15 @@ Examples:
import re
import jc.utils
DATE_RE = re.compile(r'[MTWFS][ouerha][nedritnu] [JFMASOND][aepuco][nbrynlgptvc]')
LAST_F_DATE_RE = re.compile(r'\d\d:\d\d:\d\d \d\d\d\d')
LOGIN_LOGOUT_EPOCH_RE = re.compile(r'.*\d\d:\d\d:\d\d \d\d\d\d.*')
LOGOUT_IGNORED_EVENTS = ['down', 'crash']
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.8'
version = '1.9'
description = '`last` and `lastb` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -138,9 +143,6 @@ def _process(proc_data):
if 'tty' in entry and entry['tty'] == '~':
entry['tty'] = None
if 'tty' in entry and entry['tty'] == 'system_boot':
entry['tty'] = 'system boot'
if 'hostname' in entry and entry['hostname'] == '-':
entry['hostname'] = None
@ -153,11 +155,11 @@ def _process(proc_data):
if 'logout' in entry and entry['logout'] == 'gone_-_no_logout':
entry['logout'] = 'gone - no logout'
if 'login' in entry and re.match(r'.*\d\d:\d\d:\d\d \d\d\d\d.*', entry['login']):
if 'login' in entry and LOGIN_LOGOUT_EPOCH_RE.match(entry['login']):
timestamp = jc.utils.timestamp(entry['login'])
entry['login_epoch'] = timestamp.naive
if 'logout' in entry and re.match(r'.*\d\d:\d\d:\d\d \d\d\d\d.*', entry['logout']):
if 'logout' in entry and LOGIN_LOGOUT_EPOCH_RE.match(entry['logout']):
timestamp = jc.utils.timestamp(entry['logout'])
entry['logout_epoch'] = timestamp.naive
@ -194,66 +196,71 @@ def parse(data, raw=False, quiet=False):
# Clear any blank lines
cleandata = list(filter(None, data.splitlines()))
if jc.utils.has_data(data):
if not jc.utils.has_data(data):
return []
for entry in cleandata:
output_line = {}
for entry in cleandata:
output_line = {}
if (entry.startswith('wtmp begins ') or
entry.startswith('btmp begins ') or
entry.startswith('utx.log begins ')):
if any(
entry.startswith(f'{prefix} begins ')
for prefix in ['wtmp', 'btmp', 'utx.log']
):
continue
continue
entry = entry.replace('boot time', 'boot_time')
entry = entry.replace(' still logged in', '- still_logged_in')
entry = entry.replace(' gone - no logout', '- gone_-_no_logout')
entry = entry.replace('system boot', 'system_boot')
entry = entry.replace('boot time', 'boot_time')
entry = entry.replace(' still logged in', '- still_logged_in')
entry = entry.replace(' gone - no logout', '- gone_-_no_logout')
linedata = entry.split()
linedata = entry.split()
if re.match(r'[MTWFS][ouerha][nedritnu] [JFMASOND][aepuco][nbrynlgptvc]', ' '.join(linedata[2:4])):
linedata.insert(2, '-')
# Adding "-" before the date part.
if DATE_RE.match(' '.join(linedata[2:4])):
linedata.insert(2, '-')
# freebsd fix
if linedata[0] == 'boot_time':
linedata.insert(1, '-')
linedata.insert(1, '~')
# freebsd fix
if linedata[0] == 'boot_time':
linedata.insert(1, '-')
linedata.insert(1, '~')
output_line['user'] = linedata[0]
output_line['tty'] = linedata[1]
output_line['hostname'] = linedata[2]
output_line['user'] = linedata[0]
# last -F support
if re.match(r'\d\d:\d\d:\d\d \d\d\d\d', ' '.join(linedata[6:8])):
output_line['login'] = ' '.join(linedata[3:8])
# Fix for last -x (runlevel).
if output_line['user'] == 'runlevel' and linedata[1] == '(to':
linedata[1] += f' {linedata.pop(2)} {linedata.pop(2)}'
elif output_line['user'] in ['reboot', 'shutdown'] and linedata[1] == 'system': # system down\system boot
linedata[1] += f' {linedata.pop(2)}'
if len(linedata) > 9 and linedata[9] != 'crash' and linedata[9] != 'down':
output_line['tty'] = linedata[1]
output_line['hostname'] = linedata[2]
# last -F support
if LAST_F_DATE_RE.match(' '.join(linedata[6:8])):
output_line['login'] = ' '.join(linedata[3:8])
if len(linedata) > 9:
if linedata[9] not in LOGOUT_IGNORED_EVENTS:
output_line['logout'] = ' '.join(linedata[9:14])
if len(linedata) > 9 and (linedata[9] == 'crash' or linedata[9] == 'down'):
else:
output_line['logout'] = linedata[9]
# add more items to the list to line up duration
linedata.insert(10, '-')
linedata.insert(10, '-')
linedata.insert(10, '-')
linedata.insert(10, '-')
for _ in range(4):
linedata.insert(10, '-')
if len(linedata) > 14:
output_line['duration'] = linedata[14].replace('(', '').replace(')', '')
if len(linedata) > 14:
output_line['duration'] = linedata[14].replace('(', '').replace(')', '')
else: # normal last support
output_line['login'] = ' '.join(linedata[3:7])
# normal last support
else:
output_line['login'] = ' '.join(linedata[3:7])
if len(linedata) > 8:
output_line['logout'] = linedata[8]
if len(linedata) > 8:
output_line['logout'] = linedata[8]
if len(linedata) > 9:
output_line['duration'] = linedata[9].replace('(', '').replace(')', '')
if len(linedata) > 9:
output_line['duration'] = linedata[9].replace('(', '').replace(')', '')
raw_output.append(output_line)
raw_output.append(output_line)
if raw:
return raw_output
else:
return _process(raw_output)
return _process(raw_output)

162
jc/parsers/lsattr.py Normal file
View File

@ -0,0 +1,162 @@
"""jc - JSON Convert `lsattr` command output parser
Usage (cli):
$ lsattr | jc --lsattr
or
$ jc lsattr
Usage (module):
import jc
result = jc.parse('lsattr', lsattr_command_output)
Schema:
Information from https://github.com/mirror/busybox/blob/2d4a3d9e6c1493a9520b907e07a41aca90cdfd94/e2fsprogs/e2fs_lib.c#L40
used to define field names
[
{
"file": string,
"compressed_file": Optional[boolean],
"compressed_dirty_file": Optional[boolean],
"compression_raw_access": Optional[boolean],
"secure_deletion": Optional[boolean],
"undelete": Optional[boolean],
"synchronous_updates": Optional[boolean],
"synchronous_directory_updates": Optional[boolean],
"immutable": Optional[boolean],
"append_only": Optional[boolean],
"no_dump": Optional[boolean],
"no_atime": Optional[boolean],
"compression_requested": Optional[boolean],
"encrypted": Optional[boolean],
"journaled_data": Optional[boolean],
"indexed_directory": Optional[boolean],
"no_tailmerging": Optional[boolean],
"top_of_directory_hierarchies": Optional[boolean],
"extents": Optional[boolean],
"no_cow": Optional[boolean],
"casefold": Optional[boolean],
"inline_data": Optional[boolean],
"project_hierarchy": Optional[boolean],
"verity": Optional[boolean],
}
]
Examples:
$ sudo lsattr /etc/passwd | jc --lsattr
[
{
"file": "/etc/passwd",
"extents": true
}
]
"""
from typing import List, Dict
from jc.jc_types import JSONDictType
import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.0'
description = '`lsattr` command parser'
author = 'Mark Rotner'
author_email = 'rotner.mr@gmail.com'
compatible = ['linux']
magic_commands = ['lsattr']
tags = ['command']
__version__ = info.version
ERROR_PREFIX = "lsattr:"
# https://github.com/mirror/busybox/blob/2d4a3d9e6c1493a9520b907e07a41aca90cdfd94/e2fsprogs/e2fs_lib.c#L40
# https://github.com/landley/toybox/blob/f1682dc79fd75f64042b5438918fe5a507977e1c/toys/other/lsattr.c#L97
ATTRIBUTES = {
"B": "compressed_file",
"Z": "compressed_dirty_file",
"X": "compression_raw_access",
"s": "secure_deletion",
"u": "undelete",
"S": "synchronous_updates",
"D": "synchronous_directory_updates",
"i": "immutable",
"a": "append_only",
"d": "no_dump",
"A": "no_atime",
"c": "compression_requested",
"E": "encrypted",
"j": "journaled_data",
"I": "indexed_directory",
"t": "no_tailmerging",
"T": "top_of_directory_hierarchies",
"e": "extents",
"C": "no_cow",
"F": "casefold",
"N": "inline_data",
"P": "project_hierarchy",
"V": "verity",
}
def parse(
data: str,
raw: bool = False,
quiet: bool = False
) -> List[JSONDictType]:
"""
Main text parsing function
Parameters:
data: (string) text data to parse
quiet: (boolean) suppress warning messages if True
Returns:
List of Dictionaries. Raw or processed structured data.
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.input_type_check(data)
output: List = []
cleandata = list(filter(None, data.splitlines()))
if not jc.utils.has_data(data):
return output
for line in cleandata:
# -R flag returns the output in the format:
# Folder:
# attributes file_in_folder
if line.endswith(':'):
continue
# lsattr: Operation not supported ....
if line.startswith(ERROR_PREFIX):
continue
line_output: Dict = {}
# attributes file
# --------------e----- /etc/passwd
attributes, file = line.split()
line_output['file'] = file
for attribute in list(attributes):
attribute_key = ATTRIBUTES.get(attribute)
if attribute_key:
line_output[attribute_key] = True
if line_output:
output.append(line_output)
return output

View File

@ -355,17 +355,18 @@ import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.13'
version = '1.14'
description = '`netstat` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
compatible = ['linux', 'darwin', 'freebsd']
compatible = ['linux', 'darwin', 'freebsd', 'win32']
magic_commands = ['netstat']
tags = ['command']
__version__ = info.version
WINDOWS_NETSTAT_HEADER = "Active Connections"
def _process(proc_data):
"""
@ -450,9 +451,10 @@ def parse(data, raw=False, quiet=False):
import jc.parsers.netstat_freebsd_osx
raw_output = jc.parsers.netstat_freebsd_osx.parse(cleandata)
# use linux parser
else:
elif cleandata[0] == WINDOWS_NETSTAT_HEADER: # use windows parser.
import jc.parsers.netstat_windows
raw_output = jc.parsers.netstat_windows.parse(cleandata)
else: # use linux parser.
import jc.parsers.netstat_linux
raw_output = jc.parsers.netstat_linux.parse(cleandata)

View File

@ -0,0 +1,75 @@
"""
jc - JSON Convert Windows `netstat` command output parser
"""
from typing import Dict, List
POSSIBLE_PROTOCOLS = ("TCP", "UDP", "TCPv6", "UDPv6")
def normalize_headers(headers: str):
"""
Normalizes the headers to match the jc netstat parser style
(local_address -> local_address, local_port...).
"""
headers = headers.lower().strip()
headers = headers.replace("local address", "local_address")
headers = headers.replace("foreign address", "foreign_address")
return headers.split()
def parse(cleandata: List[str]):
"""
Main text parsing function for Windows netstat
Parameters:
cleandata: (string) text data to parse
Returns:
List of Dictionaries. Raw structured data.
"""
raw_output = []
cleandata.pop(0) # Removing the "Active Connections" header.
headers = normalize_headers(cleandata.pop(0))
for line in cleandata:
line = line.strip()
if not line.startswith(POSSIBLE_PROTOCOLS): # -b.
line_data = raw_output.pop(len(raw_output) - 1)
line_data['program_name'] = line
raw_output.append(line_data)
continue
line_data = line.split()
line_data: Dict[str, str] = dict(zip(headers, line_data))
for key in list(line_data.keys()):
if key == "local_address":
local_address, local_port = line_data[key].rsplit(
":", maxsplit=1)
line_data["local_address"] = local_address
line_data["local_port"] = local_port
continue
if key == "foreign_address":
foreign_address, foreign_port = line_data[key].rsplit(
":", maxsplit=1)
line_data["foreign_address"] = foreign_address
line_data["foreign_port"] = foreign_port
continue
# There is no state in UDP, so the data after the "state" header will leak.
if key == "proto" and "state" in headers and line_data["proto"] == "UDP":
next_header = headers.index("state") + 1
if len(headers) > next_header:
next_header = headers[next_header]
line_data[next_header] = line_data["state"]
line_data["state"] = ''
raw_output.append(line_data)
return raw_output

View File

@ -164,7 +164,7 @@ import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.8'
version = '1.9'
description = '`ping` and `ping6` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -284,6 +284,8 @@ def _linux_parse(data):
if ipv4 and linedata[0][5] not in string.digits:
hostname = True
# fixup for missing hostname
linedata[0] = linedata[0][:5] + 'nohost' + linedata[0][5:]
elif ipv4 and linedata[0][5] in string.digits:
hostname = False
elif not ipv4 and ' (' in linedata[0]:
@ -314,7 +316,8 @@ def _linux_parse(data):
if line.startswith('---'):
footer = True
raw_output['destination'] = line.split()[1]
if line[4] != ' ': # fixup for missing hostname
raw_output['destination'] = line.split()[1]
continue
if footer:

View File

@ -85,7 +85,7 @@ from jc.exceptions import ParseError
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.2'
version = '1.3'
description = '`ping` and `ping6` command streaming parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -340,6 +340,8 @@ def _linux_parse(line, s):
if s.ipv4 and line[5] not in string.digits:
s.hostname = True
# fixup for missing hostname
line = line[:5] + 'nohost' + line[5:]
elif s.ipv4 and line[5] in string.digits:
s.hostname = False
elif not s.ipv4 and ' (' in line:

View File

@ -17,6 +17,13 @@ Schema:
[
{
"interfaces": [
{
"id": string,
"mac": string,
"name": string,
}
]
"destination": string,
"gateway": string,
"genmask": string,
@ -109,11 +116,11 @@ import jc.parsers.universal
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.8'
version = '1.9'
description = '`route` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
compatible = ['linux']
compatible = ['linux', 'win32']
magic_commands = ['route']
tags = ['command']
@ -152,6 +159,14 @@ def _process(proc_data):
if key in int_list:
entry[key] = jc.utils.convert_to_int(entry[key])
if 'interfaces' in entry:
interfaces = []
for interface in entry["interfaces"]:
# 00 ff 58 60 5f 61 -> 00:ff:58:60:5f:61
interface['mac'] = interface['mac'].replace(' ', ':').replace('.', '')
interfaces.append(interface)
entry["interfaces"] = interfaces
# add flags_pretty
# Flag mapping from https://www.man7.org/linux/man-pages/man8/route.8.html
if 'flags' in entry:
@ -165,6 +180,16 @@ def _process(proc_data):
return proc_data
def normalize_headers(headers: str):
# fixup header row for ipv6
if ' Next Hop ' in headers:
headers = headers.replace(' If', ' Iface')
headers = headers.replace(' Next Hop ', ' Next_Hop ')
headers = headers.replace(' Flag ', ' Flags ')
headers = headers.replace(' Met ', ' Metric ')
headers = headers.lower()
return headers
def parse(data, raw=False, quiet=False):
"""
@ -180,24 +205,22 @@ def parse(data, raw=False, quiet=False):
List of Dictionaries. Raw or processed structured data.
"""
import jc.utils
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.input_type_check(data)
cleandata = data.splitlines()[1:]
cleandata = data.splitlines()
raw_output = []
if jc.utils.has_data(data):
# fixup header row for ipv6
if ' Next Hop ' in cleandata[0]:
cleandata[0] = cleandata[0].replace(' If', ' Iface')
cleandata[0] = cleandata[0].replace(' Next Hop ', ' Next_Hop ')\
.replace(' Flag ', ' Flags ')\
.replace(' Met ', ' Metric ')
cleandata[0] = cleandata[0].lower()
raw_output = jc.parsers.universal.simple_table_parse(cleandata)
import jc.parsers.route_windows
if cleandata[0] in jc.parsers.route_windows.SEPERATORS:
raw_output = jc.parsers.route_windows.parse(cleandata)
else:
cleandata.pop(0) # Removing "Kernel IP routing table".
cleandata[0] = normalize_headers(cleandata[0])
import jc.parsers.universal
raw_output = jc.parsers.universal.simple_table_parse(cleandata)
if raw:
return raw_output

128
jc/parsers/route_windows.py Normal file
View File

@ -0,0 +1,128 @@
"""
jc - JSON Convert Windows `route` command output parser
"""
import re
from typing import List
SEPERATORS = (
"===========================================================================",
" None"
)
# 22...00 50 56 c0 00 01 ......VMware Virtual Ethernet Adapter for VMnet1
# {"id": 22, "mac": "00 50 56 c0 00 01", "name": "VMware Virtual Ethernet Adapter for VMnet1"}
INTERFACE_REGEX = re.compile(
r"^(?P<id>\d+)\.{3}(?P<mac>.{17})[\s+\.]+(?P<name>[^\n\r]+)$"
)
ROUTE_TABLES = ("IPv4 Route Table", "IPv6 Route Table")
ROUTE_TYPES = ("Active Routes:", "Persistent Routes:")
def get_lines_until_seperator(iterator):
lines = []
for line in iterator:
if line in SEPERATORS:
break
lines.append(line)
return lines
def normalize_route_table(route_table: List[str]):
headers = route_table[0]
headers = headers.lower()
headers = headers.replace("network destination", "destination")
headers = headers.replace("if", "iface")
headers = headers.replace("interface", "iface")
headers = headers.replace("netmask", "genmask")
headers_count = len(headers.split())
previous_line_has_all_the_data = True
normalized_route_table = [headers]
for row in route_table[1:]:
row = row.strip()
has_all_the_data = len(row.split()) == headers_count
# If the number of columns doesn't match the number of headers in the current and previous line, concatenating them.
if not has_all_the_data and not previous_line_has_all_the_data:
previous_line = normalized_route_table.pop(
len(normalized_route_table) - 1)
row = f'{previous_line} {row}'
has_all_the_data = True
normalized_route_table.append(row.strip())
previous_line_has_all_the_data = has_all_the_data
return normalized_route_table
def parse(cleandata: List[str]):
"""
Main text parsing function for Windows route
Parameters:
cleandata: (string) text data to parse
Returns:
List of Dictionaries. Raw structured data.
"""
raw_output = []
data_iterator = iter(cleandata)
for line in data_iterator:
if not line:
continue
if line == "Interface List":
# Interface List
# 8...00 ff 58 60 5f 61 ......TAP-Windows Adapter V9
# 52...00 15 5d fd 0d 45 ......Hyper-V Virtual Ethernet Adapter
# ===========================================================================
interfaces = []
for interface_line in data_iterator:
interface_line = interface_line.strip()
if interface_line in SEPERATORS:
break
interface_match = INTERFACE_REGEX.search(interface_line)
if interface_match:
interfaces.append(interface_match.groupdict())
if interfaces:
raw_output.append({"interfaces": interfaces})
continue
full_route_table = []
if line in ROUTE_TABLES:
next(data_iterator) # Skipping the table title.
# Persistent Routes:
# Network Address Netmask Gateway Address Metric
# 157.0.0.0 255.0.0.0 157.55.80.1 3
# ===========================================================================
for route_line in data_iterator:
if route_line in ROUTE_TYPES:
import jc.parsers.universal
route_table = get_lines_until_seperator(
data_iterator
)
if not route_table:
continue
route_table = normalize_route_table(
route_table
)
full_route_table.extend(
jc.parsers.universal.simple_table_parse(
route_table
)
)
raw_output.extend(full_route_table)
return raw_output

281
jc/parsers/srt.py Normal file
View File

@ -0,0 +1,281 @@
"""jc - JSON Convert `SRT` file parser
Usage (cli):
$ cat foo.srt | jc --srt
Usage (module):
import jc
result = jc.parse('srt', srt_file_output)
Schema:
[
{
"index": int,
"start": {
"hours": int,
"minutes": int,
"seconds": int,
"milliseconds": int,
"timestamp": string
},
"end": {
"hours": int,
"minutes": int,
"seconds": int,
"milliseconds": int,
"timestamp": string
},
"content": string
}
]
Examples:
$ cat attack_of_the_clones.srt
1
00:02:16,612 --> 00:02:19,376
Senator, we're making
our final approach into Coruscant.
2
00:02:19,482 --> 00:02:21,609
Very good, Lieutenant.
...
$ cat attack_of_the_clones.srt | jc --srt
[
{
"index": 1,
"start": {
"hours": 0,
"minutes": 2,
"seconds": 16,
"milliseconds": 612,
"timestamp": "00:02:16,612"
},
"end": {
"hours": 0,
"minutes": 2,
"seconds": 19,
"milliseconds": 376,
"timestamp": "00:02:19,376"
},
"content": "Senator, we're making\nour final approach into Coruscant."
},
{
"index": 2,
"start": {
"hours": 0,
"minutes": 2,
"seconds": 19,
"milliseconds": 482,
"timestamp": "00:02:19,482"
},
"end": {
"hours": 0,
"minutes": 2,
"seconds": 21,
"milliseconds": 609,
"timestamp": "00:02:21,609"
},
"content": "Very good, Lieutenant."
},
...
]
"""
import jc.utils
import re
from typing import List, Dict
from jc.jc_types import JSONDictType
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.0'
description = 'SRT file parser'
author = 'Mark Rotner'
author_email = 'rotner.mr@gmail.com'
compatible = ['linux', 'darwin', 'cygwin', 'win32', 'aix', 'freebsd']
tags = ['standard', 'file', 'string']
__version__ = info.version
# Regex from https://github.com/cdown/srt/blob/434d0c1c9d5c26d5c3fb1ce979fc05b478e9253c/srt.py#LL16C1.
# The MIT License
# Copyright (c) 2014-present Christopher Down
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
# The format: (int)index\n(timestamp)start --> (timestamp)end\n(str)content\n.
# Example:
# 1
# 00:02:16,612 --> 00:02:19,376
# Senator, we're making our final approach into Coruscant.
# Start & End timestamp format: hours:minutes:seconds,millisecond.
# "." is not technically valid as a delimiter, but many editors create SRT
# files with this delimiter for whatever reason. Many editors and players
# accept it, so we do too.
RGX_TIMESTAMP_MAGNITUDE_DELIM = r"[,.:,.。:]"
RGX_POSITIVE_INT = r"[0-9]+"
RGX_POSITIVE_INT_OPTIONAL = r"[0-9]*"
RGX_TIMESTAMP = '{field}{separator}{field}{separator}{field}{separator}?{optional_field}'.format(
separator=RGX_TIMESTAMP_MAGNITUDE_DELIM,
field=RGX_POSITIVE_INT,
optional_field=RGX_POSITIVE_INT_OPTIONAL
)
RGX_INDEX = r"-?[0-9]+\.?[0-9]*" # int\float\negative.
RGX_CONTENT = r".*?" # Anything(except newline) but lazy.
RGX_NEWLINE = r"\r?\n" # Newline(CRLF\LF).
SRT_REGEX = re.compile(
r"\s*(?:({index})\s*{newline})?({ts}) *-[ -] *> *({ts}) ?(?:{newline}|\Z)({content})"
# Many sub editors don't add a blank line to the end, and many editors and
# players accept that. We allow it to be missing in input.
#
# We also allow subs that are missing a double blank newline. This often
# happens on subs which were first created as a mixed language subtitle,
# for example chs/eng, and then were stripped using naive methods (such as
# ed/sed) that don't understand newline preservation rules in SRT files.
#
# This means that when you are, say, only keeping chs, and the line only
# contains english, you end up with not only no content, but also all of
# the content lines are stripped instead of retaining a newline.
r"(?:{newline}|\Z)(?:{newline}|\Z|(?=(?:{index}\s*{newline}{ts})))"
# Some SRT blocks, while this is technically invalid, have blank lines
# inside the subtitle content. We look ahead a little to check that the
# next lines look like an index and a timestamp as a best-effort
# solution to work around these.
r"(?=(?:(?:{index}\s*{newline})?{ts}|\Z))".format(
index=RGX_INDEX,
ts=RGX_TIMESTAMP,
content=RGX_CONTENT,
newline=RGX_NEWLINE,
),
re.DOTALL,
)
TIMESTAMP_REGEX = re.compile(
'^({field}){separator}({field}){separator}({field}){separator}?({optional_field})$'.format(
separator=RGX_TIMESTAMP_MAGNITUDE_DELIM,
field=RGX_POSITIVE_INT,
optional_field=RGX_POSITIVE_INT_OPTIONAL
)
)
def _process(proc_data: List[JSONDictType]) -> List[JSONDictType]:
"""
Final processing to conform to the schema.
Parameters:
proc_data: (Dictionary) raw structured data to process
Returns:
List of Dictionaries representing an SRT document.
"""
int_list = {'index'}
timestamp_list = {"start", "end"}
timestamp_int_list = {"hours", "minutes", "seconds", "milliseconds"}
for entry in proc_data:
# Converting {"index"} to int.
for key in entry:
if key in int_list:
entry[key] = jc.utils.convert_to_int(entry[key])
# Converting {"hours", "minutes", "seconds", "milliseconds"} to int.
if key in timestamp_list:
timestamp = entry[key]
for timestamp_key in timestamp:
if timestamp_key in timestamp_int_list:
timestamp[timestamp_key] = jc.utils.convert_to_int(
timestamp[timestamp_key])
return proc_data
def parse_timestamp(timestamp: str) -> Dict:
"""
timestamp: "hours:minutes:seconds,milliseconds" --->
{
"hours": "hours",
"minutes": "minutes",
"seconds": "seconds",
"milliseconds": "milliseconds",
"timestamp": "hours:minutes:seconds,milliseconds"
}
"""
ts_match = TIMESTAMP_REGEX.match(timestamp)
if ts_match:
hours, minutes, seconds, milliseconds = ts_match.groups()
return {
"hours": hours,
"minutes": minutes,
"seconds": seconds,
"milliseconds": milliseconds,
"timestamp": timestamp
}
return {}
def parse(
data: str,
raw: bool = False,
quiet: bool = False
) -> List[JSONDictType]:
"""
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
Dictionary. Raw or processed structured data.
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.input_type_check(data)
raw_output: List[Dict] = []
if not jc.utils.has_data(data):
return raw_output
for subtitle in SRT_REGEX.finditer(data):
index, start, end, content = subtitle.groups()
raw_output.append(
{
"index": index,
"start": parse_timestamp(start),
"end": parse_timestamp(end),
"content": content.replace("\r\n", "\n")
}
)
return raw_output if raw else _process(raw_output)

View File

@ -1,8 +1,5 @@
"""jc - JSON Convert `ss` command output parser
Extended information options like `-e` and `-p` are not supported and may
cause parsing irregularities.
Usage (cli):
$ ss | jc --ss
@ -23,21 +20,29 @@ field names
[
{
"netid": string,
"state": string,
"recv_q": integer,
"send_q": integer,
"local_address": string,
"local_port": string,
"local_port_num": integer,
"peer_address": string,
"peer_port": string,
"peer_port_num": integer,
"interface": string,
"link_layer" string,
"channel": string,
"path": string,
"pid": integer
"netid": string,
"state": string,
"recv_q": integer,
"send_q": integer,
"local_address": string,
"local_port": string,
"local_port_num": integer,
"peer_address": string,
"peer_port": string,
"peer_port_num": integer,
"interface": string,
"link_layer" string,
"channel": string,
"path": string,
"pid": integer,
"opts": {
"process_id": {
"<process_id>": {
"user": string,
"file_descriptor": string
}
}
}
}
]
@ -275,13 +280,15 @@ Examples:
}
]
"""
import re
import ast
import string
import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.6'
version = '1.7'
description = '`ss` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -324,6 +331,57 @@ def _process(proc_data):
return proc_data
def _parse_opts(proc_data):
""" Process extra options -e, -o, -p
Parameters:
proc_data: (List of Dictionaries) raw structured data to process
Returns:
Structured data dictionary for extra/optional headerless options.
"""
o_field = proc_data.split(' ')
opts = {}
for item in o_field:
# -e option:
item = re.sub(
'uid', 'uid_number',
re.sub('sk', 'cookie', re.sub('ino', 'inode_number', item)))
if ":" in item:
key, val = item.split(':')
# -o option
if key == "timer":
val = val.replace('(', '[').replace(')', ']')
val = ast.literal_eval(re.sub(r'([a-z0-9\.]+)', '"\\1"', val))
val = {
'timer_name': val[0],
'expire_time': val[1],
'retrans': val[2]
}
opts[key] = val
# -p option
if key == "users":
key = 'process_id'
val = val.replace('(', '[').replace(')', ']')
val = ast.literal_eval(re.sub(r'([a-z]+=[0-9]+)', '"\\1"', val))
data = {}
for rec in val:
params = {}
params['user'] = rec[0]
for i in [x for x in rec if '=' in x]:
k, v = i.split('=')
params[k] = v
data.update({
params['pid']: {
'user': params['user'],
'file_descriptor': params['fd']
}
})
val = data
opts[key] = val
return opts
def parse(data, raw=False, quiet=False):
"""
@ -357,15 +415,20 @@ def parse(data, raw=False, quiet=False):
header_text = header_text.replace('-', '_')
header_list = header_text.split()
extra_opts = False
for entry in cleandata[1:]:
output_line = {}
if entry[0] not in string.whitespace:
# fix weird ss bug where first two columns have no space between them sometimes
entry = entry[:5] + ' ' + entry[5:]
entry = entry[:5] + ' ' + entry[5:]
entry_list = entry.split()
entry_list = re.split(r'[ ]{1,}',entry.strip())
if len(entry_list) > len(header_list) or extra_opts == True:
entry_list = re.split(r'[ ]{2,}',entry.strip())
extra_opts = True
if entry_list[0] in contains_colon and ':' in entry_list[4]:
l_field = entry_list[4].rsplit(':', maxsplit=1)
@ -381,6 +444,10 @@ def parse(data, raw=False, quiet=False):
entry_list[6] = p_address
entry_list.insert(7, p_port)
if re.search(r'ino:|uid:|sk:|users:|timer:',entry_list[-1]):
header_list.append('opts')
entry_list[-1] = _parse_opts(entry_list[-1])
output_line = dict(zip(header_list, entry_list))
# some post processing to pull out fields: interface, link_layer, path, pid, channel

View File

@ -171,7 +171,7 @@ import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.12'
version = '1.13'
description = '`stat` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -238,112 +238,115 @@ def parse(data, raw=False, quiet=False):
# Clear any blank lines
cleandata = list(filter(None, data.splitlines()))
if jc.utils.has_data(data):
# linux output
if cleandata[0].startswith(' File: '):
# stats output contains 8 lines
for line in cleandata:
# line #1
if line.find('File:') == 2:
output_line = {}
line_list = line.split(maxsplit=1)
output_line['file'] = line_list[1]
# populate link_to field if -> found
if ' -> ' in output_line['file']:
filename = output_line['file'].split(' -> ')[0].strip('\u2018').rstrip('\u2019')
link = output_line['file'].split(' -> ')[1].strip('\u2018').rstrip('\u2019')
output_line['file'] = filename
output_line['link_to'] = link
else:
filename = output_line['file'].split(' -> ')[0].strip('\u2018').rstrip('\u2019')
output_line['file'] = filename
continue
# line #2
if line.find('Size:') == 2:
line_list = line.split(maxsplit=7)
output_line['size'] = line_list[1]
output_line['blocks'] = line_list[3]
output_line['io_blocks'] = line_list[6]
output_line['type'] = line_list[7]
continue
# line #3
if line.startswith('Device:'):
line_list = line.split()
output_line['device'] = line_list[1]
output_line['inode'] = line_list[3]
output_line['links'] = line_list[5]
continue
# line #4
if line.startswith('Access: ('):
line = line.replace('(', ' ').replace(')', ' ').replace('/', ' ')
line_list = line.split()
output_line['access'] = line_list[1]
output_line['flags'] = line_list[2]
output_line['uid'] = line_list[4]
output_line['user'] = line_list[5]
output_line['gid'] = line_list[7]
output_line['group'] = line_list[8]
continue
# line #5
if line.startswith('Access: 2'):
line_list = line.split(maxsplit=1)
output_line['access_time'] = line_list[1]
continue
# line #6
if line.startswith('Modify:'):
line_list = line.split(maxsplit=1)
output_line['modify_time'] = line_list[1]
continue
# line #7
if line.startswith('Change:'):
line_list = line.split(maxsplit=1)
output_line['change_time'] = line_list[1]
continue
# line #8
if line.find('Birth:') == 1:
line_list = line.split(maxsplit=1)
output_line['birth_time'] = line_list[1]
raw_output.append(output_line)
continue
# FreeBSD/OSX output
else:
for line in cleandata:
value = shlex.split(line)
output_line = {
'file': ' '.join(value[15:]),
'unix_device': value[0],
'inode': value[1],
'flags': value[2],
'links': value[3],
'user': value[4],
'group': value[5],
'rdev': value[6],
'size': value[7],
'access_time': value[8],
'modify_time': value[9],
'change_time': value[10],
'birth_time': value[11],
'block_size': value[12],
'blocks': value[13],
'unix_flags': value[14]
}
raw_output.append(output_line)
if raw:
if not jc.utils.has_data(data):
return raw_output
# linux output
if cleandata[0].startswith(' File: '):
output_line = {}
# stats output contains 8 lines
for line in cleandata:
# line #1
if line.find('File:') == 2:
if output_line: # Reached a new file stat info.
raw_output.append(output_line)
output_line = {}
line_list = line.split(maxsplit=1)
output_line['file'] = line_list[1]
# populate link_to field if -> found
if ' -> ' in output_line['file']:
filename = output_line['file'].split(' -> ')[0].strip('\u2018').rstrip('\u2019')
link = output_line['file'].split(' -> ')[1].strip('\u2018').rstrip('\u2019')
output_line['file'] = filename
output_line['link_to'] = link
else:
filename = output_line['file'].split(' -> ')[0].strip('\u2018').rstrip('\u2019')
output_line['file'] = filename
continue
# line #2
if line.startswith(' Size:'):
line_list = line.split(maxsplit=7)
output_line['size'] = line_list[1]
output_line['blocks'] = line_list[3]
output_line['io_blocks'] = line_list[6]
output_line['type'] = line_list[7]
continue
# line #3
if line.startswith('Device:'):
line_list = line.split()
output_line['device'] = line_list[1]
output_line['inode'] = line_list[3]
output_line['links'] = line_list[5]
continue
# line #4
if line.startswith('Access: ('):
line = line.replace('(', ' ').replace(')', ' ').replace('/', ' ')
line_list = line.split()
output_line['access'] = line_list[1]
output_line['flags'] = line_list[2]
output_line['uid'] = line_list[4]
output_line['user'] = line_list[5]
output_line['gid'] = line_list[7]
output_line['group'] = line_list[8]
continue
# line #5
if line.startswith('Access: 2'):
line_list = line.split(maxsplit=1)
output_line['access_time'] = line_list[1]
continue
# line #6
if line.startswith('Modify:'):
line_list = line.split(maxsplit=1)
output_line['modify_time'] = line_list[1]
continue
# line #7
if line.startswith('Change:'):
line_list = line.split(maxsplit=1)
output_line['change_time'] = line_list[1]
continue
# line #8
if line.startswith(' Birth:'):
line_list = line.split(maxsplit=1)
output_line['birth_time'] = line_list[1]
continue
if output_line:
raw_output.append(output_line)
# FreeBSD/OSX output
else:
return _process(raw_output)
for line in cleandata:
value = shlex.split(line)
output_line = {
'file': ' '.join(value[15:]),
'unix_device': value[0],
'inode': value[1],
'flags': value[2],
'links': value[3],
'user': value[4],
'group': value[5],
'rdev': value[6],
'size': value[7],
'access_time': value[8],
'modify_time': value[9],
'change_time': value[10],
'birth_time': value[11],
'block_size': value[12],
'blocks': value[13],
'unix_flags': value[14]
}
raw_output.append(output_line)
return raw_output if raw else _process(raw_output)

256
jc/parsers/veracrypt.py Normal file
View File

@ -0,0 +1,256 @@
"""jc - JSON Convert `veracrypt` command output parser
Supports the following `veracrypt` subcommands:
- `veracrypt --text --list`
- `veracrypt --text --list --verbose`
- `veracrypt --text --volume-properties <volume>`
Usage (cli):
$ veracrypt --text --list | jc --veracrypt
or
$ jc veracrypt --text --list
Usage (module):
import jc
result = jc.parse('veracrypt', veracrypt_command_output)
Schema:
Volume:
[
{
"slot": integer,
"path": string,
"device": string,
"mountpoint": string,
"size": string,
"type": string,
"readonly": string,
"hidden_protected": string,
"encryption_algo": string,
"pk_size": string,
"sk_size": string,
"block_size": string,
"mode": string,
"prf": string,
"format_version": integer,
"backup_header": string
}
]
Examples:
$ veracrypt --text --list | jc --veracrypt -p
[
{
"slot": 1,
"path": "/dev/sdb1",
"device": "/dev/mapper/veracrypt1",
"mountpoint": "/home/bob/mount/encrypt/sdb1"
}
]
$ veracrypt --text --list --verbose | jc --veracrypt -p
[
{
"slot": 1,
"path": "/dev/sdb1",
"device": "/dev/mapper/veracrypt1",
"mountpoint": "/home/bob/mount/encrypt/sdb1",
"size": "522 MiB",
"type": "Normal",
"readonly": "No",
"hidden_protected": "No",
"encryption_algo": "AES",
"pk_size": "256 bits",
"sk_size": "256 bits",
"block_size": "128 bits",
"mode": "XTS",
"prf": "HMAC-SHA-512",
"format_version": 2,
"backup_header": "Yes"
}
]
"""
import re
from typing import List, Dict, Optional, Any
from jc.jc_types import JSONDictType
import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.0'
description = '`veracrypt` command parser'
author = 'Jake Ob'
author_email = 'iakopap at gmail.com'
compatible = ["linux"]
magic_commands = ["veracrypt"]
tags = ['command']
__version__ = info.version
try:
from typing import TypedDict
Volume = TypedDict(
"Volume",
{
"slot": int,
"path": str,
"device": str,
"mountpoint": str,
"size": str,
"type": str,
"readonly": str,
"hidden_protected": str,
"encryption_algo": str,
"pk_size": str,
"sk_size": str,
"block_size": str,
"mode": str,
"prf": str,
"format_version": int,
"backup_header": str
},
)
except ImportError:
Volume = Dict[str, Any] # type: ignore
_volume_line_pattern = r"(?P<slot>[0-9]+): (?P<path>.+?) (?P<device>.+?) (?P<mountpoint>.*)"
_volume_verbose_pattern = (
r"(Slot:\s(?P<slot>.+)"
+ r"|Volume:\s(?P<path>.+)"
+ r"|Virtual\sDevice:\s(?P<device>.+)"
+ r"|Mount\sDirectory:\s(?P<mountpoint>.+)"
+ r"|Size:\s(?P<size>.+)"
+ r"|Type:\s(?P<type>.+)"
+ r"|Read-Only:\s(?P<readonly>.+)"
+ r"|Hidden\sVolume Protected:\s(?P<hidden_protected>.+)"
+ r"|Encryption\sAlgorithm:\s(?P<encryption_algo>.+)"
+ r"|Primary\sKey\sSize:\s(?P<pk_size>.+)"
+ r"|Secondary\sKey\sSize\s.*:\s(?P<sk_size>.+)"
+ r"|Block\sSize:\s(?P<block_size>.+)"
+ r"|Mode\sof\sOperation:\s(?P<mode>.+)"
+ r"|PKCS-5\sPRF:\s(?P<prf>.+)"
+ r"|Volume\sFormat\sVersion:\s(?P<format_version>.+)"
+ r"|Embedded\sBackup\sHeader:\s(?P<backup_header>.+))"
)
def _parse_volume(next_lines: List[str]) -> Optional[Volume]:
next_line = next_lines.pop()
result = re.match(_volume_line_pattern, next_line)
# Parse and return the volume given as a single line (veracrypt -t --list)
if result:
matches = result.groupdict()
volume: Volume = { # type: ignore
"slot": int(matches["slot"]),
"path": matches["path"],
"device": matches["device"],
"mountpoint": matches["mountpoint"],
}
return volume
else:
next_lines.append(next_line)
# Otherwise parse the volume given in multiple lines (veracrypt -t --list -v)
volume: Volume = {} # type: ignore
while next_lines:
next_line = next_lines.pop()
# Return when encounter an empty line
if not next_line:
return volume
result = re.match(_volume_verbose_pattern, next_line)
# Skip to the next line in case of an unknown field line
if not result:
continue
matches = result.groupdict()
if matches["slot"]:
volume["slot"] = int(matches["slot"])
elif matches["path"]:
volume["path"] = matches["path"]
elif matches["device"]:
volume["device"] = matches["device"]
elif matches["mountpoint"]:
volume["mountpoint"] = matches["mountpoint"]
elif matches["size"]:
volume["size"] = matches["size"]
elif matches["type"]:
volume["type"] = matches["type"]
elif matches["readonly"]:
volume["readonly"] = matches["readonly"]
elif matches["hidden_protected"]:
volume["hidden_protected"] = matches["hidden_protected"]
elif matches["encryption_algo"]:
volume["encryption_algo"] = matches["encryption_algo"]
elif matches["pk_size"]:
volume["pk_size"] = matches["pk_size"]
elif matches["sk_size"]:
volume["sk_size"] = matches["sk_size"]
elif matches["block_size"]:
volume["block_size"] = matches["block_size"]
elif matches["mode"]:
volume["mode"] = matches["mode"]
elif matches["prf"]:
volume["prf"] = matches["prf"]
elif matches["format_version"]:
volume["format_version"] = int(matches["format_version"])
elif matches["backup_header"]:
volume["backup_header"] = matches["backup_header"]
return volume
def parse(data: str, raw: bool = False, quiet: bool = False) -> List[JSONDictType]:
"""
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
List of Dictionaries. Raw or processed structured data.
"""
result: List = []
if jc.utils.has_data(data):
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.input_type_check(data)
linedata = data.splitlines()
first_line = linedata[0]
line_mode = re.search(_volume_line_pattern, first_line)
verbose_mode = re.search(_volume_verbose_pattern, first_line)
if not line_mode and not verbose_mode:
return []
linedata.reverse()
while linedata:
volume = _parse_volume(linedata)
if volume:
result.append(volume)
else:
break
return result

View File

@ -408,12 +408,12 @@ from collections import OrderedDict
from datetime import datetime
from typing import List, Dict, Union
import jc.utils
from jc.parsers.asn1crypto import pem, x509
from jc.parsers.asn1crypto import pem, x509, jc_global
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.1'
version = '1.2'
description = 'X.509 PEM and DER certificate file parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -462,6 +462,9 @@ def _fix_objects(obj):
Recursively traverse the nested dictionary or list and convert objects
into JSON serializable types.
"""
if isinstance(obj, tuple):
obj = list(obj)
if isinstance(obj, set):
obj = sorted(list(obj))
@ -501,6 +504,10 @@ def _fix_objects(obj):
obj.update({k: v})
continue
if isinstance(v, tuple):
v = list(v)
obj.update({k: v})
if isinstance(v, set):
v = sorted(list(v))
obj.update({k: v})
@ -548,6 +555,7 @@ def parse(
List of Dictionaries. Raw or processed structured data.
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc_global.quiet = quiet # to inject quiet setting into asn1crypto library
raw_output: List = []

318
jc/parsers/x509_csr.py Normal file
View File

@ -0,0 +1,318 @@
"""jc - JSON Convert X.509 Certificate Request format file parser
This parser will convert DER and PEM encoded X.509 certificate request files.
Usage (cli):
$ cat certificateRequest.pem | jc --x509-csr
Usage (module):
import jc
result = jc.parse('x509_csr', x509_csr_file_output)
Schema:
[
{
"certification_request_info": {
"version": string,
"serial_number": string, # [0]
"serial_number_str": string,
"signature": {
"algorithm": string,
"parameters": string/null,
},
"issuer": {
"country_name": string,
"state_or_province_name" string,
"locality_name": string,
"organization_name": array/string,
"organizational_unit_name": array/string,
"common_name": string,
"email_address": string,
"serial_number": string, # [0]
"serial_number_str": string
},
"validity": {
"not_before": integer, # [1]
"not_after": integer, # [1]
"not_before_iso": string,
"not_after_iso": string
},
"subject": {
"country_name": string,
"state_or_province_name": string,
"locality_name": string,
"organization_name": array/string,
"organizational_unit_name": array/string,
"common_name": string,
"email_address": string,
"serial_number": string, # [0]
"serial_number_str": string
},
"subject_public_key_info": {
"algorithm": {
"algorithm": string,
"parameters": string/null,
},
"public_key": {
"modulus": string, # [0]
"public_exponent": integer
}
},
"issuer_unique_id": string/null,
"subject_unique_id": string/null,
"extensions": [
{
"extn_id": string,
"critical": boolean,
"extn_value": array/object/string/integer # [2]
}
]
},
"signature_algorithm": {
"algorithm": string,
"parameters": string/null
},
"signature_value": string # [0]
}
]
[0] in colon-delimited hex notation
[1] time-zone-aware (UTC) epoch timestamp
[2] See below for well-known Extension schemas:
Basic Constraints:
{
"extn_id": "basic_constraints",
"critical": boolean,
"extn_value": {
"ca": boolean,
"path_len_constraint": string/null
}
}
Key Usage:
{
"extn_id": "key_usage",
"critical": boolean,
"extn_value": [
string
]
}
Key Identifier:
{
"extn_id": "key_identifier",
"critical": boolean,
"extn_value": string # [0]
}
Authority Key Identifier:
{
"extn_id": "authority_key_identifier",
"critical": boolean,
"extn_value": {
"key_identifier": string, # [0]
"authority_cert_issuer": string/null,
"authority_cert_serial_number": string/null
}
}
Subject Alternative Name:
{
"extn_id": "subject_alt_name",
"critical": boolean,
"extn_value": [
string
]
}
Certificate Policies:
{
"extn_id": "certificate_policies",
"critical": boolean,
"extn_value": [
{
"policy_identifier": string,
"policy_qualifiers": [ array or null
{
"policy_qualifier_id": string,
"qualifier": string
}
]
}
]
}
Signed Certificate Timestamp List:
{
"extn_id": "signed_certificate_timestamp_list",
"critical": boolean,
"extn_value": string # [0]
}
Examples:
$ cat server.csr| jc --x509-csr -p
[
{
"certification_request_info": {
"version": "v1",
"subject": {
"common_name": "myserver.for.example"
},
"subject_pk_info": {
"algorithm": {
"algorithm": "ec",
"parameters": "secp256r1"
},
"public_key": "04:40:33:c0:91:8f:e9:46:ea:d0:dc:d0:f9:63:2..."
},
"attributes": [
{
"type": "extension_request",
"values": [
[
{
"extn_id": "extended_key_usage",
"critical": false,
"extn_value": [
"server_auth"
]
},
{
"extn_id": "subject_alt_name",
"critical": false,
"extn_value": [
"myserver.for.example"
]
}
]
]
}
]
},
"signature_algorithm": {
"algorithm": "sha384_ecdsa",
"parameters": null
},
"signature": "30:45:02:20:77:ac:5b:51:bf:c5:f5:43:02:52:ae:66:..."
}
]
$ openssl req -in server.csr | jc --x509-csr -p
[
{
"certification_request_info": {
"version": "v1",
"subject": {
"common_name": "myserver.for.example"
},
"subject_pk_info": {
"algorithm": {
"algorithm": "ec",
"parameters": "secp256r1"
},
"public_key": "04:40:33:c0:91:8f:e9:46:ea:d0:dc:d0:f9:63:2..."
},
"attributes": [
{
"type": "extension_request",
"values": [
[
{
"extn_id": "extended_key_usage",
"critical": false,
"extn_value": [
"server_auth"
]
},
{
"extn_id": "subject_alt_name",
"critical": false,
"extn_value": [
"myserver.for.example"
]
}
]
]
}
]
},
"signature_algorithm": {
"algorithm": "sha384_ecdsa",
"parameters": null
},
"signature": "30:45:02:20:77:ac:5b:51:bf:c5:f5:43:02:52:ae:66:..."
}
]
"""
# import binascii
# from collections import OrderedDict
# from datetime import datetime
from typing import List, Dict, Union
import jc.utils
from jc.parsers.asn1crypto import pem, csr, jc_global
from jc.parsers.x509_cert import _fix_objects, _process
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.0'
description = 'X.509 PEM and DER certificate request file parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
details = 'Using the asn1crypto library at https://github.com/wbond/asn1crypto/releases/tag/1.5.1'
compatible = ['linux', 'darwin', 'cygwin', 'win32', 'aix', 'freebsd']
tags = ['standard', 'file', 'string', 'binary']
__version__ = info.version
def parse(
data: Union[str, bytes],
raw: bool = False,
quiet: bool = False
) -> List[Dict]:
"""
Main text parsing function
Parameters:
data: (string or bytes) text or binary data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
List of Dictionaries. Raw or processed structured data.
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc_global.quiet = quiet # to inject quiet setting into asn1crypto library
raw_output: List = []
if jc.utils.has_data(data):
# convert to bytes, if not already, for PEM detection since that's
# what pem.detect() needs. (cli.py will auto-convert to UTF-8 if it can)
try:
der_bytes = bytes(data, 'utf-8') # type: ignore
except TypeError:
der_bytes = data # type: ignore
certs = []
if pem.detect(der_bytes):
for type_name, headers, der_bytes in pem.unarmor(der_bytes, multiple=True):
if type_name == 'CERTIFICATE REQUEST' or type_name == 'NEW CERTIFICATE REQUEST':
certs.append(csr.CertificationRequest.load(der_bytes))
else:
certs.append(csr.CertificationRequest.load(der_bytes))
raw_output = [_fix_objects(cert.native) for cert in certs]
return raw_output if raw else _process(raw_output)

View File

@ -151,8 +151,9 @@ def compatibility(mod_name: str, compatible: List[str], quiet: bool = False) ->
mod = mod_name.split('.')[-1]
compat_list = ', '.join(compatible)
warning_message([
f'{mod} parser is not compatible with your OS ({sys.platform}).',
f'Compatible platforms: {compat_list}'
f'`{mod}` command output from this OS ({sys.platform}) is not supported.',
f'`{mod}` command output from the following platforms is supported: {compat_list}',
'Disregard this warning if you are processing output that came from a supported platform. (Use the -q option to suppress this warning)'
])

View File

@ -1,4 +1,4 @@
.TH jc 1 2023-04-30 1.23.2 "JSON Convert"
.TH jc 1 2023-06-21 1.23.3 "JSON Convert"
.SH NAME
\fBjc\fP \- JSON Convert JSONifies the output of many CLI tools, file-types,
and strings
@ -357,6 +357,11 @@ Key/Value file and string parser
\fB--ls-s\fP
`ls` command streaming parser
.TP
.B
\fB--lsattr\fP
`lsattr` command parser
.TP
.B
\fB--lsblk\fP
@ -777,6 +782,11 @@ Semantic Version string parser
\fB--shadow\fP
`/etc/shadow` file parser
.TP
.B
\fB--srt\fP
SRT file parser
.TP
.B
\fB--ss\fP
@ -942,6 +952,11 @@ URL string parser
\fB--ver\fP
Version string parser
.TP
.B
\fB--veracrypt\fP
`veracrypt` command parser
.TP
.B
\fB--vmstat\fP
@ -972,6 +987,11 @@ Version string parser
\fB--x509-cert\fP
X.509 PEM and DER certificate file parser
.TP
.B
\fB--x509-csr\fP
X.509 PEM and DER certificate request file parser
.TP
.B
\fB--xml\fP

View File

@ -1,2 +1,2 @@
[metadata]
license_file = LICENSE.md
license_files = LICENSE.md

View File

@ -5,7 +5,7 @@ with open('README.md', 'r') as f:
setuptools.setup(
name='jc',
version='1.23.2',
version='1.23.3',
author='Kelly Brazil',
author_email='kellyjonbrazil@gmail.com',
description='Converts the output of popular command-line tools and file-types to JSON.',

View File

@ -5,11 +5,13 @@
> Try the `jc` [web demo](https://jc-web.onrender.com/) and [REST API](https://github.com/kellyjonbrazil/jc-restapi)
> JC is [now available](https://galaxy.ansible.com/community/general) as an
> `jc` is [now available](https://galaxy.ansible.com/community/general) as an
Ansible filter plugin in the `community.general` collection. See this
[blog post](https://blog.kellybrazil.com/2020/08/30/parsing-command-output-in-ansible-with-jc/)
for an example.
> Looking for something like `jc` but lower-level? Check out [regex2json](https://gitlab.com/tozd/regex2json).
# JC
JSON Convert

View File

@ -0,0 +1 @@
[{"user":"root","tty":"pts/0","hostname":"192.168.255.1","login":"Mon Jun 19 14:18:13 2023","logout":"still logged in","login_epoch":1687209493}, {"user":"mark","tty":"pts/0","hostname":"192.168.255.1","login":"Mon Jun 19 14:15:57 2023","logout":"Mon Jun 19 14:18:00 2023","duration":"00:02","login_epoch":1687209357,"logout_epoch":1687209480,"duration_seconds":123}, {"user":"mark","tty":"pts/0","hostname":"192.168.255.1","login":"Mon Jun 19 14:15:45 2023","logout":"Mon Jun 19 14:15:52 2023","duration":"00:00","login_epoch":1687209345,"logout_epoch":1687209352,"duration_seconds":7}, {"user":"mark","tty":"tty1","hostname":"0.0.0.0","login":"Mon Jun 19 16:59:57 2023","logout":"still logged in","login_epoch":1687219197}, {"user":"runlevel","tty":"(to lvl 3)","hostname":"0.0.0.0","login":"Mon Jun 19 16:59:39 2023","logout":"Mon Jun 19 14:35:00 2023","duration":"-2:-24","login_epoch":1687219179,"logout_epoch":1687210500,"duration_seconds":-8679}, {"user":"reboot","tty":"system boot","hostname":"0.0.0.0","login":"Mon Jun 19 16:59:20 2023","logout":"Mon Jun 19 14:35:00 2023","duration":"-2:-24","login_epoch":1687219160,"logout_epoch":1687210500,"duration_seconds":-8660},{"user": "shutdown","tty": "system down","hostname": "0.0.0.0","login": "Fri Apr 14 13:46:46 2023","logout": "Fri Apr 14 13:47:12 2023","duration": "00:00","login_epoch": 1681505206,"logout_epoch": 1681505232,"duration_seconds": 26}]

View File

@ -0,0 +1,9 @@
root pts/0 192.168.255.1 Mon Jun 19 14:18:13 2023 still logged in
mark pts/0 192.168.255.1 Mon Jun 19 14:15:57 2023 - Mon Jun 19 14:18:00 2023 (00:02)
mark pts/0 192.168.255.1 Mon Jun 19 14:15:45 2023 - Mon Jun 19 14:15:52 2023 (00:00)
mark tty1 0.0.0.0 Mon Jun 19 16:59:57 2023 still logged in
runlevel (to lvl 3) 0.0.0.0 Mon Jun 19 16:59:39 2023 - Mon Jun 19 14:35:00 2023 (-2:-24)
reboot system boot 0.0.0.0 Mon Jun 19 16:59:20 2023 - Mon Jun 19 14:35:00 2023 (-2:-24)
shutdown system down 0.0.0.0 Fri Apr 14 13:46:46 2023 - Fri Apr 14 13:47:12 2023 (00:00)
wtmp begins Mon Jun 19 16:59:20 2023

View File

@ -0,0 +1 @@
[{"type":"reply","destination_ip":"100.68.105.124","sent_bytes":56,"pattern":null,"timestamp":null,"response_bytes":64,"response_ip":"100.68.105.124","icmp_seq":1,"ttl":64,"time_ms":0.04,"duplicate":false},{"type":"summary","destination_ip":"100.68.105.124","sent_bytes":56,"pattern":null,"packets_transmitted":1,"packets_received":1,"packet_loss_percent":0.0,"duplicates":0,"time_ms":0.0,"round_trip_ms_min":0.04,"round_trip_ms_avg":0.04,"round_trip_ms_max":0.04,"round_trip_ms_stddev":0.0}]

View File

@ -0,0 +1 @@
{"destination_ip":"100.68.105.124","data_bytes":56,"pattern":null,"packets_transmitted":1,"packets_received":1,"packet_loss_percent":0.0,"duplicates":0,"time_ms":0.0,"round_trip_ms_min":0.04,"round_trip_ms_avg":0.04,"round_trip_ms_max":0.04,"round_trip_ms_stddev":0.0,"responses":[{"type":"reply","timestamp":null,"bytes":64,"response_ip":"100.68.105.124","icmp_seq":1,"ttl":64,"time_ms":0.04,"duplicate":false}]}

View File

@ -0,0 +1,6 @@
PING (100.68.105.124) 56(84) bytes of data.
64 bytes from 100.68.105.124 (100.68.105.124): icmp_seq=1 ttl=64 time=0.040 ms
--- ping statistics ---
1 packets transmitted, 1 received, 0% packet loss, time 0ms
rtt min/avg/max/mdev = 0.040/0.040/0.040/0.000 ms

View File

@ -0,0 +1,18 @@
Device DF:1C:C3:B4:1A:1F (random)
Name: M585/M590
Alias: M585/M590
Appearance: 0x03c2
Icon: input-mouse
Paired: yes
Bonded: yes
Trusted: no
Blocked: no
Connected: no
LegacyPairing: no
UUID: Generic Access Profile (00001800-0000-1000-8000-00805f9b34fb)
UUID: Generic Attribute Profile (00001801-0000-1000-8000-00805f9b34fb)
UUID: Device Information (0000180a-0000-1000-8000-00805f9b34fb)
UUID: Battery Service (0000180f-0000-1000-8000-00805f9b34fb)
UUID: Human Interface Device (00001812-0000-1000-8000-00805f9b34fb)
UUID: Vendor specific (00010000-0000-1000-8000-011f2000046d)
Modalias: usb:v046DpB01Bd0011

1
tests/fixtures/generic/dig-nsid.json vendored Normal file
View File

@ -0,0 +1 @@
[{"id":22691,"opcode":"QUERY","status":"NOERROR","flags":["qr","rd","ra"],"query_num":1,"answer_num":1,"authority_num":0,"additional_num":1,"opt_pseudosection":{"edns":{"version":0,"flags":[],"udp":512},"nsid":"gpdns-sfo"},"question":{"name":"mail.google.com.","class":"IN","type":"A"},"answer":[{"name":"mail.google.com.","class":"IN","type":"A","ttl":300,"data":"142.250.189.197"}],"query_time":189,"server":"2001:4860:4860::8888#53(2001:4860:4860::8888)","when":"Sat Jun 03 12:37:43 PDT 2023","rcvd":73,"when_epoch":1685821063,"when_epoch_utc":null}]

22
tests/fixtures/generic/dig-nsid.out vendored Normal file
View File

@ -0,0 +1,22 @@
; <<>> DiG 9.10.6 <<>> +nsid @dns.google mail.google.com
; (4 servers found)
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 22691
;; flags: qr rd ra; QUERY: 1, ANSWER: 1, AUTHORITY: 0, ADDITIONAL: 1
;; OPT PSEUDOSECTION:
; EDNS: version: 0, flags:; udp: 512
; NSID: 67 70 64 6e 73 2d 73 66 6f ("gpdns-sfo")
;; QUESTION SECTION:
;mail.google.com. IN A
;; ANSWER SECTION:
mail.google.com. 300 IN A 142.250.189.197
;; Query time: 189 msec
;; SERVER: 2001:4860:4860::8888#53(2001:4860:4860::8888)
;; WHEN: Sat Jun 03 12:37:43 PDT 2023
;; MSG SIZE rcvd: 73

View File

@ -0,0 +1 @@
[{"index": 1, "start": {"hours": 0, "minutes": 2, "seconds": 16, "milliseconds": 612, "timestamp": "00:02:16,612"}, "end": {"hours": 0, "minutes": 2, "seconds": 19, "milliseconds": 376, "timestamp": "00:02:19,376"}, "content": "Senator, we're making\nour final approach into Coruscant."}, {"index": 2, "start": {"hours": 0, "minutes": 2, "seconds": 19, "milliseconds": 482, "timestamp": "00:02:19,482"}, "end": {"hours": 0, "minutes": 2, "seconds": 21, "milliseconds": 609, "timestamp": "00:02:21,609"}, "content": "Very good, Lieutenant."}, {"index": 3, "start": {"hours": 0, "minutes": 3, "seconds": 13, "milliseconds": 336, "timestamp": "00:03:13,336"}, "end": {"hours": 0, "minutes": 3, "seconds": 15, "milliseconds": 167, "timestamp": "00:03:15,167"}, "content": "We made it."}, {"index": 4, "start": {"hours": 0, "minutes": 3, "seconds": 18, "milliseconds": 608, "timestamp": "00:03:18,608"}, "end": {"hours": 0, "minutes": 3, "seconds": 20, "milliseconds": 371, "timestamp": "00:03:20,371"}, "content": "I guess I was wrong."}, {"index": 5, "start": {"hours": 0, "minutes": 3, "seconds": 20, "milliseconds": 476, "timestamp": "00:03:20,476"}, "end": {"hours": 0, "minutes": 3, "seconds": 22, "milliseconds": 671, "timestamp": "00:03:22,671"}, "content": "There was no danger at all."}]

View File

@ -0,0 +1,20 @@
1
00:02:16,612 --> 00:02:19,376
Senator, we're making
our final approach into Coruscant.
2
00:02:19,482 --> 00:02:21,609
Very good, Lieutenant.
3
00:03:13,336 --> 00:03:15,167
We made it.
4
00:03:18,608 --> 00:03:20,371
I guess I was wrong.
5
00:03:20,476 --> 00:03:22,671
There was no danger at all.

View File

@ -0,0 +1 @@
[{"index": "1", "start": {"hours": "00", "minutes": "02", "seconds": "16", "milliseconds": "612", "timestamp": "00:02:16,612"}, "end": {"hours": "00", "minutes": "02", "seconds": "19", "milliseconds": "376", "timestamp": "00:02:19,376"}, "content": "Senator, we're making\nour final approach into Coruscant."}, {"index": "2", "start": {"hours": "00", "minutes": "02", "seconds": "19", "milliseconds": "482", "timestamp": "00:02:19,482"}, "end": {"hours": "00", "minutes": "02", "seconds": "21", "milliseconds": "609", "timestamp": "00:02:21,609"}, "content": "Very good, Lieutenant."}, {"index": "3", "start": {"hours": "00", "minutes": "03", "seconds": "13", "milliseconds": "336", "timestamp": "00:03:13,336"}, "end": {"hours": "00", "minutes": "03", "seconds": "15", "milliseconds": "167", "timestamp": "00:03:15,167"}, "content": "We made it."}, {"index": "4", "start": {"hours": "00", "minutes": "03", "seconds": "18", "milliseconds": "608", "timestamp": "00:03:18,608"}, "end": {"hours": "00", "minutes": "03", "seconds": "20", "milliseconds": "371", "timestamp": "00:03:20,371"}, "content": "I guess I was wrong."}, {"index": "5", "start": {"hours": "00", "minutes": "03", "seconds": "20", "milliseconds": "476", "timestamp": "00:03:20,476"}, "end": {"hours": "00", "minutes": "03", "seconds": "22", "milliseconds": "671", "timestamp": "00:03:22,671"}, "content": "There was no danger at all."}]

View File

@ -0,0 +1 @@
[{"index": 1,"start": {"hours": 0,"minutes": 2,"seconds": 16,"milliseconds": 612,"timestamp": "00:02:16,612"},"end": {"hours": 0,"minutes": 2,"seconds": 19,"milliseconds": 376,"timestamp": "00:02:19,376"},"content": "--> --> --> -->"},{"index": 2,"start": {"hours": 0,"minutes": 2,"seconds": 19,"milliseconds": 482,"timestamp": "00:02:19.482"},"end": {"hours": 0,"minutes": 2,"seconds": 21,"milliseconds": 609,"timestamp": "00:02:21.609"},"content": "This subtitle has . as a delimiter in the timestamp."},{"index": 3,"start": {"hours": 0,"minutes": 3,"seconds": 13,"milliseconds": 336,"timestamp": "00:03:13,336"},"end": {"hours": 0,"minutes": 3,"seconds": 15,"milliseconds": 167,"timestamp": "00:03:15,167"},"content": "4"},{"index": 4,"start": {"hours": 0,"minutes": 3,"seconds": 18,"milliseconds": 608,"timestamp": "00:03:18,608"},"end": {"hours": 0,"minutes": 3,"seconds": 20,"milliseconds": 371,"timestamp": "00:03:20,371"},"content": "The previous subtitle has a number."},{"index": 5,"start": {"hours": 0,"minutes": 3,"seconds": 20,"milliseconds": 476,"timestamp": "00:03:20,476"},"end": {"hours": 0,"minutes": 3,"seconds": 22,"milliseconds": 671,"timestamp": "00:03:22,671"},"content": "This is a valid subtitle."}]

19
tests/fixtures/generic/srt-complex.srt vendored Normal file
View File

@ -0,0 +1,19 @@
1
00:02:16,612 --> 00:02:19,376
--> --> --> -->
2
00:02:19.482 --> 00:02:21.609
This subtitle has . as a delimiter in the timestamp.
3
00:03:13,336 --> 00:03:15,167
4
4
00:03:18,608 --> 00:03:20,371
The previous subtitle has a number.
5
00:03:20,476 --> 00:03:22,671
This is a valid subtitle.

View File

@ -0,0 +1,33 @@
Slot: 1
Volume: /dev/sdb1
Virtual Device: /dev/mapper/veracrypt1
Mount Directory: /home/bob/mount/encrypt/sdb1
Size: 498 MiB
Type: Normal
Read-Only: No
Hidden Volume Protected: No
Encryption Algorithm: AES
Primary Key Size: 256 bits
Secondary Key Size (XTS Mode): 256 bits
Block Size: 128 bits
Mode of Operation: XTS
PKCS-5 PRF: HMAC-SHA-512
Volume Format Version: 2
Embedded Backup Header: Yes
Slot: 2
Volume: /dev/sdb2
Virtual Device: /dev/mapper/veracrypt2
Mount Directory: /home/bob/mount/encrypt/sdb2
Size: 522 MiB
Type: Normal
Read-Only: No
Hidden Volume Protected: No
Encryption Algorithm: AES
Primary Key Size: 256 bits
Secondary Key Size (XTS Mode): 256 bits
Block Size: 128 bits
Mode of Operation: XTS
PKCS-5 PRF: HMAC-SHA-512
Volume Format Version: 2
Embedded Backup Header: Yes

View File

@ -0,0 +1,35 @@
Slot: 1
Volume: /dev/sdb1
Virtual Device: /dev/mapper/veracrypt1
Mount Directory: /home/bob/mount/encrypt/sdb1
Size: 498 MiB
Type: Normal
Read-Only: No
Hidden Volume Protected: No
Encryption Algorithm: AES
Primary Key Size: 256 bits
Secondary Key Size (XTS Mode): 256 bits
Block Size: 128 bits
Mode of Operation: XTS
Label: bar
PKCS-5 PRF: HMAC-SHA-512
Volume Format Version: 2
Embedded Backup Header: Yes
Slot: 2
Volume: /dev/sdb2
Virtual Device: /dev/mapper/veracrypt2
Mount Directory: /home/bob/mount/encrypt/sdb2
Size: 522 MiB
Type: Normal
Read-Only: No
Hidden Volume Protected: No
Encryption Algorithm: AES
Primary Key Size: 256 bits
Secondary Key Size (XTS Mode): 256 bits
Block Size: 128 bits
Mode of Operation: XTS
Label: foo
PKCS-5 PRF: HMAC-SHA-512
Volume Format Version: 2
Embedded Backup Header: Yes

View File

@ -0,0 +1 @@
[{"tbs_certificate":{"version":"v1","serial_number":"","signature":{"algorithm":"sha512_rsa","parameters":null},"issuer":{"country_name":"DE","state_or_province_name":"stateOrProvinceName","locality_name":"localityName","organization_name":"organizationName","organizational_unit_name":"organizationUnitName","common_name":"commonName","email_address":"emailAddress"},"validity":{"not_before":1686181858,"not_after":2001541858,"not_before_iso":"2023-06-07T23:50:58+00:00","not_after_iso":"2033-06-04T23:50:58+00:00"},"subject":{"country_name":"DE","state_or_province_name":"stateOrProvinceName","locality_name":"localityName","organization_name":"organizationName","organizational_unit_name":"organizationUnitName","common_name":"commonName","email_address":"emailAddress"},"subject_public_key_info":{"algorithm":{"algorithm":"rsa","parameters":null},"public_key":{"modulus":"aa:72:23:53:97:a6:e4:4e:7b:08:82:35:a5:3d:3a:83:f9:63:38:07:df:b8:38:61:7f:99:92:c8:31:6f:7f:ac:91:a4:47:64:7e:f9:2f:e0:9e:fd:d6:35:ee:50:78:55:47:fa:63:d4:b9:64:dc:d6:1d:f6:d6:67:4f:45:d1:96:81:3b:28:28:5f:c7:91:2f:a3:d5:a2:8d:3b:a0:21:91:25:6b:a9:40:5c:a4:8d:66:17:2a:3f:6e:61:74:fb:f4:35:25:e1:d1:64:aa:15:6c:6d:33:b6:f9:07:f2:a2:29:83:1c:b1:e5:97:3b:3e:14:ea:48:d6:c7:31:ea:3a:79:c1:28:a0:a7:ea:a6:7e:cf:c7:a3:00:d5:0d:70:00:f4:34:28:ab:f6:a3:80:7a:6f:01:9c:43:4a:a8:37:13:16:11:8f:e2:57:80:1d:df:50:4f:a3:2b:35:d9:d2:7d:1e:b6:b1:e4:b5:86:f2:a3:1c:63:c0:c2:e9:3e:f0:cf:23:e8:33:b4:da:ee:59:73:e9:94:16:1b:dd:33:8a:44:31:de:36:e2:58:1f:0e:75:fd:54:4b:6d:83:5f:a6:a1:dc:b6:1d:fc:45:1d:c9:1b:7a:01:d6:cc:0c:3d:1a:96:8b:0d:3b:20:a8:40:07:e0:c5:df:ad:1a:a2:86:47:f9:ca:f6:c5:a8:99:b8:60:e8:e2:09:ea:f5:0e:97:86:07:a6:ac:50:6b:19:06:f4:37:39:9a:0d:65:bb:89:e6:ae:eb:f3:a9:cd:72:c3:31:36:ef:ac:90:48:19:d0:84:df:b2:6d:9d:ef:6c:fd:9a:ff:3c:26:68:72:80:c2:c0:40:04:ba:84:39:69:5c:e9:b1:10:98:61:3d:1a:5c:a8:9e:79:48:2e:51:d0:c3:69:27:74:c1:ef:e2:98:2a:38:3c:6e:ea:7e:36:75:d3:3c:12:f5:cd:b2:a0:8a:0a:19:68:59:30:15:e3:cf:d3:4b:f4:99:a1:5a:3c:1f:c0:34:a3:e0:88:7a:44:6d:27:a9:87:2f:91:71:b4:c7:bb:c7:01:e2:fa:53:ef:09:1b:46:7b:df:52:f8:7a:cf:03:36:f9:b6:ce:a1:1c:3f:65:46:f8:13:cd:ac:9a:e2:19:43:26:b7:4a:2b:bd:da:94:d1:18:26:41:6e:19:2d:e1:6f:df:c4:c1:43:f6:8e:1e:99:d9:da:b2:8a:58:5e:5e:e8:a9:0c:4c:1d:a0:0f:50:b8:79:4b:3a:8a:4d:7a:7f:f4:10:b3:e8:d6:41:ec:57:e3:d1:c0:e1:fc:50:20:1c:f5:ad:84:a8:f6:af:2e:f4:cb:45:b7:4a:40:af:63:66:39:9b:73","public_exponent":65537}},"issuer_unique_id":null,"subject_unique_id":null,"extensions":[{"extn_id":"subject_alt_name","critical":false,"extn_value":["m\\xe4x@m\\xfcstermann.de"]}],"serial_number_str":"0"},"signature_algorithm":{"algorithm":"sha512_rsa","parameters":null},"signature_value":"78:ca:9f:d4:e7:e0:e9:95:6d:99:8f:ba:ca:69:ff:bd:2e:db:9f:4b:15:e5:ea:b8:c2:58:16:29:c2:2d:24:a3:62:36:91:61:ec:4b:99:e4:09:f9:a9:9b:fa:03:73:c1:ea:05:a9:ef:29:28:29:f6:00:aa:82:f8:53:1c:f0:6e:c0:87:ad:b2:93:24:ae:ba:56:f8:1c:62:54:23:d4:d5:66:a5:e1:36:cd:48:13:ad:fd:7b:4d:ff:c1:ee:de:fe:2f:d9:af:0e:82:7b:b0:58:2d:0c:e5:86:70:97:40:a5:ee:99:9a:96:59:14:8b:63:37:c5:04:07:17:58:04:56:d3:d9:71:a8:9c:c3:2f:21:77:19:ac:4d:95:83:f1:9f:91:0c:a3:8b:9c:1d:0e:0a:45:ed:e2:84:f9:57:6a:fa:5b:20:a8:15:26:d2:d8:34:2a:60:a7:d3:54:70:71:c3:17:aa:d7:3d:65:f5:5f:4e:a9:41:a2:e3:a7:c0:b4:5e:af:0b:48:64:f5:3a:08:0b:ec:c3:77:42:f8:13:19:45:19:7f:f8:09:79:1b:32:e2:9c:c2:91:b3:8d:e0:f4:e5:3f:9d:36:ae:22:a4:a8:d1:53:5b:c6:e3:ff:cb:a3:c0:47:ef:fd:b6:08:07:7a:97:1b:bf:cf:08:e0:5d:d1:4a:19:8a:14:c2:22:d0:79:b7:dc:76:d2:35:08:40:f8:33:80:8e:91:39:16:89:f5:51:18:d7:09:62:8d:47:ed:c6:e6:07:9d:d4:a8:3c:7a:df:e0:0d:bb:9a:a8:42:44:59:5d:f7:7b:f7:53:54:5f:0b:7f:1b:65:8d:df:bd:78:c9:e5:f8:57:e3:6b:e7:1f:d4:20:20:c3:0a:18:e2:6e:fa:10:e8:49:54:c7:25:6e:a1:5d:28:5f:45:f2:f1:c5:52:0e:28:c6:64:3a:4b:a6:d2:aa:66:e3:4d:fd:b2:3d:9c:30:b5:35:85:c8:44:93:53:f6:98:21:22:7c:36:8d:12:d9:d2:05:84:d0:22:b6:db:92:59:81:ea:26:3f:53:7b:a8:e8:34:c6:64:21:c0:e6:5b:3e:2b:23:6a:8b:dd:2d:63:25:46:ab:e7:a5:e4:1c:53:f0:e5:46:bb:80:17:da:ee:45:cf:da:34:34:3c:f4:61:a4:9e:00:92:a0:72:42:52:d9:9c:31:d0:90:6d:a7:90:53:9c:6a:49:83:55:f8:45:4a:1b:0c:da:65:1b:a3:d4:8c:b2:36:88:c3:c9:e2:ac:e2:93:e6:7c:fc:f6:e6:1b:35:21:26:d6:75:32:dc:98:dd:ba:7d:90:d8:48:25:36:7b:2e:f6:a1:72:bd:01"}]

View File

@ -0,0 +1,34 @@
-----BEGIN CERTIFICATE-----
MIIF9DCCA9wCAQAwDQYJKoZIhvcNAQENBQAwga4xCzAJBgNVBAYTAkRFMRwwGgYD
VQQIDBNzdGF0ZU9yUHJvdmluY2VOYW1lMRUwEwYDVQQHDAxsb2NhbGl0eU5hbWUx
GTAXBgNVBAoMEG9yZ2FuaXphdGlvbk5hbWUxHTAbBgNVBAsMFG9yZ2FuaXphdGlv
blVuaXROYW1lMRMwEQYDVQQDDApjb21tb25OYW1lMRswGQYJKoZIhvcNAQkBFgxl
bWFpbEFkZHJlc3MwHhcNMjMwNjA3MjM1MDU4WhcNMzMwNjA0MjM1MDU4WjCBrjEL
MAkGA1UEBhMCREUxHDAaBgNVBAgME3N0YXRlT3JQcm92aW5jZU5hbWUxFTATBgNV
BAcMDGxvY2FsaXR5TmFtZTEZMBcGA1UECgwQb3JnYW5pemF0aW9uTmFtZTEdMBsG
A1UECwwUb3JnYW5pemF0aW9uVW5pdE5hbWUxEzARBgNVBAMMCmNvbW1vbk5hbWUx
GzAZBgkqhkiG9w0BCQEWDGVtYWlsQWRkcmVzczCCAiIwDQYJKoZIhvcNAQEBBQAD
ggIPADCCAgoCggIBAKpyI1OXpuROewiCNaU9OoP5YzgH37g4YX+Zksgxb3+skaRH
ZH75L+Ce/dY17lB4VUf6Y9S5ZNzWHfbWZ09F0ZaBOygoX8eRL6PVoo07oCGRJWup
QFykjWYXKj9uYXT79DUl4dFkqhVsbTO2+QfyoimDHLHllzs+FOpI1scx6jp5wSig
p+qmfs/HowDVDXAA9DQoq/ajgHpvAZxDSqg3ExYRj+JXgB3fUE+jKzXZ0n0etrHk
tYbyoxxjwMLpPvDPI+gztNruWXPplBYb3TOKRDHeNuJYHw51/VRLbYNfpqHcth38
RR3JG3oB1swMPRqWiw07IKhAB+DF360aooZH+cr2xaiZuGDo4gnq9Q6XhgemrFBr
GQb0NzmaDWW7ieau6/OpzXLDMTbvrJBIGdCE37Jtne9s/Zr/PCZocoDCwEAEuoQ5
aVzpsRCYYT0aXKieeUguUdDDaSd0we/imCo4PG7qfjZ10zwS9c2yoIoKGWhZMBXj
z9NL9JmhWjwfwDSj4Ih6RG0nqYcvkXG0x7vHAeL6U+8JG0Z731L4es8DNvm2zqEc
P2VG+BPNrJriGUMmt0orvdqU0RgmQW4ZLeFv38TBQ/aOHpnZ2rKKWF5e6KkMTB2g
D1C4eUs6ik16f/QQs+jWQexX49HA4fxQIBz1rYSo9q8u9MtFt0pAr2NmOZtzAgMB
AAGjIDAeMBwGA1UdEQQVMBOBEW3keEBt/HN0ZXJtYW5uLmRlMA0GCSqGSIb3DQEB
DQUAA4ICAQB4yp/U5+DplW2Zj7rKaf+9LtufSxXl6rjCWBYpwi0ko2I2kWHsS5nk
Cfmpm/oDc8HqBanvKSgp9gCqgvhTHPBuwIetspMkrrpW+BxiVCPU1Wal4TbNSBOt
/XtN/8Hu3v4v2a8OgnuwWC0M5YZwl0Cl7pmallkUi2M3xQQHF1gEVtPZcaicwy8h
dxmsTZWD8Z+RDKOLnB0OCkXt4oT5V2r6WyCoFSbS2DQqYKfTVHBxwxeq1z1l9V9O
qUGi46fAtF6vC0hk9ToIC+zDd0L4ExlFGX/4CXkbMuKcwpGzjeD05T+dNq4ipKjR
U1vG4//Lo8BH7/22CAd6lxu/zwjgXdFKGYoUwiLQebfcdtI1CED4M4COkTkWifVR
GNcJYo1H7cbmB53UqDx63+ANu5qoQkRZXfd791NUXwt/G2WN3714yeX4V+Nr5x/U
ICDDChjibvoQ6ElUxyVuoV0oX0Xy8cVSDijGZDpLptKqZuNN/bI9nDC1NYXIRJNT
9pghInw2jRLZ0gWE0CK225JZgeomP1N7qOg0xmQhwOZbPisjaovdLWMlRqvnpeQc
U/DlRruAF9ruRc/aNDQ89GGkngCSoHJCUtmcMdCQbaeQU5xqSYNV+EVKGwzaZRuj
1IyyNojDyeKs4pPmfPz25hs1ISbWdTLcmN26fZDYSCU2ey72oXK9AQ==
-----END CERTIFICATE-----

View File

@ -0,0 +1 @@
[{"certification_request_info":{"version":"v1","subject":{"country_name":"US","state_or_province_name":"CA","locality_name":"San Francisco","organization_name":"jc","organizational_unit_name":"jc","common_name":"jc","email_address":"info@jc.com"},"subject_pk_info":{"algorithm":{"algorithm":"rsa","parameters":null},"public_key":{"modulus":"a3:6d:d5:69:43:b3:c2:00:87:a6:3e:57:6f:24:4c:d7:65:47:b5:87:bc:2f:3e:03:f5:bd:d8:96:36:d0:69:99:5e:3b:fd:cc:6a:3d:d9:ce:75:d8:36:ff:0e:08:55:ac:2b:ab:c5:ec:72:39:a4:12:ad:25:5f:1c:d6:c3:46:53:1b:fe:c9:53:fe:bf:18:33:64:2f:5c:36:f0:99:a1:46:4e:bc:84:19:88:0f:62:d5:14:65:c7:74:e3:a4:4c:50:6e:e0:5f:58:2a:e4:1a:10:fb:54:0b:7f:c6:d0:55:30:ea:8a:30:d2:ce:7e:93:c2:c9:0d:fd:75:20:89:70:51:49:3b:51:cb:6c:f7:5d:39:ee:d8:13:92:7d:31:04:61:e0:49:9f:34:ed:60:2a:72:a8:0b:a5:bf:2e:04:9f:61:13:7c:94:f1:e0:75:43:e0:5a:74:a4:78:36:50:f8:9b:c2:d4:e3:e2:f7:87:09:71:c7:b1:4c:53:60:3b:b3:1f:20:12:c4:cb:16:35:8b:a7:e1:00:01:a4:db:9e:c0:e7:e3:b3:a5:9b:ea:3d:38:1e:07:41:19:4c:48:34:92:71:c7:ee:ab:78:09:6f:f6:2b:78:c9:63:cc:46:1b:ad:e4:1d:96:dd:18:df:c9:66:79:73:ee:ee:d2:77:66:f5:3f","public_exponent":65537}},"attributes":[]},"signature_algorithm":{"algorithm":"sha256_rsa","parameters":null},"signature":"4a:8c:ea:3c:8a:4f:ae:f7:32:cf:46:f4:aa:a2:30:43:c8:21:26:6d:58:d4:89:12:3d:ed:69:7a:c8:3f:c7:62:d2:79:24:23:5d:53:b4:8d:6f:dd:60:82:bf:ab:46:14:bb:74:5d:92:7c:7f:f6:ad:0d:0c:74:fa:15:93:5e:ae:61:b2:dd:2c:89:a1:c9:c1:21:b9:92:57:39:be:05:98:97:e1:39:3c:5e:9a:18:56:b6:2a:db:62:51:23:d8:3c:f8:f9:dd:c1:5f:5a:85:a3:b6:e2:93:95:12:1a:6f:bb:93:14:50:e8:72:a0:b9:d9:4b:cb:8c:b6:08:35:60:4a:ba:9d:6d:e6:ba:7f:ea:a8:fe:d7:28:70:f4:c2:d1:29:92:94:c4:ff:ad:1e:b3:ad:c9:92:3e:b8:02:29:3c:e7:d6:84:a2:d4:77:46:bd:28:15:35:c6:f4:2c:a0:08:e8:9d:75:93:63:8d:e8:c8:be:60:12:da:09:a4:b7:bc:48:97:ca:fb:18:73:1f:0a:43:3d:9d:f4:8f:77:ad:7e:d2:21:b4:53:87:a1:89:d5:ee:cb:08:7c:5d:0f:a5:37:32:f2:74:53:67:97:59:f6:d4:fa:0b:9a:55:08:e3:5a:55:1b:e1:5c:f8:00:62:fb:83:e0:13:30:b1:71:5c:5c:8d"}]

View File

@ -0,0 +1 @@
[{"certification_request_info":{"version":"v1","subject":{"country_name":"US","state_or_province_name":"NY","locality_name":"NY","organization_name":"My Company","organizational_unit_name":"IT","common_name":"www.mywebsite.com"},"subject_pk_info":{"algorithm":{"algorithm":"rsa","parameters":null},"public_key":{"modulus":"b5:92:fc:6c:31:40:34:d3:9b:34:d7:3d:be:4e:ee:33:39:ad:5a:ba:a1:fe:a9:c8:2d:c7:b0:db:e6:d0:d1:7d:37:68:4b:47:5e:06:61:4c:9e:cc:b0:2f:85:c8:49:a4:2b:96:45:f6:62:5f:24:50:0f:95:72:47:e5:62:f2:10:95:97:90:7e:b9:0f:11:a1:9d:c4:90:85:4d:68:04:bd:a0:c0:43:58:4c:f8:13:ba:98:3d:97:e4:68:8f:c3:55:fb:e0:d2:61:95:23:01:f1:7b:d8:61:c8:df:e1:3f:c6:fb:9d:ba:c1:e3:e3:44:2a:c1:dc:5c:a1:92:7a:95:3c:f1:e5:3f:55:4e:fe:22:3c:c0:2f:78:de:cc:7b:8c:ab:00:6d:f1:db:c9:eb:91:aa:d7:f0:89:c4:0c:60:a8:8c:cb:4a:af:bc:1c:1e:61:2d:cf:89:84:0f:ff:f6:dd:7f:61:0d:49:67:52:90:64:b1:e0:6c:e6:d8:28:00:08:11:0b:3a:cd:92:27:57:4f:0b:95:0d:59:90:06:a0:38:18:06:87:97:5a:48:97:10:fa:9d:dd:ef:dc:de:c6:88:dd:24:6b:1a:5e:4f:ab:ea:67:21:41:37:9a:7f:ac:cd:1c:3a:7a:0e:1f:d7:6b:28:91:ca:bd:2b:a8:ab:b8:38:19","public_exponent":65537}},"attributes":[{"type":"microsoft_os_version","values":["6.1.7601.2"]},{"type":"microsoft_request_client_info","values":[{"clientid":5,"machinename":"dell-PC","username":"dell-PC\\Dev","processname":"InetMgr.exe"}]},{"type":"microsoft_enrollment_csp_provider","values":[{"keyspec":1,"cspname":"Microsoft RSA SChannel Cryptographic Provider","signature":[]}]},{"type":"extension_request","values":[[{"extn_id":"key_usage","critical":true,"extn_value":["data_encipherment","digital_signature","key_encipherment","non_repudiation"]},{"extn_id":"extended_key_usage","critical":false,"extn_value":["server_auth"]},{"extn_id":"1.2.840.113549.1.9.15","critical":false,"extn_value":"30:69:30:0e:06:08:2a:86:48:86:f7:0d:03:02:02:02:00:80:30:0e:06:08:2a:86:48:86:f7:0d:03:04:02:02:00:80:30:0b:06:09:60:86:48:01:65:03:04:01:2a:30:0b:06:09:60:86:48:01:65:03:04:01:2d:30:0b:06:09:60:86:48:01:65:03:04:01:02:30:0b:06:09:60:86:48:01:65:03:04:01:05:30:07:06:05:2b:0e:03:02:07:30:0a:06:08:2a:86:48:86:f7:0d:03:07"},{"extn_id":"key_identifier","critical":false,"extn_value":"02:3f:12:86:0b:e5:e7:b6:48:cc:b3:57:b7:8b:1e:27:81:5f:0b:08"}]]}]},"signature_algorithm":{"algorithm":"sha1_rsa","parameters":null},"signature":"56:74:46:d0:35:ab:e1:fe:c6:07:5f:d7:d7:1c:0f:3b:28:c6:a0:80:e8:d2:95:58:04:61:ac:d1:b9:af:20:bc:f9:fd:15:84:54:87:d4:d3:8f:c8:46:35:68:c1:21:21:92:c9:a7:60:43:6a:83:f0:d8:6f:a5:c5:e4:da:97:bd:15:cd:b3:9b:93:96:f0:29:37:b7:2c:01:96:54:43:43:56:a9:8e:df:1d:db:16:05:ce:e4:f7:dd:15:84:c7:32:d5:f2:c9:8e:a7:64:2b:ab:ba:71:6a:1a:ca:2f:94:2c:6e:36:aa:d0:12:00:83:e5:0f:5a:67:a9:5c:4f:c7:76:5b:67:a9:69:ee:5b:0b:36:78:5b:11:c2:50:41:ed:61:e8:da:de:16:10:7e:4f:5e:16:96:80:56:ae:0a:14:8a:01:3e:71:98:e4:bf:95:27:51:fb:f7:07:5a:d2:9e:3e:d2:46:35:68:1f:cf:c9:52:8d:31:99:ae:4a:98:5d:78:c0:f4:e3:98:6c:2a:6e:94:15:3f:bd:f5:ec:2a:fe:87:0a:0c:82:57:ed:4e:d5:54:e8:10:ff:9d:df:52:62:77:ce:a5:1a:bd:c8:3e:4b:1a:f0:14:5f:30:15:4e:7d:54:a3:d2:19:8a:0c:04:e5:84:3d:df:f9:a5:bd:2b:1b:1f"}]

View File

@ -0,0 +1,25 @@
-----BEGIN NEW CERTIFICATE REQUEST-----
MIIERTCCAy0CAQAwZTELMAkGA1UEBhMCVVMxCzAJBgNVBAgMAk5ZMQswCQYDVQQH
DAJOWTETMBEGA1UECgwKTXkgQ29tcGFueTELMAkGA1UECwwCSVQxGjAYBgNVBAMM
EXd3dy5teXdlYnNpdGUuY29tMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKC
AQEAtZL8bDFANNObNNc9vk7uMzmtWrqh/qnILcew2+bQ0X03aEtHXgZhTJ7MsC+F
yEmkK5ZF9mJfJFAPlXJH5WLyEJWXkH65DxGhncSQhU1oBL2gwENYTPgTupg9l+Ro
j8NV++DSYZUjAfF72GHI3+E/xvudusHj40QqwdxcoZJ6lTzx5T9VTv4iPMAveN7M
e4yrAG3x28nrkarX8InEDGCojMtKr7wcHmEtz4mED//23X9hDUlnUpBkseBs5tgo
AAgRCzrNkidXTwuVDVmQBqA4GAaHl1pIlxD6nd3v3N7GiN0kaxpeT6vqZyFBN5p/
rM0cOnoOH9drKJHKvSuoq7g4GQIDAQABoIIBmTAaBgorBgEEAYI3DQIDMQwWCjYu
MS43NjAxLjIwNQYJKwYBBAGCNxUUMSgwJgIBBQwHZGVsbC1QQwwLZGVsbC1QQ1xE
ZXYMC0luZXRNZ3IuZXhlMHIGCisGAQQBgjcNAgIxZDBiAgEBHloATQBpAGMAcgBv
AHMAbwBmAHQAIABSAFMAQQAgAFMAQwBoAGEAbgBuAGUAbAAgAEMAcgB5AHAAdABv
AGcAcgBhAHAAaABpAGMAIABQAHIAbwB2AGkAZABlAHIDAQAwgc8GCSqGSIb3DQEJ
DjGBwTCBvjAOBgNVHQ8BAf8EBAMCBPAwEwYDVR0lBAwwCgYIKwYBBQUHAwEweAYJ
KoZIhvcNAQkPBGswaTAOBggqhkiG9w0DAgICAIAwDgYIKoZIhvcNAwQCAgCAMAsG
CWCGSAFlAwQBKjALBglghkgBZQMEAS0wCwYJYIZIAWUDBAECMAsGCWCGSAFlAwQB
BTAHBgUrDgMCBzAKBggqhkiG9w0DBzAdBgNVHQ4EFgQUAj8Shgvl57ZIzLNXt4se
J4FfCwgwDQYJKoZIhvcNAQEFBQADggEBAFZ0RtA1q+H+xgdf19ccDzsoxqCA6NKV
WARhrNG5ryC8+f0VhFSH1NOPyEY1aMEhIZLJp2BDaoPw2G+lxeTal70VzbObk5bw
KTe3LAGWVENDVqmO3x3bFgXO5PfdFYTHMtXyyY6nZCurunFqGsovlCxuNqrQEgCD
5Q9aZ6lcT8d2W2epae5bCzZ4WxHCUEHtYeja3hYQfk9eFpaAVq4KFIoBPnGY5L+V
J1H79wda0p4+0kY1aB/PyVKNMZmuSphdeMD045hsKm6UFT+99ewq/ocKDIJX7U7V
VOgQ/53fUmJ3zqUavcg+SxrwFF8wFU59VKPSGYoMBOWEPd/5pb0rGx8=
-----END NEW CERTIFICATE REQUEST-----

BIN
tests/fixtures/generic/x509-csr.der vendored Normal file

Binary file not shown.

1
tests/fixtures/generic/x509-csr.json vendored Normal file
View File

@ -0,0 +1 @@
[{"certification_request_info":{"version":"v1","subject":{"country_name":"US","state_or_province_name":"Utah","locality_name":"Lindon","organization_name":"DigiCert Inc.","organizational_unit_name":"DigiCert","common_name":"example.digicert.com"},"subject_pk_info":{"algorithm":{"algorithm":"rsa","parameters":null},"public_key":{"modulus":"f3:e4:e8:ed:df:b6:90:f5:9e:06:ff:e8:ad:4d:cb:55:b2:70:0e:b4:90:6d:e2:9a:98:29:a8:c2:9e:5b:a8:3c:48:c1:5d:b4:ce:a4:5b:ec:03:d4:38:a6:28:54:41:45:38:44:2c:e9:3e:a0:22:69:c8:a2:58:5b:88:7e:a6:e3:38:19:fc:23:ef:58:13:a4:65:cf:9c:d4:fa:36:12:6b:c1:cf:e0:03:e6:c0:5d:4f:99:33:19:00:3a:35:b5:b2:64:69:5d:c5:1b:61:34:b3:ac:d5:e7:ce:85:d9:d6:16:e8:48:d7:ad:aa:99:c7:e5:82:98:88:58:3b:b0:ab:80:bd:7f:e6:24:78:98:4d:9f:d7:45:e7:ea:30:9b:c7:0e:42:60:eb:57:c3:4d:76:24:ea:8a:7f:2a:de:a6:00:1c:72:51:5b:6f:20:94:95:02:66:44:d9:c0:86:92:47:a7:2b:05:0f:13:6d:83:44:d1:d7:3e:09:a6:b7:0c:e2:24:cf:51:0e:b0:75:b3:4f:1f:a7:d3:32:9f:a9:c6:e0:5e:2e:03:27:1f:82:d5:b8:e9:b5:83:d1:04:f6:4b:f0:30:1e:5a:e0:3c:79:bb:9d:55:3e:38:c8:4a:7c:d8:6f:7a:fc:68:1c:7f:b1:77:df:13:31:7b:4c:9c:f9:76:ba:a3","public_exponent":65537}},"attributes":[]},"signature_algorithm":{"algorithm":"sha1_rsa","parameters":null},"signature":"1d:24:72:b1:5c:71:29:85:0e:6c:68:c7:43:5e:d3:55:08:a9:2b:03:a8:78:0b:f9:79:87:4d:72:70:ad:ee:83:84:94:99:c1:bb:c4:b4:e2:b4:1b:7f:9d:af:81:6c:d7:55:ae:50:db:79:a9:c2:ec:c7:96:bc:ba:4e:06:e8:02:87:33:3b:a1:2e:c2:7b:5d:98:e0:99:05:c6:10:2a:58:43:89:82:df:24:f7:66:80:86:a4:85:db:c3:e8:8f:de:59:84:11:78:1a:40:bd:13:c7:92:c5:97:fa:24:29:b2:98:c0:8a:8d:8b:22:96:38:c8:fb:65:1f:f0:c5:68:3f:64:31:91:b3:9e:71:ba:87:8b:0c:9f:d9:44:57:fd:6c:8f:88:68:25:1d:d5:8a:df:61:c1:c8:97:71:bc:ec:0b:fe:af:8f:58:57:0a:91:0d:3d:15:0d:5e:ee:2e:0a:a7:db:d5:c8:d4:fa:55:50:d0:8f:40:69:fd:a7:f7:97:e9:0a:3b:be:90:da:3f:26:d1:b4:0d:91:ed:72:ca:8d:06:85:f6:85:d6:78:25:2a:cb:58:6f:25:a7:3d:40:53:b6:f7:b3:9b:d5:a9:69:1c:fa:19:ee:65:a2:12:e2:70:8c:13:e2:8b:a6:bd:33:d1:b7:d2:75:28:df:d9:41:8b:5c"}]

17
tests/fixtures/generic/x509-csr.pem vendored Normal file
View File

@ -0,0 +1,17 @@
-----BEGIN CERTIFICATE REQUEST-----
MIICvDCCAaQCAQAwdzELMAkGA1UEBhMCVVMxDTALBgNVBAgMBFV0YWgxDzANBgNV
BAcMBkxpbmRvbjEWMBQGA1UECgwNRGlnaUNlcnQgSW5jLjERMA8GA1UECwwIRGln
aUNlcnQxHTAbBgNVBAMMFGV4YW1wbGUuZGlnaWNlcnQuY29tMIIBIjANBgkqhkiG
9w0BAQEFAAOCAQ8AMIIBCgKCAQEA8+To7d+2kPWeBv/orU3LVbJwDrSQbeKamCmo
wp5bqDxIwV20zqRb7APUOKYoVEFFOEQs6T6gImnIolhbiH6m4zgZ/CPvWBOkZc+c
1Po2EmvBz+AD5sBdT5kzGQA6NbWyZGldxRthNLOs1efOhdnWFuhI162qmcflgpiI
WDuwq4C9f+YkeJhNn9dF5+owm8cOQmDrV8NNdiTqin8q3qYAHHJRW28glJUCZkTZ
wIaSR6crBQ8TbYNE0dc+Caa3DOIkz1EOsHWzTx+n0zKfqcbgXi4DJx+C1bjptYPR
BPZL8DAeWuA8ebudVT44yEp82G96/Ggcf7F33xMxe0yc+Xa6owIDAQABoAAwDQYJ
KoZIhvcNAQEFBQADggEBAB0kcrFccSmFDmxox0Ne01UIqSsDqHgL+XmHTXJwre6D
hJSZwbvEtOK0G3+dr4Fs11WuUNt5qcLsx5a8uk4G6AKHMzuhLsJ7XZjgmQXGECpY
Q4mC3yT3ZoCGpIXbw+iP3lmEEXgaQL0Tx5LFl/okKbKYwIqNiyKWOMj7ZR/wxWg/
ZDGRs55xuoeLDJ/ZRFf9bI+IaCUd1YrfYcHIl3G87Av+r49YVwqRDT0VDV7uLgqn
29XI1PpVUNCPQGn9p/eX6Qo7vpDaPybRtA2R7XLKjQaF9oXWeCUqy1hvJac9QFO2
97Ob1alpHPoZ7mWiEuJwjBPii6a9M9G30nUo39lBi1w=
-----END CERTIFICATE REQUEST-----

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,74 @@
Netid State Recv-Q Send-Q Local Address:Port Peer Address:Port
udp UNCONN 0 0 0.0.0.0:4789 0.0.0.0:* ino:58296 sk:1 <->
udp UNCONN 0 0 0.0.0.0:5353 0.0.0.0:* users:(("avahi-daemon",pid=1510,fd=12)) uid:112 ino:25147 sk:2 <->
udp UNCONN 0 0 0.0.0.0:39996 0.0.0.0:* users:(("snmpd",pid=1996,fd=12)) ino:42903 sk:3 <->
udp UNCONN 0 0 0.0.0.0:7946 0.0.0.0:* users:(("dockerd",pid=2840,fd=28)) ino:61138 sk:4 <->
udp UNCONN 0 0 0.0.0.0:8125 0.0.0.0:* users:(("dockerd",pid=2840,fd=104)) ino:78939 sk:5 <->
udp UNCONN 0 0 192.168.122.1:53 0.0.0.0:* users:(("dnsmasq",pid=2783,fd=5)) ino:43666 sk:6 <->
udp UNCONN 0 0 127.0.0.53%lo:53 0.0.0.0:* users:(("systemd-resolve",pid=1117,fd=12)) uid:101 ino:29400 sk:7 <->
udp UNCONN 0 0 0.0.0.0%virbr0:67 0.0.0.0:* users:(("dnsmasq",pid=2783,fd=3)) ino:43663 sk:8 <->
udp UNCONN 0 0 0.0.0.0:68 0.0.0.0:* users:(("dhclient",pid=4852,fd=6)) ino:55971 sk:9 <->
udp UNCONN 0 0 0.0.0.0:68 0.0.0.0:* users:(("dhclient",pid=4738,fd=6)) ino:46019 sk:a <->
udp UNCONN 0 0 0.0.0.0:111 0.0.0.0:* users:(("rpcbind",pid=1119,fd=6)) ino:24994 sk:b <->
udp UNCONN 0 0 0.0.0.0:161 0.0.0.0:* users:(("snmpd",pid=1996,fd=11)) ino:42905 sk:c <->
udp UNCONN 0 0 0.0.0.0:631 0.0.0.0:* users:(("cups-browsed",pid=1880,fd=7)) ino:46417 sk:d <->
udp UNCONN 0 0 0.0.0.0:871 0.0.0.0:* users:(("rpcbind",pid=1119,fd=7)) ino:34845 sk:e <->
udp UNCONN 0 0 0.0.0.0:35059 0.0.0.0:* users:(("avahi-daemon",pid=1510,fd=13)) uid:112 ino:37602 sk:f <->
tcp LISTEN 0 4096 0.0.0.0:9187 0.0.0.0:* users:(("prometheus-post",pid=1671,fd=3)) uid:128 ino:32721 sk:10 <->
tcp LISTEN 0 4096 0.0.0.0:389 0.0.0.0:* users:(("dockerd",pid=2840,fd=183)) ino:128462 sk:11 <->
tcp LISTEN 0 128 0.0.0.0:6566 0.0.0.0:* users:(("systemd",pid=1,fd=62)) ino:35013 sk:12 <->
tcp LISTEN 0 4096 0.0.0.0:2023 0.0.0.0:* users:(("dockerd",pid=2840,fd=97)) ino:78929 sk:13 <->
tcp LISTEN 0 4096 0.0.0.0:5000 0.0.0.0:* users:(("dockerd",pid=2840,fd=214)) ino:138297 sk:14 <->
tcp LISTEN 0 4096 0.0.0.0:2024 0.0.0.0:* users:(("dockerd",pid=2840,fd=99)) ino:75475 sk:15 <->
tcp LISTEN 0 4096 0.0.0.0:9000 0.0.0.0:* users:(("dockerd",pid=2840,fd=66)) ino:71580 sk:16 <->
tcp LISTEN 0 10 0.0.0.0:3689 0.0.0.0:* users:(("rhythmbox",pid=6716,fd=19)) uid:1000 ino:150684 sk:17 <->
tcp LISTEN 0 4096 0.0.0.0:27017 0.0.0.0:* users:(("dockerd",pid=2840,fd=196)) ino:135216 sk:18 <->
tcp LISTEN 0 4096 0.0.0.0:2377 0.0.0.0:* users:(("dockerd",pid=2840,fd=18)) ino:50423 sk:19 <->
tcp LISTEN 0 4096 0.0.0.0:7946 0.0.0.0:* users:(("dockerd",pid=2840,fd=27)) ino:61137 sk:1a <->
tcp LISTEN 0 80 0.0.0.0:3306 0.0.0.0:* users:(("mysqld",pid=2367,fd=24)) uid:125 ino:47161 sk:1b <->
tcp LISTEN 0 4096 0.0.0.0:6443 0.0.0.0:* users:(("dockerd",pid=2840,fd=149)) ino:83788 sk:1c <->
tcp LISTEN 0 4096 0.0.0.0:6379 0.0.0.0:* users:(("dockerd",pid=2840,fd=33)) ino:74330 sk:1d <->
tcp LISTEN 0 4096 0.0.0.0:9100 0.0.0.0:* users:(("prometheus-node",pid=1419,fd=3)) uid:128 ino:24436 sk:1e <->
tcp LISTEN 0 4096 0.0.0.0:9133 0.0.0.0:* users:(("fritzbox_export",pid=1951,fd=3)) uid:128 ino:42810 sk:1f <->
tcp LISTEN 0 4096 0.0.0.0:2222 0.0.0.0:* users:(("dockerd",pid=2840,fd=133)) ino:89671 sk:20 <->
tcp LISTEN 0 4096 0.0.0.0:2223 0.0.0.0:* users:(("dockerd",pid=2840,fd=146)) ino:98624 sk:21 <->
tcp LISTEN 0 4096 127.0.0.1:40783 0.0.0.0:* users:(("containerd",pid=2002,fd=14)) ino:42959 sk:22 <->
tcp LISTEN 0 128 0.0.0.0:111 0.0.0.0:* users:(("rpcbind",pid=1119,fd=8)) ino:34846 sk:23 <->
tcp LISTEN 0 511 0.0.0.0:80 0.0.0.0:* users:(("/usr/sbin/apach",pid=24795,fd=3),("/usr/sbin/apach",pid=2157,fd=3),("/usr/sbin/apach",pid=2156,fd=3),("/usr/sbin/apach",pid=2155,fd=3),("/usr/sbin/apach",pid=2154,fd=3),("/usr/sbin/apach",pid=2153,fd=3),("/usr/sbin/apach",pid=2079,fd=3)) ino:40480 sk:24 <->
tcp LISTEN 0 4096 0.0.0.0:9104 0.0.0.0:* users:(("prometheus-mysq",pid=1587,fd=3)) uid:128 ino:24481 sk:25 <->
tcp LISTEN 0 4096 0.0.0.0:8081 0.0.0.0:* users:(("dockerd",pid=2840,fd=205)) ino:131957 sk:26 <->
tcp LISTEN 0 4096 0.0.0.0:8082 0.0.0.0:* users:(("dockerd",pid=2840,fd=223)) ino:140597 sk:27 <->
tcp LISTEN 0 4096 0.0.0.0:2003 0.0.0.0:* users:(("dockerd",pid=2840,fd=85)) ino:75383 sk:28 <->
tcp LISTEN 0 128 127.0.0.1:5939 0.0.0.0:* users:(("teamviewerd",pid=4963,fd=12)) ino:59559 sk:29 <->
tcp LISTEN 0 4096 0.0.0.0:2004 0.0.0.0:* users:(("dockerd",pid=2840,fd=96)) ino:71664 sk:2a <->
tcp LISTEN 0 500 0.0.0.0:8084 0.0.0.0:* users:(("Main",pid=1799,fd=5)) uid:33 ino:44981 sk:2b <->
tcp LISTEN 0 4096 0.0.0.0:8085 0.0.0.0:* users:(("dockerd",pid=2840,fd=100)) ino:77932 sk:2c <->
tcp LISTEN 0 32 192.168.122.1:53 0.0.0.0:* users:(("dnsmasq",pid=2783,fd=6)) ino:43667 sk:2d <->
tcp LISTEN 0 128 127.0.0.53%lo:53 0.0.0.0:* users:(("systemd-resolve",pid=1117,fd=13)) uid:101 ino:29401 sk:2e <->
tcp LISTEN 0 4096 0.0.0.0:8086 0.0.0.0:* users:(("dockerd",pid=2840,fd=98)) ino:75513 sk:2f <->
tcp LISTEN 0 128 0.0.0.0:22 0.0.0.0:* users:(("sshd",pid=1998,fd=3)) ino:58511 sk:30 <->
tcp LISTEN 0 4096 0.0.0.0:8087 0.0.0.0:* users:(("dockerd",pid=2840,fd=157)) ino:98633 sk:31 <->
tcp LISTEN 0 5 127.0.0.1:631 0.0.0.0:* users:(("cupsd",pid=1410,fd=7)) ino:40509 sk:32 <->
tcp LISTEN 0 4096 0.0.0.0:9111 0.0.0.0:* users:(("syno_exporter-0",pid=1941,fd=3)) uid:128 ino:25196 sk:33 <->
tcp LISTEN 0 4096 0.0.0.0:3000 0.0.0.0:* users:(("dockerd",pid=2840,fd=42)) ino:89081 sk:34 <->
tcp LISTEN 0 4096 0.0.0.0:5432 0.0.0.0:* users:(("dockerd",pid=2840,fd=117)) ino:76497 sk:35 <->
tcp LISTEN 0 4096 0.0.0.0:3033 0.0.0.0:* users:(("dockerd",pid=2840,fd=139)) ino:89680 sk:36 <->
tcp LISTEN 0 4096 0.0.0.0:8089 0.0.0.0:* users:(("dockerd",pid=2840,fd=185)) ino:84656 sk:37 <->
tcp LISTEN 0 100 127.0.0.1:25 0.0.0.0:* users:(("master",pid=5745,fd=13)) ino:57959 sk:38 <->
tcp LISTEN 0 4096 0.0.0.0:3003 0.0.0.0:* users:(("dockerd",pid=2840,fd=23)) ino:90254 sk:39 <->
tcp LISTEN 0 4096 0.0.0.0:8091 0.0.0.0:* users:(("dockerd",pid=2840,fd=144)) ino:84694 sk:3a <->
tcp LISTEN 0 511 0.0.0.0:443 0.0.0.0:* users:(("/usr/sbin/apach",pid=24795,fd=4),("/usr/sbin/apach",pid=2157,fd=4),("/usr/sbin/apach",pid=2156,fd=4),("/usr/sbin/apach",pid=2155,fd=4),("/usr/sbin/apach",pid=2154,fd=4),("/usr/sbin/apach",pid=2153,fd=4),("/usr/sbin/apach",pid=2079,fd=4)) ino:40485 sk:3b <->
tcp LISTEN 0 100 0.0.0.0:1883 0.0.0.0:* users:(("mosquitto",pid=1798,fd=4)) ino:38377 sk:3c <->
tcp LISTEN 0 4096 0.0.0.0:9115 0.0.0.0:* users:(("prometheus-blac",pid=1651,fd=3)) uid:128 ino:25119 sk:3d <->
tcp LISTEN 0 4096 0.0.0.0:636 0.0.0.0:* users:(("dockerd",pid=2840,fd=187)) ino:125799 sk:3e <->
tcp LISTEN 0 4096 0.0.0.0:8092 0.0.0.0:* users:(("dockerd",pid=2840,fd=37)) ino:87972 sk:3f <->
tcp LISTEN 0 4096 0.0.0.0:9116 0.0.0.0:* users:(("snmp_exporter",pid=1928,fd=3)) uid:128 ino:45204 sk:40 <->
tcp LISTEN 0 1024 127.0.0.1:2812 0.0.0.0:* users:(("monit",pid=1806,fd=6)) ino:37711 sk:41 <->
tcp LISTEN 0 4096 0.0.0.0:446 0.0.0.0:* users:(("dockerd",pid=2840,fd=143)) ino:98615 sk:42 <->
tcp LISTEN 0 4096 0.0.0.0:8126 0.0.0.0:* users:(("dockerd",pid=2840,fd=105)) ino:77945 sk:43 <->
tcp LISTEN 0 100 127.0.0.1:49152 0.0.0.0:* users:(("TabNine-deep-lo",pid=9583,fd=22)) uid:1000 ino:486863 sk:6ae <->
tcp LISTEN 0 4096 127.0.0.1:46624 0.0.0.0:* users:(("kited",pid=27494,fd=3)) uid:1000 ino:110003 sk:44 <->
tcp LISTEN 0 4096 0.0.0.0:8000 0.0.0.0:* users:(("dockerd",pid=2840,fd=50)) ino:73647 sk:45 <->
tcp LISTEN 0 128 0.0.0.0:5665 0.0.0.0:* users:(("icinga2",pid=5337,fd=16)) uid:136 ino:57169 sk:46 <->
tcp LISTEN 0 4096 192.168.178.5:10050 0.0.0.0:* users:(("zabbix_agent2",pid=6359,fd=10)) uid:131 ino:64927 sk:47 <->
tcp LISTEN 0 4096 0.0.0.0:9090 0.0.0.0:* users:(("prometheus",pid=1543,fd=7)) uid:128 ino:32704 sk:48 <->

View File

@ -0,0 +1 @@
[{"file": "/tmp/folder/folder", "extents": true}, {"file": "/tmp/folder/folder/test_file", "compression_requested": true, "extents": true}]

View File

@ -0,0 +1,5 @@
--------------e----- /tmp/folder/folder
/tmp/folder/folder:
--------c-----e----- /tmp/folder/folder/test_file

View File

@ -0,0 +1 @@
[]

View File

@ -0,0 +1 @@
lsattr: Operation not supported While reading flags on /etc/apache2/mods-enabled/autoindex.load

View File

@ -0,0 +1 @@
[{"file": "/tmp/folder/folder/test_file", "compression_requested": true, "extents": true}]

View File

@ -0,0 +1 @@
--------c-----e----- /tmp/folder/folder/test_file

View File

@ -0,0 +1 @@
[{"file": "/etc/passwd", "size": 3399, "blocks": 8, "io_blocks": 4096, "type": "regular file", "device": "810h/2064d", "inode": 6392, "links": 1, "access": "0644", "flags": "-rw-r--r--", "uid": 0, "user": "root", "gid": 0, "group": "root", "access_time": "2023-06-01 13:12:52.776423000 -0700", "modify_time": "2023-01-07 03:10:13.752410800 -0800", "change_time": "2023-01-07 03:10:13.752410800 -0800", "access_time_epoch": 1685650372, "access_time_epoch_utc": null, "modify_time_epoch": 1673089813, "modify_time_epoch_utc": null, "change_time_epoch": 1673089813, "change_time_epoch_utc": null}]

View File

@ -0,0 +1,7 @@
File: /etc/passwd
Size: 3399 Blocks: 8 IO Block: 4096 regular file
Device: 810h/2064d Inode: 6392 Links: 1
Access: (0644/-rw-r--r--) Uid: ( 0/ root) Gid: ( 0/ root)
Access: 2023-06-01 13:12:52.776423000 -0700
Modify: 2023-01-07 03:10:13.752410800 -0800
Change: 2023-01-07 03:10:13.752410800 -0800

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,264 @@
Active Connections
Proto Local Address Foreign Address State
TCP 0.0.0.0:135 0.0.0.0:0 LISTENING
TCP 0.0.0.0:443 0.0.0.0:0 LISTENING
TCP 0.0.0.0:445 0.0.0.0:0 LISTENING
TCP 0.0.0.0:902 0.0.0.0:0 LISTENING
TCP 0.0.0.0:912 0.0.0.0:0 LISTENING
TCP 0.0.0.0:1134 0.0.0.0:0 LISTENING
TCP 0.0.0.0:2179 0.0.0.0:0 LISTENING
TCP 0.0.0.0:2869 0.0.0.0:0 LISTENING
TCP 0.0.0.0:4325 0.0.0.0:0 LISTENING
TCP 0.0.0.0:5040 0.0.0.0:0 LISTENING
TCP 0.0.0.0:7070 0.0.0.0:0 LISTENING
TCP 0.0.0.0:7680 0.0.0.0:0 LISTENING
TCP 0.0.0.0:13656 0.0.0.0:0 LISTENING
TCP 0.0.0.0:49664 0.0.0.0:0 LISTENING
TCP 0.0.0.0:49665 0.0.0.0:0 LISTENING
TCP 0.0.0.0:49666 0.0.0.0:0 LISTENING
TCP 0.0.0.0:49667 0.0.0.0:0 LISTENING
TCP 0.0.0.0:49668 0.0.0.0:0 LISTENING
TCP 0.0.0.0:49681 0.0.0.0:0 LISTENING
TCP 0.0.0.0:57621 0.0.0.0:0 LISTENING
TCP 127.0.0.1:1031 127.0.0.1:1032 ESTABLISHED
TCP 127.0.0.1:1032 127.0.0.1:1031 ESTABLISHED
TCP 127.0.0.1:1310 0.0.0.0:0 LISTENING
TCP 127.0.0.1:1316 127.0.0.1:1317 ESTABLISHED
TCP 127.0.0.1:1317 127.0.0.1:1316 ESTABLISHED
TCP 127.0.0.1:1318 127.0.0.1:1319 ESTABLISHED
TCP 127.0.0.1:1319 127.0.0.1:1318 ESTABLISHED
TCP 127.0.0.1:1320 127.0.0.1:1321 ESTABLISHED
TCP 127.0.0.1:1321 127.0.0.1:1320 ESTABLISHED
TCP 127.0.0.1:1322 127.0.0.1:1323 ESTABLISHED
TCP 127.0.0.1:1323 127.0.0.1:1322 ESTABLISHED
TCP 127.0.0.1:2132 127.0.0.1:2133 ESTABLISHED
TCP 127.0.0.1:2133 127.0.0.1:2132 ESTABLISHED
TCP 127.0.0.1:3199 127.0.0.1:3200 ESTABLISHED
TCP 127.0.0.1:3200 127.0.0.1:3199 ESTABLISHED
TCP 127.0.0.1:5481 127.0.0.1:5482 ESTABLISHED
TCP 127.0.0.1:5482 127.0.0.1:5481 ESTABLISHED
TCP 127.0.0.1:5939 0.0.0.0:0 LISTENING
TCP 127.0.0.1:8307 0.0.0.0:0 LISTENING
TCP 127.0.0.1:8884 0.0.0.0:0 LISTENING
TCP 127.0.0.1:29802 0.0.0.0:0 LISTENING
TCP 127.0.0.1:30776 0.0.0.0:0 LISTENING
TCP 127.0.0.1:30785 127.0.0.1:30786 ESTABLISHED
TCP 127.0.0.1:30786 127.0.0.1:30785 ESTABLISHED
TCP 127.0.0.1:30787 127.0.0.1:30788 ESTABLISHED
TCP 127.0.0.1:30788 127.0.0.1:30787 ESTABLISHED
TCP 127.0.0.1:30797 127.0.0.1:30798 ESTABLISHED
TCP 127.0.0.1:30798 127.0.0.1:30797 ESTABLISHED
TCP 127.0.0.1:30799 127.0.0.1:30800 ESTABLISHED
TCP 127.0.0.1:30800 127.0.0.1:30799 ESTABLISHED
TCP 127.0.0.1:30825 127.0.0.1:30826 ESTABLISHED
TCP 127.0.0.1:30826 127.0.0.1:30825 ESTABLISHED
TCP 127.0.0.1:30842 127.0.0.1:30843 ESTABLISHED
TCP 127.0.0.1:30843 127.0.0.1:30842 ESTABLISHED
TCP 127.0.0.1:31337 127.0.0.1:31338 ESTABLISHED
TCP 127.0.0.1:31338 127.0.0.1:31337 ESTABLISHED
TCP 127.0.0.1:44440 0.0.0.0:0 LISTENING
TCP 127.0.0.1:49673 127.0.0.1:49674 ESTABLISHED
TCP 127.0.0.1:49674 127.0.0.1:49673 ESTABLISHED
TCP 127.0.0.1:65001 0.0.0.0:0 LISTENING
TCP 172.17.96.1:139 0.0.0.0:0 LISTENING
TCP 172.26.240.1:139 0.0.0.0:0 LISTENING
TCP 172.27.128.1:139 0.0.0.0:0 LISTENING
TCP 192.168.24.1:139 0.0.0.0:0 LISTENING
TCP 192.168.31.221:139 0.0.0.0:0 LISTENING
TCP [::]:135 [::]:0 LISTENING
TCP [::]:443 [::]:0 LISTENING
TCP [::]:445 [::]:0 LISTENING
TCP [::]:2179 [::]:0 LISTENING
TCP [::]:2869 [::]:0 LISTENING
TCP [::]:7680 [::]:0 LISTENING
TCP [::]:13656 [::]:0 LISTENING
TCP [::]:49664 [::]:0 LISTENING
TCP [::]:49665 [::]:0 LISTENING
TCP [::]:49666 [::]:0 LISTENING
TCP [::]:49667 [::]:0 LISTENING
TCP [::]:49668 [::]:0 LISTENING
TCP [::]:49681 [::]:0 LISTENING
TCP [::1]:1025 [::]:0 LISTENING
TCP [::1]:8307 [::]:0 LISTENING
UDP 0.0.0.0:53 *:*
UDP 0.0.0.0:53 *:*
UDP 0.0.0.0:67 *:*
UDP 0.0.0.0:500 *:*
UDP 0.0.0.0:1900 *:*
UDP 0.0.0.0:1900 *:*
UDP 0.0.0.0:1900 *:*
UDP 0.0.0.0:1900 *:*
UDP 0.0.0.0:1900 *:*
UDP 0.0.0.0:1900 *:*
UDP 0.0.0.0:1900 *:*
UDP 0.0.0.0:1900 *:*
UDP 0.0.0.0:3702 *:*
UDP 0.0.0.0:3702 *:*
UDP 0.0.0.0:4500 *:*
UDP 0.0.0.0:5050 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5353 *:*
UDP 0.0.0.0:5355 *:*
UDP 0.0.0.0:49376 *:*
UDP 0.0.0.0:50001 *:*
UDP 0.0.0.0:50531 *:*
UDP 0.0.0.0:52304 *:*
UDP 0.0.0.0:52915 *:*
UDP 0.0.0.0:53086 *:*
UDP 0.0.0.0:55581 *:*
UDP 0.0.0.0:57180 *:*
UDP 0.0.0.0:57621 *:*
UDP 0.0.0.0:57752 *:*
UDP 0.0.0.0:57753 *:*
UDP 0.0.0.0:57754 *:*
UDP 0.0.0.0:57755 *:*
UDP 0.0.0.0:57756 *:*
UDP 0.0.0.0:57757 *:*
UDP 0.0.0.0:57758 *:*
UDP 0.0.0.0:57759 *:*
UDP 0.0.0.0:60874 *:*
UDP 0.0.0.0:62243 *:*
UDP 0.0.0.0:62245 *:*
UDP 0.0.0.0:62777 *:*
UDP 0.0.0.0:62808 *:*
UDP 0.0.0.0:64544 *:*
UDP 127.0.0.1:1900 *:*
UDP 127.0.0.1:10250 *:*
UDP 127.0.0.1:49826 *:*
UDP 127.0.0.1:52299 *:*
UDP 127.0.0.1:61810 *:*
UDP 172.17.96.1:137 *:*
UDP 172.17.96.1:138 *:*
UDP 172.17.96.1:1900 *:*
UDP 172.17.96.1:2177 *:*
UDP 172.17.96.1:5353 *:*
UDP 172.17.96.1:52300 *:*
UDP 172.26.240.1:67 *:*
UDP 172.26.240.1:68 *:*
UDP 172.26.240.1:137 *:*
UDP 172.26.240.1:138 *:*
UDP 172.26.240.1:1900 *:*
UDP 172.26.240.1:2177 *:*
UDP 172.26.240.1:5353 *:*
UDP 172.26.240.1:52292 *:*
UDP 172.27.128.1:137 *:*
UDP 172.27.128.1:138 *:*
UDP 172.27.128.1:1900 *:*
UDP 172.27.128.1:2177 *:*
UDP 172.27.128.1:5353 *:*
UDP 172.27.128.1:52293 *:*
UDP 192.168.24.1:137 *:*
UDP 192.168.24.1:138 *:*
UDP 192.168.24.1:1900 *:*
UDP 192.168.24.1:2177 *:*
UDP 192.168.24.1:5353 *:*
UDP 192.168.24.1:5353 *:*
UDP 192.168.24.1:52298 *:*
UDP 192.168.31.221:137 *:*
UDP 192.168.31.221:138 *:*
UDP 192.168.31.221:1900 *:*
UDP 192.168.31.221:2177 *:*
UDP 192.168.31.221:5353 *:*
UDP 192.168.31.221:52294 *:*
UDP 192.168.216.1:137 *:*
UDP 192.168.216.1:138 *:*
UDP 192.168.216.1:1900 *:*
UDP 192.168.216.1:2177 *:*
UDP 192.168.216.1:5353 *:*
UDP 192.168.216.1:5353 *:*
UDP 192.168.216.1:52295 *:*
UDP 192.168.234.1:137 *:*
UDP 192.168.234.1:138 *:*
UDP 192.168.234.1:1900 *:*
UDP 192.168.234.1:2177 *:*
UDP 192.168.234.1:5353 *:*
UDP 192.168.234.1:5353 *:*
UDP 192.168.234.1:52297 *:*
UDP 192.168.255.1:137 *:*
UDP 192.168.255.1:138 *:*
UDP 192.168.255.1:1900 *:*
UDP 192.168.255.1:2177 *:*
UDP 192.168.255.1:5353 *:*
UDP 192.168.255.1:5353 *:*
UDP 192.168.255.1:52296 *:*
UDP [::]:500 *:*
UDP [::]:3702 *:*
UDP [::]:3702 *:*
UDP [::]:4500 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5353 *:*
UDP [::]:5355 *:*
UDP [::]:49377 *:*
UDP [::]:52305 *:*
UDP [::]:62244 *:*
UDP [::]:62246 *:*
UDP [::]:62809 *:*
UDP [::1]:1900 *:*
UDP [::1]:5353 *:*
UDP [::1]:5353 *:*
UDP [::1]:52290 *:*
UDP [fe80::asda:4124:2096:62b2%9]:1900 *:*

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,212 @@
Active Connections
Proto Local Address Foreign Address State PID
TCP 0.0.0.0:135 0.0.0.0:0 LISTENING 1712
TCP 0.0.0.0:443 0.0.0.0:0 LISTENING 9396
TCP 0.0.0.0:445 0.0.0.0:0 LISTENING 4
TCP 0.0.0.0:902 0.0.0.0:0 LISTENING 7388
TCP 0.0.0.0:49667 0.0.0.0:0 LISTENING 2160
TCP 0.0.0.0:49668 0.0.0.0:0 LISTENING 3952
TCP 0.0.0.0:49681 0.0.0.0:0 LISTENING 6316
TCP 0.0.0.0:57621 0.0.0.0:0 LISTENING 27696
TCP 127.0.0.1:1031 127.0.0.1:1032 ESTABLISHED 10644
TCP 127.0.0.1:1032 127.0.0.1:1031 ESTABLISHED 10644
TCP 127.0.0.1:1310 0.0.0.0:0 LISTENING 9328
TCP 127.0.0.1:1316 127.0.0.1:1317 ESTABLISHED 1112
TCP 127.0.0.1:1317 127.0.0.1:1316 ESTABLISHED 1112
TCP 127.0.0.1:1318 127.0.0.1:1319 ESTABLISHED 1112
TCP 127.0.0.1:1319 127.0.0.1:1318 ESTABLISHED 1112
TCP 127.0.0.1:1320 127.0.0.1:1321 ESTABLISHED 28104
TCP 127.0.0.1:1321 127.0.0.1:1320 ESTABLISHED 28104
TCP 127.0.0.1:1322 127.0.0.1:1323 ESTABLISHED 28104
TCP 127.0.0.1:1323 127.0.0.1:1322 ESTABLISHED 28104
TCP 127.0.0.1:2132 127.0.0.1:2133 ESTABLISHED 21952
TCP 127.0.0.1:2133 127.0.0.1:2132 ESTABLISHED 21952
TCP 127.0.0.1:3949 127.0.0.1:3950 ESTABLISHED 21952
TCP 127.0.0.1:3950 127.0.0.1:3949 ESTABLISHED 21952
TCP 127.0.0.1:5481 127.0.0.1:5482 ESTABLISHED 20136
TCP 127.0.0.1:5482 127.0.0.1:5481 ESTABLISHED 20136
TCP 127.0.0.1:5939 0.0.0.0:0 LISTENING 7340
TCP 127.0.0.1:8307 0.0.0.0:0 LISTENING 9396
TCP 127.0.0.1:8884 0.0.0.0:0 LISTENING 4
TCP 127.0.0.1:29802 0.0.0.0:0 LISTENING 6372
TCP 127.0.0.1:30776 0.0.0.0:0 LISTENING 30224
TCP 127.0.0.1:30785 127.0.0.1:30786 ESTABLISHED 25596
TCP 127.0.0.1:30786 127.0.0.1:30785 ESTABLISHED 25596
TCP 127.0.0.1:30787 127.0.0.1:30788 ESTABLISHED 25596
TCP 127.0.0.1:30788 127.0.0.1:30787 ESTABLISHED 25596
TCP 127.0.0.1:30797 127.0.0.1:30798 ESTABLISHED 20364
TCP 127.0.0.1:30798 127.0.0.1:30797 ESTABLISHED 20364
TCP 127.0.0.1:30799 127.0.0.1:30800 ESTABLISHED 20364
TCP 127.0.0.1:30800 127.0.0.1:30799 ESTABLISHED 20364
TCP 127.0.0.1:30825 127.0.0.1:30826 ESTABLISHED 22140
TCP 127.0.0.1:30826 127.0.0.1:30825 ESTABLISHED 22140
TCP 127.0.0.1:30842 127.0.0.1:30843 ESTABLISHED 23200
TCP 127.0.0.1:30843 127.0.0.1:30842 ESTABLISHED 23200
TCP 127.0.0.1:31337 127.0.0.1:31338 ESTABLISHED 18580
TCP 127.0.0.1:31338 127.0.0.1:31337 ESTABLISHED 18580
TCP 127.0.0.1:44440 0.0.0.0:0 LISTENING 6804
TCP 127.0.0.1:49673 127.0.0.1:49674 ESTABLISHED 4260
TCP 127.0.0.1:49674 127.0.0.1:49673 ESTABLISHED 4260
TCP 127.0.0.1:65001 0.0.0.0:0 LISTENING 7140
TCP 172.17.96.1:139 0.0.0.0:0 LISTENING 4
TCP 172.26.240.1:139 0.0.0.0:0 LISTENING 4
TCP 172.27.128.1:139 0.0.0.0:0 LISTENING 4
TCP 192.168.24.1:139 0.0.0.0:0 LISTENING 4
TCP 192.168.31.221:139 0.0.0.0:0 LISTENING 4
TCP 192.168.216.1:139 0.0.0.0:0 LISTENING 4
TCP 192.168.234.1:139 0.0.0.0:0 LISTENING 4
TCP 192.168.255.1:139 0.0.0.0:0 LISTENING 4
TCP [::]:135 [::]:0 LISTENING 1712
TCP [::]:443 [::]:0 LISTENING 9396
TCP [::]:445 [::]:0 LISTENING 4
TCP [::]:2179 [::]:0 LISTENING 4032
TCP [::]:2869 [::]:0 LISTENING 4
TCP [::]:7680 [::]:0 LISTENING 10404
TCP [::]:13656 [::]:0 LISTENING 1400
TCP [::]:49664 [::]:0 LISTENING 1428
TCP [::]:49665 [::]:0 LISTENING 1036
TCP [::]:49666 [::]:0 LISTENING 2152
TCP [::]:49667 [::]:0 LISTENING 2160
TCP [::]:49668 [::]:0 LISTENING 3952
TCP [::]:49681 [::]:0 LISTENING 6316
TCP [::1]:1025 [::]:0 LISTENING 6088
TCP [::1]:8307 [::]:0 LISTENING 9396
UDP 0.0.0.0:53 *:* 4808
UDP 0.0.0.0:53 *:* 4808
UDP 0.0.0.0:67 *:* 34064
UDP 0.0.0.0:500 *:* 6856
UDP 0.0.0.0:1900 *:* 27696
UDP 0.0.0.0:1900 *:* 27696
UDP 0.0.0.0:1900 *:* 27696
UDP 0.0.0.0:1900 *:* 27696
UDP 0.0.0.0:1900 *:* 27696
UDP 0.0.0.0:1900 *:* 27696
UDP 0.0.0.0:1900 *:* 27696
UDP 0.0.0.0:1900 *:* 27696
UDP 0.0.0.0:3702 *:* 6952
UDP 0.0.0.0:3702 *:* 6952
UDP 0.0.0.0:4500 *:* 6856
UDP 0.0.0.0:5050 *:* 13528
UDP 0.0.0.0:5353 *:* 14156
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 3404
UDP 0.0.0.0:5353 *:* 14156
UDP 0.0.0.0:5353 *:* 14156
UDP 0.0.0.0:5353 *:* 14156
UDP 0.0.0.0:5353 *:* 14156
UDP 0.0.0.0:5353 *:* 14156
UDP 0.0.0.0:5353 *:* 14156
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 14156
UDP 0.0.0.0:5353 *:* 14156
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 14156
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 14156
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 14156
UDP 0.0.0.0:5353 *:* 14156
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 14156
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 14156
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5353 *:* 14156
UDP 0.0.0.0:5353 *:* 27696
UDP 0.0.0.0:5355 *:* 3404
UDP 0.0.0.0:49376 *:* 7140
UDP 0.0.0.0:50001 *:* 6608
UDP 0.0.0.0:50531 *:* 21952
UDP 0.0.0.0:52304 *:* 7340
UDP 0.0.0.0:52506 *:* 19692
UDP 0.0.0.0:55581 *:* 23200
UDP 0.0.0.0:56621 *:* 21952
UDP 0.0.0.0:57180 *:* 18580
UDP 0.0.0.0:57621 *:* 27696
UDP 0.0.0.0:57752 *:* 27696
UDP 0.0.0.0:57753 *:* 27696
UDP 0.0.0.0:57754 *:* 27696
UDP 0.0.0.0:57755 *:* 27696
UDP 0.0.0.0:57756 *:* 27696
UDP 0.0.0.0:57757 *:* 27696
UDP 0.0.0.0:57758 *:* 27696
UDP 0.0.0.0:57759 *:* 27696
UDP 0.0.0.0:62243 *:* 4808
UDP 0.0.0.0:62245 *:* 4808
UDP 0.0.0.0:62808 *:* 6952
UDP 0.0.0.0:64544 *:* 4808
UDP 127.0.0.1:1900 *:* 8844
UDP 127.0.0.1:10250 *:* 6372
UDP 127.0.0.1:49826 *:* 5876
UDP 127.0.0.1:52299 *:* 8844
UDP 127.0.0.1:61810 *:* 27560
UDP 172.17.96.1:137 *:* 4
UDP 172.17.96.1:138 *:* 4
UDP 172.17.96.1:1900 *:* 8844
UDP 172.17.96.1:2177 *:* 28108
UDP 172.17.96.1:5353 *:* 7140
UDP 172.17.96.1:52300 *:* 8844
UDP 172.26.240.1:67 *:* 4808
UDP 172.26.240.1:68 *:* 4808
UDP 172.26.240.1:137 *:* 4
UDP 172.26.240.1:138 *:* 4
UDP 172.26.240.1:1900 *:* 8844
UDP 172.26.240.1:2177 *:* 28108
UDP 172.26.240.1:5353 *:* 7140
UDP 172.26.240.1:52292 *:* 8844
UDP 172.27.128.1:137 *:* 4
UDP 172.27.128.1:138 *:* 4
UDP 172.27.128.1:1900 *:* 8844
UDP 172.27.128.1:2177 *:* 28108
UDP 172.27.128.1:5353 *:* 7140
UDP 172.27.128.1:52293 *:* 8844
UDP 192.168.216.1:138 *:* 4
UDP 192.168.216.1:1900 *:* 8844
UDP 192.168.216.1:2177 *:* 28108
UDP 192.168.216.1:5353 *:* 7140
UDP 192.168.216.1:5353 *:* 7340
UDP 192.168.216.1:52295 *:* 8844
UDP 192.168.234.1:137 *:* 4
UDP 192.168.234.1:138 *:* 4
UDP 192.168.234.1:1900 *:* 8844
UDP 192.168.234.1:2177 *:* 28108
UDP 192.168.234.1:5353 *:* 7340
UDP 192.168.234.1:5353 *:* 7140
UDP 192.168.234.1:52297 *:* 8844
UDP 192.168.255.1:137 *:* 4
UDP 192.168.255.1:138 *:* 4
UDP 192.168.255.1:1900 *:* 8844
UDP 192.168.255.1:2177 *:* 28108
UDP 192.168.255.1:5353 *:* 7340
UDP 192.168.255.1:5353 *:* 7140
UDP 192.168.255.1:52296 *:* 8844
UDP [::]:5353 *:* 3404
UDP [::]:5355 *:* 3404
UDP [::]:49377 *:* 7140
UDP [::]:52305 *:* 7340
UDP [::]:62244 *:* 4808
UDP [::]:62246 *:* 4808
UDP [::]:62809 *:* 6952
UDP [::1]:1900 *:* 8844
UDP [::1]:5353 *:* 7140
UDP [::1]:5353 *:* 7340
UDP [::1]:52290 *:* 8844
UDP [fe80::244e:2f1e:2096:62b2%9]:1900 *:* 8844
UDP [fe80::f5d1:8bf7:c410:e325%64]:52284 *:* 8844

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,720 @@
Active Connections
Proto Local Address Foreign Address State PID
TCP 0.0.0.0:135 0.0.0.0:0 LISTENING 1712
RpcSs
[svchost.exe]
TCP 0.0.0.0:443 0.0.0.0:0 LISTENING 9396
[vmware-hostd.exe]
TCP 0.0.0.0:445 0.0.0.0:0 LISTENING 4
Can not obtain ownership information
TCP 0.0.0.0:902 0.0.0.0:0 LISTENING 7388
[vmware-authd.exe]
TCP 0.0.0.0:912 0.0.0.0:0 LISTENING 7388
[vmware-authd.exe]
TCP 0.0.0.0:1134 0.0.0.0:0 LISTENING 27696
[Spotify.exe]
TCP 0.0.0.0:2179 0.0.0.0:0 LISTENING 4032
[vmms.exe]
TCP 0.0.0.0:2869 0.0.0.0:0 LISTENING 4
Can not obtain ownership information
TCP 0.0.0.0:4325 0.0.0.0:0 LISTENING 5876
iphlpsvc
[svchost.exe]
TCP 0.0.0.0:5040 0.0.0.0:0 LISTENING 13528
CDPSvc
[svchost.exe]
TCP 0.0.0.0:7070 0.0.0.0:0 LISTENING 6608
[AnyDesk.exe]
TCP 0.0.0.0:7680 0.0.0.0:0 LISTENING 10404
Can not obtain ownership information
TCP 0.0.0.0:13656 0.0.0.0:0 LISTENING 1400
Can not obtain ownership information
TCP 0.0.0.0:49664 0.0.0.0:0 LISTENING 1428
[System]
TCP 0.0.0.0:49665 0.0.0.0:0 LISTENING 1036
Can not obtain ownership information
TCP 0.0.0.0:49666 0.0.0.0:0 LISTENING 2152
EventLog
[svchost.exe]
TCP 0.0.0.0:49667 0.0.0.0:0 LISTENING 2160
Schedule
[svchost.exe]
TCP 0.0.0.0:49668 0.0.0.0:0 LISTENING 3952
SessionEnv
[svchost.exe]
TCP 0.0.0.0:49681 0.0.0.0:0 LISTENING 6316
[spoolsv.exe]
TCP 0.0.0.0:57621 0.0.0.0:0 LISTENING 27696
[Spotify.exe]
TCP 127.0.0.1:1031 127.0.0.1:1032 ESTABLISHED 10644
[ProductAgentService.exe]
TCP 127.0.0.1:1032 127.0.0.1:1031 ESTABLISHED 10644
[ProductAgentService.exe]
TCP 127.0.0.1:1310 0.0.0.0:0 LISTENING 9328
[Code.exe]
TCP 127.0.0.1:1316 127.0.0.1:1317 ESTABLISHED 1112
[python.exe]
TCP 127.0.0.1:1317 127.0.0.1:1316 ESTABLISHED 1112
[python.exe]
TCP 127.0.0.1:1318 127.0.0.1:1319 ESTABLISHED 1112
[python.exe]
TCP 127.0.0.1:1319 127.0.0.1:1318 ESTABLISHED 1112
[python.exe]
TCP 127.0.0.1:1320 127.0.0.1:1321 ESTABLISHED 28104
[python.exe]
TCP 127.0.0.1:1321 127.0.0.1:1320 ESTABLISHED 28104
[python.exe]
TCP 127.0.0.1:1322 127.0.0.1:1323 ESTABLISHED 28104
[python.exe]
TCP 127.0.0.1:1323 127.0.0.1:1322 ESTABLISHED 28104
[python.exe]
TCP 127.0.0.1:2132 127.0.0.1:2133 ESTABLISHED 21952
[bdservicehost.exe]
TCP 127.0.0.1:2133 127.0.0.1:2132 ESTABLISHED 21952
[bdservicehost.exe]
TCP 127.0.0.1:3949 127.0.0.1:3950 ESTABLISHED 21952
[bdservicehost.exe]
TCP 127.0.0.1:3950 127.0.0.1:3949 ESTABLISHED 21952
[bdservicehost.exe]
TCP 127.0.0.1:5481 127.0.0.1:5482 ESTABLISHED 20136
[bdservicehost.exe]
TCP 127.0.0.1:5482 127.0.0.1:5481 ESTABLISHED 20136
[bdservicehost.exe]
TCP 127.0.0.1:5939 0.0.0.0:0 LISTENING 7340
[TeamViewer_Service.exe]
TCP 127.0.0.1:8307 0.0.0.0:0 LISTENING 9396
[vmware-hostd.exe]
TCP 127.0.0.1:8884 0.0.0.0:0 LISTENING 4
Can not obtain ownership information
TCP 127.0.0.1:29802 0.0.0.0:0 LISTENING 6372
[NVIDIA Web Helper.exe]
TCP 127.0.0.1:30776 0.0.0.0:0 LISTENING 30224
[Code.exe]
TCP 127.0.0.1:30785 127.0.0.1:30786 ESTABLISHED 25596
[python.exe]
TCP 127.0.0.1:30786 127.0.0.1:30785 ESTABLISHED 25596
[python.exe]
TCP 127.0.0.1:30787 127.0.0.1:30788 ESTABLISHED 25596
[python.exe]
TCP 127.0.0.1:30788 127.0.0.1:30787 ESTABLISHED 25596
[python.exe]
TCP 127.0.0.1:30797 127.0.0.1:30798 ESTABLISHED 20364
[python.exe]
TCP 127.0.0.1:30798 127.0.0.1:30797 ESTABLISHED 20364
[python.exe]
TCP 127.0.0.1:30799 127.0.0.1:30800 ESTABLISHED 20364
[python.exe]
TCP 127.0.0.1:30800 127.0.0.1:30799 ESTABLISHED 20364
[python.exe]
TCP 127.0.0.1:30825 127.0.0.1:30826 ESTABLISHED 22140
[bdvpnapp.exe]
TCP 127.0.0.1:30826 127.0.0.1:30825 ESTABLISHED 22140
[bdvpnapp.exe]
TCP 127.0.0.1:30842 127.0.0.1:30843 ESTABLISHED 23200
[bdagent.exe]
TCP 127.0.0.1:30843 127.0.0.1:30842 ESTABLISHED 23200
[bdagent.exe]
TCP 127.0.0.1:31337 127.0.0.1:31338 ESTABLISHED 18580
[bdwtxag.exe]
TCP 127.0.0.1:31338 127.0.0.1:31337 ESTABLISHED 18580
[bdwtxag.exe]
TCP 127.0.0.1:44440 0.0.0.0:0 LISTENING 6804
[FoxitConnectedPDFService.exe]
TCP 127.0.0.1:49673 127.0.0.1:49674 ESTABLISHED 4260
[bdvpnservice.exe]
TCP 127.0.0.1:49674 127.0.0.1:49673 ESTABLISHED 4260
[bdvpnservice.exe]
TCP 127.0.0.1:65001 0.0.0.0:0 LISTENING 7140
[nvcontainer.exe]
TCP 172.17.96.1:139 0.0.0.0:0 LISTENING 4
Can not obtain ownership information
TCP 172.26.240.1:139 0.0.0.0:0 LISTENING 4
Can not obtain ownership information
TCP 172.27.128.1:139 0.0.0.0:0 LISTENING 4
Can not obtain ownership information
TCP 192.168.24.1:139 0.0.0.0:0 LISTENING 4
Can not obtain ownership information
TCP 192.168.31.221:139 0.0.0.0:0 LISTENING 4
Can not obtain ownership information
TCP 192.168.31.221:1028 169.150.202.133:443 ESTABLISHED 6608
[AnyDesk.exe]
TCP 192.168.31.221:1053 66.102.1.188:5228 ESTABLISHED 19692
[chrome.exe]
TCP 192.168.31.221:1114 23.98.104.215:8883 ESTABLISHED 29992
[SupportAssistAgent.exe]
TCP 192.168.31.221:1135 104.199.65.124:4070 ESTABLISHED 27696
[Spotify.exe]
TCP 192.168.31.221:1143 35.186.224.47:443 ESTABLISHED 27696
[Spotify.exe]
TCP 192.168.31.221:1146 35.186.224.40:443 ESTABLISHED 20740
[Spotify.exe]
TCP 192.168.31.221:3405 149.154.167.91:443 ESTABLISHED 15276
[Telegram.exe]
TCP 192.168.31.221:3411 149.154.175.59:443 ESTABLISHED 15276
[Telegram.exe]
TCP 192.168.31.221:3504 40.84.185.67:9354 ESTABLISHED 35392
[ServiceHub.SettingsHost.exe]
TCP 192.168.31.221:3962 35.186.224.25:443 ESTABLISHED 20740
[Spotify.exe]
TCP 192.168.31.221:4215 34.120.68.241:443 ESTABLISHED 21952
[bdservicehost.exe]
TCP 192.168.31.221:4290 104.17.107.108:443 ESTABLISHED 18580
[bdwtxag.exe]
TCP 192.168.31.221:4582 140.82.112.26:443 ESTABLISHED 19692
[chrome.exe]
TCP 192.168.31.221:4604 82.102.152.152:443 CLOSE_WAIT 27568
[SearchApp.exe]
TCP 192.168.31.221:4605 82.102.152.152:443 CLOSE_WAIT 27568
[SearchApp.exe]
TCP 192.168.31.221:4607 192.229.221.95:80 CLOSE_WAIT 27568
[SearchApp.exe]
TCP 192.168.31.221:4608 82.102.152.161:443 CLOSE_WAIT 27568
[SearchApp.exe]
TCP 192.168.31.221:4611 82.102.152.179:443 CLOSE_WAIT 27568
[SearchApp.exe]
TCP 192.168.31.221:4612 82.102.152.179:443 CLOSE_WAIT 27568
[SearchApp.exe]
TCP 192.168.31.221:4630 35.186.224.18:443 ESTABLISHED 20740
[Spotify.exe]
TCP 192.168.31.221:4634 34.120.68.241:443 ESTABLISHED 18580
[bdwtxag.exe]
TCP 192.168.31.221:4670 104.17.108.108:443 ESTABLISHED 21952
[bdservicehost.exe]
TCP 192.168.31.221:4699 20.54.232.160:443 ESTABLISHED 17684
CDPUserSvc_46a8de2e
[svchost.exe]
TCP 192.168.31.221:4713 157.240.252.61:5222 TIME_WAIT 0
TCP 192.168.31.221:4714 212.199.140.34:443 ESTABLISHED 33968
[WhatsApp.exe]
TCP 192.168.31.221:4715 212.199.140.162:443 ESTABLISHED 33968
[WhatsApp.exe]
TCP 192.168.31.221:4716 31.13.69.60:443 ESTABLISHED 33968
[WhatsApp.exe]
TCP 192.168.31.221:4717 157.240.252.60:443 ESTABLISHED 33968
[WhatsApp.exe]
TCP 192.168.31.221:4718 192.168.31.99:53 TIME_WAIT 0
TCP 192.168.31.221:4719 192.168.31.99:53 TIME_WAIT 0
TCP 192.168.31.221:49472 20.199.120.151:443 ESTABLISHED 7920
WpnService
[svchost.exe]
TCP 192.168.216.1:139 0.0.0.0:0 LISTENING 4
Can not obtain ownership information
TCP 192.168.234.1:139 0.0.0.0:0 LISTENING 4
Can not obtain ownership information
TCP 192.168.255.1:139 0.0.0.0:0 LISTENING 4
Can not obtain ownership information
TCP [::]:135 [::]:0 LISTENING 1712
RpcSs
[svchost.exe]
TCP [::]:443 [::]:0 LISTENING 9396
[vmware-hostd.exe]
TCP [::]:445 [::]:0 LISTENING 4
Can not obtain ownership information
TCP [::]:2179 [::]:0 LISTENING 4032
[vmms.exe]
TCP [::]:2869 [::]:0 LISTENING 4
Can not obtain ownership information
TCP [::]:7680 [::]:0 LISTENING 10404
Can not obtain ownership information
TCP [::]:13656 [::]:0 LISTENING 1400
Can not obtain ownership information
TCP [::]:49664 [::]:0 LISTENING 1428
[System]
TCP [::]:49665 [::]:0 LISTENING 1036
Can not obtain ownership information
TCP [::]:49666 [::]:0 LISTENING 2152
EventLog
[svchost.exe]
TCP [::]:49667 [::]:0 LISTENING 2160
Schedule
[svchost.exe]
TCP [::]:49668 [::]:0 LISTENING 3952
SessionEnv
[svchost.exe]
TCP [::]:49681 [::]:0 LISTENING 6316
[spoolsv.exe]
TCP [::1]:1025 [::]:0 LISTENING 6088
[jhi_service.exe]
TCP [::1]:8307 [::]:0 LISTENING 9396
[vmware-hostd.exe]
UDP 0.0.0.0:53 *:* 4808
SharedAccess
[svchost.exe]
UDP 0.0.0.0:53 *:* 4808
SharedAccess
[svchost.exe]
UDP 0.0.0.0:67 *:* 34064
[bdntwrk.exe]
UDP 0.0.0.0:500 *:* 6856
IKEEXT
[svchost.exe]
UDP 0.0.0.0:1900 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:1900 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:1900 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:1900 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:1900 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:1900 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:1900 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:1900 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:3702 *:* 6952
[dashost.exe]
UDP 0.0.0.0:3702 *:* 6952
[dashost.exe]
UDP 0.0.0.0:4500 *:* 6856
IKEEXT
[svchost.exe]
UDP 0.0.0.0:5050 *:* 13528
CDPSvc
[svchost.exe]
UDP 0.0.0.0:5353 *:* 14156
[chrome.exe]
UDP 0.0.0.0:5353 *:* 14156
[chrome.exe]
UDP 0.0.0.0:5353 *:* 3404
Dnscache
[svchost.exe]
UDP 0.0.0.0:5353 *:* 14156
[chrome.exe]
UDP 0.0.0.0:5353 *:* 14156
[chrome.exe]
UDP 0.0.0.0:5353 *:* 14156
[chrome.exe]
UDP 0.0.0.0:5353 *:* 14156
[chrome.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 14156
[chrome.exe]
UDP 0.0.0.0:5353 *:* 14156
[chrome.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 14156
[chrome.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 14156
[chrome.exe]
UDP 0.0.0.0:5353 *:* 14156
[chrome.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 14156
[chrome.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 14156
[chrome.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 14156
[chrome.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:5353 *:* 14156
[chrome.exe]
UDP 0.0.0.0:5353 *:* 14156
[chrome.exe]
UDP 0.0.0.0:5355 *:* 3404
Dnscache
[svchost.exe]
UDP 0.0.0.0:49376 *:* 7140
[nvcontainer.exe]
UDP 0.0.0.0:50001 *:* 6608
[AnyDesk.exe]
UDP 0.0.0.0:50157 *:* 20740
[Spotify.exe]
UDP 0.0.0.0:50531 *:* 21952
[bdservicehost.exe]
UDP 0.0.0.0:52304 *:* 7340
[TeamViewer_Service.exe]
UDP 0.0.0.0:55981 *:* 20740
[Spotify.exe]
UDP 0.0.0.0:56621 *:* 21952
[bdservicehost.exe]
UDP 0.0.0.0:57621 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:57752 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:57753 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:57754 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:57755 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:57756 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:57757 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:57758 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:57759 *:* 27696
[Spotify.exe]
UDP 0.0.0.0:59680 *:* 18580
[bdwtxag.exe]
UDP 0.0.0.0:59993 *:* 18580
[bdwtxag.exe]
UDP 0.0.0.0:62243 *:* 4808
SharedAccess
[svchost.exe]
UDP 0.0.0.0:62245 *:* 4808
SharedAccess
[svchost.exe]
UDP 0.0.0.0:62808 *:* 6952
[dashost.exe]
UDP 0.0.0.0:64105 *:* 20136
[bdservicehost.exe]
UDP 0.0.0.0:64544 *:* 4808
SharedAccess
[svchost.exe]
UDP 0.0.0.0:65042 *:* 21952
[bdservicehost.exe]
UDP 127.0.0.1:1900 *:* 8844
SSDPSRV
[svchost.exe]
UDP 127.0.0.1:10250 *:* 6372
[NVIDIA Web Helper.exe]
UDP 127.0.0.1:49826 *:* 5876
iphlpsvc
[svchost.exe]
UDP 127.0.0.1:52299 *:* 8844
SSDPSRV
[svchost.exe]
UDP 127.0.0.1:61810 *:* 27560
[nvcontainer.exe]
UDP 172.17.96.1:137 *:* 4
Can not obtain ownership information
UDP 172.17.96.1:138 *:* 4
Can not obtain ownership information
UDP 172.17.96.1:1900 *:* 8844
SSDPSRV
[svchost.exe]
UDP 172.17.96.1:2177 *:* 28108
QWAVE
[svchost.exe]
UDP 172.17.96.1:5353 *:* 7140
[nvcontainer.exe]
UDP 172.17.96.1:52300 *:* 8844
SSDPSRV
[svchost.exe]
UDP 172.26.240.1:67 *:* 4808
SharedAccess
[svchost.exe]
UDP 172.26.240.1:68 *:* 4808
SharedAccess
[svchost.exe]
UDP 172.26.240.1:137 *:* 4
Can not obtain ownership information
UDP 172.26.240.1:138 *:* 4
Can not obtain ownership information
UDP 172.26.240.1:1900 *:* 8844
SSDPSRV
[svchost.exe]
UDP 172.26.240.1:2177 *:* 28108
QWAVE
[svchost.exe]
UDP 172.26.240.1:5353 *:* 7140
[nvcontainer.exe]
UDP 172.26.240.1:52292 *:* 8844
SSDPSRV
[svchost.exe]
UDP 172.27.128.1:137 *:* 4
Can not obtain ownership information
UDP 172.27.128.1:138 *:* 4
Can not obtain ownership information
UDP 172.27.128.1:1900 *:* 8844
SSDPSRV
[svchost.exe]
UDP 172.27.128.1:2177 *:* 28108
QWAVE
[svchost.exe]
UDP 172.27.128.1:5353 *:* 7140
[nvcontainer.exe]
UDP 172.27.128.1:52293 *:* 8844
SSDPSRV
[svchost.exe]
UDP 192.168.24.1:137 *:* 4
Can not obtain ownership information
UDP 192.168.24.1:138 *:* 4
Can not obtain ownership information
UDP 192.168.24.1:1900 *:* 8844
SSDPSRV
[svchost.exe]
UDP 192.168.24.1:2177 *:* 28108
QWAVE
[svchost.exe]
UDP 192.168.24.1:5353 *:* 7140
[nvcontainer.exe]
UDP 192.168.24.1:5353 *:* 7340
[TeamViewer_Service.exe]
UDP 192.168.24.1:52298 *:* 8844
SSDPSRV
[svchost.exe]
UDP 192.168.31.221:137 *:* 4
Can not obtain ownership information
UDP 192.168.31.221:138 *:* 4
Can not obtain ownership information
UDP 192.168.31.221:1900 *:* 8844
SSDPSRV
[svchost.exe]
UDP 192.168.31.221:2177 *:* 28108
QWAVE
[svchost.exe]
UDP 192.168.31.221:5353 *:* 7140
[nvcontainer.exe]
UDP 192.168.31.221:52294 *:* 8844
SSDPSRV
[svchost.exe]
UDP 192.168.216.1:137 *:* 4
Can not obtain ownership information
UDP 192.168.216.1:138 *:* 4
Can not obtain ownership information
UDP 192.168.216.1:1900 *:* 8844
SSDPSRV
[svchost.exe]
UDP 192.168.216.1:2177 *:* 28108
QWAVE
[svchost.exe]
UDP 192.168.216.1:5353 *:* 7140
[nvcontainer.exe]
UDP 192.168.216.1:5353 *:* 7340
[TeamViewer_Service.exe]
UDP 192.168.216.1:52295 *:* 8844
SSDPSRV
[svchost.exe]
UDP 192.168.234.1:137 *:* 4
Can not obtain ownership information
UDP 192.168.234.1:138 *:* 4
Can not obtain ownership information
UDP 192.168.234.1:1900 *:* 8844
SSDPSRV
[svchost.exe]
UDP 192.168.234.1:2177 *:* 28108
QWAVE
[svchost.exe]
UDP 192.168.234.1:5353 *:* 7340
[TeamViewer_Service.exe]
UDP 192.168.234.1:5353 *:* 7140
[nvcontainer.exe]
UDP 192.168.234.1:52297 *:* 8844
SSDPSRV
[svchost.exe]
UDP 192.168.255.1:137 *:* 4
Can not obtain ownership information
UDP 192.168.255.1:138 *:* 4
Can not obtain ownership information
UDP 192.168.255.1:1900 *:* 8844
SSDPSRV
[svchost.exe]
UDP 192.168.255.1:2177 *:* 28108
QWAVE
[svchost.exe]
UDP 192.168.255.1:5353 *:* 7340
[TeamViewer_Service.exe]
UDP 192.168.255.1:5353 *:* 7140
[nvcontainer.exe]
UDP 192.168.255.1:52296 *:* 8844
SSDPSRV
[svchost.exe]
UDP [::]:500 *:* 6856
IKEEXT
[svchost.exe]
UDP [::]:3702 *:* 6952
[dashost.exe]
UDP [::]:3702 *:* 6952
[dashost.exe]
UDP [::]:4500 *:* 6856
IKEEXT
[svchost.exe]
UDP [::]:5353 *:* 27696
[Spotify.exe]
UDP [::]:5353 *:* 27696
[Spotify.exe]
UDP [::]:5353 *:* 27696
[Spotify.exe]
UDP [::]:5353 *:* 27696
[Spotify.exe]
UDP [::]:5353 *:* 27696
[Spotify.exe]
UDP [::]:5353 *:* 27696
[Spotify.exe]
UDP [::]:5353 *:* 27696
[Spotify.exe]
UDP [::]:5353 *:* 27696
[Spotify.exe]
UDP [::]:5353 *:* 27696
[Spotify.exe]
UDP [::]:5353 *:* 27696
[Spotify.exe]
UDP [::]:5353 *:* 27696
[Spotify.exe]
UDP [::]:5353 *:* 27696
[Spotify.exe]
UDP [::]:5353 *:* 27696
[Spotify.exe]
UDP [::]:5353 *:* 27696
[Spotify.exe]
UDP [::]:5353 *:* 14156
[chrome.exe]
UDP [::]:5353 *:* 27696
[Spotify.exe]
UDP [::]:5353 *:* 3404
Dnscache
[svchost.exe]
UDP [::]:5353 *:* 14156
[chrome.exe]
UDP [::]:5353 *:* 14156
[chrome.exe]
UDP [::]:5353 *:* 14156
[chrome.exe]
UDP [::]:5353 *:* 27696
[Spotify.exe]
UDP [::]:5353 *:* 14156
[chrome.exe]
UDP [::]:5353 *:* 14156
[chrome.exe]
UDP [::]:5353 *:* 14156
[chrome.exe]
UDP [::]:5353 *:* 14156
[chrome.exe]
UDP [::]:5355 *:* 3404
Dnscache
[svchost.exe]
UDP [::]:49377 *:* 7140
[nvcontainer.exe]
UDP [::]:52305 *:* 7340
[TeamViewer_Service.exe]
UDP [::]:62244 *:* 4808
SharedAccess
[svchost.exe]
UDP [::]:62246 *:* 4808
SharedAccess
[svchost.exe]
UDP [::]:62809 *:* 6952
[dashost.exe]
UDP [::1]:1900 *:* 8844
SSDPSRV
[svchost.exe]
UDP [::1]:5353 *:* 7340
[TeamViewer_Service.exe]
UDP [::1]:5353 *:* 7140
[nvcontainer.exe]
UDP [::1]:52290 *:* 8844
SSDPSRV
[svchost.exe]
UDP [fe80::244e:2f1e:2096:62b2%9]:1900 *:* 8844
SSDPSRV
[svchost.exe]
UDP [fe80::244e:2f1e:2096:62b2%9]:2177 *:* 28108
QWAVE
[svchost.exe]
UDP [fe80::244e:2f1e:2096:62b2%9]:52285 *:* 8844
SSDPSRV
[svchost.exe]
UDP [fe80::36de:3306:84b0:6717%21]:1900 *:* 8844
SSDPSRV
[svchost.exe]
UDP [fe80::36de:3306:84b0:6717%21]:2177 *:* 28108
QWAVE
[svchost.exe]
UDP [fe80::36de:3306:84b0:6717%21]:52289 *:* 8844
SSDPSRV
[svchost.exe]
UDP [fe80::8bd5:17a8:5e7d:9a50%52]:1900 *:* 8844
SSDPSRV
[svchost.exe]
UDP [fe80::8bd5:17a8:5e7d:9a50%52]:2177 *:* 28108
QWAVE
[svchost.exe]
UDP [fe80::8bd5:17a8:5e7d:9a50%52]:52183 *:* 8844
SSDPSRV
[svchost.exe]
UDP [fe80::a625:de92:858:a42a%57]:1900 *:* 8844
SSDPSRV
[svchost.exe]
UDP [fe80::a625:de92:858:a42a%57]:2177 *:* 28108
QWAVE
[svchost.exe]
UDP [fe80::a625:de92:858:a42a%57]:52291 *:* 8844
SSDPSRV
[svchost.exe]
UDP [fe80::af17:660e:b87c:eb22%22]:1900 *:* 8844
SSDPSRV
[svchost.exe]
UDP [fe80::af17:660e:b87c:eb22%22]:2177 *:* 28108
QWAVE
[svchost.exe]
UDP [fe80::af17:660e:b87c:eb22%22]:52286 *:* 8844
SSDPSRV
[svchost.exe]
UDP [fe80::d586:c72e:e178:1147%11]:1900 *:* 8844
SSDPSRV
[svchost.exe]
UDP [fe80::d586:c72e:e178:1147%11]:2177 *:* 28108
QWAVE
[svchost.exe]
UDP [fe80::d586:c72e:e178:1147%11]:52287 *:* 8844
SSDPSRV
[svchost.exe]
UDP [fe80::d8df:ceee:28ba:aad%14]:1900 *:* 8844
SSDPSRV
[svchost.exe]
UDP [fe80::d8df:ceee:28ba:aad%14]:2177 *:* 28108
QWAVE
[svchost.exe]
UDP [fe80::d8df:ceee:28ba:aad%14]:52288 *:* 8844
SSDPSRV
[svchost.exe]
UDP [fe80::f5d1:8bf7:c410:e325%64]:1900 *:* 8844
SSDPSRV
[svchost.exe]
UDP [fe80::f5d1:8bf7:c410:e325%64]:2177 *:* 28108
QWAVE
[svchost.exe]
UDP [fe80::f5d1:8bf7:c410:e325%64]:52284 *:* 8844
SSDPSRV
[svchost.exe]

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,36 @@
Active Connections
Proto Local Address Foreign Address State
TCP 127.0.0.1:1031 api:1032 ESTABLISHED
TCP 127.0.0.1:1032 api:1031 ESTABLISHED
TCP 127.0.0.1:1316 api:1317 ESTABLISHED
TCP 127.0.0.1:1317 api:1316 ESTABLISHED
TCP 127.0.0.1:1318 api:1319 ESTABLISHED
TCP 127.0.0.1:1319 api:1318 ESTABLISHED
TCP 127.0.0.1:1320 api:1321 ESTABLISHED
TCP 127.0.0.1:1321 api:1320 ESTABLISHED
TCP 127.0.0.1:1322 api:1323 ESTABLISHED
TCP 127.0.0.1:1323 api:1322 ESTABLISHED
TCP 127.0.0.1:2132 api:2133 ESTABLISHED
TCP 127.0.0.1:2133 api:2132 ESTABLISHED
TCP 127.0.0.1:3199 api:3200 ESTABLISHED
TCP 127.0.0.1:3200 api:3199 ESTABLISHED
TCP 127.0.0.1:5481 api:5482 ESTABLISHED
TCP 127.0.0.1:5482 api:5481 ESTABLISHED
TCP 127.0.0.1:30785 api:30786 ESTABLISHED
TCP 127.0.0.1:30786 api:30785 ESTABLISHED
TCP 127.0.0.1:30787 api:30788 ESTABLISHED
TCP 127.0.0.1:30788 api:30787 ESTABLISHED
TCP 127.0.0.1:30797 api:30798 ESTABLISHED
TCP 127.0.0.1:30798 api:30797 ESTABLISHED
TCP 127.0.0.1:30799 api:30800 ESTABLISHED
TCP 127.0.0.1:30800 api:30799 ESTABLISHED
TCP 127.0.0.1:30825 api:30826 ESTABLISHED
TCP 127.0.0.1:30826 api:30825 ESTABLISHED
TCP 127.0.0.1:30842 api:30843 ESTABLISHED
TCP 127.0.0.1:30843 api:30842 ESTABLISHED
TCP 127.0.0.1:31337 api:31338 ESTABLISHED
TCP 127.0.0.1:31338 api:31337 ESTABLISHED
TCP 127.0.0.1:49673 api:49674 ESTABLISHED
TCP 127.0.0.1:49674 api:49673 ESTABLISHED

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,111 @@
===========================================================================
Interface List
8...00 ff 58 60 5f 61 ......TAP-Windows Adapter V9
52...00 15 5d fd 0d 45 ......Hyper-V Virtual Ethernet Adapter
64...00 15 5d 14 4d b0 ......Hyper-V Virtual Ethernet Adapter #3
22...00 50 56 c0 00 01 ......VMware Virtual Ethernet Adapter for VMnet1
11...00 50 56 c0 00 08 ......VMware Virtual Ethernet Adapter for VMnet8
14...00 50 56 c0 00 02 ......VMware Virtual Ethernet Adapter for VMnet2
21...00 50 56 c0 00 0b ......VMware Virtual Ethernet Adapter for VMnet11
1...........................Software Loopback Interface 1
57...00 15 5d 0e f1 0f ......Hyper-V Virtual Ethernet Adapter #2
===========================================================================
IPv4 Route Table
===========================================================================
Active Routes:
Network Destination Netmask Gateway Interface Metric
0.0.0.0 0.0.0.0 192.168.31.1 192.168.31.221 30
127.0.0.0 255.0.0.0 On-link 127.0.0.1 331
127.0.0.1 255.255.255.255 On-link 127.0.0.1 331
127.255.255.255 255.255.255.255 On-link 127.0.0.1 331
157.0.0.0 255.0.0.0 157.55.80.1 127.0.0.1 78
172.17.96.0 255.255.240.0 On-link 172.17.96.1 5256
172.17.96.1 255.255.255.255 On-link 172.17.96.1 5256
172.17.111.255 255.255.255.255 On-link 172.17.96.1 5256
172.26.240.0 255.255.240.0 On-link 172.26.240.1 271
172.26.240.1 255.255.255.255 On-link 172.26.240.1 271
172.26.255.255 255.255.255.255 On-link 172.26.240.1 271
172.27.128.0 255.255.240.0 On-link 172.27.128.1 271
172.27.128.1 255.255.255.255 On-link 172.27.128.1 271
172.27.143.255 255.255.255.255 On-link 172.27.128.1 271
192.168.24.0 255.255.255.0 On-link 192.168.24.1 291
192.168.24.1 255.255.255.255 On-link 192.168.24.1 291
192.168.24.255 255.255.255.255 On-link 192.168.24.1 291
192.168.31.0 255.255.255.0 On-link 192.168.31.221 286
192.168.31.221 255.255.255.255 On-link 192.168.31.221 286
192.168.31.255 255.255.255.255 On-link 192.168.31.221 286
192.168.216.0 255.255.255.0 On-link 192.168.216.1 291
192.168.216.1 255.255.255.255 On-link 192.168.216.1 291
192.168.216.255 255.255.255.255 On-link 192.168.216.1 291
192.168.234.0 255.255.255.0 On-link 192.168.234.1 291
192.168.234.1 255.255.255.255 On-link 192.168.234.1 291
192.168.234.255 255.255.255.255 On-link 192.168.234.1 291
192.168.255.0 255.255.255.0 On-link 192.168.255.1 291
192.168.255.1 255.255.255.255 On-link 192.168.255.1 291
192.168.255.255 255.255.255.255 On-link 192.168.255.1 291
224.0.0.0 240.0.0.0 On-link 127.0.0.1 331
224.0.0.0 240.0.0.0 On-link 172.17.96.1 5256
224.0.0.0 240.0.0.0 On-link 192.168.234.1 291
224.0.0.0 240.0.0.0 On-link 192.168.255.1 291
224.0.0.0 240.0.0.0 On-link 192.168.24.1 291
224.0.0.0 240.0.0.0 On-link 192.168.216.1 291
224.0.0.0 240.0.0.0 On-link 192.168.31.221 286
224.0.0.0 240.0.0.0 On-link 172.26.240.1 271
224.0.0.0 240.0.0.0 On-link 172.27.128.1 271
255.255.255.255 255.255.255.255 On-link 127.0.0.1 331
255.255.255.255 255.255.255.255 On-link 172.17.96.1 5256
255.255.255.255 255.255.255.255 On-link 192.168.234.1 291
255.255.255.255 255.255.255.255 On-link 192.168.255.1 291
255.255.255.255 255.255.255.255 On-link 192.168.24.1 291
255.255.255.255 255.255.255.255 On-link 192.168.216.1 291
255.255.255.255 255.255.255.255 On-link 192.168.31.221 286
255.255.255.255 255.255.255.255 On-link 172.26.240.1 271
255.255.255.255 255.255.255.255 On-link 172.27.128.1 271
===========================================================================
Persistent Routes:
Network Address Netmask Gateway Address Metric
157.0.0.0 255.0.0.0 157.55.80.1 3
===========================================================================
IPv6 Route Table
===========================================================================
Active Routes:
If Metric Network Destination Gateway
1 331 ::1/128 On-link
57 5256 fe80::/64 On-link
14 291 fe80::/64 On-link
11 291 fe80::/64 On-link
21 291 fe80::/64 On-link
22 291 fe80::/64 On-link
9 286 fe80::/64 On-link
52 271 fe80::/64 On-link
64 271 fe80::/64 On-link
9 286 fe80::244e:2f1e:2096:62b2/128
On-link
21 291 fe80::36de:3306:84b0:6717/128
On-link
52 271 fe80::8bd5:17a8:5e7d:9a50/128
On-link
57 5256 fe80::a625:de92:858:a42a/128
On-link
22 291 fe80::af17:660e:b87c:eb22/128
On-link
11 291 fe80::d586:c72e:e178:1147/128
On-link
14 291 fe80::d8df:ceee:28ba:aad/128
On-link
64 271 fe80::f5d1:8bf7:c410:e325/128
On-link
1 331 ff00::/8 On-link
57 5256 ff00::/8 On-link
14 291 ff00::/8 On-link
11 291 ff00::/8 On-link
21 291 ff00::/8 On-link
22 291 ff00::/8 On-link
9 286 ff00::/8 On-link
52 271 ff00::/8 On-link
64 271 ff00::/8 On-link
===========================================================================
Persistent Routes:
None

View File

@ -181,6 +181,47 @@ class BluetoothctlTests(unittest.TestCase):
for k, v in expected.items():
self.assertEqual(v, actual[0][k], f"Device regex failed on {k}")
def test_bluetoothctl_device_random(self):
"""
Test 'bluetoothctl' with device random
"""
with open("tests/fixtures/generic/bluetoothctl_device_random.out", "r") as f:
output = f.read()
actual = parse(output, quiet=True)
self.assertIsNotNone(actual)
self.assertIsNotNone(actual[0])
expected = {
"address": "DF:1C:C3:B4:1A:1F",
"is_random": True,
"name": "M585/M590",
"alias": "M585/M590",
"appearance": "0x03c2",
"icon": "input-mouse",
"paired": "yes",
"bonded": "yes",
"trusted": "no",
"blocked": "no",
"connected": "no",
"legacy_pairing": "no",
"uuids": [
"Generic Access Profile (00001800-0000-1000-8000-00805f9b34fb)",
"Generic Attribute Profile (00001801-0000-1000-8000-00805f9b34fb)",
"Device Information (0000180a-0000-1000-8000-00805f9b34fb)",
"Battery Service (0000180f-0000-1000-8000-00805f9b34fb)",
"Human Interface Device (00001812-0000-1000-8000-00805f9b34fb)",
"Vendor specific (00010000-0000-1000-8000-011f2000046d)"
],
"modalias": "usb:v046DpB01Bd0011"
}
if actual:
for k, v in expected.items():
self.assertEqual(v, actual[0][k], f"Device regex failed on {k}")
def test_bluetoothctl_devices(self):
"""
Test 'bluetoothctl' with devices

View File

@ -85,6 +85,9 @@ class MyTests(unittest.TestCase):
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/generic/dig-edns3.out'), 'r', encoding='utf-8') as f:
generic_dig_edns3 = f.read()
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/generic/dig-nsid.out'), 'r', encoding='utf-8') as f:
generic_dig_nsid = f.read()
# output
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/centos-7.7/dig.json'), 'r', encoding='utf-8') as f:
centos_7_7_dig_json = json.loads(f.read())
@ -155,6 +158,9 @@ class MyTests(unittest.TestCase):
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/generic/dig-edns3.json'), 'r', encoding='utf-8') as f:
generic_dig_edns3_json = json.loads(f.read())
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/generic/dig-nsid.json'), 'r', encoding='utf-8') as f:
generic_dig_nsid_json = json.loads(f.read())
def test_dig_nodata(self):
"""
@ -300,6 +306,12 @@ class MyTests(unittest.TestCase):
"""
self.assertEqual(jc.parsers.dig.parse(self.generic_dig_edns3, quiet=True), self.generic_dig_edns3_json)
def test_dig_nsid(self):
"""
Test 'dig' with nsid info
"""
self.assertEqual(jc.parsers.dig.parse(self.generic_dig_nsid, quiet=True), self.generic_dig_nsid_json)
if __name__ == '__main__':
unittest.main()

View File

@ -52,6 +52,9 @@ class MyTests(unittest.TestCase):
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/centos-7.7/last-wF.out'), 'r', encoding='utf-8') as f:
centos_7_7_last_wF = f.read()
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/centos-7.7/last-wixF.out'), 'r', encoding='utf-8') as f:
centos_7_7_last_wixF = f.read()
# output
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/centos-7.7/last.json'), 'r', encoding='utf-8') as f:
centos_7_7_last_json = json.loads(f.read())
@ -89,6 +92,9 @@ class MyTests(unittest.TestCase):
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/centos-7.7/last-wF.json'), 'r', encoding='utf-8') as f:
centos_7_7_last_wF_json = json.loads(f.read())
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/centos-7.7/last-wixF.json'), 'r', encoding='utf-8') as f:
centos_7_7_last_wixF_json = json.loads(f.read())
def test_last_nodata(self):
"""
@ -168,6 +174,12 @@ class MyTests(unittest.TestCase):
"""
self.assertEqual(jc.parsers.last.parse(self.centos_7_7_last_wF, quiet=True), self.centos_7_7_last_wF_json)
def test_last_wixF_centos_7_7(self):
"""
Test 'last -wixF' on Centos 7.7
"""
self.assertEqual(jc.parsers.last.parse(self.centos_7_7_last_wixF, quiet=True), self.centos_7_7_last_wixF_json)
if __name__ == '__main__':
unittest.main()

56
tests/test_lsattr.py Normal file
View File

@ -0,0 +1,56 @@
import os
import json
import unittest
import jc.parsers.lsattr
THIS_DIR = os.path.dirname(os.path.abspath(__file__))
class MyTests(unittest.TestCase):
# input
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/ubuntu-20.04/lsattr-error.out'), 'r', encoding='utf-8') as f:
ubuntu_20_4_lsattr_error = f.read()
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/ubuntu-20.04/lsattr-R.out'), 'r', encoding='utf-8') as f:
ubuntu_20_4_lsattr_R = f.read()
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/ubuntu-20.04/lsattr.out'), 'r', encoding='utf-8') as f:
ubuntu_20_4_lsattr = f.read()
# output
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/ubuntu-20.04/lsattr-error.json'), 'r', encoding='utf-8') as f:
ubuntu_20_4_lsattr_error_json = json.loads(f.read())
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/ubuntu-20.04/lsattr-R.json'), 'r', encoding='utf-8') as f:
ubuntu_20_4_lsattr_R_json = json.loads(f.read())
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/ubuntu-20.04/lsattr.json'), 'r', encoding='utf-8') as f:
ubuntu_20_4_lsattr_json = json.loads(f.read())
def test_lsattr_nodata(self):
"""
Test 'lsattr' with no data
"""
self.assertEqual(jc.parsers.lsattr.parse('', quiet=True), [])
def test_lsattr_error(self):
"""
Test 'lsattr' with permission error
"""
self.assertEqual(jc.parsers.lsattr.parse(self.ubuntu_20_4_lsattr_error, quiet=True), self.ubuntu_20_4_lsattr_error_json)
def test_lsattr_R_ubuntu_20_4(self):
"""
Test 'sudo lsattr -R' on Ubuntu 20.4
"""
self.assertEqual(jc.parsers.lsattr.parse(self.ubuntu_20_4_lsattr_R, quiet=True), self.ubuntu_20_4_lsattr_R_json)
def test_lsattr_ubuntu_20_4(self):
"""
Test 'sudo lsattr' on Ubuntu 20.4
"""
self.assertEqual(jc.parsers.lsattr.parse(self.ubuntu_20_4_lsattr, quiet=True), self.ubuntu_20_4_lsattr_json)
if __name__ == '__main__':
unittest.main()

View File

@ -76,6 +76,18 @@ class MyTests(unittest.TestCase):
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/generic/netstat-old.out'), 'r', encoding='utf-8') as f:
generic_netstat_old = f.read()
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows/windows-10/netstat.out'), 'r', encoding='utf-8') as f:
windows_netstat = f.read()
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows/windows-10/netstat-an.out'), 'r', encoding='utf-8') as f:
windows_netstat_an = f.read()
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows/windows-10/netstat-aon.out'), 'r', encoding='utf-8') as f:
windows_netstat_aon = f.read()
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows/windows-10/netstat-aonb.out'), 'r', encoding='utf-8') as f:
windows_netstat_aonb = f.read()
# netstat -r
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/centos-7.7/netstat-r.out'), 'r', encoding='utf-8') as f:
centos_7_7_netstat_r = f.read()
@ -188,6 +200,18 @@ class MyTests(unittest.TestCase):
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/generic/netstat-old.json'), 'r', encoding='utf-8') as f:
generic_netstat_old_json = json.loads(f.read())
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows/windows-10/netstat.json'), 'r', encoding='utf-8') as f:
windows_netstat_json = json.loads(f.read())
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows/windows-10/netstat-an.json'), 'r', encoding='utf-8') as f:
windows_netstat_an_json = json.loads(f.read())
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows/windows-10/netstat-aon.json'), 'r', encoding='utf-8') as f:
windows_netstat_aon_json = json.loads(f.read())
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows/windows-10/netstat-aonb.json'), 'r', encoding='utf-8') as f:
windows_netstat_aonb_json = json.loads(f.read())
# netsat -r
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/centos-7.7/netstat-r.json'), 'r', encoding='utf-8') as f:
centos_7_7_netstat_r_json = json.loads(f.read())
@ -449,6 +473,29 @@ class MyTests(unittest.TestCase):
"""
self.assertEqual(jc.parsers.netstat.parse(self.freebsd12_netstat_ib, quiet=True), self.freebsd12_netstat_ib_json)
def test_netstat_windows(self):
"""
Test 'netstat' on Windows
"""
self.assertEqual(jc.parsers.netstat.parse(self.windows_netstat, quiet=True), self.windows_netstat_json)
def test_netstat_an_windows(self):
"""
Test 'netstat -an' on Windows
"""
self.assertEqual(jc.parsers.netstat.parse(self.windows_netstat_an, quiet=True), self.windows_netstat_an_json)
def test_netstat_aon_windows(self):
"""
Test 'netstat -aon' on Windows
"""
self.assertEqual(jc.parsers.netstat.parse(self.windows_netstat_aon, quiet=True), self.windows_netstat_aon_json)
def test_netstat_aonb_windows(self):
"""
Test 'netstat -aonb' on Windows
"""
self.assertEqual(jc.parsers.netstat.parse(self.windows_netstat_aonb, quiet=True), self.windows_netstat_aonb_json)
if __name__ == '__main__':
unittest.main()

View File

@ -50,6 +50,9 @@ class MyTests(unittest.TestCase):
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/centos-7.7/ping-ip-O-unparsedlines.out'), 'r', encoding='utf-8') as f:
centos_7_7_ping_ip_O_unparsedlines = f.read()
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/centos-7.7/ping-missing-hostname.out'), 'r', encoding='utf-8') as f:
centos_7_7_ping_missing_hostname = f.read()
# ubuntu
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/ubuntu-18.04/ping-ip-O.out'), 'r', encoding='utf-8') as f:
ubuntu_18_4_ping_ip_O = f.read()
@ -251,6 +254,9 @@ class MyTests(unittest.TestCase):
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/centos-7.7/ping-ip-O-unparsedlines.json'), 'r', encoding='utf-8') as f:
centos_7_7_ping_ip_O_unparsedlines_json = json.loads(f.read())
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/centos-7.7/ping-missing-hostname.json'), 'r', encoding='utf-8') as f:
centos_7_7_ping_missing_hostname_json = json.loads(f.read())
# ubunutu
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/ubuntu-18.04/ping-ip-O.json'), 'r', encoding='utf-8') as f:
ubuntu_18_4_ping_ip_O_json = json.loads(f.read())
@ -801,6 +807,12 @@ class MyTests(unittest.TestCase):
"""
self.assertEqual(jc.parsers.ping.parse(self.alpine_linux_3_13_ping_hostname, quiet=True), self.alpine_linux_3_13_ping_hostname_json)
def test_ping_missing_hostname(self):
"""
Test 'ping' with missing hostname on linux
"""
self.assertEqual(jc.parsers.ping.parse(self.centos_7_7_ping_missing_hostname, quiet=True), self.centos_7_7_ping_missing_hostname_json)
if __name__ == '__main__':
unittest.main()

View File

@ -55,6 +55,9 @@ class MyTests(unittest.TestCase):
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/centos-7.7/ping-ip-O-unparsedlines.out'), 'r', encoding='utf-8') as f:
centos_7_7_ping_ip_O_unparsedlines = f.read()
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/centos-7.7/ping-missing-hostname.out'), 'r', encoding='utf-8') as f:
centos_7_7_ping_missing_hostname = f.read()
# ubuntu
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/ubuntu-18.04/ping-ip-O.out'), 'r', encoding='utf-8') as f:
ubuntu_18_4_ping_ip_O = f.read()
@ -246,6 +249,9 @@ class MyTests(unittest.TestCase):
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/centos-7.7/ping6-ip-dup-streaming.json'), 'r', encoding='utf-8') as f:
centos_7_7_ping6_ip_dup_streaming_json = json.loads(f.read())
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/centos-7.7/ping-missing-hostname-streaming.json'), 'r', encoding='utf-8') as f:
centos_7_7_ping_missing_hostname_json = json.loads(f.read())
# ubunutu
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/ubuntu-18.04/ping-ip-O-streaming.json'), 'r', encoding='utf-8') as f:
ubuntu_18_4_ping_ip_O_streaming_json = json.loads(f.read())
@ -799,6 +805,12 @@ class MyTests(unittest.TestCase):
"""
self.assertEqual(list(jc.parsers.ping_s.parse(self.pi_ping_ip_O_D.splitlines(), quiet=True)), self.pi_ping_ip_O_D_streaming_json)
def test_ping_s_missing_hostname(self):
"""
Test 'ping' with missing hostname on linux
"""
self.assertEqual(list(jc.parsers.ping_s.parse(self.centos_7_7_ping_missing_hostname.splitlines(), quiet=True)), self.centos_7_7_ping_missing_hostname_json)
if __name__ == '__main__':
unittest.main()

View File

@ -30,6 +30,9 @@ class MyTests(unittest.TestCase):
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/centos-7.7/route-6-n.out'), 'r', encoding='utf-8') as f:
centos_7_7_route_6_n = f.read()
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows/windows-10/route.out'), 'r', encoding='utf-8') as f:
windows_route = f.read()
# output
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/centos-7.7/route.json'), 'r', encoding='utf-8') as f:
centos_7_7_route_json = json.loads(f.read())
@ -52,6 +55,8 @@ class MyTests(unittest.TestCase):
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/centos-7.7/route-6-n.json'), 'r', encoding='utf-8') as f:
centos_7_7_route_6_n_json = json.loads(f.read())
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/windows/windows-10/route.json'), 'r', encoding='utf-8') as f:
windows_route_route_json = json.loads(f.read())
def test_route_nodata(self):
"""
@ -101,6 +106,12 @@ class MyTests(unittest.TestCase):
"""
self.assertEqual(jc.parsers.route.parse(self.centos_7_7_route_6_n, quiet=True), self.centos_7_7_route_6_n_json)
def test_route_windows(self):
"""
Test 'route print' on Windows
"""
self.assertEqual(jc.parsers.route.parse(self.windows_route, quiet=True), self.windows_route_route_json)
if __name__ == '__main__':
unittest.main()

58
tests/test_srt.py Normal file
View File

@ -0,0 +1,58 @@
import os
import unittest
import json
import jc.parsers.srt
THIS_DIR = os.path.dirname(os.path.abspath(__file__))
class MyTests(unittest.TestCase):
# input
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/generic/srt-attack_of_the_clones.srt'), 'r', encoding='utf-8') as f:
generic_attack_of_the_clones = f.read()
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/generic/srt-complex.srt'), 'r', encoding='utf-8') as f:
generic_complex = f.read()
# output
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/generic/srt-attack_of_the_clones_raw.json'), 'r', encoding='utf-8') as f:
generic_attack_of_the_clones_raw_json = json.loads(f.read())
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/generic/srt-attack_of_the_clones.json'), 'r', encoding='utf-8') as f:
generic_attack_of_the_clones_json = json.loads(f.read())
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/generic/srt-complex.json'), 'r', encoding='utf-8') as f:
generic_complex_json = json.loads(f.read())
def test_srt_nodata(self):
"""
Test srt parser with no data
"""
self.assertEqual(jc.parsers.srt.parse('', quiet=True), [])
def test_srt_nodata_r(self):
"""
Test srt parser with no data and raw output
"""
self.assertEqual(jc.parsers.srt.parse('', raw=True, quiet=True), [])
def test_srt_attack_of_the_clones_raw(self):
"""
Test the attack of the clones srt file without post processing
"""
self.assertEqual(jc.parsers.srt.parse(self.generic_attack_of_the_clones, raw=True, quiet=True), self.generic_attack_of_the_clones_raw_json)
def test_srt_attack_of_the_clones(self):
"""
Test the attack of the clones srt file
"""
self.assertEqual(jc.parsers.srt.parse(self.generic_attack_of_the_clones, quiet=True), self.generic_attack_of_the_clones_json)
def test_srt_complex(self):
"""
Test a complex srt file
"""
self.assertEqual(jc.parsers.srt.parse(self.generic_complex, quiet=True), self.generic_complex_json)
if __name__ == '__main__':
unittest.main()

View File

@ -15,6 +15,9 @@ class MyTests(unittest.TestCase):
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/ubuntu-18.04/ss-sudo-a.out'), 'r', encoding='utf-8') as f:
ubuntu_18_4_ss_sudo_a = f.read()
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/ubuntu-18.04/ss-sudo-tulpen.out'), 'r', encoding='utf-8') as f:
ubuntu_18_4_ss_sudo_tulpen = f.read()
# output
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/centos-7.7/ss-sudo-a.json'), 'r', encoding='utf-8') as f:
centos_7_7_ss_sudo_a_json = json.loads(f.read())
@ -22,6 +25,8 @@ class MyTests(unittest.TestCase):
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/ubuntu-18.04/ss-sudo-a.json'), 'r', encoding='utf-8') as f:
ubuntu_18_4_ss_sudo_a_json = json.loads(f.read())
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/ubuntu-18.04/ss-sudo-tulpen.json'), 'r', encoding='utf-8') as f:
ubuntu_18_4_ss_sudo_tulpen_json = json.loads(f.read())
def test_ss_nodata(self):
"""
@ -41,6 +46,11 @@ class MyTests(unittest.TestCase):
"""
self.assertEqual(jc.parsers.ss.parse(self.ubuntu_18_4_ss_sudo_a, quiet=True), self.ubuntu_18_4_ss_sudo_a_json)
def test_ss_sudo_tulpen_ubuntu_18_4(self):
"""
Test 'sudo ss -tulpen' on Ubuntu 18.4
"""
self.assertEqual(jc.parsers.ss.parse(self.ubuntu_18_4_ss_sudo_tulpen, quiet=True), self.ubuntu_18_4_ss_sudo_tulpen_json)
if __name__ == '__main__':
unittest.main()

View File

@ -31,6 +31,9 @@ class MyTests(unittest.TestCase):
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/freebsd12/stat.out'), 'r', encoding='utf-8') as f:
freebsd12_stat = f.read()
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/ubuntu-20.04/stat-missing-data.out'), 'r', encoding='utf-8') as f:
ubuntu_20_4_stat_missing_data = f.read()
# output
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/centos-7.7/stat.json'), 'r', encoding='utf-8') as f:
centos_7_7_stat_json = json.loads(f.read())
@ -47,6 +50,8 @@ class MyTests(unittest.TestCase):
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/freebsd12/stat.json'), 'r', encoding='utf-8') as f:
freebsd12_stat_json = json.loads(f.read())
with open(os.path.join(THIS_DIR, os.pardir, 'tests/fixtures/ubuntu-20.04/stat-missing-data.json'), 'r', encoding='utf-8') as f:
ubuntu_20_4_stat_missing_data_json = json.loads(f.read())
def test_stat_nodata(self):
"""
@ -84,6 +89,12 @@ class MyTests(unittest.TestCase):
"""
self.assertEqual(jc.parsers.stat.parse(self.freebsd12_stat, quiet=True), self.freebsd12_stat_json)
def test_stat_missing_data(self):
"""
Test 'stat /etc/passwd' with missing data.
"""
self.assertEqual(jc.parsers.stat.parse(self.ubuntu_20_4_stat_missing_data, quiet=True), self.ubuntu_20_4_stat_missing_data_json)
if __name__ == '__main__':
unittest.main()

196
tests/test_veracrypt.py Normal file
View File

@ -0,0 +1,196 @@
import json
import re
import unittest
from jc.parsers.veracrypt import (
_volume_line_pattern,
_volume_verbose_pattern,
_parse_volume,
Volume,
parse,
)
class VeracryptTests(unittest.TestCase):
def test_veracrypt_nodata(self):
"""
Test 'veracrypt' with no data
"""
output=''
self.assertEqual(parse(output, quiet=True), [])
def test_veracrypt_invalid_call(self):
"""
Test 'veracrypt' with output from invalid call
"""
output='Invalid command: foo'
self.assertEqual(parse(output, quiet=True), [])
def test_veracrypt_no_mounted_volumes(self):
"""
Test 'veracrypt' with no mounted volumes
"""
output='Error: No volumes mounted.'
self.assertEqual(parse(output, quiet=True), [])
def test_veracrypt_list_volumes(self):
"""
Test 'veracrypt' list volumes
"""
output = "1: /dev/sdb1 /dev/mapper/veracrypt1 /home/bob/mount/encrypt/sdb1\n"
output += "2: /dev/sdb2 /dev/mapper/veracrypt2 /home/bob/mount/encrypt/sdb2"
actual = parse(output, quiet=True)
self.assertIsNotNone(actual)
self.assertIsNotNone(actual[0])
self.assertIsNotNone(actual[1])
expected = [
{
"slot": 1,
"path": "/dev/sdb1",
"device": "/dev/mapper/veracrypt1",
"mountpoint": "/home/bob/mount/encrypt/sdb1"
},
{
"slot": 2,
"path": "/dev/sdb2",
"device": "/dev/mapper/veracrypt2",
"mountpoint": "/home/bob/mount/encrypt/sdb2"
}
]
if actual:
for k, v in expected[0].items():
self.assertEqual(v, actual[0][k], f"Volume regex failed on {k}")
for k, v in expected[1].items():
self.assertEqual(v, actual[1][k], f"Volume regex failed on {k}")
def test_veracrypt_verbose_list_volumes(self):
"""
Test 'veracrypt' list volumes in verbose mode
"""
with open("tests/fixtures/generic/veracrypt_verbose_list_volumes.out", "r") as f:
output = f.read()
actual = parse(output, quiet=True)
self.assertIsNotNone(actual)
self.assertIsNotNone(actual[0])
self.assertIsNotNone(actual[1])
expected = [
{
"slot": 1,
"path": "/dev/sdb1",
"device": "/dev/mapper/veracrypt1",
"mountpoint": "/home/bob/mount/encrypt/sdb1",
"size": "498 MiB",
"type": "Normal",
"readonly": "No",
"hidden_protected": "No",
"encryption_algo": "AES",
"pk_size": "256 bits",
"sk_size": "256 bits",
"block_size": "128 bits",
"mode": "XTS",
"prf": "HMAC-SHA-512",
"format_version": 2,
"backup_header": "Yes"
},
{
"slot": 2,
"path": "/dev/sdb2",
"device": "/dev/mapper/veracrypt2",
"mountpoint": "/home/bob/mount/encrypt/sdb2",
"size": "522 MiB",
"type": "Normal",
"readonly": "No",
"hidden_protected": "No",
"encryption_algo": "AES",
"pk_size": "256 bits",
"sk_size": "256 bits",
"block_size": "128 bits",
"mode": "XTS",
"prf": "HMAC-SHA-512",
"format_version": 2,
"backup_header": "Yes"
}
]
if actual:
for k, v in expected[0].items():
self.assertEqual(v, actual[0][k], f"Volume regex failed on {k}")
for k, v in expected[1].items():
self.assertEqual(v, actual[1][k], f"Volume regex failed on {k}")
def test_veracrypt_verbose_list_volumes_unknown_fields(self):
"""
Test 'veracrypt' list volumes with unknown fields in verbose mode
"""
with open("tests/fixtures/generic/veracrypt_verbose_list_volumes_unknown_fields.out", "r") as f:
output = f.read()
actual = parse(output, quiet=True)
self.assertIsNotNone(actual)
self.assertIsNotNone(actual[0])
self.assertIsNotNone(actual[1])
expected = [
{
"slot": 1,
"path": "/dev/sdb1",
"device": "/dev/mapper/veracrypt1",
"mountpoint": "/home/bob/mount/encrypt/sdb1",
"size": "498 MiB",
"type": "Normal",
"readonly": "No",
"hidden_protected": "No",
"encryption_algo": "AES",
"pk_size": "256 bits",
"sk_size": "256 bits",
"block_size": "128 bits",
"mode": "XTS",
"prf": "HMAC-SHA-512",
"format_version": 2,
"backup_header": "Yes"
},
{
"slot": 2,
"path": "/dev/sdb2",
"device": "/dev/mapper/veracrypt2",
"mountpoint": "/home/bob/mount/encrypt/sdb2",
"size": "522 MiB",
"type": "Normal",
"readonly": "No",
"hidden_protected": "No",
"encryption_algo": "AES",
"pk_size": "256 bits",
"sk_size": "256 bits",
"block_size": "128 bits",
"mode": "XTS",
"prf": "HMAC-SHA-512",
"format_version": 2,
"backup_header": "Yes"
}
]
if actual:
for k, v in expected[0].items():
self.assertEqual(v, actual[0][k], f"Volume regex failed on {k}")
for k, v in expected[1].items():
self.assertEqual(v, actual[1][k], f"Volume regex failed on {k}")
if __name__ == '__main__':
unittest.main()

Some files were not shown because too many files have changed in this diff Show More