mirror of
https://github.com/kellyjonbrazil/jc.git
synced 2026-04-07 17:57:03 +02:00
Compare commits
5 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
4cd721be85 | ||
|
|
d58ca402a7 | ||
|
|
5386879040 | ||
|
|
f19a1f23a9 | ||
|
|
5023e5be4c |
24
CHANGELOG
24
CHANGELOG
@@ -1,5 +1,29 @@
|
||||
jc changelog
|
||||
|
||||
20230730 v1.23.4
|
||||
- Add `/etc/resolve.conf` file parser
|
||||
- Add `/proc/net/tcp` and `/proc/net/tcp6` file parser
|
||||
- Add `find` command parser
|
||||
- Add `ip route` command parser
|
||||
- Fix `certbot` command parser to be more robust with different line endings
|
||||
|
||||
20230621 v1.23.3
|
||||
- Add `lsattr` command parser
|
||||
- Add `srt` file parser
|
||||
- Add `veracrypt` command parser
|
||||
- Add X509 Certificate Request file parser
|
||||
- Enhance X509 Certificate parser to allow non-compliant email addresses with a warning
|
||||
- Enhance `dig` command parser to support the `+nsid` option
|
||||
- Enhance `last` and `lastb` command parser to support the `-x` option
|
||||
- Enhance `route` command parser to add Windows support
|
||||
- Enhnace `netstat` command parser to add Windows support
|
||||
- Enhance `ss` command parser to support extended options
|
||||
- Enhance the compatibility warning message
|
||||
- Fix `bluetoothctl` command parser for some mouse devices
|
||||
- Fix `ping` command parsers for output with missing hostname
|
||||
- Fix `stat` command parser for older versions that may not contain all fields
|
||||
- Fix deprecated option in `setup.cfg`
|
||||
|
||||
20230429 v1.23.2
|
||||
- Add `bluetoothctl` command parser
|
||||
- Add `certbot` command parser for `certificates` and `show_account` options
|
||||
|
||||
51
EXAMPLES.md
51
EXAMPLES.md
@@ -4551,6 +4551,57 @@ cat entrust.pem | jc --x509-cert -p
|
||||
}
|
||||
]
|
||||
```
|
||||
### X.509 PEM and DER certificate request files
|
||||
```bash
|
||||
cat myserver.csr | jc --x509-csr -p
|
||||
```
|
||||
```json
|
||||
[
|
||||
{
|
||||
"certification_request_info": {
|
||||
"version": "v1",
|
||||
"subject": {
|
||||
"common_name": "myserver.for.example"
|
||||
},
|
||||
"subject_pk_info": {
|
||||
"algorithm": {
|
||||
"algorithm": "ec",
|
||||
"parameters": "secp256r1"
|
||||
},
|
||||
"public_key": "04:40:33:c0:91:8f:e9:46:ea:d0:dc:d0:f9:63:2c:a4:35:1f:0f:54:c8:a9:9b:e3:9e:d4:f3:64:b8:60:cc:7f:39:75:dd:a7:61:31:02:7c:9e:89:c6:db:45:15:f2:5f:b0:65:29:0b:42:d2:6e:c2:ea:a6:23:bd:fc:65:e5:7d:4e"
|
||||
},
|
||||
"attributes": [
|
||||
{
|
||||
"type": "extension_request",
|
||||
"values": [
|
||||
[
|
||||
{
|
||||
"extn_id": "extended_key_usage",
|
||||
"critical": false,
|
||||
"extn_value": [
|
||||
"server_auth"
|
||||
]
|
||||
},
|
||||
{
|
||||
"extn_id": "subject_alt_name",
|
||||
"critical": false,
|
||||
"extn_value": [
|
||||
"myserver.for.example"
|
||||
]
|
||||
}
|
||||
]
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
"signature_algorithm": {
|
||||
"algorithm": "sha384_ecdsa",
|
||||
"parameters": null
|
||||
},
|
||||
"signature": "30:45:02:20:77:ac:5b:51:bf:c5:f5:43:02:52:ae:66:8a:fe:95:98:98:98:a9:45:34:31:08:ff:2c:cc:92:d9:1c:70:28:74:02:21:00:97:79:7b:e7:45:18:76:cf:d7:3b:79:34:56:d2:69:b5:73:41:9b:8a:b7:ad:ec:80:23:c1:2f:64:da:e5:28:19"
|
||||
}
|
||||
]
|
||||
```
|
||||
### XML files
|
||||
```bash
|
||||
cat cd_catalog.xml
|
||||
|
||||
11
README.md
11
README.md
@@ -5,11 +5,13 @@
|
||||
|
||||
> Try the `jc` [web demo](https://jc-web.onrender.com/) and [REST API](https://github.com/kellyjonbrazil/jc-restapi)
|
||||
|
||||
> JC is [now available](https://galaxy.ansible.com/community/general) as an
|
||||
> `jc` is [now available](https://galaxy.ansible.com/community/general) as an
|
||||
Ansible filter plugin in the `community.general` collection. See this
|
||||
[blog post](https://blog.kellybrazil.com/2020/08/30/parsing-command-output-in-ansible-with-jc/)
|
||||
for an example.
|
||||
|
||||
> Looking for something like `jc` but lower-level? Check out [regex2json](https://gitlab.com/tozd/regex2json).
|
||||
|
||||
# JC
|
||||
JSON Convert
|
||||
|
||||
@@ -185,6 +187,7 @@ option.
|
||||
| `--email-address` | Email Address string parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/email_address) |
|
||||
| `--env` | `env` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/env) |
|
||||
| `--file` | `file` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/file) |
|
||||
| `--find` | `find` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/find) |
|
||||
| `--findmnt` | `findmnt` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/findmnt) |
|
||||
| `--finger` | `finger` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/finger) |
|
||||
| `--free` | `free` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/free) |
|
||||
@@ -208,6 +211,7 @@ option.
|
||||
| `--iostat-s` | `iostat` command streaming parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/iostat_s) |
|
||||
| `--ip-address` | IPv4 and IPv6 Address string parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/ip_address) |
|
||||
| `--iptables` | `iptables` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/iptables) |
|
||||
| `--ip-route` | `ip route` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/ip_route) |
|
||||
| `--iw-scan` | `iw dev [device] scan` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/iw_scan) |
|
||||
| `--iwconfig` | `iwconfig` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/iwconfig) |
|
||||
| `--jar-manifest` | Java MANIFEST.MF file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/jar_manifest) |
|
||||
@@ -217,6 +221,7 @@ option.
|
||||
| `--last` | `last` and `lastb` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/last) |
|
||||
| `--ls` | `ls` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/ls) |
|
||||
| `--ls-s` | `ls` command streaming parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/ls_s) |
|
||||
| `--lsattr` | `lsattr` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/lsattr) |
|
||||
| `--lsblk` | `lsblk` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/lsblk) |
|
||||
| `--lsmod` | `lsmod` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/lsmod) |
|
||||
| `--lsof` | `lsof` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/lsof) |
|
||||
@@ -245,6 +250,7 @@ option.
|
||||
| `--postconf` | `postconf -M` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/postconf) |
|
||||
| `--proc` | `/proc/` file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/proc) |
|
||||
| `--ps` | `ps` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/ps) |
|
||||
| `--resolve-conf` | `/etc/resolve.conf` file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/resolve_conf) |
|
||||
| `--route` | `route` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/route) |
|
||||
| `--rpm-qi` | `rpm -qi` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/rpm_qi) |
|
||||
| `--rsync` | `rsync` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/rsync) |
|
||||
@@ -252,6 +258,7 @@ option.
|
||||
| `--semver` | Semantic Version string parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/semver) |
|
||||
| `--sfdisk` | `sfdisk` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/sfdisk) |
|
||||
| `--shadow` | `/etc/shadow` file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/shadow) |
|
||||
| `--srt` | SRT file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/srt) |
|
||||
| `--ss` | `ss` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/ss) |
|
||||
| `--ssh-conf` | `ssh` config file and `ssh -G` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/ssh_conf) |
|
||||
| `--sshd-conf` | `sshd` config file and `sshd -T` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/sshd_conf) |
|
||||
@@ -285,12 +292,14 @@ option.
|
||||
| `--uptime` | `uptime` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/uptime) |
|
||||
| `--url` | URL string parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/url) |
|
||||
| `--ver` | Version string parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/ver) |
|
||||
| `--veracrypt` | `veracrypt` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/veracrypt) |
|
||||
| `--vmstat` | `vmstat` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/vmstat) |
|
||||
| `--vmstat-s` | `vmstat` command streaming parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/vmstat_s) |
|
||||
| `--w` | `w` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/w) |
|
||||
| `--wc` | `wc` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/wc) |
|
||||
| `--who` | `who` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/who) |
|
||||
| `--x509-cert` | X.509 PEM and DER certificate file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/x509_cert) |
|
||||
| `--x509-csr` | X.509 PEM and DER certificate request file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/x509_csr) |
|
||||
| `--xml` | XML file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/xml) |
|
||||
| `--xrandr` | `xrandr` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/xrandr) |
|
||||
| `--yaml` | YAML file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/yaml) |
|
||||
|
||||
@@ -3,8 +3,8 @@ _jc()
|
||||
local cur prev words cword jc_commands jc_parsers jc_options \
|
||||
jc_about_options jc_about_mod_options jc_help_options jc_special_options
|
||||
|
||||
jc_commands=(acpi airport arp blkid bluetoothctl cbt certbot chage cksum crontab date df dig dmidecode dpkg du env file findmnt finger free git gpg hciconfig id ifconfig iostat iptables iw iwconfig jobs last lastb ls lsblk lsmod lsof lspci lsusb md5 md5sum mdadm mount mpstat netstat nmcli ntpq os-prober pidstat ping ping6 pip pip3 postconf printenv ps route rpm rsync sfdisk sha1sum sha224sum sha256sum sha384sum sha512sum shasum ss ssh sshd stat sum sysctl systemctl systeminfo timedatectl top tracepath tracepath6 traceroute traceroute6 udevadm ufw uname update-alternatives upower uptime vdir vmstat w wc who xrandr zipinfo zpool)
|
||||
jc_parsers=(--acpi --airport --airport-s --arp --asciitable --asciitable-m --blkid --bluetoothctl --cbt --cef --cef-s --certbot --chage --cksum --clf --clf-s --crontab --crontab-u --csv --csv-s --date --datetime-iso --df --dig --dir --dmidecode --dpkg-l --du --email-address --env --file --findmnt --finger --free --fstab --git-log --git-log-s --git-ls-remote --gpg --group --gshadow --hash --hashsum --hciconfig --history --hosts --id --ifconfig --ini --ini-dup --iostat --iostat-s --ip-address --iptables --iw-scan --iwconfig --jar-manifest --jobs --jwt --kv --last --ls --ls-s --lsblk --lsmod --lsof --lspci --lsusb --m3u --mdadm --mount --mpstat --mpstat-s --netstat --nmcli --ntpq --openvpn --os-prober --passwd --pci-ids --pgpass --pidstat --pidstat-s --ping --ping-s --pip-list --pip-show --plist --postconf --proc --proc-buddyinfo --proc-consoles --proc-cpuinfo --proc-crypto --proc-devices --proc-diskstats --proc-filesystems --proc-interrupts --proc-iomem --proc-ioports --proc-loadavg --proc-locks --proc-meminfo --proc-modules --proc-mtrr --proc-pagetypeinfo --proc-partitions --proc-slabinfo --proc-softirqs --proc-stat --proc-swaps --proc-uptime --proc-version --proc-vmallocinfo --proc-vmstat --proc-zoneinfo --proc-driver-rtc --proc-net-arp --proc-net-dev --proc-net-dev-mcast --proc-net-if-inet6 --proc-net-igmp --proc-net-igmp6 --proc-net-ipv6-route --proc-net-netlink --proc-net-netstat --proc-net-packet --proc-net-protocols --proc-net-route --proc-net-unix --proc-pid-fdinfo --proc-pid-io --proc-pid-maps --proc-pid-mountinfo --proc-pid-numa-maps --proc-pid-smaps --proc-pid-stat --proc-pid-statm --proc-pid-status --ps --route --rpm-qi --rsync --rsync-s --semver --sfdisk --shadow --ss --ssh-conf --sshd-conf --stat --stat-s --sysctl --syslog --syslog-s --syslog-bsd --syslog-bsd-s --systemctl --systemctl-lj --systemctl-ls --systemctl-luf --systeminfo --time --timedatectl --timestamp --toml --top --top-s --tracepath --traceroute --udevadm --ufw --ufw-appinfo --uname --update-alt-gs --update-alt-q --upower --uptime --url --ver --vmstat --vmstat-s --w --wc --who --x509-cert --xml --xrandr --yaml --zipinfo --zpool-iostat --zpool-status)
|
||||
jc_commands=(acpi airport arp blkid bluetoothctl cbt certbot chage cksum crontab date df dig dmidecode dpkg du env file findmnt finger free git gpg hciconfig id ifconfig iostat ip iptables iw iwconfig jobs last lastb ls lsattr lsblk lsmod lsof lspci lsusb md5 md5sum mdadm mount mpstat netstat nmcli ntpq os-prober pidstat ping ping6 pip pip3 postconf printenv ps route rpm rsync sfdisk sha1sum sha224sum sha256sum sha384sum sha512sum shasum ss ssh sshd stat sum sysctl systemctl systeminfo timedatectl top tracepath tracepath6 traceroute traceroute6 udevadm ufw uname update-alternatives upower uptime vdir veracrypt vmstat w wc who xrandr zipinfo zpool)
|
||||
jc_parsers=(--acpi --airport --airport-s --arp --asciitable --asciitable-m --blkid --bluetoothctl --cbt --cef --cef-s --certbot --chage --cksum --clf --clf-s --crontab --crontab-u --csv --csv-s --date --datetime-iso --df --dig --dir --dmidecode --dpkg-l --du --email-address --env --file --find --findmnt --finger --free --fstab --git-log --git-log-s --git-ls-remote --gpg --group --gshadow --hash --hashsum --hciconfig --history --hosts --id --ifconfig --ini --ini-dup --iostat --iostat-s --ip-address --iptables --ip-route --iw-scan --iwconfig --jar-manifest --jobs --jwt --kv --last --ls --ls-s --lsattr --lsblk --lsmod --lsof --lspci --lsusb --m3u --mdadm --mount --mpstat --mpstat-s --netstat --nmcli --ntpq --openvpn --os-prober --passwd --pci-ids --pgpass --pidstat --pidstat-s --ping --ping-s --pip-list --pip-show --plist --postconf --proc --proc-buddyinfo --proc-consoles --proc-cpuinfo --proc-crypto --proc-devices --proc-diskstats --proc-filesystems --proc-interrupts --proc-iomem --proc-ioports --proc-loadavg --proc-locks --proc-meminfo --proc-modules --proc-mtrr --proc-pagetypeinfo --proc-partitions --proc-slabinfo --proc-softirqs --proc-stat --proc-swaps --proc-uptime --proc-version --proc-vmallocinfo --proc-vmstat --proc-zoneinfo --proc-driver-rtc --proc-net-arp --proc-net-dev --proc-net-dev-mcast --proc-net-if-inet6 --proc-net-igmp --proc-net-igmp6 --proc-net-ipv6-route --proc-net-netlink --proc-net-netstat --proc-net-packet --proc-net-protocols --proc-net-route --proc-net-tcp --proc-net-unix --proc-pid-fdinfo --proc-pid-io --proc-pid-maps --proc-pid-mountinfo --proc-pid-numa-maps --proc-pid-smaps --proc-pid-stat --proc-pid-statm --proc-pid-status --ps --resolve-conf --route --rpm-qi --rsync --rsync-s --semver --sfdisk --shadow --srt --ss --ssh-conf --sshd-conf --stat --stat-s --sysctl --syslog --syslog-s --syslog-bsd --syslog-bsd-s --systemctl --systemctl-lj --systemctl-ls --systemctl-luf --systeminfo --time --timedatectl --timestamp --toml --top --top-s --tracepath --traceroute --udevadm --ufw --ufw-appinfo --uname --update-alt-gs --update-alt-q --upower --uptime --url --ver --veracrypt --vmstat --vmstat-s --w --wc --who --x509-cert --x509-csr --xml --xrandr --yaml --zipinfo --zpool-iostat --zpool-status)
|
||||
jc_options=(--force-color -C --debug -d --monochrome -m --meta-out -M --pretty -p --quiet -q --raw -r --unbuffer -u --yaml-out -y)
|
||||
jc_about_options=(--about -a)
|
||||
jc_about_mod_options=(--pretty -p --yaml-out -y --monochrome -m --force-color -C)
|
||||
|
||||
@@ -9,7 +9,7 @@ _jc() {
|
||||
jc_help_options jc_help_options_describe \
|
||||
jc_special_options jc_special_options_describe
|
||||
|
||||
jc_commands=(acpi airport arp blkid bluetoothctl cbt certbot chage cksum crontab date df dig dmidecode dpkg du env file findmnt finger free git gpg hciconfig id ifconfig iostat iptables iw iwconfig jobs last lastb ls lsblk lsmod lsof lspci lsusb md5 md5sum mdadm mount mpstat netstat nmcli ntpq os-prober pidstat ping ping6 pip pip3 postconf printenv ps route rpm rsync sfdisk sha1sum sha224sum sha256sum sha384sum sha512sum shasum ss ssh sshd stat sum sysctl systemctl systeminfo timedatectl top tracepath tracepath6 traceroute traceroute6 udevadm ufw uname update-alternatives upower uptime vdir vmstat w wc who xrandr zipinfo zpool)
|
||||
jc_commands=(acpi airport arp blkid bluetoothctl cbt certbot chage cksum crontab date df dig dmidecode dpkg du env file findmnt finger free git gpg hciconfig id ifconfig iostat ip iptables iw iwconfig jobs last lastb ls lsattr lsblk lsmod lsof lspci lsusb md5 md5sum mdadm mount mpstat netstat nmcli ntpq os-prober pidstat ping ping6 pip pip3 postconf printenv ps route rpm rsync sfdisk sha1sum sha224sum sha256sum sha384sum sha512sum shasum ss ssh sshd stat sum sysctl systemctl systeminfo timedatectl top tracepath tracepath6 traceroute traceroute6 udevadm ufw uname update-alternatives upower uptime vdir veracrypt vmstat w wc who xrandr zipinfo zpool)
|
||||
jc_commands_describe=(
|
||||
'acpi:run "acpi" command with magic syntax.'
|
||||
'airport:run "airport" command with magic syntax.'
|
||||
@@ -38,6 +38,7 @@ _jc() {
|
||||
'id:run "id" command with magic syntax.'
|
||||
'ifconfig:run "ifconfig" command with magic syntax.'
|
||||
'iostat:run "iostat" command with magic syntax.'
|
||||
'ip:run "ip" command with magic syntax.'
|
||||
'iptables:run "iptables" command with magic syntax.'
|
||||
'iw:run "iw" command with magic syntax.'
|
||||
'iwconfig:run "iwconfig" command with magic syntax.'
|
||||
@@ -45,6 +46,7 @@ _jc() {
|
||||
'last:run "last" command with magic syntax.'
|
||||
'lastb:run "lastb" command with magic syntax.'
|
||||
'ls:run "ls" command with magic syntax.'
|
||||
'lsattr:run "lsattr" command with magic syntax.'
|
||||
'lsblk:run "lsblk" command with magic syntax.'
|
||||
'lsmod:run "lsmod" command with magic syntax.'
|
||||
'lsof:run "lsof" command with magic syntax.'
|
||||
@@ -98,6 +100,7 @@ _jc() {
|
||||
'upower:run "upower" command with magic syntax.'
|
||||
'uptime:run "uptime" command with magic syntax.'
|
||||
'vdir:run "vdir" command with magic syntax.'
|
||||
'veracrypt:run "veracrypt" command with magic syntax.'
|
||||
'vmstat:run "vmstat" command with magic syntax.'
|
||||
'w:run "w" command with magic syntax.'
|
||||
'wc:run "wc" command with magic syntax.'
|
||||
@@ -106,7 +109,7 @@ _jc() {
|
||||
'zipinfo:run "zipinfo" command with magic syntax.'
|
||||
'zpool:run "zpool" command with magic syntax.'
|
||||
)
|
||||
jc_parsers=(--acpi --airport --airport-s --arp --asciitable --asciitable-m --blkid --bluetoothctl --cbt --cef --cef-s --certbot --chage --cksum --clf --clf-s --crontab --crontab-u --csv --csv-s --date --datetime-iso --df --dig --dir --dmidecode --dpkg-l --du --email-address --env --file --findmnt --finger --free --fstab --git-log --git-log-s --git-ls-remote --gpg --group --gshadow --hash --hashsum --hciconfig --history --hosts --id --ifconfig --ini --ini-dup --iostat --iostat-s --ip-address --iptables --iw-scan --iwconfig --jar-manifest --jobs --jwt --kv --last --ls --ls-s --lsblk --lsmod --lsof --lspci --lsusb --m3u --mdadm --mount --mpstat --mpstat-s --netstat --nmcli --ntpq --openvpn --os-prober --passwd --pci-ids --pgpass --pidstat --pidstat-s --ping --ping-s --pip-list --pip-show --plist --postconf --proc --proc-buddyinfo --proc-consoles --proc-cpuinfo --proc-crypto --proc-devices --proc-diskstats --proc-filesystems --proc-interrupts --proc-iomem --proc-ioports --proc-loadavg --proc-locks --proc-meminfo --proc-modules --proc-mtrr --proc-pagetypeinfo --proc-partitions --proc-slabinfo --proc-softirqs --proc-stat --proc-swaps --proc-uptime --proc-version --proc-vmallocinfo --proc-vmstat --proc-zoneinfo --proc-driver-rtc --proc-net-arp --proc-net-dev --proc-net-dev-mcast --proc-net-if-inet6 --proc-net-igmp --proc-net-igmp6 --proc-net-ipv6-route --proc-net-netlink --proc-net-netstat --proc-net-packet --proc-net-protocols --proc-net-route --proc-net-unix --proc-pid-fdinfo --proc-pid-io --proc-pid-maps --proc-pid-mountinfo --proc-pid-numa-maps --proc-pid-smaps --proc-pid-stat --proc-pid-statm --proc-pid-status --ps --route --rpm-qi --rsync --rsync-s --semver --sfdisk --shadow --ss --ssh-conf --sshd-conf --stat --stat-s --sysctl --syslog --syslog-s --syslog-bsd --syslog-bsd-s --systemctl --systemctl-lj --systemctl-ls --systemctl-luf --systeminfo --time --timedatectl --timestamp --toml --top --top-s --tracepath --traceroute --udevadm --ufw --ufw-appinfo --uname --update-alt-gs --update-alt-q --upower --uptime --url --ver --vmstat --vmstat-s --w --wc --who --x509-cert --xml --xrandr --yaml --zipinfo --zpool-iostat --zpool-status)
|
||||
jc_parsers=(--acpi --airport --airport-s --arp --asciitable --asciitable-m --blkid --bluetoothctl --cbt --cef --cef-s --certbot --chage --cksum --clf --clf-s --crontab --crontab-u --csv --csv-s --date --datetime-iso --df --dig --dir --dmidecode --dpkg-l --du --email-address --env --file --find --findmnt --finger --free --fstab --git-log --git-log-s --git-ls-remote --gpg --group --gshadow --hash --hashsum --hciconfig --history --hosts --id --ifconfig --ini --ini-dup --iostat --iostat-s --ip-address --iptables --ip-route --iw-scan --iwconfig --jar-manifest --jobs --jwt --kv --last --ls --ls-s --lsattr --lsblk --lsmod --lsof --lspci --lsusb --m3u --mdadm --mount --mpstat --mpstat-s --netstat --nmcli --ntpq --openvpn --os-prober --passwd --pci-ids --pgpass --pidstat --pidstat-s --ping --ping-s --pip-list --pip-show --plist --postconf --proc --proc-buddyinfo --proc-consoles --proc-cpuinfo --proc-crypto --proc-devices --proc-diskstats --proc-filesystems --proc-interrupts --proc-iomem --proc-ioports --proc-loadavg --proc-locks --proc-meminfo --proc-modules --proc-mtrr --proc-pagetypeinfo --proc-partitions --proc-slabinfo --proc-softirqs --proc-stat --proc-swaps --proc-uptime --proc-version --proc-vmallocinfo --proc-vmstat --proc-zoneinfo --proc-driver-rtc --proc-net-arp --proc-net-dev --proc-net-dev-mcast --proc-net-if-inet6 --proc-net-igmp --proc-net-igmp6 --proc-net-ipv6-route --proc-net-netlink --proc-net-netstat --proc-net-packet --proc-net-protocols --proc-net-route --proc-net-tcp --proc-net-unix --proc-pid-fdinfo --proc-pid-io --proc-pid-maps --proc-pid-mountinfo --proc-pid-numa-maps --proc-pid-smaps --proc-pid-stat --proc-pid-statm --proc-pid-status --ps --resolve-conf --route --rpm-qi --rsync --rsync-s --semver --sfdisk --shadow --srt --ss --ssh-conf --sshd-conf --stat --stat-s --sysctl --syslog --syslog-s --syslog-bsd --syslog-bsd-s --systemctl --systemctl-lj --systemctl-ls --systemctl-luf --systeminfo --time --timedatectl --timestamp --toml --top --top-s --tracepath --traceroute --udevadm --ufw --ufw-appinfo --uname --update-alt-gs --update-alt-q --upower --uptime --url --ver --veracrypt --vmstat --vmstat-s --w --wc --who --x509-cert --x509-csr --xml --xrandr --yaml --zipinfo --zpool-iostat --zpool-status)
|
||||
jc_parsers_describe=(
|
||||
'--acpi:`acpi` command parser'
|
||||
'--airport:`airport -I` command parser'
|
||||
@@ -139,6 +142,7 @@ _jc() {
|
||||
'--email-address:Email Address string parser'
|
||||
'--env:`env` command parser'
|
||||
'--file:`file` command parser'
|
||||
'--find:`find` command parser'
|
||||
'--findmnt:`findmnt` command parser'
|
||||
'--finger:`finger` command parser'
|
||||
'--free:`free` command parser'
|
||||
@@ -162,6 +166,7 @@ _jc() {
|
||||
'--iostat-s:`iostat` command streaming parser'
|
||||
'--ip-address:IPv4 and IPv6 Address string parser'
|
||||
'--iptables:`iptables` command parser'
|
||||
'--ip-route:`ip route` command parser'
|
||||
'--iw-scan:`iw dev [device] scan` command parser'
|
||||
'--iwconfig:`iwconfig` command parser'
|
||||
'--jar-manifest:Java MANIFEST.MF file parser'
|
||||
@@ -171,6 +176,7 @@ _jc() {
|
||||
'--last:`last` and `lastb` command parser'
|
||||
'--ls:`ls` command parser'
|
||||
'--ls-s:`ls` command streaming parser'
|
||||
'--lsattr:`lsattr` command parser'
|
||||
'--lsblk:`lsblk` command parser'
|
||||
'--lsmod:`lsmod` command parser'
|
||||
'--lsof:`lsof` command parser'
|
||||
@@ -237,6 +243,7 @@ _jc() {
|
||||
'--proc-net-packet:`/proc/net/packet` file parser'
|
||||
'--proc-net-protocols:`/proc/net/protocols` file parser'
|
||||
'--proc-net-route:`/proc/net/route` file parser'
|
||||
'--proc-net-tcp:`/proc/net/tcp` and `/proc/net/tcp6` file parser'
|
||||
'--proc-net-unix:`/proc/net/unix` file parser'
|
||||
'--proc-pid-fdinfo:`/proc/<pid>/fdinfo/<fd>` file parser'
|
||||
'--proc-pid-io:`/proc/<pid>/io` file parser'
|
||||
@@ -248,6 +255,7 @@ _jc() {
|
||||
'--proc-pid-statm:`/proc/<pid>/statm` file parser'
|
||||
'--proc-pid-status:`/proc/<pid>/status` file parser'
|
||||
'--ps:`ps` command parser'
|
||||
'--resolve-conf:`/etc/resolve.conf` file parser'
|
||||
'--route:`route` command parser'
|
||||
'--rpm-qi:`rpm -qi` command parser'
|
||||
'--rsync:`rsync` command parser'
|
||||
@@ -255,6 +263,7 @@ _jc() {
|
||||
'--semver:Semantic Version string parser'
|
||||
'--sfdisk:`sfdisk` command parser'
|
||||
'--shadow:`/etc/shadow` file parser'
|
||||
'--srt:SRT file parser'
|
||||
'--ss:`ss` command parser'
|
||||
'--ssh-conf:`ssh` config file and `ssh -G` command parser'
|
||||
'--sshd-conf:`sshd` config file and `sshd -T` command parser'
|
||||
@@ -288,12 +297,14 @@ _jc() {
|
||||
'--uptime:`uptime` command parser'
|
||||
'--url:URL string parser'
|
||||
'--ver:Version string parser'
|
||||
'--veracrypt:`veracrypt` command parser'
|
||||
'--vmstat:`vmstat` command parser'
|
||||
'--vmstat-s:`vmstat` command streaming parser'
|
||||
'--w:`w` command parser'
|
||||
'--wc:`wc` command parser'
|
||||
'--who:`who` command parser'
|
||||
'--x509-cert:X.509 PEM and DER certificate file parser'
|
||||
'--x509-csr:X.509 PEM and DER certificate request file parser'
|
||||
'--xml:XML file parser'
|
||||
'--xrandr:`xrandr` command parser'
|
||||
'--yaml:YAML file parser'
|
||||
|
||||
@@ -36,6 +36,7 @@ a controller and a device but there might be fields corresponding to one entity.
|
||||
"name": string,
|
||||
"is_default": boolean,
|
||||
"is_public": boolean,
|
||||
"is_random": boolean,
|
||||
"address": string,
|
||||
"alias": string,
|
||||
"class": string,
|
||||
@@ -54,8 +55,10 @@ a controller and a device but there might be fields corresponding to one entity.
|
||||
{
|
||||
"name": string,
|
||||
"is_public": boolean,
|
||||
"is_random": boolean,
|
||||
"address": string,
|
||||
"alias": string,
|
||||
"appearance": string,
|
||||
"class": string,
|
||||
"icon": string,
|
||||
"paired": string,
|
||||
@@ -66,7 +69,8 @@ a controller and a device but there might be fields corresponding to one entity.
|
||||
"legacy_pairing": string,
|
||||
"rssi": int,
|
||||
"txpower": int,
|
||||
"uuids": array
|
||||
"uuids": array,
|
||||
"modalias": string
|
||||
}
|
||||
]
|
||||
|
||||
@@ -126,4 +130,4 @@ Returns:
|
||||
### Parser Information
|
||||
Compatibility: linux
|
||||
|
||||
Version 1.0 by Jake Ob (iakopap at gmail.com)
|
||||
Version 1.1 by Jake Ob (iakopap at gmail.com)
|
||||
|
||||
@@ -158,4 +158,4 @@ Returns:
|
||||
### Parser Information
|
||||
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
|
||||
|
||||
Version 1.0 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
Version 1.2 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
|
||||
@@ -9,6 +9,7 @@ Options supported:
|
||||
- `+noall +answer` options are supported in cases where only the answer
|
||||
information is desired.
|
||||
- `+axfr` option is supported on its own
|
||||
- `+nsid` option is supported
|
||||
|
||||
The `when_epoch` calculated timestamp field is naive. (i.e. based on the
|
||||
local time of the system the parser is run on)
|
||||
@@ -345,4 +346,4 @@ Returns:
|
||||
### Parser Information
|
||||
Compatibility: linux, aix, freebsd, darwin, win32, cygwin
|
||||
|
||||
Version 2.4 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
Version 2.5 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
|
||||
82
docs/parsers/find.md
Normal file
82
docs/parsers/find.md
Normal file
@@ -0,0 +1,82 @@
|
||||
[Home](https://kellyjonbrazil.github.io/jc/)
|
||||
<a id="jc.parsers.find"></a>
|
||||
|
||||
# jc.parsers.find
|
||||
|
||||
jc - JSON Convert `find` command output parser
|
||||
|
||||
This parser returns a list of objects by default and a list of strings if
|
||||
the `--raw` option is used.
|
||||
|
||||
Usage (cli):
|
||||
|
||||
$ find | jc --find
|
||||
|
||||
Usage (module):
|
||||
|
||||
import jc
|
||||
result = jc.parse('find', find_command_output)
|
||||
|
||||
Schema:
|
||||
|
||||
[
|
||||
{
|
||||
"path": string,
|
||||
"node": string,
|
||||
"error": string
|
||||
}
|
||||
]
|
||||
|
||||
Examples:
|
||||
|
||||
$ find | jc --find -p
|
||||
[
|
||||
{
|
||||
"path": "./directory"
|
||||
"node": "filename"
|
||||
},
|
||||
{
|
||||
"path": "./anotherdirectory"
|
||||
"node": "anotherfile"
|
||||
},
|
||||
{
|
||||
"path": null
|
||||
"node": null
|
||||
"error": "find: './inaccessible': Permission denied"
|
||||
}
|
||||
...
|
||||
]
|
||||
|
||||
$ find | jc --find -p -r
|
||||
[
|
||||
"./templates/readme_template",
|
||||
"./templates/manpage_template",
|
||||
"./.github/workflows/pythonapp.yml",
|
||||
...
|
||||
]
|
||||
|
||||
<a id="jc.parsers.find.parse"></a>
|
||||
|
||||
### parse
|
||||
|
||||
```python
|
||||
def parse(data, raw=False, quiet=False)
|
||||
```
|
||||
|
||||
Main text parsing function
|
||||
|
||||
Parameters:
|
||||
|
||||
data: (string) text data to parse
|
||||
raw: (boolean) unprocessed output if True
|
||||
quiet: (boolean) suppress warning messages if True
|
||||
|
||||
Returns:
|
||||
|
||||
List of raw strings or
|
||||
List of Dictionaries of processed structured data
|
||||
|
||||
### Parser Information
|
||||
Compatibility: linux
|
||||
|
||||
Version 1.0 by Solomon Leang (solomonleang@gmail.com)
|
||||
74
docs/parsers/ip_route.md
Normal file
74
docs/parsers/ip_route.md
Normal file
@@ -0,0 +1,74 @@
|
||||
[Home](https://kellyjonbrazil.github.io/jc/)
|
||||
<a id="jc.parsers.ip_route"></a>
|
||||
|
||||
# jc.parsers.ip\_route
|
||||
|
||||
jc - JSON Convert `ip route` command output parser
|
||||
|
||||
Usage (cli):
|
||||
|
||||
$ ip route | jc --ip-route
|
||||
|
||||
or
|
||||
|
||||
$ jc ip-route
|
||||
|
||||
Usage (module):
|
||||
|
||||
import jc
|
||||
result = jc.parse('ip_route', ip_route_command_output)
|
||||
|
||||
Schema:
|
||||
|
||||
[
|
||||
{
|
||||
"ip": string,
|
||||
"via": string,
|
||||
"dev": string,
|
||||
"metric": integer,
|
||||
"proto": string,
|
||||
"scope": string,
|
||||
"src": string,
|
||||
"via": string,
|
||||
"status": string
|
||||
}
|
||||
]
|
||||
|
||||
Examples:
|
||||
|
||||
$ ip route | jc --ip-route -p
|
||||
[
|
||||
{
|
||||
"ip": "10.0.2.0/24",
|
||||
"dev": "enp0s3",
|
||||
"proto": "kernel",
|
||||
"scope": "link",
|
||||
"src": "10.0.2.15",
|
||||
"metric": 100
|
||||
}
|
||||
]
|
||||
|
||||
<a id="jc.parsers.ip_route.parse"></a>
|
||||
|
||||
### parse
|
||||
|
||||
```python
|
||||
def parse(data, raw=False, quiet=False)
|
||||
```
|
||||
|
||||
Main text parsing function
|
||||
|
||||
Parameters:
|
||||
|
||||
data: (string) text data to parse
|
||||
raw: (boolean) unprocessed output if True
|
||||
quiet: (boolean) suppress warning messages if True
|
||||
|
||||
Returns:
|
||||
|
||||
List of Json objects if data is processed and Raw data if raw = true.
|
||||
|
||||
### Parser Information
|
||||
Compatibility: linux
|
||||
|
||||
Version 1.0 by Julian Jackson (jackson.julian55@yahoo.com)
|
||||
@@ -5,7 +5,7 @@
|
||||
|
||||
jc - JSON Convert `last` and `lastb` command output parser
|
||||
|
||||
Supports `-w` and `-F` options.
|
||||
Supports `-w`, `-F`, and `-x` options.
|
||||
|
||||
Calculated epoch time fields are naive (i.e. based on the local time of the
|
||||
system the parser is run on) since there is no timezone information in the
|
||||
@@ -127,4 +127,4 @@ Returns:
|
||||
### Parser Information
|
||||
Compatibility: linux, darwin, aix, freebsd
|
||||
|
||||
Version 1.8 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
Version 1.9 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
|
||||
89
docs/parsers/lsattr.md
Normal file
89
docs/parsers/lsattr.md
Normal file
@@ -0,0 +1,89 @@
|
||||
[Home](https://kellyjonbrazil.github.io/jc/)
|
||||
<a id="jc.parsers.lsattr"></a>
|
||||
|
||||
# jc.parsers.lsattr
|
||||
|
||||
jc - JSON Convert `lsattr` command output parser
|
||||
|
||||
Usage (cli):
|
||||
|
||||
$ lsattr | jc --lsattr
|
||||
|
||||
or
|
||||
|
||||
$ jc lsattr
|
||||
|
||||
Usage (module):
|
||||
|
||||
import jc
|
||||
result = jc.parse('lsattr', lsattr_command_output)
|
||||
|
||||
Schema:
|
||||
|
||||
Information from https://github.com/mirror/busybox/blob/2d4a3d9e6c1493a9520b907e07a41aca90cdfd94/e2fsprogs/e2fs_lib.c#L40
|
||||
used to define field names
|
||||
|
||||
[
|
||||
{
|
||||
"file": string,
|
||||
"compressed_file": Optional[boolean],
|
||||
"compressed_dirty_file": Optional[boolean],
|
||||
"compression_raw_access": Optional[boolean],
|
||||
"secure_deletion": Optional[boolean],
|
||||
"undelete": Optional[boolean],
|
||||
"synchronous_updates": Optional[boolean],
|
||||
"synchronous_directory_updates": Optional[boolean],
|
||||
"immutable": Optional[boolean],
|
||||
"append_only": Optional[boolean],
|
||||
"no_dump": Optional[boolean],
|
||||
"no_atime": Optional[boolean],
|
||||
"compression_requested": Optional[boolean],
|
||||
"encrypted": Optional[boolean],
|
||||
"journaled_data": Optional[boolean],
|
||||
"indexed_directory": Optional[boolean],
|
||||
"no_tailmerging": Optional[boolean],
|
||||
"top_of_directory_hierarchies": Optional[boolean],
|
||||
"extents": Optional[boolean],
|
||||
"no_cow": Optional[boolean],
|
||||
"casefold": Optional[boolean],
|
||||
"inline_data": Optional[boolean],
|
||||
"project_hierarchy": Optional[boolean],
|
||||
"verity": Optional[boolean],
|
||||
}
|
||||
]
|
||||
|
||||
Examples:
|
||||
|
||||
$ sudo lsattr /etc/passwd | jc --lsattr
|
||||
[
|
||||
{
|
||||
"file": "/etc/passwd",
|
||||
"extents": true
|
||||
}
|
||||
]
|
||||
|
||||
<a id="jc.parsers.lsattr.parse"></a>
|
||||
|
||||
### parse
|
||||
|
||||
```python
|
||||
def parse(data: str,
|
||||
raw: bool = False,
|
||||
quiet: bool = False) -> List[JSONDictType]
|
||||
```
|
||||
|
||||
Main text parsing function
|
||||
|
||||
Parameters:
|
||||
|
||||
data: (string) text data to parse
|
||||
quiet: (boolean) suppress warning messages if True
|
||||
|
||||
Returns:
|
||||
|
||||
List of Dictionaries. Raw or processed structured data.
|
||||
|
||||
### Parser Information
|
||||
Compatibility: linux
|
||||
|
||||
Version 1.0 by Mark Rotner (rotner.mr@gmail.com)
|
||||
@@ -376,6 +376,6 @@ Returns:
|
||||
List of Dictionaries. Raw or processed structured data.
|
||||
|
||||
### Parser Information
|
||||
Compatibility: linux, darwin, freebsd
|
||||
Compatibility: linux, darwin, freebsd, win32
|
||||
|
||||
Version 1.13 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
Version 1.14 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
|
||||
@@ -185,4 +185,4 @@ Returns:
|
||||
### Parser Information
|
||||
Compatibility: linux, darwin, freebsd
|
||||
|
||||
Version 1.8 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
Version 1.9 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
|
||||
@@ -106,4 +106,4 @@ Returns:
|
||||
### Parser Information
|
||||
Compatibility: linux, darwin, freebsd
|
||||
|
||||
Version 1.2 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
Version 1.3 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
|
||||
186
docs/parsers/proc_net_tcp.md
Normal file
186
docs/parsers/proc_net_tcp.md
Normal file
@@ -0,0 +1,186 @@
|
||||
[Home](https://kellyjonbrazil.github.io/jc/)
|
||||
<a id="jc.parsers.proc_net_tcp"></a>
|
||||
|
||||
# jc.parsers.proc\_net\_tcp
|
||||
|
||||
jc - JSON Convert `/proc/net/tcp` and `proc/net/tcp6` file parser
|
||||
|
||||
IPv4 and IPv6 addresses are converted to standard notation unless the raw
|
||||
(--raw) option is used.
|
||||
|
||||
Usage (cli):
|
||||
|
||||
$ cat /proc/net/tcp | jc --proc
|
||||
|
||||
or
|
||||
|
||||
$ jc /proc/net/tcp
|
||||
|
||||
or
|
||||
|
||||
$ cat /proc/net/tcp | jc --proc-net-tcp
|
||||
|
||||
Usage (module):
|
||||
|
||||
import jc
|
||||
result = jc.parse('proc', proc_net_tcp_file)
|
||||
|
||||
or
|
||||
|
||||
import jc
|
||||
result = jc.parse('proc_net_tcp', proc_net_tcp_file)
|
||||
|
||||
Schema:
|
||||
|
||||
Field names and types gathered from the following:
|
||||
|
||||
https://www.kernel.org/doc/Documentation/networking/proc_net_tcp.txt
|
||||
|
||||
https://github.com/torvalds/linux/blob/master/net/ipv4/tcp_ipv4.c
|
||||
|
||||
https://github.com/torvalds/linux/blob/master/net/ipv6/tcp_ipv6.c
|
||||
|
||||
[
|
||||
{
|
||||
"entry": integer,
|
||||
"local_address": string,
|
||||
"local_port": integer,
|
||||
"remote_address": string,
|
||||
"remote_port": integer,
|
||||
"state": string,
|
||||
"tx_queue": string,
|
||||
"rx_queue": string,
|
||||
"timer_active": integer,
|
||||
"jiffies_until_timer_expires": string,
|
||||
"unrecovered_rto_timeouts": string,
|
||||
"uid": integer,
|
||||
"unanswered_0_window_probes": integer,
|
||||
"inode": integer,
|
||||
"sock_ref_count": integer,
|
||||
"sock_mem_loc": string,
|
||||
"retransmit_timeout": integer,
|
||||
"soft_clock_tick": integer,
|
||||
"ack_quick_pingpong": integer,
|
||||
"sending_congestion_window": integer,
|
||||
"slow_start_size_threshold": integer
|
||||
}
|
||||
]
|
||||
|
||||
Examples:
|
||||
|
||||
$ cat /proc/net/tcp | jc --proc -p
|
||||
[
|
||||
{
|
||||
"entry": "0",
|
||||
"local_address": "10.0.0.28",
|
||||
"local_port": 42082,
|
||||
"remote_address": "64.12.0.108",
|
||||
"remote_port": 80,
|
||||
"state": "04",
|
||||
"tx_queue": "00000001",
|
||||
"rx_queue": "00000000",
|
||||
"timer_active": 1,
|
||||
"jiffies_until_timer_expires": "00000015",
|
||||
"unrecovered_rto_timeouts": "00000000",
|
||||
"uid": 0,
|
||||
"unanswered_0_window_probes": 0,
|
||||
"inode": 0,
|
||||
"sock_ref_count": 3,
|
||||
"sock_mem_loc": "ffff8c7a0de930c0",
|
||||
"retransmit_timeout": 21,
|
||||
"soft_clock_tick": 4,
|
||||
"ack_quick_pingpong": 30,
|
||||
"sending_congestion_window": 10,
|
||||
"slow_start_size_threshold": -1
|
||||
},
|
||||
{
|
||||
"entry": "1",
|
||||
"local_address": "10.0.0.28",
|
||||
"local_port": 38864,
|
||||
"remote_address": "104.244.42.65",
|
||||
"remote_port": 80,
|
||||
"state": "06",
|
||||
"tx_queue": "00000000",
|
||||
"rx_queue": "00000000",
|
||||
"timer_active": 3,
|
||||
"jiffies_until_timer_expires": "000007C5",
|
||||
"unrecovered_rto_timeouts": "00000000",
|
||||
"uid": 0,
|
||||
"unanswered_0_window_probes": 0,
|
||||
"inode": 0,
|
||||
"sock_ref_count": 3,
|
||||
"sock_mem_loc": "ffff8c7a12d31aa0"
|
||||
},
|
||||
...
|
||||
]
|
||||
|
||||
$ cat /proc/net/tcp | jc --proc -p -r
|
||||
[
|
||||
{
|
||||
"entry": "1",
|
||||
"local_address": "1C00000A",
|
||||
"local_port": "A462",
|
||||
"remote_address": "6C000C40",
|
||||
"remote_port": "0050",
|
||||
"state": "04",
|
||||
"tx_queue": "00000001",
|
||||
"rx_queue": "00000000",
|
||||
"timer_active": "01",
|
||||
"jiffies_until_timer_expires": "00000015",
|
||||
"unrecovered_rto_timeouts": "00000000",
|
||||
"uid": "0",
|
||||
"unanswered_0_window_probes": "0",
|
||||
"inode": "0",
|
||||
"sock_ref_count": "3",
|
||||
"sock_mem_loc": "ffff8c7a0de930c0",
|
||||
"retransmit_timeout": "21",
|
||||
"soft_clock_tick": "4",
|
||||
"ack_quick_pingpong": "30",
|
||||
"sending_congestion_window": "10",
|
||||
"slow_start_size_threshold": "-1"
|
||||
},
|
||||
{
|
||||
"entry": "2",
|
||||
"local_address": "1C00000A",
|
||||
"local_port": "97D0",
|
||||
"remote_address": "412AF468",
|
||||
"remote_port": "0050",
|
||||
"state": "06",
|
||||
"tx_queue": "00000000",
|
||||
"rx_queue": "00000000",
|
||||
"timer_active": "03",
|
||||
"jiffies_until_timer_expires": "000007C5",
|
||||
"unrecovered_rto_timeouts": "00000000",
|
||||
"uid": "0",
|
||||
"unanswered_0_window_probes": "0",
|
||||
"inode": "0",
|
||||
"sock_ref_count": "3",
|
||||
"sock_mem_loc": "ffff8c7a12d31aa0"
|
||||
},
|
||||
...
|
||||
]
|
||||
|
||||
<a id="jc.parsers.proc_net_tcp.parse"></a>
|
||||
|
||||
### parse
|
||||
|
||||
```python
|
||||
def parse(data: str, raw: bool = False, quiet: bool = False) -> List[Dict]
|
||||
```
|
||||
|
||||
Main text parsing function
|
||||
|
||||
Parameters:
|
||||
|
||||
data: (string) text data to parse
|
||||
raw: (boolean) unprocessed output if True
|
||||
quiet: (boolean) suppress warning messages if True
|
||||
|
||||
Returns:
|
||||
|
||||
List of Dictionaries. Raw or processed structured data.
|
||||
|
||||
### Parser Information
|
||||
Compatibility: linux
|
||||
|
||||
Version 1.0 by Alvin Solomon (alvinms01@gmail.com)
|
||||
83
docs/parsers/resolve_conf.md
Normal file
83
docs/parsers/resolve_conf.md
Normal file
@@ -0,0 +1,83 @@
|
||||
[Home](https://kellyjonbrazil.github.io/jc/)
|
||||
<a id="jc.parsers.resolve_conf"></a>
|
||||
|
||||
# jc.parsers.resolve\_conf
|
||||
|
||||
jc - JSON Convert `/etc/resolve.conf` file parser
|
||||
|
||||
This parser may be more forgiving than the system parser. For example, if
|
||||
multiple `search` lists are defined, this parser will append all entries to
|
||||
the `search` field, while the system parser may only use the list from the
|
||||
last defined instance.
|
||||
|
||||
Usage (cli):
|
||||
|
||||
$ cat /etc/resolve.conf | jc --resolve-conf
|
||||
|
||||
Usage (module):
|
||||
|
||||
import jc
|
||||
result = jc.parse('resolve_conf', resolve_conf_output)
|
||||
|
||||
Schema:
|
||||
|
||||
{
|
||||
"domain": string,
|
||||
"search": [
|
||||
string
|
||||
],
|
||||
"nameservers": [
|
||||
string
|
||||
],
|
||||
"options": [
|
||||
string
|
||||
],
|
||||
"sortlist": [
|
||||
string
|
||||
]
|
||||
}
|
||||
|
||||
|
||||
Examples:
|
||||
|
||||
$ cat /etc/resolve.conf | jc --resolve-conf -p
|
||||
{
|
||||
"search": [
|
||||
"eng.myprime.com",
|
||||
"dev.eng.myprime.com",
|
||||
"labs.myprime.com",
|
||||
"qa.myprime.com"
|
||||
],
|
||||
"nameservers": [
|
||||
"10.136.17.15"
|
||||
],
|
||||
"options": [
|
||||
"rotate",
|
||||
"ndots:1"
|
||||
]
|
||||
}
|
||||
|
||||
<a id="jc.parsers.resolve_conf.parse"></a>
|
||||
|
||||
### parse
|
||||
|
||||
```python
|
||||
def parse(data: str, raw: bool = False, quiet: bool = False) -> JSONDictType
|
||||
```
|
||||
|
||||
Main text parsing function
|
||||
|
||||
Parameters:
|
||||
|
||||
data: (string) text data to parse
|
||||
raw: (boolean) unprocessed output if True
|
||||
quiet: (boolean) suppress warning messages if True
|
||||
|
||||
Returns:
|
||||
|
||||
Dictionary. Raw or processed structured data.
|
||||
|
||||
### Parser Information
|
||||
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
|
||||
|
||||
Version 1.0 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
@@ -22,6 +22,13 @@ Schema:
|
||||
|
||||
[
|
||||
{
|
||||
"interfaces": [
|
||||
{
|
||||
"id": string,
|
||||
"mac": string,
|
||||
"name": string,
|
||||
}
|
||||
]
|
||||
"destination": string,
|
||||
"gateway": string,
|
||||
"genmask": string,
|
||||
@@ -129,6 +136,6 @@ Returns:
|
||||
List of Dictionaries. Raw or processed structured data.
|
||||
|
||||
### Parser Information
|
||||
Compatibility: linux
|
||||
Compatibility: linux, win32
|
||||
|
||||
Version 1.8 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
Version 1.9 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
|
||||
136
docs/parsers/srt.md
Normal file
136
docs/parsers/srt.md
Normal file
@@ -0,0 +1,136 @@
|
||||
[Home](https://kellyjonbrazil.github.io/jc/)
|
||||
<a id="jc.parsers.srt"></a>
|
||||
|
||||
# jc.parsers.srt
|
||||
|
||||
jc - JSON Convert `SRT` file parser
|
||||
|
||||
Usage (cli):
|
||||
|
||||
$ cat foo.srt | jc --srt
|
||||
|
||||
Usage (module):
|
||||
|
||||
import jc
|
||||
result = jc.parse('srt', srt_file_output)
|
||||
|
||||
Schema:
|
||||
|
||||
[
|
||||
{
|
||||
"index": int,
|
||||
"start": {
|
||||
"hours": int,
|
||||
"minutes": int,
|
||||
"seconds": int,
|
||||
"milliseconds": int,
|
||||
"timestamp": string
|
||||
},
|
||||
"end": {
|
||||
"hours": int,
|
||||
"minutes": int,
|
||||
"seconds": int,
|
||||
"milliseconds": int,
|
||||
"timestamp": string
|
||||
},
|
||||
"content": string
|
||||
}
|
||||
]
|
||||
|
||||
Examples:
|
||||
|
||||
$ cat attack_of_the_clones.srt
|
||||
1
|
||||
00:02:16,612 --> 00:02:19,376
|
||||
Senator, we're making
|
||||
our final approach into Coruscant.
|
||||
|
||||
2
|
||||
00:02:19,482 --> 00:02:21,609
|
||||
Very good, Lieutenant.
|
||||
...
|
||||
|
||||
$ cat attack_of_the_clones.srt | jc --srt
|
||||
[
|
||||
{
|
||||
"index": 1,
|
||||
"start": {
|
||||
"hours": 0,
|
||||
"minutes": 2,
|
||||
"seconds": 16,
|
||||
"milliseconds": 612,
|
||||
"timestamp": "00:02:16,612"
|
||||
},
|
||||
"end": {
|
||||
"hours": 0,
|
||||
"minutes": 2,
|
||||
"seconds": 19,
|
||||
"milliseconds": 376,
|
||||
"timestamp": "00:02:19,376"
|
||||
},
|
||||
"content": "Senator, we're making\nour final approach into Coruscant."
|
||||
},
|
||||
{
|
||||
"index": 2,
|
||||
"start": {
|
||||
"hours": 0,
|
||||
"minutes": 2,
|
||||
"seconds": 19,
|
||||
"milliseconds": 482,
|
||||
"timestamp": "00:02:19,482"
|
||||
},
|
||||
"end": {
|
||||
"hours": 0,
|
||||
"minutes": 2,
|
||||
"seconds": 21,
|
||||
"milliseconds": 609,
|
||||
"timestamp": "00:02:21,609"
|
||||
},
|
||||
"content": "Very good, Lieutenant."
|
||||
},
|
||||
...
|
||||
]
|
||||
|
||||
<a id="jc.parsers.srt.parse_timestamp"></a>
|
||||
|
||||
### parse\_timestamp
|
||||
|
||||
```python
|
||||
def parse_timestamp(timestamp: str) -> Dict
|
||||
```
|
||||
|
||||
timestamp: "hours:minutes:seconds,milliseconds" --->
|
||||
{
|
||||
"hours": "hours",
|
||||
"minutes": "minutes",
|
||||
"seconds": "seconds",
|
||||
"milliseconds": "milliseconds",
|
||||
"timestamp": "hours:minutes:seconds,milliseconds"
|
||||
}
|
||||
|
||||
<a id="jc.parsers.srt.parse"></a>
|
||||
|
||||
### parse
|
||||
|
||||
```python
|
||||
def parse(data: str,
|
||||
raw: bool = False,
|
||||
quiet: bool = False) -> List[JSONDictType]
|
||||
```
|
||||
|
||||
Main text parsing function
|
||||
|
||||
Parameters:
|
||||
|
||||
data: (string) text data to parse
|
||||
raw: (boolean) unprocessed output if True
|
||||
quiet: (boolean) suppress warning messages if True
|
||||
|
||||
Returns:
|
||||
|
||||
Dictionary. Raw or processed structured data.
|
||||
|
||||
### Parser Information
|
||||
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
|
||||
|
||||
Version 1.0 by Mark Rotner (rotner.mr@gmail.com)
|
||||
@@ -5,9 +5,6 @@
|
||||
|
||||
jc - JSON Convert `ss` command output parser
|
||||
|
||||
Extended information options like `-e` and `-p` are not supported and may
|
||||
cause parsing irregularities.
|
||||
|
||||
Usage (cli):
|
||||
|
||||
$ ss | jc --ss
|
||||
@@ -28,21 +25,29 @@ field names
|
||||
|
||||
[
|
||||
{
|
||||
"netid": string,
|
||||
"state": string,
|
||||
"recv_q": integer,
|
||||
"send_q": integer,
|
||||
"local_address": string,
|
||||
"local_port": string,
|
||||
"local_port_num": integer,
|
||||
"peer_address": string,
|
||||
"peer_port": string,
|
||||
"peer_port_num": integer,
|
||||
"interface": string,
|
||||
"link_layer" string,
|
||||
"channel": string,
|
||||
"path": string,
|
||||
"pid": integer
|
||||
"netid": string,
|
||||
"state": string,
|
||||
"recv_q": integer,
|
||||
"send_q": integer,
|
||||
"local_address": string,
|
||||
"local_port": string,
|
||||
"local_port_num": integer,
|
||||
"peer_address": string,
|
||||
"peer_port": string,
|
||||
"peer_port_num": integer,
|
||||
"interface": string,
|
||||
"link_layer" string,
|
||||
"channel": string,
|
||||
"path": string,
|
||||
"pid": integer,
|
||||
"opts": {
|
||||
"process_id": {
|
||||
"<process_id>": {
|
||||
"user": string,
|
||||
"file_descriptor": string
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
|
||||
@@ -303,4 +308,4 @@ Returns:
|
||||
### Parser Information
|
||||
Compatibility: linux
|
||||
|
||||
Version 1.6 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
Version 1.7 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
|
||||
@@ -193,4 +193,4 @@ Returns:
|
||||
### Parser Information
|
||||
Compatibility: linux, darwin, freebsd
|
||||
|
||||
Version 1.12 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
Version 1.13 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
|
||||
108
docs/parsers/veracrypt.md
Normal file
108
docs/parsers/veracrypt.md
Normal file
@@ -0,0 +1,108 @@
|
||||
[Home](https://kellyjonbrazil.github.io/jc/)
|
||||
<a id="jc.parsers.veracrypt"></a>
|
||||
|
||||
# jc.parsers.veracrypt
|
||||
|
||||
jc - JSON Convert `veracrypt` command output parser
|
||||
|
||||
Supports the following `veracrypt` subcommands:
|
||||
- `veracrypt --text --list`
|
||||
- `veracrypt --text --list --verbose`
|
||||
- `veracrypt --text --volume-properties <volume>`
|
||||
|
||||
Usage (cli):
|
||||
|
||||
$ veracrypt --text --list | jc --veracrypt
|
||||
or
|
||||
|
||||
$ jc veracrypt --text --list
|
||||
|
||||
Usage (module):
|
||||
|
||||
import jc
|
||||
result = jc.parse('veracrypt', veracrypt_command_output)
|
||||
|
||||
Schema:
|
||||
|
||||
Volume:
|
||||
[
|
||||
{
|
||||
"slot": integer,
|
||||
"path": string,
|
||||
"device": string,
|
||||
"mountpoint": string,
|
||||
"size": string,
|
||||
"type": string,
|
||||
"readonly": string,
|
||||
"hidden_protected": string,
|
||||
"encryption_algo": string,
|
||||
"pk_size": string,
|
||||
"sk_size": string,
|
||||
"block_size": string,
|
||||
"mode": string,
|
||||
"prf": string,
|
||||
"format_version": integer,
|
||||
"backup_header": string
|
||||
}
|
||||
]
|
||||
|
||||
Examples:
|
||||
|
||||
$ veracrypt --text --list | jc --veracrypt -p
|
||||
[
|
||||
{
|
||||
"slot": 1,
|
||||
"path": "/dev/sdb1",
|
||||
"device": "/dev/mapper/veracrypt1",
|
||||
"mountpoint": "/home/bob/mount/encrypt/sdb1"
|
||||
}
|
||||
]
|
||||
|
||||
$ veracrypt --text --list --verbose | jc --veracrypt -p
|
||||
[
|
||||
{
|
||||
"slot": 1,
|
||||
"path": "/dev/sdb1",
|
||||
"device": "/dev/mapper/veracrypt1",
|
||||
"mountpoint": "/home/bob/mount/encrypt/sdb1",
|
||||
"size": "522 MiB",
|
||||
"type": "Normal",
|
||||
"readonly": "No",
|
||||
"hidden_protected": "No",
|
||||
"encryption_algo": "AES",
|
||||
"pk_size": "256 bits",
|
||||
"sk_size": "256 bits",
|
||||
"block_size": "128 bits",
|
||||
"mode": "XTS",
|
||||
"prf": "HMAC-SHA-512",
|
||||
"format_version": 2,
|
||||
"backup_header": "Yes"
|
||||
}
|
||||
]
|
||||
|
||||
<a id="jc.parsers.veracrypt.parse"></a>
|
||||
|
||||
### parse
|
||||
|
||||
```python
|
||||
def parse(data: str,
|
||||
raw: bool = False,
|
||||
quiet: bool = False) -> List[JSONDictType]
|
||||
```
|
||||
|
||||
Main text parsing function
|
||||
|
||||
Parameters:
|
||||
|
||||
data: (string) text data to parse
|
||||
raw: (boolean) unprocessed output if True
|
||||
quiet: (boolean) suppress warning messages if True
|
||||
|
||||
Returns:
|
||||
|
||||
List of Dictionaries. Raw or processed structured data.
|
||||
|
||||
### Parser Information
|
||||
Compatibility: linux
|
||||
|
||||
Version 1.0 by Jake Ob (iakopap at gmail.com)
|
||||
@@ -433,4 +433,4 @@ Returns:
|
||||
### Parser Information
|
||||
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
|
||||
|
||||
Version 1.1 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
Version 1.2 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
|
||||
282
docs/parsers/x509_csr.md
Normal file
282
docs/parsers/x509_csr.md
Normal file
@@ -0,0 +1,282 @@
|
||||
[Home](https://kellyjonbrazil.github.io/jc/)
|
||||
<a id="jc.parsers.x509_csr"></a>
|
||||
|
||||
# jc.parsers.x509\_csr
|
||||
|
||||
jc - JSON Convert X.509 Certificate Request format file parser
|
||||
|
||||
This parser will convert DER and PEM encoded X.509 certificate request files.
|
||||
|
||||
Usage (cli):
|
||||
|
||||
$ cat certificateRequest.pem | jc --x509-csr
|
||||
|
||||
Usage (module):
|
||||
|
||||
import jc
|
||||
result = jc.parse('x509_csr', x509_csr_file_output)
|
||||
|
||||
Schema:
|
||||
|
||||
[
|
||||
{
|
||||
"certification_request_info": {
|
||||
"version": string,
|
||||
"serial_number": string, # [0]
|
||||
"serial_number_str": string,
|
||||
"signature": {
|
||||
"algorithm": string,
|
||||
"parameters": string/null,
|
||||
},
|
||||
"issuer": {
|
||||
"country_name": string,
|
||||
"state_or_province_name" string,
|
||||
"locality_name": string,
|
||||
"organization_name": array/string,
|
||||
"organizational_unit_name": array/string,
|
||||
"common_name": string,
|
||||
"email_address": string,
|
||||
"serial_number": string, # [0]
|
||||
"serial_number_str": string
|
||||
},
|
||||
"validity": {
|
||||
"not_before": integer, # [1]
|
||||
"not_after": integer, # [1]
|
||||
"not_before_iso": string,
|
||||
"not_after_iso": string
|
||||
},
|
||||
"subject": {
|
||||
"country_name": string,
|
||||
"state_or_province_name": string,
|
||||
"locality_name": string,
|
||||
"organization_name": array/string,
|
||||
"organizational_unit_name": array/string,
|
||||
"common_name": string,
|
||||
"email_address": string,
|
||||
"serial_number": string, # [0]
|
||||
"serial_number_str": string
|
||||
},
|
||||
"subject_public_key_info": {
|
||||
"algorithm": {
|
||||
"algorithm": string,
|
||||
"parameters": string/null,
|
||||
},
|
||||
"public_key": {
|
||||
"modulus": string, # [0]
|
||||
"public_exponent": integer
|
||||
}
|
||||
},
|
||||
"issuer_unique_id": string/null,
|
||||
"subject_unique_id": string/null,
|
||||
"extensions": [
|
||||
{
|
||||
"extn_id": string,
|
||||
"critical": boolean,
|
||||
"extn_value": array/object/string/integer # [2]
|
||||
}
|
||||
]
|
||||
},
|
||||
"signature_algorithm": {
|
||||
"algorithm": string,
|
||||
"parameters": string/null
|
||||
},
|
||||
"signature_value": string # [0]
|
||||
}
|
||||
]
|
||||
|
||||
[0] in colon-delimited hex notation
|
||||
[1] time-zone-aware (UTC) epoch timestamp
|
||||
[2] See below for well-known Extension schemas:
|
||||
|
||||
Basic Constraints:
|
||||
{
|
||||
"extn_id": "basic_constraints",
|
||||
"critical": boolean,
|
||||
"extn_value": {
|
||||
"ca": boolean,
|
||||
"path_len_constraint": string/null
|
||||
}
|
||||
}
|
||||
|
||||
Key Usage:
|
||||
{
|
||||
"extn_id": "key_usage",
|
||||
"critical": boolean,
|
||||
"extn_value": [
|
||||
string
|
||||
]
|
||||
}
|
||||
|
||||
Key Identifier:
|
||||
{
|
||||
"extn_id": "key_identifier",
|
||||
"critical": boolean,
|
||||
"extn_value": string # [0]
|
||||
}
|
||||
|
||||
Authority Key Identifier:
|
||||
{
|
||||
"extn_id": "authority_key_identifier",
|
||||
"critical": boolean,
|
||||
"extn_value": {
|
||||
"key_identifier": string, # [0]
|
||||
"authority_cert_issuer": string/null,
|
||||
"authority_cert_serial_number": string/null
|
||||
}
|
||||
}
|
||||
|
||||
Subject Alternative Name:
|
||||
{
|
||||
"extn_id": "subject_alt_name",
|
||||
"critical": boolean,
|
||||
"extn_value": [
|
||||
string
|
||||
]
|
||||
}
|
||||
|
||||
Certificate Policies:
|
||||
{
|
||||
"extn_id": "certificate_policies",
|
||||
"critical": boolean,
|
||||
"extn_value": [
|
||||
{
|
||||
"policy_identifier": string,
|
||||
"policy_qualifiers": [ array or null
|
||||
{
|
||||
"policy_qualifier_id": string,
|
||||
"qualifier": string
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
Signed Certificate Timestamp List:
|
||||
{
|
||||
"extn_id": "signed_certificate_timestamp_list",
|
||||
"critical": boolean,
|
||||
"extn_value": string # [0]
|
||||
}
|
||||
|
||||
Examples:
|
||||
|
||||
$ cat server.csr| jc --x509-csr -p
|
||||
[
|
||||
{
|
||||
"certification_request_info": {
|
||||
"version": "v1",
|
||||
"subject": {
|
||||
"common_name": "myserver.for.example"
|
||||
},
|
||||
"subject_pk_info": {
|
||||
"algorithm": {
|
||||
"algorithm": "ec",
|
||||
"parameters": "secp256r1"
|
||||
},
|
||||
"public_key": "04:40:33:c0:91:8f:e9:46:ea:d0:dc:d0:f9:63:2..."
|
||||
},
|
||||
"attributes": [
|
||||
{
|
||||
"type": "extension_request",
|
||||
"values": [
|
||||
[
|
||||
{
|
||||
"extn_id": "extended_key_usage",
|
||||
"critical": false,
|
||||
"extn_value": [
|
||||
"server_auth"
|
||||
]
|
||||
},
|
||||
{
|
||||
"extn_id": "subject_alt_name",
|
||||
"critical": false,
|
||||
"extn_value": [
|
||||
"myserver.for.example"
|
||||
]
|
||||
}
|
||||
]
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
"signature_algorithm": {
|
||||
"algorithm": "sha384_ecdsa",
|
||||
"parameters": null
|
||||
},
|
||||
"signature": "30:45:02:20:77:ac:5b:51:bf:c5:f5:43:02:52:ae:66:..."
|
||||
}
|
||||
]
|
||||
|
||||
$ openssl req -in server.csr | jc --x509-csr -p
|
||||
[
|
||||
{
|
||||
"certification_request_info": {
|
||||
"version": "v1",
|
||||
"subject": {
|
||||
"common_name": "myserver.for.example"
|
||||
},
|
||||
"subject_pk_info": {
|
||||
"algorithm": {
|
||||
"algorithm": "ec",
|
||||
"parameters": "secp256r1"
|
||||
},
|
||||
"public_key": "04:40:33:c0:91:8f:e9:46:ea:d0:dc:d0:f9:63:2..."
|
||||
},
|
||||
"attributes": [
|
||||
{
|
||||
"type": "extension_request",
|
||||
"values": [
|
||||
[
|
||||
{
|
||||
"extn_id": "extended_key_usage",
|
||||
"critical": false,
|
||||
"extn_value": [
|
||||
"server_auth"
|
||||
]
|
||||
},
|
||||
{
|
||||
"extn_id": "subject_alt_name",
|
||||
"critical": false,
|
||||
"extn_value": [
|
||||
"myserver.for.example"
|
||||
]
|
||||
}
|
||||
]
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
"signature_algorithm": {
|
||||
"algorithm": "sha384_ecdsa",
|
||||
"parameters": null
|
||||
},
|
||||
"signature": "30:45:02:20:77:ac:5b:51:bf:c5:f5:43:02:52:ae:66:..."
|
||||
}
|
||||
]
|
||||
|
||||
<a id="jc.parsers.x509_csr.parse"></a>
|
||||
|
||||
### parse
|
||||
|
||||
```python
|
||||
def parse(data: Union[str, bytes],
|
||||
raw: bool = False,
|
||||
quiet: bool = False) -> List[Dict]
|
||||
```
|
||||
|
||||
Main text parsing function
|
||||
|
||||
Parameters:
|
||||
|
||||
data: (string or bytes) text or binary data to parse
|
||||
raw: (boolean) unprocessed output if True
|
||||
quiet: (boolean) suppress warning messages if True
|
||||
|
||||
Returns:
|
||||
|
||||
List of Dictionaries. Raw or processed structured data.
|
||||
|
||||
### Parser Information
|
||||
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
|
||||
|
||||
Version 1.0 by Kelly Brazil (kellyjonbrazil@gmail.com)
|
||||
10
jc/lib.py
10
jc/lib.py
@@ -9,7 +9,7 @@ from .jc_types import ParserInfoType, JSONDictType
|
||||
from jc import appdirs
|
||||
|
||||
|
||||
__version__ = '1.23.2'
|
||||
__version__ = '1.23.4'
|
||||
|
||||
parsers: List[str] = [
|
||||
'acpi',
|
||||
@@ -43,6 +43,7 @@ parsers: List[str] = [
|
||||
'email-address',
|
||||
'env',
|
||||
'file',
|
||||
'find',
|
||||
'findmnt',
|
||||
'finger',
|
||||
'free',
|
||||
@@ -66,6 +67,7 @@ parsers: List[str] = [
|
||||
'iostat-s',
|
||||
'ip-address',
|
||||
'iptables',
|
||||
'ip-route',
|
||||
'iso-datetime',
|
||||
'iw-scan',
|
||||
'iwconfig',
|
||||
@@ -76,6 +78,7 @@ parsers: List[str] = [
|
||||
'last',
|
||||
'ls',
|
||||
'ls-s',
|
||||
'lsattr',
|
||||
'lsblk',
|
||||
'lsmod',
|
||||
'lsof',
|
||||
@@ -142,6 +145,7 @@ parsers: List[str] = [
|
||||
'proc-net-packet',
|
||||
'proc-net-protocols',
|
||||
'proc-net-route',
|
||||
'proc-net-tcp',
|
||||
'proc-net-unix',
|
||||
'proc-pid-fdinfo',
|
||||
'proc-pid-io',
|
||||
@@ -153,6 +157,7 @@ parsers: List[str] = [
|
||||
'proc-pid-statm',
|
||||
'proc-pid-status',
|
||||
'ps',
|
||||
'resolve-conf',
|
||||
'route',
|
||||
'rpm-qi',
|
||||
'rsync',
|
||||
@@ -160,6 +165,7 @@ parsers: List[str] = [
|
||||
'semver',
|
||||
'sfdisk',
|
||||
'shadow',
|
||||
'srt',
|
||||
'ss',
|
||||
'ssh-conf',
|
||||
'sshd-conf',
|
||||
@@ -193,12 +199,14 @@ parsers: List[str] = [
|
||||
'uptime',
|
||||
'url',
|
||||
'ver',
|
||||
'veracrypt',
|
||||
'vmstat',
|
||||
'vmstat-s',
|
||||
'w',
|
||||
'wc',
|
||||
'who',
|
||||
'x509-cert',
|
||||
'x509-csr',
|
||||
'xml',
|
||||
'xrandr',
|
||||
'yaml',
|
||||
|
||||
1
jc/parsers/asn1crypto/jc_global.py
Normal file
1
jc/parsers/asn1crypto/jc_global.py
Normal file
@@ -0,0 +1 @@
|
||||
quiet = False
|
||||
@@ -251,7 +251,18 @@ class EmailAddress(IA5String):
|
||||
self._unicode = contents.decode('cp1252')
|
||||
else:
|
||||
mailbox, hostname = contents.rsplit(b'@', 1)
|
||||
self._unicode = mailbox.decode('cp1252') + '@' + hostname.decode('idna')
|
||||
|
||||
# fix to allow incorrectly encoded email addresses to succeed with warning
|
||||
try:
|
||||
self._unicode = mailbox.decode('cp1252') + '@' + hostname.decode('idna')
|
||||
except UnicodeDecodeError:
|
||||
ascii_mailbox = mailbox.decode('ascii', errors='backslashreplace')
|
||||
ascii_hostname = hostname.decode('ascii', errors='backslashreplace')
|
||||
from jc.utils import warning_message
|
||||
import jc.parsers.asn1crypto.jc_global as jc_global
|
||||
if not jc_global.quiet:
|
||||
warning_message([f'Invalid email address found: {ascii_mailbox}@{ascii_hostname}'])
|
||||
self._unicode = ascii_mailbox + '@' + ascii_hostname
|
||||
return self._unicode
|
||||
|
||||
def __ne__(self, other):
|
||||
|
||||
@@ -31,6 +31,7 @@ a controller and a device but there might be fields corresponding to one entity.
|
||||
"name": string,
|
||||
"is_default": boolean,
|
||||
"is_public": boolean,
|
||||
"is_random": boolean,
|
||||
"address": string,
|
||||
"alias": string,
|
||||
"class": string,
|
||||
@@ -49,8 +50,10 @@ a controller and a device but there might be fields corresponding to one entity.
|
||||
{
|
||||
"name": string,
|
||||
"is_public": boolean,
|
||||
"is_random": boolean,
|
||||
"address": string,
|
||||
"alias": string,
|
||||
"appearance": string,
|
||||
"class": string,
|
||||
"icon": string,
|
||||
"paired": string,
|
||||
@@ -61,7 +64,8 @@ a controller and a device but there might be fields corresponding to one entity.
|
||||
"legacy_pairing": string,
|
||||
"rssi": int,
|
||||
"txpower": int,
|
||||
"uuids": array
|
||||
"uuids": array,
|
||||
"modalias": string
|
||||
}
|
||||
]
|
||||
|
||||
@@ -104,12 +108,12 @@ import jc.utils
|
||||
|
||||
class info():
|
||||
"""Provides parser metadata (version, author, etc.)"""
|
||||
version = '1.0'
|
||||
version = '1.1'
|
||||
description = '`bluetoothctl` command parser'
|
||||
author = 'Jake Ob'
|
||||
author_email = 'iakopap at gmail.com'
|
||||
compatible = ["linux"]
|
||||
magic_commands = ["bluetoothctl"]
|
||||
compatible = ['linux']
|
||||
magic_commands = ['bluetoothctl']
|
||||
tags = ['command']
|
||||
|
||||
|
||||
@@ -124,6 +128,7 @@ try:
|
||||
"name": str,
|
||||
"is_default": bool,
|
||||
"is_public": bool,
|
||||
"is_random": bool,
|
||||
"address": str,
|
||||
"alias": str,
|
||||
"class": str,
|
||||
@@ -141,8 +146,10 @@ try:
|
||||
{
|
||||
"name": str,
|
||||
"is_public": bool,
|
||||
"is_random": bool,
|
||||
"address": str,
|
||||
"alias": str,
|
||||
"appearance": str,
|
||||
"class": str,
|
||||
"icon": str,
|
||||
"paired": str,
|
||||
@@ -154,6 +161,7 @@ try:
|
||||
"rssi": int,
|
||||
"txpower": int,
|
||||
"uuids": List[str],
|
||||
"modalias": str
|
||||
},
|
||||
)
|
||||
except ImportError:
|
||||
@@ -195,6 +203,7 @@ def _parse_controller(next_lines: List[str]) -> Optional[Controller]:
|
||||
"name": '',
|
||||
"is_default": False,
|
||||
"is_public": False,
|
||||
"is_random": False,
|
||||
"address": matches["address"],
|
||||
"alias": '',
|
||||
"class": '',
|
||||
@@ -210,10 +219,12 @@ def _parse_controller(next_lines: List[str]) -> Optional[Controller]:
|
||||
if name.endswith("[default]"):
|
||||
controller["is_default"] = True
|
||||
name = name.replace("[default]", "")
|
||||
|
||||
if name.endswith("(public)"):
|
||||
elif name.endswith("(public)"):
|
||||
controller["is_public"] = True
|
||||
name = name.replace("(public)", "")
|
||||
elif name.endswith("(random)"):
|
||||
controller["is_random"] = True
|
||||
name = name.replace("(random)", "")
|
||||
|
||||
controller["name"] = name.strip()
|
||||
|
||||
@@ -257,6 +268,7 @@ _device_head_pattern = r"Device (?P<address>([0-9A-F]{2}:){5}[0-9A-F]{2}) (?P<na
|
||||
_device_line_pattern = (
|
||||
r"(\s*Name:\s*(?P<name>.+)"
|
||||
+ r"|\s*Alias:\s*(?P<alias>.+)"
|
||||
+ r"|\s*Appearance:\s*(?P<appearance>.+)"
|
||||
+ r"|\s*Class:\s*(?P<class>.+)"
|
||||
+ r"|\s*Icon:\s*(?P<icon>.+)"
|
||||
+ r"|\s*Paired:\s*(?P<paired>.+)"
|
||||
@@ -290,8 +302,10 @@ def _parse_device(next_lines: List[str], quiet: bool) -> Optional[Device]:
|
||||
device: Device = {
|
||||
"name": '',
|
||||
"is_public": False,
|
||||
"is_random": False,
|
||||
"address": matches["address"],
|
||||
"alias": '',
|
||||
"appearance": '',
|
||||
"class": '',
|
||||
"icon": '',
|
||||
"paired": '',
|
||||
@@ -303,11 +317,15 @@ def _parse_device(next_lines: List[str], quiet: bool) -> Optional[Device]:
|
||||
"rssi": 0,
|
||||
"txpower": 0,
|
||||
"uuids": [],
|
||||
"modalias": ''
|
||||
}
|
||||
|
||||
if name.endswith("(public)"):
|
||||
device["is_public"] = True
|
||||
name = name.replace("(public)", "")
|
||||
elif name.endswith("(random)"):
|
||||
device["is_random"] = True
|
||||
name = name.replace("(random)", "")
|
||||
|
||||
device["name"] = name.strip()
|
||||
|
||||
@@ -325,6 +343,8 @@ def _parse_device(next_lines: List[str], quiet: bool) -> Optional[Device]:
|
||||
device["name"] = matches["name"]
|
||||
elif matches["alias"]:
|
||||
device["alias"] = matches["alias"]
|
||||
elif matches["appearance"]:
|
||||
device["appearance"] = matches["appearance"]
|
||||
elif matches["class"]:
|
||||
device["class"] = matches["class"]
|
||||
elif matches["icon"]:
|
||||
@@ -359,6 +379,8 @@ def _parse_device(next_lines: List[str], quiet: bool) -> Optional[Device]:
|
||||
if not "uuids" in device:
|
||||
device["uuids"] = []
|
||||
device["uuids"].append(matches["uuid"])
|
||||
elif matches["modalias"]:
|
||||
device["modalias"] = matches["modalias"]
|
||||
|
||||
return device
|
||||
|
||||
@@ -376,12 +398,11 @@ def parse(data: str, raw: bool = False, quiet: bool = False) -> List[JSONDictTyp
|
||||
|
||||
List of Dictionaries. Raw or processed structured data.
|
||||
"""
|
||||
jc.utils.compatibility(__name__, info.compatible, quiet)
|
||||
jc.utils.input_type_check(data)
|
||||
result: List = []
|
||||
|
||||
if jc.utils.has_data(data):
|
||||
jc.utils.compatibility(__name__, info.compatible, quiet)
|
||||
jc.utils.input_type_check(data)
|
||||
|
||||
linedata = data.splitlines()
|
||||
linedata.reverse()
|
||||
|
||||
|
||||
@@ -130,6 +130,7 @@ Examples:
|
||||
}
|
||||
}
|
||||
"""
|
||||
import re
|
||||
from typing import List, Dict
|
||||
from jc.jc_types import JSONDictType
|
||||
import jc.utils
|
||||
@@ -137,7 +138,7 @@ import jc.utils
|
||||
|
||||
class info():
|
||||
"""Provides parser metadata (version, author, etc.)"""
|
||||
version = '1.0'
|
||||
version = '1.2'
|
||||
description = '`certbot` command parser'
|
||||
author = 'Kelly Brazil'
|
||||
author_email = 'kellyjonbrazil@gmail.com'
|
||||
@@ -200,7 +201,9 @@ def parse(
|
||||
|
||||
if jc.utils.has_data(data):
|
||||
|
||||
if 'Found the following certs:\n' in data:
|
||||
cert_pattern = re.compile(r'^Found the following certs:\r?$', re.MULTILINE)
|
||||
|
||||
if re.search(cert_pattern, data):
|
||||
cmd_option = 'certificates'
|
||||
else:
|
||||
cmd_option = 'account'
|
||||
|
||||
@@ -4,6 +4,7 @@ Options supported:
|
||||
- `+noall +answer` options are supported in cases where only the answer
|
||||
information is desired.
|
||||
- `+axfr` option is supported on its own
|
||||
- `+nsid` option is supported
|
||||
|
||||
The `when_epoch` calculated timestamp field is naive. (i.e. based on the
|
||||
local time of the system the parser is run on)
|
||||
@@ -322,7 +323,7 @@ import jc.utils
|
||||
|
||||
class info():
|
||||
"""Provides parser metadata (version, author, etc.)"""
|
||||
version = '2.4'
|
||||
version = '2.5'
|
||||
description = '`dig` command parser'
|
||||
author = 'Kelly Brazil'
|
||||
author_email = 'kellyjonbrazil@gmail.com'
|
||||
@@ -427,6 +428,7 @@ def _parse_flags_line(flagsline):
|
||||
def _parse_opt_pseudosection(optline):
|
||||
# ;; OPT PSEUDOSECTION:
|
||||
# ; EDNS: version: 0, flags:; udp: 4096
|
||||
# ; NSID: 67 70 64 6e 73 2d 73 66 6f ("gpdns-sfo")
|
||||
# ; COOKIE: 1cbc06703eaef210
|
||||
if optline.startswith('; EDNS:'):
|
||||
optline_list = optline.replace(',', ' ').split(';')
|
||||
@@ -443,11 +445,18 @@ def _parse_opt_pseudosection(optline):
|
||||
}
|
||||
}
|
||||
|
||||
elif optline.startswith('; COOKIE:'):
|
||||
if optline.startswith('; COOKIE:'):
|
||||
return {
|
||||
'cookie': optline.split()[2]
|
||||
}
|
||||
|
||||
if optline.startswith('; NSID:'):
|
||||
return {
|
||||
'nsid': optline.split('("')[-1].rstrip('")')
|
||||
}
|
||||
|
||||
return {}
|
||||
|
||||
|
||||
def _parse_question(question):
|
||||
# ;www.cnn.com. IN A
|
||||
|
||||
137
jc/parsers/find.py
Normal file
137
jc/parsers/find.py
Normal file
@@ -0,0 +1,137 @@
|
||||
"""jc - JSON Convert `find` command output parser
|
||||
|
||||
This parser returns a list of objects by default and a list of strings if
|
||||
the `--raw` option is used.
|
||||
|
||||
Usage (cli):
|
||||
|
||||
$ find | jc --find
|
||||
|
||||
Usage (module):
|
||||
|
||||
import jc
|
||||
result = jc.parse('find', find_command_output)
|
||||
|
||||
Schema:
|
||||
|
||||
[
|
||||
{
|
||||
"path": string,
|
||||
"node": string,
|
||||
"error": string
|
||||
}
|
||||
]
|
||||
|
||||
Examples:
|
||||
|
||||
$ find | jc --find -p
|
||||
[
|
||||
{
|
||||
"path": "./directory"
|
||||
"node": "filename"
|
||||
},
|
||||
{
|
||||
"path": "./anotherdirectory"
|
||||
"node": "anotherfile"
|
||||
},
|
||||
{
|
||||
"path": null
|
||||
"node": null
|
||||
"error": "find: './inaccessible': Permission denied"
|
||||
}
|
||||
...
|
||||
]
|
||||
|
||||
$ find | jc --find -p -r
|
||||
[
|
||||
"./templates/readme_template",
|
||||
"./templates/manpage_template",
|
||||
"./.github/workflows/pythonapp.yml",
|
||||
...
|
||||
]
|
||||
"""
|
||||
import jc.utils
|
||||
|
||||
|
||||
class info():
|
||||
"""Provides parser metadata (version, author, etc.)"""
|
||||
version = '1.0'
|
||||
description = '`find` command parser'
|
||||
author = 'Solomon Leang'
|
||||
author_email = 'solomonleang@gmail.com'
|
||||
compatible = ['linux']
|
||||
tags = ['command']
|
||||
|
||||
|
||||
__version__ = info.version
|
||||
|
||||
|
||||
def _process(proc_data):
|
||||
"""
|
||||
Final processing to conform to the schema.
|
||||
|
||||
Parameters:
|
||||
|
||||
proc_data: (List of Strings) raw structured data to process
|
||||
|
||||
Returns:
|
||||
|
||||
List of Dictionaries. Structured data to conform to the schema.
|
||||
"""
|
||||
processed = []
|
||||
|
||||
for index in proc_data:
|
||||
path, node, error = "", "", ""
|
||||
|
||||
if index == ".":
|
||||
node = "."
|
||||
elif index.startswith('find: '):
|
||||
error = index
|
||||
else:
|
||||
try:
|
||||
path, node = index.rsplit('/', maxsplit=1)
|
||||
except ValueError:
|
||||
pass
|
||||
|
||||
proc_line = {
|
||||
'path': path if path else None,
|
||||
'node': node if node else None
|
||||
}
|
||||
|
||||
if error:
|
||||
proc_line.update(
|
||||
{'error': error}
|
||||
)
|
||||
|
||||
processed.append(proc_line)
|
||||
|
||||
return processed
|
||||
|
||||
|
||||
def parse(data, raw=False, quiet=False):
|
||||
"""
|
||||
Main text parsing function
|
||||
|
||||
Parameters:
|
||||
|
||||
data: (string) text data to parse
|
||||
raw: (boolean) unprocessed output if True
|
||||
quiet: (boolean) suppress warning messages if True
|
||||
|
||||
Returns:
|
||||
|
||||
List of raw strings or
|
||||
List of Dictionaries of processed structured data
|
||||
"""
|
||||
jc.utils.compatibility(__name__, info.compatible, quiet)
|
||||
jc.utils.input_type_check(data)
|
||||
|
||||
raw_output = []
|
||||
|
||||
if jc.utils.has_data(data):
|
||||
raw_output = data.splitlines()
|
||||
|
||||
if raw:
|
||||
return raw_output
|
||||
else:
|
||||
return _process(raw_output)
|
||||
145
jc/parsers/ip_route.py
Normal file
145
jc/parsers/ip_route.py
Normal file
@@ -0,0 +1,145 @@
|
||||
"""jc - JSON Convert `ip route` command output parser
|
||||
|
||||
Usage (cli):
|
||||
|
||||
$ ip route | jc --ip-route
|
||||
|
||||
or
|
||||
|
||||
$ jc ip-route
|
||||
|
||||
Usage (module):
|
||||
|
||||
import jc
|
||||
result = jc.parse('ip_route', ip_route_command_output)
|
||||
|
||||
Schema:
|
||||
|
||||
[
|
||||
{
|
||||
"ip": string,
|
||||
"via": string,
|
||||
"dev": string,
|
||||
"metric": integer,
|
||||
"proto": string,
|
||||
"scope": string,
|
||||
"src": string,
|
||||
"via": string,
|
||||
"status": string
|
||||
}
|
||||
]
|
||||
|
||||
Examples:
|
||||
|
||||
$ ip route | jc --ip-route -p
|
||||
[
|
||||
{
|
||||
"ip": "10.0.2.0/24",
|
||||
"dev": "enp0s3",
|
||||
"proto": "kernel",
|
||||
"scope": "link",
|
||||
"src": "10.0.2.15",
|
||||
"metric": 100
|
||||
}
|
||||
]
|
||||
"""
|
||||
from typing import Dict
|
||||
|
||||
import jc.utils
|
||||
|
||||
|
||||
class info:
|
||||
"""Provides parser metadata (version, author, etc.)"""
|
||||
version = '1.0'
|
||||
description = '`ip route` command parser'
|
||||
author = 'Julian Jackson'
|
||||
author_email = 'jackson.julian55@yahoo.com'
|
||||
compatible = ['linux']
|
||||
magic_commands = ['ip route']
|
||||
tags = ['command']
|
||||
|
||||
|
||||
__version__ = info.version
|
||||
|
||||
|
||||
def parse(data, raw=False, quiet=False):
|
||||
"""
|
||||
Main text parsing function
|
||||
|
||||
Parameters:
|
||||
|
||||
data: (string) text data to parse
|
||||
raw: (boolean) unprocessed output if True
|
||||
quiet: (boolean) suppress warning messages if True
|
||||
|
||||
Returns:
|
||||
|
||||
List of Json objects if data is processed and Raw data if raw = true.
|
||||
"""
|
||||
structure = {}
|
||||
items = []
|
||||
lines = data.splitlines()
|
||||
index = 0
|
||||
place = 0
|
||||
inc = 0
|
||||
|
||||
for line in lines:
|
||||
temp = line.split()
|
||||
for word in temp:
|
||||
if word == 'via':
|
||||
y = {'via': temp[place + 1]}
|
||||
place += 1
|
||||
structure.update(y)
|
||||
elif word == 'dev':
|
||||
y = {'dev': temp[place + 1]}
|
||||
place += 1
|
||||
structure.update(y)
|
||||
elif word == 'metric':
|
||||
if raw:
|
||||
y = {'metric': temp[place + 1]}
|
||||
else:
|
||||
y = {'metric': jc.utils.convert_to_int(temp[place+1])}
|
||||
place += 1
|
||||
structure.update(y)
|
||||
elif word == 'proto':
|
||||
y = {'proto': temp[place + 1]}
|
||||
place += 1
|
||||
structure.update(y)
|
||||
elif word == 'scope':
|
||||
y = {'scope': temp[place + 1]}
|
||||
place += 1
|
||||
structure.update(y)
|
||||
elif word == 'src':
|
||||
y = {'src': temp[place + 1]}
|
||||
place += 1
|
||||
structure.update(y)
|
||||
elif word == 'status':
|
||||
y = {'status': temp[place + 1]}
|
||||
place += 1
|
||||
structure.update(y)
|
||||
elif word == 'default':
|
||||
y = {'ip': 'default'}
|
||||
place += 1
|
||||
structure.update(y)
|
||||
elif word == 'linkdown':
|
||||
y = {'status': 'linkdown'}
|
||||
place += 1
|
||||
structure.update(y)
|
||||
else:
|
||||
y = {'ip': temp[0]}
|
||||
place += 1
|
||||
structure.update(y)
|
||||
if y.get("ip") != "":
|
||||
items.append(structure)
|
||||
structure = {}
|
||||
place = 0
|
||||
index += 1
|
||||
inc += 1
|
||||
|
||||
jc.utils.compatibility(__name__, info.compatible, quiet)
|
||||
jc.utils.input_type_check(data)
|
||||
|
||||
if not jc.utils.has_data(data):
|
||||
return []
|
||||
|
||||
return items
|
||||
@@ -1,6 +1,6 @@
|
||||
"""jc - JSON Convert `last` and `lastb` command output parser
|
||||
|
||||
Supports `-w` and `-F` options.
|
||||
Supports `-w`, `-F`, and `-x` options.
|
||||
|
||||
Calculated epoch time fields are naive (i.e. based on the local time of the
|
||||
system the parser is run on) since there is no timezone information in the
|
||||
@@ -103,10 +103,15 @@ Examples:
|
||||
import re
|
||||
import jc.utils
|
||||
|
||||
DATE_RE = re.compile(r'[MTWFS][ouerha][nedritnu] [JFMASOND][aepuco][nbrynlgptvc]')
|
||||
LAST_F_DATE_RE = re.compile(r'\d\d:\d\d:\d\d \d\d\d\d')
|
||||
LOGIN_LOGOUT_EPOCH_RE = re.compile(r'.*\d\d:\d\d:\d\d \d\d\d\d.*')
|
||||
LOGOUT_IGNORED_EVENTS = ['down', 'crash']
|
||||
|
||||
|
||||
class info():
|
||||
"""Provides parser metadata (version, author, etc.)"""
|
||||
version = '1.8'
|
||||
version = '1.9'
|
||||
description = '`last` and `lastb` command parser'
|
||||
author = 'Kelly Brazil'
|
||||
author_email = 'kellyjonbrazil@gmail.com'
|
||||
@@ -138,9 +143,6 @@ def _process(proc_data):
|
||||
if 'tty' in entry and entry['tty'] == '~':
|
||||
entry['tty'] = None
|
||||
|
||||
if 'tty' in entry and entry['tty'] == 'system_boot':
|
||||
entry['tty'] = 'system boot'
|
||||
|
||||
if 'hostname' in entry and entry['hostname'] == '-':
|
||||
entry['hostname'] = None
|
||||
|
||||
@@ -153,11 +155,11 @@ def _process(proc_data):
|
||||
if 'logout' in entry and entry['logout'] == 'gone_-_no_logout':
|
||||
entry['logout'] = 'gone - no logout'
|
||||
|
||||
if 'login' in entry and re.match(r'.*\d\d:\d\d:\d\d \d\d\d\d.*', entry['login']):
|
||||
if 'login' in entry and LOGIN_LOGOUT_EPOCH_RE.match(entry['login']):
|
||||
timestamp = jc.utils.timestamp(entry['login'])
|
||||
entry['login_epoch'] = timestamp.naive
|
||||
|
||||
if 'logout' in entry and re.match(r'.*\d\d:\d\d:\d\d \d\d\d\d.*', entry['logout']):
|
||||
if 'logout' in entry and LOGIN_LOGOUT_EPOCH_RE.match(entry['logout']):
|
||||
timestamp = jc.utils.timestamp(entry['logout'])
|
||||
entry['logout_epoch'] = timestamp.naive
|
||||
|
||||
@@ -194,66 +196,71 @@ def parse(data, raw=False, quiet=False):
|
||||
# Clear any blank lines
|
||||
cleandata = list(filter(None, data.splitlines()))
|
||||
|
||||
if jc.utils.has_data(data):
|
||||
if not jc.utils.has_data(data):
|
||||
return []
|
||||
|
||||
for entry in cleandata:
|
||||
output_line = {}
|
||||
for entry in cleandata:
|
||||
output_line = {}
|
||||
|
||||
if (entry.startswith('wtmp begins ') or
|
||||
entry.startswith('btmp begins ') or
|
||||
entry.startswith('utx.log begins ')):
|
||||
if any(
|
||||
entry.startswith(f'{prefix} begins ')
|
||||
for prefix in ['wtmp', 'btmp', 'utx.log']
|
||||
):
|
||||
continue
|
||||
|
||||
continue
|
||||
entry = entry.replace('boot time', 'boot_time')
|
||||
entry = entry.replace(' still logged in', '- still_logged_in')
|
||||
entry = entry.replace(' gone - no logout', '- gone_-_no_logout')
|
||||
|
||||
entry = entry.replace('system boot', 'system_boot')
|
||||
entry = entry.replace('boot time', 'boot_time')
|
||||
entry = entry.replace(' still logged in', '- still_logged_in')
|
||||
entry = entry.replace(' gone - no logout', '- gone_-_no_logout')
|
||||
linedata = entry.split()
|
||||
|
||||
linedata = entry.split()
|
||||
if re.match(r'[MTWFS][ouerha][nedritnu] [JFMASOND][aepuco][nbrynlgptvc]', ' '.join(linedata[2:4])):
|
||||
linedata.insert(2, '-')
|
||||
# Adding "-" before the date part.
|
||||
if DATE_RE.match(' '.join(linedata[2:4])):
|
||||
linedata.insert(2, '-')
|
||||
|
||||
# freebsd fix
|
||||
if linedata[0] == 'boot_time':
|
||||
linedata.insert(1, '-')
|
||||
linedata.insert(1, '~')
|
||||
# freebsd fix
|
||||
if linedata[0] == 'boot_time':
|
||||
linedata.insert(1, '-')
|
||||
linedata.insert(1, '~')
|
||||
|
||||
output_line['user'] = linedata[0]
|
||||
output_line['tty'] = linedata[1]
|
||||
output_line['hostname'] = linedata[2]
|
||||
output_line['user'] = linedata[0]
|
||||
|
||||
# last -F support
|
||||
if re.match(r'\d\d:\d\d:\d\d \d\d\d\d', ' '.join(linedata[6:8])):
|
||||
output_line['login'] = ' '.join(linedata[3:8])
|
||||
# Fix for last -x (runlevel).
|
||||
if output_line['user'] == 'runlevel' and linedata[1] == '(to':
|
||||
linedata[1] += f' {linedata.pop(2)} {linedata.pop(2)}'
|
||||
elif output_line['user'] in ['reboot', 'shutdown'] and linedata[1] == 'system': # system down\system boot
|
||||
linedata[1] += f' {linedata.pop(2)}'
|
||||
|
||||
if len(linedata) > 9 and linedata[9] != 'crash' and linedata[9] != 'down':
|
||||
output_line['tty'] = linedata[1]
|
||||
output_line['hostname'] = linedata[2]
|
||||
|
||||
# last -F support
|
||||
if LAST_F_DATE_RE.match(' '.join(linedata[6:8])):
|
||||
output_line['login'] = ' '.join(linedata[3:8])
|
||||
|
||||
if len(linedata) > 9:
|
||||
if linedata[9] not in LOGOUT_IGNORED_EVENTS:
|
||||
output_line['logout'] = ' '.join(linedata[9:14])
|
||||
|
||||
if len(linedata) > 9 and (linedata[9] == 'crash' or linedata[9] == 'down'):
|
||||
else:
|
||||
output_line['logout'] = linedata[9]
|
||||
# add more items to the list to line up duration
|
||||
linedata.insert(10, '-')
|
||||
linedata.insert(10, '-')
|
||||
linedata.insert(10, '-')
|
||||
linedata.insert(10, '-')
|
||||
for _ in range(4):
|
||||
linedata.insert(10, '-')
|
||||
|
||||
if len(linedata) > 14:
|
||||
output_line['duration'] = linedata[14].replace('(', '').replace(')', '')
|
||||
if len(linedata) > 14:
|
||||
output_line['duration'] = linedata[14].replace('(', '').replace(')', '')
|
||||
else: # normal last support
|
||||
output_line['login'] = ' '.join(linedata[3:7])
|
||||
|
||||
# normal last support
|
||||
else:
|
||||
output_line['login'] = ' '.join(linedata[3:7])
|
||||
if len(linedata) > 8:
|
||||
output_line['logout'] = linedata[8]
|
||||
|
||||
if len(linedata) > 8:
|
||||
output_line['logout'] = linedata[8]
|
||||
if len(linedata) > 9:
|
||||
output_line['duration'] = linedata[9].replace('(', '').replace(')', '')
|
||||
|
||||
if len(linedata) > 9:
|
||||
output_line['duration'] = linedata[9].replace('(', '').replace(')', '')
|
||||
|
||||
raw_output.append(output_line)
|
||||
raw_output.append(output_line)
|
||||
|
||||
if raw:
|
||||
return raw_output
|
||||
else:
|
||||
return _process(raw_output)
|
||||
|
||||
return _process(raw_output)
|
||||
|
||||
162
jc/parsers/lsattr.py
Normal file
162
jc/parsers/lsattr.py
Normal file
@@ -0,0 +1,162 @@
|
||||
"""jc - JSON Convert `lsattr` command output parser
|
||||
|
||||
Usage (cli):
|
||||
|
||||
$ lsattr | jc --lsattr
|
||||
|
||||
or
|
||||
|
||||
$ jc lsattr
|
||||
|
||||
Usage (module):
|
||||
|
||||
import jc
|
||||
result = jc.parse('lsattr', lsattr_command_output)
|
||||
|
||||
Schema:
|
||||
|
||||
Information from https://github.com/mirror/busybox/blob/2d4a3d9e6c1493a9520b907e07a41aca90cdfd94/e2fsprogs/e2fs_lib.c#L40
|
||||
used to define field names
|
||||
|
||||
[
|
||||
{
|
||||
"file": string,
|
||||
"compressed_file": Optional[boolean],
|
||||
"compressed_dirty_file": Optional[boolean],
|
||||
"compression_raw_access": Optional[boolean],
|
||||
"secure_deletion": Optional[boolean],
|
||||
"undelete": Optional[boolean],
|
||||
"synchronous_updates": Optional[boolean],
|
||||
"synchronous_directory_updates": Optional[boolean],
|
||||
"immutable": Optional[boolean],
|
||||
"append_only": Optional[boolean],
|
||||
"no_dump": Optional[boolean],
|
||||
"no_atime": Optional[boolean],
|
||||
"compression_requested": Optional[boolean],
|
||||
"encrypted": Optional[boolean],
|
||||
"journaled_data": Optional[boolean],
|
||||
"indexed_directory": Optional[boolean],
|
||||
"no_tailmerging": Optional[boolean],
|
||||
"top_of_directory_hierarchies": Optional[boolean],
|
||||
"extents": Optional[boolean],
|
||||
"no_cow": Optional[boolean],
|
||||
"casefold": Optional[boolean],
|
||||
"inline_data": Optional[boolean],
|
||||
"project_hierarchy": Optional[boolean],
|
||||
"verity": Optional[boolean],
|
||||
}
|
||||
]
|
||||
|
||||
Examples:
|
||||
|
||||
$ sudo lsattr /etc/passwd | jc --lsattr
|
||||
[
|
||||
{
|
||||
"file": "/etc/passwd",
|
||||
"extents": true
|
||||
}
|
||||
]
|
||||
"""
|
||||
from typing import List, Dict
|
||||
from jc.jc_types import JSONDictType
|
||||
import jc.utils
|
||||
|
||||
|
||||
class info():
|
||||
"""Provides parser metadata (version, author, etc.)"""
|
||||
version = '1.0'
|
||||
description = '`lsattr` command parser'
|
||||
author = 'Mark Rotner'
|
||||
author_email = 'rotner.mr@gmail.com'
|
||||
compatible = ['linux']
|
||||
magic_commands = ['lsattr']
|
||||
tags = ['command']
|
||||
|
||||
|
||||
__version__ = info.version
|
||||
|
||||
|
||||
ERROR_PREFIX = "lsattr:"
|
||||
|
||||
# https://github.com/mirror/busybox/blob/2d4a3d9e6c1493a9520b907e07a41aca90cdfd94/e2fsprogs/e2fs_lib.c#L40
|
||||
# https://github.com/landley/toybox/blob/f1682dc79fd75f64042b5438918fe5a507977e1c/toys/other/lsattr.c#L97
|
||||
ATTRIBUTES = {
|
||||
"B": "compressed_file",
|
||||
"Z": "compressed_dirty_file",
|
||||
"X": "compression_raw_access",
|
||||
"s": "secure_deletion",
|
||||
"u": "undelete",
|
||||
"S": "synchronous_updates",
|
||||
"D": "synchronous_directory_updates",
|
||||
"i": "immutable",
|
||||
"a": "append_only",
|
||||
"d": "no_dump",
|
||||
"A": "no_atime",
|
||||
"c": "compression_requested",
|
||||
"E": "encrypted",
|
||||
"j": "journaled_data",
|
||||
"I": "indexed_directory",
|
||||
"t": "no_tailmerging",
|
||||
"T": "top_of_directory_hierarchies",
|
||||
"e": "extents",
|
||||
"C": "no_cow",
|
||||
"F": "casefold",
|
||||
"N": "inline_data",
|
||||
"P": "project_hierarchy",
|
||||
"V": "verity",
|
||||
}
|
||||
|
||||
|
||||
def parse(
|
||||
data: str,
|
||||
raw: bool = False,
|
||||
quiet: bool = False
|
||||
) -> List[JSONDictType]:
|
||||
"""
|
||||
Main text parsing function
|
||||
|
||||
Parameters:
|
||||
|
||||
data: (string) text data to parse
|
||||
quiet: (boolean) suppress warning messages if True
|
||||
|
||||
Returns:
|
||||
|
||||
List of Dictionaries. Raw or processed structured data.
|
||||
"""
|
||||
jc.utils.compatibility(__name__, info.compatible, quiet)
|
||||
jc.utils.input_type_check(data)
|
||||
|
||||
output: List = []
|
||||
|
||||
cleandata = list(filter(None, data.splitlines()))
|
||||
|
||||
if not jc.utils.has_data(data):
|
||||
return output
|
||||
|
||||
for line in cleandata:
|
||||
# -R flag returns the output in the format:
|
||||
# Folder:
|
||||
# attributes file_in_folder
|
||||
if line.endswith(':'):
|
||||
continue
|
||||
|
||||
# lsattr: Operation not supported ....
|
||||
if line.startswith(ERROR_PREFIX):
|
||||
continue
|
||||
|
||||
line_output: Dict = {}
|
||||
|
||||
# attributes file
|
||||
# --------------e----- /etc/passwd
|
||||
attributes, file = line.split()
|
||||
line_output['file'] = file
|
||||
for attribute in list(attributes):
|
||||
attribute_key = ATTRIBUTES.get(attribute)
|
||||
if attribute_key:
|
||||
line_output[attribute_key] = True
|
||||
|
||||
if line_output:
|
||||
output.append(line_output)
|
||||
|
||||
return output
|
||||
@@ -355,17 +355,18 @@ import jc.utils
|
||||
|
||||
class info():
|
||||
"""Provides parser metadata (version, author, etc.)"""
|
||||
version = '1.13'
|
||||
version = '1.14'
|
||||
description = '`netstat` command parser'
|
||||
author = 'Kelly Brazil'
|
||||
author_email = 'kellyjonbrazil@gmail.com'
|
||||
compatible = ['linux', 'darwin', 'freebsd']
|
||||
compatible = ['linux', 'darwin', 'freebsd', 'win32']
|
||||
magic_commands = ['netstat']
|
||||
tags = ['command']
|
||||
|
||||
|
||||
__version__ = info.version
|
||||
|
||||
WINDOWS_NETSTAT_HEADER = "Active Connections"
|
||||
|
||||
def _process(proc_data):
|
||||
"""
|
||||
@@ -450,9 +451,10 @@ def parse(data, raw=False, quiet=False):
|
||||
|
||||
import jc.parsers.netstat_freebsd_osx
|
||||
raw_output = jc.parsers.netstat_freebsd_osx.parse(cleandata)
|
||||
|
||||
# use linux parser
|
||||
else:
|
||||
elif cleandata[0] == WINDOWS_NETSTAT_HEADER: # use windows parser.
|
||||
import jc.parsers.netstat_windows
|
||||
raw_output = jc.parsers.netstat_windows.parse(cleandata)
|
||||
else: # use linux parser.
|
||||
import jc.parsers.netstat_linux
|
||||
raw_output = jc.parsers.netstat_linux.parse(cleandata)
|
||||
|
||||
|
||||
75
jc/parsers/netstat_windows.py
Normal file
75
jc/parsers/netstat_windows.py
Normal file
@@ -0,0 +1,75 @@
|
||||
"""
|
||||
jc - JSON Convert Windows `netstat` command output parser
|
||||
"""
|
||||
|
||||
|
||||
from typing import Dict, List
|
||||
|
||||
POSSIBLE_PROTOCOLS = ("TCP", "UDP", "TCPv6", "UDPv6")
|
||||
|
||||
def normalize_headers(headers: str):
|
||||
"""
|
||||
Normalizes the headers to match the jc netstat parser style
|
||||
(local_address -> local_address, local_port...).
|
||||
"""
|
||||
|
||||
headers = headers.lower().strip()
|
||||
headers = headers.replace("local address", "local_address")
|
||||
headers = headers.replace("foreign address", "foreign_address")
|
||||
return headers.split()
|
||||
|
||||
|
||||
def parse(cleandata: List[str]):
|
||||
"""
|
||||
Main text parsing function for Windows netstat
|
||||
|
||||
Parameters:
|
||||
|
||||
cleandata: (string) text data to parse
|
||||
|
||||
Returns:
|
||||
|
||||
List of Dictionaries. Raw structured data.
|
||||
"""
|
||||
|
||||
raw_output = []
|
||||
cleandata.pop(0) # Removing the "Active Connections" header.
|
||||
headers = normalize_headers(cleandata.pop(0))
|
||||
|
||||
for line in cleandata:
|
||||
line = line.strip()
|
||||
|
||||
if not line.startswith(POSSIBLE_PROTOCOLS): # -b.
|
||||
line_data = raw_output.pop(len(raw_output) - 1)
|
||||
line_data['program_name'] = line
|
||||
raw_output.append(line_data)
|
||||
continue
|
||||
|
||||
line_data = line.split()
|
||||
line_data: Dict[str, str] = dict(zip(headers, line_data))
|
||||
for key in list(line_data.keys()):
|
||||
if key == "local_address":
|
||||
local_address, local_port = line_data[key].rsplit(
|
||||
":", maxsplit=1)
|
||||
line_data["local_address"] = local_address
|
||||
line_data["local_port"] = local_port
|
||||
continue
|
||||
|
||||
if key == "foreign_address":
|
||||
foreign_address, foreign_port = line_data[key].rsplit(
|
||||
":", maxsplit=1)
|
||||
line_data["foreign_address"] = foreign_address
|
||||
line_data["foreign_port"] = foreign_port
|
||||
continue
|
||||
|
||||
# There is no state in UDP, so the data after the "state" header will leak.
|
||||
if key == "proto" and "state" in headers and line_data["proto"] == "UDP":
|
||||
next_header = headers.index("state") + 1
|
||||
if len(headers) > next_header:
|
||||
next_header = headers[next_header]
|
||||
line_data[next_header] = line_data["state"]
|
||||
line_data["state"] = ''
|
||||
|
||||
raw_output.append(line_data)
|
||||
|
||||
return raw_output
|
||||
@@ -164,7 +164,7 @@ import jc.utils
|
||||
|
||||
class info():
|
||||
"""Provides parser metadata (version, author, etc.)"""
|
||||
version = '1.8'
|
||||
version = '1.9'
|
||||
description = '`ping` and `ping6` command parser'
|
||||
author = 'Kelly Brazil'
|
||||
author_email = 'kellyjonbrazil@gmail.com'
|
||||
@@ -284,6 +284,8 @@ def _linux_parse(data):
|
||||
|
||||
if ipv4 and linedata[0][5] not in string.digits:
|
||||
hostname = True
|
||||
# fixup for missing hostname
|
||||
linedata[0] = linedata[0][:5] + 'nohost' + linedata[0][5:]
|
||||
elif ipv4 and linedata[0][5] in string.digits:
|
||||
hostname = False
|
||||
elif not ipv4 and ' (' in linedata[0]:
|
||||
@@ -314,7 +316,8 @@ def _linux_parse(data):
|
||||
|
||||
if line.startswith('---'):
|
||||
footer = True
|
||||
raw_output['destination'] = line.split()[1]
|
||||
if line[4] != ' ': # fixup for missing hostname
|
||||
raw_output['destination'] = line.split()[1]
|
||||
continue
|
||||
|
||||
if footer:
|
||||
|
||||
@@ -85,7 +85,7 @@ from jc.exceptions import ParseError
|
||||
|
||||
class info():
|
||||
"""Provides parser metadata (version, author, etc.)"""
|
||||
version = '1.2'
|
||||
version = '1.3'
|
||||
description = '`ping` and `ping6` command streaming parser'
|
||||
author = 'Kelly Brazil'
|
||||
author_email = 'kellyjonbrazil@gmail.com'
|
||||
@@ -340,6 +340,8 @@ def _linux_parse(line, s):
|
||||
|
||||
if s.ipv4 and line[5] not in string.digits:
|
||||
s.hostname = True
|
||||
# fixup for missing hostname
|
||||
line = line[:5] + 'nohost' + line[5:]
|
||||
elif s.ipv4 and line[5] in string.digits:
|
||||
s.hostname = False
|
||||
elif not s.ipv4 and ' (' in line:
|
||||
|
||||
@@ -194,6 +194,7 @@ def parse(
|
||||
net_packet_p = re.compile(r'^sk RefCnt Type Proto Iface R Rmem User Inode\n')
|
||||
net_protocols_p = re.compile(r'^protocol size sockets memory press maxhdr slab module cl co di ac io in de sh ss gs se re sp bi br ha uh gp em\n')
|
||||
net_route_p = re.compile(r'^Iface\tDestination\tGateway \tFlags\tRefCnt\tUse\tMetric\tMask\t\tMTU\tWindow\tIRTT\s+\n')
|
||||
net_tcp_p = re.compile(r'^\s+sl\s+local_address\s+(?:rem_address|remote_address)\s+st\s+tx_queue\s+rx_queue\s+tr\s+tm->when\s+retrnsmt\s+uid\s+timeout\s+inode')
|
||||
net_unix_p = re.compile(r'^Num RefCount Protocol Flags Type St Inode Path\n')
|
||||
|
||||
pid_fdinfo_p = re.compile(r'^pos:\t\d+\nflags:\t\d+\nmnt_id:\t\d+\n')
|
||||
@@ -249,6 +250,7 @@ def parse(
|
||||
net_packet_p: 'proc_net_packet',
|
||||
net_protocols_p: 'proc_net_protocols',
|
||||
net_route_p: 'proc_net_route',
|
||||
net_tcp_p: 'proc_net_tcp',
|
||||
net_unix_p: 'proc_net_unix',
|
||||
net_ipv6_route_p: 'proc_net_ipv6_route', # before net_dev_mcast
|
||||
net_dev_mcast_p: 'proc_net_dev_mcast', # after net_ipv6_route
|
||||
|
||||
293
jc/parsers/proc_net_tcp.py
Normal file
293
jc/parsers/proc_net_tcp.py
Normal file
@@ -0,0 +1,293 @@
|
||||
"""jc - JSON Convert `/proc/net/tcp` and `proc/net/tcp6` file parser
|
||||
|
||||
IPv4 and IPv6 addresses are converted to standard notation unless the raw
|
||||
(--raw) option is used.
|
||||
|
||||
Usage (cli):
|
||||
|
||||
$ cat /proc/net/tcp | jc --proc
|
||||
|
||||
or
|
||||
|
||||
$ jc /proc/net/tcp
|
||||
|
||||
or
|
||||
|
||||
$ cat /proc/net/tcp | jc --proc-net-tcp
|
||||
|
||||
Usage (module):
|
||||
|
||||
import jc
|
||||
result = jc.parse('proc', proc_net_tcp_file)
|
||||
|
||||
or
|
||||
|
||||
import jc
|
||||
result = jc.parse('proc_net_tcp', proc_net_tcp_file)
|
||||
|
||||
Schema:
|
||||
|
||||
Field names and types gathered from the following:
|
||||
|
||||
https://www.kernel.org/doc/Documentation/networking/proc_net_tcp.txt
|
||||
|
||||
https://github.com/torvalds/linux/blob/master/net/ipv4/tcp_ipv4.c
|
||||
|
||||
https://github.com/torvalds/linux/blob/master/net/ipv6/tcp_ipv6.c
|
||||
|
||||
[
|
||||
{
|
||||
"entry": integer,
|
||||
"local_address": string,
|
||||
"local_port": integer,
|
||||
"remote_address": string,
|
||||
"remote_port": integer,
|
||||
"state": string,
|
||||
"tx_queue": string,
|
||||
"rx_queue": string,
|
||||
"timer_active": integer,
|
||||
"jiffies_until_timer_expires": string,
|
||||
"unrecovered_rto_timeouts": string,
|
||||
"uid": integer,
|
||||
"unanswered_0_window_probes": integer,
|
||||
"inode": integer,
|
||||
"sock_ref_count": integer,
|
||||
"sock_mem_loc": string,
|
||||
"retransmit_timeout": integer,
|
||||
"soft_clock_tick": integer,
|
||||
"ack_quick_pingpong": integer,
|
||||
"sending_congestion_window": integer,
|
||||
"slow_start_size_threshold": integer
|
||||
}
|
||||
]
|
||||
|
||||
Examples:
|
||||
|
||||
$ cat /proc/net/tcp | jc --proc -p
|
||||
[
|
||||
{
|
||||
"entry": "0",
|
||||
"local_address": "10.0.0.28",
|
||||
"local_port": 42082,
|
||||
"remote_address": "64.12.0.108",
|
||||
"remote_port": 80,
|
||||
"state": "04",
|
||||
"tx_queue": "00000001",
|
||||
"rx_queue": "00000000",
|
||||
"timer_active": 1,
|
||||
"jiffies_until_timer_expires": "00000015",
|
||||
"unrecovered_rto_timeouts": "00000000",
|
||||
"uid": 0,
|
||||
"unanswered_0_window_probes": 0,
|
||||
"inode": 0,
|
||||
"sock_ref_count": 3,
|
||||
"sock_mem_loc": "ffff8c7a0de930c0",
|
||||
"retransmit_timeout": 21,
|
||||
"soft_clock_tick": 4,
|
||||
"ack_quick_pingpong": 30,
|
||||
"sending_congestion_window": 10,
|
||||
"slow_start_size_threshold": -1
|
||||
},
|
||||
{
|
||||
"entry": "1",
|
||||
"local_address": "10.0.0.28",
|
||||
"local_port": 38864,
|
||||
"remote_address": "104.244.42.65",
|
||||
"remote_port": 80,
|
||||
"state": "06",
|
||||
"tx_queue": "00000000",
|
||||
"rx_queue": "00000000",
|
||||
"timer_active": 3,
|
||||
"jiffies_until_timer_expires": "000007C5",
|
||||
"unrecovered_rto_timeouts": "00000000",
|
||||
"uid": 0,
|
||||
"unanswered_0_window_probes": 0,
|
||||
"inode": 0,
|
||||
"sock_ref_count": 3,
|
||||
"sock_mem_loc": "ffff8c7a12d31aa0"
|
||||
},
|
||||
...
|
||||
]
|
||||
|
||||
$ cat /proc/net/tcp | jc --proc -p -r
|
||||
[
|
||||
{
|
||||
"entry": "1",
|
||||
"local_address": "1C00000A",
|
||||
"local_port": "A462",
|
||||
"remote_address": "6C000C40",
|
||||
"remote_port": "0050",
|
||||
"state": "04",
|
||||
"tx_queue": "00000001",
|
||||
"rx_queue": "00000000",
|
||||
"timer_active": "01",
|
||||
"jiffies_until_timer_expires": "00000015",
|
||||
"unrecovered_rto_timeouts": "00000000",
|
||||
"uid": "0",
|
||||
"unanswered_0_window_probes": "0",
|
||||
"inode": "0",
|
||||
"sock_ref_count": "3",
|
||||
"sock_mem_loc": "ffff8c7a0de930c0",
|
||||
"retransmit_timeout": "21",
|
||||
"soft_clock_tick": "4",
|
||||
"ack_quick_pingpong": "30",
|
||||
"sending_congestion_window": "10",
|
||||
"slow_start_size_threshold": "-1"
|
||||
},
|
||||
{
|
||||
"entry": "2",
|
||||
"local_address": "1C00000A",
|
||||
"local_port": "97D0",
|
||||
"remote_address": "412AF468",
|
||||
"remote_port": "0050",
|
||||
"state": "06",
|
||||
"tx_queue": "00000000",
|
||||
"rx_queue": "00000000",
|
||||
"timer_active": "03",
|
||||
"jiffies_until_timer_expires": "000007C5",
|
||||
"unrecovered_rto_timeouts": "00000000",
|
||||
"uid": "0",
|
||||
"unanswered_0_window_probes": "0",
|
||||
"inode": "0",
|
||||
"sock_ref_count": "3",
|
||||
"sock_mem_loc": "ffff8c7a12d31aa0"
|
||||
},
|
||||
...
|
||||
]
|
||||
"""
|
||||
import binascii
|
||||
import socket
|
||||
import struct
|
||||
from typing import List, Dict
|
||||
import jc.utils
|
||||
|
||||
|
||||
class info():
|
||||
"""Provides parser metadata (version, author, etc.)"""
|
||||
version = '1.0'
|
||||
description = '`/proc/net/tcp` and `/proc/net/tcp6` file parser'
|
||||
author = 'Alvin Solomon'
|
||||
author_email = 'alvinms01@gmail.com'
|
||||
compatible = ['linux']
|
||||
tags = ['file']
|
||||
hidden = True
|
||||
|
||||
|
||||
__version__ = info.version
|
||||
|
||||
|
||||
def hex_to_ip(hexaddr: str) -> str:
|
||||
if len(hexaddr) == 8:
|
||||
addr_long = int(hexaddr, 16)
|
||||
return socket.inet_ntop(socket.AF_INET, struct.pack("<L", addr_long))
|
||||
elif len(hexaddr) == 32:
|
||||
addr = binascii.a2b_hex(hexaddr)
|
||||
addr_tup = struct.unpack('>IIII', addr)
|
||||
addr_bytes = struct.pack('@IIII', *addr_tup)
|
||||
return socket.inet_ntop(socket.AF_INET6, addr_bytes)
|
||||
|
||||
return ''
|
||||
|
||||
|
||||
def _process(proc_data: List[Dict]) -> List[Dict]:
|
||||
"""
|
||||
Final processing to conform to the schema.
|
||||
|
||||
Parameters:
|
||||
|
||||
proc_data: (List of Dictionaries) raw structured data to process
|
||||
|
||||
Returns:
|
||||
|
||||
List of Dictionaries. Structured to conform to the schema.
|
||||
"""
|
||||
int_list = {
|
||||
'timer_active', 'uid', 'unanswered_0_window_probes', 'inode',
|
||||
'sock_ref_count', 'retransmit_timeout', 'soft_clock_tick',
|
||||
'ack_quick_pingpong', 'sending_congestion_window',
|
||||
'slow_start_size_threshold'
|
||||
}
|
||||
|
||||
for entry in proc_data:
|
||||
if 'local_address' in entry:
|
||||
entry['local_address'] = hex_to_ip(entry['local_address'])
|
||||
entry['local_port'] = int(entry['local_port'], 16)
|
||||
entry['remote_address'] = hex_to_ip(entry['remote_address'])
|
||||
entry['remote_port'] = int(entry['remote_port'], 16)
|
||||
|
||||
for item in int_list:
|
||||
if item in entry:
|
||||
entry[item] = jc.utils.convert_to_int(entry[item])
|
||||
|
||||
return proc_data
|
||||
|
||||
|
||||
def parse(
|
||||
data: str,
|
||||
raw: bool = False,
|
||||
quiet: bool = False
|
||||
) -> List[Dict]:
|
||||
"""
|
||||
Main text parsing function
|
||||
|
||||
Parameters:
|
||||
|
||||
data: (string) text data to parse
|
||||
raw: (boolean) unprocessed output if True
|
||||
quiet: (boolean) suppress warning messages if True
|
||||
|
||||
Returns:
|
||||
|
||||
List of Dictionaries. Raw or processed structured data.
|
||||
"""
|
||||
jc.utils.compatibility(__name__, info.compatible, quiet)
|
||||
jc.utils.input_type_check(data)
|
||||
|
||||
raw_output: List = []
|
||||
|
||||
if jc.utils.has_data(data):
|
||||
|
||||
line_data = data.splitlines()[1:]
|
||||
|
||||
for entry in line_data:
|
||||
line = entry.split()
|
||||
output_line = {}
|
||||
output_line['entry'] = line[0][:-1]
|
||||
|
||||
local_ip_port = line[1]
|
||||
local_ip = local_ip_port.split(':')[0]
|
||||
local_port = local_ip_port.split(':')[1]
|
||||
|
||||
output_line['local_address'] = local_ip
|
||||
output_line['local_port'] = local_port
|
||||
|
||||
remote_ip_port = line[2]
|
||||
remote_ip = remote_ip_port.split(':')[0]
|
||||
remote_port = remote_ip_port.split(':')[1]
|
||||
|
||||
output_line['remote_address'] = remote_ip
|
||||
output_line['remote_port'] = remote_port
|
||||
|
||||
output_line['state'] = line[3]
|
||||
output_line['tx_queue'] = line[4][:8]
|
||||
output_line['rx_queue'] = line[4][9:]
|
||||
output_line['timer_active'] = line[5][:2]
|
||||
output_line['jiffies_until_timer_expires'] = line[5][3:]
|
||||
output_line['unrecovered_rto_timeouts'] = line[6]
|
||||
output_line['uid'] = line[7]
|
||||
output_line['unanswered_0_window_probes'] = line[8]
|
||||
output_line['inode'] = line[9]
|
||||
output_line['sock_ref_count'] = line[10]
|
||||
output_line['sock_mem_loc'] = line[11]
|
||||
|
||||
# fields not always included
|
||||
if len(line) > 12:
|
||||
output_line['retransmit_timeout'] = line[12]
|
||||
output_line['soft_clock_tick'] = line[13]
|
||||
output_line['ack_quick_pingpong'] = line[14]
|
||||
output_line['sending_congestion_window'] = line[15]
|
||||
output_line['slow_start_size_threshold'] = line[16]
|
||||
|
||||
raw_output.append(output_line)
|
||||
|
||||
return raw_output if raw else _process(raw_output)
|
||||
171
jc/parsers/resolve_conf.py
Normal file
171
jc/parsers/resolve_conf.py
Normal file
@@ -0,0 +1,171 @@
|
||||
"""jc - JSON Convert `/etc/resolve.conf` file parser
|
||||
|
||||
This parser may be more forgiving than the system parser. For example, if
|
||||
multiple `search` lists are defined, this parser will append all entries to
|
||||
the `search` field, while the system parser may only use the list from the
|
||||
last defined instance.
|
||||
|
||||
Usage (cli):
|
||||
|
||||
$ cat /etc/resolve.conf | jc --resolve-conf
|
||||
|
||||
Usage (module):
|
||||
|
||||
import jc
|
||||
result = jc.parse('resolve_conf', resolve_conf_output)
|
||||
|
||||
Schema:
|
||||
|
||||
{
|
||||
"domain": string,
|
||||
"search": [
|
||||
string
|
||||
],
|
||||
"nameservers": [
|
||||
string
|
||||
],
|
||||
"options": [
|
||||
string
|
||||
],
|
||||
"sortlist": [
|
||||
string
|
||||
]
|
||||
}
|
||||
|
||||
|
||||
Examples:
|
||||
|
||||
$ cat /etc/resolve.conf | jc --resolve-conf -p
|
||||
{
|
||||
"search": [
|
||||
"eng.myprime.com",
|
||||
"dev.eng.myprime.com",
|
||||
"labs.myprime.com",
|
||||
"qa.myprime.com"
|
||||
],
|
||||
"nameservers": [
|
||||
"10.136.17.15"
|
||||
],
|
||||
"options": [
|
||||
"rotate",
|
||||
"ndots:1"
|
||||
]
|
||||
}
|
||||
"""
|
||||
import re
|
||||
from typing import List, Dict
|
||||
from jc.jc_types import JSONDictType
|
||||
import jc.utils
|
||||
|
||||
|
||||
class info():
|
||||
"""Provides parser metadata (version, author, etc.)"""
|
||||
version = '1.0'
|
||||
description = '`/etc/resolve.conf` file parser'
|
||||
author = 'Kelly Brazil'
|
||||
author_email = 'kellyjonbrazil@gmail.com'
|
||||
compatible = ['linux', 'darwin', 'cygwin', 'win32', 'aix', 'freebsd']
|
||||
tags = ['file']
|
||||
|
||||
|
||||
__version__ = info.version
|
||||
|
||||
|
||||
def _process(proc_data: JSONDictType) -> JSONDictType:
|
||||
"""
|
||||
Final processing to conform to the schema.
|
||||
|
||||
Parameters:
|
||||
|
||||
proc_data: Dictionary raw structured data to process
|
||||
|
||||
Returns:
|
||||
|
||||
Dictionary. Structured to conform to the schema.
|
||||
"""
|
||||
return proc_data
|
||||
|
||||
|
||||
def parse(
|
||||
data: str,
|
||||
raw: bool = False,
|
||||
quiet: bool = False
|
||||
) -> JSONDictType:
|
||||
"""
|
||||
Main text parsing function
|
||||
|
||||
Parameters:
|
||||
|
||||
data: (string) text data to parse
|
||||
raw: (boolean) unprocessed output if True
|
||||
quiet: (boolean) suppress warning messages if True
|
||||
|
||||
Returns:
|
||||
|
||||
Dictionary. Raw or processed structured data.
|
||||
"""
|
||||
jc.utils.compatibility(__name__, info.compatible, quiet)
|
||||
jc.utils.input_type_check(data)
|
||||
|
||||
raw_output: Dict = {}
|
||||
search: List[str] = []
|
||||
nameservers: List[str] = []
|
||||
options: List[str] = []
|
||||
sortlist: List[str] = []
|
||||
|
||||
if jc.utils.has_data(data):
|
||||
|
||||
for line in filter(None, data.splitlines()):
|
||||
|
||||
# comments start with # or ; and can be inline
|
||||
if '#' in line or ';' in line:
|
||||
userdata = list(filter(None, re.split("[#;]+", line, maxsplit=1)))
|
||||
userdata = [x for x in userdata if x.strip()]
|
||||
if len(userdata) <= 1: # whole line is a comment
|
||||
continue
|
||||
|
||||
userdata_str = userdata[0].strip()
|
||||
|
||||
else:
|
||||
userdata_str = line.strip()
|
||||
|
||||
if userdata_str.startswith('domain'):
|
||||
raw_output['domain'] = userdata_str.split()[1].strip()
|
||||
continue
|
||||
|
||||
if userdata_str.startswith('search'):
|
||||
search_items = userdata_str.split(maxsplit=1)[1]
|
||||
search_list = search_items.split()
|
||||
search.extend(search_list)
|
||||
continue
|
||||
|
||||
if userdata_str.startswith('nameserver'):
|
||||
ns_str = userdata_str.split()[1]
|
||||
nameservers.append(ns_str)
|
||||
continue
|
||||
|
||||
if userdata_str.startswith('options'):
|
||||
option_items = userdata_str.split(maxsplit=1)[1]
|
||||
option_list = option_items.split()
|
||||
options.extend(option_list)
|
||||
continue
|
||||
|
||||
if userdata_str.startswith('sortlist'):
|
||||
sortlist_items = userdata_str.split(maxsplit=1)[1]
|
||||
sortlist_list = sortlist_items.split()
|
||||
sortlist.extend(sortlist_list)
|
||||
continue
|
||||
|
||||
if search:
|
||||
raw_output['search'] = search
|
||||
|
||||
if nameservers:
|
||||
raw_output['nameservers'] = nameservers
|
||||
|
||||
if options:
|
||||
raw_output['options'] = options
|
||||
|
||||
if sortlist:
|
||||
raw_output['sortlist'] = sortlist
|
||||
|
||||
return raw_output if raw else _process(raw_output)
|
||||
@@ -17,6 +17,13 @@ Schema:
|
||||
|
||||
[
|
||||
{
|
||||
"interfaces": [
|
||||
{
|
||||
"id": string,
|
||||
"mac": string,
|
||||
"name": string,
|
||||
}
|
||||
]
|
||||
"destination": string,
|
||||
"gateway": string,
|
||||
"genmask": string,
|
||||
@@ -109,11 +116,11 @@ import jc.parsers.universal
|
||||
|
||||
class info():
|
||||
"""Provides parser metadata (version, author, etc.)"""
|
||||
version = '1.8'
|
||||
version = '1.9'
|
||||
description = '`route` command parser'
|
||||
author = 'Kelly Brazil'
|
||||
author_email = 'kellyjonbrazil@gmail.com'
|
||||
compatible = ['linux']
|
||||
compatible = ['linux', 'win32']
|
||||
magic_commands = ['route']
|
||||
tags = ['command']
|
||||
|
||||
@@ -152,6 +159,14 @@ def _process(proc_data):
|
||||
if key in int_list:
|
||||
entry[key] = jc.utils.convert_to_int(entry[key])
|
||||
|
||||
if 'interfaces' in entry:
|
||||
interfaces = []
|
||||
for interface in entry["interfaces"]:
|
||||
# 00 ff 58 60 5f 61 -> 00:ff:58:60:5f:61
|
||||
interface['mac'] = interface['mac'].replace(' ', ':').replace('.', '')
|
||||
interfaces.append(interface)
|
||||
entry["interfaces"] = interfaces
|
||||
|
||||
# add flags_pretty
|
||||
# Flag mapping from https://www.man7.org/linux/man-pages/man8/route.8.html
|
||||
if 'flags' in entry:
|
||||
@@ -165,6 +180,16 @@ def _process(proc_data):
|
||||
|
||||
return proc_data
|
||||
|
||||
def normalize_headers(headers: str):
|
||||
# fixup header row for ipv6
|
||||
if ' Next Hop ' in headers:
|
||||
headers = headers.replace(' If', ' Iface')
|
||||
|
||||
headers = headers.replace(' Next Hop ', ' Next_Hop ')
|
||||
headers = headers.replace(' Flag ', ' Flags ')
|
||||
headers = headers.replace(' Met ', ' Metric ')
|
||||
headers = headers.lower()
|
||||
return headers
|
||||
|
||||
def parse(data, raw=False, quiet=False):
|
||||
"""
|
||||
@@ -180,24 +205,22 @@ def parse(data, raw=False, quiet=False):
|
||||
|
||||
List of Dictionaries. Raw or processed structured data.
|
||||
"""
|
||||
import jc.utils
|
||||
jc.utils.compatibility(__name__, info.compatible, quiet)
|
||||
jc.utils.input_type_check(data)
|
||||
|
||||
cleandata = data.splitlines()[1:]
|
||||
cleandata = data.splitlines()
|
||||
|
||||
raw_output = []
|
||||
|
||||
if jc.utils.has_data(data):
|
||||
|
||||
# fixup header row for ipv6
|
||||
if ' Next Hop ' in cleandata[0]:
|
||||
cleandata[0] = cleandata[0].replace(' If', ' Iface')
|
||||
cleandata[0] = cleandata[0].replace(' Next Hop ', ' Next_Hop ')\
|
||||
.replace(' Flag ', ' Flags ')\
|
||||
.replace(' Met ', ' Metric ')
|
||||
|
||||
cleandata[0] = cleandata[0].lower()
|
||||
raw_output = jc.parsers.universal.simple_table_parse(cleandata)
|
||||
import jc.parsers.route_windows
|
||||
if cleandata[0] in jc.parsers.route_windows.SEPERATORS:
|
||||
raw_output = jc.parsers.route_windows.parse(cleandata)
|
||||
else:
|
||||
cleandata.pop(0) # Removing "Kernel IP routing table".
|
||||
cleandata[0] = normalize_headers(cleandata[0])
|
||||
import jc.parsers.universal
|
||||
raw_output = jc.parsers.universal.simple_table_parse(cleandata)
|
||||
|
||||
if raw:
|
||||
return raw_output
|
||||
|
||||
128
jc/parsers/route_windows.py
Normal file
128
jc/parsers/route_windows.py
Normal file
@@ -0,0 +1,128 @@
|
||||
"""
|
||||
jc - JSON Convert Windows `route` command output parser
|
||||
"""
|
||||
|
||||
|
||||
import re
|
||||
from typing import List
|
||||
|
||||
SEPERATORS = (
|
||||
"===========================================================================",
|
||||
" None"
|
||||
)
|
||||
|
||||
# 22...00 50 56 c0 00 01 ......VMware Virtual Ethernet Adapter for VMnet1
|
||||
# {"id": 22, "mac": "00 50 56 c0 00 01", "name": "VMware Virtual Ethernet Adapter for VMnet1"}
|
||||
INTERFACE_REGEX = re.compile(
|
||||
r"^(?P<id>\d+)\.{3}(?P<mac>.{17})[\s+\.]+(?P<name>[^\n\r]+)$"
|
||||
)
|
||||
|
||||
ROUTE_TABLES = ("IPv4 Route Table", "IPv6 Route Table")
|
||||
ROUTE_TYPES = ("Active Routes:", "Persistent Routes:")
|
||||
|
||||
|
||||
def get_lines_until_seperator(iterator):
|
||||
lines = []
|
||||
for line in iterator:
|
||||
if line in SEPERATORS:
|
||||
break
|
||||
lines.append(line)
|
||||
return lines
|
||||
|
||||
|
||||
def normalize_route_table(route_table: List[str]):
|
||||
headers = route_table[0]
|
||||
headers = headers.lower()
|
||||
headers = headers.replace("network destination", "destination")
|
||||
headers = headers.replace("if", "iface")
|
||||
headers = headers.replace("interface", "iface")
|
||||
headers = headers.replace("netmask", "genmask")
|
||||
headers_count = len(headers.split())
|
||||
|
||||
previous_line_has_all_the_data = True
|
||||
normalized_route_table = [headers]
|
||||
for row in route_table[1:]:
|
||||
row = row.strip()
|
||||
|
||||
has_all_the_data = len(row.split()) == headers_count
|
||||
# If the number of columns doesn't match the number of headers in the current and previous line, concatenating them.
|
||||
if not has_all_the_data and not previous_line_has_all_the_data:
|
||||
previous_line = normalized_route_table.pop(
|
||||
len(normalized_route_table) - 1)
|
||||
row = f'{previous_line} {row}'
|
||||
has_all_the_data = True
|
||||
|
||||
normalized_route_table.append(row.strip())
|
||||
previous_line_has_all_the_data = has_all_the_data
|
||||
|
||||
return normalized_route_table
|
||||
|
||||
|
||||
def parse(cleandata: List[str]):
|
||||
"""
|
||||
Main text parsing function for Windows route
|
||||
|
||||
Parameters:
|
||||
|
||||
cleandata: (string) text data to parse
|
||||
|
||||
Returns:
|
||||
|
||||
List of Dictionaries. Raw structured data.
|
||||
"""
|
||||
|
||||
raw_output = []
|
||||
data_iterator = iter(cleandata)
|
||||
for line in data_iterator:
|
||||
if not line:
|
||||
continue
|
||||
|
||||
if line == "Interface List":
|
||||
# Interface List
|
||||
# 8...00 ff 58 60 5f 61 ......TAP-Windows Adapter V9
|
||||
# 52...00 15 5d fd 0d 45 ......Hyper-V Virtual Ethernet Adapter
|
||||
# ===========================================================================
|
||||
|
||||
interfaces = []
|
||||
for interface_line in data_iterator:
|
||||
interface_line = interface_line.strip()
|
||||
if interface_line in SEPERATORS:
|
||||
break
|
||||
|
||||
interface_match = INTERFACE_REGEX.search(interface_line)
|
||||
if interface_match:
|
||||
interfaces.append(interface_match.groupdict())
|
||||
|
||||
if interfaces:
|
||||
raw_output.append({"interfaces": interfaces})
|
||||
continue
|
||||
|
||||
full_route_table = []
|
||||
if line in ROUTE_TABLES:
|
||||
next(data_iterator) # Skipping the table title.
|
||||
|
||||
# Persistent Routes:
|
||||
# Network Address Netmask Gateway Address Metric
|
||||
# 157.0.0.0 255.0.0.0 157.55.80.1 3
|
||||
# ===========================================================================
|
||||
for route_line in data_iterator:
|
||||
if route_line in ROUTE_TYPES:
|
||||
import jc.parsers.universal
|
||||
route_table = get_lines_until_seperator(
|
||||
data_iterator
|
||||
)
|
||||
if not route_table:
|
||||
continue
|
||||
|
||||
route_table = normalize_route_table(
|
||||
route_table
|
||||
)
|
||||
full_route_table.extend(
|
||||
jc.parsers.universal.simple_table_parse(
|
||||
route_table
|
||||
)
|
||||
)
|
||||
|
||||
raw_output.extend(full_route_table)
|
||||
|
||||
return raw_output
|
||||
281
jc/parsers/srt.py
Normal file
281
jc/parsers/srt.py
Normal file
@@ -0,0 +1,281 @@
|
||||
"""jc - JSON Convert `SRT` file parser
|
||||
|
||||
Usage (cli):
|
||||
|
||||
$ cat foo.srt | jc --srt
|
||||
|
||||
Usage (module):
|
||||
|
||||
import jc
|
||||
result = jc.parse('srt', srt_file_output)
|
||||
|
||||
Schema:
|
||||
|
||||
[
|
||||
{
|
||||
"index": int,
|
||||
"start": {
|
||||
"hours": int,
|
||||
"minutes": int,
|
||||
"seconds": int,
|
||||
"milliseconds": int,
|
||||
"timestamp": string
|
||||
},
|
||||
"end": {
|
||||
"hours": int,
|
||||
"minutes": int,
|
||||
"seconds": int,
|
||||
"milliseconds": int,
|
||||
"timestamp": string
|
||||
},
|
||||
"content": string
|
||||
}
|
||||
]
|
||||
|
||||
Examples:
|
||||
|
||||
$ cat attack_of_the_clones.srt
|
||||
1
|
||||
00:02:16,612 --> 00:02:19,376
|
||||
Senator, we're making
|
||||
our final approach into Coruscant.
|
||||
|
||||
2
|
||||
00:02:19,482 --> 00:02:21,609
|
||||
Very good, Lieutenant.
|
||||
...
|
||||
|
||||
$ cat attack_of_the_clones.srt | jc --srt
|
||||
[
|
||||
{
|
||||
"index": 1,
|
||||
"start": {
|
||||
"hours": 0,
|
||||
"minutes": 2,
|
||||
"seconds": 16,
|
||||
"milliseconds": 612,
|
||||
"timestamp": "00:02:16,612"
|
||||
},
|
||||
"end": {
|
||||
"hours": 0,
|
||||
"minutes": 2,
|
||||
"seconds": 19,
|
||||
"milliseconds": 376,
|
||||
"timestamp": "00:02:19,376"
|
||||
},
|
||||
"content": "Senator, we're making\nour final approach into Coruscant."
|
||||
},
|
||||
{
|
||||
"index": 2,
|
||||
"start": {
|
||||
"hours": 0,
|
||||
"minutes": 2,
|
||||
"seconds": 19,
|
||||
"milliseconds": 482,
|
||||
"timestamp": "00:02:19,482"
|
||||
},
|
||||
"end": {
|
||||
"hours": 0,
|
||||
"minutes": 2,
|
||||
"seconds": 21,
|
||||
"milliseconds": 609,
|
||||
"timestamp": "00:02:21,609"
|
||||
},
|
||||
"content": "Very good, Lieutenant."
|
||||
},
|
||||
...
|
||||
]
|
||||
"""
|
||||
import jc.utils
|
||||
import re
|
||||
from typing import List, Dict
|
||||
from jc.jc_types import JSONDictType
|
||||
|
||||
|
||||
class info():
|
||||
"""Provides parser metadata (version, author, etc.)"""
|
||||
version = '1.0'
|
||||
description = 'SRT file parser'
|
||||
author = 'Mark Rotner'
|
||||
author_email = 'rotner.mr@gmail.com'
|
||||
compatible = ['linux', 'darwin', 'cygwin', 'win32', 'aix', 'freebsd']
|
||||
tags = ['standard', 'file', 'string']
|
||||
|
||||
|
||||
__version__ = info.version
|
||||
|
||||
# Regex from https://github.com/cdown/srt/blob/434d0c1c9d5c26d5c3fb1ce979fc05b478e9253c/srt.py#LL16C1.
|
||||
|
||||
# The MIT License
|
||||
|
||||
# Copyright (c) 2014-present Christopher Down
|
||||
|
||||
# Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
# of this software and associated documentation files (the "Software"), to deal
|
||||
# in the Software without restriction, including without limitation the rights
|
||||
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
# copies of the Software, and to permit persons to whom the Software is
|
||||
# furnished to do so, subject to the following conditions:
|
||||
|
||||
# The above copyright notice and this permission notice shall be included in
|
||||
# all copies or substantial portions of the Software.
|
||||
|
||||
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
# THE SOFTWARE.
|
||||
|
||||
# The format: (int)index\n(timestamp)start --> (timestamp)end\n(str)content\n.
|
||||
# Example:
|
||||
# 1
|
||||
# 00:02:16,612 --> 00:02:19,376
|
||||
# Senator, we're making our final approach into Coruscant.
|
||||
|
||||
# Start & End timestamp format: hours:minutes:seconds,millisecond.
|
||||
# "." is not technically valid as a delimiter, but many editors create SRT
|
||||
# files with this delimiter for whatever reason. Many editors and players
|
||||
# accept it, so we do too.
|
||||
RGX_TIMESTAMP_MAGNITUDE_DELIM = r"[,.:,.。:]"
|
||||
RGX_POSITIVE_INT = r"[0-9]+"
|
||||
RGX_POSITIVE_INT_OPTIONAL = r"[0-9]*"
|
||||
RGX_TIMESTAMP = '{field}{separator}{field}{separator}{field}{separator}?{optional_field}'.format(
|
||||
separator=RGX_TIMESTAMP_MAGNITUDE_DELIM,
|
||||
field=RGX_POSITIVE_INT,
|
||||
optional_field=RGX_POSITIVE_INT_OPTIONAL
|
||||
)
|
||||
RGX_INDEX = r"-?[0-9]+\.?[0-9]*" # int\float\negative.
|
||||
RGX_CONTENT = r".*?" # Anything(except newline) but lazy.
|
||||
RGX_NEWLINE = r"\r?\n" # Newline(CRLF\LF).
|
||||
SRT_REGEX = re.compile(
|
||||
r"\s*(?:({index})\s*{newline})?({ts}) *-[ -] *> *({ts}) ?(?:{newline}|\Z)({content})"
|
||||
# Many sub editors don't add a blank line to the end, and many editors and
|
||||
# players accept that. We allow it to be missing in input.
|
||||
#
|
||||
# We also allow subs that are missing a double blank newline. This often
|
||||
# happens on subs which were first created as a mixed language subtitle,
|
||||
# for example chs/eng, and then were stripped using naive methods (such as
|
||||
# ed/sed) that don't understand newline preservation rules in SRT files.
|
||||
#
|
||||
# This means that when you are, say, only keeping chs, and the line only
|
||||
# contains english, you end up with not only no content, but also all of
|
||||
# the content lines are stripped instead of retaining a newline.
|
||||
r"(?:{newline}|\Z)(?:{newline}|\Z|(?=(?:{index}\s*{newline}{ts})))"
|
||||
# Some SRT blocks, while this is technically invalid, have blank lines
|
||||
# inside the subtitle content. We look ahead a little to check that the
|
||||
# next lines look like an index and a timestamp as a best-effort
|
||||
# solution to work around these.
|
||||
r"(?=(?:(?:{index}\s*{newline})?{ts}|\Z))".format(
|
||||
index=RGX_INDEX,
|
||||
ts=RGX_TIMESTAMP,
|
||||
content=RGX_CONTENT,
|
||||
newline=RGX_NEWLINE,
|
||||
),
|
||||
re.DOTALL,
|
||||
)
|
||||
TIMESTAMP_REGEX = re.compile(
|
||||
'^({field}){separator}({field}){separator}({field}){separator}?({optional_field})$'.format(
|
||||
separator=RGX_TIMESTAMP_MAGNITUDE_DELIM,
|
||||
field=RGX_POSITIVE_INT,
|
||||
optional_field=RGX_POSITIVE_INT_OPTIONAL
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
def _process(proc_data: List[JSONDictType]) -> List[JSONDictType]:
|
||||
"""
|
||||
Final processing to conform to the schema.
|
||||
|
||||
Parameters:
|
||||
|
||||
proc_data: (Dictionary) raw structured data to process
|
||||
|
||||
Returns:
|
||||
|
||||
List of Dictionaries representing an SRT document.
|
||||
"""
|
||||
|
||||
int_list = {'index'}
|
||||
timestamp_list = {"start", "end"}
|
||||
timestamp_int_list = {"hours", "minutes", "seconds", "milliseconds"}
|
||||
|
||||
for entry in proc_data:
|
||||
# Converting {"index"} to int.
|
||||
for key in entry:
|
||||
if key in int_list:
|
||||
entry[key] = jc.utils.convert_to_int(entry[key])
|
||||
|
||||
# Converting {"hours", "minutes", "seconds", "milliseconds"} to int.
|
||||
if key in timestamp_list:
|
||||
timestamp = entry[key]
|
||||
for timestamp_key in timestamp:
|
||||
if timestamp_key in timestamp_int_list:
|
||||
timestamp[timestamp_key] = jc.utils.convert_to_int(
|
||||
timestamp[timestamp_key])
|
||||
|
||||
return proc_data
|
||||
|
||||
|
||||
def parse_timestamp(timestamp: str) -> Dict:
|
||||
"""
|
||||
timestamp: "hours:minutes:seconds,milliseconds" --->
|
||||
{
|
||||
"hours": "hours",
|
||||
"minutes": "minutes",
|
||||
"seconds": "seconds",
|
||||
"milliseconds": "milliseconds",
|
||||
"timestamp": "hours:minutes:seconds,milliseconds"
|
||||
}
|
||||
"""
|
||||
ts_match = TIMESTAMP_REGEX.match(timestamp)
|
||||
if ts_match:
|
||||
hours, minutes, seconds, milliseconds = ts_match.groups()
|
||||
return {
|
||||
"hours": hours,
|
||||
"minutes": minutes,
|
||||
"seconds": seconds,
|
||||
"milliseconds": milliseconds,
|
||||
"timestamp": timestamp
|
||||
}
|
||||
return {}
|
||||
|
||||
|
||||
def parse(
|
||||
data: str,
|
||||
raw: bool = False,
|
||||
quiet: bool = False
|
||||
) -> List[JSONDictType]:
|
||||
"""
|
||||
Main text parsing function
|
||||
|
||||
Parameters:
|
||||
|
||||
data: (string) text data to parse
|
||||
raw: (boolean) unprocessed output if True
|
||||
quiet: (boolean) suppress warning messages if True
|
||||
|
||||
Returns:
|
||||
|
||||
Dictionary. Raw or processed structured data.
|
||||
"""
|
||||
jc.utils.compatibility(__name__, info.compatible, quiet)
|
||||
jc.utils.input_type_check(data)
|
||||
|
||||
raw_output: List[Dict] = []
|
||||
if not jc.utils.has_data(data):
|
||||
return raw_output
|
||||
|
||||
for subtitle in SRT_REGEX.finditer(data):
|
||||
index, start, end, content = subtitle.groups()
|
||||
raw_output.append(
|
||||
{
|
||||
"index": index,
|
||||
"start": parse_timestamp(start),
|
||||
"end": parse_timestamp(end),
|
||||
"content": content.replace("\r\n", "\n")
|
||||
}
|
||||
)
|
||||
|
||||
return raw_output if raw else _process(raw_output)
|
||||
109
jc/parsers/ss.py
109
jc/parsers/ss.py
@@ -1,8 +1,5 @@
|
||||
"""jc - JSON Convert `ss` command output parser
|
||||
|
||||
Extended information options like `-e` and `-p` are not supported and may
|
||||
cause parsing irregularities.
|
||||
|
||||
Usage (cli):
|
||||
|
||||
$ ss | jc --ss
|
||||
@@ -23,21 +20,29 @@ field names
|
||||
|
||||
[
|
||||
{
|
||||
"netid": string,
|
||||
"state": string,
|
||||
"recv_q": integer,
|
||||
"send_q": integer,
|
||||
"local_address": string,
|
||||
"local_port": string,
|
||||
"local_port_num": integer,
|
||||
"peer_address": string,
|
||||
"peer_port": string,
|
||||
"peer_port_num": integer,
|
||||
"interface": string,
|
||||
"link_layer" string,
|
||||
"channel": string,
|
||||
"path": string,
|
||||
"pid": integer
|
||||
"netid": string,
|
||||
"state": string,
|
||||
"recv_q": integer,
|
||||
"send_q": integer,
|
||||
"local_address": string,
|
||||
"local_port": string,
|
||||
"local_port_num": integer,
|
||||
"peer_address": string,
|
||||
"peer_port": string,
|
||||
"peer_port_num": integer,
|
||||
"interface": string,
|
||||
"link_layer" string,
|
||||
"channel": string,
|
||||
"path": string,
|
||||
"pid": integer,
|
||||
"opts": {
|
||||
"process_id": {
|
||||
"<process_id>": {
|
||||
"user": string,
|
||||
"file_descriptor": string
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
|
||||
@@ -275,13 +280,15 @@ Examples:
|
||||
}
|
||||
]
|
||||
"""
|
||||
import re
|
||||
import ast
|
||||
import string
|
||||
import jc.utils
|
||||
|
||||
|
||||
class info():
|
||||
"""Provides parser metadata (version, author, etc.)"""
|
||||
version = '1.6'
|
||||
version = '1.7'
|
||||
description = '`ss` command parser'
|
||||
author = 'Kelly Brazil'
|
||||
author_email = 'kellyjonbrazil@gmail.com'
|
||||
@@ -324,6 +331,57 @@ def _process(proc_data):
|
||||
|
||||
return proc_data
|
||||
|
||||
def _parse_opts(proc_data):
|
||||
""" Process extra options -e, -o, -p
|
||||
|
||||
Parameters:
|
||||
|
||||
proc_data: (List of Dictionaries) raw structured data to process
|
||||
|
||||
Returns:
|
||||
|
||||
Structured data dictionary for extra/optional headerless options.
|
||||
"""
|
||||
o_field = proc_data.split(' ')
|
||||
opts = {}
|
||||
for item in o_field:
|
||||
# -e option:
|
||||
item = re.sub(
|
||||
'uid', 'uid_number',
|
||||
re.sub('sk', 'cookie', re.sub('ino', 'inode_number', item)))
|
||||
if ":" in item:
|
||||
key, val = item.split(':')
|
||||
# -o option
|
||||
if key == "timer":
|
||||
val = val.replace('(', '[').replace(')', ']')
|
||||
val = ast.literal_eval(re.sub(r'([a-z0-9\.]+)', '"\\1"', val))
|
||||
val = {
|
||||
'timer_name': val[0],
|
||||
'expire_time': val[1],
|
||||
'retrans': val[2]
|
||||
}
|
||||
opts[key] = val
|
||||
# -p option
|
||||
if key == "users":
|
||||
key = 'process_id'
|
||||
val = val.replace('(', '[').replace(')', ']')
|
||||
val = ast.literal_eval(re.sub(r'([a-z]+=[0-9]+)', '"\\1"', val))
|
||||
data = {}
|
||||
for rec in val:
|
||||
params = {}
|
||||
params['user'] = rec[0]
|
||||
for i in [x for x in rec if '=' in x]:
|
||||
k, v = i.split('=')
|
||||
params[k] = v
|
||||
data.update({
|
||||
params['pid']: {
|
||||
'user': params['user'],
|
||||
'file_descriptor': params['fd']
|
||||
}
|
||||
})
|
||||
val = data
|
||||
opts[key] = val
|
||||
return opts
|
||||
|
||||
def parse(data, raw=False, quiet=False):
|
||||
"""
|
||||
@@ -357,15 +415,20 @@ def parse(data, raw=False, quiet=False):
|
||||
header_text = header_text.replace('-', '_')
|
||||
|
||||
header_list = header_text.split()
|
||||
extra_opts = False
|
||||
|
||||
for entry in cleandata[1:]:
|
||||
output_line = {}
|
||||
if entry[0] not in string.whitespace:
|
||||
|
||||
# fix weird ss bug where first two columns have no space between them sometimes
|
||||
entry = entry[:5] + ' ' + entry[5:]
|
||||
entry = entry[:5] + ' ' + entry[5:]
|
||||
|
||||
entry_list = entry.split()
|
||||
entry_list = re.split(r'[ ]{1,}',entry.strip())
|
||||
|
||||
if len(entry_list) > len(header_list) or extra_opts == True:
|
||||
entry_list = re.split(r'[ ]{2,}',entry.strip())
|
||||
extra_opts = True
|
||||
|
||||
if entry_list[0] in contains_colon and ':' in entry_list[4]:
|
||||
l_field = entry_list[4].rsplit(':', maxsplit=1)
|
||||
@@ -381,6 +444,10 @@ def parse(data, raw=False, quiet=False):
|
||||
entry_list[6] = p_address
|
||||
entry_list.insert(7, p_port)
|
||||
|
||||
if re.search(r'ino:|uid:|sk:|users:|timer:',entry_list[-1]):
|
||||
header_list.append('opts')
|
||||
entry_list[-1] = _parse_opts(entry_list[-1])
|
||||
|
||||
output_line = dict(zip(header_list, entry_list))
|
||||
|
||||
# some post processing to pull out fields: interface, link_layer, path, pid, channel
|
||||
|
||||
@@ -171,7 +171,7 @@ import jc.utils
|
||||
|
||||
class info():
|
||||
"""Provides parser metadata (version, author, etc.)"""
|
||||
version = '1.12'
|
||||
version = '1.13'
|
||||
description = '`stat` command parser'
|
||||
author = 'Kelly Brazil'
|
||||
author_email = 'kellyjonbrazil@gmail.com'
|
||||
@@ -238,112 +238,115 @@ def parse(data, raw=False, quiet=False):
|
||||
# Clear any blank lines
|
||||
cleandata = list(filter(None, data.splitlines()))
|
||||
|
||||
if jc.utils.has_data(data):
|
||||
|
||||
# linux output
|
||||
if cleandata[0].startswith(' File: '):
|
||||
# stats output contains 8 lines
|
||||
for line in cleandata:
|
||||
|
||||
# line #1
|
||||
if line.find('File:') == 2:
|
||||
output_line = {}
|
||||
line_list = line.split(maxsplit=1)
|
||||
output_line['file'] = line_list[1]
|
||||
|
||||
# populate link_to field if -> found
|
||||
if ' -> ' in output_line['file']:
|
||||
filename = output_line['file'].split(' -> ')[0].strip('\u2018').rstrip('\u2019')
|
||||
link = output_line['file'].split(' -> ')[1].strip('\u2018').rstrip('\u2019')
|
||||
output_line['file'] = filename
|
||||
output_line['link_to'] = link
|
||||
else:
|
||||
filename = output_line['file'].split(' -> ')[0].strip('\u2018').rstrip('\u2019')
|
||||
output_line['file'] = filename
|
||||
|
||||
continue
|
||||
|
||||
# line #2
|
||||
if line.find('Size:') == 2:
|
||||
line_list = line.split(maxsplit=7)
|
||||
output_line['size'] = line_list[1]
|
||||
output_line['blocks'] = line_list[3]
|
||||
output_line['io_blocks'] = line_list[6]
|
||||
output_line['type'] = line_list[7]
|
||||
continue
|
||||
|
||||
# line #3
|
||||
if line.startswith('Device:'):
|
||||
line_list = line.split()
|
||||
output_line['device'] = line_list[1]
|
||||
output_line['inode'] = line_list[3]
|
||||
output_line['links'] = line_list[5]
|
||||
continue
|
||||
|
||||
# line #4
|
||||
if line.startswith('Access: ('):
|
||||
line = line.replace('(', ' ').replace(')', ' ').replace('/', ' ')
|
||||
line_list = line.split()
|
||||
output_line['access'] = line_list[1]
|
||||
output_line['flags'] = line_list[2]
|
||||
output_line['uid'] = line_list[4]
|
||||
output_line['user'] = line_list[5]
|
||||
output_line['gid'] = line_list[7]
|
||||
output_line['group'] = line_list[8]
|
||||
continue
|
||||
|
||||
# line #5
|
||||
if line.startswith('Access: 2'):
|
||||
line_list = line.split(maxsplit=1)
|
||||
output_line['access_time'] = line_list[1]
|
||||
continue
|
||||
|
||||
# line #6
|
||||
if line.startswith('Modify:'):
|
||||
line_list = line.split(maxsplit=1)
|
||||
output_line['modify_time'] = line_list[1]
|
||||
continue
|
||||
|
||||
# line #7
|
||||
if line.startswith('Change:'):
|
||||
line_list = line.split(maxsplit=1)
|
||||
output_line['change_time'] = line_list[1]
|
||||
continue
|
||||
|
||||
# line #8
|
||||
if line.find('Birth:') == 1:
|
||||
line_list = line.split(maxsplit=1)
|
||||
output_line['birth_time'] = line_list[1]
|
||||
|
||||
raw_output.append(output_line)
|
||||
continue
|
||||
|
||||
# FreeBSD/OSX output
|
||||
else:
|
||||
for line in cleandata:
|
||||
value = shlex.split(line)
|
||||
output_line = {
|
||||
'file': ' '.join(value[15:]),
|
||||
'unix_device': value[0],
|
||||
'inode': value[1],
|
||||
'flags': value[2],
|
||||
'links': value[3],
|
||||
'user': value[4],
|
||||
'group': value[5],
|
||||
'rdev': value[6],
|
||||
'size': value[7],
|
||||
'access_time': value[8],
|
||||
'modify_time': value[9],
|
||||
'change_time': value[10],
|
||||
'birth_time': value[11],
|
||||
'block_size': value[12],
|
||||
'blocks': value[13],
|
||||
'unix_flags': value[14]
|
||||
}
|
||||
|
||||
raw_output.append(output_line)
|
||||
|
||||
if raw:
|
||||
if not jc.utils.has_data(data):
|
||||
return raw_output
|
||||
|
||||
# linux output
|
||||
if cleandata[0].startswith(' File: '):
|
||||
output_line = {}
|
||||
|
||||
# stats output contains 8 lines
|
||||
for line in cleandata:
|
||||
# line #1
|
||||
if line.find('File:') == 2:
|
||||
if output_line: # Reached a new file stat info.
|
||||
raw_output.append(output_line)
|
||||
output_line = {}
|
||||
|
||||
line_list = line.split(maxsplit=1)
|
||||
output_line['file'] = line_list[1]
|
||||
|
||||
# populate link_to field if -> found
|
||||
if ' -> ' in output_line['file']:
|
||||
filename = output_line['file'].split(' -> ')[0].strip('\u2018').rstrip('\u2019')
|
||||
link = output_line['file'].split(' -> ')[1].strip('\u2018').rstrip('\u2019')
|
||||
output_line['file'] = filename
|
||||
output_line['link_to'] = link
|
||||
else:
|
||||
filename = output_line['file'].split(' -> ')[0].strip('\u2018').rstrip('\u2019')
|
||||
output_line['file'] = filename
|
||||
|
||||
continue
|
||||
|
||||
# line #2
|
||||
if line.startswith(' Size:'):
|
||||
line_list = line.split(maxsplit=7)
|
||||
output_line['size'] = line_list[1]
|
||||
output_line['blocks'] = line_list[3]
|
||||
output_line['io_blocks'] = line_list[6]
|
||||
output_line['type'] = line_list[7]
|
||||
continue
|
||||
|
||||
# line #3
|
||||
if line.startswith('Device:'):
|
||||
line_list = line.split()
|
||||
output_line['device'] = line_list[1]
|
||||
output_line['inode'] = line_list[3]
|
||||
output_line['links'] = line_list[5]
|
||||
continue
|
||||
|
||||
# line #4
|
||||
if line.startswith('Access: ('):
|
||||
line = line.replace('(', ' ').replace(')', ' ').replace('/', ' ')
|
||||
line_list = line.split()
|
||||
output_line['access'] = line_list[1]
|
||||
output_line['flags'] = line_list[2]
|
||||
output_line['uid'] = line_list[4]
|
||||
output_line['user'] = line_list[5]
|
||||
output_line['gid'] = line_list[7]
|
||||
output_line['group'] = line_list[8]
|
||||
continue
|
||||
|
||||
# line #5
|
||||
if line.startswith('Access: 2'):
|
||||
line_list = line.split(maxsplit=1)
|
||||
output_line['access_time'] = line_list[1]
|
||||
continue
|
||||
|
||||
# line #6
|
||||
if line.startswith('Modify:'):
|
||||
line_list = line.split(maxsplit=1)
|
||||
output_line['modify_time'] = line_list[1]
|
||||
continue
|
||||
|
||||
# line #7
|
||||
if line.startswith('Change:'):
|
||||
line_list = line.split(maxsplit=1)
|
||||
output_line['change_time'] = line_list[1]
|
||||
continue
|
||||
|
||||
# line #8
|
||||
if line.startswith(' Birth:'):
|
||||
line_list = line.split(maxsplit=1)
|
||||
output_line['birth_time'] = line_list[1]
|
||||
continue
|
||||
|
||||
if output_line:
|
||||
raw_output.append(output_line)
|
||||
|
||||
# FreeBSD/OSX output
|
||||
else:
|
||||
return _process(raw_output)
|
||||
for line in cleandata:
|
||||
value = shlex.split(line)
|
||||
output_line = {
|
||||
'file': ' '.join(value[15:]),
|
||||
'unix_device': value[0],
|
||||
'inode': value[1],
|
||||
'flags': value[2],
|
||||
'links': value[3],
|
||||
'user': value[4],
|
||||
'group': value[5],
|
||||
'rdev': value[6],
|
||||
'size': value[7],
|
||||
'access_time': value[8],
|
||||
'modify_time': value[9],
|
||||
'change_time': value[10],
|
||||
'birth_time': value[11],
|
||||
'block_size': value[12],
|
||||
'blocks': value[13],
|
||||
'unix_flags': value[14]
|
||||
}
|
||||
|
||||
raw_output.append(output_line)
|
||||
|
||||
return raw_output if raw else _process(raw_output)
|
||||
|
||||
256
jc/parsers/veracrypt.py
Normal file
256
jc/parsers/veracrypt.py
Normal file
@@ -0,0 +1,256 @@
|
||||
"""jc - JSON Convert `veracrypt` command output parser
|
||||
|
||||
Supports the following `veracrypt` subcommands:
|
||||
- `veracrypt --text --list`
|
||||
- `veracrypt --text --list --verbose`
|
||||
- `veracrypt --text --volume-properties <volume>`
|
||||
|
||||
Usage (cli):
|
||||
|
||||
$ veracrypt --text --list | jc --veracrypt
|
||||
or
|
||||
|
||||
$ jc veracrypt --text --list
|
||||
|
||||
Usage (module):
|
||||
|
||||
import jc
|
||||
result = jc.parse('veracrypt', veracrypt_command_output)
|
||||
|
||||
Schema:
|
||||
|
||||
Volume:
|
||||
[
|
||||
{
|
||||
"slot": integer,
|
||||
"path": string,
|
||||
"device": string,
|
||||
"mountpoint": string,
|
||||
"size": string,
|
||||
"type": string,
|
||||
"readonly": string,
|
||||
"hidden_protected": string,
|
||||
"encryption_algo": string,
|
||||
"pk_size": string,
|
||||
"sk_size": string,
|
||||
"block_size": string,
|
||||
"mode": string,
|
||||
"prf": string,
|
||||
"format_version": integer,
|
||||
"backup_header": string
|
||||
}
|
||||
]
|
||||
|
||||
Examples:
|
||||
|
||||
$ veracrypt --text --list | jc --veracrypt -p
|
||||
[
|
||||
{
|
||||
"slot": 1,
|
||||
"path": "/dev/sdb1",
|
||||
"device": "/dev/mapper/veracrypt1",
|
||||
"mountpoint": "/home/bob/mount/encrypt/sdb1"
|
||||
}
|
||||
]
|
||||
|
||||
$ veracrypt --text --list --verbose | jc --veracrypt -p
|
||||
[
|
||||
{
|
||||
"slot": 1,
|
||||
"path": "/dev/sdb1",
|
||||
"device": "/dev/mapper/veracrypt1",
|
||||
"mountpoint": "/home/bob/mount/encrypt/sdb1",
|
||||
"size": "522 MiB",
|
||||
"type": "Normal",
|
||||
"readonly": "No",
|
||||
"hidden_protected": "No",
|
||||
"encryption_algo": "AES",
|
||||
"pk_size": "256 bits",
|
||||
"sk_size": "256 bits",
|
||||
"block_size": "128 bits",
|
||||
"mode": "XTS",
|
||||
"prf": "HMAC-SHA-512",
|
||||
"format_version": 2,
|
||||
"backup_header": "Yes"
|
||||
}
|
||||
]
|
||||
|
||||
"""
|
||||
import re
|
||||
from typing import List, Dict, Optional, Any
|
||||
from jc.jc_types import JSONDictType
|
||||
import jc.utils
|
||||
|
||||
|
||||
class info():
|
||||
"""Provides parser metadata (version, author, etc.)"""
|
||||
version = '1.0'
|
||||
description = '`veracrypt` command parser'
|
||||
author = 'Jake Ob'
|
||||
author_email = 'iakopap at gmail.com'
|
||||
compatible = ["linux"]
|
||||
magic_commands = ["veracrypt"]
|
||||
tags = ['command']
|
||||
|
||||
|
||||
__version__ = info.version
|
||||
|
||||
try:
|
||||
from typing import TypedDict
|
||||
|
||||
Volume = TypedDict(
|
||||
"Volume",
|
||||
{
|
||||
"slot": int,
|
||||
"path": str,
|
||||
"device": str,
|
||||
"mountpoint": str,
|
||||
"size": str,
|
||||
"type": str,
|
||||
"readonly": str,
|
||||
"hidden_protected": str,
|
||||
"encryption_algo": str,
|
||||
"pk_size": str,
|
||||
"sk_size": str,
|
||||
"block_size": str,
|
||||
"mode": str,
|
||||
"prf": str,
|
||||
"format_version": int,
|
||||
"backup_header": str
|
||||
},
|
||||
)
|
||||
except ImportError:
|
||||
Volume = Dict[str, Any] # type: ignore
|
||||
|
||||
_volume_line_pattern = r"(?P<slot>[0-9]+): (?P<path>.+?) (?P<device>.+?) (?P<mountpoint>.*)"
|
||||
|
||||
_volume_verbose_pattern = (
|
||||
r"(Slot:\s(?P<slot>.+)"
|
||||
+ r"|Volume:\s(?P<path>.+)"
|
||||
+ r"|Virtual\sDevice:\s(?P<device>.+)"
|
||||
+ r"|Mount\sDirectory:\s(?P<mountpoint>.+)"
|
||||
+ r"|Size:\s(?P<size>.+)"
|
||||
+ r"|Type:\s(?P<type>.+)"
|
||||
+ r"|Read-Only:\s(?P<readonly>.+)"
|
||||
+ r"|Hidden\sVolume Protected:\s(?P<hidden_protected>.+)"
|
||||
+ r"|Encryption\sAlgorithm:\s(?P<encryption_algo>.+)"
|
||||
+ r"|Primary\sKey\sSize:\s(?P<pk_size>.+)"
|
||||
+ r"|Secondary\sKey\sSize\s.*:\s(?P<sk_size>.+)"
|
||||
+ r"|Block\sSize:\s(?P<block_size>.+)"
|
||||
+ r"|Mode\sof\sOperation:\s(?P<mode>.+)"
|
||||
+ r"|PKCS-5\sPRF:\s(?P<prf>.+)"
|
||||
+ r"|Volume\sFormat\sVersion:\s(?P<format_version>.+)"
|
||||
+ r"|Embedded\sBackup\sHeader:\s(?P<backup_header>.+))"
|
||||
)
|
||||
|
||||
def _parse_volume(next_lines: List[str]) -> Optional[Volume]:
|
||||
next_line = next_lines.pop()
|
||||
result = re.match(_volume_line_pattern, next_line)
|
||||
|
||||
# Parse and return the volume given as a single line (veracrypt -t --list)
|
||||
if result:
|
||||
matches = result.groupdict()
|
||||
volume: Volume = { # type: ignore
|
||||
"slot": int(matches["slot"]),
|
||||
"path": matches["path"],
|
||||
"device": matches["device"],
|
||||
"mountpoint": matches["mountpoint"],
|
||||
}
|
||||
|
||||
return volume
|
||||
else:
|
||||
next_lines.append(next_line)
|
||||
|
||||
# Otherwise parse the volume given in multiple lines (veracrypt -t --list -v)
|
||||
volume: Volume = {} # type: ignore
|
||||
|
||||
while next_lines:
|
||||
next_line = next_lines.pop()
|
||||
|
||||
# Return when encounter an empty line
|
||||
if not next_line:
|
||||
return volume
|
||||
|
||||
result = re.match(_volume_verbose_pattern, next_line)
|
||||
|
||||
# Skip to the next line in case of an unknown field line
|
||||
if not result:
|
||||
continue
|
||||
|
||||
matches = result.groupdict()
|
||||
|
||||
if matches["slot"]:
|
||||
volume["slot"] = int(matches["slot"])
|
||||
elif matches["path"]:
|
||||
volume["path"] = matches["path"]
|
||||
elif matches["device"]:
|
||||
volume["device"] = matches["device"]
|
||||
elif matches["mountpoint"]:
|
||||
volume["mountpoint"] = matches["mountpoint"]
|
||||
elif matches["size"]:
|
||||
volume["size"] = matches["size"]
|
||||
elif matches["type"]:
|
||||
volume["type"] = matches["type"]
|
||||
elif matches["readonly"]:
|
||||
volume["readonly"] = matches["readonly"]
|
||||
elif matches["hidden_protected"]:
|
||||
volume["hidden_protected"] = matches["hidden_protected"]
|
||||
elif matches["encryption_algo"]:
|
||||
volume["encryption_algo"] = matches["encryption_algo"]
|
||||
elif matches["pk_size"]:
|
||||
volume["pk_size"] = matches["pk_size"]
|
||||
elif matches["sk_size"]:
|
||||
volume["sk_size"] = matches["sk_size"]
|
||||
elif matches["block_size"]:
|
||||
volume["block_size"] = matches["block_size"]
|
||||
elif matches["mode"]:
|
||||
volume["mode"] = matches["mode"]
|
||||
elif matches["prf"]:
|
||||
volume["prf"] = matches["prf"]
|
||||
elif matches["format_version"]:
|
||||
volume["format_version"] = int(matches["format_version"])
|
||||
elif matches["backup_header"]:
|
||||
volume["backup_header"] = matches["backup_header"]
|
||||
|
||||
return volume
|
||||
|
||||
def parse(data: str, raw: bool = False, quiet: bool = False) -> List[JSONDictType]:
|
||||
"""
|
||||
Main text parsing function
|
||||
|
||||
Parameters:
|
||||
|
||||
data: (string) text data to parse
|
||||
raw: (boolean) unprocessed output if True
|
||||
quiet: (boolean) suppress warning messages if True
|
||||
|
||||
Returns:
|
||||
|
||||
List of Dictionaries. Raw or processed structured data.
|
||||
"""
|
||||
result: List = []
|
||||
|
||||
if jc.utils.has_data(data):
|
||||
jc.utils.compatibility(__name__, info.compatible, quiet)
|
||||
jc.utils.input_type_check(data)
|
||||
|
||||
linedata = data.splitlines()
|
||||
|
||||
first_line = linedata[0]
|
||||
line_mode = re.search(_volume_line_pattern, first_line)
|
||||
verbose_mode = re.search(_volume_verbose_pattern, first_line)
|
||||
|
||||
if not line_mode and not verbose_mode:
|
||||
return []
|
||||
|
||||
linedata.reverse()
|
||||
|
||||
while linedata:
|
||||
volume = _parse_volume(linedata)
|
||||
|
||||
if volume:
|
||||
result.append(volume)
|
||||
else:
|
||||
break
|
||||
|
||||
return result
|
||||
@@ -408,12 +408,12 @@ from collections import OrderedDict
|
||||
from datetime import datetime
|
||||
from typing import List, Dict, Union
|
||||
import jc.utils
|
||||
from jc.parsers.asn1crypto import pem, x509
|
||||
from jc.parsers.asn1crypto import pem, x509, jc_global
|
||||
|
||||
|
||||
class info():
|
||||
"""Provides parser metadata (version, author, etc.)"""
|
||||
version = '1.1'
|
||||
version = '1.2'
|
||||
description = 'X.509 PEM and DER certificate file parser'
|
||||
author = 'Kelly Brazil'
|
||||
author_email = 'kellyjonbrazil@gmail.com'
|
||||
@@ -462,6 +462,9 @@ def _fix_objects(obj):
|
||||
Recursively traverse the nested dictionary or list and convert objects
|
||||
into JSON serializable types.
|
||||
"""
|
||||
if isinstance(obj, tuple):
|
||||
obj = list(obj)
|
||||
|
||||
if isinstance(obj, set):
|
||||
obj = sorted(list(obj))
|
||||
|
||||
@@ -501,6 +504,10 @@ def _fix_objects(obj):
|
||||
obj.update({k: v})
|
||||
continue
|
||||
|
||||
if isinstance(v, tuple):
|
||||
v = list(v)
|
||||
obj.update({k: v})
|
||||
|
||||
if isinstance(v, set):
|
||||
v = sorted(list(v))
|
||||
obj.update({k: v})
|
||||
@@ -548,6 +555,7 @@ def parse(
|
||||
List of Dictionaries. Raw or processed structured data.
|
||||
"""
|
||||
jc.utils.compatibility(__name__, info.compatible, quiet)
|
||||
jc_global.quiet = quiet # to inject quiet setting into asn1crypto library
|
||||
|
||||
raw_output: List = []
|
||||
|
||||
|
||||
318
jc/parsers/x509_csr.py
Normal file
318
jc/parsers/x509_csr.py
Normal file
@@ -0,0 +1,318 @@
|
||||
"""jc - JSON Convert X.509 Certificate Request format file parser
|
||||
|
||||
This parser will convert DER and PEM encoded X.509 certificate request files.
|
||||
|
||||
Usage (cli):
|
||||
|
||||
$ cat certificateRequest.pem | jc --x509-csr
|
||||
|
||||
Usage (module):
|
||||
|
||||
import jc
|
||||
result = jc.parse('x509_csr', x509_csr_file_output)
|
||||
|
||||
Schema:
|
||||
|
||||
[
|
||||
{
|
||||
"certification_request_info": {
|
||||
"version": string,
|
||||
"serial_number": string, # [0]
|
||||
"serial_number_str": string,
|
||||
"signature": {
|
||||
"algorithm": string,
|
||||
"parameters": string/null,
|
||||
},
|
||||
"issuer": {
|
||||
"country_name": string,
|
||||
"state_or_province_name" string,
|
||||
"locality_name": string,
|
||||
"organization_name": array/string,
|
||||
"organizational_unit_name": array/string,
|
||||
"common_name": string,
|
||||
"email_address": string,
|
||||
"serial_number": string, # [0]
|
||||
"serial_number_str": string
|
||||
},
|
||||
"validity": {
|
||||
"not_before": integer, # [1]
|
||||
"not_after": integer, # [1]
|
||||
"not_before_iso": string,
|
||||
"not_after_iso": string
|
||||
},
|
||||
"subject": {
|
||||
"country_name": string,
|
||||
"state_or_province_name": string,
|
||||
"locality_name": string,
|
||||
"organization_name": array/string,
|
||||
"organizational_unit_name": array/string,
|
||||
"common_name": string,
|
||||
"email_address": string,
|
||||
"serial_number": string, # [0]
|
||||
"serial_number_str": string
|
||||
},
|
||||
"subject_public_key_info": {
|
||||
"algorithm": {
|
||||
"algorithm": string,
|
||||
"parameters": string/null,
|
||||
},
|
||||
"public_key": {
|
||||
"modulus": string, # [0]
|
||||
"public_exponent": integer
|
||||
}
|
||||
},
|
||||
"issuer_unique_id": string/null,
|
||||
"subject_unique_id": string/null,
|
||||
"extensions": [
|
||||
{
|
||||
"extn_id": string,
|
||||
"critical": boolean,
|
||||
"extn_value": array/object/string/integer # [2]
|
||||
}
|
||||
]
|
||||
},
|
||||
"signature_algorithm": {
|
||||
"algorithm": string,
|
||||
"parameters": string/null
|
||||
},
|
||||
"signature_value": string # [0]
|
||||
}
|
||||
]
|
||||
|
||||
[0] in colon-delimited hex notation
|
||||
[1] time-zone-aware (UTC) epoch timestamp
|
||||
[2] See below for well-known Extension schemas:
|
||||
|
||||
Basic Constraints:
|
||||
{
|
||||
"extn_id": "basic_constraints",
|
||||
"critical": boolean,
|
||||
"extn_value": {
|
||||
"ca": boolean,
|
||||
"path_len_constraint": string/null
|
||||
}
|
||||
}
|
||||
|
||||
Key Usage:
|
||||
{
|
||||
"extn_id": "key_usage",
|
||||
"critical": boolean,
|
||||
"extn_value": [
|
||||
string
|
||||
]
|
||||
}
|
||||
|
||||
Key Identifier:
|
||||
{
|
||||
"extn_id": "key_identifier",
|
||||
"critical": boolean,
|
||||
"extn_value": string # [0]
|
||||
}
|
||||
|
||||
Authority Key Identifier:
|
||||
{
|
||||
"extn_id": "authority_key_identifier",
|
||||
"critical": boolean,
|
||||
"extn_value": {
|
||||
"key_identifier": string, # [0]
|
||||
"authority_cert_issuer": string/null,
|
||||
"authority_cert_serial_number": string/null
|
||||
}
|
||||
}
|
||||
|
||||
Subject Alternative Name:
|
||||
{
|
||||
"extn_id": "subject_alt_name",
|
||||
"critical": boolean,
|
||||
"extn_value": [
|
||||
string
|
||||
]
|
||||
}
|
||||
|
||||
Certificate Policies:
|
||||
{
|
||||
"extn_id": "certificate_policies",
|
||||
"critical": boolean,
|
||||
"extn_value": [
|
||||
{
|
||||
"policy_identifier": string,
|
||||
"policy_qualifiers": [ array or null
|
||||
{
|
||||
"policy_qualifier_id": string,
|
||||
"qualifier": string
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
Signed Certificate Timestamp List:
|
||||
{
|
||||
"extn_id": "signed_certificate_timestamp_list",
|
||||
"critical": boolean,
|
||||
"extn_value": string # [0]
|
||||
}
|
||||
|
||||
Examples:
|
||||
|
||||
$ cat server.csr| jc --x509-csr -p
|
||||
[
|
||||
{
|
||||
"certification_request_info": {
|
||||
"version": "v1",
|
||||
"subject": {
|
||||
"common_name": "myserver.for.example"
|
||||
},
|
||||
"subject_pk_info": {
|
||||
"algorithm": {
|
||||
"algorithm": "ec",
|
||||
"parameters": "secp256r1"
|
||||
},
|
||||
"public_key": "04:40:33:c0:91:8f:e9:46:ea:d0:dc:d0:f9:63:2..."
|
||||
},
|
||||
"attributes": [
|
||||
{
|
||||
"type": "extension_request",
|
||||
"values": [
|
||||
[
|
||||
{
|
||||
"extn_id": "extended_key_usage",
|
||||
"critical": false,
|
||||
"extn_value": [
|
||||
"server_auth"
|
||||
]
|
||||
},
|
||||
{
|
||||
"extn_id": "subject_alt_name",
|
||||
"critical": false,
|
||||
"extn_value": [
|
||||
"myserver.for.example"
|
||||
]
|
||||
}
|
||||
]
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
"signature_algorithm": {
|
||||
"algorithm": "sha384_ecdsa",
|
||||
"parameters": null
|
||||
},
|
||||
"signature": "30:45:02:20:77:ac:5b:51:bf:c5:f5:43:02:52:ae:66:..."
|
||||
}
|
||||
]
|
||||
|
||||
$ openssl req -in server.csr | jc --x509-csr -p
|
||||
[
|
||||
{
|
||||
"certification_request_info": {
|
||||
"version": "v1",
|
||||
"subject": {
|
||||
"common_name": "myserver.for.example"
|
||||
},
|
||||
"subject_pk_info": {
|
||||
"algorithm": {
|
||||
"algorithm": "ec",
|
||||
"parameters": "secp256r1"
|
||||
},
|
||||
"public_key": "04:40:33:c0:91:8f:e9:46:ea:d0:dc:d0:f9:63:2..."
|
||||
},
|
||||
"attributes": [
|
||||
{
|
||||
"type": "extension_request",
|
||||
"values": [
|
||||
[
|
||||
{
|
||||
"extn_id": "extended_key_usage",
|
||||
"critical": false,
|
||||
"extn_value": [
|
||||
"server_auth"
|
||||
]
|
||||
},
|
||||
{
|
||||
"extn_id": "subject_alt_name",
|
||||
"critical": false,
|
||||
"extn_value": [
|
||||
"myserver.for.example"
|
||||
]
|
||||
}
|
||||
]
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
"signature_algorithm": {
|
||||
"algorithm": "sha384_ecdsa",
|
||||
"parameters": null
|
||||
},
|
||||
"signature": "30:45:02:20:77:ac:5b:51:bf:c5:f5:43:02:52:ae:66:..."
|
||||
}
|
||||
]
|
||||
|
||||
"""
|
||||
# import binascii
|
||||
# from collections import OrderedDict
|
||||
# from datetime import datetime
|
||||
from typing import List, Dict, Union
|
||||
import jc.utils
|
||||
from jc.parsers.asn1crypto import pem, csr, jc_global
|
||||
from jc.parsers.x509_cert import _fix_objects, _process
|
||||
|
||||
|
||||
class info():
|
||||
"""Provides parser metadata (version, author, etc.)"""
|
||||
version = '1.0'
|
||||
description = 'X.509 PEM and DER certificate request file parser'
|
||||
author = 'Kelly Brazil'
|
||||
author_email = 'kellyjonbrazil@gmail.com'
|
||||
details = 'Using the asn1crypto library at https://github.com/wbond/asn1crypto/releases/tag/1.5.1'
|
||||
compatible = ['linux', 'darwin', 'cygwin', 'win32', 'aix', 'freebsd']
|
||||
tags = ['standard', 'file', 'string', 'binary']
|
||||
|
||||
|
||||
__version__ = info.version
|
||||
|
||||
|
||||
def parse(
|
||||
data: Union[str, bytes],
|
||||
raw: bool = False,
|
||||
quiet: bool = False
|
||||
) -> List[Dict]:
|
||||
"""
|
||||
Main text parsing function
|
||||
|
||||
Parameters:
|
||||
|
||||
data: (string or bytes) text or binary data to parse
|
||||
raw: (boolean) unprocessed output if True
|
||||
quiet: (boolean) suppress warning messages if True
|
||||
|
||||
Returns:
|
||||
|
||||
List of Dictionaries. Raw or processed structured data.
|
||||
"""
|
||||
jc.utils.compatibility(__name__, info.compatible, quiet)
|
||||
jc_global.quiet = quiet # to inject quiet setting into asn1crypto library
|
||||
|
||||
raw_output: List = []
|
||||
|
||||
if jc.utils.has_data(data):
|
||||
# convert to bytes, if not already, for PEM detection since that's
|
||||
# what pem.detect() needs. (cli.py will auto-convert to UTF-8 if it can)
|
||||
try:
|
||||
der_bytes = bytes(data, 'utf-8') # type: ignore
|
||||
except TypeError:
|
||||
der_bytes = data # type: ignore
|
||||
|
||||
certs = []
|
||||
if pem.detect(der_bytes):
|
||||
for type_name, headers, der_bytes in pem.unarmor(der_bytes, multiple=True):
|
||||
if type_name == 'CERTIFICATE REQUEST' or type_name == 'NEW CERTIFICATE REQUEST':
|
||||
certs.append(csr.CertificationRequest.load(der_bytes))
|
||||
|
||||
else:
|
||||
certs.append(csr.CertificationRequest.load(der_bytes))
|
||||
|
||||
raw_output = [_fix_objects(cert.native) for cert in certs]
|
||||
|
||||
return raw_output if raw else _process(raw_output)
|
||||
@@ -151,8 +151,9 @@ def compatibility(mod_name: str, compatible: List[str], quiet: bool = False) ->
|
||||
mod = mod_name.split('.')[-1]
|
||||
compat_list = ', '.join(compatible)
|
||||
warning_message([
|
||||
f'{mod} parser is not compatible with your OS ({sys.platform}).',
|
||||
f'Compatible platforms: {compat_list}'
|
||||
f'`{mod}` command output from this OS ({sys.platform}) is not supported.',
|
||||
f'`{mod}` command output from the following platforms is supported: {compat_list}',
|
||||
'Disregard this warning if you are processing output that came from a supported platform. (Use the -q option to suppress this warning)'
|
||||
])
|
||||
|
||||
|
||||
|
||||
42
man/jc.1
42
man/jc.1
@@ -1,4 +1,4 @@
|
||||
.TH jc 1 2023-04-30 1.23.2 "JSON Convert"
|
||||
.TH jc 1 2023-07-30 1.23.4 "JSON Convert"
|
||||
.SH NAME
|
||||
\fBjc\fP \- JSON Convert JSONifies the output of many CLI tools, file-types,
|
||||
and strings
|
||||
@@ -192,6 +192,11 @@ Email Address string parser
|
||||
\fB--file\fP
|
||||
`file` command parser
|
||||
|
||||
.TP
|
||||
.B
|
||||
\fB--find\fP
|
||||
`find` command parser
|
||||
|
||||
.TP
|
||||
.B
|
||||
\fB--findmnt\fP
|
||||
@@ -307,6 +312,11 @@ IPv4 and IPv6 Address string parser
|
||||
\fB--iptables\fP
|
||||
`iptables` command parser
|
||||
|
||||
.TP
|
||||
.B
|
||||
\fB--ip-route\fP
|
||||
`ip route` command parser
|
||||
|
||||
.TP
|
||||
.B
|
||||
\fB--iso-datetime\fP
|
||||
@@ -357,6 +367,11 @@ Key/Value file and string parser
|
||||
\fB--ls-s\fP
|
||||
`ls` command streaming parser
|
||||
|
||||
.TP
|
||||
.B
|
||||
\fB--lsattr\fP
|
||||
`lsattr` command parser
|
||||
|
||||
.TP
|
||||
.B
|
||||
\fB--lsblk\fP
|
||||
@@ -687,6 +702,11 @@ PLIST file parser
|
||||
\fB--proc-net-route\fP
|
||||
`/proc/net/route` file parser
|
||||
|
||||
.TP
|
||||
.B
|
||||
\fB--proc-net-tcp\fP
|
||||
`/proc/net/tcp` and `/proc/net/tcp6` file parser
|
||||
|
||||
.TP
|
||||
.B
|
||||
\fB--proc-net-unix\fP
|
||||
@@ -742,6 +762,11 @@ PLIST file parser
|
||||
\fB--ps\fP
|
||||
`ps` command parser
|
||||
|
||||
.TP
|
||||
.B
|
||||
\fB--resolve-conf\fP
|
||||
`/etc/resolve.conf` file parser
|
||||
|
||||
.TP
|
||||
.B
|
||||
\fB--route\fP
|
||||
@@ -777,6 +802,11 @@ Semantic Version string parser
|
||||
\fB--shadow\fP
|
||||
`/etc/shadow` file parser
|
||||
|
||||
.TP
|
||||
.B
|
||||
\fB--srt\fP
|
||||
SRT file parser
|
||||
|
||||
.TP
|
||||
.B
|
||||
\fB--ss\fP
|
||||
@@ -942,6 +972,11 @@ URL string parser
|
||||
\fB--ver\fP
|
||||
Version string parser
|
||||
|
||||
.TP
|
||||
.B
|
||||
\fB--veracrypt\fP
|
||||
`veracrypt` command parser
|
||||
|
||||
.TP
|
||||
.B
|
||||
\fB--vmstat\fP
|
||||
@@ -972,6 +1007,11 @@ Version string parser
|
||||
\fB--x509-cert\fP
|
||||
X.509 PEM and DER certificate file parser
|
||||
|
||||
.TP
|
||||
.B
|
||||
\fB--x509-csr\fP
|
||||
X.509 PEM and DER certificate request file parser
|
||||
|
||||
.TP
|
||||
.B
|
||||
\fB--xml\fP
|
||||
|
||||
@@ -1,2 +1,2 @@
|
||||
[metadata]
|
||||
license_file = LICENSE.md
|
||||
license_files = LICENSE.md
|
||||
|
||||
2
setup.py
2
setup.py
@@ -5,7 +5,7 @@ with open('README.md', 'r') as f:
|
||||
|
||||
setuptools.setup(
|
||||
name='jc',
|
||||
version='1.23.2',
|
||||
version='1.23.4',
|
||||
author='Kelly Brazil',
|
||||
author_email='kellyjonbrazil@gmail.com',
|
||||
description='Converts the output of popular command-line tools and file-types to JSON.',
|
||||
|
||||
@@ -5,11 +5,13 @@
|
||||
|
||||
> Try the `jc` [web demo](https://jc-web.onrender.com/) and [REST API](https://github.com/kellyjonbrazil/jc-restapi)
|
||||
|
||||
> JC is [now available](https://galaxy.ansible.com/community/general) as an
|
||||
> `jc` is [now available](https://galaxy.ansible.com/community/general) as an
|
||||
Ansible filter plugin in the `community.general` collection. See this
|
||||
[blog post](https://blog.kellybrazil.com/2020/08/30/parsing-command-output-in-ansible-with-jc/)
|
||||
for an example.
|
||||
|
||||
> Looking for something like `jc` but lower-level? Check out [regex2json](https://gitlab.com/tozd/regex2json).
|
||||
|
||||
# JC
|
||||
JSON Convert
|
||||
|
||||
|
||||
13
tests/fixtures/centos-7.7/find.json
vendored
Normal file
13
tests/fixtures/centos-7.7/find.json
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
[{"path": null, "node": "."},
|
||||
{"path":".","node":null},
|
||||
{"path": ".","node": "jc"},
|
||||
{"path": "./jc","node": "tests"},
|
||||
{"path": "./jc/tests","node": "test_find.py"},
|
||||
{"path": "./jc/tests","node": "test_history.py"},
|
||||
{"path": "./jc/tests","node": "test_hosts.py"},
|
||||
{"path": "./jc","node": "anotherdirectory"},
|
||||
{"path": null,"node": null,"error": "find: './inaccessible': Permission denied"},
|
||||
{"path": "./jc","node": "directory2"},
|
||||
{"path": "./jc/directory2","node": "file.txt"},
|
||||
{"path": "./jc/directory2","node": "file2.txt"},
|
||||
{"path": ".","node": "newfile.txt"}]
|
||||
13
tests/fixtures/centos-7.7/find.out
vendored
Normal file
13
tests/fixtures/centos-7.7/find.out
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
.
|
||||
./
|
||||
./jc
|
||||
./jc/tests
|
||||
./jc/tests/test_find.py
|
||||
./jc/tests/test_history.py
|
||||
./jc/tests/test_hosts.py
|
||||
./jc/anotherdirectory
|
||||
find: './inaccessible': Permission denied
|
||||
./jc/directory2
|
||||
./jc/directory2/file.txt
|
||||
./jc/directory2/file2.txt
|
||||
./newfile.txt
|
||||
31
tests/fixtures/centos-7.7/ip_route.json
vendored
Normal file
31
tests/fixtures/centos-7.7/ip_route.json
vendored
Normal file
@@ -0,0 +1,31 @@
|
||||
[
|
||||
{
|
||||
"ip": "default",
|
||||
"via": "10.0.2.2",
|
||||
"dev": "enp0s3",
|
||||
"proto": "dhcp",
|
||||
"metric": 100
|
||||
},
|
||||
{
|
||||
"ip": "10.0.2.0/24",
|
||||
"dev": "enp0s3",
|
||||
"proto": "kernel",
|
||||
"scope": "link",
|
||||
"src": "10.0.2.15",
|
||||
"metric": 100
|
||||
},
|
||||
{
|
||||
"ip": "169.254.0.0/16",
|
||||
"dev": "enp0s3",
|
||||
"scope": "link",
|
||||
"metric": 1000
|
||||
},
|
||||
{
|
||||
"ip": "172.17.0.0/16",
|
||||
"dev": "docker0",
|
||||
"proto": "kernel",
|
||||
"scope": "link",
|
||||
"src": "172.17.0.1",
|
||||
"status": "linkdown"
|
||||
}
|
||||
]
|
||||
4
tests/fixtures/centos-7.7/ip_route.out
vendored
Normal file
4
tests/fixtures/centos-7.7/ip_route.out
vendored
Normal file
@@ -0,0 +1,4 @@
|
||||
default via 10.0.2.2 dev enp0s3 proto dhcp metric 100
|
||||
10.0.2.0/24 dev enp0s3 proto kernel scope link src 10.0.2.15 metric 100
|
||||
169.254.0.0/16 dev enp0s3 scope link metric 1000
|
||||
172.17.0.0/16 dev docker0 proto kernel scope link src 172.17.0.1 linkdown
|
||||
1
tests/fixtures/centos-7.7/last-wixF.json
vendored
Normal file
1
tests/fixtures/centos-7.7/last-wixF.json
vendored
Normal file
@@ -0,0 +1 @@
|
||||
[{"user":"root","tty":"pts/0","hostname":"192.168.255.1","login":"Mon Jun 19 14:18:13 2023","logout":"still logged in","login_epoch":1687209493}, {"user":"mark","tty":"pts/0","hostname":"192.168.255.1","login":"Mon Jun 19 14:15:57 2023","logout":"Mon Jun 19 14:18:00 2023","duration":"00:02","login_epoch":1687209357,"logout_epoch":1687209480,"duration_seconds":123}, {"user":"mark","tty":"pts/0","hostname":"192.168.255.1","login":"Mon Jun 19 14:15:45 2023","logout":"Mon Jun 19 14:15:52 2023","duration":"00:00","login_epoch":1687209345,"logout_epoch":1687209352,"duration_seconds":7}, {"user":"mark","tty":"tty1","hostname":"0.0.0.0","login":"Mon Jun 19 16:59:57 2023","logout":"still logged in","login_epoch":1687219197}, {"user":"runlevel","tty":"(to lvl 3)","hostname":"0.0.0.0","login":"Mon Jun 19 16:59:39 2023","logout":"Mon Jun 19 14:35:00 2023","duration":"-2:-24","login_epoch":1687219179,"logout_epoch":1687210500,"duration_seconds":-8679}, {"user":"reboot","tty":"system boot","hostname":"0.0.0.0","login":"Mon Jun 19 16:59:20 2023","logout":"Mon Jun 19 14:35:00 2023","duration":"-2:-24","login_epoch":1687219160,"logout_epoch":1687210500,"duration_seconds":-8660},{"user": "shutdown","tty": "system down","hostname": "0.0.0.0","login": "Fri Apr 14 13:46:46 2023","logout": "Fri Apr 14 13:47:12 2023","duration": "00:00","login_epoch": 1681505206,"logout_epoch": 1681505232,"duration_seconds": 26}]
|
||||
9
tests/fixtures/centos-7.7/last-wixF.out
vendored
Normal file
9
tests/fixtures/centos-7.7/last-wixF.out
vendored
Normal file
@@ -0,0 +1,9 @@
|
||||
root pts/0 192.168.255.1 Mon Jun 19 14:18:13 2023 still logged in
|
||||
mark pts/0 192.168.255.1 Mon Jun 19 14:15:57 2023 - Mon Jun 19 14:18:00 2023 (00:02)
|
||||
mark pts/0 192.168.255.1 Mon Jun 19 14:15:45 2023 - Mon Jun 19 14:15:52 2023 (00:00)
|
||||
mark tty1 0.0.0.0 Mon Jun 19 16:59:57 2023 still logged in
|
||||
runlevel (to lvl 3) 0.0.0.0 Mon Jun 19 16:59:39 2023 - Mon Jun 19 14:35:00 2023 (-2:-24)
|
||||
reboot system boot 0.0.0.0 Mon Jun 19 16:59:20 2023 - Mon Jun 19 14:35:00 2023 (-2:-24)
|
||||
shutdown system down 0.0.0.0 Fri Apr 14 13:46:46 2023 - Fri Apr 14 13:47:12 2023 (00:00)
|
||||
|
||||
wtmp begins Mon Jun 19 16:59:20 2023
|
||||
1
tests/fixtures/centos-7.7/ping-missing-hostname-streaming.json
vendored
Normal file
1
tests/fixtures/centos-7.7/ping-missing-hostname-streaming.json
vendored
Normal file
@@ -0,0 +1 @@
|
||||
[{"type":"reply","destination_ip":"100.68.105.124","sent_bytes":56,"pattern":null,"timestamp":null,"response_bytes":64,"response_ip":"100.68.105.124","icmp_seq":1,"ttl":64,"time_ms":0.04,"duplicate":false},{"type":"summary","destination_ip":"100.68.105.124","sent_bytes":56,"pattern":null,"packets_transmitted":1,"packets_received":1,"packet_loss_percent":0.0,"duplicates":0,"time_ms":0.0,"round_trip_ms_min":0.04,"round_trip_ms_avg":0.04,"round_trip_ms_max":0.04,"round_trip_ms_stddev":0.0}]
|
||||
1
tests/fixtures/centos-7.7/ping-missing-hostname.json
vendored
Normal file
1
tests/fixtures/centos-7.7/ping-missing-hostname.json
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"destination_ip":"100.68.105.124","data_bytes":56,"pattern":null,"packets_transmitted":1,"packets_received":1,"packet_loss_percent":0.0,"duplicates":0,"time_ms":0.0,"round_trip_ms_min":0.04,"round_trip_ms_avg":0.04,"round_trip_ms_max":0.04,"round_trip_ms_stddev":0.0,"responses":[{"type":"reply","timestamp":null,"bytes":64,"response_ip":"100.68.105.124","icmp_seq":1,"ttl":64,"time_ms":0.04,"duplicate":false}]}
|
||||
6
tests/fixtures/centos-7.7/ping-missing-hostname.out
vendored
Normal file
6
tests/fixtures/centos-7.7/ping-missing-hostname.out
vendored
Normal file
@@ -0,0 +1,6 @@
|
||||
PING (100.68.105.124) 56(84) bytes of data.
|
||||
64 bytes from 100.68.105.124 (100.68.105.124): icmp_seq=1 ttl=64 time=0.040 ms
|
||||
|
||||
--- ping statistics ---
|
||||
1 packets transmitted, 1 received, 0% packet loss, time 0ms
|
||||
rtt min/avg/max/mdev = 0.040/0.040/0.040/0.000 ms
|
||||
18
tests/fixtures/generic/bluetoothctl_device_random.out
vendored
Normal file
18
tests/fixtures/generic/bluetoothctl_device_random.out
vendored
Normal file
@@ -0,0 +1,18 @@
|
||||
Device DF:1C:C3:B4:1A:1F (random)
|
||||
Name: M585/M590
|
||||
Alias: M585/M590
|
||||
Appearance: 0x03c2
|
||||
Icon: input-mouse
|
||||
Paired: yes
|
||||
Bonded: yes
|
||||
Trusted: no
|
||||
Blocked: no
|
||||
Connected: no
|
||||
LegacyPairing: no
|
||||
UUID: Generic Access Profile (00001800-0000-1000-8000-00805f9b34fb)
|
||||
UUID: Generic Attribute Profile (00001801-0000-1000-8000-00805f9b34fb)
|
||||
UUID: Device Information (0000180a-0000-1000-8000-00805f9b34fb)
|
||||
UUID: Battery Service (0000180f-0000-1000-8000-00805f9b34fb)
|
||||
UUID: Human Interface Device (00001812-0000-1000-8000-00805f9b34fb)
|
||||
UUID: Vendor specific (00010000-0000-1000-8000-011f2000046d)
|
||||
Modalias: usb:v046DpB01Bd0011
|
||||
1
tests/fixtures/generic/dig-nsid.json
vendored
Normal file
1
tests/fixtures/generic/dig-nsid.json
vendored
Normal file
@@ -0,0 +1 @@
|
||||
[{"id":22691,"opcode":"QUERY","status":"NOERROR","flags":["qr","rd","ra"],"query_num":1,"answer_num":1,"authority_num":0,"additional_num":1,"opt_pseudosection":{"edns":{"version":0,"flags":[],"udp":512},"nsid":"gpdns-sfo"},"question":{"name":"mail.google.com.","class":"IN","type":"A"},"answer":[{"name":"mail.google.com.","class":"IN","type":"A","ttl":300,"data":"142.250.189.197"}],"query_time":189,"server":"2001:4860:4860::8888#53(2001:4860:4860::8888)","when":"Sat Jun 03 12:37:43 PDT 2023","rcvd":73,"when_epoch":1685821063,"when_epoch_utc":null}]
|
||||
22
tests/fixtures/generic/dig-nsid.out
vendored
Normal file
22
tests/fixtures/generic/dig-nsid.out
vendored
Normal file
@@ -0,0 +1,22 @@
|
||||
|
||||
; <<>> DiG 9.10.6 <<>> +nsid @dns.google mail.google.com
|
||||
; (4 servers found)
|
||||
;; global options: +cmd
|
||||
;; Got answer:
|
||||
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 22691
|
||||
;; flags: qr rd ra; QUERY: 1, ANSWER: 1, AUTHORITY: 0, ADDITIONAL: 1
|
||||
|
||||
;; OPT PSEUDOSECTION:
|
||||
; EDNS: version: 0, flags:; udp: 512
|
||||
; NSID: 67 70 64 6e 73 2d 73 66 6f ("gpdns-sfo")
|
||||
;; QUESTION SECTION:
|
||||
;mail.google.com. IN A
|
||||
|
||||
;; ANSWER SECTION:
|
||||
mail.google.com. 300 IN A 142.250.189.197
|
||||
|
||||
;; Query time: 189 msec
|
||||
;; SERVER: 2001:4860:4860::8888#53(2001:4860:4860::8888)
|
||||
;; WHEN: Sat Jun 03 12:37:43 PDT 2023
|
||||
;; MSG SIZE rcvd: 73
|
||||
|
||||
12
tests/fixtures/generic/resolve.conf-1
vendored
Normal file
12
tests/fixtures/generic/resolve.conf-1
vendored
Normal file
@@ -0,0 +1,12 @@
|
||||
# Generated by NetworkManager
|
||||
search hsd1.ca.comcast.net
|
||||
nameserver 75.75.75.75 ; test inline comment
|
||||
nameserver 75.75.76.76 # test inline comment
|
||||
nameserver 2001:558:feed::1
|
||||
# NOTE: the libc resolver may not support more than 3 nameservers.
|
||||
# The nameservers listed below may not be recognized.
|
||||
nameserver 2001:558:feed::2
|
||||
|
||||
|
||||
sortlist 130.155.160.0/255.255.240.0 130.155.0.0
|
||||
search hello.com world.com # add another search list for fun
|
||||
1
tests/fixtures/generic/resolve.conf-1.json
vendored
Normal file
1
tests/fixtures/generic/resolve.conf-1.json
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"search":["hsd1.ca.comcast.net","hello.com","world.com"],"nameservers":["75.75.75.75","75.75.76.76","2001:558:feed::1","2001:558:feed::2"],"sortlist":["130.155.160.0/255.255.240.0","130.155.0.0"]}
|
||||
4
tests/fixtures/generic/resolve.conf-2
vendored
Normal file
4
tests/fixtures/generic/resolve.conf-2
vendored
Normal file
@@ -0,0 +1,4 @@
|
||||
search eng.myprime.com dev.eng.myprime.com labs.myprime.com qa.myprime.com
|
||||
options rotate
|
||||
options ndots:1
|
||||
nameserver 10.136.17.15
|
||||
1
tests/fixtures/generic/resolve.conf-2.json
vendored
Normal file
1
tests/fixtures/generic/resolve.conf-2.json
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"search":["eng.myprime.com","dev.eng.myprime.com","labs.myprime.com","qa.myprime.com"],"nameservers":["10.136.17.15"],"options":["rotate","ndots:1"]}
|
||||
7
tests/fixtures/generic/resolve.conf-3
vendored
Normal file
7
tests/fixtures/generic/resolve.conf-3
vendored
Normal file
@@ -0,0 +1,7 @@
|
||||
|
||||
;
|
||||
; /etc/resolv.conf file for dnsmaster (sirius)
|
||||
;
|
||||
domain doc.com
|
||||
nameserver 0.0.0.0
|
||||
nameserver 111.22.3.5
|
||||
1
tests/fixtures/generic/resolve.conf-3.json
vendored
Normal file
1
tests/fixtures/generic/resolve.conf-3.json
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"domain":"doc.com","nameservers":["0.0.0.0","111.22.3.5"]}
|
||||
5
tests/fixtures/generic/resolve.conf-4
vendored
Normal file
5
tests/fixtures/generic/resolve.conf-4
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
; generated by /sbin/dhclient-script
|
||||
search example.com
|
||||
options rotate timeout:1 retries:1
|
||||
nameserver 10.1.1.2
|
||||
nameserver 10.1.1.1
|
||||
1
tests/fixtures/generic/resolve.conf-4.json
vendored
Normal file
1
tests/fixtures/generic/resolve.conf-4.json
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"search":["example.com"],"nameservers":["10.1.1.2","10.1.1.1"],"options":["rotate","timeout:1","retries:1"]}
|
||||
1
tests/fixtures/generic/srt-attack_of_the_clones.json
vendored
Normal file
1
tests/fixtures/generic/srt-attack_of_the_clones.json
vendored
Normal file
@@ -0,0 +1 @@
|
||||
[{"index": 1, "start": {"hours": 0, "minutes": 2, "seconds": 16, "milliseconds": 612, "timestamp": "00:02:16,612"}, "end": {"hours": 0, "minutes": 2, "seconds": 19, "milliseconds": 376, "timestamp": "00:02:19,376"}, "content": "Senator, we're making\nour final approach into Coruscant."}, {"index": 2, "start": {"hours": 0, "minutes": 2, "seconds": 19, "milliseconds": 482, "timestamp": "00:02:19,482"}, "end": {"hours": 0, "minutes": 2, "seconds": 21, "milliseconds": 609, "timestamp": "00:02:21,609"}, "content": "Very good, Lieutenant."}, {"index": 3, "start": {"hours": 0, "minutes": 3, "seconds": 13, "milliseconds": 336, "timestamp": "00:03:13,336"}, "end": {"hours": 0, "minutes": 3, "seconds": 15, "milliseconds": 167, "timestamp": "00:03:15,167"}, "content": "We made it."}, {"index": 4, "start": {"hours": 0, "minutes": 3, "seconds": 18, "milliseconds": 608, "timestamp": "00:03:18,608"}, "end": {"hours": 0, "minutes": 3, "seconds": 20, "milliseconds": 371, "timestamp": "00:03:20,371"}, "content": "I guess I was wrong."}, {"index": 5, "start": {"hours": 0, "minutes": 3, "seconds": 20, "milliseconds": 476, "timestamp": "00:03:20,476"}, "end": {"hours": 0, "minutes": 3, "seconds": 22, "milliseconds": 671, "timestamp": "00:03:22,671"}, "content": "There was no danger at all."}]
|
||||
20
tests/fixtures/generic/srt-attack_of_the_clones.srt
vendored
Normal file
20
tests/fixtures/generic/srt-attack_of_the_clones.srt
vendored
Normal file
@@ -0,0 +1,20 @@
|
||||
1
|
||||
00:02:16,612 --> 00:02:19,376
|
||||
Senator, we're making
|
||||
our final approach into Coruscant.
|
||||
|
||||
2
|
||||
00:02:19,482 --> 00:02:21,609
|
||||
Very good, Lieutenant.
|
||||
|
||||
3
|
||||
00:03:13,336 --> 00:03:15,167
|
||||
We made it.
|
||||
|
||||
4
|
||||
00:03:18,608 --> 00:03:20,371
|
||||
I guess I was wrong.
|
||||
|
||||
5
|
||||
00:03:20,476 --> 00:03:22,671
|
||||
There was no danger at all.
|
||||
1
tests/fixtures/generic/srt-attack_of_the_clones_raw.json
vendored
Normal file
1
tests/fixtures/generic/srt-attack_of_the_clones_raw.json
vendored
Normal file
@@ -0,0 +1 @@
|
||||
[{"index": "1", "start": {"hours": "00", "minutes": "02", "seconds": "16", "milliseconds": "612", "timestamp": "00:02:16,612"}, "end": {"hours": "00", "minutes": "02", "seconds": "19", "milliseconds": "376", "timestamp": "00:02:19,376"}, "content": "Senator, we're making\nour final approach into Coruscant."}, {"index": "2", "start": {"hours": "00", "minutes": "02", "seconds": "19", "milliseconds": "482", "timestamp": "00:02:19,482"}, "end": {"hours": "00", "minutes": "02", "seconds": "21", "milliseconds": "609", "timestamp": "00:02:21,609"}, "content": "Very good, Lieutenant."}, {"index": "3", "start": {"hours": "00", "minutes": "03", "seconds": "13", "milliseconds": "336", "timestamp": "00:03:13,336"}, "end": {"hours": "00", "minutes": "03", "seconds": "15", "milliseconds": "167", "timestamp": "00:03:15,167"}, "content": "We made it."}, {"index": "4", "start": {"hours": "00", "minutes": "03", "seconds": "18", "milliseconds": "608", "timestamp": "00:03:18,608"}, "end": {"hours": "00", "minutes": "03", "seconds": "20", "milliseconds": "371", "timestamp": "00:03:20,371"}, "content": "I guess I was wrong."}, {"index": "5", "start": {"hours": "00", "minutes": "03", "seconds": "20", "milliseconds": "476", "timestamp": "00:03:20,476"}, "end": {"hours": "00", "minutes": "03", "seconds": "22", "milliseconds": "671", "timestamp": "00:03:22,671"}, "content": "There was no danger at all."}]
|
||||
1
tests/fixtures/generic/srt-complex.json
vendored
Normal file
1
tests/fixtures/generic/srt-complex.json
vendored
Normal file
@@ -0,0 +1 @@
|
||||
[{"index": 1,"start": {"hours": 0,"minutes": 2,"seconds": 16,"milliseconds": 612,"timestamp": "00:02:16,612"},"end": {"hours": 0,"minutes": 2,"seconds": 19,"milliseconds": 376,"timestamp": "00:02:19,376"},"content": "--> --> --> -->"},{"index": 2,"start": {"hours": 0,"minutes": 2,"seconds": 19,"milliseconds": 482,"timestamp": "00:02:19.482"},"end": {"hours": 0,"minutes": 2,"seconds": 21,"milliseconds": 609,"timestamp": "00:02:21.609"},"content": "This subtitle has . as a delimiter in the timestamp."},{"index": 3,"start": {"hours": 0,"minutes": 3,"seconds": 13,"milliseconds": 336,"timestamp": "00:03:13,336"},"end": {"hours": 0,"minutes": 3,"seconds": 15,"milliseconds": 167,"timestamp": "00:03:15,167"},"content": "4"},{"index": 4,"start": {"hours": 0,"minutes": 3,"seconds": 18,"milliseconds": 608,"timestamp": "00:03:18,608"},"end": {"hours": 0,"minutes": 3,"seconds": 20,"milliseconds": 371,"timestamp": "00:03:20,371"},"content": "The previous subtitle has a number."},{"index": 5,"start": {"hours": 0,"minutes": 3,"seconds": 20,"milliseconds": 476,"timestamp": "00:03:20,476"},"end": {"hours": 0,"minutes": 3,"seconds": 22,"milliseconds": 671,"timestamp": "00:03:22,671"},"content": "This is a valid subtitle."}]
|
||||
19
tests/fixtures/generic/srt-complex.srt
vendored
Normal file
19
tests/fixtures/generic/srt-complex.srt
vendored
Normal file
@@ -0,0 +1,19 @@
|
||||
1
|
||||
00:02:16,612 --> 00:02:19,376
|
||||
--> --> --> -->
|
||||
|
||||
2
|
||||
00:02:19.482 --> 00:02:21.609
|
||||
This subtitle has . as a delimiter in the timestamp.
|
||||
|
||||
3
|
||||
00:03:13,336 --> 00:03:15,167
|
||||
4
|
||||
|
||||
4
|
||||
00:03:18,608 --> 00:03:20,371
|
||||
The previous subtitle has a number.
|
||||
|
||||
5
|
||||
00:03:20,476 --> 00:03:22,671
|
||||
This is a valid subtitle.
|
||||
33
tests/fixtures/generic/veracrypt_verbose_list_volumes.out
vendored
Normal file
33
tests/fixtures/generic/veracrypt_verbose_list_volumes.out
vendored
Normal file
@@ -0,0 +1,33 @@
|
||||
Slot: 1
|
||||
Volume: /dev/sdb1
|
||||
Virtual Device: /dev/mapper/veracrypt1
|
||||
Mount Directory: /home/bob/mount/encrypt/sdb1
|
||||
Size: 498 MiB
|
||||
Type: Normal
|
||||
Read-Only: No
|
||||
Hidden Volume Protected: No
|
||||
Encryption Algorithm: AES
|
||||
Primary Key Size: 256 bits
|
||||
Secondary Key Size (XTS Mode): 256 bits
|
||||
Block Size: 128 bits
|
||||
Mode of Operation: XTS
|
||||
PKCS-5 PRF: HMAC-SHA-512
|
||||
Volume Format Version: 2
|
||||
Embedded Backup Header: Yes
|
||||
|
||||
Slot: 2
|
||||
Volume: /dev/sdb2
|
||||
Virtual Device: /dev/mapper/veracrypt2
|
||||
Mount Directory: /home/bob/mount/encrypt/sdb2
|
||||
Size: 522 MiB
|
||||
Type: Normal
|
||||
Read-Only: No
|
||||
Hidden Volume Protected: No
|
||||
Encryption Algorithm: AES
|
||||
Primary Key Size: 256 bits
|
||||
Secondary Key Size (XTS Mode): 256 bits
|
||||
Block Size: 128 bits
|
||||
Mode of Operation: XTS
|
||||
PKCS-5 PRF: HMAC-SHA-512
|
||||
Volume Format Version: 2
|
||||
Embedded Backup Header: Yes
|
||||
35
tests/fixtures/generic/veracrypt_verbose_list_volumes_unknown_fields.out
vendored
Normal file
35
tests/fixtures/generic/veracrypt_verbose_list_volumes_unknown_fields.out
vendored
Normal file
@@ -0,0 +1,35 @@
|
||||
Slot: 1
|
||||
Volume: /dev/sdb1
|
||||
Virtual Device: /dev/mapper/veracrypt1
|
||||
Mount Directory: /home/bob/mount/encrypt/sdb1
|
||||
Size: 498 MiB
|
||||
Type: Normal
|
||||
Read-Only: No
|
||||
Hidden Volume Protected: No
|
||||
Encryption Algorithm: AES
|
||||
Primary Key Size: 256 bits
|
||||
Secondary Key Size (XTS Mode): 256 bits
|
||||
Block Size: 128 bits
|
||||
Mode of Operation: XTS
|
||||
Label: bar
|
||||
PKCS-5 PRF: HMAC-SHA-512
|
||||
Volume Format Version: 2
|
||||
Embedded Backup Header: Yes
|
||||
|
||||
Slot: 2
|
||||
Volume: /dev/sdb2
|
||||
Virtual Device: /dev/mapper/veracrypt2
|
||||
Mount Directory: /home/bob/mount/encrypt/sdb2
|
||||
Size: 522 MiB
|
||||
Type: Normal
|
||||
Read-Only: No
|
||||
Hidden Volume Protected: No
|
||||
Encryption Algorithm: AES
|
||||
Primary Key Size: 256 bits
|
||||
Secondary Key Size (XTS Mode): 256 bits
|
||||
Block Size: 128 bits
|
||||
Mode of Operation: XTS
|
||||
Label: foo
|
||||
PKCS-5 PRF: HMAC-SHA-512
|
||||
Volume Format Version: 2
|
||||
Embedded Backup Header: Yes
|
||||
1
tests/fixtures/generic/x509-cert-bad-email.json
vendored
Normal file
1
tests/fixtures/generic/x509-cert-bad-email.json
vendored
Normal file
@@ -0,0 +1 @@
|
||||
[{"tbs_certificate":{"version":"v1","serial_number":"","signature":{"algorithm":"sha512_rsa","parameters":null},"issuer":{"country_name":"DE","state_or_province_name":"stateOrProvinceName","locality_name":"localityName","organization_name":"organizationName","organizational_unit_name":"organizationUnitName","common_name":"commonName","email_address":"emailAddress"},"validity":{"not_before":1686181858,"not_after":2001541858,"not_before_iso":"2023-06-07T23:50:58+00:00","not_after_iso":"2033-06-04T23:50:58+00:00"},"subject":{"country_name":"DE","state_or_province_name":"stateOrProvinceName","locality_name":"localityName","organization_name":"organizationName","organizational_unit_name":"organizationUnitName","common_name":"commonName","email_address":"emailAddress"},"subject_public_key_info":{"algorithm":{"algorithm":"rsa","parameters":null},"public_key":{"modulus":"aa:72:23:53:97:a6:e4:4e:7b:08:82:35:a5:3d:3a:83:f9:63:38:07:df:b8:38:61:7f:99:92:c8:31:6f:7f:ac:91:a4:47:64:7e:f9:2f:e0:9e:fd:d6:35:ee:50:78:55:47:fa:63:d4:b9:64:dc:d6:1d:f6:d6:67:4f:45:d1:96:81:3b:28:28:5f:c7:91:2f:a3:d5:a2:8d:3b:a0:21:91:25:6b:a9:40:5c:a4:8d:66:17:2a:3f:6e:61:74:fb:f4:35:25:e1:d1:64:aa:15:6c:6d:33:b6:f9:07:f2:a2:29:83:1c:b1:e5:97:3b:3e:14:ea:48:d6:c7:31:ea:3a:79:c1:28:a0:a7:ea:a6:7e:cf:c7:a3:00:d5:0d:70:00:f4:34:28:ab:f6:a3:80:7a:6f:01:9c:43:4a:a8:37:13:16:11:8f:e2:57:80:1d:df:50:4f:a3:2b:35:d9:d2:7d:1e:b6:b1:e4:b5:86:f2:a3:1c:63:c0:c2:e9:3e:f0:cf:23:e8:33:b4:da:ee:59:73:e9:94:16:1b:dd:33:8a:44:31:de:36:e2:58:1f:0e:75:fd:54:4b:6d:83:5f:a6:a1:dc:b6:1d:fc:45:1d:c9:1b:7a:01:d6:cc:0c:3d:1a:96:8b:0d:3b:20:a8:40:07:e0:c5:df:ad:1a:a2:86:47:f9:ca:f6:c5:a8:99:b8:60:e8:e2:09:ea:f5:0e:97:86:07:a6:ac:50:6b:19:06:f4:37:39:9a:0d:65:bb:89:e6:ae:eb:f3:a9:cd:72:c3:31:36:ef:ac:90:48:19:d0:84:df:b2:6d:9d:ef:6c:fd:9a:ff:3c:26:68:72:80:c2:c0:40:04:ba:84:39:69:5c:e9:b1:10:98:61:3d:1a:5c:a8:9e:79:48:2e:51:d0:c3:69:27:74:c1:ef:e2:98:2a:38:3c:6e:ea:7e:36:75:d3:3c:12:f5:cd:b2:a0:8a:0a:19:68:59:30:15:e3:cf:d3:4b:f4:99:a1:5a:3c:1f:c0:34:a3:e0:88:7a:44:6d:27:a9:87:2f:91:71:b4:c7:bb:c7:01:e2:fa:53:ef:09:1b:46:7b:df:52:f8:7a:cf:03:36:f9:b6:ce:a1:1c:3f:65:46:f8:13:cd:ac:9a:e2:19:43:26:b7:4a:2b:bd:da:94:d1:18:26:41:6e:19:2d:e1:6f:df:c4:c1:43:f6:8e:1e:99:d9:da:b2:8a:58:5e:5e:e8:a9:0c:4c:1d:a0:0f:50:b8:79:4b:3a:8a:4d:7a:7f:f4:10:b3:e8:d6:41:ec:57:e3:d1:c0:e1:fc:50:20:1c:f5:ad:84:a8:f6:af:2e:f4:cb:45:b7:4a:40:af:63:66:39:9b:73","public_exponent":65537}},"issuer_unique_id":null,"subject_unique_id":null,"extensions":[{"extn_id":"subject_alt_name","critical":false,"extn_value":["m\\xe4x@m\\xfcstermann.de"]}],"serial_number_str":"0"},"signature_algorithm":{"algorithm":"sha512_rsa","parameters":null},"signature_value":"78:ca:9f:d4:e7:e0:e9:95:6d:99:8f:ba:ca:69:ff:bd:2e:db:9f:4b:15:e5:ea:b8:c2:58:16:29:c2:2d:24:a3:62:36:91:61:ec:4b:99:e4:09:f9:a9:9b:fa:03:73:c1:ea:05:a9:ef:29:28:29:f6:00:aa:82:f8:53:1c:f0:6e:c0:87:ad:b2:93:24:ae:ba:56:f8:1c:62:54:23:d4:d5:66:a5:e1:36:cd:48:13:ad:fd:7b:4d:ff:c1:ee:de:fe:2f:d9:af:0e:82:7b:b0:58:2d:0c:e5:86:70:97:40:a5:ee:99:9a:96:59:14:8b:63:37:c5:04:07:17:58:04:56:d3:d9:71:a8:9c:c3:2f:21:77:19:ac:4d:95:83:f1:9f:91:0c:a3:8b:9c:1d:0e:0a:45:ed:e2:84:f9:57:6a:fa:5b:20:a8:15:26:d2:d8:34:2a:60:a7:d3:54:70:71:c3:17:aa:d7:3d:65:f5:5f:4e:a9:41:a2:e3:a7:c0:b4:5e:af:0b:48:64:f5:3a:08:0b:ec:c3:77:42:f8:13:19:45:19:7f:f8:09:79:1b:32:e2:9c:c2:91:b3:8d:e0:f4:e5:3f:9d:36:ae:22:a4:a8:d1:53:5b:c6:e3:ff:cb:a3:c0:47:ef:fd:b6:08:07:7a:97:1b:bf:cf:08:e0:5d:d1:4a:19:8a:14:c2:22:d0:79:b7:dc:76:d2:35:08:40:f8:33:80:8e:91:39:16:89:f5:51:18:d7:09:62:8d:47:ed:c6:e6:07:9d:d4:a8:3c:7a:df:e0:0d:bb:9a:a8:42:44:59:5d:f7:7b:f7:53:54:5f:0b:7f:1b:65:8d:df:bd:78:c9:e5:f8:57:e3:6b:e7:1f:d4:20:20:c3:0a:18:e2:6e:fa:10:e8:49:54:c7:25:6e:a1:5d:28:5f:45:f2:f1:c5:52:0e:28:c6:64:3a:4b:a6:d2:aa:66:e3:4d:fd:b2:3d:9c:30:b5:35:85:c8:44:93:53:f6:98:21:22:7c:36:8d:12:d9:d2:05:84:d0:22:b6:db:92:59:81:ea:26:3f:53:7b:a8:e8:34:c6:64:21:c0:e6:5b:3e:2b:23:6a:8b:dd:2d:63:25:46:ab:e7:a5:e4:1c:53:f0:e5:46:bb:80:17:da:ee:45:cf:da:34:34:3c:f4:61:a4:9e:00:92:a0:72:42:52:d9:9c:31:d0:90:6d:a7:90:53:9c:6a:49:83:55:f8:45:4a:1b:0c:da:65:1b:a3:d4:8c:b2:36:88:c3:c9:e2:ac:e2:93:e6:7c:fc:f6:e6:1b:35:21:26:d6:75:32:dc:98:dd:ba:7d:90:d8:48:25:36:7b:2e:f6:a1:72:bd:01"}]
|
||||
34
tests/fixtures/generic/x509-cert-bad-email.pem
vendored
Normal file
34
tests/fixtures/generic/x509-cert-bad-email.pem
vendored
Normal file
@@ -0,0 +1,34 @@
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIF9DCCA9wCAQAwDQYJKoZIhvcNAQENBQAwga4xCzAJBgNVBAYTAkRFMRwwGgYD
|
||||
VQQIDBNzdGF0ZU9yUHJvdmluY2VOYW1lMRUwEwYDVQQHDAxsb2NhbGl0eU5hbWUx
|
||||
GTAXBgNVBAoMEG9yZ2FuaXphdGlvbk5hbWUxHTAbBgNVBAsMFG9yZ2FuaXphdGlv
|
||||
blVuaXROYW1lMRMwEQYDVQQDDApjb21tb25OYW1lMRswGQYJKoZIhvcNAQkBFgxl
|
||||
bWFpbEFkZHJlc3MwHhcNMjMwNjA3MjM1MDU4WhcNMzMwNjA0MjM1MDU4WjCBrjEL
|
||||
MAkGA1UEBhMCREUxHDAaBgNVBAgME3N0YXRlT3JQcm92aW5jZU5hbWUxFTATBgNV
|
||||
BAcMDGxvY2FsaXR5TmFtZTEZMBcGA1UECgwQb3JnYW5pemF0aW9uTmFtZTEdMBsG
|
||||
A1UECwwUb3JnYW5pemF0aW9uVW5pdE5hbWUxEzARBgNVBAMMCmNvbW1vbk5hbWUx
|
||||
GzAZBgkqhkiG9w0BCQEWDGVtYWlsQWRkcmVzczCCAiIwDQYJKoZIhvcNAQEBBQAD
|
||||
ggIPADCCAgoCggIBAKpyI1OXpuROewiCNaU9OoP5YzgH37g4YX+Zksgxb3+skaRH
|
||||
ZH75L+Ce/dY17lB4VUf6Y9S5ZNzWHfbWZ09F0ZaBOygoX8eRL6PVoo07oCGRJWup
|
||||
QFykjWYXKj9uYXT79DUl4dFkqhVsbTO2+QfyoimDHLHllzs+FOpI1scx6jp5wSig
|
||||
p+qmfs/HowDVDXAA9DQoq/ajgHpvAZxDSqg3ExYRj+JXgB3fUE+jKzXZ0n0etrHk
|
||||
tYbyoxxjwMLpPvDPI+gztNruWXPplBYb3TOKRDHeNuJYHw51/VRLbYNfpqHcth38
|
||||
RR3JG3oB1swMPRqWiw07IKhAB+DF360aooZH+cr2xaiZuGDo4gnq9Q6XhgemrFBr
|
||||
GQb0NzmaDWW7ieau6/OpzXLDMTbvrJBIGdCE37Jtne9s/Zr/PCZocoDCwEAEuoQ5
|
||||
aVzpsRCYYT0aXKieeUguUdDDaSd0we/imCo4PG7qfjZ10zwS9c2yoIoKGWhZMBXj
|
||||
z9NL9JmhWjwfwDSj4Ih6RG0nqYcvkXG0x7vHAeL6U+8JG0Z731L4es8DNvm2zqEc
|
||||
P2VG+BPNrJriGUMmt0orvdqU0RgmQW4ZLeFv38TBQ/aOHpnZ2rKKWF5e6KkMTB2g
|
||||
D1C4eUs6ik16f/QQs+jWQexX49HA4fxQIBz1rYSo9q8u9MtFt0pAr2NmOZtzAgMB
|
||||
AAGjIDAeMBwGA1UdEQQVMBOBEW3keEBt/HN0ZXJtYW5uLmRlMA0GCSqGSIb3DQEB
|
||||
DQUAA4ICAQB4yp/U5+DplW2Zj7rKaf+9LtufSxXl6rjCWBYpwi0ko2I2kWHsS5nk
|
||||
Cfmpm/oDc8HqBanvKSgp9gCqgvhTHPBuwIetspMkrrpW+BxiVCPU1Wal4TbNSBOt
|
||||
/XtN/8Hu3v4v2a8OgnuwWC0M5YZwl0Cl7pmallkUi2M3xQQHF1gEVtPZcaicwy8h
|
||||
dxmsTZWD8Z+RDKOLnB0OCkXt4oT5V2r6WyCoFSbS2DQqYKfTVHBxwxeq1z1l9V9O
|
||||
qUGi46fAtF6vC0hk9ToIC+zDd0L4ExlFGX/4CXkbMuKcwpGzjeD05T+dNq4ipKjR
|
||||
U1vG4//Lo8BH7/22CAd6lxu/zwjgXdFKGYoUwiLQebfcdtI1CED4M4COkTkWifVR
|
||||
GNcJYo1H7cbmB53UqDx63+ANu5qoQkRZXfd791NUXwt/G2WN3714yeX4V+Nr5x/U
|
||||
ICDDChjibvoQ6ElUxyVuoV0oX0Xy8cVSDijGZDpLptKqZuNN/bI9nDC1NYXIRJNT
|
||||
9pghInw2jRLZ0gWE0CK225JZgeomP1N7qOg0xmQhwOZbPisjaovdLWMlRqvnpeQc
|
||||
U/DlRruAF9ruRc/aNDQ89GGkngCSoHJCUtmcMdCQbaeQU5xqSYNV+EVKGwzaZRuj
|
||||
1IyyNojDyeKs4pPmfPz25hs1ISbWdTLcmN26fZDYSCU2ey72oXK9AQ==
|
||||
-----END CERTIFICATE-----
|
||||
1
tests/fixtures/generic/x509-csr-der.json
vendored
Normal file
1
tests/fixtures/generic/x509-csr-der.json
vendored
Normal file
@@ -0,0 +1 @@
|
||||
[{"certification_request_info":{"version":"v1","subject":{"country_name":"US","state_or_province_name":"CA","locality_name":"San Francisco","organization_name":"jc","organizational_unit_name":"jc","common_name":"jc","email_address":"info@jc.com"},"subject_pk_info":{"algorithm":{"algorithm":"rsa","parameters":null},"public_key":{"modulus":"a3:6d:d5:69:43:b3:c2:00:87:a6:3e:57:6f:24:4c:d7:65:47:b5:87:bc:2f:3e:03:f5:bd:d8:96:36:d0:69:99:5e:3b:fd:cc:6a:3d:d9:ce:75:d8:36:ff:0e:08:55:ac:2b:ab:c5:ec:72:39:a4:12:ad:25:5f:1c:d6:c3:46:53:1b:fe:c9:53:fe:bf:18:33:64:2f:5c:36:f0:99:a1:46:4e:bc:84:19:88:0f:62:d5:14:65:c7:74:e3:a4:4c:50:6e:e0:5f:58:2a:e4:1a:10:fb:54:0b:7f:c6:d0:55:30:ea:8a:30:d2:ce:7e:93:c2:c9:0d:fd:75:20:89:70:51:49:3b:51:cb:6c:f7:5d:39:ee:d8:13:92:7d:31:04:61:e0:49:9f:34:ed:60:2a:72:a8:0b:a5:bf:2e:04:9f:61:13:7c:94:f1:e0:75:43:e0:5a:74:a4:78:36:50:f8:9b:c2:d4:e3:e2:f7:87:09:71:c7:b1:4c:53:60:3b:b3:1f:20:12:c4:cb:16:35:8b:a7:e1:00:01:a4:db:9e:c0:e7:e3:b3:a5:9b:ea:3d:38:1e:07:41:19:4c:48:34:92:71:c7:ee:ab:78:09:6f:f6:2b:78:c9:63:cc:46:1b:ad:e4:1d:96:dd:18:df:c9:66:79:73:ee:ee:d2:77:66:f5:3f","public_exponent":65537}},"attributes":[]},"signature_algorithm":{"algorithm":"sha256_rsa","parameters":null},"signature":"4a:8c:ea:3c:8a:4f:ae:f7:32:cf:46:f4:aa:a2:30:43:c8:21:26:6d:58:d4:89:12:3d:ed:69:7a:c8:3f:c7:62:d2:79:24:23:5d:53:b4:8d:6f:dd:60:82:bf:ab:46:14:bb:74:5d:92:7c:7f:f6:ad:0d:0c:74:fa:15:93:5e:ae:61:b2:dd:2c:89:a1:c9:c1:21:b9:92:57:39:be:05:98:97:e1:39:3c:5e:9a:18:56:b6:2a:db:62:51:23:d8:3c:f8:f9:dd:c1:5f:5a:85:a3:b6:e2:93:95:12:1a:6f:bb:93:14:50:e8:72:a0:b9:d9:4b:cb:8c:b6:08:35:60:4a:ba:9d:6d:e6:ba:7f:ea:a8:fe:d7:28:70:f4:c2:d1:29:92:94:c4:ff:ad:1e:b3:ad:c9:92:3e:b8:02:29:3c:e7:d6:84:a2:d4:77:46:bd:28:15:35:c6:f4:2c:a0:08:e8:9d:75:93:63:8d:e8:c8:be:60:12:da:09:a4:b7:bc:48:97:ca:fb:18:73:1f:0a:43:3d:9d:f4:8f:77:ad:7e:d2:21:b4:53:87:a1:89:d5:ee:cb:08:7c:5d:0f:a5:37:32:f2:74:53:67:97:59:f6:d4:fa:0b:9a:55:08:e3:5a:55:1b:e1:5c:f8:00:62:fb:83:e0:13:30:b1:71:5c:5c:8d"}]
|
||||
1
tests/fixtures/generic/x509-csr-windows.json
vendored
Normal file
1
tests/fixtures/generic/x509-csr-windows.json
vendored
Normal file
@@ -0,0 +1 @@
|
||||
[{"certification_request_info":{"version":"v1","subject":{"country_name":"US","state_or_province_name":"NY","locality_name":"NY","organization_name":"My Company","organizational_unit_name":"IT","common_name":"www.mywebsite.com"},"subject_pk_info":{"algorithm":{"algorithm":"rsa","parameters":null},"public_key":{"modulus":"b5:92:fc:6c:31:40:34:d3:9b:34:d7:3d:be:4e:ee:33:39:ad:5a:ba:a1:fe:a9:c8:2d:c7:b0:db:e6:d0:d1:7d:37:68:4b:47:5e:06:61:4c:9e:cc:b0:2f:85:c8:49:a4:2b:96:45:f6:62:5f:24:50:0f:95:72:47:e5:62:f2:10:95:97:90:7e:b9:0f:11:a1:9d:c4:90:85:4d:68:04:bd:a0:c0:43:58:4c:f8:13:ba:98:3d:97:e4:68:8f:c3:55:fb:e0:d2:61:95:23:01:f1:7b:d8:61:c8:df:e1:3f:c6:fb:9d:ba:c1:e3:e3:44:2a:c1:dc:5c:a1:92:7a:95:3c:f1:e5:3f:55:4e:fe:22:3c:c0:2f:78:de:cc:7b:8c:ab:00:6d:f1:db:c9:eb:91:aa:d7:f0:89:c4:0c:60:a8:8c:cb:4a:af:bc:1c:1e:61:2d:cf:89:84:0f:ff:f6:dd:7f:61:0d:49:67:52:90:64:b1:e0:6c:e6:d8:28:00:08:11:0b:3a:cd:92:27:57:4f:0b:95:0d:59:90:06:a0:38:18:06:87:97:5a:48:97:10:fa:9d:dd:ef:dc:de:c6:88:dd:24:6b:1a:5e:4f:ab:ea:67:21:41:37:9a:7f:ac:cd:1c:3a:7a:0e:1f:d7:6b:28:91:ca:bd:2b:a8:ab:b8:38:19","public_exponent":65537}},"attributes":[{"type":"microsoft_os_version","values":["6.1.7601.2"]},{"type":"microsoft_request_client_info","values":[{"clientid":5,"machinename":"dell-PC","username":"dell-PC\\Dev","processname":"InetMgr.exe"}]},{"type":"microsoft_enrollment_csp_provider","values":[{"keyspec":1,"cspname":"Microsoft RSA SChannel Cryptographic Provider","signature":[]}]},{"type":"extension_request","values":[[{"extn_id":"key_usage","critical":true,"extn_value":["data_encipherment","digital_signature","key_encipherment","non_repudiation"]},{"extn_id":"extended_key_usage","critical":false,"extn_value":["server_auth"]},{"extn_id":"1.2.840.113549.1.9.15","critical":false,"extn_value":"30:69:30:0e:06:08:2a:86:48:86:f7:0d:03:02:02:02:00:80:30:0e:06:08:2a:86:48:86:f7:0d:03:04:02:02:00:80:30:0b:06:09:60:86:48:01:65:03:04:01:2a:30:0b:06:09:60:86:48:01:65:03:04:01:2d:30:0b:06:09:60:86:48:01:65:03:04:01:02:30:0b:06:09:60:86:48:01:65:03:04:01:05:30:07:06:05:2b:0e:03:02:07:30:0a:06:08:2a:86:48:86:f7:0d:03:07"},{"extn_id":"key_identifier","critical":false,"extn_value":"02:3f:12:86:0b:e5:e7:b6:48:cc:b3:57:b7:8b:1e:27:81:5f:0b:08"}]]}]},"signature_algorithm":{"algorithm":"sha1_rsa","parameters":null},"signature":"56:74:46:d0:35:ab:e1:fe:c6:07:5f:d7:d7:1c:0f:3b:28:c6:a0:80:e8:d2:95:58:04:61:ac:d1:b9:af:20:bc:f9:fd:15:84:54:87:d4:d3:8f:c8:46:35:68:c1:21:21:92:c9:a7:60:43:6a:83:f0:d8:6f:a5:c5:e4:da:97:bd:15:cd:b3:9b:93:96:f0:29:37:b7:2c:01:96:54:43:43:56:a9:8e:df:1d:db:16:05:ce:e4:f7:dd:15:84:c7:32:d5:f2:c9:8e:a7:64:2b:ab:ba:71:6a:1a:ca:2f:94:2c:6e:36:aa:d0:12:00:83:e5:0f:5a:67:a9:5c:4f:c7:76:5b:67:a9:69:ee:5b:0b:36:78:5b:11:c2:50:41:ed:61:e8:da:de:16:10:7e:4f:5e:16:96:80:56:ae:0a:14:8a:01:3e:71:98:e4:bf:95:27:51:fb:f7:07:5a:d2:9e:3e:d2:46:35:68:1f:cf:c9:52:8d:31:99:ae:4a:98:5d:78:c0:f4:e3:98:6c:2a:6e:94:15:3f:bd:f5:ec:2a:fe:87:0a:0c:82:57:ed:4e:d5:54:e8:10:ff:9d:df:52:62:77:ce:a5:1a:bd:c8:3e:4b:1a:f0:14:5f:30:15:4e:7d:54:a3:d2:19:8a:0c:04:e5:84:3d:df:f9:a5:bd:2b:1b:1f"}]
|
||||
25
tests/fixtures/generic/x509-csr-windows.pem
vendored
Normal file
25
tests/fixtures/generic/x509-csr-windows.pem
vendored
Normal file
@@ -0,0 +1,25 @@
|
||||
-----BEGIN NEW CERTIFICATE REQUEST-----
|
||||
MIIERTCCAy0CAQAwZTELMAkGA1UEBhMCVVMxCzAJBgNVBAgMAk5ZMQswCQYDVQQH
|
||||
DAJOWTETMBEGA1UECgwKTXkgQ29tcGFueTELMAkGA1UECwwCSVQxGjAYBgNVBAMM
|
||||
EXd3dy5teXdlYnNpdGUuY29tMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKC
|
||||
AQEAtZL8bDFANNObNNc9vk7uMzmtWrqh/qnILcew2+bQ0X03aEtHXgZhTJ7MsC+F
|
||||
yEmkK5ZF9mJfJFAPlXJH5WLyEJWXkH65DxGhncSQhU1oBL2gwENYTPgTupg9l+Ro
|
||||
j8NV++DSYZUjAfF72GHI3+E/xvudusHj40QqwdxcoZJ6lTzx5T9VTv4iPMAveN7M
|
||||
e4yrAG3x28nrkarX8InEDGCojMtKr7wcHmEtz4mED//23X9hDUlnUpBkseBs5tgo
|
||||
AAgRCzrNkidXTwuVDVmQBqA4GAaHl1pIlxD6nd3v3N7GiN0kaxpeT6vqZyFBN5p/
|
||||
rM0cOnoOH9drKJHKvSuoq7g4GQIDAQABoIIBmTAaBgorBgEEAYI3DQIDMQwWCjYu
|
||||
MS43NjAxLjIwNQYJKwYBBAGCNxUUMSgwJgIBBQwHZGVsbC1QQwwLZGVsbC1QQ1xE
|
||||
ZXYMC0luZXRNZ3IuZXhlMHIGCisGAQQBgjcNAgIxZDBiAgEBHloATQBpAGMAcgBv
|
||||
AHMAbwBmAHQAIABSAFMAQQAgAFMAQwBoAGEAbgBuAGUAbAAgAEMAcgB5AHAAdABv
|
||||
AGcAcgBhAHAAaABpAGMAIABQAHIAbwB2AGkAZABlAHIDAQAwgc8GCSqGSIb3DQEJ
|
||||
DjGBwTCBvjAOBgNVHQ8BAf8EBAMCBPAwEwYDVR0lBAwwCgYIKwYBBQUHAwEweAYJ
|
||||
KoZIhvcNAQkPBGswaTAOBggqhkiG9w0DAgICAIAwDgYIKoZIhvcNAwQCAgCAMAsG
|
||||
CWCGSAFlAwQBKjALBglghkgBZQMEAS0wCwYJYIZIAWUDBAECMAsGCWCGSAFlAwQB
|
||||
BTAHBgUrDgMCBzAKBggqhkiG9w0DBzAdBgNVHQ4EFgQUAj8Shgvl57ZIzLNXt4se
|
||||
J4FfCwgwDQYJKoZIhvcNAQEFBQADggEBAFZ0RtA1q+H+xgdf19ccDzsoxqCA6NKV
|
||||
WARhrNG5ryC8+f0VhFSH1NOPyEY1aMEhIZLJp2BDaoPw2G+lxeTal70VzbObk5bw
|
||||
KTe3LAGWVENDVqmO3x3bFgXO5PfdFYTHMtXyyY6nZCurunFqGsovlCxuNqrQEgCD
|
||||
5Q9aZ6lcT8d2W2epae5bCzZ4WxHCUEHtYeja3hYQfk9eFpaAVq4KFIoBPnGY5L+V
|
||||
J1H79wda0p4+0kY1aB/PyVKNMZmuSphdeMD045hsKm6UFT+99ewq/ocKDIJX7U7V
|
||||
VOgQ/53fUmJ3zqUavcg+SxrwFF8wFU59VKPSGYoMBOWEPd/5pb0rGx8=
|
||||
-----END NEW CERTIFICATE REQUEST-----
|
||||
BIN
tests/fixtures/generic/x509-csr.der
vendored
Normal file
BIN
tests/fixtures/generic/x509-csr.der
vendored
Normal file
Binary file not shown.
1
tests/fixtures/generic/x509-csr.json
vendored
Normal file
1
tests/fixtures/generic/x509-csr.json
vendored
Normal file
@@ -0,0 +1 @@
|
||||
[{"certification_request_info":{"version":"v1","subject":{"country_name":"US","state_or_province_name":"Utah","locality_name":"Lindon","organization_name":"DigiCert Inc.","organizational_unit_name":"DigiCert","common_name":"example.digicert.com"},"subject_pk_info":{"algorithm":{"algorithm":"rsa","parameters":null},"public_key":{"modulus":"f3:e4:e8:ed:df:b6:90:f5:9e:06:ff:e8:ad:4d:cb:55:b2:70:0e:b4:90:6d:e2:9a:98:29:a8:c2:9e:5b:a8:3c:48:c1:5d:b4:ce:a4:5b:ec:03:d4:38:a6:28:54:41:45:38:44:2c:e9:3e:a0:22:69:c8:a2:58:5b:88:7e:a6:e3:38:19:fc:23:ef:58:13:a4:65:cf:9c:d4:fa:36:12:6b:c1:cf:e0:03:e6:c0:5d:4f:99:33:19:00:3a:35:b5:b2:64:69:5d:c5:1b:61:34:b3:ac:d5:e7:ce:85:d9:d6:16:e8:48:d7:ad:aa:99:c7:e5:82:98:88:58:3b:b0:ab:80:bd:7f:e6:24:78:98:4d:9f:d7:45:e7:ea:30:9b:c7:0e:42:60:eb:57:c3:4d:76:24:ea:8a:7f:2a:de:a6:00:1c:72:51:5b:6f:20:94:95:02:66:44:d9:c0:86:92:47:a7:2b:05:0f:13:6d:83:44:d1:d7:3e:09:a6:b7:0c:e2:24:cf:51:0e:b0:75:b3:4f:1f:a7:d3:32:9f:a9:c6:e0:5e:2e:03:27:1f:82:d5:b8:e9:b5:83:d1:04:f6:4b:f0:30:1e:5a:e0:3c:79:bb:9d:55:3e:38:c8:4a:7c:d8:6f:7a:fc:68:1c:7f:b1:77:df:13:31:7b:4c:9c:f9:76:ba:a3","public_exponent":65537}},"attributes":[]},"signature_algorithm":{"algorithm":"sha1_rsa","parameters":null},"signature":"1d:24:72:b1:5c:71:29:85:0e:6c:68:c7:43:5e:d3:55:08:a9:2b:03:a8:78:0b:f9:79:87:4d:72:70:ad:ee:83:84:94:99:c1:bb:c4:b4:e2:b4:1b:7f:9d:af:81:6c:d7:55:ae:50:db:79:a9:c2:ec:c7:96:bc:ba:4e:06:e8:02:87:33:3b:a1:2e:c2:7b:5d:98:e0:99:05:c6:10:2a:58:43:89:82:df:24:f7:66:80:86:a4:85:db:c3:e8:8f:de:59:84:11:78:1a:40:bd:13:c7:92:c5:97:fa:24:29:b2:98:c0:8a:8d:8b:22:96:38:c8:fb:65:1f:f0:c5:68:3f:64:31:91:b3:9e:71:ba:87:8b:0c:9f:d9:44:57:fd:6c:8f:88:68:25:1d:d5:8a:df:61:c1:c8:97:71:bc:ec:0b:fe:af:8f:58:57:0a:91:0d:3d:15:0d:5e:ee:2e:0a:a7:db:d5:c8:d4:fa:55:50:d0:8f:40:69:fd:a7:f7:97:e9:0a:3b:be:90:da:3f:26:d1:b4:0d:91:ed:72:ca:8d:06:85:f6:85:d6:78:25:2a:cb:58:6f:25:a7:3d:40:53:b6:f7:b3:9b:d5:a9:69:1c:fa:19:ee:65:a2:12:e2:70:8c:13:e2:8b:a6:bd:33:d1:b7:d2:75:28:df:d9:41:8b:5c"}]
|
||||
17
tests/fixtures/generic/x509-csr.pem
vendored
Normal file
17
tests/fixtures/generic/x509-csr.pem
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
-----BEGIN CERTIFICATE REQUEST-----
|
||||
MIICvDCCAaQCAQAwdzELMAkGA1UEBhMCVVMxDTALBgNVBAgMBFV0YWgxDzANBgNV
|
||||
BAcMBkxpbmRvbjEWMBQGA1UECgwNRGlnaUNlcnQgSW5jLjERMA8GA1UECwwIRGln
|
||||
aUNlcnQxHTAbBgNVBAMMFGV4YW1wbGUuZGlnaWNlcnQuY29tMIIBIjANBgkqhkiG
|
||||
9w0BAQEFAAOCAQ8AMIIBCgKCAQEA8+To7d+2kPWeBv/orU3LVbJwDrSQbeKamCmo
|
||||
wp5bqDxIwV20zqRb7APUOKYoVEFFOEQs6T6gImnIolhbiH6m4zgZ/CPvWBOkZc+c
|
||||
1Po2EmvBz+AD5sBdT5kzGQA6NbWyZGldxRthNLOs1efOhdnWFuhI162qmcflgpiI
|
||||
WDuwq4C9f+YkeJhNn9dF5+owm8cOQmDrV8NNdiTqin8q3qYAHHJRW28glJUCZkTZ
|
||||
wIaSR6crBQ8TbYNE0dc+Caa3DOIkz1EOsHWzTx+n0zKfqcbgXi4DJx+C1bjptYPR
|
||||
BPZL8DAeWuA8ebudVT44yEp82G96/Ggcf7F33xMxe0yc+Xa6owIDAQABoAAwDQYJ
|
||||
KoZIhvcNAQEFBQADggEBAB0kcrFccSmFDmxox0Ne01UIqSsDqHgL+XmHTXJwre6D
|
||||
hJSZwbvEtOK0G3+dr4Fs11WuUNt5qcLsx5a8uk4G6AKHMzuhLsJ7XZjgmQXGECpY
|
||||
Q4mC3yT3ZoCGpIXbw+iP3lmEEXgaQL0Tx5LFl/okKbKYwIqNiyKWOMj7ZR/wxWg/
|
||||
ZDGRs55xuoeLDJ/ZRFf9bI+IaCUd1YrfYcHIl3G87Av+r49YVwqRDT0VDV7uLgqn
|
||||
29XI1PpVUNCPQGn9p/eX6Qo7vpDaPybRtA2R7XLKjQaF9oXWeCUqy1hvJac9QFO2
|
||||
97Ob1alpHPoZ7mWiEuJwjBPii6a9M9G30nUo39lBi1w=
|
||||
-----END CERTIFICATE REQUEST-----
|
||||
28
tests/fixtures/linux-proc/net_tcp
vendored
Normal file
28
tests/fixtures/linux-proc/net_tcp
vendored
Normal file
@@ -0,0 +1,28 @@
|
||||
sl local_address rem_address st tx_queue rx_queue tr tm->when retrnsmt uid timeout inode
|
||||
0: 3500007F:0035 00000000:0000 0A 00000000:00000000 00:00000000 00000000 101 0 34000 1 0000000000000000 100 0 0 10 5
|
||||
1: 0100007F:89E7 00000000:0000 0A 00000000:00000000 00:00000000 00000000 1000 0 140054 1 0000000000000000 100 0 0 10 0
|
||||
2: 0100007F:0277 00000000:0000 0A 00000000:00000000 00:00000000 00000000 0 0 34523 1 0000000000000000 100 0 0 10 0
|
||||
3: 802AA8C0:D450 2104C798:01BB 08 00000000:00000052 02:00000A6F 00000000 1000 0 149098 2 0000000000000000 20 4 0 10 -1
|
||||
0: 00000000:0016 00000000:0000 0A 00000000:00000000 00:00000000 00000000 0 0 23459 1 ffff8c7a0de93a80 100 0 0 10 0
|
||||
1: 1C00000A:A462 6C000C40:0050 04 00000001:00000000 01:00000015 00000000 0 0 0 3 ffff8c7a0de930c0 21 4 30 10 -1
|
||||
2: 1C00000A:97D0 412AF468:0050 06 00000000:00000000 03:000007C5 00000000 0 0 0 3 ffff8c7a12d31aa0
|
||||
3: 1C00000A:B082 E4BDFA8E:0050 06 00000000:00000000 03:00000799 00000000 0 0 0 3 ffff8c7a12d31990
|
||||
4: 1C00000A:A858 A6644468:0050 06 00000000:00000000 03:000007C4 00000000 0 0 0 3 ffff8c7a12d31000
|
||||
5: 1C00000A:BD06 2316F09D:0050 06 00000000:00000000 03:000007C4 00000000 0 0 0 3 ffff8c7a12d31880
|
||||
6: 1C00000A:9572 69769736:0050 04 00000001:00000000 01:00000015 00000000 0 0 0 3 ffff8c7a0de94e00 21 4 30 10 -1
|
||||
7: 1C00000A:A530 DA436C68:0050 04 00000001:00000000 01:00000016 00000000 0 0 0 3 ffff8c7a0de91380 22 4 30 10 -1
|
||||
8: 1C00000A:8DDC 9F426597:0050 06 00000000:00000000 03:000007B5 00000000 0 0 0 3 ffff8c7a12d31110
|
||||
9: 1C00000A:C7DE A9664468:0050 04 00000001:00000000 01:00000015 00000000 0 0 0 3 ffff8c7a0de96b40 21 4 18 10 -1
|
||||
10: 1C00000A:DF22 412AF468:0050 06 00000000:00000000 03:00001747 00000000 0 0 0 3 ffff8c7a12d31cc0
|
||||
11: 1C00000A:0016 9200000A:DDCB 01 00000000:00000000 02:000AB71D 00000000 0 0 41558 2 ffff8c7a0de957c0 20 4 1 10 -1
|
||||
12: 1C00000A:DFF8 A40B8962:0050 06 00000000:00000000 03:00001722 00000000 0 0 0 3 ffff8c7a12d31660
|
||||
13: 1C00000A:A5BE A6644468:0050 06 00000000:00000000 03:00001768 00000000 0 0 0 3 ffff8c7a12d31770
|
||||
14: 1C00000A:DA30 9F426597:0050 06 00000000:00000000 03:00001731 00000000 0 0 0 3 ffff8c7a12d31dd0
|
||||
15: 1C00000A:9F62 192E3E17:0050 04 00000001:00000000 01:00000015 00000000 0 0 0 3 ffff8c7a0de909c0 21 4 30 10 -1
|
||||
16: 1C00000A:DA50 E4BDFA8E:0050 06 00000000:00000000 03:00001716 00000000 0 0 0 3 ffff8c7a12d31ee0
|
||||
17: 1C00000A:C362 2316F09D:0050 06 00000000:00000000 03:00001748 00000000 0 0 0 3 ffff8c7a12d31330
|
||||
18: 1C00000A:A734 15E7064A:0050 06 00000000:00000000 03:000007A7 00000000 0 0 0 3 ffff8c7a12d31440
|
||||
19: 1C00000A:EA10 A9664468:0050 06 00000000:00000000 03:000007C4 00000000 0 0 0 3 ffff8c7a12d31550
|
||||
20: 1C00000A:D7C6 B0D4B136:0050 06 00000000:00000000 03:000007C4 00000000 0 0 0 3 ffff8c7a12d31bb0
|
||||
21: 1C00000A:DD12 DA436C68:0050 06 00000000:00000000 03:000007C4 00000000 0 0 0 3 ffff8c7a12d31220
|
||||
22: 1C00000A:D514 8000C798:0050 04 00000001:00000000 01:00000015 00000000 0 0 0 3 ffff8c7a0de94440 21 4 30 10 -1
|
||||
1
tests/fixtures/linux-proc/net_tcp.json
vendored
Normal file
1
tests/fixtures/linux-proc/net_tcp.json
vendored
Normal file
File diff suppressed because one or more lines are too long
9
tests/fixtures/linux-proc/net_tcp6
vendored
Normal file
9
tests/fixtures/linux-proc/net_tcp6
vendored
Normal file
@@ -0,0 +1,9 @@
|
||||
sl local_address remote_address st tx_queue rx_queue tr tm->when retrnsmt uid timeout inode
|
||||
0: 00000000000000000000000000000000:0016 00000000000000000000000000000000:0000 0A 00000000:00000000 00:00000000 00000000 0 0 23556 1 ffff8c7a0dee0000 100 0 0 10 0
|
||||
1: 4606012600F1019E000000004C3A0000:E700 8028032A830031F10CB0CEFADE250000:0050 06 00000000:00000000 03:000003D1 00000000 0 0 0 3 ffff8c7a1c76d000
|
||||
2: 4606012600F1019E000000004C3A0000:E9E6 98490120073544000000000000800000:0050 06 00000000:00000000 03:000003AD 00000000 0 0 0 3 ffff8c7a1c76d440
|
||||
3: 4606012600F1019E000000004C3A0000:95E2 061400268F01003A000000003A0C0000:0050 06 00000000:00000000 03:00000400 00000000 0 0 0 3 ffff8c7a1c76daa0
|
||||
4: 4606012600F1019E000000004C3A0000:AFE4 59050120916019000000000020070000:0050 06 00000000:00000000 03:000003EC 00000000 0 0 0 3 ffff8c7a1c76d550
|
||||
5: 4606012600F1019E000000004C3A0000:9D18 B0F80726010805400000000004200000:0050 06 00000000:00000000 03:00000396 00000000 0 0 0 3 ffff8c7a1c76dcc0
|
||||
6: 4606012600F1019E000000004C3A0000:EE0A 590501208F60190000000000330B0000:0050 06 00000000:00000000 03:000003FC 00000000 0 0 0 3 ffff8c7a1c76d110
|
||||
7: 4606012600F1019E000000004C3A0000:80A8 5905012000C01900000000001333C517:0050 06 00000000:00000000 03:000003FC 00000000 0 0 0 3 ffff8c7a1c76d330
|
||||
1
tests/fixtures/linux-proc/net_tcp6.json
vendored
Normal file
1
tests/fixtures/linux-proc/net_tcp6.json
vendored
Normal file
@@ -0,0 +1 @@
|
||||
[{"entry":"0","local_address":"::","local_port":22,"remote_address":"::","remote_port":0,"state":"0A","tx_queue":"00000000","rx_queue":"00000000","timer_active":0,"jiffies_until_timer_expires":"00000000","unrecovered_rto_timeouts":"00000000","uid":0,"unanswered_0_window_probes":0,"inode":23556,"sock_ref_count":1,"sock_mem_loc":"ffff8c7a0dee0000","retransmit_timeout":100,"soft_clock_tick":0,"ack_quick_pingpong":0,"sending_congestion_window":10,"slow_start_size_threshold":0},{"entry":"1","local_address":"2601:646:9e01:f100::3a4c","local_port":59136,"remote_address":"2a03:2880:f131:83:face:b00c:0:25de","remote_port":80,"state":"06","tx_queue":"00000000","rx_queue":"00000000","timer_active":3,"jiffies_until_timer_expires":"000003D1","unrecovered_rto_timeouts":"00000000","uid":0,"unanswered_0_window_probes":0,"inode":0,"sock_ref_count":3,"sock_mem_loc":"ffff8c7a1c76d000"},{"entry":"2","local_address":"2601:646:9e01:f100::3a4c","local_port":59878,"remote_address":"2001:4998:44:3507::8000","remote_port":80,"state":"06","tx_queue":"00000000","rx_queue":"00000000","timer_active":3,"jiffies_until_timer_expires":"000003AD","unrecovered_rto_timeouts":"00000000","uid":0,"unanswered_0_window_probes":0,"inode":0,"sock_ref_count":3,"sock_mem_loc":"ffff8c7a1c76d440"},{"entry":"3","local_address":"2601:646:9e01:f100::3a4c","local_port":38370,"remote_address":"2600:1406:3a00:18f::c3a","remote_port":80,"state":"06","tx_queue":"00000000","rx_queue":"00000000","timer_active":3,"jiffies_until_timer_expires":"00000400","unrecovered_rto_timeouts":"00000000","uid":0,"unanswered_0_window_probes":0,"inode":0,"sock_ref_count":3,"sock_mem_loc":"ffff8c7a1c76daa0"},{"entry":"4","local_address":"2601:646:9e01:f100::3a4c","local_port":45028,"remote_address":"2001:559:19:6091::720","remote_port":80,"state":"06","tx_queue":"00000000","rx_queue":"00000000","timer_active":3,"jiffies_until_timer_expires":"000003EC","unrecovered_rto_timeouts":"00000000","uid":0,"unanswered_0_window_probes":0,"inode":0,"sock_ref_count":3,"sock_mem_loc":"ffff8c7a1c76d550"},{"entry":"5","local_address":"2601:646:9e01:f100::3a4c","local_port":40216,"remote_address":"2607:f8b0:4005:801::2004","remote_port":80,"state":"06","tx_queue":"00000000","rx_queue":"00000000","timer_active":3,"jiffies_until_timer_expires":"00000396","unrecovered_rto_timeouts":"00000000","uid":0,"unanswered_0_window_probes":0,"inode":0,"sock_ref_count":3,"sock_mem_loc":"ffff8c7a1c76dcc0"},{"entry":"6","local_address":"2601:646:9e01:f100::3a4c","local_port":60938,"remote_address":"2001:559:19:608f::b33","remote_port":80,"state":"06","tx_queue":"00000000","rx_queue":"00000000","timer_active":3,"jiffies_until_timer_expires":"000003FC","unrecovered_rto_timeouts":"00000000","uid":0,"unanswered_0_window_probes":0,"inode":0,"sock_ref_count":3,"sock_mem_loc":"ffff8c7a1c76d110"},{"entry":"7","local_address":"2601:646:9e01:f100::3a4c","local_port":32936,"remote_address":"2001:559:19:c000::17c5:3313","remote_port":80,"state":"06","tx_queue":"00000000","rx_queue":"00000000","timer_active":3,"jiffies_until_timer_expires":"000003FC","unrecovered_rto_timeouts":"00000000","uid":0,"unanswered_0_window_probes":0,"inode":0,"sock_ref_count":3,"sock_mem_loc":"ffff8c7a1c76d330"}]
|
||||
13
tests/fixtures/ubuntu-18.04/find.json
vendored
Normal file
13
tests/fixtures/ubuntu-18.04/find.json
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
[{"path": null, "node": "."},
|
||||
{"path":".","node":null},
|
||||
{"path": ".","node": "jc"},
|
||||
{"path": "./jc","node": "tests"},
|
||||
{"path": "./jc/tests","node": "test_find.py"},
|
||||
{"path": "./jc/tests","node": "test_history.py"},
|
||||
{"path": "./jc/tests","node": "test_hosts.py"},
|
||||
{"path": "./jc","node": "anotherdirectory"},
|
||||
{"path": null,"node": null,"error": "find: './inaccessible': Permission denied"},
|
||||
{"path": "./jc","node": "directory2"},
|
||||
{"path": "./jc/directory2","node": "file.txt"},
|
||||
{"path": "./jc/directory2","node": "file2.txt"},
|
||||
{"path": ".","node": "newfile.txt"}]
|
||||
13
tests/fixtures/ubuntu-18.04/find.out
vendored
Normal file
13
tests/fixtures/ubuntu-18.04/find.out
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
.
|
||||
./
|
||||
./jc
|
||||
./jc/tests
|
||||
./jc/tests/test_find.py
|
||||
./jc/tests/test_history.py
|
||||
./jc/tests/test_hosts.py
|
||||
./jc/anotherdirectory
|
||||
find: './inaccessible': Permission denied
|
||||
./jc/directory2
|
||||
./jc/directory2/file.txt
|
||||
./jc/directory2/file2.txt
|
||||
./newfile.txt
|
||||
31
tests/fixtures/ubuntu-18.04/ip_route.json
vendored
Normal file
31
tests/fixtures/ubuntu-18.04/ip_route.json
vendored
Normal file
@@ -0,0 +1,31 @@
|
||||
[
|
||||
{
|
||||
"ip": "default",
|
||||
"via": "10.0.2.2",
|
||||
"dev": "enp0s3",
|
||||
"proto": "dhcp",
|
||||
"metric": 100
|
||||
},
|
||||
{
|
||||
"ip": "10.0.2.0/24",
|
||||
"dev": "enp0s3",
|
||||
"proto": "kernel",
|
||||
"scope": "link",
|
||||
"src": "10.0.2.15",
|
||||
"metric": 100
|
||||
},
|
||||
{
|
||||
"ip": "169.254.0.0/16",
|
||||
"dev": "enp0s3",
|
||||
"scope": "link",
|
||||
"metric": 1000
|
||||
},
|
||||
{
|
||||
"ip": "172.17.0.0/16",
|
||||
"dev": "docker0",
|
||||
"proto": "kernel",
|
||||
"scope": "link",
|
||||
"src": "172.17.0.1",
|
||||
"status": "linkdown"
|
||||
}
|
||||
]
|
||||
4
tests/fixtures/ubuntu-18.04/ip_route.out
vendored
Normal file
4
tests/fixtures/ubuntu-18.04/ip_route.out
vendored
Normal file
@@ -0,0 +1,4 @@
|
||||
default via 10.0.2.2 dev enp0s3 proto dhcp metric 100
|
||||
10.0.2.0/24 dev enp0s3 proto kernel scope link src 10.0.2.15 metric 100
|
||||
169.254.0.0/16 dev enp0s3 scope link metric 1000
|
||||
172.17.0.0/16 dev docker0 proto kernel scope link src 172.17.0.1 linkdown
|
||||
1599
tests/fixtures/ubuntu-18.04/ss-sudo-tulpen.json
vendored
Normal file
1599
tests/fixtures/ubuntu-18.04/ss-sudo-tulpen.json
vendored
Normal file
File diff suppressed because it is too large
Load Diff
74
tests/fixtures/ubuntu-18.04/ss-sudo-tulpen.out
vendored
Normal file
74
tests/fixtures/ubuntu-18.04/ss-sudo-tulpen.out
vendored
Normal file
@@ -0,0 +1,74 @@
|
||||
Netid State Recv-Q Send-Q Local Address:Port Peer Address:Port
|
||||
udp UNCONN 0 0 0.0.0.0:4789 0.0.0.0:* ino:58296 sk:1 <->
|
||||
udp UNCONN 0 0 0.0.0.0:5353 0.0.0.0:* users:(("avahi-daemon",pid=1510,fd=12)) uid:112 ino:25147 sk:2 <->
|
||||
udp UNCONN 0 0 0.0.0.0:39996 0.0.0.0:* users:(("snmpd",pid=1996,fd=12)) ino:42903 sk:3 <->
|
||||
udp UNCONN 0 0 0.0.0.0:7946 0.0.0.0:* users:(("dockerd",pid=2840,fd=28)) ino:61138 sk:4 <->
|
||||
udp UNCONN 0 0 0.0.0.0:8125 0.0.0.0:* users:(("dockerd",pid=2840,fd=104)) ino:78939 sk:5 <->
|
||||
udp UNCONN 0 0 192.168.122.1:53 0.0.0.0:* users:(("dnsmasq",pid=2783,fd=5)) ino:43666 sk:6 <->
|
||||
udp UNCONN 0 0 127.0.0.53%lo:53 0.0.0.0:* users:(("systemd-resolve",pid=1117,fd=12)) uid:101 ino:29400 sk:7 <->
|
||||
udp UNCONN 0 0 0.0.0.0%virbr0:67 0.0.0.0:* users:(("dnsmasq",pid=2783,fd=3)) ino:43663 sk:8 <->
|
||||
udp UNCONN 0 0 0.0.0.0:68 0.0.0.0:* users:(("dhclient",pid=4852,fd=6)) ino:55971 sk:9 <->
|
||||
udp UNCONN 0 0 0.0.0.0:68 0.0.0.0:* users:(("dhclient",pid=4738,fd=6)) ino:46019 sk:a <->
|
||||
udp UNCONN 0 0 0.0.0.0:111 0.0.0.0:* users:(("rpcbind",pid=1119,fd=6)) ino:24994 sk:b <->
|
||||
udp UNCONN 0 0 0.0.0.0:161 0.0.0.0:* users:(("snmpd",pid=1996,fd=11)) ino:42905 sk:c <->
|
||||
udp UNCONN 0 0 0.0.0.0:631 0.0.0.0:* users:(("cups-browsed",pid=1880,fd=7)) ino:46417 sk:d <->
|
||||
udp UNCONN 0 0 0.0.0.0:871 0.0.0.0:* users:(("rpcbind",pid=1119,fd=7)) ino:34845 sk:e <->
|
||||
udp UNCONN 0 0 0.0.0.0:35059 0.0.0.0:* users:(("avahi-daemon",pid=1510,fd=13)) uid:112 ino:37602 sk:f <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:9187 0.0.0.0:* users:(("prometheus-post",pid=1671,fd=3)) uid:128 ino:32721 sk:10 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:389 0.0.0.0:* users:(("dockerd",pid=2840,fd=183)) ino:128462 sk:11 <->
|
||||
tcp LISTEN 0 128 0.0.0.0:6566 0.0.0.0:* users:(("systemd",pid=1,fd=62)) ino:35013 sk:12 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:2023 0.0.0.0:* users:(("dockerd",pid=2840,fd=97)) ino:78929 sk:13 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:5000 0.0.0.0:* users:(("dockerd",pid=2840,fd=214)) ino:138297 sk:14 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:2024 0.0.0.0:* users:(("dockerd",pid=2840,fd=99)) ino:75475 sk:15 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:9000 0.0.0.0:* users:(("dockerd",pid=2840,fd=66)) ino:71580 sk:16 <->
|
||||
tcp LISTEN 0 10 0.0.0.0:3689 0.0.0.0:* users:(("rhythmbox",pid=6716,fd=19)) uid:1000 ino:150684 sk:17 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:27017 0.0.0.0:* users:(("dockerd",pid=2840,fd=196)) ino:135216 sk:18 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:2377 0.0.0.0:* users:(("dockerd",pid=2840,fd=18)) ino:50423 sk:19 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:7946 0.0.0.0:* users:(("dockerd",pid=2840,fd=27)) ino:61137 sk:1a <->
|
||||
tcp LISTEN 0 80 0.0.0.0:3306 0.0.0.0:* users:(("mysqld",pid=2367,fd=24)) uid:125 ino:47161 sk:1b <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:6443 0.0.0.0:* users:(("dockerd",pid=2840,fd=149)) ino:83788 sk:1c <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:6379 0.0.0.0:* users:(("dockerd",pid=2840,fd=33)) ino:74330 sk:1d <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:9100 0.0.0.0:* users:(("prometheus-node",pid=1419,fd=3)) uid:128 ino:24436 sk:1e <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:9133 0.0.0.0:* users:(("fritzbox_export",pid=1951,fd=3)) uid:128 ino:42810 sk:1f <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:2222 0.0.0.0:* users:(("dockerd",pid=2840,fd=133)) ino:89671 sk:20 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:2223 0.0.0.0:* users:(("dockerd",pid=2840,fd=146)) ino:98624 sk:21 <->
|
||||
tcp LISTEN 0 4096 127.0.0.1:40783 0.0.0.0:* users:(("containerd",pid=2002,fd=14)) ino:42959 sk:22 <->
|
||||
tcp LISTEN 0 128 0.0.0.0:111 0.0.0.0:* users:(("rpcbind",pid=1119,fd=8)) ino:34846 sk:23 <->
|
||||
tcp LISTEN 0 511 0.0.0.0:80 0.0.0.0:* users:(("/usr/sbin/apach",pid=24795,fd=3),("/usr/sbin/apach",pid=2157,fd=3),("/usr/sbin/apach",pid=2156,fd=3),("/usr/sbin/apach",pid=2155,fd=3),("/usr/sbin/apach",pid=2154,fd=3),("/usr/sbin/apach",pid=2153,fd=3),("/usr/sbin/apach",pid=2079,fd=3)) ino:40480 sk:24 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:9104 0.0.0.0:* users:(("prometheus-mysq",pid=1587,fd=3)) uid:128 ino:24481 sk:25 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:8081 0.0.0.0:* users:(("dockerd",pid=2840,fd=205)) ino:131957 sk:26 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:8082 0.0.0.0:* users:(("dockerd",pid=2840,fd=223)) ino:140597 sk:27 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:2003 0.0.0.0:* users:(("dockerd",pid=2840,fd=85)) ino:75383 sk:28 <->
|
||||
tcp LISTEN 0 128 127.0.0.1:5939 0.0.0.0:* users:(("teamviewerd",pid=4963,fd=12)) ino:59559 sk:29 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:2004 0.0.0.0:* users:(("dockerd",pid=2840,fd=96)) ino:71664 sk:2a <->
|
||||
tcp LISTEN 0 500 0.0.0.0:8084 0.0.0.0:* users:(("Main",pid=1799,fd=5)) uid:33 ino:44981 sk:2b <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:8085 0.0.0.0:* users:(("dockerd",pid=2840,fd=100)) ino:77932 sk:2c <->
|
||||
tcp LISTEN 0 32 192.168.122.1:53 0.0.0.0:* users:(("dnsmasq",pid=2783,fd=6)) ino:43667 sk:2d <->
|
||||
tcp LISTEN 0 128 127.0.0.53%lo:53 0.0.0.0:* users:(("systemd-resolve",pid=1117,fd=13)) uid:101 ino:29401 sk:2e <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:8086 0.0.0.0:* users:(("dockerd",pid=2840,fd=98)) ino:75513 sk:2f <->
|
||||
tcp LISTEN 0 128 0.0.0.0:22 0.0.0.0:* users:(("sshd",pid=1998,fd=3)) ino:58511 sk:30 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:8087 0.0.0.0:* users:(("dockerd",pid=2840,fd=157)) ino:98633 sk:31 <->
|
||||
tcp LISTEN 0 5 127.0.0.1:631 0.0.0.0:* users:(("cupsd",pid=1410,fd=7)) ino:40509 sk:32 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:9111 0.0.0.0:* users:(("syno_exporter-0",pid=1941,fd=3)) uid:128 ino:25196 sk:33 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:3000 0.0.0.0:* users:(("dockerd",pid=2840,fd=42)) ino:89081 sk:34 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:5432 0.0.0.0:* users:(("dockerd",pid=2840,fd=117)) ino:76497 sk:35 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:3033 0.0.0.0:* users:(("dockerd",pid=2840,fd=139)) ino:89680 sk:36 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:8089 0.0.0.0:* users:(("dockerd",pid=2840,fd=185)) ino:84656 sk:37 <->
|
||||
tcp LISTEN 0 100 127.0.0.1:25 0.0.0.0:* users:(("master",pid=5745,fd=13)) ino:57959 sk:38 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:3003 0.0.0.0:* users:(("dockerd",pid=2840,fd=23)) ino:90254 sk:39 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:8091 0.0.0.0:* users:(("dockerd",pid=2840,fd=144)) ino:84694 sk:3a <->
|
||||
tcp LISTEN 0 511 0.0.0.0:443 0.0.0.0:* users:(("/usr/sbin/apach",pid=24795,fd=4),("/usr/sbin/apach",pid=2157,fd=4),("/usr/sbin/apach",pid=2156,fd=4),("/usr/sbin/apach",pid=2155,fd=4),("/usr/sbin/apach",pid=2154,fd=4),("/usr/sbin/apach",pid=2153,fd=4),("/usr/sbin/apach",pid=2079,fd=4)) ino:40485 sk:3b <->
|
||||
tcp LISTEN 0 100 0.0.0.0:1883 0.0.0.0:* users:(("mosquitto",pid=1798,fd=4)) ino:38377 sk:3c <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:9115 0.0.0.0:* users:(("prometheus-blac",pid=1651,fd=3)) uid:128 ino:25119 sk:3d <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:636 0.0.0.0:* users:(("dockerd",pid=2840,fd=187)) ino:125799 sk:3e <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:8092 0.0.0.0:* users:(("dockerd",pid=2840,fd=37)) ino:87972 sk:3f <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:9116 0.0.0.0:* users:(("snmp_exporter",pid=1928,fd=3)) uid:128 ino:45204 sk:40 <->
|
||||
tcp LISTEN 0 1024 127.0.0.1:2812 0.0.0.0:* users:(("monit",pid=1806,fd=6)) ino:37711 sk:41 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:446 0.0.0.0:* users:(("dockerd",pid=2840,fd=143)) ino:98615 sk:42 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:8126 0.0.0.0:* users:(("dockerd",pid=2840,fd=105)) ino:77945 sk:43 <->
|
||||
tcp LISTEN 0 100 127.0.0.1:49152 0.0.0.0:* users:(("TabNine-deep-lo",pid=9583,fd=22)) uid:1000 ino:486863 sk:6ae <->
|
||||
tcp LISTEN 0 4096 127.0.0.1:46624 0.0.0.0:* users:(("kited",pid=27494,fd=3)) uid:1000 ino:110003 sk:44 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:8000 0.0.0.0:* users:(("dockerd",pid=2840,fd=50)) ino:73647 sk:45 <->
|
||||
tcp LISTEN 0 128 0.0.0.0:5665 0.0.0.0:* users:(("icinga2",pid=5337,fd=16)) uid:136 ino:57169 sk:46 <->
|
||||
tcp LISTEN 0 4096 192.168.178.5:10050 0.0.0.0:* users:(("zabbix_agent2",pid=6359,fd=10)) uid:131 ino:64927 sk:47 <->
|
||||
tcp LISTEN 0 4096 0.0.0.0:9090 0.0.0.0:* users:(("prometheus",pid=1543,fd=7)) uid:128 ino:32704 sk:48 <->
|
||||
1
tests/fixtures/ubuntu-20.04/lsattr-R.json
vendored
Normal file
1
tests/fixtures/ubuntu-20.04/lsattr-R.json
vendored
Normal file
@@ -0,0 +1 @@
|
||||
[{"file": "/tmp/folder/folder", "extents": true}, {"file": "/tmp/folder/folder/test_file", "compression_requested": true, "extents": true}]
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user