1
0
mirror of https://github.com/kellyjonbrazil/jc.git synced 2026-04-05 17:50:11 +02:00

Compare commits

..

118 Commits

Author SHA1 Message Date
Kelly Brazil
bcff00799f Merge pull request #328 from kellyjonbrazil/dev
Dev v1.22.3
2022-12-16 15:31:16 -06:00
Kelly Brazil
0dc76621bc doc update 2022-12-16 12:54:07 -08:00
Kelly Brazil
1040c5706f split prefix and ports. remove utc timestamps 2022-12-14 11:55:46 -08:00
Kelly Brazil
265d08e3bd doc update 2022-12-13 17:32:12 -08:00
Kelly Brazil
491fce7052 add openvpn-status.log file parser 2022-12-13 17:31:05 -08:00
Kelly Brazil
3404bc4840 add pgpass tests 2022-12-13 14:57:22 -08:00
Kelly Brazil
064e3f6ac0 try changing test dates for windows compatibility 2022-12-13 14:11:12 -08:00
Kelly Brazil
71f0c4f9da doc update 2022-12-13 13:46:33 -08:00
Kelly Brazil
18143608dc doc update 2022-12-13 13:30:50 -08:00
Kelly Brazil
83a6d92449 fix tests for additional timestamps 2022-12-13 13:30:31 -08:00
Kelly Brazil
ac1f690c54 add epoch timestamps 2022-12-13 13:29:54 -08:00
Kelly Brazil
b063f1bfb4 add google big table time format 2022-12-13 13:10:34 -08:00
Kelly Brazil
4c5caa7b86 add iso attribute to timestamp 2022-12-13 12:47:31 -08:00
Kelly Brazil
85edda6e5f Merge branch 'dev' of https://github.com/kellyjonbrazil/jc into dev 2022-12-13 12:15:28 -08:00
Kelly Brazil
6e1a4a103c formatting and typing fixes 2022-12-13 12:13:27 -08:00
Kelly Brazil
2879c084e5 doc update 2022-12-13 09:01:13 -08:00
Kelly Brazil
4ecc94e531 formatting and typing fixes 2022-12-13 09:01:04 -08:00
Kelly Brazil
b8ef583b93 Merge pull request #327 from graipher/master
Add parser for cbt
2022-12-13 10:40:46 -06:00
Andreas Weiden
fd61e19135 Add raw schema 2022-12-12 19:48:33 +01:00
Andreas Weiden
79011465af Fix docstring 2022-12-12 19:46:45 +01:00
Andreas Weiden
27acedf8b7 Add newlines at end of files 2022-12-12 15:43:27 +01:00
Andreas Weiden
532c37140c Move test stuff to fixtures 2022-12-12 15:33:40 +01:00
Andreas Weiden
7f4a951065 Add parser to table in readme 2022-12-12 15:18:39 +01:00
Andreas Weiden
b9fb7fad9c Add parser for cbt 2022-12-12 15:10:59 +01:00
Kelly Brazil
be85d78f55 doc update 2022-12-05 14:54:23 -08:00
Kelly Brazil
23e02090e0 add pgpass parser 2022-12-05 14:43:48 -08:00
Kelly Brazil
50f2a811ad doc update 2022-12-05 12:43:45 -08:00
Kelly Brazil
28ee448c44 remove unneeded type ignore comments 2022-12-05 12:42:55 -08:00
Kelly Brazil
688a2099b5 relax JSONDictType - unions and Dicts cause a lot of friction 2022-12-05 12:15:35 -08:00
Kelly Brazil
6f0a53ed02 fix doc title 2022-12-02 16:10:00 -08:00
Kelly Brazil
8a14de663e doc update 2022-12-02 15:16:46 -08:00
Kelly Brazil
09344e938a add lane info lines 2022-12-02 15:16:07 -08:00
Kelly Brazil
bf41140322 fix tests for missing ipv6 addresses 2022-12-02 11:16:13 -08:00
Kelly Brazil
689a85db9b fix for missing ipv6 addresses and scope_id 2022-12-02 11:09:17 -08:00
Kelly Brazil
5e0d206e7a minor cleanup 2022-11-22 13:18:08 -08:00
Kelly Brazil
61addd7950 doc update 2022-11-22 13:12:40 -08:00
Kelly Brazil
60bb9e2aa9 doc update 2022-11-22 13:11:41 -08:00
Kelly Brazil
842fbbab64 formatting 2022-11-22 13:11:10 -08:00
Kelly Brazil
bbd227caf4 tighten up blank line checking 2022-11-22 13:10:58 -08:00
Kelly Brazil
975b4f5e4f add clf-s parser tests 2022-11-22 13:10:15 -08:00
Kelly Brazil
06840931ba doc update 2022-11-21 16:54:42 -08:00
Kelly Brazil
9f4327f517 add clf-s streaming parser 2022-11-21 16:54:13 -08:00
Kelly Brazil
3f13c70dfa formatting 2022-11-21 12:12:16 -08:00
Kelly Brazil
26f8803b23 add support for unparsable lines 2022-11-21 12:09:19 -08:00
Kelly Brazil
60f1e79b2f fix clf request string parsing and add tests 2022-11-21 11:00:58 -08:00
Kelly Brazil
5ab2ebe45a add CLF timestamp support 2022-11-21 09:27:37 -08:00
Kelly Brazil
9c8fe80d6d add more processing and timestamp 2022-11-21 09:27:21 -08:00
Kelly Brazil
1e7e22330f doc update 2022-11-20 20:45:22 -08:00
Kelly Brazil
7244868fbd initial common log format parser 2022-11-20 20:43:49 -08:00
Kelly Brazil
86ed39ecdd tighten up efi split 2022-11-18 16:19:55 -08:00
Kelly Brazil
94d87b726f add efi partition split 2022-11-18 14:20:18 -08:00
Kelly Brazil
0984a1ec26 fix git-log tests and docs 2022-11-18 13:54:59 -08:00
Kelly Brazil
de5da060ce Merge pull request #322 from adamwolf/git-log-blank-author
Fix git log parsing with empty name or email
2022-11-18 13:18:06 -06:00
Kelly Brazil
592435572c Merge branch 'dev' of https://github.com/kellyjonbrazil/jc into dev
add in PR 323
2022-11-18 11:11:57 -08:00
Kelly Brazil
3172a18a46 Merge pull request #323 from kianmeng/fix-typos
Fix typos
2022-11-18 13:05:36 -06:00
Kelly Brazil
7f1c57b89c version bump 2022-11-17 13:39:59 -06:00
Kian-Meng Ang
39555a48b5 Fix typos
Found via `codespell -S ./tests/fixtures -L
chage,ro,ist,ans,unx,respons,technik`
2022-11-16 10:01:58 +08:00
Adam Wolf
e4cdfa13ca Fix git log parsing with empty name or email
Sometimes, folks leave their name or email blank in on their
git commits.  Previously, a blank name crashed the git log
parser.
2022-11-11 13:33:01 -06:00
Kelly Brazil
cb4011bc03 formatting 2022-11-08 14:23:32 -08:00
Kelly Brazil
299b0faf7c Merge pull request #319 from kellyjonbrazil/dev
Dev v1.22.2
2022-11-08 16:32:03 +00:00
Kelly Brazil
2ffd698c03 update du docs 2022-11-08 08:13:41 -08:00
Kelly Brazil
dde54690fc Merge pull request #318 from kellyjonbrazil/master
sync to dev
2022-11-08 16:07:56 +00:00
Kelly Brazil
8d03055b34 add tests 2022-11-07 16:41:13 -08:00
Kelly Brazil
8a850be857 doc update 2022-11-07 13:26:16 -08:00
Kelly Brazil
2a530712cf add os-prober parser 2022-11-07 13:23:55 -08:00
Kelly Brazil
fd22c7dc3a add git-ls-remote parser 2022-11-07 13:02:55 -08:00
Kelly Brazil
b884f6aacc doc update 2022-11-07 10:44:37 -08:00
Kelly Brazil
ce680d4082 add cidr-style freebsd ipv4 support and freebsd options 2022-11-07 10:40:07 -08:00
Kelly Brazil
644d3f350d add more bsd tests 2022-11-06 12:12:07 -08:00
Kelly Brazil
0144863d33 new ifconfig parser with additional fields. tests passing 2022-11-06 12:04:34 -08:00
Kelly Brazil
a6b56519a2 add more bsd fields to ifconfig 2022-11-05 18:47:44 -07:00
Kelly Brazil
0a71caf9cd formatting for regexes 2022-11-05 17:12:50 -07:00
Kelly Brazil
35e74328c4 tests passing for ifconfig 2022-11-05 16:36:31 -07:00
Kelly Brazil
e46ac0ff7e initial commit of new ifconfig parser 2022-11-05 12:13:01 -07:00
Kelly Brazil
17fe6c7691 semver tests and doc update 2022-11-04 16:03:39 -07:00
Kelly Brazil
01ca7a69c4 formatting 2022-11-04 15:24:34 -07:00
Kelly Brazil
6d04db6113 add semver parser 2022-11-04 15:17:47 -07:00
Kelly Brazil
7c899abb15 add support for multiple include paths 2022-11-04 14:42:04 -07:00
Kelly Brazil
89d4df2a05 document ifconfig limitations and recommend using ip 2022-11-04 09:25:17 -07:00
Kelly Brazil
71c8364f80 doc fix 2022-11-04 09:19:42 -07:00
Kelly Brazil
214cd6b9e0 Merge pull request #310 from villesinisalo/readme_arch
Readme: recommend plain Pacman on Arch
2022-11-03 15:34:19 +00:00
Ville Sinisalo
73c280de3a use plain Pacman on Arch 2022-11-03 11:29:06 +02:00
Kelly Brazil
23e1dd3e35 use dict constructor for xmltodict to suppress !!omap comments in YAML output 2022-11-02 12:00:45 -07:00
Kelly Brazil
2b060aae0d clean up raw/processed logic 2022-11-01 19:53:44 -07:00
Kelly Brazil
186ad73651 add raw option to xml parser for _ attribute prefix 2022-11-01 18:09:05 -07:00
Kelly Brazil
de7a010f62 update test templates 2022-11-01 17:01:21 -07:00
Kelly Brazil
ac1bcd2918 doc update 2022-11-01 15:28:50 -07:00
Kelly Brazil
a2e6243282 doc update and additional tests 2022-11-01 13:50:01 -07:00
Kelly Brazil
01f92ced81 add docs and tests for findmnt 2022-11-01 13:15:31 -07:00
Kelly Brazil
b493bcf4fa update type annotations 2022-10-31 17:37:01 -07:00
Kelly Brazil
f6ee30be20 formatting 2022-10-31 17:30:51 -07:00
Kelly Brazil
50da124ea7 initial findmnt parser 2022-10-31 17:22:56 -07:00
Kelly Brazil
5e22f9e2bd formatting 2022-10-31 10:29:43 -07:00
Kelly Brazil
a384eb4c15 add sshd_conf tests 2022-10-31 09:32:58 -07:00
Kelly Brazil
dc4620eeb2 doc update 2022-10-31 09:22:39 -07:00
Kelly Brazil
d7cfa38eee ignore Match blocks 2022-10-28 16:36:52 -07:00
Kelly Brazil
a27110ebe5 formatting 2022-10-28 15:25:55 -07:00
Kelly Brazil
f5988527fb add sshd_conf parser 2022-10-28 15:15:02 -07:00
Kelly Brazil
c405309742 allow debug of exceptions 2022-10-28 15:14:53 -07:00
Kelly Brazil
747d12224f allow parser_info and get_help to use module objects as input 2022-10-28 10:36:54 -07:00
Kelly Brazil
2b621ab68e use exceptions instead of signal handlers to catch piperror and sigint 2022-10-28 10:11:19 -07:00
Kelly Brazil
8689865d31 fix exit on interrupt exit code. also add message to stderr 2022-10-27 11:23:43 -07:00
Kelly Brazil
358324533d doc update 2022-10-26 15:39:32 -07:00
Kelly Brazil
bc5821e69f force ci tests 2022-10-26 15:36:22 -07:00
Kelly Brazil
d5b478c968 bump checkout and setup-python versions 2022-10-26 15:35:41 -07:00
Kelly Brazil
368eba1826 bump checkout and setup-python versions 2022-10-26 15:34:46 -07:00
Kelly Brazil
6cffb449f4 Merge branch 'dev' of https://github.com/kellyjonbrazil/jc into dev 2022-10-26 15:23:35 -07:00
Kelly Brazil
d79d9c7f13 simplify return 2022-10-26 15:21:37 -07:00
Kelly Brazil
179822b994 add python 3.11 2022-10-26 15:18:33 -07:00
Kelly Brazil
ba369a0b73 add python 3.11 2022-10-26 15:18:04 -07:00
Kelly Brazil
6a5251f0ef doc update 2022-10-25 15:50:45 -07:00
Kelly Brazil
004fd74748 add type annotations 2022-10-25 15:21:42 -07:00
Kelly Brazil
e8d6d4c080 relax input_type_check to allow object argument 2022-10-25 15:21:26 -07:00
Kelly Brazil
8644f70db4 fix typos 2022-10-25 13:46:22 -07:00
Kelly Brazil
72f233b186 version bump 2022-10-25 11:53:08 -07:00
Kelly Brazil
fc85950a73 doc update 2022-10-25 11:49:02 -07:00
Kelly Brazil
fd5cbbb4d5 add csv utf-8 bom tests 2022-10-25 11:46:31 -07:00
Kelly Brazil
888b6bd6d5 fix for UTF-8 csv files with leading BOM bytes 2022-10-25 11:18:22 -07:00
191 changed files with 6890 additions and 741 deletions

View File

@@ -14,10 +14,10 @@ jobs:
strategy:
matrix:
os: [macos-latest, ubuntu-latest, windows-latest]
python-version: ["3.7", "3.8", "3.9", "3.10"]
python-version: ["3.7", "3.8", "3.9", "3.10", "3.11"]
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v3
- name: "Set up timezone to America/Los_Angeles"
uses: szenius/set-timezone@v1.0
with:
@@ -25,7 +25,7 @@ jobs:
timezoneMacos: "America/Los_Angeles"
timezoneWindows: "Pacific Standard Time"
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v1
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies

View File

@@ -1,5 +1,38 @@
jc changelog
20221216 v1.22.3
- Add Common Log Format and Combined Log Format file parser (standard and streaming)
- Add PostgreSQL password file parser
- Add openvpn-status.log file parser
- Add `cbt` command parser (Google Big Table)
- Enhance `ifconfig` parser with interface lane information on BSD
- Enhance `ifconfig` parser with additional IPv6 `scope_id` info for BSD
- Fix `ifconfig` parser to capture some IPv6 addresses missed on BSD
- Fix `git-log` and `git-log-s` parsers for failure on empty author name
- Update `os-prober` parser with split EFI partition fields
- Add ISO string attribute (`.iso`) to `jc.utils.timestamp()`
- Fix several documentation typos
20221107 v1.22.2
- add `sshd_conf` parser for `sshd` configuration files and `sshd -T` output
- add `findmnt` command parser
- add `git ls-remote` command parser
- add `os-prober` command parser
- add SemVer string parser
- enhance the `ifconfig` parser so it can output multiple IPv4 and IPv6 addresses
- enhance the `ifconfig` parser so it can output additional fields common on BSD
- enhance `xml` parser with optional `_` prefix for attributes instead of
`@` by using the `--raw` option. This can make it easier to filter the
JSON output in some tools.
- fix the `xml` parser to output a normal Dictionary instead of OrderdDict.
This cleans up YAML output. (No `!!omap` comments)
- fix `csv` and `csv-s` parsers for UTF-8 encoded CSV files with leading BOM bytes
- fix exit code to be non-zero on interrupt
- allow parser module objects to be used as arguments to `jc.get_help()` and `jc.parser_info()`
- catch unexpected exceptions in the CLI
- add error message on interrupt to STDERR
- add python 3.11 tests to github actions
20221024 v1.22.1
- add `udevadm` command parser
- add `lspci` command parser
@@ -561,7 +594,7 @@ jc changelog
20200211 v1.7.3
- Add alternative 'magic' syntax: e.g. `jc ls -al`
- Options can now be condensed (e.g. -prq is equivalant to -p -r -q)
- Options can now be condensed (e.g. -prq is equivalent to -p -r -q)
20200208 v1.7.2
- Include test fixtures in wheel and sdist

View File

@@ -114,7 +114,7 @@ pip3 install jc
| Debian/Ubuntu linux | `apt-get install jc` |
| Fedora linux | `dnf install jc` |
| openSUSE linux | `zypper install jc` |
| Archlinux Community Repository | `paru -S jc` or `aura -S jc` or `yay -S jc` |
| Arch linux | `pacman -S jc` |
| NixOS linux | `nix-env -iA nixpkgs.jc` or `nix-env -iA nixos.jc` |
| Guix System linux | `guix install jc` |
| Gentoo Linux | `emerge dev-python/jc` |
@@ -161,10 +161,13 @@ option.
| ` --asciitable` | ASCII and Unicode table parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/asciitable) |
| ` --asciitable-m` | multi-line ASCII and Unicode table parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/asciitable_m) |
| ` --blkid` | `blkid` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/blkid) |
| ` --cbt` | `cbt` (Google Bigtable) command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/cbt) |
| ` --cef` | CEF string parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/cef) |
| ` --cef-s` | CEF string streaming parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/cef_s) |
| ` --chage` | `chage --list` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/chage) |
| ` --cksum` | `cksum` and `sum` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/cksum) |
| ` --clf` | Common and Combined Log Format file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/clf) |
| ` --clf-s` | Common and Combined Log Format file streaming parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/clf_s) |
| ` --crontab` | `crontab` command and file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/crontab) |
| ` --crontab-u` | `crontab` file parser with user support | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/crontab_u) |
| ` --csv` | CSV file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/csv) |
@@ -180,11 +183,13 @@ option.
| `--email-address` | Email Address string parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/email_address) |
| ` --env` | `env` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/env) |
| ` --file` | `file` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/file) |
| ` --findmnt` | `findmnt` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/findmnt) |
| ` --finger` | `finger` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/finger) |
| ` --free` | `free` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/free) |
| ` --fstab` | `/etc/fstab` file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/fstab) |
| ` --git-log` | `git log` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/git_log) |
| ` --git-log-s` | `git log` command streaming parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/git_log_s) |
| `--git-ls-remote` | `git ls-remote` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/git_ls_remote) |
| ` --gpg` | `gpg --with-colons` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/gpg) |
| ` --group` | `/etc/group` file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/group) |
| ` --gshadow` | `/etc/gshadow` file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/gshadow) |
@@ -221,8 +226,11 @@ option.
| ` --netstat` | `netstat` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/netstat) |
| ` --nmcli` | `nmcli` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/nmcli) |
| ` --ntpq` | `ntpq -p` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/ntpq) |
| ` --openvpn` | openvpn-status.log file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/openvpn) |
| ` --os-prober` | `os-prober` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/os_prober) |
| ` --passwd` | `/etc/passwd` file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/passwd) |
| ` --pci-ids` | `pci.ids` file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/pci_ids) |
| ` --pgpass` | PostgreSQL password file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/pgpass) |
| ` --pidstat` | `pidstat -H` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/pidstat) |
| ` --pidstat-s` | `pidstat -H` command streaming parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/pidstat_s) |
| ` --ping` | `ping` and `ping6` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/ping) |
@@ -237,9 +245,11 @@ option.
| ` --rpm-qi` | `rpm -qi` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/rpm_qi) |
| ` --rsync` | `rsync` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/rsync) |
| ` --rsync-s` | `rsync` command streaming parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/rsync_s) |
| ` --semver` | Semantic Version string parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/semver) |
| ` --sfdisk` | `sfdisk` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/sfdisk) |
| ` --shadow` | `/etc/shadow` file parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/shadow) |
| ` --ss` | `ss` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/ss) |
| ` --sshd-conf` | sshd config file and `sshd -T` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/sshd_conf) |
| ` --stat` | `stat` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/stat) |
| ` --stat-s` | `stat` command streaming parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/stat_s) |
| ` --sysctl` | `sysctl` command parser | [details](https://kellyjonbrazil.github.io/jc/docs/parsers/sysctl) |
@@ -385,7 +395,7 @@ option.
### Streaming Parsers
Most parsers load all of the data from `STDIN`, parse it, then output the entire
JSON document serially. There are some streaming parsers (e.g. `ls-s` and
`ping-s`) that immediately start processing and outputing the data line-by-line
`ping-s`) that immediately start processing and outputting the data line-by-line
as [JSON Lines](https://jsonlines.org/) (aka [NDJSON](http://ndjson.org/)) while
it is being received from `STDIN`. This can significantly reduce the amount of
memory required to parse large amounts of command output (e.g. `ls -lR /`) and

View File

@@ -3,8 +3,8 @@ _jc()
local cur prev words cword jc_commands jc_parsers jc_options \
jc_about_options jc_about_mod_options jc_help_options jc_special_options
jc_commands=(acpi airport arp blkid chage cksum crontab date df dig dmidecode dpkg du env file finger free git gpg hciconfig id ifconfig iostat iptables iw jobs last lastb ls lsblk lsmod lsof lspci lsusb md5 md5sum mdadm mount mpstat netstat nmcli ntpq pidstat ping ping6 pip pip3 postconf printenv ps route rpm rsync sfdisk sha1sum sha224sum sha256sum sha384sum sha512sum shasum ss stat sum sysctl systemctl systeminfo timedatectl top tracepath tracepath6 traceroute traceroute6 udevadm ufw uname update-alternatives upower uptime vdir vmstat w wc who xrandr zipinfo)
jc_parsers=(--acpi --airport --airport-s --arp --asciitable --asciitable-m --blkid --cef --cef-s --chage --cksum --crontab --crontab-u --csv --csv-s --date --datetime-iso --df --dig --dir --dmidecode --dpkg-l --du --email-address --env --file --finger --free --fstab --git-log --git-log-s --gpg --group --gshadow --hash --hashsum --hciconfig --history --hosts --id --ifconfig --ini --iostat --iostat-s --ip-address --iptables --iw-scan --jar-manifest --jobs --jwt --kv --last --ls --ls-s --lsblk --lsmod --lsof --lspci --lsusb --m3u --mdadm --mount --mpstat --mpstat-s --netstat --nmcli --ntpq --passwd --pci-ids --pidstat --pidstat-s --ping --ping-s --pip-list --pip-show --plist --postconf --proc --proc-buddyinfo --proc-consoles --proc-cpuinfo --proc-crypto --proc-devices --proc-diskstats --proc-filesystems --proc-interrupts --proc-iomem --proc-ioports --proc-loadavg --proc-locks --proc-meminfo --proc-modules --proc-mtrr --proc-pagetypeinfo --proc-partitions --proc-slabinfo --proc-softirqs --proc-stat --proc-swaps --proc-uptime --proc-version --proc-vmallocinfo --proc-vmstat --proc-zoneinfo --proc-driver-rtc --proc-net-arp --proc-net-dev --proc-net-dev-mcast --proc-net-if-inet6 --proc-net-igmp --proc-net-igmp6 --proc-net-ipv6-route --proc-net-netlink --proc-net-netstat --proc-net-packet --proc-net-protocols --proc-net-route --proc-net-unix --proc-pid-fdinfo --proc-pid-io --proc-pid-maps --proc-pid-mountinfo --proc-pid-numa-maps --proc-pid-smaps --proc-pid-stat --proc-pid-statm --proc-pid-status --ps --route --rpm-qi --rsync --rsync-s --sfdisk --shadow --ss --stat --stat-s --sysctl --syslog --syslog-s --syslog-bsd --syslog-bsd-s --systemctl --systemctl-lj --systemctl-ls --systemctl-luf --systeminfo --time --timedatectl --timestamp --top --top-s --tracepath --traceroute --udevadm --ufw --ufw-appinfo --uname --update-alt-gs --update-alt-q --upower --uptime --url --vmstat --vmstat-s --w --wc --who --x509-cert --xml --xrandr --yaml --zipinfo)
jc_commands=(acpi airport arp blkid cbt chage cksum crontab date df dig dmidecode dpkg du env file findmnt finger free git gpg hciconfig id ifconfig iostat iptables iw jobs last lastb ls lsblk lsmod lsof lspci lsusb md5 md5sum mdadm mount mpstat netstat nmcli ntpq os-prober pidstat ping ping6 pip pip3 postconf printenv ps route rpm rsync sfdisk sha1sum sha224sum sha256sum sha384sum sha512sum shasum ss sshd stat sum sysctl systemctl systeminfo timedatectl top tracepath tracepath6 traceroute traceroute6 udevadm ufw uname update-alternatives upower uptime vdir vmstat w wc who xrandr zipinfo)
jc_parsers=(--acpi --airport --airport-s --arp --asciitable --asciitable-m --blkid --cbt --cef --cef-s --chage --cksum --clf --clf-s --crontab --crontab-u --csv --csv-s --date --datetime-iso --df --dig --dir --dmidecode --dpkg-l --du --email-address --env --file --findmnt --finger --free --fstab --git-log --git-log-s --git-ls-remote --gpg --group --gshadow --hash --hashsum --hciconfig --history --hosts --id --ifconfig --ini --iostat --iostat-s --ip-address --iptables --iw-scan --jar-manifest --jobs --jwt --kv --last --ls --ls-s --lsblk --lsmod --lsof --lspci --lsusb --m3u --mdadm --mount --mpstat --mpstat-s --netstat --nmcli --ntpq --openvpn --os-prober --passwd --pci-ids --pgpass --pidstat --pidstat-s --ping --ping-s --pip-list --pip-show --plist --postconf --proc --proc-buddyinfo --proc-consoles --proc-cpuinfo --proc-crypto --proc-devices --proc-diskstats --proc-filesystems --proc-interrupts --proc-iomem --proc-ioports --proc-loadavg --proc-locks --proc-meminfo --proc-modules --proc-mtrr --proc-pagetypeinfo --proc-partitions --proc-slabinfo --proc-softirqs --proc-stat --proc-swaps --proc-uptime --proc-version --proc-vmallocinfo --proc-vmstat --proc-zoneinfo --proc-driver-rtc --proc-net-arp --proc-net-dev --proc-net-dev-mcast --proc-net-if-inet6 --proc-net-igmp --proc-net-igmp6 --proc-net-ipv6-route --proc-net-netlink --proc-net-netstat --proc-net-packet --proc-net-protocols --proc-net-route --proc-net-unix --proc-pid-fdinfo --proc-pid-io --proc-pid-maps --proc-pid-mountinfo --proc-pid-numa-maps --proc-pid-smaps --proc-pid-stat --proc-pid-statm --proc-pid-status --ps --route --rpm-qi --rsync --rsync-s --semver --sfdisk --shadow --ss --sshd-conf --stat --stat-s --sysctl --syslog --syslog-s --syslog-bsd --syslog-bsd-s --systemctl --systemctl-lj --systemctl-ls --systemctl-luf --systeminfo --time --timedatectl --timestamp --top --top-s --tracepath --traceroute --udevadm --ufw --ufw-appinfo --uname --update-alt-gs --update-alt-q --upower --uptime --url --vmstat --vmstat-s --w --wc --who --x509-cert --xml --xrandr --yaml --zipinfo)
jc_options=(--force-color -C --debug -d --monochrome -m --meta-out -M --pretty -p --quiet -q --raw -r --unbuffer -u --yaml-out -y)
jc_about_options=(--about -a)
jc_about_mod_options=(--pretty -p --yaml-out -y --monochrome -m --force-color -C)

View File

@@ -9,12 +9,13 @@ _jc() {
jc_help_options jc_help_options_describe \
jc_special_options jc_special_options_describe
jc_commands=(acpi airport arp blkid chage cksum crontab date df dig dmidecode dpkg du env file finger free git gpg hciconfig id ifconfig iostat iptables iw jobs last lastb ls lsblk lsmod lsof lspci lsusb md5 md5sum mdadm mount mpstat netstat nmcli ntpq pidstat ping ping6 pip pip3 postconf printenv ps route rpm rsync sfdisk sha1sum sha224sum sha256sum sha384sum sha512sum shasum ss stat sum sysctl systemctl systeminfo timedatectl top tracepath tracepath6 traceroute traceroute6 udevadm ufw uname update-alternatives upower uptime vdir vmstat w wc who xrandr zipinfo)
jc_commands=(acpi airport arp blkid cbt chage cksum crontab date df dig dmidecode dpkg du env file findmnt finger free git gpg hciconfig id ifconfig iostat iptables iw jobs last lastb ls lsblk lsmod lsof lspci lsusb md5 md5sum mdadm mount mpstat netstat nmcli ntpq os-prober pidstat ping ping6 pip pip3 postconf printenv ps route rpm rsync sfdisk sha1sum sha224sum sha256sum sha384sum sha512sum shasum ss sshd stat sum sysctl systemctl systeminfo timedatectl top tracepath tracepath6 traceroute traceroute6 udevadm ufw uname update-alternatives upower uptime vdir vmstat w wc who xrandr zipinfo)
jc_commands_describe=(
'acpi:run "acpi" command with magic syntax.'
'airport:run "airport" command with magic syntax.'
'arp:run "arp" command with magic syntax.'
'blkid:run "blkid" command with magic syntax.'
'cbt:run "cbt" command with magic syntax.'
'chage:run "chage" command with magic syntax.'
'cksum:run "cksum" command with magic syntax.'
'crontab:run "crontab" command with magic syntax.'
@@ -26,6 +27,7 @@ _jc() {
'du:run "du" command with magic syntax.'
'env:run "env" command with magic syntax.'
'file:run "file" command with magic syntax.'
'findmnt:run "findmnt" command with magic syntax.'
'finger:run "finger" command with magic syntax.'
'free:run "free" command with magic syntax.'
'git:run "git" command with magic syntax.'
@@ -53,6 +55,7 @@ _jc() {
'netstat:run "netstat" command with magic syntax.'
'nmcli:run "nmcli" command with magic syntax.'
'ntpq:run "ntpq" command with magic syntax.'
'os-prober:run "os-prober" command with magic syntax.'
'pidstat:run "pidstat" command with magic syntax.'
'ping:run "ping" command with magic syntax.'
'ping6:run "ping6" command with magic syntax.'
@@ -72,6 +75,7 @@ _jc() {
'sha512sum:run "sha512sum" command with magic syntax.'
'shasum:run "shasum" command with magic syntax.'
'ss:run "ss" command with magic syntax.'
'sshd:run "sshd" command with magic syntax.'
'stat:run "stat" command with magic syntax.'
'sum:run "sum" command with magic syntax.'
'sysctl:run "sysctl" command with magic syntax.'
@@ -97,7 +101,7 @@ _jc() {
'xrandr:run "xrandr" command with magic syntax.'
'zipinfo:run "zipinfo" command with magic syntax.'
)
jc_parsers=(--acpi --airport --airport-s --arp --asciitable --asciitable-m --blkid --cef --cef-s --chage --cksum --crontab --crontab-u --csv --csv-s --date --datetime-iso --df --dig --dir --dmidecode --dpkg-l --du --email-address --env --file --finger --free --fstab --git-log --git-log-s --gpg --group --gshadow --hash --hashsum --hciconfig --history --hosts --id --ifconfig --ini --iostat --iostat-s --ip-address --iptables --iw-scan --jar-manifest --jobs --jwt --kv --last --ls --ls-s --lsblk --lsmod --lsof --lspci --lsusb --m3u --mdadm --mount --mpstat --mpstat-s --netstat --nmcli --ntpq --passwd --pci-ids --pidstat --pidstat-s --ping --ping-s --pip-list --pip-show --plist --postconf --proc --proc-buddyinfo --proc-consoles --proc-cpuinfo --proc-crypto --proc-devices --proc-diskstats --proc-filesystems --proc-interrupts --proc-iomem --proc-ioports --proc-loadavg --proc-locks --proc-meminfo --proc-modules --proc-mtrr --proc-pagetypeinfo --proc-partitions --proc-slabinfo --proc-softirqs --proc-stat --proc-swaps --proc-uptime --proc-version --proc-vmallocinfo --proc-vmstat --proc-zoneinfo --proc-driver-rtc --proc-net-arp --proc-net-dev --proc-net-dev-mcast --proc-net-if-inet6 --proc-net-igmp --proc-net-igmp6 --proc-net-ipv6-route --proc-net-netlink --proc-net-netstat --proc-net-packet --proc-net-protocols --proc-net-route --proc-net-unix --proc-pid-fdinfo --proc-pid-io --proc-pid-maps --proc-pid-mountinfo --proc-pid-numa-maps --proc-pid-smaps --proc-pid-stat --proc-pid-statm --proc-pid-status --ps --route --rpm-qi --rsync --rsync-s --sfdisk --shadow --ss --stat --stat-s --sysctl --syslog --syslog-s --syslog-bsd --syslog-bsd-s --systemctl --systemctl-lj --systemctl-ls --systemctl-luf --systeminfo --time --timedatectl --timestamp --top --top-s --tracepath --traceroute --udevadm --ufw --ufw-appinfo --uname --update-alt-gs --update-alt-q --upower --uptime --url --vmstat --vmstat-s --w --wc --who --x509-cert --xml --xrandr --yaml --zipinfo)
jc_parsers=(--acpi --airport --airport-s --arp --asciitable --asciitable-m --blkid --cbt --cef --cef-s --chage --cksum --clf --clf-s --crontab --crontab-u --csv --csv-s --date --datetime-iso --df --dig --dir --dmidecode --dpkg-l --du --email-address --env --file --findmnt --finger --free --fstab --git-log --git-log-s --git-ls-remote --gpg --group --gshadow --hash --hashsum --hciconfig --history --hosts --id --ifconfig --ini --iostat --iostat-s --ip-address --iptables --iw-scan --jar-manifest --jobs --jwt --kv --last --ls --ls-s --lsblk --lsmod --lsof --lspci --lsusb --m3u --mdadm --mount --mpstat --mpstat-s --netstat --nmcli --ntpq --openvpn --os-prober --passwd --pci-ids --pgpass --pidstat --pidstat-s --ping --ping-s --pip-list --pip-show --plist --postconf --proc --proc-buddyinfo --proc-consoles --proc-cpuinfo --proc-crypto --proc-devices --proc-diskstats --proc-filesystems --proc-interrupts --proc-iomem --proc-ioports --proc-loadavg --proc-locks --proc-meminfo --proc-modules --proc-mtrr --proc-pagetypeinfo --proc-partitions --proc-slabinfo --proc-softirqs --proc-stat --proc-swaps --proc-uptime --proc-version --proc-vmallocinfo --proc-vmstat --proc-zoneinfo --proc-driver-rtc --proc-net-arp --proc-net-dev --proc-net-dev-mcast --proc-net-if-inet6 --proc-net-igmp --proc-net-igmp6 --proc-net-ipv6-route --proc-net-netlink --proc-net-netstat --proc-net-packet --proc-net-protocols --proc-net-route --proc-net-unix --proc-pid-fdinfo --proc-pid-io --proc-pid-maps --proc-pid-mountinfo --proc-pid-numa-maps --proc-pid-smaps --proc-pid-stat --proc-pid-statm --proc-pid-status --ps --route --rpm-qi --rsync --rsync-s --semver --sfdisk --shadow --ss --sshd-conf --stat --stat-s --sysctl --syslog --syslog-s --syslog-bsd --syslog-bsd-s --systemctl --systemctl-lj --systemctl-ls --systemctl-luf --systeminfo --time --timedatectl --timestamp --top --top-s --tracepath --traceroute --udevadm --ufw --ufw-appinfo --uname --update-alt-gs --update-alt-q --upower --uptime --url --vmstat --vmstat-s --w --wc --who --x509-cert --xml --xrandr --yaml --zipinfo)
jc_parsers_describe=(
'--acpi:`acpi` command parser'
'--airport:`airport -I` command parser'
@@ -106,10 +110,13 @@ _jc() {
'--asciitable:ASCII and Unicode table parser'
'--asciitable-m:multi-line ASCII and Unicode table parser'
'--blkid:`blkid` command parser'
'--cbt:`cbt` (Google Bigtable) command parser'
'--cef:CEF string parser'
'--cef-s:CEF string streaming parser'
'--chage:`chage --list` command parser'
'--cksum:`cksum` and `sum` command parser'
'--clf:Common and Combined Log Format file parser'
'--clf-s:Common and Combined Log Format file streaming parser'
'--crontab:`crontab` command and file parser'
'--crontab-u:`crontab` file parser with user support'
'--csv:CSV file parser'
@@ -125,11 +132,13 @@ _jc() {
'--email-address:Email Address string parser'
'--env:`env` command parser'
'--file:`file` command parser'
'--findmnt:`findmnt` command parser'
'--finger:`finger` command parser'
'--free:`free` command parser'
'--fstab:`/etc/fstab` file parser'
'--git-log:`git log` command parser'
'--git-log-s:`git log` command streaming parser'
'--git-ls-remote:`git ls-remote` command parser'
'--gpg:`gpg --with-colons` command parser'
'--group:`/etc/group` file parser'
'--gshadow:`/etc/gshadow` file parser'
@@ -166,8 +175,11 @@ _jc() {
'--netstat:`netstat` command parser'
'--nmcli:`nmcli` command parser'
'--ntpq:`ntpq -p` command parser'
'--openvpn:openvpn-status.log file parser'
'--os-prober:`os-prober` command parser'
'--passwd:`/etc/passwd` file parser'
'--pci-ids:`pci.ids` file parser'
'--pgpass:PostgreSQL password file parser'
'--pidstat:`pidstat -H` command parser'
'--pidstat-s:`pidstat -H` command streaming parser'
'--ping:`ping` and `ping6` command parser'
@@ -231,9 +243,11 @@ _jc() {
'--rpm-qi:`rpm -qi` command parser'
'--rsync:`rsync` command parser'
'--rsync-s:`rsync` command streaming parser'
'--semver:Semantic Version string parser'
'--sfdisk:`sfdisk` command parser'
'--shadow:`/etc/shadow` file parser'
'--ss:`ss` command parser'
'--sshd-conf:sshd config file and `sshd -T` command parser'
'--stat:`stat` command parser'
'--stat-s:`stat` command streaming parser'
'--sysctl:`sysctl` command parser'

View File

@@ -161,7 +161,7 @@ subset of `parser_mod_list()`.
### parser\_info
```python
def parser_info(parser_mod_name: str,
def parser_info(parser_mod_name: Union[str, ModuleType],
documentation: bool = False) -> ParserInfoType
```
@@ -169,10 +169,11 @@ Returns a dictionary that includes the parser module metadata.
Parameters:
parser_mod_name: (string) name of the parser module. This
function will accept module_name,
parser_mod_name: (string or name of the parser module. This
Module) function will accept module_name,
cli-name, and --argument-name
variants of the module name.
variants of the module name as well
as a parser module object.
documentation: (boolean) include parser docstring if True
@@ -203,11 +204,12 @@ Parameters:
### get\_help
```python
def get_help(parser_mod_name: str) -> None
def get_help(parser_mod_name: Union[str, ModuleType]) -> None
```
Show help screen for the selected parser.
This function will accept **module_name**, **cli-name**, and
**--argument-name** variants of the module name string.
**--argument-name** variants of the module name string as well as a
parser module object.

125
docs/parsers/cbt.md Normal file
View File

@@ -0,0 +1,125 @@
[Home](https://kellyjonbrazil.github.io/jc/)
<a id="jc.parsers.cbt"></a>
# jc.parsers.cbt
jc - JSON Convert `cbt` command output parser (Google Bigtable)
Parses the human-, but not machine-, friendly output of the cbt command (for
Google's Bigtable).
No effort is made to convert the data types of the values in the cells.
The `timestamp_epoch` calculated timestamp field is naive. (i.e. based on
the local time of the system the parser is run on)
The `timestamp_epoch_utc` calculated timestamp field is timezone-aware and
is only available if the timestamp has a UTC timezone.
The `timestamp_iso` calculated timestamp field will only include UTC
timezone information if the timestamp has a UTC timezone.
Raw output contains all cells for each column (including timestamps), while
the normal output contains only the latest value for each column.
Usage (cli):
$ cbt | jc --cbt
or
$ jc cbt
Usage (module):
import jc
result = jc.parse('cbt', cbt_command_output)
Schema:
[
{
"key": string,
"cells": {
<string>: { # column family
<string>: string # column: value
}
}
}
]
Schema (raw):
[
{
"key": string,
"cells": [
{
"column_family": string,
"column": string,
"value": string,
"timestamp_iso": string,
"timestamp_epoch": integer,
"timestamp_epoch_utc": integer
}
]
}
]
Examples:
$ cbt -project=$PROJECT -instance=$INSTANCE lookup $TABLE foo | jc --cbt -p
[
{
"key": "foo",
"cells": {
"foo": {
"bar": "baz"
}
}
}
]
$ cbt -project=$PROJECT -instance=$INSTANCE lookup $TABLE foo | jc --cbt -p -r
[
{
"key": "foo",
"cells": [
{
"column_family": "foo",
"column": "bar",
"value": "baz1",
"timestamp_iso": "1970-01-01T01:00:00",
"timestamp_epoch": 32400,
"timestamp_epoch_utc": null
}
]
}
]
<a id="jc.parsers.cbt.parse"></a>
### parse
```python
def parse(data: str,
raw: bool = False,
quiet: bool = False) -> List[JSONDictType]
```
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
List of Dictionaries. Raw or processed structured data.
### Parser Information
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
Version 1.0 by Andreas Weiden (andreas.weiden@gmail.com)

199
docs/parsers/clf.md Normal file
View File

@@ -0,0 +1,199 @@
[Home](https://kellyjonbrazil.github.io/jc/)
<a id="jc.parsers.clf"></a>
# jc.parsers.clf
jc - JSON Convert Common Log Format file parser
This parser will handle the Common Log Format standard as specified at
https://www.w3.org/Daemon/User/Config/Logging.html#common-logfile-format.
Combined Log Format is also supported. (Referer and User Agent fields added)
Extra fields may be present and will be enclosed in the `extra` field as
a single string.
If a log line cannot be parsed, an object with an `unparsable` field will
be present with a value of the original line.
The `epoch` calculated timestamp field is naive. (i.e. based on the
local time of the system the parser is run on)
The `epoch_utc` calculated timestamp field is timezone-aware and is
only available if the timezone field is UTC.
Usage (cli):
$ cat file.log | jc --clf
Usage (module):
import jc
result = jc.parse('clf', common_log_file_output)
Schema:
Empty strings and `-` values are converted to `null`/`None`.
[
{
"host": string,
"ident": string,
"authuser": string,
"date": string,
"day": integer,
"month": string,
"year": integer,
"hour": integer,
"minute": integer,
"second": integer,
"tz": string,
"request": string,
"request_method": string,
"request_url": string,
"request_version": string,
"status": integer,
"bytes": integer,
"referer": string,
"user_agent": string,
"extra": string,
"epoch": integer, # [0]
"epoch_utc": integer, # [1]
"unparsable": string # [2]
}
]
[0] naive timestamp
[1] timezone-aware timestamp. Only available if timezone field is UTC
[2] exists if the line was not able to be parsed
Examples:
$ cat file.log | jc --clf -p
[
{
"host": "127.0.0.1",
"ident": "user-identifier",
"authuser": "frank",
"date": "10/Oct/2000:13:55:36 -0700",
"day": 10,
"month": "Oct",
"year": 2000,
"hour": 13,
"minute": 55,
"second": 36,
"tz": "-0700",
"request": "GET /apache_pb.gif HTTPS/1.0",
"status": 200,
"bytes": 2326,
"referer": null,
"user_agent": null,
"extra": null,
"request_method": "GET",
"request_url": "/apache_pb.gif",
"request_version": "HTTPS/1.0",
"epoch": 971211336,
"epoch_utc": null
},
{
"host": "1.1.1.2",
"ident": null,
"authuser": null,
"date": "11/Nov/2016:03:04:55 +0100",
"day": 11,
"month": "Nov",
"year": 2016,
"hour": 3,
"minute": 4,
"second": 55,
"tz": "+0100",
"request": "GET /",
"status": 200,
"bytes": 83,
"referer": null,
"user_agent": null,
"extra": "- 9221 1.1.1.1",
"request_method": "GET",
"request_url": "/",
"request_version": null,
"epoch": 1478862295,
"epoch_utc": null
},
...
]
$ cat file.log | jc --clf -p -r
[
{
"host": "127.0.0.1",
"ident": "user-identifier",
"authuser": "frank",
"date": "10/Oct/2000:13:55:36 -0700",
"day": "10",
"month": "Oct",
"year": "2000",
"hour": "13",
"minute": "55",
"second": "36",
"tz": "-0700",
"request": "GET /apache_pb.gif HTTPS/1.0",
"status": "200",
"bytes": "2326",
"referer": null,
"user_agent": null,
"extra": "",
"request_method": "GET",
"request_url": "/apache_pb.gif",
"request_version": "HTTPS/1.0"
},
{
"host": "1.1.1.2",
"ident": "-",
"authuser": "-",
"date": "11/Nov/2016:03:04:55 +0100",
"day": "11",
"month": "Nov",
"year": "2016",
"hour": "03",
"minute": "04",
"second": "55",
"tz": "+0100",
"request": "GET /",
"status": "200",
"bytes": "83",
"referer": "-",
"user_agent": "-",
"extra": "- 9221 1.1.1.1",
"request_method": "GET",
"request_url": "/",
"request_version": null
},
...
]
<a id="jc.parsers.clf.parse"></a>
### parse
```python
def parse(data: str,
raw: bool = False,
quiet: bool = False) -> List[JSONDictType]
```
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
List of Dictionaries. Raw or processed structured data.
### Parser Information
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
Version 1.0 by Kelly Brazil (kellyjonbrazil@gmail.com)

117
docs/parsers/clf_s.md Normal file
View File

@@ -0,0 +1,117 @@
[Home](https://kellyjonbrazil.github.io/jc/)
<a id="jc.parsers.clf_s"></a>
# jc.parsers.clf\_s
jc - JSON Convert Common Log Format file streaming parser
> This streaming parser outputs JSON Lines (cli) or returns an Iterable of
> Dictionaries (module)
This parser will handle the Common Log Format standard as specified at
https://www.w3.org/Daemon/User/Config/Logging.html#common-logfile-format.
Combined Log Format is also supported. (Referer and User Agent fields added)
Extra fields may be present and will be enclosed in the `extra` field as
a single string.
If a log line cannot be parsed, an object with an `unparsable` field will
be present with a value of the original line.
The `epoch` calculated timestamp field is naive. (i.e. based on the
local time of the system the parser is run on)
The `epoch_utc` calculated timestamp field is timezone-aware and is
only available if the timezone field is UTC.
Usage (cli):
$ cat file.log | jc --clf-s
Usage (module):
import jc
result = jc.parse('clf_s', common_log_file_output.splitlines())
for item in result:
# do something
Schema:
Empty strings and `-` values are converted to `null`/`None`.
{
"host": string,
"ident": string,
"authuser": string,
"date": string,
"day": integer,
"month": string,
"year": integer,
"hour": integer,
"minute": integer,
"second": integer,
"tz": string,
"request": string,
"request_method": string,
"request_url": string,
"request_version": string,
"status": integer,
"bytes": integer,
"referer": string,
"user_agent": string,
"extra": string,
"epoch": integer, # [0]
"epoch_utc": integer, # [1]
"unparsable": string # [2]
}
[0] naive timestamp
[1] timezone-aware timestamp. Only available if timezone field is UTC
[2] exists if the line was not able to be parsed
Examples:
$ cat file.log | jc --clf-s
{"host":"127.0.0.1","ident":"user-identifier","authuser":"frank","...}
{"host":"1.1.1.2","ident":null,"authuser":null,"date":"11/Nov/2016...}
...
$ cat file.log | jc --clf-s -r
{"host":"127.0.0.1","ident":"user-identifier","authuser":"frank","...}
{"host":"1.1.1.2","ident":"-","authuser":"-","date":"11/Nov/2016:0...}
...
<a id="jc.parsers.clf_s.parse"></a>
### parse
```python
@add_jc_meta
def parse(data: Iterable[str],
raw: bool = False,
quiet: bool = False,
ignore_exceptions: bool = False) -> StreamingOutputType
```
Main text parsing generator function. Returns an iterable object.
Parameters:
data: (iterable) line-based text data to parse
(e.g. sys.stdin or str.splitlines())
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
ignore_exceptions: (boolean) ignore parsing exceptions if True
Returns:
Iterable of Dictionaries
### Parser Information
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
Version 1.0 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@@ -82,7 +82,9 @@ Examples:
### parse
```python
def parse(data, raw=False, quiet=False)
def parse(data: Union[str, bytes],
raw: bool = False,
quiet: bool = False) -> List[JSONDictType]
```
Main text parsing function
@@ -100,4 +102,4 @@ Returns:
### Parser Information
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
Version 1.4 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.5 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@@ -86,4 +86,4 @@ Returns:
### Parser Information
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
Version 1.3 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.4 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@@ -6,7 +6,7 @@
jc - JSON Convert ISO 8601 Datetime string parser
This parser supports standard ISO 8601 strings that include both date and
time. If no timezone or offset information is available in the sring, then
time. If no timezone or offset information is available in the string, then
UTC timezone is used.
Usage (cli):

View File

@@ -106,7 +106,7 @@ Schema:
]
[0] naive timestamp if "when" field is parsable, else null
[1] timezone aware timestamp availabe for UTC, else null
[1] timezone aware timestamp available for UTC, else null
Examples:

View File

@@ -5,6 +5,10 @@
jc - JSON Convert `du` command output parser
The `du -h` option is not supported with the default output. If you
would like to use `du -h` or other options that change the output, be sure
to use `jc --raw` (cli) or `raw=True` (module).
Usage (cli):
$ du | jc --du

117
docs/parsers/findmnt.md Normal file
View File

@@ -0,0 +1,117 @@
[Home](https://kellyjonbrazil.github.io/jc/)
<a id="jc.parsers.findmnt"></a>
# jc.parsers.findmnt
jc - JSON Convert `findmnt` command output parser
Supports `-a`, `-l`, or no `findmnt` options.
> Note: Newer versions of `findmnt` have a JSON output option.
Usage (cli):
$ findmnt | jc --findmnt
or
$ jc findmnt
Usage (module):
import jc
result = jc.parse('findmnt', findmnt_command_output)
Schema:
[
{
"target": string,
"source": string,
"fstype": string,
"options": [
string
],
"kv_options": {
"<key_name>": string
}
]
Examples:
$ findmnt | jc --findmnt -p
[
{
"target": "/",
"source": "/dev/mapper/centos-root",
"fstype": "xfs",
"options": [
"rw",
"relatime",
"seclabel",
"attr2",
"inode64",
"noquota"
]
},
{
"target": "/sys/fs/cgroup",
"source": "tmpfs",
"fstype": "tmpfs",
"options": [
"ro",
"nosuid",
"nodev",
"noexec",
"seclabel"
],
"kv_options": {
"mode": "755"
}
},
...
]
$ findmnt | jc --findmnt -p -r
[
{
"target": "/",
"source": "/dev/mapper/centos-root",
"fstype": "xfs",
"options": "rw,relatime,seclabel,attr2,inode64,noquota"
},
{
"target": "/sys/fs/cgroup",
"source": "tmpfs",
"fstype": "tmpfs",
"options": "ro,nosuid,nodev,noexec,seclabel,mode=755"
},
...
]
<a id="jc.parsers.findmnt.parse"></a>
### parse
```python
def parse(data: str,
raw: bool = False,
quiet: bool = False) -> List[JSONDictType]
```
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
List of Dictionaries. Raw or processed structured data.
### Parser Information
Compatibility: linux
Version 1.0 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@@ -40,13 +40,13 @@ Schema:
[
{
"commit": string,
"author": string,
"author_email": string,
"author": string/null,
"author_email": string/null,
"date": string,
"epoch": integer, # [0]
"epoch_utc": integer, # [1]
"commit_by": string,
"commit_by_email": string,
"commit_by": string/null,
"commit_by_email": string/null,
"commit_by_date": string,
"message": string,
"stats" : {
@@ -61,7 +61,7 @@ Schema:
]
[0] naive timestamp if "date" field is parsable, else null
[1] timezone aware timestamp availabe for UTC, else null
[1] timezone aware timestamp available for UTC, else null
Examples:
@@ -172,4 +172,4 @@ Returns:
### Parser Information
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
Version 1.2 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.3 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@@ -41,13 +41,13 @@ Schema:
{
"commit": string,
"author": string,
"author_email": string,
"author": string/null,
"author_email": string/null,
"date": string,
"epoch": integer, # [0]
"epoch_utc": integer, # [1]
"commit_by": string,
"commit_by_email": string,
"commit_by": string/null,
"commit_by_email": string/null,
"commit_by_date": string,
"message": string,
"stats" : {
@@ -68,7 +68,7 @@ Schema:
}
[0] naive timestamp if "date" field is parsable, else null
[1] timezone aware timestamp availabe for UTC, else null
[1] timezone aware timestamp available for UTC, else null
Examples:
@@ -108,4 +108,4 @@ Returns:
### Parser Information
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
Version 1.2 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.3 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@@ -0,0 +1,92 @@
[Home](https://kellyjonbrazil.github.io/jc/)
<a id="jc.parsers.git_ls_remote"></a>
# jc.parsers.git\_ls\_remote
jc - JSON Convert `git ls-remote` command output parser
This parser outputs two schemas:
- Default: A single object with key/value pairs
- Raw: An array of objects (`--raw` (cli) or `raw=True (module))
See the Schema section for more details
Usage (cli):
$ git ls-remote | jc --git-ls-remote
or
$ jc git ls-remote
Usage (module):
import jc
result = jc.parse('git_ls_remote', git_ls_remote_command_output)
Schema:
Default:
{
<reference>: string
}
Raw:
[
{
"reference": string,
"commit": string
}
]
Examples:
$ git ls-remote | jc --git-ls-remote -p
{
"HEAD": "214cd6b9e09603b3c4fa02203b24fb2bc3d4e338",
"refs/heads/dev": "b884f6aacca39e05994596d8fdfa7e7c4f1e0389",
"refs/heads/master": "214cd6b9e09603b3c4fa02203b24fb2bc3d4e338",
"refs/pull/1/head": "e416c77bed1267254da972b0f95b7ff1d43fccef",
...
}
$ git ls-remote | jc --git-ls-remote -p -r
[
{
"reference": "HEAD",
"commit": "214cd6b9e09603b3c4fa02203b24fb2bc3d4e338"
},
{
"reference": "refs/heads/dev",
"commit": "b884f6aacca39e05994596d8fdfa7e7c4f1e0389"
},
...
]
<a id="jc.parsers.git_ls_remote.parse"></a>
### parse
```python
def parse(data: str,
raw: bool = False,
quiet: bool = False) -> Union[JSONDictType, List[JSONDictType]]
```
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
Dictionary (default) or List of Dictionaries (raw)
### Parser Information
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
Version 1.0 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@@ -5,7 +5,10 @@
jc - JSON Convert `ifconfig` command output parser
> Note: No `ifconfig` options are supported.
No `ifconfig` options are supported.
Consider using the `ip` command instead of `ifconfig` as it supports native
JSON output.
Usage (cli):
@@ -24,40 +27,92 @@ Schema:
[
{
"name": string,
"flags": integer,
"name": string,
"type": string,
"metric": integer
"flags": integer,
"state": [
string
string
],
"mtu": integer,
"ipv4_addr": string,
"ipv4_mask": string,
"ipv4_bcast": string,
"ipv6_addr": string,
"ipv6_mask": integer,
"ipv6_scope": string,
"mac_addr": string,
"type": string,
"rx_packets": integer,
"rx_bytes": integer,
"rx_errors": integer,
"rx_dropped": integer,
"rx_overruns": integer,
"rx_frame": integer,
"tx_packets": integer,
"tx_bytes": integer,
"tx_errors": integer,
"tx_dropped": integer,
"tx_overruns": integer,
"tx_carrier": integer,
"tx_collisions": integer,
"metric": integer
"mtu": integer,
"mac_addr": string,
"ipv4_addr": string, # [0]
"ipv4_mask": string, # [0]
"ipv4_bcast": string, # [0]
"ipv6_addr": string, # [0]
"ipv6_mask": integer, # [0]
"ipv6_scope": string, # [0]
"ipv6_scope_id": string, # [0]
"ipv6_type": string, # [0]
"rx_packets": integer,
"rx_bytes": integer,
"rx_errors": integer,
"rx_dropped": integer,
"rx_overruns": integer,
"rx_frame": integer,
"tx_packets": integer,
"tx_bytes": integer,
"tx_errors": integer,
"tx_dropped": integer,
"tx_overruns": integer,
"tx_carrier": integer,
"tx_collisions": integer,
"options": string,
"options_flags": [
string
],
"status": string,
"hw_address": string,
"media": string,
"media_flags": [
string
],
"nd6_options": integer,
"nd6_flags": [
string
],
"plugged": string,
"vendor": string,
"vendor_pn": string,
"vendor_sn": string,
"vendor_date": string,
"module_temperature": string,
"module_voltage": string
"ipv4": [
{
"address": string,
"mask": string,
"broadcast": string
}
],
"ipv6: [
{
"address": string,
"scope_id": string,
"mask": integer,
"scope": string,
"type": string
}
],
"lanes": [
{
"lane": integer,
"rx_power_mw": float,
"rx_power_dbm": float,
"tx_bias_ma": float
}
]
}
]
[0] these fields only pick up the last IP address in the interface
output and are here for backwards compatibility. For information on
all IP addresses, use the `ipv4` and `ipv6` objects which contain an
array of IP address objects.
Examples:
$ ifconfig | jc --ifconfig -p
$ ifconfig ens33 | jc --ifconfig -p
[
{
"name": "ens33",
@@ -69,120 +124,94 @@ Examples:
"MULTICAST"
],
"mtu": 1500,
"type": "Ethernet",
"mac_addr": "00:0c:29:3b:58:0e",
"ipv4_addr": "192.168.71.137",
"ipv4_mask": "255.255.255.0",
"ipv4_bcast": "192.168.71.255",
"ipv6_addr": "fe80::c1cb:715d:bc3e:b8a0",
"ipv6_mask": 64,
"ipv6_scope": "0x20",
"mac_addr": "00:0c:29:3b:58:0e",
"type": "Ethernet",
"ipv6_type": "link",
"metric": null,
"rx_packets": 8061,
"rx_bytes": 1514413,
"rx_errors": 0,
"rx_dropped": 0,
"rx_overruns": 0,
"rx_frame": 0,
"tx_packets": 4502,
"tx_errors": 0,
"tx_dropped": 0,
"tx_overruns": 0,
"tx_carrier": 0,
"tx_collisions": 0,
"rx_bytes": 1514413,
"tx_bytes": 866622,
"tx_errors": 0,
"tx_dropped": 0,
"tx_overruns": 0,
"tx_carrier": 0,
"tx_collisions": 0,
"metric": null
},
{
"name": "lo",
"flags": 73,
"state": [
"UP",
"LOOPBACK",
"RUNNING"
"ipv4": [
{
"address": "192.168.71.137",
"mask": "255.255.255.0",
"broadcast": "192.168.71.255"
}
],
"mtu": 65536,
"ipv4_addr": "127.0.0.1",
"ipv4_mask": "255.0.0.0",
"ipv4_bcast": null,
"ipv6_addr": "::1",
"ipv6_mask": 128,
"ipv6_scope": "0x10",
"mac_addr": null,
"type": "Local Loopback",
"rx_packets": 73,
"rx_bytes": 6009,
"rx_errors": 0,
"rx_dropped": 0,
"rx_overruns": 0,
"rx_frame": 0,
"tx_packets": 73,
"tx_bytes": 6009,
"tx_errors": 0,
"tx_dropped": 0,
"tx_overruns": 0,
"tx_carrier": 0,
"tx_collisions": 0,
"metric": null
"ipv6": [
{
"address": "fe80::c1cb:715d:bc3e:b8a0",
"scope_id": null,
"mask": 64,
"scope": "0x20",
"type": "link"
}
]
}
]
$ ifconfig | jc --ifconfig -p -r
$ ifconfig ens33 | jc --ifconfig -p -r
[
{
"name": "ens33",
"flags": "4163",
"state": "UP,BROADCAST,RUNNING,MULTICAST",
"mtu": "1500",
"type": "Ethernet",
"mac_addr": "00:0c:29:3b:58:0e",
"ipv4_addr": "192.168.71.137",
"ipv4_mask": "255.255.255.0",
"ipv4_bcast": "192.168.71.255",
"ipv6_addr": "fe80::c1cb:715d:bc3e:b8a0",
"ipv6_mask": "64",
"ipv6_scope": "0x20",
"mac_addr": "00:0c:29:3b:58:0e",
"type": "Ethernet",
"ipv6_type": "link",
"metric": null,
"rx_packets": "8061",
"rx_bytes": "1514413",
"rx_errors": "0",
"rx_dropped": "0",
"rx_overruns": "0",
"rx_frame": "0",
"tx_packets": "4502",
"tx_errors": "0",
"tx_dropped": "0",
"tx_overruns": "0",
"tx_carrier": "0",
"tx_collisions": "0",
"rx_bytes": "1514413",
"tx_bytes": "866622",
"tx_errors": "0",
"tx_dropped": "0",
"tx_overruns": "0",
"tx_carrier": "0",
"tx_collisions": "0",
"metric": null
},
{
"name": "lo",
"flags": "73",
"state": "UP,LOOPBACK,RUNNING",
"mtu": "65536",
"ipv4_addr": "127.0.0.1",
"ipv4_mask": "255.0.0.0",
"ipv4_bcast": null,
"ipv6_addr": "::1",
"ipv6_mask": "128",
"ipv6_scope": "0x10",
"mac_addr": null,
"type": "Local Loopback",
"rx_packets": "73",
"rx_bytes": "6009",
"rx_errors": "0",
"rx_dropped": "0",
"rx_overruns": "0",
"rx_frame": "0",
"tx_packets": "73",
"tx_bytes": "6009",
"tx_errors": "0",
"tx_dropped": "0",
"tx_overruns": "0",
"tx_carrier": "0",
"tx_collisions": "0",
"metric": null
"ipv4": [
{
"address": "192.168.71.137",
"mask": "255.255.255.0",
"broadcast": "192.168.71.255"
}
],
"ipv6": [
{
"address": "fe80::c1cb:715d:bc3e:b8a0",
"scope_id": null,
"mask": "64",
"scope": "0x20",
"type": "link"
}
]
}
]
@@ -191,7 +220,9 @@ Examples:
### parse
```python
def parse(data, raw=False, quiet=False)
def parse(data: str,
raw: bool = False,
quiet: bool = False) -> List[JSONDictType]
```
Main text parsing function
@@ -209,4 +240,4 @@ Returns:
### Parser Information
Compatibility: linux, aix, freebsd, darwin
Version 1.12 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 2.1 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@@ -130,4 +130,4 @@ Returns:
### Parser Information
Compatibility: linux
Version 1.2 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.3 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@@ -99,4 +99,4 @@ Returns:
### Parser Information
Compatibility: linux, darwin, cygwin, aix, freebsd
Version 1.1 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.2 by Kelly Brazil (kellyjonbrazil@gmail.com)

178
docs/parsers/openvpn.md Normal file
View File

@@ -0,0 +1,178 @@
[Home](https://kellyjonbrazil.github.io/jc/)
<a id="jc.parsers.openvpn"></a>
# jc.parsers.openvpn
jc - JSON Convert openvpn-status.log file parser
The `*_epoch` calculated timestamp fields are naive. (i.e. based on
the local time of the system the parser is run on)
Usage (cli):
$ cat openvpn-status.log | jc --openvpn
Usage (module):
import jc
result = jc.parse('openvpn', openvpn_status_log_file_output)
Schema:
{
"clients": [
{
"common_name": string,
"real_address": string,
"real_address_prefix": integer, # [0]
"real_address_port": integer, # [0]
"bytes_received": integer,
"bytes_sent": integer,
"connected_since": string,
"connected_since_epoch": integer,
"updated": string,
"updated_epoch": integer,
}
],
"routing_table": [
{
"virtual_address": string,
"virtual_address_prefix": integer, # [0]
"virtual_address_port": integer, # [0]
"common_name": string,
"real_address": string,
"real_address_prefix": integer, # [0]
"real_address_port": integer, # [0]
"last_reference": string,
"last_reference_epoch": integer,
}
],
"global_stats": {
"max_bcast_mcast_queue_len": integer
}
}
[0] null/None if not found
Examples:
$ cat openvpn-status.log | jc --openvpn -p
{
"clients": [
{
"common_name": "foo@example.com",
"real_address": "10.10.10.10",
"bytes_received": 334948,
"bytes_sent": 1973012,
"connected_since": "Thu Jun 18 04:23:03 2015",
"updated": "Thu Jun 18 08:12:15 2015",
"real_address_prefix": null,
"real_address_port": 49502,
"connected_since_epoch": 1434626583,
"updated_epoch": 1434640335
},
{
"common_name": "foo@example.com",
"real_address": "10.10.10.10",
"bytes_received": 334948,
"bytes_sent": 1973012,
"connected_since": "Thu Jun 18 04:23:03 2015",
"updated": "Thu Jun 18 08:12:15 2015",
"real_address_prefix": null,
"real_address_port": 49503,
"connected_since_epoch": 1434626583,
"updated_epoch": 1434640335
}
],
"routing_table": [
{
"virtual_address": "192.168.255.118",
"common_name": "baz@example.com",
"real_address": "10.10.10.10",
"last_reference": "Thu Jun 18 08:12:09 2015",
"virtual_address_prefix": null,
"virtual_address_port": null,
"real_address_prefix": null,
"real_address_port": 63414,
"last_reference_epoch": 1434640329
},
{
"virtual_address": "10.200.0.0",
"common_name": "baz@example.com",
"real_address": "10.10.10.10",
"last_reference": "Thu Jun 18 08:12:09 2015",
"virtual_address_prefix": 16,
"virtual_address_port": null,
"real_address_prefix": null,
"real_address_port": 63414,
"last_reference_epoch": 1434640329
}
],
"global_stats": {
"max_bcast_mcast_queue_len": 0
}
}
$ cat openvpn-status.log | jc --openvpn -p -r
{
"clients": [
{
"common_name": "foo@example.com",
"real_address": "10.10.10.10:49502",
"bytes_received": "334948",
"bytes_sent": "1973012",
"connected_since": "Thu Jun 18 04:23:03 2015",
"updated": "Thu Jun 18 08:12:15 2015"
},
{
"common_name": "foo@example.com",
"real_address": "10.10.10.10:49503",
"bytes_received": "334948",
"bytes_sent": "1973012",
"connected_since": "Thu Jun 18 04:23:03 2015",
"updated": "Thu Jun 18 08:12:15 2015"
}
],
"routing_table": [
{
"virtual_address": "192.168.255.118",
"common_name": "baz@example.com",
"real_address": "10.10.10.10:63414",
"last_reference": "Thu Jun 18 08:12:09 2015"
},
{
"virtual_address": "10.200.0.0/16",
"common_name": "baz@example.com",
"real_address": "10.10.10.10:63414",
"last_reference": "Thu Jun 18 08:12:09 2015"
}
],
"global_stats": {
"max_bcast_mcast_queue_len": "0"
}
}
<a id="jc.parsers.openvpn.parse"></a>
### parse
```python
def parse(data: str, raw: bool = False, quiet: bool = False) -> JSONDictType
```
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
Dictionary. Raw or processed structured data.
### Parser Information
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
Version 1.0 by Kelly Brazil (kellyjonbrazil@gmail.com)

66
docs/parsers/os_prober.md Normal file
View File

@@ -0,0 +1,66 @@
[Home](https://kellyjonbrazil.github.io/jc/)
<a id="jc.parsers.os_prober"></a>
# jc.parsers.os\_prober
jc - JSON Convert `os-prober` command output parser
Usage (cli):
$ os-prober | jc --os-prober
or
$ jc os-prober
Usage (module):
import jc
result = jc.parse('os_prober', os_prober_command_output)
Schema:
{
"partition": string,
"efi_bootmgr": string, # [0]
"name": string,
"short_name": string,
"type": string
}
[0] only exists if an EFI boot manager is detected
Examples:
$ os-prober | jc --os-prober -p
{
"partition": "/dev/sda1",
"name": "Windows 10",
"short_name": "Windows",
"type": "chain"
}
<a id="jc.parsers.os_prober.parse"></a>
### parse
```python
def parse(data: str, raw: bool = False, quiet: bool = False) -> JSONDictType
```
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
Dictionary. Raw or processed structured data.
### Parser Information
Compatibility: linux
Version 1.1 by Kelly Brazil (kellyjonbrazil@gmail.com)

75
docs/parsers/pgpass.md Normal file
View File

@@ -0,0 +1,75 @@
[Home](https://kellyjonbrazil.github.io/jc/)
<a id="jc.parsers.pgpass"></a>
# jc.parsers.pgpass
jc - JSON Convert PostgreSQL password file parser
Usage (cli):
$ cat /var/lib/postgresql/.pgpass | jc --pgpass
Usage (module):
import jc
result = jc.parse('pgpass', postgres_password_file)
Schema:
[
{
"hostname": string,
"port": string,
"database": string,
"username": string,
"password": string
}
]
Examples:
$ cat /var/lib/postgresql/.pgpass | jc --pgpass -p
[
{
"hostname": "dbserver",
"port": "*",
"database": "db1",
"username": "dbuser",
"password": "pwd123"
},
{
"hostname": "dbserver2",
"port": "8888",
"database": "inventory",
"username": "joe:user",
"password": "abc123"
},
...
]
<a id="jc.parsers.pgpass.parse"></a>
### parse
```python
def parse(data: str,
raw: bool = False,
quiet: bool = False) -> List[JSONDictType]
```
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
List of Dictionaries. Raw or processed structured data.
### Parser Information
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
Version 1.0 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@@ -106,4 +106,4 @@ Returns:
### Parser Information
Compatibility: linux, darwin, freebsd
Version 1.1 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.2 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@@ -6,7 +6,7 @@
jc - JSON Convert Proc file output parser
This parser automatically identifies the Proc file and calls the
corresponding parser to peform the parsing.
corresponding parser to perform the parsing.
Magic syntax for converting `/proc` files is also supported by running
`jc /proc/<path to file>`. Any `jc` options must be specified before the
@@ -85,7 +85,7 @@ Examples:
...
]
$ proc_modules | jc --proc_modules -p -r
$ cat /proc/modules | jc --proc-modules -p -r
[
{
"module": "binfmt_misc",

View File

@@ -114,4 +114,4 @@ Returns:
### Parser Information
Compatibility: linux, darwin, freebsd
Version 1.1 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.2 by Kelly Brazil (kellyjonbrazil@gmail.com)

75
docs/parsers/semver.md Normal file
View File

@@ -0,0 +1,75 @@
[Home](https://kellyjonbrazil.github.io/jc/)
<a id="jc.parsers.semver"></a>
# jc.parsers.semver
jc - JSON Convert Semantic Version string parser
This parser conforms to the specification at https://semver.org/
Usage (cli):
$ echo 1.2.3-rc.1+44837 | jc --semver
Usage (module):
import jc
result = jc.parse('semver', semver_string)
Schema:
Strings that do not strictly conform to the specification will return an
empty object.
{
"major": integer,
"minor": integer,
"patch": integer,
"prerelease": string/null,
"build": string/null
}
Examples:
$ echo 1.2.3-rc.1+44837 | jc --semver -p
{
"major": 1,
"minor": 2,
"patch": 3,
"prerelease": "rc.1",
"build": "44837"
}
$ echo 1.2.3-rc.1+44837 | jc --semver -p -r
{
"major": "1",
"minor": "2",
"patch": "3",
"prerelease": "rc.1",
"build": "44837"
}
<a id="jc.parsers.semver.parse"></a>
### parse
```python
def parse(data: str, raw: bool = False, quiet: bool = False) -> JSONDictType
```
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
Dictionary. Raw or processed structured data.
### Parser Information
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
Version 1.0 by Kelly Brazil (kellyjonbrazil@gmail.com)

507
docs/parsers/sshd_conf.md Normal file
View File

@@ -0,0 +1,507 @@
[Home](https://kellyjonbrazil.github.io/jc/)
<a id="jc.parsers.sshd_conf"></a>
# jc.parsers.sshd\_conf
jc - JSON Convert sshd configuration file and `sshd -T` command output parser
This parser will work with `sshd` configuration files or the output of
`sshd -T`. Any `Match` blocks in the `sshd` configuration file will be
ignored.
Usage (cli):
$ sshd -T | jc --sshd-conf
or
$ jc sshd -T
or
$ cat sshd_conf | jc --sshd-conf
Usage (module):
import jc
result = jc.parse('sshd_conf', sshd_conf_output)
Schema:
{
"acceptenv": [
string
],
"addressfamily": string,
"allowagentforwarding": string,
"allowstreamlocalforwarding": string,
"allowtcpforwarding": string,
"authenticationmethods": string,
"authorizedkeyscommand": string,
"authorizedkeyscommanduser": string,
"authorizedkeysfile": [
string
],
"authorizedprincipalscommand": string,
"authorizedprincipalscommanduser": string,
"authorizedprincipalsfile": string,
"banner": string,
"casignaturealgorithms": [
string
],
"chrootdirectory": string,
"ciphers": [
string
],
"ciphers_strategy": string,
"clientalivecountmax": integer,
"clientaliveinterval": integer,
"compression": string,
"disableforwarding": string,
"exposeauthinfo": string,
"fingerprinthash": string,
"forcecommand": string,
"gatewayports": string,
"gssapiauthentication": string,
"gssapicleanupcredentials": string,
"gssapikexalgorithms": [
string
],
"gssapikeyexchange": string,
"gssapistorecredentialsonrekey": string,
"gssapistrictacceptorcheck": string,
"hostbasedacceptedalgorithms": [
string
],
"hostbasedauthentication": string,
"hostbasedusesnamefrompacketonly": string,
"hostkeyagent": string,
"hostkeyalgorithms": [
string
],
"hostkey": [
string
],
"ignorerhosts": string,
"ignoreuserknownhosts": string,
"include": [
string
],
"ipqos": [
string
],
"kbdinteractiveauthentication": string,
"kerberosauthentication": string,
"kerberosorlocalpasswd": string,
"kerberosticketcleanup": sttring,
"kexalgorithms": [
string
],
"listenaddress": [
string
],
"logingracetime": integer,
"loglevel": string,
"macs": [
string
],
"macs_strategy": string,
"maxauthtries": integer,
"maxsessions": integer,
"maxstartups": integer,
"maxstartups_rate": integer,
"maxstartups_full": integer,
"modulifile": string,
"passwordauthentication": string,
"permitemptypasswords": string,
"permitlisten": [
string
],
"permitopen": [
string
],
"permitrootlogin": string,
"permittty": string,
"permittunnel": string,
"permituserenvironment": string,
"permituserrc": string,
"persourcemaxstartups": string,
"persourcenetblocksize": string,
"pidfile": string,
"port": [
integer
],
"printlastlog": string,
"printmotd": string,
"pubkeyacceptedalgorithms": [
string
],
"pubkeyauthentication": string,
"pubkeyauthoptions": string,
"rekeylimit": integer,
"rekeylimit_time": integer,
"revokedkeys": string,
"securitykeyprovider": string,
"streamlocalbindmask": string,
"streamlocalbindunlink": string,
"strictmodes": string,
"subsystem": string,
"subsystem_command": string
"syslogfacility": string,
"tcpkeepalive": string,
"trustedusercakeys": string,
"usedns": string,
"usepam": string,
"versionaddendum": string,
"x11displayoffset": integer,
"x11forwarding": string,
"x11uselocalhost": string,
"xauthlocation": string
}
Examples:
$ sshd -T | jc --sshd-conf -p
{
"acceptenv": [
"LANG",
"LC_*"
],
"addressfamily": "any",
"allowagentforwarding": "yes",
"allowstreamlocalforwarding": "yes",
"allowtcpforwarding": "yes",
"authenticationmethods": "any",
"authorizedkeyscommand": "none",
"authorizedkeyscommanduser": "none",
"authorizedkeysfile": [
".ssh/authorized_keys",
".ssh/authorized_keys2"
],
"authorizedprincipalscommand": "none",
"authorizedprincipalscommanduser": "none",
"authorizedprincipalsfile": "none",
"banner": "none",
"casignaturealgorithms": [
"ssh-ed25519",
"ecdsa-sha2-nistp256",
"ecdsa-sha2-nistp384",
"ecdsa-sha2-nistp521",
"sk-ssh-ed25519@openssh.com",
"sk-ecdsa-sha2-nistp256@openssh.com",
"rsa-sha2-512",
"rsa-sha2-256"
],
"chrootdirectory": "none",
"ciphers": [
"chacha20-poly1305@openssh.com",
"aes128-ctr",
"aes192-ctr",
"aes256-ctr",
"aes128-gcm@openssh.com",
"aes256-gcm@openssh.com"
],
"ciphers_strategy": "+",
"clientalivecountmax": 3,
"clientaliveinterval": 0,
"compression": "yes",
"disableforwarding": "no",
"exposeauthinfo": "no",
"fingerprinthash": "SHA256",
"forcecommand": "none",
"gatewayports": "no",
"gssapiauthentication": "no",
"gssapicleanupcredentials": "yes",
"gssapikexalgorithms": [
"gss-group14-sha256-",
"gss-group16-sha512-",
"gss-nistp256-sha256-",
"gss-curve25519-sha256-",
"gss-group14-sha1-",
"gss-gex-sha1-"
],
"gssapikeyexchange": "no",
"gssapistorecredentialsonrekey": "no",
"gssapistrictacceptorcheck": "yes",
"hostbasedacceptedalgorithms": [
"ssh-ed25519-cert-v01@openssh.com",
"ecdsa-sha2-nistp256-cert-v01@openssh.com",
"ecdsa-sha2-nistp384-cert-v01@openssh.com",
"ecdsa-sha2-nistp521-cert-v01@openssh.com",
"sk-ssh-ed25519-cert-v01@openssh.com",
"sk-ecdsa-sha2-nistp256-cert-v01@openssh.com",
"rsa-sha2-512-cert-v01@openssh.com",
"rsa-sha2-256-cert-v01@openssh.com",
"ssh-ed25519",
"ecdsa-sha2-nistp256",
"ecdsa-sha2-nistp384",
"ecdsa-sha2-nistp521",
"sk-ssh-ed25519@openssh.com",
"sk-ecdsa-sha2-nistp256@openssh.com",
"rsa-sha2-512",
"rsa-sha2-256"
],
"hostbasedauthentication": "no",
"hostbasedusesnamefrompacketonly": "no",
"hostkeyagent": "none",
"hostkeyalgorithms": [
"ssh-ed25519-cert-v01@openssh.com",
"ecdsa-sha2-nistp256-cert-v01@openssh.com",
"ecdsa-sha2-nistp384-cert-v01@openssh.com",
"ecdsa-sha2-nistp521-cert-v01@openssh.com",
"sk-ssh-ed25519-cert-v01@openssh.com",
"sk-ecdsa-sha2-nistp256-cert-v01@openssh.com",
"rsa-sha2-512-cert-v01@openssh.com",
"rsa-sha2-256-cert-v01@openssh.com",
"ssh-ed25519",
"ecdsa-sha2-nistp256",
"ecdsa-sha2-nistp384",
"ecdsa-sha2-nistp521",
"sk-ssh-ed25519@openssh.com",
"sk-ecdsa-sha2-nistp256@openssh.com",
"rsa-sha2-512",
"rsa-sha2-256"
],
"hostkey": [
"/etc/ssh/ssh_host_ecdsa_key",
"/etc/ssh/ssh_host_ed25519_key",
"/etc/ssh/ssh_host_rsa_key"
],
"ignorerhosts": "yes",
"ignoreuserknownhosts": "no",
"ipqos": [
"lowdelay",
"throughput"
],
"kbdinteractiveauthentication": "no",
"kerberosauthentication": "no",
"kerberosorlocalpasswd": "yes",
"kerberosticketcleanup": "yes",
"kexalgorithms": [
"sntrup761x25519-sha512@openssh.com",
"curve25519-sha256",
"curve25519-sha256@libssh.org",
"ecdh-sha2-nistp256",
"ecdh-sha2-nistp384",
"ecdh-sha2-nistp521",
"diffie-hellman-group-exchange-sha256",
"diffie-hellman-group16-sha512",
"diffie-hellman-group18-sha512",
"diffie-hellman-group14-sha256"
],
"listenaddress": [
"0.0.0.0:22",
"[::]:22"
],
"logingracetime": 120,
"loglevel": "INFO",
"macs": [
"umac-64-etm@openssh.com",
"umac-128-etm@openssh.com",
"hmac-sha2-256-etm@openssh.com",
"hmac-sha2-512-etm@openssh.com",
"hmac-sha1-etm@openssh.com",
"umac-64@openssh.com",
"umac-128@openssh.com",
"hmac-sha2-256",
"hmac-sha2-512",
"hmac-sha1"
],
"macs_strategy": "^",
"maxauthtries": 6,
"maxsessions": 10,
"maxstartups": 10,
"modulifile": "/etc/ssh/moduli",
"passwordauthentication": "yes",
"permitemptypasswords": "no",
"permitlisten": [
"any"
],
"permitopen": [
"any"
],
"permitrootlogin": "without-password",
"permittty": "yes",
"permittunnel": "no",
"permituserenvironment": "no",
"permituserrc": "yes",
"persourcemaxstartups": "none",
"persourcenetblocksize": "32:128",
"pidfile": "/run/sshd.pid",
"port": [
22
],
"printlastlog": "yes",
"printmotd": "no",
"pubkeyacceptedalgorithms": [
"ssh-ed25519-cert-v01@openssh.com",
"ecdsa-sha2-nistp256-cert-v01@openssh.com",
"ecdsa-sha2-nistp384-cert-v01@openssh.com",
"ecdsa-sha2-nistp521-cert-v01@openssh.com",
"sk-ssh-ed25519-cert-v01@openssh.com",
"sk-ecdsa-sha2-nistp256-cert-v01@openssh.com",
"rsa-sha2-512-cert-v01@openssh.com",
"rsa-sha2-256-cert-v01@openssh.com",
"ssh-ed25519",
"ecdsa-sha2-nistp256",
"ecdsa-sha2-nistp384",
"ecdsa-sha2-nistp521",
"sk-ssh-ed25519@openssh.com",
"sk-ecdsa-sha2-nistp256@openssh.com",
"rsa-sha2-512",
"rsa-sha2-256"
],
"pubkeyauthentication": "yes",
"pubkeyauthoptions": "none",
"rekeylimit": 0,
"revokedkeys": "none",
"securitykeyprovider": "internal",
"streamlocalbindmask": "0177",
"streamlocalbindunlink": "no",
"strictmodes": "yes",
"subsystem": "sftp",
"syslogfacility": "AUTH",
"tcpkeepalive": "yes",
"trustedusercakeys": "none",
"usedns": "no",
"usepam": "yes",
"versionaddendum": "none",
"x11displayoffset": 10,
"x11forwarding": "yes",
"x11uselocalhost": "yes",
"xauthlocation": "/usr/bin/xauth",
"maxstartups_rate": 30,
"maxstartups_full": 100,
"rekeylimit_time": 0,
"subsystem_command": "/usr/lib/openssh/sftp-server"
}
$ sshd -T | jc --sshd-conf -p -r
{
"acceptenv": [
"LANG",
"LC_*"
],
"addressfamily": "any",
"allowagentforwarding": "yes",
"allowstreamlocalforwarding": "yes",
"allowtcpforwarding": "yes",
"authenticationmethods": "any",
"authorizedkeyscommand": "none",
"authorizedkeyscommanduser": "none",
"authorizedkeysfile": ".ssh/authorized_keys .ssh/authorized_keys2",
"authorizedprincipalscommand": "none",
"authorizedprincipalscommanduser": "none",
"authorizedprincipalsfile": "none",
"banner": "none",
"casignaturealgorithms": "ssh-ed25519,ecdsa-sha2-nistp256,ecdsa-s...",
"chrootdirectory": "none",
"ciphers": "chacha20-poly1305@openssh.com,aes128-ctr,aes192-ctr,...",
"ciphers_strategy": "+",
"clientalivecountmax": "3",
"clientaliveinterval": "0",
"compression": "yes",
"disableforwarding": "no",
"exposeauthinfo": "no",
"fingerprinthash": "SHA256",
"forcecommand": "none",
"gatewayports": "no",
"gssapiauthentication": "no",
"gssapicleanupcredentials": "yes",
"gssapikexalgorithms": "gss-group14-sha256-,gss-group16-sha512-,...",
"gssapikeyexchange": "no",
"gssapistorecredentialsonrekey": "no",
"gssapistrictacceptorcheck": "yes",
"hostbasedacceptedalgorithms": "ssh-ed25519-cert-v01@openssh.co...",
"hostbasedauthentication": "no",
"hostbasedusesnamefrompacketonly": "no",
"hostkeyagent": "none",
"hostkeyalgorithms": "ssh-ed25519-cert-v01@openssh.com,ecdsa-sha2...",
"hostkey": [
"/etc/ssh/ssh_host_ecdsa_key",
"/etc/ssh/ssh_host_ed25519_key",
"/etc/ssh/ssh_host_rsa_key"
],
"ignorerhosts": "yes",
"ignoreuserknownhosts": "no",
"ipqos": "lowdelay throughput",
"kbdinteractiveauthentication": "no",
"kerberosauthentication": "no",
"kerberosorlocalpasswd": "yes",
"kerberosticketcleanup": "yes",
"kexalgorithms": "sntrup761x25519-sha512@openssh.com,curve25519...",
"listenaddress": [
"0.0.0.0:22",
"[::]:22"
],
"logingracetime": "120",
"loglevel": "INFO",
"macs": "umac-64-etm@openssh.com,umac-128-etm@openssh.com,hmac...",
"macs_strategy": "^",
"maxauthtries": "6",
"maxsessions": "10",
"maxstartups": "10:30:100",
"modulifile": "/etc/ssh/moduli",
"passwordauthentication": "yes",
"permitemptypasswords": "no",
"permitlisten": "any",
"permitopen": "any",
"permitrootlogin": "without-password",
"permittty": "yes",
"permittunnel": "no",
"permituserenvironment": "no",
"permituserrc": "yes",
"persourcemaxstartups": "none",
"persourcenetblocksize": "32:128",
"pidfile": "/run/sshd.pid",
"port": [
"22"
],
"printlastlog": "yes",
"printmotd": "no",
"pubkeyacceptedalgorithms": "ssh-ed25519-cert-v01@openssh.com,...",
"pubkeyauthentication": "yes",
"pubkeyauthoptions": "none",
"rekeylimit": "0 0",
"revokedkeys": "none",
"securitykeyprovider": "internal",
"streamlocalbindmask": "0177",
"streamlocalbindunlink": "no",
"strictmodes": "yes",
"subsystem": "sftp /usr/lib/openssh/sftp-server",
"syslogfacility": "AUTH",
"tcpkeepalive": "yes",
"trustedusercakeys": "none",
"usedns": "no",
"usepam": "yes",
"versionaddendum": "none",
"x11displayoffset": "10",
"x11forwarding": "yes",
"x11uselocalhost": "yes",
"xauthlocation": "/usr/bin/xauth"
}
<a id="jc.parsers.sshd_conf.parse"></a>
### parse
```python
def parse(data: str, raw: bool = False, quiet: bool = False) -> JSONDictType
```
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
Dictionary. Raw or processed structured data.
### Parser Information
Compatibility: linux, darwin, freebsd
Version 1.0 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@@ -107,4 +107,4 @@ Returns:
### Parser Information
Compatibility: linux, darwin, freebsd
Version 1.2 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.3 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@@ -53,7 +53,7 @@ Blank values converted to `null`/`None`.
]
[0] naive timestamp if "timestamp" field is parsable, else null
[1] timezone aware timestamp availabe for UTC, else null
[1] timezone aware timestamp available for UTC, else null
[2] this field exists if the syslog line is not parsable. The value
is the original syslog line.

View File

@@ -64,7 +64,7 @@ Blank values converted to `null`/`None`.
}
[0] naive timestamp if "timestamp" field is parsable, else null
[1] timezone aware timestamp availabe for UTC, else null
[1] timezone aware timestamp available for UTC, else null
[2] this field exists if the syslog line is not parsable. The value
is the original syslog line.

View File

@@ -123,4 +123,4 @@ Returns:
### Parser Information
Compatibility: linux
Version 1.1 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.2 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@@ -5,6 +5,11 @@
jc - JSON Convert `XML` file parser
This parser adds a `@` prefix to attributes by default. This can be changed
to a `_` prefix by using the `-r` (cli) or `raw=True` (module) option.
Text values for nodes will have the key-name of `#text`.
Usage (cli):
$ cat foo.xml | jc --xml
@@ -93,4 +98,4 @@ Returns:
### Parser Information
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
Version 1.6 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.7 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@@ -51,7 +51,7 @@ Add `_jc_meta` object to output line if `ignore_exceptions=True`
### stream\_error
```python
def stream_error(e: BaseException, line: str) -> Dict[str, MetadataType]
def stream_error(e: BaseException, line: str) -> JSONDictType
```
Return an error `_jc_meta` field.

View File

@@ -91,7 +91,7 @@ Parameters:
the parser. compatible options:
linux, darwin, cygwin, win32, aix, freebsd
quiet: (bool) supress compatibility message if True
quiet: (bool) suppress compatibility message if True
Returns:
@@ -183,7 +183,7 @@ Returns:
### input\_type\_check
```python
def input_type_check(data: str) -> None
def input_type_check(data: object) -> None
```
Ensure input data is a string. Raises `TypeError` if not.
@@ -233,3 +233,6 @@ Returns a timestamp object with the following attributes:
utc (int | None): aware timestamp only if UTC timezone
detected in datetime string. None if conversion fails.
iso (str | None): ISO string - timezone information is output
only if UTC timezone is detected in the datetime string.

View File

@@ -7,7 +7,6 @@ import sys
import os
from datetime import datetime, timezone
import textwrap
import signal
import shlex
import subprocess
from typing import List, Union, Optional, TextIO
@@ -16,7 +15,7 @@ from .lib import (
__version__, parser_info, all_parser_info, parsers, _get_parser, _parser_is_streaming,
parser_mod_list, standard_parser_mod_list, plugin_parser_mod_list, streaming_parser_mod_list
)
from .jc_types import JSONDictType, AboutJCType, MetadataType, CustomColorType, ParserInfoType
from .jc_types import JSONDictType, CustomColorType, ParserInfoType
from . import utils
from .cli_data import (
long_options_map, new_pygments_colors, old_pygments_colors, helptext_preamble_string,
@@ -75,7 +74,7 @@ class JcCli():
def __init__(self) -> None:
self.data_in: Optional[Union[str, bytes, TextIO]] = None
self.data_out: Optional[Union[List[JSONDictType], JSONDictType, AboutJCType]] = None
self.data_out: Optional[Union[List[JSONDictType], JSONDictType]] = None
self.options: List[str] = []
self.args: List[str] = []
self.parser_module: Optional[ModuleType] = None
@@ -215,7 +214,7 @@ class JcCli():
return otext
@staticmethod
def about_jc() -> AboutJCType:
def about_jc() -> JSONDictType:
"""Return jc info and the contents of each parser.info as a dictionary"""
return {
'name': 'jc',
@@ -599,7 +598,7 @@ class JcCli():
even if there are no results.
"""
if self.run_timestamp:
meta_obj: MetadataType = {
meta_obj: JSONDictType = {
'parser': self.parser_name,
'timestamp': self.run_timestamp.timestamp()
}
@@ -610,9 +609,9 @@ class JcCli():
if isinstance(self.data_out, dict):
if '_jc_meta' not in self.data_out:
self.data_out['_jc_meta'] = {} # type: ignore
self.data_out['_jc_meta'] = {}
self.data_out['_jc_meta'].update(meta_obj) # type: ignore
self.data_out['_jc_meta'].update(meta_obj)
elif isinstance(self.data_out, list):
if not self.data_out:
@@ -623,26 +622,13 @@ class JcCli():
if '_jc_meta' not in item:
item['_jc_meta'] = {}
item['_jc_meta'].update(meta_obj) # type: ignore
item['_jc_meta'].update(meta_obj)
else:
utils.error_message(['Parser returned an unsupported object type.'])
self.exit_error()
def ctrlc(self, signum, frame) -> None:
"""Exit on SIGINT"""
self.exit_clean()
def run(self) -> None:
# break on ctrl-c keyboard interrupt
signal.signal(signal.SIGINT, self.ctrlc)
# break on pipe error. need try/except for windows compatibility
try:
signal.signal(signal.SIGPIPE, signal.SIG_DFL)
except AttributeError:
pass
def _run(self) -> None:
# enable colors for Windows cmd.exe terminal
if sys.platform.startswith('win32'):
os.system('')
@@ -724,12 +710,16 @@ class JcCli():
self.standard_parse_and_print()
self.exit_clean()
except BrokenPipeError:
sys.stdout = None # type: ignore
except (ParseError, LibraryNotInstalled) as e:
if self.debug:
raise
utils.error_message([
f'Parser issue with {self.parser_name}:', f'{e.__class__.__name__}: {e}',
f'Parser issue with {self.parser_name}:',
f'{e.__class__.__name__}: {e}',
'If this is the correct parser, try setting the locale to C (LC_ALL=C).',
f'For details use the -d or -dd option. Use "jc -h --{self.parser_name}" for help.'
])
@@ -751,6 +741,24 @@ class JcCli():
])
self.exit_error()
def run(self) -> None:
try:
self._run()
except KeyboardInterrupt:
utils.error_message(['Exit due to SIGINT.'])
self.exit_error()
except Exception as e:
if self.debug:
raise
utils.error_message([
'Exit due to unexpected error:',
f'{e.__class__.__name__}: {e}'
])
self.exit_error()
def main():
JcCli().run()

View File

@@ -1,11 +1,9 @@
"""jc - JSON Convert lib module"""
import sys
from datetime import datetime
from typing import Dict, List, Tuple, Iterator, Optional, Union
from typing import Any, Dict, List, Tuple, Iterator, Optional, Union
JSONDictType = Dict[str, Union[str, int, float, bool, List, Dict, None]]
MetadataType = Dict[str, Optional[Union[str, int, float, List[str], datetime]]]
JSONDictType = Dict[str, Any]
StreamingOutputType = Iterator[Union[JSONDictType, Tuple[BaseException, str]]]
if sys.version_info >= (3, 8):
@@ -45,9 +43,6 @@ else:
TimeStampFormatType = Dict
AboutJCType = Dict[str, Union[str, int, List[ParserInfoType]]]
try:
from pygments.token import (Name, Number, String, Keyword)
CustomColorType = Dict[Union[Name.Tag, Number, String, Keyword], str]

View File

@@ -9,7 +9,7 @@ from .jc_types import ParserInfoType, JSONDictType
from jc import appdirs
__version__ = '1.22.1'
__version__ = '1.22.3'
parsers: List[str] = [
'acpi',
@@ -19,10 +19,13 @@ parsers: List[str] = [
'asciitable',
'asciitable-m',
'blkid',
'cbt',
'cef',
'cef-s',
'chage',
'cksum',
'clf',
'clf-s',
'crontab',
'crontab-u',
'csv',
@@ -38,11 +41,13 @@ parsers: List[str] = [
'email-address',
'env',
'file',
'findmnt',
'finger',
'free',
'fstab',
'git-log',
'git-log-s',
'git-ls-remote',
'gpg',
'group',
'gshadow',
@@ -80,8 +85,11 @@ parsers: List[str] = [
'netstat',
'nmcli',
'ntpq',
'openvpn',
'os-prober',
'passwd',
'pci-ids',
'pgpass',
'pidstat',
'pidstat-s',
'ping',
@@ -145,9 +153,11 @@ parsers: List[str] = [
'rpm-qi',
'rsync',
'rsync-s',
'semver',
'sfdisk',
'shadow',
'ss',
'sshd-conf',
'stat',
'stat-s',
'sysctl',
@@ -453,22 +463,31 @@ def streaming_parser_mod_list(
return plist
def parser_info(parser_mod_name: str, documentation: bool = False) -> ParserInfoType:
def parser_info(
parser_mod_name: Union[str, ModuleType],
documentation: bool = False
) -> ParserInfoType:
"""
Returns a dictionary that includes the parser module metadata.
Parameters:
parser_mod_name: (string) name of the parser module. This
function will accept module_name,
parser_mod_name: (string or name of the parser module. This
Module) function will accept module_name,
cli-name, and --argument-name
variants of the module name.
variants of the module name as well
as a parser module object.
documentation: (boolean) include parser docstring if True
"""
# ensure parser_mod_name is a true module name and not a cli name
parser_mod_name = _cliname_to_modname(parser_mod_name)
parser_mod = _get_parser(parser_mod_name)
if isinstance(parser_mod_name, ModuleType):
parser_mod = parser_mod_name
parser_mod_name = parser_mod.__name__.split('.')[-1]
else:
# ensure parser_mod_name is a true module name and not a cli name
parser_mod_name = _cliname_to_modname(parser_mod_name)
parser_mod = _get_parser(parser_mod_name)
info_dict: ParserInfoType = {}
if hasattr(parser_mod, 'info'):
@@ -525,11 +544,17 @@ def all_parser_info(
return p_info_list
def get_help(parser_mod_name: str) -> None:
def get_help(parser_mod_name: Union[str, ModuleType]) -> None:
"""
Show help screen for the selected parser.
This function will accept **module_name**, **cli-name**, and
**--argument-name** variants of the module name string.
**--argument-name** variants of the module name string as well as a
parser module object.
"""
help(_get_parser(parser_mod_name))
if isinstance(parser_mod_name, ModuleType):
jc_parser = parser_mod_name
else:
jc_parser = _get_parser(parser_mod_name)
help(jc_parser)

View File

@@ -567,7 +567,7 @@ class DSASignature(Sequence):
@classmethod
def from_p1363(cls, data):
"""
Reads a signature from a byte string encoding accordint to IEEE P1363,
Reads a signature from a byte string encoding according to IEEE P1363,
which is used by Microsoft's BCryptSignHash() function.
:param data:

View File

@@ -247,7 +247,7 @@ class Asn1Value(object):
:param no_explicit:
If explicit tagging info should be removed from this instance.
Used internally to allow contructing the underlying value that
Used internally to allow constructing the underlying value that
has been wrapped in an explicit tag.
:param tag_type:
@@ -697,7 +697,7 @@ class Castable(object):
if other_class.tag != self.__class__.tag:
raise TypeError(unwrap(
'''
Can not covert a value from %s object to %s object since they
Can not convert a value from %s object to %s object since they
use different tags: %d versus %d
''',
type_name(other_class),
@@ -1349,7 +1349,7 @@ class Choice(Asn1Value):
class Concat(object):
"""
A class that contains two or more encoded child values concatentated
A class that contains two or more encoded child values concatenated
together. THIS IS NOT PART OF THE ASN.1 SPECIFICATION! This exists to handle
the x509.TrustedCertificate() class for OpenSSL certificates containing
extra information.
@@ -3757,7 +3757,7 @@ class Sequence(Asn1Value):
def _make_value(self, field_name, field_spec, value_spec, field_params, value):
"""
Contructs an appropriate Asn1Value object for a field
Constructs an appropriate Asn1Value object for a field
:param field_name:
A unicode string of the field name
@@ -3766,7 +3766,7 @@ class Sequence(Asn1Value):
An Asn1Value class that is the field spec
:param value_spec:
An Asn1Value class that is the vaue spec
An Asn1Value class that is the value spec
:param field_params:
None or a dict of params for the field spec

194
jc/parsers/cbt.py Normal file
View File

@@ -0,0 +1,194 @@
"""jc - JSON Convert `cbt` command output parser (Google Bigtable)
Parses the human-, but not machine-, friendly output of the cbt command (for
Google's Bigtable).
No effort is made to convert the data types of the values in the cells.
The `timestamp_epoch` calculated timestamp field is naive. (i.e. based on
the local time of the system the parser is run on)
The `timestamp_epoch_utc` calculated timestamp field is timezone-aware and
is only available if the timestamp has a UTC timezone.
The `timestamp_iso` calculated timestamp field will only include UTC
timezone information if the timestamp has a UTC timezone.
Raw output contains all cells for each column (including timestamps), while
the normal output contains only the latest value for each column.
Usage (cli):
$ cbt | jc --cbt
or
$ jc cbt
Usage (module):
import jc
result = jc.parse('cbt', cbt_command_output)
Schema:
[
{
"key": string,
"cells": {
<string>: { # column family
<string>: string # column: value
}
}
}
]
Schema (raw):
[
{
"key": string,
"cells": [
{
"column_family": string,
"column": string,
"value": string,
"timestamp_iso": string,
"timestamp_epoch": integer,
"timestamp_epoch_utc": integer
}
]
}
]
Examples:
$ cbt -project=$PROJECT -instance=$INSTANCE lookup $TABLE foo | jc --cbt -p
[
{
"key": "foo",
"cells": {
"foo": {
"bar": "baz"
}
}
}
]
$ cbt -project=$PROJECT -instance=$INSTANCE lookup $TABLE foo | jc --cbt -p -r
[
{
"key": "foo",
"cells": [
{
"column_family": "foo",
"column": "bar",
"value": "baz1",
"timestamp_iso": "1970-01-01T01:00:00",
"timestamp_epoch": 32400,
"timestamp_epoch_utc": null
}
]
}
]
"""
from itertools import groupby
from typing import List, Dict
from jc.jc_types import JSONDictType
import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.0'
description = '`cbt` (Google Bigtable) command parser'
author = 'Andreas Weiden'
author_email = 'andreas.weiden@gmail.com'
compatible = ['linux', 'darwin', 'cygwin', 'win32', 'aix', 'freebsd']
magic_commands = ['cbt']
__version__ = info.version
def _process(proc_data: List[JSONDictType]) -> List[JSONDictType]:
"""
Final processing to conform to the schema.
Parameters:
proc_data: (List of Dictionaries) raw structured data to process
Returns:
List of Dictionaries. Structured to conform to the schema.
"""
out_data = []
for row in proc_data:
cells: Dict = {}
key_func = lambda cell: (cell["column_family"], cell["column"])
all_cells = sorted(row["cells"], key=key_func)
for (column_family, column), group in groupby(all_cells, key=key_func):
group_list = sorted(group, key=lambda cell: cell["timestamp_iso"], reverse=True)
if column_family not in cells:
cells[column_family] = {}
cells[column_family][column] = group_list[0]["value"]
row["cells"] = cells
out_data.append(row)
return out_data
def parse(
data: str,
raw: bool = False,
quiet: bool = False
) -> List[JSONDictType]:
"""
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
List of Dictionaries. Raw or processed structured data.
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.input_type_check(data)
raw_output: List[Dict] = []
if jc.utils.has_data(data):
for line in filter(None, data.split("-" * 40)):
key = None
cells = []
column_name = ""
timestamp = ""
value_next = False
for field in line.splitlines():
if not field.strip():
continue
if field.startswith(" " * 4):
value = field.strip(' "')
if value_next:
dt = jc.utils.timestamp(timestamp, format_hint=(1750, 1755))
cells.append({
"column_family": column_name.split(":", 1)[0],
"column": column_name.split(":", 1)[1],
"value": value,
"timestamp_iso": dt.iso,
"timestamp_epoch": dt.naive,
"timestamp_epoch_utc": dt.utc
})
elif field.startswith(" " * 2):
column_name, timestamp = map(str.strip, field.split("@"))
value_next = True
else:
key = field
if key is not None:
raw_output.append({"key": key, "cells": cells})
return raw_output if raw else _process(raw_output)

View File

@@ -185,7 +185,7 @@ def _pycef_parse(str_input):
# If the input entry had any blanks in the required headers, that's wrong
# and we should return. Note we explicitly don't check the last item in the
# split list becuase the header ends in a '|' which means the last item
# split list because the header ends in a '|' which means the last item
# will always be an empty string (it doesn't exist, but the delimiter does).
if "" in spl[0:-1]:
raise ParseError('Blank field(s) in CEF header. Is it valid CEF format?')

298
jc/parsers/clf.py Normal file
View File

@@ -0,0 +1,298 @@
"""jc - JSON Convert Common Log Format file parser
This parser will handle the Common Log Format standard as specified at
https://www.w3.org/Daemon/User/Config/Logging.html#common-logfile-format.
Combined Log Format is also supported. (Referer and User Agent fields added)
Extra fields may be present and will be enclosed in the `extra` field as
a single string.
If a log line cannot be parsed, an object with an `unparsable` field will
be present with a value of the original line.
The `epoch` calculated timestamp field is naive. (i.e. based on the
local time of the system the parser is run on)
The `epoch_utc` calculated timestamp field is timezone-aware and is
only available if the timezone field is UTC.
Usage (cli):
$ cat file.log | jc --clf
Usage (module):
import jc
result = jc.parse('clf', common_log_file_output)
Schema:
Empty strings and `-` values are converted to `null`/`None`.
[
{
"host": string,
"ident": string,
"authuser": string,
"date": string,
"day": integer,
"month": string,
"year": integer,
"hour": integer,
"minute": integer,
"second": integer,
"tz": string,
"request": string,
"request_method": string,
"request_url": string,
"request_version": string,
"status": integer,
"bytes": integer,
"referer": string,
"user_agent": string,
"extra": string,
"epoch": integer, # [0]
"epoch_utc": integer, # [1]
"unparsable": string # [2]
}
]
[0] naive timestamp
[1] timezone-aware timestamp. Only available if timezone field is UTC
[2] exists if the line was not able to be parsed
Examples:
$ cat file.log | jc --clf -p
[
{
"host": "127.0.0.1",
"ident": "user-identifier",
"authuser": "frank",
"date": "10/Oct/2000:13:55:36 -0700",
"day": 10,
"month": "Oct",
"year": 2000,
"hour": 13,
"minute": 55,
"second": 36,
"tz": "-0700",
"request": "GET /apache_pb.gif HTTPS/1.0",
"status": 200,
"bytes": 2326,
"referer": null,
"user_agent": null,
"extra": null,
"request_method": "GET",
"request_url": "/apache_pb.gif",
"request_version": "HTTPS/1.0",
"epoch": 971211336,
"epoch_utc": null
},
{
"host": "1.1.1.2",
"ident": null,
"authuser": null,
"date": "11/Nov/2016:03:04:55 +0100",
"day": 11,
"month": "Nov",
"year": 2016,
"hour": 3,
"minute": 4,
"second": 55,
"tz": "+0100",
"request": "GET /",
"status": 200,
"bytes": 83,
"referer": null,
"user_agent": null,
"extra": "- 9221 1.1.1.1",
"request_method": "GET",
"request_url": "/",
"request_version": null,
"epoch": 1478862295,
"epoch_utc": null
},
...
]
$ cat file.log | jc --clf -p -r
[
{
"host": "127.0.0.1",
"ident": "user-identifier",
"authuser": "frank",
"date": "10/Oct/2000:13:55:36 -0700",
"day": "10",
"month": "Oct",
"year": "2000",
"hour": "13",
"minute": "55",
"second": "36",
"tz": "-0700",
"request": "GET /apache_pb.gif HTTPS/1.0",
"status": "200",
"bytes": "2326",
"referer": null,
"user_agent": null,
"extra": "",
"request_method": "GET",
"request_url": "/apache_pb.gif",
"request_version": "HTTPS/1.0"
},
{
"host": "1.1.1.2",
"ident": "-",
"authuser": "-",
"date": "11/Nov/2016:03:04:55 +0100",
"day": "11",
"month": "Nov",
"year": "2016",
"hour": "03",
"minute": "04",
"second": "55",
"tz": "+0100",
"request": "GET /",
"status": "200",
"bytes": "83",
"referer": "-",
"user_agent": "-",
"extra": "- 9221 1.1.1.1",
"request_method": "GET",
"request_url": "/",
"request_version": null
},
...
]
"""
import re
from typing import List, Dict
from jc.jc_types import JSONDictType
import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.0'
description = 'Common and Combined Log Format file parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
compatible = ['linux', 'darwin', 'cygwin', 'win32', 'aix', 'freebsd']
__version__ = info.version
def _process(proc_data: List[JSONDictType]) -> List[JSONDictType]:
"""
Final processing to conform to the schema.
Parameters:
proc_data: (List of Dictionaries) raw structured data to process
Returns:
List of Dictionaries. Structured to conform to the schema.
"""
int_list = {'day', 'year', 'hour', 'minute', 'second', 'status', 'bytes'}
for log in proc_data:
for key, val in log.items():
# integer conversions
if key in int_list:
log[key] = jc.utils.convert_to_int(val)
# convert `-` and blank values to None
if val == '-' or val == '':
log[key] = None
# add unix timestamps
if 'date' in log:
ts = jc.utils.timestamp(log['date'], format_hint=(1800,))
log['epoch'] = ts.naive
log['epoch_utc'] = ts.utc
return proc_data
def parse(
data: str,
raw: bool = False,
quiet: bool = False
) -> List[JSONDictType]:
"""
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
List of Dictionaries. Raw or processed structured data.
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.input_type_check(data)
raw_output: List[Dict] = []
output_line: Dict = {}
clf_pattern = re.compile(r'''
^(?P<host>-|\S+)\s
(?P<ident>-|\S+)\s
(?P<authuser>-|\S+)\s
\[
(?P<date>
(?P<day>\d+)/
(?P<month>\S\S\S)/
(?P<year>\d\d\d\d):
(?P<hour>\d\d):
(?P<minute>\d\d):
(?P<second>\d\d)\s
(?P<tz>\S+)
)
\]\s
\"(?P<request>.*?)\"\s
(?P<status>-|\d\d\d)\s
(?P<bytes>-|\d+)\s?
(?:\"(?P<referer>.*?)\"\s?)?
(?:\"(?P<user_agent>.*?)\"\s?)?
(?P<extra>.*)
''', re.VERBOSE
)
request_pattern = re.compile(r'''
(?P<request_method>\S+)\s
(?P<request_url>.*?(?=\sHTTPS?/|$))\s? # positive lookahead for HTTP(S)/ or end of string
(?P<request_version>HTTPS?/[\d\.]+)?
''', re.VERBOSE
)
if jc.utils.has_data(data):
for line in filter(None, data.splitlines()):
output_line = {}
clf_match = re.match(clf_pattern, line)
if clf_match:
output_line = clf_match.groupdict()
if output_line.get('request', None):
request_string = output_line['request']
request_match = re.match(request_pattern, request_string)
if request_match:
output_line.update(request_match.groupdict())
raw_output.append(output_line)
else:
raw_output.append(
{"unparsable": line}
)
return raw_output if raw else _process(raw_output)

223
jc/parsers/clf_s.py Normal file
View File

@@ -0,0 +1,223 @@
"""jc - JSON Convert Common Log Format file streaming parser
> This streaming parser outputs JSON Lines (cli) or returns an Iterable of
> Dictionaries (module)
This parser will handle the Common Log Format standard as specified at
https://www.w3.org/Daemon/User/Config/Logging.html#common-logfile-format.
Combined Log Format is also supported. (Referer and User Agent fields added)
Extra fields may be present and will be enclosed in the `extra` field as
a single string.
If a log line cannot be parsed, an object with an `unparsable` field will
be present with a value of the original line.
The `epoch` calculated timestamp field is naive. (i.e. based on the
local time of the system the parser is run on)
The `epoch_utc` calculated timestamp field is timezone-aware and is
only available if the timezone field is UTC.
Usage (cli):
$ cat file.log | jc --clf-s
Usage (module):
import jc
result = jc.parse('clf_s', common_log_file_output.splitlines())
for item in result:
# do something
Schema:
Empty strings and `-` values are converted to `null`/`None`.
{
"host": string,
"ident": string,
"authuser": string,
"date": string,
"day": integer,
"month": string,
"year": integer,
"hour": integer,
"minute": integer,
"second": integer,
"tz": string,
"request": string,
"request_method": string,
"request_url": string,
"request_version": string,
"status": integer,
"bytes": integer,
"referer": string,
"user_agent": string,
"extra": string,
"epoch": integer, # [0]
"epoch_utc": integer, # [1]
"unparsable": string # [2]
}
[0] naive timestamp
[1] timezone-aware timestamp. Only available if timezone field is UTC
[2] exists if the line was not able to be parsed
Examples:
$ cat file.log | jc --clf-s
{"host":"127.0.0.1","ident":"user-identifier","authuser":"frank","...}
{"host":"1.1.1.2","ident":null,"authuser":null,"date":"11/Nov/2016...}
...
$ cat file.log | jc --clf-s -r
{"host":"127.0.0.1","ident":"user-identifier","authuser":"frank","...}
{"host":"1.1.1.2","ident":"-","authuser":"-","date":"11/Nov/2016:0...}
...
"""
import re
from typing import Dict, Iterable
import jc.utils
from jc.streaming import (
add_jc_meta, streaming_input_type_check, streaming_line_input_type_check, raise_or_yield
)
from jc.jc_types import JSONDictType, StreamingOutputType
from jc.exceptions import ParseError
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.0'
description = 'Common and Combined Log Format file streaming parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
compatible = ['linux', 'darwin', 'cygwin', 'win32', 'aix', 'freebsd']
streaming = True
__version__ = info.version
def _process(proc_data: JSONDictType) -> JSONDictType:
"""
Final processing to conform to the schema.
Parameters:
proc_data: (Dictionary) raw structured data to process
Returns:
Dictionary. Structured data to conform to the schema.
"""
int_list = {'day', 'year', 'hour', 'minute', 'second', 'status', 'bytes'}
for key, val in proc_data.items():
# integer conversions
if key in int_list:
proc_data[key] = jc.utils.convert_to_int(val)
# convert `-` and blank values to None
if val == '-' or val == '':
proc_data[key] = None
# add unix timestamps
if 'date' in proc_data:
ts = jc.utils.timestamp(proc_data['date'], format_hint=(1800,))
proc_data['epoch'] = ts.naive
proc_data['epoch_utc'] = ts.utc
return proc_data
@add_jc_meta
def parse(
data: Iterable[str],
raw: bool = False,
quiet: bool = False,
ignore_exceptions: bool = False
) -> StreamingOutputType:
"""
Main text parsing generator function. Returns an iterable object.
Parameters:
data: (iterable) line-based text data to parse
(e.g. sys.stdin or str.splitlines())
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
ignore_exceptions: (boolean) ignore parsing exceptions if True
Returns:
Iterable of Dictionaries
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
streaming_input_type_check(data)
clf_pattern = re.compile(r'''
^(?P<host>-|\S+)\s
(?P<ident>-|\S+)\s
(?P<authuser>-|\S+)\s
\[
(?P<date>
(?P<day>\d+)/
(?P<month>\S\S\S)/
(?P<year>\d\d\d\d):
(?P<hour>\d\d):
(?P<minute>\d\d):
(?P<second>\d\d)\s
(?P<tz>\S+)
)
\]\s
\"(?P<request>.*?)\"\s
(?P<status>-|\d\d\d)\s
(?P<bytes>-|\d+)\s?
(?:\"(?P<referer>.*?)\"\s?)?
(?:\"(?P<user_agent>.*?)\"\s?)?
(?P<extra>.*)
''', re.VERBOSE
)
request_pattern = re.compile(r'''
(?P<request_method>\S+)\s
(?P<request_url>.*?(?=\sHTTPS?/|$))\s? # positive lookahead for HTTP(S)/ or end of string
(?P<request_version>HTTPS?/[\d\.]+)?
''', re.VERBOSE
)
for line in data:
try:
streaming_line_input_type_check(line)
output_line: Dict = {}
if not line.strip():
continue
clf_match = re.match(clf_pattern, line)
if clf_match:
output_line = clf_match.groupdict()
if output_line.get('request', None):
request_string = output_line['request']
request_match = re.match(request_pattern, request_string)
if request_match:
output_line.update(request_match.groupdict())
else:
output_line = {"unparsable": line.strip()}
if output_line:
yield output_line if raw else _process(output_line)
else:
raise ParseError('Not Common Log Format data')
except Exception as e:
yield raise_or_yield(ignore_exceptions, e, line)

View File

@@ -72,13 +72,15 @@ Examples:
...
]
"""
from typing import List, Union, Type
from jc.jc_types import JSONDictType
import jc.utils
import csv
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.4'
version = '1.5'
description = 'CSV file parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@@ -89,7 +91,7 @@ class info():
__version__ = info.version
def _process(proc_data):
def _process(proc_data: List[JSONDictType]) -> List[JSONDictType]:
"""
Final processing to conform to the schema.
@@ -107,7 +109,11 @@ def _process(proc_data):
return proc_data
def parse(data, raw=False, quiet=False):
def parse(
data: Union[str, bytes],
raw: bool = False,
quiet: bool = False
) -> List[JSONDictType]:
"""
Main text parsing function
@@ -124,6 +130,12 @@ def parse(data, raw=False, quiet=False):
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.input_type_check(data)
# remove BOM bytes, if present
if isinstance(data, str):
data = data.encode('utf-8')
data = data.decode('utf-8-sig')
raw_output = []
cleandata = data.splitlines()
@@ -132,7 +144,7 @@ def parse(data, raw=False, quiet=False):
if jc.utils.has_data(data):
dialect = 'excel' # default in csv module
dialect: Union[str, Type[csv.Dialect]] = 'excel' # default in csv module
try:
dialect = csv.Sniffer().sniff(data[:1024])
if '""' in data:
@@ -145,7 +157,4 @@ def parse(data, raw=False, quiet=False):
for row in reader:
raw_output.append(row)
if raw:
return raw_output
else:
return _process(raw_output)
return raw_output if raw else _process(raw_output)

View File

@@ -63,7 +63,7 @@ from jc.exceptions import ParseError
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.3'
version = '1.4'
description = 'CSV file streaming parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@@ -127,7 +127,14 @@ def parse(data, raw=False, quiet=False, ignore_exceptions=False):
if len(temp_list) == 1:
raise ParseError('Unable to detect line endings. Please try the non-streaming CSV parser instead.')
sniffdata = '\n'.join(temp_list)[:1024]
# remove BOM bytes from first row, if present
if temp_list:
if isinstance(temp_list[0], str):
temp_list[0] = temp_list[0].encode('utf-8')
temp_list[0] = temp_list[0].decode('utf-8-sig')
sniffdata = '\r\n'.join(temp_list)[:1024]
dialect = 'excel' # default in csv module
try:

View File

@@ -1,7 +1,7 @@
"""jc - JSON Convert ISO 8601 Datetime string parser
This parser supports standard ISO 8601 strings that include both date and
time. If no timezone or offset information is available in the sring, then
time. If no timezone or offset information is available in the string, then
UTC timezone is used.
Usage (cli):

View File

@@ -101,7 +101,7 @@ Schema:
]
[0] naive timestamp if "when" field is parsable, else null
[1] timezone aware timestamp availabe for UTC, else null
[1] timezone aware timestamp available for UTC, else null
Examples:

View File

@@ -1,5 +1,9 @@
"""jc - JSON Convert `du` command output parser
The `du -h` option is not supported with the default output. If you
would like to use `du -h` or other options that change the output, be sure
to use `jc --raw` (cli) or `raw=True` (module).
Usage (cli):
$ du | jc --du

184
jc/parsers/findmnt.py Normal file
View File

@@ -0,0 +1,184 @@
"""jc - JSON Convert `findmnt` command output parser
Supports `-a`, `-l`, or no `findmnt` options.
> Note: Newer versions of `findmnt` have a JSON output option.
Usage (cli):
$ findmnt | jc --findmnt
or
$ jc findmnt
Usage (module):
import jc
result = jc.parse('findmnt', findmnt_command_output)
Schema:
[
{
"target": string,
"source": string,
"fstype": string,
"options": [
string
],
"kv_options": {
"<key_name>": string
}
]
Examples:
$ findmnt | jc --findmnt -p
[
{
"target": "/",
"source": "/dev/mapper/centos-root",
"fstype": "xfs",
"options": [
"rw",
"relatime",
"seclabel",
"attr2",
"inode64",
"noquota"
]
},
{
"target": "/sys/fs/cgroup",
"source": "tmpfs",
"fstype": "tmpfs",
"options": [
"ro",
"nosuid",
"nodev",
"noexec",
"seclabel"
],
"kv_options": {
"mode": "755"
}
},
...
]
$ findmnt | jc --findmnt -p -r
[
{
"target": "/",
"source": "/dev/mapper/centos-root",
"fstype": "xfs",
"options": "rw,relatime,seclabel,attr2,inode64,noquota"
},
{
"target": "/sys/fs/cgroup",
"source": "tmpfs",
"fstype": "tmpfs",
"options": "ro,nosuid,nodev,noexec,seclabel,mode=755"
},
...
]
"""
import re
from typing import List, Dict
from jc.jc_types import JSONDictType
from jc.parsers.universal import simple_table_parse
import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.0'
description = '`findmnt` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
compatible = ['linux']
magic_commands = ['findmnt']
__version__ = info.version
def _process(proc_data: List[JSONDictType]) -> List[JSONDictType]:
"""
Final processing to conform to the schema.
Parameters:
proc_data: (List of Dictionaries) raw structured data to process
Returns:
List of Dictionaries. Structured to conform to the schema.
"""
# split normal options and k/v options
for item in proc_data:
reg_options = []
kv_options = {}
if 'options' in item:
opt_list = item['options'].split(',')
for option in opt_list:
if '=' in option:
k, v = option.split('=', maxsplit=1)
kv_options[k] = v
else:
reg_options.append(option)
if reg_options:
item['options'] = reg_options
if kv_options:
item['kv_options'] = kv_options
return proc_data
def _replace(matchobj: re.Match) -> str:
if matchobj:
matchlen = len(matchobj.group(1))
return ' ' * matchlen + '/'
return ''
def parse(
data: str,
raw: bool = False,
quiet: bool = False
) -> List[JSONDictType]:
"""
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
List of Dictionaries. Raw or processed structured data.
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.input_type_check(data)
raw_output: List[Dict] = []
table: List[str] = []
if jc.utils.has_data(data):
for line in filter(None, data.splitlines()):
# remove initial drawing characters
line = re.sub(r'^([│ ├─└─|`-]+)/', _replace, line, count=1)
table.append(line)
table[0] = table[0].lower()
raw_output = simple_table_parse(table)
return raw_output if raw else _process(raw_output)

View File

@@ -117,6 +117,10 @@ def parse(
streaming_line_input_type_check(line)
output_line: Dict = {}
# skip blank lines
if not line.strip():
continue
# parse the content here
# check out helper functions in jc.utils
# and jc.parsers.universal

View File

@@ -35,13 +35,13 @@ Schema:
[
{
"commit": string,
"author": string,
"author_email": string,
"author": string/null,
"author_email": string/null,
"date": string,
"epoch": integer, # [0]
"epoch_utc": integer, # [1]
"commit_by": string,
"commit_by_email": string,
"commit_by": string/null,
"commit_by_email": string/null,
"commit_by_date": string,
"message": string,
"stats" : {
@@ -56,7 +56,7 @@ Schema:
]
[0] naive timestamp if "date" field is parsable, else null
[1] timezone aware timestamp availabe for UTC, else null
[1] timezone aware timestamp available for UTC, else null
Examples:
@@ -153,7 +153,7 @@ changes_pattern = re.compile(r'\s(?P<files>\d+)\s+(files? changed),\s+(?P<insert
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.2'
version = '1.3'
description = '`git log` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@@ -202,6 +202,28 @@ def _is_commit_hash(hash_string: str) -> bool:
return False
def _parse_name_email(line):
values = line.rsplit(maxsplit=1)
name = None
email = None
if len(values) == 2:
name = values[0]
if values[1].startswith('<') and values[1].endswith('>'):
email = values[1][1:-1]
else:
if values[0].lstrip().startswith('<') and values[0].endswith('>'):
email = values[0].lstrip()[1:-1]
else:
name = values[0]
if not name:
name = None
if not email:
email = None # covers '<>' case turning into null, not ''
return name, email
def parse(
data: str,
@@ -271,9 +293,7 @@ def parse(
continue
if line.startswith('Author: '):
values = line_list[1].rsplit(maxsplit=1)
output_line['author'] = values[0]
output_line['author_email'] = values[1].strip('<').strip('>')
output_line['author'], output_line['author_email'] = _parse_name_email(line_list[1])
continue
if line.startswith('Date: '):
@@ -289,9 +309,7 @@ def parse(
continue
if line.startswith('Commit: '):
values = line_list[1].rsplit(maxsplit=1)
output_line['commit_by'] = values[0]
output_line['commit_by_email'] = values[1].strip('<').strip('>')
output_line['commit_by'], output_line['commit_by_email'] = _parse_name_email(line_list[1])
continue
if line.startswith(' '):

View File

@@ -36,13 +36,13 @@ Schema:
{
"commit": string,
"author": string,
"author_email": string,
"author": string/null,
"author_email": string/null,
"date": string,
"epoch": integer, # [0]
"epoch_utc": integer, # [1]
"commit_by": string,
"commit_by_email": string,
"commit_by": string/null,
"commit_by_email": string/null,
"commit_by_date": string,
"message": string,
"stats" : {
@@ -63,7 +63,7 @@ Schema:
}
[0] naive timestamp if "date" field is parsable, else null
[1] timezone aware timestamp availabe for UTC, else null
[1] timezone aware timestamp available for UTC, else null
Examples:
@@ -75,6 +75,7 @@ Examples:
import re
from typing import List, Dict, Iterable, Union
import jc.utils
from jc.parsers.git_log import _parse_name_email
from jc.streaming import (
add_jc_meta, streaming_input_type_check, streaming_line_input_type_check, raise_or_yield
)
@@ -87,7 +88,7 @@ changes_pattern = re.compile(r'\s(?P<files>\d+)\s+(files? changed),\s+(?P<insert
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.2'
version = '1.3'
description = '`git log` command streaming parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@@ -215,9 +216,7 @@ def parse(
continue
if line.startswith('Author: '):
values = line_list[1].rsplit(maxsplit=1)
output_line['author'] = values[0]
output_line['author_email'] = values[1].strip('<').strip('>')
output_line['author'], output_line['author_email'] = _parse_name_email(line_list[1])
continue
if line.startswith('Date: '):
@@ -233,9 +232,7 @@ def parse(
continue
if line.startswith('Commit: '):
values = line_list[1].rsplit(maxsplit=1)
output_line['commit_by'] = values[0]
output_line['commit_by_email'] = values[1].strip('<').strip('>')
output_line['commit_by'], output_line['commit_by_email'] = _parse_name_email(line_list[1])
continue
if line.startswith(' '):

139
jc/parsers/git_ls_remote.py Normal file
View File

@@ -0,0 +1,139 @@
"""jc - JSON Convert `git ls-remote` command output parser
This parser outputs two schemas:
- Default: A single object with key/value pairs
- Raw: An array of objects (`--raw` (cli) or `raw=True (module))
See the Schema section for more details
Usage (cli):
$ git ls-remote | jc --git-ls-remote
or
$ jc git ls-remote
Usage (module):
import jc
result = jc.parse('git_ls_remote', git_ls_remote_command_output)
Schema:
Default:
{
<reference>: string
}
Raw:
[
{
"reference": string,
"commit": string
}
]
Examples:
$ git ls-remote | jc --git-ls-remote -p
{
"HEAD": "214cd6b9e09603b3c4fa02203b24fb2bc3d4e338",
"refs/heads/dev": "b884f6aacca39e05994596d8fdfa7e7c4f1e0389",
"refs/heads/master": "214cd6b9e09603b3c4fa02203b24fb2bc3d4e338",
"refs/pull/1/head": "e416c77bed1267254da972b0f95b7ff1d43fccef",
...
}
$ git ls-remote | jc --git-ls-remote -p -r
[
{
"reference": "HEAD",
"commit": "214cd6b9e09603b3c4fa02203b24fb2bc3d4e338"
},
{
"reference": "refs/heads/dev",
"commit": "b884f6aacca39e05994596d8fdfa7e7c4f1e0389"
},
...
]
"""
from typing import List, Union
from jc.jc_types import JSONDictType
import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.0'
description = '`git ls-remote` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
compatible = ['linux', 'darwin', 'cygwin', 'win32', 'aix', 'freebsd']
magic_commands = ['git ls-remote']
__version__ = info.version
def _process(proc_data: List[JSONDictType]) -> JSONDictType:
"""
Final processing to conform to the schema.
Parameters:
proc_data: (List of Dictionaries) raw structured data to process
Returns:
Dictionary. Structured to conform to the schema.
"""
new_dict: JSONDictType = {}
for item in proc_data:
new_dict.update(
{
item['reference']: item['commit']
}
)
return new_dict
def parse(
data: str,
raw: bool = False,
quiet: bool = False
) -> Union[JSONDictType, List[JSONDictType]]:
"""
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
Dictionary (default) or List of Dictionaries (raw)
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.input_type_check(data)
raw_output: List[JSONDictType] = []
output_line: JSONDictType = {}
if jc.utils.has_data(data):
for line in filter(None, data.splitlines()):
commit, reference = line.split()
output_line = {
'reference': reference,
'commit': commit
}
raw_output.append(output_line)
return raw_output if raw else _process(raw_output)

File diff suppressed because it is too large Load Diff

View File

@@ -108,7 +108,7 @@ import jc.parsers.universal
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.2'
version = '1.3'
description = '`iostat` command streaming parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@@ -196,7 +196,7 @@ def parse(data, raw=False, quiet=False, ignore_exceptions=False):
output_line = {}
# ignore blank lines and header line
if line == '\n' or line == '' or line.startswith('Linux'):
if not line.strip() or line.startswith('Linux'):
continue
if line.startswith('avg-cpu:'):

View File

@@ -560,7 +560,7 @@ def _process(proc_data: Dict) -> Dict:
def _b2a(byte_string: bytes) -> str:
"""Convert a byte string to a colon-delimited hex ascii string"""
# need try/except since seperator was only introduced in python 3.8.
# need try/except since separator was only introduced in python 3.8.
# provides compatibility for python 3.6 and 3.7.
try:
return binascii.hexlify(byte_string, ':').decode('utf-8')

View File

@@ -180,7 +180,7 @@ def _post_parse(data):
ssid = {k: v for k, v in ssid.items() if v}
cleandata.append(ssid)
# remove asterisks from begining of values
# remove asterisks from beginning of values
for ssid in cleandata:
for key in ssid:
if ssid[key].startswith('*'):

View File

@@ -78,7 +78,7 @@ def _process(proc_data: Dict) -> Dict:
def _b2a(byte_string: bytes) -> str:
"""Convert a byte string to a colon-delimited hex ascii string"""
# need try/except since seperator was only introduced in python 3.8.
# need try/except since separator was only introduced in python 3.8.
# provides compatibility for python 3.6 and 3.7.
try:
return binascii.hexlify(byte_string, ':').decode('utf-8')

View File

@@ -77,7 +77,7 @@ from jc.exceptions import ParseError
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.1'
version = '1.2'
description = '`ls` command streaming parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@@ -148,7 +148,7 @@ def parse(data, raw=False, quiet=False, ignore_exceptions=False):
continue
# skip blank lines
if line.strip() == '':
if not line.strip():
continue
# Look for parent line if glob or -R is used

View File

@@ -158,7 +158,7 @@ def _process(proc_data: List[JSONDictType]) -> List[JSONDictType]:
for key, val in item.items():
output[key] = val
if key in int_list:
output[key + '_int'] = int(val, 16) # type: ignore
output[key + '_int'] = int(val, 16)
new_list.append(output)
return new_list

View File

@@ -228,7 +228,7 @@ def _normalize_header(keyname: str) -> str:
def _add_text_kv(key: str, value: Optional[str]) -> Optional[Dict]:
"""
Add keys with _text suffix if there is a text description inside
paranthesis at the end of a value. The value of the _text field will
parenthesis at the end of a value. The value of the _text field will
only be the text inside the parenthesis. This allows cleanup of the
original field (convert to int/float/etc) without losing information.
"""

351
jc/parsers/openvpn.py Normal file
View File

@@ -0,0 +1,351 @@
"""jc - JSON Convert openvpn-status.log file parser
The `*_epoch` calculated timestamp fields are naive. (i.e. based on
the local time of the system the parser is run on)
Usage (cli):
$ cat openvpn-status.log | jc --openvpn
Usage (module):
import jc
result = jc.parse('openvpn', openvpn_status_log_file_output)
Schema:
{
"clients": [
{
"common_name": string,
"real_address": string,
"real_address_prefix": integer, # [0]
"real_address_port": integer, # [0]
"bytes_received": integer,
"bytes_sent": integer,
"connected_since": string,
"connected_since_epoch": integer,
"updated": string,
"updated_epoch": integer,
}
],
"routing_table": [
{
"virtual_address": string,
"virtual_address_prefix": integer, # [0]
"virtual_address_port": integer, # [0]
"common_name": string,
"real_address": string,
"real_address_prefix": integer, # [0]
"real_address_port": integer, # [0]
"last_reference": string,
"last_reference_epoch": integer,
}
],
"global_stats": {
"max_bcast_mcast_queue_len": integer
}
}
[0] null/None if not found
Examples:
$ cat openvpn-status.log | jc --openvpn -p
{
"clients": [
{
"common_name": "foo@example.com",
"real_address": "10.10.10.10",
"bytes_received": 334948,
"bytes_sent": 1973012,
"connected_since": "Thu Jun 18 04:23:03 2015",
"updated": "Thu Jun 18 08:12:15 2015",
"real_address_prefix": null,
"real_address_port": 49502,
"connected_since_epoch": 1434626583,
"updated_epoch": 1434640335
},
{
"common_name": "foo@example.com",
"real_address": "10.10.10.10",
"bytes_received": 334948,
"bytes_sent": 1973012,
"connected_since": "Thu Jun 18 04:23:03 2015",
"updated": "Thu Jun 18 08:12:15 2015",
"real_address_prefix": null,
"real_address_port": 49503,
"connected_since_epoch": 1434626583,
"updated_epoch": 1434640335
}
],
"routing_table": [
{
"virtual_address": "192.168.255.118",
"common_name": "baz@example.com",
"real_address": "10.10.10.10",
"last_reference": "Thu Jun 18 08:12:09 2015",
"virtual_address_prefix": null,
"virtual_address_port": null,
"real_address_prefix": null,
"real_address_port": 63414,
"last_reference_epoch": 1434640329
},
{
"virtual_address": "10.200.0.0",
"common_name": "baz@example.com",
"real_address": "10.10.10.10",
"last_reference": "Thu Jun 18 08:12:09 2015",
"virtual_address_prefix": 16,
"virtual_address_port": null,
"real_address_prefix": null,
"real_address_port": 63414,
"last_reference_epoch": 1434640329
}
],
"global_stats": {
"max_bcast_mcast_queue_len": 0
}
}
$ cat openvpn-status.log | jc --openvpn -p -r
{
"clients": [
{
"common_name": "foo@example.com",
"real_address": "10.10.10.10:49502",
"bytes_received": "334948",
"bytes_sent": "1973012",
"connected_since": "Thu Jun 18 04:23:03 2015",
"updated": "Thu Jun 18 08:12:15 2015"
},
{
"common_name": "foo@example.com",
"real_address": "10.10.10.10:49503",
"bytes_received": "334948",
"bytes_sent": "1973012",
"connected_since": "Thu Jun 18 04:23:03 2015",
"updated": "Thu Jun 18 08:12:15 2015"
}
],
"routing_table": [
{
"virtual_address": "192.168.255.118",
"common_name": "baz@example.com",
"real_address": "10.10.10.10:63414",
"last_reference": "Thu Jun 18 08:12:09 2015"
},
{
"virtual_address": "10.200.0.0/16",
"common_name": "baz@example.com",
"real_address": "10.10.10.10:63414",
"last_reference": "Thu Jun 18 08:12:09 2015"
}
],
"global_stats": {
"max_bcast_mcast_queue_len": "0"
}
}
"""
import re
import ipaddress
from typing import List, Dict, Tuple
from jc.jc_types import JSONDictType
import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.0'
description = 'openvpn-status.log file parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
compatible = ['linux', 'darwin', 'cygwin', 'win32', 'aix', 'freebsd']
__version__ = info.version
def _split_addr(addr_str: str) -> Tuple:
"""Check the type of address (v4, v6, mac) and split out the address,
prefix, and port. Values are None if they don't exist."""
address = possible_addr = prefix = port = possible_port = None
try:
address, prefix = addr_str.rsplit('/', maxsplit=1)
except Exception:
address = addr_str
# is this a mac address? then stop
if re.match(r'(?:\S\S\:){5}\S\S', address):
return address, prefix, port
# is it an ipv4 with port or just ipv6?
if ':' in address:
try:
possible_addr, possible_port = address.rsplit(':', maxsplit=1)
_ = ipaddress.IPv4Address(possible_addr)
address = possible_addr
port = possible_port
# assume it was an IPv6 address
except Exception:
pass
return address, prefix, port
def _process(proc_data: JSONDictType) -> JSONDictType:
"""
Final processing to conform to the schema.
Parameters:
proc_data: (Dictionary) raw structured data to process
Returns:
Dictionary. Structured to conform to the schema.
"""
int_list = {'bytes_received', 'bytes_sent', 'max_bcast_mcast_queue_len'}
date_fields = {'connected_since', 'updated', 'last_reference'}
addr_fields = {'real_address', 'virtual_address'}
if 'clients' in proc_data:
for item in proc_data['clients']:
for k, v in item.copy().items():
if k in int_list:
item[k] = jc.utils.convert_to_int(v)
if k in date_fields:
dt = jc.utils.timestamp(item[k], format_hint=(1000,))
item[k + '_epoch'] = dt.naive
if k in addr_fields:
addr, prefix, port = _split_addr(v)
item[k] = addr
item[k + '_prefix'] = jc.utils.convert_to_int(prefix)
item[k + '_port'] = jc.utils.convert_to_int(port)
if 'routing_table' in proc_data:
for item in proc_data['routing_table']:
for k, v in item.copy(). items():
if k in date_fields:
dt = jc.utils.timestamp(item[k], format_hint=(1000,))
item[k + '_epoch'] = dt.naive
if k in addr_fields:
addr, prefix, port = _split_addr(v)
item[k] = addr
item[k + '_prefix'] = jc.utils.convert_to_int(prefix)
item[k + '_port'] = jc.utils.convert_to_int(port)
if 'global_stats' in proc_data:
for k, v in proc_data['global_stats'].items():
if k in int_list:
if k in int_list:
proc_data['global_stats'][k] = jc.utils.convert_to_int(v)
return proc_data
def parse(
data: str,
raw: bool = False,
quiet: bool = False
) -> JSONDictType:
"""
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
Dictionary. Raw or processed structured data.
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.input_type_check(data)
raw_output: Dict = {}
clients: List[Dict] = []
routing_table: List[Dict] = []
global_stats: Dict = {}
section: str = '' # clients, routing, stats
updated: str = ''
if jc.utils.has_data(data):
for line in filter(None, data.splitlines()):
if line.startswith('OpenVPN CLIENT LIST'):
section = 'clients'
continue
if line.startswith('ROUTING TABLE'):
section = 'routing'
continue
if line.startswith('GLOBAL STATS'):
section = 'stats'
continue
if line.startswith('END'):
break
if section == 'clients' and line.startswith('Updated,'):
_, updated = line.split(',', maxsplit=1)
continue
if section == 'clients' and line.startswith('Common Name,Real Address,'):
continue
if section == 'clients':
c_name, real_addr, r_bytes, s_bytes, connected = line.split(',', maxsplit=5)
clients.append(
{
'common_name': c_name,
'real_address': real_addr,
'bytes_received': r_bytes,
'bytes_sent': s_bytes,
'connected_since': connected,
'updated': updated
}
)
continue
if section == 'routing' and line.startswith('Virtual Address,Common Name,'):
continue
if section == 'routing':
# Virtual Address,Common Name,Real Address,Last Ref
# 192.168.255.118,baz@example.com,10.10.10.10:63414,Thu Jun 18 08:12:09 2015
virt_addr, c_name, real_addr, last_ref = line.split(',', maxsplit=4)
route = {
'virtual_address': virt_addr,
'common_name': c_name,
'real_address': real_addr,
'last_reference': last_ref
}
# fixup for virtual addresses ending in "C"
if 'virtual_address' in route:
if route['virtual_address'].endswith('C'):
route['virtual_address'] = route['virtual_address'][:-1]
routing_table.append(route)
continue
if section == "stats":
if line.startswith('Max bcast/mcast queue length'):
global_stats['max_bcast_mcast_queue_len'] = line.split(',', maxsplit=1)[1]
continue
raw_output['clients'] = clients
raw_output['routing_table'] = routing_table
raw_output['global_stats'] = {}
raw_output['global_stats'].update(global_stats)
return raw_output if raw else _process(raw_output)

118
jc/parsers/os_prober.py Normal file
View File

@@ -0,0 +1,118 @@
"""jc - JSON Convert `os-prober` command output parser
Usage (cli):
$ os-prober | jc --os-prober
or
$ jc os-prober
Usage (module):
import jc
result = jc.parse('os_prober', os_prober_command_output)
Schema:
{
"partition": string,
"efi_bootmgr": string, # [0]
"name": string,
"short_name": string,
"type": string
}
[0] only exists if an EFI boot manager is detected
Examples:
$ os-prober | jc --os-prober -p
{
"partition": "/dev/sda1",
"name": "Windows 10",
"short_name": "Windows",
"type": "chain"
}
"""
from typing import Dict
from jc.jc_types import JSONDictType
import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.1'
description = '`os-prober` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
compatible = ['linux']
magic_commands = ['os-prober']
__version__ = info.version
def _process(proc_data: JSONDictType) -> JSONDictType:
"""
Final processing to conform to the schema.
Parameters:
proc_data: (List of Dictionaries) raw structured data to process
Returns:
Dictionary. Structured to conform to the schema.
"""
# check for EFI partition@boot-manager and split/add fields
if 'partition' in proc_data and '@' in proc_data['partition']:
new_part, efi_bootmgr = proc_data['partition'].split('@', maxsplit=1)
proc_data['partition'] = new_part
proc_data['efi_bootmgr'] = efi_bootmgr
return proc_data
def parse(
data: str,
raw: bool = False,
quiet: bool = False
) -> JSONDictType:
"""
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
Dictionary. Raw or processed structured data.
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.input_type_check(data)
raw_output: Dict = {}
if jc.utils.has_data(data):
# /dev/sda1:Windows NT/2000/XP:WinNT:chain
# ^-------^ ^----------------^ ^---^ ^---^
# part. OS name for boot short May change: type of boot loader
# loader's pretty name required. Usually there is only
# output a 'linux' style bootloader or
# a chain one for other partitions
# with their own boot sectors.
partition, name, short_name, type_ = data.split(':')
raw_output = {
'partition': partition.strip(),
'name': name.strip(),
'short_name': short_name.strip(),
'type': type_.strip()
}
return raw_output if raw else _process(raw_output)

125
jc/parsers/pgpass.py Normal file
View File

@@ -0,0 +1,125 @@
"""jc - JSON Convert PostgreSQL password file parser
Usage (cli):
$ cat /var/lib/postgresql/.pgpass | jc --pgpass
Usage (module):
import jc
result = jc.parse('pgpass', postgres_password_file)
Schema:
[
{
"hostname": string,
"port": string,
"database": string,
"username": string,
"password": string
}
]
Examples:
$ cat /var/lib/postgresql/.pgpass | jc --pgpass -p
[
{
"hostname": "dbserver",
"port": "*",
"database": "db1",
"username": "dbuser",
"password": "pwd123"
},
{
"hostname": "dbserver2",
"port": "8888",
"database": "inventory",
"username": "joe:user",
"password": "abc123"
},
...
]
"""
from typing import List, Dict
from jc.jc_types import JSONDictType
import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.0'
description = 'PostgreSQL password file parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
compatible = ['linux', 'darwin', 'cygwin', 'win32', 'aix', 'freebsd']
__version__ = info.version
def _process(proc_data: List[JSONDictType]) -> List[JSONDictType]:
"""
Final processing to conform to the schema.
Parameters:
proc_data: (List of Dictionaries) raw structured data to process
Returns:
List of Dictionaries. Structured to conform to the schema.
"""
return proc_data
def parse(
data: str,
raw: bool = False,
quiet: bool = False
) -> List[JSONDictType]:
"""
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
List of Dictionaries. Raw or processed structured data.
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.input_type_check(data)
raw_output: List[Dict] = []
if jc.utils.has_data(data):
for line in filter(None, data.splitlines()):
# ignore comment lines
if line.strip().startswith('#'):
continue
# convert escaped characters (\ and :)
line = line.replace(':', '\u2063')
line = line.replace('\\\\', '\\')
line = line.replace('\\\u2063', ':')
hostname, port, database, username, password = line.split('\u2063')
raw_output.append(
{
'hostname': hostname,
'port': port,
'database': database,
'username': username,
'password': password
}
)
return raw_output if raw else _process(raw_output)

View File

@@ -161,7 +161,7 @@ def parse(
continue
if not line.startswith('#') and not found_first_hash:
# skip preample lines before header row
# skip preamble lines before header row
continue
if line.startswith('#') and not found_first_hash:

View File

@@ -85,7 +85,7 @@ from jc.exceptions import ParseError
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.1'
version = '1.2'
description = '`ping` and `ping6` command streaming parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@@ -492,7 +492,7 @@ def parse(data, raw=False, quiet=False, ignore_exceptions=False):
output_line = {}
# skip blank lines
if line.strip() == '':
if not line.strip():
continue
# skip warning lines

View File

@@ -80,7 +80,7 @@ def _process(proc_data: Dict) -> Dict:
def _b2a(byte_string: bytes) -> str:
"""Convert a byte string to a colon-delimited hex ascii string"""
# need try/except since seperator was only introduced in python 3.8.
# need try/except since separator was only introduced in python 3.8.
# provides compatibility for python 3.6 and 3.7.
try:
return binascii.hexlify(byte_string, ':').decode('utf-8')

View File

@@ -1,7 +1,7 @@
"""jc - JSON Convert Proc file output parser
This parser automatically identifies the Proc file and calls the
corresponding parser to peform the parsing.
corresponding parser to perform the parsing.
Magic syntax for converting `/proc` files is also supported by running
`jc /proc/<path to file>`. Any `jc` options must be specified before the
@@ -80,7 +80,7 @@ Examples:
...
]
$ proc_modules | jc --proc_modules -p -r
$ cat /proc/modules | jc --proc-modules -p -r
[
{
"module": "binfmt_misc",

View File

@@ -88,7 +88,7 @@ from jc.streaming import (
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.1'
version = '1.2'
description = '`rsync` command streaming parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@@ -267,7 +267,7 @@ def parse(
output_line: Dict = {}
# ignore blank lines
if line == '':
if not line.strip():
continue
file_line = file_line_re.match(line)

121
jc/parsers/semver.py Normal file
View File

@@ -0,0 +1,121 @@
"""jc - JSON Convert Semantic Version string parser
This parser conforms to the specification at https://semver.org/
Usage (cli):
$ echo 1.2.3-rc.1+44837 | jc --semver
Usage (module):
import jc
result = jc.parse('semver', semver_string)
Schema:
Strings that do not strictly conform to the specification will return an
empty object.
{
"major": integer,
"minor": integer,
"patch": integer,
"prerelease": string/null,
"build": string/null
}
Examples:
$ echo 1.2.3-rc.1+44837 | jc --semver -p
{
"major": 1,
"minor": 2,
"patch": 3,
"prerelease": "rc.1",
"build": "44837"
}
$ echo 1.2.3-rc.1+44837 | jc --semver -p -r
{
"major": "1",
"minor": "2",
"patch": "3",
"prerelease": "rc.1",
"build": "44837"
}
"""
import re
from typing import Set, Dict
from jc.jc_types import JSONDictType
import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.0'
description = 'Semantic Version string parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
compatible = ['linux', 'darwin', 'cygwin', 'win32', 'aix', 'freebsd']
__version__ = info.version
def _process(proc_data: JSONDictType) -> JSONDictType:
"""
Final processing to conform to the schema.
Parameters:
proc_data: (List of Dictionaries) raw structured data to process
Returns:
Dictionary. Structured to conform to the schema.
"""
int_list: Set[str] = {'major', 'minor', 'patch'}
for item in int_list:
if item in proc_data:
proc_data[item] = int(proc_data[item])
return proc_data
def parse(
data: str,
raw: bool = False,
quiet: bool = False
) -> JSONDictType:
"""
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
Dictionary. Raw or processed structured data.
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.input_type_check(data)
raw_output: Dict = {}
semver_pattern = re.compile(r'''
^(?P<major>0|[1-9]\d*)\.
(?P<minor>0|[1-9]\d*)\.
(?P<patch>0|[1-9]\d*)
(?:-(?P<prerelease>(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\.(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?
(?:\+(?P<build>[0-9a-zA-Z-]+(?:\.[0-9a-zA-Z-]+)*))?$
''', re.VERBOSE)
if jc.utils.has_data(data):
semver_match = re.match(semver_pattern, data)
if semver_match:
raw_output = semver_match.groupdict()
return raw_output if raw else _process(raw_output)

660
jc/parsers/sshd_conf.py Normal file
View File

@@ -0,0 +1,660 @@
"""jc - JSON Convert sshd configuration file and `sshd -T` command output parser
This parser will work with `sshd` configuration files or the output of
`sshd -T`. Any `Match` blocks in the `sshd` configuration file will be
ignored.
Usage (cli):
$ sshd -T | jc --sshd-conf
or
$ jc sshd -T
or
$ cat sshd_conf | jc --sshd-conf
Usage (module):
import jc
result = jc.parse('sshd_conf', sshd_conf_output)
Schema:
{
"acceptenv": [
string
],
"addressfamily": string,
"allowagentforwarding": string,
"allowstreamlocalforwarding": string,
"allowtcpforwarding": string,
"authenticationmethods": string,
"authorizedkeyscommand": string,
"authorizedkeyscommanduser": string,
"authorizedkeysfile": [
string
],
"authorizedprincipalscommand": string,
"authorizedprincipalscommanduser": string,
"authorizedprincipalsfile": string,
"banner": string,
"casignaturealgorithms": [
string
],
"chrootdirectory": string,
"ciphers": [
string
],
"ciphers_strategy": string,
"clientalivecountmax": integer,
"clientaliveinterval": integer,
"compression": string,
"disableforwarding": string,
"exposeauthinfo": string,
"fingerprinthash": string,
"forcecommand": string,
"gatewayports": string,
"gssapiauthentication": string,
"gssapicleanupcredentials": string,
"gssapikexalgorithms": [
string
],
"gssapikeyexchange": string,
"gssapistorecredentialsonrekey": string,
"gssapistrictacceptorcheck": string,
"hostbasedacceptedalgorithms": [
string
],
"hostbasedauthentication": string,
"hostbasedusesnamefrompacketonly": string,
"hostkeyagent": string,
"hostkeyalgorithms": [
string
],
"hostkey": [
string
],
"ignorerhosts": string,
"ignoreuserknownhosts": string,
"include": [
string
],
"ipqos": [
string
],
"kbdinteractiveauthentication": string,
"kerberosauthentication": string,
"kerberosorlocalpasswd": string,
"kerberosticketcleanup": sttring,
"kexalgorithms": [
string
],
"listenaddress": [
string
],
"logingracetime": integer,
"loglevel": string,
"macs": [
string
],
"macs_strategy": string,
"maxauthtries": integer,
"maxsessions": integer,
"maxstartups": integer,
"maxstartups_rate": integer,
"maxstartups_full": integer,
"modulifile": string,
"passwordauthentication": string,
"permitemptypasswords": string,
"permitlisten": [
string
],
"permitopen": [
string
],
"permitrootlogin": string,
"permittty": string,
"permittunnel": string,
"permituserenvironment": string,
"permituserrc": string,
"persourcemaxstartups": string,
"persourcenetblocksize": string,
"pidfile": string,
"port": [
integer
],
"printlastlog": string,
"printmotd": string,
"pubkeyacceptedalgorithms": [
string
],
"pubkeyauthentication": string,
"pubkeyauthoptions": string,
"rekeylimit": integer,
"rekeylimit_time": integer,
"revokedkeys": string,
"securitykeyprovider": string,
"streamlocalbindmask": string,
"streamlocalbindunlink": string,
"strictmodes": string,
"subsystem": string,
"subsystem_command": string
"syslogfacility": string,
"tcpkeepalive": string,
"trustedusercakeys": string,
"usedns": string,
"usepam": string,
"versionaddendum": string,
"x11displayoffset": integer,
"x11forwarding": string,
"x11uselocalhost": string,
"xauthlocation": string
}
Examples:
$ sshd -T | jc --sshd-conf -p
{
"acceptenv": [
"LANG",
"LC_*"
],
"addressfamily": "any",
"allowagentforwarding": "yes",
"allowstreamlocalforwarding": "yes",
"allowtcpforwarding": "yes",
"authenticationmethods": "any",
"authorizedkeyscommand": "none",
"authorizedkeyscommanduser": "none",
"authorizedkeysfile": [
".ssh/authorized_keys",
".ssh/authorized_keys2"
],
"authorizedprincipalscommand": "none",
"authorizedprincipalscommanduser": "none",
"authorizedprincipalsfile": "none",
"banner": "none",
"casignaturealgorithms": [
"ssh-ed25519",
"ecdsa-sha2-nistp256",
"ecdsa-sha2-nistp384",
"ecdsa-sha2-nistp521",
"sk-ssh-ed25519@openssh.com",
"sk-ecdsa-sha2-nistp256@openssh.com",
"rsa-sha2-512",
"rsa-sha2-256"
],
"chrootdirectory": "none",
"ciphers": [
"chacha20-poly1305@openssh.com",
"aes128-ctr",
"aes192-ctr",
"aes256-ctr",
"aes128-gcm@openssh.com",
"aes256-gcm@openssh.com"
],
"ciphers_strategy": "+",
"clientalivecountmax": 3,
"clientaliveinterval": 0,
"compression": "yes",
"disableforwarding": "no",
"exposeauthinfo": "no",
"fingerprinthash": "SHA256",
"forcecommand": "none",
"gatewayports": "no",
"gssapiauthentication": "no",
"gssapicleanupcredentials": "yes",
"gssapikexalgorithms": [
"gss-group14-sha256-",
"gss-group16-sha512-",
"gss-nistp256-sha256-",
"gss-curve25519-sha256-",
"gss-group14-sha1-",
"gss-gex-sha1-"
],
"gssapikeyexchange": "no",
"gssapistorecredentialsonrekey": "no",
"gssapistrictacceptorcheck": "yes",
"hostbasedacceptedalgorithms": [
"ssh-ed25519-cert-v01@openssh.com",
"ecdsa-sha2-nistp256-cert-v01@openssh.com",
"ecdsa-sha2-nistp384-cert-v01@openssh.com",
"ecdsa-sha2-nistp521-cert-v01@openssh.com",
"sk-ssh-ed25519-cert-v01@openssh.com",
"sk-ecdsa-sha2-nistp256-cert-v01@openssh.com",
"rsa-sha2-512-cert-v01@openssh.com",
"rsa-sha2-256-cert-v01@openssh.com",
"ssh-ed25519",
"ecdsa-sha2-nistp256",
"ecdsa-sha2-nistp384",
"ecdsa-sha2-nistp521",
"sk-ssh-ed25519@openssh.com",
"sk-ecdsa-sha2-nistp256@openssh.com",
"rsa-sha2-512",
"rsa-sha2-256"
],
"hostbasedauthentication": "no",
"hostbasedusesnamefrompacketonly": "no",
"hostkeyagent": "none",
"hostkeyalgorithms": [
"ssh-ed25519-cert-v01@openssh.com",
"ecdsa-sha2-nistp256-cert-v01@openssh.com",
"ecdsa-sha2-nistp384-cert-v01@openssh.com",
"ecdsa-sha2-nistp521-cert-v01@openssh.com",
"sk-ssh-ed25519-cert-v01@openssh.com",
"sk-ecdsa-sha2-nistp256-cert-v01@openssh.com",
"rsa-sha2-512-cert-v01@openssh.com",
"rsa-sha2-256-cert-v01@openssh.com",
"ssh-ed25519",
"ecdsa-sha2-nistp256",
"ecdsa-sha2-nistp384",
"ecdsa-sha2-nistp521",
"sk-ssh-ed25519@openssh.com",
"sk-ecdsa-sha2-nistp256@openssh.com",
"rsa-sha2-512",
"rsa-sha2-256"
],
"hostkey": [
"/etc/ssh/ssh_host_ecdsa_key",
"/etc/ssh/ssh_host_ed25519_key",
"/etc/ssh/ssh_host_rsa_key"
],
"ignorerhosts": "yes",
"ignoreuserknownhosts": "no",
"ipqos": [
"lowdelay",
"throughput"
],
"kbdinteractiveauthentication": "no",
"kerberosauthentication": "no",
"kerberosorlocalpasswd": "yes",
"kerberosticketcleanup": "yes",
"kexalgorithms": [
"sntrup761x25519-sha512@openssh.com",
"curve25519-sha256",
"curve25519-sha256@libssh.org",
"ecdh-sha2-nistp256",
"ecdh-sha2-nistp384",
"ecdh-sha2-nistp521",
"diffie-hellman-group-exchange-sha256",
"diffie-hellman-group16-sha512",
"diffie-hellman-group18-sha512",
"diffie-hellman-group14-sha256"
],
"listenaddress": [
"0.0.0.0:22",
"[::]:22"
],
"logingracetime": 120,
"loglevel": "INFO",
"macs": [
"umac-64-etm@openssh.com",
"umac-128-etm@openssh.com",
"hmac-sha2-256-etm@openssh.com",
"hmac-sha2-512-etm@openssh.com",
"hmac-sha1-etm@openssh.com",
"umac-64@openssh.com",
"umac-128@openssh.com",
"hmac-sha2-256",
"hmac-sha2-512",
"hmac-sha1"
],
"macs_strategy": "^",
"maxauthtries": 6,
"maxsessions": 10,
"maxstartups": 10,
"modulifile": "/etc/ssh/moduli",
"passwordauthentication": "yes",
"permitemptypasswords": "no",
"permitlisten": [
"any"
],
"permitopen": [
"any"
],
"permitrootlogin": "without-password",
"permittty": "yes",
"permittunnel": "no",
"permituserenvironment": "no",
"permituserrc": "yes",
"persourcemaxstartups": "none",
"persourcenetblocksize": "32:128",
"pidfile": "/run/sshd.pid",
"port": [
22
],
"printlastlog": "yes",
"printmotd": "no",
"pubkeyacceptedalgorithms": [
"ssh-ed25519-cert-v01@openssh.com",
"ecdsa-sha2-nistp256-cert-v01@openssh.com",
"ecdsa-sha2-nistp384-cert-v01@openssh.com",
"ecdsa-sha2-nistp521-cert-v01@openssh.com",
"sk-ssh-ed25519-cert-v01@openssh.com",
"sk-ecdsa-sha2-nistp256-cert-v01@openssh.com",
"rsa-sha2-512-cert-v01@openssh.com",
"rsa-sha2-256-cert-v01@openssh.com",
"ssh-ed25519",
"ecdsa-sha2-nistp256",
"ecdsa-sha2-nistp384",
"ecdsa-sha2-nistp521",
"sk-ssh-ed25519@openssh.com",
"sk-ecdsa-sha2-nistp256@openssh.com",
"rsa-sha2-512",
"rsa-sha2-256"
],
"pubkeyauthentication": "yes",
"pubkeyauthoptions": "none",
"rekeylimit": 0,
"revokedkeys": "none",
"securitykeyprovider": "internal",
"streamlocalbindmask": "0177",
"streamlocalbindunlink": "no",
"strictmodes": "yes",
"subsystem": "sftp",
"syslogfacility": "AUTH",
"tcpkeepalive": "yes",
"trustedusercakeys": "none",
"usedns": "no",
"usepam": "yes",
"versionaddendum": "none",
"x11displayoffset": 10,
"x11forwarding": "yes",
"x11uselocalhost": "yes",
"xauthlocation": "/usr/bin/xauth",
"maxstartups_rate": 30,
"maxstartups_full": 100,
"rekeylimit_time": 0,
"subsystem_command": "/usr/lib/openssh/sftp-server"
}
$ sshd -T | jc --sshd-conf -p -r
{
"acceptenv": [
"LANG",
"LC_*"
],
"addressfamily": "any",
"allowagentforwarding": "yes",
"allowstreamlocalforwarding": "yes",
"allowtcpforwarding": "yes",
"authenticationmethods": "any",
"authorizedkeyscommand": "none",
"authorizedkeyscommanduser": "none",
"authorizedkeysfile": ".ssh/authorized_keys .ssh/authorized_keys2",
"authorizedprincipalscommand": "none",
"authorizedprincipalscommanduser": "none",
"authorizedprincipalsfile": "none",
"banner": "none",
"casignaturealgorithms": "ssh-ed25519,ecdsa-sha2-nistp256,ecdsa-s...",
"chrootdirectory": "none",
"ciphers": "chacha20-poly1305@openssh.com,aes128-ctr,aes192-ctr,...",
"ciphers_strategy": "+",
"clientalivecountmax": "3",
"clientaliveinterval": "0",
"compression": "yes",
"disableforwarding": "no",
"exposeauthinfo": "no",
"fingerprinthash": "SHA256",
"forcecommand": "none",
"gatewayports": "no",
"gssapiauthentication": "no",
"gssapicleanupcredentials": "yes",
"gssapikexalgorithms": "gss-group14-sha256-,gss-group16-sha512-,...",
"gssapikeyexchange": "no",
"gssapistorecredentialsonrekey": "no",
"gssapistrictacceptorcheck": "yes",
"hostbasedacceptedalgorithms": "ssh-ed25519-cert-v01@openssh.co...",
"hostbasedauthentication": "no",
"hostbasedusesnamefrompacketonly": "no",
"hostkeyagent": "none",
"hostkeyalgorithms": "ssh-ed25519-cert-v01@openssh.com,ecdsa-sha2...",
"hostkey": [
"/etc/ssh/ssh_host_ecdsa_key",
"/etc/ssh/ssh_host_ed25519_key",
"/etc/ssh/ssh_host_rsa_key"
],
"ignorerhosts": "yes",
"ignoreuserknownhosts": "no",
"ipqos": "lowdelay throughput",
"kbdinteractiveauthentication": "no",
"kerberosauthentication": "no",
"kerberosorlocalpasswd": "yes",
"kerberosticketcleanup": "yes",
"kexalgorithms": "sntrup761x25519-sha512@openssh.com,curve25519...",
"listenaddress": [
"0.0.0.0:22",
"[::]:22"
],
"logingracetime": "120",
"loglevel": "INFO",
"macs": "umac-64-etm@openssh.com,umac-128-etm@openssh.com,hmac...",
"macs_strategy": "^",
"maxauthtries": "6",
"maxsessions": "10",
"maxstartups": "10:30:100",
"modulifile": "/etc/ssh/moduli",
"passwordauthentication": "yes",
"permitemptypasswords": "no",
"permitlisten": "any",
"permitopen": "any",
"permitrootlogin": "without-password",
"permittty": "yes",
"permittunnel": "no",
"permituserenvironment": "no",
"permituserrc": "yes",
"persourcemaxstartups": "none",
"persourcenetblocksize": "32:128",
"pidfile": "/run/sshd.pid",
"port": [
"22"
],
"printlastlog": "yes",
"printmotd": "no",
"pubkeyacceptedalgorithms": "ssh-ed25519-cert-v01@openssh.com,...",
"pubkeyauthentication": "yes",
"pubkeyauthoptions": "none",
"rekeylimit": "0 0",
"revokedkeys": "none",
"securitykeyprovider": "internal",
"streamlocalbindmask": "0177",
"streamlocalbindunlink": "no",
"strictmodes": "yes",
"subsystem": "sftp /usr/lib/openssh/sftp-server",
"syslogfacility": "AUTH",
"tcpkeepalive": "yes",
"trustedusercakeys": "none",
"usedns": "no",
"usepam": "yes",
"versionaddendum": "none",
"x11displayoffset": "10",
"x11forwarding": "yes",
"x11uselocalhost": "yes",
"xauthlocation": "/usr/bin/xauth"
}
"""
from typing import Set, List, Dict
from jc.jc_types import JSONDictType
import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.0'
description = 'sshd config file and `sshd -T` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
compatible = ['linux', 'darwin', 'freebsd']
magic_commands = ['sshd -T']
__version__ = info.version
def _process(proc_data: JSONDictType) -> JSONDictType:
"""
Final processing to conform to the schema.
Parameters:
proc_data: (Dictionary) raw structured data to process
Returns:
Dictionary. Structured to conform to the schema.
"""
split_fields_space: Set[str] = {
'authorizedkeysfile', 'include', 'ipqos', 'permitlisten', 'permitopen'
}
split_fields_comma: Set[str] = {
'casignaturealgorithms', 'ciphers', 'gssapikexalgorithms', 'hostbasedacceptedalgorithms',
'hostbasedacceptedkeytypes', 'hostkeyalgorithms', 'kexalgorithms', 'macs',
'pubkeyacceptedalgorithms', 'pubkeyacceptedkeytypes'
}
int_list: Set[str] = {'clientalivecountmax', 'clientaliveinterval', 'logingracetime',
'maxauthtries', 'maxsessions', 'maxstartups', 'maxstartups_rate', 'maxstartups_full',
'rekeylimit', 'rekeylimit_time', 'x11displayoffset', 'x11maxdisplays'
}
dict_copy = proc_data.copy()
for key, val in dict_copy.items():
# this is a list value
if key == 'acceptenv':
new_list: List[str] = []
for item in val:
new_list.extend(item.split())
proc_data[key] = new_list
continue
# this is a list value
if key == 'include':
new_list = []
for item in val:
new_list.extend(item.split())
proc_data[key] = new_list
continue
if key == 'maxstartups':
maxstart_split = val.split(':', maxsplit=2)
proc_data[key] = maxstart_split[0]
if len(maxstart_split) > 1:
proc_data[key + '_rate'] = maxstart_split[1]
if len(maxstart_split) > 2:
proc_data[key + '_full'] = maxstart_split[2]
continue
if key == 'port':
port_list: List[int] = []
for item in val:
port_list.append(int(item))
proc_data[key] = port_list
continue
if key == 'rekeylimit':
rekey_split = val.split(maxsplit=1)
proc_data[key] = rekey_split[0]
if len(rekey_split) > 1:
proc_data[key + '_time'] = rekey_split[1]
continue
if key == 'subsystem':
sub_split = val.split(maxsplit=1)
proc_data[key] = sub_split[0]
if len(sub_split) > 1:
proc_data[key + '_command'] = sub_split[1]
continue
if key in split_fields_space:
proc_data[key] = val.split()
continue
if key in split_fields_comma:
proc_data[key] = val.split(',')
continue
for key, val in proc_data.items():
if key in int_list:
proc_data[key] = jc.utils.convert_to_int(val)
return proc_data
def parse(
data: str,
raw: bool = False,
quiet: bool = False
) -> JSONDictType:
"""
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
Dictionary. Raw or processed structured data.
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.input_type_check(data)
raw_output: Dict = {}
multi_fields: Set[str] = {'acceptenv', 'hostkey', 'include', 'listenaddress', 'port'}
modified_fields: Set[str] = {
'casignaturealgorithms', 'ciphers', 'hostbasedacceptedalgorithms',
'kexalgorithms', 'macs', 'pubkeyacceptedalgorithms'
}
modifiers: Set[str] = {'+', '-', '^'}
match_block_found = False
if jc.utils.has_data(data):
for line in filter(None, data.splitlines()):
# support configuration file by skipping commented lines
if line.strip().startswith('#'):
continue
# support configuration file by ignoring all lines between
# Match xxx and Match any
if line.strip().startswith('Match all'):
match_block_found = False
continue
if line.strip().startswith('Match'):
match_block_found = True
continue
if match_block_found:
continue
key, val = line.split(maxsplit=1)
# support configuration file by converting to lower case
key = key.lower()
if key in multi_fields:
if key not in raw_output:
raw_output[key] = []
raw_output[key].append(val)
continue
if key in modified_fields and val[0] in modifiers:
raw_output[key] = val[1:]
raw_output[key + '_strategy'] = val[0]
continue
raw_output[key] = val
continue
return raw_output if raw else _process(raw_output)

View File

@@ -84,7 +84,7 @@ from jc.exceptions import ParseError
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.2'
version = '1.3'
description = '`stat` command streaming parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@@ -165,7 +165,7 @@ def parse(
line = line.rstrip()
# ignore blank lines
if line == '':
if not line.strip():
continue
# linux output

View File

@@ -48,7 +48,7 @@ Blank values converted to `null`/`None`.
]
[0] naive timestamp if "timestamp" field is parsable, else null
[1] timezone aware timestamp availabe for UTC, else null
[1] timezone aware timestamp available for UTC, else null
[2] this field exists if the syslog line is not parsable. The value
is the original syslog line.

View File

@@ -59,7 +59,7 @@ Blank values converted to `null`/`None`.
}
[0] naive timestamp if "timestamp" field is parsable, else null
[1] timezone aware timestamp availabe for UTC, else null
[1] timezone aware timestamp available for UTC, else null
[2] this field exists if the syslog line is not parsable. The value
is the original syslog line.

View File

@@ -143,7 +143,7 @@ def _process(proc_data: JSONDictType) -> JSONDictType:
List of Dictionaries. Structured to conform to the schema.
"""
if 'L' in proc_data:
proc_data['L'] = int(proc_data['L']) # type: ignore
proc_data['L'] = int(proc_data['L'])
return proc_data

View File

@@ -100,7 +100,7 @@ from jc.exceptions import ParseError
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.1'
version = '1.2'
description = '`vmstat` command streaming parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@@ -177,7 +177,7 @@ def parse(data, raw=False, quiet=False, ignore_exceptions=False):
output_line = {}
# skip blank lines
if line.strip() == '':
if not line.strip():
continue
# detect output type

View File

@@ -441,7 +441,7 @@ def _i2b(integer: int) -> bytes:
def _b2a(byte_string: bytes) -> str:
"""Convert a byte string to a colon-delimited hex ascii string"""
# need try/except since seperator was only introduced in python 3.8.
# need try/except since separator was only introduced in python 3.8.
# provides compatibility for python 3.6 and 3.7.
try:
return binascii.hexlify(byte_string, ':').decode('utf-8')

View File

@@ -1,5 +1,10 @@
"""jc - JSON Convert `XML` file parser
This parser adds a `@` prefix to attributes by default. This can be changed
to a `_` prefix by using the `-r` (cli) or `raw=True` (module) option.
Text values for nodes will have the key-name of `#text`.
Usage (cli):
$ cat foo.xml | jc --xml
@@ -68,10 +73,15 @@ Examples:
import jc.utils
from jc.exceptions import LibraryNotInstalled
try:
import xmltodict
except Exception:
raise LibraryNotInstalled('The xmltodict library is not installed.')
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.6'
version = '1.7'
description = 'XML file parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@@ -82,7 +92,7 @@ class info():
__version__ = info.version
def _process(proc_data):
def _process(proc_data, has_data=False):
"""
Final processing to conform to the schema.
@@ -94,9 +104,13 @@ def _process(proc_data):
Dictionary representing an XML document.
"""
raw_output = []
# No further processing
return proc_data
if has_data:
# standard output with @ prefix for attributes
raw_output = xmltodict.parse(proc_data, dict_constructor=dict)
return raw_output
def parse(data, raw=False, quiet=False):
@@ -113,22 +127,20 @@ def parse(data, raw=False, quiet=False):
Dictionary. Raw or processed structured data.
"""
# check if xml library is installed and fail gracefully if it is not
try:
import xmltodict
except Exception:
raise LibraryNotInstalled('The xmltodict library is not installed.')
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.input_type_check(data)
raw_output = []
has_data = False
if jc.utils.has_data(data):
raw_output = xmltodict.parse(data)
has_data = True
if raw:
if has_data:
# modified output with _ prefix for attributes
raw_output = xmltodict.parse(data, dict_constructor=dict, attr_prefix='_')
return raw_output
else:
return _process(raw_output)
return _process(data, has_data)

View File

@@ -2,7 +2,7 @@
from functools import wraps
from typing import Dict, Tuple, Union, Iterable, Callable, TypeVar, cast, Any
from .jc_types import JSONDictType, MetadataType
from .jc_types import JSONDictType
F = TypeVar('F', bound=Callable[..., Any])
@@ -31,7 +31,7 @@ def stream_success(output_line: JSONDictType, ignore_exceptions: bool) -> JSONDi
return output_line
def stream_error(e: BaseException, line: str) -> Dict[str, MetadataType]:
def stream_error(e: BaseException, line: str) -> JSONDictType:
"""
Return an error `_jc_meta` field.
"""

View File

@@ -6,7 +6,7 @@ import shutil
from datetime import datetime, timezone
from textwrap import TextWrapper
from functools import lru_cache
from typing import List, Dict, Iterable, Union, Optional, TextIO
from typing import Any, List, Dict, Iterable, Union, Optional, TextIO
from .jc_types import TimeStampFormatType
@@ -141,7 +141,7 @@ def compatibility(mod_name: str, compatible: List[str], quiet: bool = False) ->
the parser. compatible options:
linux, darwin, cygwin, win32, aix, freebsd
quiet: (bool) supress compatibility message if True
quiet: (bool) suppress compatibility message if True
Returns:
@@ -273,14 +273,14 @@ def convert_to_bool(value: object) -> bool:
return False
def input_type_check(data: str) -> None:
def input_type_check(data: object) -> None:
"""Ensure input data is a string. Raises `TypeError` if not."""
if not isinstance(data, str):
raise TypeError("Input data must be a 'str' object.")
class timestamp:
__slots__ = ('string', 'format', 'naive', 'utc')
__slots__ = ('string', 'format', 'naive', 'utc', 'iso')
def __init__(self,
datetime_string: Optional[str],
@@ -314,6 +314,9 @@ class timestamp:
utc (int | None): aware timestamp only if UTC timezone
detected in datetime string. None if conversion fails.
iso (str | None): ISO string - timezone information is output
only if UTC timezone is detected in the datetime string.
"""
self.string = datetime_string
@@ -326,16 +329,17 @@ class timestamp:
self.format = dt['format']
self.naive = dt['timestamp_naive']
self.utc = dt['timestamp_utc']
self.iso = dt['iso']
def __repr__(self) -> str:
return f'timestamp(string={self.string!r}, format={self.format}, naive={self.naive}, utc={self.utc})'
return f'timestamp(string={self.string!r}, format={self.format}, naive={self.naive}, utc={self.utc}, iso={self.iso!r})'
@staticmethod
@lru_cache(maxsize=512)
@lru_cache(maxsize=2048)
def _parse_dt(
dt_string: Optional[str],
format_hint: Optional[Iterable[int]] = None
) -> Dict[str, Optional[int]]:
) -> Dict[str, Any]:
"""
Input a datetime text string of several formats and convert to
a naive or timezone-aware epoch timestamp in UTC.
@@ -366,6 +370,9 @@ class timestamp:
# aware timestamp only if UTC timezone detected.
# None if conversion fails.
"timestamp_utc": int
# ISO string. None if conversion fails.
"iso": str
}
The `format` integer denotes which date_time format
@@ -380,6 +387,9 @@ class timestamp:
timezone is not found in the date-time string), then this
field will be None.
The `iso` string will only have timezone information if the
UTC timezone is detected in `dt_string`.
If the conversion completely fails, all fields will be None.
"""
formats: tuple[TimeStampFormatType, ...] = (
@@ -396,6 +406,9 @@ class timestamp:
{'id': 1700, 'format': '%m/%d/%Y, %I:%M:%S %p', 'locale': None}, # Windows english format wint non-UTC tz (found in systeminfo cli output): 3/22/2021, 1:15:51 PM (UTC-0600)
{'id': 1705, 'format': '%m/%d/%Y, %I:%M:%S %p %Z', 'locale': None}, # Windows english format with UTC tz (found in systeminfo cli output): 3/22/2021, 1:15:51 PM (UTC)
{'id': 1710, 'format': '%m/%d/%Y, %I:%M:%S %p UTC%z', 'locale': None}, # Windows english format with UTC tz (found in systeminfo cli output): 3/22/2021, 1:15:51 PM (UTC+0000)
{'id': 1750, 'format': '%Y/%m/%d-%H:%M:%S.%f', 'locale': None}, # Google Big Table format with no timezone: 1970/01/01-01:00:00.000000
{'id': 1755, 'format': '%Y/%m/%d-%H:%M:%S.%f%z', 'locale': None}, # Google Big Table format with timezone: 1970/01/01-01:00:00.000000+00:00
{'id': 1800, 'format': '%d/%b/%Y:%H:%M:%S %z', 'locale': None}, # Common Log Format: 10/Oct/2000:13:55:36 -0700
{'id': 2000, 'format': '%a %d %b %Y %I:%M:%S %p %Z', 'locale': None}, # en_US.UTF-8 local format (found in upower cli output): Tue 23 Mar 2021 04:12:11 PM UTC
{'id': 3000, 'format': '%a %d %b %Y %I:%M:%S %p', 'locale': None}, # en_US.UTF-8 local format with non-UTC tz (found in upower cli output): Tue 23 Mar 2021 04:12:11 PM IST
{'id': 4000, 'format': '%A %d %B %Y %I:%M:%S %p %Z', 'locale': None}, # European-style local format (found in upower cli output): Tuesday 01 October 2019 12:50:41 PM UTC
@@ -461,10 +474,12 @@ class timestamp:
dt_utc: Optional[datetime] = None
timestamp_naive: Optional[int] = None
timestamp_utc: Optional[int] = None
timestamp_obj: Dict[str, Optional[int]] = {
iso_string: Optional[str] = None
timestamp_obj: Dict[str, Any] = {
'format': None,
'timestamp_naive': None,
'timestamp_utc': None
'timestamp_utc': None,
'iso': None
}
# convert format_hint to a tuple so it is hashable (for lru_cache)
@@ -484,7 +499,10 @@ class timestamp:
if 'UTC+' in data or 'UTC-' in data:
utc_tz = bool('UTC+0000' in data or 'UTC-0000' in data)
elif '+0000' in data or '-0000' in data:
elif '+0000' in data \
or '-0000' in data \
or '+00:00' in data \
or '-00:00' in data:
utc_tz = True
# normalize the timezone by taking out any timezone reference, except UTC
@@ -520,8 +538,9 @@ class timestamp:
try:
locale.setlocale(locale.LC_TIME, fmt['locale'])
dt = datetime.strptime(normalized_datetime, fmt['format'])
timestamp_naive = int(dt.replace(tzinfo=None).timestamp())
timestamp_obj['format'] = fmt['id']
timestamp_naive = int(dt.replace(tzinfo=None).timestamp())
iso_string = dt.replace(tzinfo=None).isoformat()
locale.setlocale(locale.LC_TIME, None)
break
except Exception:
@@ -531,9 +550,11 @@ class timestamp:
if dt and utc_tz:
dt_utc = dt.replace(tzinfo=timezone.utc)
timestamp_utc = int(dt_utc.timestamp())
iso_string = dt_utc.isoformat()
if timestamp_naive:
timestamp_obj['timestamp_naive'] = timestamp_naive
timestamp_obj['timestamp_utc'] = timestamp_utc
timestamp_obj['iso'] = iso_string
return timestamp_obj

View File

@@ -1,4 +1,4 @@
.TH jc 1 2022-10-24 1.22.1 "JSON Convert"
.TH jc 1 2022-12-16 1.22.3 "JSON Convert"
.SH NAME
\fBjc\fP \- JSON Convert JSONifies the output of many CLI tools, file-types, and strings
.SH SYNOPSIS
@@ -65,6 +65,11 @@ multi-line ASCII and Unicode table parser
\fB--blkid\fP
`blkid` command parser
.TP
.B
\fB--cbt\fP
`cbt` (Google Bigtable) command parser
.TP
.B
\fB--cef\fP
@@ -85,6 +90,16 @@ CEF string streaming parser
\fB--cksum\fP
`cksum` and `sum` command parser
.TP
.B
\fB--clf\fP
Common and Combined Log Format file parser
.TP
.B
\fB--clf-s\fP
Common and Combined Log Format file streaming parser
.TP
.B
\fB--crontab\fP
@@ -160,6 +175,11 @@ Email Address string parser
\fB--file\fP
`file` command parser
.TP
.B
\fB--findmnt\fP
`findmnt` command parser
.TP
.B
\fB--finger\fP
@@ -185,6 +205,11 @@ Email Address string parser
\fB--git-log-s\fP
`git log` command streaming parser
.TP
.B
\fB--git-ls-remote\fP
`git ls-remote` command parser
.TP
.B
\fB--gpg\fP
@@ -370,6 +395,16 @@ M3U and M3U8 file parser
\fB--ntpq\fP
`ntpq -p` command parser
.TP
.B
\fB--openvpn\fP
openvpn-status.log file parser
.TP
.B
\fB--os-prober\fP
`os-prober` command parser
.TP
.B
\fB--passwd\fP
@@ -380,6 +415,11 @@ M3U and M3U8 file parser
\fB--pci-ids\fP
`pci.ids` file parser
.TP
.B
\fB--pgpass\fP
PostgreSQL password file parser
.TP
.B
\fB--pidstat\fP
@@ -695,6 +735,11 @@ PLIST file parser
\fB--rsync-s\fP
`rsync` command streaming parser
.TP
.B
\fB--semver\fP
Semantic Version string parser
.TP
.B
\fB--sfdisk\fP
@@ -710,6 +755,11 @@ PLIST file parser
\fB--ss\fP
`ss` command parser
.TP
.B
\fB--sshd-conf\fP
sshd config file and `sshd -T` command parser
.TP
.B
\fB--stat\fP
@@ -1047,7 +1097,7 @@ JC_COLORS=default,default,default,default
You can set the \fBNO_COLOR\fP environment variable to any value to disable color output in \fBjc\fP. Note that using the \fB-C\fP option to force color output will override both the \fBNO_COLOR\fP environment variable and the \fB-m\fP option.
.SH STREAMING PARSERS
Most parsers load all of the data from \fBSTDIN\fP, parse it, then output the entire JSON document serially. There are some streaming parsers (e.g. \fBls-s\fP, \fBping-s\fP, etc.) that immediately start processing and outputing the data line-by-line as JSON Lines (aka NDJSON) while it is being received from \fBSTDIN\fP. This can significantly reduce the amount of memory required to parse large amounts of command output (e.g. \fBls -lR /\fP) and can sometimes process the data more quickly. Streaming parsers have slightly different behavior than standard parsers as outlined below.
Most parsers load all of the data from \fBSTDIN\fP, parse it, then output the entire JSON document serially. There are some streaming parsers (e.g. \fBls-s\fP, \fBping-s\fP, etc.) that immediately start processing and outputting the data line-by-line as JSON Lines (aka NDJSON) while it is being received from \fBSTDIN\fP. This can significantly reduce the amount of memory required to parse large amounts of command output (e.g. \fBls -lR /\fP) and can sometimes process the data more quickly. Streaming parsers have slightly different behavior than standard parsers as outlined below.
.RS
Note: Streaming parsers cannot be used with the "magic" syntax

View File

@@ -1,5 +1,5 @@
#!/usr/bin/env python3
# Genereate man page from jc metadata using jinja2 templates
# Generate man page from jc metadata using jinja2 templates
from datetime import date
import jc.cli
from jinja2 import Environment, FileSystemLoader

View File

@@ -1,5 +1,5 @@
#!/usr/bin/env python3
# Genereate README.md from jc metadata using jinja2 templates
# Generate README.md from jc metadata using jinja2 templates
import jc.cli
import jc.lib
from jinja2 import Environment, FileSystemLoader

View File

@@ -5,7 +5,7 @@ with open('README.md', 'r') as f:
setuptools.setup(
name='jc',
version='1.22.1',
version='1.22.3',
author='Kelly Brazil',
author_email='kellyjonbrazil@gmail.com',
description='Converts the output of popular command-line tools and file-types to JSON.',

View File

@@ -182,7 +182,7 @@ JC_COLORS=default,default,default,default
You can set the \fBNO_COLOR\fP environment variable to any value to disable color output in \fBjc\fP. Note that using the \fB-C\fP option to force color output will override both the \fBNO_COLOR\fP environment variable and the \fB-m\fP option.
.SH STREAMING PARSERS
Most parsers load all of the data from \fBSTDIN\fP, parse it, then output the entire JSON document serially. There are some streaming parsers (e.g. \fBls-s\fP, \fBping-s\fP, etc.) that immediately start processing and outputing the data line-by-line as JSON Lines (aka NDJSON) while it is being received from \fBSTDIN\fP. This can significantly reduce the amount of memory required to parse large amounts of command output (e.g. \fBls -lR /\fP) and can sometimes process the data more quickly. Streaming parsers have slightly different behavior than standard parsers as outlined below.
Most parsers load all of the data from \fBSTDIN\fP, parse it, then output the entire JSON document serially. There are some streaming parsers (e.g. \fBls-s\fP, \fBping-s\fP, etc.) that immediately start processing and outputting the data line-by-line as JSON Lines (aka NDJSON) while it is being received from \fBSTDIN\fP. This can significantly reduce the amount of memory required to parse large amounts of command output (e.g. \fBls -lR /\fP) and can sometimes process the data more quickly. Streaming parsers have slightly different behavior than standard parsers as outlined below.
.RS
Note: Streaming parsers cannot be used with the "magic" syntax

View File

@@ -114,7 +114,7 @@ pip3 install jc
| Debian/Ubuntu linux | `apt-get install jc` |
| Fedora linux | `dnf install jc` |
| openSUSE linux | `zypper install jc` |
| Archlinux Community Repository | `paru -S jc` or `aura -S jc` or `yay -S jc` |
| Arch linux | `pacman -S jc` |
| NixOS linux | `nix-env -iA nixpkgs.jc` or `nix-env -iA nixos.jc` |
| Guix System linux | `guix install jc` |
| Gentoo Linux | `emerge dev-python/jc` |
@@ -262,7 +262,7 @@ option.
### Streaming Parsers
Most parsers load all of the data from `STDIN`, parse it, then output the entire
JSON document serially. There are some streaming parsers (e.g. `ls-s` and
`ping-s`) that immediately start processing and outputing the data line-by-line
`ping-s`) that immediately start processing and outputting the data line-by-line
as [JSON Lines](https://jsonlines.org/) (aka [NDJSON](http://ndjson.org/)) while
it is being received from `STDIN`. This can significantly reduce the amount of
memory required to parse large amounts of command output (e.g. `ls -lR /`) and

View File

@@ -0,0 +1 @@
[{"target":"/","source":"/dev/mapper/centos-root","fstype":"xfs","options":["rw","relatime","seclabel","attr2","inode64","noquota"]},{"target":"/sys","source":"sysfs","fstype":"sysfs","options":["rw","nosuid","nodev","noexec","relatime","seclabel"]},{"target":"/sys/kernel/security","source":"securityfs","fstype":"securityfs","options":["rw","nosuid","nodev","noexec","relatime"]},{"target":"/sys/fs/cgroup","source":"tmpfs","fstype":"tmpfs","options":["ro","nosuid","nodev","noexec","seclabel"],"kv_options":{"mode":"755"}},{"target":"/sys/fs/cgroup/systemd","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","xattr"],"kv_options":{"release_agent":"/usr/lib/systemd/systemd-cgroups-agent","name":"systemd"}},{"target":"/sys/fs/cgroup/net_cls,net_prio","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","net_prio","net_cls"]},{"target":"/sys/fs/cgroup/freezer","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","freezer"]},{"target":"/sys/fs/cgroup/devices","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","devices"]},{"target":"/sys/fs/cgroup/cpuset","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","cpuset"]},{"target":"/sys/fs/cgroup/cpu,cpuacct","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","cpuacct","cpu"]},{"target":"/sys/fs/cgroup/pids","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","pids"]},{"target":"/sys/fs/cgroup/memory","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","memory"]},{"target":"/sys/fs/cgroup/perf_event","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","perf_event"]},{"target":"/sys/fs/cgroup/hugetlb","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","hugetlb"]},{"target":"/sys/fs/cgroup/blkio","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","blkio"]},{"target":"/sys/fs/pstore","source":"pstore","fstype":"pstore","options":["rw","nosuid","nodev","noexec","relatime"]},{"target":"/sys/kernel/config","source":"configfs","fstype":"configfs","options":["rw","relatime"]},{"target":"/sys/fs/selinux","source":"selinuxfs","fstype":"selinuxfs","options":["rw","relatime"]},{"target":"/sys/kernel/debug","source":"debugfs","fstype":"debugfs","options":["rw","relatime"]},{"target":"/proc","source":"proc","fstype":"proc","options":["rw","nosuid","nodev","noexec","relatime"]},{"target":"/proc/sys/fs/binfmt_misc","source":"systemd-1","fstype":"autofs","options":["rw","relatime","direct"],"kv_options":{"fd":"22","pgrp":"1","timeout":"0","minproto":"5","maxproto":"5","pipe_ino":"13827"}},{"target":"/dev","source":"devtmpfs","fstype":"devtmpfs","options":["rw","nosuid","seclabel"],"kv_options":{"size":"1918816k","nr_inodes":"479704","mode":"755"}},{"target":"/dev/shm","source":"tmpfs","fstype":"tmpfs","options":["rw","nosuid","nodev","seclabel"]},{"target":"/dev/pts","source":"devpts","fstype":"devpts","options":["rw","nosuid","noexec","relatime","seclabel"],"kv_options":{"gid":"5","mode":"620","ptmxmode":"000"}},{"target":"/dev/hugepages","source":"hugetlbfs","fstype":"hugetlbfs","options":["rw","relatime","seclabel"]},{"target":"/dev/mqueue","source":"mqueue","fstype":"mqueue","options":["rw","relatime","seclabel"]},{"target":"/run","source":"tmpfs","fstype":"tmpfs","options":["rw","nosuid","nodev","seclabel"],"kv_options":{"mode":"755"}},{"target":"/run/user/1000","source":"tmpfs","fstype":"tmpfs","options":["rw","nosuid","nodev","relatime","seclabel"],"kv_options":{"size":"386136k","mode":"700","uid":"1000","gid":"1000"}},{"target":"/boot","source":"/dev/sda1","fstype":"xfs","options":["rw","relatime","seclabel","attr2","inode64","noquota"]},{"target":"/var/lib/docker/containers","source":"/dev/mapper/centos-root[/var/lib/docker/containers]","fstype":"xfs","options":["rw","relatime","seclabel","attr2","inode64","noquota"]},{"target":"/var/lib/docker/overlay2","source":"/dev/mapper/centos-root[/var/lib/docker/overlay2]","fstype":"xfs","options":["rw","relatime","seclabel","attr2","inode64","noquota"]}]

32
tests/fixtures/centos-7.7/findmnt-a.out vendored Normal file
View File

@@ -0,0 +1,32 @@
TARGET SOURCE FSTYPE OPTIONS
/ /dev/mapper/centos-root xfs rw,relatime,seclabel,attr2,inode64,noquota
|-/sys sysfs sysfs rw,nosuid,nodev,noexec,relatime,seclabel
| |-/sys/kernel/security securityfs securityfs rw,nosuid,nodev,noexec,relatime
| |-/sys/fs/cgroup tmpfs tmpfs ro,nosuid,nodev,noexec,seclabel,mode=755
| | |-/sys/fs/cgroup/systemd cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,xattr,release_agent=/usr/lib/systemd/systemd-cgroups-agent,name=systemd
| | |-/sys/fs/cgroup/net_cls,net_prio cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,net_prio,net_cls
| | |-/sys/fs/cgroup/freezer cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,freezer
| | |-/sys/fs/cgroup/devices cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,devices
| | |-/sys/fs/cgroup/cpuset cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,cpuset
| | |-/sys/fs/cgroup/cpu,cpuacct cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,cpuacct,cpu
| | |-/sys/fs/cgroup/pids cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,pids
| | |-/sys/fs/cgroup/memory cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,memory
| | |-/sys/fs/cgroup/perf_event cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,perf_event
| | |-/sys/fs/cgroup/hugetlb cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,hugetlb
| | `-/sys/fs/cgroup/blkio cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,blkio
| |-/sys/fs/pstore pstore pstore rw,nosuid,nodev,noexec,relatime
| |-/sys/kernel/config configfs configfs rw,relatime
| |-/sys/fs/selinux selinuxfs selinuxfs rw,relatime
| `-/sys/kernel/debug debugfs debugfs rw,relatime
|-/proc proc proc rw,nosuid,nodev,noexec,relatime
| `-/proc/sys/fs/binfmt_misc systemd-1 autofs rw,relatime,fd=22,pgrp=1,timeout=0,minproto=5,maxproto=5,direct,pipe_ino=13827
|-/dev devtmpfs devtmpfs rw,nosuid,seclabel,size=1918816k,nr_inodes=479704,mode=755
| |-/dev/shm tmpfs tmpfs rw,nosuid,nodev,seclabel
| |-/dev/pts devpts devpts rw,nosuid,noexec,relatime,seclabel,gid=5,mode=620,ptmxmode=000
| |-/dev/hugepages hugetlbfs hugetlbfs rw,relatime,seclabel
| `-/dev/mqueue mqueue mqueue rw,relatime,seclabel
|-/run tmpfs tmpfs rw,nosuid,nodev,seclabel,mode=755
| `-/run/user/1000 tmpfs tmpfs rw,nosuid,nodev,relatime,seclabel,size=386136k,mode=700,uid=1000,gid=1000
|-/boot /dev/sda1 xfs rw,relatime,seclabel,attr2,inode64,noquota
|-/var/lib/docker/containers /dev/mapper/centos-root[/var/lib/docker/containers] xfs rw,relatime,seclabel,attr2,inode64,noquota
`-/var/lib/docker/overlay2 /dev/mapper/centos-root[/var/lib/docker/overlay2] xfs rw,relatime,seclabel,attr2,inode64,noquota

View File

@@ -0,0 +1 @@
[{"target":"/sys","source":"sysfs","fstype":"sysfs","options":["rw","nosuid","nodev","noexec","relatime","seclabel"]},{"target":"/proc","source":"proc","fstype":"proc","options":["rw","nosuid","nodev","noexec","relatime"]},{"target":"/dev","source":"devtmpfs","fstype":"devtmpfs","options":["rw","nosuid","seclabel"],"kv_options":{"size":"1918816k","nr_inodes":"479704","mode":"755"}},{"target":"/sys/kernel/security","source":"securityfs","fstype":"securityfs","options":["rw","nosuid","nodev","noexec","relatime"]},{"target":"/dev/shm","source":"tmpfs","fstype":"tmpfs","options":["rw","nosuid","nodev","seclabel"]},{"target":"/dev/pts","source":"devpts","fstype":"devpts","options":["rw","nosuid","noexec","relatime","seclabel"],"kv_options":{"gid":"5","mode":"620","ptmxmode":"000"}},{"target":"/run","source":"tmpfs","fstype":"tmpfs","options":["rw","nosuid","nodev","seclabel"],"kv_options":{"mode":"755"}},{"target":"/sys/fs/cgroup","source":"tmpfs","fstype":"tmpfs","options":["ro","nosuid","nodev","noexec","seclabel"],"kv_options":{"mode":"755"}},{"target":"/sys/fs/cgroup/systemd","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","xattr"],"kv_options":{"release_agent":"/usr/lib/systemd/systemd-cgroups-agent","name":"systemd"}},{"target":"/sys/fs/pstore","source":"pstore","fstype":"pstore","options":["rw","nosuid","nodev","noexec","relatime"]},{"target":"/sys/fs/cgroup/net_cls,net_prio","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","net_prio","net_cls"]},{"target":"/sys/fs/cgroup/freezer","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","freezer"]},{"target":"/sys/fs/cgroup/devices","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","devices"]},{"target":"/sys/fs/cgroup/cpuset","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","cpuset"]},{"target":"/sys/fs/cgroup/cpu,cpuacct","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","cpuacct","cpu"]},{"target":"/sys/fs/cgroup/pids","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","pids"]},{"target":"/sys/fs/cgroup/memory","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","memory"]},{"target":"/sys/fs/cgroup/perf_event","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","perf_event"]},{"target":"/sys/fs/cgroup/hugetlb","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","hugetlb"]},{"target":"/sys/fs/cgroup/blkio","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","blkio"]},{"target":"/sys/kernel/config","source":"configfs","fstype":"configfs","options":["rw","relatime"]},{"target":"/","source":"/dev/mapper/centos-root","fstype":"xfs","options":["rw","relatime","seclabel","attr2","inode64","noquota"]},{"target":"/sys/fs/selinux","source":"selinuxfs","fstype":"selinuxfs","options":["rw","relatime"]},{"target":"/proc/sys/fs/binfmt_misc","source":"systemd-1","fstype":"autofs","options":["rw","relatime","direct"],"kv_options":{"fd":"22","pgrp":"1","timeout":"0","minproto":"5","maxproto":"5","pipe_ino":"13827"}},{"target":"/dev/hugepages","source":"hugetlbfs","fstype":"hugetlbfs","options":["rw","relatime","seclabel"]},{"target":"/dev/mqueue","source":"mqueue","fstype":"mqueue","options":["rw","relatime","seclabel"]},{"target":"/sys/kernel/debug","source":"debugfs","fstype":"debugfs","options":["rw","relatime"]},{"target":"/boot","source":"/dev/sda1","fstype":"xfs","options":["rw","relatime","seclabel","attr2","inode64","noquota"]},{"target":"/var/lib/docker/containers","source":"/dev/mapper/centos-root[/var/lib/docker/containers]","fstype":"xfs","options":["rw","relatime","seclabel","attr2","inode64","noquota"]},{"target":"/var/lib/docker/overlay2","source":"/dev/mapper/centos-root[/var/lib/docker/overlay2]","fstype":"xfs","options":["rw","relatime","seclabel","attr2","inode64","noquota"]},{"target":"/run/user/1000","source":"tmpfs","fstype":"tmpfs","options":["rw","nosuid","nodev","relatime","seclabel"],"kv_options":{"size":"386136k","mode":"700","uid":"1000","gid":"1000"}}]

32
tests/fixtures/centos-7.7/findmnt-l.out vendored Normal file
View File

@@ -0,0 +1,32 @@
TARGET SOURCE FSTYPE OPTIONS
/sys sysfs sysfs rw,nosuid,nodev,noexec,relatime,seclabel
/proc proc proc rw,nosuid,nodev,noexec,relatime
/dev devtmpfs devtmpfs rw,nosuid,seclabel,size=1918816k,nr_inodes=479704,mode=755
/sys/kernel/security securityfs securityfs rw,nosuid,nodev,noexec,relatime
/dev/shm tmpfs tmpfs rw,nosuid,nodev,seclabel
/dev/pts devpts devpts rw,nosuid,noexec,relatime,seclabel,gid=5,mode=620,ptmxmode=000
/run tmpfs tmpfs rw,nosuid,nodev,seclabel,mode=755
/sys/fs/cgroup tmpfs tmpfs ro,nosuid,nodev,noexec,seclabel,mode=755
/sys/fs/cgroup/systemd cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,xattr,release_agent=/usr/lib/systemd/systemd-cgroups-agent,name=systemd
/sys/fs/pstore pstore pstore rw,nosuid,nodev,noexec,relatime
/sys/fs/cgroup/net_cls,net_prio cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,net_prio,net_cls
/sys/fs/cgroup/freezer cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,freezer
/sys/fs/cgroup/devices cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,devices
/sys/fs/cgroup/cpuset cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,cpuset
/sys/fs/cgroup/cpu,cpuacct cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,cpuacct,cpu
/sys/fs/cgroup/pids cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,pids
/sys/fs/cgroup/memory cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,memory
/sys/fs/cgroup/perf_event cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,perf_event
/sys/fs/cgroup/hugetlb cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,hugetlb
/sys/fs/cgroup/blkio cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,blkio
/sys/kernel/config configfs configfs rw,relatime
/ /dev/mapper/centos-root xfs rw,relatime,seclabel,attr2,inode64,noquota
/sys/fs/selinux selinuxfs selinuxfs rw,relatime
/proc/sys/fs/binfmt_misc systemd-1 autofs rw,relatime,fd=22,pgrp=1,timeout=0,minproto=5,maxproto=5,direct,pipe_ino=13827
/dev/hugepages hugetlbfs hugetlbfs rw,relatime,seclabel
/dev/mqueue mqueue mqueue rw,relatime,seclabel
/sys/kernel/debug debugfs debugfs rw,relatime
/boot /dev/sda1 xfs rw,relatime,seclabel,attr2,inode64,noquota
/var/lib/docker/containers /dev/mapper/centos-root[/var/lib/docker/containers] xfs rw,relatime,seclabel,attr2,inode64,noquota
/var/lib/docker/overlay2 /dev/mapper/centos-root[/var/lib/docker/overlay2] xfs rw,relatime,seclabel,attr2,inode64,noquota
/run/user/1000 tmpfs tmpfs rw,nosuid,nodev,relatime,seclabel,size=386136k,mode=700,uid=1000,gid=1000

View File

@@ -0,0 +1 @@
[{"target":"/","source":"/dev/mapper/centos-root","fstype":"xfs","options":["rw","relatime","seclabel","attr2","inode64","noquota"]},{"target":"/sys","source":"sysfs","fstype":"sysfs","options":["rw","nosuid","nodev","noexec","relatime","seclabel"]},{"target":"/sys/kernel/security","source":"securityfs","fstype":"securityfs","options":["rw","nosuid","nodev","noexec","relatime"]},{"target":"/sys/fs/cgroup","source":"tmpfs","fstype":"tmpfs","options":["ro","nosuid","nodev","noexec","seclabel"],"kv_options":{"mode":"755"}},{"target":"/sys/fs/cgroup/systemd","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","xattr"],"kv_options":{"release_agent":"/usr/lib/systemd/systemd-cgroups-agent","name":"systemd"}},{"target":"/sys/fs/cgroup/net_cls,net_prio","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","net_prio","net_cls"]},{"target":"/sys/fs/cgroup/blkio","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","blkio"]},{"target":"/sys/fs/cgroup/devices","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","devices"]},{"target":"/sys/fs/cgroup/hugetlb","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","hugetlb"]},{"target":"/sys/fs/cgroup/cpuset","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","cpuset"]},{"target":"/sys/fs/cgroup/cpu,cpuacct","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","cpuacct","cpu"]},{"target":"/sys/fs/cgroup/memory","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","memory"]},{"target":"/sys/fs/cgroup/perf_event","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","perf_event"]},{"target":"/sys/fs/cgroup/freezer","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","freezer"]},{"target":"/sys/fs/cgroup/pids","source":"cgroup","fstype":"cgroup","options":["rw","nosuid","nodev","noexec","relatime","seclabel","pids"]},{"target":"/sys/fs/pstore","source":"pstore","fstype":"pstore","options":["rw","nosuid","nodev","noexec","relatime"]},{"target":"/sys/kernel/config","source":"configfs","fstype":"configfs","options":["rw","relatime"]},{"target":"/sys/fs/selinux","source":"selinuxfs","fstype":"selinuxfs","options":["rw","relatime"]},{"target":"/sys/kernel/debug","source":"debugfs","fstype":"debugfs","options":["rw","relatime"]},{"target":"/proc","source":"proc","fstype":"proc","options":["rw","nosuid","nodev","noexec","relatime"]},{"target":"/proc/sys/fs/binfmt_misc","source":"systemd-1","fstype":"autofs","options":["rw","relatime","direct"],"kv_options":{"fd":"36","pgrp":"1","timeout":"0","minproto":"5","maxproto":"5","pipe_ino":"13995"}},{"target":"/dev","source":"devtmpfs","fstype":"devtmpfs","options":["rw","nosuid","seclabel"],"kv_options":{"size":"1918816k","nr_inodes":"479704","mode":"755"}},{"target":"/dev/shm","source":"tmpfs","fstype":"tmpfs","options":["rw","nosuid","nodev","seclabel"]},{"target":"/dev/pts","source":"devpts","fstype":"devpts","options":["rw","nosuid","noexec","relatime","seclabel"],"kv_options":{"gid":"5","mode":"620","ptmxmode":"000"}},{"target":"/dev/mqueue","source":"mqueue","fstype":"mqueue","options":["rw","relatime","seclabel"]},{"target":"/dev/hugepages","source":"hugetlbfs","fstype":"hugetlbfs","options":["rw","relatime","seclabel"]},{"target":"/run","source":"tmpfs","fstype":"tmpfs","options":["rw","nosuid","nodev","seclabel"],"kv_options":{"mode":"755"}},{"target":"/run/user/0","source":"tmpfs","fstype":"tmpfs","options":["rw","nosuid","nodev","relatime","seclabel"],"kv_options":{"size":"386136k","mode":"700"}},{"target":"/run/user/1000","source":"tmpfs","fstype":"tmpfs","options":["rw","nosuid","nodev","relatime","seclabel"],"kv_options":{"size":"386136k","mode":"700","uid":"1000","gid":"1000"}},{"target":"/boot","source":"/dev/sda1","fstype":"xfs","options":["rw","relatime","seclabel","attr2","inode64","noquota"]},{"target":"/var/lib/docker/containers","source":"/dev/mapper/centos-root[/var/lib/docker/containers]","fstype":"xfs","options":["rw","relatime","seclabel","attr2","inode64","noquota"]},{"target":"/var/lib/docker/overlay2","source":"/dev/mapper/centos-root[/var/lib/docker/overlay2]","fstype":"xfs","options":["rw","relatime","seclabel","attr2","inode64","noquota"]}]

33
tests/fixtures/centos-7.7/findmnt.out vendored Normal file
View File

@@ -0,0 +1,33 @@
TARGET SOURCE FSTYPE OPTIONS
/ /dev/mapper/centos-root xfs rw,relatime,seclabel,attr2,inode64,noquota
├─/sys sysfs sysfs rw,nosuid,nodev,noexec,relatime,seclabel
│ ├─/sys/kernel/security securityfs securityfs rw,nosuid,nodev,noexec,relatime
│ ├─/sys/fs/cgroup tmpfs tmpfs ro,nosuid,nodev,noexec,seclabel,mode=755
│ │ ├─/sys/fs/cgroup/systemd cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,xattr,release_agent=/usr/lib/systemd/systemd-cgroups-agent,name=systemd
│ │ ├─/sys/fs/cgroup/net_cls,net_prio cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,net_prio,net_cls
│ │ ├─/sys/fs/cgroup/blkio cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,blkio
│ │ ├─/sys/fs/cgroup/devices cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,devices
│ │ ├─/sys/fs/cgroup/hugetlb cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,hugetlb
│ │ ├─/sys/fs/cgroup/cpuset cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,cpuset
│ │ ├─/sys/fs/cgroup/cpu,cpuacct cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,cpuacct,cpu
│ │ ├─/sys/fs/cgroup/memory cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,memory
│ │ ├─/sys/fs/cgroup/perf_event cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,perf_event
│ │ ├─/sys/fs/cgroup/freezer cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,freezer
│ │ └─/sys/fs/cgroup/pids cgroup cgroup rw,nosuid,nodev,noexec,relatime,seclabel,pids
│ ├─/sys/fs/pstore pstore pstore rw,nosuid,nodev,noexec,relatime
│ ├─/sys/kernel/config configfs configfs rw,relatime
│ ├─/sys/fs/selinux selinuxfs selinuxfs rw,relatime
│ └─/sys/kernel/debug debugfs debugfs rw,relatime
├─/proc proc proc rw,nosuid,nodev,noexec,relatime
│ └─/proc/sys/fs/binfmt_misc systemd-1 autofs rw,relatime,fd=36,pgrp=1,timeout=0,minproto=5,maxproto=5,direct,pipe_ino=13995
├─/dev devtmpfs devtmpfs rw,nosuid,seclabel,size=1918816k,nr_inodes=479704,mode=755
│ ├─/dev/shm tmpfs tmpfs rw,nosuid,nodev,seclabel
│ ├─/dev/pts devpts devpts rw,nosuid,noexec,relatime,seclabel,gid=5,mode=620,ptmxmode=000
│ ├─/dev/mqueue mqueue mqueue rw,relatime,seclabel
│ └─/dev/hugepages hugetlbfs hugetlbfs rw,relatime,seclabel
├─/run tmpfs tmpfs rw,nosuid,nodev,seclabel,mode=755
│ ├─/run/user/0 tmpfs tmpfs rw,nosuid,nodev,relatime,seclabel,size=386136k,mode=700
│ └─/run/user/1000 tmpfs tmpfs rw,nosuid,nodev,relatime,seclabel,size=386136k,mode=700,uid=1000,gid=1000
├─/boot /dev/sda1 xfs rw,relatime,seclabel,attr2,inode64,noquota
├─/var/lib/docker/containers /dev/mapper/centos-root[/var/lib/docker/containers] xfs rw,relatime,seclabel,attr2,inode64,noquota
└─/var/lib/docker/overlay2 /dev/mapper/centos-root[/var/lib/docker/overlay2] xfs rw,relatime,seclabel,attr2,inode64,noquota

View File

@@ -1 +1 @@
[{"name": "docker0", "flags": 4099, "state": ["UP", "BROADCAST", "MULTICAST"], "mtu": 1500, "ipv4_addr": "172.17.0.1", "ipv4_mask": "255.255.0.0", "ipv4_bcast": "0.0.0.0", "mac_addr": "02:42:b1:9a:ea:02", "type": "Ethernet", "rx_packets": 0, "rx_bytes": 0, "rx_errors": 0, "rx_dropped": 0, "rx_overruns": 0, "rx_frame": 0, "tx_packets": 0, "tx_bytes": 0, "tx_errors": 0, "tx_dropped": 0, "tx_overruns": 0, "tx_carrier": 0, "tx_collisions": 0, "ipv6_addr": null, "ipv6_mask": null, "ipv6_scope": null, "metric": null}, {"name": "ens33", "flags": 4163, "state": ["UP", "BROADCAST", "RUNNING", "MULTICAST"], "mtu": 1500, "ipv4_addr": "192.168.71.137", "ipv4_mask": "255.255.255.0", "ipv4_bcast": "192.168.71.255", "ipv6_addr": "fe80::c1cb:715d:bc3e:b8a0", "ipv6_mask": 64, "ipv6_scope": "0x20", "mac_addr": "00:0c:29:3b:58:0e", "type": "Ethernet", "rx_packets": 8061, "rx_bytes": 1514413, "rx_errors": 0, "rx_dropped": 0, "rx_overruns": 0, "rx_frame": 0, "tx_packets": 4502, "tx_bytes": 866622, "tx_errors": 0, "tx_dropped": 0, "tx_overruns": 0, "tx_carrier": 0, "tx_collisions": 0, "metric": null}, {"name": "lo", "flags": 73, "state": ["UP", "LOOPBACK", "RUNNING"], "mtu": 65536, "ipv4_addr": "127.0.0.1", "ipv4_mask": "255.0.0.0", "ipv4_bcast": null, "ipv6_addr": "::1", "ipv6_mask": 128, "ipv6_scope": "0x10", "mac_addr": null, "type": "Local Loopback", "rx_packets": 73, "rx_bytes": 6009, "rx_errors": 0, "rx_dropped": 0, "rx_overruns": 0, "rx_frame": 0, "tx_packets": 73, "tx_bytes": 6009, "tx_errors": 0, "tx_dropped": 0, "tx_overruns": 0, "tx_carrier": 0, "tx_collisions": 0, "metric": null}]
[{"name":"docker0","flags":4099,"state":["UP","BROADCAST","MULTICAST"],"mtu":1500,"type":"Ethernet","mac_addr":"02:42:b1:9a:ea:02","ipv4_addr":"172.17.0.1","ipv4_mask":"255.255.0.0","ipv4_bcast":"0.0.0.0","ipv6_addr":null,"ipv6_mask":null,"ipv6_scope":null,"ipv6_type":null,"metric":null,"rx_packets":0,"rx_errors":0,"rx_dropped":0,"rx_overruns":0,"rx_frame":0,"tx_packets":0,"tx_errors":0,"tx_dropped":0,"tx_overruns":0,"tx_carrier":0,"tx_collisions":0,"rx_bytes":0,"tx_bytes":0,"ipv4":[{"address":"172.17.0.1","mask":"255.255.0.0","broadcast":"0.0.0.0"}]},{"name":"ens33","flags":4163,"state":["UP","BROADCAST","RUNNING","MULTICAST"],"mtu":1500,"type":"Ethernet","mac_addr":"00:0c:29:3b:58:0e","ipv4_addr":"192.168.71.137","ipv4_mask":"255.255.255.0","ipv4_bcast":"192.168.71.255","ipv6_addr":"fe80::c1cb:715d:bc3e:b8a0","ipv6_mask":64,"ipv6_scope":"0x20","ipv6_type":"link","metric":null,"rx_packets":8061,"rx_errors":0,"rx_dropped":0,"rx_overruns":0,"rx_frame":0,"tx_packets":4502,"tx_errors":0,"tx_dropped":0,"tx_overruns":0,"tx_carrier":0,"tx_collisions":0,"rx_bytes":1514413,"tx_bytes":866622,"ipv4":[{"address":"192.168.71.137","mask":"255.255.255.0","broadcast":"192.168.71.255"}],"ipv6":[{"address":"fe80::c1cb:715d:bc3e:b8a0","mask":64,"scope":"0x20","type":"link"}]},{"name":"lo","flags":73,"state":["UP","LOOPBACK","RUNNING"],"mtu":65536,"type":"Local Loopback","mac_addr":null,"ipv4_addr":"127.0.0.1","ipv4_mask":"255.0.0.0","ipv4_bcast":null,"ipv6_addr":"::1","ipv6_mask":128,"ipv6_scope":"0x10","ipv6_type":"host","metric":null,"rx_packets":73,"rx_errors":0,"rx_dropped":0,"rx_overruns":0,"rx_frame":0,"tx_packets":73,"tx_errors":0,"tx_dropped":0,"tx_overruns":0,"tx_carrier":0,"tx_collisions":0,"rx_bytes":6009,"tx_bytes":6009,"ipv4":[{"address":"127.0.0.1","mask":"255.0.0.0","broadcast":null}],"ipv6":[{"address":"::1","mask":128,"scope":"0x10","type":"host"}]}]

View File

@@ -0,0 +1 @@
[{"name":"cxl3","flags":8843,"state":["UP","BROADCAST","RUNNING","SIMPLEX","MULTICAST"],"mtu":1500,"type":null,"mac_addr":"00:07:43:3d:b7:70","ipv4_addr":null,"ipv4_mask":null,"ipv4_bcast":null,"ipv6_addr":null,"ipv6_mask":null,"ipv6_scope":null,"ipv6_type":null,"metric":0,"rx_packets":null,"rx_errors":null,"rx_dropped":null,"rx_overruns":null,"rx_frame":null,"tx_packets":null,"tx_errors":null,"tx_dropped":null,"tx_overruns":null,"tx_carrier":null,"tx_collisions":null,"rx_bytes":null,"tx_bytes":null,"options":"6ec07bb","options_flags":["RXCSUM","TXCSUM","VLAN_MTU","VLAN_HWTAGGING","JUMBO_MTU","VLAN_HWCSUM","TSO4","TSO6","LRO","VLAN_HWTSO","LINKSTATE","RXCSUM_IPV6","TXCSUM_IPV6","HWRXTSTMP","NOMAP"],"hw_address":"00:07:43:3d:b7:88","media":"Ethernet 10Gbase-LR","media_flags":["full-duplex","rxpause","txpause"],"status":"active","nd6_options":29,"nd6_flags":["PERFORMNUD","IFDISABLED","AUTO_LINKLOCAL"],"plugged":"SFP/SFP+/SFP28 10G Base-LR (LC)","vendor":"INNOLIGHT","vendor_pn":"TR-PX13L-N00","vendor_sn":"INJBL0431986","vendor_date":"2020-01-04","module_temperature":"21.20 C","module_voltage":"3.16 Volts","lanes":[{"lane":1,"rx_power_mw":0.49,"rx_power_dbm":-3.1,"tx_bias_ma":23.85}]}]

View File

@@ -0,0 +1,10 @@
cxl3: flags=8843<UP,BROADCAST,RUNNING,SIMPLEX,MULTICAST> metric 0 mtu 1500
options=6ec07bb<RXCSUM,TXCSUM,VLAN_MTU,VLAN_HWTAGGING,JUMBO_MTU,VLAN_HWCSUM,TSO4,TSO6,LRO,VLAN_HWTSO,LINKSTATE,RXCSUM_IPV6,TXCSUM_IPV6,HWRXTSTMP,NOMAP>
ether 00:07:43:3d:b7:70
hwaddr 00:07:43:3d:b7:88 media: Ethernet 10Gbase-LR <full-duplex,rxpause,txpause>
status: active
nd6 options=29<PERFORMNUD,IFDISABLED,AUTO_LINKLOCAL>
plugged: SFP/SFP+/SFP28 10G Base-LR (LC)
vendor: INNOLIGHT PN: TR-PX13L-N00 SN: INJBL0431986 DATE: 2020-01-04
module temperature: 21.20 C voltage: 3.16 Volts
lane 1: RX power: 0.49 mW (-3.10 dBm) TX bias: 23.85 mA

View File

@@ -0,0 +1 @@
[{"name":"ix0","flags":8843,"state":["UP","BROADCAST","RUNNING","SIMPLEX","MULTICAST"],"mtu":9000,"type":null,"mac_addr":"00:1b:21:8b:f8:2c","ipv4_addr":"10.10.2.101","ipv4_mask":"255.255.255.0","ipv4_bcast":"10.10.2.255","ipv6_addr":null,"ipv6_mask":null,"ipv6_scope":null,"ipv6_type":null,"metric":0,"rx_packets":null,"rx_errors":null,"rx_dropped":null,"rx_overruns":null,"rx_frame":null,"tx_packets":null,"tx_errors":null,"tx_dropped":null,"tx_overruns":null,"tx_carrier":null,"tx_collisions":null,"rx_bytes":null,"tx_bytes":null,"options":"e53fbb","options_flags":["RXCSUM","TXCSUM","VLAN_MTU","VLAN_HWTAGGING","JUMBO_MTU","VLAN_HWCSUM","TSO4","TSO6","LRO","WOL_UCAST","WOL_MCAST","WOL_MAGIC","VLAN_HWFILTER","VLAN_HWTSO","RXCSUM_IPV6","TXCSUM_IPV6"],"media":"Ethernet autoselect (10Gbase-SR","media_flags":["full-duplex","rxpause","txpause"],"status":"active","nd6_options":29,"nd6_flags":["PERFORMNUD","IFDISABLED","AUTO_LINKLOCAL"],"plugged":"SFP/SFP+/SFP28 10G Base-SR (LC)","vendor":"Intel Corp","vendor_pn":"FTLX8571D3BCV-IT","vendor_sn":"ALH1AV9","vendor_date":"2011-10-27","module_temperature":"51.27 C","module_voltage":"3.31 Volts","rx_power":"0.49 mW (-3.02 dBm)","tx_pwer":"0.66 mW (-1.74 dBm)","ipv4":[{"address":"10.10.2.101","mask":"255.255.255.0","broadcast":"10.10.2.255"}]}]

View File

@@ -0,0 +1,11 @@
ix0: flags=8843<UP,BROADCAST,RUNNING,SIMPLEX,MULTICAST> metric 0 mtu 9000
options=e53fbb<RXCSUM,TXCSUM,VLAN_MTU,VLAN_HWTAGGING,JUMBO_MTU,VLAN_HWCSUM,TSO4,TSO6,LRO,WOL_UCAST,WOL_MCAST,WOL_MAGIC,VLAN_HWFILTER,VLAN_HWTSO,RXCSUM_IPV6,TXCSUM_IPV6>
ether 00:1b:21:8b:f8:2c
inet 10.10.2.101/24 broadcast 10.10.2.255
media: Ethernet autoselect (10Gbase-SR <full-duplex,rxpause,txpause>)
status: active
nd6 options=29<PERFORMNUD,IFDISABLED,AUTO_LINKLOCAL>
plugged: SFP/SFP+/SFP28 10G Base-SR (LC)
vendor: Intel Corp PN: FTLX8571D3BCV-IT SN: ALH1AV9 DATE: 2011-10-27
module temperature: 51.27 C Voltage: 3.31 Volts
RX: 0.49 mW (-3.02 dBm) TX: 0.66 mW (-1.74 dBm)

View File

@@ -0,0 +1 @@
[{"name":"cxl3","flags":8843,"state":["UP","BROADCAST","RUNNING","SIMPLEX","MULTICAST"],"mtu":1500,"type":null,"mac_addr":"00:07:43:3d:b7:70","ipv4_addr":null,"ipv4_mask":null,"ipv4_bcast":null,"ipv6_addr":null,"ipv6_mask":null,"ipv6_scope":null,"ipv6_type":null,"metric":0,"rx_packets":null,"rx_errors":null,"rx_dropped":null,"rx_overruns":null,"rx_frame":null,"tx_packets":null,"tx_errors":null,"tx_dropped":null,"tx_overruns":null,"tx_carrier":null,"tx_collisions":null,"rx_bytes":null,"tx_bytes":null,"options":"6ec07bb","options_flags":["RXCSUM","TXCSUM","VLAN_MTU","VLAN_HWTAGGING","JUMBO_MTU","VLAN_HWCSUM","TSO4","TSO6","LRO","VLAN_HWTSO","LINKSTATE","RXCSUM_IPV6","TXCSUM_IPV6","HWRXTSTMP","NOMAP"],"hw_address":"00:07:43:3d:b7:88","media":null,"media_flags":null,"status":"active","nd6_options":29,"nd6_flags":["PERFORMNUD","IFDISABLED","AUTO_LINKLOCAL"],"plugged":"SFP/SFP+/SFP28 10G Base-LR (LC)","vendor":"INNOLIGHT","vendor_pn":"TR-PX13L-N00","vendor_sn":"INJBL0431986","vendor_date":"2020-01-04","module_temperature":"21.20 C","module_voltage":"3.16 Volts","lanes":[{"lane":1,"rx_power_mw":0.49,"rx_power_dbm":-3.1,"tx_bias_ma":23.85}]}]

Some files were not shown because too many files have changed in this diff Show More