1
0
mirror of https://github.com/kellyjonbrazil/jc.git synced 2025-06-17 00:07:37 +02:00

Fix typos

Found via `codespell -S ./tests/fixtures -L
chage,ro,ist,ans,unx,respons,technik`
This commit is contained in:
Kian-Meng Ang
2022-11-16 10:01:58 +08:00
parent 299b0faf7c
commit 39555a48b5
33 changed files with 37 additions and 37 deletions

View File

@ -581,7 +581,7 @@ jc changelog
20200211 v1.7.3 20200211 v1.7.3
- Add alternative 'magic' syntax: e.g. `jc ls -al` - Add alternative 'magic' syntax: e.g. `jc ls -al`
- Options can now be condensed (e.g. -prq is equivalant to -p -r -q) - Options can now be condensed (e.g. -prq is equivalent to -p -r -q)
20200208 v1.7.2 20200208 v1.7.2
- Include test fixtures in wheel and sdist - Include test fixtures in wheel and sdist

View File

@ -390,7 +390,7 @@ option.
### Streaming Parsers ### Streaming Parsers
Most parsers load all of the data from `STDIN`, parse it, then output the entire Most parsers load all of the data from `STDIN`, parse it, then output the entire
JSON document serially. There are some streaming parsers (e.g. `ls-s` and JSON document serially. There are some streaming parsers (e.g. `ls-s` and
`ping-s`) that immediately start processing and outputing the data line-by-line `ping-s`) that immediately start processing and outputting the data line-by-line
as [JSON Lines](https://jsonlines.org/) (aka [NDJSON](http://ndjson.org/)) while as [JSON Lines](https://jsonlines.org/) (aka [NDJSON](http://ndjson.org/)) while
it is being received from `STDIN`. This can significantly reduce the amount of it is being received from `STDIN`. This can significantly reduce the amount of
memory required to parse large amounts of command output (e.g. `ls -lR /`) and memory required to parse large amounts of command output (e.g. `ls -lR /`) and

View File

@ -6,7 +6,7 @@
jc - JSON Convert ISO 8601 Datetime string parser jc - JSON Convert ISO 8601 Datetime string parser
This parser supports standard ISO 8601 strings that include both date and This parser supports standard ISO 8601 strings that include both date and
time. If no timezone or offset information is available in the sring, then time. If no timezone or offset information is available in the string, then
UTC timezone is used. UTC timezone is used.
Usage (cli): Usage (cli):

View File

@ -106,7 +106,7 @@ Schema:
] ]
[0] naive timestamp if "when" field is parsable, else null [0] naive timestamp if "when" field is parsable, else null
[1] timezone aware timestamp availabe for UTC, else null [1] timezone aware timestamp available for UTC, else null
Examples: Examples:

View File

@ -61,7 +61,7 @@ Schema:
] ]
[0] naive timestamp if "date" field is parsable, else null [0] naive timestamp if "date" field is parsable, else null
[1] timezone aware timestamp availabe for UTC, else null [1] timezone aware timestamp available for UTC, else null
Examples: Examples:

View File

@ -68,7 +68,7 @@ Schema:
} }
[0] naive timestamp if "date" field is parsable, else null [0] naive timestamp if "date" field is parsable, else null
[1] timezone aware timestamp availabe for UTC, else null [1] timezone aware timestamp available for UTC, else null
Examples: Examples:

View File

@ -6,7 +6,7 @@
jc - JSON Convert Proc file output parser jc - JSON Convert Proc file output parser
This parser automatically identifies the Proc file and calls the This parser automatically identifies the Proc file and calls the
corresponding parser to peform the parsing. corresponding parser to perform the parsing.
Magic syntax for converting `/proc` files is also supported by running Magic syntax for converting `/proc` files is also supported by running
`jc /proc/<path to file>`. Any `jc` options must be specified before the `jc /proc/<path to file>`. Any `jc` options must be specified before the

View File

@ -53,7 +53,7 @@ Blank values converted to `null`/`None`.
] ]
[0] naive timestamp if "timestamp" field is parsable, else null [0] naive timestamp if "timestamp" field is parsable, else null
[1] timezone aware timestamp availabe for UTC, else null [1] timezone aware timestamp available for UTC, else null
[2] this field exists if the syslog line is not parsable. The value [2] this field exists if the syslog line is not parsable. The value
is the original syslog line. is the original syslog line.

View File

@ -64,7 +64,7 @@ Blank values converted to `null`/`None`.
} }
[0] naive timestamp if "timestamp" field is parsable, else null [0] naive timestamp if "timestamp" field is parsable, else null
[1] timezone aware timestamp availabe for UTC, else null [1] timezone aware timestamp available for UTC, else null
[2] this field exists if the syslog line is not parsable. The value [2] this field exists if the syslog line is not parsable. The value
is the original syslog line. is the original syslog line.

View File

@ -91,7 +91,7 @@ Parameters:
the parser. compatible options: the parser. compatible options:
linux, darwin, cygwin, win32, aix, freebsd linux, darwin, cygwin, win32, aix, freebsd
quiet: (bool) supress compatibility message if True quiet: (bool) suppress compatibility message if True
Returns: Returns:

View File

@ -567,7 +567,7 @@ class DSASignature(Sequence):
@classmethod @classmethod
def from_p1363(cls, data): def from_p1363(cls, data):
""" """
Reads a signature from a byte string encoding accordint to IEEE P1363, Reads a signature from a byte string encoding according to IEEE P1363,
which is used by Microsoft's BCryptSignHash() function. which is used by Microsoft's BCryptSignHash() function.
:param data: :param data:

View File

@ -247,7 +247,7 @@ class Asn1Value(object):
:param no_explicit: :param no_explicit:
If explicit tagging info should be removed from this instance. If explicit tagging info should be removed from this instance.
Used internally to allow contructing the underlying value that Used internally to allow constructing the underlying value that
has been wrapped in an explicit tag. has been wrapped in an explicit tag.
:param tag_type: :param tag_type:
@ -697,7 +697,7 @@ class Castable(object):
if other_class.tag != self.__class__.tag: if other_class.tag != self.__class__.tag:
raise TypeError(unwrap( raise TypeError(unwrap(
''' '''
Can not covert a value from %s object to %s object since they Can not convert a value from %s object to %s object since they
use different tags: %d versus %d use different tags: %d versus %d
''', ''',
type_name(other_class), type_name(other_class),
@ -1349,7 +1349,7 @@ class Choice(Asn1Value):
class Concat(object): class Concat(object):
""" """
A class that contains two or more encoded child values concatentated A class that contains two or more encoded child values concatenated
together. THIS IS NOT PART OF THE ASN.1 SPECIFICATION! This exists to handle together. THIS IS NOT PART OF THE ASN.1 SPECIFICATION! This exists to handle
the x509.TrustedCertificate() class for OpenSSL certificates containing the x509.TrustedCertificate() class for OpenSSL certificates containing
extra information. extra information.
@ -3757,7 +3757,7 @@ class Sequence(Asn1Value):
def _make_value(self, field_name, field_spec, value_spec, field_params, value): def _make_value(self, field_name, field_spec, value_spec, field_params, value):
""" """
Contructs an appropriate Asn1Value object for a field Constructs an appropriate Asn1Value object for a field
:param field_name: :param field_name:
A unicode string of the field name A unicode string of the field name
@ -3766,7 +3766,7 @@ class Sequence(Asn1Value):
An Asn1Value class that is the field spec An Asn1Value class that is the field spec
:param value_spec: :param value_spec:
An Asn1Value class that is the vaue spec An Asn1Value class that is the value spec
:param field_params: :param field_params:
None or a dict of params for the field spec None or a dict of params for the field spec

View File

@ -185,7 +185,7 @@ def _pycef_parse(str_input):
# If the input entry had any blanks in the required headers, that's wrong # If the input entry had any blanks in the required headers, that's wrong
# and we should return. Note we explicitly don't check the last item in the # and we should return. Note we explicitly don't check the last item in the
# split list becuase the header ends in a '|' which means the last item # split list because the header ends in a '|' which means the last item
# will always be an empty string (it doesn't exist, but the delimiter does). # will always be an empty string (it doesn't exist, but the delimiter does).
if "" in spl[0:-1]: if "" in spl[0:-1]:
raise ParseError('Blank field(s) in CEF header. Is it valid CEF format?') raise ParseError('Blank field(s) in CEF header. Is it valid CEF format?')

View File

@ -1,7 +1,7 @@
"""jc - JSON Convert ISO 8601 Datetime string parser """jc - JSON Convert ISO 8601 Datetime string parser
This parser supports standard ISO 8601 strings that include both date and This parser supports standard ISO 8601 strings that include both date and
time. If no timezone or offset information is available in the sring, then time. If no timezone or offset information is available in the string, then
UTC timezone is used. UTC timezone is used.
Usage (cli): Usage (cli):

View File

@ -101,7 +101,7 @@ Schema:
] ]
[0] naive timestamp if "when" field is parsable, else null [0] naive timestamp if "when" field is parsable, else null
[1] timezone aware timestamp availabe for UTC, else null [1] timezone aware timestamp available for UTC, else null
Examples: Examples:

View File

@ -56,7 +56,7 @@ Schema:
] ]
[0] naive timestamp if "date" field is parsable, else null [0] naive timestamp if "date" field is parsable, else null
[1] timezone aware timestamp availabe for UTC, else null [1] timezone aware timestamp available for UTC, else null
Examples: Examples:

View File

@ -63,7 +63,7 @@ Schema:
} }
[0] naive timestamp if "date" field is parsable, else null [0] naive timestamp if "date" field is parsable, else null
[1] timezone aware timestamp availabe for UTC, else null [1] timezone aware timestamp available for UTC, else null
Examples: Examples:

View File

@ -560,7 +560,7 @@ def _process(proc_data: Dict) -> Dict:
def _b2a(byte_string: bytes) -> str: def _b2a(byte_string: bytes) -> str:
"""Convert a byte string to a colon-delimited hex ascii string""" """Convert a byte string to a colon-delimited hex ascii string"""
# need try/except since seperator was only introduced in python 3.8. # need try/except since separator was only introduced in python 3.8.
# provides compatibility for python 3.6 and 3.7. # provides compatibility for python 3.6 and 3.7.
try: try:
return binascii.hexlify(byte_string, ':').decode('utf-8') return binascii.hexlify(byte_string, ':').decode('utf-8')

View File

@ -180,7 +180,7 @@ def _post_parse(data):
ssid = {k: v for k, v in ssid.items() if v} ssid = {k: v for k, v in ssid.items() if v}
cleandata.append(ssid) cleandata.append(ssid)
# remove asterisks from begining of values # remove asterisks from beginning of values
for ssid in cleandata: for ssid in cleandata:
for key in ssid: for key in ssid:
if ssid[key].startswith('*'): if ssid[key].startswith('*'):

View File

@ -78,7 +78,7 @@ def _process(proc_data: Dict) -> Dict:
def _b2a(byte_string: bytes) -> str: def _b2a(byte_string: bytes) -> str:
"""Convert a byte string to a colon-delimited hex ascii string""" """Convert a byte string to a colon-delimited hex ascii string"""
# need try/except since seperator was only introduced in python 3.8. # need try/except since separator was only introduced in python 3.8.
# provides compatibility for python 3.6 and 3.7. # provides compatibility for python 3.6 and 3.7.
try: try:
return binascii.hexlify(byte_string, ':').decode('utf-8') return binascii.hexlify(byte_string, ':').decode('utf-8')

View File

@ -228,7 +228,7 @@ def _normalize_header(keyname: str) -> str:
def _add_text_kv(key: str, value: Optional[str]) -> Optional[Dict]: def _add_text_kv(key: str, value: Optional[str]) -> Optional[Dict]:
""" """
Add keys with _text suffix if there is a text description inside Add keys with _text suffix if there is a text description inside
paranthesis at the end of a value. The value of the _text field will parenthesis at the end of a value. The value of the _text field will
only be the text inside the parenthesis. This allows cleanup of the only be the text inside the parenthesis. This allows cleanup of the
original field (convert to int/float/etc) without losing information. original field (convert to int/float/etc) without losing information.
""" """

View File

@ -161,7 +161,7 @@ def parse(
continue continue
if not line.startswith('#') and not found_first_hash: if not line.startswith('#') and not found_first_hash:
# skip preample lines before header row # skip preamble lines before header row
continue continue
if line.startswith('#') and not found_first_hash: if line.startswith('#') and not found_first_hash:

View File

@ -80,7 +80,7 @@ def _process(proc_data: Dict) -> Dict:
def _b2a(byte_string: bytes) -> str: def _b2a(byte_string: bytes) -> str:
"""Convert a byte string to a colon-delimited hex ascii string""" """Convert a byte string to a colon-delimited hex ascii string"""
# need try/except since seperator was only introduced in python 3.8. # need try/except since separator was only introduced in python 3.8.
# provides compatibility for python 3.6 and 3.7. # provides compatibility for python 3.6 and 3.7.
try: try:
return binascii.hexlify(byte_string, ':').decode('utf-8') return binascii.hexlify(byte_string, ':').decode('utf-8')

View File

@ -1,7 +1,7 @@
"""jc - JSON Convert Proc file output parser """jc - JSON Convert Proc file output parser
This parser automatically identifies the Proc file and calls the This parser automatically identifies the Proc file and calls the
corresponding parser to peform the parsing. corresponding parser to perform the parsing.
Magic syntax for converting `/proc` files is also supported by running Magic syntax for converting `/proc` files is also supported by running
`jc /proc/<path to file>`. Any `jc` options must be specified before the `jc /proc/<path to file>`. Any `jc` options must be specified before the

View File

@ -48,7 +48,7 @@ Blank values converted to `null`/`None`.
] ]
[0] naive timestamp if "timestamp" field is parsable, else null [0] naive timestamp if "timestamp" field is parsable, else null
[1] timezone aware timestamp availabe for UTC, else null [1] timezone aware timestamp available for UTC, else null
[2] this field exists if the syslog line is not parsable. The value [2] this field exists if the syslog line is not parsable. The value
is the original syslog line. is the original syslog line.

View File

@ -59,7 +59,7 @@ Blank values converted to `null`/`None`.
} }
[0] naive timestamp if "timestamp" field is parsable, else null [0] naive timestamp if "timestamp" field is parsable, else null
[1] timezone aware timestamp availabe for UTC, else null [1] timezone aware timestamp available for UTC, else null
[2] this field exists if the syslog line is not parsable. The value [2] this field exists if the syslog line is not parsable. The value
is the original syslog line. is the original syslog line.

View File

@ -441,7 +441,7 @@ def _i2b(integer: int) -> bytes:
def _b2a(byte_string: bytes) -> str: def _b2a(byte_string: bytes) -> str:
"""Convert a byte string to a colon-delimited hex ascii string""" """Convert a byte string to a colon-delimited hex ascii string"""
# need try/except since seperator was only introduced in python 3.8. # need try/except since separator was only introduced in python 3.8.
# provides compatibility for python 3.6 and 3.7. # provides compatibility for python 3.6 and 3.7.
try: try:
return binascii.hexlify(byte_string, ':').decode('utf-8') return binascii.hexlify(byte_string, ':').decode('utf-8')

View File

@ -141,7 +141,7 @@ def compatibility(mod_name: str, compatible: List[str], quiet: bool = False) ->
the parser. compatible options: the parser. compatible options:
linux, darwin, cygwin, win32, aix, freebsd linux, darwin, cygwin, win32, aix, freebsd
quiet: (bool) supress compatibility message if True quiet: (bool) suppress compatibility message if True
Returns: Returns:

View File

@ -1072,7 +1072,7 @@ JC_COLORS=default,default,default,default
You can set the \fBNO_COLOR\fP environment variable to any value to disable color output in \fBjc\fP. Note that using the \fB-C\fP option to force color output will override both the \fBNO_COLOR\fP environment variable and the \fB-m\fP option. You can set the \fBNO_COLOR\fP environment variable to any value to disable color output in \fBjc\fP. Note that using the \fB-C\fP option to force color output will override both the \fBNO_COLOR\fP environment variable and the \fB-m\fP option.
.SH STREAMING PARSERS .SH STREAMING PARSERS
Most parsers load all of the data from \fBSTDIN\fP, parse it, then output the entire JSON document serially. There are some streaming parsers (e.g. \fBls-s\fP, \fBping-s\fP, etc.) that immediately start processing and outputing the data line-by-line as JSON Lines (aka NDJSON) while it is being received from \fBSTDIN\fP. This can significantly reduce the amount of memory required to parse large amounts of command output (e.g. \fBls -lR /\fP) and can sometimes process the data more quickly. Streaming parsers have slightly different behavior than standard parsers as outlined below. Most parsers load all of the data from \fBSTDIN\fP, parse it, then output the entire JSON document serially. There are some streaming parsers (e.g. \fBls-s\fP, \fBping-s\fP, etc.) that immediately start processing and outputting the data line-by-line as JSON Lines (aka NDJSON) while it is being received from \fBSTDIN\fP. This can significantly reduce the amount of memory required to parse large amounts of command output (e.g. \fBls -lR /\fP) and can sometimes process the data more quickly. Streaming parsers have slightly different behavior than standard parsers as outlined below.
.RS .RS
Note: Streaming parsers cannot be used with the "magic" syntax Note: Streaming parsers cannot be used with the "magic" syntax

View File

@ -1,5 +1,5 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
# Genereate man page from jc metadata using jinja2 templates # Generate man page from jc metadata using jinja2 templates
from datetime import date from datetime import date
import jc.cli import jc.cli
from jinja2 import Environment, FileSystemLoader from jinja2 import Environment, FileSystemLoader

View File

@ -1,5 +1,5 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
# Genereate README.md from jc metadata using jinja2 templates # Generate README.md from jc metadata using jinja2 templates
import jc.cli import jc.cli
import jc.lib import jc.lib
from jinja2 import Environment, FileSystemLoader from jinja2 import Environment, FileSystemLoader

View File

@ -182,7 +182,7 @@ JC_COLORS=default,default,default,default
You can set the \fBNO_COLOR\fP environment variable to any value to disable color output in \fBjc\fP. Note that using the \fB-C\fP option to force color output will override both the \fBNO_COLOR\fP environment variable and the \fB-m\fP option. You can set the \fBNO_COLOR\fP environment variable to any value to disable color output in \fBjc\fP. Note that using the \fB-C\fP option to force color output will override both the \fBNO_COLOR\fP environment variable and the \fB-m\fP option.
.SH STREAMING PARSERS .SH STREAMING PARSERS
Most parsers load all of the data from \fBSTDIN\fP, parse it, then output the entire JSON document serially. There are some streaming parsers (e.g. \fBls-s\fP, \fBping-s\fP, etc.) that immediately start processing and outputing the data line-by-line as JSON Lines (aka NDJSON) while it is being received from \fBSTDIN\fP. This can significantly reduce the amount of memory required to parse large amounts of command output (e.g. \fBls -lR /\fP) and can sometimes process the data more quickly. Streaming parsers have slightly different behavior than standard parsers as outlined below. Most parsers load all of the data from \fBSTDIN\fP, parse it, then output the entire JSON document serially. There are some streaming parsers (e.g. \fBls-s\fP, \fBping-s\fP, etc.) that immediately start processing and outputting the data line-by-line as JSON Lines (aka NDJSON) while it is being received from \fBSTDIN\fP. This can significantly reduce the amount of memory required to parse large amounts of command output (e.g. \fBls -lR /\fP) and can sometimes process the data more quickly. Streaming parsers have slightly different behavior than standard parsers as outlined below.
.RS .RS
Note: Streaming parsers cannot be used with the "magic" syntax Note: Streaming parsers cannot be used with the "magic" syntax

View File

@ -262,7 +262,7 @@ option.
### Streaming Parsers ### Streaming Parsers
Most parsers load all of the data from `STDIN`, parse it, then output the entire Most parsers load all of the data from `STDIN`, parse it, then output the entire
JSON document serially. There are some streaming parsers (e.g. `ls-s` and JSON document serially. There are some streaming parsers (e.g. `ls-s` and
`ping-s`) that immediately start processing and outputing the data line-by-line `ping-s`) that immediately start processing and outputting the data line-by-line
as [JSON Lines](https://jsonlines.org/) (aka [NDJSON](http://ndjson.org/)) while as [JSON Lines](https://jsonlines.org/) (aka [NDJSON](http://ndjson.org/)) while
it is being received from `STDIN`. This can significantly reduce the amount of it is being received from `STDIN`. This can significantly reduce the amount of
memory required to parse large amounts of command output (e.g. `ls -lR /`) and memory required to parse large amounts of command output (e.g. `ls -lR /`) and