1
0
mirror of https://github.com/kellyjonbrazil/jc.git synced 2025-07-11 01:10:37 +02:00

Merge pull request #207 from kellyjonbrazil/dev

Dev v1.18.3
This commit is contained in:
Kelly Brazil
2022-02-14 10:36:08 -08:00
committed by GitHub
109 changed files with 6497 additions and 302 deletions

View File

@ -1,5 +1,13 @@
jc changelog
20220214 v1.18.3
- Add rsync command and log file parser tested on linux and macOS
- Add rsync command and log file streaming parser tested on linux and macOS
- Add xrandr command parser tested on linux
- Enhance timestamp performance with caching and format hints
- Refactor ignore_exceptions functionality in streaming parsers
- Fix man page in packages
20220127 v1.18.2
- Fix for plugin parsers with underscores in the name
- Add type hints to public API functions

View File

@ -2743,6 +2743,39 @@ rpm_qia | jc --rpm_qi -p # or: jc -p rpm -qia
}
]
```
### rsync
```bash
rsync -i -a source/ dest | jc --rsync -p # or jc -p rsync -i -a source/ dest
```
```json
[
{
"summary": {
"sent": 1708,
"received": 8209,
"bytes_sec": 19834.0,
"total_size": 235,
"speedup": 0.02
},
"files": [
{
"filename": "./",
"metadata": ".d..t......",
"update_type": "not updated",
"file_type": "directory",
"checksum_or_value_different": false,
"size_different": false,
"modification_time_different": true,
"permissions_different": false,
"owner_different": false,
"group_different": false,
"acl_different": false,
"extended_attribute_different": false
}
]
}
]
```
### sfdisk
```bash
sfdisk -l | jc --sfdisk -p # or jc -p sfdisk -l

View File

@ -128,7 +128,7 @@ pip3 install jc
> For more OS Packages, see https://repology.org/project/jc/versions.
### Binaries and Packages
### Binaries
For precompiled binaries, see [Releases](https://github.com/kellyjonbrazil/jc/releases)
on Github.
@ -208,6 +208,8 @@ option.
- `--ps` enables the `ps` command parser ([documentation](https://kellyjonbrazil.github.io/jc/docs/parsers/ps))
- `--route` enables the `route` command parser ([documentation](https://kellyjonbrazil.github.io/jc/docs/parsers/route))
- `--rpm-qi` enables the `rpm -qi` command parser ([documentation](https://kellyjonbrazil.github.io/jc/docs/parsers/rpm_qi))
- `--rsync` enables the `rsync` command parser ([documentation](https://kellyjonbrazil.github.io/jc/docs/parsers/rsync))
- `--rsync-s` enables the `rsync` command streaming parser ([documentation](https://kellyjonbrazil.github.io/jc/docs/parsers/rsync_s))
- `--sfdisk` enables the `sfdisk` command parser ([documentation](https://kellyjonbrazil.github.io/jc/docs/parsers/sfdisk))
- `--shadow` enables the `/etc/shadow` file parser ([documentation](https://kellyjonbrazil.github.io/jc/docs/parsers/shadow))
- `--ss` enables the `ss` command parser ([documentation](https://kellyjonbrazil.github.io/jc/docs/parsers/ss))
@ -234,6 +236,7 @@ option.
- `--wc` enables the `wc` command parser ([documentation](https://kellyjonbrazil.github.io/jc/docs/parsers/wc))
- `--who` enables the `who` command parser ([documentation](https://kellyjonbrazil.github.io/jc/docs/parsers/who))
- `--xml` enables the XML file parser ([documentation](https://kellyjonbrazil.github.io/jc/docs/parsers/xml))
- `--xrandr` enables the `xrandr` command parser ([documentation](https://kellyjonbrazil.github.io/jc/docs/parsers/xrandr))
- `--yaml` enables the YAML file parser ([documentation](https://kellyjonbrazil.github.io/jc/docs/parsers/yaml))
- `--zipinfo` enables the `zipinfo` command parser ([documentation](https://kellyjonbrazil.github.io/jc/docs/parsers/zipinfo))

View File

@ -85,6 +85,9 @@ pydoc-markdown -m jc.lib "${toc_config}" > ../docs/lib.md
echo Building docs for: utils
pydoc-markdown -m jc.utils "${toc_config}" > ../docs/utils.md
echo Building docs for: streaming
pydoc-markdown -m jc.streaming "${toc_config}" > ../docs/streaming.md
echo Building docs for: universal parser
pydoc-markdown -m jc.parsers.universal "${toc_config}" > ../docs/parsers/universal.md

View File

@ -105,7 +105,7 @@ subset of `parser_mod_list()`.
### parser\_info
```python
def parser_info(parser_mod_name: str) -> Union[Dict, None]
def parser_info(parser_mod_name: str) -> Dict
```
Returns a dictionary that includes the module metadata.
@ -118,10 +118,10 @@ This function will accept **module_name**, **cli-name**, and
### all\_parser\_info
```python
def all_parser_info() -> List[Optional[Dict]]
def all_parser_info() -> List[Dict]
```
Returns a list of dictionaris that includes metadata for all modules.
Returns a list of dictionaries that includes metadata for all modules.
<a id="jc.lib.get_help"></a>

View File

@ -73,6 +73,7 @@ Examples:
### parse
```python
@add_jc_meta
def parse(data, raw=False, quiet=False, ignore_exceptions=False)
```
@ -93,9 +94,9 @@ Yields:
Returns:
Iterator object
Iterator object (generator)
### Parser Information
Compatibility: linux, darwin, cygwin, win32, aix, freebsd
Version 1.2 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.3 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@ -105,4 +105,4 @@ Returns:
### Parser Information
Compatibility: linux, darwin, freebsd
Version 2.2 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 2.3 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@ -350,4 +350,4 @@ Returns:
### Parser Information
Compatibility: linux, aix, freebsd, darwin, win32, cygwin
Version 2.2 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 2.3 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@ -148,4 +148,4 @@ Returns:
### Parser Information
Compatibility: win32
Version 1.4 by Rasheed Elsaleh (rasheed@rebelliondefense.com)
Version 1.5 by Rasheed Elsaleh (rasheed@rebelliondefense.com)

View File

@ -110,6 +110,7 @@ Examples:
### parse
```python
@add_jc_meta
def parse(data, raw=False, quiet=False, ignore_exceptions=False)
```
@ -130,9 +131,9 @@ Yields:
Returns:
Iterator object
Iterator object (generator)
### Parser Information
Compatibility: linux
Version 1.0 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.1 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@ -144,4 +144,4 @@ Returns:
### Parser Information
Compatibility: linux, darwin, cygwin, aix, freebsd
Version 1.10 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.11 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@ -87,6 +87,7 @@ Examples:
### parse
```python
@add_jc_meta
def parse(data, raw=False, quiet=False, ignore_exceptions=False)
```
@ -107,9 +108,9 @@ Yields:
Returns:
Iterator object
Iterator object (generator)
### Parser Information
Compatibility: linux, darwin, cygwin, aix, freebsd
Version 0.6 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.0 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@ -93,6 +93,7 @@ Examples:
### parse
```python
@add_jc_meta
def parse(data, raw=False, quiet=False, ignore_exceptions=False)
```
@ -113,9 +114,9 @@ Yields:
Returns:
Iterator object
Iterator object (generator)
### Parser Information
Compatibility: linux, darwin, freebsd
Version 0.6 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.0 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@ -189,4 +189,4 @@ Returns:
### Parser Information
Compatibility: linux
Version 1.4 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.5 by Kelly Brazil (kellyjonbrazil@gmail.com)

165
docs/parsers/rsync.md Normal file
View File

@ -0,0 +1,165 @@
[Home](https://kellyjonbrazil.github.io/jc/)
<a id="jc.parsers.rsync"></a>
# jc.parsers.rsync
jc - JSON CLI output utility `rsync` command output parser
Supports the `-i` or `--itemize-changes` options with all levels of
verbosity. This parser will process the STDOUT output or a log file
generated with the `--log-file` option.
Usage (cli):
$ rsync -i -a source/ dest | jc --rsync
or
$ jc rsync -i -a source/ dest
or
$ cat rsync-backup.log | jc --rsync
Usage (module):
import jc
result = jc.parse('rsync', rsync_command_output)
or
import jc.parsers.rsync
result = jc.parsers.rsync.parse(rsync_command_output)
Schema:
[
{
"summary": {
"date": string,
"time": string,
"process": integer,
"sent": integer,
"received": integer,
"total_size": integer,
"matches": integer,
"hash_hits": integer,
"false_alarms": integer,
"data": integer,
"bytes_sec": float,
"speedup": float
},
"files": [
{
"filename": string,
"date": string,
"time": string,
"process": integer,
"metadata": string,
"update_type": string/null, [0]
"file_type": string/null, [1]
"checksum_or_value_different": bool/null,
"size_different": bool/null,
"modification_time_different": bool/null,
"permissions_different": bool/null,
"owner_different": bool/null,
"group_different": bool/null,
"acl_different": bool/null,
"extended_attribute_different": bool/null,
"epoch": integer, [2]
}
]
}
]
[0] 'file sent', 'file received', 'local change or creation',
'hard link', 'not updated', 'message'
[1] 'file', 'directory', 'symlink', 'device', 'special file'
[2] naive timestamp if time and date fields exist and can be converted.
Examples:
$ rsync -i -a source/ dest | jc --rsync -p
[
{
"summary": {
"sent": 1708,
"received": 8209,
"bytes_sec": 19834.0,
"total_size": 235,
"speedup": 0.02
},
"files": [
{
"filename": "./",
"metadata": ".d..t......",
"update_type": "not updated",
"file_type": "directory",
"checksum_or_value_different": false,
"size_different": false,
"modification_time_different": true,
"permissions_different": false,
"owner_different": false,
"group_different": false,
"acl_different": false,
"extended_attribute_different": false
},
...
]
}
]
$ rsync | jc --rsync -p -r
[
{
"summary": {
"sent": "1,708",
"received": "8,209",
"bytes_sec": "19,834.00",
"total_size": "235",
"speedup": "0.02"
},
"files": [
{
"filename": "./",
"metadata": ".d..t......",
"update_type": "not updated",
"file_type": "directory",
"checksum_or_value_different": false,
"size_different": false,
"modification_time_different": true,
"permissions_different": false,
"owner_different": false,
"group_different": false,
"acl_different": false,
"extended_attribute_different": false
},
...
]
}
]
<a id="jc.parsers.rsync.parse"></a>
### parse
```python
def parse(data: str, raw: bool = False, quiet: bool = False) -> List[Dict]
```
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
List of Dictionaries. Raw or processed structured data.
### Parser Information
Compatibility: linux, darwin, freebsd
Version 1.0 by Kelly Brazil (kellyjonbrazil@gmail.com)

127
docs/parsers/rsync_s.md Normal file
View File

@ -0,0 +1,127 @@
[Home](https://kellyjonbrazil.github.io/jc/)
<a id="jc.parsers.rsync_s"></a>
# jc.parsers.rsync\_s
jc - JSON CLI output utility `rsync` command output streaming parser
> This streaming parser outputs JSON Lines
Supports the `-i` or `--itemize-changes` options with all levels of
verbosity. This parser will process the STDOUT output or a log file
generated with the `--log-file` option.
Usage (cli):
$ rsync -i -a source/ dest | jc --rsync-s
or
$ cat rsync-backup.log | jc --rsync-s
Usage (module):
import jc
# result is an iterable object (generator)
result = jc.parse('rsync_s', rsync_command_output.splitlines())
for item in result:
# do something
or
import jc.parsers.rsync_s
# result is an iterable object (generator)
result = jc.parsers.rsync_s.parse(rsync_command_output.splitlines())
for item in result:
# do something
Schema:
{
"type": string, # 'file' or 'summary'
"date": string,
"time": string,
"process": integer,
"sent": integer,
"received": integer,
"total_size": integer,
"matches": integer,
"hash_hits": integer,
"false_alarms": integer,
"data": integer,
"bytes_sec": float,
"speedup": float,
"filename": string,
"date": string,
"time": string,
"process": integer,
"metadata": string,
"update_type": string/null, [0]
"file_type": string/null, [1]
"checksum_or_value_different": bool/null,
"size_different": bool/null,
"modification_time_different": bool/null,
"permissions_different": bool/null,
"owner_different": bool/null,
"group_different": bool/null,
"acl_different": bool/null,
"extended_attribute_different": bool/null,
"epoch": integer, [2]
# Below object only exists if using -qq or ignore_exceptions=True
"_jc_meta":
{
"success": boolean, # false if error parsing
"error": string, # exists if "success" is false
"line": string # exists if "success" is false
}
}
[0] 'file sent', 'file received', 'local change or creation',
'hard link', 'not updated', 'message'
[1] 'file', 'directory', 'symlink', 'device', 'special file'
[2] naive timestamp if time and date fields exist and can be converted.
Examples:
$ rsync -i -a source/ dest | jc --rsync-s
{"type":"file","filename":"./","metadata":".d..t......","update_...}
...
$ cat rsync_backup.log | jc --rsync-s
{"type":"file","filename":"./","date":"2022/01/28","time":"03:53...}
...
<a id="jc.parsers.rsync_s.parse"></a>
### parse
```python
@add_jc_meta
def parse(data: Iterable[str], raw: bool = False, quiet: bool = False, ignore_exceptions: bool = False) -> Union[Iterable[Dict], tuple]
```
Main text parsing generator function. Returns an iterator object.
Parameters:
data: (iterable) line-based text data to parse
(e.g. sys.stdin or str.splitlines())
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
ignore_exceptions: (boolean) ignore parsing exceptions if True
Yields:
Dictionary. Raw or processed structured data.
Returns:
Iterator object (generator)
### Parser Information
Compatibility: linux, darwin, freebsd
Version 1.0 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@ -198,4 +198,4 @@ Returns:
### Parser Information
Compatibility: linux, darwin, freebsd
Version 1.10 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.11 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@ -91,6 +91,7 @@ Examples:
### parse
```python
@add_jc_meta
def parse(data, raw=False, quiet=False, ignore_exceptions=False)
```
@ -111,9 +112,9 @@ Yields:
Returns:
Iterator object
Iterator object (generator)
### Parser Information
Compatibility: linux, darwin, freebsd
Version 0.5 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.0 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@ -239,4 +239,4 @@ Returns:
### Parser Information
Compatibility: win32
Version 1.1 by Jon Smith (jon@rebelliondefense.com)
Version 1.2 by Jon Smith (jon@rebelliondefense.com)

View File

@ -92,4 +92,4 @@ Returns:
### Parser Information
Compatibility: linux
Version 1.5 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.6 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@ -40,7 +40,7 @@ Returns:
### sparse\_table\_parse
```python
def sparse_table_parse(data: List[str], delim: Optional[str] = '\u2063') -> List[Dict]
def sparse_table_parse(data: List[str], delim: str = '\u2063') -> List[Dict]
```
Parse tables with missing column data or with spaces in column data.

View File

@ -226,4 +226,4 @@ Returns:
### Parser Information
Compatibility: linux
Version 1.3 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.4 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@ -154,4 +154,4 @@ Returns:
### Parser Information
Compatibility: linux
Version 1.1 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.2 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@ -110,6 +110,7 @@ Examples:
### parse
```python
@add_jc_meta
def parse(data, raw=False, quiet=False, ignore_exceptions=False)
```
@ -130,9 +131,9 @@ Yields:
Returns:
Iterator object
Iterator object (generator)
### Parser Information
Compatibility: linux
Version 0.6 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.0 by Kelly Brazil (kellyjonbrazil@gmail.com)

View File

@ -163,4 +163,4 @@ Returns:
### Parser Information
Compatibility: linux, darwin, cygwin, aix, freebsd
Version 1.5 by Kelly Brazil (kellyjonbrazil@gmail.com)
Version 1.6 by Kelly Brazil (kellyjonbrazil@gmail.com)

166
docs/parsers/xrandr.md Normal file
View File

@ -0,0 +1,166 @@
[Home](https://kellyjonbrazil.github.io/jc/)
<a id="jc.parsers.xrandr"></a>
# jc.parsers.xrandr
jc - JSON CLI output utility `xrandr` command output parser
Usage (cli):
$ xrandr | jc --xrandr
or
$ jc xrandr
Usage (module):
import jc
result = jc.parse('xrandr', xrandr_command_output)
or
import jc.parsers.xrandr
result = jc.parsers.xrandr.parse(xrandr_command_output)
Schema:
{
"screens": [
{
"screen_number": integer,
"minimum_width": integer,
"minimum_height": integer,
"current_width": integer,
"current_height": integer,
"maximum_width": integer,
"maximum_height": integer,
"associated_device": {
"associated_modes": [
{
"resolution_width": integer,
"resolution_height": integer,
"is_high_resolution": boolean,
"frequencies": [
{
"frequency": float,
"is_current": boolean,
"is_preferred": boolean
}
],
"is_connected": boolean,
"is_primary": boolean,
"device_name": string,
"resolution_width": integer,
"resolution_height": integer,
"offset_width": integer,
"offset_height": integer,
"dimension_width": integer,
"dimension_height": integer
}
}
],
"unassociated_devices": [
{
"associated_modes": [
{
"resolution_width": integer,
"resolution_height": integer,
"is_high_resolution": boolean,
"frequencies": [
{
"frequency": float,
"is_current": boolean,
"is_preferred": boolean
}
]
}
]
}
]
}
Examples:
$ xrandr | jc --xrandr -p
{
"screens": [
{
"screen_number": 0,
"minimum_width": 8,
"minimum_height": 8,
"current_width": 1920,
"current_height": 1080,
"maximum_width": 32767,
"maximum_height": 32767,
"associated_device": {
"associated_modes": [
{
"resolution_width": 1920,
"resolution_height": 1080,
"is_high_resolution": false,
"frequencies": [
{
"frequency": 60.03,
"is_current": true,
"is_preferred": true
},
{
"frequency": 59.93,
"is_current": false,
"is_preferred": false
}
]
},
{
"resolution_width": 1680,
"resolution_height": 1050,
"is_high_resolution": false,
"frequencies": [
{
"frequency": 59.88,
"is_current": false,
"is_preferred": false
}
]
}
],
"is_connected": true,
"is_primary": true,
"device_name": "eDP1",
"resolution_width": 1920,
"resolution_height": 1080,
"offset_width": 0,
"offset_height": 0,
"dimension_width": 310,
"dimension_height": 170
}
}
],
"unassociated_devices": []
}
<a id="jc.parsers.xrandr.parse"></a>
### parse
```python
def parse(data: str, raw: bool = False, quiet: bool = False) -> Dict
```
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
Dictionary. Raw or processed structured data.
### Parser Information
Compatibility: linux, darwin, cygwin, aix, freebsd
Version 1.0 by Kevin Lyter (lyter_git at sent.com)

View File

@ -44,7 +44,7 @@ Schema:
{
"flags": string,
"zipversion": string,
"zipunder": string
"zipunder": string,
"filesize": integer,
"type": string,
"method": string,

View File

@ -14,6 +14,8 @@ and file-types to dictionaries and lists of dictionaries.
>>> help('jc')
>>> help('jc.lib')
>>> help('jc.utils')
>>> help('jc.streaming')
>>> help('jc.parsers.universal')
>>> jc.get_help('parser_module_name')
## Online Documentation

114
docs/streaming.md Normal file
View File

@ -0,0 +1,114 @@
# Table of Contents
* [jc.streaming](#jc.streaming)
* [streaming\_input\_type\_check](#jc.streaming.streaming_input_type_check)
* [streaming\_line\_input\_type\_check](#jc.streaming.streaming_line_input_type_check)
* [stream\_success](#jc.streaming.stream_success)
* [stream\_error](#jc.streaming.stream_error)
* [add\_jc\_meta](#jc.streaming.add_jc_meta)
* [raise\_or\_yield](#jc.streaming.raise_or_yield)
<a id="jc.streaming"></a>
# jc.streaming
jc - JSON CLI output utility streaming utils
<a id="jc.streaming.streaming_input_type_check"></a>
### streaming\_input\_type\_check
```python
def streaming_input_type_check(data: Iterable) -> None
```
Ensure input data is an iterable, but not a string or bytes. Raises
`TypeError` if not.
<a id="jc.streaming.streaming_line_input_type_check"></a>
### streaming\_line\_input\_type\_check
```python
def streaming_line_input_type_check(line: str) -> None
```
Ensure each line is a string. Raises `TypeError` if not.
<a id="jc.streaming.stream_success"></a>
### stream\_success
```python
def stream_success(output_line: Dict, ignore_exceptions: bool) -> Dict
```
Add `_jc_meta` object to output line if `ignore_exceptions=True`
<a id="jc.streaming.stream_error"></a>
### stream\_error
```python
def stream_error(e: BaseException, line: str) -> Dict
```
Return an error `_jc_meta` field.
<a id="jc.streaming.add_jc_meta"></a>
### add\_jc\_meta
```python
def add_jc_meta(func)
```
Decorator for streaming parsers to add stream_success and stream_error
objects. This simplifies the yield lines in the streaming parsers.
With the decorator on parse():
# successfully parsed line:
yield output_line if raw else _process(output_line)
# unsuccessfully parsed line:
except Exception as e:
yield raise_or_yield(ignore_exceptions, e, line)
Without the decorator on parse():
# successfully parsed line:
if raw:
yield stream_success(output_line, ignore_exceptions)
else:
stream_success(_process(output_line), ignore_exceptions)
# unsuccessfully parsed line:
except Exception as e:
yield stream_error(raise_or_yield(ignore_exceptions, e, line))
In all cases above:
output_line: (Dict) successfully parsed line yielded as a dict
e: (BaseException) exception object as the first value
of the tuple if the line was not successfully parsed.
line: (str) string of the original line that did not
successfully parse.
ignore_exceptions: (bool) continue processing lines and ignore
exceptions if True.
<a id="jc.streaming.raise_or_yield"></a>
### raise\_or\_yield
```python
def raise_or_yield(ignore_exceptions: bool, e: BaseException, line: str) -> tuple
```
Return the exception object and line string if ignore_exceptions is
True. Otherwise, re-raise the exception from the exception object with
an annotation.

View File

@ -8,11 +8,7 @@
* [convert\_to\_int](#jc.utils.convert_to_int)
* [convert\_to\_float](#jc.utils.convert_to_float)
* [convert\_to\_bool](#jc.utils.convert_to_bool)
* [stream\_success](#jc.utils.stream_success)
* [stream\_error](#jc.utils.stream_error)
* [input\_type\_check](#jc.utils.input_type_check)
* [streaming\_input\_type\_check](#jc.utils.streaming_input_type_check)
* [streaming\_line\_input\_type\_check](#jc.utils.streaming_line_input_type_check)
* [timestamp](#jc.utils.timestamp)
* [\_\_init\_\_](#jc.utils.timestamp.__init__)
@ -67,7 +63,7 @@ Returns:
### compatibility
```python
def compatibility(mod_name: str, compatible: List, quiet: Optional[bool] = False) -> None
def compatibility(mod_name: str, compatible: List, quiet: bool = False) -> None
```
Checks for the parser's compatibility with the running OS
@ -112,7 +108,7 @@ Returns:
### convert\_to\_int
```python
def convert_to_int(value: Union[str, float]) -> Union[int, None]
def convert_to_int(value: Union[str, float]) -> Optional[int]
```
Converts string and float input to int. Strips all non-numeric
@ -131,7 +127,7 @@ Returns:
### convert\_to\_float
```python
def convert_to_float(value: Union[str, int]) -> Union[float, None]
def convert_to_float(value: Union[str, int]) -> Optional[float]
```
Converts string and int input to float. Strips all non-numeric
@ -165,27 +161,6 @@ Returns:
True/False False unless a 'truthy' number or string is found
('y', 'yes', 'true', '1', 1, -1, etc.)
<a id="jc.utils.stream_success"></a>
### stream\_success
```python
def stream_success(output_line: Dict, ignore_exceptions: bool) -> Dict
```
Add `_jc_meta` object to output line if `ignore_exceptions=True`
<a id="jc.utils.stream_error"></a>
### stream\_error
```python
def stream_error(e: BaseException, ignore_exceptions: bool, line: str) -> Dict
```
Reraise the stream exception with annotation or print an error
`_jc_meta` field if `ignore_exceptions=True`.
<a id="jc.utils.input_type_check"></a>
### input\_type\_check
@ -196,27 +171,6 @@ def input_type_check(data: str) -> None
Ensure input data is a string. Raises `TypeError` if not.
<a id="jc.utils.streaming_input_type_check"></a>
### streaming\_input\_type\_check
```python
def streaming_input_type_check(data: Iterable) -> None
```
Ensure input data is an iterable, but not a string or bytes. Raises
`TypeError` if not.
<a id="jc.utils.streaming_line_input_type_check"></a>
### streaming\_line\_input\_type\_check
```python
def streaming_line_input_type_check(line: str) -> None
```
Ensure each line is a string. Raises `TypeError` if not.
<a id="jc.utils.timestamp"></a>
### timestamp Objects
@ -230,29 +184,34 @@ class timestamp()
### \_\_init\_\_
```python
def __init__(datetime_string: str) -> None
def __init__(datetime_string: str, format_hint: Union[List, Tuple, None] = None) -> None
```
Input a date-time text string of several formats and convert to a
Input a datetime text string of several formats and convert to a
naive or timezone-aware epoch timestamp in UTC.
Parameters:
datetime_string: (str) a string representation of a
date-time in several supported formats
datetime_string (str): a string representation of a
datetime in several supported formats
Attributes:
format_hint (list | tuple): an optional list of format ID
integers to instruct the timestamp object to try those
formats first in the order given. Other formats will be
tried after the format hint list is exhausted. This can
speed up timestamp conversion so several different formats
don't have to be tried in brute-force fashion.
string (str) the input datetime string
Returns a timestamp object with the following attributes:
format (int) the format rule that was used to
decode the datetime string. None if
conversion fails
string (str): the input datetime string
naive (int) timestamp based on locally configured
timezone. None if conversion fails
format (int | None): the format rule that was used to decode
the datetime string. None if conversion fails.
utc (int) aware timestamp only if UTC timezone
detected in datetime string. None if
conversion fails
naive (int | None): timestamp based on locally configured
timezone. None if conversion fails.
utc (int | None): aware timestamp only if UTC timezone
detected in datetime string. None if conversion fails.

View File

@ -10,6 +10,8 @@ and file-types to dictionaries and lists of dictionaries.
>>> help('jc')
>>> help('jc.lib')
>>> help('jc.utils')
>>> help('jc.streaming')
>>> help('jc.parsers.universal')
>>> jc.get_help('parser_module_name')
## Online Documentation

View File

@ -89,13 +89,15 @@ def set_env_colors(env_colors=None):
"""
Return a dictionary to be used in Pygments custom style class.
Grab custom colors from JC_COLORS environment variable. JC_COLORS env variable takes 4 comma
separated string values and should be in the format of:
Grab custom colors from JC_COLORS environment variable. JC_COLORS env
variable takes 4 comma separated string values and should be in the
format of:
JC_COLORS=<keyname_color>,<keyword_color>,<number_color>,<string_color>
Where colors are: black, red, green, yellow, blue, magenta, cyan, gray, brightblack, brightred,
brightgreen, brightyellow, brightblue, brightmagenta, brightcyan, white, default
Where colors are: black, red, green, yellow, blue, magenta, cyan, gray,
brightblack, brightred, brightgreen, brightyellow,
brightblue, brightmagenta, brightcyan, white, default
Default colors:
@ -132,8 +134,10 @@ def set_env_colors(env_colors=None):
def piped_output(force_color):
"""Return False if stdout is a TTY. True if output is being piped to another program
and foce_color is True. This allows forcing of ANSI color codes even when using pipes.
"""
Return False if stdout is a TTY. True if output is being piped to
another program and foce_color is True. This allows forcing of ANSI
color codes even when using pipes.
"""
return not sys.stdout.isatty() and not force_color
@ -224,8 +228,8 @@ def helptext():
def help_doc(options):
"""
Returns the parser documentation if a parser is found in the arguments, otherwise
the general help text is returned.
Returns the parser documentation if a parser is found in the arguments,
otherwise the general help text is returned.
"""
for arg in options:
parser_name = parser_shortname(arg)
@ -253,7 +257,10 @@ def versiontext():
def json_out(data, pretty=False, env_colors=None, mono=False, piped_out=False):
"""Return a JSON formatted string. String may include color codes or be pretty printed."""
"""
Return a JSON formatted string. String may include color codes or be
pretty printed.
"""
separators = (',', ':')
indent = None
@ -277,10 +284,10 @@ def magic_parser(args):
Parse command arguments for magic syntax: jc -p ls -al
Return a tuple:
valid_command (bool) is this a valid command? (exists in magic dict)
run_command (list) list of the user's command to run. None if no command.
jc_parser (str) parser to use for this user's command.
jc_options (list) list of jc options
valid_command (bool) is this a valid cmd? (exists in magic dict)
run_command (list) list of the user's cmd to run. None if no cmd.
jc_parser (str) parser to use for this user's cmd.
jc_options (list) list of jc options
"""
# bail immediately if there are no args or a parser is defined
if len(args) <= 1 or args[1].startswith('--'):
@ -335,12 +342,15 @@ def magic_parser(args):
def run_user_command(command):
"""Use subprocess to run the user's command. Returns the STDOUT, STDERR, and the Exit Code as a tuple."""
"""
Use subprocess to run the user's command. Returns the STDOUT, STDERR,
and the Exit Code as a tuple.
"""
proc = subprocess.Popen(command,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
close_fds=False, # Allows inheriting file descriptors. Useful for process substitution
universal_newlines=True)
close_fds=False, # Allows inheriting file descriptors;
universal_newlines=True) # useful for process substitution
stdout, stderr = proc.communicate()
return (
@ -441,14 +451,18 @@ def main():
raise
error_msg = os.strerror(e.errno)
utils.error_message([f'"{run_command_str}" command could not be run: {error_msg}. For details use the -d or -dd option.'])
utils.error_message([
f'"{run_command_str}" command could not be run: {error_msg}. For details use the -d or -dd option.'
])
sys.exit(combined_exit_code(magic_exit_code, JC_ERROR_EXIT))
except Exception:
if debug:
raise
utils.error_message([f'"{run_command_str}" command could not be run. For details use the -d or -dd option.'])
utils.error_message([
f'"{run_command_str}" command could not be run. For details use the -d or -dd option.'
])
sys.exit(combined_exit_code(magic_exit_code, JC_ERROR_EXIT))
elif run_command is not None:
@ -489,7 +503,10 @@ def main():
# streaming
if getattr(parser.info, 'streaming', None):
result = parser.parse(sys.stdin, raw=raw, quiet=quiet, ignore_exceptions=ignore_exceptions)
result = parser.parse(sys.stdin,
raw=raw,
quiet=quiet,
ignore_exceptions=ignore_exceptions)
for line in result:
print(json_out(line,
pretty=pretty,
@ -503,7 +520,9 @@ def main():
# regular
else:
data = magic_stdout or sys.stdin.read()
result = parser.parse(data, raw=raw, quiet=quiet)
result = parser.parse(data,
raw=raw,
quiet=quiet)
print(json_out(result,
pretty=pretty,
env_colors=jc_colors,
@ -517,10 +536,11 @@ def main():
if debug:
raise
utils.error_message([f'Parser issue with {parser_name}:',
f'{e.__class__.__name__}: {e}',
'If this is the correct parser, try setting the locale to C (LANG=C).',
'For details use the -d or -dd option. Use "jc -h" for help.'])
utils.error_message([
f'Parser issue with {parser_name}:', f'{e.__class__.__name__}: {e}',
'If this is the correct parser, try setting the locale to C (LANG=C).',
'For details use the -d or -dd option. Use "jc -h" for help.'
])
sys.exit(combined_exit_code(magic_exit_code, JC_ERROR_EXIT))
except json.JSONDecodeError:

View File

@ -6,10 +6,10 @@ import sys
import os
import re
import importlib
from typing import Dict, List, Iterable, Union, Iterator, Optional
from typing import Dict, List, Iterable, Union, Iterator
from jc import appdirs
__version__ = '1.18.2'
__version__ = '1.18.3'
parsers = [
'acpi',
@ -69,6 +69,8 @@ parsers = [
'ps',
'route',
'rpm-qi',
'rsync',
'rsync-s',
'sfdisk',
'shadow',
'ss',
@ -95,6 +97,7 @@ parsers = [
'wc',
'who',
'xml',
'xrandr',
'yaml',
'zipinfo'
]
@ -224,7 +227,7 @@ def plugin_parser_mod_list() -> List[str]:
"""
return [_cliname_to_modname(p) for p in local_parsers]
def parser_info(parser_mod_name: str) -> Union[Dict, None]:
def parser_info(parser_mod_name: str) -> Dict:
"""
Returns a dictionary that includes the module metadata.
@ -235,9 +238,9 @@ def parser_info(parser_mod_name: str) -> Union[Dict, None]:
parser_mod_name = _cliname_to_modname(parser_mod_name)
parser_mod = _get_parser(parser_mod_name)
info_dict: Dict = {}
if hasattr(parser_mod, 'info'):
info_dict: Dict = {}
info_dict['name'] = parser_mod_name
info_dict['argument'] = _parser_argument(parser_mod_name)
parser_entry = vars(parser_mod.info)
@ -249,15 +252,13 @@ def parser_info(parser_mod_name: str) -> Union[Dict, None]:
if _modname_to_cliname(parser_mod_name) in local_parsers:
info_dict['plugin'] = True
return info_dict
return info_dict
return None
def all_parser_info() -> List[Optional[Dict]]:
def all_parser_info() -> List[Dict]:
"""
Returns a list of dictionaris that includes metadata for all modules.
Returns a list of dictionaries that includes metadata for all modules.
"""
return [parser_info(_cliname_to_modname(p)) for p in parsers]
return [parser_info(p) for p in parsers]
def get_help(parser_mod_name: str) -> None:
"""

View File

@ -66,13 +66,13 @@ Examples:
import itertools
import csv
import jc.utils
from jc.utils import stream_success, stream_error
from jc.streaming import streaming_input_type_check, add_jc_meta, raise_or_yield
from jc.exceptions import ParseError
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.2'
version = '1.3'
description = 'CSV file streaming parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -101,6 +101,7 @@ def _process(proc_data):
return proc_data
@add_jc_meta
def parse(data, raw=False, quiet=False, ignore_exceptions=False):
"""
Main text parsing generator function. Returns an iterator object.
@ -120,10 +121,10 @@ def parse(data, raw=False, quiet=False, ignore_exceptions=False):
Returns:
Iterator object
Iterator object (generator)
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.streaming_input_type_check(data)
streaming_input_type_check(data)
# convert data to an iterable in case a sequence like a list is used as input.
# this allows the exhaustion of the input so we don't double-process later.
@ -155,6 +156,6 @@ def parse(data, raw=False, quiet=False, ignore_exceptions=False):
for row in reader:
try:
yield stream_success(row, ignore_exceptions) if raw else stream_success(_process(row), ignore_exceptions)
yield row if raw else _process(row)
except Exception as e:
yield stream_error(e, ignore_exceptions, row)
yield raise_or_yield(ignore_exceptions, e, str(row))

View File

@ -83,7 +83,7 @@ import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '2.2'
version = '2.3'
description = '`date` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -168,7 +168,7 @@ def parse(data, raw=False, quiet=False):
dt = None
dt_utc = None
timestamp = jc.utils.timestamp(data)
timestamp = jc.utils.timestamp(data, format_hint=(1000, 6000, 7000))
if timestamp.naive:
dt = datetime.fromtimestamp(timestamp.naive)
if timestamp.utc:

View File

@ -327,7 +327,7 @@ import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '2.2'
version = '2.3'
description = '`dig` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -384,7 +384,7 @@ def _process(proc_data):
auth['ttl'] = jc.utils.convert_to_int(auth['ttl'])
if 'when' in entry:
ts = jc.utils.timestamp(entry['when'])
ts = jc.utils.timestamp(entry['when'], format_hint=(1000, 7000))
entry['when_epoch'] = ts.naive
entry['when_epoch_utc'] = ts.utc

View File

@ -126,7 +126,7 @@ import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.4'
version = '1.5'
description = '`dir` command parser'
author = 'Rasheed Elsaleh'
author_email = 'rasheed@rebelliondefense.com'
@ -152,7 +152,7 @@ def _process(proc_data):
# add timestamps
if 'date' in entry and 'time' in entry:
dt = entry['date'] + ' ' + entry['time']
timestamp = jc.utils.timestamp(dt)
timestamp = jc.utils.timestamp(dt, format_hint=(1600,))
entry['epoch'] = timestamp.naive
# add ints

View File

@ -38,8 +38,8 @@ Examples:
$ foo | jc --foo -p -r
[]
"""
import jc.utils
from typing import List, Dict
import jc.utils
class info():

View File

@ -49,9 +49,11 @@ Examples:
{example output}
...
"""
from typing import Dict, Iterable
from typing import Dict, Iterable, Union
import jc.utils
from jc.utils import stream_success, stream_error
from jc.streaming import (
add_jc_meta, streaming_input_type_check, streaming_line_input_type_check, raise_or_yield
)
from jc.exceptions import ParseError
@ -63,7 +65,7 @@ class info():
author_email = 'johndoe@gmail.com'
# compatible options: linux, darwin, cygwin, win32, aix, freebsd
compatible = ['linux', 'darwin', 'cygwin', 'aix', 'freebsd']
compatible = ['linux', 'darwin', 'cygwin', 'win32', 'aix', 'freebsd']
streaming = True
@ -91,12 +93,13 @@ def _process(proc_data: Dict) -> Dict:
return proc_data
@add_jc_meta
def parse(
data: Iterable[str],
raw: bool = False,
quiet: bool = False,
ignore_exceptions: bool = False
) -> Iterable[Dict]:
) -> Union[Iterable[Dict], tuple]:
"""
Main text parsing generator function. Returns an iterator object.
@ -115,24 +118,24 @@ def parse(
Returns:
Iterator object
Iterator object (generator)
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.streaming_input_type_check(data)
streaming_input_type_check(data)
for line in data:
output_line: Dict = {}
try:
jc.utils.streaming_line_input_type_check(line)
streaming_line_input_type_check(line)
output_line: Dict = {}
# parse the content here
# check out helper functions in jc.utils
# and jc.parsers.universal
if output_line:
yield stream_success(output_line, ignore_exceptions) if raw else stream_success(_process(output_line), ignore_exceptions)
yield output_line if raw else _process(output_line)
else:
raise ParseError('Not foo data')
except Exception as e:
yield stream_error(e, ignore_exceptions, line)
yield raise_or_yield(ignore_exceptions, e, line)

View File

@ -101,14 +101,16 @@ Examples:
...
"""
import jc.utils
from jc.utils import stream_success, stream_error
from jc.streaming import (
add_jc_meta, streaming_input_type_check, streaming_line_input_type_check, raise_or_yield
)
from jc.exceptions import ParseError
import jc.parsers.universal
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.0'
version = '1.1'
description = '`iostat` command streaming parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -159,6 +161,8 @@ def _create_obj_list(section_list, section_name):
item['type'] = section_name
return output_list
@add_jc_meta
def parse(data, raw=False, quiet=False, ignore_exceptions=False):
"""
Main text parsing generator function. Returns an iterator object.
@ -178,10 +182,10 @@ def parse(data, raw=False, quiet=False, ignore_exceptions=False):
Returns:
Iterator object
Iterator object (generator)
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.streaming_input_type_check(data)
streaming_input_type_check(data)
section = '' # either 'cpu' or 'device'
headers = ''
@ -189,9 +193,9 @@ def parse(data, raw=False, quiet=False, ignore_exceptions=False):
device_list = []
for line in data:
output_line = {}
try:
jc.utils.streaming_line_input_type_check(line)
streaming_line_input_type_check(line)
output_line = {}
# ignore blank lines and header line
if line == '\n' or line == '' or line.startswith('Linux'):
@ -223,9 +227,9 @@ def parse(data, raw=False, quiet=False, ignore_exceptions=False):
device_list = []
if output_line:
yield stream_success(output_line, ignore_exceptions) if raw else stream_success(_process(output_line), ignore_exceptions)
yield output_line if raw else _process(output_line)
else:
raise ParseError('Not iostat data')
except Exception as e:
yield stream_error(e, ignore_exceptions, line)
yield raise_or_yield(ignore_exceptions, e, line)

View File

@ -122,7 +122,7 @@ import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.10'
version = '1.11'
description = '`ls` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -154,7 +154,7 @@ def _process(proc_data):
if 'date' in entry:
# to speed up processing only try to convert the date if it's not the default format
if not re.match(r'[a-zA-Z]{3}\s{1,2}\d{1,2}\s{1,2}[0-9:]{4,5}', entry['date']):
ts = jc.utils.timestamp(entry['date'])
ts = jc.utils.timestamp(entry['date'], format_hint=(7200,))
entry['epoch'] = ts.naive
entry['epoch_utc'] = ts.utc

View File

@ -79,13 +79,15 @@ Examples:
"""
import re
import jc.utils
from jc.utils import stream_success, stream_error
from jc.streaming import (
add_jc_meta, streaming_input_type_check, streaming_line_input_type_check, raise_or_yield
)
from jc.exceptions import ParseError
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '0.6'
version = '1.0'
description = '`ls` command streaming parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -116,13 +118,14 @@ def _process(proc_data):
if 'date' in proc_data:
# to speed up processing only try to convert the date if it's not the default format
if not re.match(r'[a-zA-Z]{3}\s{1,2}\d{1,2}\s{1,2}[0-9:]{4,5}', proc_data['date']):
ts = jc.utils.timestamp(proc_data['date'])
ts = jc.utils.timestamp(proc_data['date'], format_hint=(7200,))
proc_data['epoch'] = ts.naive
proc_data['epoch_utc'] = ts.utc
return proc_data
@add_jc_meta
def parse(data, raw=False, quiet=False, ignore_exceptions=False):
"""
Main text parsing generator function. Returns an iterator object.
@ -142,16 +145,16 @@ def parse(data, raw=False, quiet=False, ignore_exceptions=False):
Returns:
Iterator object
Iterator object (generator)
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.streaming_input_type_check(data)
streaming_input_type_check(data)
parent = ''
for line in data:
try:
jc.utils.streaming_line_input_type_check(line)
streaming_line_input_type_check(line)
# skip line if it starts with 'total 1234'
if re.match(r'total [0-9]+', line):
@ -163,7 +166,7 @@ def parse(data, raw=False, quiet=False, ignore_exceptions=False):
# Look for parent line if glob or -R is used
if not re.match(r'[-dclpsbDCMnP?]([-r][-w][-xsS]){2}([-r][-w][-xtT])[+]?', line) \
and line.strip().endswith(':'):
and line.strip().endswith(':'):
parent = line.strip()[:-1]
continue
@ -196,7 +199,7 @@ def parse(data, raw=False, quiet=False, ignore_exceptions=False):
output_line['size'] = parsed_line[4]
output_line['date'] = ' '.join(parsed_line[5:8])
yield stream_success(output_line, ignore_exceptions) if raw else stream_success(_process(output_line), ignore_exceptions)
yield output_line if raw else _process(output_line)
except Exception as e:
yield stream_error(e, ignore_exceptions, line)
yield raise_or_yield(ignore_exceptions, e, line)

View File

@ -86,13 +86,15 @@ Examples:
import string
import ipaddress
import jc.utils
from jc.streaming import (
add_jc_meta, streaming_input_type_check, streaming_line_input_type_check, raise_or_yield
)
from jc.exceptions import ParseError
from jc.utils import stream_success, stream_error
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '0.6'
version = '1.0'
description = '`ping` and `ping6` command streaming parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -469,6 +471,7 @@ def _linux_parse(line, s):
return output_line
@add_jc_meta
def parse(data, raw=False, quiet=False, ignore_exceptions=False):
"""
Main text parsing generator function. Returns an iterator object.
@ -488,17 +491,17 @@ def parse(data, raw=False, quiet=False, ignore_exceptions=False):
Returns:
Iterator object
Iterator object (generator)
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
streaming_input_type_check(data)
s = _state()
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.streaming_input_type_check(data)
for line in data:
output_line = {}
try:
jc.utils.streaming_line_input_type_check(line)
streaming_line_input_type_check(line)
output_line = {}
# skip blank lines
if line.strip() == '':
@ -542,9 +545,9 @@ def parse(data, raw=False, quiet=False, ignore_exceptions=False):
# yield the output line if it has data
if output_line:
yield stream_success(output_line, ignore_exceptions) if raw else stream_success(_process(output_line), ignore_exceptions)
yield output_line if raw else _process(output_line)
else:
continue
except Exception as e:
yield stream_error(e, ignore_exceptions, line)
yield raise_or_yield(ignore_exceptions, e, line)

View File

@ -166,7 +166,7 @@ import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.4'
version = '1.5'
description = '`rpm -qi` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -197,12 +197,12 @@ def _process(proc_data):
entry[key] = jc.utils.convert_to_int(entry[key])
if 'build_date' in entry:
timestamp = jc.utils.timestamp(entry['build_date'])
timestamp = jc.utils.timestamp(entry['build_date'], format_hint=(3000,))
entry['build_epoch'] = timestamp.naive
entry['build_epoch_utc'] = timestamp.utc
if 'install_date' in entry:
timestamp = jc.utils.timestamp(entry['install_date'])
timestamp = jc.utils.timestamp(entry['install_date'], format_hint=(3000,))
entry['install_date_epoch'] = timestamp.naive
entry['install_date_epoch_utc'] = timestamp.utc

493
jc/parsers/rsync.py Normal file
View File

@ -0,0 +1,493 @@
"""jc - JSON CLI output utility `rsync` command output parser
Supports the `-i` or `--itemize-changes` options with all levels of
verbosity. This parser will process the STDOUT output or a log file
generated with the `--log-file` option.
Usage (cli):
$ rsync -i -a source/ dest | jc --rsync
or
$ jc rsync -i -a source/ dest
or
$ cat rsync-backup.log | jc --rsync
Usage (module):
import jc
result = jc.parse('rsync', rsync_command_output)
or
import jc.parsers.rsync
result = jc.parsers.rsync.parse(rsync_command_output)
Schema:
[
{
"summary": {
"date": string,
"time": string,
"process": integer,
"sent": integer,
"received": integer,
"total_size": integer,
"matches": integer,
"hash_hits": integer,
"false_alarms": integer,
"data": integer,
"bytes_sec": float,
"speedup": float
},
"files": [
{
"filename": string,
"date": string,
"time": string,
"process": integer,
"metadata": string,
"update_type": string/null, [0]
"file_type": string/null, [1]
"checksum_or_value_different": bool/null,
"size_different": bool/null,
"modification_time_different": bool/null,
"permissions_different": bool/null,
"owner_different": bool/null,
"group_different": bool/null,
"acl_different": bool/null,
"extended_attribute_different": bool/null,
"epoch": integer, [2]
}
]
}
]
[0] 'file sent', 'file received', 'local change or creation',
'hard link', 'not updated', 'message'
[1] 'file', 'directory', 'symlink', 'device', 'special file'
[2] naive timestamp if time and date fields exist and can be converted.
Examples:
$ rsync -i -a source/ dest | jc --rsync -p
[
{
"summary": {
"sent": 1708,
"received": 8209,
"bytes_sec": 19834.0,
"total_size": 235,
"speedup": 0.02
},
"files": [
{
"filename": "./",
"metadata": ".d..t......",
"update_type": "not updated",
"file_type": "directory",
"checksum_or_value_different": false,
"size_different": false,
"modification_time_different": true,
"permissions_different": false,
"owner_different": false,
"group_different": false,
"acl_different": false,
"extended_attribute_different": false
},
...
]
}
]
$ rsync | jc --rsync -p -r
[
{
"summary": {
"sent": "1,708",
"received": "8,209",
"bytes_sec": "19,834.00",
"total_size": "235",
"speedup": "0.02"
},
"files": [
{
"filename": "./",
"metadata": ".d..t......",
"update_type": "not updated",
"file_type": "directory",
"checksum_or_value_different": false,
"size_different": false,
"modification_time_different": true,
"permissions_different": false,
"owner_different": false,
"group_different": false,
"acl_different": false,
"extended_attribute_different": false
},
...
]
}
]
"""
import re
from copy import deepcopy
from typing import List, Dict
import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.0'
description = '`rsync` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
compatible = ['linux', 'darwin', 'freebsd']
magic_commands = ['rsync -i', 'rsync --itemize-changes']
__version__ = info.version
def _process(proc_data: List[Dict]) -> List[Dict]:
"""
Final processing to conform to the schema.
Parameters:
proc_data: (List of Dictionaries) raw structured data to process
Returns:
List of Dictionaries. Structured to conform to the schema.
"""
int_list = [
'process', 'sent', 'received', 'total_size', 'matches', 'hash_hits',
'false_alarms', 'data'
]
float_list = ['bytes_sec', 'speedup']
for item in proc_data:
for key in item['summary']:
if key in int_list:
item['summary'][key] = jc.utils.convert_to_int(item['summary'][key])
if key in float_list:
item['summary'][key] = jc.utils.convert_to_float(item['summary'][key])
for entry in item['files']:
for key in entry:
if key in int_list:
entry[key] = jc.utils.convert_to_int(entry[key])
# add timestamp
if 'date' in entry and 'time' in entry:
date = entry['date'].replace('/', '-')
date_time = f'{date} {entry["time"]}'
ts = jc.utils.timestamp(date_time, format_hint=(7250,))
entry['epoch'] = ts.naive
return proc_data
def parse(
data: str,
raw: bool = False,
quiet: bool = False
) -> List[Dict]:
"""
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
List of Dictionaries. Raw or processed structured data.
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.input_type_check(data)
raw_output: List = []
rsync_run_new: Dict = {
'summary': {},
'files': []
}
rsync_run = deepcopy(rsync_run_new)
last_process = ''
update_type = {
'<': 'file sent',
'>': 'file received',
'c': 'local change or creation',
'h': 'hard link',
'.': 'not updated',
'*': 'message',
'+': None
}
file_type = {
'f': 'file',
'd': 'directory',
'L': 'symlink',
'D': 'device',
'S': 'special file',
'+': None
}
checksum_or_value_different = {
'c': True,
'.': False,
'+': None,
' ': None,
'?': None
}
size_different = {
's': True,
'.': False,
'+': None,
' ': None,
'?': None
}
modification_time_different = {
't': True,
'.': False,
'+': None,
' ': None,
'?': None
}
permissions_different = {
'p': True,
'.': False,
'+': None,
' ': None,
'?': None
}
owner_different = {
'o': True,
'.': False,
'+': None,
' ': None,
'?': None
}
group_different = {
'g': True,
'.': False,
'+': None,
' ': None,
'?': None
}
acl_different = {
'a': True,
'.': False,
'+': None,
' ': None,
'?': None
}
extended_attribute_different = {
'x': True,
'.': False,
'+': None,
' ': None,
'?': None
}
file_line_re = re.compile(r'(?P<meta>[<>ch.*][fdlDS][c.+ ?][s.+ ?][t.+ ?][p.+ ?][o.+ ?][g.+ ?][u.+ ?][a.+ ?][x.+ ?]) (?P<name>.+)')
file_line_mac_re = re.compile(r'(?P<meta>[<>ch.*][fdlDS][c.+ ?][s.+ ?][t.+ ?][p.+ ?][o.+ ?][g.+ ?][x.+ ?]) (?P<name>.+)')
stat1_line_re = re.compile(r'(sent)\s+(?P<sent>[0-9,]+)\s+(bytes)\s+(received)\s+(?P<received>[0-9,]+)\s+(bytes)\s+(?P<bytes_sec>[0-9,.]+)\s+(bytes/sec)')
stat2_line_re = re.compile(r'(total size is)\s+(?P<total_size>[0-9,]+)\s+(speedup is)\s+(?P<speedup>[0-9,.]+)')
file_line_log_re = re.compile(r'(?P<date>\d\d\d\d/\d\d/\d\d)\s+(?P<time>\d\d:\d\d:\d\d)\s+\[(?P<process>\d+)\]\s+(?P<meta>[<>ch.*][fdlDS][c.+ ?][s.+ ?][t.+ ?][p.+ ?][o.+ ?][g.+ ?][u.+ ?][a.+ ?][x.+ ?]) (?P<name>.+)')
file_line_log_mac_re = re.compile(r'(?P<date>\d\d\d\d/\d\d/\d\d)\s+(?P<time>\d\d:\d\d:\d\d)\s+\[(?P<process>\d+)\]\s+(?P<meta>[<>ch.*][fdlDS][c.+ ?][s.+ ?][t.+ ?][p.+ ?][o.+ ?][g.+ ?][x.+ ?]) (?P<name>.+)')
stat_line_log_re = re.compile(r'(?P<date>\d\d\d\d/\d\d/\d\d)\s+(?P<time>\d\d:\d\d:\d\d)\s+\[(?P<process>\d+)\]\s+sent\s+(?P<sent>[\d,]+)\s+bytes\s+received\s+(?P<received>[\d,]+)\s+bytes\s+total\s+size\s+(?P<total_size>[\d,]+)')
stat1_line_log_v_re = re.compile(r'(?P<date>\d\d\d\d/\d\d/\d\d)\s+(?P<time>\d\d:\d\d:\d\d)\s+\[(?P<process>\d+)]\s+total:\s+matches=(?P<matches>[\d,]+)\s+hash_hits=(?P<hash_hits>[\d,]+)\s+false_alarms=(?P<false_alarms>[\d,]+)\s+data=(?P<data>[\d,]+)')
stat2_line_log_v_re = re.compile(r'(?P<date>\d\d\d\d/\d\d/\d\d)\s+(?P<time>\d\d:\d\d:\d\d)\s+\[(?P<process>\d+)\]\s+sent\s+(?P<sent>[\d,]+)\s+bytes\s+received\s+(?P<received>[\d,]+)\s+bytes\s+(?P<bytes_sec>[\d,.]+)\s+bytes/sec')
stat3_line_log_v_re = re.compile(r'(?P<date>\d\d\d\d/\d\d/\d\d)\s+(?P<time>\d\d:\d\d:\d\d)\s+\[(?P<process>\d+)]\s+total\s+size\s+is\s+(?P<total_size>[\d,]+)\s+speedup\s+is\s+(?P<speedup>[\d,.]+)')
if jc.utils.has_data(data):
for line in filter(None, data.splitlines()):
file_line = file_line_re.match(line)
if file_line:
filename = file_line.group('name')
meta = file_line.group('meta')
output_line = {
'filename': filename,
'metadata': meta,
'update_type': update_type[meta[0]],
'file_type': file_type[meta[1]],
'checksum_or_value_different': checksum_or_value_different[meta[2]],
'size_different': size_different[meta[3]],
'modification_time_different': modification_time_different[meta[4]],
'permissions_different': permissions_different[meta[5]],
'owner_different': owner_different[meta[6]],
'group_different': group_different[meta[7]],
'acl_different': acl_different[meta[9]],
'extended_attribute_different': extended_attribute_different[meta[10]]
}
rsync_run['files'].append(output_line)
continue
file_line_mac = file_line_mac_re.match(line)
if file_line_mac:
filename = file_line_mac.group('name')
meta = file_line_mac.group('meta')
output_line = {
'filename': filename,
'metadata': meta,
'update_type': update_type[meta[0]],
'file_type': file_type[meta[1]],
'checksum_or_value_different': checksum_or_value_different[meta[2]],
'size_different': size_different[meta[3]],
'modification_time_different': modification_time_different[meta[4]],
'permissions_different': permissions_different[meta[5]],
'owner_different': owner_different[meta[6]],
'group_different': group_different[meta[7]]
}
rsync_run['files'].append(output_line)
continue
file_line_log = file_line_log_re.match(line)
if file_line_log:
filename = file_line_log.group('name')
date = file_line_log.group('date')
time = file_line_log.group('time')
process = file_line_log.group('process')
meta = file_line_log.group('meta')
if process != last_process:
raw_output.append(rsync_run)
rsync_run = deepcopy(rsync_run_new)
last_process = process
output_line = {
'filename': filename,
'date': date,
'time': time,
'process': process,
'metadata': meta,
'update_type': update_type[meta[0]],
'file_type': file_type[meta[1]],
'checksum_or_value_different': checksum_or_value_different[meta[2]],
'size_different': size_different[meta[3]],
'modification_time_different': modification_time_different[meta[4]],
'permissions_different': permissions_different[meta[5]],
'owner_different': owner_different[meta[6]],
'group_different': group_different[meta[7]],
'acl_different': acl_different[meta[9]],
'extended_attribute_different': extended_attribute_different[meta[10]]
}
rsync_run['files'].append(output_line)
continue
file_line_log_mac = file_line_log_mac_re.match(line)
if file_line_log_mac:
filename = file_line_log_mac.group('name')
date = file_line_log_mac.group('date')
time = file_line_log_mac.group('time')
process = file_line_log_mac.group('process')
meta = file_line_log_mac.group('meta')
if process != last_process:
raw_output.append(rsync_run)
rsync_run = deepcopy(rsync_run_new)
last_process = process
output_line = {
'filename': filename,
'date': date,
'time': time,
'process': process,
'metadata': meta,
'update_type': update_type[meta[0]],
'file_type': file_type[meta[1]],
'checksum_or_value_different': checksum_or_value_different[meta[2]],
'size_different': size_different[meta[3]],
'modification_time_different': modification_time_different[meta[4]],
'permissions_different': permissions_different[meta[5]],
'owner_different': owner_different[meta[6]],
'group_different': group_different[meta[7]]
}
rsync_run['files'].append(output_line)
continue
stat1_line = stat1_line_re.match(line)
if stat1_line:
rsync_run['summary'] = {
'sent': stat1_line.group('sent'),
'received': stat1_line.group('received'),
'bytes_sec': stat1_line.group('bytes_sec')
}
continue
stat2_line = stat2_line_re.match(line)
if stat2_line:
rsync_run['summary']['total_size'] = stat2_line.group('total_size')
rsync_run['summary']['speedup'] = stat2_line.group('speedup')
continue
stat_line_log = stat_line_log_re.match(line)
if stat_line_log:
rsync_run['summary'] = {
'date': stat_line_log.group('date'),
'time': stat_line_log.group('time'),
'process': stat_line_log.group('process'),
'sent': stat_line_log.group('sent'),
'received': stat_line_log.group('received'),
'total_size': stat_line_log.group('total_size')
}
continue
stat1_line_log_v = stat1_line_log_v_re.match(line)
if stat1_line_log_v:
rsync_run['summary'] = {
'date': stat1_line_log_v.group('date'),
'time': stat1_line_log_v.group('time'),
'process': stat1_line_log_v.group('process'),
'matches': stat1_line_log_v.group('matches'),
'hash_hits': stat1_line_log_v.group('hash_hits'),
'false_alarms': stat1_line_log_v.group('false_alarms'),
'data': stat1_line_log_v.group('data')
}
continue
stat2_line_log_v = stat2_line_log_v_re.match(line)
if stat2_line_log_v:
rsync_run['summary']['sent'] = stat2_line_log_v.group('sent')
rsync_run['summary']['received'] = stat2_line_log_v.group('received')
rsync_run['summary']['bytes_sec'] = stat2_line_log_v.group('bytes_sec')
continue
stat3_line_log_v = stat3_line_log_v_re.match(line)
if stat3_line_log_v:
rsync_run['summary']['total_size'] = stat3_line_log_v.group('total_size')
rsync_run['summary']['speedup'] = stat3_line_log_v.group('speedup')
continue
raw_output.append(rsync_run)
# cleanup blank entries
raw_output = [run for run in raw_output if run != rsync_run_new]
return raw_output if raw else _process(raw_output)

465
jc/parsers/rsync_s.py Normal file
View File

@ -0,0 +1,465 @@
"""jc - JSON CLI output utility `rsync` command output streaming parser
> This streaming parser outputs JSON Lines
Supports the `-i` or `--itemize-changes` options with all levels of
verbosity. This parser will process the STDOUT output or a log file
generated with the `--log-file` option.
Usage (cli):
$ rsync -i -a source/ dest | jc --rsync-s
or
$ cat rsync-backup.log | jc --rsync-s
Usage (module):
import jc
# result is an iterable object (generator)
result = jc.parse('rsync_s', rsync_command_output.splitlines())
for item in result:
# do something
or
import jc.parsers.rsync_s
# result is an iterable object (generator)
result = jc.parsers.rsync_s.parse(rsync_command_output.splitlines())
for item in result:
# do something
Schema:
{
"type": string, # 'file' or 'summary'
"date": string,
"time": string,
"process": integer,
"sent": integer,
"received": integer,
"total_size": integer,
"matches": integer,
"hash_hits": integer,
"false_alarms": integer,
"data": integer,
"bytes_sec": float,
"speedup": float,
"filename": string,
"date": string,
"time": string,
"process": integer,
"metadata": string,
"update_type": string/null, [0]
"file_type": string/null, [1]
"checksum_or_value_different": bool/null,
"size_different": bool/null,
"modification_time_different": bool/null,
"permissions_different": bool/null,
"owner_different": bool/null,
"group_different": bool/null,
"acl_different": bool/null,
"extended_attribute_different": bool/null,
"epoch": integer, [2]
# Below object only exists if using -qq or ignore_exceptions=True
"_jc_meta":
{
"success": boolean, # false if error parsing
"error": string, # exists if "success" is false
"line": string # exists if "success" is false
}
}
[0] 'file sent', 'file received', 'local change or creation',
'hard link', 'not updated', 'message'
[1] 'file', 'directory', 'symlink', 'device', 'special file'
[2] naive timestamp if time and date fields exist and can be converted.
Examples:
$ rsync -i -a source/ dest | jc --rsync-s
{"type":"file","filename":"./","metadata":".d..t......","update_...}
...
$ cat rsync_backup.log | jc --rsync-s
{"type":"file","filename":"./","date":"2022/01/28","time":"03:53...}
...
"""
import re
from typing import Dict, Iterable, Union
import jc.utils
from jc.streaming import (
add_jc_meta, streaming_input_type_check, streaming_line_input_type_check, raise_or_yield
)
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.0'
description = '`rsync` command streaming parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
compatible = ['linux', 'darwin', 'freebsd']
streaming = True
__version__ = info.version
def _process(proc_data: Dict) -> Dict:
"""
Final processing to conform to the schema.
Parameters:
proc_data: (Dictionary) raw structured data to process
Returns:
Dictionary. Structured data to conform to the schema.
"""
int_list = [
'process', 'sent', 'received', 'total_size', 'matches', 'hash_hits',
'false_alarms', 'data'
]
float_list = ['bytes_sec', 'speedup']
for key in proc_data.copy():
if key in int_list:
proc_data[key] = jc.utils.convert_to_int(proc_data[key])
if key in float_list:
proc_data[key] = jc.utils.convert_to_float(proc_data[key])
# add timestamp
if 'date' in proc_data and 'time' in proc_data:
date = proc_data['date'].replace('/', '-')
date_time = f'{date} {proc_data["time"]}'
ts = jc.utils.timestamp(date_time, format_hint=(7250,))
proc_data['epoch'] = ts.naive
return proc_data
@add_jc_meta
def parse(
data: Iterable[str],
raw: bool = False,
quiet: bool = False,
ignore_exceptions: bool = False
) -> Union[Iterable[Dict], tuple]:
"""
Main text parsing generator function. Returns an iterator object.
Parameters:
data: (iterable) line-based text data to parse
(e.g. sys.stdin or str.splitlines())
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
ignore_exceptions: (boolean) ignore parsing exceptions if True
Yields:
Dictionary. Raw or processed structured data.
Returns:
Iterator object (generator)
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
streaming_input_type_check(data)
summary: Dict = {}
process: str = ''
last_process: str = ''
update_type = {
'<': 'file sent',
'>': 'file received',
'c': 'local change or creation',
'h': 'hard link',
'.': 'not updated',
'*': 'message',
'+': None
}
file_type = {
'f': 'file',
'd': 'directory',
'L': 'symlink',
'D': 'device',
'S': 'special file',
'+': None
}
checksum_or_value_different = {
'c': True,
'.': False,
'+': None,
' ': None,
'?': None
}
size_different = {
's': True,
'.': False,
'+': None,
' ': None,
'?': None
}
modification_time_different = {
't': True,
'.': False,
'+': None,
' ': None,
'?': None
}
permissions_different = {
'p': True,
'.': False,
'+': None,
' ': None,
'?': None
}
owner_different = {
'o': True,
'.': False,
'+': None,
' ': None,
'?': None
}
group_different = {
'g': True,
'.': False,
'+': None,
' ': None,
'?': None
}
acl_different = {
'a': True,
'.': False,
'+': None,
' ': None,
'?': None
}
extended_attribute_different = {
'x': True,
'.': False,
'+': None,
' ': None,
'?': None
}
file_line_re = re.compile(r'(?P<meta>[<>ch.*][fdlDS][c.+ ?][s.+ ?][t.+ ?][p.+ ?][o.+ ?][g.+ ?][u.+ ?][a.+ ?][x.+ ?]) (?P<name>.+)')
file_line_mac_re = re.compile(r'(?P<meta>[<>ch.*][fdlDS][c.+ ?][s.+ ?][t.+ ?][p.+ ?][o.+ ?][g.+ ?][x.+ ?]) (?P<name>.+)')
stat1_line_re = re.compile(r'(sent)\s+(?P<sent>[0-9,]+)\s+(bytes)\s+(received)\s+(?P<received>[0-9,]+)\s+(bytes)\s+(?P<bytes_sec>[0-9,.]+)\s+(bytes/sec)')
stat2_line_re = re.compile(r'(total size is)\s+(?P<total_size>[0-9,]+)\s+(speedup is)\s+(?P<speedup>[0-9,.]+)')
file_line_log_re = re.compile(r'(?P<date>\d\d\d\d/\d\d/\d\d)\s+(?P<time>\d\d:\d\d:\d\d)\s+\[(?P<process>\d+)\]\s+(?P<meta>[<>ch.*][fdlDS][c.+ ?][s.+ ?][t.+ ?][p.+ ?][o.+ ?][g.+ ?][u.+ ?][a.+ ?][x.+ ?]) (?P<name>.+)')
file_line_log_mac_re = re.compile(r'(?P<date>\d\d\d\d/\d\d/\d\d)\s+(?P<time>\d\d:\d\d:\d\d)\s+\[(?P<process>\d+)\]\s+(?P<meta>[<>ch.*][fdlDS][c.+ ?][s.+ ?][t.+ ?][p.+ ?][o.+ ?][g.+ ?][x.+ ?]) (?P<name>.+)')
stat_line_log_re = re.compile(r'(?P<date>\d\d\d\d/\d\d/\d\d)\s+(?P<time>\d\d:\d\d:\d\d)\s+\[(?P<process>\d+)\]\s+sent\s+(?P<sent>[\d,]+)\s+bytes\s+received\s+(?P<received>[\d,]+)\s+bytes\s+total\s+size\s+(?P<total_size>[\d,]+)')
stat1_line_log_v_re = re.compile(r'(?P<date>\d\d\d\d/\d\d/\d\d)\s+(?P<time>\d\d:\d\d:\d\d)\s+\[(?P<process>\d+)]\s+total:\s+matches=(?P<matches>[\d,]+)\s+hash_hits=(?P<hash_hits>[\d,]+)\s+false_alarms=(?P<false_alarms>[\d,]+)\s+data=(?P<data>[\d,]+)')
stat2_line_log_v_re = re.compile(r'(?P<date>\d\d\d\d/\d\d/\d\d)\s+(?P<time>\d\d:\d\d:\d\d)\s+\[(?P<process>\d+)\]\s+sent\s+(?P<sent>[\d,]+)\s+bytes\s+received\s+(?P<received>[\d,]+)\s+bytes\s+(?P<bytes_sec>[\d,.]+)\s+bytes/sec')
stat3_line_log_v_re = re.compile(r'(?P<date>\d\d\d\d/\d\d/\d\d)\s+(?P<time>\d\d:\d\d:\d\d)\s+\[(?P<process>\d+)]\s+total\s+size\s+is\s+(?P<total_size>[\d,]+)\s+speedup\s+is\s+(?P<speedup>[\d,.]+)')
for line in data:
try:
streaming_line_input_type_check(line)
output_line: Dict = {}
# ignore blank lines
if line == '':
continue
file_line = file_line_re.match(line)
if file_line:
filename = file_line.group('name')
meta = file_line.group('meta')
output_line = {
'type': 'file',
'filename': filename,
'metadata': meta,
'update_type': update_type[meta[0]],
'file_type': file_type[meta[1]],
'checksum_or_value_different': checksum_or_value_different[meta[2]],
'size_different': size_different[meta[3]],
'modification_time_different': modification_time_different[meta[4]],
'permissions_different': permissions_different[meta[5]],
'owner_different': owner_different[meta[6]],
'group_different': group_different[meta[7]],
'acl_different': acl_different[meta[9]],
'extended_attribute_different': extended_attribute_different[meta[10]]
}
yield output_line if raw else _process(output_line)
continue
file_line_mac = file_line_mac_re.match(line)
if file_line_mac:
filename = file_line_mac.group('name')
meta = file_line_mac.group('meta')
output_line = {
'type': 'file',
'filename': filename,
'metadata': meta,
'update_type': update_type[meta[0]],
'file_type': file_type[meta[1]],
'checksum_or_value_different': checksum_or_value_different[meta[2]],
'size_different': size_different[meta[3]],
'modification_time_different': modification_time_different[meta[4]],
'permissions_different': permissions_different[meta[5]],
'owner_different': owner_different[meta[6]],
'group_different': group_different[meta[7]]
}
yield output_line if raw else _process(output_line)
continue
file_line_log = file_line_log_re.match(line)
if file_line_log:
if process != last_process:
if summary:
yield output_line if raw else _process(output_line)
last_process = process
summary = {}
filename = file_line_log.group('name')
date = file_line_log.group('date')
time = file_line_log.group('time')
process = file_line_log.group('process')
meta = file_line_log.group('meta')
output_line = {
'type': 'file',
'filename': filename,
'date': date,
'time': time,
'process': process,
'metadata': meta,
'update_type': update_type[meta[0]],
'file_type': file_type[meta[1]],
'checksum_or_value_different': checksum_or_value_different[meta[2]],
'size_different': size_different[meta[3]],
'modification_time_different': modification_time_different[meta[4]],
'permissions_different': permissions_different[meta[5]],
'owner_different': owner_different[meta[6]],
'group_different': group_different[meta[7]],
'acl_different': acl_different[meta[9]],
'extended_attribute_different': extended_attribute_different[meta[10]]
}
yield output_line if raw else _process(output_line)
continue
file_line_log_mac = file_line_log_mac_re.match(line)
if file_line_log_mac:
if process != last_process:
if summary:
yield output_line if raw else _process(output_line)
last_process = process
summary = {}
filename = file_line_log_mac.group('name')
date = file_line_log_mac.group('date')
time = file_line_log_mac.group('time')
process = file_line_log_mac.group('process')
meta = file_line_log_mac.group('meta')
output_line = {
'type': 'file',
'filename': filename,
'date': date,
'time': time,
'process': process,
'metadata': meta,
'update_type': update_type[meta[0]],
'file_type': file_type[meta[1]],
'checksum_or_value_different': checksum_or_value_different[meta[2]],
'size_different': size_different[meta[3]],
'modification_time_different': modification_time_different[meta[4]],
'permissions_different': permissions_different[meta[5]],
'owner_different': owner_different[meta[6]],
'group_different': group_different[meta[7]]
}
yield output_line if raw else _process(output_line)
continue
stat1_line = stat1_line_re.match(line)
if stat1_line:
summary = {
'type': 'summary',
'sent': stat1_line.group('sent'),
'received': stat1_line.group('received'),
'bytes_sec': stat1_line.group('bytes_sec')
}
continue
stat2_line = stat2_line_re.match(line)
if stat2_line:
summary['total_size'] = stat2_line.group('total_size')
summary['speedup'] = stat2_line.group('speedup')
continue
stat_line_log = stat_line_log_re.match(line)
if stat_line_log:
summary = {
'type': 'summary',
'date': stat_line_log.group('date'),
'time': stat_line_log.group('time'),
'process': stat_line_log.group('process'),
'sent': stat_line_log.group('sent'),
'received': stat_line_log.group('received'),
'total_size': stat_line_log.group('total_size')
}
continue
stat1_line_log_v = stat1_line_log_v_re.match(line)
if stat1_line_log_v:
summary = {
'type': 'summary',
'date': stat1_line_log_v.group('date'),
'time': stat1_line_log_v.group('time'),
'process': stat1_line_log_v.group('process'),
'matches': stat1_line_log_v.group('matches'),
'hash_hits': stat1_line_log_v.group('hash_hits'),
'false_alarms': stat1_line_log_v.group('false_alarms'),
'data': stat1_line_log_v.group('data')
}
continue
stat2_line_log_v = stat2_line_log_v_re.match(line)
if stat2_line_log_v:
summary['sent'] = stat2_line_log_v.group('sent')
summary['received'] = stat2_line_log_v.group('received')
summary['bytes_sec'] = stat2_line_log_v.group('bytes_sec')
continue
stat3_line_log_v = stat3_line_log_v_re.match(line)
if stat3_line_log_v:
summary['total_size'] = stat3_line_log_v.group('total_size')
summary['speedup'] = stat3_line_log_v.group('speedup')
continue
except Exception as e:
yield raise_or_yield(ignore_exceptions, e, line)
# gather final item
try:
if summary:
yield summary if raw else _process(summary)
except Exception as e:
yield raise_or_yield(ignore_exceptions, e, '')

View File

@ -176,7 +176,7 @@ import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.10'
version = '1.11'
description = '`stat` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -213,7 +213,7 @@ def _process(proc_data):
if key in entry:
if entry[key] == '-':
entry[key] = None
ts = jc.utils.timestamp(entry[key])
ts = jc.utils.timestamp(entry[key], format_hint=(7100, 7200))
entry[key + '_epoch'] = ts.naive
entry[key + '_epoch_utc'] = ts.utc

View File

@ -83,13 +83,15 @@ Examples:
"""
import shlex
import jc.utils
from jc.utils import stream_success, stream_error
from jc.streaming import (
add_jc_meta, streaming_input_type_check, streaming_line_input_type_check, raise_or_yield
)
from jc.exceptions import ParseError
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '0.5'
version = '1.0'
description = '`stat` command streaming parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -124,12 +126,14 @@ def _process(proc_data):
if key in proc_data:
if proc_data[key] == '-':
proc_data[key] = None
ts = jc.utils.timestamp(proc_data[key])
ts = jc.utils.timestamp(proc_data[key], format_hint=(7100, 7200))
proc_data[key + '_epoch'] = ts.naive
proc_data[key + '_epoch_utc'] = ts.utc
return proc_data
@add_jc_meta
def parse(data, raw=False, quiet=False, ignore_exceptions=False):
"""
Main text parsing generator function. Returns an iterator object.
@ -149,17 +153,17 @@ def parse(data, raw=False, quiet=False, ignore_exceptions=False):
Returns:
Iterator object
Iterator object (generator)
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.streaming_input_type_check(data)
streaming_input_type_check(data)
output_line = {}
os_type = ''
for line in data:
try:
jc.utils.streaming_line_input_type_check(line)
streaming_line_input_type_check(line)
line = line.rstrip()
# ignore blank lines
@ -175,7 +179,7 @@ def parse(data, raw=False, quiet=False, ignore_exceptions=False):
# line #1
if line.startswith(' File: '):
if output_line:
yield stream_success(output_line, ignore_exceptions) if raw else stream_success(_process(output_line), ignore_exceptions)
yield output_line if raw else _process(output_line)
output_line = {}
line_list = line.split(maxsplit=1)
@ -281,16 +285,16 @@ def parse(data, raw=False, quiet=False, ignore_exceptions=False):
}
if output_line:
yield stream_success(output_line, ignore_exceptions) if raw else stream_success(_process(output_line), ignore_exceptions)
yield output_line if raw else _process(output_line)
output_line = {}
except Exception as e:
yield stream_error(e, ignore_exceptions, line)
output_line = {}
yield raise_or_yield(ignore_exceptions, e, line)
# gather final item
if output_line:
try:
yield stream_success(output_line, ignore_exceptions) if raw else stream_success(_process(output_line), ignore_exceptions)
except Exception as e:
yield stream_error(e, ignore_exceptions, line)
try:
if output_line:
yield output_line if raw else _process(output_line)
except Exception as e:
yield raise_or_yield(ignore_exceptions, e, '')

View File

@ -217,7 +217,7 @@ import jc.utils
class info:
"""Provides parser metadata (version, author, etc.)"""
version = "1.1"
version = "1.2"
description = "`systeminfo` command parser"
author = "Jon Smith"
author_email = "jon@rebelliondefense.com"
@ -279,8 +279,10 @@ def _process(proc_data):
# to: (UTC-0800)
tz_fields = tz.split()
tz = " " + tz_fields[0].replace(":", "")
proc_data[key + '_epoch'] = jc.utils.timestamp(f"{proc_data.get(key)}{tz}").naive
proc_data[key + '_epoch_utc'] = jc.utils.timestamp(f"{proc_data.get(key)}{tz}").utc
ts_formats = (1700, 1705, 1710)
ts = jc.utils.timestamp(f"{proc_data.get(key)}{tz}", format_hint=ts_formats)
proc_data[key + '_epoch'] = ts.naive
proc_data[key + '_epoch_utc'] = ts.utc
hyperv_key = "hyperv_requirements"
hyperv_subkey_list = [

View File

@ -69,7 +69,7 @@ import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.5'
version = '1.6'
description = '`timedatectl status` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -99,7 +99,7 @@ def _process(proc_data):
proc_data[key] = jc.utils.convert_to_bool(proc_data[key])
if 'universal_time' in proc_data:
ts = jc.utils.timestamp(proc_data['universal_time'])
ts = jc.utils.timestamp(proc_data['universal_time'], format_hint=(7300,))
proc_data['epoch_utc'] = ts.utc
return proc_data

View File

@ -2,7 +2,7 @@
import string
from typing import List, Dict, Optional
from typing import List, Dict
def simple_table_parse(data: List[str]) -> List[Dict]:
@ -33,7 +33,7 @@ def simple_table_parse(data: List[str]) -> List[Dict]:
return raw_output
def sparse_table_parse(data: List[str], delim: Optional[str] ='\u2063') -> List[Dict]:
def sparse_table_parse(data: List[str], delim: str = '\u2063') -> List[Dict]:
"""
Parse tables with missing column data or with spaces in column data.
@ -60,10 +60,10 @@ def sparse_table_parse(data: List[str], delim: Optional[str] ='\u2063') -> List[
List of Dictionaries
"""
output = []
header_text = data.pop(0)
output: List = []
header_text: str = data.pop(0)
header_text = header_text + ' '
header_list = header_text.split()
header_list: List = header_text.split()
# find each column index and end position
header_search = [header_list[0]]

View File

@ -203,7 +203,7 @@ import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.3'
version = '1.4'
description = '`upower` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -234,7 +234,8 @@ def _process(proc_data):
entry['updated_seconds_ago'] = jc.utils.convert_to_int(updated_list[-3])
if entry['updated']:
ts = jc.utils.timestamp(entry['updated'])
hints = (1000, 2000, 3000, 4000, 5000, 8000, 8100)
ts = jc.utils.timestamp(entry['updated'], format_hint=hints)
entry['updated_epoch'] = ts.naive
entry['updated_epoch_utc'] = ts.utc

View File

@ -131,7 +131,7 @@ import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.1'
version = '1.2'
description = '`vmstat` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -169,7 +169,8 @@ def _process(proc_data):
entry[key] = jc.utils.convert_to_int(entry[key])
if entry['timestamp']:
ts = jc.utils.timestamp(f'{entry["timestamp"]} {entry["timezone"]}')
fmt_hint = (7250, 7255)
ts = jc.utils.timestamp(f'{entry["timestamp"]} {entry["timezone"]}', format_hint=fmt_hint)
entry['epoch'] = ts.naive
entry['epoch_utc'] = ts.utc

View File

@ -101,13 +101,15 @@ Examples:
...
"""
import jc.utils
from jc.utils import stream_success, stream_error
from jc.streaming import (
add_jc_meta, streaming_input_type_check, streaming_line_input_type_check, raise_or_yield
)
from jc.exceptions import ParseError
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '0.6'
version = '1.0'
description = '`vmstat` command streaming parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -143,13 +145,15 @@ def _process(proc_data):
proc_data[key] = jc.utils.convert_to_int(proc_data[key])
if proc_data['timestamp']:
ts = jc.utils.timestamp(f'{proc_data["timestamp"]} {proc_data["timezone"]}')
fmt_hint = (7250, 7255)
ts = jc.utils.timestamp(f'{proc_data["timestamp"]} {proc_data["timezone"]}', format_hint=fmt_hint)
proc_data['epoch'] = ts.naive
proc_data['epoch_utc'] = ts.utc
return proc_data
@add_jc_meta
def parse(data, raw=False, quiet=False, ignore_exceptions=False):
"""
Main text parsing generator function. Returns an iterator object.
@ -169,10 +173,10 @@ def parse(data, raw=False, quiet=False, ignore_exceptions=False):
Returns:
Iterator object
Iterator object (generator)
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.streaming_input_type_check(data)
streaming_input_type_check(data)
procs = None
buff_cache = None
@ -181,9 +185,9 @@ def parse(data, raw=False, quiet=False, ignore_exceptions=False):
tz = None
for line in data:
output_line = {}
try:
jc.utils.streaming_line_input_type_check(line)
streaming_line_input_type_check(line)
output_line = {}
# skip blank lines
if line.strip() == '':
@ -266,9 +270,9 @@ def parse(data, raw=False, quiet=False, ignore_exceptions=False):
}
if output_line:
yield stream_success(output_line, ignore_exceptions) if raw else stream_success(_process(output_line), ignore_exceptions)
yield output_line if raw else _process(output_line)
else:
raise ParseError('Not vmstat data')
except Exception as e:
yield stream_error(e, ignore_exceptions, line)
yield raise_or_yield(ignore_exceptions, e, line)

View File

@ -141,7 +141,7 @@ import jc.utils
class info():
"""Provides parser metadata (version, author, etc.)"""
version = '1.5'
version = '1.6'
description = '`who` command parser'
author = 'Kelly Brazil'
author_email = 'kellyjonbrazil@gmail.com'
@ -171,7 +171,7 @@ def _process(proc_data):
entry[key] = jc.utils.convert_to_int(entry[key])
if 'time' in entry:
ts = jc.utils.timestamp(entry['time'])
ts = jc.utils.timestamp(entry['time'], format_hint=(1500,))
entry['epoch'] = ts.naive
return proc_data

380
jc/parsers/xrandr.py Normal file
View File

@ -0,0 +1,380 @@
"""jc - JSON CLI output utility `xrandr` command output parser
Usage (cli):
$ xrandr | jc --xrandr
or
$ jc xrandr
Usage (module):
import jc
result = jc.parse('xrandr', xrandr_command_output)
or
import jc.parsers.xrandr
result = jc.parsers.xrandr.parse(xrandr_command_output)
Schema:
{
"screens": [
{
"screen_number": integer,
"minimum_width": integer,
"minimum_height": integer,
"current_width": integer,
"current_height": integer,
"maximum_width": integer,
"maximum_height": integer,
"associated_device": {
"associated_modes": [
{
"resolution_width": integer,
"resolution_height": integer,
"is_high_resolution": boolean,
"frequencies": [
{
"frequency": float,
"is_current": boolean,
"is_preferred": boolean
}
],
"is_connected": boolean,
"is_primary": boolean,
"device_name": string,
"resolution_width": integer,
"resolution_height": integer,
"offset_width": integer,
"offset_height": integer,
"dimension_width": integer,
"dimension_height": integer
}
}
],
"unassociated_devices": [
{
"associated_modes": [
{
"resolution_width": integer,
"resolution_height": integer,
"is_high_resolution": boolean,
"frequencies": [
{
"frequency": float,
"is_current": boolean,
"is_preferred": boolean
}
]
}
]
}
]
}
Examples:
$ xrandr | jc --xrandr -p
{
"screens": [
{
"screen_number": 0,
"minimum_width": 8,
"minimum_height": 8,
"current_width": 1920,
"current_height": 1080,
"maximum_width": 32767,
"maximum_height": 32767,
"associated_device": {
"associated_modes": [
{
"resolution_width": 1920,
"resolution_height": 1080,
"is_high_resolution": false,
"frequencies": [
{
"frequency": 60.03,
"is_current": true,
"is_preferred": true
},
{
"frequency": 59.93,
"is_current": false,
"is_preferred": false
}
]
},
{
"resolution_width": 1680,
"resolution_height": 1050,
"is_high_resolution": false,
"frequencies": [
{
"frequency": 59.88,
"is_current": false,
"is_preferred": false
}
]
}
],
"is_connected": true,
"is_primary": true,
"device_name": "eDP1",
"resolution_width": 1920,
"resolution_height": 1080,
"offset_width": 0,
"offset_height": 0,
"dimension_width": 310,
"dimension_height": 170
}
}
],
"unassociated_devices": []
}
"""
import re
from typing import Dict, List, Optional, Union
import jc.utils
class info:
"""Provides parser metadata (version, author, etc.)"""
version = "1.0"
description = "`xrandr` command parser"
author = "Kevin Lyter"
author_email = "lyter_git at sent.com"
# compatible options: linux, darwin, cygwin, win32, aix, freebsd
compatible = ["linux", "darwin", "cygwin", "aix", "freebsd"]
magic_commands = ["xrandr"]
__version__ = info.version
try:
from typing import TypedDict
Frequency = TypedDict(
"Frequency",
{
"frequency": float,
"is_current": bool,
"is_preferred": bool,
},
)
Mode = TypedDict(
"Mode",
{
"resolution_width": int,
"resolution_height": int,
"is_high_resolution": bool,
"frequencies": List[Frequency],
},
)
Device = TypedDict(
"Device",
{
"device_name": str,
"is_connected": bool,
"is_primary": bool,
"resolution_width": int,
"resolution_height": int,
"offset_width": int,
"offset_height": int,
"dimension_width": int,
"dimension_height": int,
"associated_modes": List[Mode],
},
)
Screen = TypedDict(
"Screen",
{
"screen_number": int,
"minimum_width": int,
"minimum_height": int,
"current_width": int,
"current_height": int,
"maximum_width": int,
"maximum_height": int,
"associated_device": Device,
},
)
Response = TypedDict(
"Response",
{
"screens": List[Screen],
"unassociated_devices": List[Device],
},
)
except ImportError:
Screen = Dict[str, Union[int, str]]
Device = Dict[str, Union[str, int, bool]]
Frequency = Dict[str, Union[float, bool]]
Mode = Dict[str, Union[int, bool, List[Frequency]]]
Response = Dict[str, Union[Device, Mode, Screen]]
_screen_pattern = (
r"Screen (?P<screen_number>\d+): "
+ "minimum (?P<minimum_width>\d+) x (?P<minimum_height>\d+), "
+ "current (?P<current_width>\d+) x (?P<current_height>\d+), "
+ "maximum (?P<maximum_width>\d+) x (?P<maximum_height>\d+)"
)
def _parse_screen(next_lines: List[str]) -> Optional[Screen]:
next_line = next_lines.pop()
result = re.match(_screen_pattern, next_line)
if not result:
next_lines.append(next_line)
return None
raw_matches = result.groupdict()
screen: Screen = {}
for k, v in raw_matches.items():
screen[k] = int(v)
if next_lines:
device: Optional[Device] = _parse_device(next_lines)
if device:
screen["associated_device"] = device
return screen
# eDP1 connected primary 1920x1080+0+0 (normal left inverted right x axis y axis)
# 310mm x 170mm
# regex101 demo link
_device_pattern = (
r"(?P<device_name>.+) "
+ "(?P<is_connected>(connected|disconnected)) ?"
+ "(?P<is_primary> primary)? ?"
+ "((?P<resolution_width>\d+)x(?P<resolution_height>\d+)"
+ "\+(?P<offset_width>\d+)\+(?P<offset_height>\d+))? "
+ "\(normal left inverted right x axis y axis\)"
+ "( ((?P<dimension_width>\d+)mm x (?P<dimension_height>\d+)mm)?)?"
)
def _parse_device(next_lines: List[str], quiet: bool = False) -> Optional[Device]:
if not next_lines:
return None
next_line = next_lines.pop()
result = re.match(_device_pattern, next_line)
if not result:
next_lines.append(next_line)
return None
matches = result.groupdict()
device: Device = {
"associated_modes": [],
"is_connected": matches["is_connected"] == "connected",
"is_primary": matches["is_primary"] is not None
and len(matches["is_primary"]) > 0,
"device_name": matches["device_name"],
}
for k, v in matches.items():
if k not in {"is_connected", "is_primary", "device_name"}:
try:
if v:
device[k] = int(v)
except ValueError and not quiet:
jc.utils.warning_message(
[f"Error: {next_line} : {k} - {v} is not int-able"]
)
while next_lines:
next_line = next_lines.pop()
next_mode: Optional[Mode] = _parse_mode(next_line)
if next_mode:
device["associated_modes"].append(next_mode)
else:
next_lines.append(next_line)
break
return device
# 1920x1080i 60.03*+ 59.93
# 1920x1080 60.00 + 50.00 59.94
_mode_pattern = r"\s*(?P<resolution_width>\d+)x(?P<resolution_height>\d+)(?P<is_high_resolution>i)?\s+(?P<rest>.*)"
_frequencies_pattern = r"(((?P<frequency>\d+\.\d+)(?P<star>\*| |)(?P<plus>\+?)?)+)"
def _parse_mode(line: str) -> Optional[Mode]:
result = re.match(_mode_pattern, line)
frequencies: List[Frequency] = []
if not result:
return None
d = result.groupdict()
resolution_width = int(d["resolution_width"])
resolution_height = int(d["resolution_height"])
is_high_resolution = d["is_high_resolution"] is not None
mode: Mode = {
"resolution_width": resolution_width,
"resolution_height": resolution_height,
"is_high_resolution": is_high_resolution,
"frequencies": frequencies,
}
result = re.finditer(_frequencies_pattern, d["rest"])
if not result:
return mode
for match in result:
d = match.groupdict()
frequency = float(d["frequency"])
is_current = len(d["star"]) > 0
is_preferred = len(d["plus"]) > 0
f: Frequency = {
"frequency": frequency,
"is_current": is_current,
"is_preferred": is_preferred,
}
mode["frequencies"].append(f)
return mode
def parse(data: str, raw: bool =False, quiet: bool =False) -> Dict:
"""
Main text parsing function
Parameters:
data: (string) text data to parse
raw: (boolean) unprocessed output if True
quiet: (boolean) suppress warning messages if True
Returns:
Dictionary. Raw or processed structured data.
"""
jc.utils.compatibility(__name__, info.compatible, quiet)
jc.utils.input_type_check(data)
linedata = data.splitlines()
linedata.reverse() # For popping
result: Response = {"screens": [], "unassociated_devices": []}
if jc.utils.has_data(data):
while linedata:
screen = _parse_screen(linedata)
if screen:
result["screens"].append(screen)
else:
device = _parse_device(linedata, quiet)
if device:
result["unassociated_devices"].append(device)
if not result["unassociated_devices"] and not result["screens"]:
return {}
return result

View File

@ -39,7 +39,7 @@ Schema:
{
"flags": string,
"zipversion": string,
"zipunder": string
"zipunder": string,
"filesize": integer,
"type": string,
"method": string,

118
jc/streaming.py Normal file
View File

@ -0,0 +1,118 @@
"""jc - JSON CLI output utility streaming utils"""
from functools import wraps
from typing import Dict, Iterable
def streaming_input_type_check(data: Iterable) -> None:
"""
Ensure input data is an iterable, but not a string or bytes. Raises
`TypeError` if not.
"""
if not hasattr(data, '__iter__') or isinstance(data, (str, bytes)):
raise TypeError("Input data must be a non-string iterable object.")
def streaming_line_input_type_check(line: str) -> None:
"""Ensure each line is a string. Raises `TypeError` if not."""
if not isinstance(line, str):
raise TypeError("Input line must be a 'str' object.")
def stream_success(output_line: Dict, ignore_exceptions: bool) -> Dict:
"""Add `_jc_meta` object to output line if `ignore_exceptions=True`"""
if ignore_exceptions:
output_line.update({'_jc_meta': {'success': True}})
return output_line
def stream_error(e: BaseException, line: str) -> Dict:
"""
Return an error `_jc_meta` field.
"""
return {
'_jc_meta':
{
'success': False,
'error': f'{e.__class__.__name__}: {e}',
'line': line.strip()
}
}
def add_jc_meta(func):
"""
Decorator for streaming parsers to add stream_success and stream_error
objects. This simplifies the yield lines in the streaming parsers.
With the decorator on parse():
# successfully parsed line:
yield output_line if raw else _process(output_line)
# unsuccessfully parsed line:
except Exception as e:
yield raise_or_yield(ignore_exceptions, e, line)
Without the decorator on parse():
# successfully parsed line:
if raw:
yield stream_success(output_line, ignore_exceptions)
else:
stream_success(_process(output_line), ignore_exceptions)
# unsuccessfully parsed line:
except Exception as e:
yield stream_error(raise_or_yield(ignore_exceptions, e, line))
In all cases above:
output_line: (Dict) successfully parsed line yielded as a dict
e: (BaseException) exception object as the first value
of the tuple if the line was not successfully parsed.
line: (str) string of the original line that did not
successfully parse.
ignore_exceptions: (bool) continue processing lines and ignore
exceptions if True.
"""
@wraps(func)
def wrapper(*args, **kwargs):
ignore_exceptions = kwargs.get('ignore_exceptions', False)
gen = func(*args, **kwargs)
for value in gen:
# if the yielded value is a dict, then we know it was a
# successfully parsed line
if isinstance(value, dict):
yield stream_success(value, ignore_exceptions)
# otherwise it will be a tuple and we know it was an error
else:
exception_obj = value[0]
line = value[1]
yield stream_error(exception_obj, line)
return wrapper
def raise_or_yield(
ignore_exceptions: bool,
e: BaseException,
line: str
) -> tuple:
"""
Return the exception object and line string if ignore_exceptions is
True. Otherwise, re-raise the exception from the exception object with
an annotation.
"""
ignore_exceptions_msg = '... Use the ignore_exceptions option (-qq) to ignore streaming parser errors.'
if not ignore_exceptions:
e.args = (str(e) + ignore_exceptions_msg,)
raise e
return e, line

View File

@ -5,7 +5,8 @@ import locale
import shutil
from datetime import datetime, timezone
from textwrap import TextWrapper
from typing import Dict, Iterable, List, Union, Optional
from functools import lru_cache
from typing import List, Tuple, Union, Optional
def warning_message(message_lines: List[str]) -> None:
@ -76,7 +77,7 @@ def error_message(message_lines: List[str]) -> None:
print(message, file=sys.stderr)
def compatibility(mod_name: str, compatible: List, quiet: Optional[bool] = False) -> None:
def compatibility(mod_name: str, compatible: List, quiet: bool = False) -> None:
"""
Checks for the parser's compatibility with the running OS
platform.
@ -106,8 +107,10 @@ def compatibility(mod_name: str, compatible: List, quiet: Optional[bool] = False
if not platform_found:
mod = mod_name.split('.')[-1]
compat_list = ', '.join(compatible)
warning_message([f'{mod} parser not compatible with your OS ({sys.platform}).',
f'Compatible platforms: {compat_list}'])
warning_message([
f'{mod} parser not compatible with your OS ({sys.platform}).',
f'Compatible platforms: {compat_list}'
])
def has_data(data: str) -> bool:
@ -127,7 +130,7 @@ def has_data(data: str) -> bool:
return bool(data and not data.isspace())
def convert_to_int(value: Union[str, float]) -> Union[int, None]:
def convert_to_int(value: Union[str, float]) -> Optional[int]:
"""
Converts string and float input to int. Strips all non-numeric
characters from strings.
@ -157,7 +160,7 @@ def convert_to_int(value: Union[str, float]) -> Union[int, None]:
return None
def convert_to_float(value: Union[str, int]) -> Union[float, None]:
def convert_to_float(value: Union[str, int]) -> Optional[float]:
"""
Converts string and int input to float. Strips all non-numeric
characters from strings.
@ -221,114 +224,94 @@ def convert_to_bool(value: Union[str, int, float]) -> bool:
return False
def stream_success(output_line: Dict, ignore_exceptions: bool) -> Dict:
"""Add `_jc_meta` object to output line if `ignore_exceptions=True`"""
if ignore_exceptions:
output_line.update({'_jc_meta': {'success': True}})
return output_line
def stream_error(e: BaseException, ignore_exceptions: bool, line: str) -> Dict:
"""
Reraise the stream exception with annotation or print an error
`_jc_meta` field if `ignore_exceptions=True`.
"""
if not ignore_exceptions:
e.args = (str(e) + '... Use the ignore_exceptions option (-qq) to ignore streaming parser errors.',)
raise e
return {
'_jc_meta':
{
'success': False,
'error': f'{e.__class__.__name__}: {e}',
'line': line.strip()
}
}
def input_type_check(data: str) -> None:
"""Ensure input data is a string. Raises `TypeError` if not."""
if not isinstance(data, str):
raise TypeError("Input data must be a 'str' object.")
def streaming_input_type_check(data: Iterable) -> None:
"""
Ensure input data is an iterable, but not a string or bytes. Raises
`TypeError` if not.
"""
if not hasattr(data, '__iter__') or isinstance(data, (str, bytes)):
raise TypeError("Input data must be a non-string iterable object.")
def streaming_line_input_type_check(line: str) -> None:
"""Ensure each line is a string. Raises `TypeError` if not."""
if not isinstance(line, str):
raise TypeError("Input line must be a 'str' object.")
class timestamp:
def __init__(self, datetime_string: str) -> None:
def __init__(self,
datetime_string: str,
format_hint: Union[List, Tuple, None] = None
) -> None:
"""
Input a date-time text string of several formats and convert to a
Input a datetime text string of several formats and convert to a
naive or timezone-aware epoch timestamp in UTC.
Parameters:
datetime_string: (str) a string representation of a
date-time in several supported formats
datetime_string (str): a string representation of a
datetime in several supported formats
Attributes:
format_hint (list | tuple): an optional list of format ID
integers to instruct the timestamp object to try those
formats first in the order given. Other formats will be
tried after the format hint list is exhausted. This can
speed up timestamp conversion so several different formats
don't have to be tried in brute-force fashion.
string (str) the input datetime string
Returns a timestamp object with the following attributes:
format (int) the format rule that was used to
decode the datetime string. None if
conversion fails
string (str): the input datetime string
naive (int) timestamp based on locally configured
timezone. None if conversion fails
format (int | None): the format rule that was used to decode
the datetime string. None if conversion fails.
utc (int) aware timestamp only if UTC timezone
detected in datetime string. None if
conversion fails
naive (int | None): timestamp based on locally configured
timezone. None if conversion fails.
utc (int | None): aware timestamp only if UTC timezone
detected in datetime string. None if conversion fails.
"""
self.string = datetime_string
dt = self._parse()
if not format_hint:
format_hint = tuple()
else:
format_hint = tuple(format_hint)
dt = self._parse_dt(self.string, format_hint=format_hint)
self.format = dt['format']
self.naive = dt['timestamp_naive']
self.utc = dt['timestamp_utc']
def __repr__(self):
return f'timestamp(string="{self.string}", format={self.format}, naive={self.naive}, utc={self.utc})'
return f'timestamp(string={self.string!r}, format={self.format}, naive={self.naive}, utc={self.utc})'
def _parse(self):
@staticmethod
@lru_cache(maxsize=512)
def _parse_dt(dt_string, format_hint=None):
"""
Input a date-time text string of several formats and convert to
Input a datetime text string of several formats and convert to
a naive or timezone-aware epoch timestamp in UTC.
Parameters:
data: (string) a string representation of a date-time
in several supported formats
dt_string: (string) a string representation of a date-time
in several supported formats
format_hint: (list | tuple) a list of format ID int's that
should be tried first. This can increase
performance since the function will not need to
try many incorrect formats before finding the
correct one.
Returns:
Dictionary A Dictionary of the following format:
Dictionary of the following format:
{
# for debugging purposes. None if conversion fails
"format": integer,
"format": int,
# timestamp based on locally configured timezone.
# None if conversion fails.
"timestamp_naive": integer,
"timestamp_naive": int,
# aware timestamp only if UTC timezone detected.
# None if conversion fails.
"timestamp_utc": integer
"timestamp_utc": int
}
The `format` integer denotes which date_time format
@ -345,7 +328,7 @@ class timestamp:
If the conversion completely fails, all fields will be None.
"""
data = self.string or ''
data = dt_string or ''
normalized_datetime = ''
utc_tz = False
dt = None
@ -359,6 +342,12 @@ class timestamp:
}
utc_tz = False
# convert format_hint to a tuple so it is hashable (for lru_cache)
if not format_hint:
format_hint = tuple()
else:
format_hint = tuple(format_hint)
# sometimes UTC is referenced as 'Coordinated Universal Time'. Convert to 'UTC'
data = data.replace('Coordinated Universal Time', 'UTC')
@ -439,7 +428,17 @@ class timestamp:
p = re.compile(r'(\W\d\d:\d\d:\d\d\.\d{6})\d+\W')
normalized_datetime = p.sub(r'\g<1> ', normalized_datetime)
for fmt in formats:
# try format hints first, then fall back to brute-force method
hint_obj_list = []
for fmt_id in format_hint:
for fmt in formats:
if fmt_id == fmt['id']:
hint_obj_list.append(fmt)
remaining_formats = [fmt for fmt in formats if not fmt['id'] in format_hint]
optimized_formats = hint_obj_list + remaining_formats
for fmt in optimized_formats:
try:
locale.setlocale(locale.LC_TIME, fmt['locale'])
dt = datetime.strptime(normalized_datetime, fmt['format'])

View File

@ -1,4 +1,4 @@
.TH jc 1 2022-01-27 1.18.2 "JSON CLI output utility"
.TH jc 1 2022-02-14 1.18.3 "JSON CLI output utility"
.SH NAME
jc \- JSONifies the output of many CLI tools and file-types
.SH SYNOPSIS
@ -302,6 +302,16 @@ Key/Value file parser
\fB--rpm-qi\fP
`rpm -qi` command parser
.TP
.B
\fB--rsync\fP
`rsync` command parser
.TP
.B
\fB--rsync-s\fP
`rsync` command streaming parser
.TP
.B
\fB--sfdisk\fP
@ -432,6 +442,11 @@ Key/Value file parser
\fB--xml\fP
XML file parser
.TP
.B
\fB--xrandr\fP
`xrandr` command parser
.TP
.B
\fB--yaml\fP

View File

@ -5,7 +5,7 @@ with open('README.md', 'r') as f:
setuptools.setup(
name='jc',
version='1.18.2',
version='1.18.3',
author='Kelly Brazil',
author_email='kellyjonbrazil@gmail.com',
description='Converts the output of popular command-line tools and file-types to JSON.',
@ -17,7 +17,7 @@ setuptools.setup(
license='MIT',
long_description=long_description,
long_description_content_type='text/markdown',
python_requires='>=3.6',
python_requires='>=3.7',
url='https://github.com/kellyjonbrazil/jc',
packages=setuptools.find_packages(exclude=['*.tests', '*.tests.*', 'tests.*', 'tests']),
entry_points={

View File

@ -128,7 +128,7 @@ pip3 install jc
> For more OS Packages, see https://repology.org/project/jc/versions.
### Binaries and Packages
### Binaries
For precompiled binaries, see [Releases](https://github.com/kellyjonbrazil/jc/releases)
on Github.

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,59 @@
2022/01/28 03:53:52 [9190] building file list
2022/01/28 03:53:52 [9190] .d..t...... ./
2022/01/28 03:53:52 [9190] >f+++++++++ a.txt
2022/01/28 03:53:52 [9190] >f+++++++++ b.txt
2022/01/28 03:53:52 [9190] >f+++++++++ c.txt
2022/01/28 03:53:52 [9190] >f+++++++++ d.txt
2022/01/28 03:53:52 [9190] >f+++++++++ file with spaces.txt
2022/01/28 03:53:52 [9190] cd+++++++++ folder/
2022/01/28 03:53:52 [9190] >f+++++++++ folder/file1
2022/01/28 03:53:52 [9190] >f+++++++++ folder/file10
2022/01/28 03:53:52 [9190] >f+++++++++ folder/file11
2022/01/28 03:53:52 [9190] >f+++++++++ folder/file12
2022/01/28 03:53:52 [9190] >f+++++++++ folder/file13
2022/01/28 03:53:52 [9190] >f+++++++++ folder/file14
2022/01/28 03:53:52 [9190] >f+++++++++ folder/file15
2022/01/28 03:53:52 [9190] >f+++++++++ folder/file16
2022/01/28 03:53:52 [9190] >f+++++++++ folder/file17
2022/01/28 03:53:52 [9190] >f+++++++++ folder/file18
2022/01/28 03:53:52 [9190] >f+++++++++ folder/file19
2022/01/28 03:53:52 [9190] >f+++++++++ folder/file2
2022/01/28 03:53:52 [9190] >f+++++++++ folder/file20
2022/01/28 03:53:52 [9190] >f+++++++++ folder/file3
2022/01/28 03:53:52 [9190] >f+++++++++ folder/file4
2022/01/28 03:53:52 [9190] >f+++++++++ folder/file5
2022/01/28 03:53:52 [9190] >f+++++++++ folder/file6
2022/01/28 03:53:52 [9190] >f+++++++++ folder/file7
2022/01/28 03:53:52 [9190] >f+++++++++ folder/file8
2022/01/28 03:53:52 [9190] >f+++++++++ folder/file9
2022/01/28 03:53:52 [9190] sent 1713 bytes received 507 bytes total size 235
2022/01/28 03:55:00 [9198] building file list
2022/01/28 03:55:00 [9198] .d..t...... ./
2022/01/28 03:55:00 [9198] >f+++++++++ a.txt
2022/01/28 03:55:00 [9198] >f+++++++++ b.txt
2022/01/28 03:55:00 [9198] >f+++++++++ c.txt
2022/01/28 03:55:00 [9198] >f+++++++++ d.txt
2022/01/28 03:55:00 [9198] >f+++++++++ file with spaces.txt
2022/01/28 03:55:00 [9198] cd+++++++++ folder/
2022/01/28 03:55:00 [9198] >f+++++++++ folder/file1
2022/01/28 03:55:00 [9198] >f+++++++++ folder/file10
2022/01/28 03:55:00 [9198] >f+++++++++ folder/file11
2022/01/28 03:55:00 [9198] >f+++++++++ folder/file12
2022/01/28 03:55:00 [9198] >f+++++++++ folder/file13
2022/01/28 03:55:00 [9198] >f+++++++++ folder/file14
2022/01/28 03:55:00 [9198] >f+++++++++ folder/file15
2022/01/28 03:55:00 [9198] >f+++++++++ folder/file16
2022/01/28 03:55:00 [9198] >f+++++++++ folder/file17
2022/01/28 03:55:00 [9198] >f+++++++++ folder/file18
2022/01/28 03:55:00 [9198] >f+++++++++ folder/file19
2022/01/28 03:55:00 [9198] >f+++++++++ folder/file2
2022/01/28 03:55:00 [9198] >f+++++++++ folder/file20
2022/01/28 03:55:00 [9198] >f+++++++++ folder/file3
2022/01/28 03:55:00 [9198] >f+++++++++ folder/file4
2022/01/28 03:55:00 [9198] >f+++++++++ folder/file5
2022/01/28 03:55:00 [9198] >f+++++++++ folder/file6
2022/01/28 03:55:00 [9198] >f+++++++++ folder/file7
2022/01/28 03:55:00 [9198] >f+++++++++ folder/file8
2022/01/28 03:55:00 [9198] >f+++++++++ folder/file9
2022/01/28 03:55:00 [9198] sent 1713 bytes received 507 bytes total size 235

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,61 @@
2022/01/28 04:22:10 [9306] building file list
2022/01/28 04:22:10 [9306] .d..t...... ./
2022/01/28 04:22:10 [9306] >f+++++++++ a.txt
2022/01/28 04:22:10 [9306] >f+++++++++ b.txt
2022/01/28 04:22:10 [9306] >f+++++++++ c.txt
2022/01/28 04:22:10 [9306] >f+++++++++ d.txt
2022/01/28 04:22:10 [9306] >f+++++++++ file with spaces.txt
2022/01/28 04:22:10 [9306] cd+++++++++ folder/
2022/01/28 04:22:10 [9306] >f+++++++++ folder/file1
2022/01/28 04:22:10 [9306] >f+++++++++ folder/file10
2022/01/28 04:22:10 [9306] >f+++++++++ folder/file11
2022/01/28 04:22:10 [9306] >f+++++++++ folder/file12
2022/01/28 04:22:10 [9306] >f+++++++++ folder/file13
2022/01/28 04:22:10 [9306] >f+++++++++ folder/file14
2022/01/28 04:22:10 [9306] >f+++++++++ folder/file15
2022/01/28 04:22:10 [9306] >f+++++++++ folder/file16
2022/01/28 04:22:10 [9306] >f+++++++++ folder/file17
2022/01/28 04:22:10 [9306] >f+++++++++ folder/file18
2022/01/28 04:22:10 [9306] >f+++++++++ folder/file19
2022/01/28 04:22:10 [9306] >f+++++++++ folder/file2
2022/01/28 04:22:10 [9306] >f+++++++++ folder/file20
2022/01/28 04:22:10 [9306] >f+++++++++ folder/file3
2022/01/28 04:22:10 [9306] >f+++++++++ folder/file4
2022/01/28 04:22:10 [9306] >f+++++++++ folder/file5
2022/01/28 04:22:10 [9306] >f+++++++++ folder/file6
2022/01/28 04:22:10 [9306] >f+++++++++ folder/file7
2022/01/28 04:22:10 [9306] >f+++++++++ folder/file8
2022/01/28 04:22:10 [9306] >f+++++++++ folder/file9
2022/01/28 04:22:10 [9306] sent 1,708 bytes received 502 bytes 4,420.00 bytes/sec
2022/01/28 04:22:10 [9306] total size is 235 speedup is 0.11
2022/01/28 11:07:21 [10540] building file list
2022/01/28 11:07:21 [10540] .d..t...... ./
2022/01/28 11:07:21 [10540] >f+++++++++ a.txt
2022/01/28 11:07:21 [10540] >f+++++++++ b.txt
2022/01/28 11:07:21 [10540] >f+++++++++ c.txt
2022/01/28 11:07:21 [10540] >f+++++++++ d.txt
2022/01/28 11:07:21 [10540] >f+++++++++ file with spaces.txt
2022/01/28 11:07:21 [10540] cd+++++++++ folder/
2022/01/28 11:07:21 [10540] >f+++++++++ folder/file1
2022/01/28 11:07:21 [10540] >f+++++++++ folder/file10
2022/01/28 11:07:21 [10540] >f+++++++++ folder/file11
2022/01/28 11:07:21 [10540] >f+++++++++ folder/file12
2022/01/28 11:07:21 [10540] >f+++++++++ folder/file13
2022/01/28 11:07:21 [10540] >f+++++++++ folder/file14
2022/01/28 11:07:21 [10540] >f+++++++++ folder/file15
2022/01/28 11:07:21 [10540] >f+++++++++ folder/file16
2022/01/28 11:07:21 [10540] >f+++++++++ folder/file17
2022/01/28 11:07:21 [10540] >f+++++++++ folder/file18
2022/01/28 11:07:21 [10540] >f+++++++++ folder/file19
2022/01/28 11:07:21 [10540] >f+++++++++ folder/file2
2022/01/28 11:07:21 [10540] >f+++++++++ folder/file20
2022/01/28 11:07:21 [10540] >f+++++++++ folder/file3
2022/01/28 11:07:21 [10540] >f+++++++++ folder/file4
2022/01/28 11:07:21 [10540] >f+++++++++ folder/file5
2022/01/28 11:07:21 [10540] >f+++++++++ folder/file6
2022/01/28 11:07:21 [10540] >f+++++++++ folder/file7
2022/01/28 11:07:21 [10540] >f+++++++++ folder/file8
2022/01/28 11:07:21 [10540] >f+++++++++ folder/file9
2022/01/28 11:07:21 [10540] sent 1,708 bytes received 502 bytes 4,420.00 bytes/sec
2022/01/28 11:07:21 [10540] total size is 235 speedup is 0.11

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,65 @@
2022/01/28 04:37:34 [9349] building file list
2022/01/28 04:37:34 [9349] delta-transmission disabled for local transfer or --whole-file
2022/01/28 04:37:34 [9349] .d..t...... ./
2022/01/28 04:37:34 [9349] >f+++++++++ a.txt
2022/01/28 04:37:34 [9349] >f+++++++++ b.txt
2022/01/28 04:37:34 [9349] >f+++++++++ c.txt
2022/01/28 04:37:34 [9349] >f+++++++++ d.txt
2022/01/28 04:37:34 [9349] >f+++++++++ file with spaces.txt
2022/01/28 04:37:34 [9349] cd+++++++++ folder/
2022/01/28 04:37:34 [9349] >f+++++++++ folder/file1
2022/01/28 04:37:34 [9349] >f+++++++++ folder/file10
2022/01/28 04:37:34 [9349] >f+++++++++ folder/file11
2022/01/28 04:37:34 [9349] >f+++++++++ folder/file12
2022/01/28 04:37:34 [9349] >f+++++++++ folder/file13
2022/01/28 04:37:34 [9349] >f+++++++++ folder/file14
2022/01/28 04:37:34 [9349] >f+++++++++ folder/file15
2022/01/28 04:37:34 [9349] >f+++++++++ folder/file16
2022/01/28 04:37:34 [9349] >f+++++++++ folder/file17
2022/01/28 04:37:34 [9349] >f+++++++++ folder/file18
2022/01/28 04:37:34 [9349] >f+++++++++ folder/file19
2022/01/28 04:37:34 [9349] >f+++++++++ folder/file2
2022/01/28 04:37:34 [9349] >f+++++++++ folder/file20
2022/01/28 04:37:34 [9349] >f+++++++++ folder/file3
2022/01/28 04:37:34 [9349] >f+++++++++ folder/file4
2022/01/28 04:37:34 [9349] >f+++++++++ folder/file5
2022/01/28 04:37:34 [9349] >f+++++++++ folder/file6
2022/01/28 04:37:34 [9349] >f+++++++++ folder/file7
2022/01/28 04:37:34 [9349] >f+++++++++ folder/file8
2022/01/28 04:37:34 [9349] >f+++++++++ folder/file9
2022/01/28 04:37:34 [9349] total: matches=0 hash_hits=0 false_alarms=0 data=235
2022/01/28 04:37:34 [9349] sent 1,708 bytes received 569 bytes 4,554.00 bytes/sec
2022/01/28 04:37:34 [9349] total size is 235 speedup is 0.10
2022/01/28 11:20:55 [10585] building file list
2022/01/28 11:20:55 [10585] delta-transmission disabled for local transfer or --whole-file
2022/01/28 11:20:55 [10585] .d..t...... ./
2022/01/28 11:20:55 [10585] >f+++++++++ a.txt
2022/01/28 11:20:55 [10585] >f+++++++++ b.txt
2022/01/28 11:20:55 [10585] >f+++++++++ c.txt
2022/01/28 11:20:55 [10585] >f+++++++++ d.txt
2022/01/28 11:20:55 [10585] >f+++++++++ file with spaces.txt
2022/01/28 11:20:55 [10585] cd+++++++++ folder/
2022/01/28 11:20:55 [10585] >f+++++++++ folder/file1
2022/01/28 11:20:55 [10585] >f+++++++++ folder/file10
2022/01/28 11:20:55 [10585] >f+++++++++ folder/file11
2022/01/28 11:20:55 [10585] >f+++++++++ folder/file12
2022/01/28 11:20:55 [10585] >f+++++++++ folder/file13
2022/01/28 11:20:55 [10585] >f+++++++++ folder/file14
2022/01/28 11:20:55 [10585] >f+++++++++ folder/file15
2022/01/28 11:20:55 [10585] >f+++++++++ folder/file16
2022/01/28 11:20:55 [10585] >f+++++++++ folder/file17
2022/01/28 11:20:55 [10585] >f+++++++++ folder/file18
2022/01/28 11:20:55 [10585] >f+++++++++ folder/file19
2022/01/28 11:20:55 [10585] >f+++++++++ folder/file2
2022/01/28 11:20:55 [10585] >f+++++++++ folder/file20
2022/01/28 11:20:55 [10585] >f+++++++++ folder/file3
2022/01/28 11:20:55 [10585] >f+++++++++ folder/file4
2022/01/28 11:20:55 [10585] >f+++++++++ folder/file5
2022/01/28 11:20:55 [10585] >f+++++++++ folder/file6
2022/01/28 11:20:55 [10585] >f+++++++++ folder/file7
2022/01/28 11:20:55 [10585] >f+++++++++ folder/file8
2022/01/28 11:20:55 [10585] >f+++++++++ folder/file9
2022/01/28 11:20:55 [10585] total: matches=0 hash_hits=0 false_alarms=0 data=235
2022/01/28 11:20:55 [10585] sent 1,708 bytes received 569 bytes 4,554.00 bytes/sec
2022/01/28 11:20:55 [10585] total size is 235 speedup is 0.10

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,657 @@
2022/01/29 07:11:29 [49859] building file list
2022/01/29 07:11:29 [49859] [sender] make_file(.,*,0)
2022/01/29 07:11:29 [49859] [sender] pushing local filters for /home/kbrazil/rsynctest/source/
2022/01/29 07:11:29 [49859] [sender] make_file(a.txt,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(b.txt,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(c.txt,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(d.txt,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(file with spaces.txt,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(folder,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(pyspark-2.4.5-py2.py3-none-any.whl,*,2)
2022/01/29 07:11:29 [49859] send_file_list done
2022/01/29 07:11:29 [49859] send_files starting
2022/01/29 07:11:29 [49859] [sender] pushing local filters for /home/kbrazil/rsynctest/source/folder/
2022/01/29 07:11:29 [49859] [sender] make_file(folder/file1,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(folder/file2,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(folder/file3,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(folder/file4,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(folder/file5,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(folder/file6,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(folder/file7,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(folder/file8,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(folder/file9,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(folder/file10,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(folder/file11,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(folder/file12,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(folder/file13,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(folder/file14,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(folder/file15,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(folder/file16,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(folder/file17,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(folder/file18,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(folder/file19,*,2)
2022/01/29 07:11:29 [49859] [sender] make_file(folder/file20,*,2)
2022/01/29 07:11:29 [49859] server_recv(2) starting pid=49860
2022/01/29 07:11:29 [49859] recv_file_name(.)
2022/01/29 07:11:29 [49859] recv_file_name(a.txt)
2022/01/29 07:11:29 [49859] recv_file_name(b.txt)
2022/01/29 07:11:29 [49859] recv_file_name(c.txt)
2022/01/29 07:11:29 [49859] recv_file_name(d.txt)
2022/01/29 07:11:29 [49859] recv_file_name(file with spaces.txt)
2022/01/29 07:11:29 [49859] recv_file_name(folder)
2022/01/29 07:11:29 [49859] recv_file_name(pyspark-2.4.5-py2.py3-none-any.whl)
2022/01/29 07:11:29 [49859] received 8 names
2022/01/29 07:11:29 [49859] recv_file_list done
2022/01/29 07:11:29 [49859] get_local_name count=8 dest
2022/01/29 07:11:29 [49859] generator starting pid=49860
2022/01/29 07:11:29 [49859] delta-transmission disabled for local transfer or --whole-file
2022/01/29 07:11:29 [49859] recv_generator(.,0)
2022/01/29 07:11:29 [49859] set modtime of . to (1643424952) Fri Jan 28 18:55:52 2022
2022/01/29 07:11:29 [49859] recv_generator(.,1)
2022/01/29 07:11:29 [49859] recv_generator(a.txt,2)
2022/01/29 07:11:29 [49859] recv_generator(b.txt,3)
2022/01/29 07:11:29 [49859] recv_generator(c.txt,4)
2022/01/29 07:11:29 [49859] recv_generator(d.txt,5)
2022/01/29 07:11:29 [49859] recv_generator(file with spaces.txt,6)
2022/01/29 07:11:29 [49859] recv_generator(pyspark-2.4.5-py2.py3-none-any.whl,7)
2022/01/29 07:11:29 [49859] recv_generator(folder,8)
2022/01/29 07:11:29 [49859] send_files(0, source/.)
2022/01/29 07:11:29 [49859] .d..t...... ./
2022/01/29 07:11:29 [49859] send_files(2, source/a.txt)
2022/01/29 07:11:29 [49859] send_files mapped source/a.txt of size 47
2022/01/29 07:11:29 [49859] calling match_sums source/a.txt
2022/01/29 07:11:29 [49859] sending file_sum
2022/01/29 07:11:29 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:29 [49859] >f+++++++++ a.txt
2022/01/29 07:11:29 [49859] sender finished source/a.txt
2022/01/29 07:11:29 [49859] send_files(3, source/b.txt)
2022/01/29 07:11:29 [49859] send_files mapped source/b.txt of size 47
2022/01/29 07:11:29 [49859] calling match_sums source/b.txt
2022/01/29 07:11:29 [49859] sending file_sum
2022/01/29 07:11:29 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:29 [49859] >f+++++++++ b.txt
2022/01/29 07:11:29 [49859] sender finished source/b.txt
2022/01/29 07:11:29 [49859] send_files(4, source/c.txt)
2022/01/29 07:11:29 [49859] send_files mapped source/c.txt of size 47
2022/01/29 07:11:29 [49859] calling match_sums source/c.txt
2022/01/29 07:11:29 [49859] sending file_sum
2022/01/29 07:11:29 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:29 [49859] >f+++++++++ c.txt
2022/01/29 07:11:29 [49859] sender finished source/c.txt
2022/01/29 07:11:29 [49859] send_files(5, source/d.txt)
2022/01/29 07:11:29 [49859] send_files mapped source/d.txt of size 47
2022/01/29 07:11:29 [49859] calling match_sums source/d.txt
2022/01/29 07:11:29 [49859] sending file_sum
2022/01/29 07:11:29 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:29 [49859] >f+++++++++ d.txt
2022/01/29 07:11:29 [49859] sender finished source/d.txt
2022/01/29 07:11:29 [49859] send_files(6, source/file with spaces.txt)
2022/01/29 07:11:29 [49859] send_files mapped source/file with spaces.txt of size 47
2022/01/29 07:11:29 [49859] calling match_sums source/file with spaces.txt
2022/01/29 07:11:29 [49859] sending file_sum
2022/01/29 07:11:29 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:29 [49859] >f+++++++++ file with spaces.txt
2022/01/29 07:11:29 [49859] sender finished source/file with spaces.txt
2022/01/29 07:11:29 [49859] send_files(7, source/pyspark-2.4.5-py2.py3-none-any.whl)
2022/01/29 07:11:29 [49859] send_files mapped source/pyspark-2.4.5-py2.py3-none-any.whl of size 218257928
2022/01/29 07:11:29 [49859] calling match_sums source/pyspark-2.4.5-py2.py3-none-any.whl
2022/01/29 07:11:29 [49859] recv_files(8) starting
2022/01/29 07:11:29 [49859] [receiver] receiving flist for dir 1
2022/01/29 07:11:29 [49859] recv_file_name(folder/file1)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file2)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file3)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file4)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file5)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file6)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file7)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file8)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file9)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file10)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file11)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file12)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file13)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file14)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file15)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file16)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file17)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file18)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file19)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file20)
2022/01/29 07:11:29 [49859] received 20 names
2022/01/29 07:11:29 [49859] recv_file_list done
2022/01/29 07:11:29 [49859] recv_files(.)
2022/01/29 07:11:29 [49859] recv_files(a.txt)
2022/01/29 07:11:29 [49859] got file_sum
2022/01/29 07:11:29 [49859] set modtime of .a.txt.9ZAUJ6 to (1643342949) Thu Jan 27 20:09:09 2022
2022/01/29 07:11:29 [49859] renaming .a.txt.9ZAUJ6 to a.txt
2022/01/29 07:11:29 [49859] recv_files(b.txt)
2022/01/29 07:11:29 [49859] got file_sum
2022/01/29 07:11:29 [49859] set modtime of .b.txt.eIiFqa to (1643342953) Thu Jan 27 20:09:13 2022
2022/01/29 07:11:29 [49859] renaming .b.txt.eIiFqa to b.txt
2022/01/29 07:11:29 [49859] recv_files(c.txt)
2022/01/29 07:11:29 [49859] got file_sum
2022/01/29 07:11:29 [49859] set modtime of .c.txt.peRw7d to (1643342956) Thu Jan 27 20:09:16 2022
2022/01/29 07:11:29 [49859] renaming .c.txt.peRw7d to c.txt
2022/01/29 07:11:29 [49859] recv_files(d.txt)
2022/01/29 07:11:29 [49859] got file_sum
2022/01/29 07:11:29 [49859] set modtime of .d.txt.PAXpOh to (1643342959) Thu Jan 27 20:09:19 2022
2022/01/29 07:11:29 [49859] renaming .d.txt.PAXpOh to d.txt
2022/01/29 07:11:29 [49859] recv_files(file with spaces.txt)
2022/01/29 07:11:29 [49859] got file_sum
2022/01/29 07:11:29 [49859] set modtime of .file with spaces.txt.BUujvl to (1643342980) Thu Jan 27 20:09:40 2022
2022/01/29 07:11:29 [49859] renaming .file with spaces.txt.BUujvl to file with spaces.txt
2022/01/29 07:11:29 [49859] recv_files(pyspark-2.4.5-py2.py3-none-any.whl)
2022/01/29 07:11:29 [49859] [generator] receiving flist for dir 1
2022/01/29 07:11:29 [49859] recv_file_name(folder/file1)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file2)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file3)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file4)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file5)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file6)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file7)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file8)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file9)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file10)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file11)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file12)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file13)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file14)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file15)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file16)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file17)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file18)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file19)
2022/01/29 07:11:29 [49859] recv_file_name(folder/file20)
2022/01/29 07:11:29 [49859] received 20 names
2022/01/29 07:11:29 [49859] recv_file_list done
2022/01/29 07:11:29 [49859] recv_generator(folder,9)
2022/01/29 07:11:29 [49859] set modtime of folder to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:29 [49859] recv_generator(folder/file1,10)
2022/01/29 07:11:29 [49859] recv_generator(folder/file10,11)
2022/01/29 07:11:29 [49859] recv_generator(folder/file11,12)
2022/01/29 07:11:29 [49859] recv_generator(folder/file12,13)
2022/01/29 07:11:29 [49859] recv_generator(folder/file13,14)
2022/01/29 07:11:29 [49859] recv_generator(folder/file14,15)
2022/01/29 07:11:29 [49859] recv_generator(folder/file15,16)
2022/01/29 07:11:29 [49859] recv_generator(folder/file16,17)
2022/01/29 07:11:29 [49859] recv_generator(folder/file17,18)
2022/01/29 07:11:29 [49859] recv_generator(folder/file18,19)
2022/01/29 07:11:29 [49859] recv_generator(folder/file19,20)
2022/01/29 07:11:29 [49859] recv_generator(folder/file2,21)
2022/01/29 07:11:29 [49859] recv_generator(folder/file20,22)
2022/01/29 07:11:29 [49859] recv_generator(folder/file3,23)
2022/01/29 07:11:29 [49859] recv_generator(folder/file4,24)
2022/01/29 07:11:29 [49859] recv_generator(folder/file5,25)
2022/01/29 07:11:29 [49859] recv_generator(folder/file6,26)
2022/01/29 07:11:29 [49859] recv_generator(folder/file7,27)
2022/01/29 07:11:31 [49859] sending file_sum
2022/01/29 07:11:31 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:31 [49859] >f+++++++++ pyspark-2.4.5-py2.py3-none-any.whl
2022/01/29 07:11:31 [49859] sender finished source/pyspark-2.4.5-py2.py3-none-any.whl
2022/01/29 07:11:31 [49859] recv_generator(folder/file8,28)
2022/01/29 07:11:31 [49859] recv_generator(folder/file9,29)
2022/01/29 07:11:31 [49859] generate_files phase=1
2022/01/29 07:11:31 [49859] send_files(9, source/folder)
2022/01/29 07:11:31 [49859] cd+++++++++ folder/
2022/01/29 07:11:31 [49859] send_files(10, source/folder/file1)
2022/01/29 07:11:31 [49859] send_files mapped source/folder/file1 of size 0
2022/01/29 07:11:31 [49859] calling match_sums source/folder/file1
2022/01/29 07:11:31 [49859] sending file_sum
2022/01/29 07:11:31 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:31 [49859] >f+++++++++ folder/file1
2022/01/29 07:11:31 [49859] sender finished source/folder/file1
2022/01/29 07:11:31 [49859] send_files(11, source/folder/file10)
2022/01/29 07:11:31 [49859] send_files mapped source/folder/file10 of size 0
2022/01/29 07:11:31 [49859] calling match_sums source/folder/file10
2022/01/29 07:11:31 [49859] sending file_sum
2022/01/29 07:11:31 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:31 [49859] >f+++++++++ folder/file10
2022/01/29 07:11:31 [49859] sender finished source/folder/file10
2022/01/29 07:11:31 [49859] send_files(12, source/folder/file11)
2022/01/29 07:11:31 [49859] send_files mapped source/folder/file11 of size 0
2022/01/29 07:11:31 [49859] calling match_sums source/folder/file11
2022/01/29 07:11:31 [49859] sending file_sum
2022/01/29 07:11:31 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:31 [49859] >f+++++++++ folder/file11
2022/01/29 07:11:31 [49859] sender finished source/folder/file11
2022/01/29 07:11:31 [49859] send_files(13, source/folder/file12)
2022/01/29 07:11:31 [49859] send_files mapped source/folder/file12 of size 0
2022/01/29 07:11:31 [49859] calling match_sums source/folder/file12
2022/01/29 07:11:31 [49859] sending file_sum
2022/01/29 07:11:31 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:31 [49859] >f+++++++++ folder/file12
2022/01/29 07:11:31 [49859] sender finished source/folder/file12
2022/01/29 07:11:31 [49859] send_files(14, source/folder/file13)
2022/01/29 07:11:31 [49859] send_files mapped source/folder/file13 of size 0
2022/01/29 07:11:31 [49859] calling match_sums source/folder/file13
2022/01/29 07:11:31 [49859] sending file_sum
2022/01/29 07:11:31 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:31 [49859] >f+++++++++ folder/file13
2022/01/29 07:11:31 [49859] sender finished source/folder/file13
2022/01/29 07:11:31 [49859] send_files(15, source/folder/file14)
2022/01/29 07:11:31 [49859] send_files mapped source/folder/file14 of size 0
2022/01/29 07:11:31 [49859] calling match_sums source/folder/file14
2022/01/29 07:11:31 [49859] sending file_sum
2022/01/29 07:11:31 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:31 [49859] >f+++++++++ folder/file14
2022/01/29 07:11:31 [49859] sender finished source/folder/file14
2022/01/29 07:11:31 [49859] send_files(16, source/folder/file15)
2022/01/29 07:11:31 [49859] send_files mapped source/folder/file15 of size 0
2022/01/29 07:11:31 [49859] calling match_sums source/folder/file15
2022/01/29 07:11:31 [49859] sending file_sum
2022/01/29 07:11:31 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:31 [49859] >f+++++++++ folder/file15
2022/01/29 07:11:31 [49859] sender finished source/folder/file15
2022/01/29 07:11:31 [49859] send_files(17, source/folder/file16)
2022/01/29 07:11:31 [49859] send_files mapped source/folder/file16 of size 0
2022/01/29 07:11:31 [49859] calling match_sums source/folder/file16
2022/01/29 07:11:31 [49859] sending file_sum
2022/01/29 07:11:31 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:31 [49859] >f+++++++++ folder/file16
2022/01/29 07:11:31 [49859] sender finished source/folder/file16
2022/01/29 07:11:31 [49859] send_files(18, source/folder/file17)
2022/01/29 07:11:31 [49859] send_files mapped source/folder/file17 of size 0
2022/01/29 07:11:31 [49859] calling match_sums source/folder/file17
2022/01/29 07:11:31 [49859] sending file_sum
2022/01/29 07:11:31 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:31 [49859] >f+++++++++ folder/file17
2022/01/29 07:11:31 [49859] sender finished source/folder/file17
2022/01/29 07:11:31 [49859] send_files(19, source/folder/file18)
2022/01/29 07:11:31 [49859] send_files mapped source/folder/file18 of size 0
2022/01/29 07:11:31 [49859] calling match_sums source/folder/file18
2022/01/29 07:11:31 [49859] sending file_sum
2022/01/29 07:11:31 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:31 [49859] >f+++++++++ folder/file18
2022/01/29 07:11:31 [49859] sender finished source/folder/file18
2022/01/29 07:11:31 [49859] send_files(20, source/folder/file19)
2022/01/29 07:11:31 [49859] send_files mapped source/folder/file19 of size 0
2022/01/29 07:11:31 [49859] calling match_sums source/folder/file19
2022/01/29 07:11:31 [49859] sending file_sum
2022/01/29 07:11:31 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:31 [49859] >f+++++++++ folder/file19
2022/01/29 07:11:31 [49859] sender finished source/folder/file19
2022/01/29 07:11:31 [49859] send_files(21, source/folder/file2)
2022/01/29 07:11:31 [49859] send_files mapped source/folder/file2 of size 0
2022/01/29 07:11:31 [49859] calling match_sums source/folder/file2
2022/01/29 07:11:31 [49859] sending file_sum
2022/01/29 07:11:31 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:31 [49859] >f+++++++++ folder/file2
2022/01/29 07:11:31 [49859] sender finished source/folder/file2
2022/01/29 07:11:31 [49859] send_files(22, source/folder/file20)
2022/01/29 07:11:31 [49859] send_files mapped source/folder/file20 of size 0
2022/01/29 07:11:31 [49859] calling match_sums source/folder/file20
2022/01/29 07:11:31 [49859] sending file_sum
2022/01/29 07:11:31 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:31 [49859] >f+++++++++ folder/file20
2022/01/29 07:11:31 [49859] sender finished source/folder/file20
2022/01/29 07:11:31 [49859] send_files(23, source/folder/file3)
2022/01/29 07:11:31 [49859] send_files mapped source/folder/file3 of size 0
2022/01/29 07:11:31 [49859] calling match_sums source/folder/file3
2022/01/29 07:11:31 [49859] sending file_sum
2022/01/29 07:11:31 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:31 [49859] >f+++++++++ folder/file3
2022/01/29 07:11:31 [49859] sender finished source/folder/file3
2022/01/29 07:11:31 [49859] send_files(24, source/folder/file4)
2022/01/29 07:11:31 [49859] send_files mapped source/folder/file4 of size 0
2022/01/29 07:11:31 [49859] calling match_sums source/folder/file4
2022/01/29 07:11:31 [49859] sending file_sum
2022/01/29 07:11:31 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:31 [49859] >f+++++++++ folder/file4
2022/01/29 07:11:31 [49859] sender finished source/folder/file4
2022/01/29 07:11:31 [49859] send_files(25, source/folder/file5)
2022/01/29 07:11:31 [49859] send_files mapped source/folder/file5 of size 0
2022/01/29 07:11:31 [49859] calling match_sums source/folder/file5
2022/01/29 07:11:31 [49859] sending file_sum
2022/01/29 07:11:31 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:31 [49859] >f+++++++++ folder/file5
2022/01/29 07:11:31 [49859] sender finished source/folder/file5
2022/01/29 07:11:31 [49859] send_files(26, source/folder/file6)
2022/01/29 07:11:31 [49859] send_files mapped source/folder/file6 of size 0
2022/01/29 07:11:31 [49859] calling match_sums source/folder/file6
2022/01/29 07:11:31 [49859] sending file_sum
2022/01/29 07:11:31 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:31 [49859] >f+++++++++ folder/file6
2022/01/29 07:11:31 [49859] sender finished source/folder/file6
2022/01/29 07:11:31 [49859] send_files(27, source/folder/file7)
2022/01/29 07:11:31 [49859] send_files mapped source/folder/file7 of size 0
2022/01/29 07:11:31 [49859] calling match_sums source/folder/file7
2022/01/29 07:11:31 [49859] sending file_sum
2022/01/29 07:11:31 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:31 [49859] >f+++++++++ folder/file7
2022/01/29 07:11:31 [49859] sender finished source/folder/file7
2022/01/29 07:11:31 [49859] send_files(28, source/folder/file8)
2022/01/29 07:11:31 [49859] send_files mapped source/folder/file8 of size 0
2022/01/29 07:11:31 [49859] calling match_sums source/folder/file8
2022/01/29 07:11:31 [49859] sending file_sum
2022/01/29 07:11:31 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:31 [49859] >f+++++++++ folder/file8
2022/01/29 07:11:31 [49859] sender finished source/folder/file8
2022/01/29 07:11:31 [49859] send_files(29, source/folder/file9)
2022/01/29 07:11:31 [49859] send_files mapped source/folder/file9 of size 0
2022/01/29 07:11:31 [49859] calling match_sums source/folder/file9
2022/01/29 07:11:31 [49859] sending file_sum
2022/01/29 07:11:31 [49859] false_alarms=0 hash_hits=0 matches=0
2022/01/29 07:11:31 [49859] >f+++++++++ folder/file9
2022/01/29 07:11:31 [49859] sender finished source/folder/file9
2022/01/29 07:11:31 [49859] got file_sum
2022/01/29 07:11:31 [49859] set modtime of .pyspark-2.4.5-py2.py3-none-any.whl.15ydcp to (1643424957) Fri Jan 28 18:55:57 2022
2022/01/29 07:11:31 [49859] renaming .pyspark-2.4.5-py2.py3-none-any.whl.15ydcp to pyspark-2.4.5-py2.py3-none-any.whl
2022/01/29 07:11:31 [49859] set modtime of . to (1643424952) Fri Jan 28 18:55:52 2022
2022/01/29 07:11:31 [49859] recv_files(folder)
2022/01/29 07:11:31 [49859] recv_files(folder/file1)
2022/01/29 07:11:31 [49859] got file_sum
2022/01/29 07:11:31 [49859] set modtime of folder/.file1.GY1N3v to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:31 [49859] renaming folder/.file1.GY1N3v to folder/file1
2022/01/29 07:11:31 [49859] recv_files(folder/file10)
2022/01/29 07:11:31 [49859] got file_sum
2022/01/29 07:11:31 [49859] set modtime of folder/.file10.5JhpVC to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:31 [49859] renaming folder/.file10.5JhpVC to folder/file10
2022/01/29 07:11:31 [49859] recv_files(folder/file11)
2022/01/29 07:11:31 [49859] got file_sum
2022/01/29 07:11:31 [49859] set modtime of folder/.file11.eL20MJ to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:31 [49859] renaming folder/.file11.eL20MJ to folder/file11
2022/01/29 07:11:31 [49859] recv_files(folder/file12)
2022/01/29 07:11:31 [49859] got file_sum
2022/01/29 07:11:31 [49859] set modtime of folder/.file12.HMxDEQ to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:31 [49859] renaming folder/.file12.HMxDEQ to folder/file12
2022/01/29 07:11:31 [49859] recv_files(folder/file13)
2022/01/29 07:11:31 [49859] got file_sum
2022/01/29 07:11:31 [49859] set modtime of folder/.file13.E0DgwX to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:31 [49859] renaming folder/.file13.E0DgwX to folder/file13
2022/01/29 07:11:31 [49859] recv_files(folder/file14)
2022/01/29 07:11:31 [49859] got file_sum
2022/01/29 07:11:31 [49859] set modtime of folder/.file14.xcu0n4 to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:31 [49859] renaming folder/.file14.xcu0n4 to folder/file14
2022/01/29 07:11:31 [49859] recv_files(folder/file15)
2022/01/29 07:11:31 [49859] got file_sum
2022/01/29 07:11:31 [49859] set modtime of folder/.file15.wwQKfb to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:31 [49859] renaming folder/.file15.wwQKfb to folder/file15
2022/01/29 07:11:31 [49859] recv_files(folder/file16)
2022/01/29 07:11:31 [49859] got file_sum
2022/01/29 07:11:31 [49859] set modtime of folder/.file16.AIhB7h to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:31 [49859] renaming folder/.file16.AIhB7h to folder/file16
2022/01/29 07:11:31 [49859] recv_files(folder/file17)
2022/01/29 07:11:31 [49859] got file_sum
2022/01/29 07:11:31 [49859] set modtime of folder/.file17.6l2AZo to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:31 [49859] renaming folder/.file17.6l2AZo to folder/file17
2022/01/29 07:11:31 [49859] recv_files(folder/file18)
2022/01/29 07:11:31 [49859] got file_sum
2022/01/29 07:11:31 [49859] set modtime of folder/.file18.kA2BRv to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:31 [49859] renaming folder/.file18.kA2BRv to folder/file18
2022/01/29 07:11:31 [49859] recv_files(folder/file19)
2022/01/29 07:11:31 [49859] got file_sum
2022/01/29 07:11:31 [49859] set modtime of folder/.file19.IBvDJC to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:31 [49859] renaming folder/.file19.IBvDJC to folder/file19
2022/01/29 07:11:31 [49859] recv_files(folder/file2)
2022/01/29 07:11:31 [49859] got file_sum
2022/01/29 07:11:31 [49859] set modtime of folder/.file2.w7rFBJ to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:31 [49859] renaming folder/.file2.w7rFBJ to folder/file2
2022/01/29 07:11:31 [49859] recv_files(folder/file20)
2022/01/29 07:11:31 [49859] got file_sum
2022/01/29 07:11:31 [49859] set modtime of folder/.file20.G3THtQ to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:31 [49859] renaming folder/.file20.G3THtQ to folder/file20
2022/01/29 07:11:31 [49859] recv_files(folder/file3)
2022/01/29 07:11:31 [49859] got file_sum
2022/01/29 07:11:31 [49859] set modtime of folder/.file3.8gXKlX to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:31 [49859] renaming folder/.file3.8gXKlX to folder/file3
2022/01/29 07:11:31 [49859] recv_files(folder/file4)
2022/01/29 07:11:31 [49859] got file_sum
2022/01/29 07:11:31 [49859] set modtime of folder/.file4.Xt8Ud4 to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:31 [49859] renaming folder/.file4.Xt8Ud4 to folder/file4
2022/01/29 07:11:31 [49859] recv_files(folder/file5)
2022/01/29 07:11:31 [49859] got file_sum
2022/01/29 07:11:31 [49859] set modtime of folder/.file5.KjR55a to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:31 [49859] renaming folder/.file5.KjR55a to folder/file5
2022/01/29 07:11:31 [49859] recv_files(folder/file6)
2022/01/29 07:11:31 [49859] got file_sum
2022/01/29 07:11:31 [49859] set modtime of folder/.file6.VwchYh to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:31 [49859] renaming folder/.file6.VwchYh to folder/file6
2022/01/29 07:11:31 [49859] recv_files(folder/file7)
2022/01/29 07:11:31 [49859] got file_sum
2022/01/29 07:11:31 [49859] set modtime of folder/.file7.Uo9sQo to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:31 [49859] renaming folder/.file7.Uo9sQo to folder/file7
2022/01/29 07:11:31 [49859] recv_files(folder/file8)
2022/01/29 07:11:31 [49859] got file_sum
2022/01/29 07:11:31 [49859] set modtime of folder/.file8.ht2FIv to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:31 [49859] renaming folder/.file8.ht2FIv to folder/file8
2022/01/29 07:11:31 [49859] recv_files(folder/file9)
2022/01/29 07:11:31 [49859] got file_sum
2022/01/29 07:11:31 [49859] set modtime of folder/.file9.2HK6AC to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:31 [49859] renaming folder/.file9.2HK6AC to folder/file9
2022/01/29 07:11:31 [49859] set modtime of folder to (1643343369) Thu Jan 27 20:16:09 2022
2022/01/29 07:11:31 [49859] send_files phase=1
2022/01/29 07:11:31 [49859] recv_files phase=1
2022/01/29 07:11:31 [49859] generate_files phase=2
2022/01/29 07:11:31 [49859] send_files phase=2
2022/01/29 07:11:31 [49859] send files finished
2022/01/29 07:11:31 [49859] total: matches=0 hash_hits=0 false_alarms=0 data=218258163
2022/01/29 07:11:31 [49859] recv_files phase=2
2022/01/29 07:11:31 [49859] recv_files finished
2022/01/29 07:11:31 [49859] generate_files phase=3
2022/01/29 07:11:31 [49859] generate_files finished
2022/01/29 07:11:31 [49859] sent 218,313,010 bytes received 8,608 bytes 87,328,647.20 bytes/sec
2022/01/29 07:11:31 [49859] total size is 218,258,163 speedup is 1.00
2022/01/29 07:11:31 [49859] [sender] _exit_cleanup(code=0, file=main.c, line=1178): about to call exit(0)
2022/01/29 07:11:33 [49865] building file list
2022/01/29 07:11:33 [49865] [sender] make_file(.,*,0)
2022/01/29 07:11:33 [49865] [sender] pushing local filters for /home/kbrazil/rsynctest/source/
2022/01/29 07:11:33 [49865] [sender] make_file(a.txt,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(b.txt,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(c.txt,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(d.txt,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(file with spaces.txt,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(folder,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(pyspark-2.4.5-py2.py3-none-any.whl,*,2)
2022/01/29 07:11:33 [49865] send_file_list done
2022/01/29 07:11:33 [49865] send_files starting
2022/01/29 07:11:33 [49865] [sender] pushing local filters for /home/kbrazil/rsynctest/source/folder/
2022/01/29 07:11:33 [49865] [sender] make_file(folder/file1,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(folder/file2,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(folder/file3,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(folder/file4,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(folder/file5,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(folder/file6,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(folder/file7,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(folder/file8,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(folder/file9,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(folder/file10,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(folder/file11,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(folder/file12,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(folder/file13,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(folder/file14,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(folder/file15,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(folder/file16,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(folder/file17,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(folder/file18,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(folder/file19,*,2)
2022/01/29 07:11:33 [49865] [sender] make_file(folder/file20,*,2)
2022/01/29 07:11:33 [49865] server_recv(2) starting pid=49866
2022/01/29 07:11:33 [49865] recv_file_name(.)
2022/01/29 07:11:33 [49865] recv_file_name(a.txt)
2022/01/29 07:11:33 [49865] recv_file_name(b.txt)
2022/01/29 07:11:33 [49865] recv_file_name(c.txt)
2022/01/29 07:11:33 [49865] recv_file_name(d.txt)
2022/01/29 07:11:33 [49865] recv_file_name(file with spaces.txt)
2022/01/29 07:11:33 [49865] recv_file_name(folder)
2022/01/29 07:11:33 [49865] recv_file_name(pyspark-2.4.5-py2.py3-none-any.whl)
2022/01/29 07:11:33 [49865] received 8 names
2022/01/29 07:11:33 [49865] recv_file_list done
2022/01/29 07:11:33 [49865] get_local_name count=8 dest
2022/01/29 07:11:33 [49865] generator starting pid=49866
2022/01/29 07:11:33 [49865] delta-transmission disabled for local transfer or --whole-file
2022/01/29 07:11:33 [49865] recv_generator(.,0)
2022/01/29 07:11:33 [49865] recv_generator(.,1)
2022/01/29 07:11:33 [49865] recv_generator(a.txt,2)
2022/01/29 07:11:33 [49865] recv_generator(b.txt,3)
2022/01/29 07:11:33 [49865] recv_generator(c.txt,4)
2022/01/29 07:11:33 [49865] recv_generator(d.txt,5)
2022/01/29 07:11:33 [49865] recv_generator(file with spaces.txt,6)
2022/01/29 07:11:33 [49865] recv_generator(pyspark-2.4.5-py2.py3-none-any.whl,7)
2022/01/29 07:11:33 [49865] recv_generator(folder,8)
2022/01/29 07:11:33 [49865] send_files(0, source/.)
2022/01/29 07:11:33 [49865] .d ./
2022/01/29 07:11:33 [49865] send_files(2, source/a.txt)
2022/01/29 07:11:33 [49865] .f a.txt
2022/01/29 07:11:33 [49865] send_files(3, source/b.txt)
2022/01/29 07:11:33 [49865] .f b.txt
2022/01/29 07:11:33 [49865] send_files(4, source/c.txt)
2022/01/29 07:11:33 [49865] .f c.txt
2022/01/29 07:11:33 [49865] send_files(5, source/d.txt)
2022/01/29 07:11:33 [49865] .f d.txt
2022/01/29 07:11:33 [49865] send_files(6, source/file with spaces.txt)
2022/01/29 07:11:33 [49865] .f file with spaces.txt
2022/01/29 07:11:33 [49865] send_files(7, source/pyspark-2.4.5-py2.py3-none-any.whl)
2022/01/29 07:11:33 [49865] .f pyspark-2.4.5-py2.py3-none-any.whl
2022/01/29 07:11:33 [49865] recv_files(8) starting
2022/01/29 07:11:33 [49865] [receiver] receiving flist for dir 1
2022/01/29 07:11:33 [49865] recv_file_name(folder/file1)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file2)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file3)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file4)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file5)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file6)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file7)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file8)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file9)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file10)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file11)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file12)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file13)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file14)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file15)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file16)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file17)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file18)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file19)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file20)
2022/01/29 07:11:33 [49865] received 20 names
2022/01/29 07:11:33 [49865] recv_file_list done
2022/01/29 07:11:33 [49865] recv_files(.)
2022/01/29 07:11:33 [49865] recv_files(a.txt)
2022/01/29 07:11:33 [49865] recv_files(b.txt)
2022/01/29 07:11:33 [49865] recv_files(c.txt)
2022/01/29 07:11:33 [49865] recv_files(d.txt)
2022/01/29 07:11:33 [49865] recv_files(file with spaces.txt)
2022/01/29 07:11:33 [49865] recv_files(pyspark-2.4.5-py2.py3-none-any.whl)
2022/01/29 07:11:33 [49865] [generator] receiving flist for dir 1
2022/01/29 07:11:33 [49865] recv_file_name(folder/file1)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file2)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file3)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file4)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file5)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file6)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file7)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file8)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file9)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file10)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file11)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file12)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file13)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file14)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file15)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file16)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file17)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file18)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file19)
2022/01/29 07:11:33 [49865] recv_file_name(folder/file20)
2022/01/29 07:11:33 [49865] received 20 names
2022/01/29 07:11:33 [49865] recv_file_list done
2022/01/29 07:11:33 [49865] recv_generator(folder,9)
2022/01/29 07:11:33 [49865] recv_generator(folder/file1,10)
2022/01/29 07:11:33 [49865] recv_generator(folder/file10,11)
2022/01/29 07:11:33 [49865] recv_generator(folder/file11,12)
2022/01/29 07:11:33 [49865] recv_generator(folder/file12,13)
2022/01/29 07:11:33 [49865] recv_generator(folder/file13,14)
2022/01/29 07:11:33 [49865] recv_generator(folder/file14,15)
2022/01/29 07:11:33 [49865] recv_generator(folder/file15,16)
2022/01/29 07:11:33 [49865] recv_generator(folder/file16,17)
2022/01/29 07:11:33 [49865] recv_generator(folder/file17,18)
2022/01/29 07:11:33 [49865] recv_generator(folder/file18,19)
2022/01/29 07:11:33 [49865] recv_generator(folder/file19,20)
2022/01/29 07:11:33 [49865] recv_generator(folder/file2,21)
2022/01/29 07:11:33 [49865] recv_generator(folder/file20,22)
2022/01/29 07:11:33 [49865] recv_generator(folder/file3,23)
2022/01/29 07:11:33 [49865] recv_generator(folder/file4,24)
2022/01/29 07:11:33 [49865] recv_generator(folder/file5,25)
2022/01/29 07:11:33 [49865] recv_generator(folder/file6,26)
2022/01/29 07:11:33 [49865] recv_generator(folder/file7,27)
2022/01/29 07:11:33 [49865] recv_generator(folder/file8,28)
2022/01/29 07:11:33 [49865] recv_generator(folder/file9,29)
2022/01/29 07:11:33 [49865] generate_files phase=1
2022/01/29 07:11:33 [49865] send_files(9, source/folder)
2022/01/29 07:11:33 [49865] .d folder/
2022/01/29 07:11:33 [49865] send_files(10, source/folder/file1)
2022/01/29 07:11:33 [49865] .f folder/file1
2022/01/29 07:11:33 [49865] send_files(11, source/folder/file10)
2022/01/29 07:11:33 [49865] .f folder/file10
2022/01/29 07:11:33 [49865] send_files(12, source/folder/file11)
2022/01/29 07:11:33 [49865] .f folder/file11
2022/01/29 07:11:33 [49865] send_files(13, source/folder/file12)
2022/01/29 07:11:33 [49865] .f folder/file12
2022/01/29 07:11:33 [49865] send_files(14, source/folder/file13)
2022/01/29 07:11:33 [49865] .f folder/file13
2022/01/29 07:11:33 [49865] send_files(15, source/folder/file14)
2022/01/29 07:11:33 [49865] .f folder/file14
2022/01/29 07:11:33 [49865] send_files(16, source/folder/file15)
2022/01/29 07:11:33 [49865] .f folder/file15
2022/01/29 07:11:33 [49865] send_files(17, source/folder/file16)
2022/01/29 07:11:33 [49865] .f folder/file16
2022/01/29 07:11:33 [49865] send_files(18, source/folder/file17)
2022/01/29 07:11:33 [49865] .f folder/file17
2022/01/29 07:11:33 [49865] send_files(19, source/folder/file18)
2022/01/29 07:11:33 [49865] .f folder/file18
2022/01/29 07:11:33 [49865] send_files(20, source/folder/file19)
2022/01/29 07:11:33 [49865] .f folder/file19
2022/01/29 07:11:33 [49865] send_files(21, source/folder/file2)
2022/01/29 07:11:33 [49865] .f folder/file2
2022/01/29 07:11:33 [49865] send_files(22, source/folder/file20)
2022/01/29 07:11:33 [49865] .f folder/file20
2022/01/29 07:11:33 [49865] send_files(23, source/folder/file3)
2022/01/29 07:11:33 [49865] .f folder/file3
2022/01/29 07:11:33 [49865] send_files(24, source/folder/file4)
2022/01/29 07:11:33 [49865] .f folder/file4
2022/01/29 07:11:33 [49865] send_files(25, source/folder/file5)
2022/01/29 07:11:33 [49865] .f folder/file5
2022/01/29 07:11:33 [49865] send_files(26, source/folder/file6)
2022/01/29 07:11:33 [49865] .f folder/file6
2022/01/29 07:11:33 [49865] send_files(27, source/folder/file7)
2022/01/29 07:11:33 [49865] .f folder/file7
2022/01/29 07:11:33 [49865] send_files(28, source/folder/file8)
2022/01/29 07:11:33 [49865] .f folder/file8
2022/01/29 07:11:33 [49865] send_files(29, source/folder/file9)
2022/01/29 07:11:33 [49865] .f folder/file9
2022/01/29 07:11:33 [49865] send_files phase=1
2022/01/29 07:11:33 [49865] recv_files(folder)
2022/01/29 07:11:33 [49865] recv_files(folder/file1)
2022/01/29 07:11:33 [49865] recv_files(folder/file10)
2022/01/29 07:11:33 [49865] recv_files(folder/file11)
2022/01/29 07:11:33 [49865] recv_files(folder/file12)
2022/01/29 07:11:33 [49865] recv_files(folder/file13)
2022/01/29 07:11:33 [49865] recv_files(folder/file14)
2022/01/29 07:11:33 [49865] recv_files(folder/file15)
2022/01/29 07:11:33 [49865] recv_files(folder/file16)
2022/01/29 07:11:33 [49865] recv_files(folder/file17)
2022/01/29 07:11:33 [49865] recv_files(folder/file18)
2022/01/29 07:11:33 [49865] recv_files(folder/file19)
2022/01/29 07:11:33 [49865] recv_files(folder/file2)
2022/01/29 07:11:33 [49865] recv_files(folder/file20)
2022/01/29 07:11:33 [49865] recv_files(folder/file3)
2022/01/29 07:11:33 [49865] recv_files(folder/file4)
2022/01/29 07:11:33 [49865] recv_files(folder/file5)
2022/01/29 07:11:33 [49865] recv_files(folder/file6)
2022/01/29 07:11:33 [49865] recv_files(folder/file7)
2022/01/29 07:11:33 [49865] recv_files(folder/file8)
2022/01/29 07:11:33 [49865] recv_files(folder/file9)
2022/01/29 07:11:33 [49865] recv_files phase=1
2022/01/29 07:11:33 [49865] generate_files phase=2
2022/01/29 07:11:33 [49865] send_files phase=2
2022/01/29 07:11:33 [49865] send files finished
2022/01/29 07:11:33 [49865] total: matches=0 hash_hits=0 false_alarms=0 data=0
2022/01/29 07:11:33 [49865] recv_files phase=2
2022/01/29 07:11:33 [49865] recv_files finished
2022/01/29 07:11:33 [49865] generate_files phase=3
2022/01/29 07:11:33 [49865] generate_files finished
2022/01/29 07:11:33 [49865] sent 611 bytes received 4,043 bytes 9,308.00 bytes/sec
2022/01/29 07:11:33 [49865] total size is 218,258,163 speedup is 46,896.90
2022/01/29 07:11:33 [49865] [sender] _exit_cleanup(code=0, file=main.c, line=1178): about to call exit(0)

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

28
tests/fixtures/centos-7.7/rsync-i.out vendored Normal file
View File

@ -0,0 +1,28 @@
.d..t...... ./
>f+++++++++ a.txt
>f+++++++++ b.txt
>f+++++++++ c.txt
>f+++++++++ d.txt
>f+++++++++ file with spaces.txt
cd+++++++++ folder/
>f+++++++++ folder/file1
>f+++++++++ folder/file10
>f+++++++++ folder/file11
>f+++++++++ folder/file12
>f+++++++++ folder/file13
>f+++++++++ folder/file14
>f+++++++++ folder/file15
>f+++++++++ folder/file16
>f+++++++++ folder/file17
>f+++++++++ folder/file18
>f+++++++++ folder/file19
>f+++++++++ folder/file2
>f+++++++++ folder/file20
>f+++++++++ folder/file3
>f+++++++++ folder/file4
>f+++++++++ folder/file5
>f+++++++++ folder/file6
>f+++++++++ folder/file7
>f+++++++++ folder/file8
>f+++++++++ folder/file9

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,224 @@
sending incremental file list
[sender] make_file(.,*,0)
[sender] pushing local filters for /home/kbrazil/rsynctest/source/
[sender] make_file(a.txt,*,2)
[sender] make_file(b.txt,*,2)
[sender] make_file(c.txt,*,2)
[sender] make_file(d.txt,*,2)
[sender] make_file(file with spaces.txt,*,2)
[sender] make_file(folder,*,2)
[sender] make_file(pyspark-2.4.5-py2.py3-none-any.whl,*,2)
send_file_list done
send_files starting
[sender] pushing local filters for /home/kbrazil/rsynctest/source/folder/
[sender] make_file(folder/file1,*,2)
[sender] make_file(folder/file2,*,2)
[sender] make_file(folder/file3,*,2)
[sender] make_file(folder/file4,*,2)
[sender] make_file(folder/file5,*,2)
[sender] make_file(folder/file6,*,2)
[sender] make_file(folder/file7,*,2)
[sender] make_file(folder/file8,*,2)
[sender] make_file(folder/file9,*,2)
[sender] make_file(folder/file10,*,2)
[sender] make_file(folder/file11,*,2)
[sender] make_file(folder/file12,*,2)
[sender] make_file(folder/file13,*,2)
[sender] make_file(folder/file14,*,2)
[sender] make_file(folder/file15,*,2)
[sender] make_file(folder/file16,*,2)
[sender] make_file(folder/file17,*,2)
[sender] make_file(folder/file18,*,2)
[sender] make_file(folder/file19,*,2)
[sender] make_file(folder/file20,*,2)
server_recv(2) starting pid=58141
recv_file_name(.)
recv_file_name(a.txt)
recv_file_name(b.txt)
recv_file_name(c.txt)
recv_file_name(d.txt)
recv_file_name(file with spaces.txt)
recv_file_name(folder)
recv_file_name(pyspark-2.4.5-py2.py3-none-any.whl)
received 8 names
recv_file_list done
get_local_name count=8 dest
generator starting pid=58141
delta-transmission disabled for local transfer or --whole-file
recv_generator(.,0)
recv_generator(.,1)
recv_generator(a.txt,2)
recv_generator(b.txt,3)
recv_generator(c.txt,4)
recv_generator(d.txt,5)
recv_generator(file with spaces.txt,6)
recv_generator(pyspark-2.4.5-py2.py3-none-any.whl,7)
recv_generator(folder,8)
send_files(0, source/.)
.d ./
send_files(2, source/a.txt)
.f a.txt
send_files(3, source/b.txt)
.f b.txt
send_files(4, source/c.txt)
.f c.txt
send_files(5, source/d.txt)
.f d.txt
send_files(6, source/file with spaces.txt)
.f file with spaces.txt
send_files(7, source/pyspark-2.4.5-py2.py3-none-any.whl)
.f pyspark-2.4.5-py2.py3-none-any.whl
recv_files(8) starting
[receiver] receiving flist for dir 1
recv_file_name(folder/file1)
recv_file_name(folder/file2)
recv_file_name(folder/file3)
recv_file_name(folder/file4)
recv_file_name(folder/file5)
recv_file_name(folder/file6)
recv_file_name(folder/file7)
recv_file_name(folder/file8)
recv_file_name(folder/file9)
recv_file_name(folder/file10)
recv_file_name(folder/file11)
recv_file_name(folder/file12)
recv_file_name(folder/file13)
recv_file_name(folder/file14)
recv_file_name(folder/file15)
recv_file_name(folder/file16)
recv_file_name(folder/file17)
recv_file_name(folder/file18)
recv_file_name(folder/file19)
recv_file_name(folder/file20)
received 20 names
recv_file_list done
recv_files(.)
recv_files(a.txt)
recv_files(b.txt)
recv_files(c.txt)
recv_files(d.txt)
recv_files(file with spaces.txt)
recv_files(pyspark-2.4.5-py2.py3-none-any.whl)
[generator] receiving flist for dir 1
recv_file_name(folder/file1)
recv_file_name(folder/file2)
recv_file_name(folder/file3)
recv_file_name(folder/file4)
recv_file_name(folder/file5)
recv_file_name(folder/file6)
recv_file_name(folder/file7)
recv_file_name(folder/file8)
recv_file_name(folder/file9)
recv_file_name(folder/file10)
recv_file_name(folder/file11)
recv_file_name(folder/file12)
recv_file_name(folder/file13)
recv_file_name(folder/file14)
recv_file_name(folder/file15)
recv_file_name(folder/file16)
recv_file_name(folder/file17)
recv_file_name(folder/file18)
recv_file_name(folder/file19)
recv_file_name(folder/file20)
received 20 names
recv_file_list done
recv_generator(folder,9)
recv_generator(folder/file1,10)
recv_generator(folder/file10,11)
recv_generator(folder/file11,12)
recv_generator(folder/file12,13)
recv_generator(folder/file13,14)
recv_generator(folder/file14,15)
recv_generator(folder/file15,16)
recv_generator(folder/file16,17)
recv_generator(folder/file17,18)
recv_generator(folder/file18,19)
recv_generator(folder/file19,20)
recv_generator(folder/file2,21)
recv_generator(folder/file20,22)
recv_generator(folder/file3,23)
recv_generator(folder/file4,24)
recv_generator(folder/file5,25)
recv_generator(folder/file6,26)
recv_generator(folder/file7,27)
recv_generator(folder/file8,28)
recv_generator(folder/file9,29)
generate_files phase=1
send_files(9, source/folder)
.d folder/
send_files(10, source/folder/file1)
.f folder/file1
send_files(11, source/folder/file10)
.f folder/file10
send_files(12, source/folder/file11)
.f folder/file11
send_files(13, source/folder/file12)
.f folder/file12
send_files(14, source/folder/file13)
.f folder/file13
send_files(15, source/folder/file14)
.f folder/file14
send_files(16, source/folder/file15)
.f folder/file15
send_files(17, source/folder/file16)
.f folder/file16
send_files(18, source/folder/file17)
.f folder/file17
send_files(19, source/folder/file18)
.f folder/file18
send_files(20, source/folder/file19)
.f folder/file19
send_files(21, source/folder/file2)
.f folder/file2
send_files(22, source/folder/file20)
.f folder/file20
send_files(23, source/folder/file3)
.f folder/file3
send_files(24, source/folder/file4)
.f folder/file4
send_files(25, source/folder/file5)
.f folder/file5
send_files(26, source/folder/file6)
.f folder/file6
send_files(27, source/folder/file7)
.f folder/file7
send_files(28, source/folder/file8)
.f folder/file8
send_files(29, source/folder/file9)
.f folder/file9
send_files phase=1
recv_files(folder)
recv_files(folder/file1)
recv_files(folder/file10)
recv_files(folder/file11)
recv_files(folder/file12)
recv_files(folder/file13)
recv_files(folder/file14)
recv_files(folder/file15)
recv_files(folder/file16)
recv_files(folder/file17)
recv_files(folder/file18)
recv_files(folder/file19)
recv_files(folder/file2)
recv_files(folder/file20)
recv_files(folder/file3)
recv_files(folder/file4)
recv_files(folder/file5)
recv_files(folder/file6)
recv_files(folder/file7)
recv_files(folder/file8)
recv_files(folder/file9)
recv_files phase=1
generate_files phase=2
send_files phase=2
send files finished
total: matches=0 hash_hits=0 false_alarms=0 data=0
recv_files phase=2
recv_files finished
generate_files phase=3
generate_files finished
sent 611 bytes received 4,043 bytes 9,308.00 bytes/sec
total size is 218,258,163 speedup is 46,896.90
[sender] _exit_cleanup(code=0, file=main.c, line=1178): about to call exit(0)

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

422
tests/fixtures/centos-7.7/rsync-ivvv.out vendored Normal file
View File

@ -0,0 +1,422 @@
sending incremental file list
[sender] make_file(.,*,0)
[sender] pushing local filters for /home/kbrazil/rsynctest/source/
[sender] make_file(a.txt,*,2)
[sender] make_file(b.txt,*,2)
[sender] make_file(c.txt,*,2)
[sender] make_file(d.txt,*,2)
[sender] make_file(file with spaces.txt,*,2)
[sender] make_file(folder,*,2)
send_file_list done
send_files starting
[sender] pushing local filters for /home/kbrazil/rsynctest/source/folder/
[sender] make_file(folder/file1,*,2)
[sender] make_file(folder/file2,*,2)
[sender] make_file(folder/file3,*,2)
[sender] make_file(folder/file4,*,2)
[sender] make_file(folder/file5,*,2)
[sender] make_file(folder/file6,*,2)
[sender] make_file(folder/file7,*,2)
[sender] make_file(folder/file8,*,2)
[sender] make_file(folder/file9,*,2)
[sender] make_file(folder/file10,*,2)
[sender] make_file(folder/file11,*,2)
[sender] make_file(folder/file12,*,2)
[sender] make_file(folder/file13,*,2)
[sender] make_file(folder/file14,*,2)
[sender] make_file(folder/file15,*,2)
[sender] make_file(folder/file16,*,2)
[sender] make_file(folder/file17,*,2)
[sender] make_file(folder/file18,*,2)
[sender] make_file(folder/file19,*,2)
[sender] make_file(folder/file20,*,2)
server_recv(2) starting pid=7804
recv_file_name(.)
recv_file_name(a.txt)
recv_file_name(b.txt)
recv_file_name(c.txt)
recv_file_name(d.txt)
recv_file_name(file with spaces.txt)
recv_file_name(folder)
received 7 names
recv_file_list done
get_local_name count=7 dest
generator starting pid=7804
delta-transmission disabled for local transfer or --whole-file
recv_generator(.,0)
set modtime of . to (1643343337) Thu Jan 27 20:15:37 2022
recv_generator(.,1)
recv_generator(a.txt,2)
recv_generator(b.txt,3)
recv_generator(c.txt,4)
recv_generator(d.txt,5)
recv_generator(file with spaces.txt,6)
recv_generator(folder,7)
send_files(0, source/.)
.d..t...... ./
send_files(2, source/a.txt)
send_files mapped source/a.txt of size 47
calling match_sums source/a.txt
>f+++++++++ a.txt
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/a.txt
send_files(3, source/b.txt)
send_files mapped source/b.txt of size 47
calling match_sums source/b.txt
>f+++++++++ b.txt
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/b.txt
send_files(4, source/c.txt)
send_files mapped source/c.txt of size 47
calling match_sums source/c.txt
>f+++++++++ c.txt
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/c.txt
send_files(5, source/d.txt)
send_files mapped source/d.txt of size 47
calling match_sums source/d.txt
>f+++++++++ d.txt
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/d.txt
send_files(6, source/file with spaces.txt)
send_files mapped source/file with spaces.txt of size 47
calling match_sums source/file with spaces.txt
>f+++++++++ file with spaces.txt
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/file with spaces.txt
recv_files(7) starting
[receiver] receiving flist for dir 1
recv_file_name(folder/file1)
recv_file_name(folder/file2)
recv_file_name(folder/file3)
recv_file_name(folder/file4)
recv_file_name(folder/file5)
recv_file_name(folder/file6)
recv_file_name(folder/file7)
recv_file_name(folder/file8)
recv_file_name(folder/file9)
recv_file_name(folder/file10)
recv_file_name(folder/file11)
recv_file_name(folder/file12)
recv_file_name(folder/file13)
recv_file_name(folder/file14)
recv_file_name(folder/file15)
recv_file_name(folder/file16)
recv_file_name(folder/file17)
recv_file_name(folder/file18)
recv_file_name(folder/file19)
recv_file_name(folder/file20)
received 20 names
recv_file_list done
recv_files(.)
recv_files(a.txt)
got file_sum
set modtime of .a.txt.WA5SfS to (1643342949) Thu Jan 27 20:09:09 2022
renaming .a.txt.WA5SfS to a.txt
recv_files(b.txt)
got file_sum
set modtime of .b.txt.b6U9z7 to (1643342953) Thu Jan 27 20:09:13 2022
renaming .b.txt.b6U9z7 to b.txt
recv_files(c.txt)
got file_sum
set modtime of .c.txt.0ZRrUm to (1643342956) Thu Jan 27 20:09:16 2022
renaming .c.txt.0ZRrUm to c.txt
recv_files(d.txt)
got file_sum
set modtime of .d.txt.N3zKeC to (1643342959) Thu Jan 27 20:09:19 2022
renaming .d.txt.N3zKeC to d.txt
recv_files(file with spaces.txt)
got file_sum
set modtime of .file with spaces.txt.KN33yR to (1643342980) Thu Jan 27 20:09:40 2022
renaming .file with spaces.txt.KN33yR to file with spaces.txt
[generator] receiving flist for dir 1
recv_file_name(folder/file1)
recv_file_name(folder/file2)
recv_file_name(folder/file3)
recv_file_name(folder/file4)
recv_file_name(folder/file5)
recv_file_name(folder/file6)
recv_file_name(folder/file7)
recv_file_name(folder/file8)
recv_file_name(folder/file9)
recv_file_name(folder/file10)
recv_file_name(folder/file11)
recv_file_name(folder/file12)
recv_file_name(folder/file13)
recv_file_name(folder/file14)
recv_file_name(folder/file15)
recv_file_name(folder/file16)
recv_file_name(folder/file17)
recv_file_name(folder/file18)
recv_file_name(folder/file19)
recv_file_name(folder/file20)
received 20 names
recv_file_list done
recv_generator(folder,8)
set modtime of folder to (1643343369) Thu Jan 27 20:16:09 2022
recv_generator(folder/file1,9)
set modtime of . to (1643343337) Thu Jan 27 20:15:37 2022
recv_generator(folder/file10,10)
recv_generator(folder/file11,11)
recv_generator(folder/file12,12)
recv_generator(folder/file13,13)
recv_generator(folder/file14,14)
recv_generator(folder/file15,15)
recv_generator(folder/file16,16)
recv_generator(folder/file17,17)
recv_generator(folder/file18,18)
recv_generator(folder/file19,19)
recv_generator(folder/file2,20)
recv_generator(folder/file20,21)
recv_generator(folder/file3,22)
recv_generator(folder/file4,23)
recv_generator(folder/file5,24)
recv_generator(folder/file6,25)
recv_generator(folder/file7,26)
recv_generator(folder/file8,27)
recv_generator(folder/file9,28)
generate_files phase=1
send_files(8, source/folder)
cd+++++++++ folder/
send_files(9, source/folder/file1)
send_files mapped source/folder/file1 of size 0
calling match_sums source/folder/file1
>f+++++++++ folder/file1
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/folder/file1
send_files(10, source/folder/file10)
send_files mapped source/folder/file10 of size 0
calling match_sums source/folder/file10
>f+++++++++ folder/file10
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/folder/file10
send_files(11, source/folder/file11)
send_files mapped source/folder/file11 of size 0
calling match_sums source/folder/file11
>f+++++++++ folder/file11
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/folder/file11
send_files(12, source/folder/file12)
send_files mapped source/folder/file12 of size 0
calling match_sums source/folder/file12
>f+++++++++ folder/file12
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/folder/file12
send_files(13, source/folder/file13)
send_files mapped source/folder/file13 of size 0
calling match_sums source/folder/file13
>f+++++++++ folder/file13
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/folder/file13
send_files(14, source/folder/file14)
send_files mapped source/folder/file14 of size 0
calling match_sums source/folder/file14
>f+++++++++ folder/file14
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/folder/file14
send_files(15, source/folder/file15)
send_files mapped source/folder/file15 of size 0
calling match_sums source/folder/file15
>f+++++++++ folder/file15
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/folder/file15
send_files(16, source/folder/file16)
send_files mapped source/folder/file16 of size 0
calling match_sums source/folder/file16
>f+++++++++ folder/file16
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/folder/file16
send_files(17, source/folder/file17)
send_files mapped source/folder/file17 of size 0
calling match_sums source/folder/file17
>f+++++++++ folder/file17
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/folder/file17
send_files(18, source/folder/file18)
send_files mapped source/folder/file18 of size 0
calling match_sums source/folder/file18
>f+++++++++ folder/file18
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/folder/file18
send_files(19, source/folder/file19)
send_files mapped source/folder/file19 of size 0
calling match_sums source/folder/file19
>f+++++++++ folder/file19
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/folder/file19
send_files(20, source/folder/file2)
send_files mapped source/folder/file2 of size 0
calling match_sums source/folder/file2
>f+++++++++ folder/file2
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/folder/file2
send_files(21, source/folder/file20)
send_files mapped source/folder/file20 of size 0
calling match_sums source/folder/file20
>f+++++++++ folder/file20
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/folder/file20
send_files(22, source/folder/file3)
send_files mapped source/folder/file3 of size 0
calling match_sums source/folder/file3
>f+++++++++ folder/file3
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/folder/file3
send_files(23, source/folder/file4)
send_files mapped source/folder/file4 of size 0
calling match_sums source/folder/file4
>f+++++++++ folder/file4
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/folder/file4
send_files(24, source/folder/file5)
send_files mapped source/folder/file5 of size 0
calling match_sums source/folder/file5
>f+++++++++ folder/file5
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/folder/file5
send_files(25, source/folder/file6)
send_files mapped source/folder/file6 of size 0
calling match_sums source/folder/file6
>f+++++++++ folder/file6
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/folder/file6
send_files(26, source/folder/file7)
send_files mapped source/folder/file7 of size 0
calling match_sums source/folder/file7
>f+++++++++ folder/file7
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/folder/file7
send_files(27, source/folder/file8)
send_files mapped source/folder/file8 of size 0
calling match_sums source/folder/file8
>f+++++++++ folder/file8
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/folder/file8
send_files(28, source/folder/file9)
send_files mapped source/folder/file9 of size 0
calling match_sums source/folder/file9
>f+++++++++ folder/file9
sending file_sum
false_alarms=0 hash_hits=0 matches=0
sender finished source/folder/file9
recv_files(folder)
recv_files(folder/file1)
got file_sum
set modtime of folder/.file1.iqSNT6 to (1643343369) Thu Jan 27 20:16:09 2022
renaming folder/.file1.iqSNT6 to folder/file1
recv_files(folder/file10)
got file_sum
set modtime of folder/.file10.AlTyem to (1643343369) Thu Jan 27 20:16:09 2022
renaming folder/.file10.AlTyem to folder/file10
recv_files(folder/file11)
got file_sum
set modtime of folder/.file11.McEkzB to (1643343369) Thu Jan 27 20:16:09 2022
renaming folder/.file11.McEkzB to folder/file11
recv_files(folder/file12)
got file_sum
set modtime of folder/.file12.6i46TQ to (1643343369) Thu Jan 27 20:16:09 2022
renaming folder/.file12.6i46TQ to folder/file12
recv_files(folder/file13)
got file_sum
set modtime of folder/.file13.GE3Te6 to (1643343369) Thu Jan 27 20:16:09 2022
renaming folder/.file13.GE3Te6 to folder/file13
recv_files(folder/file14)
got file_sum
set modtime of folder/.file14.iHFHzl to (1643343369) Thu Jan 27 20:16:09 2022
renaming folder/.file14.iHFHzl to folder/file14
recv_files(folder/file15)
got file_sum
set modtime of folder/.file15.QXUvUA to (1643343369) Thu Jan 27 20:16:09 2022
renaming folder/.file15.QXUvUA to folder/file15
recv_files(folder/file16)
got file_sum
set modtime of folder/.file16.avHkfQ to (1643343369) Thu Jan 27 20:16:09 2022
renaming folder/.file16.avHkfQ to folder/file16
recv_files(folder/file17)
got file_sum
set modtime of folder/.file17.wH89z5 to (1643343369) Thu Jan 27 20:16:09 2022
renaming folder/.file17.wH89z5 to folder/file17
recv_files(folder/file18)
got file_sum
set modtime of folder/.file18.Kpj0Uk to (1643343369) Thu Jan 27 20:16:09 2022
renaming folder/.file18.Kpj0Uk to folder/file18
recv_files(folder/file19)
got file_sum
set modtime of folder/.file19.885QfA to (1643343369) Thu Jan 27 20:16:09 2022
renaming folder/.file19.885QfA to folder/file19
recv_files(folder/file2)
got file_sum
set modtime of folder/.file2.SQrIAP to (1643343369) Thu Jan 27 20:16:09 2022
renaming folder/.file2.SQrIAP to folder/file2
recv_files(folder/file20)
got file_sum
set modtime of folder/.file20.kXpAV4 to (1643343369) Thu Jan 27 20:16:09 2022
renaming folder/.file20.kXpAV4 to folder/file20
recv_files(folder/file3)
got file_sum
set modtime of folder/.file3.GRdtgk to (1643343369) Thu Jan 27 20:16:09 2022
renaming folder/.file3.GRdtgk to folder/file3
recv_files(folder/file4)
got file_sum
set modtime of folder/.file4.uJGmBz to (1643343369) Thu Jan 27 20:16:09 2022
renaming folder/.file4.uJGmBz to folder/file4
recv_files(folder/file5)
got file_sum
set modtime of folder/.file5.OPPgWO to (1643343369) Thu Jan 27 20:16:09 2022
renaming folder/.file5.OPPgWO to folder/file5
recv_files(folder/file6)
got file_sum
set modtime of folder/.file6.abEbh4 to (1643343369) Thu Jan 27 20:16:09 2022
renaming folder/.file6.abEbh4 to folder/file6
recv_files(folder/file7)
got file_sum
set modtime of folder/.file7.4T46Bj to (1643343369) Thu Jan 27 20:16:09 2022
renaming folder/.file7.4T46Bj to folder/file7
recv_files(folder/file8)
got file_sum
set modtime of folder/.file8.CA62Wy to (1643343369) Thu Jan 27 20:16:09 2022
renaming folder/.file8.CA62Wy to folder/file8
recv_files(folder/file9)
got file_sum
set modtime of folder/.file9.Yu80hO to (1643343369) Thu Jan 27 20:16:09 2022
renaming folder/.file9.Yu80hO to folder/file9
set modtime of folder to (1643343369) Thu Jan 27 20:16:09 2022
send_files phase=1
recv_files phase=1
generate_files phase=2
send_files phase=2
send files finished
total: matches=0 hash_hits=0 false_alarms=0 data=235
recv_files phase=2
recv_files finished
generate_files phase=3
generate_files finished
sent 1,708 bytes received 8,209 bytes 19,834.00 bytes/sec
total size is 235 speedup is 0.02
[sender] _exit_cleanup(code=0, file=main.c, line=1178): about to call exit(0)

View File

@ -0,0 +1 @@
[{"type":"file","filename":"some/dir/new-file.txt","metadata":">f+++++++++","update_type":"file received","file_type":"file","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null,"acl_different":null,"extended_attribute_different":null},{"type":"file","filename":"some/dir/existing-file-with-changed-owner-and-group.txt","metadata":".f....og..x","update_type":"not updated","file_type":"file","checksum_or_value_different":false,"size_different":false,"modification_time_different":false,"permissions_different":false,"owner_different":true,"group_different":true,"acl_different":false,"extended_attribute_different":true},{"type":"file","filename":"some/dir/existing-file-with-changed-unnamed-attribute.txt","metadata":".f........x","update_type":"not updated","file_type":"file","checksum_or_value_different":false,"size_different":false,"modification_time_different":false,"permissions_different":false,"owner_different":false,"group_different":false,"acl_different":false,"extended_attribute_different":true},{"type":"file","filename":"some/dir/existing-file-with-changed-permissions.txt","metadata":">f...p....x","update_type":"file received","file_type":"file","checksum_or_value_different":false,"size_different":false,"modification_time_different":false,"permissions_different":true,"owner_different":false,"group_different":false,"acl_different":false,"extended_attribute_different":true},{"type":"file","filename":"some/dir/existing-file-with-changed-time-and-group.txt","metadata":">f..t..g..x","update_type":"file received","file_type":"file","checksum_or_value_different":false,"size_different":false,"modification_time_different":true,"permissions_different":false,"owner_different":false,"group_different":true,"acl_different":false,"extended_attribute_different":true},{"type":"file","filename":"some/dir/existing-file-with-changed-size.txt","metadata":">f.s......x","update_type":"file received","file_type":"file","checksum_or_value_different":false,"size_different":true,"modification_time_different":false,"permissions_different":false,"owner_different":false,"group_different":false,"acl_different":false,"extended_attribute_different":true},{"type":"file","filename":"some/dir/existing-file-with-changed-size-and-time-stamp.txt ","metadata":">f.st.....x","update_type":"file received","file_type":"file","checksum_or_value_different":false,"size_different":true,"modification_time_different":true,"permissions_different":false,"owner_different":false,"group_different":false,"acl_different":false,"extended_attribute_different":true},{"type":"file","filename":"some/dir/new-directory/","metadata":"cd+++++++++","update_type":"local change or creation","file_type":"directory","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null,"acl_different":null,"extended_attribute_different":null},{"type":"file","filename":"some/dir/existing-directory-with-changed-owner-and-group/","metadata":".d....og...","update_type":"not updated","file_type":"directory","checksum_or_value_different":false,"size_different":false,"modification_time_different":false,"permissions_different":false,"owner_different":true,"group_different":true,"acl_different":false,"extended_attribute_different":false},{"type":"file","filename":"some/dir/existing-directory-with-different-time-stamp/","metadata":".d..t......","update_type":"not updated","file_type":"directory","checksum_or_value_different":false,"size_different":false,"modification_time_different":true,"permissions_different":false,"owner_different":false,"group_different":false,"acl_different":false,"extended_attribute_different":false}]

1
tests/fixtures/generic/rsync-i.json vendored Normal file
View File

@ -0,0 +1 @@
[{"summary":{},"files":[{"filename":"some/dir/new-file.txt","metadata":">f+++++++++","update_type":"file received","file_type":"file","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null,"acl_different":null,"extended_attribute_different":null},{"filename":"some/dir/existing-file-with-changed-owner-and-group.txt","metadata":".f....og..x","update_type":"not updated","file_type":"file","checksum_or_value_different":false,"size_different":false,"modification_time_different":false,"permissions_different":false,"owner_different":true,"group_different":true,"acl_different":false,"extended_attribute_different":true},{"filename":"some/dir/existing-file-with-changed-unnamed-attribute.txt","metadata":".f........x","update_type":"not updated","file_type":"file","checksum_or_value_different":false,"size_different":false,"modification_time_different":false,"permissions_different":false,"owner_different":false,"group_different":false,"acl_different":false,"extended_attribute_different":true},{"filename":"some/dir/existing-file-with-changed-permissions.txt","metadata":">f...p....x","update_type":"file received","file_type":"file","checksum_or_value_different":false,"size_different":false,"modification_time_different":false,"permissions_different":true,"owner_different":false,"group_different":false,"acl_different":false,"extended_attribute_different":true},{"filename":"some/dir/existing-file-with-changed-time-and-group.txt","metadata":">f..t..g..x","update_type":"file received","file_type":"file","checksum_or_value_different":false,"size_different":false,"modification_time_different":true,"permissions_different":false,"owner_different":false,"group_different":true,"acl_different":false,"extended_attribute_different":true},{"filename":"some/dir/existing-file-with-changed-size.txt","metadata":">f.s......x","update_type":"file received","file_type":"file","checksum_or_value_different":false,"size_different":true,"modification_time_different":false,"permissions_different":false,"owner_different":false,"group_different":false,"acl_different":false,"extended_attribute_different":true},{"filename":"some/dir/existing-file-with-changed-size-and-time-stamp.txt ","metadata":">f.st.....x","update_type":"file received","file_type":"file","checksum_or_value_different":false,"size_different":true,"modification_time_different":true,"permissions_different":false,"owner_different":false,"group_different":false,"acl_different":false,"extended_attribute_different":true},{"filename":"some/dir/new-directory/","metadata":"cd+++++++++","update_type":"local change or creation","file_type":"directory","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null,"acl_different":null,"extended_attribute_different":null},{"filename":"some/dir/existing-directory-with-changed-owner-and-group/","metadata":".d....og...","update_type":"not updated","file_type":"directory","checksum_or_value_different":false,"size_different":false,"modification_time_different":false,"permissions_different":false,"owner_different":true,"group_different":true,"acl_different":false,"extended_attribute_different":false},{"filename":"some/dir/existing-directory-with-different-time-stamp/","metadata":".d..t......","update_type":"not updated","file_type":"directory","checksum_or_value_different":false,"size_different":false,"modification_time_different":true,"permissions_different":false,"owner_different":false,"group_different":false,"acl_different":false,"extended_attribute_different":false}]}]

11
tests/fixtures/generic/rsync-i.out vendored Normal file
View File

@ -0,0 +1,11 @@
>f+++++++++ some/dir/new-file.txt
.f....og..x some/dir/existing-file-with-changed-owner-and-group.txt
.f........x some/dir/existing-file-with-changed-unnamed-attribute.txt
>f...p....x some/dir/existing-file-with-changed-permissions.txt
>f..t..g..x some/dir/existing-file-with-changed-time-and-group.txt
>f.s......x some/dir/existing-file-with-changed-size.txt
>f.st.....x some/dir/existing-file-with-changed-size-and-time-stamp.txt
cd+++++++++ some/dir/new-directory/
.d....og... some/dir/existing-directory-with-changed-owner-and-group/
.d..t...... some/dir/existing-directory-with-different-time-stamp/

38
tests/fixtures/generic/xrandr.out vendored Normal file
View File

@ -0,0 +1,38 @@
Screen 0: minimum 8 x 8, current 1920 x 1080, maximum 32767 x 32767
eDP1 connected primary 1920x1080+0+0 (normal left inverted right x axis y axis) 310mm x 170mm
1920x1080 60.03*+ 59.93
1680x1050 59.88
1400x1050 59.98
1600x900 60.00 59.95 59.82
1280x1024 60.02
1400x900 59.96 59.88
1280x960 60.00
1368x768 60.00 59.88 59.85
1280x800 59.81 59.91
1280x720 59.86 60.00 59.74
1024x768 60.00
1024x576 60.00 59.90 59.82
960x540 60.00 59.63 59.82
800x600 60.32 56.25
864x486 60.00 59.92 59.57
640x480 59.94
720x405 59.51 60.00 58.99
640x360 59.84 59.32 60.00
DP1 disconnected (normal left inverted right x axis y axis)
DP2 disconnected (normal left inverted right x axis y axis)
HDMI1 connected (normal left inverted right x axis y axis)
1920x1080 60.00 + 50.00 59.94
1920x1080i 60.00 50.00 59.94
1680x1050 59.88
1280x1024 75.02 60.02
1440x900 59.90
1280x960 60.00
1280x720 60.00 50.00 59.94
1024x768 75.03 70.07 60.00
832x624 74.55
800x600 72.19 75.00 60.32 56.25
720x576 50.00
720x480 60.00 59.94
640x480 75.00 72.81 66.67 60.00 59.94
720x400 70.08
VIRTUAL1 disconnected (normal left inverted right x axis y axis)

43
tests/fixtures/generic/xrandr_2.out vendored Normal file
View File

@ -0,0 +1,43 @@
Screen 0: minimum 320 x 200, current 1920 x 1080, maximum 16384 x 16384
eDP-1 connected primary 1920x1080+0+0 (normal left inverted right x axis y axis) 309mm x 174mm
1920x1080 60.03*+ 60.01 59.97 59.96 59.93
1680x1050 59.95 59.88
1400x1050 59.98
1600x900 59.99 59.94 59.95 59.82
1280x1024 60.02
1400x900 59.96 59.88
1280x960 60.00
1440x810 60.00 59.97
1368x768 59.88 59.85
1280x800 59.99 59.97 59.81 59.91
1280x720 60.00 59.99 59.86 59.74
1024x768 60.04 60.00
960x720 60.00
928x696 60.05
896x672 60.01
1024x576 59.95 59.96 59.90 59.82
960x600 59.93 60.00
960x540 59.96 59.99 59.63 59.82
800x600 60.00 60.32 56.25
840x525 60.01 59.88
864x486 59.92 59.57
700x525 59.98
800x450 59.95 59.82
640x512 60.02
700x450 59.96 59.88
640x480 60.00 59.94
720x405 59.51 58.99
684x384 59.88 59.85
640x400 59.88 59.98
640x360 59.86 59.83 59.84 59.32
512x384 60.00
512x288 60.00 59.92
480x270 59.63 59.82
400x300 60.32 56.34
432x243 59.92 59.57
320x240 60.05
360x202 59.51 59.13
320x180 59.84 59.32
DP-1 disconnected (normal left inverted right x axis y axis)
HDMI-1 disconnected (normal left inverted right x axis y axis)
DP-2 disconnected (normal left inverted right x axis y axis)

View File

@ -0,0 +1,15 @@
HDMI1 connected (normal left inverted right x axis y axis)
1920x1080 60.00 + 50.00 59.94
1920x1080i 60.00 50.00 59.94
1680x1050 59.88
1280x1024 75.02 60.02
1440x900 59.90
1280x960 60.00
1280x720 60.00 50.00 59.94
1024x768 75.03 70.07 60.00
832x624 74.55
800x600 72.19 75.00 60.32 56.25
720x576 50.00
720x480 60.00 59.94
640x480 75.00 72.81 66.67 60.00 59.94
720x400 70.08

View File

@ -0,0 +1,56 @@
{
"screens": [
{
"screen_number": 0,
"minimum_width": 8,
"minimum_height": 8,
"current_width": 1920,
"current_height": 1080,
"maximum_width": 32767,
"maximum_height": 32767,
"associated_device": {
"associated_modes": [
{
"resolution_width": 1920,
"resolution_height": 1080,
"is_high_resolution": false,
"frequencies": [
{
"frequency": 60.03,
"is_current": true,
"is_preferred": true
},
{
"frequency": 59.93,
"is_current": false,
"is_preferred": false
}
]
},
{
"resolution_width": 1680,
"resolution_height": 1050,
"is_high_resolution": false,
"frequencies": [
{
"frequency": 59.88,
"is_current": false,
"is_preferred": false
}
]
}
],
"is_connected": true,
"is_primary": true,
"device_name": "eDP1",
"resolution_width": 1920,
"resolution_height": 1080,
"offset_width": 0,
"offset_height": 0,
"dimension_width": 310,
"dimension_height": 170
}
}
],
"unassociated_devices": []
}

View File

@ -0,0 +1,4 @@
Screen 0: minimum 8 x 8, current 1920 x 1080, maximum 32767 x 32767
eDP1 connected primary 1920x1080+0+0 (normal left inverted right x axis y axis) 310mm x 170mm
1920x1080 60.03*+ 59.93
1680x1050 59.88

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,233 @@
2022/02/01 08:09:26 [617] server_recv(2) starting pid=617
2022/02/01 08:09:26 [617] receiving file list
2022/02/01 08:09:26 [617] recv_file_name(.)
2022/02/01 08:09:26 [617] recv_file_name(folder)
2022/02/01 08:09:26 [617] recv_file_name(c.txt)
2022/02/01 08:09:26 [617] recv_file_name(b.txt)
2022/02/01 08:09:26 [617] recv_file_name(a.txt)
2022/02/01 08:09:26 [617] recv_file_name(d.txt)
2022/02/01 08:09:26 [617] recv_file_name(folder/c.txt)
2022/02/01 08:09:26 [617] recv_file_name(folder/b.txt)
2022/02/01 08:09:26 [617] recv_file_name(folder/a.txt)
2022/02/01 08:09:26 [617] recv_file_name(folder/d.txt)
2022/02/01 08:09:26 [617] received 10 names
2022/02/01 08:09:26 [617] recv_file_list done
2022/02/01 08:09:26 [617] get_local_name count=10 dest
2022/02/01 08:09:26 [617] generator starting pid=617 count=10
2022/02/01 08:09:26 [617] recv_files(10) starting
2022/02/01 08:09:26 [617] delta-transmission disabled for local transfer or --whole-file
2022/02/01 08:09:26 [617] recv_generator(.,0)
2022/02/01 08:09:26 [617] set modtime of . to (1643731675) Tue Feb 1 08:07:55 2022
2022/02/01 08:09:26 [617] recv_generator(a.txt,1)
2022/02/01 08:09:26 [617] recv_files(.)
2022/02/01 08:09:26 [617] recv_generator(b.txt,2)
2022/02/01 08:09:26 [617] .d..t.... ./
2022/02/01 08:09:26 [617] recv_generator(c.txt,3)
2022/02/01 08:09:26 [617] recv_files(a.txt)
2022/02/01 08:09:26 [617] got file_sum
2022/02/01 08:09:26 [617] recv_generator(d.txt,4)
2022/02/01 08:09:26 [617] >f+++++++ a.txt
2022/02/01 08:09:26 [617] set modtime of .a.txt.3zgq9u to (1643731665) Tue Feb 1 08:07:45 2022
2022/02/01 08:09:26 [617] recv_generator(folder,5)
2022/02/01 08:09:26 [617] renaming .a.txt.3zgq9u to a.txt
2022/02/01 08:09:26 [617] recv_files(b.txt)
2022/02/01 08:09:26 [617] set modtime of folder to (1643731693) Tue Feb 1 08:08:13 2022
2022/02/01 08:09:26 [617] got file_sum
2022/02/01 08:09:26 [617] >f+++++++ b.txt
2022/02/01 08:09:26 [617] recv_generator(folder/a.txt,6)
2022/02/01 08:09:26 [617] set modtime of .b.txt.ZzrbL4 to (1643731668) Tue Feb 1 08:07:48 2022
2022/02/01 08:09:26 [617] recv_generator(folder/b.txt,7)
2022/02/01 08:09:26 [617] renaming .b.txt.ZzrbL4 to b.txt
2022/02/01 08:09:26 [617] recv_files(c.txt)
2022/02/01 08:09:26 [617] recv_generator(folder/c.txt,8)
2022/02/01 08:09:26 [617] got file_sum
2022/02/01 08:09:26 [617] >f+++++++ c.txt
2022/02/01 08:09:26 [617] recv_generator(folder/d.txt,9)
2022/02/01 08:09:26 [617] set modtime of .c.txt.EPrPMr to (1643731671) Tue Feb 1 08:07:51 2022
2022/02/01 08:09:26 [617] renaming .c.txt.EPrPMr to c.txt
2022/02/01 08:09:26 [617] generate_files phase=1
2022/02/01 08:09:26 [617] recv_files(d.txt)
2022/02/01 08:09:26 [617] got file_sum
2022/02/01 08:09:26 [617] >f+++++++ d.txt
2022/02/01 08:09:26 [617] set modtime of .d.txt.6KXXf5 to (1643731675) Tue Feb 1 08:07:55 2022
2022/02/01 08:09:26 [617] renaming .d.txt.6KXXf5 to d.txt
2022/02/01 08:09:26 [617] recv_files(folder)
2022/02/01 08:09:26 [617] cd+++++++ folder/
2022/02/01 08:09:26 [617] recv_files(folder/a.txt)
2022/02/01 08:09:26 [617] got file_sum
2022/02/01 08:09:26 [617] >f+++++++ folder/a.txt
2022/02/01 08:09:26 [617] set modtime of folder/.a.txt.Se2V6X to (1643731685) Tue Feb 1 08:08:05 2022
2022/02/01 08:09:26 [617] renaming folder/.a.txt.Se2V6X to folder/a.txt
2022/02/01 08:09:26 [617] recv_files(folder/b.txt)
2022/02/01 08:09:26 [617] got file_sum
2022/02/01 08:09:26 [617] >f+++++++ folder/b.txt
2022/02/01 08:09:26 [617] set modtime of folder/.b.txt.U8duhi to (1643731687) Tue Feb 1 08:08:07 2022
2022/02/01 08:09:26 [617] renaming folder/.b.txt.U8duhi to folder/b.txt
2022/02/01 08:09:26 [617] recv_files(folder/c.txt)
2022/02/01 08:09:26 [617] got file_sum
2022/02/01 08:09:26 [617] >f+++++++ folder/c.txt
2022/02/01 08:09:26 [617] set modtime of folder/.c.txt.oEepsv to (1643731690) Tue Feb 1 08:08:10 2022
2022/02/01 08:09:26 [617] renaming folder/.c.txt.oEepsv to folder/c.txt
2022/02/01 08:09:26 [617] recv_files(folder/d.txt)
2022/02/01 08:09:26 [617] got file_sum
2022/02/01 08:09:26 [617] >f+++++++ folder/d.txt
2022/02/01 08:09:26 [617] set modtime of folder/.d.txt.AUmsdE to (1643731693) Tue Feb 1 08:08:13 2022
2022/02/01 08:09:26 [617] renaming folder/.d.txt.AUmsdE to folder/d.txt
2022/02/01 08:09:26 [617] recv_files phase=1
2022/02/01 08:09:26 [617] generate_files phase=2
2022/02/01 08:09:26 [617] recv_files phase=2
2022/02/01 08:09:26 [617] generate_files phase=3
2022/02/01 08:09:26 [617] recv_files finished
2022/02/01 08:09:26 [617] recv_generator(.,0)
2022/02/01 08:09:26 [617] set modtime of . to (1643731675) Tue Feb 1 08:07:55 2022
2022/02/01 08:09:26 [617] recv_generator(folder,5)
2022/02/01 08:09:26 [617] set modtime of folder to (1643731693) Tue Feb 1 08:08:13 2022
2022/02/01 08:09:26 [617] generate_files finished
2022/02/01 08:09:26 [617] sent 2887 bytes received 212 bytes total size 320
2022/02/01 08:09:26 [617] _exit_cleanup(code=0, file=/BuildRoot/Library/Caches/com.apple.xbs/Sources/rsync/rsync-52.200.2/rsync/main.c, line=891): about to call exit(0)
2022/02/01 13:09:25 [9503] server_recv(2) starting pid=9503
2022/02/01 13:09:25 [9503] receiving file list
2022/02/01 13:09:25 [9503] recv_file_name(.)
2022/02/01 13:09:25 [9503] recv_file_name(folder)
2022/02/01 13:09:25 [9503] recv_file_name(c.txt)
2022/02/01 13:09:25 [9503] recv_file_name(b.txt)
2022/02/01 13:09:25 [9503] recv_file_name(a.txt)
2022/02/01 13:09:25 [9503] recv_file_name(d.txt)
2022/02/01 13:09:25 [9503] recv_file_name(folder/c.txt)
2022/02/01 13:09:25 [9503] recv_file_name(folder/b.txt)
2022/02/01 13:09:25 [9503] recv_file_name(folder/a.txt)
2022/02/01 13:09:25 [9503] recv_file_name(folder/d.txt)
2022/02/01 13:09:25 [9503] received 10 names
2022/02/01 13:09:25 [9503] recv_file_list done
2022/02/01 13:09:25 [9503] get_local_name count=10 dest
2022/02/01 13:09:25 [9503] generator starting pid=9503 count=10
2022/02/01 13:09:25 [9503] recv_files(10) starting
2022/02/01 13:09:25 [9503] delta-transmission disabled for local transfer or --whole-file
2022/02/01 13:09:25 [9503] recv_generator(.,0)
2022/02/01 13:09:25 [9503] set modtime of . to (1643731675) Tue Feb 1 08:07:55 2022
2022/02/01 13:09:25 [9503] recv_generator(a.txt,1)
2022/02/01 13:09:25 [9503] recv_files(.)
2022/02/01 13:09:25 [9503] recv_generator(b.txt,2)
2022/02/01 13:09:25 [9503] .d..t.... ./
2022/02/01 13:09:25 [9503] recv_generator(c.txt,3)
2022/02/01 13:09:25 [9503] recv_generator(d.txt,4)
2022/02/01 13:09:25 [9503] recv_files(a.txt)
2022/02/01 13:09:25 [9503] got file_sum
2022/02/01 13:09:25 [9503] recv_generator(folder,5)
2022/02/01 13:09:25 [9503] >f+++++++ a.txt
2022/02/01 13:09:25 [9503] set modtime of .a.txt.jTX5bR to (1643731665) Tue Feb 1 08:07:45 2022
2022/02/01 13:09:25 [9503] set modtime of folder to (1643731693) Tue Feb 1 08:08:13 2022
2022/02/01 13:09:25 [9503] renaming .a.txt.jTX5bR to a.txt
2022/02/01 13:09:25 [9503] recv_files(b.txt)
2022/02/01 13:09:25 [9503] recv_generator(folder/a.txt,6)
2022/02/01 13:09:25 [9503] got file_sum
2022/02/01 13:09:25 [9503] recv_generator(folder/b.txt,7)
2022/02/01 13:09:25 [9503] >f+++++++ b.txt
2022/02/01 13:09:25 [9503] set modtime of .b.txt.5SSygw to (1643731668) Tue Feb 1 08:07:48 2022
2022/02/01 13:09:25 [9503] recv_generator(folder/c.txt,8)
2022/02/01 13:09:25 [9503] renaming .b.txt.5SSygw to b.txt
2022/02/01 13:09:25 [9503] recv_files(c.txt)
2022/02/01 13:09:25 [9503] recv_generator(folder/d.txt,9)
2022/02/01 13:09:25 [9503] got file_sum
2022/02/01 13:09:25 [9503] >f+++++++ c.txt
2022/02/01 13:09:25 [9503] generate_files phase=1
2022/02/01 13:09:25 [9503] set modtime of .c.txt.f18rCl to (1643731671) Tue Feb 1 08:07:51 2022
2022/02/01 13:09:25 [9503] renaming .c.txt.f18rCl to c.txt
2022/02/01 13:09:25 [9503] recv_files(d.txt)
2022/02/01 13:09:25 [9503] got file_sum
2022/02/01 13:09:25 [9503] >f+++++++ d.txt
2022/02/01 13:09:25 [9503] set modtime of .d.txt.87Atbf to (1643731675) Tue Feb 1 08:07:55 2022
2022/02/01 13:09:25 [9503] renaming .d.txt.87Atbf to d.txt
2022/02/01 13:09:25 [9503] recv_files(folder)
2022/02/01 13:09:25 [9503] cd+++++++ folder/
2022/02/01 13:09:25 [9503] recv_files(folder/a.txt)
2022/02/01 13:09:25 [9503] got file_sum
2022/02/01 13:09:25 [9503] >f+++++++ folder/a.txt
2022/02/01 13:09:25 [9503] set modtime of folder/.a.txt.p2jkNN to (1643731685) Tue Feb 1 08:08:05 2022
2022/02/01 13:09:25 [9503] renaming folder/.a.txt.p2jkNN to folder/a.txt
2022/02/01 13:09:25 [9503] recv_files(folder/b.txt)
2022/02/01 13:09:25 [9503] got file_sum
2022/02/01 13:09:25 [9503] >f+++++++ folder/b.txt
2022/02/01 13:09:25 [9503] set modtime of folder/.b.txt.sXugTy to (1643731687) Tue Feb 1 08:08:07 2022
2022/02/01 13:09:25 [9503] renaming folder/.b.txt.sXugTy to folder/b.txt
2022/02/01 13:09:25 [9503] recv_files(folder/c.txt)
2022/02/01 13:09:25 [9503] got file_sum
2022/02/01 13:09:25 [9503] >f+++++++ folder/c.txt
2022/02/01 13:09:25 [9503] set modtime of folder/.c.txt.nCgF2I to (1643731690) Tue Feb 1 08:08:10 2022
2022/02/01 13:09:25 [9503] renaming folder/.c.txt.nCgF2I to folder/c.txt
2022/02/01 13:09:25 [9503] recv_files(folder/d.txt)
2022/02/01 13:09:25 [9503] got file_sum
2022/02/01 13:09:25 [9503] >f+++++++ folder/d.txt
2022/02/01 13:09:25 [9503] set modtime of folder/.d.txt.HdWXqY to (1643731693) Tue Feb 1 08:08:13 2022
2022/02/01 13:09:25 [9503] renaming folder/.d.txt.HdWXqY to folder/d.txt
2022/02/01 13:09:25 [9503] recv_files phase=1
2022/02/01 13:09:25 [9503] generate_files phase=2
2022/02/01 13:09:25 [9503] recv_files phase=2
2022/02/01 13:09:25 [9503] generate_files phase=3
2022/02/01 13:09:25 [9503] recv_files finished
2022/02/01 13:09:25 [9503] recv_generator(.,0)
2022/02/01 13:09:25 [9503] set modtime of . to (1643731675) Tue Feb 1 08:07:55 2022
2022/02/01 13:09:25 [9503] recv_generator(folder,5)
2022/02/01 13:09:25 [9503] set modtime of folder to (1643731693) Tue Feb 1 08:08:13 2022
2022/02/01 13:09:25 [9503] generate_files finished
2022/02/01 13:09:25 [9503] sent 2889 bytes received 212 bytes total size 320
2022/02/01 13:09:25 [9503] _exit_cleanup(code=0, file=/BuildRoot/Library/Caches/com.apple.xbs/Sources/rsync/rsync-52.200.2/rsync/main.c, line=891): about to call exit(0)
2022/02/01 13:10:37 [9546] server_recv(2) starting pid=9546
2022/02/01 13:10:37 [9546] receiving file list
2022/02/01 13:10:37 [9546] recv_file_name(.)
2022/02/01 13:10:37 [9546] recv_file_name(folder)
2022/02/01 13:10:37 [9546] recv_file_name(c.txt)
2022/02/01 13:10:37 [9546] recv_file_name(b.txt)
2022/02/01 13:10:37 [9546] recv_file_name(a.txt)
2022/02/01 13:10:37 [9546] recv_file_name(d.txt)
2022/02/01 13:10:37 [9546] recv_file_name(folder/c.txt)
2022/02/01 13:10:37 [9546] recv_file_name(folder/b.txt)
2022/02/01 13:10:37 [9546] recv_file_name(folder/a.txt)
2022/02/01 13:10:37 [9546] recv_file_name(folder/d.txt)
2022/02/01 13:10:37 [9546] received 10 names
2022/02/01 13:10:37 [9546] recv_file_list done
2022/02/01 13:10:37 [9546] get_local_name count=10 dest
2022/02/01 13:10:37 [9546] generator starting pid=9546 count=10
2022/02/01 13:10:37 [9546] recv_files(10) starting
2022/02/01 13:10:37 [9546] delta-transmission disabled for local transfer or --whole-file
2022/02/01 13:10:37 [9546] recv_generator(.,0)
2022/02/01 13:10:37 [9546] recv_generator(a.txt,1)
2022/02/01 13:10:37 [9546] recv_generator(b.txt,2)
2022/02/01 13:10:37 [9546] recv_files(.)
2022/02/01 13:10:37 [9546] .d ./
2022/02/01 13:10:37 [9546] recv_generator(c.txt,3)
2022/02/01 13:10:37 [9546] recv_files(a.txt)
2022/02/01 13:10:37 [9546] .f a.txt
2022/02/01 13:10:37 [9546] recv_generator(d.txt,4)
2022/02/01 13:10:37 [9546] recv_files(b.txt)
2022/02/01 13:10:37 [9546] .f b.txt
2022/02/01 13:10:37 [9546] recv_generator(folder,5)
2022/02/01 13:10:37 [9546] recv_files(c.txt)
2022/02/01 13:10:37 [9546] .f c.txt
2022/02/01 13:10:37 [9546] recv_generator(folder/a.txt,6)
2022/02/01 13:10:37 [9546] recv_files(d.txt)
2022/02/01 13:10:37 [9546] .f d.txt
2022/02/01 13:10:37 [9546] recv_generator(folder/b.txt,7)
2022/02/01 13:10:37 [9546] recv_files(folder)
2022/02/01 13:10:37 [9546] .d folder/
2022/02/01 13:10:37 [9546] recv_generator(folder/c.txt,8)
2022/02/01 13:10:37 [9546] recv_files(folder/a.txt)
2022/02/01 13:10:37 [9546] .f folder/a.txt
2022/02/01 13:10:37 [9546] recv_generator(folder/d.txt,9)
2022/02/01 13:10:37 [9546] recv_files(folder/b.txt)
2022/02/01 13:10:37 [9546] .f folder/b.txt
2022/02/01 13:10:37 [9546] generate_files phase=1
2022/02/01 13:10:37 [9546] recv_files(folder/c.txt)
2022/02/01 13:10:37 [9546] .f folder/c.txt
2022/02/01 13:10:37 [9546] recv_files(folder/d.txt)
2022/02/01 13:10:37 [9546] .f folder/d.txt
2022/02/01 13:10:37 [9546] recv_files phase=1
2022/02/01 13:10:37 [9546] generate_files phase=2
2022/02/01 13:10:37 [9546] recv_files phase=2
2022/02/01 13:10:37 [9546] generate_files phase=3
2022/02/01 13:10:37 [9546] recv_files finished
2022/02/01 13:10:37 [9546] recv_generator(.,0)
2022/02/01 13:10:37 [9546] recv_generator(folder,5)
2022/02/01 13:10:37 [9546] generate_files finished
2022/02/01 13:10:37 [9546] sent 1403 bytes received 212 bytes total size 320
2022/02/01 13:10:37 [9546] _exit_cleanup(code=0, file=/BuildRoot/Library/Caches/com.apple.xbs/Sources/rsync/rsync-52.200.2/rsync/main.c, line=891): about to call exit(0)

View File

@ -0,0 +1 @@
[{"type":"file","filename":"./","metadata":".d ","update_type":"not updated","file_type":"directory","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null},{"type":"file","filename":"a.txt","metadata":".f ","update_type":"not updated","file_type":"file","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null},{"type":"file","filename":"b.txt","metadata":".f ","update_type":"not updated","file_type":"file","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null},{"type":"file","filename":"c.txt","metadata":".f ","update_type":"not updated","file_type":"file","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null},{"type":"file","filename":"d.txt","metadata":".f ","update_type":"not updated","file_type":"file","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null},{"type":"file","filename":"folder/","metadata":".d ","update_type":"not updated","file_type":"directory","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null},{"type":"file","filename":"folder/a.txt","metadata":".f ","update_type":"not updated","file_type":"file","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null},{"type":"file","filename":"folder/b.txt","metadata":".f ","update_type":"not updated","file_type":"file","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null},{"type":"file","filename":"folder/c.txt","metadata":".f ","update_type":"not updated","file_type":"file","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null},{"type":"file","filename":"folder/d.txt","metadata":".f ","update_type":"not updated","file_type":"file","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null},{"type":"summary","sent":284,"received":80,"bytes_sec":728.0,"total_size":320,"speedup":0.88}]

View File

@ -0,0 +1 @@
[{"summary":{"sent":284,"received":80,"bytes_sec":728.0,"total_size":320,"speedup":0.88},"files":[{"filename":"./","metadata":".d ","update_type":"not updated","file_type":"directory","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null},{"filename":"a.txt","metadata":".f ","update_type":"not updated","file_type":"file","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null},{"filename":"b.txt","metadata":".f ","update_type":"not updated","file_type":"file","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null},{"filename":"c.txt","metadata":".f ","update_type":"not updated","file_type":"file","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null},{"filename":"d.txt","metadata":".f ","update_type":"not updated","file_type":"file","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null},{"filename":"folder/","metadata":".d ","update_type":"not updated","file_type":"directory","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null},{"filename":"folder/a.txt","metadata":".f ","update_type":"not updated","file_type":"file","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null},{"filename":"folder/b.txt","metadata":".f ","update_type":"not updated","file_type":"file","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null},{"filename":"folder/c.txt","metadata":".f ","update_type":"not updated","file_type":"file","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null},{"filename":"folder/d.txt","metadata":".f ","update_type":"not updated","file_type":"file","checksum_or_value_different":null,"size_different":null,"modification_time_different":null,"permissions_different":null,"owner_different":null,"group_different":null}]}]

View File

@ -0,0 +1,88 @@
building file list ...
[sender] make_file(.,*,2)
[sender] make_file(folder,*,2)
[sender] make_file(c.txt,*,2)
[sender] make_file(b.txt,*,2)
[sender] make_file(a.txt,*,2)
[sender] make_file(d.txt,*,2)
[sender] make_file(folder/c.txt,*,2)
[sender] make_file(folder/b.txt,*,2)
[sender] make_file(folder/a.txt,*,2)
[sender] make_file(folder/d.txt,*,2)
done
server_recv(2) starting pid=13455
send_file_list done
send_files starting
recv_file_name(.)
recv_file_name(folder)
recv_file_name(c.txt)
recv_file_name(b.txt)
recv_file_name(a.txt)
recv_file_name(d.txt)
recv_file_name(folder/c.txt)
recv_file_name(folder/b.txt)
recv_file_name(folder/a.txt)
recv_file_name(folder/d.txt)
received 10 names
recv_file_list done
get_local_name count=10 dest
generator starting pid=13455 count=10
delta-transmission disabled for local transfer or --whole-file
recv_generator(.,0)
send_files(0, source/.)
.d ./
recv_generator(a.txt,1)
send_files(1, source/a.txt)
.f a.txt
recv_generator(b.txt,2)
send_files(2, source/b.txt)
.f b.txt
recv_generator(c.txt,3)
send_files(3, source/c.txt)
.f c.txt
recv_generator(d.txt,4)
send_files(4, source/d.txt)
.f d.txt
recv_generator(folder,5)
send_files(5, source/folder)
.d folder/
recv_generator(folder/a.txt,6)
send_files(6, source/folder/a.txt)
.f folder/a.txt
recv_generator(folder/b.txt,7)
send_files(7, source/folder/b.txt)
.f folder/b.txt
recv_generator(folder/c.txt,8)
send_files(8, source/folder/c.txt)
.f folder/c.txt
recv_generator(folder/d.txt,9)
send_files(9, source/folder/d.txt)
.f folder/d.txt
generate_files phase=1
send_files phase=1
recv_files(10) starting
recv_files(.)
recv_files(a.txt)
recv_files(b.txt)
recv_files(c.txt)
recv_files(d.txt)
recv_files(folder)
recv_files(folder/a.txt)
recv_files(folder/b.txt)
recv_files(folder/c.txt)
recv_files(folder/d.txt)
recv_files phase=1
generate_files phase=2
send_files phase=2
send files finished
total: matches=0 hash_hits=0 false_alarms=0 data=0
recv_files phase=2
generate_files phase=3
recv_files finished
recv_generator(.,0)
recv_generator(folder,5)
generate_files finished
sent 284 bytes received 80 bytes 728.00 bytes/sec
total size is 320 speedup is 0.88
_exit_cleanup(code=0, file=/BuildRoot/Library/Caches/com.apple.xbs/Sources/rsync/rsync-52.200.2/rsync/main.c, line=996): about to call exit(0)

Some files were not shown because too many files have changed in this diff Show More