It appears faster than the iterative method on my machine (1.06x
faster), so I'm guessing compilers improved over time (the iterative
version was slightly faster in the past).
Currently, in case of equality on the first color channel, the order of
the ref colors is defined by the hashing function. This commit makes the
sorting deterministic and improve the hierarchical ordering.
This variable is used only for the running weight (used to reach the
target median). The places where we actually need the box weight are
changed to use box->weight.
The variance computation is simple enough now (since we can use the axis
squared errors) that it doesn't need to have a complex lazy computation
logic.
The equalities in the w{r,g,b} range checks make sure longest is never
0. Even if the alpha ended up being selected in get_next_color() it
would cause underread memory accesses in its caller (colormap_insert).
This reverts commit dea673d0d548c864ec85f9260d8900d944ef7a2a.
This change cannot work for several reasons, the most obvious ones are:
- the alpha is being part of the scoring of the color difference, even
though we can not interpret the alpha as part of the perception of the
color (we don't even know if it's premultiplied or postmultiplied)
- the colors are averaged with their alpha value which simply cannot
work
The command proposed in the original thread of the patch actually
produces a completely broken file:
ffmpeg -y -loglevel verbose -i fate-suite/apng/o_sample.png -filter_complex "split[split1][split2];[split1]palettegen=max_colors=254:use_alpha=1[pal1];[split2][pal1]paletteuse=use_alpha=1" -frames:v 1 out.png
We can see that many color pixels are off, but more importantly some
colors have a random alpha value: https://imgur.com/eFQ2UK7
I don't see any easy fix for this unfortunately, the approach appears to
be flawed by design.
This filter, when used in the "pad" mode, currently makes the
distinction between limited and full range solely by testing for YUVJ
pixel formats at link setup time. This is deprecated and should be
improved to perform the detection based on the per-frame metadata.
In order to make this distinction based on color range metadata, which
is only known at the time of filtering frames, for simplicity, we simply
allocate two copies of the "black" frame - one for limited range and the
other for full range metadata. This could be done more dynamically (e.g.
as-needed or simply by blitting the appropriate pixel value directly),
but this change is relatively simple and preserves the structure of the
existing code.
This commit actually fixes a bug in FATE - the new output is correct for
the first time. The previous md5 ref was of a frame that incorrectly
combined full-range pixel data with limited-range black fields. The
corresponding result has been updated.
Signed-off-by: Niklas Haas <git@haasn.dev>
This filter currently makes the distinction between limited and full
range by testing for the deprecated YUVJ pixel formats at link setup
time. This is deprecated and should be improved to perform the detection
based on the per-frame metadata.
Rewrite it to calculate the black pixel threshold at the time of
filtering a frame, when metadata about the frame's color range is known.
Doing it this way has the small side benefit of being able to handle
streams with variable metadata, and is not a meaningful performance
cost.
Signed-off-by: Niklas Haas <git@haasn.dev>