We need to set ms_stereo in encode_init() in order to avoid incorrectly
encoding the first frame as non-m/s while flagging it as m/s. Fixes an
uncomfortable pop in the left channel at the start of playback.
CC:libav-stable@libav.org
Fixes timestamp calculation.
The FATE reference is updated because timestamp calculations are now more
accurate. Previous timestamps were based on average bit rate.
This fixes clipping if the encoder input used the full 16 bit
input range (samples with a magnitude below 16383 worked fine).
The filtered subband samples should be 15 bit maximum, while
the code earlier produced them scaled to 16 bit.
This makes the decoder output have double the magnitude
compared to before.
The spec reference samples doesn't test the QMF at all, which
was why this part slipped past initially.
Signed-off-by: Martin Storsjö <martin@martin.st>
The container has no timestamps and the framerate isn't stored in the
data either.
The decoder sets codec timebase to experimentally found value 1/15. Do
the same for the demuxer too, it should at least be better than the
default 1/90000.
ProRes codes chroma blocks in 444 mode in different order than luma blocks,
so make both decoder and encoder read/write chroma blocks in right order.
Reported by Phil Barrett
WavPack has a comprehensive test suite, and a bunch
of corner cases.
Signed-off-by: Derek Buitenhuis <derek.buitenhuis@gmail.com>
Signed-off-by: Ronald S. Bultje <rsbultje@gmail.com>
r_frame_rate should in theory have something to do with input framerate,
but in practice it is often made up from thin air by lavf. So unless we
are targeting a constant output framerate, it's better to just use input
stream timebase.
Brings back dropped frames in nuv and cscd tests introduced in
cd1ad18a65
It is not supposed to be done outside lavc.
This is basically a revert of 818062f2f3.
It is unclear what issue this was supposed to fix, if it reappears again
it will have to be fixed in a more proper place.
The wtv-demux test change is because the sample starts with a B-frame.
According to unofficial documentation, the video rate is locked to the audio
sample rate. This results in proper synchronization of audio and video
timestamps from the demuxer. This only works if the first audio packet occurs
before the first video packet or the audio sample rate is the default rate of
11111 Hz, both of which are true for all samples in our archive.
Update FATE reference to account for now non-existent palette packet.
This also fixes the FATE test if frame data is not initialized in
get_buffer(), so update comment in avconv accordingly.
This changes a number of FATE results, since before this commit, the
timestamps in all tests using rawenc were made up by lavf.
In most cases, the previous timestamps were completely bogus.
In some other cases -- raw formats, mostly h264 -- the new timestamps
are bogus as well. The only difference is that timestamps invented by
the muxer are replaced by timestamps invented by the demuxer.
cscd -- avconv sets output codec timebase from r_frame_rate
and r_frame_rate is in this case some guessed number 31.42 (377/12),
which is not accurate enough to represent all timestamps. This results
in some frames having duplicate pts. Therefore, vsync 0 needs to be
changed to vsync 2 and avconv drops two frames. A proper fix in the
future would be to set output timebase to something saner in avconv.
nuv -- previous timestamps for video were wrong AND the cscd
comment applies, one frame is dropped.
vp8-signbias -- the file contains two frames with identical timestamps,
so -vsync 0 needs to be removed/changed to -vsync 2 and avconv drops one
frame.
vc1-ism -- apparrently either the demuxer lies about timestamps or the
file is broken, since dts == pts on all packets, but reordering clearly
takes place.
Current code compares the desired recording time with InputStream.pts,
which has a very unclear meaning. Change the code to use actual
timestamps of the frames passed to the encoder.
In several tests, one less frame is encoded, which is more correct.
In the idroq test one more frame is encoded, which is again more
correct.
Behavior with stream copy should be unchanged.
The output is obviously not supposed to contain video (since only
-acodec copy is specified), but that only happens because of the way -t
handling is implemented currently.
Right now those muxers use the default timebase in all cases(1/90000).
This patch avoid unnecessary rescaling and makes the printed timestamps
more readable.
Also, extend the printed information to include the timebases and packet
pts/duration and align the columns.
Obviously changes the results of all fate tests which use those two
muxers.
Return the correct number of consumed bytes and set *data_size = 0.
Returned size is 1 too small, leading to that 1 byte being read as the next
frame, which results in an extra blank frame at the beginning of the stream.
get_ue_golomb_long() is only tested for values up to 2^15 - 2 since
we can not write larger values.
Silence the test on success and return a non-zero value on error.
Use an heap scratch buffer instead of large stack buffer.
Remove unneeded includes.