Previously, AVStream.codec.time_base was used for that purpose, which
was quite confusing for the callers. This change also opens the path for
removing AVStream.codec.
The change in the lavf-mkv test is due to the native timebase (1/1000)
being used instead of the default one (1/90000), so the packets are now
sent to the crc muxer in the same order in which they are demuxed
(previously some of them got reordered because of inexact timestamp
conversion).
It has not been properly maintained for years and there is little hope
of that changing in the future.
It appears simpler to write a new replacement from scratch than
unbreaking it.
This results in DefaultDuration not being written when the framerate is
not known, but as this field is purely informative, this should not
break any sane demuxers.
This corrects the bug that caused the checksums to change in
9767d7c092.
It caused the EOS flag to be set incorrectly; the ogg spec does not
allow it to be set in the middle of a logical bitstream.
Signed-off-by: Andrew Kelley <superjoe30@gmail.com>
Signed-off-by: Martin Storsjö <martin@martin.st>
Before, header information for ogg format files was sent with the
first encoded packet.
This patch makes it so that it is possible for API users to
differentiate between headers and encoded audio. This is useful, for
example, when creating an audio stream where you want to send one set
of headers for every client that connects and then the encoded stream
of audio.
Signed-off-by: Martin Storsjö <martin@martin.st>
Use it instead of checking CODEC_FLAG_BITEXACT in the first stream's
codec context.
Using codec options inside lavf is fragile and can easily break when the
muxing codec context is not the encoding context.
Initial implementation by Andrew D'Addesio <modchipv12@gmail.com> during
GSoC 2012.
Completion by Anton Khirnov <anton@khirnov.net>, sponsored by the
Mozilla Corporation.
Further contributions by:
Christophe Gisquet <christophe.gisquet@gmail.com>
Janne Grunau <janne-libav@jannau.net>
Luca Barbato <lu_zero@gentoo.org>
Also, stop using AVCodecContext.codec_name as fallback, since it will be
deprecated.
Changes the result of the lavf-asf test (and its associated seektest),
since 'msmpeg4v3' gets written instead of just 'msmpeg4'.
Also set the RGBA pixel format correctly as the native endian format,
which is what it returns.
This fixes the tests on big endian.
Signed-off-by: Martin Storsjö <martin@martin.st>
Fixes conversion of pal8 to rgb formats with alpha.
Updated references for 2 FATE tests which previously encoded fully
transparent images.
Based on a patch by Baptiste Coudurier <baptiste.coudurier@gmail.com>
Extrapolate audio timestamps based on the number of samples demuxed.
Deal with some MXF nastiness involving fractional number of
samples per EditUnit when seeking (the specs handwave this away).
Further fixes from Tomas Härdin.
Signed-off-by: Luca Barbato <lu_zero@gentoo.org>
The official Ut Video decoder only threads with slices, thus until
now any files encoded by the libavcodec encoder have only been
decodable with a single thread. The default slice count is now
set to subsampled_height / 120.
Also sets slices to 1 for the Ut Video encoder tests to keep them
green.
Signed-off-by: Derek Buitenhuis <derek.buitenhuis@gmail.com>
The old one didn't use segmentation. One uses segmentation in all frame
types (--aq-mode=1), and the other uses all segmentation features, but
only in inter frames (mbgraph).
Signed-off-by: Anton Khirnov <anton@khirnov.net>
This disables backward probability updates, which makes the codec more
friendly for frame-level multi-threading.
Signed-off-by: Anton Khirnov <anton@khirnov.net>
The original test without a forced idct is still useful since it tests
the switching of the idct algorithm/permutation on x86 with MMX. MMXext
or SSE2. Make sure the test runs only if MMX inline asm is available and
force -cpuflags to all.
Add the required bitexact flag for both tests.