The resolution is in the packets, so decoding must happen.
Since most other formats do not set the dimension, make it
a special case for PGS. If other codecs were to have the
same requirement, using a CODEC_CAP would be preferred.
This also changes behavior as the descriptor table is more complete than
the switch/case it replaces. As well as considering all non video as
intra only
Signed-off-by: Michael Niedermayer <michaelni@gmx.at>
* qatar/master:
lavf: Detect discontinuities in timestamps for framerate/analyzeduration calculation
lavf: Initialize the stream info timestamps in avformat_new_stream
id3v2: Match PIC mimetype/format case-insensitively
configure: Rename check_asm() to more fitting check_inline_asm()
fate: Only test enabled filters
avresample: De-doxygenize some comments where Doxygen is not appropriate
rtmp: split chunk_size var into in_chunk_size and out_chunk_size
rtmp: Factorize the code by adding find_tracked_method
Conflicts:
configure
Merged-by: Michael Niedermayer <michaelni@gmx.at>
These are normally initialized to AV_NOPTS_VALUE at the start
of avformat_find_stream_info, but if a new stream is found while
this function is running (e.g. like in mpegts), the newly added
AVStreams didn't have these values properly initalized, leading
to avformat_find_stream_info terminating too soon (when the
first timestamps are far from 0).
Signed-off-by: Martin Storsjö <martin@martin.st>
This adds a function to retrieve the number of entries in a
dictionary and updates the places directly accessing what should
be an opaque struct to use this new function instead.
Signed-off-by: Mans Rullgard <mans@mansr.com>
This is limited to the chars that arent filtered by av_log() already
we might filter more aggressively if theres some case where this becomes
needed.
Fixes Ticket1181
Signed-off-by: Michael Niedermayer <michaelni@gmx.at>
At this place, the normal way of initializing a struct works
fine, there's no need for a struct literal.
Signed-off-by: Martin Storsjö <martin@martin.st>
* qatar/master:
flvdec: remove spurious use of stream id
lavf: deprecate r_frame_rate.
lavf: round estimated average fps to a "standard" fps.
Conflicts:
ffmpeg.c
ffprobe.c
libavformat/avformat.h
libavformat/electronicarts.c
libavformat/flvdec.c
libavformat/rawdec.c
libavformat/utils.c
tests/ref/fate/iv8-demux
Merged-by: Michael Niedermayer <michaelni@gmx.at>
* commit 'fe1c1198e670242f3cf9e3e1eef27cff77f3ee23':
lavf: use dts difference instead of AVPacket.duration in find_stream_info()
avf: introduce nobuffer option
fate: make yadif tests consistent across systems
vf_hqdn3d: support 9 and 10bit colordepth
vf_hqdn3d: reduce intermediate precision
vf_hqdn3d: simplify and optimize
factor identical ff_inplace_start_frame out of two filters
vf_hqdn3d: cosmetics
avprobe/avconv: fix tentative declaration compile errors on MSVS.
Conflicts:
doc/APIchanges
ffmpeg.c
ffprobe.c
libavformat/avformat.h
libavformat/options_table.h
libavformat/utils.c
libavformat/version.h
tests/fate/filter.mak
tests/ref/fate/filter-yadif-mode0
tests/ref/fate/filter-yadif-mode1
Merged-by: Michael Niedermayer <michaelni@gmx.at>
According to its description, it is supposed to be the LCM of all the
frame durations. The usability of such a thing is vanishingly small,
especially since we cannot determine it with any amount of reliability.
Therefore get rid of it after the next bump.
Replace it with the average framerate where it makes sense.
FATE results for the wtv and xmv demux tests change. In the wtv case
this is caused by the file being corrupted (or possibly badly cut) and
containing invalid timestamps. This results in lavf estimating the
framerate wrong and making up wrong frame durations.
In the xmv case the file contains pts jumps, so again the estimated
framerate is far from anything sane and lavf again makes up different
frame durations.
In some other tests lavf starts making up frame durations from different
frame.
AVPacket.duration is mostly made up and thus completely useless, this is
especially true for video streams.
Therefore use dts difference for framerate estimation and
the max_analyze_duration check.
The asyncts test now needs -analyzeduration, because the default is 5
seconds and the audio stream in the sample appears at ~10 seconds.
Useful in cases where a significant analyzeduration is
still needed, while minimizing buffering before output.
An example is processing low-latency streams where all
media types won't necessarily come in if the
analyzeduration is small.
Additional changes by Josh Allmann <joshua.allmann@gmail.com>
Signed-off-by: Anton Khirnov <anton@khirnov.net>
* qatar/master: (35 commits)
h264_idct_10bit: port x86 assembly to cpuflags.
x86inc: clip num_args to 7 on x86-32.
x86inc: sync to latest version from x264.
fft: rename "z" to "zc" to prevent name collision.
wv: return meaningful error codes.
wv: return AVERROR_EOF on EOF, not EIO.
mp3dec: forward errors for av_get_packet().
mp3dec: remove a pointless local variable.
mp3dec: remove commented out cruft.
lavfi: bump minor to mark stabilizing the ABI.
FATE: add tests for yadif.
FATE: add a test for delogo video filter.
FATE: add a test for amix audio filter.
audiogen: allow specifying random seed as a commandline parameter.
vc1dec: Override invalid macroblock quantizer
vc1: avoid reading beyond the last line in vc1_draw_sprites()
vc1dec: check that coded slice positions and interlacing match.
vc1dec: Do not ignore ff_vc1_parse_frame_header_adv return value
configure: Move parts that should not be user-selectable to CONFIG_EXTRA
lavf: remove commented out cruft in avformat_find_stream_info()
...
Conflicts:
Makefile
configure
libavcodec/vc1dec.c
libavcodec/x86/h264_deblock.asm
libavcodec/x86/h264_deblock_10bit.asm
libavcodec/x86/h264dsp_mmx.c
libavfilter/version.h
libavformat/mp3dec.c
libavformat/utils.c
libavformat/wv.c
libavutil/x86/x86inc.asm
Merged-by: Michael Niedermayer <michaelni@gmx.at>
By moving it to a later point relative and unknown timestamps
are more likely to have been corrected
similar patch reviewed-by: Reimar Döffinger <Reimar.Doeffinger@gmx.de>
Signed-off-by: Michael Niedermayer <michaelni@gmx.at>
Conflicts:
libavformat/utils.c
If skip_samples is set and timestamps are synthesized using durations,
make them start at -skip_samples (rescaled) instead of 0,
so that the timestamp of the first undiscarded sample is 0.
This adds the minimum delay needed with the current decoder to
recognize the reorder buffer size for the reference bitstreams.
Signed-off-by: Michael Niedermayer <michaelni@gmx.at>