Leads to setting of DefaultDuration in Matroska muxer based on frame rate instead of timebase.
Fixes playback in Chrome.
Signed-off-by: Peter Große <pegro@friiks.de>
Signed-off-by: Michael Niedermayer <michael@niedermayer.cc>
Fix conformance regarding section "3.2.4. Presence of Attributes and
Element" of the "Guidelines for Implementation: DASH-IF Interoperability
Points V4.1" (http://dashif.org/guidelines/)
Signed-off-by: Anton Schubert <ischluff@mailbox.org>
Signed-off-by: Peter Große <pegro@friiks.de>
Signed-off-by: Michael Niedermayer <michael@niedermayer.cc>
Fixes: 1b8ef01f04 ("dashenc: add webm support")
Signed-off-by: Peter Große <pegro@friiks.de>
Signed-off-by: Michael Niedermayer <michael@niedermayer.cc>
This patch is inspired by the ffmpeg webm_chunk muxer and fixes that all resulting
tracks have the same track number.
Signed-off-by: Peter Große <pegro@friiks.de>
Signed-off-by: Michael Niedermayer <michael@niedermayer.cc>
Signed-off-by: Anton Schubert <ischluff@mailbox.org>
Signed-off-by: Peter Große <pegro@friiks.de>
Signed-off-by: Michael Niedermayer <michael@niedermayer.cc>
* commit 'dce2929efa8e82b0832a828f7e8cb81ff8c20a4e':
dashenc: copy language and role metadata from streams assigned to sets
Merged-by: Rodger Combs <rodger.combs@gmail.com>
* commit 'efd2fc41b3f0749f9715d50b581f22bbaa8c5b99':
dashenc: allow assigning all streams of a media type to an AdaptationSet
Merged-by: Rodger Combs <rodger.combs@gmail.com>
* commit '3d23a5f96ad72961c14ba3a0c2add8f2ab374b61':
dashenc: add support for assigning streams to AdaptationSets
Merged-by: Rodger Combs <rodger.combs@gmail.com>
* commit '9df9309d233f59d9706444a1e24ac24139f2640d':
dashenc: calculate stream bitrate from first segment if not available
Merged-by: Rodger Combs <rodger.combs@gmail.com>
move from dashenc, move DASHTmplId and dash_fill_tmpl_params to
dash.c, they will be used by dash demuxer and dash muxer.
v2 fixed:
1. rename common file from dashcomm.* to dash.*
Suggested-by: Hendrik Leppkes <h.leppkes@gmail.com>
v3 fixed:
1. rename header file pre defined
2. add ff_ prefix for the internal API
Suggested-by: James Almer <jamrial@gmail.com>
Suggested-by: Timo Rothenpieler <timo@rothenpieler.org>
Reviewed-by: wm4 <nfxjfg@googlemail.com>
Signed-off-by: Steven Liu <lq@onvideo.cn>
Skip the codec_tag altogether here, to let the user (try to) set
whichever codec/tag is preferred; the individual chained muxer will
reject invalid codecs anyway.
(cherry picked from commit 61f589e31e)
Signed-off-by: Derek Buitenhuis <derek.buitenhuis@gmail.com>
* commit '1920382aa9f21d7ed1a3c2214990da8d2b067a92':
dashenc: add option to provide UTC timing source
Also use E instead of AV_OPT_FLAG_ENCODING_PARAM to be consistent with
the other AVOption.
Merged-by: Clément Bœsch <u@pkh.me>
* commit '95f1004bdfdf2d26c330c1d4b7c4ac9352d60b18':
dashenc: add mandatory id to AdaptationSet and Period in manifest
Merged-by: Clément Bœsch <u@pkh.me>
Provides a way to change bandwidth parameter inside DASH manifest after a non-CBR H.264 encoding.
Caller now is able to compute the bitrate by itself, after all packets have been written, and then set that value in AVFormatContext->streams->codecpar->bit_rate before calling av_write_trailer. As a result that value will be set in DASH manifest.
Signed-off-by: Michael Niedermayer <michael@niedermayer.cc>
Skips using temporary files when outputting to a protocol other than
"file", which enables dash to output content over network
protocols. The logic has been copied from the HLS format.
Reviewed-by: Steven Liu <lingjiujianke@gmail.com>
Signed-off-by: Michael Niedermayer <michael@niedermayer.cc>
Use webm muxer for VP8, VP9 and Opus codec, mp4 muxer otherwise.
Signed-off-by: Peter Große <pegro@friiks.de>
Signed-off-by: Martin Storsjö <martin@martin.st>
The dash_write function drops data, if no IOContext is initialized.
Since the mp4 muxer is used in "frag_custom" mode, data is only
written when calling av_write_frame(NULL) explicitly and thus
there will be no data loss.
To add support for webm as subordinate muxer, which doesn't have
such a mode, a dynamic buffer is required to provide an always
initialized IOContext.
Signed-off-by: Peter Große <pegro@friiks.de>
Signed-off-by: Martin Storsjö <martin@martin.st>
Previously all mapped streams of a media type (video, audio) where assigned
to a single AdaptationSet. Using the DASH live profile it is mandatory, that
the segments of all representations are aligned, which is currently not
enforced. This leads to problems when using video streams with different
key frame intervals. So to play safe, default to one AdaptationSet per stream,
unless overwritten by explicit assignment.
To get the old assignment scheme, use
-adaptation_sets "id=0,streams=v id=1,streams=a"
Signed-off-by: Peter Große <pegro@friiks.de>
Signed-off-by: Martin Storsjö <martin@martin.st>
Using the characters "v" or "a" instead of stream index numbers for assigning
streams in the adaption_set option, all streams matching that given type will
be added to the AdaptationSet.
Signed-off-by: Peter Große <pegro@friiks.de>
Signed-off-by: Martin Storsjö <martin@martin.st>
Also makes sure all streams are assigned to exactly one AdaptationSet.
This patch is originally based partially on code by Vignesh Venkatasubramanian.
Signed-off-by: Peter Große <pegro@friiks.de>
Signed-off-by: Martin Storsjö <martin@martin.st>
Bandwidth information is required in the manifest, but not always
provided by the demuxer. In that case calculate the bandwith based
on the size and duration of the first segment.
Signed-off-by: Peter Große <pegro@friiks.de>
Signed-off-by: Martin Storsjö <martin@martin.st>
The current implementation creates new segments comparing
pkt->pts - first_pts > nb_segs * min_seg_duration
This works fine, but if the keyframe interval is smaller than "min_seg_duration"
segments shorter than the minimum segment duration are created.
Example: keyint=50, min_seg_duration=3000000
segment 1 contains keyframe 1 (duration=2s < total_duration=3s)
and keyframe 2 (duration=4s >= total_duration=3s)
segment 2 contains keyframe 3 (duration=6s >= total_duration=6s)
segment 3 contains keyframe 4 (duration=8s < total_duration=9s)
and keyframe 5 (duration=10s >= total_duration=9s)
...
Segment 2 is only 2s long, shorter than min_seg_duration = 3s.
To fix this, new segments are created based on the actual written duration.
Otherwise the option name "min_seg_duration" is misleading.
Signed-off-by: Peter Große <pegro@friiks.de>
Signed-off-by: Martin Storsjö <martin@martin.st>
If set, adds a UTCTiming tag in the manifest.
This is part of the recommendations listed in the "Guidelines for
Implementations: DASH-IF Interoperability Points" [1][2]
Section 4.7 describes means for the Availability Time Synchronization.
A usable default is "https://time.akamai.com/?iso"
[1] http://dashif.org/guidelines/
[2] http://dashif.org/wp-content/uploads/2016/12/DASH-IF-IOP-v4.0-clean.pdf
(current version as of writing)
Signed-off-by: Peter Große <pegro@friiks.de>
Signed-off-by: Martin Storsjö <martin@martin.st>
to avoid rebuffering on the clientside for difficult network conditions.
Signed-off-by: Anton Schubert <ischluff@mailbox.org>
Signed-off-by: Martin Storsjö <martin@martin.st>
Appends Z to timestamp to force ISO8601 datetime parsing as UTC.
Without Z, some browsers (Chrome) interpret the timestamp as
localtime and others (Firefox) interpret it as UTC.
Signed-off-by: Anton Schubert <ischluff@mailbox.org>
Signed-off-by: Martin Storsjö <martin@martin.st>
Currently, AVStream contains an embedded AVCodecContext instance, which
is used by demuxers to export stream parameters to the caller and by
muxers to receive stream parameters from the caller. It is also used
internally as the codec context that is passed to parsers.
In addition, it is also widely used by the callers as the decoding (when
demuxer) or encoding (when muxing) context, though this has been
officially discouraged since Libav 11.
There are multiple important problems with this approach:
- the fields in AVCodecContext are in general one of
* stream parameters
* codec options
* codec state
However, it's not clear which ones are which. It is consequently
unclear which fields are a demuxer allowed to set or a muxer allowed to
read. This leads to erratic behaviour depending on whether decoding or
encoding is being performed or not (and whether it uses the AVStream
embedded codec context).
- various synchronization issues arising from the fact that the same
context is used by several different APIs (muxers/demuxers,
parsers, bitstream filters and encoders/decoders) simultaneously, with
there being no clear rules for who can modify what and the different
processes being typically delayed with respect to each other.
- avformat_find_stream_info() making it necessary to support opening
and closing a single codec context multiple times, thus
complicating the semantics of freeing various allocated objects in the
codec context.
Those problems are resolved by replacing the AVStream embedded codec
context with a newly added AVCodecParameters instance, which stores only
the stream parameters exported by the demuxers or read by the muxers.
This also deprecates our old duplicated callbacks.
* commit '9f61abc8111c7c43f49ca012e957a108b9cc7610':
lavf: allow custom IO for all files
Merged-by: Derek Buitenhuis <derek.buitenhuis@gmail.com>
Some (de)muxers open additional files beyond the main IO context.
Currently, they call avio_open() directly, which prevents the caller
from using custom IO for such streams.
This commit adds callbacks to AVFormatContext that default to
avio_open2()/avio_close(), but can be overridden by the caller. All
muxers and demuxers using AVIO are switched to using those callbacks
instead of calling avio_open()/avio_close() directly.
(de)muxers that use the URLProtocol layer directly instead of AVIO
remain unconverted for now. This should be fixed in later commits.