1
0
mirror of https://github.com/FFmpeg/FFmpeg.git synced 2024-11-21 10:55:51 +02:00

doc: use @command{} for commands.

This commit is contained in:
Clément Bœsch 2012-01-02 15:32:55 +01:00 committed by Clément Bœsch
parent 837126568c
commit dc7ad85c40
7 changed files with 30 additions and 30 deletions

View File

@ -577,7 +577,7 @@ Allow to set any x264 option, see x264 --fullhelp for a list.
":". ":".
@end table @end table
For example to specify libx264 encoding options with @file{ffmpeg}: For example to specify libx264 encoding options with @command{ffmpeg}:
@example @example
ffmpeg -i foo.mpg -vcodec libx264 -x264opts keyint=123:min-keyint=20 -an out.mkv ffmpeg -i foo.mpg -vcodec libx264 -x264opts keyint=123:min-keyint=20 -an out.mkv
@end example @end example

View File

@ -135,7 +135,7 @@ Read @var{input_file}.
@chapter Writers @chapter Writers
@c man begin WRITERS @c man begin WRITERS
A writer defines the output format adopted by @file{ffprobe}, and will be A writer defines the output format adopted by @command{ffprobe}, and will be
used for printing all the parts of the output. used for printing all the parts of the output.
A writer may accept one or more arguments, which specify the options to A writer may accept one or more arguments, which specify the options to

View File

@ -354,7 +354,7 @@ A customized down-mix to stereo that works automatically for 3-, 4-, 5- and
pan=stereo: FL < FL + 0.5*FC + 0.6*BL + 0.6*SL : FR < FR + 0.5*FC + 0.6*BR + 0.6*SR pan=stereo: FL < FL + 0.5*FC + 0.6*BL + 0.6*SL : FR < FR + 0.5*FC + 0.6*BR + 0.6*SR
@end example @end example
Note that @file{ffmpeg} integrates a default down-mix (and up-mix) system Note that @command{ffmpeg} integrates a default down-mix (and up-mix) system
that should be preferred (see "-ac" option) unless you have very specific that should be preferred (see "-ac" option) unless you have very specific
needs. needs.

View File

@ -196,12 +196,12 @@ device.
Once you have created one or more JACK readable clients, you need to Once you have created one or more JACK readable clients, you need to
connect them to one or more JACK writable clients. connect them to one or more JACK writable clients.
To connect or disconnect JACK clients you can use the To connect or disconnect JACK clients you can use the @command{jack_connect}
@file{jack_connect} and @file{jack_disconnect} programs, or do it and @command{jack_disconnect} programs, or do it through a graphical interface,
through a graphical interface, for example with @file{qjackctl}. for example with @command{qjackctl}.
To list the JACK clients and their properties you can invoke the command To list the JACK clients and their properties you can invoke the command
@file{jack_lsp}. @command{jack_lsp}.
Follows an example which shows how to capture a JACK readable client Follows an example which shows how to capture a JACK readable client
with @command{ffmpeg}. with @command{ffmpeg}.
@ -260,7 +260,7 @@ device.
@itemize @itemize
@item @item
Create a color video stream and play it back with @file{ffplay}: Create a color video stream and play it back with @command{ffplay}:
@example @example
ffplay -f lavfi -graph "color=pink [out0]" dummy ffplay -f lavfi -graph "color=pink [out0]" dummy
@end example @end example
@ -280,14 +280,14 @@ ffplay -f lavfi -graph "testsrc [out0]; testsrc,hflip [out1]; testsrc,negate [ou
@item @item
Read an audio stream from a file using the amovie source and play it Read an audio stream from a file using the amovie source and play it
back with @file{ffplay}: back with @command{ffplay}:
@example @example
ffplay -f lavfi "amovie=test.wav" ffplay -f lavfi "amovie=test.wav"
@end example @end example
@item @item
Read an audio stream and a video stream and play it back with Read an audio stream and a video stream and play it back with
@file{ffplay}: @command{ffplay}:
@example @example
ffplay -f lavfi "movie=test.avi[out0];amovie=test.wav[out1]" ffplay -f lavfi "movie=test.avi[out0];amovie=test.wav[out1]"
@end example @end example
@ -380,7 +380,7 @@ $ ffmpeg -f openal -i '' out.ogg
@end example @end example
Capture from two devices simultaneously, writing to two different files, Capture from two devices simultaneously, writing to two different files,
within the same @file{ffmpeg} command: within the same @command{ffmpeg} command:
@example @example
$ ffmpeg -f openal -i 'DR-BT101 via PulseAudio' out1.ogg -f openal -i 'ALSA Default' out2.ogg $ ffmpeg -f openal -i 'DR-BT101 via PulseAudio' out1.ogg -f openal -i 'ALSA Default' out2.ogg
@end example @end example
@ -415,7 +415,7 @@ The filename to provide to the input device is a source device or the
string "default" string "default"
To list the pulse source devices and their properties you can invoke To list the pulse source devices and their properties you can invoke
the command @file{pactl list sources}. the command @command{pactl list sources}.
@example @example
ffmpeg -f pulse -i default /tmp/pulse.wav ffmpeg -f pulse -i default /tmp/pulse.wav
@ -516,8 +516,8 @@ the device.
Video4Linux and Video4Linux2 devices only support a limited set of Video4Linux and Video4Linux2 devices only support a limited set of
@var{width}x@var{height} sizes and frame rates. You can check which are @var{width}x@var{height} sizes and frame rates. You can check which are
supported for example with the command @file{dov4l} for Video4Linux supported for example with the command @command{dov4l} for Video4Linux
devices and the command @file{v4l-info} for Video4Linux2 devices. devices and the command @command{v4l-info} for Video4Linux2 devices.
If the size for the device is set to 0x0, the input device will If the size for the device is set to 0x0, the input device will
try to auto-detect the size to use. try to auto-detect the size to use.
@ -579,7 +579,7 @@ default to 0.
Check the X11 documentation (e.g. man X) for more detailed information. Check the X11 documentation (e.g. man X) for more detailed information.
Use the @file{dpyinfo} program for getting basic information about the Use the @command{dpyinfo} program for getting basic information about the
properties of your X11 display (e.g. grep for "name" or "dimensions"). properties of your X11 display (e.g. grep for "name" or "dimensions").
For example to grab from @file{:0.0} using @command{ffmpeg}: For example to grab from @file{:0.0} using @command{ffmpeg}:

View File

@ -43,13 +43,13 @@ The result will be that in output the top half of the video is mirrored
onto the bottom half. onto the bottom half.
Video filters are loaded using the @var{-vf} option passed to Video filters are loaded using the @var{-vf} option passed to
ffmpeg or to ffplay. Filters in the same linear chain are separated by @command{ffmpeg} or to @command{ffplay}. Filters in the same linear
commas. In our example, @var{split, fifo, overlay} are in one linear chain are separated by commas. In our example, @var{split, fifo,
chain, and @var{fifo, crop, vflip} are in another. The points where overlay} are in one linear chain, and @var{fifo, crop, vflip} are in
the linear chains join are labeled by names enclosed in square another. The points where the linear chains join are labeled by names
brackets. In our example, that is @var{[T1]} and @var{[T2]}. The magic enclosed in square brackets. In our example, that is @var{[T1]} and
labels @var{[in]} and @var{[out]} are the points where video is input @var{[T2]}. The magic labels @var{[in]} and @var{[out]} are the points
and output. where video is input and output.
Some filters take in input a list of parameters: they are specified Some filters take in input a list of parameters: they are specified
after the filter name and an equal sign, and are separated each other after the filter name and an equal sign, and are separated each other

View File

@ -60,7 +60,7 @@ If not specified it defaults to the size of the input video.
@subsection Examples @subsection Examples
The following command shows the @file{ffmpeg} output is an The following command shows the @command{ffmpeg} output is an
SDL window, forcing its size to the qcif format: SDL window, forcing its size to the qcif format:
@example @example
ffmpeg -i INPUT -vcodec rawvideo -pix_fmt yuv420p -window_size qcif -f sdl "SDL output" ffmpeg -i INPUT -vcodec rawvideo -pix_fmt yuv420p -window_size qcif -f sdl "SDL output"

View File

@ -52,7 +52,7 @@ resource to be concatenated, each one possibly specifying a distinct
protocol. protocol.
For example to read a sequence of files @file{split1.mpeg}, For example to read a sequence of files @file{split1.mpeg},
@file{split2.mpeg}, @file{split3.mpeg} with @file{ffplay} use the @file{split2.mpeg}, @file{split3.mpeg} with @command{ffplay} use the
command: command:
@example @example
ffplay concat:split1.mpeg\|split2.mpeg\|split3.mpeg ffplay concat:split1.mpeg\|split2.mpeg\|split3.mpeg
@ -183,7 +183,7 @@ application specified in @var{app}, may be prefixed by "mp4:".
@end table @end table
For example to read with @file{ffplay} a multimedia resource named For example to read with @command{ffplay} a multimedia resource named
"sample" from the application "vod" from an RTMP server "myserver": "sample" from the application "vod" from an RTMP server "myserver":
@example @example
ffplay rtmp://myserver/vod/sample ffplay rtmp://myserver/vod/sample
@ -224,7 +224,7 @@ For example, to stream a file in real-time to an RTMP server using
ffmpeg -re -i myfile -f flv rtmp://myserver/live/mystream ffmpeg -re -i myfile -f flv rtmp://myserver/live/mystream
@end example @end example
To play the same stream using @file{ffplay}: To play the same stream using @command{ffplay}:
@example @example
ffplay "rtmp://myserver/live/mystream live=1" ffplay "rtmp://myserver/live/mystream live=1"
@end example @end example
@ -249,7 +249,7 @@ The required syntax for a RTSP url is:
rtsp://@var{hostname}[:@var{port}]/@var{path} rtsp://@var{hostname}[:@var{port}]/@var{path}
@end example @end example
The following options (set on the @command{ffmpeg}/@file{ffplay} command The following options (set on the @command{ffmpeg}/@command{ffplay} command
line, or set in code via @code{AVOption}s or in @code{avformat_open_input}), line, or set in code via @code{AVOption}s or in @code{avformat_open_input}),
are supported: are supported:
@ -288,7 +288,7 @@ When receiving data over UDP, the demuxer tries to reorder received packets
order for this to be enabled, a maximum delay must be specified in the order for this to be enabled, a maximum delay must be specified in the
@code{max_delay} field of AVFormatContext. @code{max_delay} field of AVFormatContext.
When watching multi-bitrate Real-RTSP streams with @file{ffplay}, the When watching multi-bitrate Real-RTSP streams with @command{ffplay}, the
streams to display can be chosen with @code{-vst} @var{n} and streams to display can be chosen with @code{-vst} @var{n} and
@code{-ast} @var{n} for video and audio respectively, and can be switched @code{-ast} @var{n} for video and audio respectively, and can be switched
on the fly by pressing @code{v} and @code{a}. on the fly by pressing @code{v} and @code{a}.
@ -365,13 +365,13 @@ To broadcast a stream on the local subnet, for watching in VLC:
ffmpeg -re -i @var{input} -f sap sap://224.0.0.255?same_port=1 ffmpeg -re -i @var{input} -f sap sap://224.0.0.255?same_port=1
@end example @end example
Similarly, for watching in ffplay: Similarly, for watching in @command{ffplay}:
@example @example
ffmpeg -re -i @var{input} -f sap sap://224.0.0.255 ffmpeg -re -i @var{input} -f sap sap://224.0.0.255
@end example @end example
And for watching in ffplay, over IPv6: And for watching in @command{ffplay}, over IPv6:
@example @example
ffmpeg -re -i @var{input} -f sap sap://[ff0e::1:2:3:4] ffmpeg -re -i @var{input} -f sap sap://[ff0e::1:2:3:4]