mirror of
https://github.com/FFmpeg/FFmpeg.git
synced 2024-12-23 12:43:46 +02:00
doc: update documentation to use avconv
This commit is contained in:
parent
ca410b4eb0
commit
d5837d7fe9
@ -33,19 +33,19 @@ debug mode or a NoDaemon option is specified in the configuration
|
||||
file.
|
||||
|
||||
This documentation covers only the streaming aspects of avserver /
|
||||
ffmpeg. All questions about parameters for ffmpeg, codec questions,
|
||||
etc. are not covered here. Read @file{ffmpeg-doc.html} for more
|
||||
avconv. All questions about parameters for avconv, codec questions,
|
||||
etc. are not covered here. Read @file{avconv.html} for more
|
||||
information.
|
||||
|
||||
@section How does it work?
|
||||
|
||||
avserver receives prerecorded files or FFM streams from some ffmpeg
|
||||
avserver receives prerecorded files or FFM streams from some avconv
|
||||
instance as input, then streams them over RTP/RTSP/HTTP.
|
||||
|
||||
An avserver instance will listen on some port as specified in the
|
||||
configuration file. You can launch one or more instances of ffmpeg and
|
||||
configuration file. You can launch one or more instances of avconv and
|
||||
send one or more FFM streams to the port where avserver is expecting
|
||||
to receive them. Alternately, you can make avserver launch such ffmpeg
|
||||
to receive them. Alternately, you can make avserver launch such avconv
|
||||
instances at startup.
|
||||
|
||||
Input streams are called feeds, and each one is specified by a <Feed>
|
||||
@ -107,11 +107,11 @@ LAME is important as it allows for streaming audio to Windows Media Player.
|
||||
Don't ask why the other audio types do not work.
|
||||
|
||||
As a simple test, just run the following two command lines where INPUTFILE
|
||||
is some file which you can decode with ffmpeg:
|
||||
is some file which you can decode with avconv:
|
||||
|
||||
@example
|
||||
./avserver -f doc/avserver.conf &
|
||||
./ffmpeg -i INPUTFILE http://localhost:8090/feed1.ffm
|
||||
./avconv -i INPUTFILE http://localhost:8090/feed1.ffm
|
||||
@end example
|
||||
|
||||
At this point you should be able to go to your Windows machine and fire up
|
||||
@ -130,7 +130,7 @@ The same is true of AVI files.
|
||||
@section What happens next?
|
||||
|
||||
You should edit the avserver.conf file to suit your needs (in terms of
|
||||
frame rates etc). Then install avserver and ffmpeg, write a script to start
|
||||
frame rates etc). Then install avserver and avconv, write a script to start
|
||||
them up, and off you go.
|
||||
|
||||
@section Troubleshooting
|
||||
@ -138,13 +138,13 @@ them up, and off you go.
|
||||
@subsection I don't hear any audio, but video is fine.
|
||||
|
||||
Maybe you didn't install LAME, or got your ./configure statement wrong. Check
|
||||
the ffmpeg output to see if a line referring to MP3 is present. If not, then
|
||||
the avconv output to see if a line referring to MP3 is present. If not, then
|
||||
your configuration was incorrect. If it is, then maybe your wiring is not
|
||||
set up correctly. Maybe the sound card is not getting data from the right
|
||||
input source. Maybe you have a really awful audio interface (like I do)
|
||||
that only captures in stereo and also requires that one channel be flipped.
|
||||
If you are one of these people, then export 'AUDIO_FLIP_LEFT=1' before
|
||||
starting ffmpeg.
|
||||
starting avconv.
|
||||
|
||||
@subsection The audio and video loose sync after a while.
|
||||
|
||||
@ -250,7 +250,7 @@ Use @file{configfile} instead of @file{/etc/avserver.conf}.
|
||||
@item -n
|
||||
Enable no-launch mode. This option disables all the Launch directives
|
||||
within the various <Stream> sections. Since avserver will not launch
|
||||
any ffmpeg instances, you will have to launch them manually.
|
||||
any avconv instances, you will have to launch them manually.
|
||||
@item -d
|
||||
Enable debug mode. This option increases log verbosity, directs log
|
||||
messages to stdout and causes avserver to run in the foreground
|
||||
@ -265,7 +265,7 @@ rather than as a daemon.
|
||||
|
||||
@c man begin SEEALSO
|
||||
|
||||
avconv(1), avplay(1), avprobe(1), the @file{ffmpeg/doc/avserver.conf}
|
||||
avconv(1), avplay(1), avprobe(1), the @file{avserver.conf}
|
||||
example and the Libav HTML documentation
|
||||
@c man end
|
||||
|
||||
|
@ -132,8 +132,8 @@ libavcodec libraries. To see the list of available AVOptions, use the
|
||||
@option{-help} option. They are separated into two categories:
|
||||
@table @option
|
||||
@item generic
|
||||
These options can be set for any container, codec or device. Generic options are
|
||||
listed under AVFormatContext options for containers/devices and under
|
||||
These options can be set for any container, codec or device. Generic options
|
||||
are listed under AVFormatContext options for containers/devices and under
|
||||
AVCodecContext options for codecs.
|
||||
@item private
|
||||
These options are specific to the given container, device or codec. Private
|
||||
@ -144,14 +144,14 @@ For example to write an ID3v2.3 header instead of a default ID3v2.4 to
|
||||
an MP3 file, use the @option{id3v2_version} private option of the MP3
|
||||
muxer:
|
||||
@example
|
||||
ffmpeg -i input.flac -id3v2_version 3 out.mp3
|
||||
avconv -i input.flac -id3v2_version 3 out.mp3
|
||||
@end example
|
||||
|
||||
All codec AVOptions are obviously per-stream, so the chapter on stream
|
||||
specifiers applies to them
|
||||
|
||||
Note -nooption syntax cannot be used for boolean AVOptions, use -option
|
||||
0/-option 1.
|
||||
Note @option{-nooption} syntax cannot be used for boolean AVOptions,
|
||||
use @option{-option 0}/@option{-option 1}.
|
||||
|
||||
Note2 old undocumented way of specifying per-stream AVOptions by prepending
|
||||
v/a/s to the options name is now obsolete and will be removed soon.
|
||||
|
@ -34,7 +34,7 @@ JPEG image. The individual frames can be extracted without loss,
|
||||
e.g. by
|
||||
|
||||
@example
|
||||
ffmpeg -i ../some_mjpeg.avi -vcodec copy frames_%d.jpg
|
||||
avconv -i ../some_mjpeg.avi -c:v copy frames_%d.jpg
|
||||
@end example
|
||||
|
||||
Unfortunately, these chunks are incomplete JPEG images, because
|
||||
@ -57,9 +57,9 @@ stream (carrying the AVI1 header ID and lacking a DHT segment) to
|
||||
produce fully qualified JPEG images.
|
||||
|
||||
@example
|
||||
ffmpeg -i mjpeg-movie.avi -vcodec copy -vbsf mjpeg2jpeg frame_%d.jpg
|
||||
avconv -i mjpeg-movie.avi -c:v copy -vbsf mjpeg2jpeg frame_%d.jpg
|
||||
exiftran -i -9 frame*.jpg
|
||||
ffmpeg -i frame_%d.jpg -vcodec copy rotated.avi
|
||||
avconv -i frame_%d.jpg -c:v copy rotated.avi
|
||||
@end example
|
||||
|
||||
@section mjpega_dump_header
|
||||
|
@ -42,10 +42,10 @@ specify card number or identifier, device number and subdevice number
|
||||
To see the list of cards currently recognized by your system check the
|
||||
files @file{/proc/asound/cards} and @file{/proc/asound/devices}.
|
||||
|
||||
For example to capture with @file{ffmpeg} from an ALSA device with
|
||||
For example to capture with @command{avconv} from an ALSA device with
|
||||
card id 0, you may run the command:
|
||||
@example
|
||||
ffmpeg -f alsa -i hw:0 alsaout.wav
|
||||
avconv -f alsa -i hw:0 alsaout.wav
|
||||
@end example
|
||||
|
||||
For more information see:
|
||||
@ -72,14 +72,14 @@ For more detailed information read the file
|
||||
Documentation/fb/framebuffer.txt included in the Linux source tree.
|
||||
|
||||
To record from the framebuffer device @file{/dev/fb0} with
|
||||
@file{ffmpeg}:
|
||||
@command{avconv}:
|
||||
@example
|
||||
ffmpeg -f fbdev -r 10 -i /dev/fb0 out.avi
|
||||
avconv -f fbdev -r 10 -i /dev/fb0 out.avi
|
||||
@end example
|
||||
|
||||
You can take a single screenshot image with the command:
|
||||
@example
|
||||
ffmpeg -f fbdev -vframes 1 -r 1 -i /dev/fb0 screenshot.jpeg
|
||||
avconv -f fbdev -frames:v 1 -r 1 -i /dev/fb0 screenshot.jpeg
|
||||
@end example
|
||||
|
||||
See also @url{http://linux-fbdev.sourceforge.net/}, and fbset(1).
|
||||
@ -109,10 +109,10 @@ To list the JACK clients and their properties you can invoke the command
|
||||
@file{jack_lsp}.
|
||||
|
||||
Follows an example which shows how to capture a JACK readable client
|
||||
with @file{ffmpeg}.
|
||||
with @command{avconv}.
|
||||
@example
|
||||
# Create a JACK writable client with name "ffmpeg".
|
||||
$ ffmpeg -f jack -i ffmpeg -y out.wav
|
||||
# Create a JACK writable client with name "libav".
|
||||
$ avconv -f jack -i libav -y out.wav
|
||||
|
||||
# Start the sample jack_metro readable client.
|
||||
$ jack_metro -b 120 -d 0.2 -f 4000
|
||||
@ -123,11 +123,11 @@ system:capture_1
|
||||
system:capture_2
|
||||
system:playback_1
|
||||
system:playback_2
|
||||
ffmpeg:input_1
|
||||
libav:input_1
|
||||
metro:120_bpm
|
||||
|
||||
# Connect metro to the ffmpeg writable client.
|
||||
$ jack_connect metro:120_bpm ffmpeg:input_1
|
||||
# Connect metro to the avconv writable client.
|
||||
$ jack_connect metro:120_bpm libav:input_1
|
||||
@end example
|
||||
|
||||
For more information read:
|
||||
@ -145,10 +145,10 @@ The filename to provide to the input device is the device node
|
||||
representing the OSS input device, and is usually set to
|
||||
@file{/dev/dsp}.
|
||||
|
||||
For example to grab from @file{/dev/dsp} using @file{ffmpeg} use the
|
||||
For example to grab from @file{/dev/dsp} using @command{avconv} use the
|
||||
command:
|
||||
@example
|
||||
ffmpeg -f oss -i /dev/dsp /tmp/oss.wav
|
||||
avconv -f oss -i /dev/dsp /tmp/oss.wav
|
||||
@end example
|
||||
|
||||
For more information about OSS see:
|
||||
@ -248,10 +248,10 @@ The filename to provide to the input device is the device node
|
||||
representing the sndio input device, and is usually set to
|
||||
@file{/dev/audio0}.
|
||||
|
||||
For example to grab from @file{/dev/audio0} using @file{ffmpeg} use the
|
||||
For example to grab from @file{/dev/audio0} using @command{avconv} use the
|
||||
command:
|
||||
@example
|
||||
ffmpeg -f sndio -i /dev/audio0 /tmp/oss.wav
|
||||
avconv -f sndio -i /dev/audio0 /tmp/oss.wav
|
||||
@end example
|
||||
|
||||
@section video4linux and video4linux2
|
||||
@ -290,7 +290,7 @@ avplay -f video4linux2 /dev/video0
|
||||
# Grab and record the input of a video4linux2 device, autoadjust size,
|
||||
# frame rate value defaults to 0/0 so it is read from the video4linux2
|
||||
# driver.
|
||||
ffmpeg -f video4linux2 -i /dev/video0 out.mpeg
|
||||
avconv -f video4linux2 -i /dev/video0 out.mpeg
|
||||
@end example
|
||||
|
||||
@section vfwcap
|
||||
@ -326,12 +326,12 @@ Check the X11 documentation (e.g. man X) for more detailed information.
|
||||
Use the @file{dpyinfo} program for getting basic information about the
|
||||
properties of your X11 display (e.g. grep for "name" or "dimensions").
|
||||
|
||||
For example to grab from @file{:0.0} using @file{ffmpeg}:
|
||||
For example to grab from @file{:0.0} using @command{avconv}:
|
||||
@example
|
||||
ffmpeg -f x11grab -r 25 -s cif -i :0.0 out.mpg
|
||||
avconv -f x11grab -r 25 -s cif -i :0.0 out.mpg
|
||||
|
||||
# Grab at position 10,20.
|
||||
ffmpeg -f x11grab -r 25 -s cif -i :0.0+10,20 out.mpg
|
||||
avconv -f x11grab -r 25 -s cif -i :0.0+10,20 out.mpg
|
||||
@end example
|
||||
|
||||
@subsection @var{follow_mouse} AVOption
|
||||
@ -348,10 +348,10 @@ zero) to the edge of region.
|
||||
|
||||
For example:
|
||||
@example
|
||||
ffmpeg -f x11grab -follow_mouse centered -r 25 -s cif -i :0.0 out.mpg
|
||||
avconv -f x11grab -follow_mouse centered -r 25 -s cif -i :0.0 out.mpg
|
||||
|
||||
# Follows only when the mouse pointer reaches within 100 pixels to edge
|
||||
ffmpeg -f x11grab -follow_mouse 100 -r 25 -s cif -i :0.0 out.mpg
|
||||
avconv -f x11grab -follow_mouse 100 -r 25 -s cif -i :0.0 out.mpg
|
||||
@end example
|
||||
|
||||
@subsection @var{show_region} AVOption
|
||||
@ -367,10 +367,10 @@ being grabbed if only a portion of the screen is grabbed.
|
||||
|
||||
For example:
|
||||
@example
|
||||
ffmpeg -f x11grab -show_region 1 -r 25 -s cif -i :0.0+10,20 out.mpg
|
||||
avconv -f x11grab -show_region 1 -r 25 -s cif -i :0.0+10,20 out.mpg
|
||||
|
||||
# With follow_mouse
|
||||
ffmpeg -f x11grab -follow_mouse centered -show_region 1 -r 25 -s cif -i :0.0 out.mpg
|
||||
avconv -f x11grab -follow_mouse centered -show_region 1 -r 25 -s cif -i :0.0 out.mpg
|
||||
@end example
|
||||
|
||||
@c man end INPUT DEVICES
|
||||
|
@ -35,20 +35,20 @@ CRC=0x@var{CRC}, where @var{CRC} is a hexadecimal number 0-padded to
|
||||
For example to compute the CRC of the input, and store it in the file
|
||||
@file{out.crc}:
|
||||
@example
|
||||
ffmpeg -i INPUT -f crc out.crc
|
||||
avconv -i INPUT -f crc out.crc
|
||||
@end example
|
||||
|
||||
You can print the CRC to stdout with the command:
|
||||
@example
|
||||
ffmpeg -i INPUT -f crc -
|
||||
avconv -i INPUT -f crc -
|
||||
@end example
|
||||
|
||||
You can select the output format of each frame with @file{ffmpeg} by
|
||||
You can select the output format of each frame with @command{avconv} by
|
||||
specifying the audio and video codec and format. For example to
|
||||
compute the CRC of the input audio converted to PCM unsigned 8-bit
|
||||
and the input video converted to MPEG-2 video, use the command:
|
||||
@example
|
||||
ffmpeg -i INPUT -acodec pcm_u8 -vcodec mpeg2video -f crc -
|
||||
avconv -i INPUT -c:a pcm_u8 -c:v mpeg2video -f crc -
|
||||
@end example
|
||||
|
||||
See also the @ref{framecrc} muxer.
|
||||
@ -71,21 +71,21 @@ number 0-padded to 8 digits containing the CRC of the decoded frame.
|
||||
For example to compute the CRC of each decoded frame in the input, and
|
||||
store it in the file @file{out.crc}:
|
||||
@example
|
||||
ffmpeg -i INPUT -f framecrc out.crc
|
||||
avconv -i INPUT -f framecrc out.crc
|
||||
@end example
|
||||
|
||||
You can print the CRC of each decoded frame to stdout with the command:
|
||||
@example
|
||||
ffmpeg -i INPUT -f framecrc -
|
||||
avconv -i INPUT -f framecrc -
|
||||
@end example
|
||||
|
||||
You can select the output format of each frame with @file{ffmpeg} by
|
||||
You can select the output format of each frame with @command{avconv} by
|
||||
specifying the audio and video codec and format. For example, to
|
||||
compute the CRC of each decoded input audio frame converted to PCM
|
||||
unsigned 8-bit and of each decoded input video frame converted to
|
||||
MPEG-2 video, use the command:
|
||||
@example
|
||||
ffmpeg -i INPUT -acodec pcm_u8 -vcodec mpeg2video -f framecrc -
|
||||
avconv -i INPUT -c:a pcm_u8 -c:v mpeg2video -f framecrc -
|
||||
@end example
|
||||
|
||||
See also the @ref{crc} muxer.
|
||||
@ -119,26 +119,26 @@ The pattern "img%%-%d.jpg" will specify a sequence of filenames of the
|
||||
form @file{img%-1.jpg}, @file{img%-2.jpg}, ..., @file{img%-10.jpg},
|
||||
etc.
|
||||
|
||||
The following example shows how to use @file{ffmpeg} for creating a
|
||||
The following example shows how to use @command{avconv} for creating a
|
||||
sequence of files @file{img-001.jpeg}, @file{img-002.jpeg}, ...,
|
||||
taking one image every second from the input video:
|
||||
@example
|
||||
ffmpeg -i in.avi -r 1 -f image2 'img-%03d.jpeg'
|
||||
avconv -i in.avi -vsync 1 -r 1 -f image2 'img-%03d.jpeg'
|
||||
@end example
|
||||
|
||||
Note that with @file{ffmpeg}, if the format is not specified with the
|
||||
Note that with @command{avconv}, if the format is not specified with the
|
||||
@code{-f} option and the output filename specifies an image file
|
||||
format, the image2 muxer is automatically selected, so the previous
|
||||
command can be written as:
|
||||
@example
|
||||
ffmpeg -i in.avi -r 1 'img-%03d.jpeg'
|
||||
avconv -i in.avi -vsync 1 -r 1 'img-%03d.jpeg'
|
||||
@end example
|
||||
|
||||
Note also that the pattern must not necessarily contain "%d" or
|
||||
"%0@var{N}d", for example to create a single image file
|
||||
@file{img.jpeg} from the input video you can employ the command:
|
||||
@example
|
||||
ffmpeg -i in.avi -f image2 -vframes 1 img.jpeg
|
||||
avconv -i in.avi -f image2 -frames:v 1 img.jpeg
|
||||
@end example
|
||||
|
||||
@section mpegts
|
||||
@ -171,7 +171,7 @@ and @code{service_name}. If they are not set the default for
|
||||
@code{service_name} is "Service01".
|
||||
|
||||
@example
|
||||
ffmpeg -i file.mpg -acodec copy -vcodec copy \
|
||||
avconv -i file.mpg -c copy \
|
||||
-mpegts_original_network_id 0x1122 \
|
||||
-mpegts_transport_stream_id 0x3344 \
|
||||
-mpegts_service_id 0x5566 \
|
||||
@ -189,19 +189,19 @@ Null muxer.
|
||||
This muxer does not generate any output file, it is mainly useful for
|
||||
testing or benchmarking purposes.
|
||||
|
||||
For example to benchmark decoding with @file{ffmpeg} you can use the
|
||||
For example to benchmark decoding with @command{avconv} you can use the
|
||||
command:
|
||||
@example
|
||||
ffmpeg -benchmark -i INPUT -f null out.null
|
||||
avconv -benchmark -i INPUT -f null out.null
|
||||
@end example
|
||||
|
||||
Note that the above command does not read or write the @file{out.null}
|
||||
file, but specifying the output file is required by the @file{ffmpeg}
|
||||
file, but specifying the output file is required by the @command{avconv}
|
||||
syntax.
|
||||
|
||||
Alternatively you can write the command as:
|
||||
@example
|
||||
ffmpeg -benchmark -i INPUT -f null -
|
||||
avconv -benchmark -i INPUT -f null -
|
||||
@end example
|
||||
|
||||
@section matroska
|
||||
@ -264,7 +264,7 @@ Both eyes laced in one Block, Right-eye view is first
|
||||
|
||||
For example a 3D WebM clip can be created using the following command line:
|
||||
@example
|
||||
ffmpeg -i sample_left_right_clip.mpg -an -vcodec libvpx -metadata STEREO_MODE=left_right -y stereo_clip.webm
|
||||
avconv -i sample_left_right_clip.mpg -an -c:v libvpx -metadata STEREO_MODE=left_right -y stereo_clip.webm
|
||||
@end example
|
||||
|
||||
@c man end MUXERS
|
||||
|
@ -67,10 +67,10 @@ File access protocol.
|
||||
|
||||
Allow to read from or read to a file.
|
||||
|
||||
For example to read from a file @file{input.mpeg} with @file{ffmpeg}
|
||||
For example to read from a file @file{input.mpeg} with @command{avconv}
|
||||
use the command:
|
||||
@example
|
||||
ffmpeg -i file:input.mpeg output.mpeg
|
||||
avconv -i file:input.mpeg output.mpeg
|
||||
@end example
|
||||
|
||||
The ff* tools default to the file protocol, that is a resource
|
||||
@ -109,10 +109,10 @@ be used to test muxers without writing an actual file.
|
||||
Some examples follow.
|
||||
@example
|
||||
# Write the MD5 hash of the encoded AVI file to the file output.avi.md5.
|
||||
ffmpeg -i input.flv -f avi -y md5:output.avi.md5
|
||||
avconv -i input.flv -f avi -y md5:output.avi.md5
|
||||
|
||||
# Write the MD5 hash of the encoded AVI file to stdout.
|
||||
ffmpeg -i input.flv -f avi -y md5:
|
||||
avconv -i input.flv -f avi -y md5:
|
||||
@end example
|
||||
|
||||
Note that some formats (typically MOV) require the output protocol to
|
||||
@ -134,18 +134,18 @@ pipe (e.g. 0 for stdin, 1 for stdout, 2 for stderr). If @var{number}
|
||||
is not specified, by default the stdout file descriptor will be used
|
||||
for writing, stdin for reading.
|
||||
|
||||
For example to read from stdin with @file{ffmpeg}:
|
||||
For example to read from stdin with @command{avconv}:
|
||||
@example
|
||||
cat test.wav | ffmpeg -i pipe:0
|
||||
cat test.wav | avconv -i pipe:0
|
||||
# ...this is the same as...
|
||||
cat test.wav | ffmpeg -i pipe:
|
||||
cat test.wav | avconv -i pipe:
|
||||
@end example
|
||||
|
||||
For writing to stdout with @file{ffmpeg}:
|
||||
For writing to stdout with @command{avconv}:
|
||||
@example
|
||||
ffmpeg -i test.wav -f avi pipe:1 | cat > test.avi
|
||||
avconv -i test.wav -f avi pipe:1 | cat > test.avi
|
||||
# ...this is the same as...
|
||||
ffmpeg -i test.wav -f avi pipe: | cat > test.avi
|
||||
avconv -i test.wav -f avi pipe: | cat > test.avi
|
||||
@end example
|
||||
|
||||
Note that some formats (typically MOV), require the output protocol to
|
||||
@ -219,9 +219,9 @@ meaning as specified for the RTMP native protocol.
|
||||
See the librtmp manual page (man 3 librtmp) for more information.
|
||||
|
||||
For example, to stream a file in real-time to an RTMP server using
|
||||
@file{ffmpeg}:
|
||||
@command{avconv}:
|
||||
@example
|
||||
ffmpeg -re -i myfile -f flv rtmp://myserver/live/mystream
|
||||
avconv -re -i myfile -f flv rtmp://myserver/live/mystream
|
||||
@end example
|
||||
|
||||
To play the same stream using @file{avplay}:
|
||||
@ -249,7 +249,7 @@ The required syntax for a RTSP url is:
|
||||
rtsp://@var{hostname}[:@var{port}]/@var{path}
|
||||
@end example
|
||||
|
||||
The following options (set on the @file{avconv}/@file{avplay} command
|
||||
The following options (set on the @command{avconv}/@file{avplay} command
|
||||
line, or set in code via @code{AVOption}s or in @code{avformat_open_input}),
|
||||
are supported:
|
||||
|
||||
@ -310,7 +310,7 @@ avplay -rtsp_transport http rtsp://server/video.mp4
|
||||
To send a stream in realtime to a RTSP server, for others to watch:
|
||||
|
||||
@example
|
||||
ffmpeg -re -i @var{input} -f rtsp -muxdelay 0.1 rtsp://server/live.sdp
|
||||
avconv -re -i @var{input} -f rtsp -muxdelay 0.1 rtsp://server/live.sdp
|
||||
@end example
|
||||
|
||||
@section sap
|
||||
@ -362,19 +362,19 @@ Example command lines follow.
|
||||
To broadcast a stream on the local subnet, for watching in VLC:
|
||||
|
||||
@example
|
||||
ffmpeg -re -i @var{input} -f sap sap://224.0.0.255?same_port=1
|
||||
avconv -re -i @var{input} -f sap sap://224.0.0.255?same_port=1
|
||||
@end example
|
||||
|
||||
Similarly, for watching in avplay:
|
||||
|
||||
@example
|
||||
ffmpeg -re -i @var{input} -f sap sap://224.0.0.255
|
||||
avconv -re -i @var{input} -f sap sap://224.0.0.255
|
||||
@end example
|
||||
|
||||
And for watching in avplay, over IPv6:
|
||||
|
||||
@example
|
||||
ffmpeg -re -i @var{input} -f sap sap://[ff0e::1:2:3:4]
|
||||
avconv -re -i @var{input} -f sap sap://[ff0e::1:2:3:4]
|
||||
@end example
|
||||
|
||||
@subsection Demuxer
|
||||
@ -420,7 +420,7 @@ tcp://@var{hostname}:@var{port}[?@var{options}]
|
||||
Listen for an incoming connection
|
||||
|
||||
@example
|
||||
ffmpeg -i @var{input} -f @var{format} tcp://@var{hostname}:@var{port}?listen
|
||||
avconv -i @var{input} -f @var{format} tcp://@var{hostname}:@var{port}?listen
|
||||
avplay tcp://@var{hostname}:@var{port}
|
||||
@end example
|
||||
|
||||
@ -472,21 +472,21 @@ For receiving, this gives the benefit of only receiving packets from
|
||||
the specified peer address/port.
|
||||
@end table
|
||||
|
||||
Some usage examples of the udp protocol with @file{ffmpeg} follow.
|
||||
Some usage examples of the udp protocol with @command{avconv} follow.
|
||||
|
||||
To stream over UDP to a remote endpoint:
|
||||
@example
|
||||
ffmpeg -i @var{input} -f @var{format} udp://@var{hostname}:@var{port}
|
||||
avconv -i @var{input} -f @var{format} udp://@var{hostname}:@var{port}
|
||||
@end example
|
||||
|
||||
To stream in mpegts format over UDP using 188 sized UDP packets, using a large input buffer:
|
||||
@example
|
||||
ffmpeg -i @var{input} -f mpegts udp://@var{hostname}:@var{port}?pkt_size=188&buffer_size=65535
|
||||
avconv -i @var{input} -f mpegts udp://@var{hostname}:@var{port}?pkt_size=188&buffer_size=65535
|
||||
@end example
|
||||
|
||||
To receive over UDP from a remote endpoint:
|
||||
@example
|
||||
ffmpeg -i udp://[@var{multicast-address}]:@var{port}
|
||||
avconv -i udp://[@var{multicast-address}]:@var{port}
|
||||
@end example
|
||||
|
||||
@c man end PROTOCOLS
|
||||
|
Loading…
Reference in New Issue
Block a user