diff --git a/doc/Makefile b/doc/Makefile index 034d4921b4..ac344908e8 100644 --- a/doc/Makefile +++ b/doc/Makefile @@ -1,4 +1,7 @@ all: ffmpeg-doc.html faq.html ffserver-doc.html hooks.html -%.html: %.texi +%.html: %.texi Makefile texi2html -monolithic -number $< + +clean: + rm -f *.html diff --git a/doc/faq.html b/doc/faq.html deleted file mode 100644 index 2d5b95b89b..0000000000 --- a/doc/faq.html +++ /dev/null @@ -1,148 +0,0 @@ - -
- - --
-
- -
-FFmpeg FAQ - - - - -
-ffmpeg development is now concentrated on the codec and format -handling. New developments broke ffserver, so don't expect it to work -correctly. It is planned to fix it ASAP. - - - - -
-Even if ffmpeg can read the file format, it may not support all its -codecs. Please consult the supported codec list in the ffmpeg -documentation. - - - - -
-Currently, the grabbing stuff does not handle synchronisation -correctly. You are free to correct it. It is planned to fix it ASAP. - - - - -
-If the jpegs are named img1.jpg, img2.jpg, img3.jpg,..., use: - - - -
- ffmpeg -i img%d.jpg /tmp/a.mpg -- -
-`%d' is replaced by the image number. - - -
-`img%03d.jpg' generates `img001.jpg', `img002.jpg', etc... - - -
-The same system is used for the other image formats. - - - - -
-No. FFmpeg only supports open source codecs. Windows DLLs are not -portable, bloated and often slow. - - - - -
-Use `-' as filename. - - - - -
-No. Only GCC is supported. GCC is ported on most systems available and I
-don't see the need to pollute the source code with #ifdef
s
-related to the compiler.
-
-
-
-
-
-No. Use mingw-gcc
available at http://www.mingw.org/ to
-compile the code. It generates object files fully compatible with other
-windows compilers.
-
-
-
-
-
-No. These tools are too blowted and they complicate the build. Moreover, -since only `gcc' is supported they would add little advantages in -terms of portability. - - -
-This document was generated on 28 December 2002 using -texi2html 1.56k. - - diff --git a/doc/ffmpeg-doc.html b/doc/ffmpeg-doc.html deleted file mode 100644 index faffc41be2..0000000000 --- a/doc/ffmpeg-doc.html +++ /dev/null @@ -1,910 +0,0 @@ - -
- - --
-
- -
-FFmpeg Documentation - - - - -
-FFmpeg is a very fast video and audio converter. It can also grab from -a live audio/video source. - -The command line interface is designed to be intuitive, in the sense -that ffmpeg tries to figure out all the parameters, when -possible. You have usually to give only the target bitrate you want. - - -
-FFmpeg can also convert from any sample rate to any other, and resize -video on the fly with a high quality polyphase filter. - - - - -
- FFmpeg can use a video4linux compatible video source and any Open Sound - System audio source: - -
- ffmpeg /tmp/out.mpg -- -
- Note that you must activate the right video source and channel before - launching ffmpeg. You can use any TV viewer such as xawtv - (http://bytesex.org/xawtv/) by Gerd Knorr which I find very - good. You must also set correctly the audio recording levels with a - standard mixer. - - - - -
-* ffmpeg can use any supported file format and protocol as input: - - -
-Examples: - - -
-* You can input from YUV files: - - - -
- ffmpeg -i /tmp/test%d.Y /tmp/out.mpg -- -
- It will use the files: - -
- /tmp/test0.Y, /tmp/test0.U, /tmp/test0.V, - /tmp/test1.Y, /tmp/test1.U, /tmp/test1.V, etc... -- -
- The Y files use twice the resolution of the U and V files. They are - raw files, without header. They can be generated by all decent video - decoders. You must specify the size of the image with the '-s' option - if ffmpeg cannot guess it. - - -
-* You can input from a RAW YUV420P file: - - - -
- ffmpeg -i /tmp/test.yuv /tmp/out.avi -- -
- The RAW YUV420P is a file containing RAW YUV planar, for each frame first - come the Y plane followed by U and V planes, which are half vertical and - horizontal resolution. - - -
-* You can output to a RAW YUV420P file: - - - -
- ffmpeg -i mydivx.avi -o hugefile.yuv -- -
-* You can set several input files and output files: - - - -
- ffmpeg -i /tmp/a.wav -s 640x480 -i /tmp/a.yuv /tmp/a.mpg -- -
- Convert the audio file a.wav and the raw yuv video file a.yuv - to mpeg file a.mpg - - -
-* You can also do audio and video conversions at the same time: - - - -
- ffmpeg -i /tmp/a.wav -ar 22050 /tmp/a.mp2 -- -
- Convert the sample rate of a.wav to 22050 Hz and encode it to MPEG audio. - - -
-* You can encode to several formats at the same time and define a - mapping from input stream to output streams: - - - -
- ffmpeg -i /tmp/a.wav -ab 64 /tmp/a.mp2 -ab 128 /tmp/b.mp2 -map 0:0 -map 0:0 -- -
- Convert a.wav to a.mp2 at 64 kbits and b.mp2 at 128 kbits. '-map - file:index' specify which input stream is used for each output - stream, in the order of the definition of output streams. - - -
-* You can transcode decrypted VOBs - - - -
- ffmpeg -i snatch_1.vob -f avi -vcodec mpeg4 -b 800 -g 300 -bf 2 -acodec mp3 -ab 128 snatch.avi -- -
- This is a typical DVD ripper example, input from a VOB file, output
- to an AVI file with MPEG-4 video and MP3 audio, note that in this
- command we use B frames so the MPEG-4 stream is DivX5 compatible, GOP
- size is 300 that means an INTRA frame every 10 seconds for 29.97 fps
- input video. Also the audio stream is MP3 encoded so you need LAME
- support which is enabled using --enable-mp3lame
when
- configuring. The mapping is particularly useful for DVD transcoding
- to get the desired audio language.
-
-
-
- NOTE: to see the supported input formats, use ffmpeg -formats
.
-
-
-
-
-
- The generic syntax is: - - - -
- ffmpeg [[options][-i input_file]]... {[options] output_file}... -- -
- If no input file is given, audio/video grabbing is done. - - -
- As a general rule, options are applied to the next specified - file. For example, if you give the '-b 64' option, it sets the video - bitrate of the next file. Format option may be needed for raw input - files. - - -
- By default, ffmpeg tries to convert as losslessly as possible: it - uses the same audio and video parameter for the outputs as the one - specified for the inputs. - - - - -
hh:mm:ss[.xxx]
syntax is also
-supported.
-
--The filename can be `-' to read from the standard input or to write -to the standard output. - - -
-ffmpeg handles also many protocols specified with the URL syntax. - - -
- Use 'ffmpeg -formats' to have a list of the supported protocols. - - -
- The protocol http:
is currently used only to communicate with
- ffserver (see the ffserver documentation). When ffmpeg will be a
- video player it will also be used for streaming :-)
-
-
-
-
-
- ffmpeg -g 3 -r 3 -t 10 -b 50 -s qcif -f rv10 /tmp/b.rm -- -
-You can use the -formats
option to have an exhaustive list.
-
-
-
-
-
-FFmpeg supports the following file formats through the libavformat
-library:
-
-
-
Supported File Format | Encoding | Decoding | Comments | - -
MPEG audio | X | X | - -|
MPEG1 systems | X | X | - -muxed audio and video - |
MPEG2 PS | X | X | - - also known as VOB file
- |
MPEG2 TS | X | - -also known as DVB Transport Stream - | |
ASF | X | X | - -|
AVI | X | X | - -|
WAV | X | X | - -|
Macromedia Flash | X | X | - -Only embedded audio is decoded - |
Real Audio and Video | X | X | - -|
Raw AC3 | X | X | - -|
Raw MJPEG | X | X | - -|
Raw MPEG video | X | X | - -|
Raw PCM8/16 bits, mulaw/Alaw | X | X | - -|
SUN AU format | X | X | - -|
Quicktime | X | - -||
MPEG4 | X | - -MPEG4 is a variant of Quicktime - | |
Raw MPEG4 video | X | X | - -|
DV | X | - -Only the video track is decoded. - |
-X
means that the encoding (resp. decoding) is supported.
-
-
-
-
-
-FFmpeg can read and write images for each frame of a video sequence. The -following image formats are supported: - - -
Supported Image Format | Encoding | Decoding | Comments | - -
PGM, PPM | X | X | - -|
PGMYUV | X | X | PGM with U and V components in 420 | - -
JPEG | X | X | Progressive JPEG is not supported | - -
.Y.U.V | X | X | One raw file per component | - -
Animated GIF | X | Only uncompressed GIFs are generated | - -
-X
means that the encoding (resp. decoding) is supported.
-
-
-
-
-
Supported Codec | Encoding | Decoding | Comments | - -
MPEG1 video | X | X | - -|
MPEG2 video | X | - -||
MPEG4 | X | X | Also known as DIVX4/5 | - -
MSMPEG4 V1 | X | X | - -|
MSMPEG4 V2 | X | X | - -|
MSMPEG4 V3 | X | X | Also known as DIVX3 | - -
WMV7 | X | X | - -|
H263(+) | X | X | Also known as Real Video 1.0 | - -
MJPEG | X | X | - -|
DV | X | - -||
Huff YUV | X | X | - -
-X
means that the encoding (resp. decoding) is supported.
-
-
-
-Check at http://www.mplayerhq.hu/~michael/codec-features.html to -get a precise comparison of FFmpeg MPEG4 codec compared to the other -solutions. - - - - -
Supported Codec | Encoding | Decoding | Comments | - -
MPEG audio layer 2 | IX | IX | - -|
MPEG audio layer 1/3 | IX | IX | - -MP3 encoding is supported through the external library LAME - |
AC3 | IX | X | - -liba52 is used internally for decoding. - |
Vorbis | X | X | - -supported through the external library libvorbis. - |
WMA V1/V2 | X | - -
-X
means that the encoding (resp. decoding) is supported.
-
-
-
-I
means that an integer only version is available too (ensures highest
-performances on systems without hardware floating point support).
-
-
-
-
-
-ffmpeg should be compiled with at least GCC 2.95.3. GCC 3.2 is the -preferred compiler now for ffmpeg. All future optimizations will depend on -features only found in GCC 3.2. - - - - -
-The configure script should guess the configuration itself. -Networking support is currently not finished. -errno issues fixed by Andrew Bachmann. - - -
-Old stuff: - - -
-François Revol - revol at free dot fr - April 2002 - - -
-The configure script should guess the configuration itself, -however I still didn't tested building on net_server version of BeOS. - - -
-ffserver is broken (needs poll() implementation). - - -
-There is still issues with errno codes, which are negative in BeOs, and -that ffmpeg negates when returning. This ends up turning errors into -valid results, then crashes. -(To be fixed) - - - - -
-You can integrate all the source code of the libraries to link them -statically to avoid any version problem. All you need is to provide a -'config.mak' and a 'config.h' in the parent directory. See the defines -generated by ./configure to understand what is needed. - - -
-You can use libavcodec or libavformat in your commercial program, but -any patch you make must be published. The best way to proceed is -to send your patches to the ffmpeg mailing list. - - - - -
-ffmpeg is programmed in ANSI C language. GCC extensions are -tolerated. Indent size is 4. The TAB character should not be used. - - -
-The presentation is the one specified by 'indent -i4 -kr'. - - -
-Main priority in ffmpeg is simplicity and small code size (=less -bugs). - - -
-Comments: for functions visible from other modules, use the JavaDoc -format (see examples in `libav/utils.c') so that a documentation -can be generated automatically. - - - - -
-When you submit your patch, try to send a unified diff (diff '-u' -option). I cannot read other diffs :-) - - -
-Run the regression tests before submitting a patch so that you can -verify that there are no big problems. - - -
-Patches should be posted as base64 encoded attachments (or any other -encoding which ensures that the patch wont be trashed during -transmission) to the ffmpeg-devel mailinglist, see -http://lists.sourceforge.net/lists/listinfo/ffmpeg-devel - - - - -
-Before submitting a patch (or committing with CVS), you should at least -test that you did not break anything. - - -
-The regression test build a synthetic video stream and a synthetic -audio stream. Then these are encoded then decoded with all codecs or -formats. The CRC (or MD5) of each generated file is recorded in a -result file. Then a 'diff' is launched with the reference results and -the result file. - - -
-The regression test then goes on to test the ffserver code with a -limited set of streams. It is important that this step runs correctly -as well. - - -
-Run 'make test' to test all the codecs. - - -
-Run 'make libavtest' to test all the codecs. - - -
-[Of course, some patches may change the regression tests results. In -this case, the regression tests reference results shall be modified -accordingly]. - - -
-This document was generated on 17 April 2003 using -texi2html 1.56k. - - diff --git a/doc/ffserver-doc.html b/doc/ffserver-doc.html deleted file mode 100644 index 2356264b11..0000000000 --- a/doc/ffserver-doc.html +++ /dev/null @@ -1,312 +0,0 @@ - -
- - --
-
- -
-FFserver Documentation - - - - -
-FFserver is a streaming server for both audio and video. It supports -several live feeds, streaming from files and time shifting on live feeds -(you can seek to positions in the past on each live feed, provided you -specify a big enough feed storage in ffserver.conf). - - -
-This documentation covers only the streaming aspects of ffserver / -ffmpeg. All questions about parameters for ffmpeg, codec questions, -etc. are not covered here. Read `ffmpeg-doc.[texi|html]' for more -information. - - - - -
-[Contributed by Philip Gladstone, philip-ffserver at gladstonefamily dot net] - - - - -
-When properly configured and running, you can capture video and audio in real -time from a suitable capture card, and stream it out over the Internet to -either Windows Media Player or RealAudio player (with some restrictions). - - -
-It can also stream from files, though that is currently broken. Very often, a -web server can be used to serve up the files just as well. - - -
-It can stream prerecorded video from .ffm files, though it is somewhat tricky -to make it work correctly. - - - - -
-I use Linux on a 900MHz Duron with a cheapo Bt848 based TV capture card. I'm -using stock linux 2.4.17 with the stock drivers. [Actually that isn't true, -I needed some special drivers from my motherboard based sound card.] - - -
-I understand that FreeBSD systems work just fine as well. - - - - -
-First, build the kit. It *really* helps to have installed LAME first. Then when -you run the ffserver ./configure, make sure that you have the --enable-mp3lame -flag turned on. - - -
-LAME is important as it allows streaming of audio to Windows Media Player. Don't -ask why the other audio types do not work. - - -
-As a simple test, just run the following two command lines (assuming that you -have a V4L video capture card): - - - -
-./ffserver -f doc/ffserver.conf & -./ffmpeg http://localhost:8090/feed1.ffm -- -
-At this point you should be able to go to your windows machine and fire up -Windows Media Player (WMP). Go to Open URL and enter - - - -
- http://<linuxbox>:8090/test.asf -- -
-You should see (after a short delay) video and hear audio. - - -
-WARNING: trying to stream test1.mpg doesn't work with WMP as it tries to -transfer the entire file before starting to play. The same is true of avi files. - - - - -
-You should edit the ffserver.conf file to suit your needs (in terms of -frame rates etc). Then install ffserver and ffmpeg, write a script to start -them up, and off you go. - - - - -
-Maybe you didn't install LAME, or get your ./configure statement right. Check -the ffmpeg output to see if a line referring to mp3 is present. If not, then -your configuration was incorrect. If it is, then maybe your wiring is not -setup correctly. Maybe the sound card is not getting data from the right -input source. Maybe you have a really awful audio interface (like I do) -that only captures in stereo and also requires that one channel be flipped. -If you are one of these people, then export 'AUDIO_FLIP_LEFT=1' before -starting ffmpeg. - - - - -
-Yes, they do. - - - - -
-Yes, it does. Who knows why? - - - - -
-Yes, it does. Any thoughts on this would be gratefully received. These -differences extend to embedding WMP into a web page. [There are two -different object ids that you can use, one of them -- the old one -- cannot -play very well, and the new one works well (both on the same system). However, -I suspect that the new one is not available unless you have installed WMP 7]. - - - - -
-You can replay video from .ffm files that was recorded earlier. -However, there are a number of caveats which include the fact that the -ffserver parameters must match the original parameters used to record the -file. If not, then ffserver deletes the file before recording into it. (Now I write -this, this seems broken). - - -
-You can fiddle with many of the codec choices and encoding parameters, and -there are a bunch more parameters that you cannot control. Post a message -to the mailing list if there are some 'must have' parameters. Look in the -ffserver.conf for a list of the currently available controls. - - -
-It will automatically generate the .ASX or .RAM files that are often used -in browsers. These files are actually redirections to the underlying .ASF -or .RM file. The reason for this is that the browser often fetches the -entire file before starting up the external viewer. The redirection files -are very small and can be transferred quickly. [The stream itself is -often 'infinite' and thus the browser tries to download it and never -finishes.] - - - - -
-* When you connect to a live stream, most players (WMP, RA etc) want to -buffer a certain number of seconds of material so that they can display the -signal continuously. However, ffserver (by default) starts sending data -in real time. This means that there is a pause of a few seconds while the -buffering is being done by the player. The good news is that this can be -cured by adding a '?buffer=5' to the end of the URL. This says that the -stream should start 5 seconds in the past -- and so the first 5 seconds -of the stream is sent as fast as the network will allow. It will then -slow down to real time. This noticeably improves the startup experience. - - -
-You can also add a 'Preroll 15' statement into the ffserver.conf that will -add the 15 second prebuffering on all requests that do not otherwise -specify a time. In addition, ffserver will skip frames until a key_frame -is found. This further reduces the startup delay by not transferring data -that will be discarded. - - -
-* You may want to adjust the MaxBandwidth in the ffserver.conf to limit -the amount of bandwidth consumed by live streams. - - - - -
-It turns out that (on my machine at least) the number of frames successfully -grabbed is marginally less than the number that ought to be grabbed. This -means that the timestamp in the encoded data stream gets behind real time. -This means that if you say 'preroll 10', then when the stream gets 10 -or more seconds behind, there is no preroll left. - - -
-Fixing this requires a change in the internals in how timestamps are -handled. - - - - -
?date=
stuff work.-Yes (subject to the caution above). Also note that whenever you start -ffserver, it deletes the ffm file (if any parameters have changed), thus wiping out what you had recorded -before. - - -
-The format of the ?date=xxxxxx
is fairly flexible. You should use one
-of the following formats (the 'T' is literal):
-
-
-
-
-* YYYY-MM-DDTHH:MM:SS (localtime) -* YYYY-MM-DDTHH:MM:SSZ (UTC) -- -
-You can omit the YYYY-MM-DD, and then it refers to the current day. However -note that `?date=16:00:00' refers to 4PM on the current day -- this may be -in the future and so unlikely to useful. - - -
-You use this by adding the ?date= to the end of the URL for the stream. -For example: `http://localhost:8080/test.asf?date=2002-07-26T23:05:00'. - - -
-This document was generated on 28 December 2002 using -texi2html 1.56k. - - diff --git a/doc/hooks.html b/doc/hooks.html deleted file mode 100644 index d18376f283..0000000000 --- a/doc/hooks.html +++ /dev/null @@ -1,90 +0,0 @@ - -
- - --
-
- -
-Video Hook Documentation - - - - -
-The video hook functionality is designed (mostly) for live video. It allows -the video to be modified or examined between the decoder and the encoder. - - -
-Any number of hook modules can be placed inline, and they are run in the -order that they were specified on the ffmpeg command line. - - -
-Three modules are provided and are described below. They are all intended to -be used as a base for your own modules. - - -
-Modules are loaded using the -vhook option to ffmpeg. The value of this parameter -is a space seperated list of arguments. The first is the module name, and the rest -are passed as arguments to the Configure function of the module. - - - - -
-This does nothing. Actually it converts the input image to RGB24 and then converts -it back again. This is meant as a sample that you can use to test your setup. - - - - -
-This implements a 'fish detector'. Essentially it converts the image into HSV -space and tests whether more than a certain percentage of the pixels fall into -a specific HSV cuboid. If so, then the image is saved into a file for processing -by other bits of code. - - -
-Why use HSV? It turns out that HSV cuboids represent a more compact range of -colors than would an RGB cuboid. - - - - -
-This allows a caption to be placed onto each frame. It supports inserting the -time and date. By using the imlib functions, it would be easy to add your own -graphical logo, add a frame/border, etc. - - -
-This document was generated on 28 December 2002 using -texi2html 1.56k. - -