Statistics
| Branch: | Revision:

ffmpeg / doc / protocols.texi @ e0b33d47

History | View | Annotate | Download (13.3 KB)

1
@chapter Protocols
2
@c man begin PROTOCOLS
3

    
4
Protocols are configured elements in Libav which allow to access
5
resources which require the use of a particular protocol.
6

    
7
When you configure your Libav build, all the supported protocols are
8
enabled by default. You can list all available ones using the
9
configure option "--list-protocols".
10

    
11
You can disable all the protocols using the configure option
12
"--disable-protocols", and selectively enable a protocol using the
13
option "--enable-protocol=@var{PROTOCOL}", or you can disable a
14
particular protocol using the option
15
"--disable-protocol=@var{PROTOCOL}".
16

    
17
The option "-protocols" of the ff* tools will display the list of
18
supported protocols.
19

    
20
A description of the currently available protocols follows.
21

    
22
@section applehttp
23

    
24
Read Apple HTTP Live Streaming compliant segmented stream as
25
a uniform one. The M3U8 playlists describing the segments can be
26
remote HTTP resources or local files, accessed using the standard
27
file protocol.
28
HTTP is default, specific protocol can be declared by specifying
29
"+@var{proto}" after the applehttp URI scheme name, where @var{proto}
30
is either "file" or "http".
31

    
32
@example
33
applehttp://host/path/to/remote/resource.m3u8
34
applehttp+http://host/path/to/remote/resource.m3u8
35
applehttp+file://path/to/local/resource.m3u8
36
@end example
37

    
38
@section concat
39

    
40
Physical concatenation protocol.
41

    
42
Allow to read and seek from many resource in sequence as if they were
43
a unique resource.
44

    
45
A URL accepted by this protocol has the syntax:
46
@example
47
concat:@var{URL1}|@var{URL2}|...|@var{URLN}
48
@end example
49

    
50
where @var{URL1}, @var{URL2}, ..., @var{URLN} are the urls of the
51
resource to be concatenated, each one possibly specifying a distinct
52
protocol.
53

    
54
For example to read a sequence of files @file{split1.mpeg},
55
@file{split2.mpeg}, @file{split3.mpeg} with @file{ffplay} use the
56
command:
57
@example
58
ffplay concat:split1.mpeg\|split2.mpeg\|split3.mpeg
59
@end example
60

    
61
Note that you may need to escape the character "|" which is special for
62
many shells.
63

    
64
@section file
65

    
66
File access protocol.
67

    
68
Allow to read from or read to a file.
69

    
70
For example to read from a file @file{input.mpeg} with @file{ffmpeg}
71
use the command:
72
@example
73
ffmpeg -i file:input.mpeg output.mpeg
74
@end example
75

    
76
The ff* tools default to the file protocol, that is a resource
77
specified with the name "FILE.mpeg" is interpreted as the URL
78
"file:FILE.mpeg".
79

    
80
@section gopher
81

    
82
Gopher protocol.
83

    
84
@section http
85

    
86
HTTP (Hyper Text Transfer Protocol).
87

    
88
@section mmst
89

    
90
MMS (Microsoft Media Server) protocol over TCP.
91

    
92
@section mmsh
93

    
94
MMS (Microsoft Media Server) protocol over HTTP.
95

    
96
The required syntax is:
97
@example
98
mmsh://@var{server}[:@var{port}][/@var{app}][/@var{playpath}]
99
@end example
100

    
101
@section md5
102

    
103
MD5 output protocol.
104

    
105
Computes the MD5 hash of the data to be written, and on close writes
106
this to the designated output or stdout if none is specified. It can
107
be used to test muxers without writing an actual file.
108

    
109
Some examples follow.
110
@example
111
# Write the MD5 hash of the encoded AVI file to the file output.avi.md5.
112
ffmpeg -i input.flv -f avi -y md5:output.avi.md5
113

    
114
# Write the MD5 hash of the encoded AVI file to stdout.
115
ffmpeg -i input.flv -f avi -y md5:
116
@end example
117

    
118
Note that some formats (typically MOV) require the output protocol to
119
be seekable, so they will fail with the MD5 output protocol.
120

    
121
@section pipe
122

    
123
UNIX pipe access protocol.
124

    
125
Allow to read and write from UNIX pipes.
126

    
127
The accepted syntax is:
128
@example
129
pipe:[@var{number}]
130
@end example
131

    
132
@var{number} is the number corresponding to the file descriptor of the
133
pipe (e.g. 0 for stdin, 1 for stdout, 2 for stderr).  If @var{number}
134
is not specified, by default the stdout file descriptor will be used
135
for writing, stdin for reading.
136

    
137
For example to read from stdin with @file{ffmpeg}:
138
@example
139
cat test.wav | ffmpeg -i pipe:0
140
# ...this is the same as...
141
cat test.wav | ffmpeg -i pipe:
142
@end example
143

    
144
For writing to stdout with @file{ffmpeg}:
145
@example
146
ffmpeg -i test.wav -f avi pipe:1 | cat > test.avi
147
# ...this is the same as...
148
ffmpeg -i test.wav -f avi pipe: | cat > test.avi
149
@end example
150

    
151
Note that some formats (typically MOV), require the output protocol to
152
be seekable, so they will fail with the pipe output protocol.
153

    
154
@section rtmp
155

    
156
Real-Time Messaging Protocol.
157

    
158
The Real-Time Messaging Protocol (RTMP) is used for streaming multimeā€
159
dia content across a TCP/IP network.
160

    
161
The required syntax is:
162
@example
163
rtmp://@var{server}[:@var{port}][/@var{app}][/@var{playpath}]
164
@end example
165

    
166
The accepted parameters are:
167
@table @option
168

    
169
@item server
170
The address of the RTMP server.
171

    
172
@item port
173
The number of the TCP port to use (by default is 1935).
174

    
175
@item app
176
It is the name of the application to access. It usually corresponds to
177
the path where the application is installed on the RTMP server
178
(e.g. @file{/ondemand/}, @file{/flash/live/}, etc.).
179

    
180
@item playpath
181
It is the path or name of the resource to play with reference to the
182
application specified in @var{app}, may be prefixed by "mp4:".
183

    
184
@end table
185

    
186
For example to read with @file{ffplay} a multimedia resource named
187
"sample" from the application "vod" from an RTMP server "myserver":
188
@example
189
ffplay rtmp://myserver/vod/sample
190
@end example
191

    
192
@section rtmp, rtmpe, rtmps, rtmpt, rtmpte
193

    
194
Real-Time Messaging Protocol and its variants supported through
195
librtmp.
196

    
197
Requires the presence of the librtmp headers and library during
198
configuration. You need to explicitely configure the build with
199
"--enable-librtmp". If enabled this will replace the native RTMP
200
protocol.
201

    
202
This protocol provides most client functions and a few server
203
functions needed to support RTMP, RTMP tunneled in HTTP (RTMPT),
204
encrypted RTMP (RTMPE), RTMP over SSL/TLS (RTMPS) and tunneled
205
variants of these encrypted types (RTMPTE, RTMPTS).
206

    
207
The required syntax is:
208
@example
209
@var{rtmp_proto}://@var{server}[:@var{port}][/@var{app}][/@var{playpath}] @var{options}
210
@end example
211

    
212
where @var{rtmp_proto} is one of the strings "rtmp", "rtmpt", "rtmpe",
213
"rtmps", "rtmpte", "rtmpts" corresponding to each RTMP variant, and
214
@var{server}, @var{port}, @var{app} and @var{playpath} have the same
215
meaning as specified for the RTMP native protocol.
216
@var{options} contains a list of space-separated options of the form
217
@var{key}=@var{val}.
218

    
219
See the librtmp manual page (man 3 librtmp) for more information.
220

    
221
For example, to stream a file in real-time to an RTMP server using
222
@file{ffmpeg}:
223
@example
224
ffmpeg -re -i myfile -f flv rtmp://myserver/live/mystream
225
@end example
226

    
227
To play the same stream using @file{ffplay}:
228
@example
229
ffplay "rtmp://myserver/live/mystream live=1"
230
@end example
231

    
232
@section rtp
233

    
234
Real-Time Protocol.
235

    
236
@section rtsp
237

    
238
RTSP is not technically a protocol handler in libavformat, it is a demuxer
239
and muxer. The demuxer supports both normal RTSP (with data transferred
240
over RTP; this is used by e.g. Apple and Microsoft) and Real-RTSP (with
241
data transferred over RDT).
242

    
243
The muxer can be used to send a stream using RTSP ANNOUNCE to a server
244
supporting it (currently Darwin Streaming Server and Mischa Spiegelmock's
245
RTSP server, @url{http://github.com/revmischa/rtsp-server}).
246

    
247
The required syntax for a RTSP url is:
248
@example
249
rtsp://@var{hostname}[:@var{port}]/@var{path}[?@var{options}]
250
@end example
251

    
252
@var{options} is a @code{&}-separated list. The following options
253
are supported:
254

    
255
@table @option
256

    
257
@item udp
258
Use UDP as lower transport protocol.
259

    
260
@item tcp
261
Use TCP (interleaving within the RTSP control channel) as lower
262
transport protocol.
263

    
264
@item multicast
265
Use UDP multicast as lower transport protocol.
266

    
267
@item http
268
Use HTTP tunneling as lower transport protocol, which is useful for
269
passing proxies.
270

    
271
@item filter_src
272
Accept packets only from negotiated peer address and port.
273
@end table
274

    
275
Multiple lower transport protocols may be specified, in that case they are
276
tried one at a time (if the setup of one fails, the next one is tried).
277
For the muxer, only the @code{tcp} and @code{udp} options are supported.
278

    
279
When receiving data over UDP, the demuxer tries to reorder received packets
280
(since they may arrive out of order, or packets may get lost totally). In
281
order for this to be enabled, a maximum delay must be specified in the
282
@code{max_delay} field of AVFormatContext.
283

    
284
When watching multi-bitrate Real-RTSP streams with @file{ffplay}, the
285
streams to display can be chosen with @code{-vst} @var{n} and
286
@code{-ast} @var{n} for video and audio respectively, and can be switched
287
on the fly by pressing @code{v} and @code{a}.
288

    
289
Example command lines:
290

    
291
To watch a stream over UDP, with a max reordering delay of 0.5 seconds:
292

    
293
@example
294
ffplay -max_delay 500000 rtsp://server/video.mp4?udp
295
@end example
296

    
297
To watch a stream tunneled over HTTP:
298

    
299
@example
300
ffplay rtsp://server/video.mp4?http
301
@end example
302

    
303
To send a stream in realtime to a RTSP server, for others to watch:
304

    
305
@example
306
ffmpeg -re -i @var{input} -f rtsp -muxdelay 0.1 rtsp://server/live.sdp
307
@end example
308

    
309
@section sap
310

    
311
Session Announcement Protocol (RFC 2974). This is not technically a
312
protocol handler in libavformat, it is a muxer and demuxer.
313
It is used for signalling of RTP streams, by announcing the SDP for the
314
streams regularly on a separate port.
315

    
316
@subsection Muxer
317

    
318
The syntax for a SAP url given to the muxer is:
319
@example
320
sap://@var{destination}[:@var{port}][?@var{options}]
321
@end example
322

    
323
The RTP packets are sent to @var{destination} on port @var{port},
324
or to port 5004 if no port is specified.
325
@var{options} is a @code{&}-separated list. The following options
326
are supported:
327

    
328
@table @option
329

    
330
@item announce_addr=@var{address}
331
Specify the destination IP address for sending the announcements to.
332
If omitted, the announcements are sent to the commonly used SAP
333
announcement multicast address 224.2.127.254 (sap.mcast.net), or
334
ff0e::2:7ffe if @var{destination} is an IPv6 address.
335

    
336
@item announce_port=@var{port}
337
Specify the port to send the announcements on, defaults to
338
9875 if not specified.
339

    
340
@item ttl=@var{ttl}
341
Specify the time to live value for the announcements and RTP packets,
342
defaults to 255.
343

    
344
@item same_port=@var{0|1}
345
If set to 1, send all RTP streams on the same port pair. If zero (the
346
default), all streams are sent on unique ports, with each stream on a
347
port 2 numbers higher than the previous.
348
VLC/Live555 requires this to be set to 1, to be able to receive the stream.
349
The RTP stack in libavformat for receiving requires all streams to be sent
350
on unique ports.
351
@end table
352

    
353
Example command lines follow.
354

    
355
To broadcast a stream on the local subnet, for watching in VLC:
356

    
357
@example
358
ffmpeg -re -i @var{input} -f sap sap://224.0.0.255?same_port=1
359
@end example
360

    
361
Similarly, for watching in ffplay:
362

    
363
@example
364
ffmpeg -re -i @var{input} -f sap sap://224.0.0.255
365
@end example
366

    
367
And for watching in ffplay, over IPv6:
368

    
369
@example
370
ffmpeg -re -i @var{input} -f sap sap://[ff0e::1:2:3:4]
371
@end example
372

    
373
@subsection Demuxer
374

    
375
The syntax for a SAP url given to the demuxer is:
376
@example
377
sap://[@var{address}][:@var{port}]
378
@end example
379

    
380
@var{address} is the multicast address to listen for announcements on,
381
if omitted, the default 224.2.127.254 (sap.mcast.net) is used. @var{port}
382
is the port that is listened on, 9875 if omitted.
383

    
384
The demuxers listens for announcements on the given address and port.
385
Once an announcement is received, it tries to receive that particular stream.
386

    
387
Example command lines follow.
388

    
389
To play back the first stream announced on the normal SAP multicast address:
390

    
391
@example
392
ffplay sap://
393
@end example
394

    
395
To play back the first stream announced on one the default IPv6 SAP multicast address:
396

    
397
@example
398
ffplay sap://[ff0e::2:7ffe]
399
@end example
400

    
401
@section tcp
402

    
403
Trasmission Control Protocol.
404

    
405
The required syntax for a TCP url is:
406
@example
407
tcp://@var{hostname}:@var{port}[?@var{options}]
408
@end example
409

    
410
@table @option
411

    
412
@item listen
413
Listen for an incoming connection
414

    
415
@example
416
ffmpeg -i @var{input} -f @var{format} tcp://@var{hostname}:@var{port}?listen
417
ffplay tcp://@var{hostname}:@var{port}
418
@end example
419

    
420
@end table
421

    
422
@section udp
423

    
424
User Datagram Protocol.
425

    
426
The required syntax for a UDP url is:
427
@example
428
udp://@var{hostname}:@var{port}[?@var{options}]
429
@end example
430

    
431
@var{options} contains a list of &-seperated options of the form @var{key}=@var{val}.
432
Follow the list of supported options.
433

    
434
@table @option
435

    
436
@item buffer_size=@var{size}
437
set the UDP buffer size in bytes
438

    
439
@item localport=@var{port}
440
override the local UDP port to bind with
441

    
442
@item pkt_size=@var{size}
443
set the size in bytes of UDP packets
444

    
445
@item reuse=@var{1|0}
446
explicitly allow or disallow reusing UDP sockets
447

    
448
@item ttl=@var{ttl}
449
set the time to live value (for multicast only)
450

    
451
@item connect=@var{1|0}
452
Initialize the UDP socket with @code{connect()}. In this case, the
453
destination address can't be changed with ff_udp_set_remote_url later.
454
If the destination address isn't known at the start, this option can
455
be specified in ff_udp_set_remote_url, too.
456
This allows finding out the source address for the packets with getsockname,
457
and makes writes return with AVERROR(ECONNREFUSED) if "destination
458
unreachable" is received.
459
For receiving, this gives the benefit of only receiving packets from
460
the specified peer address/port.
461
@end table
462

    
463
Some usage examples of the udp protocol with @file{ffmpeg} follow.
464

    
465
To stream over UDP to a remote endpoint:
466
@example
467
ffmpeg -i @var{input} -f @var{format} udp://@var{hostname}:@var{port}
468
@end example
469

    
470
To stream in mpegts format over UDP using 188 sized UDP packets, using a large input buffer:
471
@example
472
ffmpeg -i @var{input} -f mpegts udp://@var{hostname}:@var{port}?pkt_size=188&buffer_size=65535
473
@end example
474

    
475
To receive over UDP from a remote endpoint:
476
@example
477
ffmpeg -i udp://[@var{multicast-address}]:@var{port}
478
@end example
479

    
480
@c man end PROTOCOLS