public class Publisher
extends Object
Publisher: clientless stream publisher. This class can be used to publish raw video, audio and metadata packets to the Wowza Pro server. Here is a quick snippet of code that illustrates how to use it.
This code below will publish data the stream named "myStream". It will be streamed to the default virtual host and avaible at the rtmp address rtmp://[server-ip-address]/streamtest.
IVHost vhost = VHostSingleton.getInstance(VHost.VHOST_DEFAULT);
Publisher publisher = Publisher.createInstance(vhost, "streamtest");
publisher.setFileExtension("flv");
publisher.setStreamType("live");
publisher.publish("myStream", "live");
// sit in a loop adding data
boolean done = false;
while(true)
{
AMFPacket amfPacket;
// read packet from audio, video, data source
// amfPacket = readPacketFromSomewhere();
switch (amfPacket.getType())
{
case IVHost.CONTENTTYPE_AUDIO:
publisher.addAudioData(amfPacket.getData(), amfPacket.getSize(), amfPacket.getTimecode());
break;
case IVHost.CONTENTTYPE_VIDEO:
publisher.addVideoData(amfPacket.getData(), amfPacket.getSize(), amfPacket.getTimecode());
break;
case IVHost.CONTENTTYPE_DATA:
publisher.addDataData(amfPacket.getData(), amfPacket.getSize(), amfPacket.getTimecode());
break;
}
if (done)
break;
}
publisher.unpublish();
publisher.close();
Basic packet format:
Audio:
AAC
[1-byte header]
[1-byte codec config indicator (1 - audio data, 0 - codec config packet)]
[n-bytes audio content or codec config data]
All others
[1-byte header]
[n-bytes audio content]
Below is the bit
layout of the header byte of data (table goes from least significant bit to most significant bit):
1 bit Number of channels:
0 mono
1 stereo
1 bit Sample size:
0 8 bits per sample
1 16 bits per sample
2 bits Sample rate:
0 special or 8KHz
1 11KHz
2 22KHz
3 44KHz
4 bits Audio type:
0 PCM (big endian)
1 PCM (swf - ADPCM)
2 MP3
3 PCM (little endian)
4 Nelly Moser ASAO 16KHz Mono
5 Nelly Moser ASAO 8KHz Mono
6 Nelly Moser ASAO
7 G.711 ALaw
8 G.711 MULaw
9 Reserved
a AAC
b Speex
f MP3 8Khz
Note: For AAC the codec config data is generally a two byte packet that describes the stream. It must
be published first. Here is the basic code to fill in the codec config data.
AACFrame frame = new AACFrame();
int sampleRate = 22100;
int channels = 2;
frame.setSampleRate(sampleRate);
frame.setRateIndex(AACUtils.sampleRateToIndex(sampleRate));
frame.setChannels(channels);
frame.setChannelIndex(AACUtils.channelCountToIndex(sampleRate));
byte[] codecConfig = new byte[2];
AACUtils.encodeAACCodecConfig(frame, codecConfig, 0);
Note: For AAC the header byte is always 0xaf
Note: For Speex the audio data must be encoded as 16000Hz wide band
Video:
H.264
[1-byte header]
[1-byte codec config indicator (1 - video data, 0 - codec config packet)]
[3-byte time difference between dts and pts in milliseconds]
[n-bytes video content or codec config data]
All others
[1-byte header]
[n-bytes audio content]
Below is the bit layout of the header byte of data (table goes from least significant bit to most significant bit):
4 bits Video type:
2 Sorenson Spark (H.263)
3 Screen
4 On2 VP6
5 On2 VP6A
6 Screen2
7 H.264
2 bit Frame type:
1 K frame (key frame)
2 P frame
3 B frame
Note: H.264 codec config data is the same as the AVCc packet in a QuickTime container.
Note: All timecode data is in milliseconds
| Modifier and Type | Method | Description |
|---|---|---|
void |
addAudioData(byte[] data,
int offset,
int len,
long timecode) |
|
void |
addAudioData(byte[] data,
int offset,
int len,
long timecode,
java.util.Map<String,IAMFPacketExtraData> extraData) |
Add audio data
|
void |
addAudioData(byte[] data,
int len,
long timecode) |
Add audio data
|
void |
addAudioData(byte[] data,
long timecode) |
Add audio data
|
void |
addAudioDataInc(byte[] data,
int offset,
int len) |
|
void |
addDataData(byte[] data,
int offset,
int len,
long timecode) |
|
void |
addDataData(byte[] data,
int offset,
int len,
long timecode,
java.util.Map<String,IAMFPacketExtraData> extraData) |
Add metadata
|
void |
addDataData(byte[] data,
int len,
long timecode) |
Add metadata
|
void |
addDataData(byte[] data,
long timecode) |
Add metadata
|
void |
addDataDataInc(byte[] data,
int offset,
int len) |
|
void |
addVideoData(byte[] data,
int offset,
int len,
long timecode) |
Add video data
|
void |
addVideoData(byte[] data,
int offset,
int len,
long timecode,
java.util.Map<String,IAMFPacketExtraData> extraData) |
Add video data
|
void |
addVideoData(byte[] data,
int len,
long timecode) |
Add video data
|
void |
addVideoData(byte[] data,
long timecode) |
Add video data
|
void |
addVideoDataInc(byte[] data,
int offset,
int len) |
|
void |
close() |
Close the publisher
|
static Publisher |
createInstance(IApplicationInstance appInstance) |
|
static Publisher |
createInstance(IVHost vhost,
String applicationName) |
|
static Publisher |
createInstance(IVHost vhost,
String applicationName,
String appInstanceName) |
|
void |
createStream() |
Create underlying IMediaStream object if not already created
|
void |
flush() |
Flush the packets from the input buffer to the output buffer
|
IApplicationInstance |
getAppInstance() |
|
String |
getFileExtension() |
Get the file extension (default flv)
|
long |
getLastAudioTimecode() |
Get last audio timecode written through this publisher (millseconds).
|
long |
getLastDataTimecode() |
Get last data timecode written through this publisher (millseconds).
|
long |
getLastGapAudioTimecode() |
Get gap time from last audio timecode.
|
long |
getLastGapVideoTimecode() |
Get gap time from last video timecode.
|
long |
getLastVideoTimecode() |
Get last video timecode written through this publisher (millseconds).
|
long |
getMaxTimecode() |
Highest timecode written through this publisher (millseconds).
|
IMediaStream |
getStream() |
Get the media stream object
|
String |
getStreamType() |
|
boolean |
isPublishDataEvents() |
|
void |
publish(String streamName) |
Publish a stream (null to stop publishing)
|
void |
publish(String streamName,
String howToPublish) |
Start publishing a stream (streamName = null to stop).
|
void |
setFileExtension(String fileExtension) |
Set the file extension
|
void |
setPublishDataEvents(boolean publishDataEvents) |
|
void |
setStreamType(String streamType) |
Set the stream type (default live)
|
void |
startAudioData(int len,
long timecode) |
|
void |
startAudioData(int len,
long timecode,
java.util.Map<String,IAMFPacketExtraData> extraData) |
|
void |
startDataData(int len,
long timecode) |
|
void |
startDataData(int len,
long timecode,
java.util.Map<String,IAMFPacketExtraData> extraData) |
|
void |
startVideoData(int len,
long timecode) |
|
void |
startVideoData(int len,
long timecode,
java.util.Map<String,IAMFPacketExtraData> extraData) |
|
void |
unpublish() |
public void addAudioData(byte[] data,
int offset,
int len,
long timecode)
public void addAudioData(byte[] data,
int offset,
int len,
long timecode,
java.util.Map<String,IAMFPacketExtraData> extraData)
data - datalen - data lengthoffset - offsettimecode - absolute timecode (milliseconds)public void addAudioData(byte[] data,
int len,
long timecode)
data - datalen - data lengthtimecode - absolute timecode (milliseconds)public void addAudioData(byte[] data,
long timecode)
data - datatimecode - absolute timecode (milliseconds)public void addAudioDataInc(byte[] data,
int offset,
int len)
public void addDataData(byte[] data,
int offset,
int len,
long timecode)
public void addDataData(byte[] data,
int offset,
int len,
long timecode,
java.util.Map<String,IAMFPacketExtraData> extraData)
data - dataoffset - offsetlen - data lengthtimecode - absolute timecode (milliseconds)public void addDataData(byte[] data,
int len,
long timecode)
data - datalen - data lengthtimecode - absolute timecode (milliseconds)public void addDataData(byte[] data,
long timecode)
data - datatimecode - absolute timecode (milliseconds)public void addDataDataInc(byte[] data,
int offset,
int len)
public void addVideoData(byte[] data,
int offset,
int len,
long timecode)
data - dataoffset - offsetlen - data lengthtimecode - absolute timecode (milliseconds)public void addVideoData(byte[] data,
int offset,
int len,
long timecode,
java.util.Map<String,IAMFPacketExtraData> extraData)
data - dataoffset - offsetlen - data lengthtimecode - absolute timecode (milliseconds)extraData - extra datapublic void addVideoData(byte[] data,
int len,
long timecode)
data - datalen - data lengthtimecode - absolute timecode (milliseconds)public void addVideoData(byte[] data,
long timecode)
data - datatimecode - absolute timecode (milliseconds)public void addVideoDataInc(byte[] data,
int offset,
int len)
public void close()
public static Publisher createInstance(IApplicationInstance appInstance)
public static Publisher createInstance(IVHost vhost, String applicationName, String appInstanceName)
public void createStream()
public void flush()
public IApplicationInstance getAppInstance()
public String getFileExtension()
public long getLastAudioTimecode()
public long getLastDataTimecode()
public long getLastGapAudioTimecode()
public long getLastGapVideoTimecode()
public long getLastVideoTimecode()
public long getMaxTimecode()
public IMediaStream getStream()
public String getStreamType()
public boolean isPublishDataEvents()
public void publish(String streamName)
streamName - stream namepublic void publish(String streamName,
String howToPublish)
streamName - stream namehowToPublish - publish method (live, record, append)public void setFileExtension(String fileExtension)
fileExtension - file extensionpublic void setPublishDataEvents(boolean publishDataEvents)
public void setStreamType(String streamType)
streamType - stream typepublic void startAudioData(int len,
long timecode)
public void startAudioData(int len,
long timecode,
java.util.Map<String,IAMFPacketExtraData> extraData)
public void startDataData(int len,
long timecode)
public void startDataData(int len,
long timecode,
java.util.Map<String,IAMFPacketExtraData> extraData)
public void startVideoData(int len,
long timecode)
public void startVideoData(int len,
long timecode,
java.util.Map<String,IAMFPacketExtraData> extraData)
public void unpublish()