STANAG 4609 Player .NET SDK  Version 3.10.0
Video / Data synchronization

The STANAG Player SDK synchronizes video, audio, klv and private data frames/packets, according to their time stamps (PTS). Once matching frames (for each incoming pid) are found, the SDK notifies through a callback on their data, and then forwards the list of "synced video / data frames" to the application layer. You can specify an allowed waiting time using MaxDelay parameter.

Note
Timestamp information is available for KLV packets only in SYNC_KLV mode.

Sync scenarios

There are two different scenarios:

Frame Capture mode.

In this mode, which is mostly used when the processing of uncompressed video frames is needed, the SyncFrameEvent provides a list of synchronized frames and data packets, so there is no additional synchronization required.

Note
The list is provided as soon as it is ready, so it could be processed ASAP. Displaying it "as Is" will not give a smooth playback. If you want to render (play) these frames, use timeToRender information that is present for all packets that have PTS info. You can use a simple FIFO to push the list and display it when the presentation time comes. If no timing info is available, you'll get -1 value. In this case, just use video timing info from this list for scheduling the presentation
struct StreamFrameInfoWr
{
StreamType streamType; // specifies the packet PID type
int streamId; // pid number of the stream
byte[] data; // data buffer
unsigned long dataSize; // frame's data buffer size
Int64 timeStamp; // time of IMediaSample
long timeToRender; // time to render in msec
};

Video Rendering mode.

This mode provides a smooth video playback as the interval betweem the frame presentation is handled internally. In this mode, the SyncFrameEvent provides a list of synchronized frames and data packets, but they don't exactly match the video currently shown in the window. The internal SyncSampleGrabber filter has video (YUV) output pin, which then connects to VMR9 through a scaler. VMR9 schedules the display of each frame according to the sample time stamp (PTS). When Source is a file, it is expected that the Grabber Filter’s output fifo will contain several pending frames. In this scenario you have some extra time for data processing (as you receive it in advance). In order to use the metadata synchronized to the currently displayed video (for overlay purposes, for example) it must be delayed (with FIFO), as explained above.

Note
This is not needed if you use a Frame Accuracy mode. For more info see Frame Accuracy mode.
Untitled 1




 Copyright 2023,    IMPLEOTV SYSTEMS LTD