Flash DirectShow Source Filter: Complete Setup and Configuration Guide

Flash DirectShow Source Filter: Integrating Flash Streams into DirectShow Pipelines

Overview

This article explains how a Flash DirectShow source filter works and shows a practical approach to integrating Flash (SWF/SWF-like) streams into DirectShow pipelines on Windows. It covers architecture, implementation steps, sample code patterns, threading and timing considerations, and guidance for handling codecs, synchronization, and error recovery.

When to use this filter

  • You need to render or process Flash content (SWF or streamed Flash video) inside a DirectShow graph.
  • You’re building a custom media player that must incorporate Flash-based streams alongside other media.
  • You want to decode or capture Flash video/audio for recording, streaming, or post-processing via DirectShow filters.

Architecture and components

  • Source Filter: Implements IFilterGraph, IBaseFilter, and related interfaces to provide media samples downstream.
  • Output Pin(s): Provide media types (video and possibly audio) via IPin/IEnumPins; handle connection and media type negotiation.
  • Sample Grabber / Renderer: Downstream components that accept samples (e.g., Sample Grabber, custom renderer).
  • Flash Engine: The component or library that parses/decodes Flash content (could be Adobe Flash runtime, a third-party decoder, or a custom SWF parser/renderer).
  • Threading & Scheduler: A worker thread to push samples downstream according to timestamps and DirectShow’s streaming model.
  • Sync Clock: Use IMediaSeeking/IMediaPosition and reference clocks to maintain A/V sync.

Implementation steps (high-level)

  1. Choose Flash handling approach
    • Use an existing Flash playback library (if licensing permits) or implement a SWF parser/renderer. For RTMP/FLV streams, use an FLV/RTMP demuxer + codec.
  2. Create COM filter structure
    • Implement required COM interfaces: IBaseFilter, IUnknown, IPersist, IMediaSeeking (optional), and custom interfaces as needed.
  3. Define media types
    • For video: AM_MEDIA_TYPE with subtype VIDEOINFOHEADER or VIDEOINFOHEADER2, set appropriate width, height, bitrate, and format (e.g., RGB24, RGB32, or compressed like H.264 if decoding).
    • For audio: WAVEFORMATEX with PCM or AAC/MP3 details if audio present.
  4. Implement output pin(s)
    • Support Connect, Disconnect, QueryAccept, EnumMediaTypes, GetMediaType, and DecideBufferSize.
    • Provide negotiated media types based on Flash content capabilities.
  5. Worker thread to deliver samples
    • Implement a run loop that reads frames from the Flash engine, wraps them in IMediaSample, sets timestamps (rtStart/rtStop), and calls IPin::Receive on downstream input pins or uses the connection’s allocator to deliver samples.
  6. Timing and synchronization
    • Use the graph’s reference clock (IMediaFilter::GetSyncSource) for accurate timestamps.
    • Convert Flash frame timing (e.g., SWF frame rate) to REFERENCE_TIME units (100 ns).
  7. Seeking and timestamps
    • Implement IMediaSeeking if random access is supported; otherwise support basic seeking semantics with start/stop.
  8. Error handling and reconnection
    • Detect format changes and send EC_COMPLETE or ECERRORABORT events as needed via IMediaEventSink.
  9. Registration and graph building
    • Register the filter DLL with COM and provide a DirectShow-friendly CLSID/SetupAPI information so GraphEdit/GraphStudio can use it.

Key code patterns (conceptual snippets)

  • Creating IMediaSample and setting times:

cpp

REFERENCE_TIME rtStart = frameIndex frameDuration; REFERENCETIME rtStop= rtStart + frameDuration; pSample->SetTime(&rtStart, &rtStop); pSample->SetActualDataLength(frameSize);
  • Delivering a sample on the output pin:

cpp

HRESULT hr = m_pOutputPin->Deliver(pSample); if (FAILED(hr)) { / handle downstream refusal or stop / }
  • Negotiating media types in GetMediaType:

cpp

if (pos == 0) { VIDEOINFOHEADER pVih = (VIDEOINFOHEADER)CoTaskMemAlloc(sizeof(VIDEOINFOHEADER)); ZeroMemory(pVih, sizeof(VIDEOINFOHEADER)); pVih->bmiHeader.biWidth = width; pVih->bmiHeader.biHeight = height; // set other fields… pmt->majortype = MEDIATYPE_Video; pmt->subtype = MEDIASUBTYPE_RGB24; pmt->pbFormat = (BYTE)pVih; pmt->cbFormat = sizeof(VIDEOINFOHEADER); return S_OK; }

Threading and performance tips

  • Use a single dedicated worker thread per source filter for frame extraction and delivery.
  • Avoid long blocking operations on the delivery thread; prefetch frames when possible.
  • Reuse allocators and buffers; implement DecideBufferSize to request an appropriate buffer count.
  • Use hardware-accelerated decoders (DXVA, Media Foundation transforms) when dealing with compressed streams.

Handling different Flash content types

  • SWF vector/ActionScript content: require a SWF rendering engine to rasterize frames (e.g., scale, render text/graphics) into bitmap frames for DirectShow video output.
  • FLV/RTMP streams: demux FLV packets, decode codecs (H.264, AAC) then output raw frames or compressed samples depending on downstream capability.
  • H.264 in FLV: prefer passing compressed H.264 as MEDIASUBTYPE_H264 if the downstream supports it; otherwise decode to RGB/YUV frames.

Synchronization with audio

  • Align video sample timestamps with audio timestamps from the Flash engine.
  • If audio is raw PCM, deliver audio samples on a separate audio output pin with properly matched timestamps.
  • For compressed audio (MP3/AAC), either deliver compressed packets or decode to PCM for compatibility.

Testing and debugging

  • Test with GraphEdit/GraphStudio to verify pin connections and media type negotiation.
  • Use logging for timestamp values, buffer sizes, and delivery return codes.
  • Validate behavior with network interruptions (for streamed sources) and format changes.

Deployment and licensing considerations

  • Verify licensing for using the Adobe Flash runtime; prefer open-source decoders where licensing fits your distribution.
  • Ensure your filter’s installer registers COM objects and adds registry entries for DirectShow filter categories.

Conclusion

Integrating Flash streams into DirectShow requires a source filter that bridges Flash decoding/rasterization with DirectShow’s streaming model. Focus on precise timestamping, media-type negotiation, efficient buffering, and robust error handling. Using existing decoders/demuxers where possible reduces development effort and licensing risk.

If you want, I can provide a minimal example project structure and code files for a simple SWF-to-RGB source filter.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *