EyeQ Docs

Sample Projects

Explore the iOS Video SDK sample project

The SDK includes a complete sample project demonstrating real-world usage patterns. This sample provides a working implementation you can reference when building your own application.

Project structure

The iOS sample demonstrates real-time camera processing with AVFoundation, including camera switching, tap-to-focus, and exposure adjustment.

Sample/
├── DynamicDemo/
│   ├── AppDelegate.swift             # App lifecycle
│   ├── SceneDelegate.swift           # Scene lifecycle
│   ├── ViewController.swift          # Main camera view controller
│   ├── PreviewView.swift             # AVCaptureVideoPreviewLayer wrapper
│   ├── RuntimeError.swift            # Error handling
│   ├── Info.plist                    # Camera usage description
│   ├── Base.lproj/
│   │   ├── Main.storyboard
│   │   └── LaunchScreen.storyboard
│   ├── Assets.xcassets/              # Images and icons
│   └── DynamicFramework.xcframework/ # Embedded SDK framework
└── DynamicDemo.xcodeproj

Key features demonstrated

FeatureDescription
Camera previewAVCaptureSession with video data output
Real-time processingFrame-by-frame SDK processing
Camera switchingFront/back camera with resetDeflicker()
Strength sliderReal-time correction intensity
Tap to focusTouch-based focus and exposure
Exposure adjustmentPan gesture for exposure bias

SDK initialization and configuration

The SDK is initialized inline with credentials stored in a constants struct. Deflickering is configured in viewDidLoad before camera capture begins:

ViewController.swift
struct Constants {
    static let certificate = #"YOUR_CERTIFICATE"#
    static let apiKey = "YOUR_API_KEY"
}

class ViewController: UIViewController {

    private let pfcDynamic = PFCDynamic(
        apiKey: Constants.apiKey,
        certificate: Constants.certificate
    )

    override func viewDidLoad() {
        super.viewDidLoad()

        // Configure deflickering for video
        pfcDynamic.setParams(
            deflickerCurve: 0.08,
            deflickerImage: 0.9,
            skip: 0
        )

        setupViews()
    }
}

Real-time camera processing

The AVCaptureVideoDataOutputSampleBufferDelegate receives each camera frame. The sample converts the buffer to CGImage, processes it through the SDK, and displays the result:

ViewController.swift
extension ViewController: AVCaptureVideoDataOutputSampleBufferDelegate {

    func captureOutput(
        _ output: AVCaptureOutput,
        didOutput sampleBuffer: CMSampleBuffer,
        from connection: AVCaptureConnection
    ) {
        guard let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer),
              let sourceCgImage = cgImage(from: imageBuffer) else { return }

        do {
            let filteredCgImage = try pfcDynamic.processImage(
                sourceCgImage,
                strength: dynamicStrength
            )

            // Handle front camera mirroring
            let isMirrored = videoDeviceInput?.device.position == .front
            let orientation: UIImage.Orientation = isMirrored ? .upMirrored : .up

            let filteredImage = UIImage(
                cgImage: filteredCgImage,
                scale: 1,
                orientation: orientation
            )

            DispatchQueue.main.async {
                self.filteredFrameView.image = filteredImage
            }
        } catch {
            debugPrint(error)
        }
    }
}

Camera switching with deflicker reset

When switching between front and back cameras, the deflicker state must be reset. Without this, the temporal smoothing algorithm would blend frames from different cameras, causing visual artifacts:

ViewController.swift
@objc func rotateButtonTapped(_ sender: AnyObject) {
    sessionQueue.async {
        // ... camera switching logic ...

        // Reset deflicker when switching cameras
        self.pfcDynamic.resetDeflicker()

        self.session.commitConfiguration()
    }
}

PreviewView helper class

The sample includes a reusable PreviewView class that wraps AVCaptureVideoPreviewLayer:

PreviewView.swift
class PreviewView: UIView {

    var videoPreviewLayer: AVCaptureVideoPreviewLayer {
        guard let layer = layer as? AVCaptureVideoPreviewLayer else {
            fatalError("Expected AVCaptureVideoPreviewLayer type for layer.")
        }
        return layer
    }

    var session: AVCaptureSession? {
        get { return videoPreviewLayer.session }
        set { videoPreviewLayer.session = newValue }
    }

    override class var layerClass: AnyClass {
        return AVCaptureVideoPreviewLayer.self
    }
}

Converting CVImageBuffer to CGImage

Camera frames arrive as CVImageBuffer but the SDK requires CGImage. This helper performs the conversion using Core Image:

ViewController.swift
func cgImage(from imageBuffer: CVImageBuffer) -> CGImage? {
    let ciImage = CIImage(cvImageBuffer: imageBuffer)
    let ciContext = CIContext(options: [CIContextOption.useSoftwareRenderer: false])

    let colorSpace = CGColorSpaceCreateDeviceRGB()
    let width = Int(ciImage.extent.size.width)
    let height = Int(ciImage.extent.size.height)

    let bytesPerRow = width * 4
    let contextMemoryCapacity = width * height * 4 * 8
    let dataPointer = UnsafeMutableRawPointer.allocate(
        byteCount: contextMemoryCapacity,
        alignment: MemoryLayout<UInt8>.alignment
    )

    ciContext.render(
        ciImage,
        toBitmap: dataPointer,
        rowBytes: bytesPerRow,
        bounds: ciImage.extent,
        format: .RGBA8,
        colorSpace: colorSpace
    )

    // Create CGImage from bitmap data
    // ... provider and bitmapInfo setup ...

    return CGImage(
        width: width,
        height: height,
        bitsPerComponent: 8,
        bitsPerPixel: 32,
        bytesPerRow: bytesPerRow,
        space: colorSpace,
        bitmapInfo: bitmapInfo,
        provider: provider,
        decode: nil,
        shouldInterpolate: false,
        intent: .defaultIntent
    )
}

Running the sample

  1. Open DynamicDemo.xcodeproj in Xcode
  2. Update the Constants struct with your API key and certificate
  3. Select a physical iOS device (camera requires real hardware)
  4. Ensure signing is configured for your development team
  5. Click Run or press Cmd+R

The sample includes an expired demo license. Replace the credentials in the Constants struct with your own valid license from EyeQ before running.

The iOS Simulator does not support camera access. You must use a physical device to run this sample.

Adapting for your project

Common modifications:

  • SwiftUI: Wrap UIImageView in UIViewRepresentable for displaying processed frames
  • Photo library: Use PHPickerViewController to select images, then process with processImage()
  • Video recording: Add AVCaptureMovieFileOutput and process frames during recording
  • Check initialization: Always call checkStatus() after creating PFCDynamic to verify license validity

VIDEO-SDK Version 1.0.0.23 built from aa5eef97017e23db1d3051b079500606825ef474 on 5-6-2023.

Copyright © 2026 EyeQ Imaging Inc. All rights reserved.

On this page