EyeQ Docs

Sample projects

Explore the Video Mobile SDK sample projects for Android and iOS.

The SDK includes complete sample projects demonstrating real-world usage patterns. These samples provide working implementations you can reference when building your own applications.

Android sample

The Android sample demonstrates a complete image gallery and preview application using modern Android architecture patterns including Hilt dependency injection, MVVM, and Navigation components.

Project structure

The sample follows standard Android project conventions with a clean architecture approach:

sample/
├── app/
│   ├── src/main/java/photos/eyeq/sample/
│   │   ├── App.kt                        # Application class with SDK init
│   │   ├── core/                         # Base classes
│   │   │   ├── BaseFragment.kt
│   │   │   ├── BaseRepository.kt
│   │   │   └── BaseViewModel.kt
│   │   ├── di/
│   │   │   └── AppModule.kt              # Hilt dependency injection
│   │   ├── media/
│   │   │   └── Media.kt                  # Media data models
│   │   ├── ui/
│   │   │   ├── home/
│   │   │   │   ├── HomeFragment.kt       # Gallery grid view
│   │   │   │   ├── HomeViewModel.kt
│   │   │   │   └── MediaAdapter.kt       # RecyclerView adapter
│   │   │   ├── main/
│   │   │   │   └── MainActivity.kt       # Navigation host
│   │   │   └── preview/
│   │   │       ├── PreviewFragment.kt    # Image preview with slider
│   │   │       └── PreviewViewModel.kt   # Processing logic
│   │   └── utils/
│   │       ├── StorageUtil.kt            # Save processed images
│   │       └── ktx/                      # Kotlin extensions
│   └── build.gradle
├── build.gradle
└── settings.gradle.kts

Key features demonstrated

The sample showcases several important SDK integration patterns.

FeatureDescription
Hilt DISingleton DynamicProcessor injected throughout the app
Preview renderingDynamicView with processImagePreview() for fast display
Full processingprocessImage() for saving at full resolution
Strength sliderReal-time adjustment using setStrength()
License statusDisplay days remaining with getDaysLeft()
Background processingCoroutines with IO dispatcher

Dependency injection setup

The sample uses Hilt to provide a singleton DynamicProcessor instance throughout the app. This ensures the SDK is initialized once and reused across all screens:

di/AppModule.kt
@Module
@InstallIn(SingletonComponent::class)
class AppModule {

    @Provides
    @Singleton
    fun provideDynamicProcessor(
        @ApplicationContext context: Context
    ): DynamicProcessor {
        val apiKey = "your_api_key"
        val cert = "your_certificate"
        
        val dynamicProcessor = DynamicProcessor(context)
        dynamicProcessor.setLicence(apiKey = apiKey, cert = cert)
        return dynamicProcessor
    }
}

Application initialization

The Application class triggers SDK initialization when the app starts. GPU acceleration is enabled for optimal performance:

App.kt
@HiltAndroidApp
class App : Application() {

    @Inject
    lateinit var dynamic: DynamicProcessor

    override fun onCreate() {
        super.onCreate()
        dynamic.init(useGpu = true)
    }
}

Preview with DynamicView

The PreviewFragment demonstrates efficient preview rendering using DynamicView. Users can adjust correction strength in real-time with a slider:

ui/preview/PreviewFragment.kt
override fun setupViews() = with(binding) {
    loadBitmap()
    
    // Slider for real-time strength adjustment
    dynamicStrength.value = DEFAULT_STRENGTH
    dynamicStrength.addOnChangeListener { _, value, _ ->
        preview.setStrength(value / 100f)
        preview.requestRender()
    }
}

private fun loadBitmap() = lifecycleScope.launch {
    binding.loading.isVisible = true
    
    val bitmap = withContext(ioDispatcher) {
        fetchBitmap(safeArgs.imgUri)
    }
    
    binding.preview.setBitmap(bitmap)
    viewModel.setImage(bitmap, DEFAULT_STRENGTH / 100f)
}

Processing and saving images

The PreviewViewModel handles both fast preview generation and full-resolution export. Preview uses processImagePreview() for speed, while saving uses processImage() for quality:

ui/preview/PreviewViewModel.kt
// Fast preview processing - returns DynamicOutputs for DynamicView
fun setImage(bitmap: Bitmap, strength: Float) = viewModelScope.launch(ioDispatcher) {
    val result = dynamicProcessor.processImagePreview(bitmap, strength = strength)
    _dynamicResult.postValue(result)
}

// Full resolution processing - returns Bitmap for saving
fun saveImage(bitmap: Bitmap, strength: Float) = viewModelScope.launch(ioDispatcher) {
    val fullSizeImage = dynamicProcessor.processImage(bitmap = bitmap, strength = strength)
    storageUtil.saveImage(fullSizeImage, folderName = "Dynamic")
    _saveResult.postValue(Unit)
}

Gradle configuration

The sample's build configuration includes TensorFlow Lite for ML inference and Hilt for dependency injection. Note the noCompress directive to prevent model file corruption:

app/build.gradle
android {
    // Prevent compression of model files
    androidResources {
        noCompress("pnn", "pnne")
    }
}

dependencies {
    // SDK AAR file
    implementation fileTree(dir: 'libs', include: ['*.jar', '*.aar'])
    
    // TensorFlow Lite for ML inference
    implementation "org.tensorflow:tensorflow-lite:2.8.0"
    implementation "org.tensorflow:tensorflow-lite-gpu:2.8.0"
    
    // Hilt for dependency injection
    implementation "com.google.dagger:hilt-android:2.40.5"
    kapt "com.google.dagger:hilt-compiler:2.40.5"
    
    // Architecture components
    implementation "androidx.lifecycle:lifecycle-runtime-ktx:2.4.1"
    implementation "androidx.lifecycle:lifecycle-viewmodel-ktx:2.4.1"
    implementation "androidx.navigation:navigation-fragment-ktx:2.4.1"
}

Running the Android sample

Follow these steps to build and run the sample application:

  1. Open the sample directory in Android Studio
  2. Wait for Gradle sync to complete
  3. Ensure the SDK AAR file is in app/libs/ directory
  4. Update credentials in di/AppModule.kt with your API key and certificate
  5. Connect a physical Android device (API 21+)
  6. Grant storage permissions when prompted
  7. Click Run or press Shift+F10

The sample includes an expired demo license. Replace the credentials in AppModule.kt with your own valid license from EyeQ before running.


iOS sample

The iOS sample demonstrates real-time camera processing with AVFoundation, including camera switching, tap-to-focus, and exposure adjustment.

Project structure

The iOS sample is a single-view application focused on demonstrating real-time camera processing:

Sample/
├── DynamicDemo/
│   ├── AppDelegate.swift             # App lifecycle
│   ├── SceneDelegate.swift           # Scene lifecycle
│   ├── ViewController.swift          # Main camera view controller
│   ├── PreviewView.swift             # AVCaptureVideoPreviewLayer wrapper
│   ├── RuntimeError.swift            # Error handling
│   ├── Info.plist                    # Camera usage description
│   ├── Base.lproj/
│   │   ├── Main.storyboard
│   │   └── LaunchScreen.storyboard
│   ├── Assets.xcassets/              # Images and icons
│   └── DynamicFramework.xcframework/ # Embedded SDK framework
└── DynamicDemo.xcodeproj

Key features demonstrated

The sample showcases real-time video processing patterns.

FeatureDescription
Camera previewAVCaptureSession with video data output
Real-time processingFrame-by-frame SDK processing
Camera switchingFront/back camera with resetDeflicker()
Strength sliderReal-time correction intensity
Tap to focusTouch-based focus and exposure
Exposure adjustmentPan gesture for exposure bias

SDK initialization and configuration

The SDK is initialized inline with credentials stored in a constants struct. Deflickering is configured in viewDidLoad before camera capture begins:

ViewController.swift
struct Constants {
    static let certificate = #"your_certificate"#
    static let apiKey = "your_api_key"
}

class ViewController: UIViewController {
    
    private let pfcDynamic = PFCDynamic(
        apiKey: Constants.apiKey, 
        certificate: Constants.certificate
    )
    
    override func viewDidLoad() {
        super.viewDidLoad()
        
        // Configure deflickering for video
        pfcDynamic.setParams(
            deflickerCurve: 0.08, 
            deflickerImage: 0.9, 
            skip: 0
        )
        
        setupViews()
    }
}

Real-time camera processing

The AVCaptureVideoDataOutputSampleBufferDelegate receives each camera frame. The sample converts the buffer to CGImage, processes it through the SDK, and displays the result:

ViewController.swift
extension ViewController: AVCaptureVideoDataOutputSampleBufferDelegate {
    
    func captureOutput(
        _ output: AVCaptureOutput,
        didOutput sampleBuffer: CMSampleBuffer,
        from connection: AVCaptureConnection
    ) {
        guard let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer),
              let sourceCgImage = cgImage(from: imageBuffer) else { return }
        
        do {
            let filteredCgImage = try pfcDynamic.processImage(
                sourceCgImage,
                strength: dynamicStrength
            )
            
            // Handle front camera mirroring
            let isMirrored = videoDeviceInput?.device.position == .front
            let orientation: UIImage.Orientation = isMirrored ? .upMirrored : .up
            
            let filteredImage = UIImage(
                cgImage: filteredCgImage,
                scale: 1,
                orientation: orientation
            )
            
            DispatchQueue.main.async {
                self.filteredFrameView.image = filteredImage
            }
        } catch {
            debugPrint(error)
        }
    }
}

Camera switching with deflicker reset

When switching between front and back cameras, the deflicker state must be reset. Without this, the temporal smoothing algorithm would blend frames from different cameras, causing visual artifacts:

ViewController.swift
@objc func rotateButtonTapped(_ sender: AnyObject) {
    sessionQueue.async {
        // ... camera switching logic ...
        
        // Reset deflicker when switching cameras
        self.pfcDynamic.resetDeflicker()
        
        self.session.commitConfiguration()
    }
}

PreviewView helper class

The sample includes a reusable PreviewView class that wraps AVCaptureVideoPreviewLayer. This simplifies camera preview setup by exposing the session as a simple property:

PreviewView.swift
class PreviewView: UIView {
    
    var videoPreviewLayer: AVCaptureVideoPreviewLayer {
        guard let layer = layer as? AVCaptureVideoPreviewLayer else {
            fatalError("Expected AVCaptureVideoPreviewLayer type for layer.")
        }
        return layer
    }
    
    var session: AVCaptureSession? {
        get { return videoPreviewLayer.session }
        set { videoPreviewLayer.session = newValue }
    }
    
    override class var layerClass: AnyClass {
        return AVCaptureVideoPreviewLayer.self
    }
}

Converting CVImageBuffer to CGImage

Camera frames arrive as CVImageBuffer but the SDK requires CGImage. This helper method performs the conversion using Core Image:

ViewController.swift
func cgImage(from imageBuffer: CVImageBuffer) -> CGImage? {
    let ciImage = CIImage(cvImageBuffer: imageBuffer)
    let ciContext = CIContext(options: [CIContextOption.useSoftwareRenderer: false])
    
    let colorSpace = CGColorSpaceCreateDeviceRGB()
    let width = Int(ciImage.extent.size.width)
    let height = Int(ciImage.extent.size.height)
    
    // Allocate memory and render CIImage to bitmap
    let bytesPerRow = width * 4
    let contextMemoryCapacity = width * height * 4 * 8
    let dataPointer = UnsafeMutableRawPointer.allocate(
        byteCount: contextMemoryCapacity,
        alignment: MemoryLayout<UInt8>.alignment
    )
    
    ciContext.render(
        ciImage,
        toBitmap: dataPointer,
        rowBytes: bytesPerRow,
        bounds: ciImage.extent,
        format: .RGBA8,
        colorSpace: colorSpace
    )
    
    // Create CGImage from bitmap data
    // ... provider and bitmapInfo setup ...
    
    return CGImage(
        width: width,
        height: height,
        bitsPerComponent: 8,
        bitsPerPixel: 32,
        bytesPerRow: bytesPerRow,
        space: colorSpace,
        bitmapInfo: bitmapInfo,
        provider: provider,
        decode: nil,
        shouldInterpolate: false,
        intent: .defaultIntent
    )
}

Running the iOS sample

Follow these steps to build and run the sample application:

  1. Open DynamicDemo.xcodeproj in Xcode
  2. Update the Constants struct with your API key and certificate
  3. Select a physical iOS device (camera requires real hardware)
  4. Ensure signing is configured for your development team
  5. Click Run or press Cmd+R

The sample includes an expired demo license. Replace the credentials in the Constants struct with your own valid license from EyeQ before running.

The iOS Simulator does not support camera access. You must use a physical device to run this sample.

Configuration checklist

Before building either sample project, verify that all required items are in place. Missing any of these will cause build failures or runtime errors:

ItemAndroidiOS
API key configuredAppModule.ktViewController.swift
Certificate configuredAppModule.ktViewController.swift
SDK file locationAAR in app/libs/XCFramework in project
Minimum OS versionAPI 21 (Android 5.0)iOS 11.0
Build toolAndroid StudioXcode 13+
Physical device requiredRecommendedRequired (camera)
TensorFlow dependencyRequiredBundled in framework

Adapting samples for your project

The samples demonstrate one approach to SDK integration, but your project may have different requirements. Here are common adaptations:

Android

Common modifications for Android projects:

  • Without Hilt: Create a singleton DynamicProcessor in your Application class (see quickstart)
  • Jetpack Compose: Replace XML layouts with Compose UI, wrap DynamicView in AndroidView
  • Camera processing: Add CameraX dependencies and use processCameraFrame() instead of processImagePreview()
  • License monitoring: Use getDaysLeft() to display license expiration warnings to users

iOS

Common modifications for iOS projects:

  • SwiftUI: Wrap UIImageView in UIViewRepresentable for displaying processed frames
  • Photo library: Use PHPickerViewController to select images, then process with processImage()
  • Video recording: Add AVCaptureMovieFileOutput and process frames during recording
  • Check initialization: Always call checkStatus() after creating PFCDynamic to verify license validity

VIDEO-SDK Version 1.0.0.23 built from aa5eef97017e23db1d3051b079500606825ef474 on 5-6-2023.

Copyright © 2026 EyeQ Imaging Inc. All rights reserved.

On this page