Sample projects
Explore the Video Mobile SDK sample projects for Android and iOS.
The SDK includes complete sample projects demonstrating real-world usage patterns. These samples provide working implementations you can reference when building your own applications.
Android sample
The Android sample demonstrates a complete image gallery and preview application using modern Android architecture patterns including Hilt dependency injection, MVVM, and Navigation components.
Project structure
The sample follows standard Android project conventions with a clean architecture approach:
sample/
├── app/
│ ├── src/main/java/photos/eyeq/sample/
│ │ ├── App.kt # Application class with SDK init
│ │ ├── core/ # Base classes
│ │ │ ├── BaseFragment.kt
│ │ │ ├── BaseRepository.kt
│ │ │ └── BaseViewModel.kt
│ │ ├── di/
│ │ │ └── AppModule.kt # Hilt dependency injection
│ │ ├── media/
│ │ │ └── Media.kt # Media data models
│ │ ├── ui/
│ │ │ ├── home/
│ │ │ │ ├── HomeFragment.kt # Gallery grid view
│ │ │ │ ├── HomeViewModel.kt
│ │ │ │ └── MediaAdapter.kt # RecyclerView adapter
│ │ │ ├── main/
│ │ │ │ └── MainActivity.kt # Navigation host
│ │ │ └── preview/
│ │ │ ├── PreviewFragment.kt # Image preview with slider
│ │ │ └── PreviewViewModel.kt # Processing logic
│ │ └── utils/
│ │ ├── StorageUtil.kt # Save processed images
│ │ └── ktx/ # Kotlin extensions
│ └── build.gradle
├── build.gradle
└── settings.gradle.ktsKey features demonstrated
The sample showcases several important SDK integration patterns.
| Feature | Description |
|---|---|
| Hilt DI | Singleton DynamicProcessor injected throughout the app |
| Preview rendering | DynamicView with processImagePreview() for fast display |
| Full processing | processImage() for saving at full resolution |
| Strength slider | Real-time adjustment using setStrength() |
| License status | Display days remaining with getDaysLeft() |
| Background processing | Coroutines with IO dispatcher |
Dependency injection setup
The sample uses Hilt to provide a singleton DynamicProcessor instance throughout the app. This ensures the SDK is initialized once and reused across all screens:
@Module
@InstallIn(SingletonComponent::class)
class AppModule {
@Provides
@Singleton
fun provideDynamicProcessor(
@ApplicationContext context: Context
): DynamicProcessor {
val apiKey = "your_api_key"
val cert = "your_certificate"
val dynamicProcessor = DynamicProcessor(context)
dynamicProcessor.setLicence(apiKey = apiKey, cert = cert)
return dynamicProcessor
}
}Application initialization
The Application class triggers SDK initialization when the app starts. GPU acceleration is enabled for optimal performance:
@HiltAndroidApp
class App : Application() {
@Inject
lateinit var dynamic: DynamicProcessor
override fun onCreate() {
super.onCreate()
dynamic.init(useGpu = true)
}
}Preview with DynamicView
The PreviewFragment demonstrates efficient preview rendering using DynamicView. Users can adjust correction strength in real-time with a slider:
override fun setupViews() = with(binding) {
loadBitmap()
// Slider for real-time strength adjustment
dynamicStrength.value = DEFAULT_STRENGTH
dynamicStrength.addOnChangeListener { _, value, _ ->
preview.setStrength(value / 100f)
preview.requestRender()
}
}
private fun loadBitmap() = lifecycleScope.launch {
binding.loading.isVisible = true
val bitmap = withContext(ioDispatcher) {
fetchBitmap(safeArgs.imgUri)
}
binding.preview.setBitmap(bitmap)
viewModel.setImage(bitmap, DEFAULT_STRENGTH / 100f)
}Processing and saving images
The PreviewViewModel handles both fast preview generation and full-resolution export. Preview uses processImagePreview() for speed, while saving uses processImage() for quality:
// Fast preview processing - returns DynamicOutputs for DynamicView
fun setImage(bitmap: Bitmap, strength: Float) = viewModelScope.launch(ioDispatcher) {
val result = dynamicProcessor.processImagePreview(bitmap, strength = strength)
_dynamicResult.postValue(result)
}
// Full resolution processing - returns Bitmap for saving
fun saveImage(bitmap: Bitmap, strength: Float) = viewModelScope.launch(ioDispatcher) {
val fullSizeImage = dynamicProcessor.processImage(bitmap = bitmap, strength = strength)
storageUtil.saveImage(fullSizeImage, folderName = "Dynamic")
_saveResult.postValue(Unit)
}Gradle configuration
The sample's build configuration includes TensorFlow Lite for ML inference and Hilt for dependency injection. Note the noCompress directive to prevent model file corruption:
android {
// Prevent compression of model files
androidResources {
noCompress("pnn", "pnne")
}
}
dependencies {
// SDK AAR file
implementation fileTree(dir: 'libs', include: ['*.jar', '*.aar'])
// TensorFlow Lite for ML inference
implementation "org.tensorflow:tensorflow-lite:2.8.0"
implementation "org.tensorflow:tensorflow-lite-gpu:2.8.0"
// Hilt for dependency injection
implementation "com.google.dagger:hilt-android:2.40.5"
kapt "com.google.dagger:hilt-compiler:2.40.5"
// Architecture components
implementation "androidx.lifecycle:lifecycle-runtime-ktx:2.4.1"
implementation "androidx.lifecycle:lifecycle-viewmodel-ktx:2.4.1"
implementation "androidx.navigation:navigation-fragment-ktx:2.4.1"
}Running the Android sample
Follow these steps to build and run the sample application:
- Open the
sampledirectory in Android Studio - Wait for Gradle sync to complete
- Ensure the SDK AAR file is in
app/libs/directory - Update credentials in
di/AppModule.ktwith your API key and certificate - Connect a physical Android device (API 21+)
- Grant storage permissions when prompted
- Click Run or press
Shift+F10
The sample includes an expired demo license. Replace the credentials in AppModule.kt with your own valid license from EyeQ before running.
iOS sample
The iOS sample demonstrates real-time camera processing with AVFoundation, including camera switching, tap-to-focus, and exposure adjustment.
Project structure
The iOS sample is a single-view application focused on demonstrating real-time camera processing:
Sample/
├── DynamicDemo/
│ ├── AppDelegate.swift # App lifecycle
│ ├── SceneDelegate.swift # Scene lifecycle
│ ├── ViewController.swift # Main camera view controller
│ ├── PreviewView.swift # AVCaptureVideoPreviewLayer wrapper
│ ├── RuntimeError.swift # Error handling
│ ├── Info.plist # Camera usage description
│ ├── Base.lproj/
│ │ ├── Main.storyboard
│ │ └── LaunchScreen.storyboard
│ ├── Assets.xcassets/ # Images and icons
│ └── DynamicFramework.xcframework/ # Embedded SDK framework
└── DynamicDemo.xcodeprojKey features demonstrated
The sample showcases real-time video processing patterns.
| Feature | Description |
|---|---|
| Camera preview | AVCaptureSession with video data output |
| Real-time processing | Frame-by-frame SDK processing |
| Camera switching | Front/back camera with resetDeflicker() |
| Strength slider | Real-time correction intensity |
| Tap to focus | Touch-based focus and exposure |
| Exposure adjustment | Pan gesture for exposure bias |
SDK initialization and configuration
The SDK is initialized inline with credentials stored in a constants struct. Deflickering is configured in viewDidLoad before camera capture begins:
struct Constants {
static let certificate = #"your_certificate"#
static let apiKey = "your_api_key"
}
class ViewController: UIViewController {
private let pfcDynamic = PFCDynamic(
apiKey: Constants.apiKey,
certificate: Constants.certificate
)
override func viewDidLoad() {
super.viewDidLoad()
// Configure deflickering for video
pfcDynamic.setParams(
deflickerCurve: 0.08,
deflickerImage: 0.9,
skip: 0
)
setupViews()
}
}Real-time camera processing
The AVCaptureVideoDataOutputSampleBufferDelegate receives each camera frame. The sample converts the buffer to CGImage, processes it through the SDK, and displays the result:
extension ViewController: AVCaptureVideoDataOutputSampleBufferDelegate {
func captureOutput(
_ output: AVCaptureOutput,
didOutput sampleBuffer: CMSampleBuffer,
from connection: AVCaptureConnection
) {
guard let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer),
let sourceCgImage = cgImage(from: imageBuffer) else { return }
do {
let filteredCgImage = try pfcDynamic.processImage(
sourceCgImage,
strength: dynamicStrength
)
// Handle front camera mirroring
let isMirrored = videoDeviceInput?.device.position == .front
let orientation: UIImage.Orientation = isMirrored ? .upMirrored : .up
let filteredImage = UIImage(
cgImage: filteredCgImage,
scale: 1,
orientation: orientation
)
DispatchQueue.main.async {
self.filteredFrameView.image = filteredImage
}
} catch {
debugPrint(error)
}
}
}Camera switching with deflicker reset
When switching between front and back cameras, the deflicker state must be reset. Without this, the temporal smoothing algorithm would blend frames from different cameras, causing visual artifacts:
@objc func rotateButtonTapped(_ sender: AnyObject) {
sessionQueue.async {
// ... camera switching logic ...
// Reset deflicker when switching cameras
self.pfcDynamic.resetDeflicker()
self.session.commitConfiguration()
}
}PreviewView helper class
The sample includes a reusable PreviewView class that wraps AVCaptureVideoPreviewLayer. This simplifies camera preview setup by exposing the session as a simple property:
class PreviewView: UIView {
var videoPreviewLayer: AVCaptureVideoPreviewLayer {
guard let layer = layer as? AVCaptureVideoPreviewLayer else {
fatalError("Expected AVCaptureVideoPreviewLayer type for layer.")
}
return layer
}
var session: AVCaptureSession? {
get { return videoPreviewLayer.session }
set { videoPreviewLayer.session = newValue }
}
override class var layerClass: AnyClass {
return AVCaptureVideoPreviewLayer.self
}
}Converting CVImageBuffer to CGImage
Camera frames arrive as CVImageBuffer but the SDK requires CGImage. This helper method performs the conversion using Core Image:
func cgImage(from imageBuffer: CVImageBuffer) -> CGImage? {
let ciImage = CIImage(cvImageBuffer: imageBuffer)
let ciContext = CIContext(options: [CIContextOption.useSoftwareRenderer: false])
let colorSpace = CGColorSpaceCreateDeviceRGB()
let width = Int(ciImage.extent.size.width)
let height = Int(ciImage.extent.size.height)
// Allocate memory and render CIImage to bitmap
let bytesPerRow = width * 4
let contextMemoryCapacity = width * height * 4 * 8
let dataPointer = UnsafeMutableRawPointer.allocate(
byteCount: contextMemoryCapacity,
alignment: MemoryLayout<UInt8>.alignment
)
ciContext.render(
ciImage,
toBitmap: dataPointer,
rowBytes: bytesPerRow,
bounds: ciImage.extent,
format: .RGBA8,
colorSpace: colorSpace
)
// Create CGImage from bitmap data
// ... provider and bitmapInfo setup ...
return CGImage(
width: width,
height: height,
bitsPerComponent: 8,
bitsPerPixel: 32,
bytesPerRow: bytesPerRow,
space: colorSpace,
bitmapInfo: bitmapInfo,
provider: provider,
decode: nil,
shouldInterpolate: false,
intent: .defaultIntent
)
}Running the iOS sample
Follow these steps to build and run the sample application:
- Open
DynamicDemo.xcodeprojin Xcode - Update the
Constantsstruct with your API key and certificate - Select a physical iOS device (camera requires real hardware)
- Ensure signing is configured for your development team
- Click Run or press
Cmd+R
The sample includes an expired demo license. Replace the credentials in the Constants struct with your own valid license from EyeQ before running.
The iOS Simulator does not support camera access. You must use a physical device to run this sample.
Configuration checklist
Before building either sample project, verify that all required items are in place. Missing any of these will cause build failures or runtime errors:
| Item | Android | iOS |
|---|---|---|
| API key configured | AppModule.kt | ViewController.swift |
| Certificate configured | AppModule.kt | ViewController.swift |
| SDK file location | AAR in app/libs/ | XCFramework in project |
| Minimum OS version | API 21 (Android 5.0) | iOS 11.0 |
| Build tool | Android Studio | Xcode 13+ |
| Physical device required | Recommended | Required (camera) |
| TensorFlow dependency | Required | Bundled in framework |
Adapting samples for your project
The samples demonstrate one approach to SDK integration, but your project may have different requirements. Here are common adaptations:
Android
Common modifications for Android projects:
- Without Hilt: Create a singleton
DynamicProcessorin yourApplicationclass (see quickstart) - Jetpack Compose: Replace XML layouts with Compose UI, wrap
DynamicViewinAndroidView - Camera processing: Add CameraX dependencies and use
processCameraFrame()instead ofprocessImagePreview() - License monitoring: Use
getDaysLeft()to display license expiration warnings to users
iOS
Common modifications for iOS projects:
- SwiftUI: Wrap
UIImageViewinUIViewRepresentablefor displaying processed frames - Photo library: Use
PHPickerViewControllerto select images, then process withprocessImage() - Video recording: Add
AVCaptureMovieFileOutputand process frames during recording - Check initialization: Always call
checkStatus()after creatingPFCDynamicto verify license validity
VIDEO-SDK Version 1.0.0.23 built from aa5eef97017e23db1d3051b079500606825ef474 on 5-6-2023.
Copyright © 2026 EyeQ Imaging Inc. All rights reserved.