Introduction
ONNX (Open Neural Network Exchange) is a widely used open-source format for AI models, supported by various popular frameworks. Integrating ONNX Runtime in your Kotlin Multiplatform (KMP) project allows you to run machine learning models efficiently across multiple platforms. In this guide, we'll integrate ONNX Runtime into a Compose Multiplatform project targeting both Android and iOS. We'll leverage the swift-klib-plugin to bridge Swift and Kotlin code, making ONNX models usable across platforms.
Prerequisites
- Basic knowledge of Kotlin Multiplatform projects.
- Installed Android Studio with Kotlin Multiplatform Mobile plugin.
- Xcode for iOS development.
Check Out the Code on GitHub! 🚀 🚀
While reading through the blog, you can refer to the code on my GitHub repository for better understanding and comparison. Here is the repo link : https://github.com/wh173d3v11/OnnxRuntimeKotlinMultiplatform
Step 1: Clone the swift-klib-plugin Repository
First, clone the swift-klib-plugin
repository:
git clone https://github.com/wh173d3v11/swift-klib-plugin.git
Note: This is not a official repo of swift-klib-plugin, this is my repo with fix for integrating onnx library in KMP Project.
Open the "project" in android studio and run the following Gradle task to publish the plugin to your local Maven repository:
gradle publishToMavenLocal
Step 2: Create a New Kotlin Multiplatform Project
Navigate to JetBrains KMP Portal and create a new mobile project targeting both Android and iOS. Download the generated project and open it in Android Studio.
Step 3: Configure the Project
Modify settings.gradle.kts
Add the mavenLocal()
repository to your
settings.gradle.kts
file:
dependencyResolutionManagement {
repositories {
mavenLocal()
// other repositories
}
}
pluginManagement {
repositories {
mavenLocal()
// other repositories
}
}
Modify build.gradle.kts
for ComposeApp
In the build.gradle.kts
file of the
composeApp
module, add the swift-klib plugin at the top:
plugins {
id("io.github.ttypic.swiftklib") version "0.7.0-SNAPSHOT"
// other plugins
}
Sync the project to apply the changes.
Step 4: Integrate ONNX Runtime for Android
Add the ONNX Runtime dependency for Android in androidMain
's
dependencies block:
androidMain {
dependencies {
....
implementation("com.microsoft.onnxruntime:onnxruntime-android:1.19.2")
}
}

Learn how seamlessly to bridge Android and iOS with CMP, building shared UIs, platform-specific screens & functionality
Enroll NowStep 5: Integrate ONNX Runtime for iOS
Configure CInterops
In the build.gradle.kts
file under the
kotlin
block, configure cinterops for iOS:
kotlin {
iosTarget.compilations.getByName("main") {
cinterops {
create("iosOnnx")
}
}
}
Create a folder for the ONNX Swift files in your iOS project path:
mkdir -p projectPath/iOSApp/iosApp/onnx
Add a dummy Swift file or leave it empty for now.
Configure Swift-Klib
Add the following configuration to the bottom of your
build.gradle.kts
file:
swiftklib {
create("iosOnnx") {
path = file("../iosApp/iosApp/onnx")
packageName("com.fierydinesh.onnx")
}
}
Step 6: Implement ONNX Runtime in Swift
Create an OnnxManager.swift
class in Xcode under the
onnx
folder with the following content:
import Foundation
import OnnxRuntimeBindings
@objc public class OnnxManager: NSObject {
@objc public static func runWithModel(year: Float, model: String) -> NSNumber {
do {
let result = try OnnxInferenceManager.run(year: year, model: model)
return NSNumber(value: result ?? -1)
} catch {
print("Failed to run OnnxInferenceManager: \(error)")
}
return -1.0
}
}
Create the OnnxInferenceManager.swift
class as well:
public class OnnxInferenceManager {
public static func run(year: Float, model: String) throws -> Float? {
// Initialization and inference logic
}
}
Check this link for full implementation -> OnnxInterferenceManager.swift
Step 7: Implement ONNX Inference in Kotlin
Don't have any model ? get my model from here -> linear_model.onnx
CommonMain
Define the expected class and function in
commonMain,
create a class file called OnnxInference.kt in
commonMain module
expect class OnnxInference() {
fun predict(input: Float, model: String = getResPath("files/linear_model.onnx")): Float
}
fun getResPath(input: String): String = "composeResources/onnxkmpsample.composeapp.generated.resources/$input"
Note: you may need to change the getResPath with your package name like "onnxkmpsample.composeapp" into your "com.package.name"
iOSMain
In iOSMain Module
, create the following
OnnxInference.kt
class:
import com.fierydinesh.onnx.OnnxManager
import kotlinx.cinterop.ExperimentalForeignApi
actual class OnnxInference {
@OptIn(ExperimentalForeignApi::class)
actual fun predict(input: Float, model: String): Float {
return OnnxManager.runWithModel(input, model).floatValue
}
}
AndroidMain
Implement the actual class in androidMain
:
actual class OnnxInference actual constructor() {
actual fun predict(input: Float, model: String): Float {
// Android ONNX Runtime inference logic
}
}
Check this link for full implementation -> OnnxInference.kt
Now we can use it on CommonMain Module.
result = onnxInterface.predict(input = inputYear.toFloat())
Check this link for full implementation -> App.kt
Conclusion
Integrating ONNX Runtime into your Kotlin Multiplatform project opens up a world of possibilities for running machine learning models across multiple platforms, like Android, iOS. With ONNX’s versatility and Kotlin’s power, you can seamlessly build AI-powered applications that perform well on any device. By following the steps in this guide, you’ll be able to enhance your app’s capabilities, all while leveraging the strengths of both Kotlin Multiplatform and ONNX Runtime. Happy coding, and feel free to explore more as you implement AI in your apps! 🚀
Post a Comment