This guide covers deploying the Mobile AI Orchestrator to various mobile and embedded platforms.
- Platform Overview
- Android Deployment
- iOS Deployment
- Linux Mobile (Embedded)
- Cross-Compilation
- Size Optimization
- Testing and Validation
| Platform | Architecture | Status | Notes |
|---|---|---|---|
| Android | ARM64 (aarch64) | ✅ Primary | NDK r25+ |
| Android | ARMv7 (armv7) | Older devices | |
| iOS | ARM64 (aarch64) | ✅ Primary | iOS 14+ |
| Linux Mobile | ARM64 | ✅ Primary | PinePhone, Librem 5 |
| RISC-V | rv64gc | 🔬 Experimental | Future platforms |
- RAM: 128MB minimum, 256MB recommended
- Storage: 5MB binary + 10-50MB for models
- CPU: ARMv7+ or RISC-V RV64
- OS: Android 8+, iOS 14+, Linux 5.4+
This approach builds the Rust library as a native library (.so) and calls it from Java/Kotlin via JNI.
# Install Android NDK
# Download from https://developer.android.com/ndk/downloads
export ANDROID_NDK_HOME=/path/to/android-ndk-r25c
# Add Android targets
rustup target add aarch64-linux-android
rustup target add armv7-linux-androideabi
rustup target add x86_64-linux-android # For emulatorCreate .cargo/config.toml:
[target.aarch64-linux-android]
linker = "/path/to/android-ndk/toolchains/llvm/prebuilt/linux-x86_64/bin/aarch64-linux-android30-clang"
[target.armv7-linux-androideabi]
linker = "/path/to/android-ndk/toolchains/llvm/prebuilt/linux-x86_64/bin/armv7a-linux-androideabi30-clang"
[target.x86_64-linux-android]
linker = "/path/to/android-ndk/toolchains/llvm/prebuilt/linux-x86_64/bin/x86_64-linux-android30-clang"Add to Cargo.toml:
[dependencies]
jni = "0.21"
[lib]
crate-type = ["cdylib"] # For AndroidCreate src/android.rs:
use jni::JNIEnv;
use jni::objects::{JClass, JString};
use jni::sys::jstring;
use std::sync::Mutex;
use crate::Orchestrator;
lazy_static::lazy_static! {
static ref ORCHESTRATOR: Mutex<Orchestrator> = Mutex::new(Orchestrator::new());
}
#[no_mangle]
pub extern "C" fn Java_com_example_mobileai_MobileAI_initOrchestrator(
_env: JNIEnv,
_class: JClass,
) {
// Initialize orchestrator
let _ = ORCHESTRATOR.lock();
}
#[no_mangle]
pub extern "C" fn Java_com_example_mobileai_MobileAI_processQuery(
env: JNIEnv,
_class: JClass,
query_text: JString,
) -> jstring {
// Convert Java string to Rust
let query: String = env.get_string(query_text)
.expect("Failed to get string")
.into();
// Process query
let mut orch = ORCHESTRATOR.lock().unwrap();
let response = match orch.process(crate::Query::new(&query)) {
Ok(resp) => resp.text,
Err(e) => format!("Error: {}", e),
};
// Convert back to Java string
env.new_string(response)
.expect("Failed to create Java string")
.into_inner()
}
#[no_mangle]
pub extern "C" fn Java_com_example_mobileai_MobileAI_switchProject(
env: JNIEnv,
_class: JClass,
project_name: JString,
) {
let project: String = env.get_string(project_name)
.expect("Failed to get string")
.into();
let mut orch = ORCHESTRATOR.lock().unwrap();
orch.switch_project(&project);
}Add to src/lib.rs:
#[cfg(target_os = "android")]
pub mod android;# Build for ARM64 (most modern Android devices)
cargo build --target aarch64-linux-android --release
# Build for ARMv7 (older devices)
cargo build --target armv7-linux-androideabi --release
# Build for x86_64 (emulator)
cargo build --target x86_64-linux-android --releaseOutputs:
target/aarch64-linux-android/release/libmobile_ai_orchestrator.sotarget/armv7-linux-androideabi/release/libmobile_ai_orchestrator.sotarget/x86_64-linux-android/release/libmobile_ai_orchestrator.so
-
Create Android project in Android Studio
-
Add native libraries to
app/src/main/jniLibs/:
app/src/main/jniLibs/
├── arm64-v8a/
│ └── libmobile_ai_orchestrator.so
├── armeabi-v7a/
│ └── libmobile_ai_orchestrator.so
└── x86_64/
└── libmobile_ai_orchestrator.so
- Create Java wrapper (
app/src/main/java/com/example/mobileai/MobileAI.java):
package com.example.mobileai;
public class MobileAI {
static {
System.loadLibrary("mobile_ai_orchestrator");
}
public static native void initOrchestrator();
public static native String processQuery(String query);
public static native void switchProject(String projectName);
}- Use in Activity:
class MainActivity : AppCompatActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
// Initialize
MobileAI.initOrchestrator()
// Process query
val response = MobileAI.processQuery("What is Rust?")
println("Response: $response")
// Switch project
MobileAI.switchProject("my-app")
}
}Add to proguard-rules.pro:
# Keep native methods
-keepclasseswithmembernames class * {
native <methods>;
}
# Keep MobileAI class
-keep class com.example.mobileai.MobileAI { *; }
For testing on rooted devices or Termux:
# Install Rust in Termux
pkg install rust
# Clone repository
git clone https://github.com/Hyperpolymath/heterogenous-mobile-computing
cd heterogenous-mobile-computing
# Build
cargo build --release
# Run
./target/release/mobile-ai# Install Xcode Command Line Tools
xcode-select --install
# Add iOS targets
rustup target add aarch64-apple-ios
rustup target add x86_64-apple-ios # For simulator
rustup target add aarch64-apple-ios-sim # For M1/M2 simulatorUpdate Cargo.toml:
[lib]
crate-type = ["staticlib"] # For iOSCreate mobile_ai.h:
#ifndef MOBILE_AI_H
#define MOBILE_AI_H
#include <stdint.h>
#include <stdbool.h>
// Initialize orchestrator
void mobile_ai_init(void);
// Process query (returns owned string, caller must free)
char* mobile_ai_process_query(const char* query_text);
// Switch project
void mobile_ai_switch_project(const char* project_name);
// Free string returned by mobile_ai_process_query
void mobile_ai_free_string(char* s);
#endifCreate src/ios.rs:
use std::ffi::{CStr, CString};
use std::os::raw::c_char;
use std::sync::Mutex;
use crate::Orchestrator;
lazy_static::lazy_static! {
static ref ORCHESTRATOR: Mutex<Orchestrator> = Mutex::new(Orchestrator::new());
}
#[no_mangle]
pub extern "C" fn mobile_ai_init() {
let _ = ORCHESTRATOR.lock();
}
#[no_mangle]
pub extern "C" fn mobile_ai_process_query(query_text: *const c_char) -> *mut c_char {
if query_text.is_null() {
return std::ptr::null_mut();
}
let c_str = unsafe { CStr::from_ptr(query_text) };
let query = match c_str.to_str() {
Ok(s) => s,
Err(_) => return std::ptr::null_mut(),
};
let mut orch = ORCHESTRATOR.lock().unwrap();
let response = match orch.process(crate::Query::new(query)) {
Ok(resp) => resp.text,
Err(e) => format!("Error: {}", e),
};
CString::new(response).unwrap().into_raw()
}
#[no_mangle]
pub extern "C" fn mobile_ai_switch_project(project_name: *const c_char) {
if project_name.is_null() {
return;
}
let c_str = unsafe { CStr::from_ptr(project_name) };
if let Ok(project) = c_str.to_str() {
let mut orch = ORCHESTRATOR.lock().unwrap();
orch.switch_project(project);
}
}
#[no_mangle]
pub extern "C" fn mobile_ai_free_string(s: *mut c_char) {
if !s.is_null() {
unsafe {
let _ = CString::from_raw(s);
}
}
}Add to src/lib.rs:
#[cfg(target_os = "ios")]
pub mod ios;# Build for device (ARM64)
cargo build --target aarch64-apple-ios --release
# Build for simulator (x86_64 Intel)
cargo build --target x86_64-apple-ios --release
# Build for simulator (ARM64 M1/M2)
cargo build --target aarch64-apple-ios-sim --release
# Create universal library
lipo -create \
target/aarch64-apple-ios/release/libmobile_ai_orchestrator.a \
target/x86_64-apple-ios/release/libmobile_ai_orchestrator.a \
-output libmobile_ai_universal.a-
Create new iOS project in Xcode
-
Add library:
- Drag
libmobile_ai_universal.ato project - Drag
mobile_ai.hto project
- Drag
-
Create Swift wrapper (
MobileAI.swift):
import Foundation
class MobileAI {
static let shared = MobileAI()
private init() {
mobile_ai_init()
}
func processQuery(_ query: String) -> String {
let cQuery = query.cString(using: .utf8)
let cResponse = mobile_ai_process_query(cQuery)
guard let cResponse = cResponse else {
return "Error: Failed to process query"
}
let response = String(cString: cResponse)
mobile_ai_free_string(cResponse)
return response
}
func switchProject(_ projectName: String) {
let cProject = projectName.cString(using: .utf8)
mobile_ai_switch_project(cProject)
}
}- Create bridging header (
MobileAI-Bridging-Header.h):
#import "mobile_ai.h"- Use in SwiftUI:
struct ContentView: View {
@State private var query = ""
@State private var response = ""
var body: some View {
VStack {
TextField("Enter query", text: $query)
.padding()
Button("Process") {
response = MobileAI.shared.processQuery(query)
}
Text(response)
.padding()
}
}
}- PinePhone (Manjaro, postmarketOS)
- Librem 5 (PureOS)
- Generic ARM64 Linux
# On device
cargo build --release
# Or cross-compile from Linux x86_64
rustup target add aarch64-unknown-linux-gnu
# Install cross-compilation tools
sudo apt install gcc-aarch64-linux-gnu
# Configure
export CARGO_TARGET_AARCH64_UNKNOWN_LINUX_GNU_LINKER=aarch64-linux-gnu-gcc
# Build
cargo build --target aarch64-unknown-linux-gnu --release
# Copy to device
scp target/aarch64-unknown-linux-gnu/release/mobile-ai user@pinephone:/usr/local/bin/Create /etc/systemd/system/mobile-ai.service:
[Unit]
Description=Mobile AI Orchestrator Service
After=network.target
[Service]
Type=simple
User=mobile
ExecStart=/usr/local/bin/mobile-ai --daemon
Restart=on-failure
RestartSec=10
# Resource limits
MemoryMax=256M
CPUQuota=50%
[Install]
WantedBy=multi-user.targetEnable and start:
sudo systemctl enable mobile-ai
sudo systemctl start mobile-ai
sudo systemctl status mobile-ai# Install cross
cargo install cross
# Build for various platforms
cross build --target aarch64-linux-android --release
cross build --target aarch64-unknown-linux-gnu --release
cross build --target armv7-unknown-linux-gnueabihf --release# Build for Android
nix build .#packages.x86_64-linux.android-aarch64
# Build for Linux ARM64
nix build .#packages.x86_64-linux.linux-aarch64
# Build with network features
nix build .#packages.x86_64-linux.with-networkUpdate flake.nix to add cross-compilation targets:
{
outputs = { self, nixpkgs, ... }: {
packages.x86_64-linux = {
android-aarch64 = pkgs.pkgsCross.aarch64-android.rustPlatform.buildRustPackage {
# ... Android build config
};
linux-aarch64 = pkgs.pkgsCross.aarch64-multiplatform.rustPlatform.buildRustPackage {
# ... Linux ARM64 build config
};
};
};
}Default release build:
- Linux x86_64: ~2.5MB (stripped)
- Android ARM64: ~2.8MB (stripped)
- iOS ARM64: ~2.6MB (stripped)
Already configured in Cargo.toml:
[profile.release]
opt-level = "z" # Optimize for size
lto = true # Link-time optimization
codegen-units = 1 # Better optimization
strip = true # Strip symbols
panic = "abort" # Smaller binary# Build without debug info
cargo build --release
# Additional stripping
strip target/release/mobile-ai
# Result: ~2MB → ~1.8MB# Build without network features
cargo build --release --no-default-features
# Result: ~2MB → ~1.5MB (removes tokio/reqwest)# Install UPX
sudo apt install upx
# Compress binary
upx --best --lzma target/release/mobile-ai
# Result: ~1.5MB → ~600KB
# Note: Slower startup, but much smallerRemove unused dependencies from Cargo.toml:
[dependencies]
# Only keep what you need
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
# Remove if not needed:
# tokio = ...
# reqwest = ...| Configuration | Size | Use Case |
|---|---|---|
| Full (network + all features) | 2.5MB | Development |
| Release (no network) | 1.5MB | Offline deployment |
| Release + stripped + UPX | 600KB | Constrained devices |
| Minimal (no serde_json) | 400KB | Embedded systems |
# Push binary to device
adb push target/aarch64-linux-android/release/mobile-ai /data/local/tmp/
# Make executable
adb shell chmod +x /data/local/tmp/mobile-ai
# Run
adb shell /data/local/tmp/mobile-ai
# Check logs
adb logcat | grep mobile-ai# Deploy via Xcode
# Run on device or simulator
# View logs
xcrun simctl spawn booted log stream --predicate 'subsystem contains "mobile-ai"'# SSH to device
ssh user@192.168.1.100
# Run
./mobile-ai
# Check memory usage
htop
free -h
# Check CPU usage
top -p $(pidof mobile-ai)Create benches/mobile_bench.rs:
use criterion::{criterion_group, criterion_main, Criterion};
use mobile_ai_orchestrator::*;
fn bench_mobile_query(c: &mut Criterion) {
c.bench_function("mobile_query_processing", |b| {
let mut orch = Orchestrator::new();
let query = Query::new("What is Rust?");
b.iter(|| {
orch.process(query.clone())
});
});
}
criterion_group!(benches, bench_mobile_query);
criterion_main!(benches);Run on device:
# Android
adb push target/aarch64-linux-android/release/deps/mobile_bench /data/local/tmp/
adb shell /data/local/tmp/mobile_bench --bench
# iOS (via Xcode Instruments)
# Profile > CPU Profiler
# Linux
./target/aarch64-unknown-linux-gnu/release/deps/mobile_bench --bench# Start battery monitoring
adb shell dumpsys batterystats --reset
adb shell dumpsys batterystats --enable full-wake-history
# Run app for 1 hour
# ...
# Get battery stats
adb shell dumpsys batterystats > battery_stats.txt
# Analyze
# Look for "mobile-ai" in power consumption breakdownUse Xcode Instruments:
- Open Instruments
- Select "Energy Log"
- Profile your app
- Analyze power usage over time
# Android (valgrind not available, use Android Studio Profiler)
# iOS (Instruments - Leaks)
# Linux (valgrind)
valgrind --leak-check=full ./target/release/mobile-aiCreate .github/workflows/mobile-deploy.yml:
name: Mobile Deployment
on:
push:
tags:
- 'v*'
jobs:
build-android:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Setup Android NDK
uses: nttld/setup-ndk@v1
with:
ndk-version: r25c
- name: Install Rust
uses: actions-rs/toolchain@v1
with:
toolchain: stable
target: aarch64-linux-android
- name: Build
run: |
cargo build --target aarch64-linux-android --release
- name: Upload
uses: actions/upload-artifact@v3
with:
name: android-arm64
path: target/aarch64-linux-android/release/libmobile_ai_orchestrator.so
build-ios:
runs-on: macos-latest
steps:
- uses: actions/checkout@v3
- name: Install Rust
uses: actions-rs/toolchain@v1
with:
toolchain: stable
target: aarch64-apple-ios
- name: Build
run: |
cargo build --target aarch64-apple-ios --release
- name: Upload
uses: actions/upload-artifact@v3
with:
name: ios-arm64
path: target/aarch64-apple-ios/release/libmobile_ai_orchestrator.a
build-linux-mobile:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Install Cross
run: cargo install cross
- name: Build
run: |
cross build --target aarch64-unknown-linux-gnu --release
- name: Upload
uses: actions/upload-artifact@v3
with:
name: linux-arm64
path: target/aarch64-unknown-linux-gnu/release/mobile-aiIssue: UnsatisfiedLinkError when loading library
- Fix: Ensure library is in correct
jniLibs/directory - Fix: Check ABI matches (arm64-v8a for aarch64)
Issue: Binary too large for APK
- Fix: Use ProGuard/R8 shrinking
- Fix: Build with
opt-level = "z" - Fix: Split APKs by ABI
Issue: Linker errors with static library
- Fix: Ensure all dependencies are included
- Fix: Link with
-lresolvif needed
Issue: App Store rejection due to size
- Fix: Enable bitcode compilation
- Fix: Use thinning for per-device optimization
Issue: Binary won't run (Exec format error)
- Fix: Ensure target matches device architecture
- Fix: Check
file mobile-aioutput
Issue: High CPU usage
- Fix: Profile with
perf record - Fix: Optimize hot paths identified in benchmarks
- Test on real devices: Android phone, iPhone, PinePhone
- Profile power consumption: Use platform tools to measure battery impact
- Optimize for size: Target <1MB for embedded systems
- Create platform packages: APK for Android, IPA for iOS, deb for Linux
- Set up CI/CD: Automated builds for all platforms
For development setup, see main README.md For advanced features, see ADVANCED_FEATURES.md For performance tuning, see benchmarking section in advanced features