3D & AR for every platform.
Build 3D and AR experiences with the UI frameworks you already know. Same concepts, same simplicity — Android, iOS, Web, Desktop, TV, Flutter, React Native.
// Android — Jetpack Compose
Scene(modifier = Modifier.fillMaxSize()) {
rememberModelInstance(modelLoader, "models/helmet.glb")?.let {
ModelNode(modelInstance = it, scaleToUnits = 1.0f, autoAnimate = true)
}
}// iOS — SwiftUI
SceneView(environment: .studio) {
ModelNode(named: "helmet.usdz")
.scaleToUnits(1.0)
}No engine boilerplate. No lifecycle callbacks. The runtime handles everything.
| Platform | Renderer | Framework | Status |
|---|---|---|---|
| Android | Filament | Jetpack Compose | Stable |
| Android TV | Filament | Compose TV | Alpha |
| iOS / macOS / visionOS | RealityKit | SwiftUI | Alpha |
| Web | Filament.js (WASM) | Kotlin/JS + WebXR | Alpha |
| Desktop | Software renderer | Compose Desktop | Alpha |
| Flutter | Native per platform | PlatformView | Alpha |
| React Native | Native per platform | Fabric | Alpha |
Android (3D + AR):
dependencies {
implementation("io.github.sceneview:sceneview:3.4.0") // 3D
implementation("io.github.sceneview:arsceneview:3.4.0") // AR (includes 3D)
}iOS / macOS / visionOS (Swift Package Manager):
https://github.com/SceneView/SceneViewSwift.git (from: 3.3.0)
Web (Kotlin/JS + Filament.js):
dependencies {
implementation("io.github.sceneview:sceneview-web:3.4.0")
}Desktop / Flutter / React Native: see samples/
Scene is a Composable that renders a Filament 3D viewport. Nodes are composables inside it.
Scene(
modifier = Modifier.fillMaxSize(),
engine = rememberEngine(),
modelLoader = rememberModelLoader(engine),
environment = rememberEnvironment(engine, "envs/studio.hdr"),
cameraManipulator = rememberCameraManipulator()
) {
// Model — async loaded, appears when ready
rememberModelInstance(modelLoader, "models/helmet.glb")?.let {
ModelNode(modelInstance = it, scaleToUnits = 1.0f, autoAnimate = true)
}
// Geometry — procedural shapes
CubeNode(size = Size(0.2f))
SphereNode(radius = 0.1f, position = Position(x = 0.5f))
// Nesting — same as Column { Row { } }
Node(position = Position(y = 1.0f)) {
LightNode(apply = { type(LightManager.Type.POINT); intensity(50_000f) })
CubeNode(size = Size(0.05f))
}
}| Node | What it does |
|---|---|
ModelNode |
glTF/GLB model with animations. isEditable = true for gestures. |
LightNode |
Sun, directional, point, or spot light. apply is a named parameter. |
CubeNode / SphereNode / CylinderNode / PlaneNode |
Procedural geometry |
ImageNode |
Image on a plane |
ViewNode |
Compose UI rendered as a 3D surface |
MeshNode |
Custom GPU mesh |
Node |
Group / pivot |
ARScene is Scene with ARCore. The camera follows real-world tracking.
var anchor by remember { mutableStateOf<Anchor?>(null) }
ARScene(
modifier = Modifier.fillMaxSize(),
planeRenderer = true,
onSessionUpdated = { _, frame ->
if (anchor == null) {
anchor = frame.getUpdatedPlanes()
.firstOrNull { it.type == Plane.Type.HORIZONTAL_UPWARD_FACING }
?.let { frame.createAnchorOrNull(it.centerPose) }
}
}
) {
anchor?.let {
AnchorNode(anchor = it) {
ModelNode(modelInstance = helmet, scaleToUnits = 0.5f)
}
}
}Plane detected → anchor set → Compose recomposes → model appears. Clear anchor → node removed. AR state is just Kotlin state.
| Node | What it does |
|---|---|
AnchorNode |
Follows a real-world anchor |
AugmentedImageNode |
Tracks a detected image |
AugmentedFaceNode |
Face mesh overlay |
CloudAnchorNode |
Persistent cross-device anchor |
StreetscapeGeometryNode |
Geospatial streetscape mesh |
Native Swift Package built on RealityKit. 17 node types.
SceneView(environment: .studio) {
ModelNode(named: "helmet.usdz").scaleToUnits(1.0)
GeometryNode.cube(size: 0.1, color: .blue).position(x: 0.5)
LightNode.directional(intensity: 1000)
}
.cameraControls(.orbit)AR on iOS:
ARSceneView(planeDetection: .horizontal) { position, arView in
GeometryNode.cube(size: 0.1, color: .blue)
.position(position)
}Install: https://github.com/SceneView/SceneViewSwift.git (SPM, from 3.3.0)
Each platform uses its native renderer. Shared logic lives in KMP.
sceneview-core (Kotlin Multiplatform)
├── math, collision, geometry, physics, animation
│
├── sceneview (Android) → Filament + Jetpack Compose
├── arsceneview (Android) → ARCore
├── SceneViewSwift (Apple) → RealityKit + SwiftUI
├── sceneview-web (Web) → Filament.js + WebXR
└── sceneview-desktop (JVM) → Compose Desktop
| Sample | Platform | Run |
|---|---|---|
samples/android-demo |
Android | ./gradlew :samples:android-demo:assembleDebug |
samples/android-tv-demo |
Android TV | ./gradlew :samples:android-tv-demo:assembleDebug |
samples/ios-demo |
iOS | Open in Xcode |
samples/web-demo |
Web | ./gradlew :samples:web-demo:jsBrowserRun |
samples/desktop-demo |
Desktop | ./gradlew :samples:desktop-demo:run |
samples/flutter-demo |
Flutter | cd samples/flutter-demo && flutter run |
samples/react-native-demo |
React Native | See README |
SceneView is on the MCP Registry — any AI assistant can use it to generate correct 3D/AR code.
npx sceneview-mcpThe MCP server provides API reference, code samples, setup guides, validation, and migration tools for all platforms.