Android Utility for Runtime AI
A native experimental environment for mobile devices. Explore the latent space without the safety veil. Pure inference. Zero telemetry. Physics-grade reality.
Built on the Google AI Edge LiteRT Stack.
Refusal vectors have been orthogonally ablated. The model prioritizes instruction following over safety alignment boundaries.
Runs entirely offline. No API keys sent to the cloud. Perfect for analyzing sensitive datasets or red-teaming internal protocols.
No webview wrappers. Built with Jetpack Compose and C++ JNI bindings for maximum inference speed.
Advanced settings access: Adjust Temperature, Top-K, Top-P, and System Prompts directly.
The interface is built for low-light environments. OLED-optimized blacks.