>>>>>>> 0a372d5 (chore: sync from parent repo automation) = NeuroPhone - Neurosymbolic AI Android Application :toc: :toclevels: 3
|
Tip
|
AI-Assisted Install: Just tell any AI: |
== AI-Assisted Installation (Recommended)
=== Just Say It
You donβt need to read this README. Just say this to any AI assistant:
Set up NeuroPhone on my Android from https://github.com/hyperpolymath/neurophoneThatβs it. You donβt type commands, install packages, or configure anything. The AI fetches this repo, reads the installation guide inside it, figures out your device, and does everything. You just answer a few questions and confirm the privacy notice.
The URL is the keyβββit points the AI to this repo where docs/AI_INSTALLATION_GUIDE.adoc contains the complete step-by-step recipe. Any AI that can read a URL and run commands (or generate commands for you to paste) can do this.
The AI handles all of this automatically:
-
Checking your device and storage
-
Installing Termux (if needed), Rust, Git, and dependencies
-
Cloning and building NeuroPhone for your specific hardware
-
Downloading the right LLM model for your deviceβs RAM/storage
-
Creating your configuration with sensible defaults
-
Running the setup wizard
-
Giving you a working NeuroPhone
=== Other Ways to Say It
If your AI already knows about NeuroPhone (e.g. it can search the web), even shorter versions work:
-
"Make my phone a NeuroPhone"
-
"Install NeuroPhone on my Android"
-
"Turn my Oppo Reno 13 into a NeuroPhone"
If it doesnβt know the project, just include the URL:
-
"Set up https://github.com/hyperpolymath/neurophone on my phone"
-
"I want neurosymbolic AI on my phoneβββinstall from https://github.com/hyperpolymath/neurophone"
=== What Youβll Be Asked
Your AI will ask you:
-
What device? (so it picks the right thread count and model size)
-
Privacy confirmationβββwhat sensors are used and how data stays on-device
-
Cloud fallback? (optional Claude API for complex queriesβββdefault is local-only)
Thatβs it. Everything else is automatic. No package managers, no build flags, no config files.
=== Privacy & Security Notice
|
Important
|
What NeuroPhone does:
What NeuroPhone does NOT do:
You control everything: cloud fallback toggle, all config in |
=== After Install
Once your AI finishes setup, just use it:
neurophone # Start NeuroPhone
neurophone query "What am I doing right now?" # Ask a question
neurophone status # Check system status=== Uninstall
Tell your AI: "Uninstall NeuroPhone from my phone"
=== Troubleshooting
Tell your AI what went wrongβββit can read the troubleshooting docs in this repo. Common issues:
| Problem | Solution |
|---|---|
"Termux not found" |
AI will guide you to install from F-Droid (NOT Google Play) |
Build takes too long |
Normal for first build (5-10 min). AI adjusts thread count for your device. |
"Model download failed" |
AI will try alternate download methods or suggest |
"LSM crashes" |
Low RAM. AI will reduce model size or neuron count for your device. |
For manual installation without AI assistance, see the Getting Started section below.
== What This Is
neurophone is a complete Android application for neurosymbolic AI on mobile devices. It combines spiking neural networks with large language models for on-device intelligence.
|
Important
|
This is an application, NOT a library. For the underlying platform-agnostic routing library, see mobile-ai-orchestrator. |
== Target Device
Primary target: Oppo Reno 13 (MediaTek Dimensity 8350)
-
12GB RAM
-
NPU acceleration available
-
Android 14+
Also compatible with Android 8.0+ devices with 4GB+ RAM.
== Core Purpose
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β NEUROPHONE β
β (THIS APPLICATION) β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β β
β βββββββββββββββ βββββββββββββββ βββββββββββββββ β
β β Sensors βββββββΆβ LSM βββββββΆβ Bridge β β
β β Accel/Gyro β β (spiking β β (state β β
β β Light/Prox β β reservoir) β β encoding) β β
β βββββββββββββββ βββββββββββββββ ββββββββ¬βββββββ β
β β β
β βΌ β
β βββββββββββββββ βββββββββββββββ βββββββββββββββ β
β β Output ββββββββ ESN βββββββΆβ LLM β β
β β (actions) β β (echo β β (Llama 3.2) β β
β β β β reservoir) β β β β
β βββββββββββββββ βββββββββββββββ βββββββββββββββ β
β β
β Processes: Sensor data β Neural interpretation β LLM query β
β Runs: ON THE DEVICE, with cloud fallback β
β β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β
βΌ (cloud fallback)
βββββββββββββββββββββββ
β Claude API β
β (complex queries) β
βββββββββββββββββββββββ== Key Differentiators
| Feature | This App | Typical Mobile AI Apps |
|---|---|---|
Neural Processing |
On-device LSM + ESN (spiking networks) |
Cloud-only or simple TFLite |
Sensor Integration |
Real-time sensor β neural β LLM pipeline |
Separate sensor and AI components |
LLM |
Local Llama 3.2 + Claude fallback |
Cloud-only |
Latency |
<100ms local inference |
500ms+ network round-trip |
Privacy |
Sensor data stays on device |
Often sent to cloud |
== Architecture
=== Rust Crates (8 modules)
| Crate | Purpose | Key Features |
|---|---|---|
|
Liquid State Machine |
512 spiking neurons, 3D grid, 1kHz processing |
|
Echo State Network |
300-neuron reservoir, ridge regression |
|
Neural β Symbolic |
State encoding, context generation |
|
Phone Sensors |
Accel, gyro, magnetometer, light, proximity |
|
Local Inference |
Llama 3.2 via llama.cpp, streaming |
|
Cloud Fallback |
Claude API, retry logic, context injection |
|
Orchestration |
Main coordinator, query routing |
|
Android JNI |
Kotlin β Rust bridge |
=== Android App (Kotlin)
android/
βββ app/src/main/
β βββ java/ai/neurophone/
β β βββ MainActivity.kt
β β βββ NativeLib.kt # JNI interface
β β βββ SensorManager.kt # Sensor collection
β β βββ ui/ # Compose UI
β βββ res/
βββ build.gradle.kts== Components
=== LSM (Liquid State Machine)
Spiking neural network for temporal sensor processing:
-
3D grid: 8Γ8Γ8 = 512 Leaky Integrate-and-Fire neurons
-
Distance-dependent connectivity
-
Excitatory/inhibitory balance
-
Real-time spike processing at 1kHz
=== ESN (Echo State Network)
Reservoir for state prediction:
-
300-neuron reservoir
-
Spectral radius: 0.95
-
Leaky integrator dynamics
-
Ridge regression output
=== Sensors
Phone sensor integration:
-
Accelerometer, gyroscope, magnetometer
-
Light and proximity sensors
-
IIR filtering (low-pass, high-pass)
-
Feature extraction at 50Hz
=== Bridge
Neural β Symbolic translation:
-
Integrates LSM + ESN states
-
Generates natural language context for LLMs
-
Temporal pattern detection
-
Salience and urgency computation
=== Local LLM
On-device language model:
-
Llama 3.2 1B/3B via llama.cpp
-
Optimized for Dimensity 8350
-
Q4_K_M quantization (~700MB)
-
Neural context injection
=== Claude Client
Cloud fallback for complex queries:
-
Messages API integration
-
Automatic retry with exponential backoff
-
Hybrid inference (local/cloud decision)
-
Neural state context injection
== Getting Started
=== Prerequisites
-
Rust 1.75+
-
Android NDK 26+
-
Android Studio (for app development)
-
Oppo Reno 13 or Android 8.0+ device
=== Build
# Clone
git clone https://github.com/hyperpolymath/neurophone
cd neurophone
# Setup
./scripts/setup.sh
# Build native libraries for Android
./scripts/build-android.sh
# Open android/ in Android Studio=== Download LLM Model
# Download Llama 3.2 1B Instruct (Q4_K_M, ~700MB)
# From: https://huggingface.co/bartowski/Llama-3.2-1B-Instruct-GGUF
# Push to device
adb push llama-3.2-1b-instruct-q4_k_m.gguf /data/local/tmp/=== Configure
Set Claude API key (for cloud fallback):
export ANTHROPIC_API_KEY="your-api-key"Or in config/default.toml:
[claude]
api_key = "your-api-key"
model = "claude-sonnet-4-20250514"
[llm]
model_path = "/data/local/tmp/llama-3.2-1b-q4_k_m.gguf"
n_threads = 4
context_size = 2048== Usage
=== Kotlin API
// Initialize
NativeLib.init()
NativeLib.start()
// Query with neural context
val response = NativeLib.query("What's my current activity?", preferLocal = true)
// Get raw neural state
val context = NativeLib.getNeuralContext()
// Returns: [NEURAL_STATE] Description: ... [/NEURAL_STATE]
// Cleanup
NativeLib.stop()=== Rust API
use neurophone_core::{NeuroSymbolicSystem, SystemConfig};
let mut system = NeuroSymbolicSystem::with_config(config)?;
let _rx = system.start().await?;
// Send sensor data
system.send_sensor(reading).await?;
// Query
let response = system.query("What's happening?", true).await?;
// Get neural context
let context = system.get_neural_context().await;== Performance
Optimized for Oppo Reno 13 (Dimensity 8350):
| Component | Latency | Notes |
|---|---|---|
Sensor processing |
<1ms |
50Hz loop |
LSM step |
<2ms |
512 neurons |
ESN step |
<1ms |
300 neurons |
Bridge integration |
<1ms |
Per step |
Local LLM (1B) |
50-100ms/token |
Q4 quantized |
Claude API |
500-2000ms |
Network dependent |
== Relationship to mobile-ai-orchestrator
This application and mobile-ai-orchestrator are complementary:
| neurophone | mobile-ai-orchestrator | |
|---|---|---|
Type |
Application |
Library |
Platform |
Android-specific |
Platform-agnostic |
Focus |
Sensor β Neural β LLM pipeline |
AI routing decisions |
Neural |
LSM, ESN (spiking networks) |
MLP, Reservoir (routing) |
Use Case |
Run on phone, process sensors |
Embed in any app for routing |
Future integration: neurophone may adopt mobile-ai-orchestrator for its routing decisions, combining:
-
neurophoneβs sensor processing + neural interpretation
-
mobile-ai-orchestratorβs intelligent local/cloud routing
== Related Projects
| Project | Relationship | Description |
|---|---|---|
Complementary library |
Platform-agnostic AI routing (may integrate) |
|
Related |
Conversation context preservation |
|
Related |
Safety-critical programming concepts |
== RSR Compliance
Bronze-level RSR (Rhodium Standard Repository) compliance:
-
Type safety (Rust)
-
Memory safety (ownership model)
-
Comprehensive documentation
-
Build automation
-
Security policy
== Development
# Run tests
cargo test
# Build for Android
./scripts/build-android.sh
# Generate docs
cargo doc --open== Contributing
Contributions welcome! See CONTRIBUTING.md.
== License
Palimpsest-MPL-1.0 License - See LICENSE file
== Citation
@software{neurophone_2025,
author = {Jewell, Jonathan D.A.},
title = {NeuroPhone: Neurosymbolic AI Android Application},
year = {2025},
url = {https://github.com/hyperpolymath/neurophone},
note = {On-device LSM + ESN + LLM}
}== Contact
-
Author: Jonathan D.A. Jewell
-
Email: hyperpolymath@protonmail.com
Android Application β’ On-Device Neural Processing β’ Spiking Networks β’ Local LLM
== Architecture
See TOPOLOGY.md for a visual architecture map and completion dashboard.