━━━ What is ResonanceNet? ━━━ResonanceNet is a peer-to-peer intelligence currency. Instead of wasting energy on SHA-256 hash puzzles, miners train a shared neural network. Every block makes the AI measurably smarter. The consensus rule is a single mathematical inequality:
val_loss(checkpoint_new, D_val) < val_loss(checkpoint_parent, D_val)
A block is valid if and only if the model checkpoint it contains achieves strictly lower validation loss than the parent block. This is objective, verifiable by any node, and unforgeable — there is no shortcut to reducing loss other than genuine training.
The result: a blockchain whose security grows alongside a continuously improving AI model — a living, permissionless intelligence that belongs to no one and serves everyone.
━━━ Why This Matters ━━━The AI Crisis:- Total AI sector market cap: $41 trillion (March 2026)
- OpenAI projects $74 billion operating losses in 2028 alone
- 95% of enterprise GenAI investments produce zero measurable return
- $527 billion in AI infrastructure capex planned for 2026
- DeepSeek erased $600 billion in market cap in a single day (January 2025)
- Centralized AI is locked behind APIs, subject to censorship, controlled by corporations
The Bitcoin Mining Problem:- Post-halving margins at historic lows
- Miners pivoting GPU infrastructure to AI compute via centralized contracts
- 100% of PoW energy produces nothing but security
ResonanceNet solves both: Mining IS training. 100% useful compute. No wasted energy. Decentralized AI that runs on any device.
━━━ Key Specifications ━━━| Algorithm | Proof-of-Training (PoT) |
| Hashing | Keccak-256d (original Keccak, 0x01 padding, Ethereum-compatible) |
| Signatures | Ed25519 (128-bit security, deterministic signing) |
| Total Supply | 21,000,000 RNT |
| Block Reward | 50 RNT + performance bonus (up to 2x for larger improvements) |
| Halving | Adaptive (based on effective circulating supply, not block count) |
| Block Time | Organic (determined by training difficulty, no fixed target) |
| P2P Port | 9555 |
| RPC Port | 9554 |
| Address Format | Bech32m — rnt1... (mainnet), trnt1... (testnet) |
| Min Denomination | 1 resonance = 0.00000001 RNT |
| P2P Transport | ChaCha20-Poly1305 encrypted, SOCKS5/Tor native |
| Lightning | Native from v0.1 (instant payments) |
| GPU Backends | CUDA, Metal, Vulkan (mine on any hardware) |
━━━ The Neural Architecture ━━━ResonanceNet V5 is a recurrent neural architecture that replaces self-attention with three mechanisms:
1. Multi-Scale Causal Convolution5 parallel branches with kernel sizes {3, 7, 15, 31, 63}. Captures local patterns at multiple time scales. No quadratic cost.
2. MinGRU with Log-Domain Parallel ScanMinimal Gated Recurrent Units computed via Blelloch parallel prefix sum in log-space. O(n) training, O(1) inference per token. Numerical stability through log-domain computation.
3. Slot Memory (64 learnable slots)Fixed set of key-value slots providing global context via cross-attention. Replaces O(n²) self-attention with O(n·64). Slots learn to represent persistent concepts — a compressed world model.
Data Efficiency:- 33.6M parameter model trained on 86M tokens reaches perplexity 189.37
- Comparable transformer requires ~33.5 billion tokens for equivalent result
- Conservative estimate: 388x more data-efficient
- Real measurement (data deficit at 43M tokens): 780x
- Theoretical range: 500–1000x depending on data quality
Inference:| Property | Transformer | ResonanceNet |
| State per token | O(n) KV cache | O(1) hidden state |
| Inference compute | O(n²) attention | O(1) recurrence |
| Memory at 100K context | ~8 GB (7B model) | ~2 MB |
| Memory at 3B params | ~6 GB KV cache | ~96 MB |
| Mobile inference | Impractical | Native (2-8K tok/s) |
━━━ How Mining Works ━━━1. Miner downloads the tip checkpoint from the network
2. Miner selects ANY training data of their choice (unconstrained)
3. Miner trains the model for N steps on their GPU
4. Miner evaluates on the canonical validation set (D_val)
5. If val_loss improved → construct block, sign with Ed25519, broadcast
6. Network verifies by recomputing val_loss (forward pass only — 10,000x cheaper than training)
7. Block accepted if loss matches within 2% tolerance
8. Miner receives 50 RNT + performance bonus
Key insight: A single miner with an RTX 5090 and unique, high-quality training data can outperform an entire data center running on exhausted public datasets. Proof-of-Training is proof-of-data, not proof-of-hardware. Data is distributed more equally than capital.
━━━ Continuous Model Growth ━━━The model grows with every block. No hard forks. No scheduled upgrades.
- Each block: d_model += 2 (when loss improves)
- When stagnation occurs: growth accelerates — d_model += 2 × (1 + stagnation/10)
- New layer added every 128 units of cumulative d_model growth
- Max: d_model = 4096, 48 layers (~30B parameters)
- New weights zero-initialized → identity via residual → no performance loss from growth
Growth trajectory:| Block | d_model | Layers | Params | Equivalent (at 500-1000x eff.) |
| 0 (genesis) | 384 | 6 | 33.6M | — |
| 500 | ~1,384 | 13 | ~500M | GPT-2 |
| 1,000 | ~2,200 | 20 | ~2B | — |
| 2,000 | ~3,500 | 30 | ~10B | LLaMA-7B class |
| 3,000+ | 4,096 | 48 | ~30B | LLaMA-13B+ class |
━━━ Wallet Recovery (Mandatory) ━━━~20% of Bitcoin supply is permanently lost. ResonanceNet eliminates this. Every wallet MUST have a recovery policy:
1. Heartbeat Recovery — Periodic proof-of-life transaction. Miss it → recovery address claims funds.
2. Social Recovery (M-of-N) — Designated guardians can recover with threshold signatures. Owner can cancel.
3. Emission Return — After long inactivity, funds return to emission pool as future block rewards.
Encrypted backups: BIP39 seed (24 words) + passphrase → Argon2id (64MB memory-hard) → AES-256-CBC. Store anywhere — useless without passphrase.
━━━ Comparison ━━━| Property | Bitcoin | Bittensor (TAO) | ResonanceNet |
| Consensus | PoW (SHA-256d) | PoS + subjective | PoT (val_loss) |
| Mining output | Heat | Model marketplace | Single improving AI |
| Verification | Hash < target | Validator vote | Forward pass |
| Useful compute | 0% | Partial | 100% |
| Mobile inference | — | No (transformer) | Yes (O(1) state) |
| Objectivity | Full | Subjective | Full |
| Data efficiency | N/A | 1x (transformer) | 500–1000x |
| Censorship | Financial only | Partial | Full (AI + financial) |
| Wallet recovery | Optional | Optional | Mandatory |
| Lightning | Added 2018 | No | Native v0.1 |
━━━ Technical Details ━━━- Language: C++20, ~80,000+ LOC
- 17 static libraries, 6 binaries, 1 test suite
- GPU: CUDA (NVIDIA), Metal (Apple), Vulkan (AMD/Intel/Qualcomm/Mali)
- P2P: Bitcoin Core-style networking, encrypted transport, Tor/SOCKS5 native
- Lightning: HTLC with Keccak-256d, Sphinx onion routing, BOLT11 invoices (rnt1 prefix)
- Checkpoints: .rnet format, chunked P2P transfer (1MB chunks), prunable
- Build: CMake + Ninja, tested on MSVC 19.50, GCC, Clang
━━━ Testnet ━━━Testnet is LIVE. Download binaries and sync the chain:
Seed node: 188.137.227.180:9555
Downloads:- Linux:
https://github.com/Kristian5013/resonancenet/releases/latest- Windows:
https://github.com/Kristian5013/resonancenet/releases/latest- macOS:
https://github.com/Kristian5013/resonancenet/releases/latestQuick start:# Start node and connect to testnet
./rnetd -testnet -listen -port=19555
# Check sync status
./rnet-cli -testnet getblockchaininfo
# Start mining (choose your own training data)
./rnet-miner -testnet -data=your_data.bin -address=trnt1...
━━━ Roadmap ━━━Phase 0 — Genesis (NOW)Architecture proven. 33.6M model. 500–1000x efficiency. V4 codebase complete. Testnet operational.
Phase 1 — MainnetGenesis block. Public mining. CLI wallet. Lightning channels. Seed nodes.
Phase 2 — GrowthModel surpasses 1B parameters. Data marketplace. Mobile inference engine. Community GUI.
Phase 3 — EcosystemThird-party apps. Multi-modal extensions. The model becomes global AI infrastructure.
━━━ Fair Launch ━━━- No premine
- No ICO
- No team allocation
- No VC funding
- First block mines 50 RNT — same for everyone
- Fully open source (MIT license)
- Solo developer — Kristian Pilatovich, age 16
━━━ Links ━━━Website: https://kristian5013.github.io/resonancenet/Whitepaper: h
https://kristian5013.github.io/resonancenet/ResonanceNet_Whitepaper_V2.pdfGitHub: https://github.com/Kristian5013/resonancenetReleases: https://github.com/Kristian5013/resonancenet/releases"The model is the chain. The chain is the model. The intelligence is permanent."