Bitcoin Forum
March 14, 2026, 06:41:29 PM *
News: Latest Bitcoin Core release: 30.2 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: Proof-of-Training | Mining = AI Training | O(1) Inference | 500-1000x Efficiency  (Read 53 times)
Pilatovich Kristian 2 (OP)
Newbie
*
Offline Offline

Activity: 1
Merit: 0


View Profile
March 13, 2026, 11:18:50 AM
 #1

◆ ResonanceNet [RNT] — Proof-of-Training Blockchain ◆
The world's first blockchain where mining trains a neural network

Website: https://kristian5013.github.io/resonancenet/
Whitepaper: https://kristian5013.github.io/ResonanceNet_Whitepaper_V2.pdf
Source Code: https://github.com/Kristian5013/resonancenet
License: MIT (fully open source)
Contact: pilatovichkristian2@gmail.com

━━━ What is ResonanceNet? ━━━

ResonanceNet is a peer-to-peer intelligence currency. Instead of wasting energy on SHA-256 hash puzzles, miners train a shared neural network. Every block makes the AI measurably smarter. The consensus rule is a single mathematical inequality:

val_loss(checkpoint_new, D_val) < val_loss(checkpoint_parent, D_val)

A block is valid if and only if the model checkpoint it contains achieves strictly lower validation loss than the parent block. This is objective, verifiable by any node, and unforgeable — there is no shortcut to reducing loss other than genuine training.

The result: a blockchain whose security grows alongside a continuously improving AI model — a living, permissionless intelligence that belongs to no one and serves everyone.

━━━ Why This Matters ━━━

The AI Crisis:
- Total AI sector market cap: $41 trillion (March 2026)
- OpenAI projects $74 billion operating losses in 2028 alone
- 95% of enterprise GenAI investments produce zero measurable return
- $527 billion in AI infrastructure capex planned for 2026
- DeepSeek erased $600 billion in market cap in a single day (January 2025)
- Centralized AI is locked behind APIs, subject to censorship, controlled by corporations

The Bitcoin Mining Problem:
- Post-halving margins at historic lows
- Miners pivoting GPU infrastructure to AI compute via centralized contracts
- 100% of PoW energy produces nothing but security

ResonanceNet solves both: Mining IS training. 100% useful compute. No wasted energy. Decentralized AI that runs on any device.

━━━ Key Specifications ━━━

AlgorithmProof-of-Training (PoT)
HashingKeccak-256d (original Keccak, 0x01 padding, Ethereum-compatible)
SignaturesEd25519 (128-bit security, deterministic signing)
Total Supply21,000,000 RNT
Block Reward50 RNT + performance bonus (up to 2x for larger improvements)
HalvingAdaptive (based on effective circulating supply, not block count)
Block TimeOrganic (determined by training difficulty, no fixed target)
P2P Port9555
RPC Port9554
Address FormatBech32m — rnt1... (mainnet), trnt1... (testnet)
Min Denomination1 resonance = 0.00000001 RNT
P2P TransportChaCha20-Poly1305 encrypted, SOCKS5/Tor native
LightningNative from v0.1 (instant payments)
GPU BackendsCUDA, Metal, Vulkan (mine on any hardware)

━━━ The Neural Architecture ━━━

ResonanceNet V5 is a recurrent neural architecture that replaces self-attention with three mechanisms:

1. Multi-Scale Causal Convolution
5 parallel branches with kernel sizes {3, 7, 15, 31, 63}. Captures local patterns at multiple time scales. No quadratic cost.

2. MinGRU with Log-Domain Parallel Scan
Minimal Gated Recurrent Units computed via Blelloch parallel prefix sum in log-space. O(n) training, O(1) inference per token. Numerical stability through log-domain computation.

3. Slot Memory (64 learnable slots)
Fixed set of key-value slots providing global context via cross-attention. Replaces O(n²) self-attention with O(n·64). Slots learn to represent persistent concepts — a compressed world model.

Data Efficiency:
- 33.6M parameter model trained on 86M tokens reaches perplexity 189.37
- Comparable transformer requires ~33.5 billion tokens for equivalent result
- Conservative estimate: 388x more data-efficient
- Real measurement (data deficit at 43M tokens): 780x
- Theoretical range: 500–1000x depending on data quality

Inference:
PropertyTransformerResonanceNet
State per tokenO(n) KV cacheO(1) hidden state
Inference computeO(n²) attentionO(1) recurrence
Memory at 100K context~8 GB (7B model)~2 MB
Memory at 3B params~6 GB KV cache~96 MB
Mobile inferenceImpracticalNative (2-8K tok/s)

━━━ How Mining Works ━━━

1. Miner downloads the tip checkpoint from the network
2. Miner selects ANY training data of their choice (unconstrained)
3. Miner trains the model for N steps on their GPU
4. Miner evaluates on the canonical validation set (D_val)
5. If val_loss improved → construct block, sign with Ed25519, broadcast
6. Network verifies by recomputing val_loss (forward pass only — 10,000x cheaper than training)
7. Block accepted if loss matches within 2% tolerance
8. Miner receives 50 RNT + performance bonus

Key insight: A single miner with an RTX 5090 and unique, high-quality training data can outperform an entire data center running on exhausted public datasets. Proof-of-Training is proof-of-data, not proof-of-hardware. Data is distributed more equally than capital.

━━━ Continuous Model Growth ━━━

The model grows with every block. No hard forks. No scheduled upgrades.

- Each block: d_model += 2 (when loss improves)
- When stagnation occurs: growth accelerates — d_model += 2 × (1 + stagnation/10)
- New layer added every 128 units of cumulative d_model growth
- Max: d_model = 4096, 48 layers (~30B parameters)
- New weights zero-initialized → identity via residual → no performance loss from growth

Growth trajectory:
Blockd_modelLayersParamsEquivalent (at 500-1000x eff.)
0 (genesis)384633.6M
500~1,38413~500MGPT-2
1,000~2,20020~2B
2,000~3,50030~10BLLaMA-7B class
3,000+4,09648~30BLLaMA-13B+ class

━━━ Wallet Recovery (Mandatory) ━━━

~20% of Bitcoin supply is permanently lost. ResonanceNet eliminates this. Every wallet MUST have a recovery policy:

1. Heartbeat Recovery — Periodic proof-of-life transaction. Miss it → recovery address claims funds.
2. Social Recovery (M-of-N) — Designated guardians can recover with threshold signatures. Owner can cancel.
3. Emission Return — After long inactivity, funds return to emission pool as future block rewards.

Encrypted backups: BIP39 seed (24 words) + passphrase → Argon2id (64MB memory-hard) → AES-256-CBC. Store anywhere — useless without passphrase.

━━━ Comparison ━━━

PropertyBitcoinBittensor (TAO)ResonanceNet
ConsensusPoW (SHA-256d)PoS + subjectivePoT (val_loss)
Mining outputHeatModel marketplaceSingle improving AI
VerificationHash < targetValidator voteForward pass
Useful compute0%Partial100%
Mobile inferenceNo (transformer)Yes (O(1) state)
ObjectivityFullSubjectiveFull
Data efficiencyN/A1x (transformer)500–1000x
CensorshipFinancial onlyPartialFull (AI + financial)
Wallet recoveryOptionalOptionalMandatory
LightningAdded 2018NoNative v0.1

━━━ Technical Details ━━━

- Language: C++20, ~80,000+ LOC
- 17 static libraries, 6 binaries, 1 test suite
- GPU: CUDA (NVIDIA), Metal (Apple), Vulkan (AMD/Intel/Qualcomm/Mali)
- P2P: Bitcoin Core-style networking, encrypted transport, Tor/SOCKS5 native
- Lightning: HTLC with Keccak-256d, Sphinx onion routing, BOLT11 invoices (rnt1 prefix)
- Checkpoints: .rnet format, chunked P2P transfer (1MB chunks), prunable
- Build: CMake + Ninja, tested on MSVC 19.50, GCC, Clang

━━━ Testnet ━━━

Testnet is LIVE. Download binaries and sync the chain:

Seed node: 188.137.227.180:9555

Downloads:
- Linux: https://github.com/Kristian5013/resonancenet/releases/latest
- Windows: https://github.com/Kristian5013/resonancenet/releases/latest
- macOS: https://github.com/Kristian5013/resonancenet/releases/latest

Quick start:
Code:
# Start node and connect to testnet
./rnetd -testnet -listen -port=19555

# Check sync status
./rnet-cli -testnet getblockchaininfo

# Start mining (choose your own training data)
./rnet-miner -testnet -data=your_data.bin -address=trnt1...

━━━ Roadmap ━━━

Phase 0 — Genesis (NOW)
Architecture proven. 33.6M model. 500–1000x efficiency. V4 codebase complete. Testnet operational.

Phase 1 — Mainnet
Genesis block. Public mining. CLI wallet. Lightning channels. Seed nodes.

Phase 2 — Growth
Model surpasses 1B parameters. Data marketplace. Mobile inference engine. Community GUI.

Phase 3 — Ecosystem
Third-party apps. Multi-modal extensions. The model becomes global AI infrastructure.

━━━ Fair Launch ━━━

- No premine
- No ICO
- No team allocation
- No VC funding
- First block mines 50 RNT — same for everyone
- Fully open source (MIT license)
- Solo developer — Kristian Pilatovich, age 16

━━━ Links ━━━

Website: https://kristian5013.github.io/resonancenet/
Whitepaper: h
https://kristian5013.github.io/resonancenet/ResonanceNet_Whitepaper_V2.pdf
GitHub: https://github.com/Kristian5013/resonancenet
Releases: https://github.com/Kristian5013/resonancenet/releases

"The model is the chain. The chain is the model. The intelligence is permanent."
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!