Hi all I’d like to start a discussion at the intersection of AI and blockchain.
Bitcoin showed that finance can be transparent and leaderless. AI now faces a similar challenge: most large models (ChatGPT, Midjourney, DeepSeek, etc.) are trained on opaque datasets and pipelines. We rarely know what the “source of truth” is, or how it shapes outputs. That opacity can bias results and enable manipulation.
An idea being explored in the Ethereum community is to anchor data/model provenance and usage accounting on-chain, and make revenue splits deterministic and auditable. The proposed format is a Data Anchoring Token (DAT) - a semi-fungible token that accounts not only for ownership but also for use (e.g., dataset reads, model inference, agent steps). The goal is to bring AI assets closer to core blockchain values: verifiability, attribution, and understandable payout rules.
Questions to discuss in your opinion:
- Does the AI ecosystem need on-chain transparency for data/model usage?
- How viable is “pay for the actual resource” (inference weight, requests) versus paying simply for asset ownership?
- What risks or trade-offs do you see for altcoin markets and L2 infrastructure if this model gains traction?
If you want to dive deeper, here’s the formal discussion thread on the EIP forum (Ethereum Magicians):
👉 ERC-8028: Data Anchoring Token (DAT) - draft and comments:
https://ethereum-magicians.org/t/erc-8028-data-anchoring-token-dat/25512/7This is not a token announcement or sale the aim is to discuss the architecture and its applicability. Constructive critique, alternatives, and real-world examples are welcome.