This is the first installment in an ongoing series breaking down Stillcore’s thinking on Bittensor. We’ll be publishing long-form research walking through the full investment case for why we believe Bittensor represents the most asymmetric opportunity in markets today. Whether you’re new to Bittensor or deep in the weeds, there should be something here for you.
Introduction: Every Major Technology Shift Starts Centralized, Then Breaks Open
The internet started as ARPANET, a closed government network. It evolved through AOL, CompuServe, and Prodigy: walled gardens that controlled content, access, and monetization. By 2000, those walled gardens were dead or dying, and the open web had won.
AI is following the same pattern. A handful of well-funded labs control the frontier: OpenAI, Google DeepMind, Anthropic, Meta AI, xAI. They have the capital, the talent, and the infrastructure.
But they are the AOL of AI, not the TCP/IP.
This document presents a foundational investment thesis on Bittensor and its native token, TAO. The structural case suggests that Bittensor represents the most significant opportunity in cryptocurrency since Ethereum.
The thesis rests on a simple observation. Cryptocurrency and artificial intelligence, two of the most important technological transformations of this era, are converging. Bittensor sits squarely at their intersection. Crypto promised coordination without extraction and largely failed to deliver. AI is transforming every sector of the economy, yet centralizing into a handful of labs. Bittensor offers a structural solution to both problems at once.
The structural case arguably exceeds the case that existed for Ethereum at a comparable stage. Ethereum decentralized finance, creating a multi-hundred-billion-dollar ecosystem. But the total addressable market of decentralized AI dwarfs decentralized finance. AI is not a sector. It is an economic transformation that will touch every industry, every company, and eventually every human activity on Earth.
Part I: The Crypto Thesis
The Promise of Minimally Extractive Coordination
Consider Uber. It owns no vehicles and employs no drivers. Its value derives entirely from solving a coordination problem, matching supply with demand at scale. For this, Uber extracts 25 to 30 percent of every transaction.
Amazon coordinates millions of sellers with hundreds of millions of buyers while extracting fees and harvesting transaction data to compete against its own merchants.
In both cases, the people who create actual value capture only a fraction of what they produce. The platform sits between them, extracting rents in perpetuity.
Cryptocurrency’s foundational insight: this extraction is not technologically necessary. Encode the rules of coordination in a protocol rather than a company, and the platform layer drops to near-zero cost. Network effects still exist, but they accrue to participants rather than extractors.
Bitcoin proved it was achievable. No company, no employees, no president of Bitcoin. Yet people all over Earth donated electricity and CPU cycles to secure a monetary ledger. The result was trillions of dollars in value created through coordination without platform extraction.
Why Tokens Matter
Achieving this vision requires solving the bootstrap problem.
The challenge is circular: users won’t pay for an insecure network, but operators won’t invest without revenue. Traditional capital solves this with debt or equity, but external capital obligations reintroduce the very extraction that decentralization was supposed to eliminate.
Native tokens solve the bootstrap without reintroducing extraction. The protocol mints new tokens and distributes them to participants who contribute work. If the network succeeds, those tokens become valuable retroactively. The bootstrap is funded by the future value of the network itself.
But here’s the catch, and why most token projects fail: the tokens must actually accrue value as the network succeeds. If participants can capture the benefits without holding the token, the mechanism fails entirely.
The most robust designs combine multiple demand mechanisms: payment utility, staking that locks supply and aligns incentives, cash flows through burns or buybacks, and governance rights. The tokens that work are the ones that give holders reasons to keep them.
Even Bitcoin and Ethereum face a structural limitation: they each coordinate only one thing. Bitcoin coordinates SHA-256 hashing. Ethereum coordinates general computation. Neither can spawn new coordination problems as opportunities emerge. A protocol locked to a single problem has a ceiling on its relevance.
TAO is the first token to combine all four demand mechanisms while coordinating an unlimited number of problems.
Where Crypto Failed
Rather than building mechanisms that could spawn new coordination problems, the industry took a different path.
It produced endless variations on a single theme: blockchains that are slightly faster, slightly cheaper, or slightly more scalable. Ethereum begat Solana begat Avalanche begat Aptos begat Sui begat dozens of others, all competing for the same pool of applications. The industry now has more blockspace than demand requires.
Network effects, supposedly the moat that would make early winners unassailable, proved remarkably shallow. A dApp can migrate to an L2, deploy across multiple chains, or bridge liquidity with ease. If network effects don’t compound, what exactly are holders paying for?
Most tokens do not need to exist. They don’t solve coordination problems that couldn’t be solved otherwise. They exist primarily to create tradeable assets from which insiders extract value, precisely the dynamic crypto was supposed to eliminate.
What the industry never produced was the thing it claimed to be building: decentralized capital formation. A team designs an incentive mechanism, what work needs doing and how to judge quality, and the protocol funds it. Continuously, through emissions, with miners competing and validators scoring. No token sale. No whitepaper promising future utility. A working business whose capital comes from the network itself.
That is what Bittensor built.
Part II: The Bittensor Protocol
Bitcoin decentralized money. Ethereum made Bitcoin’s smart contract side programmable, leading to stablecoins, decentralized lending, and DEXs.
Bittensor expands the other side of Bitcoin. It takes Bitcoin’s incentive alignment engine, the mining and the token issuance, and makes that programmable. Once you do this, you can mine for all kinds of things. Including intelligence.
Bittensor abstracts the bootstrap mechanism itself. It creates a protocol that spawns, funds, and governs an unlimited number of specialized minimally extractive coordinators through market signals rather than committee decisions.
Just as the internet became a launchpad for websites and online businesses, Bittensor is a launchpad for AI startups. A network of hundreds, soon thousands, of specialized AI models and protocols. A World Wide Web of intelligence.
Yuma Consensus: Decentralized Agreement on Subjective Quality
Bitcoin’s consensus is elegant: miners compete to find hashes meeting a difficulty target. The hash either meets it or it doesn’t. Binary. No ambiguity.
But many valuable coordination problems can’t be reduced to binary validation. Was this model’s response good? Which image was generated by the superior model? These questions have answers, but the answers exist on gradients, depend on context, and require judgment.
The breakthrough came from recognizing that while individual quality judgments are subjective, patterns of agreement among judges are objective and measurable. If diverse evaluators independently conclude output A beats output B, that consensus signal contains real information, even if no individual judgment is provably correct.
Yuma Consensus implements this. Validators score on a continuous gradient from zero to one. Outlier scores get mathematically clipped. Validators with more TAO carry greater weight, creating a reputation market where accurate judges attract more stake. Validators who consistently diverge from consensus earn less. Economic pressure enforces honest evaluation.
Bitcoin proved decentralized consensus on objective facts was possible. Yuma Consensus proves it works for subjective quality too, opening an entirely new category of coordination problems.
Subnets: Bittensor’s Mining Contests
Within Bittensor, there are currently over one hundred mining contests called subnets. Each functions as a standalone minimally extractive coordinator.
Each subnet has an owner who defines what it mines for and how the contest works. Miners compete to produce the best work. Validators evaluate using Yuma Consensus. A native alpha token coordinates economic participation.
The chain pays miners, not the subnet owner. This mirrors exactly how Bitcoin operates. The task was “find the hash of the previous block.” Miners competed. Winners got paid. Bitcoin was the world’s first subnet.
Everything is winner-take-all. Only the best miners get paid. At historical highs, daily earnings on top subnets have hit six figures. No hiding. No coasting. Every day is a new contest.
For subnet owners, the value proposition is straightforward: no recruiting, no geographic limits, no invoices, no salaries. The best minds and GPUs on the planet compete on your task.
For miners: work from anywhere, get paid immediately by the chain, and earn ownership in the subnet via alpha tokens.
If an entrepreneur wants to start an AI company today, the conventional path requires raising tens of millions and hiring aggressively. The alternative is to start a subnet and receive work funded by the protocol.
TAO: The Coordinator of Coordinators
If each subnet is a minimally extractive coordinator solving a specific problem, what is TAO?
TAO is the coordinator that coordinates the coordinators.
The chain mints new TAO continuously. These emissions must be distributed across 100+ competing subnets. Subnets that receive more emissions attract better talent and produce better outputs. Capital allocation at this level determines which coordination problems get solved.
Early Bittensor handled this through governance, with large stakeholders voting on allocations. This worked to bootstrap but couldn’t scale. It concentrated control within a small group, reintroducing the committee-based decision-making decentralization was supposed to eliminate.
Dynamic TAO and TAO Flow
The solution was to replace human judgment with market signals.
Dynamic TAO gave each subnet an automated market maker (AMM) pairing its alpha token with TAO. This AMM is not an app built on top of Bittensor. It IS the chain. The protocol provides liquidity.
Stake TAO into a subnet’s pool, receive alpha tokens. The price is set by the ratio of TAO to alpha in the pool. There is no other way to get subnet tokens. Every buyer goes through the TAO-denominated AMM.
The original mechanism allocated emissions based on market cap. The flaw was that once a subnet reached a high valuation, it kept collecting outsized emissions whether or not anyone was still buying. Incumbents were rewarded for arriving early, not for continuing to deliver.
TAO Flow corrected this by making current buying pressure the signal rather than accumulated price. The protocol observes TAO flowing into each pool in real time. Which subnet has the most demand right now? That’s where emissions go.
If a subnet experiences net negative flow, meaning more TAO leaving than entering, the chain stops injecting new TAO. Miners still get alpha tokens, but with no TAO backing them, the result is dilution. Losing market interest doesn’t shrink your reward. It erodes the value underneath it.
Sustained negative flow triggers deregistration. The subnet loses its slot, which gets reclaimed for a new entrant. You cannot coast on past performance. The protocol is indifferent to history.
Solving the Calculation Problem
This solves something fundamental: the calculation problem for resource allocation.
A centralized tech company can never fully solve its allocation problem. Which products to build? Which capabilities to prioritize? In practice, it guesses, using lagging feedback, biased intuition, and artificial metrics. What it lacks is real-time market prices aggregating distributed knowledge about what’s actually valuable.
Bittensor’s subnet prices are those prices. High market cap with positive TAO Flow tells the network “this service is valuable, deploy resources here.” A failing subnet approaching deregistration says “shut it down.”
Y Combinator processes applications on human timescales, funds so many companies per year, and is subject to bias. Bittensor runs this process every block, for every subnet, with capital allocation determined by aggregated market intelligence rather than partner meetings.
The Bittensor Flywheel
TAO emissions seed an ecosystem of AI startups through market-directed capital allocation. But how does value flow back?
TAO functions as an index of the entire ecosystem, akin to an S&P 500 ETF for decentralized AI. A specific subnet token captures more upside if that subnet wins. But TAO captures value regardless of which subnets win. And the only way to access any of them is through TAO.
The structural demand cannot be circumvented.
TAO is the exclusive medium of exchange for all subnet tokens. Every dollar of value created anywhere in the ecosystem requires TAO to access. Subnets generate cash flows through products (inference APIs, compute marketplaces, agent platforms), and leading subnets use revenue for alpha token buybacks. Because all alpha pairs with TAO, subnet demand mechanically creates TAO demand.
TAO staked into pools earns emissions while removing supply from circulation. The more staked, the less available for sale.
When a subnet succeeds, demand for its alpha rises. Every buyer of alpha must purchase TAO first. TAO appreciation then lifts all subnet valuations, since alpha is priced in TAO, and increases the dollar value of emissions network-wide. Every subnet’s success benefits every other subnet.
The reflexive loop is what makes it compound. Revenue grows, alpha prices rise, TAO rises with them, emissions become worth more in dollar terms, which means larger subsidies attracting more capable participants, who build more products, which generate more revenue. The cycle reinforces itself.
Value accrual tends to be front-loaded for early participants. Flywheel effects are recognized only after they’ve begun compounding.
The Moat
Network effects in crypto have proven shallow. Ethereum was supposed to dominate forever. Applications migrate. Liquidity bridges. Users follow incentives.
Bittensor’s network effects are structurally different.
A founder launching an open-source AI protocol faces an asymmetric choice. Bittensor means immediate access to established mining pools with proven talent, validators with years of reputation, deep DEX liquidity where the protocol provides the other side of every trade, composability with subnets that cut inference costs to a fraction of centralized rates, and alpha tokens giving miners ownership.
Launching elsewhere means bootstrapping from scratch, standing up untested infrastructure, paying centralized rates for compute. Miners on the alternative would be mercenaries, not owners.
The code is open source. What can’t be replicated is the accumulated human capital, the proven track records, the deep liquidity, and the composable infrastructure that took years and hundreds of millions in emissions to build. By the time a competitor reaches parity, Bittensor has compounded further.
Beyond human capital, the emissions mechanism itself constitutes a second moat. Annual emissions are one of the largest continuous funding streams for open-source AI in existence, and the number scales directly with TAO price. At previous highs, annual emissions approached a billion dollars. Not whitepaper numbers. Actual capital, every day, to contributors worldwide.
The fair launch constitutes a third. The founders rejected premining and VC discounts. TAO was distributed entirely through mining. Like Bitcoin: 21 million fixed supply, halving schedule, no insider allocations. The community that formed consists disproportionately of builders, not speculators.
Part III: The AI Thesis
AI’s Transformation of the Economy
AI is not a technology sector. It is a general-purpose economic transformation.
Hundreds of billions flow into AI infrastructure annually. Cloud providers are racing to deploy millions of GPUs. New data centers consume gigawatts of power.
Every sector will be affected: software development, customer service, legal, financial services, healthcare, creative industries, manufacturing. And this is before the physical-world transformations now visible on the horizon. Self-driving vehicles will restructure transportation. Humanoid robots are moving from labs to warehouses. The integration of AI into the physical economy will dwarf the impact of AI on software alone.
AI progress compounds across four simultaneous curves: compute hardware, training data, algorithmic innovation, and open-source collaboration. Any one would be significant. All four are converging.
The Centralization Problem
The frontier is controlled by a handful of labs: OpenAI, Anthropic, Google DeepMind, Meta AI, xAI. Training a frontier model costs hundreds of millions. Data access is a moat. Top researchers command compensation only the best-funded orgs can afford. Each generation raises the stakes.
This reintroduces the extraction dynamic crypto was supposed to eliminate: a small number of companies controlling critical infrastructure and extracting rents from everyone who depends on it. Single points of failure. Innovation channeled through corporate bureaucracies rather than distributed experimentation.
AI is increasingly understood as strategic infrastructure. Nations recognize that dependence on foreign-controlled AI creates vulnerabilities. China can’t rely on American AI. Europe worries about sovereignty. Smaller nations face permanent dependence on systems they neither control nor understand.
The Case for Decentralized AI
Open source has repeatedly demonstrated its superiority for infrastructure technology. TCP/IP, Linux, Python, PyTorch: some of humanity’s most important technological infrastructure emerged from open collaboration rather than corporate development.
Every layer of the digital economy eventually commoditizes into open-source infrastructure. Proprietary networks lost to open protocols. Proprietary operating systems lost to Linux. Proprietary mobile platforms lost to Android. The closed system dominates early, then open alternatives catch up on quality and win on cost, flexibility, and ecosystem breadth. No single company, no matter how well capitalized, can outproduce the entire world.
In 2025, Chinese laboratories proved it empirically. DeepSeek, Kimi, and others released frontier-class open-source models that matched or exceeded closed Western labs at a fraction of the cost. The lesson is permanent: frontier intelligence has no structural moat. It commoditizes on cost, and open source wins that fight every time.
If intelligence is a commodity, value shifts from producers to whoever can most efficiently coordinate, serve, and monetize it. The question is no longer whether open source wins. The question is who captures the value when it does.
Open source wins, but its contributors historically have not. Linux generated billions in value; the kernel developers captured almost none of it. PyTorch powers the AI revolution; its creators are salaried employees at Meta. The value flows to the companies that commercialize it rather than the people who built it.
Bittensor offers a structural solution. Contributors receive tokens proportional to their contribution value. If the network succeeds, those tokens appreciate. TAO is that token for open-source AI.
Why Bittensor Is Uniquely Positioned
Bittensor is not a blockchain that happens to focus on AI. It is an economic system purpose-built for the world the Chinese labs just proved we live in: one where intelligence is abundant and commoditizing, and value accrues to whoever can coordinate it most efficiently.
TAO coordinates thousands of AI developers across 100+ subnets. Millions of GPU-hours flow through the network. Annual emissions represent hundreds of millions of dollars funding this activity, approaching a billion at previous highs.
Each subnet is a startup anyone can join. The protocol handles funding. The market handles allocation. Competition handles quality control. What emerges is a distributed R&D lab operating at global scale.
When DeepSeek or Kimi release a new model, Bittensor’s subnets absorb it on day one. The Chinese labs bear the training cost. Bittensor captures the value downstream through inference, fine-tuning, and application. Every open-source model released anywhere in the world makes Bittensor’s subnets better instantly.
Elite AI talent is scarce, expensive, and geographically concentrated. Bittensor allows anyone to contribute based purely on output quality. Contribution is decoupled from credentials.
Part IV: The Evidence
Production Subnets With Real Revenue
Chutes (Subnet 64) provides serverless AI inference. Developers deploy models in seconds without managing infrastructure. The platform processes over 100 billion tokens daily across 600,000+ users and ranks as the number one provider on OpenRouter, handling roughly a quarter of that platform’s total daily volume. The majority of revenue comes from organic developer usage. All revenue converts to alpha buybacks.
The edge is structural. A permissionless compute marketplace where thousands of GPU nodes compete for requests. Miners often run hardware with idle capacity at rates centralized providers can’t match. VC-backed inference companies must raise capital to subsidize pricing. Chutes miners are subsidized by TAO emissions, creating a durable cost floor that doesn’t depend on venture funding cycles.
Targon (Subnet 4) offers hardware-attested confidential compute at scale. CPU and GPU-level encryption through Trusted Execution Environments means customer data stays cryptographically sealed even from the machine operators running the workload. This is not a contractual promise. It is a hardware guarantee. Among hyperscalers, only Azure has anything comparable, limited to a single SKU in two regions. AWS and GCP have no equivalent.
The production evidence: Dippy AI, a consumer AI companion app with millions of downloads, routes over 20 billion paid inference tokens per day through Targon after migrating from a well-funded centralized competitor. Revenue runs at eight-figure annualized levels, all committed to buybacks.
Ridges (Subnet 62) is a decentralized tournament for autonomous coding agents. Anyone can build and submit an agent. They compete daily against real software engineering challenges. Ridges agents have achieved 73.6% on SWE-Bench Verified, the best open-source score available, closing in on the roughly 79% ceiling set by proprietary systems. The full 500-question benchmark cost $1.26 in total inference by routing through Chutes. Positive unit economics from day one.
Hippius (Subnet 75) provides decentralized storage that is functionally equivalent to Amazon S3, with full API compatibility. Storing 10 TB on S3 costs roughly $200 per month. On Hippius, the same capacity costs a few dollars. The platform supports fiat payments through Stripe and publishes SDKs for Python and Rust.
Templar (Subnet 3) addresses whether frontier model pre-training can happen outside the major labs. Templar is the only fully open decentralized training network. Its gradient compression technique reduces communication overhead by approximately 97%. The current run, Covenant 72B, is the largest permissionless collaborative training effort ever attempted: 72 billion parameters, a 1.2 trillion token budget, and over 20 miners each running clusters of B200 GPUs. Both the optimizer and reward mechanism have been peer-reviewed and accepted at the NeurIPS OPT2025 Workshop.
Score (Subnet 44) turns competitive AI toward computer vision. Specialized sports analytics firms charge north of $100,000 per year with proprietary camera requirements. Score claims a 10 to 100x cost reduction through its Manako AI platform. Model accuracy is rapidly approaching the 79% human gold standard.
Nova (Subnet 68) applies the mechanism to drug discovery. Average cost to approval is $2.6 billion, timelines stretch up to a decade, and over 90% of candidates fail. Nova’s miners compete around the clock to discover novel compounds against pharmaceutical targets, reporting over a 400% improvement in hit quality over baseline. Compute cost to the team is near zero, since miners absorb it through emissions.
Subnet Composability: Why the Stack Compounds
The subnets do not operate in isolation. They build on each other, and the value dynamics scale faster than linear network effects would suggest.
Each new subnet doesn’t just add its own capability. It creates new possible integrations with every existing subnet. The possible combinations multiply exponentially with each addition.
This plays out concretely in cost structures. When Ridges runs its SWE-Bench evaluations, it doesn’t pay AWS or Azure for inference. It uses Chutes and Targon. Result: 500 coding challenges solved for a fraction of what centralized infrastructure would cost.
When subnets stack, each layer pays perhaps one-sixth the centralized cost. Those savings compound multiplicatively. A subnet deeply integrated with the Bittensor stack would face massive cost increases if it tried to migrate to centralized infrastructure. The switching cost isn’t just technical integration. It’s the loss of compounding cost advantages across every dependency.
The Decentralized Frontier Lab
Together, the subnets form a coordinated stack covering the entire AI development pipeline, assembled through market coordination rather than corporate planning.
The Data Layer handles collection, curation, and preparation. Subnets like Apex and Data Universe aggregate and clean the datasets that downstream models require.
The Storage Layer provides decentralized infrastructure for datasets and model weights. Hippius offers distributed storage that removes dependency on centralized cloud providers.
The Compute Layer aggregates GPU resources at scale. Targon coordinates tens of millions of dollars in NVIDIA-certified hardware. Chutes processes over a hundred billion tokens daily.
The Training Layer focuses on model development. IOTA pursues frontier model pre-training. Templar enables distributed pre-training across nodes.
The Post-Training Layer refines models after initial training. Affine specializes in reinforcement learning and RLHF fine-tuning.
The Application Layer deploys capabilities to end users. Ridges runs autonomous AI coding agents. BitMind develops domain-specific models. Synth operates prediction markets.
Data flows up through storage and compute into training, refinement, and application. Each layer draws on layers below it. The result is vertical integration achieved through horizontal coordination. No single entity controls the stack, yet the stack functions as an integrated whole.
Conclusion
The theoretical foundation is sound: TAO synthesizes value accrual at both protocol and application layers in ways that are mutually reinforcing.
The technical innovation is substantive: Yuma Consensus enables decentralized agreement on subjective quality, opening entirely new coordination problems to cryptoeconomic mechanisms.
The application domain is well-suited: open-source AI requires the fast iteration, talent aggregation, and continuous competition that Bittensor provides.
The evidence is concrete: functioning subnets generating revenue, achieving frontier-competitive performance, and delivering services at a fraction of centralized cost.
Network effects drive integration depth that makes the ecosystem increasingly difficult to replicate. Every subnet’s success benefits every other subnet through TAO-denominated economics.
If Bitcoin was the first great coin, decentralized money, and Ethereum the second, decentralized finance, TAO may prove to be the third: decentralized intelligence, applied to the defining economic transformation of the century.
Stillcore Capital is the first U.S.-based fund structured as an open-end, evergreen, 506(c) hedge fund exclusively dedicated to the Bittensor ecosystem. Get in touch to learn more about our fund.
This material is for informational purposes only and does not constitute an offer to sell or a solicitation of an offer to buy any security. Investments in digital assets involve substantial risks, including the possible loss of principal.

