Here is a summary that I came up with after reading this comment in a message exchange elsewhere.
"..the single person having a eureka moment is hollywood, and is incredibly rare. in the 21st century it mostly occurs in extremely hard to understand fields such as quantum mechanics or theoretical physics.."
I'm repeating it because other important economic and social dimensions are involved.
Please bear with me.
Another short post from yours truly.
I don't think the above statement is right.
I have just glanced through Dan's Why Sharding?
Here:
Exploring Alternative Solutions
Starting off with Block Trees https://danxrd.substack.com/i/145944194/
Block Tree —> DAG —> CAST —> Bottleneck
Bottleneck —> (leads to seeing) the advantage of treating execution as an intent rather than processing transactions directly in the consensus process.
In other words, there is a separation of concerns between processing transactions and the consensus process.
This was a simplification or an abstraction, allowing logical decomposition.
Dan then rejected these three alternatives:
Beacon Chain. Turning to a central chain for definitive signals.
Dynamic Sharding. Starting with a single chain and sharding it as the network grows.
Intra-validator sharding, in which a network-facing validator acts as a load balancer and orchestrator for a bunch of secondary validators behind it, was also rejected.
Finally, he arrived at a state-sharded ledger in which each shard maintained only a portion of the full state.
"The numerous consensus processes can operate near their maximum throughput potential for prolonged periods of time with the agreed state transitions forming a directed acyclic graph (DAG) that references all necessary consensus processes and their quorum outputs."
—
This solution is topologically still a DAG, but the quoted paragraph roughly describes how it works.
So while it is generally true that great, fundamental research takes, well a lot of research and a lot of resources, it is also possible for an individual to sit down with a problem and patiently work it through.
Dan was helped by having many alternatives to hand, which he could test against.
It also required a leap of imagination to set very ambitious goals that he would not compromise.
This helped him organise his thoughts and the conceptual hierarchy to make the chain possible.
But the contrast between big company R&D is interesting.
A huge percentage of the very expensive R&D big companies get discarded for various reasons.
Not least, simply because that is just not where their market is at the moment.
What has happened with Radix, perhaps most of the crypto world, is that the idea of market need is circumvented.
Dan's imaginative leap sets Radix far in the future.
Even if Xi'an were released tomorrow, there would be no immediate market, and the lack of a market could still spell its death.
This problem has been compounded, perhaps exploited, by the idea that discussing the actual possible use of this technology in any detail may involve some legal risk.
Radix has a strong bias towards the thrust of technology, as exemplified by Dan.
I very much doubt he will sketch out use cases in his SubStack except in the most vague terms.
But Dan and others repeatedly comment in this way:
"If networks like Bitcoin were to handle programmable money and decentralized applications, the scalability demands would skyrocket. We're talking tens of thousands of transactions per second (which would sound obvious today), not just a few thousand."
Of course, other factors, such as bots and arbitrage, are at play here.
The field of smart contracts has not been factored in, as these are also "bots" of sorts.
But enough for now.



