Starknet Decentralized Protocol IV - Proofs in the Protocol

Dear @matteo, thank you for this feedback on turns vs competition! I don’t have much to say now, but you have given us some food for thought.

Regarding the fee structure: I agree that only having one fee dimension is most convenient for users. However, if the user pays for consumption of distinct resources, each of them will need to be priced by the user (or wallet) somehow, in which case it seems we have not really spared the user any hassle. A different avenue is to consider the user not paying for particular resources and instead subsidizing them at the protocol level through inflation. We will look into such approaches if, for example, the total computational costs of proving turn out to be sufficiently low. What do you think?

16 Likes

For both the sequencing and the proving?

On a very high-level overview, I think that inflation to cover proving costs can be an elegant solution. I particularly like the EIP1559 as it decouples the price required for validators to include transactions and the market price of including a transaction (what I mean here is how much a user is willing to pay to include a tx). I think that the user should pay depending on the market value of including a transaction, ofc it depends on the proving cost but most importantly on the congestion and demand to include tx, and the protocol should handle the incentivization through its token issuance.

We could make inflation vary on proving demand (less demand → less incentives for provers → more token inflation required and vice versa), but I feel that having a more similar mechanism to EIP1559 with burning fees is better as we can have deflation if there’s enough demand.

Moreover, I think this solution legitimates having a token. If users are paying gas as a function of sequencing and proving costs, why use $STARK - I’m paying for a service I could use a stablecoin (it would also remove some complexity to build the incentives). Now if the protocol itself balances the incentives based on utilization, having a token makes sense as it’s not possible to do the same thing with any other coin (as it’s not possible program its issuance as we wish). That way we’re truly creating a network supported by an asset, in the end tokens are so powerful because they’re programmable right?

Just throwing a random thought I had while thinking about this as well, I’m not even sure if it makes sense at all, but while the sequencing layer is something bound to Starknet, maybe the proving layer can be more general. We know that we’ll have a bunch of L3s and app chains built on top of Starknet, what if those rollups could use the same proving layer, that way those rollups can benefit from decentralized and performant proving and the STARK token and ecosystem can accrue value horizontally with all the different rollups built on its tech stack.

12 Likes

Dear @matteo thanks for the additional food for thought.

  • I was only referring to the prover resource.

  • I also think it’s best to have users pay the market price. Beyond that I am undecided on whether provers should usually be paid market price, or constantly overpaid. I briefly touched on this in the buffer problem post (including simplified 1559-type mechanism). There is also the factual matter of actual proving costs in relation to the remaining operation costs of the protocol.

  • As far as the token goes, I agree the ability to control monetary policy is a powerful argument in favor of a native token.

  • We have toyed around with the idea of a “universal proving layer”. Perhaps this is a somewhat premature discussion, but let us at least distinguish between proving as an off-chain service and an actual protocol that explicitly involves designated “provers” and possibly enforces on-chain logic. I haven’t given much thought as to when and why the latter is necessary. Do you have any further thoughts?

13 Likes

From what I’ve seen, protocols chose to overpay at the beggining to bootstrap the network when there’s low demand, but the major problem is to find the right balance, so that it doesn’t harm the token in terms of long term value and centralization. I think it’s also tricky to determine the right incentive, the token will be volatile and so it’s hard to hard code a predetermined inflation for the token that will always satisfy provers. Do you have any idea how to price the proving cost? Also do you think that proving cost will follow some kind of Moore’s law, with regular improvements in cost efficiency that would lead to a scheduled decrease of incentives?

I was thinking about the latter yes. I’ll explore more this idea when I have more time this weekend, but again in a very high-level overview, I think that any team wishing to build a zk-rollup leveraging the Starknet stack could benefit from having a proving layer that is already decentralized, and secured audited etc. That would enable Stark to accrue from its whole ecosystem, and it could be possible to get more composability between the different chains of the ecosystem (my question here is do we have to wait for proofs to be posted on L1 for messaging if the same proving layer is shared). I’m missing a lot of complexity here, especially since some rollups might settle on L1 other on Starknet, and maybe on upper layers - but sharing the proving layer might strengthen the interoperability between these chains, and allow Stark to directly accrue value from this multi-rollup design

7 Likes

@matteo valid points!

  • As far as pricing goes, I’m really out of my depth here, so I’ll just share my primitive intuition. In my opinion the most natural foundation for pricing provers is a market mechanism of price discovery, regardless of whether or not provers will be overpaid. Perhaps a reasonable approximation to such a mechanism is an algorithmic base fee, which may in turn be fed into a minting mechanism to overpay provers. Beyond that, it seems reasonable that proving costs will follow Moore’s law, but my instinct is to pay provers according to demand (i.e market price) as opposed to operation cost.

  • I would love to understand some concrete examples of benefits you have in mind! Also, your mention of “secured audited” provers raises a question: do you think provers should be protocol-level players with some sort of on-chain reputation system, or at least an on-chain commitment of open-source prover software?

12 Likes

If it’s a market price discovery then I tend to think no need to overpay, since those actors will want to do the job anyway, and it’s a fair price for everyone. The only reason that I think would justify overpaying here may be to have more geographic diversity since provers will have to move to regions with low-cost energy as the market matures (not sure that’s the priority tho).

Just found this article while trying to make a state of the art on the subject, I think he’s in the forum just quoting him @stonecoldpat

I’ll put someone from our research team at Empiric on this next week, it’s such an interesting design space

Yes, for the former I think that it’s the best way to set up the cursor between decentralization and performance. For the latter, maybe heavily integrating this factor in the overall reputation system would make sense.

8 Likes
  • Disregarding geographic diversity, there is a concern that fluctuating exchange rates between the token and fiat will disturb the market prices enough to cause unprofitable blocks. See this post.

  • I’ll read the article. Thanks.

  • I am not as convinced by the first statement, so I’d love to hear/read more: why do you think involving provers at the protocol layer achieves a finer balance between performance and decentralization than e.g a free market? Do you suggest somehow enforcing the use of diverse proving software at the protocol level? If not, what prevents all the protocol-level provers from purchasing all proofs from the same off-chain proof provider?

6 Likes

Hey @ilia! Sorry for the late answer,

I understand the buffer problem if the fees paid by the user are directly the fees used to pay the prover (and sequencer). But my take as I explained before is users pay the market price for Starknet’s utilization (this price depends on how congested the network is) - and separately the provers are paid based on the proving market price. Those two prices are somewhat correlated, but sometimes a user might pay less than the actual proving price to include a transaction if there’s a low demand for Starknet and that difference is covered by the token’s inflation (and vice versa).
However, this system raises a lot of questions if the proving layer is general purpose.

If we have a reputation involved, then we can favor stuff outside the scope of performances and costs. With a free market, provers will optimize its setup for the limit of performance required by the protocol, in order to have the cheapest costs possible. With a reputation system, we can require a minimum performance, but we can also favor ethos such as open-source as you said, or prover diversity, and maybe other parameters with for instance involvement in the governance or relative improvement of the prover’s performance.

11 Likes

Dear @matteo thank you for the many insights! Are you familiar with any promising ideas for reputation systems that are practically applicable in blockchain?

4 Likes

Hey @ilia

That’s a good question, at the protocol layer I think that’s somewhat hard to find
I’ve seen some interesting approach on a Layer 1 a while ago but I can’t remember the name nor the paper. I’ll try to find it.

3 Likes

:wink: :wink: :wink:That’s a good question, at the protocol layer I think that’s somewhat hard to find

2 Likes

Is there any possibility that make recursive proof size to automatically adjust for the proof cost. Because user transaction may be higher than what they can afford, if the cost of prover and sequencer will be huge

I’m not sure about automatic, however:

  1. The expect the cost of off-chain proof computation to be negligible compared to the gas cost of L1 state updates.

  2. L1 state updates will not be forced, and the updaters will be able to choose when to submit an L1 state update (taking into account several factors, including the amortization of L1 gas over L2 transactions). I think this will provide sufficient adjustment.

What do you think?

Yeah, the second point is what I mean, the updaters will adjust the recursive proof size, regarding the market price.
The cost of off-chain proof computation is negligible? As far as I know, the proof generation is a computing-intensive work, is there any benchmark to see the real cost?

@F.F I don’t have any data off the top of my head. Our rough estimates are that proof computation is around 1/100 of the capital costs incurred by stakers. Since we plan to couple block production with proof production, it seems reasonable to consider the compute cost as (at least) a second order consideration.

We have recently participated in Taiko testnet for provers and it had competitive proving system - who got the proof on chain faster wins the reward. The results were disappointing for most of the participants as 2 addresses ended up gathering most of the proofs, the 3rd have picked up what was left from those 2.

It also seems that those 3 provers have used some gaming laptops as they went offline during the night and that was a time for others to pick up 1-3 proofs.

Moreover, most of the times the proof was completed almost simultaneously and the last piece was to make a transaction on chain. But only one proof can be accounted, thus 1 transaction was successful and others have wasted their gas with a transaction being eventually failed. So, in testnet (Sepolia) it was ok, but when it comes to real ETH spending, I don’t think there will be a lot of people willing to waste electricity, hardware and ETH for failed attempts.

So, in my opinion, competitive proving will only lead to centralization, as in deterministic computations, like ZK proof generation, the fastest will always win.

Eventually, provers will have to constantly upgrade their hardware as the fastest always wins. And that is not PoW situation with probabilistic computations where 2nd best hardware is still capable of mining blocks. No. As soon as there’s a better equipment out there, you can throw your server away. At the pace of the tech development this will lead to the necessity to switch the hardware at approx. 1-2 times a year. Obviously, the costs of constant upgrades are impossible to cover with the proof fees, as we want to lower them for end users.

That is why I vouch for some turn-based rotation of proof generation with some minimal threshold by proving time that would satisfy quality of service. It is possible to make some kind of a slashing mechanism for violating QoS. The threshold can be defined by average proving time. It is also possible to introduce bigger rewards for faster proving to incentivize hardware update.

I think that it is better to give the block to some randomly selected small set of N provers and give most of the rewards for the winner and the rest the losers to keep them in game. Then randomly selected another subset thus making sure that at least some provers will get the job done as others fail. Some mechanism like that maybe can be seen in Polygon PoS, when validators take turns and if the chosen validator messes up the others get a chance to pick up the block. Or the BNB chain, where they have block producers and candidates.

Less than a year ago Ethereum went green overnight by switching from PoW to PoS, but now ZK rollups are returning those bad features as wasted electricity consumption. To a smaller degree, as everything in L2 is supposed to be smaller than in L1.
Avoiding competitive proving will help to avoid all those unnecessary costs for provers and for environment.

Great experience from Taiko, winner-take-all won’t be a good solution.

@Anton_Gaev_p2p.org thank you for the useful feedback! We are strongly inclined in favor of the turn-based approach, at least at the beginning.

Taiko also has recognized the issues with competitive proving and they are looking into auction-based model

But I do not see the value of developing chain-specific auctions while projects like https://nil.foundation already exist and can be a single marketplace for many rollups

@Anton_Gaev_p2p.org could you roughly outline the workings of the nil marketplace?

When thinking of auctions, we felt an L1 auction contract is too cumbersome and expensive, while off-chain auctions may be vulnerable to censorship. Coupling proof production to block circumvents these issues. Lastly, even if you have some proofs decoupled from block production, it seems a turn-based system based on stake is still simpler to manage than an auction. What do you think?