* [Bitcoin-development] Incorporating block validation rule modifications into the block chain @ 2013-02-12 13:49 Raph Frank 2013-02-12 15:49 ` Gregory Maxwell 0 siblings, 1 reply; 16+ messages in thread From: Raph Frank @ 2013-02-12 13:49 UTC (permalink / raw) To: bitcoin-development Has this been considered? If made sufficiently general, older clients could support any extension of the rules. Various "hard" parameters within the protocol are defined in main.h of the official client. In BIP-34, there is a suggested way to make changes, based on consensus. https://en.bitcoin.it/wiki/BIP_0034 These could be made into a rule for changing the parameters directly. The process for updating could be handled by adding a new field to the coinbase transaction, in the same way the height was added in BIP-34. Something like - miner proposed a change by by including proposal in a block (name of parameter and new value) - seconded by at least 6 of the next 10 blocks (proposal dies otherwise) - active if 750 of the last 1000 blocks voted yes, or 950 of any successive 1000 previous blocks voted yes (with reduced thresholds on testnet) - dies if more than 500 of the previous 1000 voted No - blocks which don't reference the proposal are considered to abstain This could also be used to update NOPs. Complex signing algorithms could be incorporated. However, that would require a more complex scripting language for defining opcode functions. The proposal would have opcode number + script description of algorithm. This would also allow methods <method name> + <script code>. Once of the NOPs could be "call method". The rule would require that the script is valid under the current rules (NOPs as nops) and under the latest rules. This prevents needing to try all possible permutations. However, it reduces security. An compromise would be to have each new opcode change could as a version and scripts must be valid under all versions in the chain, so far. Once an op-code is accepted, new clients implementations would probably create dedicated functions for performing the calculation. Older clients would have to perform the calculations using the scripting language. ^ permalink raw reply [flat|nested] 16+ messages in thread
* Re: [Bitcoin-development] Incorporating block validation rule modifications into the block chain 2013-02-12 13:49 [Bitcoin-development] Incorporating block validation rule modifications into the block chain Raph Frank @ 2013-02-12 15:49 ` Gregory Maxwell 2013-02-13 14:58 ` Raph Frank 0 siblings, 1 reply; 16+ messages in thread From: Gregory Maxwell @ 2013-02-12 15:49 UTC (permalink / raw) To: Raph Frank; +Cc: bitcoin-development On Tue, Feb 12, 2013 at 5:49 AM, Raph Frank <raphfrk@gmail.com> wrote: > Has this been considered? > > If made sufficiently general, older clients could support any > extension of the rules. > > Various "hard" parameters within the protocol are defined in main.h of > the official client. > > In BIP-34, there is a suggested way to make changes, based on consensus. > https://en.bitcoin.it/wiki/BIP_0034 You misunderstand what BIP_0034 is doing— it's not gauging consensus, it's making sure that the change is safe to enforce. This is a subtle but important difference. The mechanism happens to be the same, but we're not asking for anyone's approval there— the change is needed to make Bitcoin as secure as people previously believed it to be, there have been no serious alternatives tendered. As far as I can tell the proposal has always had universal agreement from anyone who's thought about it. The only open question was if it was safe to deploy, and thats what that process solves. Bitcoin is not a democracy— it quite intentionally uses the consensus mechanism _only_ the one thing that nodes can not autonomously and interdependently validate (the ordering of transactions). This protects the users of Bitcoin by making most of the system largely nonvolatile "constitutional" rules instead of being controlled by popular whim where 'two wolves may vote to have the one sheep for dinner'. If it were possible to run the whole thing autonomously it would be, but alas... Even if you accept the premise that voting is a just way of making decisions (it isn't; it's just often the least unjust when something must be done), mining is not a particularly just way of voting: 'Hashpower isn't people', and currently the authority to control the majority of the hashpower is vested in a only a half dozen people. Moreover, the incentives to abuse hashpower are sharply curtailed by its limited authority (all one can do with it is reorder transactions... which is powerful but still finite) and allowing arbitrary rule changes would vastly increase that power. There are some more subtle issues— if the acceptance of chain B depends on if you've seen orthogonal chains A or A' first then there can be a carefully timed announcement of A and A' which forever prevents global convergence (thanks to a finite speed of light an attacker can make sure some nodes see A first, some A'). If a rule change can be reorged out, then it's not really a rule— any actual rule prohibits otherwise valid blocks that violate it (and without this distinction you might as well implement the 'rule' as miner preferences). Additionally there is the very hard software engineering QA problem for a sufficiently complex rule language— script isn't turing complete and look at all the issues it has had. In summary— this sort of thing, which has come up before, is technically interesting and fun to think about but it would make for substantial engineering challenges and would not be obviously compatible with the economic motivations which make Bitcoin secure nor would it be morally compatible with the social contract embedded in the system today. ^ permalink raw reply [flat|nested] 16+ messages in thread
* Re: [Bitcoin-development] Incorporating block validation rule modifications into the block chain 2013-02-12 15:49 ` Gregory Maxwell @ 2013-02-13 14:58 ` Raph Frank 2013-02-13 15:42 ` Gregory Maxwell 0 siblings, 1 reply; 16+ messages in thread From: Raph Frank @ 2013-02-13 14:58 UTC (permalink / raw) To: bitcoin-development On Tue, Feb 12, 2013 at 3:49 PM, Gregory Maxwell <gmaxwell@gmail.com> wrote: > You misunderstand what BIP_0034 is doing— it's not gauging consensus, > it's making sure that the change is safe to enforce. This is a subtle > but important difference. Sounds reasonable. The change in BIP-34 doesn't cause old client to reject the main chain. The increase to the maximum block size would be rejected by old clients, so is different. Adding new opcodes (as long as they act like a NOP on success) also doesn't cause a disagreement about what is the longest chain, in the end. Miners might end up mining chains which are guaranteed to be orphaned at worst. > Bitcoin is not a democracy— it quite intentionally uses the consensus > mechanism _only_ the one thing that nodes can not autonomously and > interdependently validate (the ordering of transactions). So, how is max block size to be decided then? ^ permalink raw reply [flat|nested] 16+ messages in thread
* Re: [Bitcoin-development] Incorporating block validation rule modifications into the block chain 2013-02-13 14:58 ` Raph Frank @ 2013-02-13 15:42 ` Gregory Maxwell 2013-02-13 21:02 ` Gavin Andresen 2013-02-14 1:02 ` Gregory Maxwell 0 siblings, 2 replies; 16+ messages in thread From: Gregory Maxwell @ 2013-02-13 15:42 UTC (permalink / raw) To: Raph Frank; +Cc: bitcoin-development On Wed, Feb 13, 2013 at 6:58 AM, Raph Frank <raphfrk@gmail.com> wrote: >> Bitcoin is not a democracy— it quite intentionally uses the consensus >> mechanism _only_ the one thing that nodes can not autonomously and >> interdependently validate (the ordering of transactions). > So, how is max block size to be decided then? In one sense it already is decided— there is a protocol rule implementing a hard maximum, and soft rules for lower targets. If it's to be changed it would only be by it being obvious to almost everyone that it should _and_ must be. Since, in the long run, Bitcoin can't meet its security and decenteralization promises without blockspace scarcity to drive non-trivial fees and without scaling limits to keep it decenteralized— it's not a change that could be made more lightly than changing the supply of coin. I hope that should it become necessary to do so that correct path will be obvious to everyone, otherwise there is a grave risk of undermining the justification for the confidence in the immutability of any of the rules of the system. ^ permalink raw reply [flat|nested] 16+ messages in thread
* Re: [Bitcoin-development] Incorporating block validation rule modifications into the block chain 2013-02-13 15:42 ` Gregory Maxwell @ 2013-02-13 21:02 ` Gavin Andresen 2013-02-13 21:05 ` Gregory Maxwell 2013-02-13 23:10 ` Stephen Pair 2013-02-14 1:02 ` Gregory Maxwell 1 sibling, 2 replies; 16+ messages in thread From: Gavin Andresen @ 2013-02-13 21:02 UTC (permalink / raw) To: Gregory Maxwell; +Cc: bitcoin-development [-- Attachment #1: Type: text/plain, Size: 755 bytes --] On Wed, Feb 13, 2013 at 10:42 AM, Gregory Maxwell <gmaxwell@gmail.com>wrote: > Since, in the long run, > Bitcoin can't meet its security and decenteralization promises without > blockspace scarcity to drive non-trivial fees and without scaling > limits to keep it decenteralized— it's not a change that could be made > more lightly than changing the supply of coin. > I disagree with Gregory on this. I believe that Bitcoin CAN meet its security and decentralization promises without any hard limit on block size. I had a fruitful discussion about this with an economist friend this weekend, and I'll eventually getting around to writing up why I believe raising the block size limit will not be a problem. -- -- Gavin Andresen [-- Attachment #2: Type: text/html, Size: 1095 bytes --] ^ permalink raw reply [flat|nested] 16+ messages in thread
* Re: [Bitcoin-development] Incorporating block validation rule modifications into the block chain 2013-02-13 21:02 ` Gavin Andresen @ 2013-02-13 21:05 ` Gregory Maxwell 2013-02-13 23:10 ` Stephen Pair 1 sibling, 0 replies; 16+ messages in thread From: Gregory Maxwell @ 2013-02-13 21:05 UTC (permalink / raw) To: Gavin Andresen; +Cc: bitcoin-development On Wed, Feb 13, 2013 at 1:02 PM, Gavin Andresen <gavinandresen@gmail.com> wrote: > On Wed, Feb 13, 2013 at 10:42 AM, Gregory Maxwell <gmaxwell@gmail.com> > wrote: >> Since, in the long run, >> Bitcoin can't meet its security and decenteralization promises without >> blockspace scarcity to drive non-trivial fees and without scaling >> limits to keep it decenteralized— it's not a change that could be made >> more lightly than changing the supply of coin. > I disagree with Gregory on this. I believe that Bitcoin CAN meet its > security and decentralization promises without any hard limit on block size. > > I had a fruitful discussion about this with an economist friend this > weekend, and I'll eventually getting around to writing up why I believe > raising the block size limit will not be a problem. That would be fantastic. ^ permalink raw reply [flat|nested] 16+ messages in thread
* Re: [Bitcoin-development] Incorporating block validation rule modifications into the block chain 2013-02-13 21:02 ` Gavin Andresen 2013-02-13 21:05 ` Gregory Maxwell @ 2013-02-13 23:10 ` Stephen Pair 2013-02-14 0:28 ` Gregory Maxwell 1 sibling, 1 reply; 16+ messages in thread From: Stephen Pair @ 2013-02-13 23:10 UTC (permalink / raw) To: Gavin Andresen; +Cc: Bitcoin Dev [-- Attachment #1: Type: text/plain, Size: 3549 bytes --] On Wed, Feb 13, 2013 at 4:02 PM, Gavin Andresen <gavinandresen@gmail.com>wrote: > On Wed, Feb 13, 2013 at 10:42 AM, Gregory Maxwell <gmaxwell@gmail.com>wrote: > >> Since, in the long run, >> Bitcoin can't meet its security and decenteralization promises without >> blockspace scarcity to drive non-trivial fees and without scaling >> limits to keep it decenteralized— it's not a change that could be made >> more lightly than changing the supply of coin. >> > > I disagree with Gregory on this. I believe that Bitcoin CAN meet its > security and decentralization promises without any hard limit on block > size. > > I had a fruitful discussion about this with an economist friend this > weekend, and I'll eventually getting around to writing up why I believe > raising the block size limit will not be a problem. If you've already validated the majority of transactions in a block, isn't validating the block not all that compute intensive? Thus, it's really not blocks that should be used to impose any sort of scarcity, but rather it's the costs associated with the validation and propagation of the transactions themselves...which is the way it should be. When I think about issues like this, I like to remind myself that the mesh network isn't really an essential part of Bitcoin. It is a way to disseminate transactions and blocks, but it's by no means the only possible way and could certainly be improved in various ways. Nodes can at some point start to charge fees to collect and distribute transactions and blocks. Collectives of such nodes could pool together fees to ensure connected nodes can propagate and hear about transactions and blocks. These nodes would charge based on the bandwidth and the work required to validate transactions. They would also charge for the propagation of blocks based on the work required to validate them. Miners would of course have a lot of incentive to pay for such services since they will want to get access to as many fee bearing transactions as possible (and filter out the transactions they don't want to include in blocks). They will also want the blocks to ensure they're always building on the latest valid block. That in turn would give these relay nodes a window into the fees needed to ensure fast inclusion into the block chain (something that wallets could use to automatically set fees on transactions). Note, I think the bitcoin protocol might actually be ideally suited for this type of thing...nodes would broadcast INV messages all day long, but as soon as one of your peers wants the actual transaction or block, well, then you have to pay up. Two relay nodes sending transactions between each other would pay each other when they have to download the transaction body...if they trade roughly equal amounts of transactions, they wouldn't end up owing each other anything...for a given transaction they would pull the data exactly, but would then turn around and provide that transaction to many connected peers, earning a fee for each delivery. P.S. such a fee structure would address the spam issue with economics rather than rules about spammy transactions P.P.S. micropayment channels could be used as the payment method for nodes that validate and relay transactions...this would actually be a very good first use of that technology (people have talked about micropayment channels for renting bandwidth...why not use them to pay for the bandwidth and CPU needed to validate and relay transactions) [-- Attachment #2: Type: text/html, Size: 5150 bytes --] ^ permalink raw reply [flat|nested] 16+ messages in thread
* Re: [Bitcoin-development] Incorporating block validation rule modifications into the block chain 2013-02-13 23:10 ` Stephen Pair @ 2013-02-14 0:28 ` Gregory Maxwell 2013-02-14 2:44 ` Stephen Pair 0 siblings, 1 reply; 16+ messages in thread From: Gregory Maxwell @ 2013-02-14 0:28 UTC (permalink / raw) To: Stephen Pair; +Cc: Bitcoin Dev On Wed, Feb 13, 2013 at 3:10 PM, Stephen Pair <stephen@bitpay.com> wrote: > If you've already validated the majority of transactions in a block, isn't validating the block not all that compute intensive? Thus, it's really not blocks that should be used to impose any sort of scarcity, but rather it's the costs associated with the validation and propagation of the transactions themselves...which is the way it should be. The cost to whom? This is important because the cost of validating blocks is borne by all the participants in Bitcoin— or at least all the participants who haven't given up on the decenteralized trustless stuff and are simply trusting someone else. Even a small cost becomes large when hundreds of thousands. And perhaps you don't lament people delegating their trust to large entities— but keep in mind: Bitcoin was created for the express purpose of creating a money system which didn't require trust because it was based on cryptographic proof— mathematical law— instead of politics and human law. Take that away and you have a really poorly understood inefficient system operated by entities which are less trustworthy and rightfully entitled to authority than the ones operating the established major currencies. > When I think about issues like this, I like to remind myself that the mesh network isn't really an essential part of Bitcoin. Thats absolutely true— but I don't know that it's relevant in this case. > Nodes can at some point start to charge fees to collect and distribute transactions and blocks. They can— but doing so would radically undermine Bitcoin. A refresher: If you combine digital signatures with simple transaction rules you can have a purely autonomous monetary system based entirely on math. It would be perfect, anonymous, scalable ... except for the problem of double spending. To solve double spending the participants must agree on which of a set of duplicated payments is one the authoritative one. Coming to this agreement is fundamentally hard just at the basics of physics— a result of relativity is that observers will perceive events in different orders depending on the observer's and the events relative locations. If no observer is privileged (a decenteralized system) you have to have a way of reaching a consensus. This kind of efficient consensus we need— which which participants can join or enter at any time, which doesn't require exponential communication, and which is robust against sock-puppet participants— was long believed to be practically impossible. Bitcoin solved the problem by using hashcash to vote— because real resources were forever expended in the process the sock-puppet problem is solved. But the vote only works if everyone can see the results of it: We assume that the majority of hashpower isn't a dishonest party, and that honest nodes can't be prevented from hearing the honest history. Nodes choose then rules-valid history that has the most work (votes) expended on it... but they can only choose among what they know of. As Satoshi, wrote: "[Bitcoin] takes advantage of the nature of information being easy to spread but hard to stifle". The requirement for everyone to hear the history doesn't get talked about much— at least with reasonably sized blocks and today's technical and common political climates the assumption that information is easy to spread but hard to stifle is a very sound one. It's a good thing, because this assumption is even more important than the hash-power honesty assumption (a malicious party with a simple majority of hashpower is much weaker than one who can partition the network). ... but that all goes out the window if communicating blocks is costly enough that the only way to sustain it is to jealously guard and charge for access/forwarding. The consequence of such a change is that the Bitcoin consensus algorithm would be handicapped. How long must you wait before you know that the history you have won't get replaced by a more authoritative one? Today an hour or two seems relatively solid. In a world with non-uniform block forwarding perhaps it takes days— if ever— before any participant is confident that there isn't a better history lurking. All doubly so if the bookkeeping required for this payment ends up necessitating additional transactions and adds to the load. [This is also the flaw in the 'Red Balloons' paper, making transactions a dozen times longer just to attach credit for forwarding doesn't seem wise compared to keep transactions so cheap to transmit that even a small number of altruists make the equilibrium state be liberally-forwarding] > They would also charge for the propagation of blocks based on the work required to validate them. Large miners would obviously locate and connect to each other. Even enormous blocks are no problems for big industrial players. Don't want to pay the cost to get their big blocks from them? Your loss: If you don't take their blocks and they constitute the longest history, you'll be believing the wrong history until such a time as you wise up and pay the piper. Your transactions will be reversed and you'll lose money. You can hypothesize some cartel behavior external to the rules of the system— where by some consensus mechanism (????) some super large mass of participants agree to reject blocks according 'extrajudicial rules', some rule existing outside of Bitcoin itself— but there must be a consensus because rejecting blocks by yourself only gets you ripped off. I don't see how this works— it basically embeds another hard consensus problem (what is the criteria for blocks to be valid?) inside our solution to a hard consensus problem (which are the best valid blocks?), but doesn't benefit from the same incentive structure— locally-greedy miners obviously want to produce the largest blocks possible— and in hashpower consensus non-miners don't have a voice. That might be acceptable for ordering, but now you're deciding on the rules of the system which all non-trusting participants must validate. You could instead solve that consensus problem with politically stipulated regulation or industry cartels, or good old-fashion kneecap busting or whathave you. But then Bitcoin loses the transparency and determinism that make it worthwhile. I sure hope to hear something better than that. This is basically the gap: Right now I could afford hardware that could process multiple gigabyte blocks— maybe it only costs as much as a small house which is not an insane cost for a large business. But the cost would be decidedly non-negligible and it would be rational for me to let someone else take it. Applied to everyone, you end up with a small number of the most vested parties doing all the validation, and so they have full ability to manipulate like today's central banks. For a great many to perform validation— keeping the system honest and decentralized as it was envisioned— without worrying about the cost requires that the cost be almost unnoticeable. A tiny fraction of what some industrial player— who profit from consolidation and manipulation— could easily handle. I'm skeptical about the system internally self-regulating the size because of what gets called "evaporative cooling" in social sciences— the cost goes up, some people cross their "hey, I'm better off if I externalize the cost of keeping Bitcoin secure by not participating" boundary and lose their voice. There is probably some equilibrium where Bitcoin is compromised frequently enough that more validators spin up (and ignore past rule violations which can't be undone without economic Armageddon), and eat the costs even though there is an insane amount of freeloading going on. The trustworthiness of today's monetary systems suggests to me that if there is an equilibrium point here it isn't a very trustworthy one. There is an even stronger equilibrium state available too: don't use Bitcoin at at all. If you want a system which is dominated by political whim and expedience and large industrial players and is, as a result, only somewhat trustworthy you can just use government issued currencies— they're well established and have a lot less overhead than this decentralized stuff. (And generally— Security makes for a terrible market, security is naturally a lemon market. The need is only clear in hindsight. In our case it would be one with an enormous freeloading problem) > P.S. such a fee structure would address the spam issue with economics rather than rules about spammy transactions Our current anti-spam one is primarily an economic one— transactions prioritized based on fee per KB in scarce blocks or priority (another scarce commodity), the only really non-very-economic part is the very-small-output heuristic. I would argue that our economic anti-spam mechanisms are currently failing at their job: Various parties are engaging in transaction patterns with near pessimal efficiency— using a dozen (— sometimes thousands) of transactions where one or two would be adequate. This isn't limited to just one or two sites— many parties are using inefficient transaction patterns— creating externalized costs on all future Bitcoin users—, simply because there is hardly any incentive not to. Though much discussion among technical people, no one has come up with any reparametrizations that seem likely to achieve the desired incentive alignment in the near term. Of all the elements of the anti-spam policy, it seems to me that the least economic— the minimum output size— is actually the most effective (most spam suppression relative to efficient usage suppression), especially as we move to focusing on the UTXO set size. (The minimum output value requirement discourages the creation of UTXOs which will never be economically rational to redeem). ^ permalink raw reply [flat|nested] 16+ messages in thread
* Re: [Bitcoin-development] Incorporating block validation rule modifications into the block chain 2013-02-14 0:28 ` Gregory Maxwell @ 2013-02-14 2:44 ` Stephen Pair 2013-02-14 3:38 ` Gregory Maxwell 2013-02-14 6:07 ` Peter Todd 0 siblings, 2 replies; 16+ messages in thread From: Stephen Pair @ 2013-02-14 2:44 UTC (permalink / raw) To: Gregory Maxwell; +Cc: Bitcoin Dev [-- Attachment #1: Type: text/plain, Size: 2344 bytes --] On Wed, Feb 13, 2013 at 7:28 PM, Gregory Maxwell <gmaxwell@gmail.com> wrote: > <bunch of stuff> > I understand your arguments, but don't agree with many of your conclusions. The requirement for everyone to hear the history doesn't get talked > about much One of the beauties of bitcoin is that the miners have a very strong incentive to distribute as widely and as quickly as possible the blocks they find...they also have a very strong incentive to hear about the blocks that others find. There will not be an issue with blocks being "jealously guarded"...what miners will want is a good feed of transactions that they want to mine. They will be willing to pay for those feeds (either by sharing the proceeds with highly connected "relay" nodes or by operating highly connected nodes themselves). Because miners will only want to pay to get a feed of profitable transactions, they will not pay to receive transactions whose miner fee does not cover the "relay" fee (by which I mean the fee or cost associated with the bandwidth and validation that a transaction requires) with some amount of profit. This means that the relay node will not fetch and propagate those transactions whose fee is too small (unless there was some other fee structure outside the miners fee). These are relatively easy businesses to operate...which means there will be a lot of them and they'll compete on fees (with wallets automatically discovering the cheapest of the services). If the businesses of relaying and mining ever became too centralized, other businesses with a vested interest in the success of bitcoin would take the necessary steps to ensure there remained adequate decentralization. It's important to remember that the centralization that currently exists in the fiat currency world benefits one set of businesses to the detriment of many others. Having a functioning and trustworthy payment system benefits far more people and businesses than a centralized system would. It is good to be wary of these potential issues, but I don't see how the economics are likely to yield the outcome you fear. And, really, there's not a lot that can be done to prevent economics from dictating the ultimate outcome. In fact, what I write above is not so much about what I think *should* be built, it's more about what I *predict* will be built. [-- Attachment #2: Type: text/html, Size: 3781 bytes --] ^ permalink raw reply [flat|nested] 16+ messages in thread
* Re: [Bitcoin-development] Incorporating block validation rule modifications into the block chain 2013-02-14 2:44 ` Stephen Pair @ 2013-02-14 3:38 ` Gregory Maxwell 2013-02-14 5:36 ` Stephen Pair 2013-02-14 6:07 ` Peter Todd 1 sibling, 1 reply; 16+ messages in thread From: Gregory Maxwell @ 2013-02-14 3:38 UTC (permalink / raw) To: Stephen Pair; +Cc: Bitcoin Dev On Wed, Feb 13, 2013 at 6:44 PM, Stephen Pair <stephen@bitpay.com> wrote: > One of the beauties of bitcoin is that the miners have a very strong incentive to distribute as widely and as quickly as possible the blocks they find...they also have a very strong incentive to hear about the blocks that others find. There will not be an issue with blocks being "jealously guarded" Then perhaps I totally misunderstood what you were suggesting. I believed you were saying blocksize would be controlled by people having to pay to receive and pay to have blocks forwarded. >(by which I mean the fee or cost associated with the bandwidth and validation that a transaction requires) with some amount of profit. This means that the relay node will not fetch and propagate those transactions whose fee is too small (unless there was some other fee structure outside the miners fee). The only fee-or-cost they're worrying about is their own marginal costs. This says nothing about the externalized cost of the hundreds of thousands of other nodes which also must validate the block they produce, many of which are not miners— if we are well distributed— and thus don't have any way to monetize fees. And even if they are all miners for some reason, if these fees are paying the ever growing validation/storage costs what expenditure is left for the proof of work that makes Bitcoin resistant to reversal? If the cost is soaked up by validation/forwarding then the capacity to run a validating node ends up being the barrier to entry and difficulty would be very low... which sounds fine until you realize that an attacker doesn't have validation costs, and that selfish ("optimally rational") miners could just eschew validation (who cares if you lose some blocks to invalidity if you're producing them so much cheaper than the honest players?). > It is good to be wary of these potential issues, but I don't see how the economics are likely to yield the outcome you fear. And, really, there's not a lot that can be done to prevent economics from dictating the ultimate outcome. In fact, what I write above is not so much about what I think *should* be built, it's more about what I *predict* will be built. What I want is for economics to dictate a positive outcome. They can do this how the system is currently constructed where the economics of using the system are clearly aligned with securing it. ^ permalink raw reply [flat|nested] 16+ messages in thread
* Re: [Bitcoin-development] Incorporating block validation rule modifications into the block chain 2013-02-14 3:38 ` Gregory Maxwell @ 2013-02-14 5:36 ` Stephen Pair 0 siblings, 0 replies; 16+ messages in thread From: Stephen Pair @ 2013-02-14 5:36 UTC (permalink / raw) To: Gregory Maxwell; +Cc: Bitcoin Dev [-- Attachment #1: Type: text/plain, Size: 4253 bytes --] On Wed, Feb 13, 2013 at 10:38 PM, Gregory Maxwell <gmaxwell@gmail.com>wrote: > On Wed, Feb 13, 2013 at 6:44 PM, Stephen Pair <stephen@bitpay.com> wrote: > >(by which I mean the fee or cost associated with the bandwidth and > validation that a transaction requires) with some amount of profit. This > means that the relay node will not fetch and propagate those transactions > whose fee is too small (unless there was some other fee structure outside > the miners fee). > > The only fee-or-cost they're worrying about is their own marginal > costs. This says nothing about the externalized cost of the hundreds > of thousands of other nodes which also must validate the block they > produce, many of which are not miners— if we are well distributed— and > thus don't have any way to monetize fees. But this is exactly the point I'm making...the thousands of other nodes do have a way to monetize the work they do in relaying and validating transactions. Miners will pay them for the prompt delivery of profitable transactions. So, in effect, the block reward and transactions fees will be paying not only for the mining work, but also the validation and relaying work. Such nodes would get paid in micro transactions from the miners for that service. This would be one way that full nodes could operate profitably (there may be many other indirect ways). I think decentralization is pretty much guaranteed because anyone with profitable transactions would only deliver them to miners or other peers that are willing to pay for them. This is in effect a rebate of a portion of the transaction fee to the network for delivering the transaction to the miner. Wallet software might cut out the middle men and submit directly to miners...other nodes with access to a large amounts of transactions and good infrastructure might be able to reduce the infrastructure a miner has to maintain and deliver a larger volume of fee bearing transactions. And everyone would have a very good sense of the market price for transaction fees for a given level of service (speed of block inclusion). The other side of it is that wallets will need to receive valid, wallet relevant transactions. They may also need to connect with multiple nodes for independent verification of the validity of their transactions. But I think that cost would be more than covered with fees they include in any transactions they originate (but if they rarely originate fee bearing transactions, they might need to pay something to keep receiving an incoming transaction feed...it could be as simple as an artificial transaction they pay to themselves, but that includes a fee). A while back everyone was worried that a tragedy of the commons situation would develop whereby all transactions that carried any fee at all would get included by miners, thus destroying the mining business as the block reward diminished...but I think the cost involved in relaying and validating transactions ensures that situation won't develop...mining nodes will have to only connect to relaying and validating nodes such that they can filter down the volume to something that's profitable for them...and relaying and validating nodes will ignore transactions with fees that are too low to be profitable. It will be a few years before we see the kinds of volumes that will force this infrastructure to evolve...I don't think there is an issue with lifting or even eliminating the block size limit...there may be a point at which the volume is sufficient enough that full nodes start dropping offline...and the nodes that do remain will have to increasingly find ways to cover their costs...which will be a forcing function for solutions similar to these. There is no doubt that Bitcoin will be a lot more valuable if it can handle very large volumes of transactions. Also, Mike Hearn has done some analysis that suggests that even at Visa scales, the hardware requirements to do full validation and relay may not all that substantial (enabling lots of small, but profitable, node operators and low transactions fees...the key to profitability would be access to a sufficient number of original transactions bearing fees). [-- Attachment #2: Type: text/html, Size: 5421 bytes --] ^ permalink raw reply [flat|nested] 16+ messages in thread
* Re: [Bitcoin-development] Incorporating block validation rule modifications into the block chain 2013-02-14 2:44 ` Stephen Pair 2013-02-14 3:38 ` Gregory Maxwell @ 2013-02-14 6:07 ` Peter Todd 2013-02-14 12:59 ` Stephen Pair 1 sibling, 1 reply; 16+ messages in thread From: Peter Todd @ 2013-02-14 6:07 UTC (permalink / raw) To: Stephen Pair; +Cc: Bitcoin Dev [-- Attachment #1: Type: text/plain, Size: 5334 bytes --] On Wed, Feb 13, 2013 at 09:44:11PM -0500, Stephen Pair wrote: > One of the beauties of bitcoin is that the miners have a very strong > incentive to distribute as widely and as quickly as possible the blocks > they find...they also have a very strong incentive to hear about the blocks > that others find. There will not be an issue with blocks being "jealously The idea that miners have a strong incentive to distribute blocks as widely and as quickly as possible is a serious misconception. The optimal situation for a miner is if they can guarantee their blocks would reach just over 50% of the overall hashing power, but no more. The reason is orphans. Here's an example that makes this clear: suppose Alice, Bob, Charlie and David are the only Bitcoin miners, and each of them has exactly the same amount of hashing power. We will also assume that every block they mine is exactly the same size, 1MiB. However, Alice and Bob both have pretty fast internet connections, 2MiB/s and 1MiB/s respectively. Charlie isn't so lucky, he's on an average internet connection for the US, 0.25MiB/second. Finally David lives in country with a failing currency, and his local government is trying to ban Bitcoin, so he has to mine behind Tor and can only reliably transfer 50KiB/second. Now the transactions themselves aren't a problem, 1MiB/10minutes is just 1.8KiB/second average. However, what happens when someone finds a block? So Alice finds one, and with her 1MiB/second connection she simultaneously transfers her new found block to her three peers. She has enough bandwidth that she can do all three at once, so Bob has it in 1 second, Charlie 4 seconds, and finally David in 20 seconds. The thing is, David has effectively spent that 20 seconds doing nothing. Even if he found a new block in that time he wouldn't be able to upload it to his other peers fast enough to beat Alices block. In addition, there was also a probabalistic time window *before* Alice found her block, where even if David found a block, he couldn't get it to the majority of hashing power fast enough to matter. Basically we can say David spent about 30 seconds doing nothing, and thus his effective hash power is now down by 5% However, it gets worse. Lets say a rolling average mechanism to determine maximum block sizes has been implemented, and since demand is high enough that every block is at the maximum, the rolling average lets the blocks get bigger. Lets say we're now at 10MiB blocks. Average transaction volume is now 18KiB/second, so David just has 32KiB/second left, and a 1MiB block takes 5.3 minutes to download. Including the time window when David finds a new block but can't upload it he's down to doing useful mining a bit over 3 minutes/block on average. Alice on the other hand now has 15% less competition, so she's actually clearly benefited from the fact that her blocks can't propegate quickly to 100% of the installed hashing power. Now I know you are going to complain that this is BS because obviously we don't need to actually transmit the full block; everyone already has the transactions so you just need to transfer the tx hashes, roughly a 10x reduction in bandwidth. But it doesn't change the fundemental principle: instead of David being pushed off-line at 10MiB blocks, he'll be pushed off-line at 100MiB blocks. Either way, the incentives are to create blocks so large that they only reliably propegate to a bit over 50% of the hashing power, *not* 100% Of course, who's to say Alice and Bob are mining blocks full of transactions known to the network anyway? Right now the block reward is still high, and tx fees low. If there isn't actually 10MiB/second of transactions on the network it still makes sense for them to pad their blocks to that size anyway to force David out of the mining business. They would gain from the reduced hashing power, and get the tx fees he would have collected. Finally since there are now just three miners, for Alice and Bob whether or not their blocks ever get to Charlie is now totally irrelevant; they have every reason to make their blocks even bigger. Would this happen in the real world? With pools chances are people would quit mining solo or via P2Pool and switch to central pools. Then as the block sizes get large enough they would quit pools with higher stale rates in preference for pools with lower ones, and eventually the pools with lower stale rates would probably wind up clustering geographically so that the cost of the high-bandwidth internet connections between them would be cheaper. Already miners are very sensitive to orphan rates, and will switch pools because of small differences in that rate. Ultimately the reality is miners have very, very perverse incentives when it comes to block size. If you assume malice, these perverse incentives lead to nasty outcomes, and even if you don't assume malice, for pool operators the natural effects of the cycle of slightly reduced profitability leading to less ability invest in and maintain fast network connections, leading to more orphans, less miners, and finally further reduced profitability due to higher overhead will inevitably lead to centralization of mining capacity. -- 'peter'[:-1]@petertodd.org [-- Attachment #2: Digital signature --] [-- Type: application/pgp-signature, Size: 490 bytes --] ^ permalink raw reply [flat|nested] 16+ messages in thread
* Re: [Bitcoin-development] Incorporating block validation rule modifications into the block chain 2013-02-14 6:07 ` Peter Todd @ 2013-02-14 12:59 ` Stephen Pair 2013-02-18 16:22 ` Peter Todd 0 siblings, 1 reply; 16+ messages in thread From: Stephen Pair @ 2013-02-14 12:59 UTC (permalink / raw) To: Peter Todd; +Cc: Bitcoin Dev [-- Attachment #1: Type: text/plain, Size: 1991 bytes --] On Thu, Feb 14, 2013 at 1:07 AM, Peter Todd <pete@petertodd.org> wrote: > On Wed, Feb 13, 2013 at 09:44:11PM -0500, Stephen Pair wrote: > > One of the beauties of bitcoin is that the miners have a very strong > > incentive to distribute as widely and as quickly as possible the blocks > > they find...they also have a very strong incentive to hear about the > blocks > > that others find. There will not be an issue with blocks being > "jealously > > The idea that miners have a strong incentive to distribute blocks as > widely and as quickly as possible is a serious misconception. The > optimal situation for a miner is if they can guarantee their blocks > would reach just over 50% of the overall hashing power, but no more. The > reason is orphans. > Perhaps, but a miner trying to target just over 50% of the network will run the very real risk that they'll only reach 49%. What about the case for centralization if the block size remains capped? I see a far greater risk of centralization in that scenario than if the cap were to be removed. The reason is very simple, bitcoin would ultimately become useful only for very high value, settlement transactions. Only the mega corporations and banks would be using it directly, everyone else would be doing daily transacting in centrally issued currencies of one form or another. As the banks and mega corps learned about the utility of bitcoin and began to use it en masse, they would start to take the whole network off the public internet and put it on a higher speed and more reliable backbone. Those corporations would establish mining agreements among themselves to ensure none of the participants could take over the system and compromise it, while at the same time keeping the operational costs to a minimum. Bitcoin is now a great alternative to the wire transfer system, but has no value to the average person wanted to have cheap and private transactions over the Internet. Maybe Litecoin starts to fill that niche. [-- Attachment #2: Type: text/html, Size: 3080 bytes --] ^ permalink raw reply [flat|nested] 16+ messages in thread
* Re: [Bitcoin-development] Incorporating block validation rule modifications into the block chain 2013-02-14 12:59 ` Stephen Pair @ 2013-02-18 16:22 ` Peter Todd 0 siblings, 0 replies; 16+ messages in thread From: Peter Todd @ 2013-02-18 16:22 UTC (permalink / raw) To: Stephen Pair; +Cc: Bitcoin Dev [-- Attachment #1: Type: text/plain, Size: 5253 bytes --] On Thu, Feb 14, 2013 at 07:59:04AM -0500, Stephen Pair wrote: > > The idea that miners have a strong incentive to distribute blocks as > > widely and as quickly as possible is a serious misconception. The > > optimal situation for a miner is if they can guarantee their blocks > > would reach just over 50% of the overall hashing power, but no more. The > > reason is orphans. > > > > Perhaps, but a miner trying to target just over 50% of the network will run > the very real risk that they'll only reach 49%. Then don't be so agressive; target 90% as I suggested and the miner still comes out ahead by having 10% less hashing power to compete with. 50% is only a maximum because when more than 50% of the network does not see your blocks the majority will inevitably create a longer chain than you, but less than 50% and your part of the network will inevitably create a longer chain than them. > What about the case for centralization if the block size remains capped? I > see a far greater risk of centralization in that scenario than if the cap > were to be removed. The reason is very simple, bitcoin would ultimately > become useful only for very high value, settlement transactions. Only the > mega corporations and banks would be using it directly, everyone else would > be doing daily transacting in centrally issued currencies of one form or > another. As the banks and mega corps learned about the utility of bitcoin > and began to use it en masse, they would start to take the whole network > off the public internet and put it on a higher speed and more reliable > backbone. Those corporations would establish mining agreements among > themselves to ensure none of the participants could take over the system > and compromise it, while at the same time keeping the operational costs to > a minimum. Bitcoin is now a great alternative to the wire transfer system, > but has no value to the average person wanted to have cheap and private > transactions over the Internet. Maybe Litecoin starts to fill that niche. What you are describing is either *voluntary* centralization, or won't happen. Nothing in your scenario will stop people from transacting on the Bitcoin network directly, it will just make it more expensive. For instance suppose fees rose to the point where the value of the fees was 10x the value of the block reward today; miners would be taking in $972,000/day, or $6750/block. At 1MiB/block that implies transaction fees of $6.75/KiB, or about $2 per transaction. Even if the fees were $20 per transaction that'd be pretty cheap for direct access to the worlds bank-to-bank financial network; I can still transfer an unlimited amount of money across the planet, and no-one can stop me. Importantly there will be plenty of demand to have transactions mined from people other than banks and large corporations. Because there will continue to be demand, and because 1MiB blocks means running a relay node is trivial enough that people can do it just for fun, banks won't be able to force people to use their "high-speed backbone". Not to say they won't create one, but it won't have any real advantage over something that can be run in your basement. On the mining side with 1MiB blocks the fixed costs for setting up a mining operation are just a moderately powered computer with a bunch of harddrive space and a slow internet connection. The marginal costs are still there of course, but the cost of power and cooling are lower at small scale than at larger industrial scales; power is often available for free in small amounts, and cooling isn't a problem in small setups. Because small-scale miners will still exist, there will still be a market for "consumer" mining gear, and trying to regulate mining equipment will just turn it into a black-market good. Small blocks let you setup a mining operation anywhere in the world - good luck controlling that. Mining also will remain a way to import Bitcoins into places. Banks can try setting up exclusive mining contracts, but unless they control more than 50% of the network they'll still have to accept blocks found by these highly decentralized, small-scale miners. They'd be better off broadcasting their transactions to those miners as well so they don't get double-spent. Thus decentralized miners still can profit from transaction fees, and still have an incentive to mine. Doesn't sound like centralization to me at all. On the other land, with large blocks, not only is mining solo unprofitable due to the huge fixed costs required to process the blocks, miners on pools can't effectively secure the network because they can't independently verify that the blocks they are mining are valid. It would be easy then to co-opt the relatively small number of pools, a number that is not particularly large even now. Transaction relay nodes would also be very expensive to run, and again the small number of them makes them targets for control. Sure transactions will be cheap, but that doesn't do you any good if the small number of miners out there are all regulated and ignore your transactions. Sounds like centralization to me. -- 'peter'[:-1]@petertodd.org [-- Attachment #2: Digital signature --] [-- Type: application/pgp-signature, Size: 490 bytes --] ^ permalink raw reply [flat|nested] 16+ messages in thread
* Re: [Bitcoin-development] Incorporating block validation rule modifications into the block chain 2013-02-13 15:42 ` Gregory Maxwell 2013-02-13 21:02 ` Gavin Andresen @ 2013-02-14 1:02 ` Gregory Maxwell 2013-02-14 6:39 ` Peter Todd 1 sibling, 1 reply; 16+ messages in thread From: Gregory Maxwell @ 2013-02-14 1:02 UTC (permalink / raw) To: Raph Frank; +Cc: bitcoin-development On Wed, Feb 13, 2013 at 7:42 AM, Gregory Maxwell <gmaxwell@gmail.com> wrote: > I hope that should it become necessary to do so that correct path will > be obvious to everyone, otherwise there is a grave risk of undermining > the justification for the confidence in the immutability of any of the > rules of the system. With all I wrote on the gloom side— I thought I should elaborate how I think that would work, assuming that my gloom isn't convincingly disproven. It's the year 2043— the Y2038 problem is behind us and everyone is beginning to forget how terrible it turned out to be— By some amazing chance Bitcoin still exists and is widely used. Off-chain system like fidelity bonded banks are vibrant and widely used providing scalable instant and completely private transactions to millions of people. Someone posts to the infrequently used IETF Bitcoin working group with a new draft— It points out that the transaction load is high enough that even with a 100x increase in block size completion for fees would hardly be impacted and that— because computers are 2^20 times faster per unit cost than they were in 2013— and networks had made similar gains, so even a common wristwatch (the personal computer embedded in everyone's wrist at birth) could easily keep up with 100 megabyte blocks.... so the size should be increased as of block 2,047,500. The only objections are filed by some bearded hippy at the museum of internet trolling (their authentic reconstruction of Diablo-D3's desktop exhibit couldn't keep up), and by some dictatorship who again insists that their communist PeoplesCoin should be used instead— the usual suspects. And so, after a couple years of upgrades, it is so. Or perhaps more likely— it would get revised along side a hardforking cryptosystem upgrade (e.g. replacing sha256 in the hash trees with SHA-4-512), thus amortizing out all the migration costs... The trickiness and risk of changing it— of economic problems, of the risk of undermining trust in the immutability of the system's rules— only exists if there is genuine, considered, and honest controversy about the parameters. At the moment any increase would be sure to be controversial: common hardware and networks would not obviously keep up with our current maximum size, and our current transaction load doesn't produce a usable fees market. This cannot remain true forever. ^ permalink raw reply [flat|nested] 16+ messages in thread
* Re: [Bitcoin-development] Incorporating block validation rule modifications into the block chain 2013-02-14 1:02 ` Gregory Maxwell @ 2013-02-14 6:39 ` Peter Todd 0 siblings, 0 replies; 16+ messages in thread From: Peter Todd @ 2013-02-14 6:39 UTC (permalink / raw) To: Gregory Maxwell; +Cc: bitcoin-development [-- Attachment #1: Type: text/plain, Size: 2209 bytes --] On Wed, Feb 13, 2013 at 05:02:39PM -0800, Gregory Maxwell wrote: > It's the year 2043— the Y2038 problem is behind us and everyone is > beginning to forget how terrible it turned out to be— By some amazing > chance Bitcoin still exists and is widely used. Off-chain system like > fidelity bonded banks are vibrant and widely used providing scalable > instant and completely private transactions to millions of people. Speaking of fidelity bonded banks I think it needs to be made clear that really trustworthy bonded banks require the maximum block size to be kept limited. The problem is that even if you don't create any transactions on the chain yourself, you still need to be able to keep watch the chain to keep track of what the bank is doing. For instance if you are trying to decide if you can trust the bank with a 1BTC deposit, and they've purchased a 1000BTC fidelity bond, you still need to be able to determine if all the unspent transaction outputs in the blockchain that the bank could spend, in addition to all the unspen transactions in the mempool, are less than the value of their fidelity bond. With 1MiB blocks that will be practical on smartphones with wireless internet connectivity without having to trust anyone else. With 1GiB blocks that just won't be true and you'll be forced to trust the relatively few nodes out there with the hardware to deal with the blockchain. You'll pay for it too. Potentially the various UTXO proposals will help, but they will need to be quite sophisticated; we'll need sums of all txout values by scriptPubKey and a fraud notice system for instance. All of this stuff is at best many months away from even beginning to be deployed on the network, and probably years away from getting to the point where it is truely trustworthy. Maybe it'll never become trustworthy, either because miners just don't bother, the code doesn't get written, or a flaw in the whole idea is found. We're just not going to know until these technologies are implemented and tested, and without them, large blocks force us into trusting miners blindly and make many valuable applications impossible. -- 'peter'[:-1]@petertodd.org [-- Attachment #2: Digital signature --] [-- Type: application/pgp-signature, Size: 490 bytes --] ^ permalink raw reply [flat|nested] 16+ messages in thread
end of thread, other threads:[~2013-02-18 16:22 UTC | newest] Thread overview: 16+ messages (download: mbox.gz / follow: Atom feed) -- links below jump to the message on this page -- 2013-02-12 13:49 [Bitcoin-development] Incorporating block validation rule modifications into the block chain Raph Frank 2013-02-12 15:49 ` Gregory Maxwell 2013-02-13 14:58 ` Raph Frank 2013-02-13 15:42 ` Gregory Maxwell 2013-02-13 21:02 ` Gavin Andresen 2013-02-13 21:05 ` Gregory Maxwell 2013-02-13 23:10 ` Stephen Pair 2013-02-14 0:28 ` Gregory Maxwell 2013-02-14 2:44 ` Stephen Pair 2013-02-14 3:38 ` Gregory Maxwell 2013-02-14 5:36 ` Stephen Pair 2013-02-14 6:07 ` Peter Todd 2013-02-14 12:59 ` Stephen Pair 2013-02-18 16:22 ` Peter Todd 2013-02-14 1:02 ` Gregory Maxwell 2013-02-14 6:39 ` Peter Todd
This is a public inbox, see mirroring instructions for how to clone and mirror all data and code used for this inbox