Gavinandresen
Transcript By: Bryan Bishop
http://blog.circle.com/2015/02/10/devcore-livestream/
The instant transaction time.. you know I walk up to a cash register, I put my phone there, and in a second or two the transaction is confirmed and I walk away with my coffee. Anything beyond that, 10 minutes versus 1 minute doesn’t matter. So the problem you want to solve is how do we get instant confirmation.. there’s a bunch of ideas about this, like a trusted third party that promises to not double spend, have some coins locked up in a multisig wallet like Green Address. There’s ideas about risk assessment of unconfirmed transactions, like broadcast and then get an idea of how much of the network has seen it and has anyone seen a double spend, make some assumptions about how likely it is to be double spent, if it is low enough and the amount is low enough then you take the risk and proceed. I don’t know how that will, I don’t know how that will work its way out, which of any of these approaches will work.
Yeah? Are you tracking what the speed of adoption of these is for bitcoin.. how can you speed that up to get really fast iterations? So the question was about speeding adoption, and automatic updates. For Bitcoin Core, we have been pretty against automatic updates for a bunch of reasons. Partly, we don’t want the responsibility for holding keys and switching switches on the bitcoin network over night. That’s not ideal. Nobody wants to hold these keys. Some subset of people don’t want to jointly hold those keys. Rolling out slowly has an advantage, if we screw it up then we don’t screw up everyone all at once. We can come up with a new release if we messed up. With automatic roll outs, you can roll out to a certain percentage over time. As for tracking releases, there are ewbsites that track version numbers of websites on networks. There was an interesting paper at Financial Crypto where they tracked transaction fees, and they looked at te change in transaction fees in Bitcoin Core, based on the transactions that happened then which versions of Bitcoin Core may have been used at any given time. I hope that answers your question.
Yeah in the back? So the question was about Nick Szabo… zabo? Thanks. Proposed BitGold, which was a bitcoin-like system. In BitGold, the value of the tokens were related to the difficulty of creating them, which is not true in Bitcoin. Doesn’t matter what the difficulty is in Bitcoin, whether 10 billion or 1. I think it would have been better if Satoshi had somehow tied the difficulty into the number of BTC created. Although.. I think Satoshi did know that you should keep things pretty simple, simpler is better for security and for just the ability to get a system out. I would have to think long and hard about whether you could come up with a system that is more fair because people putting more work into creating these tokens would be rewarded more. There may be some runaway effect. I think we’re over secure with mining right now. If the higher the difficulty went, then the more BTC that miners got, then that might create a bad feedback loop. That’s an interesting question.
One more question. Yeah? Well, first of all congrats on a successful 20 megabyte block size test. Oh, that’s not done yet. I saw the blog post about it. I know that the core team has been historically opposed to bloating the blockchain with non-payment transactions, suggesting sidechains as an alternative method. Given the um, petertodd’s objections to sidechains for anything other than testing new Bitcoin protocol extensions, and the utility of having things on the Bitcoin network, do you think larger block sizes would pave the way to allowing more use cases for Bitcoin other than monetary transactions?
So the question was about bloating the blockchain. Bloating, scarequotes. With non-financial data, and whether raising the block size would allow more different applications. The issue of bloating, uh, I think I have a lsightly different opinion than some of the other core developers. We should be politically neutral about what the blockchain is used for, whether that’s paying people for something, or whether as a ledger that just records transaction. We should be politically neutral, and let the fee system do what it was designed to do. The most valuable uses should pay to use the global ledger. This is different from some other people who hae strong feelings about people who want to pay to put data in the blockchain. I think once we have pruning, people will realize that the blockchain is not a permanent record. If you want a permanent record of like large amounts of data, you need another system. This is designed to be a transaction system with an open set of transactions that haven’t settled yet.
Will increasing the block size enable other applications? Transactions will be cheaper. They will cost less. Whether that enables other applications of the blockchain, I think it depends on how cheap transactions get.
The core developers tend to be conservative. I think that the OP_RETURN 80 bytes pull request will be accepted. There are all sorts of tricks you can play if you want to store more data htan that. P2SH - you can store a bunch of data in your script signatures if you really want to. They will get pruned away, so people wont have to store those for oyu, and that should be the way it is. I think that if the block size doesn’t change, non-financial transactions might start to push out the financial transactions. If you are recording a car title on the blockchain perhaps you are not interested in spending $10 to store it, whereas maybe you are not paying $10 to buy a cup of coffee, or a month’s worth of webhosting.
I think we’re done here. I will be around all day so come grab me. Thanks.