Ultimate glossary of crypto currency terms, acronyms and abbreviations
You've probably been hearing a lot about Bitcoin recently and are wondering what's the big deal? Most of your questions should be answered by the resources below but if you have additional questions feel free to ask them in the comments. It all started with the release of the release of Satoshi Nakamoto's whitepaper however that will probably go over the head of most readers so we recommend the following videos for a good starting point for understanding how bitcoin works and a little about its long term potential:
Limited Supply - There will only ever be 21,000,000 bitcoins created and they are issued in a predictable fashion, you can view the inflation schedule here. Once they are all issued Bitcoin will be truly deflationary. The halving countdown can be found here.
Open source - Bitcoin code is fully auditable. You can read the source code yourself here.
Accountable - The public ledger is transparent, all transactions are seen by everyone.
Decentralized - Bitcoin is globally distributed across thousands of nodes with no single point of failure and as such can't be shut down similar to how Bittorrent works. You can even run a node on a Raspberry Pi.
Censorship resistant - No one can prevent you from interacting with the bitcoin network and no one can censor, alter or block transactions that they disagree with, see Operation Chokepoint.
Push system - There are no chargebacks in bitcoin because only the person who owns the address where the bitcoins reside has the authority to move them.
Low fee scaling - On chain transaction fees depend on network demand and how much priority you wish to assign to the transaction. Most wallets calculate on chain fees automatically but you can view current fees here and mempool activity here. On chain fees may rise occasionally due to network demand, however instant micropayments that do not require confirmations are happening via the Lightning Network, a second layer scaling solution currently rolling out on the Bitcoin mainnet.
Borderless - No country can stop it from going in/out, even in areas currently unserved by traditional banking as the ledger is globally distributed.
Portable - Bitcoins are digital so they are easier to move than cash or gold. They can even be transported by simply memorizing a string of words for wallet recovery (while cool this method is generally not recommended due to potential for insecure key generation by inexperienced users. Hardware wallets are the preferred method for new users due to ease of use and additional security).
Bitcoin.org and BuyBitcoinWorldwide.com are helpful sites for beginners. You can buy or sell any amount of bitcoin (even just a few dollars worth) and there are several easy methods to purchase bitcoin with cash, credit card or bank transfer. Some of the more popular resources are below, also check out the bitcoinity exchange resources for a larger list of options for purchases.
Here is a listing of local ATMs. If you would like your paycheck automatically converted to bitcoin use Bitwage. Note: Bitcoins are valued at whatever market price people are willing to pay for them in balancing act of supply vs demand. Unlike traditional markets, bitcoin markets operate 24 hours per day, 365 days per year. Preev is a useful site that that shows how much various denominations of bitcoin are worth in different currencies. Alternatively you can just Google "1 bitcoin in (your local currency)".
Securing your bitcoins
With bitcoin you can "Be your own bank" and personally secure your bitcoins OR you can use third party companies aka "Bitcoin banks" which will hold the bitcoins for you.
If you prefer to "Be your own bank" and have direct control over your coins without having to use a trusted third party, then you will need to create your own wallet and keep it secure. If you want easy and secure storage without having to learn computer security best practices, then a hardware wallet such as the Trezor, Ledger or ColdCard is recommended. Alternatively there are many software wallet options to choose from here depending on your use case.
If you prefer to let third party "Bitcoin banks" manage your coins, try Gemini but be aware you may not be in control of your private keys in which case you would have to ask permission to access your funds and be exposed to third party risk.
Note: For increased security, use Two Factor Authentication (2FA) everywhere it is offered, including email! 2FA requires a second confirmation code to access your account making it much harder for thieves to gain access. Google Authenticator and Authy are the two most popular 2FA services, download links are below. Make sure you create backups of your 2FA codes.
As mentioned above, Bitcoin is decentralized, which by definition means there is no official website or Twitter handle or spokesperson or CEO. However, all money attracts thieves. This combination unfortunately results in scammers running official sounding names or pretending to be an authority on YouTube or social media. Many scammers throughout the years have claimed to be the inventor of Bitcoin. Websites like bitcoin(dot)com and the btc subreddit are active scams. Almost all altcoins (shitcoins) are marketed heavily with big promises but are really just designed to separate you from your bitcoin. So be careful: any resource, including all linked in this document, may in the future turn evil. Don't trust, verify. Also as they say in our community "Not your keys, not your coins".
Where can I spend bitcoins?
Check out spendabit or bitcoin directory for millions of merchant options. Also you can spend bitcoin anywhere visa is accepted with bitcoin debit cards such as the CashApp card. Some other useful site are listed below.
Mining bitcoins can be a fun learning experience, but be aware that you will most likely operate at a loss. Newcomers are often advised to stay away from mining unless they are only interested in it as a hobby similar to folding at home. If you want to learn more about mining you can read more here. Still have mining questions? The crew at /BitcoinMining would be happy to help you out. If you want to contribute to the bitcoin network by hosting the blockchain and propagating transactions you can run a full node using this setup guide. If you would prefer to keep it simple there are several good options. You can view the global node distribution here.
Just like any other form of money, you can also earn bitcoins by being paid to do a job.
You can also earn bitcoins by participating as a market maker on JoinMarket by allowing users to perform CoinJoin transactions with your bitcoins for a small fee (requires you to already have some bitcoins.
The following is a short list of ongoing projects that might be worth taking a look at if you are interested in current development in the bitcoin space.
One Bitcoin is quite large (hundreds of £/$/€) so people often deal in smaller units. The most common subunits are listed below:
one bitcoin is equal to 100 million satoshis
1,000 per bitcoin
used as default unit in recent Electrum wallet releases
1,000,000 per bitcoin
colloquial "slang" term for microbitcoin (μBTC)
100,000,000 per bitcoin
smallest unit in bitcoin, named after the inventor
For example, assuming an arbitrary exchange rate of $10000 for one Bitcoin, a $10 meal would equal:
For more information check out the Bitcoin units wiki. Still have questions? Feel free to ask in the comments below or stick around for our weekly Mentor Monday thread. If you decide to post a question in /Bitcoin, please use the search bar to see if it has been answered before, and remember to follow the community rules outlined on the sidebar to receive a better response. The mods are busy helping manage our community so please do not message them unless you notice problems with the functionality of the subreddit. Note: This is a community created FAQ. If you notice anything missing from the FAQ or that requires clarification you can edit it here and it will be included in the next revision pending approval. Welcome to the Bitcoin community and the new decentralized economy!
Taproot, CoinJoins, and Cross-Input Signature Aggregation
It is a very common misconception that the upcoming Taproot upgrade helps CoinJoin. TLDR: The upcoming Taproot upgrade does not help equal-valued CoinJoin at all, though it potentially increases the privacy of other protocols, such as the Lightning Network, and escrow contract schemes. If you want to learn more, read on!
Let's start with equal-valued CoinJoins, the type JoinMarket and Wasabi use. What happens is that some number of participants agree on some common value all of them use. With JoinMarket the taker defines this value and pays the makers to agree to it, with Wasabi the server defines a value approximately 0.1 BTC. Then, each participant provides inputs that they unilaterally control, totaling equal or greater than the common value. Typically since each input is unilaterally controlled, each input just requires a singlesig. Each participant also provides up to two addresses they control: one of these will be paid with the common value, while the other will be used for any extra value in the inputs they provided (i.e. the change output). The participants then make a single transaction that spends all the provided inputs and pays out to the appropriate outputs. The inputs and outputs are shuffled in some secure manner. Then the unsigned transaction is distributed back to all participants. Finally, each participant checks that the transaction spends the inputs it provided (and more importantly does not spend any other coins it might own that it did not provide for this CoinJoin!) and that the transaction pays out to the appropriate address(es) it controls. Once they have validated the transaction, they ratify it by signing for each of the inputs it provided. Once every participant has provided signatures for all inputs it registered, the transaction is now completely signed and the CoinJoin transaction is now validly confirmable. CoinJoin is a very simple and direct privacy boost, it requires no SCRIPTs, needs only singlesig, etc.
Let's say we have two participants who have agreed on a common amount of 0.1 BTC. One provides a 0.105 coin as input, the other provides a 0.114 coin as input. This results in a CoinJoin with a 0.105 coin and a 0.114 coin as input, and outputs with 0.1, 0.005, 0.014, and 0.1 BTC. Now obviously the 0.005 output came from the 0.105 input, and the 0.014 output came from the 0.114 input. But the two 0.1 BTC outputs cannot be correlated with either input! There is no correlating information, since either output could have come from either input. That is how common CoinJoin implementations like Wasabi and JoinMarket gain privacy.
Unfortunately, large-scale CoinJoins like that made by Wasabi and JoinMarket are very obvious. All you have to do is look for a transactions where, say, more than 3 outputs are the same equal value, and the number of inputs is equal or larger than the number of equal-valued outputs. Thus, it is trivial to identify equal-valued CoinJoins made by Wasabi and JoinMarket. You can even trivially differentiate them: Wasabi equal-valued CoinJoins are going to have a hundred or more inputs, with outputs that are in units of approximately 0.1 BTC, while JoinMarket CoinJoins have equal-valued outputs of less than a dozen (between 4 to 6 usually) and with the common value varying wildly from as low as 0.001 BTC to as high as a dozen BTC or more. This has led to a number of anti-privacy exchanges to refuse to credit custodially-held accounts if the incoming deposit is within a few hops of an equal-valued CoinJoin, usually citing concerns about regulations. Crucially, the exchange continues to hold private keys for those "banned" deposits, and can still spend them, thus this is effectively a theft. If your exchange does this to you, you should report that exchange as stealing money from its customers. Not your keys not your coins. Thus, CoinJoins represent a privacy tradeoff:
It's very hard for everyone else to determine which output belongs to which input.
It's obvious to everyone else that the output was involved in a mixing operation.
Let's now briefly discuss that nice new shiny thing called Taproot. Taproot includes two components:
The use of Schnorr-based signature scheme, with multisignature support. Spending from a Schnorr pubkey is called a "keypath spend".
The ability to secretly commit to a set of scripts, one of which can be revealed later and its inputs provided correctly in order to spend the coin. Spending via a hidden script is called a "scriptpath spend".
This has some nice properties:
Direct multisignature support means all multisignature uses look the same. In current Bitcoin, a 2-of-2 "multisignature" is really a script which demands that two signatures be provided, from 2 different pre-specified public keys. To a cryptographer, the strict definition of multisignature is that this is a single signature that is cooperatively created by multiple parties.
A typical minimal "multisig" setup would be a 2-of-3, because that lets you lose one signing device while still being able to keep access to your money, and still providing an increase in security relative to a singlesig, since a 2-of-3 requires that potential thieves abscond with at least two signing devices. In current Bitcoin, a 2-of-3 is a SCRIPT containing 3 public keys, requiring that two signatures from those three public keys be provided.
But a Lightning Network channel has exactly two participants. Thus, it uses a 2-of-2, and is a SCRIPT containing 2 public keys, requiring that two signatures from those public keys be provided. If you look for 2-of-2 spends on the blockchain after Lightning became cool, the chances are very good that a random 2-of-2 spend is a Lightning Network channel being closed, because there are hardly ever any other uses of 2-of-2.
Just from there, you can easily differentiate the most common HODLer multisig of 2-of-3 (SCRIPT contains 3 pubkeys) from the Lightning channel 2-of-2 (SCRIPT contains 2 pubkeys).
Fortunately, with Taproot, 2-of-3 and 2-of-2 (and any arbitrary k-of-n) can look exactly the same, because Schnorr allows for the cryptographer's strict definition of "multisignature": a single signature cooperatively created by multiple parties.
Complex SCRIPTs, like HTLCs, can be hidden in a Taproot output.
For example, the output can have a keyspend branch that is a n-of-n of all participants, with hidden SCRIPTs that encode the conditions under which the output can be spent
The hidden SCRIPTs ensure that the protocol is followed. If one of the participants drops from the protocol, the rest can reveal the hidden SCRIPTs and follow their conditions.
If everyone follows the protocol correctly, and agrees to the result, they can all cooperatively sign with the keyspend n-of-n. They can just all agree on what the result of the SCRIPTs would be, and sign a transaction that performs that, without revealing any SCRIPTs. Since all of them agreed on the result, nobody should complain (if one of them believes the result is not correct, they can just refuse to sign and force everyone else to publish the SCRIPTs onchain).
If everyone agrees, they get privacy: none of the SCRIPTs they were following ever get published onchain, and it looks like every other multisignature spend.
Taproot DOES NOT HELP CoinJoin
So let's review! CoinJoin:
CoinJoin inputs are singlesig
There are no SCRIPTs involved in CoinJoin.
Improves multisig privacy.
Improves SCRIPT privacy.
There is absolutely no overlap. Taproot helps things that CoinJoin does not use. CoinJoin uses things that Taproot does not improve.
B-but They Said!!
A lot of early reporting on Taproot claimed that Taproot benefits CoinJoin. What they are confusing is that earlier drafts of Taproot included a feature called cross-input signature aggregation. In current Bitcoin, every input, to be spent, has to be signed individually. With cross-input signature aggregation, all inputs that support this feature are signed with a single signature that covers all those inputs. So for example if you would spend two inputs, current Bitcoin requires a signature for each input, but with cross-input signature aggregation you can sign both of them with a single signature. This works even if the inputs have different public keys: two inputs with cross-input signature aggregation effectively define a 2-of-2 public key, and you can only sign for that input if you know the private keys for both inputs, or if you are cooperatively signing with somebody who knows the private key of the other input. This helps CoinJoin costs. Since CoinJoins will have lots of inputs (each participant will provide at least one, and probably will provide more, and larger participant sets are better for more privacy in CoinJoin), if all of them enabled cross-input signature aggregation, such large CoinJoins can have only a single signature. This complicates the signing process for CoinJoins (the signers now have to sign cooperatively) but it can be well worth it for the reduced signature size and onchain cost. But note that the while cross-input signature aggregation improves the cost of CoinJoins, it does not improve the privacy! Equal-valued CoinJoins are still obvious and still readily bannable by privacy-hating exchanges. It does not improve the privacy of CoinJoin. Instead, see https://old.reddit.com/Bitcoin/comments/gqb3udesign_for_a_coinswap_implementation_fo
Why isn't cross-input signature aggregation in?
There's some fairly complex technical reasons why cross-input signature aggregation isn't in right now in the current Taproot proposal. The primary reason was to reduce the technical complexity of Taproot, in the hope that it would be easier to convince users to activate (while support for Taproot is quite high, developers have become wary of being hopeful that new proposals will ever activate, given the previous difficulties with SegWit). The main technical complexity here is that it interacts with future ways to extend Bitcoin. The rest of this writeup assumes you already know about how Bitcoin SCRIPT works. If you don't understand how Bitcoin SCRIPT works at the low-level, then the TLDR is that cross-input signature aggregation complicates how to extend Bitcoin in the future, so it was deferred to let the develoeprs think more about it. (this is how I understand it; perhaps pwuille or ajtowns can give a better summary.) In detail, Taproot also introduces OP_SUCCESS opcodes. If you know about the OP_NOP opcodes already defined in current Bitcoin, well, OP_SUCCESS is basically "OP_NOP done right". Now, OP_NOP is a do-nothing operation. It can be replaced in future versions of Bitcoin by having that operation check some condition, and then fail if the condition is not satisfied. For example, both OP_CHECKLOCKTIMEVERIFY and OP_CHECKSEQUENCEVERIFY were previously OP_NOP opcodes. Older nodes will see an OP_CHECKLOCKTIMEVERIFY and think it does nothing, but newer nodes will check if the nLockTime field has a correct specified value, and fail if the condition is not satisfied. Since most of the nodes on the network are using much newer versions of the node software, older nodes are protected from miners who try to misspend any OP_CHECKLOCKTIMEVERIFY/OP_CHECKSEQUENCEVERIFY, and those older nodes will still remain capable of synching with the rest of the network: a dedication to strict backward-compatibility necessary for a consensus system. Softforks basically mean that a script that passes in the latest version must also be passing in all older versions. A script cannot be passing in newer versions but failing in older versions, because that would kick older nodes off the network (i.e. it would be a hardfork). But OP_NOP is a very restricted way of adding opcodes. Opcodes that replace OP_NOP can only do one thing: check if some condition is true. They can't push new data on the stack, they can't pop items off the stack. For example, suppose instead of OP_CHECKLOCKTIMEVERIFY, we had added a OP_GETBLOCKHEIGHT opcode. This opcode would push the height of the blockchain on the stack. If this command replaced an older OP_NOP opcode, then a script like OP_GETBLOCKHEIGHT 650000 OP_EQUAL might pass in some future Bitcoin version, but older versions would see OP_NOP 650000 OP_EQUAL, which would fail because OP_EQUAL expects two items on the stack. So older versions will fail a SCRIPT that newer versions will pass, which is a hardfork and thus a backwards incompatibility. OP_SUCCESS is different. Instead, old nodes, when parsing the SCRIPT, will see OP_SUCCESS, and, without executing the body, will consider the SCRIPT as passing. So, the OP_GETBLOCKHEIGHT 650000 OP_EQUAL example will now work: a future version of Bitcoin might pass it, and existing nodes that don't understand OP_GETBLOCKHEIGHT will se OP_SUCCESS 650000 OP_EQUAL, and will not execute the SCRIPT at all, instead passing it immediately. So a SCRIPT that might pass in newer versions will pass for older versions, which keeps the back-compatibility consensus that a softfork needs. So how does OP_SUCCESS make things difficult for cross-input signatur aggregation? Well, one of the ways to ask for a signature to be verified is via the opcodes OP_CHECKSIGVERIFY. With cross-input signature aggregation, if a public key indicates it can be used for cross-input signature aggregation, instead of OP_CHECKSIGVERIFY actually requiring the signature on the stack, the stack will contain a dummy 0 value for the signature, and the public key is instead added to a "sum" public key (i.e. an n-of-n that is dynamically extended by one more pubkey for each OP_CHECKSIGVERIFY operation that executes) for the single signature that is verified later by the cross-input signature aggregation validation algorithm00. The important part here is that the OP_CHECKSIGVERIFY has to execute, in order to add its public key to the set of public keys to be checked in the single signature. But remember that an OP_SUCCESS prevents execution! As soon as the SCRIPT is parsed, if any opcode is OP_SUCCESS, that is considered as passing, without actually executing the SCRIPT, because the OP_SUCCESS could mean something completely different in newer versions and current versions should assume nothing about what it means. If the SCRIPT contains some OP_CHECKSIGVERIFY command in addition to an OP_SUCCESS, that command is not executed by current versions, and thus they cannot add any public keys given by OP_CHECKSIGVERIFY. Future versions also have to accept that: if they parsed an OP_SUCCESS command that has a new meaning in the future, and then execute an OP_CHECKSIGVERIFY in that SCRIPT, they cannot add the public key into the same "sum" public key that older nodes use, because older nodes cannot see them. This means that you might need more than one signature in the future, in the presence of an opcode that replaces some OP_SUCCESS. Thus, because of the complexity of making cross-input signature aggregation work compatibly with future extensions to the protocol, cross-input signature aggregation was deferred.
Cosmos is a heterogeneous network of many independent parallel blockchains, each powered by classical BFT consensus algorithms like Tendermint. Developers can easily build custom application specific blockchains, called Zones, through the Cosmos SDK framework. These Zones connect to Hubs, which are specifically designed to connect zones together. The vision of Cosmos is to have thousands of Zones and Hubs that are Interoperable through the Inter-Blockchain Communication Protocol (IBC). Cosmos can also connect to other systems through peg zones, which are specifically designed zones that each are custom made to interact with another ecosystem such as Ethereum and Bitcoin. Cosmos does not use Sharding with each Zone and Hub being sovereign with their own validator set. For a more in-depth look at Cosmos and provide more reference to points made in this article, please see my three part series — Part One, Part Two, Part Three (There's a youtube video with a quick video overview of Cosmos on the medium article - https://medium.com/ava-hub/comparison-between-avalanche-cosmos-and-polkadot-a2a98f46c03b)
Polkadot is a heterogeneous blockchain protocol that connects multiple specialised blockchains into one unified network. It achieves scalability through a sharding infrastructure with multiple blockchains running in parallel, called parachains, that connect to a central chain called the Relay Chain. Developers can easily build custom application specific parachains through the Substrate development framework. The relay chain validates the state transition of connected parachains, providing shared state across the entire ecosystem. If the Relay Chain must revert for any reason, then all of the parachains would also revert. This is to ensure that the validity of the entire system can persist, and no individual part is corruptible. The shared state makes it so that the trust assumptions when using parachains are only those of the Relay Chain validator set, and no other. Interoperability is enabled between parachains through Cross-Chain Message Passing (XCMP) protocol and is also possible to connect to other systems through bridges, which are specifically designed parachains or parathreads that each are custom made to interact with another ecosystem such as Ethereum and Bitcoin. The hope is to have 100 parachains connect to the relay chain. For a more in-depth look at Polkadot and provide more reference to points made in this article, please see my three part series — Part One, Part Two, Part Three (There's a youtube video with a quick video overview of Polkadot on the medium article - https://medium.com/ava-hub/comparison-between-avalanche-cosmos-and-polkadot-a2a98f46c03b)
Avalanche is a platform of platforms, ultimately consisting of thousands of subnets to form a heterogeneous interoperable network of many blockchains, that takes advantage of the revolutionary Avalanche Consensus protocols to provide a secure, globally distributed, interoperable and trustless framework offering unprecedented decentralisation whilst being able to comply with regulatory requirements. Avalanche allows anyone to create their own tailor-made application specific blockchains, supporting multiple custom virtual machines such as EVM and WASM and written in popular languages like Go (with others coming in the future) rather than lightly used, poorly-understood languages like Solidity. This virtual machine can then be deployed on a custom blockchain network, called a subnet, which consist of a dynamic set of validators working together to achieve consensus on the state of a set of many blockchains where complex rulesets can be configured to meet regulatory compliance. Avalanche was built with serving financial markets in mind. It has native support for easily creating and trading digital smart assets with complex custom rule sets that define how the asset is handled and traded to ensure regulatory compliance can be met. Interoperability is enabled between blockchains within a subnet as well as between subnets. Like Cosmos and Polkadot, Avalanche is also able to connect to other systems through bridges, through custom virtual machines made to interact with another ecosystem such as Ethereum and Bitcoin. For a more in-depth look at Avalanche and provide more reference to points made in this article, please see here and here (There's a youtube video with a quick video overview of Avalanche on the medium article - https://medium.com/ava-hub/comparison-between-avalanche-cosmos-and-polkadot-a2a98f46c03b)
Comparison between Cosmos, Polkadot and Avalanche
A frequent question I see being asked is how Cosmos, Polkadot and Avalanche compare? Whilst there are similarities there are also a lot of differences. This article is not intended to be an extensive in-depth list, but rather an overview based on some of the criteria that I feel are most important. For a more in-depth view I recommend reading the articles for each of the projects linked above and coming to your own conclusions. I want to stress that it’s not a case of one platform being the killer of all other platforms, far from it. There won’t be one platform to rule them all, and too often the tribalism has plagued this space. Blockchains are going to completely revolutionise most industries and have a profound effect on the world we know today. It’s still very early in this space with most adoption limited to speculation and trading mainly due to the limitations of Blockchain and current iteration of Ethereum, which all three of these platforms hope to address. For those who just want a quick summary see the image at the bottom of the article. With that said let’s have a look
Each Zone and Hub in Cosmos is capable of up to around 1000 transactions per second with bandwidth being the bottleneck in consensus. Cosmos aims to have thousands of Zones and Hubs all connected through IBC. There is no limit on the number of Zones / Hubs that can be created
Parachains in Polkadot are also capable of up to around 1500 transactions per second. A portion of the parachain slots on the Relay Chain will be designated as part of the parathread pool, the performance of a parachain is split between many parathreads offering lower performance and compete amongst themselves in a per-block auction to have their transactions included in the next relay chain block. The number of parachains is limited by the number of validators on the relay chain, they hope to be able to achieve 100 parachains.
Avalanche is capable of around 4500 transactions per second per subnet, this is based on modest hardware requirements to ensure maximum decentralisation of just 2 CPU cores and 4 GB of Memory and with a validator size of over 2,000 nodes. Performance is CPU-bound and if higher performance is required then more specialised subnets can be created with higher minimum requirements to be able to achieve 10,000 tps+ in a subnet. Avalanche aims to have thousands of subnets (each with multiple virtual machines / blockchains) all interoperable with each other. There is no limit on the number of Subnets that can be created.
All three platforms offer vastly superior performance to the likes of Bitcoin and Ethereum 1.0. Avalanche with its higher transactions per second, no limit on the number of subnets / blockchains that can be created and the consensus can scale to potentially millions of validators all participating in consensus scores ✅✅✅. Polkadot claims to offer more tps than cosmos, but is limited to the number of parachains (around 100) whereas with Cosmos there is no limit on the number of hubs / zones that can be created. Cosmos is limited to a fairly small validator size of around 200 before performance degrades whereas Polkadot hopes to be able to reach 1000 validators in the relay chain (albeit only a small number of validators are assigned to each parachain). Thus Cosmos and Polkadot scores ✅✅ https://preview.redd.it/2o0brllyvpq51.png?width=1000&format=png&auto=webp&s=8f62bb696ecaafcf6184da005d5fe0129d504518
Tendermint consensus is limited to around 200 validators before performance starts to degrade. Whilst there is the Cosmos Hub it is one of many hubs in the network and there is no central hub or limit on the number of zones / hubs that can be created.
Polkadot has 1000 validators in the relay chain and these are split up into a small number that validate each parachain (minimum of 14). The relay chain is a central point of failure as all parachains connect to it and the number of parachains is limited depending on the number of validators (they hope to achieve 100 parachains). Due to the limited number of parachain slots available, significant sums of DOT will need to be purchased to win an auction to lease the slot for up to 24 months at a time. Thus likely to lead to only those with enough funds to secure a parachain slot. Parathreads are however an alternative for those that require less and more varied performance for those that can’t secure a parachain slot.
Avalanche consensus scan scale to tens of thousands of validators, even potentially millions of validators all participating in consensus through repeated sub-sampling. The more validators, the faster the network becomes as the load is split between them. There are modest hardware requirements so anyone can run a node and there is no limit on the number of subnets / virtual machines that can be created.
Avalanche offers unparalleled decentralisation using its revolutionary consensus protocols that can scale to millions of validators all participating in consensus at the same time. There is no limit to the number of subnets and virtual machines that can be created, and they can be created by anyone for a small fee, it scores ✅✅✅. Cosmos is limited to 200 validators but no limit on the number of zones / hubs that can be created, which anyone can create and scores ✅✅. Polkadot hopes to accommodate 1000 validators in the relay chain (albeit these are split amongst each of the parachains). The number of parachains is limited and maybe cost prohibitive for many and the relay chain is a ultimately a single point of failure. Whilst definitely not saying it’s centralised and it is more decentralised than many others, just in comparison between the three, it scores ✅ https://preview.redd.it/ckfamee0wpq51.png?width=1000&format=png&auto=webp&s=c4355f145d821fabf7785e238dbc96a5f5ce2846
Tendermint consensus used in Cosmos reaches finality within 6 seconds. Cosmos consists of many Zones and Hubs that connect to each other. Communication between 2 zones could pass through many hubs along the way, thus also can contribute to latency times depending on the path taken as explained in part two of the articles on Cosmos. It doesn’t need to wait for an extended period of time with risk of rollbacks.
Polkadot provides a Hybrid consensus protocol consisting of Block producing protocol, BABE, and then a finality gadget called GRANDPA that works to agree on a chain, out of many possible forks, by following some simpler fork choice rule. Rather than voting on every block, instead it reaches agreements on chains. As soon as more than 2/3 of validators attest to a chain containing a certain block, all blocks leading up to that one are finalized at once. If an invalid block is detected after it has been finalised then the relay chain would need to be reverted along with every parachain. This is particularly important when connecting to external blockchains as those don’t share the state of the relay chain and thus can’t be rolled back. The longer the time period, the more secure the network is, as there is more time for additional checks to be performed and reported but at the expense of finality. Finality is reached within 60 seconds between parachains but for external ecosystems like Ethereum their state obviously can’t be rolled back like a parachain and so finality will need to be much longer (60 minutes was suggested in the whitepaper) and discussed in more detail in part three
Avalanche consensus achieves finality within 3 seconds, with most happening sub 1 second, immutable and completely irreversible. Any subnet can connect directly to another without having to go through multiple hops and any VM can talk to another VM within the same subnet as well as external subnets. It doesn’t need to wait for an extended period of time with risk of rollbacks.
With regards to performance far too much emphasis is just put on tps as a metric, the other equally important metric, if not more important with regards to finance is latency. Throughput measures the amount of data at any given time that it can handle whereas latency is the amount of time it takes to perform an action. It’s pointless saying you can process more transactions per second than VISA when it takes 60 seconds for a transaction to complete. Low latency also greatly increases general usability and customer satisfaction, nowadays everyone expects card payments, online payments to happen instantly. Avalanche achieves the best results scoring ✅✅✅, Cosmos with comes in second with 6 second finality ✅✅ and Polkadot with 60 second finality (which may be 60 minutes for external blockchains) scores ✅ https://preview.redd.it/kzup5x42wpq51.png?width=1000&format=png&auto=webp&s=320eb4c25dc4fc0f443a7a2f7ff09567871648cd
Every Zone and Hub in Cosmos has their own validator set and different trust assumptions. Cosmos are researching a shared security model where a Hub can validate the state of connected zones for a fee but not released yet. Once available this will make shared security optional rather than mandatory.
Shared Security is mandatory with Polkadot which uses a Shared State infrastructure between the Relay Chain and all of the connected parachains. If the Relay Chain must revert for any reason, then all of the parachains would also revert. Every parachain makes the same trust assumptions, and as such the relay chain validates state transition and enables seamless interoperability between them. In return for this benefit, they have to purchase DOT and win an auction for one of the available parachain slots. However, parachains can’t just rely on the relay chain for their security, they will also need to implement censorship resistance measures and utilise proof of work / proof of stake for each parachain as well as discussed in part three, thus parachains can’t just rely on the security of the relay chain, they need to ensure sybil resistance mechanisms using POW and POS are implemented on the parachain as well.
A subnet in Avalanche consists of a dynamic set of validators working together to achieve consensus on the state of a set of many blockchains where complex rulesets can be configured to meet regulatory compliance. So unlike in Cosmos where each zone / hub has their own validators, A subnet can validate a single or many virtual machines / blockchains with a single validator set. Shared security is optional
Shared security is mandatory in polkadot and a key design decision in its infrastructure. The relay chain validates the state transition of all connected parachains and thus scores ✅✅✅. Subnets in Avalanche can validate state of either a single or many virtual machines. Each subnet can have their own token and shares a validator set, where complex rulesets can be configured to meet regulatory compliance. It scores ✅ ✅. Every Zone and Hub in cosmos has their own validator set / token but research is underway to have the hub validate the state transition of connected zones, but as this is still early in the research phase scores ✅ for now. https://preview.redd.it/pbgyk3o3wpq51.png?width=1000&format=png&auto=webp&s=61c18e12932a250f5633c40633810d0f64520575
The Cosmos project started in 2016 with an ICO held in April 2017. There are currently around 50 projects building on the Cosmos SDK with a full list can be seen here and filtering for Cosmos SDK . Not all of the projects will necessarily connect using native cosmos sdk and IBC and some have forked parts of the Cosmos SDK and utilise the tendermint consensus such as Binance Chain but have said they will connect in the future.
The Polkadot project started in 2016 with an ICO held in October 2017. There are currently around 70 projects building on Substrate and a full list can be seen here and filtering for Substrate Based. Like with Cosmos not all projects built using substrate will necessarily connect to Polkadot and parachains or parathreads aren’t currently implemented in either the Live or Test network (Kusama) as of the time of this writing.
Avalanche in comparison started much later with Ava Labs being founded in 2018. Avalanche held it’s ICO in July 2020. Due to lot shorter time it has been in development, the number of projects confirmed are smaller with around 14 projects currently building on Avalanche. Due to the customisability of the platform though, many virtual machines can be used within a subnet making the process incredibly easy to port projects over. As an example, it will launch with the Ethereum Virtual Machine which enables byte for byte compatibility and all the tooling like Metamask, Truffle etc. will work, so projects can easily move over to benefit from the performance, decentralisation and low gas fees offered. In the future Cosmos and Substrate virtual machines could be implemented on Avalanche.
Whilst it’s still early for all 3 projects (and the entire blockchain space as a whole), there is currently more projects confirmed to be building on Cosmos and Polkadot, mostly due to their longer time in development. Whilst Cosmos has fewer projects, zones are implemented compared to Polkadot which doesn’t currently have parachains. IBC to connect zones and hubs together is due to launch Q2 2021, thus both score ✅✅✅. Avalanche has been in development for a lot shorter time period, but is launching with an impressive feature set right from the start with ability to create subnets, VMs, assets, NFTs, permissioned and permissionless blockchains, cross chain atomic swaps within a subnet, smart contracts, bridge to Ethereum etc. Applications can easily port over from other platforms and use all the existing tooling such as Metamask / Truffle etc but benefit from the performance, decentralisation and low gas fees offered. Currently though just based on the number of projects in comparison it scores ✅. https://preview.redd.it/4zpi6s85wpq51.png?width=1000&format=png&auto=webp&s=e91ade1a86a5d50f4976f3b23a46e9287b08e373
Cosmos enables permissioned and permissionless zones which can connect to each other with the ability to have full control over who validates the blockchain. For permissionless zones each zone / hub can have their own token and they are in control who validates.
With polkadot the state transition is performed by a small randomly selected assigned group of validators from the relay chain plus with the possibility that state is rolled back if an invalid transaction of any of the other parachains is found. This may pose a problem for enterprises that need complete control over who performs validation for regulatory reasons. In addition due to the limited number of parachain slots available Enterprises would have to acquire and lock up large amounts of a highly volatile asset (DOT) and have the possibility that they are outbid in future auctions and find they no longer can have their parachain validated and parathreads don’t provide the guaranteed performance requirements for the application to function.
Avalanche enables permissioned and permissionless subnets and complex rulesets can be configured to meet regulatory compliance. For example a subnet can be created where its mandatory that all validators are from a certain legal jurisdiction, or they hold a specific license and regulated by the SEC etc. Subnets are also able to scale to tens of thousands of validators, and even potentially millions of nodes, all participating in consensus so every enterprise can run their own node rather than only a small amount. Enterprises don’t have to hold large amounts of a highly volatile asset, but instead pay a fee in AVAX for the creation of the subnets and blockchains which is burnt.
Avalanche provides the customisability to run private permissioned blockchains as well as permissionless where the enterprise is in control over who validates the blockchain, with the ability to use complex rulesets to meet regulatory compliance, thus scores ✅✅✅. Cosmos is also able to run permissioned and permissionless zones / hubs so enterprises have full control over who validates a blockchain and scores ✅✅. Polkadot requires locking up large amounts of a highly volatile asset with the possibility of being outbid by competitors and being unable to run the application if the guaranteed performance is required and having to migrate away. The relay chain validates the state transition and can roll back the parachain should an invalid block be detected on another parachain, thus scores ✅. https://preview.redd.it/li5jy6u6wpq51.png?width=1000&format=png&auto=webp&s=e2a95f1f88e5efbcf9e23c789ae0f002c8eb73fc
Cosmos will connect Hubs and Zones together through its IBC protocol (due to release in Q1 2020). Connecting to blockchains outside of the Cosmos ecosystem would either require the connected blockchain to fork their code to implement IBC or more likely a custom “Peg Zone” will be created specific to work with a particular blockchain it’s trying to bridge to such as Ethereum etc. Each Zone and Hub has different trust levels and connectivity between 2 zones can have different trust depending on which path it takes (this is discussed more in this article). Finality time is low at 6 seconds, but depending on the number of hops, this can increase significantly.
Polkadot’s shared state means each parachain that connects shares the same trust assumptions, of the relay chain validators and that if one blockchain needs to be reverted, all of them will need to be reverted. Interoperability is enabled between parachains through Cross-Chain Message Passing (XCMP) protocol and is also possible to connect to other systems through bridges, which are specifically designed parachains or parathreads that each are custom made to interact with another ecosystem such as Ethereum and Bitcoin. Finality time between parachains is around 60 seconds, but longer will be needed (initial figures of 60 minutes in the whitepaper) for connecting to external blockchains. Thus limiting the appeal of connecting two external ecosystems together through Polkadot. Polkadot is also limited in the number of Parachain slots available, thus limiting the amount of blockchains that can be bridged. Parathreads could be used for lower performance bridges, but the speed of future blockchains is only going to increase.
A subnet can validate multiple virtual machines / blockchains and all blockchains within a subnet share the same trust assumptions / validator set, enabling cross chain interoperability. Interoperability is also possible between any other subnet, with the hope Avalanche will consist of thousands of subnets. Each subnet may have a different trust level, but as the primary network consists of all validators then this can be used as a source of trust if required. As Avalanche supports many virtual machines, bridges to other ecosystems are created by running the connected virtual machine. There will be an Ethereum bridge using the EVM shortly after mainnet. Finality time is much faster at sub 3 seconds (with most happening under 1 second) with no chance of rolling back so more appealing when connecting to external blockchains.
All 3 systems are able to perform interoperability within their ecosystem and transfer assets as well as data, as well as use bridges to connect to external blockchains. Cosmos has different trust levels between its zones and hubs and can create issues depending on which path it takes and additional latency added. Polkadot provides the same trust assumptions for all connected parachains but has long finality and limited number of parachain slots available. Avalanche provides the same trust assumptions for all blockchains within a subnet, and different trust levels between subnets. However due to the primary network consisting of all validators it can be used for trust. Avalanche also has a much faster finality time with no limitation on the number of blockchains / subnets / bridges that can be created. Overall all three blockchains excel with interoperability within their ecosystem and each score ✅✅. https://preview.redd.it/ai0bkbq8wpq51.png?width=1000&format=png&auto=webp&s=3e85ee6a3c4670f388ccea00b0c906c3fb51e415
The ATOM token is the native token for the Cosmos Hub. It is commonly mistaken by people that think it’s the token used throughout the cosmos ecosystem, whereas it’s just used for one of many hubs in Cosmos, each with their own token. Currently ATOM has little utility as IBC isn’t released and has no connections to other zones / hubs. Once IBC is released zones may prefer to connect to a different hub instead and so ATOM is not used. ATOM isn’t a fixed capped supply token and supply will continuously increase with a yearly inflation of around 10% depending on the % staked. The current market cap for ATOM as of the time of this writing is $1 Billion with 203 million circulating supply. Rewards can be earnt through staking to offset the dilution caused by inflation. Delegators can also get slashed and lose a portion of their ATOM should the validator misbehave.
Polkadot’s native token is DOT and it’s used to secure the Relay Chain. Each parachain needs to acquire sufficient DOT to win an auction on an available parachain lease period of up to 24 months at a time. Parathreads have a fixed fee for registration that would realistically be much lower than the cost of acquiring a parachain slot and compete with other parathreads in a per-block auction to have their transactions included in the next relay chain block. DOT isn’t a fixed capped supply token and supply will continuously increase with a yearly inflation of around 10% depending on the % staked. The current market cap for DOT as of the time of this writing is $4.4 Billion with 852 million circulating supply. Delegators can also get slashed and lose their DOT (potentially 100% of their DOT for serious attacks) should the validator misbehave.
AVAX is the native token for the primary network in Avalanche. Every validator of any subnet also has to validate the primary network and stake a minimum of 2000 AVAX. There is no limit to the number of validators like other consensus methods then this can cater for tens of thousands even potentially millions of validators. As every validator validates the primary network, this can be a source of trust for interoperability between subnets as well as connecting to other ecosystems, thus increasing amount of transaction fees of AVAX. There is no slashing in Avalanche, so there is no risk to lose your AVAX when selecting a validator, instead rewards earnt for staking can be slashed should the validator misbehave. Because Avalanche doesn’t have direct slashing, it is technically possible for someone to both stake AND deliver tokens for something like a flash loan, under the invariant that all tokens that are staked are returned, thus being able to make profit with staked tokens outside of staking itself. There will also be a separate subnet for Athereum which is a ‘spoon,’ or friendly fork, of Ethereum, which benefits from the Avalanche consensus protocol and applications in the Ethereum ecosystem. It’s native token ATH will be airdropped to ETH holders as well as potentially AVAX holders as well. This can be done for other blockchains as well. Transaction fees on the primary network for all 3 of the blockchains as well as subscription fees for creating a subnet and blockchain are paid in AVAX and are burnt, creating deflationary pressure. AVAX is a fixed capped supply of 720 million tokens, creating scarcity rather than an unlimited supply which continuously increase of tokens at a compounded rate each year like others. Initially there will be 360 tokens minted at Mainnet with vesting periods between 1 and 10 years, with tokens gradually unlocking each quarter. The Circulating supply is 24.5 million AVAX with tokens gradually released each quater. The current market cap of AVAX is around $100 million.
Disclaimer: This is my editing, so there could be some errors, misunderstandings or exaggerations. Waiting for "IOTA TIME " (an era where IOTA defines nearly everything in terms of the block-chain world) niels12어제 오후 4:51 IOTA funds are public:https://thetangle.org/address/IDNAFP9FWWKYGNDMKGJWZD9GATGRPTJYTYHLKFNDEQSISPSETLZQOSPGOHC99LMPXDEHSH9XYHNVOLUBBQPCEGHYK9But they have probably other sources of income, like funding by government etc. And maybe also other IOTA funds on other addresses. I don't know. Balance: 59.68 Ti David Sønstebø어제 오후 9:41 I wonder how many times an out of context 2 year old private DM has to be addressed. At the time IOTA was approaching stagnation due to the actions of primarily CFB**, thus since we both started Jinn together which lead to IOTA,** I tried repeatedly to talk sense into him.I.E. "If you are going to torpedo all progress, let's just sell it all and start from scratch, fuck it"It's a figure of speech, while trying to talk sense into someone who insists that 1 + 1 = 3.59 My tax records show when I last sold iotas. February of 2018. Now stop reading into private DMs, especially ones taken out of context and especially those leaked by someone who's proclaimed he is going to ruin IOTAand my life. You need to go back to school if you think there is anything to 'speculate' on there. dom어제 오후 4:15 u/unsywe will release the condensed version of them once we want to.Just because you so desperately desire them for whatever reason doesn't make us do it faster. Being in this space for so fucking long,last thing I want is to attempt to act in good faith again and then be screwed over by those trying to misconstrue reality and spread lies.We've been at that for too long.Once they are fully ready, and we have them in a format we like, we will publish them. dom어제 오후 4:16 Our objective of the finance / legal department is to become one of the most trustworthy / transparent organizations in this space. Which is why we're setting up new and stricter policies in general dom어제 오후 4:18 quite frankly, with everything that has happened up until now, I would certainly say thatwe are one of the most transparent organization(if we wanted it or not)u/unsy dom어제 오후 4:21 u/unsyI am not worried about it.If we have problems, we always solve them - I think we've proven that by now. And as it stands right now with our current funding + our strategy, we are in good hands David Sønstebø오늘 오전 6:41 Don't worry, a shitty FUD piece in a cryptoblog is nada [오전 6:41] We were once numero uno target by Jeffrey Epstein funded Joi Ito's MIT DCI [오전 6:41] This is nothing ------------------------------------------------------------------------------------ Antonio Nardella [IF]어제 오후 11:13 IMO the community has matured a lot, we have community and certified developers working with the IF in the X-Teams, there are new people coming in with direct interest in the tech (yeah, also spec is still popular) and from the chats that I've had, there are devs waiting for the breaking changes of Chrysalis P2, before starting to develop again..But that's my assessment.. Jelle Millenaar [IF]어제 오후 9:15 Well, I can say the DID developments are going smooth. Starting publishing the first DIDs to the Tangle ;D Jelle Millenaar [IF]어제 오후 9:15 And since I am totally not biased towards Identity, but its gonna be revolutionary ;D Jelle Millenaar [IF]어제 오후 10:06 This is the perfect time to loose faith in the IOTA Foundations capability to deliver, especially after the network just received a major update with many improvements. Its just crypto being crypto, dom오늘 오전 2:12 Yeh we'll go through it. This is the usual game... Dominik Schiener There is more tech maturity, more adoption and more progress than ever. We are one of the only projects which gets funding from government grants and corporations.Stop the attention grabbing headlines and get your sources right. Long field You can track their iota address, and I can tell they didn't sell any iota tokens in last two months HusQy IOTA is like a large decentralized network cable that connects any number of nodes with each other and that enables data and values to be exchanged with one another, whereby the data is protected against manipulation and the value transactions against double spends. Thereon ... ... you can run any decentralized application (we call this layer) - e.g. a blockchainthat stores certain data for as long as you want and limits the amount of data to be saved via fees like Bitcoin.Each of these uses inherit ... ... your security from the basic protocol and can specifically only save the data that is relevant for you (also decentralized).To say that IOTA is not a DLT is in principle not that wrong -it is a platform for DLTs and therefore much more powerful than all ... ... existing DLTs because it is much more flexible. For example, you can run Hashgraph in IOTA, or Bitcoin or whatever. And IOTA is the token that connects the entire ecosystem. This is of course "not yet" the case, but Chrysalis Part 2 is the first step. HusQy @blocktrainerperhaps this explanation will enable you to understand where the journey is going.If a decentralized data storage is required, then you can build it with IOTA and it then has exactly the same properties in terms of permanent storage as Bitcoin. Block trainer We can also get a little more technical.The way you describe it, it sounds like an interoperability layer ... something like that here, which then equates to a polkadot etc. 📷 HusQy In principle yes, only that it doesn't connect Bitcoin and ETH but "IOTA Smart Contracts" with "IOTA Storage" etc. It is not there to connect other projects but to offer the same as other projects, only faster and cheaper. ------------------------------------------------------------------------------- Bitcoin Coach And in 5 years there will be a completely new project, which then claims to be better than IOTA. And then should all the infrastructure be thrown overboard and the partners simply change the DLT? HusQy This is how technology works.It makes no sense to run the Internet on the basis of 64k modems just because many people have one at home.The change does not take place overnight but creeping and if you look at the BTC Dominance you can see that too. Ultimately, everything will switch to the best technology and we'll see which that is :) Block trainer The "best" must also be defined. What are the classes to master? HusQy All classes. If there is a technology that can represent even one aspect better, then it is not yet good enough. Blockchain, for example, is a "degenerate" DAG with only one reference. The goal is that IOTA can also use blockchains if the use case requires it. HusQy The future is not "either DAG or blockchain" but both seamlessly linked within the same ecosystem. IOTA smart contracts use a blockchain, for example, but a separate chain for each smart contract and the blockchain is within the tangle. Block trainer According to the new definition, they are no longer saved ... A doublespent could change the reference retrospectively. HusQy That's not quite true.The tangle itself contains all information for all eternity and you cannot remove any information. Once the data has reached a certain age, it is no longer stored by every node in the network. But you can still ... ... still prove what happened in the part of the tangle that was "forgotten" by the nodes after a certain time. Now there are two ways to keep this evidence: 1. You save the evidence personally and can present it at any time. 2. Man ... ... writes a plug-in for the node, which monitors the Tangle for information of a certain type and keeps a copy of all car purchase-related data forever (or for at least 30 years, for example). All dealerships could then install this plugin and ... ... jointly store this data decentrally in order to query the information if necessary. However, you would only selectively save the data that interests you. The evidence they produce can still be verified by any node on the network. If the server of a car dealership fails, it can download the data again from one of the other dealerships. Quasi like an application-related private blockchain which is secured by the Tangle. It is also conceivable that there are service providers for this ... ---------------------------------------------------------------------------------- HusQy Data is only kept immutable. How do you intend to execute a token transaction over pure data? I'm simply sending the following two data transactions at the same time: 1. I'm sending $ 100 from address A to address B. 2. I'm sending $ 100 from address A to address C. HusQy In order to determine which transaction is successful / came first, you need consensus. Data transactions do not allow token transfer. Block trainer Why doesn't that allow token transfer? I can simply use it to sign my values. The question is about the meaning of the token. I can also sign that I have transferred € 10 for the petrol station. Or I transmit the proof via curled BTC ... HusQy Did I just describe you can publish two conflicting data transactions and no one knows which is the correct one: P Block trainer Unless you agree on a consensus. Time stamp + BTC (locked) in hash = value transmitted ... What else is the IOTA token for? HusQy Whether information is correct can only be seen in the context. Take a look at the difference between "data" and "information". For example, you can claim that you locked Bitcoin even though it didn't. Block trainer I may need a proof of this. See how, for example, BTC is unlocked in liquid or in the LN. The IOTA data layer is extremely similar to the principle of Lightning. Accordingly, the sending of tokens would be possible here, which means that I see the use case of the IOTA coin at risk HusQy Such a proof is impossible. The reason why this works with LN nodes is because LN nodes are Bitcoin nodes that know what is happening in the Bitcoin network and have "information" and not just "data": P What you are describing is technically impossible. Block trainer Data = information What can the LN not, what IOTA can sometimes? HusQy That's not rubbish.There is a huge difference between data and information, and inter-chain transactions are not possible because of that very difference. LN won't work - there are too many game theory problems: P -------------------------------------------------------------------------------------------- Dominik Schiener There is more tech maturity, more adoption and more progress than ever. We are one of the only projects which gets funding from government grants and corporations.Stop the attention grabbing headlines and get your sources right. Dominik Schiener As an innovation leader in Europe, I certainly say we deserve to get grants.There is a below 7% success chance usually. And yes,everything is fully audited (by externals ofc), showing clearly how and that the money was used in achieving the milestones of the grant. ----------------------------------------------------------------------------------------------------------------------------------
How YFI came out of nowhere to become the fastest coin to reach $1B and the fastest coin to ever get listed on Coinbase
Note: As mentioned to the original 624 Reddit subscribers, there will be $YFI based Exclusive Original Content released here by myself and others from time to time. These kinds of interactive Deep Dives with a Q&A with fellow Investors / Beta Testers right afterwards is a rare thing in Crypto, and will only be found with this level of immediacy, social interaction, permanence, depth, and complexity of analysis and feedback on a platform like Reddit. A lot of projects have low innovation, just copying something that someone else has already done, but with small tweaks to things like variables in Smart Contracts. A few rare projects have genuine innovation, providing genuine value to investors and users by providing attractive new products that simplify a lot of things in this space. Even rarer are the Unicorns that not only have innovation, but they have innovation in spades, oozing out of every pore. $YFI is one of these types of Unicorns. The scope of products and rapidity of release of new revolutionary products of this project has been simply unmatched in the short history of Crypto. Since 2009, the world of crypto has never seen anything like this lightning fast pace of development spanning such a wide scope of products - optimized automated yield farming and lending that relentlessly hunts the best yields, crypto insurance on Smart Contracts, a revolutionary Stablecoin idea that essentially makes a USD altcoin "smart" with built-in yield farming capabilities for the first time, to name a few - all built by a genius Smart Contract Builder who provided the world the first Fair Launch token. Key to wrapping your head around the advantages that the yEarn Finance ecosystem has over - well, every single other option out there at this time - are the concepts below:
CeFi vs. DeFi
Smart Contract Stacking
The power of a Talented and Diverse DAO
To discuss these concepts, and to educate beginners, we have to understand what the terms above truly mean. This post doesn't discuss any particular products and their advantages, only the systemic advantages that are available only to $YFI. This project seems to attract the smartest and the highest risk taking of crypto investors, and an important thing in truly understanding all of the risks involved, is that you have to know the terms and concepts first. Even veteran crypto and DeFi users may be thrown for a loop by some of the innovative products and concepts that keep coming out of the YFI Labs. This project is going through an expansion phase, where the scope of everything and the reach of the various released products is increasing (Insurance, A truly pegged Stablecoin, yETH Version 2, ySwap, yLiquidate, etc, etc..) You know that there's some motherforker or twenty that is now just avidly waiting for every piece of code that Andre drops onto GitHub, so that they can be among the first to copy it verbatim then claim it as "their own variation" because they changed some variables and titles. Yawn. From the definitive glossary for the DeFi space - yet another $YFI innovation - I'll list their definitions below. These may not be their final definitions when I finish any V1.1 edits to it, but they're good enough for now, and at least 3 or more YFI Dev Team members have read, reviewed, or edited these definitions. I've also invited my fellow Beta testers to provide comments to my RFC on this subreddit and in the Governance forum (among the documentation volunteers). Yes, this is how early DeFi investors are in the development and maturation of the DeFi space. Anyone reading this right now is so early into DeFi's evolution that the terms used for this space are literally still being finalized by the community. I've given a little bit of a sneak peek into how technical documentation is somehow self-organized in a powerful DAO such as this one. In this example, it starts off with a call for help on Twitter to improve our documentation by tracheopteryx. Interested and qualified volunteers show up (or don't) when such a call is made. Your writers and editors have spent many a moment pondering off into space debating whether this term really means this or that, or if the term was either succinctly described, or fully sufficient. It's a usually thankless and anonymous job, that is critical in providing enough relevant information to its users and investors. [Note: Just like anything you see related to the $YFI project: You can help us improve this documentation - any of it - if you see errors or better ways of describing this information.] All terms are shamelessly plagiarized from myself and my fellow writeeditors - u/tracheopteryx and Franklin - from the draft definitions in our new DeFi glossary: https://docs.yearn.finance/defi-glossary 1. CeFi vs. DeFi CeFi - Centralized Finance. In terms of cryptocurrency, CeFi is represented by centralized cryptocurrency exchanges, businesses or organizations with a physical address, and usually with some sort of corporate structure. These CeFi businesses must follow all applicable laws, rules, and regulations in each country, state, or region in which they operate. DeFi - DeFi, or Decentralized Finance, is at its root a set of Smart Contracts running independently on blockchains such as the Ethereum network. Smart Contracts may or may not interact with other smart contracts and even other blockchains. The goal of DeFi is to enhance profitability of investors in DeFi through automated smart contracts seeking to maximize yields for invested funds. DeFi is marked by rapid innovative progression and testing of new ideas and concepts. DeFi often involves high risk investing sometimes involving smart contracts that have not been audited or even thoroughly reviewed (a review is not as comprehensive as an audit, but may be also be included as part of an audit). Due to this and other reasons, DeFi is conventionally considered to be more risky than CeFi or traditional investing. Comment: DeFi is higher risk, partly because it moves so fast. A lot of yams, hot dogs, and sushi can get lost when you move so fast that you can't even bother to do a thorough audit before releasing code. The cream of the crop projects will all have had multiple audits done by multiple independent auditors. Auditors are expensive. At such an embryonic stage, most projects can't afford to have one audit done let alone 5. But if you can live with that higher risk intrinsic in DeFi and be willing to be a part of "testing in prod," then financial innovation can truly blossom. And if you let your best and brightest members of your community focus only on doing what they do best, then they don't have to bother to try to grow a business like a Bezos, Musk, or a Zuckerberg. Innovative entrepreneurs in this mold such as Andre, don't have to even try to do this business growth on their own because the DAO sets it up so that they don't have to do this.The DAO both grows the business while supporting and allowing these innovators to simply innovate, instead of trying to get nerds to do backroom deals to gain market share and access to new customers. It turns out that nerds are much more productive when you just let them be a nerd in their labs.
Composability - The measure of the usability and ability of a product to be used as a building block (or "money lego") in the construction of other products or domains. A protocol that is simple, powerful, and that functions well with other protocols would be considered to have high composability. Comment: The maturity of the cryptocurrency ecosystem and the evolution of composable building tools in the DeFi space now make new products and concepts available. $YFI would not have been possible only 2 or 3 years ago; the tools and ecosystem simply weren't ready for it yet. This is why only now are you and many other now hearing about YFI. In 2018, Andre began providing free code reviews to Crypto Briefing. Andre had to learn to walk before he could run, and the composable tools needed to work on embryonic ideas in his head were simply not ready or available then. By reading and reviewing so many Smart Contracts he learned to recognize good code from bad code at what was still a very early stage in Smart Contract development in 2018, only 3 years after ETH's launch in July 2015.
Smart Contract Stacking
Smart Contracts - A digital contract that is programmed in a language that is considered Turing complete, meaning that with enough processing power and time, a properly programmed Smart Contract should be able to use its code base and logical algorithms to perform almost any digital task or process. Ethereum's programming languages, such as Solidity and Vyper, are Turing complete. Comment: Smart Contracts have actually gotten smarter since ETH launched in July 2015. It's because Smart Contract builders needed to learn Solidity and how it functions and interoperates before they could spread their wings as designers. With more time and experience under their belts, the early SC builders that stuck to it have gotten much better. In Andre Cronje, we may have been witness to the rise of the next Satoshi or Vitalik of crypto. There is a reason that a couple of days ago, I counted 6 of 41 YF clones - nearly 15% - among the top gainers on the day. Success breeds copycats showing a ton of flattery. A smart contract is so smart, it can be used to be stacked upon other smart contracts such as at Aave or Maker. True innovation takes time, sacrifice, blood, sweat, and tears. It does not come without cost to those doing the innovating. There is not a single project in DeFi, CeFi, or even all of cryptocurrency that can claim the breadth and diversity of innovation and product reach that is found in the $YFI ecosystem. As a tech investor and professional nerd who's been involved at Research Labs and around product development and testing since before the year 2000. Prior to that I've ready widely and keenly to keep up with technological changes and assess investment potential in these disruptive changes nearly my whole life. The amount of innovation shown in this project is breathtaking if you're a Tech or FinTech researcher. It's being released at a ridiculously rapid pace that is simply unmatched in any private or government research lab anywhere, let alone at any CeFi or traditional financial institution one can name. The only comparable levels of innovation shown by this young project is typically only seen during periods of epochal changes such as The Renaissance or times of strife and war, such as World War II. Unless you've been in the industry and working with coders:I don't think those that haven't been around software development and testing can understand, can truly grasp that no one, no group does this.This isn't normal. This rapid-fire release of truly innovative code and intelligent strategies would have to be comparable to some of the greatest creative periods of human ingenuity and creativity. It's truly on par with periods of brilliance seen by thinkers like Newton, Einstein and Tesla, except with software code and concepts in decentralized finance. When the history of FinTech writes this chapter in its history, $YFI may need its own section or chapter. Don't forget all of these financial instruments we take for granted all around us, all had a simple start somewhere, whether it was an IOU system of credit, insurance, stocks, bonds, derivatives, futures, options, and so on...they all started off as an idea somewhere that had to get tested sooner or later "in production." One brilliant aspect of $YFI Smart Contracts is that they're built as a profitable layer atop existing DeFi protocols, extracting further value from base crypto assets and even primary crypto derivatives. $YFI is built atop existing smart contracts to create further value where there was none before, and help maximize gains for long term investors.
The Power of a Talented and Diverse DAO
DAO - Distributed Autonomous Organization. The first DAO was started in 2016. According to Wikipedia's definition, it is an: "organization represented by rules encoded as a computer program that is transparent, controlled by the organization members and not influenced by a central government. A DAO's financial transaction record and program rules are maintained on a blockchain." When implemented well, a DAO allows for real world experiments in decentralized democratic organization and control, with more freedom of action and less regulatory oversight for DAO controlled projects and products when compared to legacy corporate structures and organizations. Comment: yEarn Finance has shown us what a properly motivated and sufficiently powerful DAO can do in a short amount of time. There's many reasons why this project with an already profitable business model is the fastest original project in history to ever reach a $1B marketcap in any market - traditional or crypto - accomplishing this amazing feat in less than two months. There's reasons why this is probably the fastest coin in history to get listed on Coinbase in less than 2 months. The power of a sufficiently talented and diverse development team and community is stunning in its power, speed, and ability to get things done quickly. There are risks aplenty with parts of this project, but $YFI is now seen as a "safe" place in DeFi, because you know you that as far as yield farming you probably couldn't do it better yourself unless you took a chance on unaudited code with anonymous Devs, or you were doing the trading equivalent of throwing darts blindfolded and somehow won, except that you even more improbably kept doing that over and over and winning. Summary: There's reasons why YFI has been called the Bitcoin of DeFi and the Berkshire Hathaway Series A of crypto. I've listed some of the reasons above. The confluence of these 4 factors has helped lead to explosive growth for this project. This isn't financial advice as I'm not a financial pro but make no mistake: as a Crypto OG around crypto since early 2013, who was deeply involved in multiple community projects as an early organizer, and who was a small investor during the DotCom era investing in early giants that went on to be gorillas, I don't say this lightly that the $YFI project is lightning in a bottle and a diamond in the rough. What $YFI allows, when all is said and done, is the rapid fire implementation of great ideas that have gone through a rapid Darwinian evolution, where only the best ideas are implemented. Thoughts and ideas are powerful things. The valuation of this coin and ecosystem has to, itmusttake into account that this nascent financial innovation hub and ecosystem actually works and allows the best of these ideas to actually blossom rapidly. You just don't find too many gems like this.
How DAO users can truly control their voting rights
https://blockchaintopbuzz.medium.com/how-dao-users-can-truly-control-their-voting-rights-f945c9c6b65e Aelf proposed a solution that gives the control of the voting rights back to users by classifying token permissions. As of today, there are still few complete businesses. In addition to mining and building trading platforms, it is difficult to create a complete business model. Moreover, various trading platforms have gradually grown into enterprises with comprehensive products in the blockchain industry, including wallets, nodes, lending, mining pools, etc. At the same time, cloud services can reduce the cost of building small exchanges, but they can also lead to big trading platforms monopolizing data. For example, some Internet companies provide free cloud services in order to collect more valuable data. Currently, Ethereum, which has the richest DeFi ecosystem, is gradually upgrading to V2.0, and its consensus protocol will also be upgraded to PoS. Governance voting can be regarded as the most important feature in the PoS ecosystem. This year, Yearn.Finance rose to sudden prominence. But due to the governance problem, its community members initiated a hard fork, resulting in YFII. Another DeFi project, YAM, had a unfixable rebase function error. The founding team apologized for the error and announced a ‘Migration Plan’, which will turn the project over to the community. For a while, governance voting became all the rage. However, the increasingly bigger trading platforms have been criticized by users in governance voting. Is there a proper solution to handling the relationship between the trading platform and governance voting?
What will we lose when trading platforms monopolize the blockchain industry?
In June 2018, during the BP node election before the EOS mainnet launch, node voting began to have a crisis of confidence between token holders and the trading platform. it is widely believed that the top 20 holders of trading platform wallets held about 40% of all the EOS in circulation. Since then, many trading platforms have enabled the “User Authorization” interface. EOS holders can authorize the token voting rights to the trading platform, who will vote on behalf of the users. The rule caused a backlash from users, forcing these trading platforms to change the rule immediately so that EOS holders could vote on their preferred BP nodes. After the EOS BP node votes, whether the trading platform has the token voting right has been occasionally discussed, but fewhave noticed it. Two years later, Justin Sun, founder of TRON, made a commercial acquisition of Steemit, a decentralized social networking platform. After the acquisition was announced, the Steemit community launched a soft fork to resist the project being controlled by TRON. However, Justin Sun voted with the support of trading platforms such as Binance, Huobi and Poloniex to prevent a soft fork. After being questioned by users, Binance and Huobi said that they would no longer interfere in the voting of the Steemit community. However, hkdev 404 of the Steem community again reveived votes from Huobi accounts. It is said that nearly 40 million votes were cast during the incident, accounting for about 10% of the total circulation of STEEM tokens. There is no doubt that when the trading platform monopolizes the industry, we will lose our voting right. How do we defend our voting rights The fact that the ownership of the tokens belongs to the holders is indisputable, but what about the voting rights of the tokens deposited on the trading platform? How can we defend our voting rights after trading platforms have monopolized the industry?
Trading Platform Model
Traditional centralized trading platforms will assign to each user a separate deposit address. After depositing, the depositedamount will be added into the cold wallet and hot wallet. When users want to withdraw their tokens, the trading platform will transfer the tokens out of the hot wallet. If there is insufficient balance in the hot wallet, then the tokens will be transferred from the cold wallet to the hot wallet, and then be withdrawn. Under the traditional centralized trading platform model, once users transfer their tokens into a trading platform, it means thetoken ownership (including voting rights) is also transferred to that trading platform. The aelf solution: classify token permissions and claim back voting rights For the issue of “voting rights” between token holders and centralized trading platforms, aelf, a decentralized cloud computing blockchain network, has proposed a solution: to establish an aelf Centre Asset Management Contract on the chain. The contract can limit the funds entering the exchange and define different permissions to control the assets. The main feature of the aelf Centre Asset Management Contract is to create the “Main Virtual Address of the Trading Platform”. Each exchange has a main virtual address, which can only be used for transfer operation, but not for voting, trading and other operations. As a result, the exchange cannot misappropriate users’ assets for voting. At the same time, the assets of the primary virtual address are publicly available on the chain, which makes it more difficult for the exchange to misappropriate assets. At the same time, the aelf Centre Asset Management Contract also has the function of “address definition”. The exchange can open different permissions to different addresses, such as opening different permissions according to the amount, transactions exceeding a certain amount can only be given the greenlight by using multiple signatures, and the assets can be frozen through the contract when the assets of the trading platform are stolen, etc. For the users of the trading platform, the access of the trading platform to the aelf Center Asset Management Contract function will not undermine user experience. The virtual system address of the aelf Center Asset Management Contract will assign a virtual address to each user, which offers the same user experience as the traditional mode. For the trading platform, each deposit address constructed by the virtual address system is generated by the algorithm and does not need to be carried out on the blockchain. This means that the trading platform does not need to manage a large number of private keys, and there is no risk that the private keys will be lost. On the most important “voting rights” issue, the aelf Center Asset Management Contract will assign to each user a separate virtual address for voting: Voting address = Hash (Exchange Main Address + Token + “VOTE”) Voting process: the tokens are transferred from the main virtual address of the exchange to the special “voting address” for voting, and are then voted. After voting, the tokens are withdrawn from the voting address back to the main virtual address. We can see that the aelf Centre Asset Management Contract proposed by aelf can improve the efficiency of the trading platform without affecting user experience. In addition, it solves the problem that users would lose their voting rights. According to the data on Crypto Mode, the market value of PoS tokens has exceeded $33 billion without counting Ethereum. In the field of crypto, it is the biggest ecosystem next to Bitcoin. The most important function of PoS is vote staking. faced with bigtrading platforms, if the status quo continues, retail investors will gradually lose their “voting rights” that belong to them. Comparison of Market Value of PoS tokens (Source: Crypto Mode) The emergence of DAO offers an alternative to trading platforms who misappropriate users’ tokens, but it still can not change this situation. Of course, DAO will not die out. Small communities will still use DAO for community governance. The idea behind the design of aelf is to start from the underlying trading platform and solve this issue at the source. Whether the solution can work still takes time. However, as a member of the crypto industry, we should understand the importance of “voting rights”, and cannot allow the exchange to seize our rights at will. Recently, aelf has also announced its DeFi plan, which includes a new blockchain 3.0 project with a large number of new technical features, such as cross chain function, virtual address and cloud services. Aelf also proposed a set of interoperability solutions with ERC-20 tokens. It can directly access the ETH ecosystem, allow ETH-based applications and wallets to directly access it, and maintain the interoperability with ETH. And aelf will provide a high-performance smart contract operation platform and cloud services that can support cross chain interaction. Users on major cloud servers can easily run aelf’s services and adjust the scale of cloud according to their own business needs. The implementation of a slew of tools, cloud services and interoperability solutions developed by aelf means that centralized transactions can be directly connected to the aelf network, realizing one-click adaptation to the DeFi ecosystem. With aelf, CeFi and DeFi are able to learn from and complement each other.
The effects of the web by a number of companies have seduced a large number of users as these companies keep their data to prevent them from searching for alternatives. Likewise, these huge platforms have attracted applications to build their highest ecosystems before either severing access or actively opposing their interests when the applications became so successful. As a result, these walled gardens have effectively hindered innovation and monopolized large sections of the web. After the emergence of blockchain technology and decentralized cryptocurrencies, the need for applications to support decentralization has emerged. Several blockchain-based companies, applications and platforms have appeared in decentralization. In this research report, we will explain the approach adopted by the NEAR decentralization platform in designing and implementing the basic technology for its system. Near is a basic platform for cloud computing and decentralized storage managed by the community, designed to enable the open web for the future. On this web, everything can be created from new currencies to new applications to new industries, opening the door to an entirely new future.
The richness of the web is increasing day by day with the combined efforts of millions of people who have benefited from “innovation without permission” as content and applications are created without asking anyone. this lack of freedom of data has led to an environment hostile to the interests of its participants. And as we explained in the summary previously, web hosting companies have hindered innovation and greatly monopolized the web. In the future, we can fix this by using new technologies to re-enable the permissionless innovation of the past in a way, which creates a more open web where users are free and applications are supportive rather than adversarial to their interests. Decentralization emerged after the global financial crisis in 2008, which created fundamental problems of confidence in the heavily indebted banking system. Then the decentralized financial sector based on Blockchain technology has emerged since 2009. Decentralized Blockchain technology has made it easy for decentralized digital currencies like Bitcoin to exchange billions of dollars in peer-to-peer transfers for a fraction of the price of a traditional banking system. This technology allows participants in the over $ 50 billion virtual goods economy to track, own and trade in these commodities without permission. It allows real-world goods to cross into the digital domain, with verified ownership and tracking just like that of the digital. By default, the Internet where freedom of data enables innovation will lead to the development of a new form of software development. On this web, developers can quickly create applications from open state components and boost their efforts by using new business models that are enabled from within the program itself rather than relying on parasitic relationships with their users. This not only accelerates the creation of applications that have a more honest and cooperative relationship with its users, but also allows the emergence of completely new business built on them. To enable these new applications and the open web, it needs the appropriate infrastructure. The new web platform cannot be controlled by a single entity and its use is not limited due to insufficient scalability. It should be decentralized in design like the web itself and supported by a community of distributors widely so that the value they store cannot be monitored, modified or removed without permission from the users who store this value on their behalf. A new decentralization technology (Blockchain), which has facilitated decentralized digital currencies like Bitcoin, has made billions of dollars in peer-to-peer transfers at a fraction of the price of the traditional banking system. This technology allows participants in the $ 50 billion + virtual goods economy to track, own and trade in these goods without permission. It allows real-world goods to cross into the digital domain, with verified ownership and tracking just like that of the digital. Although the cost of storing data or performing a calculation on the Ethereum blockchain is thousands and millions of times higher than the cost of performing the same functionality on Amazon Web Services. A developer can always create a “central” app or even a central currency for a fraction of the cost of doing the same on a decentralized platform because a decentralized platform, by definition, will have many iterations in its operations and storage. Bitcoin can be thought of as the first, very basic, version of this global community-run cloud, though it is primarily used only to store and move the Bitcoin digital currency. Ethereum is the second and slightly more sophisticated version, which expanded the basic principles of Bitcoin to create a more general computing and storage platform, though it is a raw technology, which hasn’t achieved meaningful mainstream adoption.
1.1 WHY IS IT IMPORTANT TO PAY THE EXTRA COST TO SUPPORT DECENTRALIZATION?
Because some elements of value, for example bits representing digital currency ownership, personal identity, or asset notes, are very sensitive. While in the central system, the following players can change the value of any credits they come into direct contact with:
The developer who controls the release or update of the application’s code
The platform where the data is stored
The servers which run the application’s code
Even if none of these players intend to operate with bad faith, the actions of governments, police forces and hackers can easily turn their hands against their users and censor, modify or steal the balances they are supposed to protect. A typical user will trust a typical centralized application, despite its potential vulnerabilities, with everyday data and computation. Typically, only banks and governments are trusted sufficiently to maintain custody of the most sensitive information — balances of wealth and identity. But these entities are also subject to the very human forces of hubris, corruption and theft. Especially after the 2008 global financial crisis, which demonstrated the fundamental problems of confidence in a highly indebted banking system. And governments around the world apply significant capital controls to citizens during times of crisis. After these examples, it has become a truism that hackers now own most or all of your sensitive data. These decentralized applications operate on a more complex infrastructure than today’s web but they have access to an instantaneous and global pool of currency, value and information that today’s web, where data is stored in the silos of individual corporations, cannot provide.
1.2 THE CHALLENGES OF CREATING A DECENTRALIZED CLOUD
A community-run system like this has very different challenges from centralized “cloud” infrastructure, which is running by a single entity or group of known entities. For example:
It must be both inclusive to anyone and secure from manipulation or capture.
Participants must be fairly compensated for their work while avoiding creating incentives for negligent or malicious behavior.
It must be both game theoretically secure so good actors find the right equilibrium and resistant to manipulation so bad actors are actively prevented from negatively affecting the system.
NEAR is a global community-run computing and storage cloud which is organized to be permissionless and which is economically incentivized to create a strong and decentralized data layer for the new web. Essentially, it is a platform for running applications which have access to a shared — and secure — pool of money, identity and data which is owned by their users. More technically, it combines the features of partition-resistant networking, serverless compute and distributed storage into a new kind of platform. NEAR is a community-managed, decentralized cloud storage and computing platform, designed to enable the open web in the future. It uses the same core technology for Bitcoin and Blockchain. On this web, everything can be created from new currencies to new applications to new industries, opening the door to an entirely new future. NEAR is a decentralized community-run cloud computing and storage platform, which is designed to enable the open web of the future. On this web, everything from new currencies to new applications to new industries can be created, opening the door to a brand new future. NEAR is a scalable computing and storage platform with the potential to change how systems are designed, how applications are built and how the web itself works. It is a complex technology allow developers and entrepreneurs to easily and sustainably build applications which reap the benefits of decentralization and participate in the Open Web while minimizing the associated costs for end users. NEAR creates the only community-managed cloud that is strong enough to power the future of the open web, as NEAR is designed from the ground up to deliver intuitive experiences to end users, expand capacity across millions of devices, and provide developers with new and sustainable business models for their applications. The NEAR Platform uses a token — also called “NEAR”. This token allows the users of these cloud resources, regardless of where they are in the world, to fairly compensate the providers of the services and to ensure that these participants operate in good faith.
2.1 WHY NEAR?
Through focus, we find that Platforms based on blockchain technologies like Bitcoin and Ethereum have made great progress and enriched the world with thousands of innovative applications spanning from games to decentralized financing. However, these original networks and none of the networks that followed were not able to bridge the gap towards mainstream adoption of the applications created above them and do not provide this type of standard that fully supports the web. This is a result of two key factors:
System design is relevant because the technical architecture of other platforms creates substantial problems with both usability and scalability which have made adoption nearly impossible by any but the most technical innovators. End-users experience 97–99% dropoff rates when using applications and developers find the process of creating and maintaining their applications endlessly frustrating. Fixing these problems requires substantial and complex changes to current protocol architectures, something which existing organizations haven’t proven capable of implementing. Instead, they create multi-year backlogs of specification design and implementation, which result in their technology falling further and further behind. NEAR’s platform and organization are architected specifically to solve the above-mentioned problems. The technical design is fanatically focused on creating the world’s most usable and scalable decentralized platform so global-scale applications can achieve real adoption. The organization and governance structure are designed to rapidly ship and continuously evolve the protocol so it will never become obsolete.
2.1.1 Features, which address these problems:
1. USABILITY FIRST The most important problem that needs to be addressed is how to allow developers to create useful applications that users can use easily and that will capture the sustainable value of these developers. 2. End-User Usability Developers will only build applications, which their end users can actually use. NEAR’s “progressive security” model allows developers to create experiences for their users which more closely resemble familiar web experiences by delaying onboarding, removing the need for user to learn “blockchain” concepts and limiting the number of permission-asking interactions the user must have to use the application. 1. Simple Onboarding: NEAR allows developers to take actions on behalf of their users, which allows them to onboard users without requiring these users to provide a wallet or interact with tokens immediately upon reaching an application. Because accounts keep track of application-specific keys, user accounts can also be used for the kind of “Single Sign On” (SSO) functionality that users are familiar with from the traditional web (eg “Login with Facebook/Google/Github/etc”). 2. Easy Subscriptions: Contract-based accounts allow for easy creation of subscriptions and custom permissioning for particular applications. 3. Familiar Usage Styles: The NEAR economic model allows developers to pay for usage on behalf of their users in order to hide the costs of infrastructure in a way that is in line with familiar web usage paradigms. 4. Predictable Pricing: NEAR prices transactions on the platform in simple terms, which allow end-users to experience predictable pricing and less cognitive load when using the platform.
2.1.2 Design principles and development NEAR’s platform
1. Usability: Applications deployed to the platform should be seamless to use for end users and seamless to create for developers. Wherever possible, the underlying technology itself should fade to the background or be hidden completely from end users. Wherever possible, developers should use familiar languages and patterns during the development process. Basic applications should be intuitive and simple to create while applications that are more robust should still be secure. 2. Scalability: The platform should scale with no upper limit as long as there is economic justification for doing so in order to support enterprise-grade, globally used applications. 3. Sustainable Decentralization: The platform should encourage significant decentralization in both the short term and the long term in order to properly secure the value it hosts. The platform — and community — should be widely and permissionlessly inclusive and actively encourage decentralization and participation. To maintain sustainability, both technological and community governance mechanisms should allow for practical iteration while avoiding capture by any single parties in the end. 4. Simplicity: The design of each of the system’s components should be as simple as possible in order to achieve their primary purpose. Optimize for simplicity, pragmatism and ease of understanding above theoretical perfection.
2.2 HOW NEAR WORKS?
NEAR’s platform provides a community-operated cloud infrastructure for deploying and running decentralized applications. It combines the features of a decentralized database with others of a serverless compute platform. The token, which allows this platform to run also, enables applications built on top of it to interact with each other in new ways. Together, these features allow developers to create censorship resistant back-ends for applications that deal with high stakes data like money, identity, assets, and open-state components, which interact seamlessly with each other. These application back-ends and components are called “smart contracts,” though we will often refer to these all as simply “applications” here. The infrastructure, which makes up this cloud, is created from a potentially infinite number of “nodes” run by individuals around the world who offer portions of their CPU and hard drive space — whether on their laptops or more professionally deployed servers. Developers write smart contracts and deploy them to this cloud as if they were deploying to a single server, which is a process that feels very similar to how applications are deployed to existing centralized clouds. Once the developer has deployed an application, called a “smart contract”, and marked it unchangeable (“immutable”), the application will now run for as long as at least a handful of members of the NEAR community continue to exist. When end users interact with that deployed application, they will generally do so through a familiar web or mobile interface just like any one of a million apps today. In the central cloud hosted by some companies today like: Amazon or Google, developers pay for their apps every month based on the amount of usage needed, for example based on the number of requests created by users visiting their webpages. The NEAR platform similarly requires that either users or developers provide compensation for their usage to the community operators of this infrastructure. Like today’s cloud infrastructure, NEAR prices usage based on easy to understand metrics that aren’t heavily influenced by factors like system congestion. Such factors make it very complicated for developers on alternative blockchain-based systems today. In the centralized cloud, the controlling corporation makes decisions unilaterally. NEAR community-run cloud is decentralized so updates must ultimately be accepted by a sufficient quorum of the network participants. Updates about its future are generated from the community and subject to an inclusive governance process, which balances efficiency and security. In order to ensure that the operators of nodes — who are anonymous and potentially even malicious — run the code with good behavior, they participate in a staking process called “Proof of Stake”. In this process, they willingly put a portion of value at risk as a sort of deposit, which they will forfeit if it is proven that they have operated improperly.
2.2.1 Elements of the NEAR’s Platform
The NEAR platform is made up of many separate elements. Some of these are native to the platform itself while others are used in conjunction with or on top of it. 1. THE NEAR TOKEN NEAR token is the fundamental native asset of the NEAR ecosystem and its functionality is enabled for all accounts. Each token is a unique digital asset similar to Ether, which can be used to: a) Pay the system for processing transactions and storing data. b) Run a validating node as part of the network by participating in the staking process. c) Help determine how network resources are allocated and where its future technical direction will go by participating in governance processes. The NEAR token enables the economic coordination of all participants who operate the network plus it enables new behaviors among the applications which are built on top of that network. 2. OTHER DIGITAL ASSETS The platform is designed to easily store unique digital assets, which may include, but aren’t limited to:
Other Tokens: Tokens bridged from other chains (“wrapped”) or created atop the NEAR Platform can be easily stored and moved using the underlying platform. This allows many kinds of tokens to be used atop the platform to pay for goods and services. “Stablecoins,” specific kinds of token which are designed to match the price of another asset (like the US Dollar), are particularly useful for transacting on the network in this way.
Unique Digital Assets: Similar to tokens, digital assets (sometimes called “Non Fungible Tokens” (NFTs) ranging from in-game collectibles to representations of real-world asset ownership can be stored and moved using the platform.
3. THE NEAR PLATFORM The core platform, which is made up of the cloud of community-operated nodes, is the most basic piece of infrastructure provided. Developers can permissionlessly deploy smart contracts to this cloud and users can permissionlessly use the applications they power. Applications, which could range from consumer-facing games to digital currencies, can store their state (data) securely on the platform. This is conceptually similar to the Ethereum platform. Operations that require an account, network use, or storage at the top of the platform require payment to the platform in the form of transaction fees that the platform then distributes to its community from the authentication contract. These operations could include creating new accounts, publishing new contracts, implementing code by contract and storing or modifying data by contract. As long as the rules of the protocol are followed, any independent developer can write software, which interfaces with it (for example, by submitting transactions, creating accounts or even running a new node client) without asking for anyone’s permission first. 4. THE NEAR DEVELOPMENT SUITE Set of tools and reference implementations created to facilitate its use by those developers and end users who prefer them. These tools include:
NEAR SDKs: NEAR platform supports (Rust and AssemblyScript) languages to write smart contracts. To provide a great experience for developers, NEAR has a full SDK, which includes standard data structures, examples and testing tools for these two languages.
Gitpod for NEAR: NEAR uses existing technology Gitpod to create zero time onboarding experience for developers. Gitpod provides an online “Integrated Development Environment” (IDE), which NEAR customized to allow developers to easily write, test and deploy smart contracts from a web browser.
NEAR Wallet: A wallet is a basic place for developers and end users to store the assets they need to use the network. NEAR Wallet is a reference implementation that is intended to work seamlessly with the progressive security model that lets application developers design more effective user experiences. It will eventually include built-in functionality to easily enable participation by holders in staking and governance processes on the network.
NEAR Explorer: To aid with both debugging of contracts and the understanding of network performance, Explorer presents information from the blockchain in an easily digestible web-based format.
NEAR Command Line Tools: The NEAR team provides a set of straightforward command line tools to allow developers to easily create, test and deploy applications from their local environments.
All of these tools are being created in an open-source manner so they can be modified or deployed by anyone.
Primarily economic forces drive the ecosystem, which makes up the NEAR platform. This economy creates the incentives, which allow participants permissionlessly organize to drive the platform’s key functions while creating strong disincentives for undesirable, irresponsible or malicious behavior. In order for the platform to be effective, these incentives need to exist both in the short term and in the long term. The NEAR platform is a market among participants interested in two aspects:
On the supply side, certification contract operators and other core infrastructure must be motivated to provide these services that make up the community cloud.
On the demand side, platform developers and end-users who pay for their use need to be able to do so in a simple, clear and consistent way that helps them.
Further, economic forces can also be applied to support the ecosystem as a whole. They can be used at a micro level to create new business models by directly compensating the developers who create its most useful applications. They can also be used at a macro level by coordinating the efforts of a broader set of ecosystem participants who participate in everything from education to governance.
3.1 NEAR ECONOMY DESIGN PRINCIPLES
NEAR’s overall system design principles are used to inform its economic design according to the following interpretations: 1. Usability: End users and developers should have predictable and consistent pricing for their usage of the network. Users should never lose data forever. 2. Scalability: The platform should scale at economically justified thresholds. 3. Simplicity: The design of each of the system’s components should be as simple as possible in order to achieve their primary purpose. 4. Sustainable Decentralization: The barrier for participation in the platform as a validating node should be set as low as possible in order to bring a wide range of participants. Over time, their participation should not drive wealth and control into the hands of a small number. Individual transactions made far in the future must be at least as secure as those made today in order to safeguard the value they modify.
3.2 ECONOMIC OVERVIEW
The NEAR economy is optimized to provide developers and end users with the easiest possible experience while still providing proper incentives for network security and ecosystem development. Summary of the key ideas that drive the system:
Thresholded Proof of Stake: Validating node operators provide scarce and valuable compute resources to the network. In order to ensure that the computations they run are correct, they are required to “stake” NEAR tokens, which guarantee their results. If these results are found to be inaccurate, the staker loses their tokens. This is a fundamental mechanism for securing the network. The threshold for participating in the system is set algorithmically at the lowest level possible to allow for the broadest possible participation of validating nodes in a given “epoch” period (½ of a day).
Epoch Rewards: Node operators are paid for their service a fixed percentage of total supply as a “security” fee of roughly 4.5% annualized. This rate targets sufficient participation levels among stakers in order to secure the network while balancing with other usage of NEAR token in the ecosystem.
Protocol treasury: In addition to validators, protocol treasury received a 0.5% of total supply annually to continuously re-invest into ecosystem development.
Transaction Costs: Usage of the network consumes two separate kinds of resources — instantaneous and long term. Instantaneous costs are generated by every transaction because each transaction requires the usage of both the network itself and some of its computation resources. These are priced together as a mostly-predictable cost per transaction, which is paid in NEAR tokens.
Storage Costs: Storage is a long term cost because storing data represents an ongoing burden to the nodes of the network. Storage costs are covered by maintaining minimum balance of NEAR tokens on the account or contract. This provides indirect mechanism of payment via inflation to validators for maintaining contract and account state on their nodes.
Inflation: Inflation is determined as combination of payouts to validators and protocol treasury minus the collected transaction fees and few other NEAR burning mechanics (like name auction). Overall the maximum inflation is 5%, which can go down over time as network gets more usage and more transactions fees are burned. It’s possible that inflation becomes negative (total supply decreases) if there is enough fees burned.
Scaling Thresholds: In a network, which scales its capacity relative to the amount of usage it receives, the thresholds, which drive the network to bring on additional capacity are economic in nature.
Security Thresholds: Some thresholds, which provide for good behavior among participants are set using economic incentives. For example, “Fishermen” (described separately).
A block chain is a transaction database shared by all nodes participating in a system based on the Bitcoin protocol. A full copy of a currency's block chain contains every transaction ever executed in the currency. With this information, one can find out how much value belonged to each address at any point in history.. Every block contains a hash of the previous block. Blockchain technology is often wrongly associated with just Bitcoin, forgetting that blockchain can be used for any of the 700+ cryptocurrencies. And as proved above, for much more than just money. Distributed Ledger: The Bitcoin blockchain protocol introduced a mechanism of making it expensive to copy digital values. A copy of the ledger is stored on multiple devices of a cryptographically secured P2P network. The ledger is a le, also called blockchain. It maintains a continuously growing list of transaction data records, chained in blocks that are cryptographically secured from ... This page describes the behavior of the reference client.The Bitcoin protocol is specified by the behavior of the reference client, not by this page. In particular, while this page is quite complete in describing the network protocol, it does not attempt to list all of the rules for block or transaction validity.. Type names used in this documentation are from the C99 standard. In short, Bitcoin is a digital form of money that runs on a distributed network of computers ().In a broader sense, though, many people often use the word Bitcoin to refer to a few different things: a digital currency, a decentralized public ledger, a protocol, or simply the big ecosystem that encompasses all of these.
Blockchain Programmiersprachen in 7 Minuten verstehen!
Bitcoin Protocol Paper Playlist: http://www.youtube.com/watch?v=UieiMU-ImvI&list=PLQVvvaa0QuDcq2QME4pfeh0cE71mkb_qz&feature=share All Bitcoin Videos Playlist... A block chain is a transaction database shared by all nodes participating in a system based on the Bitcoin protocol. A full copy of a currency's block chain ... In this short video, the Bitcoin timestamp server / global ledger is discussed. As usual, however, we present ourselves with an ending challenge which is to be covered in the, you guessed it, next ... Interview du film documentaire sur le Bitcoin et la cryptomonnaie par le réalisateur de Protocole : Rémi Crussière. 📌 Ma formation GRATUITE pour décoller sur... Bitcoin Protocol Paper Playlist: http://www.youtube.com/watch?v=UieiMU-ImvI&list=PLQVvvaa0QuDcq2QME4pfeh0cE71mkb_qz&feature=share All Bitcoin Videos Playlist...