https://preview.redd.it/4s9hat9znf341.png?width=587&format=png&auto=webp&s=4c544a57d23e9a0101f4adc154cae0f3b7923bc4submitted by Floris-Jan to aelfofficial [link] [comments]
Protocol congestion is a perennial problem in the blockchain ecosystem. Various measures have been implemented to avert congestion, but most struggle to offer a long-term solution.Protocols have tried increasing their block size to increase the number of transactions they can hold and decreasing block production time to increase block generation. Though these measures worked in the short-term, they soon reached their limit. Thus, nearly all existing protocols cannot compare their transaction rates to those of centralized platforms.
The blockchain ecosystem has experienced transaction delays, massive transaction fees, and other inconveniences as a result of congestion within blockchain protocols.
Now, protocols like Aelf are out to change this narrative.
This article explores the congestion issue in the blockchain system, specifically on the Ethereum and EOS protocols. It also explores why Aelf will not be affected by the problem of congestion.
More Users = More TransactionsAccording to a report by Deloitte, blockchain is changing the business landscape, causing industries to adjust their operations based on the solutions it offers. This is also being seen in governments. The report also highlights that blockchain is yet to reach its full potential.
Blockchain is growing significantly, and one of the best examples of this is the congestion in Ethereum. Back in 2017, one of the first signs of future congestion was the d’App, CryptoKitties, which caused massive congestion in the Ethereum network- at one point resulting in a six-fold increase in total network requests.
These furry kittens were the source of great delays on the Ethereum network upon release | Source
It is also worth noting, that during the peak bull run in 2017, Bitcoin also suffered from a massively congested network and transaction time delays. The situation got so bad, some transactions took over two weeks to complete!
The delays were caused because Ethereum could only meet 15 transactions per second (tps) at the time. Even without CryptoKitties, the platform was eventually going to suffer massive delays as more people used their protocol.
Ethereum is now living the congested future of its platform as Tether transactions load its network with numerous requests that often leads to delays in the Ethereum Network. Despite increasing their block capacity by about 25%, it is not enough to meet the growing number of transactions on their platform.
Attempts Towards Greater ScalabilityOver at EOS, things are not going as planned.
The protocol is among the networks that ushered in blockchain 3.0 promising faster transaction rates. This was achieved as EOS outperformed Ethereum and Bitcoin in transaction rates.
However, because of their network set up, their platform weakness was exposed in 2019 as EOS experienced a massive delay caused by a specialized Denial-of-Service (DoS) attack.
DoS attacks are successful when the targeted platform is flooded with numerous transaction requests; thus, legitimate requests cannot be processed in a good time. This can be further specialized when attackers use Distributed-Denial-of-Service, which specifically targets a single network or server, thus rendering the platform ineffective faster.
For EOS, their network weakness was exposed as the attack targeted the blockchain layer. The attacker posted so many deferred transactions that when the time came to process them (deferred are given priority over new transactions) that no new transactions could be processed The attack produced numerous trash transactions that made valid transactions useless. The attack was made via a d’App hosted on EOS.
Because the issue was not addressed since January, another attempt to slow down the network was successfully made. The plan was likely carried out to determine the limitations of the EOS network.
An airdrop was planned on the EOS network, where users would be rewarded with tokens if they frequently transferred EOS tokens into and out of the EOS network. The airdrop event created congestion because of the number of transactions being generated on the EOS network.
The congestion created on EOS on both instances can be attributed to the function of deferring transactions to a later time. This allows attackers to technically block other transactions for the period it will take to process all their ‘deferred’ transactions.
Aelf’s Simple BrillianceEthereum and EOS are both suffering congestion as a result of the growing number of transactions daily. These protocols are also likely to suffer congestion from planned attacks on their network.
Aelf drew lessons from both EOS and Ethereum to develop a platform that solves the issue of scalability.
On the issue of transaction rates, Aelf created a platform that achieves high tps. The tps are performed on-chain, and this is created through separation and specialization. Aelf’s protocol separates transactional data and computational dependency, which significantly impacts their tps.
Furthermore, Aelf implements parallel data processing through the separation of transactional data. This helps Aelf achieve even high tps on-chain.
The separation of transactional data is done using side chains. Aelf implements a branched-chain network as opposed to the single-chain system that is in use by both EOS and Ethereum. The branched-chain network allows Aelf to dedicate each side chain to a particular transaction type.
Aelf achieves its side chain specialization by using a “one chain to one type of contract” system. Therefore, one side chain can only process requests from one type of contract only. This makes the Aelf system highly specialized while still maintaining a simple structure.
Moreover, within the dedicated side chain, other side chains can be formed depending on the demand and needs of the network. This system resembles partitioning or sharding in database architecture and is known as “Tree Branch side chain extension” in the Aelf ecosystem.
The” Tree Branch side chain extension” acts as an emergency overflow system that protects Aelf from congestion by creating other side chains that can process transactions in case transaction requests outweigh Aelf’s capacity at the time.
A visual example of Aelf’s ‘Tree Branch’ | Source
Aelf’s side chains communicate through the main chain in the form of a Merkle tree root. Communication between the side chains is not direct. The information must pass through the filtering system of the mainchain to determine whether the data can be passed from one side chain to the other. The filtering process is based on the protocol’s guidelines.
These implementations deter deferred transactions, which makes it impossible for planned attacks to slow down the network through numerous “fake” transactions.
With Aelf’s set up, they are ahead in terms of scalability and security and, thus, a worthy choice for setting up a d’App.
Having seen the limitations of EOS and Ethereum, it is clear that their congestion problems are inevitable. Aelf remains the only platform that is immune to network congestion. The use of a side chain set up to isolate and categorize transactions is a simple yet brilliant idea implemented by the Aelf team, which assures Aelf of scalability throughout its lifetime. Aelf may have cemented themselves in blockchain history through its platform.
For more information of Aelf's platform, please follow this link.
#Aelf #DPoS #Blockchain #ParallelProcessing $ELF
Disclaimer: Please only take this information as my OWN opinion and should not be regarded as financial advice in any situation. Please remember to DYOR before making any decisions.
♂️ Hi, my name’s Sal. If you found this article useful and would like to view my other work please be sure to clap and follow me on medium and LinkedIn!😎
https://preview.redd.it/22zrdwgeg3m31.jpg?width=1280&format=pjpg&auto=webp&s=1370c511afa85ec06cda6843c36aa9289456806dsubmitted by Unitimes_ to u/Unitimes_ [link] [comments]
At 10:30 on September 12, Unitimes held the 40th online AMA about blockchain technologies and applications. We were glad to have Joanes Espanol , CEO and CTO of Amberdata, to share with us on ‘’Danger in Blockchain, Data Protection is Necessary‘’ . The AMA is composed of two parts : Fixed Q&A and Free Q&A. Check out the details below!
Amberdata is a blockchain and digital asset company which combines validated blockchain and market data from the top crypto exchanges into a unified platform and API, enabling customers to operate with confidence and build real-time data-powered applications.
We provide a standardized way to access blockchain data (blocks, transactions, account information, etc) across different blockchain models like UTXO (Bitcoin, Litecoin, Dash, Zcash...) and Account Based (Ethereum...), with contextualized pricing data from the top crypto exchanges in one API call. If you want to build applications on top of different blockchains, you would have to learn the intricacies of each distributed ledgers, run multiple nodes, aggregate the data, etc - instead of spending all that time and money, you can start immediately by using the APIs that we provide.
What can you get access to? Accounts, account-balances, blocks, contracts, internal messages, logs and events, pending transactions, security audits, source code, tokens, token balances, token transfers, token supplies (circulating & total supplies), transactions as well as prices, order books, trades, tickers and best bid and offers for about 2,000 different assets.
One important thing to note is that most of the APIs return validated data that anybody can verify by themselves. Blockchain is all about trust - operating in a hostile and trustless environment, maintaining consensus while continuously under attack, etc - and we want to make sure that we maintain that level of trust, so the API returns all the information that you would need to recalculate Merkle proofs yourself, hence guaranteeing the data was not tampered with and is authentique.
The genius of Satoshi Nakamoto was to combine and improve upon existing decentralized protocols with game theory, to arrive at a consensus protocol able to circumvent the Byzatine’s General Problem. Now participants have incentives to follow the rules (they get financially rewarded for doing so by mining for example, and penalized for misbehaving), which in turn results in a stable system. This was the first time that crypto-economics were used in a working product and this became the base and norm for a lot of the new systems today.
Pricing data is needed as context to blockchain data: there are a lot of (ERC-20) tokens created on Ethereum - it is very easy to clone an existing contract, and configure it with a certain amount of initial tokens (most commonly in the millions and billions in volume). Each token has an intrinsic value, as determined by the law of supply and demand, and as traded on the exchanges. Price fluctuations have an impact on the adoption and usage, meaning on the overall transaction volume (and to a certain extent transaction throughput) on the blockchain.
Blockchain data is needed as context to market data: activity on blockchain can have an impact on market data. For example, one can look at the incoming token transfers in the Ethereum transaction pool and see if there are any impending big transfers for a specific token, which could result in a significant price move on the other end. Being able to detect that kind of movement and act upon it is the kind of signals that traders are looking for. Another example can be found with token supplies: exchanges want to be notified as soon as possible when a token circulating supply changes, as it affects their trading ability, and in the worst case scenario, they would need to halt trading if a token contract gets compromised.
In conclusion, events on the blockchain can influence price, and market events also have an impact on blockchain data: the two are intimately intertwined, and putting them both in context leads to better insights and better decision making.
Not quick: blockchain data structures were designed and optimized for achieving consensus in a hostile and trustless environment and for internal state management, not for random access and overall search. Imagine you want to list all the transactions that your wallet address has participated in? The only way to do that would be to replay all the transactions from the beginning of time (starting at the genesis block), looking at the to and from addresses and retain only the ones matching your wallet: at over 500 million of transactions as of today, it will take some unacceptable amount of time to retrieve that list for a customer facing application.
Not easy: Some very basic things that one would expect when dealing with financial assets and instruments are actually very difficult to get at, especially when related to tokens. For example, the current Ether balance of a wallet is easy to retrieve in one call to a Geth or Parity client - however, looking at time series of these balances starts to be a little hairy, as not all historical state is kept by these clients, unless you are running a full archive node. Looking at token holdings and balances gets even more complicated, as most of the token transfers are part of the transient state and not kept on chain. Moreover, token transfers and balance changes over time are triggered by different mechanisms (especially when dealing with contract to contract function calls), and detecting these changes accurately is prone to errors.
Not cheap: As mentioned above, most of the historical data and time series metrics are only available via a full archive node, which at the time of writing requires about 3TB of disk space, just to hold all the blockchain state - and remember, this state is in a compressed and not easily accessible format. To convert it to a more searchable format requires much more space. Also, running your own full archive node requires constant care, maintenance and monitoring, which has become very expensive and prohibitive to run.
· It can be used in the traditional REST way to augment your own processes or enrich your own data with hard to get pieces of information. For example, lots of our users retrieve historical information (blocks and transactions) and relay it in their applications to their own customers, while others are more interested in financial data (account & token balances) and time series for portfolio management.
· Other projects are more in need of real-time up-to-date data, for which we recommend using our websockets, so you can filter out data in real-time and match your exact needs, rather than getting the firehose of information and having to filter out and discard 99% of it.
· We have a few research projects tapping into our API as well. For example, some of our customers want access to historical market data to backtest their trading strategies and fine-tune their own algorithms.
· Our API is also fully Json RPC compliant, meaning some people use it as a drop-in replacement for their own node, or as an alternative to Infura for example. We have some customers using both Amberdata and Infura as their web3 providers, with the benefits of getting additional enriched data when connecting to our API.
· And finally, we have also built an SDK on top of the API itself, so it is easier to integrate into your own application (https://www.npmjs.com/package/web3data-js).
We also have several subscriptions to match your needs. The developer tier is free and gets you access to 90% of all the data. If you are not sure about your usage patterns yet, we recommend the on-demand plan to get started, while for heavy users the professional and enterprise plans would be more adequate - see https://amberdata.io/pricing for more information.
All and all, we try really hard to make it as easy as possible to use for you. We do the heavy lifting, so you don’t have to worry about all the minutia and you can focus on bringing value to your customers. We work very closely with our customers and continuously improve upon and add new features to our API. If something is not supported or you want something that is not in the API, chances are we already have the data, do not hesitate to ask us ;)
Even though this is a pretty severe bug (any/all Parity node(s) can be remotely shutdown with just one small call to its API), in practice the number of nodes at risk is probably small because only operators who have enabled public facing RPC calls (and possibly the ones who have enabled tracing as well) are affected - which are both disabled by default. Kudos to the Parity team for fixing and releasing a patch in less than 24 hours after the bug was reported!
A good starting point is to use our Postman collection, which is pretty complete and can give you a very good overview of all the capabilities: https://amberdata.io/docs/libraries and https://www.getpostman.com/collections/79afa5bafe91f0e676d6.
For more advanced users, the REST API is where you should start, but as I mentioned earlier, how to access the data depends on your use case: REST, websockets, Json RPC and SDK are the most commonly ways of getting to it. We have a lot of tutorials and code examples available here: https://amberdata.io/docs.
For developers interested in getting access to Amberdata’s blockchain and market data from within their own contract, they can use the Chainlink Oracle contract, which integrates directly with the API:
Free Q&A---Who are your competitors? What makes you better?
There are a few data providers out there offering similar information as Amberdata. For example, Etherscan has very complete blockchain data for Ethereum, and CoinmarketCap has assets rankings by market cap and some pricing information. We actually did a pretty thorough analysis on the different data providers and they pros and cons:
What makes Amberdata unique is three folds:
· Combination of blockchain and market data: typically other providers offer one or the other, but not both, and not integrated with each other - with Amberdata, in one API call I can get blockchain and historically accurate pricing data at the same time. We have also standardized access across multiple blockchains, so you get one interface for all and do not have to worry about understanding each and every one of them.
· Validated & verifiable data: we work hard to preserve transparency and trust and are very open about how our metrics are calculated. For example, blockchain data comes with all the pieces needed to recompute the Mekle proofs so the integrity of the data can be verified at any moment. Also, additional metrics like circulating supply are based on tangible and very concrete definitions so anybody can follow and recalculate them by themselves if needed.
· Enriched data: we have spent a lot of time enriching our APIs with (historical) off chain data like token names and symbols, mappings for token addresses and tradable market pairs, etc. At the same time, our APIs are very granular and provide a level of detail that only a few other providers offer, especially with market data (Level 2 with order books across multiple exchanges, Best Bid Offers, etc).
That's all for the 40th AMA. We should like to thank all the community members for their participation and cooperation! Thanks, Joanes!
[Of course, that scene finishes with knocking out the "recovering" patient so he can be taken away...not to mention the absurdity of including Monty Python in a financial article, but moving right along.]
I am not providing financial advice and I do not make any recommendations of any sort on any matters. Make your own decisions; do your own research. Please, I do not want to hear about anyone doing anything "on my advice." I am not offering advice.And I'll reiterate that I own about 30% [g] of the current supply of NYAN, which makes me by definition maximally biased.
This first course of the Blockchain specialization provides a broad overview of the essential concepts of blockchain technology – by initially exploring the Bitcoin protocol followed by the Ethereum protocol – to lay the foundation necessary for developing applications and programming. You will be equipped with the knowledge needed to create nodes on your personal Ethereum blockchain ... Data Visualization Software ... Let’s start with a quick history lesson. Bitcoin Timeline – a Decade of Cryptocurrency. Although Bitcoin is relatively young, many events marked its evolution in the past decade. Before we can speculate about bitcoin ’s future, let’s discover its history. October 31, 2008: Bitcoin Whitepaper relеase. January 3, 2009: Bitcoin Genesis Block Mined (also ... Nov 26, 2017 - By following the transaction flows through the bitcoin blockchain, the analysis lays out a timeline of the events leading up to and following the attack. It also manages to link the attacker to: In Bitcoin blockchain, when approximately six. blocks are generated, the relevant blockchain is considered to be the authentic one (e.g., the. chain of blocks B11, B12, B13, B14, B15 and B16 in ... Bitcoin has a bad reputation. But how much of it is actually based on fact? We look beyond the sensationalism and dig into the data to separate the myths from reality. As industry insiders it is…
[index]          
Blockchain 3D and VR Explorer & The Bitcoin Pizza Transaction - Duration: ... Bitcoin GitHub History Visualization - Multiple Projects (Jan 2015) - Duration: 5:52. Coding In My Sleep 6,946 views ... In this video we will see: - What is BlockChain - Why blockchain is important - Who is Satoshi Nakamoto - What is bitcoin - Bitcoin with blockchain - Securit... This is like my very, very, very, first earliest draft effort at making a real-time 3d visualization of the bitcoin blockchain over time. The finished versio... Blockchain explained. Shai Rubin, CTO of Citi Innovation Lab, explains in an easy and simple way the basics of blockchain. I found a project (link below) which shows how to connect to the Bitcoin blockchain in Unity, so I took his implementation and rejigged it into this piggy-bank style falling transaction sorter.