How does Bitcoin work? - Bitcoin

Bitcoin 2

Bitcoin 2 is a scalable Bitcoin 1:1 fork with private transactions and instant verified payments. Open source - Anonymous - Fast - Proof of Stake algorithm. All Bitcoin holders as of block 507850 (~February 5, 2018) got Bitcoin 2 on a 1 for 1 basis earlier.
[link]

Bitcoin reborn as LiteBitcoin

Welcome! This is an official subreddit for the Litebitcoin. Created and managed by the current Moderators (Litebitcoin) (RomanPetrush) & (LitebitcoinMooning) and will be looking for new applicants for more moderators.
[link]

DigiByte: More Secure, Faster & Forward Thinking

DigiByte is more than a faster digital currency. It is an innovative blockchain that can be used for digital assets, smart contracts, decentralized applications and secure authentication.
[link]

@binance: #binance Adds Open-Source Implementation for Edwards-Curve Digital Signature Algorithm (EdDSA) in the TSS Library The library is compatible with ECDSA-based blockchains, including Binance Chain, #Bitcoin, and @ethereum networks. https://t.co/xNILYim9EV

@binance: #binance Adds Open-Source Implementation for Edwards-Curve Digital Signature Algorithm (EdDSA) in the TSS Library The library is compatible with ECDSA-based blockchains, including Binance Chain, #Bitcoin, and @ethereum networks. https://t.co/xNILYim9EV submitted by rulesforrebels to BinanceTrading [link] [comments]

@BinanceResearch: RT @binance: #binance Adds Open-Source Implementation for Edwards-Curve Digital Signature Algorithm (EdDSA) in the TSS Library The library is compatible with ECDSA-based blockchains, including Binance Chain, #Bitcoin, and @ethereum networks. https://t.co/xNILYim9EV

@BinanceResearch: RT @binance: #binance Adds Open-Source Implementation for Edwards-Curve Digital Signature Algorithm (EdDSA) in the TSS Library The library is compatible with ECDSA-based blockchains, including Binance Chain, #Bitcoin, and @ethereum networks. https://t.co/xNILYim9EV submitted by rulesforrebels to BinanceTrading [link] [comments]

StockSharp/StockSharp: Algorithmic trading and quantitative trading open source platform to develop trading robots (stock markets, forex, bitcoins and options).

submitted by TsukiZombina to coolgithubprojects [link] [comments]

Undergrad CS student here, about to start making an *very open source* bitcoin trading bot. It will crush all the other bots because it will use Genetic Evolution as its' machine learning algorithm.

This just an interest check post to see if this is something that would be interesting to you guys. Most trading bots I see either use hard-coded signals/indicators to make the buy/sell decisions. The rare and expensive bots that I do find that use Machine Learning, are using outdated LinearRegression algorithms that just don't work very well for the volatility of Bitcoin.
Why Genetic Evolution as the algorithm? Its' strong ability to learn from mistakes. An algorithm like this is not trained based on historical data, but rather teaches itself based on performance. Basically, this algorithm would play "dummy" trades and learn from its mistake when it misses a target, or a certain gain percentage. (I'm not in finance, so I'm not really sure what is considered a loss or a win in day trading). Eventually, it would genetically evolve to a point where it gets pretty good at day trading in theory.
ELI5 The cost function would force the bot to learn like this. Let's say we use this algorithm to keep guessing a random phrase until it get's to the phrase "Hello, world!". We give the attempts a fitness score and a "cost". The first phrase would be gibberish. For example, if we have a capital "A" (ASCII 65) but it's supposed to be a capital "C" (ASCII 67), then our cost for that character is 4 (67 - 65 = 2, and 22 = 4). We square it so that the cost is always positive, and also penalizes outlier data more.
The genetic algorithm would learn like this: Gekmo+ xosmd! (7) Gekln, worle" (5) Fello, wosld! (5) Gello, wprld! (2) Hello, world! (0)
Where the total squared cost is on the right. It basically keeps trying with pseudorandom strategies at the beginning and evolves to be smarter as it starts to cut down on error. As dumb as it sounds, this kind of Machine Learning workflow would work great financially in my opinion, and ESPECIALLY with day trading. Obviously you could never get a 100% profit rate, but I am extremely confident I could build a model that reduces error and picks up on patterns that the human mind is not trained to comprehend.
If it seems like something that would be cool, I would open source it and start working on it asap. The coolest thing is, you don't even have to trade real bitcoin! I would obviously have a test mode that calculates "would've been" profit basically.
the best way to understand what I'm trying to accomplish is by watching this video: https://www.youtube.com/watch?v=05rEefXlmhI This shows how a "dumb" program can evolve relatively quickly to be a "perfect" video game player.
submitted by sacstatebro to Bitcoin [link] [comments]

Best consensus algorithm for an open source project /r/Bitcoin

Best consensus algorithm for an open source project /Bitcoin submitted by HiIAMCaptainObvious to BitcoinAll [link] [comments]

Undergrad CS student here, about to start making an *very open source* bitcoin trading bot. It will crush all the other bots because it will use Genetic Evolution as its' machine learning algorithm. /r/Bitcoin

Undergrad CS student here, about to start making an *very open source* bitcoin trading bot. It will crush all the other bots because it will use Genetic Evolution as its' machine learning algorithm. /Bitcoin submitted by BitcoinAllBot to BitcoinAll [link] [comments]

OpenAT: Open Source Algorithmic Trading Library /r/Bitcoin

OpenAT: Open Source Algorithmic Trading Library /Bitcoin submitted by BitcoinAllBot to BitcoinAll [link] [comments]

Bitcoin mentioned around Reddit: ELI5: How can encryption algorithms be open source without compromising their effectiveness? /r/explainlikeimfive

Bitcoin mentioned around Reddit: ELI5: How can encryption algorithms be open source without compromising their effectiveness? /explainlikeimfive submitted by BitcoinAllBot to BitcoinAll [link] [comments]

Proposal: The Sia Foundation

Vision Statement

A common sentiment is brewing online; a shared desire for the internet that might have been. After decades of corporate encroachment, you don't need to be a power user to realize that something has gone very wrong.
In the early days of the internet, the future was bright. In that future, when you sent an instant message, it traveled directly to the recipient. When you needed to pay a friend, you announced a transfer of value to their public key. When an app was missing a feature you wanted, you opened up the source code and implemented it. When you took a picture on your phone, it was immediately encrypted and backed up to storage that you controlled. In that future, people would laugh at the idea of having to authenticate themselves to some corporation before doing these things.
What did we get instead? Rather than a network of human-sized communities, we have a handful of enormous commons, each controlled by a faceless corporate entity. Hey user, want to send a message? You can, but we'll store a copy of it indefinitely, unencrypted, for our preference-learning algorithms to pore over; how else could we slap targeted ads on every piece of content you see? Want to pay a friend? You can—in our Monopoly money. Want a new feature? Submit a request to our Support Center and we'll totally maybe think about it. Want to backup a photo? You can—inside our walled garden, which only we (and the NSA, of course) can access. Just be careful what you share, because merely locking you out of your account and deleting all your data is far from the worst thing we could do.
You rationalize this: "MEGACORP would never do such a thing; it would be bad for business." But we all know, at some level, that this state of affairs, this inversion of power, is not merely "unfortunate" or "suboptimal" – No. It is degrading. Even if MEGACORP were purely benevolent, it is degrading that we must ask its permission to talk to our friends; that we must rely on it to safeguard our treasured memories; that our digital lives are completely beholden to those who seek only to extract value from us.
At the root of this issue is the centralization of data. MEGACORP can surveil you—because your emails and video chats flow through their servers. And MEGACORP can control you—because they hold your data hostage. But centralization is a solution to a technical problem: How can we make the user's data accessible from anywhere in the world, on any device? For a long time, no alternative solution to this problem was forthcoming.
Today, thanks to a confluence of established techniques and recent innovations, we have solved the accessibility problem without resorting to centralization. Hashing, encryption, and erasure encoding got us most of the way, but one barrier remained: incentives. How do you incentivize an anonymous stranger to store your data? Earlier protocols like BitTorrent worked around this limitation by relying on altruism, tit-for-tat requirements, or "points" – in other words, nothing you could pay your electric bill with. Finally, in 2009, a solution appeared: Bitcoin. Not long after, Sia was born.
Cryptography has unleashed the latent power of the internet by enabling interactions between mutually-distrustful parties. Sia harnesses this power to turn the cloud storage market into a proper marketplace, where buyers and sellers can transact directly, with no intermediaries, anywhere in the world. No more silos or walled gardens: your data is encrypted, so it can't be spied on, and it's stored on many servers, so no single entity can hold it hostage. Thanks to projects like Sia, the internet is being re-decentralized.
Sia began its life as a startup, which means it has always been subjected to two competing forces: the ideals of its founders, and the profit motive inherent to all businesses. Its founders have taken great pains to never compromise on the former, but this often threatened the company's financial viability. With the establishment of the Sia Foundation, this tension is resolved. The Foundation, freed of the obligation to generate profit, is a pure embodiment of the ideals from which Sia originally sprung.
The goals and responsibilities of the Foundation are numerous: to maintain core Sia protocols and consensus code; to support developers building on top of Sia and its protocols; to promote Sia and facilitate partnerships in other spheres and communities; to ensure that users can easily acquire and safely store siacoins; to develop network scalability solutions; to implement hardforks and lead the community through them; and much more. In a broader sense, its mission is to commoditize data storage, making it cheap, ubiquitous, and accessible to all, without compromising privacy or performance.
Sia is a perfect example of how we can achieve better living through cryptography. We now begin a new chapter in Sia's history. May our stewardship lead it into a bright future.
 

Overview

Today, we are proposing the creation of the Sia Foundation: a new non-profit entity that builds and supports distributed cloud storage infrastructure, with a specific focus on the Sia storage platform. What follows is an informal overview of the Sia Foundation, covering two major topics: how the Foundation will be funded, and what its funds will be used for.

Organizational Structure

The Sia Foundation will be structured as a non-profit entity incorporated in the United States, likely a 501(c)(3) organization or similar. The actions of the Foundation will be constrained by its charter, which formalizes the specific obligations and overall mission outlined in this document. The charter will be updated on an annual basis to reflect the current goals of the Sia community.
The organization will be operated by a board of directors, initially comprising Luke Champine as President and Eddie Wang as Chairman. Luke Champine will be leaving his position at Nebulous to work at the Foundation full-time, and will seek to divest his shares of Nebulous stock along with other potential conflicts of interest. Neither Luke nor Eddie personally own any siafunds or significant quantities of siacoin.

Funding

The primary source of funding for the Foundation will come from a new block subsidy. Following a hardfork, 30 KS per block will be allocated to the "Foundation Fund," continuing in perpetuity. The existing 30 KS per block miner reward is not affected. Additionally, one year's worth of block subsidies (approximately 1.57 GS) will be allocated to the Fund immediately upon activation of the hardfork.
As detailed below, the Foundation will provably burn any coins that it cannot meaningfully spend. As such, the 30 KS subsidy should be viewed as a maximum. This allows the Foundation to grow alongside Sia without requiring additional hardforks.
The Foundation will not be funded to any degree by the possession or sale of siafunds. Siafunds were originally introduced as a means of incentivizing growth, and we still believe in their effectiveness: a siafund holder wants to increase the amount of storage on Sia as much as possible. While the Foundation obviously wants Sia to succeed, its driving force should be its charter. Deriving significant revenue from siafunds would jeopardize the Foundation's impartiality and focus. Ultimately, we want the Foundation to act in the best interests of Sia, not in growing its own budget.

Responsibilities

The Foundation inherits a great number of responsibilities from Nebulous. Each quarter, the Foundation will publish the progress it has made over the past quarter, and list the responsibilities it intends to prioritize over the coming quarter. This will be accompanied by a financial report, detailing each area of expenditure over the past quarter, and forecasting expenditures for the coming quarter. Below, we summarize some of the myriad responsibilities towards which the Foundation is expected to allocate its resources.

Maintain and enhance core Sia software

Arguably, this is the most important responsibility of the Foundation. At the heart of Sia is its consensus algorithm: regardless of other differences, all Sia software must agree upon the content and rules of the blockchain. It is therefore crucial that the algorithm be stewarded by an entity that is accountable to the community, transparent in its decision-making, and has no profit motive or other conflicts of interest.
Accordingly, Sia’s consensus functionality will no longer be directly maintained by Nebulous. Instead, the Foundation will release and maintain an implementation of a "minimal Sia full node," comprising the Sia consensus algorithm and P2P networking code. The source code will be available in a public repository, and signed binaries will be published for each release.
Other parties may use this code to provide alternative full node software. For example, Nebulous may extend the minimal full node with wallet, renter, and host functionality. The source code of any such implementation may be submitted to the Foundation for review. If the code passes review, the Foundation will provide "endorsement signatures" for the commit hash used and for binaries compiled internally by the Foundation. Specifically, these signatures assert that the Foundation believes the software contains no consensus-breaking changes or other modifications to imported Foundation code. Endorsement signatures and Foundation-compiled binaries may be displayed and distributed by the receiving party, along with an appropriate disclaimer.
A minimal full node is not terribly useful on its own; the wallet, renter, host, and other extensions are what make Sia a proper developer platform. Currently, the only implementations of these extensions are maintained by Nebulous. The Foundation will contract Nebulous to ensure that these extensions continue to receive updates and enhancements. Later on, the Foundation intends to develop its own implementations of these extensions and others. As with the minimal node software, these extensions will be open source and available in public repositories for use by any Sia node software.
With the consensus code now managed by the Foundation, the task of implementing and orchestrating hardforks becomes its responsibility as well. When the Foundation determines that a hardfork is necessary (whether through internal discussion or via community petition), a formal proposal will be drafted and submitted for public review, during which arguments for and against the proposal may be submitted to a public repository. During this time, the hardfork code will be implemented, either by Foundation employees or by external contributors working closely with the Foundation. Once the implementation is finished, final arguments will be heard. The Foundation board will then vote whether to accept or reject the proposal, and announce their decision along with appropriate justification. Assuming the proposal was accepted, the Foundation will announce the block height at which the hardfork will activate, and will subsequently release source code and signed binaries that incorporate the hardfork code.
Regardless of the Foundation's decision, it is the community that ultimately determines whether a fork is accepted or rejected – nothing can change that. Foundation node software will never automatically update, so all forks must be explicitly adopted by users. Furthermore, the Foundation will provide replay and wipeout protection for its hard forks, protecting other chains from unintended or malicious reorgs. Similarly, the Foundation will ensure that any file contracts formed prior to a fork activation will continue to be honored on both chains until they expire.
Finally, the Foundation also intends to pursue scalability solutions for the Sia blockchain. In particular, work has already begun on an implementation of Utreexo, which will greatly reduce the space requirements of fully-validating nodes (allowing a full node to be run on a smartphone) while increasing throughput and decreasing initial sync time. A hardfork implementing Utreexo will be submitted to the community as per the process detailed above.
As this is the most important responsibility of the Foundation, it will receive a significant portion of the Foundation’s budget, primarily in the form of developer salaries and contracting agreements.

Support community services

We intend to allocate 25% of the Foundation Fund towards the community. This allocation will be held and disbursed in the form of siacoins, and will pay for grants, bounties, hackathons, and other community-driven endeavours.
Any community-run service, such as a Skynet portal, explorer or web wallet, may apply to have its costs covered by the Foundation. Upon approval, the Foundation will reimburse expenses incurred by the service, subject to the exact terms agreed to. The intent of these grants is not to provide a source of income, but rather to make such services "break even" for their operators, so that members of the community can enrich the Sia ecosystem without worrying about the impact on their own finances.

Ensure easy acquisition and storage of siacoins

Most users will acquire their siacoins via an exchange. The Foundation will provide support to Sia-compatible exchanges, and pursue relevant integrations at its discretion, such as Coinbase's new Rosetta standard. The Foundation may also release DEX software that enables trading cryptocurrencies without the need for a third party. (The Foundation itself will never operate as a money transmitter.)
Increasingly, users are storing their cryptocurrency on hardware wallets. The Foundation will maintain the existing Ledger Nano S integration, and pursue further integrations at its discretion.
Of course, all hardware wallets must be paired with software running on a computer or smartphone, so the Foundation will also develop and/or maintain client-side wallet software, including both full-node wallets and "lite" wallets. Community-operated wallet services, i.e. web wallets, may be funded via grants.
Like core software maintenance, this responsibility will be funded in the form of developer salaries and contracting agreements.

Protect the ecosystem

When it comes to cryptocurrency security, patching software vulnerabilities is table stakes; there are significant legal and social threats that we must be mindful of as well. As such, the Foundation will earmark a portion of its fund to defend the community from legal action. The Foundation will also safeguard the network from 51% attacks and other threats to network security by implementing softforks and/or hardforks where necessary.
The Foundation also intends to assist in the development of a new FOSS software license, and to solicit legal memos on various Sia-related matters, such as hosting in the United States and the EU.
In a broader sense, the establishment of the Foundation makes the ecosystem more robust by transferring core development to a more neutral entity. Thanks to its funding structure, the Foundation will be immune to various forms of pressure that for-profit companies are susceptible to.

Drive adoption of Sia

Although the overriding goal of the Foundation is to make Sia the best platform it can be, all that work will be in vain if no one uses the platform. There are a number of ways the Foundation can promote Sia and get it into the hands of potential users and developers.
In-person conferences are understandably far less popular now, but the Foundation can sponsor and/or participate in virtual conferences. (In-person conferences may be held in the future, permitting circumstances.) Similarly, the Foundation will provide prizes for hackathons, which may be organized by community members, Nebulous, or the Foundation itself. Lastly, partnerships with other companies in the cryptocurrency space—or the cloud storage space—are a great way to increase awareness of Sia. To handle these responsibilities, one of the early priorities of the Foundation will be to hire a marketing director.

Fund Management

The Foundation Fund will be controlled by a multisig address. Each member of the Foundation's board will control one of the signing keys, with the signature threshold to be determined once the final composition of the board is known. (This threshold may also be increased or decreased if the number of board members changes.) Additionally, one timelocked signing key will be controlled by David Vorick. This key will act as a “dead man’s switch,” to be used in the event of an emergency that prevents Foundation board members from reaching the signature threshold. The timelock ensures that this key cannot be used unless the Foundation fails to sign a transaction for several months.
On the 1st of each month, the Foundation will use its keys to transfer all siacoins in the Fund to two new addresses. The first address will be controlled by a high-security hot wallet, and will receive approximately one month's worth of Foundation expenditures. The second address, receiving the remaining siacoins, will be a modified version of the source address: specifically, it will increase the timelock on David Vorick's signing key by one month. Any other changes to the set of signing keys, such as the arrival or departure of board members, will be incorporated into this address as well.
The Foundation Fund is allocated in SC, but many of the Foundation's expenditures must be paid in USD or other fiat currency. Accordingly, the Foundation will convert, at its discretion, a portion of its monthly withdrawals to fiat currency. We expect this conversion to be primarily facilitated by private "OTC" sales to accredited investors. The Foundation currently has no plans to speculate in cryptocurrency or other assets.
Finally, it is important that the Foundation adds value to the Sia platform well in excess of the inflation introduced by the block subsidy. For this reason, the Foundation intends to provably burn, on a quarterly basis, any coins that it cannot allocate towards any justifiable expense. In other words, coins will be burned whenever doing so provides greater value to the platform than any other use. Furthermore, the Foundation will cap its SC treasury at 5% of the total supply, and will cap its USD treasury at 4 years’ worth of predicted expenses.
 
Addendum: Hardfork Timeline
We would like to see this proposal finalized and accepted by the community no later than September 30th. A new version of siad, implementing the hardfork, will be released no later than October 15th. The hardfork will activate at block 293220, which is expected to occur around 12pm EST on January 1st, 2021.
 
Addendum: Inflation specifics
The total supply of siacoins as of January 1st, 2021 will be approximately 45.243 GS. The initial subsidy of 1.57 GS thus increases the supply by 3.47%, and the total annual inflation in 2021 will be at most 10.4% (if zero coins are burned). In 2022, total annual inflation will be at most 6.28%, and will steadily decrease in subsequent years.
 

Conclusion

We see the establishment of the Foundation as an important step in the maturation of the Sia project. It provides the ecosystem with a sustainable source of funding that can be exclusively directed towards achieving Sia's ambitious goals. Compared to other projects with far deeper pockets, Sia has always punched above its weight; once we're on equal footing, there's no telling what we'll be able to achieve.
Nevertheless, we do not propose this change lightly, and have taken pains to ensure that the Foundation will act in accordance with the ideals that this community shares. It will operate transparently, keep inflation to a minimum, and respect the user's fundamental role in decentralized systems. We hope that everyone in the community will consider this proposal carefully, and look forward to a productive discussion.
submitted by lukechampine to siacoin [link] [comments]

Ledger Live adds Coin control: Here's why that matters.

Ledger Live adds Coin control: Here's why that matters.
Ledger Live version 2.11.1 (download link) adds Coin control for power users.
The coin control feature gives advanced users more granular control over their wallets. It enables them to change how and which coins are selected when making transactions. This increases their ability to manage their privacy and the network fees they will have to pay to spend their account balance.
More control over your coins

How does it work?

The account balance for Bitcoin and its derivatives consists of all the unspent transaction outputs (UTXOs) in the account. You can think of UTXOs as the coins in a regular wallet. When you receive money, you collect coins in your wallet. Then, when you want to make a payment, you get to choose which coins you pick from your wallet. Do you pick the largest coins first? Or do you want to spend all the smaller value coins to lighten up your wallet? Similar considerations can be made when creating a Bitcoin or Bitcoin derivative (altcoin) transaction.
Before the Coin Control feature was released, all transactions involving Bitcoin (and altcoins) automatically selected their coins using the First-In-First-Out (FIFO) algorithm. This strategy includes the oldest coin in the account, and when the amount is not sufficient the second-oldest coin is added, and so forth.
As of Ledger Live version 2.11.1, users are able to make use of a dedicated Coin Control tool to choose the coin selection strategy and the coins that may be spent.

Using Coin control in Ledger Live

Coin control is available in Advanced options in the Send flow
  1. Click on Send, choose an account to debit, and enter a recipient address. Click on Continue.
  2. Enter an amount and click on Advanced options. You will then see: - The currently selected, default coin selection strategy: Oldest coins first (FIFO). - A toggle to enable Replace-By-Fee (RBF). - A toggle to include coins from unconfirmed, replaceable transactions.
  3. Click on Coin control. The coin control modal opens.
  4. Select a Coin selection strategy from the dropdown menu: - Oldest coins first (FIFO). This is the default strategy that spends the oldest coins first. - Minimize fees (optimize size). This strategy tries to minimize the byte size of the transaction by spending the lowest number of UTXOs. This results in a low network fee. - Minimize future fees (merge coins), This strategy includes the maximum number of inputs so that a potential future price rise does not make smaller UTXOs economically unspendable. If the price of a crypto asset increases too much, small UTXOs may become worth less than the cost of the network fees to spend them.
  5. Select which coins may not be included in the selection by unticking their checkbox. The SELECTED indicator shows which coins will be included in the transaction. By changing the selection strategy and/or coins to include, the user has precise control over which coins end up being spent. The Coins to spend and Change to return indicators show how much is spent from and returned to the account.
  6. Click on Done to return to the Send flow to verify and send the transaction.
The coin control window lets you select the strategy as well as pick the coins. Coins marked SELECTED will be included in the transaction.

Coin status

The following statuses can be displayed for a coin:
  • Coins received in a transaction with 0 confirmations without RBF enabled: PENDING
  • Coins received in a transaction with 0 confirmations with RBF enabled: REPLACEABLE
  • Coins received in a transaction with 1337 confirmations: 1337 CONFIRMATIONS
By enabling the toggle Include coins from unconfirmed, replaceable transactions, replaceable transactions can be selected in the Coin control screen.

The Privacy use case

One of the main use cases for Coin control is to protect one’s privacy. UTXOs are, unfortunately, not perfectly fungible due to their unique history on the blockchain. Therefore, users may want to spend coins from different sources without mixing them together, because this would indicate to an outside observer of the blockchain that these addresses belong to the same account. For instance, if one were to spend coins bought on a KYC exchange, which are associated with the user’s identity, together with coins bought anonymously using cash, the anonymous coins could be linked to the user’s identity.
Another example would be that you would like to prevent spending a high-value coin for smaller purchases because this would unnecessarily show the person you’re paying how much you have. This is similar to not showing the boulanger how much is on your bank account when buying a baguette.

Let us know what you think!

We are excited to release this new feature because we think it will fulfill real needs of an important part of our users. This version of Ledger Live marks an important milestone, but we will continue working on more features that our community wants.
So, we invite you to try out Coin control in Ledger Live and let us know what you think! All feedback is welcome on this thread, on ledgerwallet, and you can send suggestions or get help through our official contact form.
We'd like to close out by underlining our commitment to the Bitcoin community, and our willingness to build the best wallet ecosystem for newbies as well as for power users.
submitted by fabnormal to Bitcoin [link] [comments]

Stakenet (XSN) - A DEX with interchain capabilities (BTC-ETH), Huge Potential [Full Writeup]

Preface
Full disclosure here; I am heavily invested in this. I have picked up some real gems from here and was only in the position to buy so much of this because of you guys so I thought it was time to give back. I only invest in Utility Coins. These are coins that actually DO something, and provide new/build upon the crypto infrastructure to work towards the end goal that Bitcoin itself set out to achieve(financial independence from the fiat banking system). This way, I avoid 99% of the scams in crypto that are functionless vapourware, and if you only invest in things that have strong fundamentals in the long term you are much more likely to make money.
Introduction
Stakenet is a Lightning Network-ready open-source platform for decentralized applications with its native cryptocurrency – XSN. It is powered by a Proof of Stake blockchain with trustless cold staking and Masternodes. Its use case is to provide a highly secure cross-chain infrastructure for these decentralized applications, where individuals can easily operate with any blockchain simply by using Stakenet and its native currency XSN.
Ok... but what does it actually do and solve?
The moonshot here is the DEX (Decentralised Exchange) that they are building. This is a lightning-network DEX with interchain capabilities. That means you could trade BTC directly for ETH; securely, instantly, cheaply and privately.
Right now, most crypto is traded to and from Centralised Exchanges like Binance. To buy and sell on these exchanges, you have to send your crypto wallets on that exchange. That means the exchanges have your private keys, and they have control over your funds. When you use a centralised exchange, you are no longer in control of your assets, and depend on the trustworthiness of middlemen. We have in the past of course seen infamous exit scams by centralised exchanges like Mt. Gox.
The alternative? Decentralised Exchanges. DEX's have no central authority and most importantly, your private keys(your crypto) never leavesYOUR possession and are never in anyone else's possession. So you can trade peer-to-peer without any of the drawbacks of Centralised Exchanges.
The problem is that this technology has not been perfected yet, and the DEX's that we have available to us now are not providing cheap, private, quick trading on a decentralised medium because of their technological inadequacies. Take Uniswap for example. This DEX accounts for over 60% of all DEX volume and facilitates trading of ERC-20 tokens, over the Ethereum blockchain. The problem? Because of the huge amount of transaction that are occurring over the Ethereum network, this has lead to congestion(too many transaction for the network to handle at one time) so the fees have increased dramatically. Another big problem? It's only for Ethereum. You cant for example, Buy LINK with BTC. You must use ETH.
The solution? Layer 2 protocols. These are layers built ON TOP of existing blockchains, that are designed to solve the transaction and scaling difficulties that crypto as a whole is facing today(and ultimately stopping mass adoption) The developers at Stakenet have seen the big picture, and have decided to implement the lightning network(a layer 2 protocol) into its DEX from the ground up. This will facilitate the functionalities of a DEX without any of the drawbacks of the CEX's and the DEX's we have today.
Heres someone much more qualified than me, Andreas Antonopoulos, to explain this
https://streamable.com/kzpimj
'Once we have efficient, well designed DEX's on layer 2, there wont even be any DEX's on layer 1'
Progress
The Stakenet team were the first to envision this grand solution and have been working on it since its conception in June 2019. They have been making steady progress ever since and right now, the DEX is in an open beta stage where rigorous testing is constant by themselves and the public. For a project of this scale, stress testing is paramount. If the product were to launch with any bugs/errors that would result in the loss of a users funds, this would obviously be very damaging to Stakenet's reputation. So I believe that the developers conservative approach is wise.
As of now the only pairs tradeable on the DEX are XSN/BTC and LTC/BTC. The DEX has only just launched as a public beta and is not in its full public release stage yet. As development moves forward more lightning network and atomic swap compatible coins will be added to the DEX, and of course, the team are hard at work on Raiden Integration - this will allow ETH and tokens on the Ethereum blockchain to be traded on the DEX between separate blockchains(instantly, cheaply, privately) This is where Stakenet enters top 50 territory on CMC if successful and is the true value here. Raiden Integration is well underway is being tested in a closed public group on Linux.
The full public DEX with Raiden Integration is expected to release by the end of the year. Given the state of development so far and the rate of progress, this seems realistic.
Tokenomics
2.6 Metrics overview (from whitepaper)
XSN is slightly inflationary, much like ETH as this is necessary for the economy to be adopted and work in the long term. There is however a deflationary mechanism in place - all trading fees on the DEX get converted to XSN and 10% of these fees are burned. This puts constant buying pressure on XSN and acts as a deflationary mechanism. XSN has inherent value because it makes up the infrastructure that the DEX will run off and as such Masternode operators and Stakers will see the fee's from the DEX.
Conclusion
We can clearly see that a layer 2 DEX is the future of crypto currency trading. It will facilitate secure, cheap, instant and private trading across all coins with lightning capabilities, thus solving the scaling and transaction issues that are holding back crypto today. I dont need to tell you the implications of this, and what it means for crypto as a whole. If Stakenet can launch a layer 2 DEX with Raiden Integration, It will become the primary DEX in terms of volume.
Stakenet DEX will most likely be the first layer 2 DEX(first mover advantage) and its blockchain is the infrastructure that will host this DEX and subsequently receive it's trading fee's. It is not difficult to envision a time in the next year when Stakenet DEX is functional and hosting hundreds of millions of dollars worth of trading every single day.
At $30 million market cap, I cant see any other potential investment right now with this much potential upside.
This post has merely served as in introduction and a heads up for this project, there is MUCH more to cover like vortex liquidity, masternodes, TOR integration... for now, here is some additional reading. Resources
TLDR; No. Do you want to make money? I'd start with learning how to read.
submitted by hotprocession to CryptoMoonShots [link] [comments]

Zano Newcomers Introduction/FAQ - please read!

Welcome to the Zano Sticky Introduction/FAQ!

https://preview.redd.it/al1gy9t9v9q51.png?width=424&format=png&auto=webp&s=b29a60402d30576a4fd95f592b392fae202026ca
Hopefully any questions you have will be answered by the resources below, but if you have additional questions feel free to ask them in the comments. If you're quite technically-minded, the Zano whitepaper gives a thorough overview of Zano's design and its main features.
So, what is Zano? In brief, Zano is a project started by the original developers of CryptoNote. Coins with market caps totalling well over a billion dollars (Monero, Haven, Loki and countless others) run upon the codebase they created. Zano is a continuation of their efforts to create the "perfect money", and brings a wealth of enhancements to their original CryptoNote code.
Development happens at a lightning pace, as the Github activity shows, but Zano is still very much a work-in-progress. Let's cut right to it:
Here's why you should pay attention to Zano over the next 12-18 months. Quoting from a recent update:
Anton Sokolov has recently joined the Zano team. ... For the last months Anton has been working on theoretical work dedicated to log-size ring signatures. These signatures theoretically allows for a logarithmic relationship between the number of decoys and the size/performance of transactions. This means that we can set mixins at a level from up to 1000, keeping the reasonable size and processing speed of transactions. This will take Zano’s privacy to a whole new level, and we believe this technology will turn out to be groundbreaking!
If successful, this scheme will make Zano the most private, powerful and performant CryptoNote implementation on the planet. Bar none. A quantum leap in privacy with a minimal increase in resource usage. And if there's one team capable of pulling it off, it's this one.

What else makes Zano special?

You mean aside from having "the Godfather of CryptoNote" as the project lead? ;) Actually, the calibre of the developers/researchers at Zano probably is the project's single greatest strength. Drawing on years of experience, they've made careful design choices, optimizing performance with an asynchronous core architecture, and flexibility and extensibility with a modular code structure. This means that the developers are able to build and iterate fast, refining features and adding new ones at a rate that makes bigger and better-funded teams look sluggish at best.
Zano also has some unique features that set it apart from similar projects:
Privacy Firstly, if you're familiar with CryptoNote you won't be surprised that Zano transactions are private. The perfect money is fungible, and therefore must be untraceable. Bitcoin, for the most part, does little to hide your transaction data from unscrupulous observers. With Zano, privacy is the default.
The untraceability and unlinkability of Zano transactions come from its use of ring signatures and stealth addresses. What this means is that no outside observer is able to tell if two transactions were sent to the same address, and for each transaction there is a set of possible senders that make it impossible to determine who the real sender is.
Hybrid PoW-PoS consensus mechanism Zano achieves an optimal level of security by utilizing both Proof of Work and Proof of Stake for consensus. By combining the two systems, it mitigates their individual vulnerabilities (see 51% attack and "nothing at stake" problem). For an attack on Zano to have even a remote chance of success the attacker would have to obtain not only a majority of hashing power, but also a majority of the coins involved in staking. The system and its design considerations are discussed at length in the whitepaper.
Aliases Here's a stealth address: ZxDdULdxC7NRFYhCGdxkcTZoEGQoqvbZqcDHj5a7Gad8Y8wZKAGZZmVCUf9AvSPNMK68L8r8JfAfxP4z1GcFQVCS2Jb9wVzoe. I have a hard enough time remembering my phone number. Fortunately, Zano has an alias system that lets you register an address to a human-readable name. (@orsonj if you want to anonymously buy me a coffee)
Multisig
Multisignature (multisig) refers to requiring multiple keys to authorize a Zano transaction. It has a number of applications, such as dividing up responsibility for a single Zano wallet among multiple parties, or creating backups where loss of a single seed doesn't lead to loss of the wallet.
Multisig and escrow are key components of the planned Decentralized Marketplace (see below), so consideration was given to each of them from the design stages. Thus Zano's multisig, rather than being tagged on at the wallet-level as an afterthought, is part of its its core architecture being incorporated at the protocol level. This base-layer integration means months won't be spent in the future on complicated refactoring efforts in order to integrate multisig into a codebase that wasn't designed for it. Plus, it makes it far easier for third-party developers to include multisig (implemented correctly) in any Zano wallets and applications they create in the future.
(Double Deposit MAD) Escrow
With Zano's escrow service you can create fully customizable p2p contracts that are designed to, once signed by participants, enforce adherence to their conditions in such a way that no trusted third-party escrow agent is required.
https://preview.redd.it/jp4oghyhv9q51.png?width=1762&format=png&auto=webp&s=12a1e76f76f902ed328886283050e416db3838a5
The Particl project, aside from a couple of minor differences, uses an escrow scheme that works the same way, so I've borrowed the term they coined ("Double Deposit MAD Escrow") as I think it describes the scheme perfectly. The system requires participants to make additional deposits, which they will forfeit if there is any attempt to act in a way that breaches the terms of the contract. Full details can be found in the Escrow section of the whitepaper.
The usefulness of multisig and the escrow system may not seem obvious at first, but as mentioned before they'll form the backbone of Zano's Decentralized Marketplace service (described in the next section).

What does the future hold for Zano?

The planned upgrade to Zano's privacy, mentioned at the start, is obviously one of the most exciting things the team is working on, but it's not the only thing.
Zano Roadmap
Decentralized Marketplace
From the beginning, the Zano team's goal has been to create the perfect money. And money can't just be some vehicle for speculative investment, money must be used. To that end, the team have created a set of tools to make it as simple as possible for Zano to be integrated into eCommerce platforms. Zano's API’s and plugins are easy to use, allowing even those with very little coding experience to use them in their E-commerce-related ventures. The culmination of this effort will be a full Decentralized Anonymous Marketplace built on top of the Zano blockchain. Rather than being accessed via the wallet, it will act more as a service - Marketplace as a Service (MAAS) - for anyone who wishes to use it. The inclusion of a simple "snippet" of code into a website is all that's needed to become part a global decentralized, trustless and private E-commerce network.
Atomic Swaps
Just as Zano's marketplace will allow you to transact without needing to trust your counterparty, atomic swaps will let you to easily convert between Zano and other cyryptocurrencies without having to trust a third-party service such as a centralized exchange. On top of that, it will also lead to the way to Zano's inclusion in the many decentralized exchange (DEX) services that have emerged in recent years.

Where can I buy Zano?

Zano's currently listed on the following exchanges:
https://coinmarketcap.com/currencies/zano/markets/
It goes without saying, neither I nor the Zano team work for any of the exchanges or can vouch for their reliability. Use at your own risk and never leave coins on a centralized exchange for longer than necessary. Your keys, your coins!
If you have any old graphics cards lying around(both AMD & NVIDIA), then Zano is also mineable through its unique ProgPowZ algorithm. Here's a guide on how to get started.
Once you have some Zano, you can safely store it in one of the desktop or mobile wallets (available for all major platforms).

How can I support Zano?

Zano has no marketing department, which is why this post has been written by some guy and not the "Chief Growth Engineer @ Zano Enterprises". The hard part is already done: there's a team of world class developers and researchers gathered here. But, at least at the current prices, the team's funds are enough to cover the cost of development and little more. So the job of publicizing the project falls to the community. If you have any experience in community building/growth hacking at another cryptocurrency or open source project, or if you're a Zano holder who would like to ensure the project's long-term success by helping to spread the word, then send me a pm. We need to get organized.
Researchers and developers are also very welcome. Working at the cutting edge of mathematics and cryptography means Zano provides challenging and rewarding work for anyone in those fields. Please contact the project's Community Manager u/Jed_T if you're interested in joining the team.
Social Links:
Twitter
Discord Server
Telegram Group
Medium blog
I'll do my best to keep this post accurate and up to date. Message me please with any suggested improvements and leave any questions you have below.
Welcome to the Zano community and the new decentralized private economy!
submitted by OrsonJ to Zano [link] [comments]

Checker Thread (A list of hacked clients for easy reference)

Community contribution is the only way this thread will be useful. This is a WIP! More clients will be added as time goes on!
Quick note: The vast majority of these are files supplied by the community -- some may not work. The person who sent me 9b9t says it doesn't work, but I have not personally tested. Please let me know if it works or does not work.

Some additional notes:
I am open to community criticism so long as it can actually be used to benefit the thread. Issues with formatting, approach, client choice, etc. are all great things to come talk to me about so I may improve. However, people keep commenting the same things over and over and over — “VirusTotal can’t catch X” is a popular one — and I don’t have time to deal with it. If you have a security concern, you are more than welcome to raise it. HOWEVER, do not make unreasonable requests of me — I work, in addition to that I have a social life, I have responsibilities at home as well. This is a side thing that I do when I have the time. A lot of people have suggested looking through bytecode to determine if a client is malicious. I do not have the time to do this for every client. If you are willing to help and do more than say “you should do more than you already are, even though you have a job and a life outside of an obscure subreddit,” feel free to DM me. People who take the broken record approach and say things that are already covered in the thread, or refuse to offer help and just post complaints, will be blocked and ignored. I’m all down for making this post better, but I’m not gonna waste my time with people who won’t work with me. Thank you for your help!

BIG THANKS TO u/jpie726 FOR HIS MASSIVE CONTRIBUTIONS! WITHOUT HIM THIS LIST WOULD BE SIGNIFICANTLY LESS EXPANSIVE. GO UPVOTE THIS MAN INTO HEAVEN, HE DESERVES IT!

Eventually I would like to make a Python script that takes care of all the necessary tasks to install these clients. Python itself is available through the Windows Store as well as on Python's website, or through various other installers. It will install any additional dependencies via Pip and will use Curl to retrieve the files. Anyone interested in helping with this script is more than welcome to do so, if you wish. The two options for assisting me would be as follows:
1. You can install Visual Studio Code or Atom and work with me directly through those applications (more details soon™)
2. A GitHub page may be made and you could submit pull requests through that and edit the script alone.

#######################################################################
There is a misconception that I am only here for free paid clients.
1. I can't use a paid client unless I have access to a cracked version, an account, and the HWID bound to said account.
2. People are also saying there's a way for clients to track if someone leaks them. While I suppose this is possible if they bothered to put in the absurd amount of effort it would take, it would give them no benefit and it would do no good as I am not running the software on my desktop -- I am sending the software to VirusTotal, grabbing the SHA-256 checksum, and deleting the file.
3. If all you're going to do is post that "VirusTotal can't catch x" or "muh client" do not waste my time, and don't waste yours. You will be ignored in favor of people who will actually help me construct what I hope to be a megathread for this subreddit, FOR YOU GUYS. I mean jeez, someone's trying to help and half the fucking responses are "muh client" like come on. No wonder this fucking subreddit is dead, sheesh. If someone's trying to help you and you just step on their toes that's just not very cool, not everyone is trying to scam you. I do just be trying to help people who come here doe.
4. If you have concerns, raise them respectfully. Do not attack me, you will be ignored or if I feel so inclined I will give you 110% of the shit you give me right back to you. I will answer questions, I will elaborate on my goals, I will take helpful advice. Everything else will be outright ignored, and misinformation will be countered. That is all, thread below.
#######################################################################

I named this the Checker Thread to make it easy to search for in the subreddit. Enjoy.Below will be a list of hacked clients, with VirusTotal links, SHA-256 hashes, direct download links (skipping ad pages) and eventually features for each client. I'd also like to add what servers they do and don't work on, but I need the community to help with that.
The only client I use is Impact, send me additional clients in the comments and they will be added to this list.

Note about SHA-256 hashes:
SHA-256 hashes are a hash of a file produced algorithmically. This type of hash was developed by the United States National Security Agency, and is typically used to verify that files are what they are supposed to be. Widely used in cryptographic applications such as SSH, APT repositories, transactions on websites, verifying files, Bitcoin, and more. It is very common. This type of cryptographic hash has been in use since 2001. An SHA-256 hash will be the exact same for the same file regardless of the source of the file, so if your hash is different from the one listed here you do not have the legitimate installer OR the hash is not up-to-date.

Note about VirusTotal tests:
Several people have pointed out to me something I feel should be brought up in the main thread. VirusTotal is not a catch-all, just as your typical antivirus software is not a catch-all. It will not catch everything. It should be used as a general guideline only. Clean VirusTotal tests do not guarantee your safety, although if the program passes all ~70 or so antivirus engines it's probably safe to use. Any additional malicious activity should be caught by your antivirus program's heuristics. I take no responsibility for anything that is malicious, but I can say in good conscience that I have done everything I can to ensure that everything on this thread is safe. Keep in mind that programs that trip VirusTotal may still be safe to use, antivirus programs often pick up injectors as malware or Trojans because that's how those types of viruses behave in the real world. Use your best judgement. If your best judgement is not good enough, do not use these programs. You have been warned.

----- C --- L --- I --- E --- N --- T--- S ------ C --- L --- I --- E --- N --- T--- S ------ C --- L --- I --- E --- N --- T--- S -----

Impact | 1.11.2 to 1.15.2 | Java Edition
VirusTotal Link for *.exe Installer | VirusTotal Link for *.jar Installer
SHA-256 Hash: 4EAFFB99759FBD949D0FBEF58AE9CEB45CE8CA2B0D7DC22147D4FF0E46F010EC
Impact triggered 0/72 engines on both installers.
Direct Link to Impact's *.exe Installer | Direct Link to Impact's *.jar Installer
------------------------------------------------------------------------------------------------------------------------------------------------

Sigma | 1.8 to 1.16 | Java Edition
VirusTotal Link for *.jar Installer | VirusTotal Link for the *.zip the Installer is Contained In
SHA-256 Hash for the *.zip file: 3FCD397849358522BF0EEEAF117487DBA860919900A904551DF512BE2C34B48C
Sigma's *.zip file triggered 0/59 engines on the *.zip file.
Sigma's *.jar file triggered 0/60 engines on the *.jar file.
Direct Link to Sigma's *.zip File that Contains the Installer
------------------------------------------------------------------------------------------------------------------------------------------------

9b9t | 1.12.2, needs testing | Java Edition | Forge Mod
VirusTotal Link for the *.jar Forge Mod
SHA-256 Hash: 30E4F2778688D54CE7992AFDE509460A7BDDBDA77800219083D4D12BC696EEA0
9b9t triggered 0/60 engines.
AnonFile link to 9b9t's *.jar Mod
------------------------------------------------------------------------------------------------------------------------------------------------

Ingrosware | 1.12.2 | Java Edition | Forge Mod
VirusTotal Link for the *.jar Forge Mod
SHA-256 Hash: BD1A0F9079F4C834A251163C3A0ECBFF7DFC28AB00CF1C74008AADD042FAD358
Ingrosware triggered 0/59 engines.
AnonFile link to a pre-built *.jar Mod
Note: Ingrosware is open source, and is available on GitHub. If you want to build it yourself, you can do so here.
------------------------------------------------------------------------------------------------------------------------------------------------

Mercury | 1.12.2 | Java Edition | Forge Mod
VirusTotal Link for the *.jar Forge Mod
SHA-256 Hash: 70E585A94218149970410ACAE5BE7C1C1B731140F1AF55FE2D1292B1CA74DCB9
Mercury triggered 0/60 engines.
AnonFile link to Mercury's *.jar Mod
------------------------------------------------------------------------------------------------------------------------------------------------

Atlas | 1.12.2 | Java Edition | Forge Mod | Use with caution!
VirusTotal Link for the *.jar Forge Mod
SHA-256 Hash: 7AEB7220CBD5D7C4E4421A940357F14EC70B18DB905469E288529FE3A2C04D57
Note: The file is called AceHackGold-n3.0-release.jar in VirusTotal. The client is identified as Atlas in the .nfo file it came bundled with.
Atlas triggered 7/59 engines.
Note: Upon closer inspection of the VirusTotal scan, the client appears to be of the injected flavor. Injectors are commonly a false positive Trojan. The client appears to be safe, and there was nothing particularly noteworthy in the VirusTotal scan that is atypical for an injector. While this makes the file appear to be safe, use with caution.
AnonFile Link for the *.jar Mod
Note: This is a cracked client! The crack is pre-done, so no additional work is required to use -- just put it in your Forge Mods folder and click play.
Note 2: The file downloaded is called AceHackGold-n3.0-release.jar*, I'm not sure why. This file was community-sourced, but it has been inspected in the same manner as all the others*.
------------------------------------------------------------------------------------------------------------------------------------------------

Atom | 1.12.2 | Java Edition | Forge Mod | Use with extreme caution!
VirusTotal Link for the *.zip the Forge Mod is Contained In
SHA-256 Hash: 3B43F952EB5B14F2B01592057B27E92B0E38B6874EA10B8E893BFCBC71463377
Note: The file is called output.157312297.txt in VirusTotal. In addition, VirusTotal identifies the file type properly (\.zip).*
Atom triggered 9/59 engines.
Note: Upon closer inspection of the VirusTotal scan, the client accesses numerous registry keys, which is a behavior I personally would consider to be unnecessary and incredibly suspicious. You can find more information in the VirusTotal scan. The client also exhibits typical Trojan false-positives.
AnonFile Link for the *.zip File
Note: This is a cracked client! The crack is pre-done, so no additional work is required to use -- just put it in your Forge Mods folder and click play.
Use this client with extreme caution. There are behaviors that I consider to be extremely suspicious, you must determine for yourself if it's safe to use however. This may just be how the client works. I do not know, and I can't be bothered to test it.
------------------------------------------------------------------------------------------------------------------------------------------------

Aurora | 1.12.2 | Java Edition | Forge Mod
VirusTotal Link for the *.zip the Forge Mod is Contained In
SHA-256 Hash: 9A66929B629AB076383340D33E0EF9B8CE221679EF79315240EA6C760651A533
Aurora triggered 0/61 engines.
AnonFile Link for the *.zip File
Note: This is a cracked client! The crack is pre-done, so no additional work is required to use -- just put it in your Forge Mods folder and click play.
------------------------------------------------------------------------------------------------------------------------------------------------

CandyCat | 1.12.2 | Java Edition | Forge Mod | Use with caution!
VirusTotal Link for the *.zip the Forge Mod is Contained In
SHA-256 Hash: 8CEC2F9F28AA3957504E0CC66BF1516080C7BAC50EADB54DC6DD97E0E6E9C745
CandyCat triggered 9/61 engines.
Note: Upon closer inspection of the VirusTotal scan, the client appears to be of the injected flavor. Injectors are commonly a false positive Trojan. The client appears to be safe, and there was nothing particularly noteworthy in the VirusTotal scan that is atypical for an injector. While this makes the file appear to be safe, use with caution.
AnonFile Link for the *.zip File
Note: This is a cracked client! The crack is pre-done, so no additional work is required to use -- just put it in your Forge Mods folder and click play.
------------------------------------------------------------------------------------------------------------------------------------------------

DayNightGod | 1.12.2 | Java Edition | Forge Mod | Use with caution!
VirusTotal Link for the *.zip the Forge Mod is Contained In
SHA-256 Hash: 9CEEB43476B18149C0DA76B7AE94713AAF60ED4D2BFD2339E863CC46A1808A0D
DayNightGod triggered 1/59 engines.
Note: Upon closer inspection, only one engine was triggered. The client did not trigger the usual false-positives of a Trojan, use with caution.
AnonFile Link for the *.zip File
Note: This is a cracked client! The crack is pre-done, so no additional work is required to use -- just put it in your Forge Mods folder and click play.
------------------------------------------------------------------------------------------------------------------------------------------------

HyperLethal | 1.12.2 | Java Edition | Forge Mod | Use with caution!
VirusTotal Link for the *.zip the Forge Mod is Contained In
SHA-256 Hash: 77FACC1FDB0415438963CCC8DDB4081958563AAA962CE9C024E5063DA32E8FAD
HyperLethal triggered 2/59 engines.
Note: Upon closer inspection of the VirusTotal scan, the client appears to be of the injected flavor. Injectors are commonly a false positive Trojan. The client appears to be safe, and there was nothing particularly noteworthy in the VirusTotal scan that is atypical for an injector. While this makes the file appear to be safe, use with caution.
AnonFile Link for the *.zip File
Note: This is a cracked client! The crack is pre-done, so no additional work is required to use -- just put it in your Forge Mods folder and click play.
------------------------------------------------------------------------------------------------------------------------------------------------

LoveClient | 1.12.2 | Java Edition | Forge Mod | Use with extreme caution!
VirusTotal Link for the *.zip the Forge Mod is Contained In
SHA-256 Hash: C71EC42FF612D75CB7AA21B8400D164A74AAD9BB65D2DFEE232461DAF98034C2
LoveClient triggered 9/61 engines.
Note: Upon closer inspection of the VirusTotal scan, the client accesses numerous registry keys, which is a behavior I personally would consider to be unnecessary and incredibly suspicious. You can find more information in the VirusTotal scan. The client also exhibits typical Trojan false-positives.
AnonFile Link for the *.zip File
Use this client with extreme caution. There are behaviors that I consider to be extremely suspicious, you must determine for yourself if it's safe to use however. This may just be how the client works. I do not know, and I can't be bothered to test it.
------------------------------------------------------------------------------------------------------------------------------------------------

SnowHack | 1.12.2 | Java Edition | Forge Mod
VirusTotal Link for the *.zip the Forge Mod is Contained In
SHA-256 Hash: 7100C8D59CE06B279F7D03D834FC2C361F10BEAE913575FC7EFA74E498167D2C
SnowHack triggered 10/62 engines.
Note: Upon closer inspection of the VirusTotal scan, the client appears to be of the injected flavor. Injectors are commonly a false positive Trojan. The client appears to be safe, and there was nothing particularly noteworthy in the VirusTotal scan that is atypical for an injector. While this makes the file appear to be safe, use with caution.
AnonFile Link for the *.zip File
Note: This is a cracked client! The crack is pre-done, so no additional work is required to use -- just put it in your Forge Mods folder and click play.
------------------------------------------------------------------------------------------------------------------------------------------------
submitted by Daemris to minecraftclients [link] [comments]

I built a decentralized legal-binding smart contract system. I need peer reviewers and whitepaper proof readers. Help greatly appreciated!

I posted this on /cryptotechnology . It attracted quite a bit of upvotes but not many potential contributors. Someone mentioned I should try this sub. I read the rules and it seems to fit within them. Hope this kind of post is alright here...
EDIT: My mother language is french (I'm from Montreal/Canada). Please excuse any blatant grammatical errors.
TLDR: I built a decentralized legal-binding smart contract system. I need peer reviewers and whitepaper proof readers. If you're interested, send me an email to discuss: [email protected] . Thanks in advance!
Hi guys,
For the last few years, I've been working on a decentralized legal-binding contract system. Basically, I created a PoW blockchain software that can receive a hash as an address, and another hash as a bucket, in each transaction.
The address hash is used to tell a specific entity (application/contract/company/person, etc) that uses the blockchain that this transaction might be addressed to them. The bucket hash simply tells the nodes which hashtree of files they need to download in order to execute that contract.
The buckets are shared within the network of nodes. Someone could, for example, write a contract with a series of nodes in order to host their data for them. Buckets can hold any kind of data, and can be of any size... including encrypted data.
The blockchain's blocks are chained together using a mining system similar to bitcoin (hashcash algorithm). Each block contains transactions. The requested difficulty increases when the amount of transactions in a block increases, linearly. Then, when a block is mined properly, another smaller mining effort is requested to link the block to the network's head block.
To replace a block, you need to create another block with more transactions than the amount that were transacted in and after the mined block.
I expect current payment processors to begin accepting transactions and mine them for their customers and make money with fees, in parallel. Using such a mechanism, miners will need to have a lot of bandwidth available in order to keep downloading the blocks of other miners, just like the current payment processors.
The contracts is code written in our custom programming language. Their code is pushed using a transaction, and hosted in buckets. Like you can see, the contract's data are off-chain, only its bucket hash is on-chain. The contract can be used to listen to events that occurs on the blockchain, in any buckets hosted by nodes or on any website that can be crawled and parsed in the contract.
There is also an identity system and a vouching system...which enable the creation of soft-money (promise of future payment in hard money (our cryptocurrency) if a series of events arrive).
The contracts can also be compiled to a legal-binding framework and be potentially be used in court. The contracts currently compile to english and french only.
I also built a browser that contains a 3D viewport, using OpenGL. The browser contains a domain name system (DNS) in form of contracts. Anyone can buy a new domain by creating a transaction with a bucket that contains code to reserve a specific name. When a user request a domain name, it discovers the bucket that is attached to the domain, download that bucket and executes its scripts... which renders in the 3D viewport.
When people interact with an application, the application can create contracts on behalf of the user and send them to the blockchain via a transaction. This enables normal users (non-developers) to interact with others using legal contracts, by using a GUI software.
The hard money (cryptocurrency) is all pre-mined and will be sold to entities (people/company) that want to use the network. The hard money can be re-sold using the contract proposition system, for payment in cash or a bank transfer. The fiat funds will go to my company in order to create services that use this specific network of contracts. The goal is to use the funds to make the network grow and increase its demand in hard money. For now, we plan to create:
A logistic and transportation company
A delivery company
A company that buy and sell real estate options
A company that manage real estate
A software development company
A world-wide fiat money transfer company
A payment processor company
We chose these niche because our team has a lot of experience in these areas: we currently run companies in these fields. These niche also generate a lot of revenue and expenses, making the value of exchanges high. We expect this to drive volume in contracts, soft-money and hard-money exchanges.
We also plan to use the funds to create a venture capital fund that invests in startups that wants to create contracts on our network to execute a specific service in a specific niche.
I'm about to release the software open source very soon and begin executing our commercial activities on the network. Before launching, I'd like to open a discussion with the community regarding the details of how this software works and how it is explained in the whitepaper.
If you'd like to read the whitepaper and open a discussion with me regarding how things work, please send me an email at [email protected] .
If you have any comment, please comment below and Ill try to answer every question. Please note that before peer-reviewing the software and the whitepaper, I'd like to keep the specific details of the software private, but can discuss the general details. A release date will be given once my work has been peer reviewed.
Thanks all in advance!
P.S: This project is not a competition to bitcoin. My goal with this project is to enable companies to write contracts together, easily follow events that are executed in their contracts, understand what to expect from their partnership and what they need to give in order to receive their share of deals... and sell their contracts that they no longer need to other community members.
Bitcoin already has a network of people that uses it. It has its own value. In fact, I plan to create contracts on our network to exchange value from our network for bitcoin and vice-versa. Same for any commodity and currency that currently exits in this world.
submitted by steve-rodrigue to compsci [link] [comments]

August / September monthly report from v1docq47 (CCS + XRM.RU)

This is my monthly progress report (CCS.html) + XMR.RU).
Below is a list of what has been done and translated into Russian for two months of my work.

Monero Video (YouTube)

The following video posted on Monero Russian Community YouTube Channel.

Weekly News:

Short Q&A about Monero:

Monero into Russian (Translation)

The following articles / guides have been translated into Russian and posted on the XMR.RU website and my Github repository.
Note: If you would like to read the original article in English, then, open the article you are interested in, and at the end of each article you will find a link to the source.

Critical Decentralisation Cluster 36c3 (transcriptions (EN + RU) + translation (RU)):

01 - Monero Introduction (Diego "rehrar" Salazar) | Transcriptions - EN.md) / RU.md) / XMR.RU 02 - RIAT Introduction (parasew) | Transcriptions - EN.md) / RU.md) / XMR.RU 03 - Swiss Cryptoeconomics Assembly (polto, Ome) | Transcriptions - EN.md) / RU.md) / XMR.RU 04 - Namecoin Introduction (Jeremy Rand) | Transcriptions - EN.md) / RU.md) / XMR.RU 05 - Open Hardware developed at FOSSASIA (Mario Behling) | Transcriptions - EN.md) / RU.md) / XMR.RU 06 - Paralelni Polis (Juraj Bednar) | Transcriptions - EN.md) / RU.md) / XMR.RU 07 - Introduction to Replicant (dllud, Denis ‘GNUtoo’ Carikli)​ | Transcriptions - EN.md) / RU.md) / XMR.RU 08 - Open Source Hardware and OSHWA (Drew Fustini) | Transcriptions - EN.md) / RU.md) / XMR.RU 09 - ImplicitCAD (Juila Longtin) | Transcriptions - EN.md) / RU.md) / XMR.RU 10 - Program in Detail | Transcriptions - EN / RU / XMR.RU 11 - about:freedom (Bonnie Mehring, Blipp)​ | Transcriptions - EN.md) / RU.md) / XMR.RU 13 - Funding Models of FOSS (Diego “rehrar” Salazar) | Transcriptions - EN.md) / RU.md) / XMR.RU 14 - The Sharp Forks We Follow​ | Transcriptions - EN / RU / XMR.RU 16 - P2P Trading in Cryptoanarchy | Transcriptions - EN / RU / XMR.RU 17 - Monero’s Adaptive Blockweight Approach to Scaling | Transcriptions - EN / RU / XMR.RU 18 - Nym (Harry Halpin)​ | Transcriptions - EN.md) / RU.md) / XMR.RU 19 - Digital Integrity of the Human Person | Transcriptions - EN / RU / XMR.RU 20 - cyber~Congress (Sergey Simanovsky) | Transcriptions - EN.md) / RU.md) / XMR.RU 21 - KYC & Crypto-AML Tools (polto) | Transcriptions - EN.md) / RU.md) / XMR.RU 22 - Parallel Polis, Temporary Autonomous Zones and Beyond | Transcriptions - EN / RU 23 - MandelBot:HAB - Open Source Ecotecture and Horizontalism | Transcriptions - EN / RU 24 - Adventures and Experiments Adding Namecoin to Tor Browser | Transcriptions - EN / RU 25 - Fair Data Society (Gregor Zavcer) | Transcriptions - EN.md) / RU.md) / XMR.RU 45 - Designing a Communal Computing Interface | Transcriptions - EN / RU / XMR.RU 47 - Hackatoshi’s Flying Circuit | Transcriptions - EN / RU / XMR.RU

Zero to Monero - Second Edition

https://www.overleaf.com/read/hcmqnvgtfmyh - Chapter 00 - Abstract - Chapter 01 - Introduction - Chapter 02 - Basic Concepts - Chapter 03 - Advanced Schnorr-like Signatures

Monero Outreach Articles

Getmonero.org Posts Blog

LocalMonero Articles

Note: You need "Change Language" to Russian - Why Monero Has A Tail Emission - How CLSAG Will Improve Monero's Efficiency - How Monero Solved the Block Size Problem That Plagues Bitcoin - How Ring Signatures Obscure Monero's Outputs - Monero Best Practices for Beginners - Monero Outputs Explained

Monero Meeting logs

CCS Result / Report

Monero News

Other Articles

Pull / Merge Request

Monero Project Translations (Weblate)

Thanks for your support!
submitted by v1docq47 to Monero [link] [comments]

Two Prime, under the radar coin worth looking into.

Two Prime has released their FF1 MacroToken.
"We show how this methodology can be applied as an Open Source application, in the vein of BTC and ETH, with all the creative and value generative potential that comes along with it. We leverage store of value functions of cryptocurrencies to arrive at value creation and accretion in the real economy by the intermediary of crypto exchanges on which we propose to provide protective measures. We detail treasury and reserve formation for the Open Source Finance Foundation, describe its relation to Two Prime and detail the emission of a new crypto-asset called the FF1 Token.
We seek liquidity for the FF1 treasury within the secondary exchanges for the purpose of applying M4 in the real world, both in the private and public sector. We first apply this to the vertical of cryptocurrencies while outlining the genericity and stability of the model which we indeed to apply to esoteric financial needs (e.g. Smart City financing). In so doing, we extend the scope and control of applications that a system of digital units of value stored on decentralized, public ledgers can aim to advance. We call this approach Open Source Finance and the resulting coin class a MacroToken.
MODERN MONETARY THEORY FRAMEWORKModern Monetary Theory states two interdependent phenomenological axioms and the banking system operates on a resulting syllogism:
In the past 10 years, the formation and emergence of BTC and ETH has verifiably falsified Axiom 2 [1]. The phenomenon of crypto-currencies has created ab-initio global stores of value of type 1a. Cryptoc Currencies have displaced trust by means of government violence and associated, implied violence, with instead, open source distribution, cloud computing, objective mathematics, and the algorithmic integrity of blockchain ledgers. The first “killer app” of these open source ledgers areis stores of value, e.g. Bitcoin, or “open source money” as it was first characterized by its semi-anonymous creators. Leading crypto-currencies have proven themselves as viable global stores of value. They are regulated as Gold is in the United States. However, as type 1a units of value, they have tended towards high volatility inevitably leading to speculative market behavior and near 0 “real” asset-” backing or floor price [2], albeit with an aggregate value of $350bn ab-initio creation.
We therefore advance Axiom 2 to Axiom 2’
At N < 1 we have dilutive debasement of fungible units of value, aka inflation. At 1, the new monies are therefore stable coins. At N > 1, these tokens are designed to grow with demand. Axiom Two Prime (or 2’) displaces government endorsed violence as our macro-socio organizing principle, with algorithmic objectivity and verifiable transparency. This occurs within the landscape we call Open Source Finance.
THE TWO PRIME MODEL
Two Prime refers to the financial management company managing the OSFF. FF1 refers to the Macro Token of the OSFF. The first stage is reserve and treasury formation, the second stage describes the mechanics of the public markets and the protective measures of the reserves and third stage is treasury liquidity via the Continuous Token Offering both in public and private markets. We will now describe these in more detail.
MACRO INVESTMENT THESIS AND RATIONALE FOR FF1The FF1 MacroToken is a synthetic token based on the proven killer applications of Cryptoc-Currencies. After 110 years since the inception of the blockchain technology, the killer apps of crypto are already here and they are primarily all financial, not technical. The historical killers apps are:
The FF1 MacroToken is a pot-pourri of these features, a synthetic token that mixes the best of breed practices of crypto mixing Store-of-Value, Capital Formation and Fractional Asset-Backing.
MACRO INVESTMENT THESIS AND RATIONALE FOR FF1Treasury Generation: Ab-Initio Store of Value On the supply side, The OSFFTwo Prime has created is creating 100, 000, 000 FF1 Macro Tokens, which it keeps in treasury. They are pure stores of value for they have no assets backing them at birth. They are ab-initio instruments. The FF1 Macro Tokens are listed on public crypto exchanges. Two Prime manages operates market- making for these stores of value.
Treasury Management: Supply- Side Tokenomics All FF1 are held in the Open Source Finance Foundation treasury. Crypto aAssets that enter into treasury are, at first, not traded. The FF1 supply will be offered upon sufficient demand. which Two Prime generates publicly and privately. The total supply will be finite in total units (100, 000, 000), but variable in its aggregate value for supply and demand will make the price move. The proceeds are the property of the OSFF (not Two Prime) and Two Prime places invests the liquid treasury (post FF1 liquidation) in crypto assets to protect against depreciation and create a macro-hedge reserve andor floor for the price. It should be noted that the price and the NAV of assets are, by design, not equal. In other words, the additional OSFF treasury is locked and can enter circulation if, and only if, there is a corresponding demand which is then placed invested in crypto assets with a target value N 1. This results in fractional asset- backing at first.
EXCHANGES, CONTINUOUS TOKEN OFFERING, AND DEMAND- SIDE TOKENOMICSPublic Exchanges Two Prime will maintain listings for the FF1 Tokens on behalf of the OSFF. Two Prime maintains market- making operations in public crypto exchanges on behalf of the OSFF.
Continuous Token Offering Two Prime works on creating new liquidity for the FF1 Macro Tokens to comply with the supply side constraints detailed above, namely that a token enters circulation when matched by demand. Two Prime does demand generation in public as above as well as private. This CTO results in something akin to a reverse-ICO, letting the reserves be set by public trading and then marketing to private purchasers investors (accredited US for example) after the public liquidity event. Demand generation is done via marketing to relevant audiences, e.g. as a macro way to HODL with exclusive private equity investments for crypto holders, and as a diversified and de-risked way to gain crypto exposure for FIAT holders (Sharpe ratio: 1.55, Beta to BTC: 0.75).
PARTNER NETWORK, USE OF PROCEEDS, ACCRETION AND FLOOR PROTECTIONThough this mathematical approach allows for a broad and differentiated set of financial applications and outcomes, Two Prime founding Members will first apply this work to the realm of project finance within the Blockchain space via algorithmic balancing of an equity and debt based treasury consisting of real crypto assets and future cash flows.
Proof of Value Mining in Partner Network Funds and projects can apply to the foundation for financing. This is the partner network and is akin to the way a network of miners secure the chain. Here a network of partners protects the value. The Foundation invests the proceeds in liquid crypto assets, interest bearing crypto assets and equity crypto assets via partner funds, creating a bridge to the real economy (crypto companies) in the last step. The foundation holds these (real economic) assets.
M4 Asset Mix The funds raised are invested in public and private sector projects. We consider the following mix
This completes the M4 step and the flow of funds for the FF1 Token. It shows a feedback loop, for the Foundation can buy back it’s token, leading to an idiosyncratic tokenomics: the FF1 Token has a fixed (and potentially diminishing) SUPPLY alongside (potentially increasing) endogenous and exogenous DEMAND."
This seems pretty interesting imo, thoughts?
submitted by Stock-Accountant to CryptoMoonShots [link] [comments]

Scaling Reddit Community Points with Arbitrum Rollup: a piece of cake

Scaling Reddit Community Points with Arbitrum Rollup: a piece of cake
https://preview.redd.it/b80c05tnb9e51.jpg?width=2550&format=pjpg&auto=webp&s=850282c1a3962466ed44f73886dae1c8872d0f31
Submitted for consideration to The Great Reddit Scaling Bake-Off
Baked by the pastry chefs at Offchain Labs
Please send questions or comments to [[email protected] ](mailto:[email protected])
1. Overview
We're excited to submit Arbitrum Rollup for consideration to The Great Reddit Scaling Bake-Off. Arbitrum Rollup is the only Ethereum scaling solution that supports arbitrary smart contracts without compromising on Ethereum's security or adding points of centralization. For Reddit, this means that Arbitrum can not only scale the minting and transfer of Community Points, but it can foster a creative ecosystem built around Reddit Community Points enabling points to be used in a wide variety of third party applications. That's right -- you can have your cake and eat it too!
Arbitrum Rollup isn't just Ethereum-style. Its Layer 2 transactions are byte-for-byte identical to Ethereum, which means Ethereum users can continue to use their existing addresses and wallets, and Ethereum developers can continue to use their favorite toolchains and development environments out-of-the-box with Arbitrum. Coupling Arbitrum’s tooling-compatibility with its trustless asset interoperability, Reddit not only can scale but can onboard the entire Ethereum community at no cost by giving them the same experience they already know and love (well, certainly know).
To benchmark how Arbitrum can scale Reddit Community Points, we launched the Reddit contracts on an Arbitrum Rollup chain. Since Arbitrum provides full Solidity support, we didn't have to rewrite the Reddit contracts or try to mimic their functionality using an unfamiliar paradigm. Nope, none of that. We launched the Reddit contracts unmodified on Arbitrum Rollup complete with support for minting and distributing points. Like every Arbitrum Rollup chain, the chain included a bridge interface in which users can transfer Community Points or any other asset between the L1 and L2 chains. Arbitrum Rollup chains also support dynamic contract loading, which would allow third-party developers to launch custom ecosystem apps that integrate with Community Points on the very same chain that runs the Reddit contracts.
1.1 Why Ethereum
Perhaps the most exciting benefit of distributing Community Points using a blockchain is the ability to seamlessly port points to other applications and use them in a wide variety of contexts. Applications may include simple transfers such as a restaurant that allows Redditors to spend points on drinks. Or it may include complex smart contracts -- such as placing Community Points as a wager for a multiparty game or as collateral in a financial contract.
The common denominator between all of the fun uses of Reddit points is that it needs a thriving ecosystem of both users and developers, and the Ethereum blockchain is perhaps the only smart contract platform with significant adoption today. While many Layer 1 blockchains boast lower cost or higher throughput than the Ethereum blockchain, more often than not, these attributes mask the reality of little usage, weaker security, or both.
Perhaps another platform with significant usage will rise in the future. But today, Ethereum captures the mindshare of the blockchain community, and for Community Points to provide the most utility, the Ethereum blockchain is the natural choice.
1.2 Why Arbitrum
While Ethereum's ecosystem is unmatched, the reality is that fees are high and capacity is too low to support the scale of Reddit Community Points. Enter Arbitrum. Arbitrum Rollup provides all of the ecosystem benefits of Ethereum, but with orders of magnitude more capacity and at a fraction of the cost of native Ethereum smart contracts. And most of all, we don't change the experience from users. They continue to use the same wallets, addresses, languages, and tools.
Arbitrum Rollup is not the only solution that can scale payments, but it is the only developed solution that can scale both payments and arbitrary smart contracts trustlessly, which means that third party users can build highly scalable add-on apps that can be used without withdrawing money from the Rollup chain. If you believe that Reddit users will want to use their Community Points in smart contracts--and we believe they will--then it makes the most sense to choose a single scaling solution that can support the entire ecosystem, eliminating friction for users.
We view being able to run smart contracts in the same scaling solution as fundamentally critical since if there's significant demand in running smart contracts from Reddit's ecosystem, this would be a load on Ethereum and would itself require a scaling solution. Moreover, having different scaling solutions for the minting/distribution/spending of points and for third party apps would be burdensome for users as they'd have to constantly shuffle their Points back and forth.
2. Arbitrum at a glance
Arbitrum Rollup has a unique value proposition as it offers a combination of features that no other scaling solution achieves. Here we highlight its core attributes.
Decentralized. Arbitrum Rollup is as decentralized as Ethereum. Unlike some other Layer 2 scaling projects, Arbitrum Rollup doesn't have any centralized components or centralized operators who can censor users or delay transactions. Even in non-custodial systems, centralized components provide a risk as the operators are generally incentivized to increase their profit by extracting rent from users often in ways that severely degrade user experience. Even if centralized operators are altruistic, centralized components are subject to hacking, coercion, and potential liability.
Massive Scaling. Arbitrum achieves order of magnitude scaling over Ethereum's L1 smart contracts. Our software currently supports 453 transactions-per-second for basic transactions (at 1616 Ethereum gas per tx). We have a lot of room left to optimize (e.g. aggregating signatures), and over the next several months capacity will increase significantly. As described in detail below, Arbitrum can easily support and surpass Reddit's anticipated initial load, and its capacity will continue to improve as Reddit's capacity needs grow.
Low cost. The cost of running Arbitrum Rollup is quite low compared to L1 Ethereum and other scaling solutions such as those based on zero-knowledge proofs. Layer 2 fees are low, fixed, and predictable and should not be overly burdensome for Reddit to cover. Nobody needs to use special equipment or high-end machines. Arbitrum requires validators, which is a permissionless role that can be run on any reasonable on-line machine. Although anybody can act as a validator, in order to protect against a “tragedy of the commons” and make sure reputable validators are participating, we support a notion of “invited validators” that are compensated for their costs. In general, users pay (low) fees to cover the invited validators’ costs, but we imagine that Reddit may cover this cost for its users. See more on the costs and validator options below.
Ethereum Developer Experience. Not only does Arbitrum support EVM smart contracts, but the developer experience is identical to that of L1 Ethereum contracts and fully compatible with Ethereum tooling. Developers can port existing Solidity apps or write new ones using their favorite and familiar toolchains (e.g. Truffle, Buidler). There are no new languages or coding paradigms to learn.
Ethereum wallet compatibility. Just as in Ethereum, Arbitrum users need only hold keys, but do not have to store any coin history or additional data to protect or access their funds. Since Arbitrum transactions are semantically identical to Ethereum L1 transactions, existing Ethereum users can use their existing Ethereum keys with their existing wallet software such as Metamask.
Token interoperability. Users can easily transfer their ETH, ERC-20 and ERC-721 tokens between Ethereum and the Arbitrum Rollup chain. As we explain in detail below, it is possible to mint tokens in L2 that can subsequently be withdrawn and recognized by the L1 token contract.
Fast finality. Transactions complete with the same finality time as Ethereum L1 (and it's possible to get faster finality guarantees by trading away trust assumptions; see the Arbitrum Rollup whitepaper for details).
Non-custodial. Arbitrum Rollup is a non-custodial scaling solution, so users control their funds/points and neither Reddit nor anyone else can ever access or revoke points held by users.
Censorship Resistant. Since it's completely decentralized, and the Arbitrum protocol guarantees progress trustlessly, Arbitrum Rollup is just as censorship-proof as Ethereum.
Block explorer. The Arbitrum Rollup block explorer allows users to view and analyze transactions on the Rollup chain.
Limitations
Although this is a bake-off, we're not going to sugar coat anything. Arbitrum Rollup, like any Optimistic Rollup protocol, does have one limitation, and that's the delay on withdrawals.
As for the concrete length of the delay, we've done a good deal of internal modeling and have blogged about this as well. Our current modeling suggests a 3-hour delay is sufficient (but as discussed in the linked post there is a tradeoff space between the length of the challenge period and the size of the validators’ deposit).
Note that this doesn't mean that the chain is delayed for three hours. Arbitrum Rollup supports pipelining of execution, which means that validators can keep building new states even while previous ones are “in the pipeline” for confirmation. As the challenge delays expire for each update, a new state will be confirmed (read more about this here).
So activity and progress on the chain are not delayed by the challenge period. The only thing that's delayed is the consummation of withdrawals. Recall though that any single honest validator knows immediately (at the speed of L1 finality) which state updates are correct and can guarantee that they will eventually be confirmed, so once a valid withdrawal has been requested on-chain, every honest party knows that the withdrawal will definitely happen. There's a natural place here for a liquidity market in which a validator (or someone who trusts a validator) can provide withdrawal loans for a small interest fee. This is a no-risk business for them as they know which withdrawals will be confirmed (and can force their confirmation trustlessly no matter what anyone else does) but are just waiting for on-chain finality.
3. The recipe: How Arbitrum Rollup works
For a description of the technical components of Arbitrum Rollup and how they interact to create a highly scalable protocol with a developer experience that is identical to Ethereum, please refer to the following documents:
Arbitrum Rollup Whitepaper
Arbitrum academic paper (describes a previous version of Arbitrum)
4. Developer docs and APIs
For full details about how to set up and interact with an Arbitrum Rollup chain or validator, please refer to our developer docs, which can be found at https://developer.offchainlabs.com/.
Note that the Arbitrum version described on that site is older and will soon be replaced by the version we are entering in Reddit Bake-Off, which is still undergoing internal testing before public release.
5. Who are the validators?
As with any Layer 2 protocol, advancing the protocol correctly requires at least one validator (sometimes called block producers) that is honest and available. A natural question is: who are the validators?
Recall that the validator set for an Arbitrum chain is open and permissionless; anyone can start or stop validating at will. (A useful analogy is to full nodes on an L1 chain.) But we understand that even though anyone can participate, Reddit may want to guarantee that highly reputable nodes are validating their chain. Reddit may choose to validate the chain themselves and/or hire third-party validators.To this end, we have begun building a marketplace for validator-for-hire services so that dapp developers can outsource validation services to reputable nodes with high up-time. We've announced a partnership in which Chainlink nodes will provide Arbitrum validation services, and we expect to announce more partnerships shortly with other blockchain infrastructure providers.
Although there is no requirement that validators are paid, Arbitrum’s economic model tracks validators’ costs (e.g. amount of computation and storage) and can charge small fees on user transactions, using a gas-type system, to cover those costs. Alternatively, a single party such as Reddit can agree to cover the costs of invited validators.
6. Reddit Contract Support
Since Arbitrum contracts and transactions are byte-for-byte compatible with Ethereum, supporting the Reddit contracts is as simple as launching them on an Arbitrum chain.
Minting. Arbitrum Rollup supports hybrid L1/L2 tokens which can be minted in L2 and then withdrawn onto the L1. An L1 contract at address A can make a special call to the EthBridge which deploys a "buddy contract" to the same address A on an Arbitrum chain. Since it's deployed at the same address, users can know that the L2 contract is the authorized "buddy" of the L1 contract on the Arbitrum chain.
For minting, the L1 contract is a standard ERC-20 contract which mints and burns tokens when requested by the L2 contract. It is paired with an ERC-20 contract in L2 which mints tokens based on whatever programmer provided minting facility is desired and burns tokens when they are withdrawn from the rollup chain. Given this base infrastructure, Arbitrum can support any smart contract based method for minting tokens in L2, and indeed we directly support Reddit's signature/claim based minting in L2.
Batch minting. What's better than a mint cookie? A whole batch! In addition to supporting Reddit’s current minting/claiming scheme, we built a second minting design, which we believe outperforms the signature/claim system in many scenarios.
In the current system, Reddit periodically issues signed statements to users, who then take those statements to the blockchain to claim their tokens. An alternative approach would have Reddit directly submit the list of users/amounts to the blockchain and distribute the tokens to the users without the signature/claim process.
To optimize the cost efficiency of this approach, we designed an application-specific compression scheme to minimize the size of the batch distribution list. We analyzed the data from Reddit's previous distributions and found that the data is highly compressible since token amounts are small and repeated, and addresses appear multiple times. Our function groups transactions by size, and replaces previously-seen addresses with a shorter index value. We wrote client code to compress the data, wrote a Solidity decompressing function, and integrated that function into Reddit’s contract running on Arbitrum.
When we ran the compression function on the previous Reddit distribution data, we found that we could compress batched minting data down to to 11.8 bytes per minting event (averaged over a 6-month trace of Reddit’s historical token grants)compared with roughly 174 bytes of on-chain data needed for the signature claim approach to minting (roughly 43 for an RLP-encoded null transaction + 65 for Reddit's signature + 65 for the user's signature + roughly 8 for the number of Points) .
The relative benefit of the two approaches with respect to on-chain call data cost depends on the percentage of users that will actually claim their tokens on chain. With the above figures, batch minting will be cheaper if roughly 5% of users redeem their claims. We stress that our compression scheme is not Arbitrum-specific and would be beneficial in any general-purpose smart contract platform.
8. Benchmarks and costs
In this section, we give the full costs of operating the Reddit contracts on an Arbitrum Rollup chain including the L1 gas costs for the Rollup chain, the costs of computation and storage for the L2 validators as well as the capital lockup requirements for staking.
Arbitrum Rollup is still on testnet, so we did not run mainnet benchmarks. Instead, we measured the L1 gas cost and L2 workload for Reddit operations on Arbitrum and calculated the total cost assuming current Ethereum gas prices. As noted below in detail, our measurements do not assume that Arbitrum is consuming the entire capacity of Ethereum. We will present the details of our model now, but for full transparency you can also play around with it yourself and adjust the parameters, by copying the spreadsheet found here.
Our cost model is based on measurements of Reddit’s contracts, running unmodified (except for the addition of a batch minting function) on Arbitrum Rollup on top of Ethereum.
On the distribution of transactions and frequency of assertions. Reddit's instructions specify the following minimum parameters that submissions should support:
Over a 5 day period, your scaling PoC should be able to handle:
  • 100,000 point claims (minting & distributing points)
  • 25,000 subscriptions
  • 75,000 one-off points burning
  • 100,000 transfers
We provide the full costs of operating an Arbitrum Rollup chain with this usage under the assumption that tokens are minted or granted to users in batches, but other transactions are uniformly distributed over the 5 day period. Unlike some other submissions, we do not make unrealistic assumptions that all operations can be submitted in enormous batches. We assume that batch minting is done in batches that use only a few percent on an L1 block’s gas, and that other operations come in evenly over time and are submitted in batches, with one batch every five minutes to keep latency reasonable. (Users are probably already waiting for L1 finality, which takes at least that long to achieve.)
We note that assuming that there are only 300,000 transactions that arrive uniformly over the 5 day period will make our benchmark numbers lower, but we believe that this will reflect the true cost of running the system. To see why, say that batches are submitted every five minutes (20 L1 blocks) and there's a fixed overhead of c bytes of calldata per batch, the cost of which will get amortized over all transactions executed in that batch. Assume that each individual transaction adds a marginal cost of t. Lastly assume the capacity of the scaling system is high enough that it can support all of Reddit's 300,000 transactions within a single 20-block batch (i.e. that there is more than c + 300,000*t byes of calldata available in 20 blocks).
Consider what happens if c, the per-batch overhead, is large (which it is in some systems, but not in Arbitrum). In the scenario that transactions actually arrive at the system's capacity and each batch is full, then c gets amortized over 300,000 transactions. But if we assume that the system is not running at capacity--and only receives 300,000 transactions arriving uniformly over 5 days-- then each 20-block assertion will contain about 200 transactions, and thus each transaction will pay a nontrivial cost due to c.
We are aware that other proposals presented scaling numbers assuming that 300,000 transactions arrived at maximum capacity and was executed in a single mega-transaction, but according to our estimates, for at least one such report, this led to a reported gas price that was 2-3 orders of magnitude lower than it would have been assuming uniform arrival. We make more realistic batching assumptions, and we believe Arbitrum compares well when batch sizes are realistic.
Our model. Our cost model includes several sources of cost:
  • L1 gas costs: This is the cost of posting transactions as calldata on the L1 chain, as well as the overhead associated with each batch of transactions, and the L1 cost of settling transactions in the Arbitrum protocol.
  • Validator’s staking costs: In normal operation, one validator will need to be staked. The stake is assumed to be 0.2% of the total value of the chain (which is assumed to be $1 per user who is eligible to claim points). The cost of staking is the interest that could be earned on the money if it were not staked.
  • Validator computation and storage: Every validator must do computation to track the chain’s processing of transactions, and must maintain storage to keep track of the contracts’ EVM storage. The cost of computation and storage are estimated based on measurements, with the dollar cost of resources based on Amazon Web Services pricing.
It’s clear from our modeling that the predominant cost is for L1 calldata. This will probably be true for any plausible rollup-based system.
Our model also shows that Arbitrum can scale to workloads much larger than Reddit’s nominal workload, without exhausting L1 or L2 resources. The scaling bottleneck will ultimately be calldata on the L1 chain. We believe that cost could be reduced substantially if necessary by clever encoding of data. (In our design any compression / decompression of L2 transaction calldata would be done by client software and L2 programs, never by an L1 contract.)
9. Status of Arbitrum Rollup
Arbitrum Rollup is live on Ethereum testnet. All of the code written to date including everything included in the Reddit demo is open source and permissively licensed under the Apache V2 license. The first testnet version of Arbitrum Rollup was released on testnet in February. Our current internal version, which we used to benchmark the Reddit contracts, will be released soon and will be a major upgrade.
Both the Arbitrum design as well as the implementation are heavily audited by independent third parties. The Arbitrum academic paper was published at USENIX Security, a top-tier peer-reviewed academic venue. For the Arbitrum software, we have engaged Trail of Bits for a security audit, which is currently ongoing, and we are committed to have a clean report before launching on Ethereum mainnet.
10. Reddit Universe Arbitrum Rollup Chain
The benchmarks described in this document were all measured using the latest internal build of our software. When we release the new software upgrade publicly we will launch a Reddit Universe Arbitrum Rollup chain as a public demo, which will contain the Reddit contracts as well as a Uniswap instance and a Connext Hub, demonstrating how Community Points can be integrated into third party apps. We will also allow members of the public to dynamically launch ecosystem contracts. We at Offchain Labs will cover the validating costs for the Reddit Universe public demo.
If the folks at Reddit would like to evaluate our software prior to our public demo, please email us at [email protected] and we'd be more than happy to provide early access.
11. Even more scaling: Arbitrum Sidechains
Rollups are an excellent approach to scaling, and we are excited about Arbitrum Rollup which far surpasses Reddit's scaling needs. But looking forward to Reddit's eventual goal of supporting hundreds of millions of users, there will likely come a time when Reddit needs more scaling than any Rollup protocol can provide.
While Rollups greatly reduce costs, they don't break the linear barrier. That is, all transactions have an on-chain footprint (because all calldata must be posted on-chain), albeit a far smaller one than on native Ethereum, and the L1 limitations end up being the bottleneck for capacity and cost. Since Ethereum has limited capacity, this linear use of on-chain resources means that costs will eventually increase superlinearly with traffic.
The good news is that we at Offchain Labs have a solution in our roadmap that can satisfy this extreme-scaling setting as well: Arbitrum AnyTrust Sidechains. Arbitrum Sidechains are similar to Arbitrum Rollup, but deviate in that they name a permissioned set of validators. When a chain’s validators agree off-chain, they can greatly reduce the on-chain footprint of the protocol and require almost no data to be put on-chain. When validators can't reach unanimous agreement off-chain, the protocol reverts to Arbitrum Rollup. Technically, Arbitrum Sidechains can be viewed as a hybrid between state channels and Rollup, switching back and forth as necessary, and combining the performance and cost that state channels can achieve in the optimistic case, with the robustness of Rollup in other cases. The core technical challenge is how to switch seamlessly between modes and how to guarantee that security is maintained throughout.
Arbitrum Sidechains break through this linear barrier, while still maintaining a high level of security and decentralization. Arbitrum Sidechains provide the AnyTrust guarantee, which says that as long as any one validator is honest and available (even if you don't know which one will be), the L2 chain is guaranteed to execute correctly according to its code and guaranteed to make progress. Unlike in a state channel, offchain progress does not require unanimous consent, and liveness is preserved as long as there is a single honest validator.
Note that the trust model for Arbitrum Sidechains is much stronger than for typical BFT-style chains which introduce a consensus "voting" protocols among a small permissioned group of validators. BFT-based protocols require a supermajority (more than 2/3) of validators to agree. In Arbitrum Sidechains, by contrast, all you need is a single honest validator to achieve guaranteed correctness and progress. Notice that in Arbitrum adding validators strictly increases security since the AnyTrust guarantee provides correctness as long as any one validator is honest and available. By contrast, in BFT-style protocols, adding nodes can be dangerous as a coalition of dishonest nodes can break the protocol.
Like Arbitrum Rollup, the developer and user experiences for Arbitrum Sidechains will be identical to that of Ethereum. Reddit would be able to choose a large and diverse set of validators, and all that they would need to guarantee to break through the scaling barrier is that a single one of them will remain honest.
We hope to have Arbitrum Sidechains in production in early 2021, and thus when Reddit reaches the scale that surpasses the capacity of Rollups, Arbitrum Sidechains will be waiting and ready to help.
While the idea to switch between channels and Rollup to get the best of both worlds is conceptually simple, getting the details right and making sure that the switch does not introduce any attack vectors is highly non-trivial and has been the subject of years of our research (indeed, we were working on this design for years before the term Rollup was even coined).
12. How Arbitrum compares
We include a comparison to several other categories as well as specific projects when appropriate. and explain why we believe that Arbitrum is best suited for Reddit's purposes. We focus our attention on other Ethereum projects.
Payment only Rollups. Compared to Arbitrum Rollup, ZK-Rollups and other Rollups that only support token transfers have several disadvantages:
  • As outlined throughout the proposal, we believe that the entire draw of Ethereum is in its rich smart contracts support which is simply not achievable with today's zero-knowledge proof technology. Indeed, scaling with a ZK-Rollup will add friction to the deployment of smart contracts that interact with Community Points as users will have to withdraw their coins from the ZK-Rollup and transfer them to a smart contract system (like Arbitrum). The community will be best served if Reddit builds on a platform that has built-in, frictionless smart-contract support.
  • All other Rollup protocols of which we are aware employ a centralized operator. While it's true that users retain custody of their coins, the centralized operator can often profit from censoring, reordering, or delaying transactions. A common misconception is that since they're non-custodial protocols, a centralized sequencer does not pose a risk but this is incorrect as the sequencer can wreak havoc or shake down users for side payments without directly stealing funds.
  • Sidechain type protocols can eliminate some of these issues, but they are not trustless. Instead, they require trust in some quorum of a committee, often requiring two-third of the committee to be honest, compared to rollup protocols like Arbitrum that require only a single honest party. In addition, not all sidechain type protocols have committees that are diverse, or even non-centralized, in practice.
  • Plasma-style protocols have a centralized operator and do not support general smart contracts.
13. Concluding Remarks
While it's ultimately up to the judges’ palate, we believe that Arbitrum Rollup is the bakeoff choice that Reddit kneads. We far surpass Reddit's specified workload requirement at present, have much room to optimize Arbitrum Rollup in the near term, and have a clear path to get Reddit to hundreds of millions of users. Furthermore, we are the only project that gives developers and users the identical interface as the Ethereum blockchain and is fully interoperable and tooling-compatible, and we do this all without any new trust assumptions or centralized components.
But no matter how the cookie crumbles, we're glad to have participated in this bake-off and we thank you for your consideration.
About Offchain Labs
Offchain Labs, Inc. is a venture-funded New York company that spun out of Princeton University research, and is building the Arbitrum platform to usher in the next generation of scalable, interoperable, and compatible smart contracts. Offchain Labs is backed by Pantera Capital, Compound VC, Coinbase Ventures, and others.
Leadership Team
Ed Felten
Ed Felten is Co-founder and Chief Scientist at Offchain Labs. He is on leave from Princeton University, where he is the Robert E. Kahn Professor of Computer Science and Public Affairs. From 2015 to 2017 he served at the White House as Deputy United States Chief Technology Officer and senior advisor to the President. He is an ACM Fellow and member of the National Academy of Engineering. Outside of work, he is an avid runner, cook, and L.A. Dodgers fan.
Steven Goldfeder
Steven Goldfeder is Co-founder and Chief Executive Officer at Offchain Labs. He holds a PhD from Princeton University, where he worked at the intersection of cryptography and cryptocurrencies including threshold cryptography, zero-knowledge proof systems, and post-quantum signatures. He is a co-author of Bitcoin and Cryptocurrency Technologies, the leading textbook on cryptocurrencies, and he has previously worked at Google and Microsoft Research, where he co-invented the Picnic signature algorithm. When not working, you can find Steven spending time with his family, taking a nature walk, or twisting balloons.
Harry Kalodner
Harry Kalodner is Co-founder and Chief Technology Officer at Offchain Labs where he leads the engineering team. Before the company he attended Princeton as a Ph.D candidate where his research explored economics, anonymity, and incentive compatibility of cryptocurrencies, and he also has worked at Apple. When not up at 3:00am writing code, Harry occasionally sleeps.
submitted by hkalodner to ethereum [link] [comments]

Why i’m bullish on Zilliqa (long read)

Edit: TL;DR added in the comments
 
Hey all, I've been researching coins since 2017 and have gone through 100s of them in the last 3 years. I got introduced to blockchain via Bitcoin of course, analyzed Ethereum thereafter and from that moment I have a keen interest in smart contact platforms. I’m passionate about Ethereum but I find Zilliqa to have a better risk-reward ratio. Especially because Zilliqa has found an elegant balance between being secure, decentralized and scalable in my opinion.
 
Below I post my analysis of why from all the coins I went through I’m most bullish on Zilliqa (yes I went through Tezos, EOS, NEO, VeChain, Harmony, Algorand, Cardano etc.). Note that this is not investment advice and although it's a thorough analysis there is obviously some bias involved. Looking forward to what you all think!
 
Fun fact: the name Zilliqa is a play on ‘silica’ silicon dioxide which means “Silicon for the high-throughput consensus computer.”
 
This post is divided into (i) Technology, (ii) Business & Partnerships, and (iii) Marketing & Community. I’ve tried to make the technology part readable for a broad audience. If you’ve ever tried understanding the inner workings of Bitcoin and Ethereum you should be able to grasp most parts. Otherwise, just skim through and once you are zoning out head to the next part.
 
Technology and some more:
 
Introduction
 
The technology is one of the main reasons why I’m so bullish on Zilliqa. First thing you see on their website is: “Zilliqa is a high-performance, high-security blockchain platform for enterprises and next-generation applications.” These are some bold statements.
 
Before we deep dive into the technology let’s take a step back in time first as they have quite the history. The initial research paper from which Zilliqa originated dates back to August 2016: Elastico: A Secure Sharding Protocol For Open Blockchains where Loi Luu (Kyber Network) is one of the co-authors. Other ideas that led to the development of what Zilliqa has become today are: Bitcoin-NG, collective signing CoSi, ByzCoin and Omniledger.
 
The technical white paper was made public in August 2017 and since then they have achieved everything stated in the white paper and also created their own open source intermediate level smart contract language called Scilla (functional programming language similar to OCaml) too.
 
Mainnet is live since the end of January 2019 with daily transaction rates growing continuously. About a week ago mainnet reached 5 million transactions, 500.000+ addresses in total along with 2400 nodes keeping the network decentralized and secure. Circulating supply is nearing 11 billion and currently only mining rewards are left. The maximum supply is 21 billion with annual inflation being 7.13% currently and will only decrease with time.
 
Zilliqa realized early on that the usage of public cryptocurrencies and smart contracts were increasing but decentralized, secure, and scalable alternatives were lacking in the crypto space. They proposed to apply sharding onto a public smart contract blockchain where the transaction rate increases almost linear with the increase in the amount of nodes. More nodes = higher transaction throughput and increased decentralization. Sharding comes in many forms and Zilliqa uses network-, transaction- and computational sharding. Network sharding opens up the possibility of using transaction- and computational sharding on top. Zilliqa does not use state sharding for now. We’ll come back to this later.
 
Before we continue dissecting how Zilliqa achieves such from a technological standpoint it’s good to keep in mind that a blockchain being decentralised and secure and scalable is still one of the main hurdles in allowing widespread usage of decentralised networks. In my opinion this needs to be solved first before blockchains can get to the point where they can create and add large scale value. So I invite you to read the next section to grasp the underlying fundamentals. Because after all these premises need to be true otherwise there isn’t a fundamental case to be bullish on Zilliqa, right?
 
Down the rabbit hole
 
How have they achieved this? Let’s define the basics first: key players on Zilliqa are the users and the miners. A user is anybody who uses the blockchain to transfer funds or run smart contracts. Miners are the (shard) nodes in the network who run the consensus protocol and get rewarded for their service in Zillings (ZIL). The mining network is divided into several smaller networks called shards, which is also referred to as ‘network sharding’. Miners subsequently are randomly assigned to a shard by another set of miners called DS (Directory Service) nodes. The regular shards process transactions and the outputs of these shards are eventually combined by the DS shard as they reach consensus on the final state. More on how these DS shards reach consensus (via pBFT) will be explained later on.
 
The Zilliqa network produces two types of blocks: DS blocks and Tx blocks. One DS Block consists of 100 Tx Blocks. And as previously mentioned there are two types of nodes concerned with reaching consensus: shard nodes and DS nodes. Becoming a shard node or DS node is being defined by the result of a PoW cycle (Ethash) at the beginning of the DS Block. All candidate mining nodes compete with each other and run the PoW (Proof-of-Work) cycle for 60 seconds and the submissions achieving the highest difficulty will be allowed on the network. And to put it in perspective: the average difficulty for one DS node is ~ 2 Th/s equaling 2.000.000 Mh/s or 55 thousand+ GeForce GTX 1070 / 8 GB GPUs at 35.4 Mh/s. Each DS Block 10 new DS nodes are allowed. And a shard node needs to provide around 8.53 GH/s currently (around 240 GTX 1070s). Dual mining ETH/ETC and ZIL is possible and can be done via mining software such as Phoenix and Claymore. There are pools and if you have large amounts of hashing power (Ethash) available you could mine solo.
 
The PoW cycle of 60 seconds is a peak performance and acts as an entry ticket to the network. The entry ticket is called a sybil resistance mechanism and makes it incredibly hard for adversaries to spawn lots of identities and manipulate the network with these identities. And after every 100 Tx Blocks which corresponds to roughly 1,5 hour this PoW process repeats. In between these 1,5 hour, no PoW needs to be done meaning Zilliqa’s energy consumption to keep the network secure is low. For more detailed information on how mining works click here.
Okay, hats off to you. You have made it this far. Before we go any deeper down the rabbit hole we first must understand why Zilliqa goes through all of the above technicalities and understand a bit more what a blockchain on a more fundamental level is. Because the core of Zilliqa’s consensus protocol relies on the usage of pBFT (practical Byzantine Fault Tolerance) we need to know more about state machines and their function. Navigate to Viewblock, a Zilliqa block explorer, and just come back to this article. We will use this site to navigate through a few concepts.
 
We have established that Zilliqa is a public and distributed blockchain. Meaning that everyone with an internet connection can send ZILs, trigger smart contracts, etc. and there is no central authority who fully controls the network. Zilliqa and other public and distributed blockchains (like Bitcoin and Ethereum) can also be defined as state machines.
 
Taking the liberty of paraphrasing examples and definitions given by Samuel Brooks’ medium article, he describes the definition of a blockchain (like Zilliqa) as: “A peer-to-peer, append-only datastore that uses consensus to synchronize cryptographically-secure data”.
 
Next, he states that: "blockchains are fundamentally systems for managing valid state transitions”. For some more context, I recommend reading the whole medium article to get a better grasp of the definitions and understanding of state machines. Nevertheless, let’s try to simplify and compile it into a single paragraph. Take traffic lights as an example: all its states (red, amber, and green) are predefined, all possible outcomes are known and it doesn’t matter if you encounter the traffic light today or tomorrow. It will still behave the same. Managing the states of a traffic light can be done by triggering a sensor on the road or pushing a button resulting in one traffic lights’ state going from green to red (via amber) and another light from red to green.
 
With public blockchains like Zilliqa, this isn’t so straightforward and simple. It started with block #1 almost 1,5 years ago and every 45 seconds or so a new block linked to the previous block is being added. Resulting in a chain of blocks with transactions in it that everyone can verify from block #1 to the current #647.000+ block. The state is ever changing and the states it can find itself in are infinite. And while the traffic light might work together in tandem with various other traffic lights, it’s rather insignificant comparing it to a public blockchain. Because Zilliqa consists of 2400 nodes who need to work together to achieve consensus on what the latest valid state is while some of these nodes may have latency or broadcast issues, drop offline or are deliberately trying to attack the network, etc.
 
Now go back to the Viewblock page take a look at the amount of transaction, addresses, block and DS height and then hit refresh. Obviously as expected you see new incremented values on one or all parameters. And how did the Zilliqa blockchain manage to transition from a previous valid state to the latest valid state? By using pBFT to reach consensus on the latest valid state.
 
After having obtained the entry ticket, miners execute pBFT to reach consensus on the ever-changing state of the blockchain. pBFT requires a series of network communication between nodes, and as such there is no GPU involved (but CPU). Resulting in the total energy consumed to keep the blockchain secure, decentralized and scalable being low.
 
pBFT stands for practical Byzantine Fault Tolerance and is an optimization on the Byzantine Fault Tolerant algorithm. To quote Blockonomi: “In the context of distributed systems, Byzantine Fault Tolerance is the ability of a distributed computer network to function as desired and correctly reach a sufficient consensus despite malicious components (nodes) of the system failing or propagating incorrect information to other peers.” Zilliqa is such a distributed computer network and depends on the honesty of the nodes (shard and DS) to reach consensus and to continuously update the state with the latest block. If pBFT is a new term for you I can highly recommend the Blockonomi article.
 
The idea of pBFT was introduced in 1999 - one of the authors even won a Turing award for it - and it is well researched and applied in various blockchains and distributed systems nowadays. If you want more advanced information than the Blockonomi link provides click here. And if you’re in between Blockonomi and the University of Singapore read the Zilliqa Design Story Part 2 dating from October 2017.
Quoting from the Zilliqa tech whitepaper: “pBFT relies upon a correct leader (which is randomly selected) to begin each phase and proceed when the sufficient majority exists. In case the leader is byzantine it can stall the entire consensus protocol. To address this challenge, pBFT offers a view change protocol to replace the byzantine leader with another one.”
 
pBFT can tolerate ⅓ of the nodes being dishonest (offline counts as Byzantine = dishonest) and the consensus protocol will function without stalling or hiccups. Once there are more than ⅓ of dishonest nodes but no more than ⅔ the network will be stalled and a view change will be triggered to elect a new DS leader. Only when more than ⅔ of the nodes are dishonest (66%) double-spend attacks become possible.
 
If the network stalls no transactions can be processed and one has to wait until a new honest leader has been elected. When the mainnet was just launched and in its early phases, view changes happened regularly. As of today the last stalling of the network - and view change being triggered - was at the end of October 2019.
 
Another benefit of using pBFT for consensus besides low energy is the immediate finality it provides. Once your transaction is included in a block and the block is added to the chain it’s done. Lastly, take a look at this article where three types of finality are being defined: probabilistic, absolute and economic finality. Zilliqa falls under the absolute finality (just like Tendermint for example). Although lengthy already we skipped through some of the inner workings from Zilliqa’s consensus: read the Zilliqa Design Story Part 3 and you will be close to having a complete picture on it. Enough about PoW, sybil resistance mechanism, pBFT, etc. Another thing we haven’t looked at yet is the amount of decentralization.
 
Decentralisation
 
Currently, there are four shards, each one of them consisting of 600 nodes. 1 shard with 600 so-called DS nodes (Directory Service - they need to achieve a higher difficulty than shard nodes) and 1800 shard nodes of which 250 are shard guards (centralized nodes controlled by the team). The amount of shard guards has been steadily declining from 1200 in January 2019 to 250 as of May 2020. On the Viewblock statistics, you can see that many of the nodes are being located in the US but those are only the (CPU parts of the) shard nodes who perform pBFT. There is no data from where the PoW sources are coming. And when the Zilliqa blockchain starts reaching its transaction capacity limit, a network upgrade needs to be executed to lift the current cap of maximum 2400 nodes to allow more nodes and formation of more shards which will allow to network to keep on scaling according to demand.
Besides shard nodes there are also seed nodes. The main role of seed nodes is to serve as direct access points (for end-users and clients) to the core Zilliqa network that validates transactions. Seed nodes consolidate transaction requests and forward these to the lookup nodes (another type of nodes) for distribution to the shards in the network. Seed nodes also maintain the entire transaction history and the global state of the blockchain which is needed to provide services such as block explorers. Seed nodes in the Zilliqa network are comparable to Infura on Ethereum.
 
The seed nodes were first only operated by Zilliqa themselves, exchanges and Viewblock. Operators of seed nodes like exchanges had no incentive to open them for the greater public. They were centralised at first. Decentralisation at the seed nodes level has been steadily rolled out since March 2020 ( Zilliqa Improvement Proposal 3 ). Currently the amount of seed nodes is being increased, they are public-facing and at the same time PoS is applied to incentivize seed node operators and make it possible for ZIL holders to stake and earn passive yields. Important distinction: seed nodes are not involved with consensus! That is still PoW as entry ticket and pBFT for the actual consensus.
 
5% of the block rewards are being assigned to seed nodes (from the beginning in 2019) and those are being used to pay out ZIL stakers. The 5% block rewards with an annual yield of 10.03% translate to roughly 610 MM ZILs in total that can be staked. Exchanges use the custodial variant of staking and wallets like Moonlet will use the non-custodial version (starting in Q3 2020). Staking is being done by sending ZILs to a smart contract created by Zilliqa and audited by Quantstamp.
 
With a high amount of DS; shard nodes and seed nodes becoming more decentralized too, Zilliqa qualifies for the label of decentralized in my opinion.
 
Smart contracts
 
Let me start by saying I’m not a developer and my programming skills are quite limited. So I‘m taking the ELI5 route (maybe 12) but if you are familiar with Javascript, Solidity or specifically OCaml please head straight to Scilla - read the docs to get a good initial grasp of how Zilliqa’s smart contract language Scilla works and if you ask yourself “why another programming language?” check this article. And if you want to play around with some sample contracts in an IDE click here. The faucet can be found here. And more information on architecture, dapp development and API can be found on the Developer Portal.
If you are more into listening and watching: check this recent webinar explaining Zilliqa and Scilla. Link is time-stamped so you’ll start right away with a platform introduction, roadmap 2020 and afterwards a proper Scilla introduction.
 
Generalized: programming languages can be divided into being ‘object-oriented’ or ‘functional’. Here is an ELI5 given by software development academy: * “all programs have two basic components, data – what the program knows – and behavior – what the program can do with that data. So object-oriented programming states that combining data and related behaviors in one place, is called “object”, which makes it easier to understand how a particular program works. On the other hand, functional programming argues that data and behavior are different things and should be separated to ensure their clarity.” *
 
Scilla is on the functional side and shares similarities with OCaml: OCaml is a general-purpose programming language with an emphasis on expressiveness and safety. It has an advanced type system that helps catch your mistakes without getting in your way. It's used in environments where a single mistake can cost millions and speed matters, is supported by an active community, and has a rich set of libraries and development tools. For all its power, OCaml is also pretty simple, which is one reason it's often used as a teaching language.
 
Scilla is blockchain agnostic, can be implemented onto other blockchains as well, is recognized by academics and won a so-called Distinguished Artifact Award award at the end of last year.
 
One of the reasons why the Zilliqa team decided to create their own programming language focused on preventing smart contract vulnerabilities is that adding logic on a blockchain, programming, means that you cannot afford to make mistakes. Otherwise, it could cost you. It’s all great and fun blockchains being immutable but updating your code because you found a bug isn’t the same as with a regular web application for example. And with smart contracts, it inherently involves cryptocurrencies in some form thus value.
 
Another difference with programming languages on a blockchain is gas. Every transaction you do on a smart contract platform like Zilliqa or Ethereum costs gas. With gas you basically pay for computational costs. Sending a ZIL from address A to address B costs 0.001 ZIL currently. Smart contracts are more complex, often involve various functions and require more gas (if gas is a new concept click here ).
 
So with Scilla, similar to Solidity, you need to make sure that “every function in your smart contract will run as expected without hitting gas limits. An improper resource analysis may lead to situations where funds may get stuck simply because a part of the smart contract code cannot be executed due to gas limits. Such constraints are not present in traditional software systems”. Scilla design story part 1
 
Some examples of smart contract issues you’d want to avoid are: leaking funds, ‘unexpected changes to critical state variables’ (example: someone other than you setting his or her address as the owner of the smart contract after creation) or simply killing a contract.
 
Scilla also allows for formal verification. Wikipedia to the rescue: In the context of hardware and software systems, formal verification is the act of proving or disproving the correctness of intended algorithms underlying a system with respect to a certain formal specification or property, using formal methods of mathematics.
 
Formal verification can be helpful in proving the correctness of systems such as: cryptographic protocols, combinational circuits, digital circuits with internal memory, and software expressed as source code.
 
Scilla is being developed hand-in-hand with formalization of its semantics and its embedding into the Coq proof assistant — a state-of-the art tool for mechanized proofs about properties of programs.”
 
Simply put, with Scilla and accompanying tooling developers can be mathematically sure and proof that the smart contract they’ve written does what he or she intends it to do.
 
Smart contract on a sharded environment and state sharding
 
There is one more topic I’d like to touch on: smart contract execution in a sharded environment (and what is the effect of state sharding). This is a complex topic. I’m not able to explain it any easier than what is posted here. But I will try to compress the post into something easy to digest.
 
Earlier on we have established that Zilliqa can process transactions in parallel due to network sharding. This is where the linear scalability comes from. We can define simple transactions: a transaction from address A to B (Category 1), a transaction where a user interacts with one smart contract (Category 2) and the most complex ones where triggering a transaction results in multiple smart contracts being involved (Category 3). The shards are able to process transactions on their own without interference of the other shards. With Category 1 transactions that is doable, with Category 2 transactions sometimes if that address is in the same shard as the smart contract but with Category 3 you definitely need communication between the shards. Solving that requires to make a set of communication rules the protocol needs to follow in order to process all transactions in a generalised fashion.
 
And this is where the downsides of state sharding comes in currently. All shards in Zilliqa have access to the complete state. Yes the state size (0.1 GB at the moment) grows and all of the nodes need to store it but it also means that they don’t need to shop around for information available on other shards. Requiring more communication and adding more complexity. Computer science knowledge and/or developer knowledge required links if you want to dig further: Scilla - language grammar Scilla - Foundations for Verifiable Decentralised Computations on a Blockchain Gas Accounting NUS x Zilliqa: Smart contract language workshop
 
Easier to follow links on programming Scilla https://learnscilla.com/home Ivan on Tech
 
Roadmap / Zilliqa 2.0
 
There is no strict defined roadmap but here are topics being worked on. And via the Zilliqa website there is also more information on the projects they are working on.
 
Business & Partnerships
 
It’s not only technology in which Zilliqa seems to be excelling as their ecosystem has been expanding and starting to grow rapidly. The project is on a mission to provide OpenFinance (OpFi) to the world and Singapore is the right place to be due to its progressive regulations and futuristic thinking. Singapore has taken a proactive approach towards cryptocurrencies by introducing the Payment Services Act 2019 (PS Act). Among other things, the PS Act will regulate intermediaries dealing with certain cryptocurrencies, with a particular focus on consumer protection and anti-money laundering. It will also provide a stable regulatory licensing and operating framework for cryptocurrency entities, effectively covering all crypto businesses and exchanges based in Singapore. According to PWC 82% of the surveyed executives in Singapore reported blockchain initiatives underway and 13% of them have already brought the initiatives live to the market. There is also an increasing list of organizations that are starting to provide digital payment services. Moreover, Singaporean blockchain developers Building Cities Beyond has recently created an innovation $15 million grant to encourage development on its ecosystem. This all suggests that Singapore tries to position itself as (one of) the leading blockchain hubs in the world.
 
Zilliqa seems to already take advantage of this and recently helped launch Hg Exchange on their platform, together with financial institutions PhillipCapital, PrimePartners and Fundnel. Hg Exchange, which is now approved by the Monetary Authority of Singapore (MAS), uses smart contracts to represent digital assets. Through Hg Exchange financial institutions worldwide can use Zilliqa's safe-by-design smart contracts to enable the trading of private equities. For example, think of companies such as Grab, Airbnb, SpaceX that are not available for public trading right now. Hg Exchange will allow investors to buy shares of private companies & unicorns and capture their value before an IPO. Anquan, the main company behind Zilliqa, has also recently announced that they became a partner and shareholder in TEN31 Bank, which is a fully regulated bank allowing for tokenization of assets and is aiming to bridge the gap between conventional banking and the blockchain world. If STOs, the tokenization of assets, and equity trading will continue to increase, then Zilliqa’s public blockchain would be the ideal candidate due to its strategic positioning, partnerships, regulatory compliance and the technology that is being built on top of it.
 
What is also very encouraging is their focus on banking the un(der)banked. They are launching a stablecoin basket starting with XSGD. As many of you know, stablecoins are currently mostly used for trading. However, Zilliqa is actively trying to broaden the use case of stablecoins. I recommend everybody to read this text that Amrit Kumar wrote (one of the co-founders). These stablecoins will be integrated in the traditional markets and bridge the gap between the crypto world and the traditional world. This could potentially revolutionize and legitimise the crypto space if retailers and companies will for example start to use stablecoins for payments or remittances, instead of it solely being used for trading.
 
Zilliqa also released their DeFi strategic roadmap (dating November 2019) which seems to be aligning well with their OpFi strategy. A non-custodial DEX is coming to Zilliqa made by Switcheo which allows cross-chain trading (atomic swaps) between ETH, EOS and ZIL based tokens. They also signed a Memorandum of Understanding for a (soon to be announced) USD stablecoin. And as Zilliqa is all about regulations and being compliant, I’m speculating on it to be a regulated USD stablecoin. Furthermore, XSGD is already created and visible on block explorer and XIDR (Indonesian Stablecoin) is also coming soon via StraitsX. Here also an overview of the Tech Stack for Financial Applications from September 2019. Further quoting Amrit Kumar on this:
 
There are two basic building blocks in DeFi/OpFi though: 1) stablecoins as you need a non-volatile currency to get access to this market and 2) a dex to be able to trade all these financial assets. The rest are built on top of these blocks.
 
So far, together with our partners and community, we have worked on developing these building blocks with XSGD as a stablecoin. We are working on bringing a USD-backed stablecoin as well. We will soon have a decentralised exchange developed by Switcheo. And with HGX going live, we are also venturing into the tokenization space. More to come in the future.”
 
Additionally, they also have this ZILHive initiative that injects capital into projects. There have been already 6 waves of various teams working on infrastructure, innovation and research, and they are not from ASEAN or Singapore only but global: see Grantees breakdown by country. Over 60 project teams from over 20 countries have contributed to Zilliqa's ecosystem. This includes individuals and teams developing wallets, explorers, developer toolkits, smart contract testing frameworks, dapps, etc. As some of you may know, Unstoppable Domains (UD) blew up when they launched on Zilliqa. UD aims to replace cryptocurrency addresses with a human-readable name and allows for uncensorable websites. Zilliqa will probably be the only one able to handle all these transactions onchain due to ability to scale and its resulting low fees which is why the UD team launched this on Zilliqa in the first place. Furthermore, Zilliqa also has a strong emphasis on security, compliance, and privacy, which is why they partnered with companies like Elliptic, ChainSecurity (part of PwC Switzerland), and Incognito. Their sister company Aqilliz (Zilliqa spelled backwards) focuses on revolutionizing the digital advertising space and is doing interesting things like using Zilliqa to track outdoor digital ads with companies like Foodpanda.
 
Zilliqa is listed on nearly all major exchanges, having several different fiat-gateways and recently have been added to Binance’s margin trading and futures trading with really good volume. They also have a very impressive team with good credentials and experience. They don't just have “tech people”. They have a mix of tech people, business people, marketeers, scientists, and more. Naturally, it's good to have a mix of people with different skill sets if you work in the crypto space.
 
Marketing & Community
 
Zilliqa has a very strong community. If you just follow their Twitter their engagement is much higher for a coin that has approximately 80k followers. They also have been ‘coin of the day’ by LunarCrush many times. LunarCrush tracks real-time cryptocurrency value and social data. According to their data, it seems Zilliqa has a more fundamental and deeper understanding of marketing and community engagement than almost all other coins. While almost all coins have been a bit frozen in the last months, Zilliqa seems to be on its own bull run. It was somewhere in the 100s a few months ago and is currently ranked #46 on CoinGecko. Their official Telegram also has over 20k people and is very active, and their community channel which is over 7k now is more active and larger than many other official channels. Their local communities also seem to be growing.
 
Moreover, their community started ‘Zillacracy’ together with the Zilliqa core team ( see www.zillacracy.com ). It’s a community-run initiative where people from all over the world are now helping with marketing and development on Zilliqa. Since its launch in February 2020 they have been doing a lot and will also run their own non-custodial seed node for staking. This seed node will also allow them to start generating revenue for them to become a self sustaining entity that could potentially scale up to become a decentralized company working in parallel with the Zilliqa core team. Comparing it to all the other smart contract platforms (e.g. Cardano, EOS, Tezos etc.) they don't seem to have started a similar initiative (correct me if I’m wrong though). This suggests in my opinion that these other smart contract platforms do not fully understand how to utilize the ‘power of the community’. This is something you cannot ‘buy with money’ and gives many projects in the space a disadvantage.
 
Zilliqa also released two social products called SocialPay and Zeeves. SocialPay allows users to earn ZILs while tweeting with a specific hashtag. They have recently used it in partnership with the Singapore Red Cross for a marketing campaign after their initial pilot program. It seems like a very valuable social product with a good use case. I can see a lot of traditional companies entering the space through this product, which they seem to suggest will happen. Tokenizing hashtags with smart contracts to get network effect is a very smart and innovative idea.
 
Regarding Zeeves, this is a tipping bot for Telegram. They already have 1000s of signups and they plan to keep upgrading it for more and more people to use it (e.g. they recently have added a quiz features). They also use it during AMAs to reward people in real-time. It’s a very smart approach to grow their communities and get familiar with ZIL. I can see this becoming very big on Telegram. This tool suggests, again, that the Zilliqa team has a deeper understanding of what the crypto space and community needs and is good at finding the right innovative tools to grow and scale.
 
To be honest, I haven’t covered everything (i’m also reaching the character limited haha). So many updates happening lately that it's hard to keep up, such as the International Monetary Fund mentioning Zilliqa in their report, custodial and non-custodial Staking, Binance Margin, Futures, Widget, entering the Indian market, and more. The Head of Marketing Colin Miles has also released this as an overview of what is coming next. And last but not least, Vitalik Buterin has been mentioning Zilliqa lately acknowledging Zilliqa and mentioning that both projects have a lot of room to grow. There is much more info of course and a good part of it has been served to you on a silver platter. I invite you to continue researching by yourself :-) And if you have any comments or questions please post here!
submitted by haveyouheardaboutit to CryptoCurrency [link] [comments]

Using the Lerchs-Grossman algorithm during open pit mine design - DesignXL Simulation of Dijkstra's algorithm (openGL) Prim's Algorithm OpenGL OpenGL - Project Demo with source code - Solar System ... Crypto Trading bot for Binance and Bittrex automated ...

Reports come in that Google has just released a new core algorithm update and that Google is allegedly censoring bitcoin. This is an auspicious time for censorship. What is a Bitcoin Robot? A Bitcoin robot is an auto-trading software that use complex algorithms and mechanisms to scan the Bitcoin markets, read signals and make decisions on which trades to The process is almost the same as Bitcoin mining, except you use the scrypt algorithm instead of sha256d. There are many other alternative cryptocurrencies to choose from. Notes and references [1] Bitcoin mining seems like a NP (nondeterministic polynomial) problem since a solution can be quickly verified. However, there are a couple of issues A BIP is a "Bitcoin improvement proposal" or the format for making changes to Bitcoin. Similar to the relationship between a private and public key, the private key sequence that results from using an HD wallet is defined by a one-way relationship between inputs and outputs of an algorithm. When mining bitcoin, the hashcash algorithm repeatedly hashes the block header while incrementing the counter & extraNonce fields. Incrementing the extraNonce field entails recomputing the merkle tree, as the coinbase transaction is the left most leaf node. The block is also occasionally updated as you are working on it.

[index] [20235] [25290] [21941] [2776] [8159] [17486] [7847] [30841] [2478] [20836]

Using the Lerchs-Grossman algorithm during open pit mine design - DesignXL

Swiscoin generated by SHA 256 algorithm open source, allowing all users are allowed to exploit and exploitation of the schedule. ... As 1 of 4 Company declared open algorithm: Bitcoin, LiteCoin ... 15 Sorting Algorithms in 6 Minutes - Duration: 5:50. Timo Bingmann Recommended for you. 5:50. Photogrammetry vs. "Real" 3D Scanner - Duration: 17:55. Thomas Sanladerer Recommended for you. For the Love of Physics - Walter Lewin - May 16, 2011 - Duration: 1:01:26. Lectures by Walter Lewin. They will make you ♥ Physics. Recommended for you Hand Gesture Recognition Basics Finger Counting Algorithm with OpenCV 🖖 ... OpenCV 3 Car Counting C++ full source code - Duration: 11:15. Chris Dahms 81,206 views. 11:15. Project Name: Solar System Developed By: Waliullah Marjan If you need an emergency then knock me on Fiverr: https://www.fiverr.com/s2/5d017e87ad Buy this Pro...

#