text
stringlengths 46
74.6k
|
---|
Meet the Builders of the Open Web: Top Speakers at NEARCON ’23
NEAR FOUNDATION
October 19, 2023
If you thought last year’s NEARCON was epic, then wait until you see what’s in store for NEARCON ’23 in Lisbon, kicking off on November 7th. From deep dives into the open web to insightful sessions featuring global leaders in blockchain and the open web, here’s the inside scoop on what to expect at this year’s iconic edition.
What awaits you at NEARCON ’23
NEARCON ’23 is set to be a smorgasbord of innovation and collaboration. Taking place across three venues in Lisbon, including the historic Convento Do Beato, and spanning four days, NEARCON ‘23 attendees will have access to an array of exciting opportunities. From enlightening talks on AI and Web3’s future to hands-on developer hackathons, there’s something for everyone.
You also won’t want to miss exclusive networking sessions that connect you with NEAR and multi-chain enthusiasts and projects from around the globe. Attendees also get a firsthand experience with building on the Blockchain Operating System (B.O.S.).
Meet the Big Brains of NEARCON ’23
There’s an incredible lineup of speakers and thought leaders scheduled for this year’s edition of NEARCON. Here’s a rundown of some of top names you’ll on various stages, panels and sessions:
Mitchell Amador: Founder and CEO of Immunefi. Mitchell is a leading voice in blockchain security and will explore how incentivized bounties enhance security.
James Young: The Co-founder and CCO of Collab.Land will delve into the power of tokenized communities, from governance to engagement.
Samantha Bohbot: Chief Growth Officer at RockawayX. An expert in strategic growth, Samantha will share essential tips for project scalability.
Se Hyun Oh: SVP at SK Telecom. Expect insights into the convergence of telecom, AI, and the open web.
Michael Casey: Chief Content Officer at CoinDesk. Michael will dissect the evolving narratives around cryptocurrency and its impact on the traditional financial sector.
Richard Muirhead: Managing Partner at Fabric Ventures. Hear a venture capitalist’s perspective on what makes a blockchain project investable.
Oleg Fomenko: Co-founder of Sweat Economy. Join the discussion on how blockchain is revolutionizing health and fitness via Move-to-Earn.
Nicolai Reinbold: Global Head of Expansion and Innovation at CV Labs. Nicolai will cover blockchain innovation hubs and their role in nurturing early-stage startups.
Teana Baker-Taylor: Circle’s VP of Policy and Regulatory Strategy for the EU/UK. She’ll address the regulatory landscape surrounding blockchain and cryptocurrencies.
Joy Macknight: The editor at The Banker will offer an overview of how the open web and blockchain are disrupting traditional banking systems.
Check out the complete list of NEARCON 23′ Speakers, full of founders, builders, VCs, and other open web innovators.
Maximizing your NEARCON ’23 experience
While the speakers are a definite draw, NEARCON ’23 is more than just speaker sessions. You can also join the NEARCON IRL Hackathon where developers will have the chance to build directly on the B.O.S and compete for over $140,000 in prizes. In case you didn’t already know, NEARCON is free for hackers participating in the hackathon! At NEARCON ‘23, there’s something for just about everyone.
Four Distinct Paths to Explore at NEARCON ’23
Developers: Engage in technical workshops and learn from the best in the business. These sessions are designed to meet you at any point in your open web journey.
Entrepreneurs: Explore how the open web can revolutionize the way businesses connect and engage with users through innovative products and solutions.
Creators: Discover how the open web can disrupt traditional systems, offering more equitable platforms for creators to get discovered and fairly compensated.
Regulators: Stay up-to-date with the evolving regulations and policies that are shaping the open web, keeping you ahead of the curve.
And don’t forget the networking opportunities, where you’ll get a chance to collaborate and engage in exciting conversations with NEAR community members from around the world.
Don’t miss the future at NEARCON ’23
Your chance to explore the entire NEAR ecosystem and the B.O.S is right around the corner. So, come play a part in driving the open web towards mass adoption. And with General Admission ticket prices still available, NEARCON ‘23 is still a steal for anyone interested in helping shape the future of blockchain and decentralization.
So what are you waiting for? Register for NEARCON ’23 to secure your spot today. And if you’re a Ukrainian living in Portugal, or a student in Portugal or Spain, click those links for more information on how to register and attend NEARCON ‘23 in Lisbon for absolutely free!
|
---
NEP: 452
Title: Linkdrop Standard
Author: Ben Kurrek <[email protected]>, Ken Miyachi <[email protected]>
DiscussionsTo: https://gov.near.org/t/official-linkdrop-standard/32463/1
Status: Final
Type: Standards Track
Category: Contract
Version: 1.0.0
Created: 24-Jan-2023
Updated: 19-Apr-2023
---
## Summary
A standard interface for linkdrops that support $NEAR, fungible tokens, non-fungible tokens, and is extensible to support new types in the future.
Linkdrops are a simple way to send assets to someone by providing them with a link. This link can be embedded into a QR code, sent via email, text or any other means. Within the link, there is a private key that allows the holder to call a method that can create an account and send it assets. Alternatively, if the holder has an account, the assets can be sent there as well.
By definition, anyone with an access key can interact with the blockchain and since there is a private key embedded in the link, this removes the need for the end-user to have a wallet.
## Motivation
Linkdrops are an extremely powerful tool that enable seamless onboarding and instant crypto experiences with the click of a link. The original [near-linkdrop](https://github.com/near/near-linkdrop) contract provides a minimal interface allowing users to embed $NEAR within an access key and create a simple Web2 style link that can then be used as a means of onboarding. This simple $NEAR linkdrop is not enough as many artists, developers, event coordinators, and applications want to drop more digital assets such as NFTs, FTs, tickets etc.
As linkdrop implementations start to push the boundaries of what’s possible, new data structures, methods, and interfaces are being developed. There needs to be a standard data model and interface put into place to ensure assets can be claimed independent of the contract they came from. If not, integrating any application with linkdrops will require customized solutions, which would become cumbersome for the developer and deteriorate the user onboarding experience. The linkdrop standard addresses these issues by providing a simple and extensible standard data model and interface.
The initial discussion can be found [here](https://gov.near.org/t/official-linkdrop-standard/32463/1).
## Specification
### Example Scenarios
_Pre-requisite Steps_: Linkdrop creation:
The linkdrop creator that has an account with some $NEAR:
- creates a keypair locally (`pubKey1`, `privKey1`). (The keypair is not written to chain at this time)
- calls a method on a contract that implements the linkdrop standard in order to create the drop. The `pubKey1` and desired $NEAR amount are both passed in as arguments.
- The contract maps the `pubKey1` to the desired balance for the linkdrop (`KeyInfo` record).
- The contract then adds the `pubKey1` as a function call access key with the ability to call `claim` and `create_account_and_claim`. This means that anyone with the `privKey1` (see above), can sign a transaction on behalf of the contract (signer id set to contract id) with a function call to call one of the mentioned functions to claim the assets.
#### Claiming a linkdrop without a NEAR Account
A user with _no_ account can claim the assets associated with an existing public key, already registered in the linkdrop contract:
- generates a new keypair (`pubKey2`, `privKey2`) locally. (This new keypair is not written to chain)
- chooses a new account ID such as benji.near.
- calls `create_account_and_claim`. The transaction is signed on behalf of the linkdrop contract (`signer_id` is set to the contract address) using `privKey1`.
- the args of this function call will contain both `pubKey2` (which will be used to create a full access key for the new account) and the account ID itself.
- the linkdrop contract will delete the access key associated with `pubKey1` so that it cannot be used again.
- the linkdrop contract will create the new account and transfer the funds to it alongside any other assets.
- the user will be able to sign transactions on behalf of the new account using `privKey2`.
#### Claiming a linkdrop with a NEAR Account
A user with an _existing_ account can claim the assets with an existing public key, already registered in the linkdrop contract:
- calls `claim`. The transaction is signed on behalf of the linkdrop contract (`signer_id` is set to the contract address) using `privKey1`.
- the args of this function call will simply contain the user's existing account ID.
- the linkdrop contract will delete the access key associated with `pubKey1` so that it cannot be used again.
- the linkdrop contract will transfer the funds to that account alongside any other assets.
```ts
/// Information about a specific public key.
type KeyInfo = {
/// How much Gas should be attached when the key is used to call `claim` or `create_account_and_claim`.
/// It is up to the smart contract developer to calculate the required gas (which can be done either automatically on the contract or on the client-side).
required_gas: string,
/// yoctoNEAR$ amount that will be sent to the account that claims the linkdrop (either new or existing)
/// when the key is successfully used.
yoctonear: string,
/// If using the NFT standard extension, a set of NFTData can be linked to the public key
/// indicating that all those assets will be sent to the account that claims the linkdrop (either new or
/// existing) when the key is successfully used.
nft_list: NFTData[] | null,
/// If using the FT standard extension, a set of FTData can be linked to the public key
/// indicating that all those assets will be sent to the account that claims the linkdrop (either new or
/// existing) when the key is successfully used.
ft_list: FTData[] | null
/// ... other types can be introduced and the standard is easily extendable.
}
/// Data outlining a specific Non-Fungible Token that should be sent to the claiming account
/// (either new or existing) when a key is successfully used.
type NFTData = {
/// the id of the token to transfer
token_id: string,
/// The valid NEAR account indicating the Non-Fungible Token contract.
contract_id: string
}
/// Data outlining Fungible Tokens that should be sent to the claiming account
/// (either new or existing) when a key is successfully used.
type FTData = {
/// The number of tokens to transfer, wrapped in quotes and treated
/// like a string, although the number will be stored as an unsigned integer
/// with 128 bits.
amount: string,
/// The valid NEAR account indicating the Fungible Token contract.
contract_id: string
}
/****************/
/* VIEW METHODS */
/****************/
/// Allows you to query for the amount of $NEAR tokens contained in a linkdrop corresponding to a given public key.
///
/// Requirements:
/// * Panics if the key does not exist.
///
/// Arguments:
/// * `key` the public counterpart of the key used to sign, expressed as a string with format "<key-type>:<base58-key-bytes>" (e.g. "ed25519:6TupyNrcHGTt5XRLmHTc2KGaiSbjhQi1KHtCXTgbcr4Y")
///
/// Returns a string representing the $yoctoNEAR amount associated with a given public key
function get_key_balance(key: string) -> string;
/// Allows you to query for the `KeyInfo` corresponding to a given public key. This method is preferred over `get_key_balance` as it provides more information about the key.
///
/// Requirements:
/// * Panics if the key does not exist.
///
/// Arguments:
/// * `key` the public counterpart of the key used to sign, expressed as a string with format "<key-type>:<base58-key-bytes>" (e.g. "ed25519:6TupyNrcHGTt5XRLmHTc2KGaiSbjhQi1KHtCXTgbcr4Y")
///
/// Returns `KeyInfo` associated with a given public key
function get_key_information(key: string) -> KeyInfo;
/******************/
/* CHANGE METHODS */
/******************/
/// Transfer all assets linked to the signer’s public key to an `account_id`.
/// If the transfer fails for whatever reason, it is up to the smart contract developer to
/// choose what should happen. For example, the contract can choose to keep the assets
/// or send them back to the original linkdrop creator.
///
/// Requirements:
/// * The predecessor account *MUST* be the current contract ID.
/// * The `account_id` MUST be an *initialized* NEAR account.
/// * The assets being sent *MUST* be associated with the signer’s public key.
/// * The assets *MUST* be sent to the `account_id` passed in.
///
/// Arguments:
/// * `account_id` the account that should receive the linkdrop assets.
///
/// Returns `true` if the claim was successful meaning all assets were sent to the `account_id`.
function claim(account_id: string) -> Promise<boolean>;
/// Creates a new NEAR account and transfers all assets linked to the signer’s public key to
/// the *newly created account*. If the transfer fails for whatever reason, it is up to the
/// smart contract developer to choose what should happen. For example, the contract can
/// choose to keep the assets or return them to the original linkdrop creator.
///
/// Requirements:
/// * The predecessor account *MUST* be the current contract ID.
/// * The assets being sent *MUST* be associated with the signer’s public key.
/// * The assets *MUST* be sent to the `new_account_id` passed in.
/// * The newly created account *MUST* have a new access key added to its account (either
/// full or limited access) in the same receipt that the account was created in.
/// * The Public key must be in a binary format with base58 string serialization with human-readable curve.
/// The key types currently supported are secp256k1 and ed25519. Ed25519 public keys accepted are 32 bytes
/// and secp256k1 keys are the uncompressed 64 format.
///
/// Arguments:
/// * `new_account_id`: the valid NEAR account which is being created and should
/// receive the linkdrop assets
/// * `new_public_key`: the valid public key that should be used for the access key added to the newly created account (serialized with borsh).
///
/// Returns `true` if the claim was successful meaning the `new_account_id` was created and all assets were sent to it.
function create_account_and_claim(new_account_id: string, new_public_key: string) -> Promise<boolean>;
```
## Reference Implementation
Below are some references for linkdrop contracts.
- [Link Drop Contract](https://github.com/near/near-linkdrop)
- [Keypom Contract](https://github.com/keypom/keypom)
## Security Implications
1. Linkdrop Creation
Linkdrop creation involves creating keypairs that, when used, have access to assets such as $NEAR, FTs, NFTs, etc. These keys should be limited access and restricted to specific functionality. For example, they should only have permission to call `claim` and `create_account_and_claim`. Since the keys allow the holder to sign transactions on behalf of the linkdrop contract, without the proper security measures, they could be used in a malicious manner (for example executing private methods or owner-only functions).
Another important security implication of linkdrop creation is to ensure that only one key is mapped to a set of assets at any given time. Externally, assets such as FTs, and NFTs belong to the overall linkdrop contract account rather than a specific access key. It is important to ensure that specific keys can only claim assets that they are mapped to.
2. Linkdrop Key Management
Key management is a critical safety component of linkdrops. The linkdrop contract should implement a key management strategy for keys such that a reentrancy attack does not occur. For example, one strategy may be to "lock" or mark a key as "in transfer" such that it cannot be used again until the transfer is complete.
3. Asset Refunds & Failed Claims
Given that linkdrops could contain multiple different assets such as NFTs, or fungible tokens, sending assets might happen across multiple blocks. If the claim was unsuccessful (such as passing in an invalid account ID), it is important to ensure that all state is properly managed and assets are optionally refunded depending on the linkdrop contract's implementation.
4. Fungible Tokens & Future Data
Fungible token contracts require that anyone receiving tokens must be registered. For this reason, it is important to ensure that storage for accounts claiming linkdrops is paid for. This concept can be extended to any future data types that may be added. You must ensure that all the pre-requisite conditions have been met for the asset that is being transferred.
5. Tokens Properly Sent to Linkdrop Contract
Since the linkdrop contract facilitates the transfer of assets including NFTs, and FTs, it is important to ensure that those tokens have been properly sent to the linkdrop contract prior to claiming. In addition, since all the tokens are in a shared pool, you must ensure that the linkdrop contract cannot claim assets that do not belong to the key that is being used to claim.
It is also important to note that not every linkdrop is valid. Drops can expire, funds can be lazily sent to the contract (as seen in the case of fungible and non-fungible tokens) and the supply can be limited.
## Alternatives
#### Why is this design the best in the space of possible designs?
This design allows for flexibility and extensibility of the standard while providing a set of criteria that cover the majority of current linkdrop use cases. The design was heavily inspired by current, functional NEPs such as the Fungible Token and Non-Fungible Token standards.
#### What other designs have been considered and what is the rationale for not choosing them?
A generic data struct that all drop types needed to inherit from. This struct contained a name and some metadata in the form of stringified JSON. This made it easily extensible for any new types down the road. The rationale for not choosing this design was both simplicity and flexibility. Having one data struct requires keys to be of one type only. In reality, there can be many at once. In addition, having a generic, open-ended metadata field could lead to many interpretations and different designs. We chose to use a KeyInfo struct that can be easily extensible and can cover all use-cases by having optional vectors of different data types. The proposed standard is simple, supports drops with multiple assets, and is backwards compatible with all previous linkdrops, and can be extended very easily.
A standard linkdrop creation interface. A standardized linkdrop creation interface would provide data models and functions to ensure linkdrops were created and stored in a specific format. The rationale for not choosing this design was that is was too restrictive. Standardizing linkdrop creation adds complexity and reduces flexibility by restricting linkdrop creators in the process in which linkdrops are created, and potentially limiting linkdrop functionality. The functionality of the linkdrop creation, such as refunding of assets, access keys, and batch creation, should be chosen by the linkdrop creator and live within the linkdrop creator platform. Further, linkdrop creation is often not displayed to end users and there is not an inherent value proposition for a standardized linkdrop creation interface from a client perspective.
#### What is the impact of not doing this?
The impact of not doing this is creating a fragmented ecosystem of linkdrops, increasing the friction for user onboarding. Linkdrop claim pages (e.g. wallet providers) would have to implement custom integrations for every linkdrop provider platform. Inherently this would lead to a bad user experience when new users are onboarding and interacting with linkdrops in general.
## Future possibilities
- Linkdrop creation interface
- Bulk linkdrop management (create, update, claim)
- Function call data types (allowing for funder defined functions to be executed when a linkdrop is claimed)
- Optional configurations added to KeyInfo which can include multi-usekeys, time-based claiming etc…
- Standard process for how links connect to claim pages (i.e a standardized URL such as an app’s baseUrl/contractId= [LINKDROP_CONTRACT]&secretKey=[SECRET_KEY]
- Standard for deleting keys and refunding assets.
## Copyright
Copyright and related rights waived via [CC0](https://creativecommons.org/publicdomain/zero/1.0/).
|
---
id: frontend-multiple-contracts
title: Frontend Interacting with Multiple Contracts
sidebar_label: Frontend & Multiple Contracts
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
import {CodeTabs, Language, Github} from "@site/src/components/codetabs"
This example showcases how to interact with multiple contracts from a single frontend.
Particularly, this example shows how to:
1. Query data from multiple contracts.
2. Call methods in multiple contracts simultaneously.
---
## Query Data from Multiple Contracts
To query multiple contracts simply perform multiple `view` calls:
<Language value="js" language="ts">
<Github fname="index.js"
url="https://github.com/near-examples/frontend-multiple-contracts/blob/main/frontend/index.js"
start="70" end="76" />
</Language>
---
## Dispatching Multiple Transactions
The `wallet` object enables to dispatch multiple transactions simultaneously. However, please notice that the transactions execute independently.
Dispatching multiple transactions at once is just a nice way to improve UX, because the user interacts with the wallet only once.
<Language value="js" language="ts">
<Github fname="index.js"
url="https://github.com/near-examples/frontend-multiple-contracts/blob/main/frontend/index.js"
start="39" end="66" />
</Language>
In this example, the user signs two independent transactions:
1. A transaction to call `set_greeting` in our [Hello NEAR example](https://github.com/near-examples/hello-near-examples)
2. A transaction to call `add_message` in our [GuestBook example](https://github.com/near-examples/guest-book-examples)
:::caution
Even when the user accepts signing the transactions at the same time, the
transactions remain **independent**. This is, if one fails, the other is **NOT** rolled back.
:::
---
## Batch Actions
You can aggregate multiple [actions](../../2.build/2.smart-contracts/anatomy/actions.md) directed towards a same contract into a single transaction. Batched actions execute **sequentially**, with the added benefit that, if **one fails** then they **all** get reverted.
```js
// Register a user and transfer them FT on a single take
const REGISTER_DEPOSIT = "1250000000000000000000";
const ftTx = {
receiverId: FT_ADDRESS,
actions: [
{
type: 'FunctionCall',
params: {
methodName: 'storage_deposit',
args: { account_id: "<receiver-account>" },
gas: THIRTY_TGAS, deposit: REGISTER_DEPOSIT
}
},
{
type: 'FunctionCall',
params: {
methodName: 'ft_transfer',
args: { receiver_id: "<receiver-account>", amount: amount_in_yocto },
gas: THIRTY_TGAS, deposit: 1 }
}
]
}
// Ask the wallet to sign and send the transaction
await wallet.signAndSendTransactions({ transactions: [ ftTx ] })
```
|
---
id: welcome
title: Tools
sidebar_label: Home
---
Welcome! The NEAR ecosystem has a complete set of tools for you to make the most out of the NEAR network.
In this page you will find:
1. [Wallets](https://www.mynearwallet.com/) to handle your assets.
2. [Explorers](explorer.md) to quickly obtain information from the blockchain.
3. Websites to simplify creating and participating on governance projects.
4. Tools to query [past information](indexer4explorer.md) from the blockchain as well as [real time events](events.md).
5. Developer tools to deploy and interact with contracts such as the [NEAR CLI](cli.md) and [NEAR JavaScript API](/tools/near-api-js/quick-reference).
|
NEAR & Social Good: Change Through Crypto
COMMUNITY
July 18, 2022
Harnessing emerging technologies and showing no signs of slowing, social good initiatives focused on bettering society and our planet have become a force in 2022. Despite a rocky crypto market, activism investment and funds focused on environmental, social, and governance factors (ESG) are thriving on NEAR and across the Web3 world.
Growing to over $500B in 2021, ESG funds are making bold statements on the future of sustainability through innovation. Beyond traditional markets, significant social causes are using the blockchain and reaping benefits for the greater good, combating the ideas that crypto is environmentally wasteful and predatory in its practices.
With its sustainable and infinitely-scalable blockchain, NEAR Protocol is an ideal platform for social initiatives, which can and often do overlap with tackling climate change. These projects know that NEAR is climate-neutral and committed to funding grants to reimagine business, creativity, and community for a more inclusive and sustainable future.
From empowering underserved communities of musicians and disrupting big agriculture to supporting earth restoration projects and blazing a new path in community-based governance, countless social good initiatives are gravitating to NEAR. These projects are the perfect building blocks of a global community aligned with the Foundation’s goal: enabling community-driven innovation that ultimately benefits people worldwide.
Breathing life into art through blockchain
Emerging as a collaborative project in 2021 between artists from Nigeria, Cameroon, and Canada, Chapter One Foundation evolved quickly. What started as a collaboration track under DAO Records and the NxM guild has grown into a group of artists “fostering philanthropy and community for other creative in less fortunate circumstances.”
Generating funds through NFT sales, C1 aims to “create opportunities for the less fortunate, by providing the tools needed to create revenue sources through the blockchain and NFTs.” The project provides up-and-coming artists with the tools, skills, and resources needed to breathe life into their art. Rooted in setting up recording and art facilities, C1 makes these spaces available to people who otherwise wouldn’t have access.
With an open-door policy that welcomes all artists to join, C1 provides the less fortunate with real-world savvy in navigating an ever-changing industry while earning revenue. To participate in this mission, people can contribute and purchase creative works from the C1 store. Under the Chapter One Global Charity Foundation, a staggering 30% of NFT sale proceeds go to the less privileged worldwide.
By providing artists with much-needed tools to see their vision through from concept to completion, at-risk youth and marginalized populations of creatives now have access to safe, creative spaces. Operating on NEAR, this innovative zone also serves as a platform for creators to hone their art and entertainment industry knowledge.
For the first time, creators will also have a vested interest in how the industry and its practices evolve. Beyond education and experiential learning, generating revenue through NFT sales creates a positive feedback loop in returning proceeds to C1, allowing the project to scale and expand its reach across the globe.
Nature is non-fungible
On NEAR, a number of projects are leveraging blockchain technology to tackle climate-related issues. SISU, a community-driven project and DAO, supports earth restoration projects through Web3 funding mechanisms. Operating as a global network of organizations working on climate, SISU engages artists and creatives to contribute and earn while supporting their initiatives.
SISU uses blockchain to create multi-stakeholder, market-driven communities of earth restoration projects to further ecological projects and initiatives through public art and storytelling. Designed as a Web3-powered “human safety net”, SISU supports concrete restoration and conservation projects in local communities worldwide.
One such project is the breeding of the native honey bees of Sierra Nevada. This honey bee initiative produces multiple positive outcomes. Not only is it a sustainable alternative for income generation and forest conservation, but it’s also a means of recovering and preserving ancestral knowledge from indigenous communities.
This project is working to recover stingless beekeeping, or meliponiculture, as ancestral knowledge for the La Sierra Nevada of Santa Marta indigenous communities. Through technical assistance, equipment, resources, and infrastructure provision, SISU is restoring beekeeping as a sustainable income alternative. The goals: improve the quality of life of those involved, while guaranteeing the conservation of ecosystems and water sources.
Transforming the way humans connect with food
Lisbon-based Raiz is taking another approach to social and ecological good. This project uses NFTs to help fund vertical farms—the practice of growing crops in vertically stacked layers—in under-utilized urban spaces. Founded by Emiliano Gutierrez, Raiz’s mission is to transform urban areas into epicenters for the growing and harvesting of foods for local communities and restaurants.
Raiz is ardently opposed to the agriculture industry’s stagnant business model, which ships harvested food across global markets. It aims to bring a renewed focus back to the local community while maintaining incredible taste and nutrients in their crops to disrupt this unsustainable model. As part of their vertical farming revolution, Raiz incorporates controlled-environment agriculture, optimizing plant growth through sustainable technology that directly impacts our planet.
Raiz uses NEAR-powered NFTs to generate yield and offer “tokenized impact for the bold.” Through NFTs, Raiz makes and releases digital artworks of plants linked to impact metrics such as carbon emissions and water savings—significant issues in traditional agriculture and industrial farming markets.
Raiz is transforming local hydroponic, vertical farming, and solar energy systems into investable crypto assets by linking the physical with the digital. The Raiz NFT model has the potential to be used for other local food-growing efforts, resulting in people worldwide having greater access to tokenized environmental and local impact.
DAOs for positive social change on a global scale
Kin DAO emerged from a real-world co-op that was seeking solutions for the pressing issues of food insecurity, homelessness, poverty, fractured communities, and climate change. Since 2021, founders Asya Abdrahman and Adrian Bello have used blockchain to facilitate activism investments in art, regenerative innovation, and collective land stewardship.
In June 2022, Kin DAO announced Primordia DAO, a year-long project DAO designed to be experienced as a conceptual art exhibition of village building online and IRL. Founded by Bay Area arts community members, Primordia employs NEAR’s smart contracts to develop decentralized systems, such as a network that could underpin art-based communities or regenerative systems like permaculture. Its founders have created art and educational programs that have served 50,000 people and counting.
One of Primordia’s missions is providing historically-excluded communities with the skills to springboard them into the digital world by working with them to create collaborative artworks. Primordia DAO develops leaders with the abilities to self-govern, self-build, and establish digital villages based on real-world relationships that can meet the demands of real communities.
The project will onboard 100 DAOs to NEAR by the end of the year so they can work together to achieve common objectives. As society transitions into the realms of regeneration and community-based government, Primordia is a crucial use case for the future of global coordination and creative community autonomy.
A mission for change starting from within
While a flourishing global community is taking the NEAR Foundation’s mission to improve the lives of the world’s citizens to the next level, NEAR recognizes that change starts from within. To that end, it is taking unprecedented steps to achieve its mission and serve as a model for sustainable technology.
NEAR’s blockchain scaling technology, Nightshade, divides computation across parallel “shards,” optimizing the already super cheap, exceptionally fast, incredibly secure, and ecologically friendly platform. Its underlying proof-of-stake (PoS) architecture makes the NEAR Protocol more environmentally-friendly than other blockchains.
From inception, NEAR Foundation committed itself to climate-neutral certification. In 2021, the foundation engaged South Pole, a global climate solutions provider, to assess its climate footprint. At 174 tons of CO2 per year between the NEAR Foundation, all employees and contractors working on NEAR Protocol, and all validators, NEAR’s carbon footprint is a whopping 500,000 times more carbon efficient than Bitcoin.
NEAR is also committed to the total compensation of remaining exhaust through CO2 offsetting projects. With every transaction completed on the platform, NEAR plants trees on multiple continents to achieve carbon neutrality.
The beginnings of a lifelong commitment
These projects are just the tip of the iceberg. The NEAR community’s commitment to fostering social change is global in scale and ambition. As the community grows, more and more people are working together to develop real solutions to our most pressing problems.
Each DAO or other community-based initiatives operating on NEAR prove once and for all that crypto is not just another financial market.
Stay tuned for a deeper look at NEAR’s commitment to promoting social good. Over the next month, we’ll focus on crypto’s dynamic aspects as a vehicle for positive change. Web3 is helping historically marginalized communities find their voice, and as the mission evolves, NEAR is unleashing a new level of community engagement and activism worldwide.
|
Case Study: Indexer’s Mary Korch on Building a Multichain Indexing Platform
NEAR FOUNDATION
August 18, 2023
NEAR is a home to a number of amazing apps and projects. These developers, founders, and entrepreneurs are using NEAR to effortlessly create and distribute innovative decentralized apps, while helping build a more open web — free from centralized platforms. In these Case Study videos, NEAR Foundation showcases some of these projects.
In the latest NEAR Foundation Case Study video below, Mary Korch, Customer Success at Indexer, talks about how the project built its Multichain Indexing Platform on NEAR. As Korch notes in the case study, Indexer builds infrastructure to support developers, giving them the data they need to be successful on the NEAR Protocol.
“If you’re a Web2 company looking to build in Web3, and specifically in NEAR, we have done a lot of the hard work for you,” says Korch. “You’re going to need data from the blockchain but also some off-chain data like metadata and IPFS. We actually have that all at your disposal, and it makes it a lot easier to build your application and get to market a lot quicker.” |
A Deep Dive into DAOs: How DAOs Are Changing Everything
NEAR FOUNDATION
March 17, 2022
The global community is being heard. Web3 continues to surge in popularity as developers and end-users alike instill confidence in decentralized blockchains, where information lives transparently and out in the open, in direct opposition to Big Tech’s private, opaque databases storing massive amounts of data.
Just as NFTs went viral in 2021, DAOs are now emerging as 2022’s hottest Web3 topic. Within the NEAR community, Decentralized Autonomous Organizations (DAOs) are exhibiting rapid growth while bringing more project diversity to the ecosystem.
Decision making from the bottom-up
A decentralized autonomous organization, or DAO, is a formalized, self-governing community, analogous to an internet-native Web2 company—where control is centralized. Owned and managed by its members, a DAO carries out a set of instructions determined by prewritten smart contracts, with voting required by members for any changes to be implemented. At its core, the DAO has a built-in treasury that no individual has the authority to access without the approval of the group, fostering the “trustless” environment championed by Web3 platforms.
The first-ever DAO, created in 2016, made waves when its token sale raised the equivalent of $120 million USD, making it one of the largest crowdfunding campaigns in history. While the DAO’s underlying principles and core values remain the same, they have experienced a remarkable evolution in the last several years.
Uniquely Web3 with unmatched power
In 2022, new Web3 users will find DAOs quick to set up, global in scale, extremely customizable, and exceptionally transparent. Think of them as a toolbox that invites anyone to create a business structure with elegant simplicity and flexibility baked in. As such, an immense diversity of structures and scenarios emerge from them. Flat or hierarchical, open to everyone or members-only, the power is invested in the stakeholders to determine how the community should be governed.
The power of DAOs is being harnessed right now by communities to advance a number of missions, made possible by NEAR’s extremely fast, super secure and infinitely scalable protocol. In grants giving, DAO members vote and collectively decide which people and projects to fund. Businesses can track owners’ equity and elect managers to make day-to-day decisions, while investors utilize DAOs to pool funds and vote on strategy and decisions. Clubs of any kind can use the DAO toolbox to elect leaders and decide where to spend group funds.
The DAOs on NEAR are diverse, creative, and boundless in terms of growth. Reinventing the social internet, decentralizing finance, providing structure for various Guilds, and working in areas like legal and software development, these are just a few examples of DAOs that leverage NEAR’s sharded blockchain infrastructure.
The types and number of DAO use cases grow daily, and its true potential is just beginning to be realized.
Together we can do so much
The lion’s share of DAO community growth on NEAR can be seen in blockchain infrastructure and the decentralized finance, or DeFi, space. By total assets, the largest DAOs are Aurora, a token exchange bridge between NEAR and Ethereum, and Octopus, a platform for launching and running Web3 independent, custom blockchains, known as “appchains”.
DAOs don’t stop at providing structure for organizations blazing entirely new trails. They are also serving as the foundation for organizations challenging our preconceptions of social networks and even music recording and distribution.
Feiyu, a picture-based social network on NEAR, allows users to express creativity through memes and GIF sharing. In its novel take on social media, it all happens in an NFT-based metaverse where users earn tokens or NFT items for participating in the community.
Elsewhere on NEAR, DAO Records is reimagining the record label, building tools that will help write the future of the music industry, while working to fairly compensate musicians for their songwriting and recording work. Its founders are encouraging artists to release and package new music as audio NFTs, whose smart contracts allow for an entirely more equitable royalty structure, in perpetuity.
On DAO-powered social platform t² (short for time²), launching in late 2022, users explore a world of narratives “curated by time.” Feature articles are created, curated, and propagated under a curation market mechanism where all participants are fairly rewarded for the network value they contributed. t² acts as the first organization in the Web3 space attempting to monetize human time with the consensus achieved by “Proof of Attention.”
Supercharging the NEAR community
DAO launchpads are gearing up the NEAR community through platforms that allow individuals to join forces with others and create guilds for meaningful projects and more.
Sputnik DAO, a hub for NEAR’s DAO ecosystem, has a platform that welcomes and rewards creators of all kinds within the NEAR community. Individuals can receive rewards by submitting proposals to existing DAOs, it’s that simple. Meanwhile, developers are able to quickly and efficiently build apps on Sputnik because it makes use of coding languages that are already widely known, such as Rust and AssemblyScript. Paired with NEAR’s support and community engagement through Near Academy and the NEAR Certified Developer program, Sputnik is approaching 250 DAOs with a total of nearly 20,000 transactions completed on-chain.
Helping propel NEAR’s ecosystem to become one of crypto’s most diverse, AstroDAO is empowering groups anywhere in the world to make decisions together, collectively. In less than 10 minutes, and without a line of code, everyday users are able to launch a DAO using AstroDAO’s powerful platform. With over 100 DAOs and counting, and a total locked value approaching 90,000 $NEAR, AstroDAO is now a catalyst helping to bring communities together that thrive in the NEAR ecosystem.
The key to AstroDAO’s early success has been a feature set that has been incredibly well-executed on their platform, powered by NEAR. Central tenants are democratic by default, ensuring that decisions like distributing funds and member admissions happen through an intuitive and transparent voting process. Thanks to smart contracts, funds are held in a treasury and always distributed through a community-defined process. Who votes and how it works can be precisely configured to meet a community’s preferred operating style.
Extremely versatile, DAOs act as one component of NEAR Protocol’s commitment to facilitate the decentralization of the network to a worldwide community on a fast, secure, and scalable blockchain. Through their democratic, flexible, and collaborative foundations, DAOs are empowering like-minded individuals to come together and take action in a truly global way.
Robust and disruptive in nature, prepare yourself as 2022 shapes up to be the year of the DAO, powered on NEAR. |
---
id: runtime
title: Runtime
---
This section contains videos that explore the core Runtime, its operation, and how it implements cross-contract calls.
## Runtime Overview
An in-depth code overview of NEAR Runtime.
<iframe
width="560"
height="315"
src="https://www.youtube-nocookie.com/embed/Xi_8PapFCjo"
frameborder="0"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
allowfullscreen>
</iframe>
## Runtime Action and Data Receipts
An in-depth code review of how NEAR Runtime implements cross contract calls.
<iframe
width="560"
height="315"
src="https://www.youtube-nocookie.com/embed/RBb3rJGtqOE"
frameborder="0"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
allowfullscreen>
</iframe>
## Runtime State
An in-depth overview of how NEAR runtime operates with its state.
<iframe
width="560"
height="315"
src="https://www.youtube-nocookie.com/embed/JCkSNL4ie1U"
frameborder="0"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
allowfullscreen>
</iframe>
|
Women in Web3 Changemakers: Zoe Leavitt
NEAR FOUNDATION
October 17, 2023
“Trust yourself. Believe in yourself. Having that trust and belief that if there’s something you really believe in, you can really take it forward,” says Zoe Leavitt, founder and CEO of Glass, the first web three platform purpose built for the $1.5 trillion alcohol industry.
Leavitt is one of the 10 finalists of the 2023 Women in Web3 Changemakers List, an annual competition designed to champion women pushing boundaries in the space. Leavitt, along with nine other outstanding candidates, are showing the industry the change women are bringing about during the advent of the Open Web.
Watch an extended NEAR Foundation video interview with Zoe Leavitt below.
The potential of the Open Web
For Leavitt however, the world of web3 wasn’t where she started out. Working at the venture capital arm of InBev, the world’s largest brewing company, she saw the billions being spent by brands like Budweiser, Michelob Ultra and Stella Artois spend billions on marketing.
Events, sponsorships, billboards and more, all designed to reach people during crucial purchasing moments and celebratory occasions. But, says Leavitt, there were no real direct digital connections between alcohol brands and alcohol consumers. It was here she saw first hand how Web3 technology could disrupt the beverage industry as she knew it.
“I really saw massive potential for this new technology to help bridge those gaps, to better connect consumers in that moment to their favourite drinks brands, and then enable folks to earn rewards for their brand engagement and get more value out of their drinks and social occasions.”
Historically brands were not permitted to form direct relationships. But through careful collaboration with regulators and blockchain technology, Glass has emerged as the world’s first regulatory compliant loyalty service for alcohol brands built on Web3.
Just like you don’t need to know how to code to visit a website, you don’t need to understand web3 to use GLASS. When users sign up and create an account, a wallet is also generated, creating a secure way for users to accrue points and rewards with their favourite drinks brands.
Neither GLASS nor the brands have access, but you can log in any time with your email address and password. No blockchain knowledge is required. What Leavitt discovered is that Web3 technologies offered a new way of thinking about ownership.
In web3, you and your identity come first, rather than the platform. In other words, when you earn points those points belong to you, not to GLASS or the brands. Because you own your own points, you can take them with you to access a range of redemption opportunities across sports, music, entertainment, dining, and more.Beverages can enhance social occasions, anywhere, any time. Now, your points and rewards can, as well.
Unlocking the Open Web
In a little over a year, Glass has already started working with big brands like Hotaling & Co and Lobos Tequila, a high-end spirit brand backed by LeBron James and Arnold Schwarzenegger.
While helping the alcohol industry make the switch to Web3 technologies has been challenging, Leavitt has had her own challenges too. “While there are of course the technical challenges, there still is the issue of how to communicate this to the average consumer?”
For Leavitt, it was all about focusing on what matters: the customer. Bringing her unique skill set to Web3 brought with it a belief that until you’ve found a compelling reason for someone to try something new, it’s back to the drawing board.
“The consumer doesn’t even need to care if this is on web three or not,” says Leavitt. On the Glass website, the technology that powers this disruptive company sits in the background. It’s all about how this improves and enhances the experience for customers. This focus on the end user, is something she feels women can and do bring to the Open Web table.
“I feel like with the influx of certainly women, but also the next generation of web three founders, we’re seeing a lot more folks come from other industries and identified this real need that crypto can now solve. Women really have this opportunity to bring these new perspectives and bring these web three tools into a wider range of use cases and industries.”
Does she have any advice for women thinking about getting into this space?
“Everything doesn’t have to be perfect when you first launch, it’s really about being obsessed with that consumer problem, that industry problem. You have the skills, you have the industry insight, take the plunge.” |
Meet the Startups in OWC’s Batch II: Accelerating Web 3.0 Adoption
COMMUNITY
February 10, 2021
At NEAR, we’re excited at the prospect of championing and uplifting promising Web 3.0 founders – it’s the reason the Open Web Collective (OWC)was created. As a protocol-agnostic collective, user-centrism and metric-driven experimentation are at the core of what NEAR does. By bringing together the right resources, VCs, and seasoned advisors, NEAR helps founders de-risk, focus, and accelerate their open web projects.
Since Batch I concluded in autumn 2020, the nine startups in the cohort – 1inch Exchange, Arterra, Snowball Money, Leaf Global Fintech, BoxScore, Snark Art, Upshot, Brink Exchange, and OP Games – have become integral players in the Web3 ecosystem.
1inchExchange is now the second-most-used application in DeFi, according to ETH Gas Station. 1inch has exchanged over $9 billion (USD) in crypto-assets and completed over $700K (USD) in total swaps. In December 2020, 1inch distributed over 90 million governance tokens to their users, liquidity providers, and community, making the application even more decentralized.
SnowSwap is an automated market maker and decentralized exchange for yield-bearing liquidity provider token swaps. As of February 4, 2021, SnowSwap’s total value locked exceeded $1.5 million (USD). Among the novel financial instruments introduced by SnowSwap is the eth2SNOW pool designed to incentivize ETH2 adoption. The eth2SNOW pool allows users to enter, exit, and swap between popular ETH2 staking services with low slippage.
Leaf Global Fintech relaunched its Leaf Wallet in September 2020. Since then, LeafGlobal has helped over 1,410 refugees in Kenya, Rwanda, and Uganda gain access to digital financial services. As of December 2020, Leaf has successfully completed over 5,600 transactions worth $32,595 (USD).
NEAR is excited to share the participants in Batch II. This cohort’s members are focused across three innovation areas to reflect the stages and target markets of this stellar group of builders.
The Decentralized Revolution Continues with Batch II
Meet our Open Web Builders
Open web teams are builders focused on mission-critical infrastructure for the open web, such as privacy, accessibility, data management, and identity.
Nym is building the next generation of privacy infrastructure for Web3. Fear of big tech, privacy, and data intrusion is driving consumers to adopt a privacy-cautious attitude.The Nym mixnet is fully decentralized, with no trusted third parties such as a VPN. NYM supports a high volume of service and low latency through network incentives.
Verida (Ver/id/a = Verified, Identity, Data) is developing the future of decentralized data. Verida enables developers to build web and mobile apps where users own their data. Their library provides decentralized single sign-on, data sharing, and profile management that fully integrates with the Verida Vault.
Sarcophagus is a general purpose decentralized dead man’s switch built on ethereum and Arweave. It is triggered if the human operator becomes incapacitated due to death, loss of consciousness, or being physically removed from control. Use cases include but are not limited to wills and trusts, password recovery, credential passdown, political activism, key material backup, and emergency communication.
Meet our Open Finance Builders
Open finance teams are pioneers emerging from stealth mode and challenging traditional finance. The future of fintech is interoperable, user-centric, transparent, and accessible to everyone.
Kamix helps the African diaspora send remittances to family for free. Currently operating in Cameroon, they will soon launch in Nigeria.
Heo Finance is the first stablecoin-powered crowdfunding platform that pays out rewards to donors. Each donation results in daily farming of tokens that allows donors to receive 2X to 10X their donation in return.
Meet our DApp Builders
DApps are the next generation of web applications; promising the convenience we’ve come to expect from traditional applications with new key features that create trustless environments, lower barriers to services, and develop open markets with liquidity.
Vezt is the first mobile app where music fans can share royalty rights for songs and recordings. Users can buy royalty rights for artists such as Maluma, Panic! At The Disco, and Blond Don.
PESA is creating a decentralized carrier that transforms voice, cellular, and WiFi networks into tradable resources. Currently, PESA has a partnership with 37 airlines for in-flight internet and serves more than 68 million hotspots.
Hash Rush is an online sci-fi/fantasy RTS “Play-to-Earn” blockchain-enabled game. VZ studio will transition to a platform-as-a-service in the future, empowering game developers with a suite of tools to create in-game economies using blockchain.
MintGate is turning any online content into exclusive rewards. MintGate’s token economy incentivizes early followers to use their network and resources to support a community or a creator, and then redistributes the added-value to the backers.
Paras lets users create, trade, and collect digital art cards on NEAR Protocol. Since its release in December 2020, Paras has onboarded 70+ artists, who have released 472 minted cards with a trading volume of over $35,000 USD from 700+ transactions.
OWC Batch II cohort member teams and founders
Partner & Mentor Spotlight
A critical part of the OWC mission and value comes from the mentors who work with the cohort teams throughout their OWC journeys. These mentors, guest speakers, and advisors are an essential part of the OWC culture. By sharing their expertise, wisdom, and guidance, they create real, lasting relationships and results for the next generation of Web3 founders.
The current OWC Mentors and Partners are: Maria Shen, Jesse Walden, Kartik Talwar, Tara Tan, Jonathan Kol, Jake Brukhman, Ivan Bogatty, Evan Feng, Tom Serres, James Zhang, Joyce Yang, Jocy Lin, Damir Bandalo, Pantera Capital, Brex, CoinFund, Bollinger Investment Group, AdColony, and GreenField One.
Some of the OWC Batch II Mentors
Congratulations to all the members of Batch II! These are the Web3 teams to watch. Stay tuned for more updates from the teams.
Every Tuesday, OWC invites builders, investors, and entrepreneurs to share experiences and expertise on the OWC podcast and YouTube. We also invite you to join our community by subscribing to our newsletter and following us on Twitter.
|
---
id: functions
title: External Interface
hide_table_of_contents: true
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
import {CodeTabs, Language, Github} from "@site/src/components/codetabs"
import CodeBlock from '@theme/CodeBlock'
import {ExplainCode, Block, File} from '@site/src/components/CodeExplainer/code-explainer';
Smart contracts expose functions so users can interact with them. There are different types of functions including `read-only`, `private` and `payable`.
<ExplainCode languages={["js", "rust"]}>
<Block highlights={{"js": "14-17,20-39,42-44,47-49", "rust": "22-30,33-58,60-62,64-66"}} fname="auction">
### Contract's Interface
All public functions in the contract are part of its interface. They can be called by anyone, and are the only way to interact with the contract
</Block>
<Block highlights={{"js":"13-17", "rust": "22-30"}} fname="auction">
### Initialization Functions
A contract can opt to have an initialization function. If present, this function must be called before any other to [initialize the contract](./storage.md)
</Block>
<Block highlights={{"js": "13"}} fname="auction">
#### `@initialize({ privateFunction: true })`
The initialization function is marked with the `@initialize` decorator
</Block>
<Block highlights={{"rust": "20"}} fname="auction">
#### `#[init]`
Read-only functions are those that take an **immutable** reference to `self` in Rust
</Block>
<Block highlights={{"js":"14-17", "rust": "33-58"}} fname="auction">
### State Changing Functions
The functions that modify the [state](./storage.md) or perform [actions](./actions.md) need to be called by a user with a NEAR account, since a transaction is required to execute them
</Block>
<Block highlights={{"js": "19"}} fname="auction">
#### `@call`
State changing functions are marked with the `@call` decorator
</Block>
<Block highlights={{"rust": "33"}} fname="auction">
#### `mut &self`
State changing functions are those that take a **mutable** reference to `self` in Rust
</Block>
<Block highlights={{"js": "22,26", "rust": "36,42"}} fname="auction">
**Note:** The SDK provides [contextual information](./environment.md), such as which account is calling the function, or what time it is
</Block>
<Block highlights={{"js":"42-44,47-49", "rust": "60-62,64-66"}} fname="auction">
### Read-Only Functions
Contract's functions can be read-only, meaning they don't modify the state. Calling them is free for everyone, and does not require to have a NEAR account
</Block>
<Block highlights={{"js": "41,46"}} fname="auction">
#### `@view`
Read-only functions are marked with the `@view` decorator in TS/JS
</Block>
<Block highlights={{"rust": "60,64"}} fname="auction">
#### `&self`
Read-only functions are those that take an **immutable** reference to `self` in Rust
</Block>
<Block highlights={{"js":"13", "rust": "21"}} fname="auction">
### Private Functions
Many times you will want to have functions that **are exposed** as part of the contract's interface, but **should not be called directly** by users
Besides initialization functions, [callbacks from cross-contract calls](./crosscontract.md) should always be `private`
These functions are marked as `private` in the contract's code, and can only be called by the contract itself
</Block>
<Block highlights={{"js": "13"}} fname="auction">
#### `decorator({privateFunction: true})`
Private functions are marked by setting `privateFunction: true` in the `@call` or `@initialization` decorators
</Block>
<Block highlights={{"rust": "21"}} fname="auction">
#### [#private]
Private functions are marked using the `#[private]` macro in Rust
</Block>
<Block highlights={{"js":"19,25", "rust": "32,41"}} fname="auction">
### Payable Functions
By default, functions will panic if the user attaches NEAR Tokens to the call. Functions that accept NEAR Tokens must be marked as `payable`
Within the function, the user will have access to the [attached deposit](./environment.md)
</Block>
<Block highlights={{"js": "19,25"}} fname="auction">
#### `@call({payableFunction: true})`
Payable functions are marked by setting `payableFunction: true` in the `@call` decorator
</Block>
<Block highlights={{"rust": "32,41"}} fname="auction">
#### [#payable]
Payable functions are marked using the `#[payable]` macro in Rust
</Block>
<Block highlights={{"js":"3-5"}} fname="example">
### Internal Functions
All the functions we covered so far are part of the interface, meaning they can be called by an external actor
However, contracts can also have private internal functions - such as helper or utility functions - that are **not exposed** to the outside world
To create internal private methods in a JS contract, simply omit the `@view` and `@call` decorators
</Block>
<Block highlights={{"rust": "5-7"}} fname="example">
### Internal Functions
All the functions we covered so far are part of the interface, meaning they can be called by an external actor
However, contracts can also have private internal functions - such as helper or utility functions - that are **not exposed** to the outside world
To create internal private methods in a Rust contract, do not declare them as public (`pub fn`)
</Block>
<Block highlights={{"rust": "9-11,13-15"}} fname="example">
### Pure Functions
Pure functions are a special kind of function that do not require to access data from the state
They are useful to return hardcoded values on the contract
</Block>
<File language="js" fname="auction" url="https://github.com/near-examples/auction-examples/blob/main/contract-ts/src/contract.ts" start="2" end="51" />
<File language="rust" fname="auction" url="https://github.com/near-examples/auction-examples/blob/main/contract-rs/src/lib.rs" start="2" end="68" />
<CodeBlock language="js" fname="example">
```js
@NearBindgen({})
class Contract {
helper_function(params... ){
// this function cannot be called from the outside
}
@view({})
interface_view(params...){
// this function can be called from outside
}
@call({privateFunction: true}){
// this function can be called from outside, but
// only by the contract's account
}
}
```
</CodeBlock>
<CodeBlock language="rust" fname="example">
```rs
const SOME_VALUE: u64 = 8;
#[near_bindgen]
impl MyContractStructure {
fn internal_helper(mut &self, params... ){
// this function cannot be called from the outside
}
pub fn public_log(/* Parameters here */) {
near_sdk::log!("inside log message");
}
pub fn return_static_u64() -> u64 {
SOME_VALUE
}
}
```
</CodeBlock>
</ExplainCode>
|
---
id: best-practices
title: Indexing best practices
sidebar_label: Best Practices
---
In this article you can find suggested best practices when building blockchain indexers using [QueryAPI](intro.md).
If you're planning to design a production-ready indexer, please check the recommendations for [indexing development](#indexing-development) and [database design](#database-design).
## Indexing development
This section presents a recommended workflow when building a new indexer using QueryAPI.
### Design APIs for your UIs
If your application requires front-end User Interfaces (UIs), your first step should be to define the APIs that will be used by your front-end UI. The main objective here is to reduce the overall number of requests that each page makes to render all content. Once you define these APIs, you will have a an good overview of the data that you need to index from the blockchain.
### Create a Database
Once you have a better idea of the indexed data, you can design the database to store the indexing results.
When defining your SQL database schema, consider these recommendations:
- Design for `UPSERT`s, so that indexed data can be replaced if needed
- Use foreign keys between entities for GraphQL linking
- Think of indexes (e.g. by accounts, by dates, etc.)
- Use views to generate more GraphQL queries – when you `CREATE VIEW`, QueryAPI generates a separate GraphQL query for it.
:::tip
Check the [Database design section](#database-design) to learn how to design optimal database schemas for your indexer.
:::
### Find blocks to test on
Using exploring tools such as [NearBlocks](https://nearblocks.io/), you can find a few `block_height`s with transactions to your smart contracts that you need to index. These example blocks will help you to test and debug while writing your indexer code.
### Write JS code and debug
1. Start from a simple [`indexingLogic.js`](index-function.md) to get blockchain data dumped in a database, in a raw form. For example, start by getting the [FunctionCall](../../2.smart-contracts/anatomy/actions.md#function-call)'s arguments from the smart contract that you want to index. Then, use the [GraphQL playground](index-function.md#mutations-in-graphql) to understand the raw dump and further analyze the data.
![Playground](/docs/assets/QAPIScreen.gif)
:::tip
- Check the [NEAR Lake Primitives](https://near.github.io/near-lake-framework-js/) documentation
- Use [`context.db`](context.md#db) object to access your database tables
- Write logs
:::
2. Once you have figured out a good logic to process the raw data, test the processing logic by [enabling debug mode](index-function.md#local-debug-mode) in the `indexingLogic.js` editor, and set a list of block heights that contains different cases that your processing logic must handle.
![QueryAPI Dashboard](/docs/assets/QAPIdebug.png)
3. Once your index logic extracts all data correctly as expected, you might find that you need to create new tables or change your schema to better organize the data. In that case, fork the indexer, change the SQL schema and update the indexer logic to process and store structured data.
4. If there were changes in smart contracts, e.g. changes in method and event definitions, you might need to implement conditional logic on the block height.
### Deploy code and check logs
Make sure to `try { } catch { }` exceptions while processing each block. In the `catch` section, log exceptional blocks and debug them by enabling debug mode. (set the block `height` of the problematic blocks, and run a local debug)
```js
try {
console.log("Creating a Post Snapshot");
const postData = {
post_id: post_id,
account_id: accountId,
block_height: block_height,
};
await context.db.Posts.insert(postData);
console.log(
`Post Snapshot with post_id ${post_id} at block_height ${block_height} has been added to the database`
);
return null;
} catch (e) {
console.log(
`Error creating Post Snapshot with post_id ${post_id} at block_height ${block_height}: ${e}`
);
return e;
}
```
### Fix bugs and redeploy
You may have to do several iterations to fix all bugs in your indexer to process all the blocks correctly. Currently QueryAPI does not allow to clean the database and change the schema, so you will need to fork your indexer, update the schema or `indexingLogic`, and try again. The new indexer can be named `YourIndexName_v2`, `YourIndexerName_v3`, `..._v4`, and so on. If you don't do that, your new indexing logic will re-run on old blocks, and if you don’t handle re-indexing in your `indexingLogic.js`, the same old data will be inserted again into the database, bringing further errors.
:::tip
Remember to clean out old, unused indexers. If you get `YourIndexerName_v8` to work, delete `..._v7`, `..._v6`, so they can free resources taken from QueryAPI workers.
:::
### Generate GraphQL queries and export
When your indexer is deployed and ready, you can generate and export GraphQL queries that can be used in your front-end application, NEAR component, or any other integration.
To generate GraphQL queries:
- [Use GraphiQL playground](index-function.md#mutations-in-graphql)
- Click through and debug queries
- [Use code exporter to NEAR components](index-function.md#create-a-bos-component-from-query)
- Change `query` to `subscription` for WebSockets
## Database design
Designing an optimal database schema depends on the type of indexer that you want to build.
Focusing on the two most common blockchain indexing use cases, you can consider:
- a database schema for an indexer doing blockchain analytics, reporting, business intelligence, and big-data queries.
- a database schema for an indexer built as a backend for a web3 dApp building interactive and responsive UIs, that tracks interactions over a specific smart contract.
:::info
QueryAPI uses [PostgreSQL 14.9](https://www.postgresql.org/docs/14/index.html). You can find additional documentation about PostgresSQL data definition language [in this link](https://postgrespro.com/docs/postgresql/14/ddl).
:::
### Schema for Blockchain analytics
- Consider using summary tables for precomputed analytics. Example:
```sql
CREATE TABLE summary_account_transactions_per_day (
dim_signer_account_id TEXT NOT NULL,
dim_transaction_date DATE NOT NULL,
metric_total_transactions BIGINT NOT NULL,
PRIMARY KEY (dim_signer_account_id, dim_transaction_date)
);
INSERT INTO summary_account_transactions_per_day (dim_signer_account_id, dim_transaction_date, metric_total_transactions)
SELECT
t.signer_account_id AS dim_signer_account_id,
t.transaction_date AS dim_transaction_date,
COUNT(*) AS metric_total_transactions
FROM
transactions t
WHERE
t.transaction_date = CURRENT_DATE
GROUP BY
t.signer_account_id, t.transaction_date
ON CONFLICT (dim_signer_account_id, dim_transaction_date)
DO UPDATE SET
metric_total_transactions = EXCLUDED.metric_total_transactions;
```
- If you want to do a SQL `JOIN` query, use a `VIEW`. For example:
```sql
CREATE VIEW
posts_with_latest_snapshot AS
SELECT
ps.post_id,
p.parent_id,
p.author_id,
ps.block_height,
ps.editor_id,
ps.labels,
ps.post_type,
ps.description,
ps.name,
ps.sponsorship_token,
ps.sponsorship_amount,
ps.sponsorship_supervisor
FROM
posts p
INNER JOIN (
SELECT
post_id,
MAX(block_height) AS max_block_height
FROM
post_snapshots
GROUP BY
post_id
) latest_snapshots ON p.id = latest_snapshots.post_id
INNER JOIN post_snapshots ps ON latest_snapshots.post_id = ps.post_id
AND latest_snapshots.max_block_height = ps.block_height;
```
### Schema for interactive UIs
#### Indexing
Add indexes for efficient querying. Example:
```sql
CREATE INDEX idx_transactions_signer_account_id ON transactions(signer_account_id);
```
#### Partitioning
Utilize partitioning for large tables.
```sql
CREATE TABLE transactions_partitioned_by_account_id (
transaction_hash text NOT NULL,
signer_account_id text NOT NULL,
receipt_conversion_tokens_burnt numeric(45) NULL,
PRIMARY KEY (signer_account_id, transaction_hash)
)
PARTITION BY LIST (signer_account_id);
```
|
# Multi Token Metadata
:::caution
This is part of the proposed spec [NEP-245](https://github.com/near/NEPs/blob/master/neps/nep-0245.md) and is subject to change.
:::
Version `1.0.0`
## Summary
An interface for a multi token's metadata. The goal is to keep the metadata future-proof as well as lightweight. This will be important to dApps needing additional information about multi token properties, and broadly compatible with other token standards such that the [NEAR Rainbow Bridge](https://near.org/blog/eth-near-rainbow-bridge/) can move tokens between chains.
## Motivation
The primary value of tokens comes from their metadata. While the [core standard](Core.md) provides the minimum interface that can be considered a multi token, most artists, developers, and dApps will want to associate more data with each token, and will want a predictable way to interact with any MT's metadata.
NEAR's unique [storage staking](https://docs.near.org/concepts/storage/storage-staking) approach makes it feasible to store more data on-chain than other blockchains. This standard leverages this strength for common metadata attributes, and provides a standard way to link to additional offchain data to support rapid community experimentation.
This standard also provides a `spec` version. This makes it easy for consumers of Multi Tokens, such as marketplaces, to know if they support all the features of a given token.
Prior art:
- NEAR's [Fungible Token Metadata Standard](../FungibleToken/Metadata.md)
- NEAR's [Non-Fungible Token Metadata Standard](../NonFungibleToken/Metadata.md)
- Discussion about NEAR's complete NFT standard: #171
- Discussion about NEAR's complete Multi Token standard: #245
## Interface
Metadata applies at both the class level (`MTBaseTokenMetadata`) and the specific instance level (`MTTokenMetadata`). The relevant metadata for each:
```ts
type MTContractMetadata = {
spec: string, // required, essentially a version like "mt-1.0.0"
name: string, // required Zoink's Digitial Sword Collection
}
type MTBaseTokenMetadata = {
name: string, // required, ex. "Silver Swords" or "Metaverse 3"
id: string, // required a unique identifier for the metadata
symbol: string|null, // required, ex. "MOCHI"
icon: string|null, // Data URL
decimals: string|null // number of decimals for the token useful for FT related tokens
base_uri: string|null, // Centralized gateway known to have reliable access to decentralized storage assets referenced by `reference` or `media` URLs
reference: string|null, // URL to a JSON file with more info
copies: number|null, // number of copies of this set of metadata in existence when token was minted.
reference_hash: string|null, // Base64-encoded sha256 hash of JSON from reference field. Required if `reference` is included.
}
type MTTokenMetadata = {
title: string|null, // ex. "Arch Nemesis: Mail Carrier" or "Parcel #5055"
description: string|null, // free-form description
media: string|null, // URL to associated media, preferably to decentralized, content-addressed storage
media_hash: string|null, // Base64-encoded sha256 hash of content referenced by the `media` field. Required if `media` is included.
issued_at: string|null, // When token was issued or minted, Unix epoch in milliseconds
expires_at: string|null, // When token expires, Unix epoch in milliseconds
starts_at: string|null, // When token starts being valid, Unix epoch in milliseconds
updated_at: string|null, // When token was last updated, Unix epoch in milliseconds
extra: string|null, // Anything extra the MT wants to store on-chain. Can be stringified JSON.
reference: string|null, // URL to an off-chain JSON file with more info.
reference_hash: string|null // Base64-encoded sha256 hash of JSON from reference field. Required if `reference` is included.
}
type MTTokenMetadataAll = {
base: MTBaseTokenMetadata
token: MTTokenMetadata
}
```
A new set of functions MUST be supported on the MT contract:
```ts
// Returns the top-level contract level metadtata
function mt_metadata_contract(): MTContractMetadata {}
function mt_metadata_token_all(token_ids: string[]): MTTokenMetadataAll[]
function mt_metadata_token_by_token_id(token_ids: string[]): MTTokenMetadata[]
function mt_metadata_base_by_token_id(token_ids: string[]): MTBaseTokenMetadata[]
function mt_metadata_base_by_metadata_id(base_metadata_ids: string[]): MTBaseTokenMetadata[]
```
A new attribute MUST be added to each `Token` struct:
```diff
type Token = {
token_id: string,
+ token_metadata?: MTTokenMetadata,
+ base_metadata_id: string,
}
```
### An implementing contract MUST include the following fields on-chain
For `MTContractMetadata`:
- `spec`: a string that MUST be formatted `mt-1.0.0` to indicate that a Multi Token contract adheres to the current versions of this Metadata spec. This will allow consumers of the Multi Token to know if they support the features of a given contract.
- `name`: the human-readable name of the contract.
### An implementing contract must include the following fields on-chain
For `MTBaseTokenMetadata`:
- `name`: the human-readable name of the Token.
- `base_uri`: Centralized gateway known to have reliable access to decentralized storage assets referenced by `reference` or `media` URLs. Can be used by other frontends for initial retrieval of assets, even if these frontends then replicate the data to their own decentralized nodes, which they are encouraged to do.
### An implementing contract MAY include the following fields on-chain
For `MTBaseTokenMetadata`:
- `symbol`: the abbreviated symbol of the contract, like MOCHI or MV3
- `icon`: a small image associated with this contract. Encouraged to be a [data URL](https://developer.mozilla.org/en-US/docs/Web/HTTP/Basics_of_HTTP/Data_URIs), to help consumers display it quickly while protecting user data. Recommendation: use [optimized SVG](https://codepen.io/tigt/post/optimizing-svgs-in-data-uris), which can result in high-resolution images with only 100s of bytes of [storage cost](https://docs.near.org/concepts/storage/storage-staking). (Note that these storage costs are incurred to the contract deployer, but that querying these icons is a very cheap & cacheable read operation for all consumers of the contract and the RPC nodes that serve the data.) Recommendation: create icons that will work well with both light-mode and dark-mode websites by either using middle-tone color schemes, or by [embedding `media` queries in the SVG](https://timkadlec.com/2013/04/media-queries-within-svg/).
- `reference`: a link to a valid JSON file containing various keys offering supplementary details on the token. Example: "/ipfs/QmdmQXB2mzChmMeKY47C43LxUdg1NDJ5MWcKMKxDu7RgQm", etc. If the information given in this document conflicts with the on-chain attributes, the values in `reference` shall be considered the source of truth.
- `reference_hash`: the base64-encoded sha256 hash of the JSON file contained in the `reference` field. This is to guard against off-chain tampering.
- `copies`: The number of tokens with this set of metadata or `media` known to exist at time of minting. Supply is a more accurate current reflection.
For `MTTokenMetadata`:
- `title`: The title of this specific token.
- `description`: A longer description of the token.
- `media`: URL to associated media. Preferably to decentralized, content-addressed storage.
- `media_hash`: the base64-encoded sha256 hash of content referenced by the `media` field. This is to guard against off-chain tampering.
- `copies`: The number of tokens with this set of metadata or `media` known to exist at time of minting.
- `issued_at`: Unix epoch in milliseconds when token was issued or minted (an unsigned 32-bit integer would suffice until the year 2106)
- `expires_at`: Unix epoch in milliseconds when token expires
- `starts_at`: Unix epoch in milliseconds when token starts being valid
- `updated_at`: Unix epoch in milliseconds when token was last updated
- `extra`: anything extra the MT wants to store on-chain. Can be stringified JSON.
- `reference`: URL to an off-chain JSON file with more info.
- `reference_hash`: Base64-encoded sha256 hash of JSON from reference field. Required if `reference` is included.
For `MTTokenMetadataAll `:
- `base`: The base metadata that corresponds to `MTBaseTokenMetadata` for the token.
- `token`: The token specific metadata that corresponds to `MTTokenMetadata`.
### No incurred cost for core MT behavior
Contracts should be implemented in a way to avoid extra gas fees for serialization & deserialization of metadata for calls to `mt_*` methods other than `mt_metadata*` or `mt_tokens`. See `near-contract-standards` [implementation using `LazyOption`](https://github.com/near/near-sdk-rs/blob/c2771af7fdfe01a4e8414046752ee16fb0d29d39/examples/fungible-token/ft/src/lib.rs#L71) as a reference example.
## Drawbacks
* When this MT contract is created and initialized, the storage use per-token will be higher than an MT Core version. Frontends can account for this by adding extra deposit when minting. This could be done by padding with a reasonable amount, or by the frontend using the [RPC call detailed here](https://docs.near.org/docs/develop/front-end/rpc#genesis-config) that gets genesis configuration and actually determine precisely how much deposit is needed.
* Convention of `icon` being a data URL rather than a link to an HTTP endpoint that could contain privacy-violating code cannot be done on deploy or update of contract metadata, and must be done on the consumer/app side when displaying token data.
* If on-chain icon uses a data URL or is not set but the document given by `reference` contains a privacy-violating `icon` URL, consumers & apps of this data should not naïvely display the `reference` version, but should prefer the safe version. This is technically a violation of the "`reference` setting wins" policy described above.
## Future possibilities
- Detailed conventions that may be enforced for versions.
- A fleshed out schema for what the `reference` object should contain.
|
Limitations of Zilliqa’s Sharding approach
DEVELOPERS
August 29, 2018
Zilliqa published a blog post on their sharding design today.
It is evident that if you don’t have sharding from day 1, your blockchain has no chance of scaling with adoption. Building sharding after the fact is extremely hard. For instance, Ethereum has one of the strongest engineering teams in the blockchain space, and yet their sharding release was pushed back yet again. In their situation, integrating sharding into the system is similar to changing a car engine while the car is driving.
Zilliqa is one of the very few protocols that promises sharding, thus we followed it closely from the beginning.
I was the engineer #1 at a database company called MemSQL. MemSQL builds a distributed analytics platform that has large clusters deployed at Goldman Sachs, Uber, Comcast, Akamai and many other enterprise companies. While at MemSQL, I was responsible for its sharding implementation. Since then, I’ve co-founded Near Protocol, which has two other early MemSQL engineers who were responsible for cross-shard transactions and complex distributed joins, as well as four ex-Google engineers.
In the past, we have built sharding that powers large clusters in production and processes millions of transactions per second per aggregator node. From our experiences, we know well how to implement sharding on complex systems and what practical issues it will have.
To come back to Zilliqa post, the essence of their message can be summarized in several bullet points:
Execute all the single-shard transactions in parallel;
Do not execute transactions that affect the same smart contract in parallel;
Do not execute any transaction that affects more than one shard in parallel with any other transaction.
Besides that, while not explicitly stated in the blog post, it follows that Zilliqa doesn’t shard state (this Ethereum FAQ provides the explanation of difference between sharded state and sharded processing).
Executing only single-shard transactions in parallel
Only executing in parallel transactions for which the transaction initiator and the smart contract are on the same shard might not be a big problem. In Fleta, the payments are entirely designed on the idea that shards can be treated interchangeably. It doesn’t quite work for Zilliqa, since in Fleta the shard is dictated by the sender, while in Zilliqa it is dictated by the shard of the contract, but it suggests that a similar idea might be applicable.
No state sharding
Not sharding the state makes our lives easier. For example, if the state is sharded, then even the very first example in Zilliqa’s blog post becomes obsolete: assigning the payment to the shard of the sender would not be enough, since the shard of the sender would not be able to update the state for the receiver. As a result, a task as simple as processing payments becomes very complex once the state is sharded. However, It is also worth noting that even in the absence of sharding by state, assigning payments to the sender’s shard only works if the accounts are represented as UTXO. If accounts store the accumulated amount, then two shards processing transactions with the same receiver will apply conflicting updates to the receiver’s account.
Nevertheless, not sharding by state, while simplifies the system design, imposes a huge limit on the scalability of the system. The only reason why Ethereum nodes can still store the entire state is that Ethereum only processes 14 transactions per second. Once a system processes thousands of transactions per second, the state will explode, since transactions do leave a trace on the state. Introducing sharding by the state later will be as hard as introducing sharded processing into modern non-sharded blockchain protocols.
Not executing transactions that affect the same smart contract in parallel
Similarly, not sharding smart contract processing, while making the implementation simpler, limits the scalability of a protocol. Ultimately, in any ecosystem, only a few applications dominate the usage, and as Zilliqa scales to thousands of shards, five top dApps will have to reside in five shards and be limited by both the shard’s processing power (and its storage once sharding state is introduced).
With the limitations described above and while also not processing contracts that by design affect multiple shards in parallel, Zilliqa will just make another incremental change in the landscape of scalable blockchains. They might outperform EOS, Thunder, and Algorand (or at least provide better decentralization than the former two), but are not future-proof, and such limitations will prevent them from scaling with the demand for the decentralized applications platform.
The area of research concerned with the execution of distributed transactions has a long history, and shall not be ignored in the development of sharded blockchain protocols.
For example, implementations of Map-Reduce, or generally engines that involve parallel processing, shuffles, and aggregations, have been used for parallel execution of complex transactions for more than a decade.
Why then do we not see an emergence of sharded blockchain protocols that are powered by techniques proven in the industry? The primary reason is that building distributed systems in the presence of failures is an extremely complex engineering task. The number of production-tested distributed database systems that are not coming from engineering giants such as Amazon, Microsoft, Google or Facebook, who have access to the best-distributed systems engineering talent, is very small.
From this perspective, Near Protocol, with its exceptional team of distributed engineers is uniquely positioned to build a sharded decentralized applications engine.
At this stage, we do not have our sharding technical paper finished — but we will release it soon. The way we develop our approach is more practical in nature, where we first build a prototype to test all of our hypotheses. In a field as complex as distributed systems writing a whitepaper before having a working implementation is often a rushed decision, although it seems to be a widely adopted approach for blockchain projects.
At a high level, transactions in Near are split into a series of parallel “map” steps, interleaved by “shuffle” steps, and the state and execution of each smart contract is sharded. This enables execution of arbitrarily complex programs in parallel. Near also doesn’t introduce its own programming language and instead relies on the entire ecosystem of transpilers to WebAssembly, as well as access to the state in the form of SQL queries.
Stay tuned for our sharding technical paper!
To follow our progress you can use:
Twitter — https://twitter.com/nearprotocol,
Medium — https://medium.com/nearprotocol
https://upscri.be/633436/
Thanks to Bowen Wang, Aliaksandr Hudzilin, Mikhail Kever for helping putting together this post. |
```js
import { Wallet } from './near-wallet';
const AMM_CONTRACT_ADDRESS = "v2.ref-finance.near";
const wallet = new Wallet({ createAccessKeyFor: AMM_CONTRACT_ADDRESS });
await wallet.callMethod({
method: 'swap',
args: {
actions: [
{
pool_id: 79,
token_in: "token.v2.ref-finance.near",
token_out: "wrap.near",
amount_in: "100000000000000000",
min_amount_out: "1",
},
],
},
contractId: AMM_CONTRACT_ADDRESS,
gas: 300000000000000,
deposit: 1
});
```
_The `Wallet` object comes from our [quickstart template](https://github.com/near-examples/hello-near-examples/blob/main/frontend/near-wallet.js)_
<details>
<summary>Example response</summary>
```json
"5019606679394603179450"
```
</details> |
NEAR Web Wallet Security Update
COMMUNITY
August 4, 2022
The recent wallet hacks on other platforms have brought to light potentially serious security issues connected to the use of common analytics tools in Web3. In light of those hacks, we are sharing a perspective on a recent experience involving similar tools.
On June 6th, 2022, the NEAR Wallet team received a bug report indicating that sensitive information had been shared with a third party. The issue was fixed promptly the same day.
While the team was aware of this threat, and careful to sanitize data collected by the third party service, a code change nevertheless resulted in the collection of sensitive data for some users who had used email or SMS recovery with their wallets. Thankfully, @Hacxyk caught this before us and submitted the finding to our security team on June 6th (for which they have earned a bounty). The wallet team immediately remediated the situation, scrubbed all sensitive data, and identified any personnel who could have had the ability to access this data.
To date, we have found no indicators of compromise related to the accidental collection of this data, nor do we have reason to believe this data persists anywhere.
Regardless, we no longer allow users to create accounts using email or SMS for account recovery. Despite having no evidence of compromise, we strongly recommend that users who have used email or SMS recovery options in the past rotate their keys. This can be accomplished by visiting wallet.nearpages.wpengine.com, either by enabling a Ledger device (your most secure option and highly recommended) or enabling passphrase security. After doing this, users should disable email or SMS recovery.
With the transition of the open source wallet codebase to the team at My NEAR Wallet, many improvements are in the works.
We remind our users that the security of wallet accounts is of utmost importance to us–and it doesn’t stop with us either. User choices and behavior also impact security. Please consider using a hardware device, like a Ledger, to secure your wallet. Use only trusted and secured devices when creating and accessing your wallet. Never give out your recovery phrase or private keys.
You can learn more about the future of wallet.nearpages.wpengine.com here:
blog/near-opens-the-door-to-more-wallets/
The MyNearWallet team is also actively improving the security of wallets as outlined here:
https://medium.com/mynearwallet-blog/mynearwallet-security-statement-fd24265d91f2 |
---
NEP: 297
Title: Events
Author: Olga Telezhnaya <[email protected]>
DiscussionsTo: https://github.com/near/NEPs/issues/297
Status: Final
Type: Standards Track
Category: Contract
Created: 03-Mar-2022
---
## Summary
Events format is a standard interface for tracking contract activity.
This document is a meta-part of other standards, such as [NEP-141](https://github.com/near/NEPs/issues/141) or [NEP-171](https://github.com/near/NEPs/discussions/171).
## Motivation
Apps usually perform many similar actions.
Each app may have its own way of performing these actions, introducing inconsistency in capturing these events.
NEAR and third-party applications need to track these and similar events consistently.
If not, tracking state across many apps becomes infeasible.
Events address this issue, providing other applications with the needed standardized data.
Initial discussion is [here](https://github.com/near/NEPs/issues/254).
## Rationale and alternatives
- Why is this design the best in the space of possible designs?
- What other designs have been considered and what is the rationale for not choosing them?
- What is the impact of not doing this?
## Specification
Many apps use different interfaces that represent the same action.
This interface standardizes that process by introducing event logs.
Events use the standard logs capability of NEAR.
Events are log entries that start with the `EVENT_JSON:` prefix followed by a single valid JSON string.
JSON string may have any number of space characters in the beginning, the middle, or the end of the string.
It's guaranteed that space characters do not break its parsing.
All the examples below are pretty-formatted for better readability.
JSON string should have the following interface:
```ts
// Interface to capture data about an event
// Arguments
// * `standard`: name of standard, e.g. nep171
// * `version`: e.g. 1.0.0
// * `event`: type of the event, e.g. nft_mint
// * `data`: associate event data. Strictly typed for each set {standard, version, event} inside corresponding NEP
interface EventLogData {
standard: string;
version: string;
event: string;
data?: unknown;
}
```
Thus, to emit an event, you only need to log a string following the rules above. Here is a bare-bones example using Rust SDK `near_sdk::log!` macro (security note: prefer using `serde_json` or alternatives to serialize the JSON string to avoid potential injections and corrupted events):
```rust
use near_sdk::log;
// ...
log!(
r#"EVENT_JSON:{"standard": "nepXXX", "version": "1.0.0", "event": "YYY", "data": {"token_id": "{}"}}"#,
token_id
);
// ...
```
#### Valid event logs
```js
EVENT_JSON:{
"standard": "nepXXX",
"version": "1.0.0",
"event": "xyz_is_triggered"
}
```
```js
EVENT_JSON:{
"standard": "nepXXX",
"version": "1.0.0",
"event": "xyz_is_triggered",
"data": {
"triggered_by": "foundation.near"
}
}
```
#### Invalid event logs
- Two events in a single log entry (instead, call `log` for each individual event)
```js
EVENT_JSON:{
"standard": "nepXXX",
"version": "1.0.0",
"event": "abc_is_triggered"
}
EVENT_JSON:{
"standard": "nepXXX",
"version": "1.0.0",
"event": "xyz_is_triggered"
}
```
- Invalid JSON data
```js
EVENT_JSON:invalid json
```
- Missing required fields `standard`, `version` or `event`
```js
EVENT_JSON:{
"standard": "nepXXX",
"event": "xyz_is_triggered",
"data": {
"triggered_by": "foundation.near"
}
}
```
## Reference Implementation
[Fungible Token Events Implementation](https://github.com/near/near-sdk-rs/blob/master/near-contract-standards/src/fungible_token/events.rs)
[Non-Fungible Token Events Implementation](https://github.com/near/near-sdk-rs/blob/master/near-contract-standards/src/non_fungible_token/events.rs)
## Drawbacks
There is a known limitation of 16kb strings when capturing logs.
This impacts the amount of events that can be processed.
## Copyright
Copyright and related rights waived via [CC0](https://creativecommons.org/publicdomain/zero/1.0/).
|
---
id: error-implementation
title: Source Code Survey
sidebar_label: Source Code Survey
---
This page provides a very high level, sometimes "pseudocode", view of error types and related messages as implemented by the NEAR platform.
Errors raised by the NEAR platform are implemented in the following locations in `nearcore`:
- [nearcore/core/primitives/src/errors.rs](https://github.com/near/nearcore/blob/master/core/primitives/src/errors.rs)
- [nearcore/runtime/near-vm-errors/src/lib.rs](https://github.com/near/nearcore/blob/master/runtime/near-vm-errors/src/lib.rs)
---
## RuntimeError and subtypes {#runtimeerror-and-subtypes}
### RuntimeError {#runtimeerror}
#### Definition {#definition}
```rust
/// Error returned from `Runtime::apply`
pub enum RuntimeError {
/// An unexpected integer overflow occurred. The likely issue is an invalid state or the transition.
UnexpectedIntegerOverflow,
/// An error happened during TX verification and account charging. It's likely the chunk is invalid.
/// and should be challenged.
InvalidTxError(InvalidTxError),
/// Unexpected error which is typically related to the node storage corruption.account
/// That it's possible the input state is invalid or malicious.
StorageError(StorageError),
/// An error happens if `check_balance` fails, which is likely an indication of an invalid state.
BalanceMismatchError(BalanceMismatchError),
}
```
#### Error Messages {#error-messages}
- see below: `InvalidTxError`, `StorageError` and `BalanceMismatchError`
### InvalidTxError {#invalidtxerror}
#### Definition {#definition-1}
```rust
/// An error happened during TX execution
pub enum InvalidTxError {
/// Happens if a wrong AccessKey used or AccessKey has not enough permissions
InvalidAccessKeyError(InvalidAccessKeyError),
/// TX signer_id is not in a valid format or not satisfy requirements see `near_core::primitives::utils::is_valid_account_id`
InvalidSignerId { signer_id: AccountId },
/// TX signer_id is not found in a storage
SignerDoesNotExist { signer_id: AccountId },
/// Transaction nonce must be account[access_key].nonce + 1
InvalidNonce { tx_nonce: Nonce, ak_nonce: Nonce },
/// TX receiver_id is not in a valid format or not satisfy requirements see `near_core::primitives::utils::is_valid_account_id`
InvalidReceiverId { receiver_id: AccountId },
/// TX signature is not valid
InvalidSignature,
/// Account does not have enough balance to cover TX cost
NotEnoughBalance {
signer_id: AccountId,
balance: Balance,
cost: Balance,
},
/// Signer account rent is unpaid
RentUnpaid {
/// An account which is required to pay the rent
signer_id: AccountId,
/// Required balance to cover the state rent
amount: Balance,
},
/// An integer overflow occurred during transaction cost estimation.
CostOverflow,
/// Transaction parent block hash doesn't belong to the current chain
InvalidChain,
/// Transaction has expired
Expired,
/// An error occurred while validating actions of a Transaction.
ActionsValidation(ActionsValidationError),
}
```
#### Error Messages {#error-messages-1}
```rust
InvalidTxError::InvalidSignerId { signer_id }
"Invalid signer account ID {:?} according to requirements"
InvalidTxError::SignerDoesNotExist { signer_id }
"Signer {:?} does not exist"
InvalidTxError::InvalidAccessKeyError(access_key_error)
InvalidTxError::InvalidNonce { tx_nonce, ak_nonce }
"Transaction nonce {} must be larger than nonce of the used access key {}"
InvalidTxError::InvalidReceiverId { receiver_id }
"Invalid receiver account ID {:?} according to requirements"
InvalidTxError::InvalidSignature
"Transaction is not signed with the given public key"
InvalidTxError::NotEnoughBalance { signer_id, balance, cost }
"Sender {:?} does not have enough balance {} for operation costing {}"
InvalidTxError::RentUnpaid { signer_id, amount }
"Failed to execute, because the account {:?} wouldn't have enough to pay required rent {}"
InvalidTxError::CostOverflow
"Transaction gas or balance cost is too high"
InvalidTxError::InvalidChain
"Transaction parent block hash doesn't belong to the current chain"
InvalidTxError::Expired
"Transaction has expired"
InvalidTxError::ActionsValidation(error)
"Transaction actions validation error: {}"
```
### StorageError {#storageerror}
#### Definition {#definition-2}
```rust
pub enum StorageError {
/// Key-value db internal failure
StorageInternalError,
/// Storage is PartialStorage and requested a missing trie node
TrieNodeMissing,
/// Either invalid state or key-value db is corrupted.
/// For PartialStorage it cannot be corrupted.
/// Error message is unreliable and for debugging purposes only. It's also probably ok to
/// panic in every place that produces this error.
/// We can check if db is corrupted by verifying everything in the state trie.
StorageInconsistentState(String),
}
```
### BalanceMismatchError {#balancemismatcherror}
#### Definition {#definition-3}
```rust
/// Happens when the input balance doesn't match the output balance in Runtime apply.
pub struct BalanceMismatchError {
// Input balances
pub incoming_validator_rewards: Balance,
pub initial_accounts_balance: Balance,
pub incoming_receipts_balance: Balance,
pub processed_delayed_receipts_balance: Balance,
pub initial_postponed_receipts_balance: Balance,
// Output balances
pub final_accounts_balance: Balance,
pub outgoing_receipts_balance: Balance,
pub new_delayed_receipts_balance: Balance,
pub final_postponed_receipts_balance: Balance,
pub total_rent_paid: Balance,
pub total_validator_reward: Balance,
pub total_balance_burnt: Balance,
pub total_balance_slashed: Balance,
}
```
#### Error Messages {#error-messages-2}
```rust
"Balance Mismatch Error. The input balance {} doesn't match output balance {}\n\
Inputs:\n\
\tIncoming validator rewards sum: {}\n\
\tInitial accounts balance sum: {}\n\
\tIncoming receipts balance sum: {}\n\
\tProcessed delayed receipts balance sum: {}\n\
\tInitial postponed receipts balance sum: {}\n\
Outputs:\n\
\tFinal accounts balance sum: {}\n\
\tOutgoing receipts balance sum: {}\n\
\tNew delayed receipts balance sum: {}\n\
\tFinal postponed receipts balance sum: {}\n\
\tTotal rent paid: {}\n\
\tTotal validators reward: {}\n\
\tTotal balance burnt: {}\n\
\tTotal balance slashed: {}",
```
### InvalidAccessKeyError {#invalidaccesskeyerror}
#### Definition {#definition-4}
```rust
pub enum InvalidAccessKeyError {
/// The access key identified by the `public_key` doesn't exist for the account
AccessKeyNotFound { account_id: AccountId, public_key: PublicKey },
/// Transaction `receiver_id` doesn't match the access key receiver_id
ReceiverMismatch { tx_receiver: AccountId, ak_receiver: AccountId },
/// Transaction method name isn't allowed by the access key
MethodNameMismatch { method_name: String },
/// Transaction requires a full permission access key.
RequiresFullAccess,
/// Access Key does not have enough allowance to cover transaction cost
NotEnoughAllowance {
account_id: AccountId,
public_key: PublicKey,
allowance: Balance,
cost: Balance,
},
/// Having a deposit with a function call action is not allowed with a function call access key.
DepositWithFunctionCall,
}
```
#### Error Messages {#error-messages-3}
```rust
InvalidAccessKeyError::AccessKeyNotFound { account_id, public_key }
"Signer {:?} doesn't have access key with the given public_key {}"
InvalidAccessKeyError::ReceiverMismatch { tx_receiver, ak_receiver }
"Transaction receiver_id {:?} doesn't match the access key receiver_id {:?}"
InvalidAccessKeyError::MethodNameMismatch { method_name }
"Transaction method name {:?} isn't allowed by the access key"
InvalidAccessKeyError::RequiresFullAccess
"The transaction contains more then one action, but it was signed \
with an access key which allows transaction to apply only one specific action. \
To apply more then one actions TX must be signed with a full access key"
InvalidAccessKeyError::NotEnoughAllowance { account_id, public_key, allowance, cost }
"Access Key {:?}:{} does not have enough balance {} for transaction costing {}"
InvalidAccessKeyError::DepositWithFunctionCall
"Having a deposit with a function call action is not allowed with a function call access key."
```
### ActionsValidationError {#actionsvalidationerror}
#### Definition {#definition-5}
```rust
/// Describes the error for validating a list of actions.
pub enum ActionsValidationError {
/// The total prepaid gas (for all given actions) exceeded the limit.
TotalPrepaidGasExceeded { total_prepaid_gas: Gas, limit: Gas },
/// The number of actions exceeded the given limit.
TotalNumberOfActionsExceeded { total_number_of_actions: u64, limit: u64 },
/// The total number of bytes of the method names exceeded the limit in a Add Key action.
AddKeyMethodNamesNumberOfBytesExceeded { total_number_of_bytes: u64, limit: u64 },
/// The length of some method name exceeded the limit in a Add Key action.
AddKeyMethodNameLengthExceeded { length: u64, limit: u64 },
/// Integer overflow during a compute.
IntegerOverflow,
/// Invalid account ID.
InvalidAccountId { account_id: AccountId },
/// The size of the contract code exceeded the limit in a DeployContract action.
ContractSizeExceeded { size: u64, limit: u64 },
/// The length of the method name exceeded the limit in a Function Call action.
FunctionCallMethodNameLengthExceeded { length: u64, limit: u64 },
/// The length of the arguments exceeded the limit in a Function Call action.
FunctionCallArgumentsLengthExceeded { length: u64, limit: u64 },
}
```
#### Error Messages {#error-messages-4}
```rust
ActionsValidationError::TotalPrepaidGasExceeded { total_prepaid_gas, limit }
"The total prepaid gas {} exceeds the limit {}"
ActionsValidationError::TotalNumberOfActionsExceeded {total_number_of_actions, limit }
"The total number of actions {} exceeds the limit {}"
ActionsValidationError::AddKeyMethodNamesNumberOfBytesExceeded { total_number_of_bytes, limit }
"The total number of bytes in allowed method names {} exceeds the maximum allowed number {} in a AddKey action"
ActionsValidationError::AddKeyMethodNameLengthExceeded { length, limit }
"The length of some method name {} exceeds the maximum allowed length {} in a AddKey action"
ActionsValidationError::IntegerOverflow
"Integer overflow during a compute"
ActionsValidationError::InvalidAccountId { account_id }
"Invalid account ID `{}`"
ActionsValidationError::ContractSizeExceeded { size, limit }
"The length of the contract size {} exceeds the maximum allowed size {} in a DeployContract action"
ActionsValidationError::FunctionCallMethodNameLengthExceeded { length, limit }
"The length of the method name {} exceeds the maximum allowed length {} in a FunctionCall action"
ActionsValidationError::FunctionCallArgumentsLengthExceeded { length, limit }
"The length of the arguments {} exceeds the maximum allowed length {} in a FunctionCall action"
```
## TxExecutionError and subtypes {#txexecutionerror-and-subtypes}
### TxExecutionError {#txexecutionerror}
#### Definition {#definition-6}
```rust
/// Error returned in the ExecutionOutcome in case of failure
pub enum TxExecutionError {
/// An error happened during Acton execution
ActionError(ActionError),
/// An error happened during Transaction execution
InvalidTxError(InvalidTxError),
}
```
### ActionError {#actionerror}
#### Definition {#definition-7}
```rust
ActionError
pub struct ActionError {
/// Index of the failed action in the transaction.
/// Action index is not defined if ActionError.kind is `ActionErrorKind::RentUnpaid`
pub index: Option<u64>,
/// The kind of ActionError happened
pub kind: ActionErrorKind,
}
```
### ActionErrorKind {#actionerrorkind}
#### Definition {#definition-8}
```rust
pub enum ActionErrorKind {
/// Happens when CreateAccount action tries to create an account with account_id which is already exists in the storage
AccountAlreadyExists { account_id: AccountId },
/// Happens when TX receiver_id doesn't exist (but action is not Action::CreateAccount)
AccountDoesNotExist { account_id: AccountId },
/// A newly created account must be under a namespace of the creator account
CreateAccountNotAllowed { account_id: AccountId, predecessor_id: AccountId },
/// Administrative actions like `DeployContract`, `Stake`, `AddKey`, `DeleteKey`. can be proceed only if sender=receiver
/// or the first TX action is a `CreateAccount` action
ActorNoPermission { account_id: AccountId, actor_id: AccountId },
/// Account tries to remove an access key that doesn't exist
DeleteKeyDoesNotExist { account_id: AccountId, public_key: PublicKey },
/// The public key is already used for an existing access key
AddKeyAlreadyExists { account_id: AccountId, public_key: PublicKey },
/// Account is staking and can not be deleted
DeleteAccountStaking { account_id: AccountId },
/// Foreign sender (sender=!receiver) can delete an account only if a target account hasn't enough tokens to pay rent
DeleteAccountHasRent {
account_id: AccountId,
balance: Balance,
},
/// ActionReceipt can't be completed, because the remaining balance will not be enough to pay rent.
RentUnpaid {
/// An account which is required to pay the rent
account_id: AccountId,
/// Rent due to pay.
amount: Balance,
},
/// Account is not yet staked, but tries to unstake
TriesToUnstake { account_id: AccountId },
/// The account doesn't have enough balance to increase the stake.
TriesToStake {
account_id: AccountId,
stake: Balance,
locked: Balance,
balance: Balance,
},
/// An error occurred during a `FunctionCall` Action.
FunctionCallError(FunctionCallError),
/// Error occurs when a new `ActionReceipt` created by the `FunctionCall` action fails
/// receipt validation.
NewReceiptValidationError(ReceiptValidationError),
}
```
#### Error Messages {#error-messages-5}
```rust
ActionErrorKind::AccountAlreadyExists { account_id }
"Can't create a new account {:?}, because it already exists"
ActionErrorKind::AccountDoesNotExist { account_id }
"Can't complete the action because account {:?} doesn't exist"
ActionErrorKind::ActorNoPermission { actor_id, account_id }
"Actor {:?} doesn't have permission to account {:?} to complete the action"
ActionErrorKind::RentUnpaid { account_id, amount }
"The account {} wouldn't have enough balance to pay required rent {}"
ActionErrorKind::TriesToUnstake { account_id }
"Account {:?} is not yet staked, but tries to unstake"
ActionErrorKind::TriesToStake { account_id, stake, locked, balance }
"Account {:?} tries to stake {}, but has staked {} and only has {}"
ActionErrorKind::CreateAccountNotAllowed { account_id, predecessor_id }
"The new account_id {:?} can't be created by {:?}"
ActionErrorKind::DeleteKeyDoesNotExist { account_id, .. }
"Account {:?} tries to remove an access key that doesn't exist"
ActionErrorKind::AddKeyAlreadyExists { public_key, .. }
"The public key {:?} is already used for an existing access key"
ActionErrorKind::DeleteAccountStaking { account_id }
"Account {:?} is staking and can not be deleted"
ActionErrorKind::DeleteAccountHasRent { account_id, balance }
"Account {:?} can't be deleted. It has {}, which is enough to cover the rent"
ActionErrorKind::FunctionCallError(s)
ActionErrorKind::NewReceiptValidationError(e)
"An new action receipt created during a FunctionCall is not valid: {}"
```
### ReceiptValidationError {#receiptvalidationerror}
#### Definition {#definition-9}
```rust
/// Describes the error for validating a receipt.
pub enum ReceiptValidationError {
/// The `predecessor_id` of a Receipt is not valid.
InvalidPredecessorId { account_id: AccountId },
/// The `receiver_id` of a Receipt is not valid.
InvalidReceiverId { account_id: AccountId },
/// The `signer_id` of an ActionReceipt is not valid.
InvalidSignerId { account_id: AccountId },
/// The `receiver_id` of a DataReceiver within an ActionReceipt is not valid.
InvalidDataReceiverId { account_id: AccountId },
/// The length of the returned data exceeded the limit in a DataReceipt.
ReturnedValueLengthExceeded { length: u64, limit: u64 },
/// The number of input data dependencies exceeds the limit in an ActionReceipt.
NumberInputDataDependenciesExceeded { number_of_input_data_dependencies: u64, limit: u64 },
/// An error occurred while validating actions of an ActionReceipt.
ActionsValidation(ActionsValidationError),
}
```
#### Error Messages {#error-messages-6}
```rust
ReceiptValidationError::InvalidPredecessorId { account_id }
"The predecessor_id `{}` of a Receipt is not valid."
ReceiptValidationError::InvalidReceiverId { account_id }
"The receiver_id `{}` of a Receipt is not valid."
ReceiptValidationError::InvalidSignerId { account_id }
"The signer_id `{}` of an ActionReceipt is not valid."
ReceiptValidationError::InvalidDataReceiverId { account_id }
"The receiver_id `{}` of a DataReceiver within an ActionReceipt is not valid."
ReceiptValidationError::ReturnedValueLengthExceeded { length, limit }
"The length of the returned data {} exceeded the limit {} in a DataReceipt"
ReceiptValidationError::NumberInputDataDependenciesExceeded { number_of_input_data_dependencies, limit }
"The number of input data dependencies {} exceeded the limit {} in an ActionReceipt"
ReceiptValidationError::ActionsValidation(e)
```
## VMError and subtypes {#vmerror-and-subtypes}
### VMError {#vmerror}
#### Definition {#definition-10}
```rust
pub enum VMError {
FunctionCallError(FunctionCallError),
/// Serialized external error from External trait implementation.
ExternalError(Vec<u8>),
/// An error that is caused by an operation on an inconsistent state.
/// E.g. an integer overflow by using a value from the given context.
InconsistentStateError(InconsistentStateError),
}
```
#### Error Messages {#error-messages-7}
```rust
VMError::ExternalError
"Serialized ExternalError"
```
### FunctionCallError {#functioncallerror}
#### Definition {#definition-11}
```rust
pub enum FunctionCallError {
CompilationError(CompilationError),
LinkError { msg: String },
MethodResolveError(MethodResolveError),
WasmTrap { msg: String },
HostError(HostError),
}
```
#### Error Messages {#error-messages-8}
```rust
FunctionCallError::WasmTrap
"WebAssembly trap: {}"
```
### MethodResolveError {#methodresolveerror}
#### Definition {#definition-12}
```rust
pub enum MethodResolveError {
MethodEmptyName,
MethodUTF8Error,
MethodNotFound,
MethodInvalidSignature,
}
```
### CompilationError {#compilationerror}
#### Definition {#definition-13}
```rust
pub enum CompilationError {
CodeDoesNotExist { account_id: String },
PrepareError(PrepareError),
WasmerCompileError { msg: String },
}
```
#### Error Messages {#error-messages-9}
```rust
CompilationError::CodeDoesNotExist
"cannot find contract code for account {}"
CompilationError::PrepareError(p)
"PrepareError: {}"
CompilationError::WasmerCompileError
"Wasmer compilation error: {}"
```
### PrepareError {#prepareerror}
#### Definition {#definition-14}
```rust
/// Error that can occur while preparing or executing Wasm smart-contract.
pub enum PrepareError {
/// Error happened while serializing the module.
Serialization,
/// Error happened while deserializing the module.
Deserialization,
/// Internal memory declaration has been found in the module.
InternalMemoryDeclared,
/// Gas instrumentation failed.
///
/// This most likely indicates the module isn't valid.
GasInstrumentation,
/// Stack instrumentation failed.
///
/// This most likely indicates the module isn't valid.
StackHeightInstrumentation,
/// Error happened during instantiation.
///
/// This might indicate that `start` function trapped, or module isn't
/// instantiable and/or unlinkable.
Instantiate,
/// Error creating memory.
Memory,
}
```
#### Error Messages {#error-messages-10}
```rust
Serialization
"Error happened while serializing the module."
Deserialization
"Error happened while deserializing the module."
InternalMemoryDeclared
"Internal memory declaration has been found in the module."
GasInstrumentation
"Gas instrumentation failed."
StackHeightInstrumentation
"Stack instrumentation failed."
Instantiate
"Error happened during instantiation."
Memory
"Error creating memory"
```
### HostError {#hosterror}
#### Definition {#definition-15}
```rust
pub enum HostError {
/// String encoding is bad UTF-16 sequence
BadUTF16,
/// String encoding is bad UTF-8 sequence
BadUTF8,
/// Exceeded the prepaid gas
GasExceeded,
/// Exceeded the maximum amount of gas allowed to burn per contract
GasLimitExceeded,
/// Exceeded the account balance
BalanceExceeded,
/// Tried to call an empty method name
EmptyMethodName,
/// Smart contract panicked
GuestPanic { panic_msg: String },
/// IntegerOverflow happened during a contract execution
IntegerOverflow,
/// `promise_idx` does not correspond to existing promises
InvalidPromiseIndex { promise_idx: u64 },
/// Actions can only be appended to non-joint promise.
CannotAppendActionToJointPromise,
/// Returning joint promise is currently prohibited
CannotReturnJointPromise,
/// Accessed invalid promise result index
InvalidPromiseResultIndex { result_idx: u64 },
/// Accessed invalid register id
InvalidRegisterId { register_id: u64 },
/// Iterator `iterator_index` was invalidated after its creation by performing a mutable operation on trie
IteratorWasInvalidated { iterator_index: u64 },
/// Accessed memory outside the bounds
MemoryAccessViolation,
/// VM Logic returned an invalid receipt index
InvalidReceiptIndex { receipt_index: u64 },
/// Iterator index `iterator_index` does not exist
InvalidIteratorIndex { iterator_index: u64 },
/// VM Logic returned an invalid account id
InvalidAccountId,
/// VM Logic returned an invalid method name
InvalidMethodName,
/// VM Logic provided an invalid public key
InvalidPublicKey,
/// `method_name` is not allowed in view calls
ProhibitedInView { method_name: String },
/// The total number of logs will exceed the limit.
NumberOfLogsExceeded { limit: u64 },
/// The storage key length exceeded the limit.
KeyLengthExceeded { length: u64, limit: u64 },
/// The storage value length exceeded the limit.
ValueLengthExceeded { length: u64, limit: u64 },
/// The total log length exceeded the limit.
TotalLogLengthExceeded { length: u64, limit: u64 },
/// The maximum number of promises within a FunctionCall exceeded the limit.
NumberPromisesExceeded { number_of_promises: u64, limit: u64 },
/// The maximum number of input data dependencies exceeded the limit.
NumberInputDataDependenciesExceeded { number_of_input_data_dependencies: u64, limit: u64 },
/// The returned value length exceeded the limit.
ReturnedValueLengthExceeded { length: u64, limit: u64 },
/// The contract size for DeployContract action exceeded the limit.
ContractSizeExceeded { size: u64, limit: u64 },
}
```
#### Error Messages {#error-messages-11}
```rust
BadUTF8
"String encoding is bad UTF-8 sequence."
BadUTF16
"String encoding is bad UTF-16 sequence."
GasExceeded
"Exceeded the prepaid gas."
GasLimitExceeded
"Exceeded the maximum amount of gas allowed to burn per contract."
BalanceExceeded
"Exceeded the account balance."
EmptyMethodName
"Tried to call an empty method name."
GuestPanic { panic_msg }
"Smart contract panicked: {}"
IntegerOverflow
"Integer overflow."
InvalidIteratorIndex { iterator_index }
"Iterator index {:?} does not exist"
InvalidPromiseIndex { promise_idx }
"{:?} does not correspond to existing promises"
CannotAppendActionToJointPromise
"Actions can only be appended to non-joint promise."
CannotReturnJointPromise
"Returning joint promise is currently prohibited."
InvalidPromiseResultIndex { result_idx }
"Accessed invalid promise result index: {:?}"
InvalidRegisterId { register_id }
"Accessed invalid register id: {:?}"
IteratorWasInvalidated { iterator_index }
"Iterator {:?} was invalidated after its creation by performing a mutable operation on trie"
MemoryAccessViolation
"Accessed memory outside the bounds."
InvalidReceiptIndex { receipt_index }
"VM Logic returned an invalid receipt index: {:?}"
InvalidAccountId
"VM Logic returned an invalid account id"
InvalidMethodName
"VM Logic returned an invalid method name"
InvalidPublicKey
"VM Logic provided an invalid public key"
ProhibitedInView { method_name }
"{} is not allowed in view calls"
NumberOfLogsExceeded { limit }
"The number of logs will exceed the limit {}"
KeyLengthExceeded { length, limit }
"The length of a storage key {} exceeds the limit {}"
ValueLengthExceeded { length, limit }
"The length of a storage value {} exceeds the limit {}"
TotalLogLengthExceeded{ length, limit }
"The length of a log message {} exceeds the limit {}"
NumberPromisesExceeded { number_of_promises, limit }
"The number of promises within a FunctionCall {} exceeds the limit {}"
NumberInputDataDependenciesExceeded { number_of_input_data_dependencies, limit }
"The number of input data dependencies {} exceeds the limit {}"
ReturnedValueLengthExceeded { length, limit }
"The length of a returned value {} exceeds the limit {}"
ContractSizeExceeded { size, limit }
"The size of a contract code in DeployContract action {} exceeds the limit {}"
```
### VMLogicError {#vmlogicerror}
#### Definition {#definition-16}
```rust
pub enum VMLogicError {
HostError(HostError),
/// Serialized external error from External trait implementation.
ExternalError(Vec<u8>),
/// An error that is caused by an operation on an inconsistent state.
InconsistentStateError(InconsistentStateError),
}
```
### InconsistentStateError {#inconsistentstateerror}
#### Definition {#definition-17}
```rust
pub enum InconsistentStateError {
/// Math operation with a value from the state resulted in a integer overflow.
IntegerOverflow,
}
```
#### Error Messages {#error-messages-12}
```rust
InconsistentStateError::IntegerOverflow
"Math operation with a value from the state resulted in a integer overflow."
```
## RPC interface {#rpc-interface}
- error name
- error subtype(s)
- error properties
### Error Schema {#error-schema}
```json
{
"schema": {
"BadUTF16": {
"name": "BadUTF16",
"subtypes": [],
"props": {}
},
"BadUTF8": {
"name": "BadUTF8",
"subtypes": [],
"props": {}
},
"BalanceExceeded": {
"name": "BalanceExceeded",
"subtypes": [],
"props": {}
},
"CannotAppendActionToJointPromise": {
"name": "CannotAppendActionToJointPromise",
"subtypes": [],
"props": {}
},
"CannotReturnJointPromise": {
"name": "CannotReturnJointPromise",
"subtypes": [],
"props": {}
},
"CodeDoesNotExist": {
"name": "CodeDoesNotExist",
"subtypes": [],
"props": {
"account_id": ""
}
},
"CompilationError": {
"name": "CompilationError",
"subtypes": [
"CodeDoesNotExist",
"PrepareError",
"WasmerCompileError"
],
"props": {}
},
"ContractSizeExceeded": {
"name": "ContractSizeExceeded",
"subtypes": [],
"props": {
"limit": "",
"size": ""
}
},
"Deserialization": {
"name": "Deserialization",
"subtypes": [],
"props": {}
},
"EmptyMethodName": {
"name": "EmptyMethodName",
"subtypes": [],
"props": {}
},
"FunctionCallError": {
"name": "FunctionCallError",
"subtypes": [
"CompilationError",
"LinkError",
"MethodResolveError",
"WasmTrap",
"HostError"
],
"props": {}
},
"GasExceeded": {
"name": "GasExceeded",
"subtypes": [],
"props": {}
},
"GasInstrumentation": {
"name": "GasInstrumentation",
"subtypes": [],
"props": {}
},
"GasLimitExceeded": {
"name": "GasLimitExceeded",
"subtypes": [],
"props": {}
},
"GuestPanic": {
"name": "GuestPanic",
"subtypes": [],
"props": {
"panic_msg": ""
}
},
"HostError": {
"name": "HostError",
"subtypes": [
"BadUTF16",
"BadUTF8",
"GasExceeded",
"GasLimitExceeded",
"BalanceExceeded",
"EmptyMethodName",
"GuestPanic",
"IntegerOverflow",
"InvalidPromiseIndex",
"CannotAppendActionToJointPromise",
"CannotReturnJointPromise",
"InvalidPromiseResultIndex",
"InvalidRegisterId",
"IteratorWasInvalidated",
"MemoryAccessViolation",
"InvalidReceiptIndex",
"InvalidIteratorIndex",
"InvalidAccountId",
"InvalidMethodName",
"InvalidPublicKey",
"ProhibitedInView",
"NumberOfLogsExceeded",
"KeyLengthExceeded",
"ValueLengthExceeded",
"TotalLogLengthExceeded",
"NumberPromisesExceeded",
"NumberInputDataDependenciesExceeded",
"ReturnedValueLengthExceeded",
"ContractSizeExceeded"
],
"props": {}
},
"Instantiate": {
"name": "Instantiate",
"subtypes": [],
"props": {}
},
"IntegerOverflow": {
"name": "IntegerOverflow",
"subtypes": [],
"props": {}
},
"InternalMemoryDeclared": {
"name": "InternalMemoryDeclared",
"subtypes": [],
"props": {}
},
"InvalidAccountId": {
"name": "InvalidAccountId",
"subtypes": [],
"props": {}
},
"InvalidIteratorIndex": {
"name": "InvalidIteratorIndex",
"subtypes": [],
"props": {
"iterator_index": ""
}
},
"InvalidMethodName": {
"name": "InvalidMethodName",
"subtypes": [],
"props": {}
},
"InvalidPromiseIndex": {
"name": "InvalidPromiseIndex",
"subtypes": [],
"props": {
"promise_idx": ""
}
},
"InvalidPromiseResultIndex": {
"name": "InvalidPromiseResultIndex",
"subtypes": [],
"props": {
"result_idx": ""
}
},
"InvalidPublicKey": {
"name": "InvalidPublicKey",
"subtypes": [],
"props": {}
},
"InvalidReceiptIndex": {
"name": "InvalidReceiptIndex",
"subtypes": [],
"props": {
"receipt_index": ""
}
},
"InvalidRegisterId": {
"name": "InvalidRegisterId",
"subtypes": [],
"props": {
"register_id": ""
}
},
"IteratorWasInvalidated": {
"name": "IteratorWasInvalidated",
"subtypes": [],
"props": {
"iterator_index": ""
}
},
"KeyLengthExceeded": {
"name": "KeyLengthExceeded",
"subtypes": [],
"props": {
"length": "",
"limit": ""
}
},
"LinkError": {
"name": "LinkError",
"subtypes": [],
"props": {
"msg": ""
}
},
"Memory": {
"name": "Memory",
"subtypes": [],
"props": {}
},
"MemoryAccessViolation": {
"name": "MemoryAccessViolation",
"subtypes": [],
"props": {}
},
"MethodEmptyName": {
"name": "MethodEmptyName",
"subtypes": [],
"props": {}
},
"MethodInvalidSignature": {
"name": "MethodInvalidSignature",
"subtypes": [],
"props": {}
},
"MethodNotFound": {
"name": "MethodNotFound",
"subtypes": [],
"props": {}
},
"MethodResolveError": {
"name": "MethodResolveError",
"subtypes": [
"MethodEmptyName",
"MethodUTF8Error",
"MethodNotFound",
"MethodInvalidSignature"
],
"props": {}
},
"MethodUTF8Error": {
"name": "MethodUTF8Error",
"subtypes": [],
"props": {}
},
"NumberInputDataDependenciesExceeded": {
"name": "NumberInputDataDependenciesExceeded",
"subtypes": [],
"props": {
"limit": "",
"number_of_input_data_dependencies": ""
}
},
"NumberOfLogsExceeded": {
"name": "NumberOfLogsExceeded",
"subtypes": [],
"props": {
"limit": ""
}
},
"NumberPromisesExceeded": {
"name": "NumberPromisesExceeded",
"subtypes": [],
"props": {
"limit": "",
"number_of_promises": ""
}
},
"PrepareError": {
"name": "PrepareError",
"subtypes": [
"Serialization",
"Deserialization",
"InternalMemoryDeclared",
"GasInstrumentation",
"StackHeightInstrumentation",
"Instantiate",
"Memory"
],
"props": {}
},
"ProhibitedInView": {
"name": "ProhibitedInView",
"subtypes": [],
"props": {
"method_name": ""
}
},
"ReturnedValueLengthExceeded": {
"name": "ReturnedValueLengthExceeded",
"subtypes": [],
"props": {
"length": "",
"limit": ""
}
},
"Serialization": {
"name": "Serialization",
"subtypes": [],
"props": {}
},
"StackHeightInstrumentation": {
"name": "StackHeightInstrumentation",
"subtypes": [],
"props": {}
},
"TotalLogLengthExceeded": {
"name": "TotalLogLengthExceeded",
"subtypes": [],
"props": {
"length": "",
"limit": ""
}
},
"ValueLengthExceeded": {
"name": "ValueLengthExceeded",
"subtypes": [],
"props": {
"length": "",
"limit": ""
}
},
"WasmTrap": {
"name": "WasmTrap",
"subtypes": [],
"props": {
"msg": ""
}
},
"WasmerCompileError": {
"name": "WasmerCompileError",
"subtypes": [],
"props": {
"msg": ""
}
},
"AccessKeyNotFound": {
"name": "AccessKeyNotFound",
"subtypes": [],
"props": {
"account_id": "",
"public_key": ""
}
},
"AccountAlreadyExists": {
"name": "AccountAlreadyExists",
"subtypes": [],
"props": {
"account_id": ""
}
},
"AccountDoesNotExist": {
"name": "AccountDoesNotExist",
"subtypes": [],
"props": {
"account_id": ""
}
},
"ActionError": {
"name": "ActionError",
"subtypes": [
"AccountAlreadyExists",
"AccountDoesNotExist",
"CreateAccountNotAllowed",
"ActorNoPermission",
"DeleteKeyDoesNotExist",
"AddKeyAlreadyExists",
"DeleteAccountStaking",
"DeleteAccountHasRent",
"RentUnpaid",
"TriesToUnstake",
"TriesToStake",
"FunctionCallError",
"NewReceiptValidationError"
],
"props": {
"index": ""
}
},
"ActorNoPermission": {
"name": "ActorNoPermission",
"subtypes": [],
"props": {
"account_id": "",
"actor_id": ""
}
},
"AddKeyAlreadyExists": {
"name": "AddKeyAlreadyExists",
"subtypes": [],
"props": {
"account_id": "",
"public_key": ""
}
},
"BalanceMismatchError": {
"name": "BalanceMismatchError",
"subtypes": [],
"props": {
"final_accounts_balance": "",
"final_postponed_receipts_balance": "",
"incoming_receipts_balance": "",
"incoming_validator_rewards": "",
"initial_accounts_balance": "",
"initial_postponed_receipts_balance": "",
"new_delayed_receipts_balance": "",
"outgoing_receipts_balance": "",
"processed_delayed_receipts_balance": "",
"total_balance_burnt": "",
"total_balance_slashed": "",
"total_rent_paid": "",
"total_validator_reward": ""
}
},
"CostOverflow": {
"name": "CostOverflow",
"subtypes": [],
"props": {}
},
"CreateAccountNotAllowed": {
"name": "CreateAccountNotAllowed",
"subtypes": [],
"props": {
"account_id": "",
"predecessor_id": ""
}
},
"DeleteAccountHasRent": {
"name": "DeleteAccountHasRent",
"subtypes": [],
"props": {
"account_id": "",
"balance": ""
}
},
"DeleteAccountStaking": {
"name": "DeleteAccountStaking",
"subtypes": [],
"props": {
"account_id": ""
}
},
"DeleteKeyDoesNotExist": {
"name": "DeleteKeyDoesNotExist",
"subtypes": [],
"props": {
"account_id": "",
"public_key": ""
}
},
"DepositWithFunctionCall": {
"name": "DepositWithFunctionCall",
"subtypes": [],
"props": {}
},
"Expired": {
"name": "Expired",
"subtypes": [],
"props": {}
},
"InvalidAccessKeyError": {
"name": "InvalidAccessKeyError",
"subtypes": [
"AccessKeyNotFound",
"ReceiverMismatch",
"MethodNameMismatch",
"RequiresFullAccess",
"NotEnoughAllowance",
"DepositWithFunctionCall"
],
"props": {}
},
"InvalidChain": {
"name": "InvalidChain",
"subtypes": [],
"props": {}
},
"InvalidNonce": {
"name": "InvalidNonce",
"subtypes": [],
"props": {
"ak_nonce": "",
"tx_nonce": ""
}
},
"InvalidReceiverId": {
"name": "InvalidReceiverId",
"subtypes": [],
"props": {
"receiver_id": ""
}
},
"InvalidSignature": {
"name": "InvalidSignature",
"subtypes": [],
"props": {}
},
"InvalidSignerId": {
"name": "InvalidSignerId",
"subtypes": [],
"props": {
"signer_id": ""
}
},
"InvalidTxError": {
"name": "InvalidTxError",
"subtypes": [
"InvalidAccessKeyError",
"InvalidSignerId",
"SignerDoesNotExist",
"InvalidNonce",
"InvalidReceiverId",
"InvalidSignature",
"NotEnoughBalance",
"RentUnpaid",
"CostOverflow",
"InvalidChain",
"Expired",
"ActionsValidation"
],
"props": {}
},
"MethodNameMismatch": {
"name": "MethodNameMismatch",
"subtypes": [],
"props": {
"method_name": ""
}
},
"NotEnoughAllowance": {
"name": "NotEnoughAllowance",
"subtypes": [],
"props": {
"account_id": "",
"allowance": "",
"cost": "",
"public_key": ""
}
},
"NotEnoughBalance": {
"name": "NotEnoughBalance",
"subtypes": [],
"props": {
"balance": "",
"cost": "",
"signer_id": ""
}
},
"ReceiverMismatch": {
"name": "ReceiverMismatch",
"subtypes": [],
"props": {
"ak_receiver": "",
"tx_receiver": ""
}
},
"RentUnpaid": {
"name": "RentUnpaid",
"subtypes": [],
"props": {
"account_id": "",
"amount": ""
}
},
"RequiresFullAccess": {
"name": "RequiresFullAccess",
"subtypes": [],
"props": {}
},
"SignerDoesNotExist": {
"name": "SignerDoesNotExist",
"subtypes": [],
"props": {
"signer_id": ""
}
},
"TriesToStake": {
"name": "TriesToStake",
"subtypes": [],
"props": {
"account_id": "",
"balance": "",
"locked": "",
"stake": ""
}
},
"TriesToUnstake": {
"name": "TriesToUnstake",
"subtypes": [],
"props": {
"account_id": ""
}
},
"TxExecutionError": {
"name": "TxExecutionError",
"subtypes": [
"ActionError",
"InvalidTxError"
],
"props": {}
},
"Closed": {
"name": "Closed",
"subtypes": [],
"props": {}
},
"ServerError": {
"name": "ServerError",
"subtypes": [
"TxExecutionError",
"Timeout",
"Closed"
],
"props": {}
},
"Timeout": {
"name": "Timeout",
"subtypes": [],
"props": {}
}
}
}
```
:::tip Got a question?
<a href="https://stackoverflow.com/questions/tagged/nearprotocol"> Ask it on StackOverflow! </a>
:::
|
---
id: db-migrations
title: Safe Database Migrations
sidebar_label: Safe Database Migrations
description: How to do database migrations without corrupting the database
sidebar_position: 3
---
# Database Migrations
## Background
Nodes use [RocksDB](https://rocksdb.org/) database to store blockchain
information locally. Some node releases need to change the format of data stored
in that database, for example to enable new protocol features. The process of
converting the data to a new format is called **database migration**. Data
formats are also numbered and this is called **database version**.
Node binary can print its supported protocol version and the database version it needs.
For example, here we can see that the node expects database version 28, as indicated by `db 28`:
```
$ ./target/release/neard
neard (release 1.22.0) (build 5a6fb2bd2) (protocol 48) (db 28)
NEAR Protocol Node
```
If an existing database version is lower that what the binary needs, the node
performs a **database migration** at startup. Here is an example of running a
node version `1.24.0` using an instance of database created by a node version
`1.21.1`. You can see several DB migrations triggering sequentially.
```
$ ./target/release/neard
INFO neard: Version: 1.24.0, Build: crates-0.11.0-80-g164991d7a, Latest Protocol: 51
INFO near: Opening store database at "/home/user/.near/data"
INFO near: Migrate DB from version 27 to 28
INFO near: Migrate DB from version 28 to 29
INFO near: Migrate DB from version 29 to 30
INFO near: Migrate DB from version 30 to 31
```
## What can go wrong?
Sometimes a database migration gets **interrupted**. This can happen for many
reasons, such as a machine failing during a long-running database migration, or
the user accidentally stopping the process with `Ctrl-C`. The data stored in the
database has no self-describing metadata for efficiency reasons, therefore it is
impossible to determine which database items were already converted to the new
format, making it impossible to resume or start the migration over. This means
that interrupting a database migration gets the database irrecoverably corrupted.
## Safe database migrations
Starting with neard release `1.26.0`, the node includes a way to **recover the
database instance** even if the database migration gets corrupted. This feature
is enabled by default but requires a manual intervention if a database migration
actually gets interrupted.
One of the possible ways to restore a database is to use a known good state of
the database. Before `1.26.0`, this was mostly done by downloading a
[node database snapshot](/intro/node-data-snapshots).
Starting with `1.26.0`, it can be done locally, which is more convenient and
much faster.
For the demonstration purposes, let's assume that the near home directory is
`/home/user/.near`, and the database location is `/home/user/.near/data`. Then a
safe database migration works the following way:
1. Creates an instant and free snapshot of the existing database in
`/home/user/.near/data/migration-snapshot` using filesystem hard links.
<blockquote class="warning">
If your filesystem doesn't support hardlinks (or you’ve configured the snapshot
to be created on a different file system), this step can take significant
time and double the space used by the database.
</blockquote>
2. Runs the database migration.
<blockquote class="warning">
Even though a newly created snapshot takes no additional space, the space taken
by the snapshot will gradually increase as the database migration progresses.
</blockquote>
3. Deletes the snapshot.
4. Runs the node normally.
If the migration step is interrupted, a snapshot will not be deleted. Upon
restart, the node will detect the presence of the local snapshot, assume that
a database migration was interrupted (thus corrupting the database) and ask the
user to recover the database from that snapshot.
### Recovery
Assuming the corrupted database is in `/home/user/.near/data`, and the snapshot
is in its default location in the database directory (
`/home/user/.near/data/migration-snapshot`) a user may restore the database as
follows:
```sh
# Delete files of the corrupted database
rm /home/user/.near/data/*.sst
# Move not only the .sst files, but all files, to the data directory
mv /home/user/.near/data/migration-snapshot/* /home/user/.near/data/
# Delete the empty snapshot directory
rm -r /home/user/.near/data/migration-snapshot
# Restart
./target/release/neard
```
### Configuration
Starting with upcoming release 1.30, the safe database migrations feature is
configured by a `store.migration_snapshot` option (i.e., a `migration_snapshot`
property of a `store` object). It can be set to one of the following:
- an absolute path (e.g. `"/srv/neard/migration-snapshot"`) — the snapshot will
be created in specified location;
- a relative path (e.g. `"migration-snapshot"`) — the snapshot will be created
in specified sub-directory inside of the database directory;
- `true` (the default) — equivalent to specifying `"migration-snapshot"`
relative path; or
- `false` — the safe migration feature will be disabled.
Note that the default location of the snapshot is inside the database directory.
This ensures the snapshot is instant and free (so long as the filesystem
supports hardlinks).
Prior to version 1.30, the feature was configured by `use_db_migration_snapshot`
and `db_migration_snapshot_path` options. They are are now deprecated and if
the node detects that they are set, it will fail a migration with message
explaining how to migrate to new options.
>Got a question?
<a href="https://stackoverflow.com/questions/tagged/nearprotocol">
<h8>Ask it on StackOverflow!</h8></a>
|
NEAR Weekly On-Chain Data Report: December 16
NEAR FOUNDATION
December 16, 2022
As part of the Foundation’s commitment to transparency, each week it will publish data to help the NEAR community understand the health of the ecosystem. This will be on top of the quarterly reports, and the now monthly funding reports.
You can find the Quarterly Transparency Reports here. (The Q3 report will be published next week.)
Last week’s On-Chain Data Report can be found here.
The Importance of Transparency
Transparency has always been one of NEAR Foundation’s core beliefs. Openness to the community, investors, builders, and creators is one of the core tenets of being a Web3 project.
In recent months, in response to community frustration, the Foundation has endeavored to do more. This NEAR Weekly On-Chain Data Report is just one of the ways in which the Foundation is being more proactively transparent.
New Accounts and Active Accounts
New Accounts are defined as new wallets created on the NEAR blockchain. While there was a decline in new accounts between the last week of November and the first week of December, this figure rose from an average of 14,000 per day to approximately 24,000 per day this week. New accounts reached a weekly high of 25,001 on December 12th.
This puts last week’s New Accounts data in line with November’s average figure of 24,000 wallets per 24 hours. As noted last week, the peak for account creation in Q4 was September 13, during which 130,000 new wallets were created in a single day. Collectively, these numbers equate to nearly 2M total wallets on the NEAR blockchain.
The Daily Number of Active Accounts measures the number of NEAR wallets making on-chain transactions. Over the last week, Active Accounts hit a high of 92,440 before trending down to about 51,000 per day.
Historically, there have been highs of more than 100,000 active accounts on NEAR. The high for Active Accounts on any one day in Q4 of 2022 was logged on September 14, during which 183,000 accounts were active.
New Contracts and Active Contracts
Smart contracts created on NEAR are programs stored on a blockchain that run when predetermined conditions are met. The Daily Number of New Contracts is valuable as a metric because it gives the community a way of measuring the NEAR ecosystem’s health and growth. If there are more active contracts, it follows that projects are more actively engaging with the NEAR protocol.
This week, the daily number of New Contract has been trending upward, whereas last week it rose and fell. A low of 17 New Contracts was measured on December 11th,, before falling to 50 on the 14th. A weekly high of 57 contracts were created on December 13th.
Active Contracts measure contracts executed during a 24-hour period. This metric also trended upward this week, with 640 measured on December 11th and and a high of 797 Active Contracts created on December 13th.
Used Gas and Gas Fee
Gas Fees is a term used to describe the cost of making transactions on the NEAR network. These fees are paid to validators for the network services they provide to the NEAR blockchain. Gas fees incentivize validators to secure the network.
In the last week, Used Gas on NEAR (PetaGas) was measured at a high of 7,569 on December 11th and 6,758 on the 14th. To learn more about Gas on NEAR, check out the NEAR White Paper. (Rises in gas used can be attributed to many factors, with a common one being increased user activity on the NEAR network.)
Over the last week, there has been a slight drop in the Gas Fee (in NEAR), which correlates with a drop in Used Gas. On December 11th, the Gas Fee was measured at 756, before falling to 675 on the 14th.
Daily Transactions
The Daily Number of Transactions illustrates the number of times the blockchain logged a transaction. The data from this week showcases an increase in the number of transactions, which was also the case over the previous week. Daily Number of Transactions hit a weekly high of 439,660 on December 11th before falling to 373,335 on the 14th. |
# Fraction
## numerator
_type: u64_
## denominator
_type: u64_
|
```js
import { Wallet } from './near-wallet';
const DAO_FACTORY_CONTRACT_ADDRESS = "sputnik-dao.near";
const wallet = new Wallet({ createAccessKeyFor: DAO_FACTORY_CONTRACT_ADDRESS });
await wallet.viewMethod({
method: 'get_dao_list',
args: {},
contractId: DAO_FACTORY_CONTRACT_ADDRESS
});
```
_The `Wallet` object comes from our [quickstart template](https://github.com/near-examples/hello-near-examples/blob/main/frontend/near-wallet.js)_ |
---
id: index-functions
title: Indexing Functions
sidebar_label: Indexing Functions
---
:::info
QueryAPI is a fully managed service that allows you to create and manage indexers on-chain seamlessly.
:::
## Indexing
Let's review a [very simple indexer](https://near.org/dataplatform.near/widget/QueryApi.App?selectedIndexerPath=roshaan.near/demo-indexer), which will help you to understand
how the indexer's indexing logic works.
```js title=indexingLogic.js
import { Block } from "@near-lake/primitives";
/**
* getBlock(block, context) applies your custom logic to a Block on Near and commits the data to a database.
*
* @param {block} Block - A Near Protocol Block
* @param {context} - A set of helper methods to retrieve and commit state
*/
async function getBlock(block: Block, context) {
const h = block.header().height;
await context.set("height", h);
}
```
In the `getBlock()` function, you're given a block, which is a block on the Near blockchain, as
well as a `context` object, which gives you a set of helper methods to be able to commit
what you want to the database that QueryAPI has provisioned for you.
The code is going into the header of the `block`
and getting the block's `height`, and then is using the `context` object to set a key value store.
Next, if you check out the database schema:
```sql title=schema.sql
CREATE TABLE
"indexer_storage" (
"function_name" TEXT NOT NULL,
"key_name" TEXT NOT NULL,
"value" TEXT NOT NULL,
PRIMARY KEY ("function_name", "key_name")
)
```
It's a very simple schema where you can store the `function_name`, `key_name`, and `value`.
:::tip
That's all this indexer function is doing: it sets the `height` value equal to the current block's height.
:::
<!-- ![QueryAPI Indexer Dashboard](/docs/assets/QAPIScreen2.png) -->
## Local Debug Mode
While you're iterating on your indexer development, it's critical to have some type of debugging
functionality to be able to test with, and the _Debug Mode_ is very helpful for that.
![QueryAPI Dashboard](/docs/assets/QAPIdebug.png)
For example, if you want to test the [simple indexer](#indexing) explained in the previous section
using the local debugging mode:
- Enable <kbd>Debug Mode</kbd> on the **Indexer Editor**
- Add a block to your debug list (e.g., `97779559`)
- Go into your web browser's Console
- Finally, click <kbd>Play</kbd>.
On your browser's Console, you should see the indexer's debug execution where it sets the `height` key to `97779559`:
![QueryAPI Indexer Dashboard](/docs/assets/QAPIdebuglog.png)
:::info Local tests
All debug mode tests are happening **locally**, so they do not reach the database.
All your queries and mutations will return empty objects.
:::
:::tip
You can also click <kbd>Follow the Network</kbd> and it will show how your indexer logic works throughout.
:::
## Contract filters
A contract filter is used by QueryAPI to do backend optimizations to
help do historical indexing faster.
While creating an indexer, when you define a contract filter,
QueryAPI will send any single transaction event that passes this filter to your indexer function
so you can index it.
If you only want to index events from a single contract, simply define the contract name on the **Contract Filter** text box.
In some cases you might want to either support indexing on multiple contracts,
or simply support every single contract that exists on the Near blockchain.
#### Single contract filter
For example, if you check out the [simple indexer](https://near.org/dataplatform.near/widget/QueryApi.App?selectedIndexerPath=roshaan.near/demo-indexer), you'll see that in this case
you have a `social.near` contract filter.
In this example, the indexer is only concerned on indexing events from `social.near`'s contract.
#### Multiple contracts filter
For example, if you want to index all the contracts from AstroDAO, where there is an account created
for each and every different DAO, you should define `*.sputnik-dao.near` as the contract filter.
Likewise, if you want to get events from every contract on the blockchain, simply define `*` as the filter.
## Feed-indexer logic
Then we call context.graphql, which allows us to make arbitrary mutations and queries
to our database that we provision for you.
If you're interested in how to create GraphQL queries, there's a whole bunch of resources
online.
In this case, we are passing in our mutation data, which has a post object, and it's inserting
it inside Postgres, I mean, inside of Postgres using GraphQL.
But it's very easy to create these mutations.
## Mutations in GraphQL
If you go to the GraphiQL tab, you can access the GraphiQL Explorer that provides a user friendly GraphQL playground, where you can view and create queries and mutations based on the DB schema that you defined for the indexer.
![QueryAPI Indexer Dashboard](/docs/assets/QAPIgraphiql.png)
You can easily set some fields and select the returning data
that you want, and the tool will build a query on the mutation panel on the right.
Then you can copy the resulting query, either in your JavaScript code so that you pass actual
data manually, or you pass in the mutation data object as a second parameter.
For example, if you go and add a new mutation, click <kbd>+</kbd>, then you can do a bunch of actions here, such as creating, deleting, or inserting posts into your table.
![Playground](/docs/assets/QAPIScreen.gif)
If you want to test your mutation, using [Debug Mode](#local-debug-mode) you can add a specific
block to the list, and then play it to see how it works.
Based on the indexer logic you defined, you'll get a call to the GraphQL mutation with the object
and data passed into it.
:::tip Video Walkthrough
**Tip:** watch the video on how to [create mutations in GraphQL](https://www.youtube.com/watch?v=VwO6spk8D58&t=781s).
:::
## Create a NEAR component from query
Creating a NEAR component from a GraphQL query is simple when using the GraphQL Playground. Just follow these steps:
- go to the GraphiQL tab
- select the query that you want to use
- click on the <kbd>Show GraphiQL Code Exporter</kbd> button
- get some default code here, copy it,
- go to the NEAR sandbox, paste it.
This will set up some boilerplate code to execute the GraphQL query, add the query that you had
in your playground and then call that query, extract the data and render it using the
render data function.
Once you have the NEAR component code, you can test it out by going to [the sandbox](https://near.org/sandbox),
pasting the generated code, and then selecting <kbd>Component Preview</kbd>.
Next, you can create a nice UI over this boilerplate code, and publish your new NEAR component.
|
Introducing the NEAR Foundation
NEAR FOUNDATION
May 13, 2020
The Why of NEAR
When Alex, Illia and the rest of the early team came together in 2018 to explore the ideas that would eventually become the NEAR Protocol, we each used slightly different words to express a single feeling — that the trends of both governments and corporations over the past decade have made it significantly more difficult to drive and distribute innovation than was the case before.
Each of us saw that developers and entrepreneurs need a better set of tools and it was clear that decentralized technologies offer a powerful new opportunity. Most promisingly, the blockchain technology pioneered by Bitcoin and extended by Ethereum uses a fully decentralized architecture to create a system where developers can build apps that have access to a new way of storing and transferring value which enables the development of disruptive business models.
The NEAR Project was born to make those ideas useful for a much wider group of participants. NEAR provides a highly performant open source backbone for running applications that tap into the flow of high value money and data. In today’s world, any app that isn’t a static HTML page can benefit from these tools. NEAR is the first decentralized application platform to allow these applications to reach mainstream usage because they are both performant and usable.
But the NEAR Purpose is larger than just the NEAR Protocol:
Our purpose is to enable community-driven innovation to benefit people around the world.
When we first assembled these words, they struck like a gong and it was clear we had finally articulated the Why which brought us together in the first place. Let’s look at each piece of this statement:
“Our purpose is…” “We” are a wide-ranging collective of developers, entrepreneurs and visionaries who have endorsed this purpose and begun working towards its enablement. While we share the same reason why we are here, we operate across many different areas of the ecosystem and in multiple entities and the world. There is no single leader or controller of this Purpose; it is permissionless.
“…to enable…” We aren’t here to do all the work ourselves… we are emphatically here to create the platform, tools and environment in which others can help themselves. That is how we achieve the greatest leverage for change.
“…community-driven innovation…” Many of the most important innovations which drive today’s businesses grew out of the collaborative nature of the early web and the openness of the entrepreneurial community in general. When the community is allowed to permissionlessly drive innovation, the scope of ideas will always exceed that which can be created through the top-down dictates of a single company or government.
“…to benefit people…” Not all innovation is created equal and we are specifically motivated by those innovations which can be harnessed to improve the condition of real people rather than finding better ways to weaponize their psychology against them as too many of today’s tools do.
“…around the world.” Not all innovation is distributed evenly around the world and so we must ensure that artificial barriers which prevent individuals from accessing markets, information or freedom are overcome.
The NEAR Foundation
While we are excited to say that the NEAR Protocol MainNet genesis occurred on April 22, 2020, NEAR is more than just a blockchain. It is a full suite of tools and protocols and components that are being researched and developed by world class teams who are distributed across the globe. Because of the fractured nature of the creators and the differing goals of each tool that makes up the NEAR platform, there is no single project which represents the fulfillment of the above-mentioned Purpose. Rather, it is the sum of the parts which do so.
The NEAR Foundation (https://near.foundation) is, at the broadest level, the steward of the *full* NEAR purpose. This foundation is an independent nonprofit entity based in Switzerland whose charter directly contains the words of that Purpose. To fulfill it, the Foundation plays a supporting and coordinating role between the players of the ecosystem. It is the lighthouse which helps keep the ecosystem oriented towards the North Star of that Purpose.
The Swiss “Stiftung” (German for “foundation”) structure, which the NEAR Foundation uses, is shared by some of the highest profile projects like Ethereum and Polkadot for a reason. This structure is neither flexible nor simple to operate because there is very strict regulatory oversight within the highly regulated Swiss jurisdiction. Stiftungs are legally bound to pursue their Purpose and funds that have been given to them cannot be removed for any reason except the fulfillment of that Purpose. These reasons are why most decentralized projects have opted for simpler or more flexible legal structures. But they are precisely the reasons why the Swiss Stiftung is the gold standard for projects who are operating reputably, transparently and with positive intentions.
NEAR is a global community effort which could become the backbone for the global economy so it is important to everyone building it — and building *on* it — that everything about it is held to the highest standard. Thus, we have worked to make sure everything from the structure of the Foundation to the code produced in support of the project are set up in a way that engenders trust.
To be clear, there will always be some tradeoffs between effective execution and community involvement, between transparency and privacy, and between short term decisions and long term value creation. But we would rather acknowledge those tradeoffs up front and will do our best to address them as openly as possible so the community has the trust it should in a project of this scope and magnitude.
The Foundation Council
A Swiss Stiftung is directed by an independent council which is similar to a corporate board of directors but with greater regulatory oversight. These members have the power to guide and ratify decisions of the Foundation but they can be removed by the regulatory authority if they are failing to pursue the Foundation’s purpose.
While the Foundation has obviously been operational for some time now, I am proud to officially introduce the first NEAR Foundation Council for the first time. These members were chosen for their particular blend of skills and outstanding reputation within the community:
Mona El Isa
is a serial entrepreneur with a deep history in both traditional finance and blockchain. She is currently the founder and CEO of Avantgarde Finance and previously founded Melonport which built the Melon protocol. As part of Melonport, Mona oversaw the first successful handoff of power from a blockchain foundation to fully decentralized governance.
Richard Muirhead
is a successful founder, operator and allocator in the technology space. He co-founded Orchestream in network management (LSE/NASDAQ in 2000, now part of Oracle), founded Tideway in application management (acquired in 2009 by BMC) and led Automic Software in business automation (acquired in 2016 by CA Technologies). Then as co-founder of Firestartr he backed firms like Tray, Citymapper and Transferwise; as GP at OpenOcean (founders of MySQL” and MariaDB) backed Truecaller, Bitrise and Supermetrics. This evolved into leading the Europe-based Fabric Ventures, successor to Firestartr and sister to OpenOcean. The Fabric team specifically backs entrepreneurs in the decentralized app and Web3 arena like Orchid, Keep, Polkadot, Messari and, as I’m happy to finally share, NEAR too.
Illia Polosukhin
is a gifted technologist and entrepreneur who is cofounder and CEO of Near Inc, a company which has done much of the early research and development work for the NEAR project. Illia formerly managed a team at Google Research which worked on core question-answering capabilities and he co-developed the Tensorflow machine learning library used by the majority of applications in machine learning. He has been a driving force behind NEAR from its earliest inception.
Also of note, Ali Yahya has joined as a nonvoting advisor and observer of the council. Ali is an engineer who formerly excelled in Google’s Brain and Moonshot divisions and who is currently a partner at Andreessen Horowitz. He brings both technical depth and strategic guidance to the Foundation.
As you can see, this initial council — which is populated with successful execution-oriented entrepreneurs and technologists who have been working together since last year — is specifically designed to provide NEAR with the support it needs to build its technical infrastructure and deliver its vision to help developers and founders across the world. I’m privileged to support them and to have their support on this journey.
How the Foundation will Operate
There are many models for the operation of nonprofits and foundations but few which create the kind of operational efficiency, clarity of vision and quality of output which are absolutely necessary to deliver world class software and seed a global ecosystem. Technological development doesn’t behave the same way as charitable giving and there is no reason to confuse the two models.
The NEAR Foundation stewards a purpose without borders or boundaries and thus its operations will span the globe. We are competing with massive technology companies to marshall the best talent in the world to solve some of the hardest technological and societal challenges of our time. It will require significant skills and resources to attract world class teams and coordinate the governance of the protocol as it grows.
To achieve such an ambitious purpose, the NEAR Foundation has to combine the best practices of a high growth technology project with the purpose-driven nature of a nonprofit. Our DNA is fundamentally entrepreneurial. Everyone who is involved knows that we have to hold ourselves to the same standards of operational excellence that drive successful disruption in other industries.
In the path towards realizing its purpose, the NEAR Foundation exists to address a number of specific challenges in the NEAR ecosystem:
Coordination
Allocation
Advocacy
Governance
Let’s look at each of these in greater depth.
1. Coordination
This ecosystem is made up of an ever-growing number of participants. The research, development and evolution of the NEAR Project requires coordinating a number of participants. During the early phases of the ecosystem, this means providing technical leadership for the design of NEAR Protocol and related tooling plus gathering those teams who can best build, test and deploy applications related to it.
As the ecosystem grows, we anticipate the scope of coordination to increase across a wider portfolio of tools and groups. In particular, the most important asset we have is our broader community of developers, entrepreneurs and end users so the Foundation will do all it can to provide them with the framework within which they can organize to support each other while still remaining a nonessential part of their operations.
2. Allocation
For the ecosystem to grow, its participants need to be given sufficient incentives to drive their participation. Our early mission is to do so by ensuring that the protocol itself creates the proper incentives to drive adoption but many aspects of the ecosystem development, including the development of the core protocol itself, can only be done with additional assistance.
The Foundation will drive its financial and technical resources to support teams at all layers of the ecosystem. On the financial side, this will include a balance of grant funding (for projects, teams or communities unable to support themselves) and possibly also participatory funding (eg incubation, acceleration and traditional investment). On the technical side, this includes work across the spectrum from advisory to coordinating the direct support of projects building on NEAR who need it to overcome their own technical issues.
3. Advocacy
When building something completely new, the biggest obstacle is that, by default, no one cares. To fulfill our purpose, we will need to raise the awareness of this new suite of tools and the problems they solve among end users, developers, regulators, governments, businesses and beyond. Each of these groups will require a different approach but education is core to each of them so the NEAR Foundation will be well invested in this effort.
This explicitly includes empowering local communities to host meetups, run workshops, organize hackathons, write documentation and otherwise do whatever it takes to help bring new participants into the ecosystem and up to speed. This is an emphatically collaborative effort across a wide range of entities that make up the NEAR ecosystem.
4. Governance
As part of a decentralized ecosystem, the NEAR Foundation exists to steward the purpose of that ecosystem through servant leadership. While the structure of the Foundation is designed from the start to be independent and nonessential, we want to ensure that it is serving the key stakeholders of our ecosystem in a way which is representative of their interests.
To that end, the community will play an active part in each of the key activities the foundation undergoes. Our goal here isn’t to reinvent governance from day one but to provide a template of strong communication and transparency to start with and, as the ecosystem matures, to cede more and more of those activities back to the community’s guidance through adding elements of direct democracy, corporate governance and representative systems as appropriate.
Early Stages
We are frequently asked how the Foundation will participate in the active running of the protocol and the distribution of its overall token resources during its earliest stages.
As part of the Roadmap to MainNet rollout, the Foundation is helping to test the network and stand up its initial nodes. Since there is no inflation during this period, the Foundation doesn’t gain any tokens by undertaking these activities and they are intended to be temporary.
Once the network progresses to its next phase, the Foundation will hand off the running of nodes to the decentralized community of validators. The Foundation doesn’t intend to run full MainNet validating nodes again or to participate in voting with any of the tokens under its direct or indirect control as part of the governance functions it oversees, though it may delegate some portion of its Endowment to other validators (not affiliated with any of its directors or executives) during the early phases of the network. The NEAR network will run and govern itself independently.
As part of the initial rollout, the Foundation has a number of tokens under its control or custody. One portion of those, the Endowment (which totals 10% of the initial amount), are reserved for the Foundation in order to support its long term operations. Another portion is distributed to early collaborators and backers. The last portion (which begins large but decreases in size by the day) is intended for distribution to the community via a variety of different means from drops to grants to sales and beyond over the next several years.
To allay any concerns about the Foundation’s involvement, almost all of the tokens in its possession, whether intended for its own use or for distribution, will be programmatically locked up to varying degrees. Half of the Endowment will be locked up for long term release and each of the distribution buckets will be locked in rough accordance with the intended timeline of their use. It should go without saying that the Foundation has the network’s integrity and health as its top priorities.
The Future
My father always liked to say that you should strive to be “nonessential but irreplaceable” and that’s a good roadmap for how the Foundation’s role will progress through time.
While it is an important force in standing up the network for the first time, the Foundation will transition to a position as a cheerleader and advisor for the ecosystem as it hands off more and more of the operation and governance of the network to its participants. Ultimately, the Foundation will persist as long as its Purpose remains unfilled but its path will be increasingly overseen and steered by the community at large as the project grows.
The Foundation will always be a unique entity in the NEAR ecosystem but it should be a nonessential one for that ecosystem’s continued operation and success.
I look forward to a day when it is no longer necessary for the world at all.
Next Steps
With MainNet live, we have moved into a different phase of NEAR’s development. You will hear more from us as the early distribution activities are rolled out and the steady state system governance design is released. We hope to always be a helpful source of information about the NEAR network in all its phases (through both calm and difficult times) and to provide ways for participants to onboard to the project and get acquainted with each other along the way.
You can be a part of this story as well.
The NEAR Foundation represents the manifested voice of all the developers and entrepreneurs who struggle to bring innovation to the rest of the world and we need your help because there is a long way to go. Whether you’re a gifted engineer, a hustling marketer, a visionary product person, a diligent operator, a passionate lawyer or just a community enthusiast — we have a spot for you somewhere.
Find your place among the teams that drive this forward at https://pages.near.org/careers or get involved directly as a Contributor at https://pages.near.org/community. If you are a developer, you can jump into the code at https://docs.near.org and if you are already a founder, you can find support for your journey from the Open Web Collective, a platform-agnostic founder community, at https://openwebcollective.com. |
Open Call for NEAR Accelerator Service Partners
NEAR FOUNDATION
April 5, 2023
The NEAR blockchain is an open-source, decentralized blockchain protocol that is designed to be scalable, developer-friendly, and able to support a wide range of decentralized applications (dApps) and smart contracts. NEAR has built the Blockchain Operating System (BOS), a Web3 stack to integrate experiences from across the Open Web and streamline the discovery and onboarding experience for users and developers alike.
NEAR Foundation has committed over $90M in capital to ecosystem growth, and the NEAR ecosystem grew from 0 to over 1,000 teams. One of the central lessons learned during this time is that founding teams need a host of contributors — beyond investment — in order to be successful.
Partner with NEAR Foundation for Accelerator program
NEAR Foundation recently announced Updates to Decentralized Funding, Development DAO, and the Accelerator, and released an Open Call for NEAR Accelerator Partners. To further assist founders and startups, the NEAR Foundation is looking to partner with existing products, experts, and services to support teams participating in NEAR Accelerator programs in a variety of different functions:
Legal Partner RFP — Comprehensive legal and regulatory guidance for cryptocurrency, web3, and general business matters such as intellectual property, taxation, governance, regulatory rulings, outsourcing, cross-border transactions, and consumer/data protection. Expertise in Anti-Money Laundering/Know Your Customer (AML/KYC) compliance and overall startup support from formation to governance.
Marketing Partner RFP — Full-service marketing support for product engagement and go-to-market strategy execution, including digital marketing, branding, creative concept development, campaign planning, and design.
Finance Partner RFP — Comprehensive financial services tailored to web3 startups, including accounting, bookkeeping, tax services, and general financial advisory.
Talent / Recruiting Partner RFP — Specialized recruiting and talent acquisition services to help accelerator teams attract top-tier talent essential for building successful and sustainable businesses.
Technical Partner RFP — Robust infrastructure and development tools designed to expedite technical and product development, encompassing web3 services, audit services, traditional cloud services as well as UX/UI services.
Product Partner RFP — Expert guidance in product-market fit, go-to-market strategy, product vision and roadmap development, and tokenomics evaluation for web3 startups.
PR / Comm Partner RFP — Strategic public relations and communications services aimed at boosting product engagement and supporting go-to-market strategies, covering PR campaigns, social media, communications, and community outreach.
NEAR Foundation will offer service credits to teams participating in the accelerator program to use partner services to help them accomplish their vision. The ultimate goal is for founding teams to create great products and run successful fundraising campaigns, while also increasing revenue and market share for other stakeholders interacting with the product such as our service partners. NEAR Foundation commits to providing access to NEAR ecosystem teams for the selected service partner(s) as “preferred partners.” This will mean that the selected partner(s) are vetted and approved by the NEAR Foundation, and should lead to an increase in the number of projects that are interested in using the partner(s)’ product / services.
Important dates for NEAR accelerator service partners
We will be assessing proposals based on the requirements outlined in each of the function-specific service partner open calls. Service partners are expected to submit their proposals by Apr 23, 2023. The NEAR Foundation review team will be conducting interviews with selected candidates between May 13-17. The review team will shortlist the service partners between May 20-23 and will communicate further with the selected parties. Final approval will be provided by the NEAR Foundation Counsel. |
---
id: protocol
title: Protocol
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
The RPC API enables you to retrieve the current genesis and protocol configuration.
---
## Genesis Config {#genesis-config}
> Returns current genesis configuration.
- method: `EXPERIMENTAL_genesis_config`
- params: _none_
Example:
<Tabs>
<TabItem value="json" label="JSON" default>
```json
{
"jsonrpc": "2.0",
"id": "dontcare",
"method": "EXPERIMENTAL_genesis_config"
}
```
</TabItem>
<TabItem value="js" label="🌐 JavaScript" label="JavaScript">
```js
const response = await near.connection.provider.experimental_genesisConfig();
```
</TabItem>
<TabItem value="http" label="HTTPie">
```bash
http post https://rpc.testnet.near.org jsonrpc=2.0 id=dontcare method=EXPERIMENTAL_genesis_config
```
</TabItem>
</Tabs>
<details>
<summary>Example response: </summary>
<p>
```json
{
"jsonrpc": "2.0",
"result": {
"protocol_version": 29,
"genesis_time": "2020-07-31T03:39:42.911378Z",
"chain_id": "testnet",
"genesis_height": 10885359,
"num_block_producer_seats": 100,
"num_block_producer_seats_per_shard": [100],
"avg_hidden_validator_seats_per_shard": [0],
"dynamic_resharding": false,
"protocol_upgrade_stake_threshold": [4, 5],
"protocol_upgrade_num_epochs": 2,
"epoch_length": 43200,
"gas_limit": 1000000000000000,
"min_gas_price": "5000",
"max_gas_price": "10000000000000000000000",
"block_producer_kickout_threshold": 80,
"chunk_producer_kickout_threshold": 90,
"online_min_threshold": [90, 100],
"online_max_threshold": [99, 100],
"gas_price_adjustment_rate": [1, 100],
"runtime_config": {
"storage_amount_per_byte": "90949470177292823791",
"transaction_costs": {
"action_receipt_creation_config": {
"send_sir": 108059500000,
"send_not_sir": 108059500000,
"execution": 108059500000
},
"data_receipt_creation_config": {
"base_cost": {
"send_sir": 4697339419375,
"send_not_sir": 4697339419375,
"execution": 4697339419375
},
"cost_per_byte": {
"send_sir": 59357464,
"send_not_sir": 59357464,
"execution": 59357464
}
},
"action_creation_config": {
"create_account_cost": {
"send_sir": 99607375000,
"send_not_sir": 99607375000,
"execution": 99607375000
},
"deploy_contract_cost": {
"send_sir": 184765750000,
"send_not_sir": 184765750000,
"execution": 184765750000
},
"deploy_contract_cost_per_byte": {
"send_sir": 6812999,
"send_not_sir": 6812999,
"execution": 6812999
},
"function_call_cost": {
"send_sir": 2319861500000,
"send_not_sir": 2319861500000,
"execution": 2319861500000
},
"function_call_cost_per_byte": {
"send_sir": 2235934,
"send_not_sir": 2235934,
"execution": 2235934
},
"transfer_cost": {
"send_sir": 115123062500,
"send_not_sir": 115123062500,
"execution": 115123062500
},
"stake_cost": {
"send_sir": 141715687500,
"send_not_sir": 141715687500,
"execution": 102217625000
},
"add_key_cost": {
"full_access_cost": {
"send_sir": 101765125000,
"send_not_sir": 101765125000,
"execution": 101765125000
},
"function_call_cost": {
"send_sir": 102217625000,
"send_not_sir": 102217625000,
"execution": 102217625000
},
"function_call_cost_per_byte": {
"send_sir": 1925331,
"send_not_sir": 1925331,
"execution": 1925331
}
},
"delete_key_cost": {
"send_sir": 94946625000,
"send_not_sir": 94946625000,
"execution": 94946625000
},
"delete_account_cost": {
"send_sir": 147489000000,
"send_not_sir": 147489000000,
"execution": 147489000000
}
},
"storage_usage_config": {
"num_bytes_account": 100,
"num_extra_bytes_record": 40
},
"burnt_gas_reward": [3, 10],
"pessimistic_gas_price_inflation_ratio": [103, 100]
},
"wasm_config": {
"ext_costs": {
"base": 264768111,
"contract_compile_base": 35445963,
"contract_compile_bytes": 216750,
"read_memory_base": 2609863200,
"read_memory_byte": 3801333,
"write_memory_base": 2803794861,
"write_memory_byte": 2723772,
"read_register_base": 2517165186,
"read_register_byte": 98562,
"write_register_base": 2865522486,
"write_register_byte": 3801564,
"utf8_decoding_base": 3111779061,
"utf8_decoding_byte": 291580479,
"utf16_decoding_base": 3543313050,
"utf16_decoding_byte": 163577493,
"sha256_base": 4540970250,
"sha256_byte": 24117351,
"keccak256_base": 5879491275,
"keccak256_byte": 21471105,
"keccak512_base": 5811388236,
"keccak512_byte": 36649701,
"log_base": 3543313050,
"log_byte": 13198791,
"storage_write_base": 64196736000,
"storage_write_key_byte": 70482867,
"storage_write_value_byte": 31018539,
"storage_write_evicted_byte": 32117307,
"storage_read_base": 56356845750,
"storage_read_key_byte": 30952533,
"storage_read_value_byte": 5611005,
"storage_remove_base": 53473030500,
"storage_remove_key_byte": 38220384,
"storage_remove_ret_value_byte": 11531556,
"storage_has_key_base": 54039896625,
"storage_has_key_byte": 30790845,
"storage_iter_create_prefix_base": 0,
"storage_iter_create_prefix_byte": 0,
"storage_iter_create_range_base": 0,
"storage_iter_create_from_byte": 0,
"storage_iter_create_to_byte": 0,
"storage_iter_next_base": 0,
"storage_iter_next_key_byte": 0,
"storage_iter_next_value_byte": 0,
"touching_trie_node": 16101955926,
"promise_and_base": 1465013400,
"promise_and_per_promise": 5452176,
"promise_return": 560152386,
"validator_stake_base": 911834726400,
"validator_total_stake_base": 911834726400
},
"grow_mem_cost": 1,
"regular_op_cost": 3856371,
"limit_config": {
"max_gas_burnt": 200000000000000,
"max_gas_burnt_view": 200000000000000,
"max_stack_height": 16384,
"initial_memory_pages": 1024,
"max_memory_pages": 2048,
"registers_memory_limit": 1073741824,
"max_register_size": 104857600,
"max_number_registers": 100,
"max_number_logs": 100,
"max_total_log_length": 16384,
"max_total_prepaid_gas": 300000000000000,
"max_actions_per_receipt": 100,
"max_number_bytes_method_names": 2000,
"max_length_method_name": 256,
"max_arguments_length": 4194304,
"max_length_returned_data": 4194304,
"max_contract_size": 4194304,
"max_length_storage_key": 4194304,
"max_length_storage_value": 4194304,
"max_promises_per_function_call_action": 1024,
"max_number_input_data_dependencies": 128
}
},
"account_creation_config": {
"min_allowed_top_level_account_length": 0,
"registrar_account_id": "registrar"
}
},
"validators": [
{
"account_id": "node0",
"public_key": "ed25519:7PGseFbWxvYVgZ89K1uTJKYoKetWs7BJtbyXDzfbAcqX",
"amount": "1000000000000000000000000000000"
},
{
"account_id": "node1",
"public_key": "ed25519:6DSjZ8mvsRZDvFqFxo8tCKePG96omXW7eVYVSySmDk8e",
"amount": "1000000000000000000000000000000"
},
{
"account_id": "node2",
"public_key": "ed25519:GkDv7nSMS3xcqA45cpMvFmfV1o4fRF6zYo1JRR6mNqg5",
"amount": "1000000000000000000000000000000"
},
{
"account_id": "node3",
"public_key": "ed25519:ydgzeXHJ5Xyt7M1gXLxqLBW1Ejx6scNV5Nx2pxFM8su",
"amount": "1000000000000000000000000000000"
}
],
"transaction_validity_period": 86400,
"protocol_reward_rate": [1, 10],
"max_inflation_rate": [1, 20],
"total_supply": "1031467299046044096035532756810080",
"num_blocks_per_year": 31536000,
"protocol_treasury_account": "near",
"fishermen_threshold": "10000000000000000000",
"minimum_stake_divisor": 10
},
"id": "dontcare"
}
```
</p>
</details>
#### What could go wrong? {#what-could-go-wrong}
When API request fails, RPC server returns a structured error response with a limited number of well-defined error variants, so client code can exhaustively handle all the possible error cases. Our JSON-RPC errors follow [verror](https://github.com/joyent/node-verror) convention for structuring the error response:
```json
{
"error": {
"name": <ERROR_TYPE>,
"cause": {
"info": {..},
"name": <ERROR_CAUSE>
},
"code": -32000,
"data": String,
"message": "Server error",
},
"id": "dontcare",
"jsonrpc": "2.0"
}
```
> **Heads up**
>
> The fields `code`, `data`, and `message` in the structure above are considered legacy ones and might be deprecated in the future. Please, don't rely on them.
Here is the exhaustive list of the error variants that can be returned by `EXPERIMENTAL_genesis_config` method:
<table>
<thead>
<tr>
<th>
ERROR_TYPE<br />
<code>error.name</code>
</th>
<th>ERROR_CAUSE<br /><code>error.cause.name</code></th>
<th>Reason</th>
<th>Solution</th>
</tr>
</thead>
<tbody>
<tr>
<td>INTERNAL_ERROR</td>
<td>INTERNAL_ERROR</td>
<td>Something went wrong with the node itself or overloaded</td>
<td>
<ul>
<li>Try again later</li>
<li>Send a request to a different node</li>
<li>Check <code>error.cause.info</code> for more details</li>
</ul>
</td>
</tr>
</tbody>
</table>
---
## Protocol Config {#protocol-config}
> Returns most recent protocol configuration or a specific queried block. Useful for finding current storage and transaction costs.
- method: `EXPERIMENTAL_protocol_config`
- params:
- [`finality`](/api/rpc/setup#using-finality-param) _OR_ [`block_id`](/api/rpc/setup#using-block_id-param)
Example:
<Tabs>
<TabItem value="json" label="JSON" default>
```json
{
"jsonrpc": "2.0",
"id": "dontcare",
"method": "EXPERIMENTAL_protocol_config",
"params": {
"finality": "final"
}
}
```
</TabItem>
<TabItem value="http" label="HTTPie">
```bash
http post https://rpc.testnet.near.org jsonrpc=2.0 id=dontcare method=EXPERIMENTAL_protocol_config \
params:='{
"finality": "final"
}'
```
</TabItem>
</Tabs>
<details>
<summary>Example response: </summary>
<p>
```json
{
"jsonrpc": "2.0",
"result": {
"protocol_version": 45,
"genesis_time": "2020-07-31T03:39:42.911378Z",
"chain_id": "testnet",
"genesis_height": 42376888,
"num_block_producer_seats": 200,
"num_block_producer_seats_per_shard": [200],
"avg_hidden_validator_seats_per_shard": [0],
"dynamic_resharding": false,
"protocol_upgrade_stake_threshold": [4, 5],
"epoch_length": 43200,
"gas_limit": 1000000000000000,
"min_gas_price": "5000",
"max_gas_price": "10000000000000000000000",
"block_producer_kickout_threshold": 80,
"chunk_producer_kickout_threshold": 90,
"online_min_threshold": [90, 100],
"online_max_threshold": [99, 100],
"gas_price_adjustment_rate": [1, 100],
"runtime_config": {
"storage_amount_per_byte": "10000000000000000000",
"transaction_costs": {
"action_receipt_creation_config": {
"send_sir": 108059500000,
"send_not_sir": 108059500000,
"execution": 108059500000
},
"data_receipt_creation_config": {
"base_cost": {
"send_sir": 4697339419375,
"send_not_sir": 4697339419375,
"execution": 4697339419375
},
"cost_per_byte": {
"send_sir": 59357464,
"send_not_sir": 59357464,
"execution": 59357464
}
},
"action_creation_config": {
"create_account_cost": {
"send_sir": 99607375000,
"send_not_sir": 99607375000,
"execution": 99607375000
},
"deploy_contract_cost": {
"send_sir": 184765750000,
"send_not_sir": 184765750000,
"execution": 184765750000
},
"deploy_contract_cost_per_byte": {
"send_sir": 6812999,
"send_not_sir": 6812999,
"execution": 6812999
},
"function_call_cost": {
"send_sir": 2319861500000,
"send_not_sir": 2319861500000,
"execution": 2319861500000
},
"function_call_cost_per_byte": {
"send_sir": 2235934,
"send_not_sir": 2235934,
"execution": 2235934
},
"transfer_cost": {
"send_sir": 115123062500,
"send_not_sir": 115123062500,
"execution": 115123062500
},
"stake_cost": {
"send_sir": 141715687500,
"send_not_sir": 141715687500,
"execution": 102217625000
},
"add_key_cost": {
"full_access_cost": {
"send_sir": 101765125000,
"send_not_sir": 101765125000,
"execution": 101765125000
},
"function_call_cost": {
"send_sir": 102217625000,
"send_not_sir": 102217625000,
"execution": 102217625000
},
"function_call_cost_per_byte": {
"send_sir": 1925331,
"send_not_sir": 1925331,
"execution": 1925331
}
},
"delete_key_cost": {
"send_sir": 94946625000,
"send_not_sir": 94946625000,
"execution": 94946625000
},
"delete_account_cost": {
"send_sir": 147489000000,
"send_not_sir": 147489000000,
"execution": 147489000000
}
},
"storage_usage_config": {
"num_bytes_account": 100,
"num_extra_bytes_record": 40
},
"burnt_gas_reward": [3, 10],
"pessimistic_gas_price_inflation_ratio": [103, 100]
},
"wasm_config": {
"ext_costs": {
"base": 264768111,
"contract_compile_base": 35445963,
"contract_compile_bytes": 216750,
"read_memory_base": 2609863200,
"read_memory_byte": 3801333,
"write_memory_base": 2803794861,
"write_memory_byte": 2723772,
"read_register_base": 2517165186,
"read_register_byte": 98562,
"write_register_base": 2865522486,
"write_register_byte": 3801564,
"utf8_decoding_base": 3111779061,
"utf8_decoding_byte": 291580479,
"utf16_decoding_base": 3543313050,
"utf16_decoding_byte": 163577493,
"sha256_base": 4540970250,
"sha256_byte": 24117351,
"keccak256_base": 5879491275,
"keccak256_byte": 21471105,
"keccak512_base": 5811388236,
"keccak512_byte": 36649701,
"log_base": 3543313050,
"log_byte": 13198791,
"storage_write_base": 64196736000,
"storage_write_key_byte": 70482867,
"storage_write_value_byte": 31018539,
"storage_write_evicted_byte": 32117307,
"storage_read_base": 56356845750,
"storage_read_key_byte": 30952533,
"storage_read_value_byte": 5611005,
"storage_remove_base": 53473030500,
"storage_remove_key_byte": 38220384,
"storage_remove_ret_value_byte": 11531556,
"storage_has_key_base": 54039896625,
"storage_has_key_byte": 30790845,
"storage_iter_create_prefix_base": 0,
"storage_iter_create_prefix_byte": 0,
"storage_iter_create_range_base": 0,
"storage_iter_create_from_byte": 0,
"storage_iter_create_to_byte": 0,
"storage_iter_next_base": 0,
"storage_iter_next_key_byte": 0,
"storage_iter_next_value_byte": 0,
"touching_trie_node": 16101955926,
"promise_and_base": 1465013400,
"promise_and_per_promise": 5452176,
"promise_return": 560152386,
"validator_stake_base": 911834726400,
"validator_total_stake_base": 911834726400
},
"grow_mem_cost": 1,
"regular_op_cost": 3856371,
"limit_config": {
"max_gas_burnt": 200000000000000,
"max_gas_burnt_view": 200000000000000,
"max_stack_height": 16384,
"initial_memory_pages": 1024,
"max_memory_pages": 2048,
"registers_memory_limit": 1073741824,
"max_register_size": 104857600,
"max_number_registers": 100,
"max_number_logs": 100,
"max_total_log_length": 16384,
"max_total_prepaid_gas": 300000000000000,
"max_actions_per_receipt": 100,
"max_number_bytes_method_names": 2000,
"max_length_method_name": 256,
"max_arguments_length": 4194304,
"max_length_returned_data": 4194304,
"max_contract_size": 4194304,
"max_length_storage_key": 4194304,
"max_length_storage_value": 4194304,
"max_promises_per_function_call_action": 1024,
"max_number_input_data_dependencies": 128
}
},
"account_creation_config": {
"min_allowed_top_level_account_length": 0,
"registrar_account_id": "registrar"
}
},
"transaction_validity_period": 86400,
"protocol_reward_rate": [1, 10],
"max_inflation_rate": [1, 20],
"num_blocks_per_year": 31536000,
"protocol_treasury_account": "near",
"fishermen_threshold": "340282366920938463463374607431768211455",
"minimum_stake_divisor": 10
},
"id": "dontcare"
}
```
</p>
</details>
#### What could go wrong? {#what-could-go-wrong-1}
When API request fails, RPC server returns a structured error response with a limited number of well-defined error variants, so client code can exhaustively handle all the possible error cases. Our JSON-RPC errors follow [verror](https://github.com/joyent/node-verror) convention for structuring the error response:
```json
{
"error": {
"name": <ERROR_TYPE>,
"cause": {
"info": {..},
"name": <ERROR_CAUSE>
},
"code": -32000,
"data": String,
"message": "Server error",
},
"id": "dontcare",
"jsonrpc": "2.0"
}
```
> **Heads up**
>
> The fields `code`, `data`, and `message` in the structure above are considered legacy ones and might be deprecated in the future. Please, don't rely on them.
Here is the exhaustive list of the error variants that can be returned by `EXPERIMENTAL_protocol_config` method:
<table>
<thead>
<tr>
<th>
ERROR_TYPE<br />
<code>error.name</code>
</th>
<th>ERROR_CAUSE<br /><code>error.cause.name</code></th>
<th>Reason</th>
<th>Solution</th>
</tr>
</thead>
<tbody>
<tr>
<td>HANDLER_ERROR</td>
<td>UNKNOWN_BLOCK</td>
<td>The requested block has not been produced yet or it has been garbage-collected (cleaned up to save space on the RPC node)</td>
<td>
<ul>
<li>Check that the requested block is legit</li>
<li>If the block had been produced more than 5 epochs ago, try to send your request to <a href="https://near-nodes.io/intro/node-types#archival-node">an archival node</a></li>
</ul>
</td>
</tr>
<tr>
<td>INTERNAL_ERROR</td>
<td>INTERNAL_ERROR</td>
<td>Something went wrong with the node itself or overloaded</td>
<td>
<ul>
<li>Try again later</li>
<li>Send a request to a different node</li>
<li>Check <code>error.cause.info</code> for more details</li>
</ul>
</td>
</tr>
</tbody>
</table>
---
|
```js
Near.call(
"primitives.sputnik-dao.near",
"act_proposal",
{ id: 0, action: "VoteApprove" },
300000000000000
);
```
:::note
Available vote options: `VoteApprove`, `VoteReject`, `VoteRemove`.
::: |
---
id: feature-request-and-bug-report
title: Feature Request and Bug Report
sidebar_label: Feature Request and Bug Report
description: NEAR Node Feature Request and Bug Report
---
<blockquote class="info">
<strong>Did You Know?</strong><br /><br />
As a node operator, you are welcome to create feature requests and submit bug reports on [Github](https://github.com/near/nearcore/issues) to improve the experience of running a node. Head to [`nearcore`](https://github.com/near/nearcore/issues) repository on Github to open an issue, and don't forget to tag the issue with the `nodeX` tag to indicate that this issue is related to `Node Experience`.
</blockquote>
## Feature Request and Bug Report {#feature-request-and-bug-report}
The NEAR team is actively solicitating feedback from the Node and Validator Community, and offers a process for the community to engage in feature / enhancement requests and to submit bug reports. Besides the existing NEAR Validator channels, the NEAR team is introducing a formal process for feature requests and bug reports.
With respect to the experience of operating a NEAR node, all bugs and feature enhancements are publicly tracked in the Github project tracker [`Node Experience`](https://github.com/orgs/near/projects/42). Node operators are welcome to create new issues for features and bugs, and to add these issues into the [Github project tracker](https://github.com/orgs/near/projects/42).
- **New Feature / Enhancement Request:**
- Please create a [`Github issue`](https://github.com/near/nearcore/issues)
- The issue will be reviewed and filtered through `Incoming Requests` column in the Github project [`Node Experience`](https://github.com/orgs/near/projects/42), where they will be groomed and slated for development based on priority
- **Bug Report:**
- For node operator issues, please create a Zendesk issue here https://near-node.zendesk.com/hc/en-us/requests/new for the Node team, and try to attach your logs.
>Got a question?
<a href="https://stackoverflow.com/questions/tagged/nearprotocol">
<h8>Ask it on StackOverflow!</h8></a>
|
Cosmose and NEAR Foundation Set To Revolutionize Retail with Web3 and AI
NEAR FOUNDATION
April 24, 2023
The Artificial Intelligence (AI) wave is touching just about every corner of commerce and culture, and retail is no exception. Cosmose AI, one of the leaders in AI and retail, has received a strategic investment from NEAR Foundation to create Web3, mobile, and retail experiences that enhance personalization without sacrificing privacy or security.
“NEAR is the most secure, scalable, and sustainable blockchain protocol,” says Miron Mironiuk, founder and CEO of Cosmose. “As such, we’re grateful for the ongoing support from NEAR Foundation and are excited about what’s to come.”
By utilizing the NEAR Blockchain Operating System (BOS) and AI-powered retail personalization, Cosmose gives users access to their data and personalized recommendations. This move toward a decentralized and user-focused Web3 future emphasizes how AI and blockchain have the power to completely alter current business structures in the retail sector.
With the help of NEAR’s technology and ecosystem, Cosmose can now change conventional retail business models and produce hyper-personalized shopping experiences that increase loyalty and happiness. These experiences will include its flagship mobile application KaiKai and a suite of AI-powered personalization tools.
Personalizing retail with Web3 experiences
Shopper and user personalization is nothing new in retail, but Cosmose is blazing a new trail with the power of its proprietary AI engine. Cosmose’s AI gathers and analyzes user data to produce suggestions and experiences that are uniquely tailored to each user.
By building on NEAR, Cosmose is able to help retailers develop stronger customer interactions while also addressing privacy and data security issues endemic to Web2 retail data collection. Cosmose and its KaiKai mobile app are trusted by top brands including LVMH, Richemont, L’Oréal, and Estée Lauder.
“Having built on NEAR in 2022 and while working with NEAR Foundation we discovered that our visions for the Web3-driven future are aligned,” Mironiuk continues.
Through the partnership, Cosmose will be able to offer individualized experiences to its worldwide clientele through a number of channels, including online, in-store, and mobile.
“This investment from NEAR Foundation is a testament to Cosmose AI’s strength and potential to revolutionize e-commerce and the retail industry,” said George Raymond Zage III, founder and CEO of Tiga Investments and a Cosmose board member. “We’re excited to see Cosmose AI’s continued growth and success.”
KaiKai: Introducing “Shoppertainment” on the NEAR Blockchain Operating System
KaiKai, Cosmose AI’s flagship product, shopping, retail, and gamification to create a new category altogether: Shoppertainment. The mobile shopping experience is constructed in a very unique way, using the BOS as a backbone to make brand discovery and engagement more fun and rewarding.
Some of KaiKai’s features that brands are already using include:
Exclusive product drops available for a limited time
Augmented Reality (AR) technology that brings products to life
Livestreams featuring celebrities and influencers
Ads displayed on users’ lock screens without interruptions
Geolocation features displaying available products in the user’s area
Rewards for collecting products and writing reviews
A secure Near wallet accessible via the KaiKai app
KaiKai also features a native cryptocurrency called Kai-Ching, which users earn and spend like any other retail rewards program. The difference is that Cosmose’s AI provides more personalized recommendations and rewards suggestions, with customers easily transacting on the NEAR blockchain with a native KaiKai crypto wallet.
“We’re excited to support Cosmose as it continues to scale rapidly and create new ways for retailers to offer customers the best offline and online shopping experiences,” noted Marieke Flament, CEO of the NEAR Foundation. “Cosmose has already been building on NEAR testnet, and with this additional support it will have many more opportunities to grow and expand its offerings with Web3 in a sustainable, transparent, and infinitely scalable way.”
Flament added: “Cosmose’s excellent AI innovation will help to intensify its global marketplace lead, and with superior AI-driven personalization, its user base will undoubtedly continue to grow as new and existing customers are seamlessly transitioned into the world of Web3 and all the exciting opportunities it brings.”
It’s no secret that AI is turning industries, communities, and business models on their heads – and that’s not a bad thing. NEAR Foundation’s commitment to developing Cosmose’s retail technology, including KaiKai, signals that consumers will now be even more empowered with more personal recommendations, a “shoppertaining” experience, and the sound of “Kai-Ching” as they earn crypto rewards powered by NEAR and BOS.
“Together we’ll build a future where one billion users benefit from the ecosystem they’re part of, with complete control of their data and superior AI-driven personalization,” said Mironiuk. |
# Receipt
All cross-contract (we assume that each account lives in its own shard) communication in Near happens through Receipts.
Receipts are stateful in a sense that they serve not only as messages between accounts but also can be stored in the account storage to await DataReceipts.
Each receipt has a [`predecessor_id`](#predecessor_id) (who sent it) and [`receiver_id`](#receiver_id) the current account.
Receipts are one of 2 types: action receipts or data receipts.
Data Receipts are receipts that contains some data for some `ActionReceipt` with the same `receiver_id`.
Data Receipts have 2 fields: the unique data identifier `data_id` and `data` the received result.
`data` is an `Option` field and it indicates whether the result was a success or a failure. If it's `Some`, it means
the remote execution was successful and it represents the result as a vector of bytes.
Each `ActionReceipt` also contains fields related to data:
- [`input_data_ids`](#input_data_ids) - a vector of input data with the `data_id`s required for the execution of this receipt.
- [`output_data_receivers`](#output_data_receivers) - a vector of output data receivers. It indicates where to send outgoing data.
Each `DataReceiver` consists of `data_id` and `receiver_id` for routing.
Before any action receipt is executed, all input data dependencies need to be satisfied.
Which means all corresponding data receipts have to be received.
If any of the data dependencies are missing, the action receipt is postponed until all missing data dependencies arrive.
Because chain and runtime guarantees that no receipts are missing, we can rely that every action receipt will be executed eventually ([Receipt Matching explanation](#receipt-matching)).
Each `Receipt` has the following fields:
#### predecessor_id
- **`type`**: `AccountId`
The account_id which issued a receipt.
In case of a gas or deposit refund, the account ID is `system`.
#### receiver_id
- **`type`**: `AccountId`
The destination account_id.
#### receipt_id
- **`type`**: `CryptoHash`
An unique id for the receipt.
#### receipt
- **`type`**: [ActionReceipt](#actionreceipt) | [DataReceipt](#datareceipt)
There are 2 types of Receipt: [ActionReceipt](#actionreceipt) and [DataReceipt](#datareceipt). An `ActionReceipt` is a request to apply [Actions](Actions.md), while a `DataReceipt` is a result of the application of these actions.
## ActionReceipt
`ActionReceipt` represents a request to apply actions on the `receiver_id` side. It could be derived as a result of a `Transaction` execution or another `ActionReceipt` processing. `ActionReceipt` consists the following fields:
#### signer_id
- **`type`**: `AccountId`
An account_id which signed the original [transaction](Transactions.md).
In case of a deposit refund, the account ID is `system`.
#### signer_public_key
- **`type`**: `PublicKey`
The public key of an [AccessKey](../DataStructures/AccessKey.md) which was used to sign the original transaction.
In case of a deposit refund, the public key is empty (all bytes are 0).
#### gas_price
- **`type`**: `u128`
Gas price which was set in a block where the original [transaction](Transactions.md) has been applied.
#### output_data_receivers
- **`type`**: `[DataReceiver{ data_id: CryptoHash, receiver_id: AccountId }]`
If smart contract finishes its execution with some value (not Promise), runtime creates a [`DataReceipt`]s for each of the `output_data_receivers`.
#### input_data_ids
- **`type`**: `[CryptoHash]_`
`input_data_ids` are the receipt data dependencies. `input_data_ids` correspond to `DataReceipt.data_id`.
#### actions
- **`type`**: [`FunctionCall`](Actions.md#functioncallaction) | [`TransferAction`](Actions.md#transferaction) | [`StakeAction`](Actions.md#stakeaction) | [`AddKeyAction`](Actions.md#addkeyaction) | [`DeleteKeyAction`](Actions.md#deletekeyaction) | [`CreateAccountAction`](Actions.md#createaccountaction) | [`DeleteAccountAction`](Actions.md#deleteaccountaction)
## DataReceipt
`DataReceipt` represents a final result of some contract execution.
#### data_id
- **`type`**: `CryptoHash`
An a unique `DataReceipt` identifier.
#### data
- **`type`**: `Option([u8])`
Associated data in bytes. `None` indicates an error during execution.
# Creating Receipt
Receipts can be generated during the execution of a [SignedTransaction](./Transactions.md#SignedTransaction) (see [example](./Scenarios/FinancialTransaction.md)) or during application of some `ActionReceipt` which contains a [`FunctionCall`](#actions) action. The result of the `FunctionCall` could be either another `ActionReceipt` or a `DataReceipt` (returned data).
# Receipt Matching
Runtime doesn't require that Receipts come in a particular order. Each Receipt is processed individually. The goal of the `Receipt Matching` process is to match all [`ActionReceipt`s](#actionreceipt) to the corresponding [`DataReceipt`s](#datareceipt).
## Processing ActionReceipt
For each incoming [`ActionReceipt`](#actionreceipt) runtime checks whether we have all the [`DataReceipt`s](#datareceipt) (defined as [`ActionsReceipt.input_data_ids`](#input_data_ids)) required for execution. If all the required [`DataReceipt`s](#datareceipt) are already in the [storage](#received-datareceipt), runtime can apply this `ActionReceipt` immediately. Otherwise we save this receipt as a [Postponed ActionReceipt](#postponed-actionreceipt). Also we save [Pending DataReceipts Count](#pending-datareceipt-count) and [a link from pending `DataReceipt` to the `Postponed ActionReceipt`](#pending-datareceipt-for-postponed-actionreceipt). Now runtime will wait for all the missing `DataReceipt`s to apply the `Postponed ActionReceipt`.
#### Postponed ActionReceipt
A Receipt which runtime stores until all the designated [`DataReceipt`s](#datareceipt) arrive.
- **`key`** = `account_id`,`receipt_id`
- **`value`** = `[u8]`
_Where `account_id` is [`Receipt.receiver_id`](#receiver_id), `receipt_id` is [`Receipt.receipt_id`](#receipt_id) and value is a serialized [`Receipt`](#receipt) (which [type](#type) must be [ActionReceipt](#actionreceipt))._
#### Pending DataReceipt Count
A counter which counts pending [`DataReceipt`s](#DataReceipt) for a [Postponed Receipt](#postponed-receipt) initially set to the length of missing [`input_data_ids`](#input_data_ids) of the incoming `ActionReceipt`. It's decrementing with every new received [`DataReceipt`](#datareceipt):
- **`key`** = `account_id`,`receipt_id`
- **`value`** = `u32`
_Where `account_id` is AccountId, `receipt_id` is CryptoHash and value is an integer._
#### Pending DataReceipt for Postponed ActionReceipt
We index each pending `DataReceipt` so when a new [`DataReceipt`](#datareceipt) arrives we connect it to the [Postponed Receipt](#postponed-receipt) it belongs to.
- **`key`** = `account_id`,`data_id`
- **`value`** = `receipt_id`
## Processing DataReceipt
#### Received DataReceipt
First of all, runtime saves the incoming `DataReceipt` to the storage as:
- **`key`** = `account_id`,`data_id`
- **`value`** = `[u8]`
_Where `account_id` is [`Receipt.receiver_id`](#receiver_id), `data_id` is [`DataReceipt.data_id`](#data_id) and value is a [`DataReceipt.data`](#data) (which is typically a serialized result of the call to a particular contract)._
Next, runtime checks if there are any [`Postponed ActionReceipt`](#postponed-actionreceipt) waiting for this `DataReceipt` by querying [`Pending DataReceipt` to the Postponed Receipt](#pending-datareceipt-for-postponed-actionReceipt). If there is no postponed `receipt_id` yet, we do nothing else. If there is a postponed `receipt_id`, we do the following:
- decrement [`Pending Data Count`](#pending-datareceipt-count) for the postponed `receipt_id`
- remove found [`Pending DataReceipt` to the `Postponed ActionReceipt`](#pending-datareceipt-for-postponed-actionreceipt)
If [`Pending DataReceipt Count`](#pending-datareceipt-count) is now 0 that means all the [`Receipt.input_data_ids`](#input_data_ids) are in storage and runtime can safely apply the [Postponed Receipt](#postponed-receipt) and remove it from the store.
## Case 1: Call to multiple contracts and await responses
Suppose runtime got the following `ActionReceipt`:
```python
# Non-relevant fields are omitted.
Receipt{
receiver_id: "alice",
receipt_id: "693406"
receipt: ActionReceipt {
input_data_ids: []
}
}
```
If execution return Result::Value
Suppose runtime got the following `ActionReceipt` (we use a python-like pseudo code):
```python
# Non-relevant fields are omitted.
Receipt{
receiver_id: "alice",
receipt_id: "5e73d4"
receipt: ActionReceipt {
input_data_ids: ["e5fa44", "7448d8"]
}
}
```
We can't apply this receipt right away: there are missing DataReceipt'a with IDs: ["e5fa44", "7448d8"]. Runtime does the following:
```python
postponed_receipts["alice,5e73d4"] = borsh_serialize(
Receipt{
receiver_id: "alice",
receipt_id: "5e73d4"
receipt: ActionReceipt {
input_data_ids: ["e5fa44", "7448d8"]
}
}
)
pending_data_receipt_store["alice,e5fa44"] = "5e73d4"
pending_data_receipt_store["alice,7448d8"] = "5e73d4"
pending_data_receipt_count = 2
```
_Note: the subsequent Receipts could arrived in the current block or next, that's why we save [Postponed ActionReceipt](#postponed-actionreceipt) in the storage_
Then the first pending `Pending DataReceipt` arrives:
```python
# Non-relevant fields are omitted.
Receipt {
receiver_id: "alice",
receipt: DataReceipt {
data_id: "e5fa44",
data: "some data for alice",
}
}
```
```python
data_receipts["alice,e5fa44"] = borsh_serialize(Receipt{
receiver_id: "alice",
receipt: DataReceipt {
data_id: "e5fa44",
data: "some data for alice",
}
};
pending_data_receipt_count["alice,5e73d4"] = 1`
del pending_data_receipt_store["alice,e5fa44"]
```
And finally the last `Pending DataReceipt` arrives:
```python
# Non-relevant fields are omitted.
Receipt{
receiver_id: "alice",
receipt: DataReceipt {
data_id: "7448d8",
data: "some more data for alice",
}
}
```
```python
data_receipts["alice,7448d8"] = borsh_serialize(Receipt{
receiver_id: "alice",
receipt: DataReceipt {
data_id: "7448d8",
data: "some more data for alice",
}
};
postponed_receipt_id = pending_data_receipt_store["alice,5e73d4"]
postponed_receipt = postponed_receipts[postponed_receipt_id]
del postponed_receipts[postponed_receipt_id]
del pending_data_receipt_count["alice,5e73d4"]
del pending_data_receipt_store["alice,7448d8"]
apply_receipt(postponed_receipt)
```
## Receipt Validation Error
Some postprocessing validation is done after an action receipt is applied. The validation includes:
* Whether the generated receipts are valid. A generated receipt can be invalid, if, for example, a function call
generates a receipt to call another function on some other contract, but the contract name is invalid. Here there are
mainly two types of errors:
- account id is invalid. If the receiver id of the receipt is invalid, a
```rust
/// The `receiver_id` of a Receipt is not valid.
InvalidReceiverId { account_id: AccountId },
```
error is returned.
- some action is invalid. The errors returned here are the same as the validation errors mentioned in [actions](Actions.md).
* Whether the account still has enough balance to pay for storage. If, for example, the execution of one function call
action leads to some receipts that require transfer to be generated as a result, the account may no longer have enough
balance after the transferred amount is deducted. In this case, a
```rust
/// ActionReceipt can't be completed, because the remaining balance will not be enough to cover storage.
LackBalanceForState {
/// An account which needs balance
account_id: AccountId,
/// Balance required to complete an action.
amount: Balance,
},
```
|
NEAR Now Integrated into Ledger Live For More Security and Ownership
NEAR FOUNDATION
January 30, 2023
Wallet security can be intimidating, especially for newcomers. But it just got a whole lot easier for NEAR users. NEAR Foundation is excited to announce full integration with Ledger Live, making it easier than ever for NEAR holders to secure their assets using Ledger’s devices and software.
Ledger Live is Ledger’s all-in-one digital asset management app enabling its users to manage and stake an ever-growing range of digital assets from the security and self-custody of their hardware wallets. Through its single, secure app, anyone from newcomers to crypto natives can follow the market,manage their DeFi portfolio, and explore Web3 with complete control and freedom.
“We are thrilled to help more people enjoy the benefits of self-custody, no matter what their level of crypto experience,” said Marieke Flament, CEO of the NEAR Foundation. “This new partnership will bring even easier and more secure access to cryptocurrency worldwide, as we remain committed to serving as the de facto entry point to Web3 – simplifying the onboarding experience for users even if they have never used crypto, tokens, keys, or wallets.”
Carl Anderson, VP B2C Engineering at Ledger, adds: “I’m pleased to see a crypto player like NEAR join the Ledger ecosystem. This integration highlights what Ledger Live really is: an all-in-one asset management app where users can manage their digital assets, visualize their NFTs, and explore an ever-growing range of Web3 apps from the security of their hardware wallets.”
How Ledger Live works
To use Ledger Live, users must have a Ledger hardware wallet and then download the desktop and/or mobile app. This all-in-one digital asset management app allows users to track, buy, sell and swap an ever-growing range of tokens, visualize NFTs, earn rewards, manage NFT collections, and securely stake their digital assets with Ledger by Figment validator while enjoying the benefits of self-custody.
Users can delegate their NEAR tokens to the Ledger by Figment validator, contributing to the protocol’s security while benefiting from coverage against slashing risks, low commission fees, and earning rewards from that contribution.
Visit the Ledger Live webpage for more details on using the app and staking with the Ledger by Figment validator.
DISCLOSURE NOTICE: Investment in cryptocurrency and other digital assets is highly speculative and comes with significant risks. The value of these assets can be highly volatile and may fluctuate significantly in a short period of time. It is important to conduct thorough research and consult with a financial advisor before making any investment decisions. Additionally, it is important to understand the regulatory environment and potential legal implications of investing in cryptocurrency. This disclosure is not a recommendation to buy or sell any particular asset and is for informational purposes only. |
---
id: wallet
title: Wallet Chain Key Rules
---
## Overview
In this article you'll find details on how to parse and present multichain transactions to the user so they can take an informed decision about their wallet's assets, while minimizing the number of times the user has to consent.
You'll also learn how to ensure that a signature on one chain is not used to take a meaningful action on another chain.
### Key derivation
When signing using [chain signatures](./chain-signatures.md) each account has an unlimited number of keys. Each key's public key is derived from the account name and the key extension which is an arbitrary string.
User's keys can be described as follow:
```
"david.near," A key with no extension
"david.near, " A key with an extension of " "
"david.near,cold_wallet" A key with an extension of "cold_wallet"
```
:::tip
If the keys aren't identical they have no relationship.
:::
### Ambiguous signatures
You're going to be potentially storing keys for users who hold assets on many chains. Different chains have different ways of serializing and signing transactions. Many chains take steps to ensure that their signatures are not valid signatures on other chains. EVM chains use `ChainID` to disambiguate signatures between different EVM chains. Dfinity uses a unique salt on the hash of the transaction.
Unfortunately, while this is a best practice, you can't guarantee that all chains do this. As such, a user could receive an innocent looking transaction on one chain that could be used to take a destructive action on another chain.
An apocryphal example:
```
Transaction: "7b73656e643a2022313030222c206e6f74652022227d"
Parsed SOL: claim free NFT
Parsed BTC: send 100 BTC to Attacker
```
The user would approve the `SOL` transaction but the attacker would also get the `BTC` transaction.
This can be solved by having different keys for any chains that you can't prove could have ambiguous transactions. This means that while an attacker may create ambiguous transactions, it will only be for wallets without assets on the target chain.
## Serialization format
We're using the following format for our derivations paths.
```typescript
{
chain: number, // SLIP-44 coin type unsigned integer
domain: String, // The domain that owns this key
meta: any, // Catch all data structure
}
```
This is encoded in canonical [JSON RFC 8785](https://www.rfc-editor.org/rfc/rfc8785).
:::info
If you are not using a field don't make it `null`, don't include the key instead.
:::
### User-defined fields
For user-defined fields, the `meta` field can include any data you like.
:::tip
Do not add any extra fields at the top level, as that may clash with future versions of this specification. If needed, put them in the `meta` field instead.
:::
For example, a simple way of selecting alternate keys will be using an object with an `ID` field:
```typescript
{
meta: {id: 10}, // Pick the tenth bitcoin key
chain: 0,
}
```
### Examples
| Key | Description |
|-----|-------------|
| `{chain: 0, domain: "near.org"}` | A bitcoin key used on `near.org` |
| `{chain: 0, meta: {id: 3}}` | Use the third bitcoin key |
## Example user flows
In the following examples, the messages are coming from the user's wallet frontend.
- [Using a domains bitcoin key](#using-a-domains-bitcoin-key)
- [Using a personal Bitcoin key](#using-a-personal-bitcoin-key)
- [Using a personal EVM key to sign a Binance transaction](#using-a-personal-evm-key-to-sign-a-binance-transaction)
- [Using an untyped domain key](#using-an-untyped-domain-key)
- [Using another domains Bitcoin key](#using-another-domains-bitcoin-key)
:::tip
Wallet developers should follow this user flow format.
:::
### Using a domains bitcoin key
An application at `near.org` wants to sign the Bitcoin transaction `Send 100 BTC` using the key `david.near,bitcoin,near.org,`.
```
Signed ✅
```
We sign the transaction without confirmation because the key is owned by `near.org`.
### Using a personal Bitcoin key
An application at `near.org` wants to sign the Bitcoin transaction `Send 100 BTC` using the key `david.near,bitcoin,`.
```
near.org would like to run the following Bitcoin transaction:
Send 100 BTC
[Accept] [Reject]
```
The user must make an informed decision about whether this is an action they would like to take.
```
Signed ✅
```
### Using a personal EVM key to sign a Binance transaction
An application at `near.org` wants to sign the Binance Smart Chain transaction `Send 100 BNB, ChainID 56` using the key `david.near,evm,`. The wallet knows this is a BSC transaction because of the corresponding `ChainID` (56) and because the `evm` key is being used.
```
near.org would like to run the following Binance Smart Chain transaction:
Send 100 BNB
[Accept] [Reject]
```
The user must make an informed decision about whether this is an action they would like to take.
```
Signed ✅
```
### Using an untyped domain key
An application at `near.org` wants to sign the Bitcoin transaction `Send 100 BTC` using the key `david.near,,near.org,`.
```
Signed ✅
```
While this is ill advised, it's still the domains key so the domain can still choose whether to sign something using it.
### Using another domains Bitcoin key
An application at `attacker.com` wants to sign the Bitcoin transaction `Send 100 BTC` using the key `david.near,bitcoin,near.org,`.
```
Attacker.com would like to sign a transaction using your credentials from near.org
Send 100 BTC
This is a suspicious transaction and likely not one you should accept
[Reject] [Accept (Are you sure!)]
```
The user must make an explicit decision to do something that is ill advised.
```
Signed ✅
```
The correct way for `attacker.com` to make this request is to somehow redirect the user to `near.org` and get the user to make a decision there.
|
# Fungible Token Event
Version `1.0.0`
## Summary
Standard interfaces for FT contract actions.
Extension of [NEP-297](../../EventsFormat.md)
## Motivation
NEAR and third-party applications need to track `mint`, `transfer`, `burn` events for all FT-driven apps consistently.
This extension addresses that.
Keep in mind that applications, including NEAR Wallet, could require implementing additional methods, such as [`ft_metadata`](Metadata.md), to display the FTs correctly.
## Interface
Fungible Token Events MUST have `standard` set to `"nep141"`, standard version set to `"1.0.0"`, `event` value is one of `ft_mint`, `ft_burn`, `ft_transfer`, and `data` must be of one of the following relevant types: `FtMintLog[] | FtTransferLog[] | FtBurnLog[]`:
```ts
interface FtEventLogData {
standard: "nep141",
version: "1.0.0",
event: "ft_mint" | "ft_burn" | "ft_transfer",
data: FtMintLog[] | FtTransferLog[] | FtBurnLog[],
}
```
```ts
// An event log to capture tokens minting
// Arguments
// * `owner_id`: "account.near"
// * `amount`: the number of tokens to mint, wrapped in quotes and treated
// like a string, although the number will be stored as an unsigned integer
// with 128 bits.
// * `memo`: optional message
interface FtMintLog {
owner_id: string,
amount: string,
memo?: string
}
// An event log to capture tokens burning
// Arguments
// * `owner_id`: owner of tokens to burn
// * `amount`: the number of tokens to burn, wrapped in quotes and treated
// like a string, although the number will be stored as an unsigned integer
// with 128 bits.
// * `memo`: optional message
interface FtBurnLog {
owner_id: string,
amount: string,
memo?: string
}
// An event log to capture tokens transfer
// Arguments
// * `old_owner_id`: "owner.near"
// * `new_owner_id`: "receiver.near"
// * `amount`: the number of tokens to transfer, wrapped in quotes and treated
// like a string, although the number will be stored as an unsigned integer
// with 128 bits.
// * `memo`: optional message
interface FtTransferLog {
old_owner_id: string,
new_owner_id: string,
amount: string,
memo?: string
}
```
## Examples
Batch mint:
```js
EVENT_JSON:{
"standard": "nep141",
"version": "1.0.0",
"event": "ft_mint",
"data": [
{"owner_id": "foundation.near", "amount": "500"}
]
}
```
Batch transfer:
```js
EVENT_JSON:{
"standard": "nep141",
"version": "1.0.0",
"event": "ft_transfer",
"data": [
{"old_owner_id": "from.near", "new_owner_id": "to.near", "amount": "42", "memo": "hi hello bonjour"},
{"old_owner_id": "user1.near", "new_owner_id": "user2.near", "amount": "7500"}
]
}
```
Batch burn:
```js
EVENT_JSON:{
"standard": "nep141",
"version": "1.0.0",
"event": "ft_burn",
"data": [
{"owner_id": "foundation.near", "amount": "100"},
]
}
```
## Further methods
Note that the example events covered above cover two different kinds of events:
1. Events that are not specified in the FT Standard (`ft_mint`, `ft_burn`)
2. An event that is covered in the [FT Core Standard](Core.md). (`ft_transfer`)
Please feel free to open pull requests for extending the events standard detailed here as needs arise.
|
# Approval Management
## [NEP-178](https://github.com/near/NEPs/blob/master/neps/nep-0178.md)
Version `1.1.0`
## Summary
A system for allowing a set of users or contracts to transfer specific Non-Fungible Tokens on behalf of an owner. Similar to approval management systems in standards like [ERC-721].
[ERC-721]: https://eips.ethereum.org/EIPS/eip-721
## Motivation
People familiar with [ERC-721] may expect to need an approval management system for basic transfers, where a simple transfer from Alice to Bob requires that Alice first _approve_ Bob to spend one of her tokens, after which Bob can call `transfer_from` to actually transfer the token to himself.
NEAR's [core Non-Fungible Token standard](Core.md) includes good support for safe atomic transfers without such complexity. It even provides "transfer and call" functionality (`nft_transfer_call`) which allows a specific token to be "attached" to a call to a separate contract. For many Non-Fungible Token workflows, these options may circumvent the need for a full-blown Approval Managament system.
However, some Non-Fungible Token developers, marketplaces, dApps, or artists may require greater control. This standard provides a uniform interface allowing token owners to approve other NEAR accounts, whether individuals or contracts, to transfer specific tokens on the owner's behalf.
Prior art:
- Ethereum's [ERC-721]
- [NEP-4](https://github.com/near/NEPs/pull/4), NEAR's old NFT standard that does not include approved_account_ids per token ID
## Example Scenarios
Let's consider some examples. Our cast of characters & apps:
* Alice: has account `alice` with no contract deployed to it
* Bob: has account `bob` with no contract deployed to it
* NFT: a contract with account `nft`, implementing only the [Core NFT standard](Core.md) with this Approval Management extension
* Market: a contract with account `market` which sells tokens from `nft` as well as other NFT contracts
* Bazaar: similar to Market, but implemented differently (spoiler alert: has no `nft_on_approve` function!), has account `bazaar`
Alice and Bob are already [registered](../../StorageManagement.md) with NFT, Market, and Bazaar, and Alice owns a token on the NFT contract with ID=`"1"`.
Let's examine the technical calls through the following scenarios:
1. [Simple approval](#1-simple-approval): Alice approves Bob to transfer her token.
2. [Approval with cross-contract call (XCC)](#2-approval-with-cross-contract-call): Alice approves Market to transfer one of her tokens and passes `msg` so that NFT will call `nft_on_approve` on Market's contract.
3. [Approval with XCC, edge case](#3-approval-with-cross-contract-call-edge-case): Alice approves Bazaar and passes `msg` again, but what's this? Bazaar doesn't implement `nft_on_approve`, so Alice sees an error in the transaction result. Not to worry, though, she checks `nft_is_approved` and sees that she did successfully approve Bazaar, despite the error.
4. [Approval IDs](#4-approval-ids): Bob buys Alice's token via Market.
5. [Approval IDs, edge case](#5-approval-ids-edge-case): Bob transfers same token back to Alice, Alice re-approves Market & Bazaar. Bazaar has an outdated cache. Bob tries to buy from Bazaar at the old price.
6. [Revoke one](#6-revoke-one): Alice revokes Market's approval for this token.
7. [Revoke all](#7-revoke-all): Alice revokes all approval for this token.
### 1. Simple Approval
Alice approves Bob to transfer her token.
**High-level explanation**
1. Alice approves Bob
2. Alice queries the token to verify
3. Alice verifies a different way
**Technical calls**
1. Alice calls `nft::nft_approve({ "token_id": "1", "account_id": "bob" })`. She attaches 1 yoctoⓃ, (.000000000000000000000001Ⓝ). Using [NEAR CLI](https://docs.near.org/tools/near-cli) to make this call, the command would be:
```bash
near call nft nft_approve \
'{ "token_id": "1", "account_id": "bob" }' \
--accountId alice --depositYocto 1
```
The response:
''
2. Alice calls view method `nft_token`:
```bash
near view nft nft_token \
'{ "token_id": "1" }'
```
The response:
```bash
{
"token_id": "1",
"owner_id": "alice.near",
"approved_account_ids": {
"bob": 1,
}
}
```
3. Alice calls view method `nft_is_approved`:
```bash
near view nft nft_is_approved \
'{ "token_id": "1", "approved_account_id": "bob" }'
```
The response:
true
### 2. Approval with cross-contract call
Alice approves Market to transfer one of her tokens and passes `msg` so that NFT will call `nft_on_approve` on Market's contract. She probably does this via Market's frontend app which would know how to construct `msg` in a useful way.
**High-level explanation**
1. Alice calls `nft_approve` to approve `market` to transfer her token, and passes a `msg`
2. Since `msg` is included, `nft` will schedule a cross-contract call to `market`
3. Market can do whatever it wants with this info, such as listing the token for sale at a given price. The result of this operation is returned as the promise outcome to the original `nft_approve` call.
**Technical calls**
1. Using near-cli:
```bash
near call nft nft_approve '{
"token_id": "1",
"account_id": "market",
"msg": "{\"action\": \"list\", \"price\": \"100\", \"token\": \"nDAI\" }"
}' --accountId alice --depositYocto 1
```
At this point, near-cli will hang until the cross-contract call chain fully resolves, which would also be true if Alice used a Market frontend using [near-api-js](https://docs.near.org/tools/near-api-js/quick-reference). Alice's part is done, though. The rest happens behind the scenes.
2. `nft` schedules a call to `nft_on_approve` on `market`. Using near-cli notation for easy cross-reference with the above, this would look like:
```bash
near call market nft_on_approve '{
"token_id": "1",
"owner_id": "alice",
"approval_id": 2,
"msg": "{\"action\": \"list\", \"price\": \"100\", \"token\": \"nDAI\" }"
}' --accountId nft
```
3. `market` now knows that it can sell Alice's token for 100 [nDAI](https://explorer.mainnet.near.org/accounts/6b175474e89094c44da98b954eedeac495271d0f.factory.bridge.near), and that when it transfers it to a buyer using `nft_transfer`, it can pass along the given `approval_id` to ensure that Alice hasn't changed her mind. It can schedule any further cross-contract calls it wants, and if it returns these promises correctly, Alice's initial near-cli call will resolve with the outcome from the final step in the chain. If Alice actually made this call from a Market frontend, the frontend can use this return value for something useful.
### 3. Approval with cross-contract call, edge case
Alice approves Bazaar and passes `msg` again. Maybe she actually does this via near-cli, rather than using Bazaar's frontend, because what's this? Bazaar doesn't implement `nft_on_approve`, so Alice sees an error in the transaction result.
Not to worry, though, she checks `nft_is_approved` and sees that she did successfully approve Bazaar, despite the error. She will have to find a new way to list her token for sale in Bazaar, rather than using the same `msg` shortcut that worked for Market.
**High-level explanation**
1. Alice calls `nft_approve` to approve `bazaar` to transfer her token, and passes a `msg`.
2. Since `msg` is included, `nft` will schedule a cross-contract call to `bazaar`.
3. Bazaar doesn't implement `nft_on_approve`, so this call results in an error. The approval still worked, but Alice sees an error in her near-cli output.
4. Alice checks if `bazaar` is approved, and sees that it is, despite the error.
**Technical calls**
1. Using near-cli:
```bash
near call nft nft_approve '{
"token_id": "1",
"account_id": "bazaar",
"msg": "{\"action\": \"list\", \"price\": \"100\", \"token\": \"nDAI\" }"
}' --accountId alice --depositYocto 1
```
2. `nft` schedules a call to `nft_on_approve` on `market`. Using near-cli notation for easy cross-reference with the above, this would look like:
```bash
near call bazaar nft_on_approve '{
"token_id": "1",
"owner_id": "alice",
"approval_id": 3,
"msg": "{\"action\": \"list\", \"price\": \"100\", \"token\": \"nDAI\" }"
}' --accountId nft
```
3. 💥 `bazaar` doesn't implement this method, so the call results in an error. Alice sees this error in the output from near-cli.
4. Alice checks if the approval itself worked, despite the error on the cross-contract call:
```bash
near view nft nft_is_approved \
'{ "token_id": "1", "approved_account_id": "bazaar" }'
```
The response:
true
### 4. Approval IDs
Bob buys Alice's token via Market. Bob probably does this via Market's frontend, which will probably initiate the transfer via a call to `ft_transfer_call` on the nDAI contract to transfer 100 nDAI to `market`. Like the NFT standard's "transfer and call" function, [Fungible Token](../FungibleToken/Core.md)'s `ft_transfer_call` takes a `msg` which `market` can use to pass along information it will need to pay Alice and actually transfer the NFT. The actual transfer of the NFT is the only part we care about here.
**High-level explanation**
1. Bob signs some transaction which results in the `market` contract calling `nft_transfer` on the `nft` contract, as described above. To be trustworthy and pass security audits, `market` needs to pass along `approval_id` so that it knows it has up-to-date information.
**Technical calls**
Using near-cli notation for consistency:
```bash
near call nft nft_transfer '{
"receiver_id": "bob",
"token_id": "1",
"approval_id": 2,
}' --accountId market --depositYocto 1
```
### 5. Approval IDs, edge case
Bob transfers same token back to Alice, Alice re-approves Market & Bazaar, listing her token at a higher price than before. Bazaar is somehow unaware of these changes, and still stores `approval_id: 3` internally along with Alice's old price. Bob tries to buy from Bazaar at the old price. Like the previous example, this probably starts with a call to a different contract, which eventually results in a call to `nft_transfer` on `bazaar`. Let's consider a possible scenario from that point.
**High-level explanation**
Bob signs some transaction which results in the `bazaar` contract calling `nft_transfer` on the `nft` contract, as described above. To be trustworthy and pass security audits, `bazaar` needs to pass along `approval_id` so that it knows it has up-to-date information. It does not have up-to-date information, so the call fails. If the initial `nft_transfer` call is part of a call chain originating from a call to `ft_transfer_call` on a fungible token, Bob's payment will be refunded and no assets will change hands.
**Technical calls**
Using near-cli notation for consistency:
```bash
near call nft nft_transfer '{
"receiver_id": "bob",
"token_id": "1",
"approval_id": 3,
}' --accountId bazaar --depositYocto 1
```
### 6. Revoke one
Alice revokes Market's approval for this token.
**Technical calls**
Using near-cli:
```bash
near call nft nft_revoke '{
"account_id": "market",
"token_id": "1",
}' --accountId alice --depositYocto 1
```
Note that `market` will not get a cross-contract call in this case. The implementors of the Market app should implement [cron](https://en.wikipedia.org/wiki/Cron)-type functionality to intermittently check that Market still has the access they expect.
### 7. Revoke all
Alice revokes all approval for this token.
**Technical calls**
Using near-cli:
```bash
near call nft nft_revoke_all '{
"token_id": "1",
}' --accountId alice --depositYocto 1
```
Again, note that no previous approvers will get cross-contract calls in this case.
## Reference-level explanation
The `Token` structure returned by `nft_token` must include an `approved_account_ids` field, which is a map of account IDs to approval IDs. Using TypeScript's [Record type](https://www.typescriptlang.org/docs/handbook/utility-types.html#recordkeystype) notation:
```diff
type Token = {
token_id: string,
owner_id: string,
+ approved_account_ids: Record<string, number>,
}
```
Example token data:
```json
{
"token_id": "1",
"owner_id": "alice.near",
"approved_account_ids": {
"bob.near": 1,
"carol.near": 2,
}
}
```
In Rust, the standard library `HashMap` is the recommended type for the `approvals` field, though any `map` may be used.
```diff
pub struct Token = {
pub id: String,
pub owner_id: String,
+ pub approvals: std::collections::HashMap<String, u64>,
}
```
### What is an "approval ID"?
This is a unique number given to each approval that allows well-intentioned marketplaces or other 3rd-party NFT resellers to avoid a race condition. The race condition occurs when:
1. A token is listed in two marketplaces, which are both saved to the token as approved accounts.
2. One marketplace sells the token, which clears the approved accounts.
3. The new owner sells back to the original owner.
4. The original owner approves the token for the second marketplace again to list at a new price. But for some reason the second marketplace still lists the token at the previous price and is unaware of the transfers happening.
5. The second marketplace, operating from old information, attempts to again sell the token at the old price.
Note that while this describes an honest mistake, the possibility of such a bug can also be taken advantage of by malicious parties via [front-running](https://users.encs.concordia.ca/~clark/papers/2019_wtsc_front.pdf).
To avoid this possibility, the NFT contract generates a unique approval ID each time it approves an account. Then when calling `nft_transfer` or `nft_transfer_call`, the approved account passes `approval_id` with this value to make sure the underlying state of the token hasn't changed from what the approved account expects.
Keeping with the example above, say the initial approval of the second marketplace generated the following `approved_account_ids` data:
```json
{
"token_id": "1",
"owner_id": "alice.near",
"approved_account_ids": {
"marketplace_1.near": 1,
"marketplace_2.near": 2,
}
}
```
But after the transfers and re-approval described above, the token might have `approved_account_ids` as:
```json
{
"token_id": "1",
"owner_id": "alice.near",
"approved_account_ids": {
"marketplace_2.near": 3,
}
}
```
The marketplace then tries to call `nft_transfer`, passing outdated information:
```bash
# oops!
near call nft-contract.near nft_transfer '{ "approval_id": 2 }'
```
### Interface
The NFT contract must implement the following methods:
```ts
/******************/
/* CHANGE METHODS */
/******************/
// Add an approved account for a specific token.
//
// Requirements
// * Caller of the method must attach a deposit of at least 1 yoctoⓃ for
// security purposes
// * Contract MAY require caller to attach larger deposit, to cover cost of
// storing approver data
// * Contract MUST panic if called by someone other than token owner
// * Contract MUST panic if addition would cause `nft_revoke_all` to exceed
// single-block gas limit. See below for more info.
// * Contract MUST increment approval ID even if re-approving an account
// * If successfully approved or if had already been approved, and if `msg` is
// present, contract MUST call `nft_on_approve` on `account_id`. See
// `nft_on_approve` description below for details.
//
// Arguments:
// * `token_id`: the token for which to add an approval
// * `account_id`: the account to add to `approved_account_ids`
// * `msg`: optional string to be passed to `nft_on_approve`
//
// Returns void, if no `msg` given. Otherwise, returns promise call to
// `nft_on_approve`, which can resolve with whatever it wants.
function nft_approve(
token_id: TokenId,
account_id: string,
msg: string|null,
): void|Promise<any> {}
// Revoke an approved account for a specific token.
//
// Requirements
// * Caller of the method must attach a deposit of 1 yoctoⓃ for security
// purposes
// * If contract requires >1yN deposit on `nft_approve`, contract
// MUST refund associated storage deposit when owner revokes approval
// * Contract MUST panic if called by someone other than token owner
//
// Arguments:
// * `token_id`: the token for which to revoke an approval
// * `account_id`: the account to remove from `approved_account_ids`
function nft_revoke(
token_id: string,
account_id: string
) {}
// Revoke all approved accounts for a specific token.
//
// Requirements
// * Caller of the method must attach a deposit of 1 yoctoⓃ for security
// purposes
// * If contract requires >1yN deposit on `nft_approve`, contract
// MUST refund all associated storage deposit when owner revokes approved_account_ids
// * Contract MUST panic if called by someone other than token owner
//
// Arguments:
// * `token_id`: the token with approved_account_ids to revoke
function nft_revoke_all(token_id: string) {}
/****************/
/* VIEW METHODS */
/****************/
// Check if a token is approved for transfer by a given account, optionally
// checking an approval_id
//
// Arguments:
// * `token_id`: the token for which to revoke an approval
// * `approved_account_id`: the account to check the existence of in `approved_account_ids`
// * `approval_id`: an optional approval ID to check against current approval ID for given account
//
// Returns:
// if `approval_id` given, `true` if `approved_account_id` is approved with given `approval_id`
// otherwise, `true` if `approved_account_id` is in list of approved accounts
function nft_is_approved(
token_id: string,
approved_account_id: string,
approval_id: number|null
): boolean {}
```
### Why must `nft_approve` panic if `nft_revoke_all` would fail later?
In the description of `nft_approve` above, it states:
Contract MUST panic if addition would cause `nft_revoke_all` to exceed
single-block gas limit.
What does this mean?
First, it's useful to understand what we mean by "single-block gas limit". This refers to the [hard cap on gas per block at the protocol layer](https://docs.near.org/docs/concepts/gas#thinking-in-gas). This number will increase over time.
Removing data from a contract uses gas, so if an NFT had a large enough number of approved_account_ids, `nft_revoke_all` would fail, because calling it would exceed the maximum gas.
Contracts must prevent this by capping the number of approved_account_ids for a given token. However, it is up to contract authors to determine a sensible cap for their contract (and the single block gas limit at the time they deploy). Since contract implementations can vary, some implementations will be able to support a larger number of approved_account_ids than others, even with the same maximum gas per block.
Contract authors may choose to set a cap of something small and safe like 10 approved_account_ids, or they could dynamically calculate whether a new approval would break future calls to `nft_revoke_all`. But every contract MUST ensure that they never break the functionality of `nft_revoke_all`.
### Approved Account Contract Interface
If a contract that gets approved to transfer NFTs wants to, it can implement `nft_on_approve` to update its own state when granted approval for a token:
```ts
// Respond to notification that contract has been granted approval for a token.
//
// Notes
// * Contract knows the token contract ID from `predecessor_account_id`
//
// Arguments:
// * `token_id`: the token to which this contract has been granted approval
// * `owner_id`: the owner of the token
// * `approval_id`: the approval ID stored by NFT contract for this approval.
// Expected to be a number within the 2^53 limit representable by JSON.
// * `msg`: specifies information needed by the approved contract in order to
// handle the approval. Can indicate both a function to call and the
// parameters to pass to that function.
function nft_on_approve(
token_id: TokenId,
owner_id: string,
approval_id: number,
msg: string,
) {}
```
Note that the NFT contract will fire-and-forget this call, ignoring any return values or errors generated. This means that even if the approved account does not have a contract or does not implement `nft_on_approve`, the approval will still work correctly from the point of view of the NFT contract.
Further note that there is no parallel `nft_on_revoke` when revoking either a single approval or when revoking all. This is partially because scheduling many `nft_on_revoke` calls when revoking all approved_account_ids could incur prohibitive [gas fees](https://docs.near.org/docs/concepts/gas). Apps and contracts which cache NFT approved_account_ids can therefore not rely on having up-to-date information, and should periodically refresh their caches. Since this will be the necessary reality for dealing with `nft_revoke_all`, there is no reason to complicate `nft_revoke` with an `nft_on_revoke` call.
### No incurred cost for core NFT behavior
NFT contracts should be implemented in a way to avoid extra gas fees for serialization & deserialization of `approved_account_ids` for calls to `nft_*` methods other than `nft_token`. See `near-contract-standards` [implementation of `ft_metadata` using `LazyOption`](https://github.com/near/near-sdk-rs/blob/c2771af7fdfe01a4e8414046752ee16fb0d29d39/examples/fungible-token/ft/src/lib.rs#L71) as a reference example.
## Errata
* **2022-02-03**: updated `Token` struct field names. `id` was changed to `token_id` and `approvals` was changed to `approved_account_ids`. This is to be consistent with current implementations of the standard and the rust SDK docs.
|
Case Study: NEAR Foundation’s Laura Cunningham on the NEAR Horizon Startup Accelerator
CASE STUDIES
July 26, 2023
NEAR is a home to a number of amazing apps and projects. These developers, founders, and entrepreneurs are using NEAR to effortlessly create and distribute innovative decentralized apps, while helping build a more open web — free from centralized platforms. In these Case Study videos, NEAR Foundation showcases some of these projects.
In the latest NEAR Foundation Case Study video, viewers hear from Laura Cunningham, Team Lead at NEAR Horizon and General Manager at NEAR Foundation. NEAR Horizon is a Web3 accelerator program revolutionizing how founders and builders receive support in the open web. Laura takes viewers on a deep dive into Horizon’s startup accelerator marketplace, which the Horizon team built on the Blockchain Operating System (BOS).
“The BOS was our platform of choice for building NEAR Horizon because it makes Horizon much more accessible to founders and projects that are already in the ecosystem. And then ones coming in new, you can be building on another chain as long as you have some component of your stack on NEAR then you can apply and be part of NEAR Horizon, ” she adds. “It is also incredibly composable, so we were able to fork different attributes that were already available on BOS and make use of them in our front end, so it made it a lot quicker to actually develop.” |
---
id: explorer
title: Explorer
sidebar_label: Explorers
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
An Explorer allows you to quickly obtain information from the blockchain through a friendly user interface.
These useful tools enable for example to:
1. Search for a transactions and its receipts.
2. Check all the interactions in a smart contract.
3. See the balance of an account.
4. View block creations in real time
---
## NEAR Explorer
Created by NEAR, the [Near Explorer](https://nearblocks.io) enables to check information for transactions and accounts through a user-friendly interface.
![NEAR Explorer](/docs/assets/explorers/near-explorer.png)
*Main page of [NEAR Explorer](https://nearblocks.io)*
<hr className="subsection"/>
## Stats Gallery
Created by the community, [Stats Gallery](https://stats.gallery) gamifies the experience of looking to an account, adding levels and badges based on the account's activity. One of its
best features is that it allows you to see the contract's methods.
![Stats Gallery](/docs/assets/explorers/stats-gallery.png)
*Account view in [Stats Gallery](https://stats.gallery)*
<hr className="subsection"/>
## NearBlocks
Created by the community, [NearBlocks](https://nearblocks.io/) enables to check accounts and transactions with an interface similar to etherscan.
![NearBlocks](/docs/assets/explorers/nearblocks.png)
*Main page of [NearBlocks](https://nearblocks.io/)*
<hr className="subsection"/>
## Nearscope
[Nearscope](https://nearscope.net/) provides a NEAR node validator and delegator explorer.
![Nearscope](/docs/assets/explorers/nearscope.png)
*Main page of [Nearscope](https://nearscope.net/)*
<hr className="subsection"/>
## DappLooker
[DappLooker](https://dapplooker.com/) lets you analyze and query NEAR blockchain data, build dashboards to visualize data and share with your community.
![DappLooker](/docs/assets/explorers/dapplooker.png)
*Main page of [DappLooker](https://dapplooker.com/)*
<hr className="subsection"/>
## Pikespeak
[Pikespeak](https://pikespeak.ai/) provides access to real-time and historical data on the NEAR Protocol.
|
- Proposal Name: Batched Transactions
- Start Date: 2019-07-22
- NEP PR: [nearprotocol/neps#0008](https://github.com/nearprotocol/neps/pull/8)
# Summary
[summary]: #summary
Refactor signed transactions and receipts to support batched atomic transactions and data dependency.
# Motivation
[motivation]: #motivation
It simplifies account creation, by supporting batching of multiple transactions together instead of
creating more complicated transaction types.
For example, we want to create a new account with some account balance and one or many access keys, deploy a contract code on it and run an initialization method to restrict access keys permissions for a `proxy` function.
To be able to do this now, we need to have a `CreateAccount` transaction with all the parameters of a new account.
Then we need to handle it in one operation in a runtime code, which might have duplicated code for executing some WASM code with the rollback conditions.
Alternative to this is to execute multiple simple transactions in a batch within the same block.
It has to be done in a row without any commits to the state until the entire batch is completed.
We propose to support this type of transaction batching to simplify the runtime.
Currently callbacks are handled differently from async calls, this NEP simplifies data dependencies and callbacks by unifying them.
# Guide-level explanation
[guide-level-explanation]: #guide-level-explanation
### New transaction and receipts
Previously, in the runtime to produce a block we first executed new signed transactions and then executed received receipts. It resulted in duplicated code that might be shared across similar actions, e.g. function calls for async calls, callbacks and self-calls.
It also increased the complexity of the runtime implementation.
This NEP proposes changing it by first converting all signed transactions into receipts and then either execute them immediately before received receipts, or put them into the list of the new receipts to be routed.
To achieve this, NEP introduces a new message `Action` that represents one of atomic actions, e.g. a function call.
`TransactionBody` is now called just `Transaction`. It contains the list of actions that needs to be performed in a single batch and the information shared across these actions.
`Transaction` contains the following fields
- `signer_id` is an account ID of the transaction signer.
- `public_key` is a public key used to identify the access key and to sign the transaction.
- `nonce` is used to deduplicate and order transactions (per access key).
- `receiver_id` is the account ID of the destination of this transaction. It's where the generated receipt will be routed for execution.
- `action` is the list of actions to perform.
An `Action` can be of the following:
- `CreateAccount` creates a new account with the `receiver_id` account ID. The action fails if the account already exists. `CreateAccount` also grants permission for all subsequent batched action for the newly created account. For example, permission to deploy code on the new account. Permission details are described in the reference section below.
- `DeployContract` deploys given binary wasm code on the account. Either the `receiver_id` equals to the `signer_id`, or the batch of actions has started with `CreateAccount`, which granted that permission.
- `FunctionCall` executes a function call on the last deployed contract. The action fails if the account or the code doesn't exist. E.g. if the previous action was `DeployContract`, then the code to execute will be the new deployed contract. `FunctionCall` has `method_name` and `args` to identify method with arguments to call. It also has `gas` and the `deposit`. `gas` is a prepaid amount of gas for this call (the price of gas is determined when a signed transaction is converted to a receipt. `deposit` is the attached deposit balance of NEAR tokens that the contract can spend, e.g. 10 tokens to pay for a crypto-corgi.
- `Transfer` transfers the given `deposit` balance of tokens from the predecessor to the receiver.
- `Stake` stakes the new total `stake` balance with the given `public_key`. The difference in stake is taken from the account's balance (if the new stake is greater than the current one) at the moment when this action is executed, so it's not prepaid. There is no particular reason to stake on behalf of a newly created account, so we may disallow it.
- `DeleteKey` deletes an old `AccessKey` identified by the given `public_key` from the account. Fails if the access key with the given public key doesn't exist. All next batched actions will continue to execute, even if the public key that authorized that transaction was removed.
- `AddKey` adds a new given `AccessKey` identified by a new given `public_key` to the account. Fails if an access key with the given public key already exists. We removed `SwapKeyTransaction`, because it can be replaced with 2 batched actions - delete an old key and add a new key.
- `DeleteAccount` deletes `receiver_id` account if the account doesn't have enough balance to pay the rent, or the `receiver_id` is the `predecessor_id`. Sends the remaining balance to the `beneficiary_id` account.
The new `Receipt` contains the shared information and either one of the receipt actions or a list of actions:
- `predecessor_id` the account ID of the immediate previous sender (predecessor) of this receipt. It can be different from the `signer_id` in some cases, e.g. for promises.
- `receiver_id` the account ID of the current account, on which we need to perform action(s).
- `receipt_id` is a unique ID of this receipt (previously was called `nonce`). It's generated from either the signed transaction or the parent receipt.
- `receipt` can be one of 2 types:
- `ActionReceipt` is used to perform some actions on the receiver.
- `DataReceipt` is used when some data needs to be passed from the predecessor to the receiver, e.g. an execution result.
To support promises and callbacks we introduce a concept of cross-shard data sharing with dependencies. Each `ActionReceipt` may have a list of input `data_id`. The execution will not start until all required inputs are received. Once the execution completes and if there is `output_data_id`, it produces a `DataReceipt` that will be routed to the `output_receiver_id`.
`ActionReceipt` contains the following fields:
- `signer_id` the account ID of the signer, who signed the transaction.
- `signer_public_key` the public key that the signer used to sign the original signed transaction.
- `output_data_id` is the data ID to create DataReceipt. If it's absent, then the `DataReceipt` is not created.
- `output_receiver_id` is the account ID of the data receiver. It's needed to route `DataReceipt`. It's absent if the DataReceipt is not needed.
- `input_data_id` is the list of data IDs that are required for the execution of the `ActionReceipt`. If some of data IDs is not available when the receipt is received, then the `ActionReceipt` is postponed until all data is available. Once the last `DataReceipt` for the required input data arrives, the action receipt execution is triggered.
- `action` is the list of actions to execute. The execution doesn't need to validate permissions of the actions, but need to fail in some cases. E.g. when the receiver's account doesn't exist and the action acts on the account, or when the action is a function call and the code is not present.
`DataReceipt` contains the following fields:
- `data_id` is the data ID to be used as an input.
- `success` is true if the `ActionReceipt` that generated this `DataReceipt` finished the execution without any failures.
- `data` is the binary data that is returned from the last action of the `ActionReceipt`. Right now, it's empty for all actions except for function calls. For function calls the data is the result of the code execution. But in the future we might introduce non-contract state reads.
Data should be stored at the same shard as the receiver's account, even if the receiver's account doesn't exist.
### Refunds
In case an `ActionReceipt` execution fails the runtime can generate a refund.
We've removed `refund_account_id` from receipts, because the account IDs for refunds can be determined from the `signer_id` and `predecessor_id` in the `ActionReceipt`.
All unused gas and action fees (also measured in gas) are always refunded back to the `signer_id`, because fees are always prepaid by the signer. The gas is converted into tokens using the `gas_price`.
The deposit balances from `FunctionCall` and `Transfer` are refunded back to the `predecessor_id`, because they were deducted from predecessor's account balance.
It's also important to note that the account ID of predecessor for refund receipts is `system`.
It's done to prevent refund loops, e.g. when the account to receive the refund was deleted before the refund arrives. In this case the refund is burned.
If the function call action with the attached `deposit` fails in the middle of the execution, then 2 refund receipts can be generated, one for the unused gas and one for the deposits.
The runtime should combine them into one receipt if `signer_id` and `predecessor_id` is the same.
Example of a receipt for a refund of `42000` atto-tokens to `vasya.near`:
```json
{
"predecessor_id": "system",
"receiver_id": "vasya.near",
"receipt_id": ...,
"action": {
"signer_id": "vasya.near",
"signer_public_key": ...,
"gas_price": "3",
"output_data_id": null,
"output_receiver_id": null,
"input_data_id": [],
"action": [
{
"transfer": {
"deposit": "42000"
}
}
]
}
}
```
### Examples
#### Account Creation
To create a new account we can create a new `Transaction`:
```json
{
"signer_id": "vasya.near",
"public_key": ...,
"nonce": 42,
"receiver_id": "vitalik.vasya.near",
"action": [
{
"create_account": {
}
},
{
"transfer": {
"deposit": "19231293123"
}
},
{
"deploy_contract": {
"code": ...
}
},
{
"add_key": {
"public_key": ...,
"access_key": ...
}
},
{
"function_call": {
"method_name": "init",
"args": ...,
"gas": 20000,
"deposit": "0"
}
}
]
}
```
This transaction is sent from `vasya.near` signed with a `public_key`.
The receiver is `vitalik.vasya.near`, which is a new account id.
The transaction contains a batch of actions.
First we create the account, then we transfer a few tokens to the newly created account, then we deploy code on the new account, add a new access key with some given public key, and as a final action initializing the deployed code by calling a method `init` with some arguments.
For this transaction to work `vasya.near` needs to have enough balance on the account cover gas and deposits for all actions at once.
Every action has some associated action gas fee with it. While `transfer` and `function_call` actions need additional balance for deposits and gas (for executions and promises).
Once we validated and subtracted the total amount from `vasya.near` account, this transaction is transformed into a `Receipt`:
```json
{
"predecessor_id": "vasya.near",
"receiver_id": "vitalik.vasya.near",
"receipt_id": ...,
"action": {
"signer_id": "vasya.near",
"signer_public_key": ...,
"gas_price": "3",
"output_data_id": null,
"output_receiver_id": null,
"input_data_id": [],
"action": [...]
}
}
```
In this example the gas price at the moment when the transaction was processed was 3 per gas.
This receipt will be sent to `vitalik.vasya.near`'s shard to be executed.
In case the `vitalik.vasya.near` account already exists, the execution will fail and some amount of prepaid_fees will be refunded back to `vasya.near`.
If the account creation receipt succeeds, it wouldn't create a `DataReceipt`, because `output_data_id` is `null`.
But it will generate a refund receipt for the unused portion of prepaid function call `gas`.
#### Deploy code example
Deploying code with initialization is pretty similar to creating account, except you can't deploy code on someone else account. So the transaction's `receiver_id` has to be the same as the `signer_id`.
#### Simple promise with callback
Let's say the transaction contained a single action which is a function call to `a.contract.near`.
It created a new promise `b.contract.near` and added a callback to itself.
Once the execution completes it will result in the following new receipts:
The receipt for the new promise towards `b.contract.near`
```json
{
"predecessor_id": "a.contract.near",
"receiver_id": "b.contract.near",
"receipt_id": ...,
"action": {
"signer_id": "vasya.near",
"signer_public_key": ...,
"gas_price": "3",
"output_data_id": "data_123_1",
"output_receiver_id": "a.contract.near",
"input_data_id": [],
"action": [
{
"function_call": {
"method_name": "sum",
"args": ...,
"gas": 10000,
"deposit": "0"
}
}
]
}
}
```
Interesting details:
- `signer_id` is still `vasya.near`, because it's the account that initialized the transaction, but not the creator of the promise.
- `output_data_id` contains some unique data ID. In this example we used `data_123_1`.
- `output_receiver_id` indicates where to route the result of the execution.
The other receipt is for the callback which will stay in the same shard.
```json
{
"predecessor_id": "a.contract.near",
"receiver_id": "a.contract.near",
"receipt_id": ...,
"action": {
"signer_id": "vasya.near",
"signer_public_key": ...,
"gas_price": "3",
"output_data_id": null,
"output_receiver_id": null,
"input_data_id": ["data_123_1"],
"action": [
{
"function_call": {
"method_name": "process_sum",
"args": ...,
"gas": 10000,
"deposit": "0"
}
}
]
}
}
```
It looks very similar to the new promise, but instead of `output_data_id` it has an `input_data_id`.
This action receipt will be postponed until the other receipt is routed, executed and generated a data receipt.
Once the new promise receipt is successfully executed, it will generate the following receipt:
```json
{
"predecessor_id": "b.contract.near",
"receiver_id": "a.contract.near",
"receipt_id": ...,
"data": {
"data_id": "data_123_1",
"success": true,
"data": ...
}
}
```
It contains the data ID `data_123_1` and routed to the `a.contract.near`.
Let's say the callback receipt was processed and postponed, then this data receipt will trigger execution of the callback receipt, because the all input data is now available.
#### Remote callback with 2 joined promises, with a callback on itself
Let's say `a.contract.near` wants to call `b.contract.near` and `c.contract.near`, and send the result to `d.contract.near` for joining before processing the result on itself.
It will generate 2 receipts for new promises, 1 receipt for the remote callback and 1 receipt for the callback on itself.
Part of the receipt (#1) for the promise towards `b.contract.near`:
```
...
"output_data_id": "data_123_b",
"output_receiver_id": "d.contract.near",
"input_data_id": [],
...
```
Part of the receipt (#2) for the promise towards `c.contract.near`:
```
...
"output_data_id": "data_321_c",
"output_receiver_id": "d.contract.near",
"input_data_id": [],
...
```
The receipt (#3) for the remote callback that has to be executed on `d.contract.near` with data from `b.contract.near` and `c.contract.near`:
```json
{
"predecessor_id": "a.contract.near",
"receiver_id": "d.contract.near",
"receipt_id": ...,
"action": {
"signer_id": "vasya.near",
"signer_public_key": ...,
"gas_price": "3",
"output_data_id": "bla_543",
"output_receiver_id": "a.contract.near",
"input_data_id": ["data_123_b", "data_321_c"],
"action": [
{
"function_call": {
"method_name": "join_data",
"args": ...,
"gas": 10000,
"deposit": "0"
}
}
]
}
}
```
It also has the `output_data_id` and `output_receiver_id` that is specified back towards `a.contract.near`.
And finally the part of the receipt (#4) for the local callback on `a.contract.near`:
```
...
"output_data_id": null,
"output_receiver_id": null,
"input_data_id": ["bla_543"],
...
```
For all of this to execute the first 3 receipts needs to go to the corresponding shards and be processed.
If for some reason the data arrived before the corresponding action receipt, then this data will be hold there until the action receipt arrives.
An example for this is if the receipt #3 is delayed for some reason, while the receipt #2 was processed and generated a data receipt towards `d.contract.near` which arrived before #3.
Also if any of the function calls fail, the receipt still going to generate a new `DataReceipt` because it has `output_data_id` and `output_receiver_id`. Here is an example of a DataReceipt for a failed execution:
```json
{
"predecessor_id": "b.contract.near",
"receiver_id": "d.contract.near",
"receipt_id": ...,
"data": {
"data_id": "data_123_b",
"success": false,
"data": null
}
}
```
#### Swap Key example
Since there are no swap key action, we can just batch 2 actions together. One for adding a new key and one for deleting the old key. The actual order is not important if the public keys are different, but if the public key is the same then you need to first delete the old key and only after this add a new key.
# Reference-level explanation
[reference-level-explanation]: #reference-level-explanation
### Updated protobufs
**public_key.proto**
```proto
syntax = "proto3";
message PublicKey {
enum KeyType {
ED25519 = 0;
}
KeyType key_type = 1;
bytes data = 2;
}
```
**signed_transaction.proto**
```proto
syntax = "proto3";
import "access_key.proto";
import "public_key.proto";
import "uint128.proto";
message Action {
message CreateAccount {
// empty
}
message DeployContract {
// Binary wasm code
bytes code = 1;
}
message FunctionCall {
string method_name = 1;
bytes args = 2;
uint64 gas = 3;
Uint128 deposit = 4;
}
message Transfer {
Uint128 deposit = 1;
}
message Stake {
// New total stake
Uint128 stake = 1;
PublicKey public_key = 2;
}
message AddKey {
PublicKey public_key = 1;
AccessKey access_key = 2;
}
message DeleteKey {
PublicKey public_key = 1;
}
message DeleteAccount {
// The account ID which would receive the remaining funds.
string beneficiary_id = 1;
}
oneof action {
CreateAccount create_account = 1;
DeployContract deploy_contract = 2;
FunctionCall function_call = 3;
Transfer transfer = 4;
Stake stake = 5;
AddKey add_key = 6;
DeleteKey delete_key = 7;
DeleteAccount delete_account = 8;
}
}
message Transaction {
string signer_id = 1;
PublicKey public_key = 2;
uint64 nonce = 3;
string receiver_id = 4;
repeated Action actions = 5;
}
message SignedTransaction {
bytes signature = 1;
Transaction transaction = 2;
}
```
**receipt.proto**
```proto
syntax = "proto3";
import "public_key.proto";
import "signed_transaction.proto";
import "uint128.proto";
import "wrappers.proto";
message DataReceipt {
bytes data_id = 1;
google.protobuf.BytesValue data = 2;
}
message ActionReceipt {
message DataReceiver {
bytes data_id = 1;
string receiver_id = 2;
}
string signer_id = 1;
PublicKey signer_public_key = 2;
// The price of gas is determined when the original SignedTransaction is
// converted into the Receipt. It's used for refunds.
Uint128 gas_price = 3;
// List of data receivers where to route the output data
// (e.g. result of execution)
repeated DataReceiver output_data_receivers = 4;
// Ordered list of data ID to provide as input results.
repeated bytes input_data_ids = 5;
repeated Action actions = 6;
}
message Receipt {
string predecessor_id = 1;
string receiver_id = 2;
bytes receipt_id = 3;
oneof receipt {
ActionReceipt action = 4;
DataReceipt data = 5;
}
}
```
### Validation and Permissions
To validate `SignedTransaction` we need to do the following:
- verify transaction hash against signature and the given public key
- verify `signed_id` is a valid account ID
- verify `receiver_id` is a valid account ID
- fetch account for the given `signed_id`
- fetch access key for the given `signed_id` and `public_key`
- verify access key `nonce`
- get the current price of gas
- compute total required balance for the transaction, including action fees (in gas), deposits and prepaid gas.
- verify account balance is larger than required balance.
- verify actions are allowed by the access key permissions, e.g. if the access key only allows function call, then need to verify receiver, method name and allowance.
Before we convert a `Transaction` to a new `ActionReceipt`, we don't need to validate permissions of the actions or their order. It's checked during `ActionReceipt` execution.
`ActionReceipt` doesn't need to be validated before we start executing it.
The actions in the `ActionReceipt` are executed in given order.
Each action has to check for the validity before execution.
Since `CreateAccount` gives permissions to perform actions on the new account, like it's your account, we introduce temporary variable `actor_id`.
At the beginning of the execution `actor_id` is set to the value of `predecessor_id`.
Validation rules for actions:
- `CreateAccount`
- check the account `receiver_id` doesn't exist
- `DeployContract`, `Stake`, `AddKey`, `DeleteKey`
- check the account `receiver_id` exists
- check `actor_id` equals to `receiver_id`
- `FunctionCall`, `Transfer`
- check the account `receiver_id` exists
When `CreateAccount` completes, the `actor_id` changes to `receiver_id`.
NOTE: When we implement `DeleteAccount` action, its completion will change `actor_id` back to `predecessor_id`.
Once validated, each action might still do some additional checks, e.g. `FunctionCall` might check that the code exists and `method_name` is valid.
### `DataReceipt` generation rules
If `ActionReceipt` doesn't have `output_data_id` and `output_receiver_id`, then `DataReceipt` is not generated.
Otherwise, `DataReceipt` depends on the last action of `ActionReceipt`. There are 4 different outcomes:
1. Last action is invalid, failed or the execution stopped on some previous action.
- `DataReceipt` is generated
- `data_id` is set to the value of `output_data_id` from the `ActionReceipt`
- `success` is set to `false`
- `data` is set to `null`
2. Last action is valid and finished successfully, but it's not a `FunctionCall`. Or a `FunctionCall`, that returned no value.
- `DataReceipt` is generated
- `data_id` is set to the value of `output_data_id` from the `ActionReceipt`
- `success` is set to `true`
- `data` is set to `null`
3. Last action is `FunctionCall`, and the result of the execution is some value.
- `DataReceipt` is generated
- `data_id` is set to the value of `output_data_id` from the `ActionReceipt`
- `success` is set to `true`
- `data` is set to the bytes of the returned value
4. Last action is `FunctionCall`, and the result of the execution is a promise ID
- `DataReceipt` is NOT generated, because we don't have the value for the execution.
- Instead we should modify the `ActionReceipt` generated for the returned promise ID.
- In this receipt the `output_data_id` should be set to the `output_data_id` of the action receipt that we just finished executed.
- `output_receiver_id` is set the same way as `output_data_id` described above.
#### Example for the case #4
A user called contract `a.app`, which called `b.app` and expect a callback to `a.app`. So `a.app` generated 2 receipts:
Towards `b.app`:
```
...
"receiver_id": "b.app",
...
"output_data_id": "data_a",
"output_receiver_id": "a.app",
"input_data_id": [],
...
```
Towards itself:
```
...
"receiver_id": "a.app",
...
"output_data_id": "null",
"output_receiver_id": "null",
"input_data_id": ["data_a"],
...
```
Now let's say `b.app` doesn't actually do the work, but it's just a middleman that charges some fees before redirecting the work to the actual contract `c.app`.
In this case `b.app` creates a new promise by calling `c.app` and returns it instead of data.
This triggers the case #4, so it doesn't generate the data receipt yet, instead it creates an action receipt which would look like that:
```
...
"receiver_id": "c.app",
...
"output_data_id": "data_a",
"output_receiver_id": "a.app",
"input_data_id": [],
...
```
Once it completes, it would send a data receipt to `a.app` (unless `c.app` is a middleman as well).
But let's say `b.app` doesn't want to reveal it's a middleman.
In this case it would call `c.app`, but instead of returning data directly to `a.app`, `b.app` wants to wrap the result into some nice wrapper.
Then instead of returning the promise to `c.app`, `b.app` would attach a callback to itself and return the promise ID of that callback. Here is how it would look:
Towards `c.app`:
```
...
"receiver_id": "c.app",
...
"output_data_id": "data_b",
"output_receiver_id": "b.app",
"input_data_id": [],
...
```
So when the callback receipt first generated, it looks like this:
```
...
"receiver_id": "b.app",
...
"output_data_id": "null",
"output_receiver_id": "null",
"input_data_id": ["data_b"],
...
```
But once, its promise ID is returned with `promise_return`, it is updated to return data towards `a.app`:
```
...
"receiver_id": "b.app",
...
"output_data_id": "data_a",
"output_receiver_id": "a.app",
"input_data_id": ["data_b"],
...
```
### Data storage
We should maintain the following persistent maps per account (`receiver_id`)
- Received data: `data_id -> (success, data)`
- Postponed receipts: `receipt_id -> Receipt`
- Pending input data: `data_id -> receipt_id`
When `ActionReceipt` is received, the runtime iterates through the list of `input_data_id`.
If `input_data_id` is not present in the received data map, then a pair `(input_data_id, receipt_id)` is added to pending input data map and the receipt marked as postponed.
At the end of the iteration if the receipt is marked as postponed, then it's added to map of postponed receipts keyed by `receipt_id`.
If all `input_data_id`s are available in the received data, then `ActionReceipt` is executed.
When `DataReceipt` is received, a pair `(data_id, (success, data))` is added to the received data map.
Then the runtime checks if `data_id` is present in the pending input data.
If it's present, then `data_id` is removed from the pending input data and the corresponding `ActionReceipt` is checked again (see above).
NOTE: we can optimize by not storing `data_id` in the received data map when the pending input data is present and it was the final input data item in the receipt.
When `ActionReceipt` is executed, the runtime deletes all `input_data_id` from the received data map.
The `receipt_id` is deleted from the postponed receipts map (if present).
### TODO Receipt execution
- input data is available to all function calls in the batched actions
- TODODO
# Future possibilities
[future-possibilities]: #future-possibilities
- We can add `or` based data selector, so data storage can be affected.
|
---
id: approvals
title: Approvals
sidebar_label: Approvals
---
import {Github} from "@site/src/components/codetabs"
In this tutorial you'll learn the basics of an approval management system which will allow you to grant others access to transfer NFTs on your behalf. This is the backbone of all NFT marketplaces and allows for some complex yet beautiful scenarios to happen. If you're joining us for the first time, feel free to clone [this repository](https://github.com/near-examples/nft-tutorial) and checkout the `4.core` branch to follow along.
```bash
git checkout 4.core
```
:::tip
If you wish to see the finished code for this _Approval_ tutorial, you can find it on the `5.approval` branch.
:::
## Introduction
Up until this point you've created a smart contract that allows users to mint and transfer NFTs as well as query for information using the [enumeration standard](https://nomicon.io/Standards/Tokens/NonFungibleToken/Enumeration). As we've been doing in the previous tutorials, let's break down the problem into smaller, more digestible, tasks. Let's first define some of the end goals that we want to accomplish as per the [approval management](https://nomicon.io/Standards/Tokens/NonFungibleToken/ApprovalManagement) extension of the standard. We want a user to have the ability to:
- Grant other accounts access to transfer their NFTs on a per token basis.
- Check if an account has access to a specific token.
- Revoke a specific account the ability to transfer an NFT.
- Revoke **all** other accounts the ability to transfer an NFT.
If you look at all these goals, they are all on a per token basis. This is a strong indication that you should change the `Token` struct which keeps track of information for each token.
## Allow an account to transfer your NFT
Let's start by trying to accomplish the first goal. How can you grant another account access to transfer an NFT on your behalf?
The simplest way that you can achieve this is to add a list of approved accounts to the `Token` struct. When transferring the NFT, if the caller is not the owner, you could check if they're in the list.
Before transferring, you would need to clear the list of approved accounts since the new owner wouldn't expect the accounts approved by the original owner to still have access to transfer their new NFT.
### The problem {#the-problem}
On the surface, this would work, but if you start thinking about the edge cases, some problems arise. Often times when doing development, a common approach is to think about the easiest and most straightforward solution. Once you've figured it out, you can start to branch off and think about optimizations and edge cases.
Let's consider the following scenario. Benji has an NFT and gives two separate marketplaces access to transfer his token. By doing so, he's putting the NFT for sale (more about that in the [marketplace integrations](#marketplace-integrations) section). Let's say he put the NFT for sale for 1 NEAR on both markets. The token's list of approved account IDs would look like the following:
```
Token: {
owner_id: Benji
approved_accounts_ids: [marketplace A, marketplace B]
}
```
Josh then comes along and purchases the NFT on marketplace A for 1 NEAR. This would take the sale down from the marketplace A and clear the list of approved accounts. Marketplace B, however, still has the token listed for sale for 1 NEAR and has no way of knowing that the token was purchased on marketplace A by Josh. The new token struct would look as follows:
```
Token: {
owner_id: Josh
approved_accounts_ids: []
}
```
Let's say Josh is low on cash and wants to flip this NFT and put it for sale for 10 times the price on marketplace B. He goes to put it for sale and for whatever reason, the marketplace is built in a way that if you try to put a token up for sale twice, it keeps the old sale data. This would mean that from marketplace B's perspective, the token is still for sale for 1 NEAR (which was the price that Benji had originally listed it for).
Since Josh approved the marketplace to try and put it for sale, the token struct would look as follows:
```
Token: {
owner_id: Josh
approved_accounts_ids: [marketplace A, marketplace B]
}
```
If Mike then comes along and purchases the NFT for only 1 NEAR on marketplace B, the marketplace would go to try and transfer the NFT and since technically, Josh approved the marketplace and it's in the list of approved accounts, the transaction would go through properly.
### The solution {#the-solution}
Now that we've identified a problem with the original solution, let's think about ways that we can fix it. What would happen now if, instead of just keeping track of a list of approved accounts, you introduced a specific ID that went along with each approved account. The new approved accounts would now be a map instead of a list. It would map an account to it's `approval id`.
For this to work, you need to make sure that the approval ID is **always** a unique, new ID. If you set it as an integer that always increases by 1 whenever u approve an account, this should work. Let's consider the same scenario with the new solution.
Benji puts his NFT for sale for 1 NEAR on marketplace A and marketplace B by approving both marketplaces. The "next approval ID" would start off at 0 when the NFT was first minted and will increase from there. This would result in the following token struct:
```
Token: {
owner_id: Benji
approved_accounts_ids: {
marketplace A: 0
marketplace B: 1
}
next_approval_id: 2
}
```
When Benji approved marketplace A, it took the original value of `next_approval_id` which started off at 0. The marketplace was then inserted into the map and the next approval ID was incremented. This process happened again for marketplace B and the next approval ID was again incremented where it's now 2.
Josh comes along and purchases the NFT on marketplace A for 1 NEAR. Notice how the next approval ID stayed at 2:
```
Token: {
owner_id: Josh
approved_accounts_ids: {}
next_approval_id: 2
}
```
Josh then flips the NFT because he's once again low on cash and approves marketplace B:
```
Token: {
owner_id: Josh
approved_accounts_ids: {
marketplace B: 2
}
next_approval_id: 3
}
```
The marketplace is inserted into the map and the next approval ID is incremented. From marketplace B's perspective it stores it's original approval ID from when Benji put the NFT up for sale which has a value of 1. If Mike were to go and purchase the NFT on marketplace B for the original 1 NEAR sale price, the NFT contract should panic. This is because the marketplace is trying to transfer the NFT with an approval ID 1 but the token struct shows that it **should** have an approval ID of 2.
### Expanding the `Token` and `JsonToken` structs
Now that you understand the proposed solution to the original problem of allowing an account to transfer your NFT, it's time to implement some of the logic. The first thing you should do is modify the `Token` and `JsonToken` structs to reflect the new changes. Let's switch over to the `nft-contract/src/metadata.ts` file:
<Github language="js" start="106" end="156" url="https://github.com/near-examples/nft-tutorial-js/blob/5.approval/src/nft-contract/metadata.ts" />
You'll then need to initialize both the `approved_account_ids` and `next_approval_id` to their default values when a token is minted. Switch to the `nft-contract/src/mint.ts` file and when creating the `Token` struct to store in the contract, let's set the next approval ID to be 0 and the approved account IDs to be an empty object:
<Github language="js" start="23" end="31" url="https://github.com/near-examples/nft-tutorial-js/blob/5.approval/src/nft-contract/mint.ts" />
### Approving accounts
Now that you've added the support for approved account IDs and the next approval ID on the token level, it's time to add the logic for populating and changing those fields through a function called `nft_approve`. This function should approve an account to have access to a specific token ID. Let's move to the `nft-contract/src/approval.ts` file and edit the `internalNftApprove` function:
<Github language="js" start="9" end="73" url="https://github.com/near-examples/nft-tutorial-js/blob/5.approval/src/nft-contract/approval.ts" />
The function will first assert that the user has attached **at least** one yoctoNEAR (which we'll implement soon). This is both for security and to cover storage. When someone approves an account ID, they're storing that information on the contract. As you saw in the [minting tutorial](/tutorials/nfts/js/minting), you can either have the smart contract account cover the storage, or you can have the users cover that cost. The latter is more scalable and it's the approach you'll be working with throughout this tutorial.
After the assertion comes back with no problems, you get the token object and make sure that only the owner is calling this method. Only the owner should be able to allow other accounts to transfer their NFTs. You then get the next approval ID and insert the passed in account into the map with the next approval ID. If it's a new approval ID, storage must be paid. If it's not a new approval ID, no storage needs to be paid and only attaching 1 yoctoNEAR would be enough.
You then calculate how much storage is being used by adding that new account to the map and increment the tokens `next_approval_id` by 1. After inserting the token object back into the `tokensById` map, you refund any excess storage.
You'll notice that the function contains an optional `msg` parameter. This message is actually the foundation of all NFT marketplaces on NEAR.
#### Marketplace Integrations {#marketplace-integrations}
If a message was provided into the function, you're going to perform a cross contract call to the account being given access. This cross contract call will invoke the `nft_on_approve` function which will parse the message and act accordingly. Let's consider a general use case.
We have a marketplace that expects it's sale conditions to be passed in through the message field. Benji approves the marketplace with the `nft_approve` function and passes in a stringified JSON to the message which will outline sale conditions. These sale conditions could look something like the following:
```json
sale_conditions: {
price: 5
}
```
By leaving the message field type as just a string, this generalizes the process and allows users to input sale conditions for many different marketplaces. It is up to the person approving to pass in an appropriate message that the marketplace can properly decode and use. This is usually done through the marketplace's frontend app which would know how to construct the `msg` in a useful way.
#### Internal functions
Now that the core logic for approving an account is finished, you need to implement the `assertAtLeastOneYocto` and `bytesForApprovedAccountId` functions. Move to the `nft-contract/src/internal.ts` file and copy the following function right below the `assertOneYocto` function.
<Github language="js" start="61" end="64" url="https://github.com/near-examples/nft-tutorial-js/blob/5.approval/src/nft-contract/internal.ts" />
Next, you'll need to copy the logic for calculating how many bytes it costs to store an account ID. Place this function at the very top of the page:
<Github language="js" start="55" end="59" url="https://github.com/near-examples/nft-tutorial-js/blob/5.approval/src/nft-contract/internal.ts" />
Now that the logic for approving accounts is finished, you need to change the restrictions for transferring.
### Changing the restrictions for transferring NFTs
Currently, an NFT can **only** be transferred by its owner. You need to change that restriction so that people that have been approved can also transfer NFTs. In addition, you'll make it so that if an approval ID is passed, you can increase the security and check if both the account trying to transfer is in the approved list **and** they correspond to the correct approval ID. This is to address the problem we ran into earlier.
In the `internal.ts` file, you need to change the logic of the `internalTransfer` method as that's where the restrictions are being made. Change the internal transfer function to be the following:
<Github language="js" start="108" end="163" url="https://github.com/near-examples/nft-tutorial-js/blob/5.approval/src/nft-contract/internal.ts" />
This will check if the sender isn't the owner and then if they're not, it will check if the sender is in the approval list. If an approval ID was passed into the function, it will check if the sender's actual approval ID stored on the contract matches the one passed in.
#### Refunding storage on transfer
While you're in the internal file, you're going to need to add methods for refunding users who have paid for storing approved accounts on the contract when an NFT is transferred. This is because you'll be clearing the `approved_account_ids` object whenever NFTs are transferred and so the storage is no longer being used.
<Github language="js" start="13" end="28" url="https://github.com/near-examples/nft-tutorial-js/blob/5.approval/src/nft-contract/internal.ts" />
These will be useful in the next section where you'll be changing the `nft_core` functions to include the new approval logic.
### Changes to `nft_core.ts`
Head over to the `nft-contract/src/nft_core.ts` file and the first change that you'll want to make is to add an `approval_id` to the `internalTransfer` function. This is so that anyone trying to transfer the token that isn't the owner must pass in an approval ID to address the problem seen earlier. If they are the owner, the approval ID won't be used as we saw in the `internalTransfer` function.
For the `nft_transfer` function, the only change that you'll need to make is to pass in the approval ID into the `internalTransfer` function and then refund the previous tokens approved account IDs after the transfer is finished
<Github language="js" start="38" end="72" url="https://github.com/near-examples/nft-tutorial-js/blob/5.approval/src/nft-contract/nft_core.ts" />
Next, you need to do the same to `nft_transfer_call` but instead of refunding immediately, you need to attach the previous token's approved account IDs to `nft_resolve_transfer` instead as there's still the possibility that the transfer gets reverted.
<Github language="js" start="74" end="135" url="https://github.com/near-examples/nft-tutorial-js/blob/5.approval/src/nft-contract/nft_core.ts" />
You'll also need to add the tokens approved account IDs to the `JsonToken` being returned by `nft_token`.
<Github language="js" start="10" end="36" url="https://github.com/near-examples/nft-tutorial-js/blob/5.approval/src/nft-contract/nft_core.ts" />
Finally, you need to add the logic for refunding the approved account IDs in `internalResolveTransfer`. If the transfer went through, you should refund the owner for the storage being released by resetting the tokens `approved_account_ids` field. If, however, you should revert the transfer, it wouldn't be enough to just not refund anybody. Since the receiver briefly owned the token, they could have added their own approved account IDs and so you should refund them if they did so.
<Github language="js" start="137" end="208" url="https://github.com/near-examples/nft-tutorial-js/blob/5.approval/src/nft-contract/nft_core.ts" />
With that finished, it's time to move on and complete the next task.
## Check if an account is approved
Now that the core logic is in place for approving and refunding accounts, it should be smooth sailing from this point on. You now need to implement the logic for checking if an account has been approved. This should take an account and token ID as well as an optional approval ID. If no approval ID was provided, it should simply return whether or not the account is approved.
If an approval ID was provided, it should return whether or not the account is approved and has the same approval ID as the one provided. Let's move to the `nft-contract/src/approval.ts` file and add the necessary logic to the `internalNftIsApproved` function.
<Github language="js" start="75" end="110" url="https://github.com/near-examples/nft-tutorial-js/blob/5.approval/src/nft-contract/approval.ts" />
Let's now move on and add the logic for revoking an account
## Revoke an account
The next step in the tutorial is to allow a user to revoke a specific account from having access to their NFT. The first thing you'll want to do is assert one yocto for security purposes. You'll then need to make sure that the caller is the owner of the token. If those checks pass, you'll need to remove the passed in account from the tokens approved account IDs and refund the owner for the storage being released.
<Github language="js" start="112" end="145" url="https://github.com/near-examples/nft-tutorial-js/blob/5.approval/src/nft-contract/approval.ts" />
## Revoke all accounts
The final step in the tutorial is to allow a user to revoke all accounts from having access to their NFT. This should also assert one yocto for security purposes and make sure that the caller is the owner of the token. You then refund the owner for releasing all the accounts in the map and then clear the `approved_account_ids`.
<Github language="js" start="147" end="177" url="https://github.com/near-examples/nft-tutorial-js/blob/5.approval/src/nft-contract/approval.ts" />
With that finished, it's time to deploy and start testing the contract.
## Testing the new changes {#testing-changes}
Since these changes affect all the other tokens and the state won't be able to automatically be inherited by the new code, simply redeploying the contract will lead to errors. For this reason, it's best practice to create a sub-account and deploy the contract there.
### Creating a sub-account {#creating-sub-account}
Run the following command to create a sub-account `approval` of your main account with an initial balance of 25 NEAR which will be transferred from the original to your new account.
```bash
near create-account approval.$NFT_CONTRACT_ID --masterAccount $NFT_CONTRACT_ID --initialBalance 25
```
Next, you'll want to export an environment variable for ease of development:
```bash
export APPROVAL_NFT_CONTRACT_ID=approval.$NFT_CONTRACT_ID
```
Using the build script, build the deploy the contract as you did in the previous tutorials:
```bash
yarn build && near deploy --wasmFile build/nft.wasm --accountId $APPROVAL_NFT_CONTRACT_ID
```
### Initialization and minting {#initialization-and-minting}
Since this is a new contract, you'll need to initialize and mint a token. Use the following command to initialize the contract:
```bash
near call $APPROVAL_NFT_CONTRACT_ID init '{"owner_id": "'$APPROVAL_NFT_CONTRACT_ID'"}' --accountId $APPROVAL_NFT_CONTRACT_ID
```
Next, you'll need to mint a token. By running this command, you'll mint a token with a token ID `"approval-token"` and the receiver will be your new account.
```bash
near call $APPROVAL_NFT_CONTRACT_ID nft_mint '{"token_id": "approval-token", "metadata": {"title": "Approval Token", "description": "testing out the new approval extension of the standard", "media": "https://bafybeiftczwrtyr3k7a2k4vutd3amkwsmaqyhrdzlhvpt33dyjivufqusq.ipfs.dweb.link/goteam-gif.gif"}, "receiver_id": "'$APPROVAL_NFT_CONTRACT_ID'"}' --accountId $APPROVAL_NFT_CONTRACT_ID --amount 0.1
```
You can check to see if everything went through properly by calling one of the enumeration functions:
```bash
near view $APPROVAL_NFT_CONTRACT_ID nft_tokens_for_owner '{"account_id": "'$APPROVAL_NFT_CONTRACT_ID'", "limit": 10}'
```
This should return an output similar to the following:
```json
[
{
"token_id": "approval-token",
"owner_id": "approval.goteam.examples.testnet",
"metadata": {
"title": "Approval Token",
"description": "testing out the new approval extension of the standard",
"media": "https://bafybeiftczwrtyr3k7a2k4vutd3amkwsmaqyhrdzlhvpt33dyjivufqusq.ipfs.dweb.link/goteam-gif.gif",
},
"approved_account_ids": {}
}
]
```
Notice how the approved account IDs are now being returned from the function? This is a great sign! You're now ready to move on and approve an account to have access to your token.
### Approving an account {#approving-an-account}
At this point, you should have two accounts. One stored under `$NFT_CONTRACT_ID` and the other under the `$APPROVAL_NFT_CONTRACT_ID` environment variable. You can use both of these accounts to test things out. If you approve your old account, it should have the ability to transfer the NFT to itself.
Execute the following command to approve the account stored under `$NFT_CONTRACT_ID` to have access to transfer your NFT with an ID `"approval-token"`. You don't need to pass a message since the old account didn't implement the `nft_on_approve` function. In addition, you'll need to attach enough NEAR to cover the cost of storing the account on the contract. 0.1 NEAR should be more than enough and you'll be refunded any excess that is unused.
```bash
near call $APPROVAL_NFT_CONTRACT_ID nft_approve '{"token_id": "approval-token", "account_id": "'$NFT_CONTRACT_ID'"}' --accountId $APPROVAL_NFT_CONTRACT_ID --deposit 0.1
```
If you call the same enumeration method as before, you should see the new approved account ID being returned.
```bash
near view $APPROVAL_NFT_CONTRACT_ID nft_tokens_for_owner '{"account_id": "'$APPROVAL_NFT_CONTRACT_ID'", "limit": 10}'
```
This should return an output similar to the following:
```json
[
{
"token_id": "approval-token",
"owner_id": "approval.goteam.examples.testnet",
"metadata": {
"title": "Approval Token",
"description": "testing out the new approval extension of the standard",
"media": "https://bafybeiftczwrtyr3k7a2k4vutd3amkwsmaqyhrdzlhvpt33dyjivufqusq.ipfs.dweb.link/goteam-gif.gif"
},
"approved_account_ids": { "goteam.examples.testnet": 0 }
}
]
```
### Transferring an NFT as an approved account {#transferring-the-nft}
Now that you've approved another account to transfer the token, you can test that behavior. You should be able to use the other account to transfer the NFT to itself by which the approved account IDs should be reset. Let's test transferring the NFT with the wrong approval ID:
```bash
near call $APPROVAL_NFT_CONTRACT_ID nft_transfer '{"receiver_id": "'$NFT_CONTRACT_ID'", "token_id": "approval-token", "approval_id": 1}' --accountId $NFT_CONTRACT_ID --depositYocto 1
```
<details>
<summary>Example response: </summary>
<p>
```bash
kind: {
ExecutionError: "Smart contract panicked: panicked at 'assertion failed: `(left == right)`\n" +
' left: `0`,\n' +
" right: `1`: The actual approval_id 0 is different from the given approval_id 1', src/internal.ts:165:17"
},
```
</p>
</details>
If you pass the correct approval ID which is `0`, everything should work fine.
```bash
near call $APPROVAL_NFT_CONTRACT_ID nft_transfer '{"receiver_id": "'$NFT_CONTRACT_ID'", "token_id": "approval-token", "approval_id": 0}' --accountId $NFT_CONTRACT_ID --depositYocto 1
```
If you again call the enumeration method, you should see the owner updated and the approved account IDs reset.
```json
[
{
"token_id": "approval-token",
"owner_id": "goteam.examples.testnet",
"metadata": {
"title": "Approval Token",
"description": "testing out the new approval extension of the standard",
"media": "https://bafybeiftczwrtyr3k7a2k4vutd3amkwsmaqyhrdzlhvpt33dyjivufqusq.ipfs.dweb.link/goteam-gif.gif"
},
"approved_account_ids": {}
}
]
```
Let's now test the approval ID incrementing across different owners. If you approve the sub-account that originally minted the token, the approval ID should be 1 now.
```bash
near call $APPROVAL_NFT_CONTRACT_ID nft_approve '{"token_id": "approval-token", "account_id": "'$APPROVAL_NFT_CONTRACT_ID'"}' --accountId $NFT_CONTRACT_ID --deposit 0.1
```
Calling the view function again show now return an approval ID of 1 for the sub-account that was approved.
```bash
near view $APPROVAL_NFT_CONTRACT_ID nft_tokens_for_owner '{"account_id": "'$NFT_CONTRACT_ID'", "limit": 10}'
```
<details>
<summary>Example response: </summary>
<p>
```json
[
{
"token_id": "approval-token",
"owner_id": "goteam.examples.testnet",
"metadata": {
"title": "Approval Token",
"description": "testing out the new approval extension of the standard",
"media": "https://bafybeiftczwrtyr3k7a2k4vutd3amkwsmaqyhrdzlhvpt33dyjivufqusq.ipfs.dweb.link/goteam-gif.gif"
},
"approved_account_ids": { "approval.goteam.examples.testnet": 1 }
}
]
```
</p>
</details>
With the testing finished, you've successfully implemented the approvals extension to the standard!
## Conclusion
Today you went through a lot of logic to implement the [approvals extension](https://nomicon.io/Standards/Tokens/NonFungibleToken/ApprovalManagement) so let's break down exactly what you did.
First, you explored the [basic approach](#basic-solution) of how to solve the problem. You then went through and discovered some of the [problems](#the-problem) with that solution and learned how to [fix it](#the-solution).
After understanding what you should do to implement the approvals extension, you started to [modify](#expanding-json-and-token) the JsonToken and Token structs in the contract. You then implemented the logic for [approving accounts](#approving-accounts) and saw how [marketplaces](#marketplace-integrations) are integrated.
After implementing the logic behind approving accounts, you went and [changed the restrictions](#changing-restrictions) needed to transfer NFTs. The last step you did to finalize the approving logic was to go back and edit the [nft_core](#nft-core-changes) files to be compatible with the new changes.
At this point, everything was implemented in order to allow accounts to be approved and you extended the functionality of the [core standard](https://nomicon.io/Standards/Tokens/NonFungibleToken/Core) to allow for approved accounts to transfer tokens.
You implemented a view method to [check](#check-if-account-approved) if an account is approved and to finish the coding portion of the tutorial, you implemented the logic necessary to [revoke an account](#revoke-account) as well as [revoke all accounts](#revoke-all-accounts).
After this, the contract code was finished and it was time to move onto testing where you created a [subaccount](#creating-sub-account) and tested the [approving](#approving-an-account) and [transferring](#transferring-the-nft) for your NFTs.
In the next tutorial, you'll learn about the royalty standards and how you can interact with NFT marketplaces.
:::note Versioning for this article
At the time of this writing, this example works with the following versions:
- near-cli: `3.0.0`
- NFT standard: [NEP171](https://nomicon.io/Standards/Tokens/NonFungibleToken/Core), version `1.0.0`
- Enumeration standard: [NEP181](https://nomicon.io/Standards/Tokens/NonFungibleToken/Enumeration), version `1.0.0`
- Approval standard: [NEP178](https://nomicon.io/Standards/Tokens/NonFungibleToken/ApprovalManagement), version `1.0.0`
:::
|
---
sidebar_position: 1
sidebar_label: "Introduction"
---
# Pagoda Alerts & Triggers
:::warning
Please be advised that these tools and services will be discontinued soon.
:::
## What are Alerts & Triggers?
Pagoda Alerts & Triggers are designed to notify, and automated responses to important events that occur on the NEAR blockchain. Behind the scenes, Alerts are powered by many mini-indexers, “Alertexers”, that stream blockchain data in real-time, enabling developers to know what’s happening to their dApp before their users do.
Alerts are broken into three parts:
1. The NEAR address the alert should listen to (account or contract)
2. The event condition (success & failed actions, account drains, and more)
3. The alert destination ([e-mail](setup.md#setting-up-e-mail-alerts), [Telegram](setup.md#setting-up-telegram-alerts), [webhooks](webhooks.md))
Alerts can be set-up to listen for the following five conditions:
1. Successful Actions
2. Failed Actions
3. [Event Logged](https://nomicon.io/Standards/EventsFormat)
4. Function Called
5. Account Balance Change
## Setup
- [E-mail alerts](setup.md#setting-up-e-mail-alerts)
- [Telegram alerts](setup.md#setting-up-telegram-alerts)
- [Event Log Alerts](setup.md#setting-up-event-log-alerts)
- [Function Call Specific Alerts](setup.md#setting-up-function-call-specific-alerts)
## Using Webhooks with Alerts & Triggers
See an example on how to [set up alerts using webhooks](webhooks.md).
|
---
sidebar_position: 2
sidebar_label: "Webhooks Example"
---
# Turn on the (Hue) lights with NEAR NFTs and Pagoda Alerts & Triggers
:::warning
Please be advised that these tools and services will be discontinued soon.
:::
## Overview
How cool would it be to have your lights turn on or your favorite song on spotify to play when someone bought your NFT on NEAR?
With the Pagoda Console and IFTTT you can do both in minutes with zero code!
## What will we be doing?
Using a combination of the [Pagoda Console](https://console.pagoda.co) and [IFTTT](https://ifttt.com) we will turn on our lights when a successful transaction has been processed.
We will be using the webhook trigger to allow the pagoda console to call an endpoint on IFTTT which will then turn on our HUE Lights
### What is IFTTT?
IFTTT stands for "If This Then That". It's a platform (at ifttt.com) that provides a variety of services each with their own collection of applets within it that provide some unique functionality.
<img width="60%" src="/docs/pagoda/webhook1.png" />
#### If This
It starts with the "If This" Trigger. For example time could be your trigger, so if it's 10pm you can write your own script to turn off your lights. Or something more random like, liking a song on spotify could add the music video to a youtube playlist.
There are a lot of triggers on this service, but just to name some examples
- Time
- Temperature
- Webhooks (what we'll be using)
#### Then That
Next comes the "Then That" action. An action is what happens when your trigger has been tripped. For example, turning out the lights at 10pm OR turning them on when you mint an NFT on NEAR.
## Step 1: Getting the webhook address
We will be setting up a webhook trigger so after you make an account on ifttt.com You will see this page....
<img width="70%" src="/docs/pagoda/webhook2.png" />
Go to the **Services** Tab and search for "webhook"
<img width="20%" src="/docs/pagoda/webhook3.png" />
Click on the webhooks icon and then you'll be sent to this page...
<img width="50%" src="/docs/pagoda/webhook4.png" />
Click on the "Documentation Button". This should open up a new tab....
<img width="70%" src="/docs/pagoda/webhook5.png" />
Leave that page alone for now we'll come back to it. This is essentially where we get the webhook address we will call for our "IF".
## Step 2: Setting up your Trigger
Hit the **Create** Button on the upper right corner of the screen..
<img width="20%" src="/docs/pagoda/webhook6.png" />
Next click on the "If This" Button...
<img width="40%" src="/docs/pagoda/webhook7.png" />
Again search for **webhooks** ...
<img width="60%" src="/docs/pagoda/webhook8.png" />
Select the **Receive a Web Request** trigger...
<img width="30%" src="/docs/pagoda/webhook9.png" />
let's call this "**on_transaction**" then select create trigger
<img width="50%" src="/docs/pagoda/webhook10.png" />
## Step 3: Select your Action
You for this tutorial you will need to have:
- Hue Account
- Hue Lights
Next click on **Then That** ...
<img width="50%" src="/docs/pagoda/webhook11.png" />
Search for **Hue**
<img width="60%" src="/docs/pagoda/webhook12.png" />
Select Turn On Lights
<img width="30%" src="/docs/pagoda/webhook13.png" />
Select the lights of your choosing but I will simply do all lights.
If you haven't already create and connect your Hue Account
<img width="40%" src="/docs/pagoda/webhook14.png" />
After you do this simply hit **Create Action** Then you'll be redirected here...
<img width="50%" src="/docs/pagoda/webhook15.png" />
As you can see you can add more than one trigger or action if you'd like. But for now we'll stick to the one. Hit **Continue**
## Step 4: Setting up your endpoint
Once you hit **Continue** you'll be redirected here...
<img width="40%" src="/docs/pagoda/webhook16.png" />
take note of the name "on_transaction" and copy it.. then hit the **Finish** button...
Next go back to the documentation tab we opened up earlier
Where it says `{event}` replace everything even the curly braces with "on_transaction"
<img width="90%" src="/docs/pagoda/webhook17.png" />
to
<img width="90%" src="/docs/pagoda/webhook18.png" />
copy that entire line and head on over to console.pagoda.co
## Step 5: Integrating Webhook into Pagoda Console
Once at console.pagoda.co, you should be greeted by the log-in page. Select the Non-funcable Token (NFT) project to start exploring the NFT contract
<img width="60%" src="/docs/pagoda/webhook19.png" />
Hit the "Deploy and Explore Contract" Button. This will create a dev account for you and deploy the pre written NFT smart contract onto that account for you.
<img width="40%" src="/docs/pagoda/webhook20.png" />
Head to the <kbd>Alerts</kbd> section
<img width="20%" src="/docs/pagoda/webhook21.png" />
And select <kbd>+ New Alert</kbd>
<img width="70%" src="/docs/pagoda/webhook22.png" />
You should see this page...
<img width="80%" src="/docs/pagoda/webhook23.png" />
Select the suggested contract which should be dev account that was created.
<img width="80%" src="/docs/pagoda/webhook24.png" />
:::info
You can use any contract running on mainnet or testnet for an alert. We'll just use this NFT testnet example for this guide.
:::
Under "Select Condition" hit "Successful Transaction". This means that for any successful transaction an alert will be sent. In this case for any successful method call the lights will turn on. If you want to, you can select "function call" for a specific method to be the trigger.
But for now, we'll keep it easy and select any Successful Action.
<img width="60%" src="/docs/pagoda/webhook25.png" />
We're almost done! Under destination select webhooks. Now that webhook we created earlier go ahead and copy and paste it into here. Then hit "Create"
:::tip
Don't forget to remove the `{}` around the name of your event! `ifttt.com/trigger/...`, not `ifftt.com/{trigger}/...`
:::
<img width="60%" src="/docs/pagoda/webhook26.png" />
Remember to hit the "+ Create Alert" button on this page...
<img width="60%" src="/docs/pagoda/webhook27.png" />
Now head on over to the "Contracts" Section.
<img width="60%" src="/docs/pagoda/webhook28.png" />
Select the contract we just created and navigate to the "Interact" tab to connect your wallet.
<img width="60%" src="/docs/pagoda/webhook29.png" />
Now here is the part we've all been waiting for... **Turn on the (hue) lights!**
Select the `new_default_metadata` function (we are choosing this one because we have to initialize our contract, this is still a transaction which will trigger our new webhook). Fill in the `owner_id` field with your wallet account name and hit send transaction
<img width="60%" src="/docs/pagoda/webhook30.png" />
## Wrapping up
And that's it! You've just triggered something in the real world with an event that happened on the NEAR Blockchain. Hopefully this inspires you to create your own webhook using IFTTT and the Pagoda Console.
We'd love to see what you create! Tag [@PagodaPlatform](https://twitter.com/PagodaPlatform) on Twitter with a novel implementation of a webhhook and trigger and we might retweet it.
Happy hacking!
|
---
id: introduction
title: NFT Zero to Hero JavaScript Edition
sidebar_label: Introduction
---
> In this _Zero to Hero_ series, you'll find a set of tutorials that will cover every aspect of a non-fungible token (NFT) smart contract.
> You'll start by minting an NFT using a pre-deployed contract and by the end you'll end up building a fully-fledged NFT smart contract that supports every extension.
## Prerequisites
To complete these tutorials successfully, you'll need:
- [Node.js](/build/smart-contracts/quickstart#prerequisites#nodejs)
- [A NEAR Wallet](https://testnet.mynearwallet.com/create)
- [NEAR-CLI](/tools/near-cli#setup)
---
## Overview
These are the steps that will bring you from **_Zero_** to **_Hero_** in no time! 💪
| Step | Name | Description |
|------|------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------|
| 1 | [Pre-deployed contract](/tutorials/nfts/js/predeployed-contract) | Mint an NFT without the need to code, create, or deploy a smart contract. |
| 2 | [Contract architecture](/tutorials/nfts/js/skeleton) | Learn the basic architecture of the NFT smart contract and compile code. |
| 3 | [Minting](/tutorials/nfts/js/minting) | Flesh out the skeleton so the smart contract can mint a non-fungible token. |
| 4 | [Upgrade a contract](/tutorials/nfts/js/upgrade-contract) | Discover the process to upgrade an existing smart contract. |
| 5 | [Enumeration](/tutorials/nfts/js/enumeration) | Explore enumeration methods that can be used to return the smart contract's states. |
| 6 | [Core](/tutorials/nfts/js/core) | Extend the NFT contract using the core standard which allows token transfer |
| 7 | [Approvals](/tutorials/nfts/js/approvals) | Expand the contract allowing other accounts to transfer NFTs on your behalf. |
| 8 | [Royalty](/tutorials/nfts/js/royalty) | Add NFT royalties allowing for a set percentage to be paid out to the token creator. |
| 9 | [Events](/tutorials/nfts/js/events) | in this tutorial you'll explore the events extension, allowing the contract to react on certain events. |
| 10 | [Marketplace](/tutorials/nfts/js/marketplace) | Learn about how common marketplaces operate on NEAR and dive into some of the code that allows buying and selling NFTs |
---
## Next steps
Ready to start? Jump to the [Pre-deployed Contract](/tutorials/nfts/js/predeployed-contract) tutorial and begin your learning journey!
If you already know about non-fungible tokens and smart contracts, feel free to skip and jump directly to the tutorial of your interest. The tutorials have been designed so you can start at any given point!
:::info Questions?
👉 Join us on [Discord](https://near.chat/) and let us know in the `#development` channels. 👈
We also host daily [Office Hours](https://pages.near.org/developers/get-help/office-hours/) live where the DevRel team will answer any questions you may have. 🤔
Monday – Friday 11AM – 12PM Pacific (6PM – 7PM UTC)
:::
|
Berry Club Part II: How Does The Berry Club Yield Farming App Work?
CASE STUDIES
February 17, 2021
Let’s Put It To The Test
For an introduction to Yield Farming on NEAR, see Part 1, Berry Club: A Fun, Creative Example of a DeFi Yield Farming App on NEAR Protocol
Berry Club Background
Berry Club is one of the oldest and certainly most engaging smart contracts on the NEAR blockchain. Berry Club has actually been around since before the NEAR mainnet was launched. Berry Club and sister contracts Berry Farm and Banana Swap are a suite of DeFi yield farming smart contracts. They were developed partly for community participation and creativity, but also to prove the viability of DeFi apps on NEAR.
Let’s start the tour with some details on each of the smart contracts related to Berry Club.
Artwork From Avocados
Berry Club is a yield farming application that allows users to purchase Avocados (tokens) with NEAR at fixed rates. These Avocados can then be “planted” on a public board of colored pixels. In aggregate, these pixels create an ever-changing shared piece of art.
It’s a crazy mix between a chat room and graffiti wall. Here are a few snapshots.
Some Examples of User-Created Pixel Art Using Avocados on Berry Club
Farming Avocados, Bananas, and Cucumbers
But there’s an even more interesting aspect to these pixels. Each pixel a user owns will pay a yield denominated either in Bananas or Avocados, based on the user’s preference (you will begin to notice the farming theme if you aren’t already familiar with Berry Club). This allows for beautiful game theory driven community-created artwork to be consistently changed and collaborated on.
Let’s add Cucumbers to the farm. Berry Farm allows these hard-working farmers (or users) to swap Bananas for Cucumbers at a 1:1 ratio. But why would they want a Cucumber when they have already purchased Avocados and earned Bananas?
Cucumbers pay a yield denominated in NEAR that is proportional to each farmer’s share of all Cucumbers. That is to say: if there are 100 Cucumbers total, and you own 1 Cucumber, then you will receive 1% of each of the NEAR reward distributions. These distributions are paid whenever any user draws on the pixel board (at most once per minute). The NEAR rewards distribution method highlights one of the coolest features about NEAR and Berry Farm: a portion of the NEAR distributions are actually taken from the gas that is used to interact with the smart contract, while the rest is distributed from Avocado purchases. This Contract Reward provides an incentive to build quality smart contracts on NEAR, and can help smart contracts become more self-sustaining.
This nicely highlights one of NEAR’s distinctive features: the creator of each smart contract can automatically receive a portion of rewards whenever the contract is used.
Banana Swap
Last but not least in the Berry Club ecosystem, there is Banana Swap: an automated market maker that services the Banana economy, allowing users to buy and sell Bananas directly for NEAR. This is a tiny version of Uniswap that was one of the first proofs of working AMM contracts on NEAR. Now that we have introduced all of the building blocks of Berry Club, let’s do some farming.
Trying Out Berry Club
DeFi is all about experimentation, so no blog post about DeFi would be complete without a little bit of first-hand user data.
Below is an example of what a new Berry Club farmer will experience starting out. For ease of understanding, it is narrated in the first person:
25 Avocados are given to each new user, so that’s where I started the experiment. After those were received, used 5 NEAR to purchase 1500 Avocados at a rate of 300 Avocados per NEAR (an advantaged rate compared to the 250 Avocados per NEAR for smaller NEAR amounts). I planted the Avocados with care on the Berry Club drawing board in batches of 150-250 at a time. The UI makes note that you should not destroy the lovely artwork, and that if you would merely like to farm Bananas, that you should set the opacity lower in order to preserve the underlying creations. So that is mostly what I did, occasionally adding darker pixels when I felt that it would add to the artwork. Eventually, after some experimentation, I find that actually contributing original artwork to the board, rather than merely yield farming the clear pixels, was the optimal strategy.
My bountiful Avocado fields would yield me a harvest of 250 bananas per day. Fantastic! But not so fast. Bananas are only received for the spaces on the board that you currently own, so the 250 banana per day yield was quickly reduced as others drew pixels on top of mine. And herein lies the beauty of the design of Berry Club: each user can optimize their own strategy based on what they observe happening on the board in real-time.
A Good Harvest
When all was said and done and I had harvested my crops, the initial 5 NEAR that I had converted into 1500 Avocados yielded me 109.9 Bananas total – 75.9 of which were swapped back into 10 NEAR on Banana Swap, and the other 34 being converted into Cucumbers on Berry Farm.
This was good enough for a 0.022% share of all Cucumbers and generated a reward in NEAR.The NEAR denominated price of Bananas on Banana Swap varied (as it should with a constant product automated market maker) on each of the swaps. In this way, the combination of Berry Farm and Banana Swap give the user the choice between a high time preference and a low time preference. Users can simply farm Bananas and sell them directly for NEAR, or swap their Bananas for Cucumbers and collect NEAR passively. The entire process took two days (it could have been shorter, if I had been more active in drawing), which gave a point estimate of the yield earned of a mind-melting 18,250% per year.
I attempted to replicate these results a second time with another 5 NEAR, and I got a point estimate of the annual yield earned of 6,000% amidst a falling Banana market. As I said earlier: may the best artist (and farmer) win! I could have chosen a completely Cucumber-centric model for testing, but deemed that the constant creation of more Cucumbers constitutes a dynamic system that would be too complex to analyze for such a blog post.
Testing Banana Swap
Seeing that there was a decentralized exchange, of course I had to go see if I could break it. Similar to Uniswap, Banana Swap uses a bonding curve that generates the relative token price based on the amount of NEAR and Bananas in the liquidity pool (less a fee, of course). In its current implementation, Banana Swap does not allow users to add more liquidity themselves, but rather charges a 10% fee on swaps and automatically adds those balances to the liquidity pool. In this way, liquidity on Banana Swap is self-sustaining, but also limited to natural growth. In it’s beta implementation, Banana Swap is technically vulnerable to an overflow exploit. However, thanks to the Rust programming language that NEAR supports, the overflow error cannot be exploited.
At this point it’s important to note that Berry Club is a proof of concept that is meant to highlight the ease of use of the NEAR blockchain as well as some of the newest protocol standards. One of those standards is NEP-122, the allowance-free vault-based token standard. Another example is the use of Restricted Access Keys, which allow for the end user experience to be much smoother as well as more secure. Restricted Access Keys can be restricted to specific NEAR allowances and method names (for example: the Restricted Access Keys for Berry Club only allow for the drawing method to be called, and not to make deposits from your account).
Berry Club is a fun and engaging way to participate in a live dApp on the NEAR mainnet. That is to say: you should only participate in Berry Club, Berry Farm, and Banana Swap with a degree of curiosity and as many NEAR as you can afford to lose. With that in mind, I hope you’ll try it out and see if the results of your yield farming strategy can beat my results!
Yield Farming Experimental Results
First Round:
SPENT: 5 NEAR for 1500 Avocados
30 Banana -> 30 Cucumber
1 NEAR @ 15.7 Banana = 15.7 Banana
1 NEAR @ 17.3 Banana = 17.3 Banana
4 NEAR @ 3.8 Banana = 15.2 Banana
2 NEAR @ 6.1 Banana = 12.2 Banana
1 NEAR @ 6.9 Banana = 6.9 Banana
1 NEAR @ 8.6 Banana = 8.6 Banana
Total: 75.9 Banana for 10 NEAR, 34 cucumbers (0.022% share) = 109.9 Banana total in 2 days
Next round:
Spent: 5 NEAR for 1500 Avocados
17 Banana -> 17 Cucumber = 17 Banana
2 NEAR @ 12.935 Banana = 25.87 Banana
1 NEAR @ 13.2 Banana = 13.2 Banana
1 NEAR @ 15.9 Banana = 15.9 Banana
1 NEAR @ 17.9 Banana = 17.9 Banana
1 NEAR @ 19.5 Banana = 19.5 Banana
1 NEAR @ 18.9 Banana = 18.9 Banana
TOTAL: 111.27 Banana for 7 NEAR, 17 cucumbers
GRAND TOTAL:
187.17 Banana/17 NEAR (11 Banana/NEAR average price) + 51 Cucumbers (0.032% share), 0.13 NEAR yield from Cucumbers = 71.3%
NEAR-denominated return in 4 days = about 6,000% APY
You can view some time-lapse summaries of shared Berry Club artwork on Youtube https://youtu.be/PYF6RWd7ZgI
References:
https://pages.near.org/blog/2020-near-in-review-ecosystem-tour/
https://pages.near.org/blog/near-protocol-economics/
Get started on Berry Club today!
|
---
title: 1.2 The History of Money
description: The History of Money and Reserve Currencies
---
# 1.2 Bitcoin, The History of Money, and Reserve Currencies
## Introduction
This lecture is perhaps one of the most important in setting the stage for the ‘What’ and ‘Why’ of Web3. As a prequel, it is fair to say that in Crypto we have a multi-faceted industry that often leads to incongruence between the different parties involved. The finance people cannot understand the tech stack, the devs cannot do crypto-economics, and the marketers struggle to understand both!
The goal of this lecture is to give everyone here a concise and substantive overview of the financial and monetary context surrounding the emergence of the Internet of Value (also called Web3 or Crypto). And to do that we have to go on a bit of a journey through economic history.
## Money Over Time As A Social Construct. The Boring Basics
Monetary theory 101 - the very boring basics. This is conceptual and something to just keep in mind from a very high level:
Humans have consistently abstracted value away, as a convenience mechanism over time - they have optimized. What does this mean? It means when we used to barter, we evolved to use sea shells, from which we evolved to use record keeping of credit, and the state issuance of a minted coin. Then over time, we have further abstracted from that minted coin into paper money, and now we stand at a period where that level of abstraction is reaching even more incredible levels - namely the digital domain.
Crucial to understand, in this context and as you can see from [the title of this extremely boring article](https://www.academia.edu/723898/1973_Dinar_versus_the_ducat) - The Dinar and the Ducat - is that the origin of value - over time - has been first and foremost based upon the buy-in and social consciousness of the human collective. Money is a social relationship and it is largely based upon the belief in the value held by a specific asset by a certain group of people. This is crucial to remember as we enter the world of tokens and crypto-currencies.
## The Geopolitics of Money
Second, and building off of this first point: The institution or entity that has maintained the social agreement on the value of value (on what is money), has traditionally been the state or the king! And as states have evolved - and more specifically - the nation-state we have seen certain patterns emerge in terms of the strength and sustainability of monetary schemes. Here is a macro of what those look like:
![](@site/static/img/bootcamp/mod-em-1.2.1.png)
And there have been some fascinating lessons from this evolution in state-governed money in the past 600 years (which are directly relevant to what we will be diving into, on the topic of crypto and Web3):
* **(NEO) Mercantilism:** At first, everyone fought to strengthen their own currency standard against and at the cost of each other. This was known as mercantilism and subsequently neo-mercantilism. It was zero-sum, extractive, and competitive between states (mainly European ones).
* **Capitalism:** Over time, the winners realized that it was to their benefit and advantage to work together. This is the basis of the early stages of capitalism as described by Adam Smith in the Wealth of Nations. This transition is one of many factors often regarded as the basis of the ‘Great Enrichment’ or ‘the First Industrial Revolution’. Money, ideas, and people were able to share and collaborate from which innovation was born.
* **The Gold Standard:** By the 19th century, this morphed into an alliance among these countries no longer fighting each other, to adopt a ‘Gold Standard’. But the theory here far exceeded realistic expectations in reality. In reality, it was constantly debated whether Gold or Silver were reliable reserves, and if they could truly be trusted to back a currency regime. Nevertheless, the collective consciousness came to understand healthy monetary policy as requiring ‘gold’ backing. What needs to be remembered, is this was always controlled and modulated at the convenience of the governing state. There was never a ‘gold standard system’ floating outside of national monetary interests.
* **The Rise of the USA Federal Reserve and the Petro-Dollar:** At the end of WWII, we see a transition into the unprecedented situation of today (or recent times), which is characterized by the absolute dominance of the US Dollar. Absolute dominance here refers to the settlement currency used for the bulk of trade around the world. To keep things simple, thanks to the positioning of the USA at the end of WW II, and the destruction of other modern monetary regimes, the US Dollar was the natural currency enforced in the liberal world order (often spoken of by Peter Zeihan). This was then strengthened with the rapid digitization of finance.
* **The Digitization of Money:** Getting up to the present, we arrive at a relatively disturbing state of affairs: Widespread financial illiteracy, in a world dominated by the dollar, and printing excess amounts of money in the form of loans and debt obligations. Let’s pause here for a second and dig in on this, because it absolutely informs the environment which gave way to Bitcoin and crypto-currencies.
## Commercial Banking and Money Supply
As you will see from the recommended reading there is a book titled_‘Where does Money come From?’_by Professor Richard Werner. And our interests in this book, beyond the structure of the Bank of England or the Federal Reserve, and the relationship between the Federal Reserve and Commercial banking, is the nature of money and credit in society. This is summarized clearly in the following statement by the Bank of England:
_“Where does money come from? In the modern economy, most money takes the form of bank deposits. But how those bank deposits are created is often misunderstood. **The principal way in which they are created is through commercial banks making loans**: whenever a bank makes a loan, it creates a deposit in the borrower’s bank account, thereby **creating new money.** This description of how money is created differs from the story found in some economics textbooks.”([Bank of England](https://www.bankofengland.co.uk/-/media/boe/files/quarterly-bulletin/2014/money-creation-in-the-modern-economy.pdf))_
This is to say, that money supply, as controlled by Commercial Banks, is loosely enabled by a select few, within certain limited government currency schemes. In other words, the government has given the mandate of money creation to certain private banking institutions which it, in principle regulates.
And since the advent of the internet - these institutions – have been _printing_.
![](@site/static/img/bootcamp/mod-em-1.2.2.png)
So the hard truth is:
* 97% of current money supply is digitally created (not in paper form).
* We have printed an excessive amount in the last 20 years that is not correlated to reserves.
* And the game is gated under this legacy system of state denominated currency regulated by private banking institutions.
* Here is the game zoomed out over time.
![](@site/static/img/bootcamp/mod-em-1.2.3.png)
And there are a couple of things to say about this system fundamentally irrespective of how well it might have worked for you (or not):
* Money supply is controlled by first and foremost the government, and secondarily by private institutions.
* The individual is 100% dependent on both, in order to participate in the financial system.
* The system can eliminate or remove certain individuals or entities, based on the geopolitical control over the currency regime in question (think Iran being banned from SWIFT).
* Since the digitization of money, supply has drastically increased in the form of loans from commercial banks. This ‘up-only’ mentality has fueled speculation, bubbles, and hyper financialization.
## Introducing Bitcoin
_“The real exciting thing about this technology is not a blockchain. A blockchain is a database artifact created out of this protocol. No, the real excitement comes from the ability to achieve distributed consensus among parties that don’t trust each other, across great distances, without any central party, authority, or intermediary.”- (Andreas Antonopolous, the Internet of Money, Volume III - 8)_
All of what has been mentioned has been a build-up and essential context towards understanding the value proposition of Bitcoin. It is only through this context, that one can come to grasp the underlying value proposition of Bitcoin, which is best understood through the following question:
“_Is government regulation of banking and money more harmful or helpful? This is the key hypothesis.”_
This is the clear picture of what Bitcoin is:
Bitcoin is a digital monetary system, in which the value of the system is maintained and ‘regulated’ by random operators plugged into the internet. These operators are known as miners, who compete in calculating complex calculations as a means of confirming the ‘state’ or ‘ledger’ of the system of money.
The value here is not in the technicalities of how the Bitcoin network works. That is what is indicated in the Antonopolous quote:
**_“The real excitement comes from the ability to achieve distributed consensus among parties that don’t trust each other, across great distances, without any central party, authority, or intermediary.”_**
So for the first time in history, we have a monetary system governed in a ‘decentralized’ manner such that there is no national authority or commercial body regulating its supply and demand. But rather, the network has been programmed to operate a certain way such that there is no way a single authority could corrupt the network.
In practice, this leads to some entirely new possibilities and opportunities in the deinstitutionalization of money:
* (**Sovereignty**) Individuals can self-custody value, outside of the centralized institutions or regulated containers.
* (**Digital Reserve**) Countries now have alternatives to the reserve currency of the world (petro-dollar) - the thesis by Xapo Wences is that BTC is positioned to become the world’s reserve currency in the coming century.
* (**Money at the Speed and Access of the Internet**) Money can now move across borders and jurisdictions, on the internet, without oversight or interference from authorities.
_“**A System of Money on a Network: **A system of money that operates on a network. It is first and foremost a medium of exchange, a store of value, and one day (potentially) a unit of account. But it will never become a system of control: it refuses to become a system of control. In fact, its design principles are neutrality, openness, borderless access, censorship resistance. (Antonopoulos, 48)”_
For these reasons, Bitcoin has been referred to as digital gold. Or arguably, a digital manifestation of the theory behind gold and rare metals. There do remain some large questions vis a vis the long-term future of Bitcoin:
* Right now there are around 19.1mm BTCs in existence. There will be a total supply of 21mm in around the year of 2136. What happens then? Fees from the network is usually the most referred to answer.
**[IF TIME AND IN PERSON - BALAJI TALK to LEX ON HOW BITCOIN REINVENTS MONEY.]**
## Bitcoin and the Advent of Cryptocurrencies: Currencies vs. Tokens
With Bitcoin we have the advent of sovereign digital currencies - that is to say, forms of money tokenized on the internet, in such a way that they are used to pay for goods and services. This is the proper term for crypto-currencies - currencies that are cryptographically secured on a distributed ledger. Other forms of crypto-currencies include Litecoin, Blackcoin, and Ripple (XRP).
Separate from a crypto-currency is what is known as a token - or a digital token. These are _not_ crypto-currencies, but rather created on a distributed ledger for a different purpose: _for a specific utility, for governance, yield,_ or something else specific to a protocol or dApp built on a smart contract platform.
The conceptual distinction is to ask about the underlying nature of the system in question:
_Is there a utility for the token in question, or is the utility of the token to be used as an exchange or store of value? That is to say, is it a payment token, or does it serve a specific function in light of the nature it is created from and imbues value to?_
## The Importance of Crypto-Economics in A New World of Value
And herein lies the key principle which we will visit and revisit over and over again:
_In crypto- whether it be a crypto-currency or a crypto-token, the underlying paradigm shift is that of shifting the origin of value, from an entity in a system, to the system itself. And the token, as representative of a fraction of the system, enables participation among anyone who owns or manages the token._
That is to say, with Bitcoin, sovereign money is the result of a distributed network of Bitcoin nodes, maintaining and collaborating to manage the state of the Bitcoin network. This is in contrast to the Federal Reserve system whereby a single entity - the Federal Reserve chairman and board, dictates certain policies to the rest of the system.
With crypto-tokens, the same principle applies: When we are building a protocol or dApp on a smart contract platform, we are designing solutions which are value imbued with a fungible token of sorts (if these protocols and dApps are open source and public networks).
This principle is an allusion to the brand new world of crypto-economics, which refers to the design, implementation, and mechanics of a value-imbued system that offers a specific service to the world.
For now, we can conclude from this dichotomy, that crypto-is first and foremost a financial and political technology, that happens to originate through code - and on the internet. And the story that crypto is inserting into, tells us that it has originated at a very interesting time in economic history.
|
Stake Wars III is Halfway to the Finish Line: What Does This Mean for NEAR’s Sharding?
DEVELOPERS
August 19, 2022
The incredible Stake Wars III journey is still unfolding, but now is a good time to celebrate some important milestones. Let’s look at a few quick stats and updates, as the organizers move forward in this effort to prepare NEAR’s sharding to move from Phase 0 to 1.
Before we do that, there’s now a leaderboard where you can see the performance of the participants. As you can see, everyone in the top 25 has a 90% or above success rate in getting their chunks online. Great job!
During the program, shardnet, the testing network set up specifically for Stake Wars, frequently had more than 1,500 active nodes participating and 400 validators (chunk + block producers) competing in 11 successful challenges in the shardnet while a few more lined up. The Stake Wars organizers will continue to operate Stake Wars exclusively in shardnet until the program ends on September 9th.
Stake Wars will not be moved to testnet as previously announced. Testnet is used as a final staging environment, and is part of the regular release cycle. Separating that cycle from Stake Wars will allow the team to focus on the community and not on the release requirements.
Major credit for Stake Wars’ success goes to the dedicated NEAR community and partners, without whom Stake Wars wouldn’t have been possible. Their thoughtful and continuous feedback shaped the program. Last but not least, a big shout-out to the entire Pagoda team, who stands tall and has been a backbone throughout the process.
Some Wins
Since the launch of Stake Wars III four weeks ago, the community has rallied a massive amount of participation–much more than previously anticipated. The stats have been stunning as shown above, and there is a tremendous amount of activity in the dedicated Discord channels, e.g. #stake-wars. Let’s keep this momentum up throughout the second part of the program all the way through to the finish line and beyond!
The 11 published challenges (competitions) have helped shed light on the work required to become a successful validator. They also created a battleground in shardnet for testing chunk-only producer code.
One of the most important takeaways so far is that Stake Wars is teaching community members the ins-and-outs of running successful validator nodes. It is also offering them a pathway, through stake support, to securing seats as mainnet validators.
Some Learnings
Win or lose, there’s always something to learn. In these four intense, amazing weeks, the Stake Wars organizers have faced a series of potential setbacks, but with the help of the community the team persevered, and NEAR is a stronger ecosystem for it.
The first of these challenges was the abuse of the distribution mechanism for the shardnet token. The Stake Wars organizers addressed this issue by switching over to a delegation model to protect the network. Although this created some additional friction for participants, it did have the silver lining of introducing them to the underlying staking and delegation models in mainnet.
The other issue the Stake Wars organizers faced was handling simultaneous setup and token-distribution, which warranted creating several hard forks. Fortunately, this decision essentially helped bring the network back into a functional state. While there is no expectation that these situations will be seen in mainnet, being able to “safely” learn from them during Stake Wars contributes to an optimal operational experience for program participants. Hence, running Stake Wars in a dedicated test network (shardnet) gives the team the flexibility to respond and adapt to the different situations that may be encountered throughout the duration of the program.
What Next? – Phase 1 of Sharding
In the run-up to the launch of Phase 1 of sharding, which is scheduled for next month, the chuck-only producer feature was released to testnet on Aug 15th. This will create more opportunities to earn rewards and further secure the NEAR Ecosystem. The chunk-only producer is also a great onramp to becoming a validator for those who may not have the $NEAR or system requirements to run a validator node since chunk-only producers are solely responsible for single shards.
The number of validators will also increase from ~100 to ~300, and the seat price to become a validator will be lowered. This is a crucial step to better facilitate scaling, improve NEAR’s decentralization, and bring the Open Web to mass adoption. |
```bash
near call v2.keypom.near create_drop '{"public_keys": <PUBLIC_KEYS>, "deposit_per_use": "10000000000000000000000", "ftData": {"contractId": "ft.primitives.near","senderId": "bob.near", "amount": "1"}}}' --depositYocto 23000000000000000000000 --gas 100000000000000 --accountId bob.near
``` |
Community Update: NEAR EDU, Flux Beta Program, and GooGuild
COMMUNITY
August 4, 2020
Hello citizens of NEARverse! This is NiMA coming to you from a NEARby planet in the Milky Chain …
This week marks the beginning of a series of educational workshops that will help us onboard more web 2.0 and web 3.0 developers to the NEAR ecosystem. We would love to have you join us for the first workshop this Thursday with Sherif, our head of education. These workshops will be evolving based on your feedback and suggestions into a larger initiative that could onboard millions of developers to the most developer-friendly platform for building A Better Open Web.
Good news! We would like to have you join us on our official Telegram channel to talk all things NEAR with the rest of the community. We are re-opening the Telegram channel and will be hosting a series of AMAs with NEAR core team, backers and founders. First one will be this Saturday with Illia!
Use the hashtag #NEARAMA on Twitter or Telegram so we can find your questions and answer them during the AMA.
One last thing, don’t forget to upvote and comment on our Reddit scaling bakeoff submission 🙏✌️
Linking real world data to NEAR with Chainlink 🔗
We recently announced a partnership and integration with Chainlink, the leading decentralized oracle network in the Web3 space! We’re excited to see more DApps on NEAR connect to offchain data sources using Chainlink. This is particularly useful for DeFi and OpenFinance applications that require reliable price feeds. Here’s what our friends from Chainlink have to say about this integration:
“Chainlink integration provides NEAR with most versatile oracle network in the market, as well as the ability to connect to any off-chain API. These are key building blocks for connecting Web2 and Open Web environments, which ultimately allows our developers to build a much wider range of applications, ranging from decentralized financial products and NFTs, to asset tokenization and insurance contracts.”
You can read more about the Chainlink partnership here or dive deeper into TestNet repos and instruction on Github.
Welcoming Xoogler to the NEAR Guild program
We’re excited to welcome our friends from Xoogler.co to the NEAR Guild program. Xoogler is a group of Google alumni and current Googlers who have come together to help each other build and grow in the startup ecosystem. The Xoogler.co community consists of startup founders, early team members, angel investors, VCs, and mentors.
If you’re a Googler or Xoogler, make sure to get in touch with Rune, the Guild lead, to get involved in our joint activities and future ventures. Our first collaboration with Xoogler is going to be around educational workshops and an upcoming global NEAR hackathon! Beyond that, we are also excited to have experienced angel investors and builders from Xoogler grow our DApp ecosystem on NEAR.
MOAR Guilds … Designer Guild and Open Shard Alliance want you!
🎨 Designer Guild is looking for designers in the NEAR community …
The NEAR Design Guild is an independent community focused on crypto-related art and designs. As part of the Guild Program, the community is consistently contributing to the NEAR ecosystem and is looking for more like-minded artsy guild members to join their group. Get in touch with Bruce on Twitter to find out more!
The Open Shards Alliance (OSA) is recruiting new validators!
Are you interested in becoming a validator, but are unsure where to start or if you have the needed skills? The OSA can help you learn the skills necessary to become a validator on the NEAR decentralized network. Basic Linux or Windows PowerShell skills are preferred, but not required.
Join the OSA and help build the open web with NEAR Discord –> https://discord.gg/t9Kgbvf or directly email Henry –> [email protected]
One last thing … We would like to thank Henry, the leader of OSA for all his relentless support and welcome him to the NEAR collective in a more formal role and long term engagement with the NEAR Foundation. We hope to be able to collaborate with more Guild leads and community members in the future in a similar fashion. The best way to join the team building NEAR is by becoming an active member of our community.
NEAR Ecosystem updates: Flux is launching their Beta Program …
The team behind Flux are hard at work preparing for the phase 1 of their launch. They’re also launching a Beta Program for developers interested in building on Flux. Here’s a short snippet from their latest newsletter, but for more Fluxy goodness go to Substack to subscribe to their future updates!
“Over July, we’ve taken considerable strides toward Phase 1 of our main-net launch. Phase 1 will consist of Flux Protocol on the main-net with the introduction of DAI ( a stable coin pegged to the USD ) for trading. Phase 1 will also feature the update of our open-source app and SDK to enable developers to start preparing for testing and eventual launch on Flux. In addition to tech updates, we have many exciting programs and communities popping up on and around Flux to support our ecosystem.”
This week we are watching, reading, and listening to …
We released Alex’s white board session with Michael from Starkware → youtu.be/JIlLAxFUcwA
StarkWare Co-founder and Chief Architect, Michael Riabzev and NEAR Co-Founder & CTO, Alex Skidanov discuss StarkEx, STARKs vs. SNARKs, and do a deep dive into Stark’s algebra.
StarkWare solves the inherent problems of blockchains – scalability and privacy. They develop a full proof stack, using STARK technology to generate and verify proofs of computational integrity. StarkWare’s cryptographic proofs are zero-knowledge, succinct, transparent and post-quantum secure.
Open Web Collective published their latest episode with Jesse Walden of Variant Fund (ex-A16Z) where they talk about Ownership Economy which is the backbone of Jesse’s thesis for his new fund.
Listen on Spotify or Apple Podcasts
That’s it for this update. See you in the next one!
Your friends,
NiMA and the NEAR team |
NEAR’s Road to Decentralization: Building Bridges
NEAR FOUNDATION
March 14, 2022
The NEAR community’s big-picture vision has always been straightforward — onboard billions of people into Web3. In this new iteration of the Internet, blockchain platforms and apps will be decentralized, open-source, transparent, and humane. This Open Web will be full of utility, encouraging maximum participation from developers and end-users.
Web3 won’t be dependent on central authorities to access services like social media, games, Metaverse experiences, and more. Decentralization, coupled with permissionless systems (i.e, open or public smart contracts), will give people a greater sense of ownership, control, and protection over their data in alignment with the financial outcomes of decentralized apps (dapps). With this regulatory foundation—or governance, as it is referred to in crypto—humanity can evolve into a new era of borderless economic opportunity for all.
A future built on creativity, community, and inclusivity requires a core based on interoperability between blockchains. There is no single blockchain that will own Web3. NEAR and other blockchain communities understand that the future is multi-chain, and bridges and decentralization are critical to getting there.
Rainbow Bridge: the first step toward a multi-chain future
Enter Rainbow Bridge, one of the first fully permissionless, decentralized, and openly accessible bridges in the crypto-sphere. Almost a year after launch, Rainbow Bridge has more than a dozen projects utilizing this technical infrastructure.
In connecting NEAR and Ethereum, Rainbow Bridge is a major first step toward the multi-chain future, allowing developers to use NEAR’s fast, simple, and low-cost transaction network to move assets back and forth between the two protocols. Crucially, Rainbow Bridge gives NEAR users access to familiar projects on Ethereum.
The development of Rainbow Bridge evolved, as many blockchain projects do, out of a necessity to solve an immediate engineering problem: how does a project on a separate blockchain migrate its services to another without having to rebuild the entire product from scratch. Thanks to NEAR’s Rainbow Bridge, any token or Dapp that uses Ethereum’s Virtual Machine can run on NEAR. With this capability in place, the NEAR ecosystem is able to integrate ERC20 with native NEAR solutions. Anyone can use NEAR as a technical backend while operating in the Ethereum ecosystem. The NEAR-ETH Rainbow Bridge is entirely permissionless and decentralized. No single entity controls it.
With Rainbow Bridge, NEAR effectively becomes a collaborative Layer 2 solution for existing applications and assets deployed on Ethereum. In addition to solving the problem of high gas fees and barriers to entry and scaling, NEAR brings about an additional value proposition: it’s incredibly easy to use and simple to get to grips with.
Rainbow Bridge advances NEAR’s larger goal of advancing Web3 to create a more open and inclusive internet and financial ecosystem. The key is to empower NEAR community members to operate in a high quality, scalable, and self-sufficient manner.
Pagoda: accelerating ideation to application for Web3
For developers looking to use Rainbow Bridge or other bridges, the NEAR community offers a number of projects, tools, and resources to get started.
Pagoda, the first-ever Web3 startup platform, gives developers and entrepreneurs access to interactive tutorials, scalable infrastructure, and operational analytics that are Web3 native. Using Pagoda’s Developer Console, tools and resources can be found at every layer of the stack, from the base blockchain protocol all the way up to the application and marketing and distribution layers. Aurora, the builders of Rainbow Bridge, also offers a number of resources on their website to developers looking to build atop their EVM.
With this technical foundation, NEAR has already begun its journey of creating a network of interoperable blockchains.
Other bridges and collaborative partnerships
NEAR is, at its core, an ecosystem. The technology is just one part of the equation, with the platform’s true infrastructure being its self-directed, decentralized community. Human decisions are the heartbeat of NEAR and its partnerships.
A number of exciting projects are already showcasing the collaborative efforts happening within the vibrant NEAR ecosystem. Let’s take a look at a few of them.
Aurora
A turnkey solution for developers seeking to extend their dapps to reach additional markets, Aurora uses a number of NEAR’s core technologies, including sharding and developer gas fee remuneration.
The solution consists of two core components. The first is the Aurora Engine, which allows for the seamless deployment of Solidity and Vyper smart contracts. The second is the Aurora Bridge. Based on NEAR’s Rainbow Bridge technology, Aurora Bridge enables the permissionless transfer of tokens and data between Ethereum and Aurora.
Allbridge
In December 2021, Allbridge announced an integration between Aurora and Terra, an open-source stablecoin network controlled by its stakeholders. The connection between the two networks brings Terra liquidity to the Aurora ecosystem.
As a modular and expanding token bridge with on-chain consensus, Allbridge provides a fast, affordable, and secure way to move liquidity between EVM, non-EVM, and L2 blockchains. Allbridge’s mission is to make the blockchain world borderless, with the ability to freely move assets across different networks.
Octopus Network
A decentralized network natively built on NEAR Protocol, Octopus provides out-of-box security, interoperability, and on-chain governance to projects that are creating application-specific blockchains (appchains) for their open web applications.
Built around the $OCT Token, Octopus Network functions as a composable means by which applications can operate with different security parameters on NEAR. Octopus Network Relay runs a smart contract on NEAR, which provides infrastructure for a validator network that brings partners together.
Woo Network
In December 2021, NEAR Foundation and WOO Network completed a $5M token swap to create a partnership between the two projects. Think of this partnership, in which NEAR and WOO Network share tokens, as a symbolic bridge of sorts.
For the NEAR community, WOO Network provides a liquidity ecosystem that connects traders, exchanges, institutions, and DeFi platforms with access to liquidity, trading execution, and yield-generating strategies. For Woo Network, NEAR provides a carbon-neutral, high-scalability platform built on a proof-of-stake, layer-1 blockchain.
Roadmap for connecting new networks
The future is collaborative — a web of blockchain protocols and decentralized projects working together to build Web3. While NEAR wants to be the onramp for a billion users, the community also wants to help enable all ships to rise to meet the challenge of onboarding the world to Web3. And the NEAR ecosystem has the grants funding, technical infrastructure, and community resources and tools to help make this happen.
In October 2021, the NEAR ecosystem announced a $800M grants fund to accelerate the development of its blockchain. Of this $800M, the largest allocation — $350M from Proximity Labs — will go towards developing the protocol’s DeFi sector. A DeFi DAO will govern how these funds are spent. These grant funds will be absolutely crucial in connecting with new networks for the multi-chain Web3 future.
On March 2, 2022, NEAR announced a new partnership with Multichain, the most popular bridging platform in the crypto space, which supports more than 1,600 tokens, hundreds of millions of dollars in daily transactions, and 540,000 users. As a bridge operator, Multichain has developed the technology for all blockchains to interoperate. With this partnership, NEAR’s user-friendly interfaces and tools are now extendable across the entire Web3 ecosystem. The partnership is a meaningful step forward towards the advancement of a global network of open Web 3 platforms.
A decentralized future begins with the right infrastructure. NEAR is committed to enabling the tools to ground this vision in reality, right now. |
Introducing Endlesss: Gamifying Music Creation with NFTs
COMMUNITY
July 1, 2022
Endlesss, the creation, marketing, and community building platform built for musicians, has just launched on NEAR mainnet. As music production software with social media features, Endlesss is a virtual space for musicians, fans, and collectors alike. With the app, musicians of all skill levels can make, share, collaborate, and collect music in a social way. And perhaps most importantly, its technology reintroduces musical improvisation and collaboration into the cultural landscape.
As part of the launch, Endlesss announced a new feature called “Bands”—an NFT marketplace powered by NEAR’s NEP-171 NFT standard. Endlesss debuted the Bands NFT marketplace at NYC.NFT, attendees got the chance to make and mint tunes on Endlesss with various controllers. The team’s booth even included a MIDI-enabled Endlesss arcade game, which was a hit with the NYC.NFT crowd.
Before launching Endlesss, founder Tim Exile had several other careers. Trained as a violinist, Exile cut records for Warp Records and Planet Mu, two esteemed electronic music labels. Exile (real name Tim Shaw) also developed plugins for music software company Native Instruments, which paved the way for Endlesss.
Exile launched Endlesss as a mobile app in 2020, then followed it up with a desktop version in 2021. Over the past year, Exile and team have been integrating Endlesss with NEAR. And now, Endlesss is officially Web3-ready.
“NEAR offers great developer tools, a super-fast reliable network, and great Web3 onboarding through the linkdrop contract and human-readable wallet addresses,” says Exile.
NEAR Foundation recently caught up with Tim Exile, fresh off of the NYC.NFT announcement, to talk about Endlesss’ new NFT feature.
How collaborative jams work on Endlesss
Before diving into the Endlesss NFT marketplace, let’s first explore how the music software works.
After downloading the Endlesss app, users see a music production interface. This will be familiar to anyone who has used mobile music production apps or software like Ableton, Fruity Loops, or Native Instruments. With Endlesss, the twist is that it optimizes sharing and collaboration. An amalgam of music production software and social media, with e-commerce functionality baked-in.
Endless gives musicians the tools to start “Jams” (Endlesss-speak for “songs”), which Exile describes as something like musical chat groups.
“You can share that jam with friends anywhere, and they can dive in, download the app on iOS, MacOS, or Windows, and you can jam with people live from all over the world,” Exile explains. “And as you jam together, you build an entire history of everything that happened in that jam. So it’s like music creation as storytelling because as the jam evolves, you can look back to see and listen to how the music unfolds as the jam progresses.”
“Endlesss is really the culmination of my life’s work because I really love improvising,” says Exile. “Before the music industry got going 150 years ago, music was an activity we came together to do. It was something that cemented our social bonds, gave us that kind of ritual resonance that allowed us to transmit, propagate, and mutate these forms through folk music. And so music was very closely attached to the myths and legends that kept us together as a society. That’s the thing that we’re building instead of selling a million copies of one song.”
But if musicians want to jam solo, they can do that as well. Musicians can create solo Jams and share them as complete pieces with other musicians and collectors on Endlesss.
“The future of media is a game, not an industry,” he adds. “And games are designed as journeys, but they’re not prescriptive. Those are really the principles upon which we built Endlesss.”
Under the hood of Endlesss’s NFT marketplace
To integrate with NEAR, Endless first had to build an in-browser user experience. Exile reckons that 50% of the team’s work went into optimizing the front and back ends to deliver this seamless in-browser feature.
“We needed to develop a web audio player and API endpoints,” says Exile. “Obviously, we needed to do a really good job of designing the front end to make it look and feel responsive and slick in the browser, on desktop and mobile. But the real magic, the real secret sauce is how we then take assets that have been created and stored on Web2 servers and turn them into Web3 immutable assets.”
From a technical standpoint, musicians can share a Jam with anyone on the internet with a browser link. After clicking on the URL, users can see the jam in their browser, complete with the entire history of the Jam’s Riffs. And if the Jam unfolds in real time, Endlesss users see others adding Riffs as it all happens. Collectors can then click on any Riff that they’re listening to and trigger an NFT minting process.
The minting process begins with the bundling of all associated media assets into a fully-executable mini-DAW which loads in an iframe. This includes web audio code, web visualizer code, all metadata, all audio stems, and an image.
“That’s shipped off to Arweave where it’s stored on the permaweb,” Exile explains. “We get the hash back from Arweave, then mint the NFT using the NEAR NFT standard.”
If users connect their NEAR Wallet to Endlesss, a single click executes the NFT mint and purchase. Users simply hit “Collect” and then approve the transaction.
“We had to do a bit of code dancing on our back end,” Exile says. “So, we actually have a contract that mints a token on our behalf, then transfers the token to users.”
On the creator side, the royalty split is automatic, thanks to NEAR’s NFT standard. This was a major factor in Endlesss’ decision to build on NEAR.
“One of the real strengths of the NEAR NFT standard is that it supports collaborative splits,” he adds. “It’s baked into the smart contract, which is not the case with ERC-71 because any collaborative splits are held at the marketplace level. So, you can take that asset and list it on another marketplace and those collaborative splits won’t be on it.”
“Endlesss is deeply predicated on collaboration, so collaborative splits executed properly was very, very important for us.”
Public launch and upcoming features
The soft launch of Endlesss’ NFT marketplace has been live on NEAR mainnet for only a few weeks. So, it’s still a work-in-progress. But if the warm reception at NYC.NFT was any indication, the Web3 community already loves it.
Not only does Endlesss fuse the Web2 and Web3 worlds, but it can also integrate with hardware. Few Web3 apps can integrate with hardware like MIDI keyboards and controllers for the truly tactile user experience Endlesss offers.
“Before we shipped Bands, on Endless you could create jams solo or with other people around the world, you could jam in real time, you could jam asynchronously,” Exile explains. “And every jam has a full history of everything that was created in that jam. And what we did when we were building on NEAR was to basically build an NFT marketplace where all the Riffs in that Jam could be minted as an NFT.”
“We seeded it out to 30 people just really to understand what the behavior is, what the demand is, and what works,” says Exile. “And we’re already learning a huge amount about the dynamics, the curiosity, and the patterns of behavior of collection.”
In this closed Beta phase, the plan is to tweak Bands’ features to optimize the market dynamics of supply and demand. Endlesss will also give creators more tools to select just some of the Riffs instead of making every one fully available by default.
Exile says the team will add more curation features to the Bands NFT marketplace. The team is already at work on a feature that would make NFTs remixable. In other words, someone could remix a NFT Rifff they collected, then mint their remix as a new NFT.
“We’ve got some features that we already know we’re going to build into Bands to help control the supply,” he says. “That basically means that if you want to be around to collect the best Riffs (from Jams), you really have to be there while they’re happening because we’re probably going to put this ‘timeout’ feature where they’re only mintable for the first few hours, for example.”
“We’ll probably do an official public launch of Bands around September, I imagine with a whole bunch of other creator features.”
|
---
id: transfers
title: Transferring Fungible Tokens
sidebar_label: Transferring FTs
---
import {Github} from "@site/src/components/codetabs"
In this tutorial, you'll learn how to implement the [core standards](https://nomicon.io/Standards/Tokens/FungibleToken/Core) into your smart contract. You'll implement the logic that allows you to transfer and receive tokens. If you're joining us for the first time, feel free to clone [this repository](https://github.com/near-examples/ft-tutorial) and follow along in the `4.storage` folder.
:::tip
If you wish to see the finished code for this _Core_ tutorial, you can find it in the `5.transfers` folder.
:::
## Introduction {#introduction}
Up until this point, you've created a simple FT smart contract that allows the owner to mint a total supply of tokens and view information about the Fungible Token itself. In addition, you've added the functionality to register accounts and emit events. Today, you'll expand your smart contract to allow for users to transfer and receive fungible tokens.
The logic for doing a simple transfer is quite easy to understand. Let's say Benji wants to transfer Mike 100 of his fungible tokens. The contract should do a few things:
- Check if Benji owns at least 100 tokens.
- Make sure Benji is calling the function.
- Ensure Mike is registered on the contract.
- Take 100 tokens out of Benji's account.
- Put 100 tokens into Mike's account.
At this point, you're ready to move on and make the necessary modifications to your smart contract.
## Modifications to the contract
Let's start our journey in the `src/ft_core.rs` file.
### Transfer function {#transfer-function}
You'll start by implementing the `ft_transfer` logic which is found in the `src/ft_core.rs` file. This function will transfer the specified `amount` to the `receiver_id` with an optional `memo` such as `"Happy Birthday Mike!"`.
<Github language="rust" start="60" end="72" url="https://github.com/near-examples/ft-tutorial/blob/main/5.transfers/src/ft_core.rs" />
There are a couple things to notice here.
1. We've introduced a new function called `assert_one_yocto()`. This method will ensure that the user is signing the transaction with a full access key by requiring a deposit of exactly 1 yoctoNEAR, the smallest possible amount of $NEAR that can be transferred. Since the transfer function is potentially transferring very valuable assets, you'll want to make sure that whoever is calling the function has a full access key.
2. We've introduced an `internal_transfer` method. This will perform all the logic necessary to transfer the tokens internally.
### Internal helper functions
Let's quickly move over to the `ft-contract/src/internal.rs` file so that you can implement the `internal_transfer` method which is the core of this tutorial. This function will take the following parameters:
- **sender_id**: the account that's attempting to transfer the tokens.
- **receiver_id**: the account that's receiving the tokens.
- **amount**: the amount of FTs being transferred.
- **memo**: an optional memo to include.
The first thing you'll want to do is make sure the sender isn't sending tokens to themselves and that they're sending a positive number. After that, you'll want to withdraw the tokens from the sender's balance and deposit them into the receiver's balance. You've already written the logic to deposit FTs by using the `internal_deposit` function.
Let's use similar logic to implement `internal_withdraw`:
<Github language="rust" start="29" end="40" url="https://github.com/near-examples/ft-tutorial/blob/main/5.transfers/src/internal.rs" />
In this case, the contract will get the account's balance and ensure they are registered by calling the `internal_unwrap_balance_of` function. It will then subtract the amount from the user's balance and re-insert them into the map.
Using the `internal_deposit` and `internal_withdraw` functions together, the core of the `internal_transfer` function is complete.
There's only one more thing you need to do. When transferring the tokens, you need to remember to emit a log as per the [events](https://nomicon.io/Standards/Tokens/FungibleToken/Event) standard:
<Github language="rust" start="42" end="67" url="https://github.com/near-examples/ft-tutorial/blob/main/5.transfers/src/internal.rs" />
Now that this is finished, the simple transfer case is done! You can now transfer FTs between registered users!
### Transfer call function {#transfer-call-function}
In this section, you'll learn about a new function `ft_transfer_call`. This will transfer FTs to a receiver and also call a method on the receiver's contract all in the same transaction.
Let's consider the following scenario. An account wants to transfer FTs to a smart contract for performing a service. The traditional approach would be to perform the service and then ask for the tokens in two separate transactions. If we introduce a “transfer and call” workflow baked into a single transaction, the process can be greatly improved:
<Github language="rust" start="74" end="104" url="https://github.com/near-examples/ft-tutorial/blob/main/5.transfers/src/ft_core.rs" />
This function will do several things:
1. Ensures the caller attached exactly 1 yocto for security purposes.
2. Transfer the tokens using the `internal_transfer` you just wrote.
3. Creates a promise to call the method `ft_on_transfer` on the `receiver_id`'s contract.
4. After the promise finishes executing, the function `ft_resolve_transfer` is called.
:::info
This is a very common workflow when dealing with cross contract calls. You first initiate the call and wait for it to finish executing. Then, you invoke a function that resolves the result of the promise and act accordingly.
Learn more [here](../../2.build/2.smart-contracts/anatomy/crosscontract.md).
:::
When calling `ft_on_transfer`, it will return how many tokens the contract should refund the original sender.
This is important for a couple of reasons:
1. If you send the receiver too many FTs and their contract wants to refund the excess.
2. If any of the logic panics, all of the tokens should be refunded back to the sender.
This logic will all happen in the `ft_resolve_transfer` function:
<Github language="rust" start="174" end="221" url="https://github.com/near-examples/ft-tutorial/blob/main/5.transfers/src/ft_core.rs" />
The first thing you'll do is check the status of the promise. If anything failed, you'll refund the sender for the full amount of tokens. If the promise succeeded, you'll extract the amount of tokens to refund the sender based on the value returned from `ft_on_transfer`. Once you have the amount needed to be refunded, you'll perform the actual refund logic by using the `internal_transfer` function you wrote previously.
You'll then return the net amount of tokens that were transferred to the receiver. If the sender transferred 100 tokens but 20 were refunded, this function should return 80.
With that finished, you've now successfully added the necessary logic to allow users to transfer FTs. It's now time to deploy and do some testing.
## Deploying the contract {#redeploying-contract}
Let's create a new sub-account to deploy the contract to. Since these changes are only additive and non state breaking, you could have deployed a patch fix to the contract you deployed in the storage section as well. To learn more about upgrading contracts, see the [upgrading a contract](/tutorials/nfts/upgrade-contract) section in the NFT Zero to Hero tutorial.
### Creating a sub-account
Run the following command to create a sub-account `transfer` of your main account with an initial balance of 25 NEAR which will be transferred from the original to your new account.
```bash
near create-account transfer.$FT_CONTRACT_ID --masterAccount $FT_CONTRACT_ID --initialBalance 25
```
Next, you'll want to export an environment variable for ease of development:
```bash
export TRANSFER_FT_CONTRACT_ID=transfer.$FT_CONTRACT_ID
```
Using the build script, build the deploy the contract as you did in the previous tutorials:
```bash
cd 1.skeleton && ./build.sh && cd .. && near deploy --wasmFile out/contract.wasm --accountId $TRANSFER_FT_CONTRACT_ID
```
:::tip
If you haven't completed the previous tutorials and are just following along with this one, simply create an account and login with your CLI using `near login`. You can then export an environment variable `export TRANSFER_FT_CONTRACT_ID=YOUR_ACCOUNT_ID_HERE`. In addition, you can find the contract code by going to the `5.transfers` folder. Instead of building using `1.skeleton`, you can build by going to the `5.transfers` folder and running `./build.sh`.
:::
### Initialization {#initialization}
Now that the contract is deployed, it's time to initialize it and mint the total supply. Let's once again create an initial supply of 1000 `gtNEAR`.
```bash
near call $TRANSFER_FT_CONTRACT_ID new_default_meta '{"owner_id": "'$TRANSFER_FT_CONTRACT_ID'", "total_supply": "1000000000000000000000000000"}' --accountId $TRANSFER_FT_CONTRACT_ID
```
You can check if you own the FTs by running the following command:
```bash
near view $TRANSFER_FT_CONTRACT_ID ft_balance_of '{"account_id": "'$TRANSFER_FT_CONTRACT_ID'"}'
```
### Testing the transfer function
Let's test the transfer function by transferring 1 `gtNEAR` from the owner account to the account `benjiman.testnet`
:::note
The FTs won't be recoverable unless the account `benjiman.testnet` transfers them back to you. If you don't want your FTs lost, make a new account and transfer the token to that account instead.
:::
You'll first need to register the account `benjiman.testnet` by running the following command.
```bash
near call $TRANSFER_FT_CONTRACT_ID storage_deposit '{"account_id": "benjiman.testnet"}' --accountId $TRANSFER_FT_CONTRACT_ID --amount 0.01
```
Once the account is registered, you can transfer the FTs by running the following command. Take note that you're also attaching exactly 1 yoctoNEAR by using the `--depositYocto` flag.
```bash
near call $TRANSFER_FT_CONTRACT_ID ft_transfer '{"receiver_id": "benjiman.testnet", "amount": "1000000000000000000000000", "memo": "Go Team :)"}' --accountId $TRANSFER_FT_CONTRACT_ID --depositYocto 1
```
You should see the `FtTransferEvent` being emitted in the console. At this point, if you check for the total supply, it should still be 1000 `gtNEAR` but if you check both the balance of Benji and the balance of the owner, they should reflect the transfer.
```bash
near view $TRANSFER_FT_CONTRACT_ID ft_balance_of '{"account_id": "'$TRANSFER_FT_CONTRACT_ID'"}'
```
```bash
near view $TRANSFER_FT_CONTRACT_ID ft_balance_of '{"account_id": "benjiman.testnet"}'
```
```bash
near view $TRANSFER_FT_CONTRACT_ID ft_total_supply
```
### Testing the transfer call function
Now that you've tested the `ft_transfer` function, it's time to test the `ft_transfer_call` function. If you try to transfer tokens to a receiver that does **not** implement the `ft_on_transfer` function, the contract will panic and the FTs will be **refunded**. Let's test this functionality below.
You can try to transfer the FTs to the account `no-contract.testnet` which, as the name suggests, doesn't have a contract. This means that the receiver doesn't implement the `ft_on_transfer` function and the FTs should remain yours after the transaction is complete. You'll first have to register the account, however.
```bash
near call $TRANSFER_FT_CONTRACT_ID storage_deposit '{"account_id": "no-contract.testnet"}' --accountId $TRANSFER_FT_CONTRACT_ID --amount 0.01
```
```bash
near call $TRANSFER_FT_CONTRACT_ID ft_transfer_call '{"receiver_id": "no-contract.testnet", "amount": "1000000000000000000000000", "msg": "foo"}' --accountId $TRANSFER_FT_CONTRACT_ID --depositYocto 1 --gas 200000000000000
```
The output response should be as follows.
```bash
Scheduling a call: transfer.dev-1660680326316-91393402417293.ft_transfer_call({"receiver_id": "no-contract.testnet", "amount": "1000000000000000000000000", "msg": "foo"}) with attached 0.000000000000000000000001 NEAR
Doing account.functionCall()
Receipts: AJ3yWv7tSiZRLtoTkyTgfdzmQP1dpjX9DxJDiD33uwTZ, EKtpDFoJWNnbyxJ7SriAFQYX8XV9ZTzwmCF2qhSaYMAc, 21UzDZ791pWZRKAHv8WaRKN8mqDVrz8zewLWGTNZTckh
Log [transfer.dev-1660680326316-91393402417293]: EVENT_JSON:{"standard":"nep141","version":"1.0.0","event":"ft_transfer","data":[{"old_owner_id":"transfer.dev-1660680326316-91393402417293","new_owner_id":"no-contract.testnet","amount":"1000000000000000000000000"}]}
Receipt: 5N2WV8picxwUNC5TYa3A3v4qGquQAhkU6P81tgRt1UFN
Failure [transfer.dev-1660680326316-91393402417293]: Error: Cannot find contract code for account no-contract.testnet
Receipt: AdT1bSZNCu9ACq7m6ynb12GgSb3zBenfzvvzRwfYPBmg
Log [transfer.dev-1660680326316-91393402417293]: EVENT_JSON:{"standard":"nep141","version":"1.0.0","event":"ft_transfer","data":[{"old_owner_id":"no-contract.testnet","new_owner_id":"transfer.dev-1660680326316-91393402417293","amount":"1000000000000000000000000","memo":"Refund"}]}
Transaction Id 2XVy8MZU8TWreW8C9zK6HSyBsxE5hyTbyUyxNdncxL8g
To see the transaction in the transaction explorer, please open this url in your browser
https://testnet.nearblocks.io/txns/2XVy8MZU8TWreW8C9zK6HSyBsxE5hyTbyUyxNdncxL8g
'0'
```
There should be a transfer event emitted for the initial transfer of tokens and then also for the refund. In addition, `0` should have been returned from the function because the sender ended up transferring net 0 tokens to the receiver since all the tokens were refunded.
If you query for the balance of `no-contract.testnet`, it should still be 0.
```bash
near view $TRANSFER_FT_CONTRACT_ID ft_balance_of '{"account_id": "no-contract.testnet"}'
```
Hurray! At this point, your FT contract is fully complete and all the functionality is working as expected. Go forth and experiment! The world is your oyster and don't forget, go team!
## Conclusion
In this tutorial, you learned how to expand a FT contract by adding ways for users to transfer FTs. You [broke down](#introduction) the problem into smaller, more digestible subtasks and took that information and implemented both the [FT transfer](#transfer-function) and [FT transfer call](#transfer-call-function) functions. In addition, you deployed another [contract](#redeploying-contract) and [tested](#testing-changes) the transfer functionality.
In the [next tutorial](/tutorials/fts/marketplace), you'll learn about how an NFT marketplace can operate to purchase NFTs by using Fungible Tokens.
|
Why NEARCON is the Must-See Global Web3 Event of 2023
NEAR FOUNDATION
November 1, 2023
Just less than a week until Lisbon rumbles as the epicenter of the NEAR ecosystem. Step into the open web at NEARCON ‘23 across three different stunning venues, with the Convento Do Beato being the main hub for things like founder and builder talks. Rua Pereira will pop off with creative and community action, and hardcore dev building will be going down at the Hacker HQ.
Don’t miss what’s next in NEAR’s vision for an open web that’s scalable, user-friendly, and secure. NEAR has grown to over 30 million active accounts and over 1,000 projects, and NEARCON ‘23 will showcase all these ecosystem successes, along with a slate of brand new product and ecosystem announcements.
Here’s exactly why you’ll want to be in Lisbon starting on November 7th.
A track for everyone at NEARCON ‘23
This year’s edition promises to be even more iconic then the last, with diverse tracks for any open web enthusiast, along with a melting pot of cultural and social activities to spice up the nights — Portuguese style.
No matter who you are, one of these tracks is likely tailor made for your interests:
Developers: Sharpen your skills in open web technology with hands-on workshops and coding sessions. This is your chance to get a first-hand look at the newest innovations in the blockchain space.
Entrepreneurs: Listen to industry leaders as they offer game-changing insights, actionable strategies, and practical advice you need to transform your vision into the next unicorn startup.
Creatives: Explore how blockchain technology can revolutionize art, marketing, and beyond. You’ll walk away with new perspectives on how to infuse blockchain into your creative process.
Regulators: Blockchain regulations can seem like a daunting maze of laws and policies. Get the low-down on best practices, legal frameworks, and the future of blockchain regulation from experts in the field.
Hackathons, bounty rewards, and AI at the bleeding edge
If you’re a developer, fling yourself into an intense 48-hour NEARCON hackathon where you’ll get the chance to build next-level apps for NEAR and potentially see your project in the spotlight. Developers can also earn NCON tokens through bounties, redeemable for perks from food to merch. Also, don’t miss the much anticipated “AI is NEAR” track led by AI and machine learning OGs Illia Polosukhin and Alex Skidanov, exploring where AI meets the Open Web.
So whether you’re an AI head who craves the latest insights from the baddest machine learning experts or a curious founder who delights in brainstorming over a local merlot at one of the many super cool side events, NEARCON ‘23 is sure to be memorable and valuable.
With tracks tailored to suit every interest, ecosystem milestones to celebrate, and a hackathon that brings the brightest minds together, NEARCON ‘23 is the can’t miss event of the year for anyone curious, passionate (or both) about what’s next for the open web.
So what are you waiting for? Sign up for NEARCON ’23 now to secure your spot!
And let’s not forget some special treatment is in store for Ukrainians, students in Spain and Portugal, and hackathon registrants. All of the above can head over to those pages and register for a totally free NEARCON ‘23 pass! |
2020 NEAR In Review & Ecosystem Tour
COMMUNITY
December 23, 2020
Most of us won’t be sorry to say goodbye to 2020, but this year was a big one for NEAR. From Ready Layer One to announcing the NEAR Foundation, Hacking the Rainbow to launching the NEAR Mainnet, there are many milestones to be proud of. So, to every developer, staker, validator, builder, maker, and contributor, thank you for being a part of this community and building the Open Web with us.
Why not spend some holiday downtime trying out cool things around the NEARverse? We’ll get you started. If you’re new to the ecosystem or want a refresher, try the Beginner’s Guide deep dive. Or choose your own adventure below: Learn for watching and reading, Build for developer onboarding, Try to play around with live projects, Join for Community and Guild resources, and Give to share the love.
Learn 👩💻
Amazing projects are already taking advantage of NEAR’s building experience. Check out the use cases page for all the details on how Flux Protocol, Mintbase, and Zed.Run are leveraging NEAR’s speed and low-cost development to grow their user communities in ways that weren’t possible before.
Electric Capital released the 2020 edition of their Developer Report this month and features lots of excellent NEAR growth stats. Some highlights: NEAR grew our developer community by 3x, is one of the top 10 gainers of devs, and has about the same number of developers as Ethereum did at this point in its growth timeline!
The NEAR Whiteboard Series features a technical leader from another protocol in casual conversation with NEAR engineers—comparing approaches, unpacking concepts, and discussing challenges. Recent episodes feature Alexey Akhunov on TurboGeth and Adrian Manning from Sigma Prime.
Build 🛠️
To dive in and start tinkering with some example apps, head to NEAR.dev to dive in. For a high-level overview of how applications are put together on NEAR and a suggested path you can use to learn how to build your own, head to the Building Applications page.
For more in-depth intro content and tutorials to guide you on your way, check out our educational NEAR 101 video and NEAR 102 for ETH developers. You’ll find lots of other onboarding resources on our YouTube if you fall down the rabbit hole!
Join us on Discord to reach out, ask questions, get the latest news, and find fellow builders to exchange ideas and collaborate. We’re here to help.
Try 🤳
There are tons of cool applications live on NEAR today! The OG NEAR app, Berry Club, is a yield farming app that lets you draw with pixels and earn 🥑 tokens. Start eating and climb that leaderboard. (Fun fact: the first version was built in December 2018, making Berry Club even older than the NEAR TestNet! Happy 2nd Birthday, Berry!)
Paras, now live on MainNet, is an NFT digital art card marketplace powered by NEAR and IPFS. Collectors can discover and collect art cards built with tech that prevents forgery and provides provable ownership. Artists create digital art cards and sell them on the marketplace in just a few clicks; active artists on the platform have earned thousands of NEAR in just the first two weeks!
Is Santa bringing you a new phone this year? Glyde provides buyers and sellers of used mobile phones a higher level of transparency, security, and trust – and uses NEAR smart contracts to minimize fraud.
Join 🤝
A clear 2020 highlight for NEAR has been the growth of our community. Our vibrant network of contributors building with and on NEAR has evolved into a resilient, decentralized organization in their own right.
Our December Town Hall focused on NEAR Guilds. A guild is a community with a unique identity based on a shared purpose and goals. We have regional guilds, guilds focused around common knowledge and skills, domain-specific guilds focused on particular use cases, and more. The Town Hall featured 5 emerging Guild leaders from around the world, but there are lots more Guilds to explore on NEARGuilds.com.
Chloe gave a tour of Createbase, a guild supporting experimental art projects that contribute to the NFT ecosystem on NEAR in collaboration with Mintbase. Learn more about their funding support, education resources, and Advent Calendar on their Wiki.
Henry AKA Blaze shared Open Shard Alliance, a network of professional validators who launched GuildNet, the first decentralized, sharded implementation of NEAR. OSA are the masterminds behind near-staking.com and Narwallets.
Jitendra presented NEAR India, a regional guild focused on growing the NEAR India community through awareness, education (including materials in regional languages from across India), and empowering local developer communities. Talk to them!
Ash introduced the NEAR Marketing Guild, a global group of advocates dedicated to spreading the word about NEAR initiatives and making shareable developer content.
Michael AKA Ozymandius presented 4NTS, an OG guild dedicated to expanding the NEAR ecosystem and advancing the Open Web vision. 4NTS are also the maker-maintainers of the self-organized NEAR Guilds home base!
Give 🎁
To send us all off in the spirit of the season, the NEAR Foundation and a few friendly collaborators in the ecosystem decided to team up and donate to Coin Center today. Regulatory advocacy is more important than ever in the blockchain space and it’s important to support the people doing the hard work on our collective behalf. The NEAR Foundation has donated $25,000. Thanks to the projects who joined us: Celo Foundation, Solana Foundation, Interchain Foundation, and The Graph Foundation.
If you hold NEAR and want to spread cheer, try out NearNames. It’s a fun holiday app that lets you gift NEAR accounts in an easy flow for crypto natives and newbies alike. Bring them aboard with a personalized account name (24K+ accounts are live on NEAR so far!) and then send them this post to help them get started in the ecosystem.
The Future is NEAR! ✨
Early 2021 will bring lots of exciting initiatives, so don’t forget to follow along on Twitter and our governance Forum to hear the latest. We will announce the launch of the NEAR Grants Program, debut improvements to the NEAR community governance experience, and share exciting updates about the Rainbow Bridge. And that’s just in January.
Happy holidays and see you next year!
Don’t forget to join us on Twitter and subscribe to our newsletter. |
---
NEP: 492
Title: Restrict creation of Ethereum Addresses
Authors: Bowen Wang <[email protected]>
Status: Final
DiscussionsTo: https://github.com/near/NEPs/pull/492
Type: Protocol
Version: 0.0.0
Created: 2023-07-27
LastUpdated: 2023-07-27
---
## Summary
This proposal aims to restrict the creation of top level accounts (other than implicit accounts) on NEAR to both prevent loss of funds due to careless user behaviors and scams
and create possibilities for future interopability solutions.
## Motivation
Today an [Ethereum address](https://ethereum.org/en/developers/docs/accounts/) such as "0x32400084c286cf3e17e7b677ea9583e60a000324" is a valid account on NEAR and because it is longer than 32 characters,
anyone can create such an account. This has unfortunately caused a few incidents where users lose their funds due to either a scam or careless behaviors.
For example, when a user withdraw USDT from an exchange to their NEAR account, it is possible that they think they withdraw to Ethereum and therefore enter their Eth address.
If this address exists on NEAR, then the user would lose their fund. A malicious actor could exploit this can create known Eth smart contract addresses on NEAR to trick users to send tokens to those addresses. With the proliferation of BOS gateways, including Ethereum ones, such exploits may become more common as users switch between NEAR wallets and Ethereum wallets (mainly metamask).
In addition to prevent loss of funds for users, this change allows the possibility of Ethereum wallets supporting NEAR transactions, which could enable much more adoption of NEAR. The exact details of how that would be done is outside the scope of this proposal.
There are currently ~5000 Ethereum addresses already created on NEAR. It is also outside the scope of this proposal to discuss what to do with them.
## Specification
The proposed change is quite simple. Only the protocol registrar account can create top-level accounts that are not implicit accounts
## Reference Implementation
The implementation roughly looks as follows:
```Rust
fn action_create_account(...) {
...
if account_id.is_top_level() && !account_id.is_implicit()
&& predecessor_id != &account_creation_config.registrar_account_id
{
// Top level accounts that are not implicit can only be created by registrar
result.result = Err(ActionErrorKind::CreateAccountOnlyByRegistrar {
account_id: account_id.clone(),
registrar_account_id: account_creation_config.registrar_account_id.clone(),
predecessor_id: predecessor_id.clone(),
}
.into());
return;
}
...
}
```
## Alternatives
There does not appear to be a good alternative for this problem.
## Future possibilities
Ethereum wallets such as Metamask could potentially support NEAR transactions through meta transactions.
## Consequences
In the short term, no new top-level accounts would be allowed to be created, but this change would not create any problem for users.
### Backwards Compatibility
For Ethereum addresses specifically, there are ~5000 existing ones, but this proposal per se do not deal with existing accounts.
## Copyright
Copyright and related rights waived via [CC0](https://creativecommons.org/publicdomain/zero/1.0/).
|
---
id: dao
title: Decentralized Autonomous Organizations
sidebar_label: Autonomous Organizations (DAO)
hide_table_of_contents: false
---
import {FeatureList, Column, Feature} from "@site/src/components/featurelist"
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
import BOSGetDAOList from "./dao/bos/get-dao-list.md"
import BOSGetProposalList from "./dao/bos/get-proposal-list.md"
import BOSCreateDAO from "./dao/bos/create-dao.md"
import BOSCreateProposal from "./dao/bos/create-proposal.md"
import BOSVoteForProposal from "./dao/bos/vote-for-proposal.md"
import WebAppGetDAOList from "./dao/web-app/get-dao-list.md"
import WebAppGetProposalList from "./dao/web-app/get-proposal-list.md"
import WebAppCreateDAO from "./dao/web-app/create-dao.md"
import WebAppCreateProposal from "./dao/web-app/create-proposal.md"
import WebAppVoteForProposal from "./dao/web-app/vote-for-proposal.md"
import CLIGetDAOList from "./dao/near-cli/get-dao-list.md"
import CLIGetProposalList from "./dao/near-cli/get-proposal-list.md"
import CLICreateDAO from "./dao/near-cli/create-dao.md"
import CLICreateProposal from "./dao/near-cli/create-proposal.md"
import CLIVoteForProposal from "./dao/near-cli/vote-for-proposal.md"
import SmartContractCreateDAO from "./dao/smart-contract/create-dao.md"
import SmartContractCreateProposal from "./dao/smart-contract/create-proposal.md"
import SmartContractVoteForProposal from "./dao/smart-contract/vote-for-proposal.md"
Decentralized Autonomous Organizations (DAOs) are self-organized groups that form around common purposes. Membership, decision making, and funding are coordinated by publicly voting on proposals through a smart contract.
![dao](/docs/primitives/dao.png)
In contrast with [FT](ft.md) and [NFT](nft.md), DAO contract's are not standardized. Because of this, in this page we will use as
reference the [Astra dao](https://near.org/astraplusplus.ndctools.near/widget/home?page=daos) [contract](https://github.com/near-daos/sputnik-dao-contract). The main concepts covered here should
easily generalizable to other DAO implementations.
---
## Create a DAO
The simplest way to create and interact with a DAO is to go through the [AstraDAO UI](https://near.org/astraplusplus.ndctools.near/widget/home?page=daos).
You can also create a DAO by interacting with the `sputnik-dao` contract.
<Tabs groupId="code-tabs">
<TabItem value="⚛️ Component" label="⚛️ Component" default>
<BOSCreateDAO />
</TabItem>
<TabItem value="🌐 WebApp" label="🌐 WebApp">
<WebAppCreateDAO />
</TabItem>
<TabItem value="🖥️ CLI" label="🖥️ CLI">
<CLICreateDAO />
</TabItem>
<TabItem value="📄 Contract" label="📄 Contract">
<SmartContractCreateDAO />
</TabItem>
</Tabs>
<hr className="subsection" />
### Voting policy
Currently, DAOs support two different types of [voting policies](https://github.com/near-daos/sputnik-dao-contract#voting-policy): `TokenWeight`, and `RoleWeight`.
When the vote policy is `TokenWeight`, the council votes using [tokens](ft.md). The weigh of a vote is the proportion of tokens used for voting over the token's total supply.
When the vote policy is `RoleWeight(role)`, the vote weigh is computed as "one over the total number of people with the role".
<details>
<summary> Voting Threshold </summary>
Both voting policies further include a `threshold` for passing a proposal, which can be a ratio or a fixed number.
The ratio indicates that you need a proportion of people/tokens to approve the proposal (e.g. half the people need to vote, and to vote positively). A fixed number indicated that you need a specific number of votes/tokens to pass the proposal (e.g. 3 people/tokens are enough to approve the proposal).
</details>
---
## List of DAOs
Query the list of DAOs existing in Sputnik Dao.
<Tabs groupId="code-tabs">
<TabItem value="⚛️ Component" label="⚛️ Component" default>
<BOSGetDAOList />
</TabItem>
<TabItem value="🌐 WebApp" label="🌐 WebApp">
<WebAppGetDAOList />
</TabItem>
<TabItem value="🖥️ CLI" label="🖥️ CLI">
<CLIGetDAOList />
</TabItem>
</Tabs>
---
## Query Existing Proposals
These snippets will enable you to query the proposals existing in a particular DAO.
<Tabs groupId="code-tabs">
<TabItem value="⚛️ Component" label="⚛️ Component" default>
<BOSGetProposalList />
</TabItem>
<TabItem value="🌐 WebApp" label="🌐 WebApp">
<WebAppGetProposalList />
</TabItem>
<TabItem value="🖥️ CLI" label="🖥️ CLI">
<CLIGetProposalList />
</TabItem>
</Tabs>
---
## Create proposal
Create a proposal so other users can vote in favor or against it.
<Tabs groupId="code-tabs">
<TabItem value="⚛️ Component" label="⚛️ Component" default>
<BOSCreateProposal />
</TabItem>
<TabItem value="🌐 WebApp" label="🌐 WebApp">
<WebAppCreateProposal />
</TabItem>
<TabItem value="🖥️ CLI" label="🖥️ CLI">
<CLICreateProposal />
</TabItem>
<TabItem value="📄 Contract" label="📄 Contract">
<SmartContractCreateProposal />
</TabItem>
</Tabs>
:::info
By default, only **council members** can create proposals.
:::
---
## Vote for proposal
These snippet will enable your users to cast a vote for proposal of a particular DAO.
<Tabs groupId="code-tabs">
<TabItem value="⚛️ Component" label="⚛️ Component" default>
<BOSVoteForProposal />
</TabItem>
<TabItem value="🌐 WebApp" label="🌐 WebApp">
<WebAppVoteForProposal />
</TabItem>
<TabItem value="🖥️ CLI" label="🖥️ CLI">
<CLIVoteForProposal />
</TabItem>
<TabItem value="📄 Contract" label="📄 Contract">
<SmartContractVoteForProposal />
</TabItem>
</Tabs>
---
## Additional Resources
1. [AstroDAO UI](https://astrodao.com/) - the web app built on top of the Sputnik DAO Contract. Allows users to create and manage DAOs.
2. [List of DAOs as a NEAR component](https://near.org/onboarder.near/widget/DAOSocialSearch) |
NEAR Boosts Web3 Gaming with New South Korea Regional Hub Launch
COMMUNITY
November 1, 2022
South Korea will soon have its own NEAR Regional Hub dedicated to Web3 innovation, business development, education, and talent throughout the country. The hub will be led by entrepreneurs Scott Lee and Ben Kang – both influential figures within South Korea’s growing blockchain community. Details of the plan include tapping into the country’s active gaming community, bringing more amazing projects and creators to the NEAR ecosystem.
NEAR has entered the game
This exciting development will bring all things NEAR to South Korea’s vibrant Web2 and Web3 scene. Most notably, game developers building on NEAR will be exposed to the country’s massive gaming markets, ranked fourth in the world, and earn commissions on the NEAR-based dapps they build.
South Korea’s gaming industry leaders feature WEMADE’s P2E project, WEMIX, NetMarble’s MarbleX, and KakaoGames’ METABORA. With gaming and crypto’s popularity in South Korea, the new hub will be integral in helping lay the foundation for Web3 gaming’s future; both locally within the country, and beyond.
Local leadership for local communities
Together, Scott Lee and Ben Kang bring an impressive depth of knowledge and experience to the NEAR ecosystem, as well as a vast network of industry leaders from the broader Web3 space.
When Lee wasn’t busy founding a Web3 startup or advising on multiple Korean-based projects, he was the Director of Business Development of Legendaries, a subsidiary of KAKAO Entertainment. Before joining KAKAO Entertainment, he led an accelerating division and growth team from SparkLabs and FuturePlay.
Kang began his career in financial services and worked in high profile roles at Merrill Lynch and the Korea Stock Exchange. After co-founding Klaytn, Korea’s leading Layer 1, he leveraged his expertise to advise on the company’s tokenomics and corporate investments.
The power duo hopes to attract the best quality projects and the hottest developers in South Korea to build on NEAR while helping companies transition from Web2 into Web3, particularly those within the gaming industry.
New places, new talent, new projects
To seize this opportunity to heighten awareness for NEAR in South Korea, the hub will also play a proactive role in community building through meetups, hackathons, and university relations, attracting high quality projects and growing NEAR’s presence in the region.
“NEAR has gained wide support among the exciting Web3 community for its ease of use and impressive user interface, which has helped to democratize access to building and bring a mix of exciting talent to the protocol. Moreover, there is a significant number of JavaScript developers in Korea, and they can now easily build on NEAR,” said Scott Lee. “We want to use the hub to leverage that momentum and to attract an even bigger audience of builders by giving them the funding and the tools to create an inclusive and sustainable Web3 future for Korea.”
“We will focus on enhancing the overall NEAR ecosystem and attracting Web3 projects to onboard – and also value campus blockchain communities, which will be the fundamentals of the NEAR Protocol.” said Ben Kang. “In order to do this we will work closely with the community to raise the profile of the NEAR and to shine a spotlight on the high level of scalability and useability that the protocol can offer creators.”
South Korea has one of the most established Web3 ecosystems in Asia and one of the most active cryptocurrency economies in the world. This, along with a combination of highly creative local talent and a growing desire for developers to build sustainably and without limits, makes this country an ideal addition to the NEAR Regional Hubs initiative.
“We are thrilled to be growing the blockchain dev community in Korea as the talent there is world class and they have the potential to create some fantastic Web3 projects on NEAR,” said Marieke Flament, CEO of the NEAR Foundation. “With a stellar leadership team at the helm of this exciting and inclusive hub, we are confident that we can deliver the necessary tools and opportunities to attract more of the best and brightest stars from the local community. Together we can give them the training and freedom to build sustainable and exciting Web3 applications without limits.”
Building a world without limits
Internationally, the excitement continues to build around the #NEARisNOW movement, and the potential NEAR has as the de facto blockchain of Web3. With its ecosystem-first approach, incredible community, and innovative tech, there is no better on ramp to the future of the web. Unlike other chains, NEAR gives software developers easy access to create without limits.
|
---
id: transactions
title: Transactions
---
Users interact with NEAR by creating transactions. Specifically, user's use their account's [private keys](./access-keys.md) to sign transactions, which are then broadcasted and processed by the network.
![](@site/static/docs/assets/welcome-pages/data-lake.png)
A transaction is composed of one or more [`Actions`](./transaction-anatomy.md), and each action costs a deterministic amount of [gas units](./gas.md). This gas units are translated into a cost in NEAR tokens, which the user must pay for the transaction to be processed.
:::tip
You can use an <a href="https://nearblocks.io/">Explorer</a> to inspect transactions in the NEAR network
::: |
NEAR Wallet Will Remain Online Until the End of the Year
NEAR FOUNDATION
October 26, 2023
To avoid the rush of last minute transfers, and to support the inclusion of new wallets coming into the ecosystem, we are delaying the sunset of the NEAR wallet.
The NEAR wallet sunset is now targeted for December 31st, 2023.
We want to encourage users to continue to transfer their wallets using the Transfer Wizard available at wallet.near.org. Simply select “Get Started” on the banner at the top of the page displayed when you log in.
Learn more about how the Transfer Wizard works.
You can also learn more about why Pagoda is sunsetting the NEAR Wallet.
The future of the wallet.near.org landing page
Previously, we had intended to make wallet.near.org into a wallet-centric landing page as a place for ecosystem users to learn about all the amazing wallets available on the NEAR platform. However, due to resource and time constraints, we no longer believe we could provide the best experience. Instead, we are asking the community to create an information site that showcases all the NEAR wallets.
These plans are still in progress as we work with community partners. More to follow when we have more to share.
Thank you for your continued support and patience as we work hard to improve the user onboarding and transaction experience. |
NEAR MainNet is now Community-Operated
COMMUNITY
September 24, 2020
As of today, the NEAR Foundation has spun down its nodes, making the NEAR Protocol MainNet fully community run. This is the most significant (though not the last!) milestone in NEAR’s multi-year journey from excited scribbles on a whiteboard to a fully decentralized and community operated network.
For context, the NEAR Protocol MainNet genesis occurred on April 22, 2020 and the network has been operated in “Proof Of Authority” mode by the NEAR Foundation, which ran its only nodes during the early stages. This can be considered the “training wheels” phase of the network because the Foundation had sufficient control to unilaterally address any challenges which might have arisen.
Over the past several months, in conjunction with the Stake Wars incentivized TestNet program, the NEAR team has onboarded and promoted additional participants to operate the validator nodes which run the network and given them the delegation they need to operate those nodes. At this point, dozens of independent node operators currently run the NEAR network.
NEAR has always been meant to be community operated, meaning that its node hardware is run by independent participants and no entity controls sufficient stake to unilaterally override decisions made on the network. That’s why, as planned, the NEAR foundation today realized the Phase I milestone of its Road to MainNet by shutting down its nodes and placing the network in the hands of the other operators. While the foundation still has the ability to delegate to support a diversity of these nodes, it no longer has control over the network. NEAR is now community operated.
Why this is Important
NEAR fills a major hole in the current blockchain ecosystem and many people have been impatient for the network to reach its final stage of MainNet launch.
The obvious problem today is the crippling fees on Ethereum which have sent a wave of refugees out into the world looking for a network that will not rewrite the economics of their businesses because of an unrelated bubble. Yes, NEAR has 10,000x lower fees than Ethereum and will scale dynamically. Yes, NEAR is the most developer-friendly way to achieve this scalability. But “faster and cheaper” is insufficient to be successful in supporting the next generation of applications.
The reason why projects like Mintbase have chosen to build on NEAR is because NEAR is the fastest way for builders to get to market and do things that they didn’t expect were possible from a blockchain. NEAR interoperates with Ethereum with the ease of a layer 2 via the trustless Rainbow Bridge while providing a flexible account model that can hide the blockchain from users until they’re ready by using Progressive Onboarding. It gives them a full stack suite of tooling and a way to keep 30% of transaction fees generated by their contracts. It easily allows sending tokens to unregistered users with “NEAR drops” and its human-readable accounts have easy multisig so DAOs are almost native.
These features don’t just solve the problems of today (cost & scaling) but allow builders to create sustainable apps and experiences that no other chain can support… experiences that real users can use.
What Comes Next
The transition to Phase I means that the network is now community operated but it is still technically restricted from allowing token transfers between participants and there are no block rewards being offered. The first task of this new set of community operators will be to exercise their governance power to vote to enable each of these things, which will transition the network into its final stage, “MainNet Phase II: Community-Governed.”
The technical details of this process are described in more detail in the Transitioning NEAR MainNet post, but basically there is a stake-weighted vote where validators indicate their choice by flagging their decision. Over time, as more delegation joins each voting validator, the “yes, proceed to Phase II” votes should increase until they reach the necessary threshold (2/3rds of total stake) to pass.
We have no idea when this vote will pass because it is entirely up to the community now. Members of the NEAR Validator Advisory Board have been vocal in discussing the details of this transition. Generally, important criteria will include ensuring that enough tokens have been distributed and are participating in the delegation process and maintaining the network’s stability. This could be days, weeks or months… and it’s partially up to you.
What we Need from You
If you are a NEAR token holder, now is the time to start participating!
This means you should start by going through the claims process (the details of which will depend on your exact method of acquiring tokens) to accept and custody your tokens.
Next, if you are comfortable doing so, you can participate in both the network security and voting by delegating your stake to a validator. This is still a relatively technical process (see the docs) but several community members are working on human-usable interfaces to help do this in a more user-friendly way.
There are several upsides to delegating:
You get to participate in the vote to transition into Phase II, which will enable transfers and block rewards when it succeeds.
Once block rewards are enabled in Phase II, you can expect to receive a portion of them (which depends on the details of the validator you delegated to).
As with anything in the blockchain space, there are risks to consider so you should do so only if you are comfortable. But, as with anything in the blockchain space, your participation actually matters. A first step is to join us at our first community Town Hall on Monday September 28.
Outro
For the first time, we can say that NEAR is officially and undeniably community operated.
This almost feels like sending a favored son or daughter off to school for the first time — it can be a bit uncomfortable to place your trust in the hands of others. But everyone in the NEAR community has worked incredibly hard to get this far and we couldn’t be prouder of the group who are now carrying this network into the next phase. |
---
title: 5.2 Sputnik V2 and Astro DAO on NEAR
description: Sputnik and Astro and developing robust DAO infrastructure on NEAR
---
# 5.2 Sputnik V2 and Astro DAO on NEAR
Sputnik V2 and Astro DAO refer to NEAR DAO tooling logic, deployed exclusively on NEAR’s native network (not Aurora). In the context of the emergence of DAO’s, Sputnik and Astro represent the future potential for developing robust DAO infrastructure on NEAR to power decentralized governance across verticals in the future.
However, we are only at the very beginning of this paradigm shift: DAOs and OOs (on-chain organizations) are extremely immature - with the tooling running just slightly ahead. The goal of this lecture is to break down what is Sputnik V2, how Astro DAO improves upon Sputnik V2, and what remains to be done in terms of DAO tooling on NEAR - looking forward to the future.
## Context and Possibilities
![](@site/static/img/bootcamp/mod-em-5.2.1.png)
From the outset, one of the most difficult things concerning DAO tooling, is the massive scope of potential applications for DAOs. In short, _"How do we approach the logic of decentralized autonomous organizations when the very scope and nature of a DAO is rapidly expanding and changing?”_
On NEAR, Illia has designed a very robust logic for DAO tooling that is unique to the NEAR Ecosystem. This tooling logic is known as Sputnik, and in April of 2021, Sputnik V2 launched with a number of highly innovative features. Sputnik V2 was then used as the basis for creating a more simplified and user-friendly tool for anyone to make use of - this tool was known as Astro DAO. In its current form, Astro DAO has been kept inside of Pagoda, and is not spinning out as its own product - meaning the playing field is wide open for disruption and innovation in NEAR specifically.
## Sputnik V2: Logic and Possibilities
The core logic of Sputnik V2 centers upon the idea of proposal kinds and actions. That is to say, types of proposals that a user could put forward, and a set of actions that a user may be allowed to take, in relation to a proposal. A user’s permission is the combination of the type of proposal kind, and the action that a particular user is allowed to take. This is the basis for different ‘Roles’ among different participants:
![](@site/static/img/bootcamp/mod-em-5.2.2.png)
In terms of what kind of proposal kinds exist, and what actions can be taken, Sputnik is built [around the following logic](https://github.com/near-daos/sputnik-dao-contract):
![](@site/static/img/bootcamp/mod-em-5.2.3.png)
![](@site/static/img/bootcamp/mod-em-5.2.4.png)
This essentially means the following:
* A role is structured around:
* The kinds of proposals a user has permissions to create.
* The types of actions a user has permission to take, for a specific proposal kind.
Mathematically, this means the following:
![](@site/static/img/bootcamp/mod-em-5.2.5.png)
In its most broad and expansive form, Sputnik V2 is designed to accommodate up to 3,432 possible roles.
Beyond the permissions for specific types of users, Sputnik is also built to provide specific vote policies for users:
This can be done in two ways (or a combination of both):
1. By token weight - meaning the passage of a proposal by users with the acceptable role is done by the weight of the amount of tokens each user votes with. And notably this is not limited strictly to NEAR Tokens, but could be set as a combination of any NEP-141 tokens.
2. By role weight - meaning members vote from the role they possess, with each role having 1 vote.
On top of these parameters, the DAO must also decide on the quorum threshold - or how much vote is needed to be accumulated for a policy to pass. A bond amount for posting a proposal, and a proposal period, for how long the proposal is considered for.
![](@site/static/img/bootcamp/mod-em-5.2.6.png)
Altogether, Sputnik is extremely modular insofar as it allows any creator to create an on-chain organization around different permissions for users to put forward types of proposals, actions to vote on proposals, and policies for deciding how voting is done. Interestingly, the _FunctionCall_ proposal-kind is extremely broad insofar as virtually any type of action leveraging smart-contracts on NEAR can be leveraged within a DAO to be done by the DAO as a whole.
## Astro DAO
With Sputnik V2 NEAR launched an exceptionally robust DAO tooling factory that needed to become more user-friendly in order to become operational. In 2021, Jordan was brought into Pagoda to launch Astro DAO, as a user-friendly platform for quickly and easily spinning up DAOs on NEAR using the Sputnik V2 logic.
![](@site/static/img/bootcamp/mod-em-5.2.7.png)
The one significant innovation of Astro DAO - beyond a much improved UI/UX, is its communal _actions library._
![](@site/static/img/bootcamp/mod-em-5.2.8.png)
In short, anyone using Astro DAO can create a custom function call or proposal kind for their DAO, that can then be plugged into any other DAO on the platform. This represents a _significant_ opportunity for open-source collaboration, the development of library-specific verticals for different types of DAOs and OO’s.
![](@site/static/img/bootcamp/mod-em-5-multiple.png)
## V3 and Unfinished Business
While Sputnik and Astro DAO represent monumental leaps forward for DAO tooling on NEAR (as compared to Sputnik V1), there remain a number of areas of development that would vastly improve DAOs on NEAR. Dubbed as Sputnik V3 these enhancements include:
* **Granular Vote Policies For Specific Proposal Kinds:** Being able to set how certain types of proposals (i.e. adding a new member or allocating funds) has a different vote criteria from other proposals.
* **Reputation and Proof of Humanity / Attestations:** Carrying over past user history from other engagements in the ecosystem (dApps, other DAOs, etc.)
![](@site/static/img/bootcamp/mod-em-5.2.9.png)
* **DAO Hierarchies and Inter-DAO Communication:** Being able to structure roles and actions more specifically within each DAO, and merging DAO decisions, such that if DAO _A _does X, DAO B can only do Y or Z.
* **Configurable Tokens Based Actions to Include:**
* **Staked / Delegated:** Stake for vote weight or delegate to certain accounts.
* **Bonding Curves:** Allowing proposals to remain open and evolving over time, based on staking, delegation to the proposal.
* **Fungible Token / Non-Fungible Token Dichotomy.** Allowing for NFTs to be utilized for access, and voting or a combination of both NFTs and FTs.
* **Legal Wrappers for DAO incorporation similar to the LAO.** Legal setup such that the DAO can be legally organized at the same time as when it is created on-chain.
Most importantly, the takeaway here is that DAOs as a whole are in their infancy, and while much work remains to be done, there are a number of upgrades, iterations, and organizational opportunities for keen builders and ecosystem developers. The verdict is still very much out on which ecosystem has the best DAO tooling.
## DAO Business Models and tokens
Importantly, DAO’s have struggled around business model design and monetization pathways. This is largely due to the fact that ‘tooling’ and ‘infrastructure’ is largely open-source or easily replicable. This presents a unique opportunity for an ecosystem as a whole to develop a strategy around building and offering subsidized or free DAO tooling and libraries for a robust DAO toolkit that any dApp or protocol could make use of.
## The Opportunity Before Us For DAO Products
Looking ahead to the future, the opportunity in front of us for DAO products can be summarized as follows: Vertically, DAO tooling will become more complex, and more granular, while open-source libraries can become more rich and developed. Horizontally, DAOs will rapidly expand to encompass more and more industries - beyond the narrow scope of crypto-native treasuries, NFT communities, and protocols, eventually VC’s, investment funds, community organizations, legal organizations, and even national governments can be imagined to move on chain.
## Future DAO Product Recommendations for NEAR Specifically
* **The Figma of DAOs.** A full proposal of this can be found [here](https://docs.google.com/document/d/17q8HSlSIYjVyxV1YeBX--0HkJbi4SYZeHPKmM2oYqcw/edit). In short, the idea is to create two interfaces for DAO tooling on NEAR: (1) The first, for developers and builders, is to create an extremely easy to user interface from which DAO creators can leverage plug-and-play actions from Astro and Sputnik to drag and drop DAO organizations into existence and in communication with one another. While some have dropped the idea of the ‘wordpress’ for DAOs, the complexity of the actions yet underlying logic is more similar to Figma. (2) Second, a clear pathway for users to spin-up different types of DAOs based on specific vertical needs: for Protocol Treasuries, for Investment DAOs, for Network States, and for NFT communities. In short, one click setup of pre-set DAO configurations.
* **The Network State Spin-Up Structure.** Praxis, Afropolitan, Balaji and many, many others are pioneering a new future of start-up cities, network states, and break-away organizations that are seeking to create new polities fully on-chain. Not only is the narrative power around this movement fantastic, but there is serious capital, builder-interest, and growth potential of this vertical. NEAR would be very smart and forward looking to develop an easy to use, Network State pipeline / product, including legal wrappers, and national agreements to incentivize builders to create their network states of the future in the NEAR Ecosystem. Combined with products like Request Finance and Roketo, the idea of a startup-country can be practical and implemented within a couple of months.
* **Custom Libraries for Specific Verticals.** The idea of building out the libraries of the future for DAO creators, centers upon expanding the limits of what developers and users believe to be possible. The roll-out of DAOs and OO’s is thus constrained by the roll-out and development of smart-contract logic on NEAR, as well as different combinations of products and tools that exist (for example, Croncat). As more products build on NEAR, more smart contracts are deployed, and more products are synthesized, and ever-increasing pool of potential use-cases emerges for DAOs on NEAR. In order to ‘Keep up’ with this innovation, there is a low-hanging fruit opportunity to establish a public good around DAO library improvement, maintenance and organization.
|
Women in Web3 Changemakers: Erica Kang
NEAR FOUNDATION
October 3, 2023
“The space needs more women,” says Erica Kang, the CEO and founder of KryptoSeoul, a community building team focused on highlighting projects that are actively building in the space, connecting audiences in Asia and beyond.
“I think many people find it intimidating coming into this space, but I think women can add value.”
Kang is one of the 10 finalists of the 2023 Women in Web3 Changemakers List, an annual competition designed to champion women pushing boundaries in the space. Kang, along with nine other outstanding candidates, are showing the industry the change women are bringing about during the advent of the Open Web.
Kang was chosen because of her tireless work in the Web3 community. On top of creating one of the most vibrant meetups in Asia under KryptoSeoul, she’s also one of the main organisers of BUIDL Asia, as well as numerous hackathons and meetups, and has been a pivotal player in helping bring together some of the brightest technical talent under one roof. But getting to this point has been anything but easy.
Watch an extended NEAR Foundation video interview with Erica Kang below.
Stepping off the ladder
Kang graduated from Stanford with a major in international relations and policy, giving her the foundations for being able to bring people from diverse backgrounds closer together. After graduating, she joined Korea Telecom, one of the largest telco companies in Asia, working in the strategy division, but found it stifling.
“It’s a very conservative culture,” says Kang. “The hierarchy is dramatic. Juniors don’t have an opportunity to strive for creating new things or innovations, even though they have amazing ideas. It’s hard to bring it up.”
But a chance encounter with a friend took Kang on a different path. “A friend recommended a very unique seminar, back in 2017. It’s very rare in Seoul, Korea to kind of have that kind of a seminar, especially on crypto.”
The opportunity to jump into an emerging new field of technology, and away from the hierarchy of corporate South Korea was too tempting.
“Since then, I’ve been in the rabbit hole ever since.” Kang brought her experience in bringing people together in a role she describes as closer to a digital “diplomat” and started organising meet ups in and around South Korea from 2017 onwards.
Kang was well placed. South Korea has become one of the leaders in Web3 technology. The South Korean government has invested US$21 million into local services looking to utilize the metaverse, and has established a US$30 million metaverse fund to help startups expand.
Earlier this year, South Korea’s parliament approved the country’s first standalone crypto asset legislation, which integrates 19 crypto-related bills and authorises the Financial Services Commission (FSC) to oversee crypto asset operators and custodians.
But it hasn’t always been easy working in Web3.
Growing Pains
“I’ve been working in the space since 2017, and I did feel a lack of female culture here. It was devastating at times. And it was frustrating too. I really had to eliminate my femininity and had to prove I could really go up there and do my thing,” says Kang.
But the hard work, says Kang has paid off. “I think people get it now. I’ve just been building, and make a difference.” Kang is now considered one of the key pillars of the South Korean, and broader east Asian crypto community.
She has also become a sounding board for many young female entrepreneurs thinking about jumping into the Web3 space. But Kang suggests they tread with caution.
“I think it’s really important for newbies to do your own research. Learn a lot and indulge yourself but when it comes to engaging in certain projects, and certain people investing in those people, you really have to be careful.”
Kang stresses there are reputational challenges working on some projects, but the opportunity to innovate and come up with new solutions to old problems is unique in the Web3 space. |
---
id: contribute-nearcore
title: Contribute to nearcore
sidebar_label: NearCore Contributions
---
Please see nearcore's [CONTRIBUTING.md](https://github.com/near/nearcore/blob/master/CONTRIBUTING.md).
|
---
id: hardware
title: Hardware Requirements for Validator Node
sidebar_label: Hardware Requirements
sidebar_position: 1
description: NEAR Validator Node Hardware Requirements
---
This page covers the minimum and recommended hardware requirements for engaging with the NEAR platform as a validator node.
## Recommended Hardware Specifications {#recommended-hardware-specifications}
| Hardware | Recommended Specifications |
| -------------- | --------------------------------------------------------------- |
| CPU | x86_64 (Intel, AMD) processor with at least 8 physical cores |
| CPU Features | CMPXCHG16B, POPCNT, SSE4.1, SSE4.2, AVX |
| RAM | 24GB DDR4 |
| Storage | 2TB SSD (NVMe SSD is recommended. HDD will be enough for localnet only) |
Verify CPU feature support by running the following command on Linux:
```
lscpu | grep -P '(?=.*avx )(?=.*sse4.2 )(?=.*cx16 )(?=.*popcnt )' > /dev/null \
&& echo "Supported" \
|| echo "Not supported"
```
## Minimal Hardware Specifications {#minimal-hardware-specifications}
| Hardware | Minimal Specifications |
| -------------- | --------------------------------------------------------------- |
| CPU | x86_64 (Intel, AMD) processor with at least 8 physical cores |
| CPU Features | CMPXCHG16B, POPCNT, SSE4.1, SSE4.2, AVX |
| RAM | 16GB DDR4 |
| Storage | 1TB SSD (NVMe SSD is recommended. HDD will be enough for localnet only) |
Verify CPU feature support by running the following command on Linux:
```
lscpu | grep -P '(?=.*avx )(?=.*sse4.2 )(?=.*cx16 )(?=.*popcnt )' > /dev/null \
&& echo "Supported" \
|| echo "Not supported"
```
## Cost Estimation {#cost-estimation}
Estimated monthly costs depending on operating system:
| Cloud Provider | Machine Size | Linux |
| -------------- | --------------- | ---------------------- |
| AWS | m5.2xlarge | $330 CPU + $80 storage |
| GCP | n2-standard-8 | $280 CPU + $80 storage |
| Azure | Standard_F8s_v2 | $180 CPU + $40 storage |
<blockquote class="info">
<strong>Resources for Cost Estimation</strong><br /><br />
All prices reflect *reserved instances* which offer deep discounts on all platforms with a 1 year commitment
- AWS
- cpu: https://aws.amazon.com/ec2/pricing/reserved-instances/pricing
- storage: https://aws.amazon.com/ebs/pricing
- GCP
- cpu: https://cloud.google.com/compute/vm-instance-pricing
- storage: https://cloud.google.com/compute/disks-image-pricing
- Azure — https://azure.microsoft.com/en-us/pricing/calculator
</blockquote>
>Got a question?
<a href="https://stackoverflow.com/questions/tagged/nearprotocol">
<h8>Ask it on StackOverflow!</h8></a>
|
John Smith joins NEAR Inc. as CISO to redefine safety in the Blockchain Ecosystem
NEAR FOUNDATION
October 29, 2021
We are thrilled to announce that John Smith has joined NEAR Inc as its Chief Information Security Officer (CISO). John will be leading NEAR Inc’s information security strategy, working closely with core development teams, partners, and the ever-expanding NEAR community.
“John has the right background, experience, and network to do the hard work of elevating trust in the blockchain community,” says Illia Polosukhin, the CEO of Near Inc. “He comes to us with a history of solving complex technical problems at scale across challenging industries.”
John is a seasoned cybersecurity professional—having led highly collaborative teams at iconic threat intelligence, security, and technology companies; serving clients at every scale across an array of industries. In his previous role as the VP of Business Information and Product Security at the global ad ratings company Nielsen, he led all aspects of application and cloud security. At Nielsen, he transformed the relationship between developers and the security organization, shifting DevOps to DevSecOps. Early in his career, John also served at the US Nuclear Regulatory Committee, working on digital transformation, alongside reactor programs and human safety. There he learned to understand and appreciate the detailed nuances of risk analysis, applied in complex high-stakes environments.
While John will be responsible for ensuring traditional security practices are maintained at NEAR, he will be particularly focused on developing and enabling scalable security practices for smart contract developers and ensuring the security of dApps when they go live.
Scalable Security
Interest in DeFi continues to soar and NEAR is uniquely positioned to drive massive adoption of this revolutionary technology while continuing to build the foundations of Web3.
NEAR Inc. is achieving this through dedication to developer friendliness, scalability, cross-chain interoperability, and performance. These are the features that drive adoption.
However, these features don’t guarantee that blockchain and cryptocurrency are free of issues. We’ve seen the ramifications of development mistakes and attacks on DeFi protocols and smart contracts.
This creates another barrier to adoption—trust—and it limits the adoption of not just a single protocol, but all of Web 3.0. Ensuring the security of the NEAR protocol and elevating the practices of the entire NEAR development community are crucial to earning that trust and attracting new participants to take part in the future of finance and the web.
We are excited that John has joined our team to help us achieve these goals. His work will help NEAR Inc. continue to build out a secure and scalable blockchain designed to onboard the world into Web3. |
Case Study: Building on the NEAR Blockchain Operating System (BOS)
CASE STUDIES
July 24, 2023
NEAR is a home to a number of amazing apps and projects. These developers, founders, and entrepreneurs are using NEAR to effortlessly create and distribute innovative decentralized apps, while helping build a more open web — free from centralized platforms. In these Case Study videos, NEAR Foundation showcases some of these projects.
In the latest NEAR Foundation Case Study video, viewers take a deep dive into the Blockchain Operating System (BOS). We explore how the BOS, like a decentralized app store, isn’t controlled by any centralized entities. Other topics of conversation include BOS Gateways, the discoverability and composability of BOS apps and components, building on the BOS with Javascript, and integrations with Visual Studio Code and Github.
|
# Wallet Selector
An easy-to-navigate modal that allows users to select their preferred wallet to easily interact with the NEAR protocol.
Launched in March 2022 by the NEAR Foundation, this simple modal will appear whenever users are given the option to “Connect Wallet” to the NEAR blockchain.
![Preview](/docs/assets/wallet-selector-preview.png)
*Initial screen of [Wallet Selector](https://near.github.io/wallet-selector/)*
---
## Framework agnostic
[React](https://reactjs.org/) / [Next.js](https://nextjs.org/) and [Angular](https://angular.io/) variations of the [Guest Book](https://github.com/near-examples/guest-book-examples/) dApp can be found in the [`examples`](https://github.com/near/wallet-selector/tree/main/examples) directory. Developers can use these to gain a concrete understanding of how to integrate NEAR Wallet Selector into their own dApp.
### Unlocking the wallet ecosystem
Wallet Selector makes it easy for users to interact with dApps by providing an abstraction over various wallets and wallet types within the NEAR ecosystem.
:::info
You can check the current list of supported wallets in the [README.md](https://github.com/near/wallet-selector/blob/main/README.md) file of near/wallet-selector repository.
:::
Thanks to NEAR’s open and inclusive approach, other wallet developers can contribute to the NEAR ecosystem by following the documentation and instructions on the [NEAR Github repository](https://github.com/near/wallet-selector) on how to add a new wallets to the Wallet Selector.
:::tip
To learn more on how to include new wallets for Wallet Selector you can check the listing criteria for third party wallets on this [link](https://github.com/near/wallet-selector/blob/main/CONTRIBUTING.md#listing-criteria-for-third-party-wallet-on-wallet-selector).
:::
## Install
The easiest way to use NEAR Wallet Selector is to install the core package from the NPM registry, some packages may require near-api-js v0.44.2 or above check them at packages.
```bash
npm install near-api-js@^0.44.2
```
```bash
npm install @near-wallet-selector/core
```
Next, you'll need to install the wallets you want to support:
```bash
npm install \
@near-wallet-selector/near-wallet \
@near-wallet-selector/my-near-wallet \
@near-wallet-selector/sender \
@near-wallet-selector/nearfi \
@near-wallet-selector/here-wallet \
@near-wallet-selector/math-wallet \
@near-wallet-selector/nightly \
@near-wallet-selector/meteor-wallet \
@near-wallet-selector/ledger \
@near-wallet-selector/wallet-connect \
@near-wallet-selector/nightly-connect \
@near-wallet-selector/default-wallets \
@near-wallet-selector/coin98-wallet
```
## Setup Wallet Selector
Optionally, you can install our [`modal-ui`](https://www.npmjs.com/package/@near-wallet-selector/modal-ui) or [`modal-ui-js`](https://www.npmjs.com/package/@near-wallet-selector/modal-ui-js) package for a pre-built interface that wraps the `core` API and presents the supported wallets:
```bash
npm install @near-wallet-selector/modal-ui
```
Then use it in your dApp:
```ts
import { setupWalletSelector } from "@near-wallet-selector/core";
import { setupModal } from "@near-wallet-selector/modal-ui";
import { setupNearWallet } from "@near-wallet-selector/near-wallet";
const selector = await setupWalletSelector({
network: "testnet",
modules: [setupNearWallet()],
});
const modal = setupModal(selector, {
contractId: "test.testnet",
});
modal.show();
```
:::info Required CSS
To integrate the Wallet Selector, you also need to include the required CSS:
```
import "@near-wallet-selector/modal-ui/styles.css"
```
:::
## API Reference
The API reference of the selector can be found [`here`](https://github.com/near/wallet-selector/blob/main/packages/core/docs/api/selector.md)
## Wallet API
### Sign in
```ts
// NEAR Wallet.
(async () => {
const wallet = await selector.wallet("my-near-wallet");
const accounts = await wallet.signIn({ contractId: "test.testnet" });
})();
```
### Sign out
```ts
(async () => {
const wallet = await selector.wallet("my-near-wallet");
await wallet.signOut();
})();
```
### Get accounts
```ts
(async () => {
const wallet = await selector.wallet("my-near-wallet");
const accounts = await wallet.getAccounts();
console.log(accounts); // [{ accountId: "test.testnet" }]
})();
```
### Verify Owner
```ts
// MyNearWallet
(async () => {
const wallet = await selector.wallet("my-near-wallet");
await wallet.verifyOwner({
message: "Test message",
});
})();
```
### Sign and send transaction
```ts
(async () => {
const wallet = await selector.wallet("my-near-wallet");
await wallet.signAndSendTransaction({
actions: [
{
type: "FunctionCall",
params: {
methodName: "addMessage",
args: { text: "Hello World!" },
gas: "30000000000000",
deposit: "10000000000000000000000",
},
},
],
});
})();
```
### Sign and send transactions
```ts
(async () => {
const wallet = await selector.wallet("my-near-wallet");
await wallet.signAndSendTransactions({
transactions: [
{
receiverId: "guest-book.testnet",
actions: [
{
type: "FunctionCall",
params: {
methodName: "addMessage",
args: { text: "Hello World!" },
gas: "30000000000000",
deposit: "10000000000000000000000",
},
},
],
},
],
});
})();
```
|
---
sidebar_position: 4
sidebar_label: "Hash the solution, unit tests, and an init method"
title: "Introduction to basic hashing and adding unit tests"
---
import {Github} from "@site/src/components/codetabs"
import batchCookieTray from '/docs/assets/crosswords/batch-of-actions--dobulyo.near--w_artsu.jpg';
# Hash the solution, add basic unit tests
In the previous section, we stored the crossword solution as plain text as a `String` type on the smart contract. If we're trying to hide the solution from the users, this isn't a great approach as it'll be public to anyone looking at the state. Let's instead hash our crossword solution and store that instead. There are different ways to hash data, but let's use `sha256` which is one of the hashing algorithms available in [the Rust SDK](https://docs.rs/near-sdk/latest/near_sdk/env/fn.sha256.html).
:::info Remind me about hashing
Without getting into much detail, hashing is a "one-way" function that will output a result from a given input. If you have input (in our case, the crossword puzzle solution) you can get a hash, but if you have a hash you cannot get the input. This basic idea is foundational to information theory and security.
Later on in this tutorial, we'll switch from using `sha256` to using cryptographic key pairs to illustrate additional NEAR concepts.
Learn more about hashing from [Evgeny Kapun](https://github.com/abacabadabacaba)'s presentation on the subject. You may find other NEAR-related videos from the channel linked in the screenshot below.
[![Evgeny Kapun presents details on hashing](/docs/assets/crosswords/kapun-hashing.png)](https://youtu.be/PfabikgnD08)
:::
## Helper unit test during rapid iteration
As mentioned in the first section of this **Basics** chapter, our smart contract is technically a library as defined in the manifest file. For our purposes, a consequence of writing a library in Rust is not having a "main" function that runs. You may find many online tutorials where the command `cargo run` is used during development. We don't have this luxury, but we can use unit tests to interact with our smart contract. This is likely more convenient than building the contract, deploying to a blockchain network, and calling a method.
We'll add a dependency to the [hex crate](https://crates.io/crates/hex) to make things easier. As you may remember, dependencies live in the manifest file.
<Github language="rust" start="10" end="12" url="https://github.com/near-examples/crossword-tutorial-chapter-1/blob/481a83f0c90398f3234ce8006af4e232d6c779d7/contract/Cargo.toml" />
Let's write a unit test that acts as a helper during development. This unit test will sha256 hash the input **"near nomicon ref finance"** and print it in a human-readable, hex format. (We'll typically put unit tests at the bottom of the `lib.rs` file.)
```rust
#[cfg(test)]
mod tests {
use super::*;
use near_sdk::test_utils::{get_logs, VMContextBuilder};
use near_sdk::{testing_env, AccountId};
#[test]
fn debug_get_hash() {
// Basic set up for a unit test
testing_env!(VMContextBuilder::new().build());
// Using a unit test to rapidly debug and iterate
let debug_solution = "near nomicon ref finance";
let debug_hash_bytes = env::sha256(debug_solution.as_bytes());
let debug_hash_string = hex::encode(debug_hash_bytes);
println!("Let's debug: {:?}", debug_hash_string);
}
}
```
:::info What is that `{:?}` thing?
Take a look at different formatting traits that are covered in the [`std` Rust docs](https://doc.rust-lang.org/std/fmt/index.html#formatting-traits) regarding this. This is a `Debug` formatting trait and can prove to be useful during development.
:::
Run the unit tests with the command:
```
cargo test -- --nocapture
```
You'll see this output:
```
…
running 1 test
Let's debug: "69c2feb084439956193f4c21936025f14a5a5a78979d67ae34762e18a7206a0f"
test tests::debug_get_hash ... ok
test result: ok. 1 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.00s
```
This means when you sha256 the input **"near nomicon ref finance"** it produces the hash:
`69c2feb084439956193f4c21936025f14a5a5a78979d67ae34762e18a7206a0f`
:::tip Note on the test flags
You may also run tests using:
```
cargo test
```
Note that the test command we ran had additional flags. Those flags told Rust *not to hide the output* from the tests. You can read more about this in [the cargo docs](https://doc.rust-lang.org/cargo/commands/cargo-test.html#display-options). Go ahead and try running the tests using the command above, without the additional flags, and note that we won't see the debug message.
:::
The unit test above is meant for debugging and quickly running snippets of code. Some may find this a useful technique when getting familiar with Rust and writing smart contracts. Next we'll write a real unit test that applies to this early version of our crossword puzzle contract.
## Write a regular unit test
Let's add this unit test (inside the `mod tests {}` block, under our previous unit test) and analyze it:
<Github language="rust" start="63" end="93" url="https://github.com/near-examples/crossword-tutorial-chapter-1/blob/5bce1c2a604fcb179e9789de1f299063f91abb4d/contract/src/lib.rs" />
The first few lines of code will be used commonly when writing unit tests. It uses the `VMContextBuilder` to create some basic context for a transaction, then sets up the testing environment.
Next, an object is created representing the contract and the `set_solution` function is called. After that, the `guess_solution` function is called twice: first with the incorrect solution and then the correct one. We can check the logs to determine that the function is acting as expected.
:::info Note on assertions
This unit test uses the [`assert_eq!`](https://doc.rust-lang.org/std/macro.assert_eq.html) macro. Similar macros like [`assert!`](https://doc.rust-lang.org/std/macro.assert.html) and [`assert_ne!`](https://doc.rust-lang.org/std/macro.assert_ne.html) are commonly used in Rust. These are great to use in unit tests. However, these will add unnecessary overhead when added to contract logic, and it's recommended to use the [`require!` macro](https://docs.rs/near-sdk/4.0.0-pre.2/near_sdk/macro.require.html). See more information on this and [other efficiency tips here](/sdk/rust/contract-size).
:::
Again, we can run all the unit tests with:
```
cargo test -- --nocapture
```
:::tip Run only one test
To only run this latest test, use the command:
```
cargo test check_guess_solution -- --nocapture
```
:::
## Modifying `set_solution`
The [overview section](00-overview.md) of this chapter tells us we want to have a single crossword puzzle and the user solving the puzzle should not be able to know the solution. Using a hash addresses this, and we can keep `crossword_solution`'s field type, as `String` will work just fine. The overview also indicates we only want the author of the crossword puzzle to be able to set the solution. As it stands, our function `set_solution` can be called by anyone with a full-access key. It's trivial for someone to create a NEAR account and call this function, changing the solution. Let's fix that.
Let's have the solution be set once, right after deploying the smart contract.
Here we'll use the [`#[near_bindgen]` macro](https://docs.rs/near-sdk/latest/near_sdk/attr.near_bindgen.html) on a function called `new`, which is a common pattern.
<Github language="rust" start="10" end="17" url="https://github.com/near-examples/crossword-tutorial-chapter-1/blob/94f42e75cf70ed2aafb9c29a1faa1e21f079a49e/contract/src/lib.rs" />
Let's call this method on a fresh contract.
```bash
# Build (for Windows it's build.bat)
./build.sh
# Create fresh account if you wish, which is good practice
near delete crossword.friend.testnet friend.testnet
near create-account crossword.friend.testnet --masterAccount friend.testnet
# Deploy
near deploy crossword.friend.testnet --wasmFile res/my_crossword.wasm
# Call the "new" method
near call crossword.friend.testnet new '{"solution": "69c2feb084439956193f4c21936025f14a5a5a78979d67ae34762e18a7206a0f"}' --accountId crossword.friend.testnet
```
Now the crossword solution, as a hash, is stored instead. If you try calling the last command again, you'll get the error message, thanks to the `#[init]` macro:
`The contract has already been initialized`
## First use of Batch Actions
This is close to what we want, but what if a person deploys their smart contract and **someone else** quickly calls the `new` function before them? We want to make sure the same person who deployed the contract sets the solution, and we can do this using Batch Actions. Besides, why send two transactions when we can do it in one? (Technical details covered in the spec for a [batch transaction here](https://nomicon.io/RuntimeSpec/Transactions.html?highlight=batch#batched-transaction).)
<figure>
<img src={batchCookieTray} alt="Cookie sheet representing a transaction, where cookies are Deploy and FunctionCall Actions. Art created by dobulyo.near."/>
<figcaption className="full-width">Art by <a href="https://twitter.com/w_artsu" target="_blank">dobulyo.near</a></figcaption>
</figure><br/>
:::info Batch Actions in use
Batch Actions are common in this instance, where we want to deploy and call an initialization function. They're also common when using a factory pattern, where a subaccount is created, a smart contract is deployed to it, a key is added, and a function is called.
Here's a truncated snippet from a useful (though somewhat advanced) repository with a wealth of useful code:
<Github language="rust" start="172" end="177" url="https://github.com/near/core-contracts/blob/1720c0cfee238974ebeae8ad43076abeb951504f/staking-pool-factory/src/lib.rs" />
We'll get into Actions later in this tutorial, but in the meantime here's a handy [reference from the spec](https://nomicon.io/RuntimeSpec/Actions.html).
:::
As you can from the info bubble above, we can batch [Deploy](https://docs.rs/near-sdk/3.1.0/near_sdk/struct.Promise.html#method.deploy_contract) and [FunctionCall](https://docs.rs/near-sdk/3.1.0/near_sdk/struct.Promise.html#method.function_call) Actions. This is exactly what we want to do for our crossword puzzle, and luckily, NEAR CLI has a [flag especially for this](https://docs.near.org/tools/near-cli#near-deploy).
Let's run this again with the handy `--initFunction` and `--initArgs` flags:
```bash
# Create fresh account if you wish, which is good practice
near delete crossword.friend.testnet friend.testnet
near create-account crossword.friend.testnet --masterAccount friend.testnet
# Deploy
near deploy crossword.friend.testnet --wasmFile res/my_crossword.wasm \
--initFunction 'new' \
--initArgs '{"solution": "69c2feb084439956193f4c21936025f14a5a5a78979d67ae34762e18a7206a0f"}'
```
Now that we're using Batch Actions, no one can call this `new` method before us.
:::note Batch action failures
If one Action in a set of Batch Actions fails, the entire transaction is reverted. This is good to note because sharded, proof-of-stake systems do not work like proof-of-work where a complex transaction with multiple cross-contract calls reverts if one call fails. With NEAR, cross-contract calls use callbacks to ensure expected behavior, but we'll get to that later.
:::
## Get ready for our frontend
In the previous section we showed that we could use a `curl` command to view the state of the contract without explicitly having a function that returns a value from state. Now that we've demonstrated that and hashed the solution, let's add a short view-only function `get_solution`.
In the next section we'll add a simple frontend for our single, hardcoded crossword puzzle. We'll want to easily call a function to get the final solution hash. We can use this opportunity to remove the function `get_puzzle_number` and the constant it returns, as these were use for informative purposes.
We'll also modify our `guess_solution` to return a boolean value, which will also make things easier for our frontend.
<Github language="rust" start="19" end="34" url="https://github.com/near-examples/crossword-tutorial-chapter-1/blob/94f42e75cf70ed2aafb9c29a1faa1e21f079a49e/contract/src/lib.rs" />
The `get_solution` method can be called with:
```
near view crossword.friend.testnet get_solution
```
In the next section we'll add a simple frontend. Following chapters will illustrate more NEAR concepts built on top of this idea.
|
# Multi Token Standard Approval Management
:::caution
This is part of the proposed spec [NEP-245](https://github.com/near/NEPs/blob/master/neps/nep-0245.md) and is subject to change.
:::
Version `1.0.0`
## Summary
A system for allowing a set of users or contracts to transfer specific tokens on behalf of an owner. Similar to approval management systems in standards like [ERC-721] and [ERC-1155].
[ERC-721]: https://eips.ethereum.org/EIPS/eip-721
[ERC-1155]: https://eips.ethereum.org/EIPS/eip-1155
## Motivation
People familiar with [ERC-721] may expect to need an approval management system for basic transfers, where a simple transfer from Alice to Bob requires that Alice first _approve_ Bob to spend one of her tokens, after which Bob can call `transfer_from` to actually transfer the token to himself.
NEAR's [core Multi Token standard](README.md) includes good support for safe atomic transfers without such complexity. It even provides "transfer and call" functionality (`mt_transfer_call`) which allows specific tokens to be "attached" to a call to a separate contract. For many token workflows, these options may circumvent the need for a full-blown Approval Management system.
However, some Multi Token developers, marketplaces, dApps, or artists may require greater control. This standard provides a uniform interface allowing token owners to approve other NEAR accounts, whether individuals or contracts, to transfer specific tokens on the owner's behalf.
Prior art:
- Ethereum's [ERC-721]
- Ethereum's [ERC-1155]
## Example Scenarios
Let's consider some examples. Our cast of characters & apps:
* Alice: has account `alice` with no contract deployed to it
* Bob: has account `bob` with no contract deployed to it
* MT: a contract with account `mt`, implementing only the [Multi Token Standard](Core.md) with this Approval Management extension
* Market: a contract with account `market` which sells tokens from `mt` as well as other token contracts
* Bazaar: similar to Market, but implemented differently (spoiler alert: has no `mt_on_approve` function!), has account `bazaar`
Alice and Bob are already [registered](../../StorageManagement.md) with MT, Market, and Bazaar, and Alice owns a token on the MT contract with ID=`"1"` and a fungible style token with ID =`"2"` and AMOUNT =`"100"`.
Let's examine the technical calls through the following scenarios:
1. [Simple approval](#1-simple-approval): Alice approves Bob to transfer her token.
2. [Approval with cross-contract call (XCC)](#2-approval-with-cross-contract-call): Alice approves Market to transfer one of her tokens and passes `msg` so that MT will call `mt_on_approve` on Market's contract.
3. [Approval with XCC, edge case](#3-approval-with-cross-contract-call-edge-case): Alice approves Bazaar and passes `msg` again, but what's this? Bazaar doesn't implement `mt_on_approve`, so Alice sees an error in the transaction result. Not to worry, though, she checks `mt_is_approved` and sees that she did successfully approve Bazaar, despite the error.
4. [Approval IDs](#4-approval-ids): Bob buys Alice's token via Market.
5. [Approval IDs, edge case](#5-approval-ids-edge-case): Bob transfers same token back to Alice, Alice re-approves Market & Bazaar. Bazaar has an outdated cache. Bob tries to buy from Bazaar at the old price.
6. [Revoke one](#6-revoke-one): Alice revokes Market's approval for this token.
7. [Revoke all](#7-revoke-all): Alice revokes all approval for this token.
### 1. Simple Approval
Alice approves Bob to transfer her tokens.
**High-level explanation**
1. Alice approves Bob
2. Alice queries the token to verify
**Technical calls**
1. Alice calls `mt::mt_approve({ "token_ids": ["1","2"], amounts:["1","100"], "account_id": "bob" })`. She attaches 1 yoctoⓃ, (.000000000000000000000001Ⓝ). Using [NEAR CLI](https://docs.near.org/tools/near-cli) to make this call, the command would be:
near call mt mt_approve \
'{ "token_ids": ["1","2"], amounts: ["1","100"], "account_id": "bob" }' \
--accountId alice --amount .000000000000000000000001
The response:
''
2. Alice calls view method `mt_is_approved`:
near view mt mt_is_approved \
'{ "token_ids": ["1", "2"], amounts:["1","100"], "approved_account_id": "bob" }'
The response:
true
### 3. Approval with cross-contract call
Alice approves Market to transfer some of her tokens and passes `msg` so that MT will call `mt_on_approve` on Market's contract. She probably does this via Market's frontend app which would know how to construct `msg` in a useful way.
**High-level explanation**
1. Alice calls `mt_approve` to approve `market` to transfer her token, and passes a `msg`
2. Since `msg` is included, `mt` will schedule a cross-contract call to `market`
3. Market can do whatever it wants with this info, such as listing the token for sale at a given price. The result of this operation is returned as the promise outcome to the original `mt_approve` call.
**Technical calls**
1. Using near-cli:
near call mt mt_approve '{
"token_ids": ["1","2"],
"amounts": ["1", "100"],
"account_id": "market",
"msg": "{\"action\": \"list\", \"price\": [\"100\",\"50\"],\"token\": \"nDAI\" }"
}' --accountId alice --amount .000000000000000000000001
At this point, near-cli will hang until the cross-contract call chain fully resolves, which would also be true if Alice used a Market frontend using [near-api-js](https://docs.near.org/tools/near-api-js/quick-reference). Alice's part is done, though. The rest happens behind the scenes.
2. `mt` schedules a call to `mt_on_approve` on `market`. Using near-cli notation for easy cross-reference with the above, this would look like:
near call market mt_on_approve '{
"token_ids": ["1","2"],
"amounts": ["1","100"],
"owner_id": "alice",
"approval_ids": ["4","5"],
"msg": "{\"action\": \"list\", \"price\": [\"100\",\"50\"], \"token\": \"nDAI\" }"
}' --accountId mt
3. `market` now knows that it can sell Alice's tokens for 100 [nDAI](https://explorer.mainnet.near.org/accounts/6b175474e89094c44da98b954eedeac495271d0f.factory.bridge.near) and 50 [nDAI](https://explorer.mainnet.near.org/accounts/6b175474e89094c44da98b954eedeac495271d0f.factory.bridge.near), and that when it transfers it to a buyer using `mt_batch_transfer`, it can pass along the given `approval_ids` to ensure that Alice hasn't changed her mind. It can schedule any further cross-contract calls it wants, and if it returns these promises correctly, Alice's initial near-cli call will resolve with the outcome from the final step in the chain. If Alice actually made this call from a Market frontend, the frontend can use this return value for something useful.
### 3. Approval with cross-contract call, edge case
Alice approves Bazaar and passes `msg` again. Maybe she actually does this via near-cli, rather than using Bazaar's frontend, because what's this? Bazaar doesn't implement `mt_on_approve`, so Alice sees an error in the transaction result.
Not to worry, though, she checks `mt_is_approved` and sees that she did successfully approve Bazaar, despite the error. She will have to find a new way to list her token for sale in Bazaar, rather than using the same `msg` shortcut that worked for Market.
**High-level explanation**
1. Alice calls `mt_approve` to approve `bazaar` to transfer her token, and passes a `msg`.
2. Since `msg` is included, `mt` will schedule a cross-contract call to `bazaar`.
3. Bazaar doesn't implement `mt_on_approve`, so this call results in an error. The approval still worked, but Alice sees an error in her near-cli output.
4. Alice checks if `bazaar` is approved, and sees that it is, despite the error.
**Technical calls**
1. Using near-cli:
near call mt mt_approve '{
"token_ids": ["1"],
"amounts: ["1000"],
"account_id": "bazaar",
"msg": "{\"action\": \"list\", \"price\": \"100\", \"token\": \"nDAI\" }"
}' --accountId alice --amount .000000000000000000000001
2. `mt` schedules a call to `mt_on_approve` on `market`. Using near-cli notation for easy cross-reference with the above, this would look like:
near call bazaar mt_on_approve '{
"token_ids": ["1"],
"amounts": ["1000"],
"owner_id": "alice",
"approval_ids": [3],
"msg": "{\"action\": \"list\", \"price\": \"100\", \"token\": \"nDAI\" }"
}' --accountId mt
3. 💥 `bazaar` doesn't implement this method, so the call results in an error. Alice sees this error in the output from near-cli.
4. Alice checks if the approval itself worked, despite the error on the cross-contract call:
near view mt mt_is_approved \
'{ "token_ids": ["1","2"], "amounts":["1","100"], "approved_account_id": "bazaar" }'
The response:
true
### 4. Approval IDs
Bob buys Alice's token via Market. Bob probably does this via Market's frontend, which will probably initiate the transfer via a call to `ft_transfer_call` on the nDAI contract to transfer 100 nDAI to `market`. Like the MT standard's "transfer and call" function, [Fungible Token](../FungibleToken/Core.md)'s `ft_transfer_call` takes a `msg` which `market` can use to pass along information it will need to pay Alice and actually transfer the MT. The actual transfer of the MT is the only part we care about here.
**High-level explanation**
1. Bob signs some transaction which results in the `market` contract calling `mt_transfer` on the `mt` contract, as described above. To be trustworthy and pass security audits, `market` needs to pass along `approval_id` so that it knows it has up-to-date information.
**Technical calls**
Using near-cli notation for consistency:
near call mt mt_transfer '{
"receiver_id": "bob",
"token_id": "1",
"amount": "1",
"approval_id": 2,
}' --accountId market --amount .000000000000000000000001
### 5. Approval IDs, edge case
Bob transfers same token back to Alice, Alice re-approves Market & Bazaar, listing her token at a higher price than before. Bazaar is somehow unaware of these changes, and still stores `approval_id: 3` internally along with Alice's old price. Bob tries to buy from Bazaar at the old price. Like the previous example, this probably starts with a call to a different contract, which eventually results in a call to `mt_transfer` on `bazaar`. Let's consider a possible scenario from that point.
**High-level explanation**
Bob signs some transaction which results in the `bazaar` contract calling `mt_transfer` on the `mt` contract, as described above. To be trustworthy and pass security audits, `bazaar` needs to pass along `approval_id` so that it knows it has up-to-date information. It does not have up-to-date information, so the call fails. If the initial `mt_transfer` call is part of a call chain originating from a call to `ft_transfer_call` on a fungible token, Bob's payment will be refunded and no assets will change hands.
**Technical calls**
Using near-cli notation for consistency:
near call mt mt_transfer '{
"receiver_id": "bob",
"token_id": "1",
"amount": "1",
"approval_id": 3,
}' --accountId bazaar --amount .000000000000000000000001
### 6. Revoke one
Alice revokes Market's approval for this token.
**Technical calls**
Using near-cli:
near call mt mt_revoke '{
"account_id": "market",
"token_ids": ["1"],
}' --accountId alice --amount .000000000000000000000001
Note that `market` will not get a cross-contract call in this case. The implementors of the Market app should implement [cron](https://en.wikipedia.org/wiki/Cron)-type functionality to intermittently check that Market still has the access they expect.
### 7. Revoke all
Alice revokes all approval for these tokens
**Technical calls**
Using near-cli:
near call mt mt_revoke_all '{
"token_ids": ["1", "2"],
}' --accountId alice --amount .000000000000000000000001
Again, note that no previous approvers will get cross-contract calls in this case.
## Reference-level explanation
The `TokenApproval` structure returned by `mt_token_approvals` returns `approved_account_ids` field, which is a map of account IDs to `Approval` and `approval_owner_id` which is the associated account approved for removal from. The `amount` field though wrapped in quotes and treated like strings, the number will be stored as an unsigned integer with 128 bits.
in approval is Using TypeScript's [Record type](https://www.typescriptlang.org/docs/handbook/utility-types.html#recordkeystype) notation:
```diff
+ type Approval = {
+ amount: string
+ approval_id: string
+ }
+
+ type TokenApproval = {
+ approval_owner_id: string,
+ approved_account_ids: Record<string, Approval>,
+ };
```
Example token approval data:
```json
[{
"approval_owner_id": "alice.near",
"approved_account_ids": {
"bob.near": {
"amount": "100",
"approval_id":1,
},
"carol.near": {
"amount":"2",
"approval_id": 2,
}
}
}]
```
### What is an "approval ID"?
This is a unique number given to each approval that allows well-intentioned marketplaces or other 3rd-party MT resellers to avoid a race condition. The race condition occurs when:
1. A token is listed in two marketplaces, which are both saved to the token as approved accounts.
2. One marketplace sells the token, which clears the approved accounts.
3. The new owner sells back to the original owner.
4. The original owner approves the token for the second marketplace again to list at a new price. But for some reason the second marketplace still lists the token at the previous price and is unaware of the transfers happening.
5. The second marketplace, operating from old information, attempts to again sell the token at the old price.
Note that while this describes an honest mistake, the possibility of such a bug can also be taken advantage of by malicious parties via [front-running](https://users.encs.concordia.ca/~clark/papers/2019_wtsc_front.pdf).
To avoid this possibility, the MT contract generates a unique approval ID each time it approves an account. Then when calling `mt_transfer`, `mt_transfer_call`, `mt_batch_transfer`, or `mt_batch_transfer_call` the approved account passes `approval_id` or `approval_ids` with this value to make sure the underlying state of the token(s) hasn't changed from what the approved account expects.
Keeping with the example above, say the initial approval of the second marketplace generated the following `approved_account_ids` data:
```json
{
"approval_owner_id": "alice.near",
"approved_account_ids": {
"marketplace_1.near": {
"approval_id": 1,
"amount": "100",
},
"marketplace_2.near": 2,
"approval_id": 2,
"amount": "50",
}
}
```
But after the transfers and re-approval described above, the token might have `approved_account_ids` as:
```json
{
"approval_owner_id": "alice.near",
"approved_account_ids": {
"marketplace_2.near": {
"approval_id": 3,
"amount": "50",
}
}
}
```
The marketplace then tries to call `mt_transfer`, passing outdated information:
```bash
# oops!
near call mt-contract.near mt_transfer '{"account_id": "someacct", "amount":"50", "approval_id": 2 }'
```
### Interface
The MT contract must implement the following methods:
```ts
/******************/
/* CHANGE METHODS */
/******************/
// Add an approved account for a specific set of tokens.
//
// Requirements
// * Caller of the method must attach a deposit of at least 1 yoctoⓃ for
// security purposes
// * Contract MAY require caller to attach larger deposit, to cover cost of
// storing approver data
// * Contract MUST panic if called by someone other than token owner
// * Contract MUST panic if addition would cause `mt_revoke_all` to exceed
// single-block gas limit. See below for more info.
// * Contract MUST increment approval ID even if re-approving an account
// * If successfully approved or if had already been approved, and if `msg` is
// present, contract MUST call `mt_on_approve` on `account_id`. See
// `mt_on_approve` description below for details.
//
// Arguments:
// * `token_ids`: the token ids for which to add an approval
// * `account_id`: the account to add to `approved_account_ids`
// * `amounts`: the number of tokens to approve for transfer, wrapped in quotes and treated
// like an array of string, although the numbers will be stored as an array of
// unsigned integer with 128 bits.
// * `msg`: optional string to be passed to `mt_on_approve`
//
// Returns void, if no `msg` given. Otherwise, returns promise call to
// `mt_on_approve`, which can resolve with whatever it wants.
function mt_approve(
token_ids: [string],
amounts: [string],
account_id: string,
msg: string|null,
): void|Promise<any> {}
// Revoke an approved account for a specific token.
//
// Requirements
// * Caller of the method must attach a deposit of 1 yoctoⓃ for security
// purposes
// * If contract requires >1yN deposit on `mt_approve`, contract
// MUST refund associated storage deposit when owner revokes approval
// * Contract MUST panic if called by someone other than token owner
//
// Arguments:
// * `token_ids`: the token for which to revoke approved_account_ids
// * `account_id`: the account to remove from `approvals`
function mt_revoke(
token_ids: [string],
account_id: string
) {}
// Revoke all approved accounts for a specific token.
//
// Requirements
// * Caller of the method must attach a deposit of 1 yoctoⓃ for security
// purposes
// * If contract requires >1yN deposit on `mt_approve`, contract
// MUST refund all associated storage deposit when owner revokes approved_account_ids
// * Contract MUST panic if called by someone other than token owner
//
// Arguments:
// * `token_ids`: the token ids with approved_account_ids to revoke
function mt_revoke_all(token_ids: [string]) {}
/****************/
/* VIEW METHODS */
/****************/
// Check if tokens are approved for transfer by a given account, optionally
// checking an approval_id
//
// Requirements:
// * Contract MUST panic if `approval_ids` is not null and the length of
// `approval_ids` is not equal to `token_ids`
//
// Arguments:
// * `token_ids`: the tokens for which to check an approval
// * `approved_account_id`: the account to check the existence of in `approved_account_ids`
// * `amounts`: specify the positionally corresponding amount for the `token_id`
// that at least must be approved. The number of tokens to approve for transfer,
// wrapped in quotes and treated like an array of string, although the numbers will be
// stored as an array of unsigned integer with 128 bits.
// * `approval_ids`: an optional array of approval IDs to check against
// current approval IDs for given account and `token_ids`.
//
// Returns:
// if `approval_ids` is given, `true` if `approved_account_id` is approved with given `approval_id`
// and has at least the amount specified approved otherwise, `true` if `approved_account_id`
// is in list of approved accounts and has at least the amount specified approved
// finally it returns false for all other states
function mt_is_approved(
token_ids: [string],
approved_account_id: string,
amounts: [string],
approval_ids: number[]|null
): boolean {}
// Get a the list of approvals for a given token_id and account_id
//
// Arguments:
// * `token_id`: the token for which to check an approval
// * `account_id`: the account to retrieve approvals for
//
// Returns a TokenApproval object, as described in Approval Management standard
function mt_token_approval(
token_id: string,
account_id: string,
): TokenApproval {}
// Get a list of all approvals for a given token_id
//
// Arguments:
// * `from_index`: a string representing an unsigned 128-bit integer,
// representing the starting index of tokens to return
// * `limit`: the maximum number of tokens to return
//
// Returns an array of TokenApproval objects, as described in Approval Management standard, and an empty array if there are no approvals
function mt_token_approvals(
token_id: string,
from_index: string|null, // default: "0"
limit: number|null,
): TokenApproval[] {}
```
### Why must `mt_approve` panic if `mt_revoke_all` would fail later?
In the description of `mt_approve` above, it states:
Contract MUST panic if addition would cause `mt_revoke_all` to exceed
single-block gas limit.
What does this mean?
First, it's useful to understand what we mean by "single-block gas limit". This refers to the [hard cap on gas per block at the protocol layer](https://docs.near.org/docs/concepts/gas#thinking-in-gas). This number will increase over time.
Removing data from a contract uses gas, so if an MT had a large enough number of approvals, `mt_revoke_all` would fail, because calling it would exceed the maximum gas.
Contracts must prevent this by capping the number of approvals for a given token. However, it is up to contract authors to determine a sensible cap for their contract (and the single block gas limit at the time they deploy). Since contract implementations can vary, some implementations will be able to support a larger number of approvals than others, even with the same maximum gas per block.
Contract authors may choose to set a cap of something small and safe like 10 approvals, or they could dynamically calculate whether a new approval would break future calls to `mt_revoke_all`. But every contract MUST ensure that they never break the functionality of `mt_revoke_all`.
### Approved Account Contract Interface
If a contract that gets approved to transfer MTs wants to, it can implement `mt_on_approve` to update its own state when granted approval for a token:
```ts
// Respond to notification that contract has been granted approval for a token.
//
// Notes
// * Contract knows the token contract ID from `predecessor_account_id`
//
// Arguments:
// * `token_ids`: the token_ids to which this contract has been granted approval
// * `amounts`: the ositionally corresponding amount for the token_id
// that at must be approved. The number of tokens to approve for transfer,
// wrapped in quotes and treated like an array of string, although the numbers will be
// stored as an array of unsigned integer with 128 bits.
// * `owner_id`: the owner of the token
// * `approval_ids`: the approval ID stored by NFT contract for this approval.
// Expected to be a number within the 2^53 limit representable by JSON.
// * `msg`: specifies information needed by the approved contract in order to
// handle the approval. Can indicate both a function to call and the
// parameters to pass to that function.
function mt_on_approve(
token_ids: [TokenId],
amounts: [string],
owner_id: string,
approval_ids: [number],
msg: string,
) {}
```
Note that the MT contract will fire-and-forget this call, ignoring any return values or errors generated. This means that even if the approved account does not have a contract or does not implement `mt_on_approve`, the approval will still work correctly from the point of view of the MT contract.
Further note that there is no parallel `mt_on_revoke` when revoking either a single approval or when revoking all. This is partially because scheduling many `mt_on_revoke` calls when revoking all approvals could incur prohibitive [gas fees](https://docs.near.org/docs/concepts/gas). Apps and contracts which cache MT approvals can therefore not rely on having up-to-date information, and should periodically refresh their caches. Since this will be the necessary reality for dealing with `mt_revoke_all`, there is no reason to complicate `mt_revoke` with an `mt_on_revoke` call.
### No incurred cost for core MT behavior
MT contracts should be implemented in a way to avoid extra gas fees for serialization & deserialization of `approved_account_ids` for calls to `mt_*` methods other than `mt_tokens`. See `near-contract-standards` [implementation of `ft_metadata` using `LazyOption`](https://github.com/near/near-sdk-rs/blob/c2771af7fdfe01a4e8414046752ee16fb0d29d39/examples/fungible-token/ft/src/lib.rs#L71) as a reference example.
|
How the NEAR Foundation is Supporting Network Decentralization
NEAR FOUNDATION
October 30, 2020
Now that inflationary rewards are live, token holders are able to earn rewards from staking. Even without running a node, they can support the stability and security of NEAR by staking to validators. In just a few weeks, more than 250 million NEAR tokens have been staked, representing 25% of all NEAR in existence. We’re overwhelmed by the amount of community support and engagement that’s ignited over the past few weeks and are excited to continue helping fuel the community’s growth.
In this article we will explore the importance of distributing stake among validators, the NEAR Foundation’s efforts to help kickstart the validator community with Delegation Funding, and two new recipients of NEAR Foundation Stake.
Distributing Stake to Validators
On Proof of Stake blockchains, it is common for token holders to converge on the few staking pools run by validators that already have most of the stake as they seem the most popular. While this is normal and repays the hard work of validators, it can have an impact on the protocol reliability and governance.
On one hand, in the unlikely scenario that the top 25% of validators go offline in a PoS blockchain, 25% of the blocks will need to be re-allocated to other nodes, which translates to slower transaction confirmations, or smart contracts needing additional time to be executed, at least until these top validator nodes come back. This may become even more evident if the top 33% goes offline: at that point, the consensus would be missing, no new blocks would be produced, and no transactions or smart contracts would be processed. In the opposite situation, if the smallest validators go offline, the protocol will miss only a few blocks, which would be almost undetected by the user.
On the other hand, this convergence can influence governance: in a liquid democracy, PoS blockchains use the staking size to weight the vote from a validator: the higher the stake, the bigger the influence on the vote. This means that every staked token gives a perpetual vote right to the validator of choice, at least until funds are delegated to a different staking pool, or unstaked completely. In a scenario where a PoS protocol is voting for a controversial change, the same top 33% validators can control the vote outcome, thanks to the delegation in their staking pool. 67% of stake is required for validators to “vote yes” on a proposal, so the most powerful validators can provide enough voting power to stall the vote or keep things unchanged.
While it’s extremely unlikely that NEAR validators will decide against the interest of the community, NEAR Foundation is anyway actively involved in keeping protocol decentralization one of its top priorities, and have as many validators as possible in the top 33%.
With that, here are three new Governance- and Validator-related initiatives that the NEAR Foundation is excited to announce:
The NEAR Governance Community
Unstaking Foundation’s Tokens
NEAR Foundation Delegation Pilot Program
NEAR Governance Community
The NEAR Governance Community launched last week as a forum for the community to discuss topical NEAR protocol improvements and governance initiatives. The first two issues submitted are the Phase II Governance Review and the Block Producer Selection Algorithm. While the content of proposals may be very technical at times for most token holders to contribute to, validators are encouraged to be actively involved in the conversation to signal to the community that they are aligned with their needs.
As a token holder selecting Validators to stake to, you can follow these conversations to get a better understanding of the commitment and direction different validators want to take the community, and elect to stake with validators that align with your beliefs regardless of the size of their staking pool.
Unstaking NEAR Foundation Tokens
With the launch of Phase One in August, the NEAR Foundation delegated 28.8 million tokens to a small number of staking pools, to kickstart the network and help validators become active while the community was claiming their own tokens.
This week, NEAR Foundation will withdraw its funds from the staking pools in the top 33% in order to help more evenly distribute stake among all validators now that more token holders from the community have started to stake. In alphabetical order, these validators are: astro-stakers.poolv1.near, bisontrails.poolv1.near, chorusone.poolv1.near, cryptium.poolv1.near, dokiacapital.poolv1.near, figment.poolv1.near, staked.poolv1.near, and zavodil.poolv1.near.
While these validators represent an example of integrity and commitment to NEAR Protocol, this operation will reduce the gap from the lower 67% of validators, and will free up additional resources to onboard more validators to NEAR (see below).
NEAR Foundation Delegation pilot
Initially introduced in the Stake Wars is over blog post, the NEAR Foundation has been piloting a program to select new validator candidates to stake to in order to help onboard them into the NEAR community.
Currently, the minimum stake to become a validator is above 2.2 million NEAR tokens. Below this threshold, the staking pool is not elected and would not have the option to earn token rewards from producing blocks. This is the typical chicken-and-egg problem: as a delegator, why should you give your stake to a pool that is not yet big enough to be elected and produce new blocks? Or if you are interested in being a validator, how will you raise enough stake to start producing blocks and generate rewards to make being a validator a viable endeavor?
The NEAR Foundation is helping to alleviate this cold start issue by providing a temporary delegation of between 1-4 million tokens as a form of “startup” stake to kickstart new validators’ participation in the active validator set, and help them attract delegation from their followers and the NEAR token holder community at large.
As mentioned above, eventually, the NEAR Foundation will progressively unstake these tokens from the validators, as it did with the top 33% validators in the list above, and re-allocate them to other staking pools to continue on boarding more validators to NEAR.
The details of criteria for receiving and maintaining the NEAR Foundation’s stake are outlined in this blog post about the program, but primarily comes down to:
Reviewing application information potential validators submit in this form
Asking potential validators to outline goals and KPIs they have to better the NEAR community using this template as an example.
This “startup” stake funding from the NEAR Foundation will help validators earn a seat in the active set and earn staking rewards from the network. Over time, as the minimum stake to be an active validator increases, validators will need to find ways to attract more stake from the community in order to remain in the active set. Practically speaking, if validators receive NEAR Foundation stake, but fail to deliver on their community growth and marketing initiatives, they will fall out of the active set by not attracting enough community token holder stake to maintain their position.
Foundation Delegation Pilot Program Recipients
We’re excited to announce that the Foundation has on boarded the first two organizations in this program to receive stake: Everstake and AUDIT.one.
Why Everstake
Everstake provided a solid growth plan, together with a strong track record in writing original contents for their community of token holders. They already published a blogpost providing basic information about NEAR, and were able to grow their staking pool size above 4 million tokens, even before the NEAR Foundation provided its official support. This is a great example of driving growth from an existing community of token holders. A copy of their submitted request can be found here.
Why AUDIT.one
AUDIT.one is the validator arm of Persistence.one, one of the largest blockchain organizations in India. AUDIT.one is built on the belief that the Validators of today will be the Auditors of tomorrow with economic skin-in-the-game. They are very active on Cosmos, and recently they started to support Polkadot.
Similarly to Everstake, they have a strong track record in generating high-quality content for their followers, along with the translation of technical documentation and papers.
Overall, Audit One becomes an ideal partner to include Hindi-speaking enthusiasts in the community, and help them delegate to a local validator.
Looking Forward
A number of great potential validator candidates have applied to the Foundation Delegation Pilot Program and we will continue to review and support the community as needed to foster the growth of a vibrant decentralized validator community on NEAR.
We encourage all token holders to get involved with the community and meet validator teams who are playing an active role in the health and governance of the NEAR ecosystem. The NEAR Foundation will continue to make this decision process for Foundation Delegation as transparent as possible, while encouraging validators who receive this temporary support to do their best to support the growth of the community as a whole.
In addition to technical prowess and uptime, validators who are providing educational content, exceptional customer service, and helpful tooling and explorers will be the best candidates for the future growth of NEAR.
Join the Community
If you are interested in becoming a validator and participating in this pilot project, submit your application from this link.
If you’re just learning about NEAR for the first time and are seeing this post, we invite you to join us on the journey to make blockchain accessible to everyday people who are excited for the future of the Open Web with NEAR. If you’re a developer, be sure to sign up for our developer program and join the conversation in our Discord chat. You can also keep up to date on future releases, announcements, and event invitations by subscribing to our newsletter for the latest news on NEAR. |
NEAR Foundation Hires Chris Ghent To Lead Global Brand and Communications Efforts
NEAR FOUNDATION
November 2, 2021
NEAR Foundation is delighted to announce that Chris Ghent has joined NEAR Foundation as Global Head of Brand Strategy & Partnerships. Chris will be spearheading the Foundation’s efforts to build on its strategy of creating a blockchain that is simple to use, incredibly secure, and infinitely scalable.
He’ll do that by looking to build the NEAR brand to help join the dots between product, the organization, and the community to ensure the NEAR ecosystem is working collectively to bring more people into web3 and deliver on its mission of mass adoption.
“Chris brings exactly what this ecosystem needs right now — a clear and experienced voice to help surface the stories and ideas that matter so everyone can see what the NEAR ecosystem is capable of and what it’s doing well right now,” says Erik Trautman, NEAR Foundation’s CEO.
An award-winning marketer and entrepreneur, Chris started his career launching an eCommerce retail start-up in 2005. In the marketing and communications industry, he has specialized in strategy first, digital-led, technology-powered, and data-driven initiatives. Delivering business strategies and marketing approaches that transform how companies go to market is his primary objective.
Chris has worked with leading brands including Breaker (formerly SingularDTV), First Republic Bank, Harman International Industries, Infiniti Motor Company Ltd, Johnson & Johnson, Keds, Nielsen, Novartis, Prestige Brands, and Stanley Black & Decker.
Chris has already spent three years in the crypto industry, previously leading the development of a US-based marketing team in NYC, which focused on global brand coordination and paid marketing initiatives in the Tezos ecosystem. He has been an advocate and early adopter of non-traditional blockchain based advertising solutions, partnering with Brave Browser from alpha to beta to mobile capabilities, and believes that traditional mediums like digitally connected out-of-home media will be key to mass adoption. He looks forward to driving awareness of the NEAR brand and delivering awareness that the NEAR ecosystem is the home to crypto/blockchain innovation.
“Having worked in the crypto/blockchain space full-time since 2018, I’ve always been attracted to projects that have very clear value propositions yet have a hard time telling their story,” explains Ghent.
“Teams and communities in crypto are fantastic at focusing on amazing technological advantages, but often overlook the core reasons why new users join the space or project: simplicity, a compelling brand, and a number of key features that can’t be found anywhere else. One of those is seamless UX for onboarding users, which is what attracted me the most to NEAR. It’s worked harder than most to make someone’s first step in crypto, simple.
Crypto Barriers
Crypto has struggled to identify the needs of developers, engineers, and end-users curious about the industry. Is it easy to use? Can programming languages from Web2 be brought over to Web3? Are there hidden costs to deploying and maintaining code?
It’s on the protocols, their foundations, and their communities to solve these issues if mass adoption is to come to crypto. It’s something the NEAR Foundation is focused on and looking to build on over the next 12 months.
NEAR’s developer ecosystem, the community and founders, have consistently put the end-user at the heart of technological development and decision making. We often forget that the category is still relatively new, but NEAR has become an innovation inspiration for so many other projects and teams.
“I can’t wait to learn from what’s working best from the community and Foundation team to harness those insights in order to drive net-new developers into the ecosystem who will build even more of the most end-user friendly NEAR-based experiences.”
We are excited that Chris has joined our team to help us achieve these goals. His work will help NEAR continue to build out its brand and storytelling capability, while simultaneously creating better ways to support NEAR’s thriving ecosystem. |
---
id: run-rpc-node-without-nearup
title: Run an RPC Node
sidebar_label: Run a Node 🚀
sidebar_position: 2
description: How to run an RPC Node without nearup
---
The following instructions are applicable across localnet, testnet, and mainnet.
If you are looking to learn how to compile and run a NEAR RPC node natively for one of the following networks, this guide is for you.
- [`testnet`](/rpc/run-rpc-node-without-nearup#testnet)
- [`mainnet`](/rpc/run-rpc-node-without-nearup#mainnet)
<blockquote class="info">
<strong>Heads up</strong><br /><br />
Running a RPC node is very similar to running a [validator node](/validator/running-a-node) as both types of node use the same `nearcore` release. The main difference for running a validator node is requiring `validator_key.json` to be used by the validator node to support its work of validating blocks and chunks on the network.
</blockquote>
## Prerequisites
- [Rust](https://www.rust-lang.org/). If not already installed, please [follow these instructions](https://doc.rust-lang.org/book/ch01-01-installation.html).
- [Git](https://git-scm.com/)
- Installed developer tools:
- MacOS
```bash
$ brew install cmake protobuf clang llvm awscli
```
- Linux
```bash
$ apt update
$ apt install -y git binutils-dev libcurl4-openssl-dev zlib1g-dev libdw-dev libiberty-dev cmake gcc g++ python docker.io protobuf-compiler libssl-dev pkg-config clang llvm cargo awscli
```
---
### Choosing your `nearcore` version
When building your NEAR node you will have two branch options to choose from depending on your desired use:
- `master` : _(**Experimental**)_
- Use this if you want to play around with the latest code and experiment. This branch is not guaranteed to be in a fully working state and there is absolutely no guarantee it will be compatible with the current state of *mainnet* or *testnet*.
- [`Latest stable release`](https://github.com/near/nearcore/tags) : _(**Stable**)_
- Use this if you want to run a NEAR node for *mainnet*. For *mainnet*, please use the latest stable release. This version is used by mainnet validators and other nodes and is fully compatible with the current state of *mainnet*.
- [`Latest release candidates`](https://github.com/near/nearcore/tags) : _(**Release Candidates**)_
- Use this if you want to run a NEAR node for *tesnet*. For *testnet*, we first release a RC version and then later make that release stable. For testnet, please run the latest RC version.
## `testnet`
### 1. Clone `nearcore` project from GitHub
First, clone the [`nearcore` repository](https://github.com/near/nearcore).
```bash
$ git clone https://github.com/near/nearcore
$ cd nearcore
$ git fetch origin --tags
```
Checkout to the branch you need if not `master` (default). Latest release is recommended. Please check the [releases page on GitHub](https://github.com/near/nearcore/releases).
```bash
$ git checkout tags/1.25.0 -b mynode
```
### 2. Compile `nearcore` binary
In the `nearcore` folder run the following commands:
```bash
$ make release
```
This will start the compilation process. It will take some time
depending on your machine power (e.g. i9 8-core CPU, 32 GB RAM, SSD
takes approximately 25 minutes). Note that compilation will need over
1 GB of memory per virtual core the machine has. If the build fails
with processes being killed, you might want to try reducing number of
parallel jobs, for example: `CARGO_BUILD_JOBS=8 make release`.
If you’re familiar with Cargo, you could also run `cargo build -p neard
--release` instead, which might or might not be equivalent to `make release`. It
is equivalent at the time of writing, but we don't guarantee this. If in doubt,
consult the `Makefile`, or just stick with `make release`.
The binary path is `target/release/neard`
### 3. Initialize working directory
The NEAR node requires a working directory with a couple of configuration files. Generate the initial required working directory by running:
```bash
$ ./target/release/neard --home ~/.near init --chain-id testnet --download-genesis --download-config
```
> You can specify trusted boot nodes that you'd like to use by pass in a flag during init: `--boot-nodes ed25519:[email protected]:24567,ed25519:[email protected]:24567,ed25519:[email protected]:24567,ed25519:[email protected]:24567`
> You can skip the `--home` argument if you are fine with the default working directory in `~/.near`. If not, pass your preferred location.
This command will create the required directory structure and will generate `config.json`, `node_key.json`, and `genesis.json` for `testnet` network.
- `config.json` - Configuration parameters which are responsive for how the node will work.
- `genesis.json` - A file with all the data the network started with at genesis. This contains initial accounts, contracts, access keys, and other records which represents the initial state of the blockchain.
- `node_key.json` - A file which contains a public and private key for the node. Also includes an optional `account_id` parameter which is required to run a validator node (not covered in this doc).
- `data/` - A folder in which a NEAR node will write it's state.
> **Heads up**
> The genesis file for `testnet` is big (6GB +) so this command will be running for a while and no progress will be shown.
### 4. Replacing the `config.json`
From the generated `config.json`, there two parameters to modify:
- `boot_nodes`: If you had not specify the boot nodes to use during init in Step 3, the generated `config.json` shows an empty array, so we will need to replace it with a full one specifying the boot nodes.
- `tracked_shards`: In the generated `config.json`, this field is an empty. You will have to replace it to `"tracked_shards": [0]`
To replace the `config.json`, run the following commands:
```bash
$ rm ~/.near/config.json
$ wget https://s3-us-west-1.amazonaws.com/build.nearprotocol.com/nearcore-deploy/testnet/config.json -P ~/.near/
```
### 5. Get data backup
The node is ready to be started. However, you must first sync up with the network. This means your node needs to download all the headers and blocks that other nodes in the network already have.
```bash
$ aws s3 --no-sign-request cp s3://near-protocol-public/backups/testnet/rpc/latest .
$ LATEST=$(cat latest)
$ aws s3 --no-sign-request cp --no-sign-request --recursive s3://near-protocol-public/backups/testnet/rpc/$LATEST ~/.near/data
```
### 6. Run the node
To start your node simply run the following command:
```bash
$ ./target/release/neard --home ~/.near run
```
That's all. The node is running you can see log outputs in your console. It will download a bit of missing data since the last backup was performed but it shouldn't take much time.
## `mainnet`
### 1. Clone `nearcore` project from GitHub
First, clone the [`nearcore` repository](https://github.com/near/nearcore).
```bash
$ git clone https://github.com/near/nearcore
$ cd nearcore
$ git fetch origin --tags
```
Next, checkout the release branch you need (recommended) if you will not be using the default `master` branch. Please check the [releases page on GitHub](https://github.com/near/nearcore/releases) for the latest release.
For more information on choosing between `master` and latest release branch [ [click here](/validator/compile-and-run-a-node#choosing-your-nearcore-version) ].
```bash
$ git checkout tags/1.25.0 -b mynode
```
### 2. Compile `nearcore` binary
In the `nearcore` folder run the following commands:
```bash
$ make release
```
This will start the compilation process. It will take some time
depending on your machine power (e.g. i9 8-core CPU, 32 GB RAM, SSD
takes approximately 25 minutes). Note that compilation will need over
1 GB of memory per virtual core the machine has. If the build fails
with processes being killed, you might want to try reducing number of
parallel jobs, for example: `CARGO_BUILD_JOBS=8 make release`.
If you’re familiar with Cargo, you could also run `cargo build -p neard
--release` instead, which might or might not be equivalent to `make release`. It
is equivalent at the time of writing, but we don't guarantee this. If in doubt,
consult the `Makefile`, or just stick with `make release`.
The binary path is `target/release/neard`
### 3. Initialize working directory
The NEAR node requires a working directory with a couple of configuration files. Generate the initial required working directory by running:
```bash
$ ./target/release/neard --home ~/.near init --chain-id mainnet --download-genesis --download-config
```
> You can specify trusted boot nodes that you'd like to use by pass in a flag during init: `--boot-nodes ed25519:[email protected]:24567,ed25519:[email protected]:24567,ed25519:[email protected]:24567,ed25519:[email protected]:24567,ed25519:[email protected]:24567`
> You can skip the `--home` argument if you are fine with the default working directory in `~/.near`. If not, pass your preferred location.
This command will create the required directory structure by generating a `config.json`, `node_key.json`, and downloads a `genesis.json` for `mainnet`.
- `config.json` - Configuration parameters which are responsive for how the node will work.
- `genesis.json` - A file with all the data the network started with at genesis. This contains initial accounts, contracts, access keys, and other records which represents the initial state of the blockchain.
- `node_key.json` - A file which contains a public and private key for the node. Also includes an optional `account_id` parameter which is required to run a validator node (not covered in this doc).
- `data/` - A folder in which a NEAR node will write it's state.
### 4. Replacing the `config.json`
From the generated `config.json`, there two parameters to modify:
- `boot_nodes`: If you had not specify the boot nodes to use during init in Step 3, the generated `config.json` shows an empty array, so we will need to replace it with a full one specifying the boot nodes.
- `tracked_shards`: In the generated `config.json`, this field is an empty. You will have to replace it to `"tracked_shards": [0]`
To replace the `config.json`, run the following commands:
```bash
$ rm ~/.near/config.json
$ wget https://s3-us-west-1.amazonaws.com/build.nearprotocol.com/nearcore-deploy/mainnet/config.json -P ~/.near/
```
### 5. Get data backup
The node is ready to be started. However, you must first sync up with the network. This means your node needs to download all the headers and blocks that other nodes in the network already have.
```bash
$ aws s3 --no-sign-request cp s3://near-protocol-public/backups/mainnet/rpc/latest .
$ LATEST=$(cat latest)
$ aws s3 --no-sign-request cp --no-sign-request --recursive s3://near-protocol-public/backups/mainnet/rpc/$LATEST ~/.near/data
```
### 6. Run the node
To start your node simply run the following command:
```bash
$ ./target/release/neard --home ~/.near run
```
That's all. The node is running and you can see log outputs in your console. It will download a bit of missing data since the last backup was performed but it shouldn't take much time.
## Running a node in `light` mode
Running a node in `light` mode allows the operator to access chain level data, not state level data. You can also use the `light` node to submit transactions or verify certain proofs. To run a node in a `light` mode that doesn't track any shards, the only change required is to update the `config.json` whereby `tracked_shards` is set to an empty array.
```
"tracked_shards": []
```
>Got a question?
<a href="https://stackoverflow.com/questions/tagged/nearprotocol">
<h8>Ask it on StackOverflow!</h8></a>
|
NEAR Community Update: May 17th, 2019
COMMUNITY
May 17, 2019
We had a blast in NYC, kicking off Consensus week on Monday with a DeFi panel in a completely packed room. We followed it up the next day with a fascinating panel exploring the future of gaming in blockchain (meme of the week right below from this panel!). We also jumped on a panel on Staking and Validation. High quality edits will be available soon, so make sure to keep an eye on our twitter! On the development side, the major push has been the new wallet, economics design and the new version of Nightshade consensus. In community news, we’ve wrapped up the hackathon and have been having a great time in New York, collaborating and discussing the future of blockchain with some very smart and cool crypto teams!
Au revoir New York! Until next time!
Blockchain Gaming in 2019
Apparently “we’re starting!!!” @Mitch_Kosowski @NEARProtocol @neondistrictRPG @opensea pic.twitter.com/6ku8N0JeXh
— Matt Lockyer (@mattdlockyer) May 14, 2019
COMMUNITY AND EVENTS
NY Consensus Week
It has been a HUGE week for NEAR events while we’ve been in New York for Consensus Week.
First off on Monday was a panel on the Future of DeFi, moderated by NEAR’s cofounder Illia, which had a completely packed house despite the rainy weather.
If you missed the livestream you can watch it here; we will be releasing the full videos of both our panels next week.
Also on Monday, Alex spoke on a panel about sharding with Quarkchain, Harmony, Thunder, Ontology, ETH and Zilliqa.
Fun meetup here in NYC, a lot of different teams presenting their take on blockchain sharding: @harmonyprotocol @Quark_Chain @zilliqa @OntologyNetwork @NEARProtocol @ethereum and more.#Consensus2019 #BlockchainWeekNYC pic.twitter.com/HFjF4yesIp
— protolambda (@protolambda) May 14, 2019
On Tuesday we ran two events simultaneously – a Crypto Community Managers’ Happy Hour hosted by Jess, where we met community managers from around the globe.
We also had a panel on the Future of Gaming hosted by Sasha from NEAR. The livestream is here if you missed it and the full video will be up soon.
We are so grateful to Nomadworks NY for hosting us for all NEAR’s events in New York; everyone on the team there is amazing and we had a blast!
Finally, Illia presented on a panel on Proof of Stake networks alongside Tezos, Cosmos, Parity, and Solana Labs. Livestream here. Illia will also be presenting on the economics of validation at this event tonight.
Talking next gen proof of stake protocols with @SolanaLabs, @cosmos, @NEARProtocol, @tezos and @ParityTech pic.twitter.com/Iw5QQdGYXm
— Wilson Withiam (@WilsonWithiam) May 15, 2019
Hackathon
NEAR’s first hackathon, Hack.One, is closed, the judging completed, and the winners announced. Thanks so much to our esteemed judges Linda Xie (Co-founder & Managing Director of Scalar Capital), Peter Kieltyka (Co-Founder & CEO at Horizon Blockchain Games Inc.), and Sina Habibian (Advisor at the Ethereum Foundation) for their thoughtful and constructive feedback.
*Drumroll*
And our winners are:
1st place ($5000) – Team twitr, with a blockchain implementation of Twitter
2nd place ($3000) – Team Zod.TV, with a decentralized video transcoder
3rd place ($1000) – Team azban, with a decentralized economy for data storage
Thanks to all the teams that participated, and keep an eye out for our next hackathon.
ENGINEERING HIGHLIGHTS
The Wallet has gone through even more iterations and is very close to release. We’ve started breaking the mega Pull Request (that encapsulates all of the new consensus research up this point) into merge-able PRs. Lastly, we’ve begun the process of breaking our mono repo into many smaller ones, so that contributing is much easier.
There were 34 PRs in our multiple repos from 6 different authors over the last couple of weeks. As always, the code is completely open source on Github.
Application/Development Layer
Implemented RLP in assemblyscript.
Added testing to near-runtime-ts.
Nearlib
Extracted into separate GitHub repository and re-configured continuous integration
Improved error handling
Make continuous integration tests work with shared TestNet
Two version releases of near-shell, the dApp command line tool
NEARStudio
Tracking downloads in Google Analytics to see who moves from online IDE to local development
Improving downloaded code compatibility with CLI development tools
Wallet
Completed phone verification service
Integrated phone verification with blockchain
Integrated app-specific access keys into the wallet
Introduced common request error handling components
Added base64 encoding and decoding to near-runtime-ts
Blockchain Layer
Implemented economics that deducts fees directly from the balance instead of using gas/mana
Discussed new Nightshade consensus in Santa Clara
Currently breaking the gigantic research PR on the new consensus into smaller, mergeable PRs
Big refactoring PR of existing consensus to prepare for merge of Nightshade
HOW YOU CAN GET INVOLVED
Join us! If you want to work with one of the most talented teams in the world right now to solve incredibly hard problems, check out our careers page for openings. And tell your friends ?
Learn more about NEAR in The Beginner’s Guide to NEAR Protocol. Stay up to date with what we’re building by following us on Twitter for updates, joining the conversation on Discord and subscribing to our newsletter to receive updates right to your inbox.
Reminder: you can help out significantly by adding your thoughts in our 2-minute Developer Survey.
https://upscri.be/633436/ |
NEAR at ETHDenver 2022 Highlights
COMMUNITY
February 24, 2022
Couldn’t make it to the NEAR Lounge for #BUIDL Week at this year’s ETHDenver? This post has you covered with highlights from events at the NEAR Lounge and ETHDenver stages, as well as a major ecosystem announcement from NEAR Co-Founder Illia Polosukhin.
NEAR went to ETHDenver because the community believes in a collaborative, decentralized, multi-chain ecosystem of Open Web and Metaverse platforms. Projects like Aurora and Rainbow Bridge, for example, have helped build simple and secure bridges between NEAR and Ethererum, which are working to foster a truly global network of open-source Web3 projects.
So, if you weren’t able to meet and connect with the NEAR developers and ecosystem partners building the decentralized multi-chain future, get up to speed with these highlights, and start learning to build at NEAR Education.
Day 1 @ NEAR Lounge – Building Bridges with Community
On Day 1 of #BUIDL Week at the NEAR Lounge, the focus was on how the NEAR community is building bridges for a multi-chain future.
Improving Web3 UX Through Appchain Technology
The first presenters at the NEAR Lounge were the team leads from Octopus Network, who detailed their efforts to help kickstart an innovation wave on Web3.
“So far, Web3 user application development hasn’t had great user experience because distributed ledger technology is more complex and expensive to build, which results in a downgraded UI,” said Octopus Network Editor Suzanne Leigh. “One of the key design considerations for Octopus is making Web3 accessible to all.
NEAR Foundation’s Jacob Lindahl
Leigh noted that new frameworks, like Substrate and Cosmos SDK, give Web3 developers a much bigger and better design space to deliver fully optimized Web3 applications. As Leigh explained, Octopus Network does this through application-specific blockchains (aka, appchains).
To hear more from Leigh and the rest of the Octopus Network team, listen to the Twitter Spaces recording.
Introducing Pagoda, a Web3 Startup Platform
NEAR Co-Founder Illia Polosukhin took to the stage to announce the launch of Pagoda, the first-ever Web3 startup platform. Pagoda gives developers all the tools they need to build, create, and maintain Web3 applications.
Illia Polosukhin introduces Pagoda
To hear Illia’s Pagoda announcement in its entirety, check out the Twitter Spaces recording, which also includes Pagoda’s Min Zhang going into detail on how NEAR’s sharding approach works.
Developer Console and NEAR 101 Fundamentals
After Illia’s announcement, Pagoda’s Josh Quintal demoed Developer Console—Pagoda’s developer platform for creating and maintaining dapps on NEAR.
Developer Console features a number of interactive tutorials, a scalable RPC service, and operational metrics. It takes care of Web3 infrastructure so that developers can focus on what makes their dapps unique.
Josh Quintal talking through the Developer Console
To hear Josh explain how to deploy a smart contract and mint an NFT using Developer Console, be sure to listen to the Twitter Spaces recording. You can also tune in to the “NEAR 101 Fundamentals” talk hosted by NEAR Foundation’s Jacob Lindhal, another great primer on getting started building on NEAR.
Day 2 @ NEAR Lounge – Rust, Aurora, Metapool, and Brave
The second day of ETHDenver at NEAR Lounge was full of variety, with talks, panels, and workshops covering everything from NEAR’s Rust programming language to Web3 funding, Brave browser’s gaming ambitions, and more.
Following the morning’s “In Rust We Trust” brunch, Pagoda’s Austin Abell taught attendees how to use the Rust programming language to create smart contracts on the NEAR protocol. Pagoda software engineer Phuong Nguyen followed this up with a workshop on how to test NEAR contracts on the near-workspaces Github.
After Pagoda’s opening workshops, NEAR Foundation’s Jacob Lindhal returned to the stage for a talk on how to get started with NEAR University’s borderless education efforts. You can listen to Jacob’s remarks on the Twitter Spaces recordings, or just sign up for a class at NEAR University.
After lunch, Aurora team members took the stage to talk about building on Aurora, the EVM scaling solution built on NEAR. Alex Shevchenko, Founder and CEO of Aurora, explained how to build apps on Aurora, which Aurora’s Business Development Lead Danny Bocanegra followed up with “Getting Onboarded to Aurora’s Ecosystem”.
The Aurora-centric talks continued with a presentation by team members from Endemic, Chronicle, and TENKBay—three creator-focused NFT platforms recently launched on the Aurora EVM and NEAR Protocol.
Day 2 also featured a fascinating fireside chat between Coinfund’s Jake Brukhman NEAR Foundation Grants’ Josh Daniels. Brukhman spent a good deal of time talking about the future of NFTs.
Brave’s Luke Mulks and NEAR Protocol Founder Illia Polosukhin
“Our thesis at CoinFund is that this is just the beginning,” he said. “We have the entire mainstream world to convert to NFTs.”
Team members from the privacy-preserving browser Brave, which sponsored a MetaBUILD 2 challenge, also took the stage at the NEAR Lounge. Jonathan Sampson, Head of Developer Relations, talked about how gaming NFTs are a path to mainstream adoption, while Luke Mulks, VP of Business Operations at Brave, had a fireside chat with Illia on frictionless onboarding for a multi-chain future.
Check out the Day 2 Twitter Spaces recording to hear these talks, as well as others on fractionalized assets, API3’s Aurora oracle solutions, and MoonNoobs, an Aurora-based platform for investing in the blockchain startups that are regularly surfacing throughout the NEAR ecosystem.
Day 3 @ NEAR Lounge – Research & DAOvelopment and Governauts Assemblage
Day 3 kicked off with the “NEAR DAOVersity Brunch”, an event co-hosted by NEAR Foundation CEO Marieke Flament, NEAR Co-Founder Illia Polosukhin, and H.E.R. DAO Founder Tracey Bowen, a womxn-led developer DAO championing innovation and diversity.
The day of DAOs got started with the “Building (Rainbow) Bridges, which showcased the NEAR community’s commitment and active work toward inclusivity. NEAR Foundation Grants’ Nicole Tay moderated the panel, which included Tracey Bowen, Mildred Idada (Founder, Open Web Collective), Kimberly Adams (Co-founder, Bridge Network), Barbara Liau (Co-Founder & COO, Nomad), and Kelsey Ruiz (Head of Content, Akash Network).
NEAR Foundation CEO Marieke Flament
Interested in others exploring Web3 inclusivity and DAOs on the NEAR ecosystem? Check out the work of Chloe Lewis of Marma J Foundation, who gave an excellent workshop on multi-DAO governance within NEAR DAOs. Also, dig into the work of speakers Eugene Levanthal (Smart Contract Research Forum), Livia Deschermeyer (Token Engineering Commons), and Jessica Zartler (Commons Stack/BlockScience Forum Governauts + Token Engineering Academy)—each whom are working to make Web3 spaces inclusive and welcoming.
Once immersed in DAOs and governance, check out AstroDAO, founded by Jordan Gray. AstroDAO is the NEAR ecosystem’s DAO-launching platform, and Gray shared a number of techniques and strategies for creating momentum for dapp communities on NEAR.
NEAR at the ETHDenver Web3 Castle
The NEAR community’s presence at ETHDenver continued after the talks, workshops, panels, and hacking at the NEAR Lounge. A number of folks from NEAR spoke on the Web3 Castle stages, exploring topics such as mainstream blockchain adoption, the multi-chain future, NEAR sharding, and more.
Illia Polosukhin Talks Pagoda and NEAR
A major event for the NEAR community at ETHDenver was Illia Polosukhin’s announcement of Pagoda, the first-ever Web3 startup platform. Illia began with a brief talk on NEAR’s sharding, then demonstrated how Pagoda’s suite of tools and resources, including Developer Console, can help anyone build and maintain Web3 apps, in a simplified and elegant way.
As Illia sees it, Pagoda will play a major role in launching the next generation of crypto startups. Check out his talk in the video below.
Illia was also part of the panel “Is Blockchain the Future of Technology”, led by Time reporter Andrew Chow. Watch it below.
NEAR Foundation CEO Marieke Flament on Blue Chip Forays into Crypto
A lot of blockchain developers are interested in how traditional companies and financial institutions will integrate crypto technology. As part of the panel “Blue Chip Forays Into Crypto,” NEAR Foundation CEO Marieke Flament leaned on her experience as CMO of Circle and CEO of Mettle to make some forecasts.
“We are seeing more and more blue chips getting into this space but they need to wrap their heads around it,” said Flament, who emphasized how integral security will be for blue chip adoption. “There is a lot of education to be done and also a lot of streamline usability needed to make sure they get to the same understanding [as Web2].”
Listen to Marieke’s full remarks in the video below.
Alex Skidanov Imagines the Multi-Chain Future
A major theme for the NEAR community during #BUIDL Week and ETHDenver Main Event was the multi-chain future, where various protocols collaborate to elevate Web3. NEAR Co-Founder Alex Skidanov discussed this topic as part of the panel “A Multi-Chain Future”, which you can watch below.
Alex Shevchenko Talks Aurora and Mainstream Adoption
Aurora Founder and CEO Alex Shevchenko spoke twice during the ETHDenver Main Event. In a talk at the Web3 Castle, Alex explained the two ingredients necessary to onboard the next billion users: a scalable, sharded blockchain infrastructure and mass-market user experience like simple subscription models and FaceID.
Alex also took part in the panel “EVM Interoperability Across Ecosystems” at the Art Hotel, which you can watch below.
And that’s a wrap! See you at the next ETHDenver!
|
---
id: sybil
title: Sybil Attacks
---
While developing your smart contract, keep in mind that an individual can potentially create multiple NEAR accounts. This is especially relevant in ecosystems involving crowd decisions, such as [DAOs](../../../2.build/5.primitives/dao.md).
Imagine that you open the voting to anyone in the community. If each account can cast a vote, a malicious actor could span multiple accounts and gain a disproportionately large influence on the result. |
AI is NEAR: The Future of AI on the Open Web at NEARCON ‘23
NEAR FOUNDATION
October 25, 2023
If you’re as excited as we are about the future of AI and blockchain, then get ready because NEARCON ’23 is just around the corner. If you haven’t registered yet, consider this your invitation to do so now on the official NEARCON ’23 site.
This year’s edition has a special focus on the intersection of AI and Web3. NEAR co-founder Illia Polosukhin, a pioneer in the AI world, is leading the charge. Let’s explore the amazing speakers sharing their perspectives on this game-changing new field in must-see NEARCON sessions.
Keynote by Cosmose AI’s Miron Mironiuk
Miron Mironiuk, the mind behind Cosmose AI, will enlighten NEARCON attendees on how AI is shaping the future of consumer behavior analysis with its unique “Shoppertainment” app that seamlessly blends personalization, loyalty, and blockchain-based rewards. KAIKAI has consistently ranked in the top 3 dApps in Web3 since its launch.
AI on NEAR: The Future of Work and Governance by Illia Polosukhin
Illia Polosukhin, another NEAR co-founder and head of Pagoda, will bring his deep roots in AI to this discussion of the future of AI strategy on NEAR. He’ll explore how NEAR is integrating AI to revolutionize work and evolve governance on the blockchain with AI. Topics will include why AI needs blockchain and the emergence of new business models as both AI and blockchain mature.
Illia Polosukhin in Conversation With TechCrunch’s Mike Butcher
In this candid discussion, Illia will join TechCrunch’s Mike Butcher to discuss the evolving role of AI in the open web and what we can expect in the coming years. As the case for open source AI becomes clearer and as regulators across the world introduce new policies designed to mitigate risk, how can Web3 help this revolutionary tech evolve in a more positive direction? Join this fireside chat to learn more.
AI + Blockchain = Future of Work
This dynamic panel discussion will delve into the transformative potential of integrating AI and blockchain technologies in modern work environments. If you’re curious what the future of work will look like with AI on the open web, you’ll find those thoughts here.
AI+Web3 Keynote from Niraj Pant
Niraj Pant, previously an investor at Polychain and now building a startup at the intersection of AI and Web3, will provide key insights into how AI can be leveraged for decentralized applications and services.
AI Keynote by Alexander Skidanov
When Alexander Skidanov and Illia Polosukhin originally founded NEAR, it was as an AI company called NEAR.ai. After transforming NEAR into a blockchain network, and working on its well-known Nightshade sharding technique, Alex turned his attention back to AI and founded an AI startup building smarter large language models, or LLMs, in a more decentralized way.
To get more details on these sessions, and to check out the full speakers list and agenda, head on over to nearcon.org.
Wrap-Up and Next Steps
NEARCON ’23 is set to blow last year’s away, with some of the brightest thought leaders in AI and the open web. And with NEAR’s co-founder Illia Polosukhin being an OG in the AI space, you can bet that the AI track at NEARCON will be awesome. So whether you’re a developer, an entrepreneur, or just a curious mind, NEARCON ‘23 has something for you.
And let’s not forget, if you’re a student in Spain or Portugal, or a Ukrainian citizen living in Portugal, you’re eligible for free tickets. Hackers can also get in for free by participating in our hackathon event.
NEAR is already delivering on its promise of a blockchain that is simple, scalable, and secure. Now’s your chance to be part of bringing AI into the open web. Register for NEARCON ’23 today and witness firsthand how NEAR is shaping the future of AI and decentralization. |
---
id: update-contract-migrate-state
title: Self Upgrade & State Migration
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
import {CodeTabs, Language, Github} from "@site/src/components/codetabs"
Three examples on how to handle updates and [state migration](../../2.build/2.smart-contracts/release/upgrade.md):
1. [State Migration](https://github.com/near-examples/update-migrate-rust/tree/main/basic-updates): How to implement a `migrate` method to migrate state between contract updates.
2. [State Versioning](https://github.com/near-examples/update-migrate-rust/tree/main/enum-updates): How to use readily use versioning on a state, to simplify updating it later.
3. [Self Update](https://github.com/near-examples/update-migrate-rust/tree/main/self-updates): How to implement a contract that can update itself.
---
## State Migration
The [State Migration example](https://github.com/near-examples/update-migrate-rust/tree/main/basic-updates) shows how to handle state-breaking changes
between contract updates.
It is composed by 2 contracts:
1. Base: A Guest Book where people can write messages.
2. Update: An update in which we remove a parameter and change the internal structure.
<CodeTabs>
<Language value="rust" language="rust">
<Github fname="migrate.rs"
url="https://github.com/near-examples/update-migrate-rust/blob/main/basic-updates/update/src/migrate.rs"
start="18" end="45" />
</Language>
</CodeTabs>
#### The Migration Method
The migration method deserializes the current state (`OldState`) and iterates through the messages, updating them
to the new `PostedMessage` that includes the `payment` field.
:::tip
Notice that migrate is actually an [initialization method](../../2.build/2.smart-contracts/anatomy/anatomy.md#initialization-method) that ignores the existing state (`[#init(ignore_state)]`), thus being able to execute and rewrite the state.
:::
---
## State Versioning
The [State Versioning example](https://github.com/near-examples/update-migrate-rust/tree/main/enum-updates) shows how to use
[Enums](https://doc.rust-lang.org/book/ch06-01-defining-an-enum.html) to implement state versioning on a contract.
Versioning simplifies updating the contract since you only need to add a new new version of the structure.
All versions can coexist, thus you will not need to change previously existing structures.
The example is composed by 2 contracts:
1. Base: The Guest Book contract using versioned `PostedMessages` (`PostedMessagesV1`).
2. Update: An update that adds a new version of `PostedMessages` (`PostedMessagesV2`).
<CodeTabs>
<Language value="rust" language="rust">
<Github fname="versioned_msg.rs"
url="https://github.com/near-examples/update-migrate-rust/blob/main/enum-updates/update/src/versioned_msg.rs"
start="18" end="36" />
</Language>
</CodeTabs>
---
## Self Update
The [Self Update example](https://github.com/near-examples/update-migrate-rust/tree/main/self-updates) shows how to implement a contract
that can update itself.
It is composed by 2 contracts:
1. Base: A Guest Book were people can write messages, implementing a `update_contract` method.
2. Update: An update in which we remove a parameter and change the internal structure.
<CodeTabs>
<Language value="rust" language="rust">
<Github fname="update.rs"
url="https://github.com/near-examples/update-migrate-rust/blob/main/self-updates/base/src/update.rs"
start="10" end="31" />
</Language>
</CodeTabs>
|
---
id: multichain-server
title: Multichain Relayer Server
sidebar_label: Multichain Relayer Server
---
The [Multichain Relayer Server](https://github.com/near/multichain-relayer-server) facilitates cross-chain transactions and enables Chain Abstraction.
## Overview
The main function of this server is interfacing with foreign chain RPCs sending both presigned funding transactions to cover gas and the actual presigned transaction once the funding is done.
Although the multichain relayer is a server in current design of this system, the goal is to package this as a library that can be called on the client side of the wallet. This will make the system more decentralized.
:::tip
The Multichain Relayer is meant to be deployed alongside the [Gas Station Event Indexer](https://github.com/near/gas-station-event-indexer) on the same server so that the gas station event indexer can call the multichain relayer server via IPC instead of having to send the request over the network introducing extra latency to the system.
:::
## Technical system design
Below is a design diagram of the entire multichain relayer system:
![multichain_relayer_technical_design.png](/docs/multichain_relayer_technical_design.png)
- The [gas station contract](https://github.com/near/multichain-gas-station-contract) and the [MPC signing service contract](https://github.com/near/mpc-recovery/tree/main/contract) are in the green box which take place on NEAR.
- This multichain relayer server focuses on the purple/blue Multichain Relayer Core Backend Services Box in the middle and the connections to the XChain systems in the red box via RPCs.
- The XChain Settlement that's happening in the yellow box is currently manual and will be automated in the future.
## Paymaster
A paymaster represents an address on a destination chain that holds a balance of that chain’s native gas token:
- User addresses on destination chains will be funded directly from paymaster accounts.
- Partners that want to integrate with the Multichain Gas Relayer service need to create, fund, and manage paymaster accounts on the destination chains that they want to have support for.
- [Manual settlement](gas-station.md#settlement) between the [NEAR Gas Station contract](gas-station.md) and paymaster accounts are also required on a regular basis to ensure a consistent balance of funds.
## System workflow
1. The wallet sends a NEAR transaction to the gas station contract that contains 2 actions:
1. A transfer of `NEAR` (or FT Transfer in the future) to cover gas on the foreign chain
2. A `create_transaction` function call to the gas station contract `canhazgas.testnet` containing the unsigned foreign chain transaction to be signed by the MPC signing service, assuming the unsigned transaction passes validation.
2. The Gas Station Contract calls the MPC signing service to sign both a funding transaction, which ensures the user's foreign chain account has sufficient gas to execute the desired transaction, and signs the unsigned foreign chain transaction.
3. Upon receipt of both the signed transactions, the Gas Station Contract emits an event which is picked up by the indexer, which then calls the `/send_funding_and_user_signed_txns` with the 2 signed transactions from the indexer.
4. The multichain relayer server sends the funding transaction to the foreign chain RPC to fund the user's account with gas.
5. After the gas funding transaction is confirmed, the multichain relayer server calls the foreign chain RPC again to send the signed transaction initiated by the user in step 1.
6. The Cross Chain Settlement takes care of selling the extra `NEAR` being sent to the gas station contract for gas tokens on foreign chains as well as bridging the tokens to the other chains. This process is currently manual, but will be automated in partnership with market makers in the future.
## Relayer Server Endpoints
1. `/send_funding_and_user_signed_txns` which handles both
1. Funding the user's XChain account with gas from the paymaster treasury account, which is provided as a raw signed transaction
2. Sending the user's raw signed transaction (in hexadecimal in EVM case) after the funding transaction has been confirmed on the foreign chain
## Supported Chains
- BSC testnet
- BSC mainnet, Ethereum mainnet, and more chains coming soon!
:::info
Check the Relayer's [GitHub repository](https://github.com/near/multichain-relayer-server) to learn more about upcoming features and updates.
:::
## Limitations
When using the Multichain Gas relayer solution, some limitations should be considered. Here's a list of potential issues you might encounter, and suggested ways to mitigate them:
- Not enough gas for a cross-chain transaction to get included in time.
- **Solution:** overcharge for gas at the gas station and when constructing the transaction include more than the average gas price.
- Slippage violations causing the gas token or foreign chain Fungible Token to get refunded to the user's foreign chain address.
- **Solution:** encourage your users to use high slippage settings in volatile or low liquidity market conditions.
- **Solution:** if such error occurs, make the user aware of what happened and that funds were not lost.
- **Note:** in future versions the solution will support retrying transactions.
- Nonce issues if Paymaster rotation isn't done properly. This issue is a function of concurrent usage, blockchain finality time, and the number of paymaster treasury accounts that the [Gas Station](gas-station.md) is rotating through.
- **Solution:** use a blockchain that has faster finality.
- **Solution:** increase the number of paymaster treasury accounts through which the gas station rotates.
## Total Time expectations for end users
It depends on the chain, but in our current estimation 50-90% of the time will be on NEAR calling and waiting for the signing to complete on the MPC service.
The signing service will take 15-30 seconds.
We assume that both the signing of the foreign chain transaction and the gas funding transaction happen in parallel.
On BSC mainnet (not beaconchain which has 1 second finality) with 3 second blocktimes there should be 2 blocks for confirmation optimistically bringing the total to 6 seconds optimistically/transaction on BSC.
We need to make 2 transactions, so that's 12-24 seconds on BSC assuming 2-4 blocks for finality. Add in some network overhead for each step in the process, especially the indexer picking up the emitted event (~5-7 seconds), and we're at 30-60 seconds/transaction on BSC.
For Solana it would be closer to 20-30 seconds (0.4 second block time, 1 block confirmation). See [table 1](https://usa.visa.com/solutions/crypto/deep-dive-on-solana.html) for more confirmation times.
L2 real finality times can over a day for finality unless we trust a centralized sequencer for soft confirmations, which may be as fast as a few seconds as in the case of [zksync era](https://era.zksync.io/docs/reference/concepts/finality.html#instant-confirmations).
The difference between optimistic or soft confirmations vs real finality is something we are considering. We may get better finalized guarantees when the [Eigenlayer-Near Partnership is live](https://pages.near.org/blog/near-foundation-and-eigen-labs-partner-to-enable-faster-cheaper-web3-transactions-for-ethereum-rollups-via-eigenlayer/). 3-4 second finality for all ETH L2s is much more manageable.
|
---
sidebar_position: 5
sidebar_label: Transaction
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
# `Transaction` Structure
## Definition
Transaction is the main way of interraction between a user and a blockchain. Transaction contains:
- Signer account ID
- Receiver account ID
- Actions
## `SignedTransactionView`
Transaction might be unsigned but from the indexer perspective when we think about Transaction we think about signed one
<Tabs>
<TabItem value="rust" label="Rust">
```rust links=1
pub struct SignedTransactionView {
pub signer_id: AccountId,
pub public_key: PublicKey,
pub nonce: Nonce,
pub receiver_id: AccountId,
pub actions: Vec<ActionView>,
pub signature: Signature,
pub hash: CryptoHash,
}
```
</TabItem>
<TabItem value="typescript" label="TypeScript">
```ts links=1
export type Transaction = {
signerId: string;
publicKey: string;
nonce: number;
receiverId: string;
actions: Action[];
signature: string;
hash: string;
};
```
</TabItem>
</Tabs>
## `ActionView`
`ActionView` is an Enum with possible actions along with parameters. This structure is used in Transactions and in [Receipts](receipt.mdx)
<Tabs>
<TabItem value="rust" label="Rust">
```rust links=1
pub enum ActionView {
CreateAccount,
DeployContract {
code: String,
},
FunctionCall {
method_name: String,
args: String,
gas: Gas,
#[serde(with = "u128_dec_format")]
deposit: Balance,
},
Transfer {
#[serde(with = "u128_dec_format")]
deposit: Balance,
},
Stake {
#[serde(with = "u128_dec_format")]
stake: Balance,
public_key: PublicKey,
},
AddKey {
public_key: PublicKey,
access_key: AccessKeyView,
},
DeleteKey {
public_key: PublicKey,
},
DeleteAccount {
beneficiary_id: AccountId,
},
}
```
</TabItem>
<TabItem value="typescript" label="TypeScript">
```ts links=1
export type Action =
| "CreateAccount"
| DeployContractAction
| FunctionCallAction
| TransferAction
| StakeAction
| AddKeyAction
| DeleteKeyAction
| DeleteAccountAction;
export type DeployContractAction = {
DeployContract: {
code: string;
};
};
export type FunctionCallAction= {
FunctionCall: {
methodName: string;
args: string;
gas: number;
deposit: string;
};
};
export type TransferAction = {
Transfer: {
deposit: string;
};
};
export type StakeAction = {
Stake: {
stake: number;
publicKey: string;
};
};
export type AddKeyAction = {
AddKey: {
publicKey: string;
accessKey: AccessKey;
};
};
export type DeleteKeyAction = {
DeleteKey: {
publicKey: string;
};
};
export type DeleteAccountAction = {
DeleteAccount: {
beneficiaryId: string;
};
};
```
</TabItem>
</Tabs>
|
- Proposal Name: NEAR economics specs
- Start Date: 2020-02-23
- NEP PR: [nearprotocol/neps#0000](https://github.com/nearprotocol/NEPs/pull/33)
- Issue(s): link to relevant issues in relevant repos (not required).
# Summary
[summary]: #summary
Adding economics specification for NEAR Protocol based on the NEAR whitepaper - https://pages.near.org/papers/the-official-near-white-paper/#economics
# Motivation
[motivation]: #motivation
Currently, the specification is defined by the implementation in https://github.com/near/nearcore. This codifies all the parameters and formulas and defines main concepts.
# Guide-level explanation
[guide-level-explanation]: #guide-level-explanation
The goal is to build a set of specs about NEAR token economics, for analysts and adopters, to simplify their understanding of the protocol and its game-theoretical dynamics.
This initial release will be oriented to validators and staking in general.
# Reference-level explanation
[reference-level-explanation]: #reference-level-explanation
This part of the documentation is self-contained. It may provide material for third-party research papers, and spreadsheet analysis.
# Drawbacks
[drawbacks]: #drawbacks
We might just put this in the NEAR docs.
# Rationale and alternatives
[rationale-and-alternatives]: #rationale-and-alternatives
# Unresolved questions
[unresolved-questions]: #unresolved-questions
# Future possibilities
[future-possibilities]: #future-possibilities
This is an open document which may be used by NEAR's community to pull request a new economic policy. Having a formal document also for non-technical aspects opens new opportunities for the governance.
|
---
id: welcome
title: NEAR - Your Gateway to an Open Web
hide_table_of_contents: true
---
import {FeatureList, Column, Feature} from "@site/src/components/featurelist"
Welcome, this is the starting point for all NEAR documentation. Learn to build and publish blockchain applications. Embrace the power of Web3.
<div className="row">
<div className="col col--4">
<a href="/concepts/welcome">
<div className="card">
<div className="card__image">
<img src={require("@site/static/docs/assets/welcome-pages/protocol.png").default} alt="Learn" />
</div>
<div className="card__body">
<h3>Understanding NEAR</h3>
Learn what NEAR is and how it works
</div>
</div>
</a>
</div>
<div className="col col--4">
<a href="/build/smart-contracts/what-is">
<div className="card">
<div className="card__image">
<img src={require("@site/static/docs/assets/welcome-pages/contracts.png").default} alt="Contracts" />
</div>
<div className="card__body">
<h3>Smart Contracts</h3>
Learn to build smart contracts in NEAR
</div>
</div>
</a>
</div>
<div className="col col--4">
<a href="/build/near-components/what-is">
<div className="card">
<div className="card__image">
<img src={require("@site/static/docs/assets/welcome-pages/bos-big.png").default} alt="Web3 Components" />
</div>
<div className="card__body">
<h3>Web3 Components</h3>
The building blocks for multi-chain apps
</div>
</div>
</a>
</div>
<div className="col col--4">
<a href="/build/web3-apps/what-is">
<div className="card">
<div className="card__image">
<img src={require("@site/static/docs/assets/welcome-pages/examples.png").default} alt="Solutions" />
</div>
<div className="card__body">
<h3>Web3 Applications</h3>
Supercharge your App using NEAR
</div>
</div>
</a>
</div>
<div className="col col--4">
<a href="/tools/welcome">
<div className="card">
<div className="card__image">
<img src={require("@site/static/docs/assets/welcome-pages/tools.png").default} alt="Tools" />
</div>
<div className="card__body">
<h3>NEAR Tools</h3>
Discover our SDK, API, CLI, and more
</div>
</div>
</a>
</div>
<div className="col col--4">
<a href="/build/data-infrastructure/what-is">
<div className="card">
<div className="card__image">
<img src={require("@site/static/docs/assets/welcome-pages/data-lake.png").default} alt="Data Lake" />
</div>
<div className="card__body">
<h3>Query On-Chain Information</h3>
Learn about indexers and our data lake
</div>
</div>
</a>
</div>
</div>
<hr className="subsection" />
<h1 className="text-center big-title" > Browse the Docs By Topic </h1>
<FeatureList>
<Column title="Understanding NEAR" size="3">
<Feature url="/concepts/basics/protocol" title="What is NEAR?" subtitle="Learn the Basics about NEAR" image="near-logo.png" />
<Feature url="/concepts/protocol/account-id" title="Named Accounts" subtitle="NEAR uses human-readable accounts" image="user.png" />
<Feature url="/concepts/protocol/access-keys" title="Multiple Access Keys" subtitle="More keys means more security" image="key.png" />
<Feature url="/concepts/protocol/smartcontract" title="Smart Contracts" subtitle="Learn about our contract technology" image="contract.png" />
</Column>
<Column title="Developer Docs" size="3">
<Feature url="/build/web3-apps/quickstart" title="Quickstart: WebApp" subtitle="Spin-up your first dApp" image="quickstart.png" />
<Feature url="/build/smart-contracts/quickstart" title="Quickstart: Contract"
subtitle="Learn how to write smart contracts" image="smartcontract.png" />
<Feature url="/build/near-components/anatomy/state" title="Multi-chain Components"
subtitle="Learn about multi-chain components" image="bos-lido.png" />
<Feature url="/build/data-infrastructure/query-api/intro" title="QueryAPI" subtitle="The simplest way to build indexers" image="blocks.png" />
</Column>
<Column title="Developer Tools" size="3">
<Feature url="/sdk/js/introduction" title="JavaScript SDK" subtitle="Write contracts in JavaScript" image="smartcontract-js.png" />
<Feature url="/sdk/rust/introduction" title="Rust SDK" subtitle="Write contracts in Rust" image="smartcontract-rust.png" />
<Feature url="/tools/near-cli" title="NEAR CLI" subtitle="Use NEAR from the Terminal" image="near-cli.png" />
<Feature url="/tools/near-api-js/quick-reference" title="NEAR API JS" subtitle="Interact with NEAR from JS" image="near-api-js.png" />
<Feature url="/api/rpc/introduction" title="RPC API" subtitle="Interact with the NEAR RPC API" image="rpc.png" />
</Column>
<Column title="Examples & Tutorials" size="3">
<Feature url="/tutorials/examples/donation" title="Donation" subtitle="Receive and send tokens" image="donation.png" />
<Feature url="/tutorials/examples/factory" title="Factory Contract" subtitle="Build a contract that deploys contracts" image="factory.png" />
<Feature url="/tutorials/examples/frontend-multiple-contracts" title="Multi-Contract Frontend" subtitle="Interact with multiple contracts" image="multiple.png" />
<Feature url="/tutorials/nfts/js/introduction" title="Master NFTs on NEAR (JS)" subtitle="Learn everything about NFT in JS" image="nft-marketplace-js.png" />
</Column>
</FeatureList>
---
## External Resources
Here are more sources from our ecosystem that can help you to learn more about NEAR.
<div className="row cards">
<div className="col col--6">
<a href="https://near.org/applications">
<div className="card">
<div className="card__image">
<img src={require("@site/static/docs/assets/welcome-pages/awesomenear.jpg").default} alt="Discover" />
</div>
<div className="card__body">
<h3>
Discover
</h3>
Discover awesome apps in the Near ecosystem.
</div>
</div>
</a>
</div>
<div className="col col--6">
<a href="https://nomicon.io">
<div className="card">
<div className="card__image">
<img src={require("@site/static/docs/assets/welcome-pages/nomicon.png").default} alt="Nomicon" />
</div>
<div className="card__body">
<h3>
Nomicon
</h3>
See how NEAR is implemented in the official protocol specification.
</div>
</div>
</a>
</div>
<div className="col col--6">
<a href="https://near-nodes.io">
<div className="card">
<div className="card__image">
<img src={require("@site/static/docs/assets/welcome-pages/validate.png").default} alt="Validate" />
</div>
<div className="card__body">
<h3>
Running a Node
</h3>
Documentation on becoming a validator to help keeping the blockchain safe
</div>
</div>
</a>
</div>
<div className="col col--6">
<a href="https://templates.mintbase.xyz/">
<div className="card">
<div className="card__image">
<img src={require("@site/static/docs/assets/welcome-pages/mintbase-templates.png").default} alt="Templates" />
</div>
<div className="card__body">
<h3>Templates</h3>
Templates for creating web3 applications
</div>
</div>
</a>
</div>
</div>
|
NEAR Foundation Signs Up to the Ethereum Climate Platform
NEAR FOUNDATION
July 17, 2023
NEAR Foundation is thrilled to announce that it is signed up to the Ethereum Climate Platform. The Foundation has worked on making the NEAR Protocol and open web sustainable from the get-go, and this move is a major demonstration of that continued commitment to Web3 sustainability.
The announcement also illustrates NEAR Foundation’s efforts at working collaboratively with globally recognized leaders to minimize the environmental impact of Web3 development.
Ethereum Climate Platform: accelerating climate finance at scale
At COP27, ConsenSys and climate tech firm Allinfra convened and announced the Ethereum Climate Alliance. A group of technology companies, the ECA is aligned around the mission to accelerate decarbonization and scale climate action. It achieves this by deploying best-in-class technology and funding the development of digital environmental assets designed to mitigate greenhouse gas emissions and deliver positive environmental and social impact well into the future.
Inspired by the Ethereum Merge, ECA is launching the Ethereum Climate Platform (ECP) — an initiative designed to collaboratively build a blockchain-based technology platform that will engage a decentralized community to accelerate climate finance at scale. The ECP is committed to redress and counteract the historical Ethereum-based carbon footprint, dating back to the network’s launch in 2015, as well as to accelerate the launch partners’ own decarbonization strategies.
NEAR as a sustainability model for ECP and beyond
By joining the ECP, NEAR Foundation is making clear its intention to act as a sustainability model to the broader tech and business community.
The collective includes a still-growing list of prominent stakeholders from the Web3 ecosystems, including Laser Digital (Nomura), Microsoft, Polygon, The Climate Collective, Celo, Aave, Filecoin, and UPC Capital Ventures. Beyond open web ecosystem, the ECP also includes civil society leaders who are committed to mitigating human-driven climate change.
NEAR Foundation’s environmental stewardship is an integral part of its ethos. From inception, NEAR has focused on a computational alternative to PoW.
In this system, known as Proof-of-Stake (PoS), there are no miners. Instead, validators stake a given token amount to allow themselves to take part. PoS blockchains incentivize communities to own and control the network. Only then can they validate blocks and collect a block award. Functionally, NEAR’s PoS, called Nightshade, allows the platform to realize its ambition of being simple, scalable, and secure.
NEAR sustainability a cornerstone to mass Web3 adoption
From the outset, NEAR Foundation has been dedicated to bringing a billion users into a more open web, said NEAR Foundation CEO, Marieke Flament.
“We recognised early on that sustainability would be a cornerstone of our approach to facilitating the widespread adoption of Web3 technologies,” Flament added. “By aligning with the Ethereum Climate Platform and implementing our Blockchain Operating System (BOS), we’re not just enhancing the user experience — we’re creating a paradigm shift in the blockchain industry that respects and supports our global environment.”
On a social and ecological level, NEAR’s Nightshade sharding approach allows NEAR Foundation to deliver on its carbon-neutral commitment. In 2021, NEAR Protocol was awarded the Climate Neutral Product Label, and its developers have been leading the charge in this era of extraordinary energy efficiency, creating an infrastructure that is lightning-fast, remarkably secure, and capable of accommodating millions of Web3 users.
NEAR Protocol achieved this after enlisting South Pole, a globally recognized climate solutions provider based in Zurich, Switzerland, to evaluate NEAR’s carbon footprint, find ways to reduce it, and counterbalance the remaining emissions with CO2 offsetting initiatives. The assessment revealed that NEAR Protocol was over 200,000 times more carbon efficient than Bitcoin, primarily due to the adoption of PoS over PoW.
Reforestation projects offset the minimal environmental impact of NEAR, rendering the NEAR Blockchain carbon neutral. Performing transactions on NEAR contributes to tree-planting initiatives in Colombia, Zimbabwe, and the United States via these carbon-offsetting projects.
“Having NEAR Foundation onboard this endeavor strengthens our joint mission significantly,” said Bill Kentrup, Executive Director of ECA. “Through persistent collective effort, we can bring to market climate finance tools that will accelate capital into climate-aligned assets and in turn, significantly mitigate greenhouse gasses.”
“As a fellow pioneer in the Web3 space, their participation is invaluable,” said Kentrup, “and we’re sincerely excited about the potential this partnership holds for the future.”
|
```bash
near view v2.ref-finance.near get_deposits '{"account_id": "bob.near"}'
```
<details>
<summary>Example response</summary>
<p>
```bash
{
'token.v2.ref-finance.near': '0',
'wrap.near': "0"
}
```
</p>
</details>
|
Register for NEAR’s June Town Hall: NEAR at Consensus
COMMUNITY
June 3, 2022
Howdy, NEAR World! This month the NEAR Town Hall will be broadcasting from Coindesk’s Consensus conference in Austin, Texas. Join us on Sunday, June 12, 2022 at 5:00pm GMT (1:00PM EDT/10:00PM PDT) for updates on the NEAR ecosystem and Web3 onboarding efforts, as well as for special Consensus-related segments.
Here’s a brief preview of what to expect from the June Town Hall. And don’t forget to register for the Town Hall below.
State of NEAR address
NEAR Foundation CEO Marieke Flament will open the June Town Hall with a brief segment on why NEAR is thrilled to be a Level 5 sponsor of Consensus. She will also give developers, projects, and investors ecosystem updates, and remind the community of its shared vision of mainstreaming Web3 technology and apps.
Flament’s segment will be a “State of NEAR”. In this segment, which comes at the mid-year point, Flament will reflect on NEAR’s tremendous growth. She will also address the bear market and speak of NEAR’s solid fundamentals for funding, support, marketing, and more.
To round out her Town Hall spot, Flament will also highlight NEAR’s presence and interactions at this year’s Consensus gathering.
NEAR Protocol update
Following Flament’s introduction, NEAR Co-founder Illia Polosukhin will give a mid-year overview and update on NEAR’s Nightshade sharding. He will also touch on what to expect from the NEAR Protocol throughout the rest of 2022 and into 2023.
This Town Hall segment will be must-see for both new and prospective developers, projects, and investors curious about the state of NEAR’s scalability and security.
NEAR Foundation updates
After Illia’s NEAR Protocol update, the Town Hall will pivot into NEAR Foundation updates. Jack Collier, NEAR Foundation’s new CMO, will get into a number of new and exciting foundation activities. After he shares his vision for the foundation’s marketing efforts.
Yadira Blocker, the foundation’s Experiential Marketing and Events Lead, will update the community on upcoming events. Her segment will feature spots on Ethereum Community Conference (EthCC) and NEARCON, including where to find information on these events and how to stay updated as they approach.
Blocker will also give an introduction to the NEAR Town Hall’s Consensus Spotlight.
Consensus Spotlight and Town Hall Panel
This special Town Hall segment will feature “man on the street” video interviews from Consensus. They will feature NEAR Foundation team members David Morrison (Community Moderator) and Chad Lamon (Video Producer).
After the video segment, Marieke Flament will introduce the Town Hall panel. Moderated by Cathy Hackl, a Tech Columnist and Thought Leader, this special panel will include a team of globally acclaimed and awarded creatives announcing their project on NEAR. So, be sure to tune in to the Town Hall to see what this super stealth and exciting project is all about. |
NEAR Weekly On Chain Data Report: December 9
NEAR FOUNDATION
December 9, 2022
As part of the Foundation’s commitment to transparency, each week it will publish data to help the NEAR community understand the health of the ecosystem. This will be on top of the quarterly reports, and the now monthly funding reports.
You can find the quarterly reports here.
You can find monthly reports on funding here.
Last week’s transparency report can be found here.
The importance of transparency
The NEAR Foundation has always held transparency as one of its core beliefs. Being open to the community, investors, builders and creators is one of the core tenets of being a Web3 project. But it’s become apparent the Foundation needs to do more.
The Foundation hears the frustration from the community, and it wants to be more pro-active in when and how it communicates.
New Accounts and Active Accounts
New Accounts are new wallets being created on the NEAR blockchain. In the last days of November and the first days of December, the daily number of new accounts had been declining. This week however, activity has been trending up, with the average number of new accounts averaging 14,000 per day, with a weekly peak of 14,752 recorded on December 6.
This is down on November’s figure of 24,000 wallets per 24 hours, on average. These numbers are conducive with the overall decline in sentiment around blockchain.
Looking more broadly, the peak for account creation in Q4 was September 13 where 130,000 new wallets were created in one day. Collectively, these numbers equate to 22,551,000 total wallets on the NEAR blockchain.
The Daily Number of Active Accounts is a measure of how many wallets on NEAR are making transactions on chain. Over the last week, the number of daily active accounts has oscillated between a a low of 12,639 on December 5, to a peak of 14,752 on December 6.
Historically, this is a decline from highs of more than 100,000 active accounts on the network. The highest number of active accounts on any one day in Q4 this year was logged on September 14, where 183,000 accounts were active.
New Contracts and Active Contracts
Contracts on NEAR are simply programs stored on a blockchain that run when predetermined conditions are met. The Daily Number of New Contracts is a valuable metric for understanding the health and growth of an ecosystem.
The more active contracts there are, the more projects are actively engaging with the NEAR protocol. The chart below shows a cyclical rhythm to new contracts, with rises and falls. Over the last seven days, the number of new contracts reached a daily high of 54 on December 4, and a weekly low of 27 on December 6. This range is broader compared to the week before, with last week’s highest number of 44 on November 30, to a low of 12 new contracts on November 27.
Active contracts is a measure of contracts that execute in a 24 hour period. This number has remained consistent throughout the last week with an average of more than 600 active contracts on the NEAR network. Taking a historical perspective on these numbers, the average has declined in Q4, with previous highs in active contract activity coming in the third week of September 2022.
Used Gas and Gas Fee
Gas Fees are a catch all term for the cost of making transactions on the NEAR network. These fees are paid to validators for their services to the blockchain. Without these fees, there would be no incentive for anyone to keep the network secure.
Over the last week, the daily amount of gas, expressed here as PetaGas, which is the equivalent of 0.1 $NEAR, has maintained a fairly consistent value between 5,949 PetaGas and 6,743 PetaGas. To learn more about Gas on NEAR, there is an excellent explainer in the NEAR White Paper. Compared with last week, the Daily Amount of Used Gas has decreased, from highs of more than 8,000 PetaGas.
The Daily Amount of Gas correlates with the Daily Gas Fee used on the network. Over the last week, there has been an uptick in the amount of Gas used, which can be brought on by a number of different factors. One of the most common is increased activity among users of the network.
Daily Transactions
The daily number of transactions is a record of how many times the blockchain logged a transaction. This week’s data represents a healthy increase in the number of transactions. From lows of 328,000 transactions on December 3, to a weekly high of 388,000 transactions per day on December 6. Looking more broadly, NEAR transaction activity has been trending downwards in Q4, reflecting other on chain data presented here.
These reports will be generated each week and published on Friday. |
The Evolution of the Open Web
COMMUNITY
April 29, 2020
Developers have been pitched Blockchain technology for many years based on a hand-wavy set of “use cases” which are tied to unclear definitions of how the technology works, what it’s actually good for and how the platforms that employ it are meaningfully different from each other. This has justifiably created both confusion and skepticism.
In this post, I want to break through that fog and provide a set of clear mental models for understanding how potential use cases drive the technical tradeoffs that each platform has been forced to make. These mental models derive from the progress that the technology has made over the past 10 years through three distinct generations from Open Money to Open Finance to Open Web.
My goal is that you leave with a much clearer and more concrete understanding of what blockchain technology is, what different platforms are good for and how the future might unfold.
A Quick Blockchain Primer
To cover the obligatory basics, a blockchain is essentially just a database that is run by a group of different operators rather than a single entity (like Amazon or Microsoft or Google). This gives it powerful properties because, unlike today’s clouds, you no longer have to trust the “owner” of the database (or their operational security) to maintain high value data. When the blockchain is public (as the largest ones are), anyone can use it for anything.
To enable this system to run across a wide range of potentially anonymous machines around the world, the system must have a digital token which facilitates the payment from users of the chain to the operators of the system (and provides security guarantees baked into the game theory behind it). Though it was co-opted by the misguided ICO boom in 2017, the core idea of “tokens” and of “tokenization” in general — that a single digital asset can be uniquely identified and transferred — is incredibly powerful.
It’s also important to separate the database portion which stores the data from the layer that actually modifies the data (the virtual machine). These are usually tied together to a degree because the technical properties of the database layer dictate what’s possible to do with the modification logic but each of them separately drives the higher level properties of the system.
The chain can be optimized along several dimensions, for example security (think Bitcoin), speed, cost or scalability. The modification logic on top of this can also be optimized along several dimensions — it could be a simple “add/subtract” calculator (like Bitcoin) or a fully Turing complete virtual machine (like Ethereum or NEAR).
So it is actually possible that two blockchain-powered platforms tune the underlying blockchain and VM to drive completely different functionalities and they might never compete with each other in the market. For example, Bitcoin is in a different world from Ethereum or NEAR which are totally separate from Ripple or Stellar, despite all being run using “blockchain technology.”
The 3 Generations of Blockchain
Both technological advances and specific system design decisions have driven blockchain’s expanding functionality during the 3 generations of its development, which occurred over the last 10 years. These generations can be grouped in the following way:
Open Money: Give anyone access to digital money.
Open Finance: Apply programmability to multiply the capabilities of that Open Money.
Open Web: Expand the scope of Open Finance to encompass all high value data and bridge it to consumer-scale.
Let’s start by examining Open Money.
First Generation: Open Money
Money is the basic building block of capitalism. Step one allowed anyone anywhere to have access to money.
One of the highest value pieces of data you can store in a database is money itself. That’s the innovation of Bitcoin — to provide a simple shared ledger that allows everyone to agree that Joe owns 30 BTC and just sent Jill 1.5 BTC. Bitcoin was tuned to optimize for security above all else. Their consensus is enormously expensive, time-consuming and bottlenecked while their modification layer is essentially just a “plus/minus calculator” which allows the transfers of balances and a few other tightly scoped operations.
Bitcoin is a good training tool because it illustrates the fundamental benefits of data stored on a blockchain: it is independent of any single third party and can be operated on permissionlessly (one holder of BTC can initiate a p2p transfer to another account without asking anyone).
Because of the simplicity and power of Bitcoin’s promise, “money” has been one of the earliest and most successful initial use cases for blockchain. But it turns out that people have many different ways that they want to use money so Bitcoin’s “extra slow, extra expensive, extra secure” tuning works well for storing value as if it were gold but doesn’t work well for uses like rapid e-commerce payments or cross-border transfers.
Tuning Open Money
For these other use cases, other chains have emerged and tuned their parameters accordingly:
Transfers: if you want to allow a million people to send occasional amounts of money around the world every day, you need something that’s much more performant and less expensive than Bitcoin but still maintains a healthy amount of security. Ripple and Stellar are two projects that have tuned their chains for this use case.
Rapid transactions: if you want to allow a billion people to spend digital money like they do with credit cards, you need to further tune the chain to be massively scalable, massively performance and low cost. To do this, you generally have to make some sacrifices along the dimension of security. There are two approaches to this. First, build a faster “layer 2” on top of Bitcoin which optimizes for fast performance but then stores the assets back in the Bitcoin “vault” when transactions are complete, like Lightning Network. Second, build a new blockchain which provides as much security as possible while still allowing for fast, cheap transactions, like Libra.
Private transactions: If you want to ensure that no one can infer who sent what to who, you would need to add a layer that anonymizes transactions, which sacrifices performance and adds to cost. Zcash and Monero have done this.
Because the token that represents this money is actually a fully digital asset, it can also be programmed. This can occur at the root level — for example, the total amount of Bitcoin that will ever be produced is coded into the core Bitcoin system — or it can be taken to a whole other level by having the right computing system on top of it.[1]
This is where Open Finance comes in.
Second Generation: Open Finance
With Finance, Money is no longer just a store of value or transactional asset — it can be operated on in useful ways which multiply its potential.
Things get far more interesting for digital money when you realize that the same properties which allow people to permissionlessly initiate transfers of Bitcoin also allow developers to write programs that do the same thing. In this context, think of this digital money as having its own independent API which doesn’t require any company to provide you with an API key and Terms of Use contract.
This is the promise of “Open Finance”, also known as “Decentralized Finance (DeFi)”. And it’s where we again hit the issues of technological progress and platform tuning.
Ethereum
As mentioned above, Bitcoin’s API is quite simple and non-performant. This makes it possible to deploy scripts to the Bitcoin network which can move Bitcoin around but, to do anything more interesting, you generally need to move the Bitcoin itself to another blockchain platform (which is a nontrivial operation).
Other platforms have worked to marry the security necessary to handle digital money with a more complex modification layer. Ethereum was the first to offer this. Instead of the “add/subtract” calculator of Bitcoin, Ethereum created a full virtual machine on top of the storage layer which allowed developers to write full programs and run them on the chain natively.
This is important because the security of the digital asset (eg money) that is stored on the chain is only as good as the security and robustness of the programs which can natively modify its state. Ethereum’s “smart contract” programs are essentially serverless scripts that run on the chain the same way that a vanilla “send Jill 23 tokens” transaction runs on Bitcoin. Ethereum’s native token is “Ether” or “ETH”.
Blockchain Components as Plumbing
The power of this took some time to emerge. Because the API on top of ETH is permissionless (like Bitcoin) but infinitely programmable, it became possible to create a series of building block components which essentially pipe ETH from one to another to accomplish useful work for the end user.
In the default world, this would require, for example, a big bank to negotiate contracts and API access with each of the individual providers. On blockchain, each of these components were independently created by developers and quickly scaled to handling millions of dollars of throughput and holding over $1B of value in early 2020.
For example, let’s start with Dharma, a wallet that allows users to store digital tokens and earn interest on those balances. This is a fundamental use case of traditional banking. The developers of Dharma generate that interest for their users by pulling together multiple components that were created on top of Ethereum. For example, users’ dollars are converted to DAI, an Ethereum-based “stable coin” which maintains the value of the US dollar, then piped into Compound, a protocol which lends these balances, thus earning users instant interest on their balances.
Implications of Open Finance
The key takeaway is that a real consumer-facing product was created by relying on a stack of components which were each created by a different team and which were used without asking for any permission or API keys and now handle millions of dollars of value. It is almost like open source software but, whereas open source software relies on downloading a copy of a particular library again for each implementation, open components are only deployed once and then anyone can just call that specific instance to access its shared state.
Each of the teams who created those components can rest easy knowing that they won’t become liable for any oversized EC2 bills because someone is abusing their API — the metering and charging for use of these components is done intrinsically as part of the chain.
Performance and Tuning
Ethereum operates with similar parameters as Bitcoin, just slightly faster (blocks propagate about 30x faster) and slightly cheaper (instead of roughly $0.50 transaction cost on Bitcoin, it costs about $0.10[2]). This has allowed them to maintain comfortable security while allowing slow speed apps like asset management to flourish.
But, as a first-generation bare-metal technology, the Ethereum network has been brought to its knees in times of stress and suffers from a maximum 15-transaction-per-second bottleneck. This performance gap has left Open Finance stalled as a niche proof-of-concept. We are stuck with slow-moving assets similar to how the global financial system operated in the analog age of paper checks and telephone confirmations because Ethereum has less computing power than a 1990 graphing calculator.
Ethereum showed a hint of the power of composable components for financial use cases but also opened the door to a much broader set of use cases, called the Open Web.
Third Generation: Open Web
Now anything which has value can acquire the properties of money, merging the Internet with Open Finance to create the Internet of Value and the Open Web.
We saw previously that “Open Money” encompasses a wide spectrum of different use cases and how a next generation technology, Ethereum, made Open Money far more useful by creating the permissionless composability of “Open Finance.” Now, we’ll see how one more generation of technology expands the possibilities of Open Finance even further to unlock the true potential of Blockchain.
Fundamentally, all the “money” we’ve been talking about is just a line item of data stored on the blockchain with its own open API. But the database can store anything.
While the nature of blockchain makes it best suited for elements that have meaningful value, the definition of “meaningful value” is highly flexible. Any data which is potentially valuable to other people can be set free by “tokenizing” it. “Tokenization” in this context is the process by which an existing asset — not a newly generated one like Bitcoin — is transferred on to the blockchain and provided with the same sort of permissionless API that Bitcoin and Ethereum have. As with Bitcoin, this allows for the option of globally enforceable scarcity (whether 21 million or just one)
Consider community projects like Reddit where users earn online reputation in the form of “Karma”. Then consider projects like Sofi where a wide range of signals are used to evaluate whether someone should be approved for a loan. In today’s world, if a hackathon team building the next Sofi wanted to incorporate the Karma score into their credit algorithm, they’d need to negotiate a bilateral agreement with the Reddit team to acquire certified API access. If Karma is tokenized, they have all the tools they need to integrate with Karma and Reddit never needs to know about it — they simply benefit from more users who want to grow their Karma because it’s useful around the world.
Even further, 100 different teams at the next hackathon can figure out new ways to take this asset plus dozens more and create a new set of publicly reusable components or build new consumer-facing applications from them.
This is the vision of the Open Web.
Just like Ethereum enabled the easy piping of high value money through permissionless components, any asset which has been tokenized can be piped through a similar set of components and “spent”, exchanged, collateralized, modified or otherwise interacted with as per the specification of its open API.
Tuning for the Open Web
The Open Web isn’t fundamentally different from Open Finance — the first is really a superset of the second — but the expansion of Open Web’s use cases requires a substantial uptick in performance and also the capability to reach a new tranche of potential users.
The key requirements for a platform to drive the Open Web are:
Higher volume, higher speed, lower cost transactions — because the chain is no longer just handling slow-moving asset management decisions, it needs to scale to support more nuanced data types and use cases.
Usability — because the use cases will cross into consumer-facing applications, it’s critical that the components developers build or the apps on top of them allow for smooth end-user experiences, for example when they’re provisioning or linking accounts to various assets or platforms while still preserving user ownership of their data.
These specifications have not been met by any platform before because they are enormously complicated. It has taken years of research to arrive at the point where new consensus mechanisms merge with new runtimes and new scalability techniques while still maintaining the performance and security needed to maintain a backbone of monetary assets.
The Open Web Platform
There are dozens of blockchain projects coming to market this year which have tuned their platforms to address various subsets of the Open Money and Open Finance use cases. Given the limitations of the existing technology, it made sense for them to optimize for a specific niche.
NEAR is the only chain which has deliberately advanced its technology and tuned its performance characteristics to meet the full needs of the Open Web.
NEAR merges scalability approaches from the high performance database world with advances in runtime performance and years of progress on usability. Like Ethereum, it has a full virtual machine built on top of a blockchain but the underlying chain adjusts its capacity to meet demand by dynamically splitting computation into parallel processes (sharding) while still maintaining the security needed to keep the data safe.
This means that the full set of possible use cases can be built on NEAR — fiat-backed coins that give global access to stable currency, Open Finance tools that scale up to complex financial instruments and out to everyday people, and Open Web applications that integrate all of this to power everyday commerce and interaction.
Outro
The story of the Open Web is just beginning because we’ve only just developed the requisite technologies to bring it to its proper scale. With that milestone achieved, the future will be driven by the innovation that can be achieved on top of this new stack and how well supported are the developers and entrepreneurs who represent its leading edge.
For a sense of the potential impact of the Open Web, note the cambrian explosion that occurred when the early Internet, after several challenging years, developed the necessary protocols to finally allow consumer spending of money online in the late 1990s. The ensuing 25 years have seen web-based commerce grow to generate over $2 trillion in spending each year.
Similarly, the Open Web expands the scope and reach of the financial primitives of Open Finance and enables their inclusion in business- and consumer-facing applications in ways that we can guess but certainly not predict.
[1]: Bitcoin Script does allow some programmability but this is quite limited.
[2]: Both are based on an auction, so all numbers are approximate. |
```bash
near call v2.keypom.near create_drop '{"public_keys": <PUBLIC_KEYS>, "deposit_per_use": "10000000000000000000000", "fcData": {"methods": [[{"receiverId": "nft.primitives.near","methodName": "nft_mint","args": {"token_id": "1", "metadata": {"title": "My NFT drop","description": "","media": ""}, "accountIdField": "receiver_id", "attachedDeposit": "10000000000000000000000"}]]}}' --depositYocto 23000000000000000000000 --gas 100000000000000 --accountId bob.near
``` |
# Miscellaneous API
```rust
value_return(value_len: u64, value_ptr: u64)
```
Sets the blob of data as the return value of the contract.
###### Panics
- If `value_len + value_ptr` exceeds the memory container or points to an unused register it panics with `MemoryAccessViolation`;
---
```rust
panic()
```
Terminates the execution of the program with panic `GuestPanic("explicit guest panic")`.
---
```rust
panic_utf8(len: u64, ptr: u64)
```
Terminates the execution of the program with panic `GuestPanic(s)`, where `s` is the given UTF-8 encoded string.
###### Normal behavior
If `len == u64::MAX` then treats the string as null-terminated with character `'\0'`;
###### Panics
- If string extends outside the memory of the guest with `MemoryAccessViolation`;
- If string is not UTF-8 returns `BadUtf8`.
- If string length without null-termination symbol is larger than `config.max_log_len` returns `BadUtf8`.
---
```rust
log_utf8(len: u64, ptr: u64)
```
Logs the UTF-8 encoded string.
###### Normal behavior
If `len == u64::MAX` then treats the string as null-terminated with character `'\0'`;
###### Panics
- If string extends outside the memory of the guest with `MemoryAccessViolation`;
- If string is not UTF-8 returns `BadUtf8`.
- If string length without null-termination symbol is larger than `config.max_log_len` returns `BadUtf8`.
---
```rust
log_utf16(len: u64, ptr: u64)
```
Logs the UTF-16 encoded string. `len` is the number of bytes in the string.
See https://stackoverflow.com/a/5923961 that explains that null termination is not defined through encoding.
###### Normal behavior
If `len == u64::MAX` then treats the string as null-terminated with two-byte sequence of `0x00 0x00`.
###### Panics
- If string extends outside the memory of the guest with `MemoryAccessViolation`;
---
```rust
abort(msg_ptr: u32, filename_ptr: u32, line: u32, col: u32)
```
Special import kept for compatibility with AssemblyScript contracts. Not called by smart contracts directly, but instead
called by the code generated by AssemblyScript.
# Future Improvements
In the future we can have some of the registers to be on the guest.
For instance a guest can tell the host that it has some pre-allocated memory that it wants to be used for the register,
e.g.
```rust
set_guest_register(register_id: u64, register_ptr: u64, max_register_size: u64)
```
will assign `register_id` to a span of memory on the guest. Host then would also know the size of that buffer on guest
and can throw a panic if there is an attempted copying that exceeds the guest register size.
|
```js
import { Wallet } from './near-wallet';
const KEYPOM_CONTRACT_ADDRESS = "v2.keypom.near";
const DROP_AMOUNT = "10000000000000000000000"; // 0.1 NEAR
const wallet = new Wallet({ createAccessKeyFor: KEYPOM_CONTRACT_ADDRESS });
await wallet.callMethod({
method: "create_drop",
contractId: KEYPOM_CONTRACT_ADDRESS,
args: {
public_keys: state.publicKeys,
deposit_per_use: DROP_AMOUNT,
},
deposit: "23000000000000000000000" // state.publicKeys.length * dropAmount + 3000000000000000000000,
gas: "100000000000000",
});
```
_The `Wallet` object comes from our [quickstart template](https://github.com/near-examples/hello-near-examples/blob/main/frontend/near-wallet.js)_ |
Understanding Rust Lifetimes
DEVELOPERS
January 9, 2019
No, seriously, this time for real
Coming to Rust from C++ and learning about lifetimes is very similar to coming to C++ from Java and learning about pointers. At first, it looks like an unnecessary concept, something a compiler should have taken care of. Later, as you realize that it gifts you with more power — in case of Rust it is more safety and better optimizations — you feel encouraged to master it but fail because it is not exactly intuitive and the formal rules are hard to find. Arguably, C++ pointers are easier to get immersed into than Rust lifetimes because C++ pointers are everywhere in the code, while Rust lifetimes are usually hidden behind tons of syntactic sugar. And so you end up being exposed to the lifetimes when syntactic sugar does not apply which is usually some complex cases. It is hard to internalize the concept when the only things you are exposed to are the complex cases.
Intro
The first thing one needs to realize about lifetimes is that they are all about references, and nothing else. For example, when we see a struct with a lifetime type-parameter it refers to the lifetimes of the references owned by this struct and nothing else. There is no such thing as a lifetime of a struct or a closure, there are only lifetimes of the references inside the struct or the closure. Therefore our discussion of the lifetimes will inevitably be about Rust references.
The motivation behind the lifetimes
To understand the lifetimes we first need to understand the motivation behind them, which requires understanding the motivation behind the borrowing rules first. The borrowing rules state:
At no point in the code, there are references to overlapping pieces of memory, a.k.a aliasing, such that at least one of them is used to mutate the content of the memory.
Simultaneous mutation and aliasing are undesirable because it is unsafe and it prevents the compiler from doing various optimizations.
Example
Suppose we wanted to write a function that shifts the given coordinates along the x-axis twice in the given direction.
struct Coords {
pub x: i64,
pub y: i64,
}
fn shift_x_twice(coords: &mut Coords, delta: &i64) {
coords.x += *delta;
coords.x += *delta;
}
fn main() {
let mut a = Coords{x: 10, y: 10};
let delta_a = 10;
shift_x_twice(&mut a, &delta_a); // All good.
let mut b = Coords{x: 10, y: 10};
let delta_b = &b.x;
// shift_x_twice(&mut b, delta_b); // Compilation failure.
}
The last statement would have shifted the coordinate three times instead of two which could cause all kinds of bugs in the production system. The core problem is that delta_b and &mut b point to an overlapping memory, which is prevented in Rust through lifetimes and borrowing rules. Specifically, Rust compiler notices that delta_b requires holding an immutable reference to b until the end of main(), but in that scope, we also attempt creating a mutable reference to b, which is prohibited.
To be able to perform the borrowing rules check compiler needs to know the lifetimes of all references. In many cases compiler is able to derive the lifetimes on its own, but sometimes it is unable to and that’s where the developer needs to step in and annotate them manually. Additionally, it gives the developer a design tool, e.g. one can require that all structs that implement a certain trait have all their references live for the given duration, at least.
Compare Rust references to C++ references. In C++ one can also have const and non-const references, similarly to Rust’s x and &mut x. However, there are no lifetimes in C++. This helps C++ compiler with optimizations a bit, but it does not give complete safety guarantees. And so the above example would compile if it was written in C++.
Desugaring
Before we dive deep into understanding lifetimes, we need to clarify what a lifetime is, because various Rust documentation uses word lifetime to refer to both scopes and type-parameters. Here we say lifetime to denote a scope and lifetime-parameter to denote a parameter that compiler would substitute with a real lifetime just like when it infers types of generics.
Example
To make explanation transparent we will desugar some Rust code. Consider the following snippet:
fn announce(value: &impl Display) {
println!("Behold! {}!", value);
}
fn main() {
let num = 42;
let num_ref = #
announce(num_ref);
}
Here the desugared version:
fn announce<'a, T>(value: &'a T) where T: Display {
println!("Behold! {}!", value);
}
fn main() {
'x: {
let num = 42;
'y: {
let num_ref = &'y num;
'z: {
announce(num_ref);
}
}
}
}
The following desugared code was explicitly annotated with lifetime-parameter 'a and lifetimes/scopes 'x, 'y, and 'x.
We have also used impl Display to compare lifetime-parameters with general type-parameters. Notice how sugar was hiding both a lifetime-parameter 'a and a general type-parameter T. Note, the scopes are not a part of the Rust language syntax, we use them for annotation purposes only, and so this code will not compile. Also, in this and other examples, we ignore non-lexical lifetimes that were added in Rust 2018 to simplify the explanation.
Subtyping
Technically, lifetime is not a type, because we cannot construct an instance of it, unlike regular types like u64 or Vec<T>. However, when we parametrize functions or structs lifetime-parameters are used just like they were type-parameters, see the above announce example. Also, Variance rules that we will see later operate with lifetimes like they were types, so we will go along with calling them types in this post.
It is useful comparing lifetimes to regular types, and lifetime-parameters to regular type-parameters :
When compiler infers a type for a regular type-parameter it would complain if there are multiple types that can satisfy the given type-parameter. In the case of lifetimes, multiple lifetimes can satisfy the given lifetime-parameter and the compiler would take the minimal one.
Simple Rust types do not have subtyping, more specifically, a struct cannot be a subtype of another struct, unless they have lifetime-parameters. However, lifetimes allow subtyping, and so if lifetime 'longer completely encloses the lifetime 'shorter then 'longer is a subtype of 'shorter. Lifetime subtyping also enables limited subtyping on types that are parameterized with lifetimes. As we will see later this implies that &'longer int is a subtype of &'shorter int. The 'static lifetime is a subtype of all lifetimes because it is the longest. 'static is kinda opposite to an Object type in Java which is a supertype of all types.
The Rules
Coercions vs Subtyping
Rust has a set of rules that allows one type to be coerced to another one. While coercions and subtyping look similar it is important to differentiate one from another. The key difference is that subtyping does not change the underlying value while coercion does. Specifically, the compiler inserts some extra code at the site of coercion to perform some low-level conversion, while subtyping is just a compiler check. Since this extra code is hidden from the developer, coercions and subtyping look visually similar, because both might look like this:
let b: B;
...
let a: A = b;
Side-by-side coercion and subtyping:
// This is coercion:
let values: [u32, 5] = [1, 2, 3, 4, 5];
let slice: [u32] = values;
// This is subtyping:
let val1 = 42;
let val2 = 24;
'x: {
let ref1 = &'x val1;
'y: {
let mut ref2 = &'y val2;
ref2 = ref1;
}
}
This code works, because 'x is a subtype of 'y and so &'x is a subtype of &'y.
It is easy to differentiate the two by learning some of the most common coercions, the rest are much less frequent, see Rustonomicon:
Pointer weakening: &mut T to &T
Deref: &x of type &T to &*x of type &U, if T: Deref<Target=U>.
This allows using smart pointers like they were regular references.
[T; n] to [T].
T to dyn Trait, if T: Trait.
You might wonder why the fact that 'x is a subtype of 'y implies that &'x T is a subtype of &'y T? To answer this we need to discuss Variance.
Variance
It is easy to tell whether lifetime 'longer is a subtype of a lifetime 'shorter based on the previous section. You can even intuitively understand why &'longer T is a subtype of &'shorter T. But can you tell whether &'a mut &'longer T is a subtype of &'a mut &'shorter T? It is actually not, and to understand why we need the rules of Variance.
As we said before, lifetimes enable limited subtyping on types that are parameterized with lifetimes. Variance is a property of the type-constructors, where type-constructor is a type with parameters, e.g. Vec<T> or &mut T. Specifically, variance determines how subtyping of the parameters affects the subtyping of the resulting type. If type-constructor has multiple parameters, e.g. F<'a, T, U> or &'b mut V, then the variance is computed with respect to each parameter individually.
There are three types of variances:
F<T> is covariant of T if F<Subtype> is a subtype of F<Supertype>
F<T> is contravariant over T if F<Subtype> is a supertype of F<Supertype> F<T>is invariant over T ifF<Subtype> is a neither a subtype nor a supertype of F<Supertype>, making them incompatible.
When type-constructor has multiple parameters, we can talk about individual variances by sayint that, for example F<'a, T> is covariant over 'a and invariant over T. Also, there is actually a fourth type of variance, bivariance, but it is a specific compiler implementation detail which we do not need to touch here.
Here is the table of variances for most common type-constructors:
Taken from Rustonomicon
Covariance is basically a pass-through rule. Contravariance is rare and occurs only when one passes a pointer to a function that uses higher-rank trait bounds. Invariance is the most important one and we will see its motivation when we start combining variances.
Variance arithmetic
Now we know what are the subtypes and supertypes of &'a mut T and Vec<T>, but do we know what are the subtypes and supertypes of &'a mut Vec<T> and Vec<&'a mut T> ? To answer this we need to know how to combine variances of the type-constructors.
There are two mathematical operations for combining variances: Transform and the greatest lower bound, GLB. Transform is used for type composition, while GLB is used for all aggregates: struct, tuple, enum, and union. Let us denote in-, co-, and contravariance with 0, +, and -, respectively. Then Transform(x) and GLB(^) are described with the following two tables:
Taken from the following post that also contains additional details
Example
Suppose we want to know whether Box<&'longer bool> is a subtype of Box<&'shorter bool>. In other words, we want to know the covariance of Box<&'a bool> with respect to 'a. &'a bool is covariant with respect to 'a and Box<T> is covariant with respect to T. Since it is a composition we need to apply Transform (x): covariant (+) x covariant (+) = covariant (+). Which means we can assign Box<&'longer bool> to Box<&'shorter bool>.
Similarly, Cell<&'longer bool> cannot be assigned to Cell<&'shorter bool> because covariant (+) x invariant (0) = invariant (0).
Example
The following example from Rustonomicon demonstrates why we need invariance on some type-constructors. It attempts to construct a code that uses a freed object.
fn evil_feeder<T>(input: &mut T, val: T) {
*input = val;
}
fn main() {
let mut mr_snuggles: &'static str = "meow! :3"; // mr. snuggles forever!!
{
let spike = String::from("bark! >:V");
let spike_str: &str = &spike; // Only lives for the block
evil_feeder(&mut mr_snuggles, spike_str); // EVIL!
}
println!("{}", mr_snuggles); // Use after free?
}
Taken from Rustonomicon
Rust compiler would not allow it. To understand why we first desugar some of the code:
fn evil_feeder<’a, T>(input: &’a mut T, val: T) {
*input = val;
}
fn main() {
let mut mr_snuggles: &'static str = "meow! :3";
{
let spike = String::from("bark! >:V");
‘x: {
let spike_str: &’x str = &’x spike;
‘y: {
evil_feeder(&’y mut mr_snuggles, spike_str);
}
}
}
println!("{}", mr_snuggles);
}
During compilation, the compiler will try to find parameter T that satisfies the constraints. Recall that compiler takes the minimal lifetime and so it will attempt using &'x str for T. Now, the first argument of evil_feeder is &'y mut &'x str and we are trying to pass &'y &'static str instead. Will it work?
In order for it to work &'y mut &'z str should be covariant over 'z, because 'static is a subtype of 'y. Recall that &'y mut T is invariant with respect to T and &'z T is covariant with respect to 'z. &'y mut &'z str is invariant with respect to 'z, because covariant(+) x invariant (0) = invariant (0). So it will not compile. The crisis has been averted!
Interestingly, this code would compile if written in C++.
Example with structs
With structs, we would need to use GLB instead of Transform, but it only matters for when contravariance is involved which only happens when we use function pointers. Here is an example that will not compile, because struct Owner is invariant with respect to lifetime-parameter 'c, which is also indicated by the compiler error message: type annotation requires that `spike` is borrowed for `’static`. Invariance essentially disables subtyping and so the lifetime of spike should match mr_snuggles exactly:
struct Owner<'a:'c, 'b:'c, 'c> {
pub dog: &'a &'c str,
pub cat: &'b mut &'c str,
}
fn main() {
let mut mr_snuggles: &'static str = "meow! :3";
let spike = String::from("bark! >:V");
let spike_str: &str = &spike;
let alice = Owner { dog: &spike_str, cat: &mut mr_snuggles };
}
Outro
It is hard to memorize all these rules and we do not want to search through them every time we encounter a difficult situation in Rust. The best way to develop intuition is to understand and remember unsafe situations that these rules prevent.
The first example with shifting coordinates allows us to remember the borrowing rules that prevent simultaneous mutation and aliasing.
&'a T and &'a mut T are covariant over 'a because it is always ok to pass a longer living lifetime where the shorter one is expected, except when they are wrapped in mutable reference and alike.
&'a mut T, UnsafeCell<T>, Cell<T>, *mut T allow mutable access and so to prevent the above evil_feeder example and alike we want it to be invariant over T which implies an exact match of the lifetimes.
With each release, Rust moves in the direction of usability and friendliness, however, lifetimes is a core concept that still requires a deep dive. This post assembled information from various resources so that you make one deep dive instead of many 🙂
Credits
The following resources were used to compile this post:
The entire Rustonomicon is a journey through ownership, borrowing rules, and lifetime.
Variance in Rust by kennytm does some ol’ compiler source code reading to present us with rigorous rules of Variance and Variance Arithmetic.
Subtyping and coercion in Rust by Nick Cameron discusses differences between coercion and subtyping.
Finally, thanks to Michael Kever, David Stolp a.k.a pieguy and everyone else in our Near Protocol team for the discussions of the lifetimes. For those interested in Near Protocol: we build a sharded general purpose blockchain with a huge emphasis on usability. If you like our write-ups, follow us on twitter to learn when we post new content:
http://twitter.com/nearprotocol
If you want to be more involved, join our Discord channel where we discuss all technical and non-technical aspects of Near Protocol, such as consensus, economics, and governance:
https://discord.gg/nqAXT7h
Near Protocol is being actively developed, and the code is open source, follow our progress on GitHub:
https://github.com/nearprotocol/nearcore
https://upscri.be/633436/
Upd
According to @centril, fn announce(value: &impl Display) to fn announce<'a, T>(value: &'a T) where T: Display is not an actually valid in-language desugaring, because turbofish will not allow applying a type like that. However, the point still stands;
In this post, we said that lifetimes are not types, but for simplicity we decided to call them types. Also, as pointed out by @centril lifetimes are what is called type-like. |
AI is NEAR: Illia Polosukhin on AI and the Open Web at NEARCON ’23
NEAR FOUNDATION
October 26, 2023
Artificial Intelligence and Web3 have been two of the hottest topics of 2023. There has been significant interest in how the two technology ecosystems may converge in the coming years, with lots of interest from investors.
For NEARCON ‘23, we’ve designed a special track for AI and the open web: ”AI is NEAR”. Leading the way will be NEAR co-founder Illia Polosukhin, a pioneer in the world of AI.
Illia will share his perspective on this past year’s AI revolution, including its intersection with the Web3 space on NEAR in the months to come.
Attention Is All You need
By now, everyone knows ChatGPT, which kicked off the latest AI revolution. But years before OpenAI released its AI chatbot, a group of researchers published a paper that would revolutionize the fields of AI and machine learning.
The “Attention is All You Need” paper, of which Illia is a co-author, introduced transformers — the “T” in ChatGPT. Illia’s research was instrumental in laying the groundwork for the major LLMs of today, such as ChatGPT and Bard.
At NEARCON ‘23, Illia will draw on his deep expertise in AI to offer a perspective on how AI and Web3 will converge and how blockchains can help AI develop in a more positive direction, mitigating potential risks and introducing more transparency. He will share the strategy for AI on NEAR going forward, including evolving NEAR governance with AI agents and the emergence of new business models.
Illia Polosukhin and TechCrunch’s Mike Butcher in a fireside chat
In this candid discussion, Illia and TechCrunch’s Mike Butcher will explore the evolving role of AI in the open web, what we can expect next, and why the current state of AI controlled by major corporations needs to change.
As the case for open source AI becomes clearer, and as regulators across the globe mitigate risk with new policies, the question now becomes: how can Web3 help AI evolve in a more positive direction?
Join the AI and Open Web chat at NEARCON ‘23
With some of the brightest thought leaders in AI and the open web, NEARCON ‘23 is one not to be missed. Alongside Illia, you will also hear from NEAR co-founder Alex Skidanov, who is currently working on an AI startup that is building smarter large language models, or LLMs, in a more decentralized way.
There will also be a panel discussion on how AI and blockchain technologies will integrate with modern work environments. Niraj Pant, previously an investor at Polychain who is launching a startup at the intersection of AI and Web3, will also deliver a keynote on how AI can be leveraged for decentralized applications and services.
The full NEARCON ‘23 schedule is now live, so check out the other tracks!
If you’re a student in Spain or Portugal, or a Ukrainian citizen living in Portugal, you’re eligible for free tickets. Hackers also get in for free by applying to the NEARCON IRL Hackathon.
|
---
sidebar_position: 4
---
# Deploying Contracts
You might want your smart contract to deploy subsequent smart contract code for a few reasons:
* The contract acts as a Factory, a pattern where a parent contract creates many child contracts ([Mintbase](https://www.mintbase.io/) does this to create a new NFT store for anyone who wants one; [Rainbow Bridge](https://near.org/bridge/) does this to deploy separate Fungible Token contracts for [each bridged token](https://github.com/aurora-is-near/rainbow-token-connector/blob/ce7640da144f000e0a93b6d9373bbc2514e37f3b/bridge-token-factory/src/lib.rs#L311-L341))
* The contract [updates its own code](../../../2.build/2.smart-contracts/release/upgrade.md#programmatic-update) (calls `deploy` on itself).
* You could implement a "contract per user" system that creates app-specific subaccounts for users (`your-app.user1.near`, `your-app.user2.near`, etc) and deploys the same contract to each. This is currently prohibitively expensive due to NEAR's [storage fees](https://docs.near.org/concepts/storage/storage-staking), but that may be optimized in the future. If it is, this sort of "sharded app design" may become the more scalable, user-centric approach to contract standards and app mechanics. An early experiment with this paradigm was called [Meta NEAR](https://github.com/metanear).
If your goal is to deploy to a subaccount of your main contract like Mintbase or the Rainbow Bridge, you will also need to create the account. So, combining concepts from the last few pages, here's what you need:
```rust
const CODE: &[u8] = include_bytes!("./path/to/compiled.wasm");
Promise::new("subaccount.example.near".parse().unwrap())
.create_account()
.add_full_access_key(env::signer_account_pk())
.transfer(3_000_000_000_000_000_000_000_000) // 3e24yN, 3N
.deploy_contract(CODE.to_vec())
```
Here's what a full contract might look like, showing a naïve way to pass `code` as an argument rather than hard-coding it with `include_bytes!`:
```rust
use near_sdk::{env, near_bindgen, AccountId, Balance, Promise};
const INITIAL_BALANCE: Balance = 3_000_000_000_000_000_000_000_000; // 3e24yN, 3N
#[near_bindgen]
pub struct Contract {}
#[near_bindgen]
impl Contract {
#[private]
pub fn create_child_contract(prefix: AccountId, code: Vec<u8>) -> Promise {
let subaccount_id = AccountId::new_unchecked(
format!("{}.{}", prefix, env::current_account_id())
);
Promise::new(subaccount_id)
.create_account()
.add_full_access_key(env::signer_account_pk())
.transfer(INITIAL_BALANCE)
.deploy_contract(code)
}
}
```
Why is this a naïve approach? It could run into issues because of the 4MB transaction size limit – the function above would deserialize and heap-allocate a whole contract. For many situations, the `include_bytes!` approach is preferable. If you really need to attach compiled Wasm as an argument, you might be able to copy the approach [used by Sputnik DAO v2](https://github.com/near-daos/sputnik-dao-contract/blob/a8fc9a8c1cbde37610e56e1efda8e5971e79b845/sputnikdao2/src/types.rs#L74-L142).
|
Avalanche vs The new IOTA consensus algorithm, with a touch of Spacemesh
DEVELOPERS
June 5, 2019
Intro
IOTA Foundation recently announced a project called Coordicide, and an accompanying paper. The goal of the project is to remove the so-called IOTA Coordinator, a centralized service that finalizes transactions on IOTA.
The paper outlines multiple changes that are being considered in order to make it possible for the protocol to remove the coordinator and thus make IOTA completely decentralized.
One of the most interesting changes proposed is the new consensus algorithm for choosing between multiple conflicting transactions. The consensus algorithm is described in a separate paper that Dr. Serguei Popov, one of the core researchers at IOTA, published shortly before the Coordicide project was announced.
The new consensus design attracted a lot of interest because of the similarities people drew between it and Avalanche, a paper that was publicly endorsed by Emin Gun Sirer (who is also one of Avalanche’s likely co-authors) a few years ago.
DAGs of Transactions
Indeed, IOTA and Avalanche are both built around a directed acyclic graph (DAG) of transactions, with edges of the DAG representing approvals.
In both IOTA and Avalanche the DAG is used to reduce the amount of communication needed in order to finalize the transactions. If a certain transaction is final from the perspective of some node, then all the transactions approved by that transaction are also final (such transactions are called Ancestry in Avalanche and Past Cone in IOTA). If a particular transaction is rejected (which can only happen if the transaction spends the same UTXO as some other approved transaction) then all the transactions that approve the rejected transaction are also rejected (such transactions are called Progeny in Avalanche and Future Cone in IOTA).
Consensus
At the core of such a DAG-based protocol is the consensus algorithm that chooses one transaction amongst several conflicting transactions, i.e. transactions that spend the same UTXO. Such consensus doesn’t have to converge if multiple conflicting transactions appear at approximately the same time, and have an approximately equal number of participants initially preferring each one, since conflicting transactions can only come from malicious actors, and the protocol doesn’t need to guarantee finality for such actors. But if a particular transaction came first, and a sufficiently large percentage of participants preferred it for sufficiently long, a conflicting transaction that comes later shall never become the preferred transaction. The motivation behind this is that more transactions now exist in the Progeny / Future Cone of the transaction that arrived first, and if it gets rejected, all the new transactions from the honest nodes will get rejected too.
The consensus protocol that chooses between multiple conflicting transactions presented in the Avalanche paper is called Snowball. The consensus protocol for IOTA is presented in the aforementioned paper.
Since the consensus algorithms pursue similar goals, they also are quite similar underneath. At the core of both consensus algorithms is a node doing the following procedure iteratively until it becomes sufficiently certain that consensus has been reached:
Choose a small sample of other nodes (on the order of 10) and query the outcome they currently prefer;
Update the node’s current belief based on the resulting votes.
The “update the current belief” part is the core of such consensus algorithms. Since the consensus protocols need to work in the presence of adversaries that will behave in such a way as to prevent the network from reaching consensus, naive approaches (such as just preferring the outcome that was preferred by the majority of sampled nodes, or changing preference if a large percentage of sampled nodes (say 80%) believe in the opposite outcome) do not work.
Consensus properties
Before we proceed, let’s define what “do not work” means. There are three ways in which these consensus algorithms can break:
Agreement failure — when two nodes both decide that some outcome was agreed upon, but those outcomes differ;
Termination failure — when no consensus is reached after an arbitrarily long period of time;
Integrity failure — when consensus is reached on some outcome, but that outcome was not proposed by anyone. An example of integrity failure is reaching consensus on value 0, when all the participants initially proposed the value of 1.
In the context of Snowball and the new IOTA consensus protocol, Agreement failure is absolutely not acceptable, and Integrity failure is also not acceptable, but in a slightly adjusted way. It is not only necessary that consensus is reached on the outcome that was proposed by someone, but also that if a majority of nodes were proposing some outcome, no other outcome shall be agreed upon, even if some nodes in the minority were proposing it.
The termination would also be desirable, but both protocols deemphasize it for the cases with more than one proposed outcome, arguing that more than one proposed outcome means multiple transactions spending the same UTXO, which can only come from malicious actors.
Both Snowball and the new IOTA consensus protocol provide agreement, at least as far as I can tell (though it’s important to note that the Avalanche paper has a typo that currently means Snowball doesn’t provide Agreement; with the typo fixed it is unlikely that Agreement can be violated). For both of them, it is easy to argue that if a majority of nodes initially sway towards one of the outcomes, the nodes will not switch to the other outcome no matter what malicious actors do, so Integrity (as defined above) is also present.
Termination of the new IOTA consensus and Snowball
The important difference comes when we consider Termination.
Let’s get back to the “update the current belief” part. After sampling the votes of the 10 peers a node needs to somehow adjust their current preference. In snowball each node maintains several counters to remember its confidence in each of the outcomes, and waits for several consistent consecutive samples before it changes its belief. This way adversaries cannot easily sway nodes towards one decision or the other, and cannot violate Agreement.
However, this doesn’t help much with Termination. In our simulation of Snowball we show that with the parameters provided in the paper, consensus can be kept in a metastable state for thousands of iterations if just 4% of the nodes are adversarial. With just 10% adversaries, not only does the consensus process remain in the metastable state indefinitely, but also, the nodes that believe in each outcome increase their confidence in their preferred outcome to very large numbers and continue to become more confident, bringing the consensus process into a state from which it provably cannot escape.
Thus, Snowball as-is only has Termination in the binary consensus setting with a very low percentage of adversaries. Here’s a simulation with 17% adversaries:
Read more about it here.
IOTA consensus uses a very different approach. It doesn’t maintain any counters, and instead does the following to choose between 0 and 1:
Each node samples current beliefs of k (say k=10) other nodes;
After that, all the nodes run some distributed randomness generator to generate some threshold between some value beta and 1 – beta. I.e. if beta = 0.3, then the value will be picked between 0.3 and 0.7;
Each node then chooses 1 if the number of nodes that prefer 1 in their sample was bigger than beta * k, otherwise it chooses 0.
The core idea here is that since the random value is chosen after the samples were performed, even an omniscient adversary (an adversary who knows the current preferences and states of all the nodes, but not the random value that will be generated in the future) doesn’t know which threshold to sway the samples towards.
To understand why this helps, imagine that the adversary can actually predict the value of the random generator, and knows it will be 0.7. If k=10, the adversary knows that in order to keep the consensus process in a metastable state, it wants approximately half of the nodes to sample less than 7 ones, and approximately half the nodes to sample more than 7 ones. If it also knows that in the current population of honest nodes 62 nodes prefer 1, it will make exactly 8 of the malicious nodes report 1 as well (so that together with the honest nodes, exactly 70 nodes report 1), and the remaining malicious nodes report 0. This way the median number of sampled ones will be 7, and thus approximately half of the nodes will end up sampling more than beta * k = 7 ones and choose one, while approximately half of the nodes will end up sampling less than 7 ones and choose zero. The adversary will then continue doing it in the future rounds, preventing the consensus process from converging.
However, consider what happens if the adversary doesn’t know the threshold. The adversary could try to report different outcomes to the different honest nodes that query them, but no matter what distribution of sampled votes they end up signalling, (with some non-trivial probability) the threshold they choose will cause a large percentage of the honest nodes to have queried samples that lie on the same side of the threshold. Thus, a large percentage of the participants will end up choosing the same outcome for the next round.
Such a consensus algorithm is significantly harder to stall. However, it relies on the existence of a distributed random number generator that generates the required randomness for the thresholds. Such randomness generation is a rather hard problem, especially given that the consensus strives for low network overhead. If Snowball had access to a distributed random number generator, the protocol’s designers could just make nodes choose the outcome of a random number generator in the event that the consensus process gets stuck in a metastable state.
For example, an idea like this is used in one of the components of Spacemesh. See professor Tal Moran explaining this approach here in a whiteboard session we recorded with him a few weeks ago.
Outro
In short, the new IOTA consensus is definitely in the same family of consensus algorithms as Snowball, but it is far from just being a Snowball copycat. As described, it is likely to have better liveness, but it relies on the existence of a distributed random number generator, which on itself is a complex problem (though the IOTA paper provides several references to existing research). If we assumed such a generator were available, it could be used in a variety of ways to escape the metastable state.
I work on a blockchain protocol called NEAR. It is a sharded protocol with a huge emphasis on usability. I wrote a lot about sharding before (see one and two), and also have a piece on usability.
Myself and my co-founder Illia often invite founders and core researchers of other protocols to a room with a whiteboard and record an hour-long video diving deep into their tech. We have episodes with Ethereum Serenity, Cosmos, Polkadot, Ontology, QuarkChain and many other protocols. All the episodes are conveniently assembled into a playlist here.
Follow myself and NEAR on Twitter to stay up to date with our progress and see new write-ups and whiteboard sessions. |
NEAR APAC Day One: Scaling Local Adoption, Regional Regulation, and AI in Web3
COMMUNITY
September 9, 2023
After many months in the making, the inaugural NEAR APAC conference kicked off today in Ho Chi Min City, Vietnam. Held in conjunction with the NEAR Vietnam community and NEAR Foundation, NEAR APAC is showcasing the vibrant crypto and Web3 community across the region as well as the potential of the B.O.S. and Open Web both locally and globally.
The day was charged with excitement, drawing over 8,000 attendees from various corners of the APAC region and beyond. With Web3 thought leaders, innovators, and enthusiasts converging in Vietnam, Day One of NEAR APAC kicked off with a slew of riveting discussions, industry insights, and speculations on the future of the Open Web with a focus on APAC.
From discussions around regional growth and investment potential to how AI and blockchain will transform the future of work, here’s everything that went down at an incredible start to NEAR APAC.
Setting the stage for NEAR APAC with Vietnam in focus
The event kicked off with a focus on southeast Asia, with NEAR’s Vietnam country director Riley Tran welcoming attendees and providing context as to Asia’s critical role in the overarching development and expansion of Web3. Tran was followed by two NEAR luminaries, CEO Marieke Flament and co-founder Illia Polosukhin.
“Asia’s young demographics and rapid digital growth, especially in Vietnam, position it as a leader in crypto and blockchain growth,” Riley explained. “With over 100 million new internet users in just three years, Asia’s momentum in Web3 is accelerating”
Marieke added that the overall NEAR ecosystem remains strong with over 29M active wallets and 650K daily active accounts. She added that a collaborative approach to building tomorrow’s Open Web is critical for tackling significant societal and digital challenges.
“Today’s Web 2.0 is closed and siloed, with the prevailing digital culture being a toxic dialectic,” Marieke expressed. “But the value loop of an Open Web holds immense promise. We’re already seeing how projects like KaiKai, SWEAT, and PlayEmber can move the needle in the right direction.”
Echoing Mariek’s sentiment, Illia emphasized that it’s not necessarily about championing a specific blockchain, but about creating a common layer and interface for the Open Web. This is precisely what the Blockchain Operating System (B.O.S.) is in the process of achieving.
“Looking forward, tech isn’t the only focus,” said Illia. “It’s about blending scalable solutions with user-centric experiences, with the B.O.S. becoming the essential entry point into Web3 for both users and developers.”
Asia’s role in mainstream adoption and regulatory hurdles
The push for mainstream blockchain adoption in APAC was also in focus on Day One, with NEAR Foundation CMO Jack Collier leading a panel discussion on current adoption challenges and how APAC can potentially lead the way in overcoming them.
“I’m extremely bullish on gaming as a key part of mass adoption,” opined Don Pham, Google Cloud’s regional Web3 specialist. “Vietnam is the birthplace of Axie Infinity, which is just the beginning of blockchain gaming’s potential. Use cases like loyalty, gaming, and music NFTs combined with better UX will likely be the path forward.”
The discussion dove deeper into the importance of user experience in the adoption of blockchain. Marieke reiterated that while technology forms the backbone, the real test will be simplifying the user journey, which can significantly hasten adoption across the APAC region as well as globally.
Regulation was also in focus with a panel about blockchain policies and APAC’s vision for governments and associations. Mary Beth Buchanan, a board member of the Cardano Foundation, drove home the need for regulatory clarity while discussing regional frameworks in countries like Hong Kong.
“The biggest challenge is not necessarily regulation, but lack of clarity,” observed Buchanan. “If regulations are unclear that will cause investors not to want to invest in the space. Builders become hesitant to build and users aren’t going to know what the rules are. Regulators need to recognize that clarity is what everyone wants.”
Spotlight on Artificial Intelligence during the afternoon sessions
Day One of NEAR APAC dedicated a significant chunk of its schedule to exploring the intricate interplay between AI and blockchain, particularly their role in APAC’s future technology and Web3 landscape. Illia took the stage just after the lunch break to discuss the Convergence of AI and Web3, diving deep into AI-empowered DAO and the future of AI-assisted work.
“While technologies like ChatGPT have found product market fit, much of AI is still siloed and doesn’t empower open source communities and projects,” Illia posited. “Web3 can operate in the middle, with AI models incorporating all participants and data sources. DAOs, for example, can be co-piloted by AI to achieve goals and KPIs faster and more efficiently.”
In the following panel discussion entitled “The Roles of Blockchains in an AI World,” Illia was joined by several AI and blockchain experts from the APAC region. One of the more distinguished panel members was Dr. Nguyen An Khuong, a lecturer and blockchain researcher from Ho Chi Minh University of Technology, who provided an academic perspective.
“We have a lot of research endeavors in Vietnam around ensuring AI’s ethical applications using blockchain,” commented Dr. Nguyen. “We’re looking at how transparency and accountability in AI decisions can be drastically improved with blockchain technology. Combining a neutral AI with a blockchain’s public ledger could enhance the ecosystem and foster wider adoption.”
Illia drove home that the potential “Skynet” scenario of AI taking over the world is mostly a projection of what humans might do if they were given supercomputing powers. In reality, developments like the B.O.S. and NEAR Tasks will aid humans in success and productivity, with AI agents acting as middlemen for assistance, sourcing, and even creativity.
Day One of the first-ever NEAR APAC was a rousing success, with these highlights just being the tip of the iceberg. The crowd was treated to bánh mì’s and Manga cosplay during lunch, B.O.S. building and gaming workshops on the Builder Stage, and an awe-inspiring NFT gallery from APAC creators.
As the sun sets in Saigon, the NEAR Foundation couldn’t be more appreciative of the Vietnamese crypto community. The momentum rolls on tomorrow with Day Two, so stay tuned for more updates and insights around emerging APAC trends in Web3, how blockchain gaming is taking off in Asia-Pacific, and local DeFi developments.
If Day One was any indication, the NEAR community’s growth, presence, and development in APAC will only continue accelerating through 2023 and beyond! |
NEAR Foundation and Seracle Team to Nurture the Web3 Ecosystem with DevOps Efficiency
NEAR FOUNDATION
June 7, 2023
NEAR Foundation and Seracle, a top blockchain cloud platform, have established a partnership to revolutionize the world of Web3 development. This alliance, forged in the name of innovation and cost-efficiency, capitalizes on Seracle’s unique Litenode architecture.
This groundbreaking technology aims to drive down costs, with the potential for reducing a project’s monthly expenditures on node maintenance and DevOps by up to 90%.
Introducing the Litenode and the Web3 Incubation Center
The Litenode architecture is one of the linchpins in this strategic collaboration. Its design facilitates dramatic cost savings, all while ensuring quality isn’t compromised. But it doesn’t stop there.
The integration of a Web3 incubation center, nestled in the bustling tech hub of Pune, India, gives developers access to a fertile innovation environment. This collaborative center presents developers with a unique opportunity to immerse themselves in a vibrant ecosystem dedicated to the development, learning, and scaling of Web3 projects.
“Seracle’s enthusiasm, technical capabilities, and experience in the Web3 space make them an ideal partner to grow NEAR,” said Arpit Sharma, Managing Director of APAC, MENA of NEAR Foundation, expressing enthusiasm for the collaboration’s potential.
This partnership, with its focus on communal growth and knowledge sharing, is poised to cultivate an enriched community of developers ready to take on the challenges of the Web3 landscape.
How NEAR and Seracle are financing the next wave of Web3
To bring these ambitions to fruition, financial support is vital. Seracle has committed to providing substantial backing, offering platform credits from a pool of $100,000. These credits serve to alleviate the financial burdens of upfront costs, letting developers channel their focus into the creative process.
NEAR Foundation, in step with this initiative, pledges to offer grants to select innovative projects based at Seracle’s incubation center. Looking ahead, the partnership has set its sights high. The ambitious goal to onboard around 100 Web3 projects and engage with 5,000 developers this year underscores the commitment to growth and innovation.
Shrikant Bhalerao, CEO of Seracle, shares this vision and expresses his optimism about the partnership, stating that “NEAR boasts some of the most innovative solutions in the Web3 space. We anticipate that our partnership with NEAR will contribute to a significant 30% growth in our revenue.”
With such dedication to growth and innovation, the collaboration between NEAR Foundation and Seracle is shaping up to be an inspiring chapter in the evolution of the Web3 ecosystem. Leveraging cost-efficient technology, fostering a vibrant community, and providing robust financial support, this partnership stands as a beacon for the future of Web3 development. |
---
id: maintenance-windows
title: Maintenance Windows
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
The RPC API enables you to query future maintenance windows for a specific validator in current epoch
---
## Maintenance windows {#maintenance-windows}
> The maintenance windows for a specific validator are future block height ranges in current epoch, in which the validator does not need produce block or chunk
> If the provided account is not a validator, then it will return the range from now to the end of the epoch.
- method: `EXPERIMENTAL_maintenance_windows`
- params:
- `account_id`
example:
<Tabs>
<TabItem value="json" label="JSON" default>
```json
{
"jsonrpc": "2.0",
"id": "dontcare",
"method": "EXPERIMENTAL_maintenance_windows",
"params": {
"account_id": "node0"
}
}
```
</TabItem>
<TabItem value="http" label="HTTPie">
```bash
http post https://rpc.testnet.near.org jsonrpc=2.0 id=dontcare method=EXPERIMENTAL_maintenance_windows \
params:='{
"account_id": "node0"
}'
```
</TabItem>
</Tabs>
<details>
<summary>Example response:</summary>
<p>
The result will be a list of future maintenance windows in current epoch.
For example a window `[1028, 1031]` includes 1028, 1029 and 1030.
```json
{
"jsonrpc": "2.0",
"result": [
[
1028,
1031
],
[
1034,
1038
],
],
"id": "dontcare"
}
```
</p>
</details>
#### What Could Go Wrong?? {#what-could-go-wrong}
When API request fails, RPC server returns a structured error response with a limited number of well-defined error variants, so client code can exhaustively handle all the possible error cases. Our JSON-RPC errors follow [verror](https://github.com/joyent/node-verror) convention for structuring the error response:
```json
{
"error": {
"name": <ERROR_TYPE>,
"cause": {
"info": {..},
"name": <ERROR_CAUSE>
},
"code": -32000,
"data": String,
"message": "Server error",
},
"id": "dontcare",
"jsonrpc": "2.0"
}
```
Here is the exhaustive list of the error variants that can be returned by `maintenance_windows` method:
<table className="custom-stripe">
<thead>
<tr>
<th>
ERROR_TYPE<br />
<code>error.name</code>
</th>
<th>ERROR_CAUSE<br /><code>error.cause.name</code></th>
<th>Reason</th>
<th>Solution</th>
</tr>
</thead>
<tbody>
<tr>
<td>INTERNAL_ERROR</td>
<td>INTERNAL_ERROR</td>
<td>Something went wrong with the node itself or overloaded</td>
<td>
<ul>
<li>Try again later</li>
<li>Send a request to a different node</li>
<li>Check <code>error.cause.info</code> for more details</li>
</ul>
</td>
</tr>
</tbody>
</table>
|
Case Study: Aurora’s Alex Shevchenko on Scaling Solidity Smart Contracts
COMMUNITY
August 16, 2023
NEAR is a home to a number of amazing apps and projects. These developers, founders, and entrepreneurs are using NEAR to effortlessly create and distribute innovative decentralized apps, while helping build a more open web — free from centralized platforms. In these Case Study videos, NEAR Foundation showcases some of these projects.
In the latest NEAR Foundation Case Study video below, Alex Shevchenko, co-founder and CEO of Aurora Labs, talks about the developer studio’s work in making it easier for developers integrating Solidity smart contracts on NEAR.
“Aurora is a solution that allows you to launch your Solidity smart contracts on NEAR,” says Shevchenko. “We’re also adding something new: a scalability feature for the NEAR Protocol — something that can make EVM future-safe and make a destination of Aurora plus NEAR a final one for businesses to come.”
“We are focusing on the user experience,” he adds. “We have a whole set of products that are helping blockchain businesses to make the user experience much more convenient.”
|
---
id: usecases
title: Interacting with a Contract
sidebar_label: 💡 Interacting with a Contract
---
Here we enumerate case scenarios, and point to where the documentation is present.
---
## Integrating Contracts into a Web App
If you are developing a website (or a web-app), then you will be using `near-api-js` to communicate with the blockchain. Go to the [website](/tools/near-api-js/quick-reference) for more information about it.
---
## Command Line Interface
You can use [NEAR CLI](./cli.md) to automatize tasks from the command line such as:
- Creating sub-accounts
- Deploying contracts to them
- Calling initialization methods
---
## Querying Post Hoc Information
The [NEAR Indexer](./indexer4explorer.md) enables you to query information from a deployed contract such as:
1. Which users called a specific method?
2. How much money they attached?
3. How much GAS was used?
It is very useful for analyzing scenarios that happened in the past.
---
## Getting Real Time Information
If you want to track real time information from your contract, then you need the [Events framework](/tools/realtime).
|
---
id: mpc
title: Multi-Party Computation (MPC)
---
MPC, or multi-party computation, is about how multiple parties can do shared computations on private inputs without revealing the private data.
As an example, suppose two investors want to compare who holds more crypto tokens without revealing their account balances. MPC can solve this situation, by computing the function `f(x > y)`, where `x` and `y` are private inputs. Each person would submit a private value, and would get the function `x > y` result.
In general, MPC can be used to build all kinds of useful protocols, like threshold cryptography, dark pools, and private auctions. For example, MPC can be used to jointly encrypt a message, with the key split up among many different parties.
<details>
<summary> MPC versus key splitting</summary>
In secret sharing, the key has to get reassembled. At some point, some trusted party is going to have the entire key available to them. With MPC, the whole operation is done in MPC, meaning there's no point where the combined key could be extracted.
</details>
:::info
Want to learn more about multi-party computation? Check [this article](https://www.zellic.io/blog/mpc-from-scratch/).
:::
---
## MPC signature generation
- MPC nodes are doing a multistep process called signature generation.
- They are doing it by using user key shares derived from their root key shares.
- A root key is never reconstructed, but protocol allows to create signatures using it’s shares.
:::info
Using MPC, the root key is never reconstructed and it’s never available. User key is never reconstructed as well.
:::
## How MPC creates a new key
- Once MPC account verification is complete, a root key becomes available to sign a new signature that creates a new key
- This new key is created using [Additive Key Deriviation](https://github.com/bitcoin/bips/blob/master/bip-0032.mediawiki#specification-key-derivation) (a mechanism for deriving many _sub-keys_ from a single _master key_)
- This new sub-key can now be used to sign a payload for a given account associated with a given blockchain |
```js
Near.call(
"primitives.sputnik-dao.near",
"add_proposal",
{
proposal: {
description: "My first proposal",
kind: {
Transfer: {
token_id: "",
receiver_id: "bob.near",
amount: "10000000000000000000000000",
},
},
},
},
300000000000000,
100000000000000000000000
);
``` |
---
title: 2.5 Composability
description: Open source and open protocols are the foundation of Web3. This article explains how composability is the key to unlocking the full potential of Web3
---
# 2.5 Composability and Why It Matters
There is perhaps no other more important thesis in crypto that has developed in the last five years, than the notion of ‘_composability’_. Unlike the siloed and privatized Web 2 internet, crypto has been largely built out in the open - whereby protocols, dApps, and user activity is visible on-chain - or extractable by any person willing to build their own indexer.
What this fundamental design characteristic does, is inverts the ‘business models’ of internet systems up until this point: In Web 2, behavioral surplus and user data was the commodity used by private tech companies to advertise and manipulate user behavior. This ‘exclusive’ access to information jumpstarted the creation of the wealthiest companies in the last 50 years. In Web3, public open-source systems, undergird the movement of value between systems in such a way where the value can blend into each other based on user interest and product design. This is known as composability - and it is the basis for a thesis that argues that value in the next 50 years is going to emerge within and between different products inside of blockchain ecosystems.
_Composability: The ability for protocols and dApps inside of crypto-ecosystems to be able to communicate with each other, such that the value from between different systems can integrate and harmonize over time, and automatically._
To break down the definition of composability: We are strictly referring to how the internet of value allows different types of value built or created on top of it, to talk to each other and to streamline into aggregate wholes.
The precedent for the composability thesis is firmly rooted in economic history and the theory of the firm (Douglass North and Donald Coarse): Maximizing efficiency and minimizing transaction costs, is one of the most - if not the most - fundamental determinant of a successful business. That is to say, making it simple and streamlined for value to be created, shared, and accessed, greatly expands the amount of value created over time. Institutions, culture, and socialization strongly impact these factors (read protocol design, community and active users).
The composability thesis argues for a world with self-executing code (smart contracts) at the center of new economic systems, such that transaction costs can go down to close to zero, and efficiency can be maximized.
_The Composability Thesis: That value in Web3 accrues at the intersection of different systems talking to each other, such that new forms of value can be created by the automatic and smart-contract based management of different types of value._
The theory of the composability thesis suggests that the realization of the internet of value is best achieved when the different systems built on top of the internet of value are able to talk to each other. From the perspective of this thesis, one-dimensional products that do not integrate with other dApps are backwards looking. Meanwhile, dApps and protocols that actively connect, facilitate, and utilize other assets, tokens, accounts, or data, are forward looking. While this leads us to a discussion on Money lego’s and new forms of value as short term considerations on how the _composability thesis _will evolve over time, the long-term bet on composability centers on maximizing the network effects of dApps and products within L1 ecosystems.
## Money Legos
The idea of money lego’s is that certain tokens of value, can be used on different protocols to create or optimize further forms of value. The classic example refers to lending money on a decentralized protocol:
_Example 1:_
[L1] [Infrastructure] Lending → Tokens → stable value
_The logic is that once there are a suite of liquid fungible tokens in an ecosystem - that represent some specific type of value - for example for fees from a DEX, for governance of a protocol, or from the original L1 itself - these tokens can be held in escrow, from which the user can receive a loan in a stable form of value for having locked their fungible asset. This essentially allows users to optimize the type of value they are managing, by locking up certain forms of value, so as to take loans in other forms of value that can then be used to purchase other assets or participate in other facets of an ecosystem._
_Example 2:_
[L1] [Infrastructure] Data Upload → Carbon Credit → Carbon Backed Stable Coin → Purchasing of Art → Loaning of Art for future value.
_This is a more complex example straight from what is being built on Open Forest Protocol. The idea of composability here is straightforward: Data collected about a forest in the field, can be uploaded on-chain and used as a basis for generating a carbon credit. This carbon credit can then be used to create a stable-coin backed by carbon, such that stable value can be derived from locked carbon assets. This stable value can then be used to purchase other assets, for example, tokenized art from Renaissance DAO. And this tokenized art could then be locked up to or loaned out to a gallery for further value._
Composability across systems is what allows raw data from a forest, to eventually turn into tokenized art loaned out to a gallery! And while normal systems of value operate in this way with cash, the facilitation between assets and cash, is done manually, with high transaction costs and under inefficient mechanisms. With smart contracts, we can facilitate value transfer and composability of value automatically, seamlessly, and solely with communication between user and software (no negotiation needed!).
What this ultimately means, is that the way _existing value is managed, _can be drastically changed in Web3, with the composability thesis arguing that the ecosystem that can facilitate this transformation of existing value, will benefit the most.
## Tokenizing New Forms of Value
However, beyond the transformation of existing value, the composability thesis argues that Web3 is optimally positioned to create entirely new forms of value - hitherto unquantifiable, imaginable, or liquid for purchase and resale.
Some examples of these new forms of value include:
_(Dynamic) Non-Fungible Value:_ User actions, user participation, user time committed, user personal data, and so forth. Dynamically, this refers to the latter actions, at certain times, in certain locations, or in conjunction with certain events (circumstances / holidays / and so forth). Non-Fungible value would also apply to things like academic research and original content (music, film, etc.).
_Fungible Value:_ Systems value created from protocols and dApps - this could be sensor data, biological data, tokenized assets, commodities, real-estate, and other value that normally underlies liquid financial markets (bonds, ETF’s, etc.)
_From Emerging Technologies._ Game items, virtual reality events, AI-generated art, sensor data, live-stream video feed, etc.
The composability thesis would then argue, that the real value proposition of crypto is for someone to be able to use their genome, to earn money, to invest in tokenized real estate, to loan against stable value, to buy art to license to a gallery - all on-chain and all with interaction facilitated between the self-executing contract, and the user (whereby self-executing contract could also be an organized community such as a Decentralized Autonomous Organization or On-chain Organization).
So, from a 10,000 foot view, what is the composability thesis ultimately arguing? It is arguing that in the coming decade, the capacity for emerging technologies to quantify and collect information about the world, combined with our ability to digitize and tokenize facets of human experience (non-fungible value), conjoined with the ability for smart contracts to manage the transfer, exchange, collateralization, creation, and velocity of this value, will be the nexus from which the most successful, affluent, and
## Geopolitics Once Again: Inflows and Outflows of Ecosystem Native Value
When we look at value in a national economy we tend to look at key features like imports or exports to gauge the health of an economy: Strong economies import products they need or could use, to produce other types of products that they export. Some economies export services more than products. And then the importation or exportation of such value, is denominated in some type of currency, usually from which the most powerful or most socially accepted is denominated as the ‘reserve’ of value transfer between states.
In crypto, we are witnessing the emergence of a similar phenomenon, but slightly different in terms of the nature of the value in question: Productive L1 ecosystems, generate and export _ecosystem native value_ - that is to say, value that originated inside of the ecosystem itself. This could be from types of data, art, bridged assets (USDC backed by treasury bonds), and so forth. Emergent L1 ecosystems, _import value_ in preparation for leveraging that value to eventually export value. Dominant ecosystems, import and export value, with ever increasing users, communities, and organizations settling inside of the ecosystem to offer services and products from such value.
The composability thesis argues that maximizing outflows of ecosystem native value comes from maximizing composability within the ecosystem. Furthermore, inflows should be used for increasing the types and quantity of value created in an ecosystem, such that more users, organizations, and liquidity can accrue.
|
# Staking and slashing
## Stake invariant
`Account` has two fields representing its tokens: `amount` and `locked`. `amount + locked` is the total number of
tokens an account has: locking/unlocking actions involve transferring balance between the two fields, and slashing
is done by subtracting from the `locked` value.
On a stake action the balance gets locked immediately (but the locked balance can only increase), and the stake proposal is
passed to the epoch manager. Proposals get accumulated during an epoch and get processed all at once when an epoch is finalized.
Unlocking only happens at the start of an epoch.
Account's stake is defined per epoch and is stored in `EpochInfo`'s `validators` and `fishermen` sets. `locked` is always
equal to the maximum of the last three stakes and the highest proposal in the current epoch.
### Returning stake
`locked` is the number of tokens locked for staking, it's computed the following way:
- initially it's the value in genesis or `0` for new accounts
- on a staking proposal with a value higher than `locked`, it increases to that value
- at the start of each epoch it's recomputed:
1. consider the most recent 3 epochs
2. for non-slashed accounts, take the maximum of their stakes in those epochs
3. if an account made a proposal in the block that starts the epoch, also take the maximum with the proposal value
4. change `locked` to the resulting value (and update `amount` so that `amount + locked` stays the same)
### Slashing
TODO.
|
---
sidebar_position: 2
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
# Sending $NEAR
You might want to send tokens from a contract for many reasons.
* The contract uses something like the [Storage Standard](https://nomicon.io/Standards/StorageManagement) and needs to return deposits to users when they unregister.
* Users pay into the contract and the contract later pays these fees to the maintainers, redistributes them to users, or disburses them to some cause the users vote on.
* And more!
Blockchains give us programmable money, and the ability for a smart contract to send tokens lies at the heart of that ability.
NEAR makes this easy. Transferring NEAR tokens is the simplest transaction you can send from a smart contract. Here's all you need:
```rust
let amount: u128 = 1_000_000_000_000_000_000_000_000; // 1 $NEAR as yoctoNEAR
let account_id: AccountId = "example.near".parse().unwrap();
Promise::new(account_id).transfer(amount);
```
In the context of a full contract and function call, this could look like:
```rust
use near_sdk::{json_types::U128, near_bindgen, AccountId, Promise};
#[near_bindgen]
pub struct Contract {}
#[near_bindgen]
impl Contract {
pub fn pay(amount: U128, to: AccountId) -> Promise {
Promise::new(to).transfer(amount.0)
}
}
```
Most of this is boilerplate you're probably familiar with by now – imports, setting up [`near_bindgen`](../contract-structure/near-bindgen.md), [borsh](../contract-interface/serialization-interface.md), etc. Some interesting details related to the transfer itself:
* `U128` with a capital `U`: The `pay` method defined here accepts JSON as input, and numbers in JS [cannot be larger than `2^53-1`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number/MAX_SAFE_INTEGER), so for compatibility with deserializing JSON to JS, the integer is serialized as a decimal string. Since the `transfer` method takes a number in [yocto](https://en.wikipedia.org/wiki/Yocto-)NEAR, it's likely to need numbers much larger than `2^53-1`.
When a function takes `U128` as input, it means that callers need to specify the number a a string. near-sdk-rs will then cast it to `U128` type, which wraps Rust's native [`u128`](https://doc.rust-lang.org/std/primitive.u128.html). The underlying `u128` can be retrieved with `.0` – used in `transfer(amount.0)`.
* `AccountId`: this will automatically check that the provided string is a well-formed NEAR account ID, and panic with a useful error if not.
* Returning `Promise`: This allows NEAR Explorer, near-cli, near-api-js, and other tooling to correctly determine if a whole chain of transactions is successful. If your function does not return `Promise`, tools like near-cli will return immediately after your function call. And then even if the `transfer` fails, your function call will be considered successful. You can see an example of this behavior [here](/tutorials/examples/advanced-xcc).
Using near-cli or near-cli-rs, someone could invoke this function with a call like:
<Tabs className="language-tabs" groupId="code-tabs">
<TabItem value="near-cli">
```bash
near call <contract> pay '{"amount": "1000000000000000000000000", "to": "example.near"}' --accountId benjiman.near
```
</TabItem>
<TabItem value="near-cli-rs">
```bash
near contract call-function as-transaction <contract> pay json-args '{"amount": "1000000000000000000000000", "to": "example.near"}' prepaid-gas '30 TeraGas' attached-deposit '0 NEAR' sign-as benjiman.near network-config testnet sign-with-keychain send
```
</TabItem>
</Tabs> |