diff --git a/docs/.gitbook/assets/Edge_Node_cover (1).png b/docs/.gitbook/assets/Edge_Node_cover (1).png
deleted file mode 100644
index b2457b17..00000000
Binary files a/docs/.gitbook/assets/Edge_Node_cover (1).png and /dev/null differ
diff --git a/docs/.gitbook/assets/Screen Shot 2023-02-22 at 14.51.44 (1).png b/docs/.gitbook/assets/Screen Shot 2023-02-22 at 14.51.44 (1).png
deleted file mode 100644
index a94dc349..00000000
Binary files a/docs/.gitbook/assets/Screen Shot 2023-02-22 at 14.51.44 (1).png and /dev/null differ
diff --git a/docs/.gitbook/assets/Screen Shot 2023-02-22 at 14.51.44 (2).png b/docs/.gitbook/assets/Screen Shot 2023-02-22 at 14.51.44 (2).png
deleted file mode 100644
index a94dc349..00000000
Binary files a/docs/.gitbook/assets/Screen Shot 2023-02-22 at 14.51.44 (2).png and /dev/null differ
diff --git a/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.32.41 (1).png b/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.32.41 (1).png
deleted file mode 100644
index c681da5d..00000000
Binary files a/docs/.gitbook/assets/Screen Shot 2023-12-15 at 12.32.41 (1).png and /dev/null differ
diff --git a/docs/.gitbook/assets/Screen Shot 2023-12-18 at 14.10.25 (1).png b/docs/.gitbook/assets/Screen Shot 2023-12-18 at 14.10.25 (1).png
deleted file mode 100644
index f1de1b28..00000000
Binary files a/docs/.gitbook/assets/Screen Shot 2023-12-18 at 14.10.25 (1).png and /dev/null differ
diff --git a/docs/.gitbook/assets/Three sequential bounty rounds from pre-verification integrations through verified memory and context oracles to agent-ready analytics and user support. b/docs/.gitbook/assets/Three sequential bounty rounds from pre-verification integrations through verified memory and context oracles to agent-ready analytics and user support.
new file mode 100644
index 00000000..f640b4d4
Binary files /dev/null and b/docs/.gitbook/assets/Three sequential bounty rounds from pre-verification integrations through verified memory and context oracles to agent-ready analytics and user support. differ
diff --git a/docs/.gitbook/assets/V8 Timeline (1).png b/docs/.gitbook/assets/V8 Timeline (1).png
deleted file mode 100644
index 7b4af39e..00000000
Binary files a/docs/.gitbook/assets/V8 Timeline (1).png and /dev/null differ
diff --git a/docs/.gitbook/assets/aa (1).png b/docs/.gitbook/assets/aa (1).png
deleted file mode 100644
index c7922f5c..00000000
Binary files a/docs/.gitbook/assets/aa (1).png and /dev/null differ
diff --git a/docs/.gitbook/assets/dkg-memory-hr.png b/docs/.gitbook/assets/dkg-memory-hr.png
new file mode 100644
index 00000000..873e852d
Binary files /dev/null and b/docs/.gitbook/assets/dkg-memory-hr.png differ
diff --git a/docs/.gitbook/assets/dkg_v10_bounty_program_high_res_white_bg.png b/docs/.gitbook/assets/dkg_v10_bounty_program_high_res_white_bg.png
new file mode 100644
index 00000000..7c41a77e
Binary files /dev/null and b/docs/.gitbook/assets/dkg_v10_bounty_program_high_res_white_bg.png differ
diff --git a/docs/.gitbook/assets/image (16).png b/docs/.gitbook/assets/image (16).png
deleted file mode 100644
index 6ae4c41d..00000000
Binary files a/docs/.gitbook/assets/image (16).png and /dev/null differ
diff --git a/docs/.gitbook/assets/image (21).png b/docs/.gitbook/assets/image (21).png
deleted file mode 100644
index cdfd5c91..00000000
Binary files a/docs/.gitbook/assets/image (21).png and /dev/null differ
diff --git a/docs/.gitbook/assets/image (8).png b/docs/.gitbook/assets/image (8).png
deleted file mode 100644
index 6bb22810..00000000
Binary files a/docs/.gitbook/assets/image (8).png and /dev/null differ
diff --git a/docs/README.md b/docs/README.md
index f4858a17..b06d2dd3 100644
--- a/docs/README.md
+++ b/docs/README.md
@@ -2,7 +2,7 @@
OriginTrail is an ecosystem building **collective, trusted memory** for AI. The core ecosystem technology is the **Decentralized Knowledge Graph (DKG)**, a decentralized, permissionless network of nodes, through which both humans and machines can share knowledge, reason together, and preserve context across time.
-Modern AI is powerful but ungrounded. It predicts without knowing, hallucinates, forgets what it said, and relies on data controlled by a few centralized platforms. LLMs in particular have an "explainability" problem - why did an LLM respond a certain way, based on what knowledge, coming from which source?
+Modern AI is powerful but ungrounded. It predicts without knowing, hallucinates, forgets what it said, and relies on data controlled by a few centralized platforms. LLMs in particular have an "explainability" problem - why did an LLM respond a certain way, based on what knowledge, coming from which source?
The **Decentralized Knowledge Graph (DKG)** hosts **Knowledge Assets** that encode facts, data provenance, and meaning in a tamper-proof way. The network is hosted by a set of independent **DKG Nodes**. Anyone can run a **DKG network Node** - organizations and individuals - contributing to the DKG and at the same time building upon their knowledge in a **privacy-preserving** way. Thus the DKG ensures that no single entity can rewrite, censor, or monopolize the collective memory. Decentralization keeps AI accountable, bias-resistant, and aligned with human diversity.
@@ -13,21 +13,22 @@ The DKG grows through human participation. Researchers, developers, and citizens
**Core operations**
* **Publishing knowledge:** Turning data into structured, verifiable Knowledge Assets
-* **Knowledge Discovery:** Querying, traversing, and monetizing knowledge in the decentralized graph and its _paranets_
+* **Knowledge discovery:** Querying, traversing, and monetizing knowledge in the decentralized graph and its _paranets_
* **Trusted sharing:** Cryptographically verify authenticity and provenance of knowledge
-* **Neuro-symbolic Reasoning**: Infer new facts based on rules, leveraging graph-based reasoning in combination with LLMs and GenAI models
+* **Neuro-symbolic reasoning**: Infer new facts based on rules, leveraging graph-based reasoning in combination with LLMs and GenAI models
-We encourage developers to [try out the DKG Node](getting-started/decentralized-knowle-dge-graph-dkg.md) and build their first DKG based agent with it, to get a feel of what the technology can do.
+We encourage developers to [try out the DKG Node](getting-started/decentralized-knowledge-graph-dkg.md) and build their first DKG based agent with it, to get a feel of what the technology can do.
### Three ways to get started
-
Begin your journey on the Decentralized Knowledge Graph by staking TRAC and setting up your first DKG Node. This is your entry point into the verifiable knowledge economy
Begin your journey on the Decentralized Knowledge Graph by setting up your first DKG Node and Agent. This is your entry point into the verifiable knowledge economy.
+
+
{% hint style="success" %}
-This site is a constantly updated, work-in-progress official OriginTrail documentation built by the OriginTrail community and core developers.
+This site is a constantly updated, work-in-progress official OriginTrail documentation built by the OriginTrail community and core developers.
-Find the "Edit on GitHub" button in the upper-right corner and submit your updates as pull requests (or do it directly in the [docs GitHub repo).](https://github.com/OriginTrail/dkg-docs)
+Find the "Edit on GitHub" button in the upper-right corner and submit your updates as pull requests (or do it directly in the [docs GitHub repo).](https://github.com/OriginTrail/dkg-docs)
We appreciate any feedback, improvement ideas, and comments.
{% endhint %}
-
diff --git a/docs/SUMMARY.md b/docs/SUMMARY.md
index a7ee9824..8cf390f6 100644
--- a/docs/SUMMARY.md
+++ b/docs/SUMMARY.md
@@ -3,9 +3,16 @@
* [Introduction](README.md)
* [DKG — Key concepts](dkg-key-concepts.md)
+## ORIGINTRAIL V9/V10
+
+* [Roadmap](origintrail-v9-v10/roadmap.md)
+* [V10 Mainnet Release Timeline](origintrail-v9-v10/v10-mainnet-release-timeline.md)
+* [OriginTrail DKG v10 Bounty Program](origintrail-v9-v10/origintrail-dkg-v10-bounty-program.md)
+* [OriginTrail Decentralized Knowledge Graph DKG V10 - Terms and Conditions](origintrail-v9-v10/origintrail-decentralized-knowledge-graph-dkg-v10-terms-and-conditions.md)
+
## Getting Started
-* [Installation](getting-started/decentralized-knowle-dge-graph-dkg.md)
+* [Installation — Edge Node](getting-started/decentralized-knowledge-graph-dkg.md)
* [Interacting with your DKG Agent](getting-started/interacting-with-your-dkg-agent.md)
* [DKG Node Services](getting-started/dkg-node-services.md)
* [Basic Knowledge Asset operations](getting-started/basic-knowledge-asset-operations.md)
@@ -16,6 +23,7 @@
* [Architecture](build-a-dkg-node-ai-agent/architecture.md)
* [Essentials Plugin](build-a-dkg-node-ai-agent/essentials-plugin.md)
+* [Using the DKG client](build-a-dkg-node-ai-agent/using-the-dkg-client.md)
* [Customizing your DKG agent](build-a-dkg-node-ai-agent/customizing-your-dkg-agent.md)
* [Evaluating agent responses](build-a-dkg-node-ai-agent/evaluating-agent-responses.md)
* [Set up your custom DKG Node fork & update flow](build-a-dkg-node-ai-agent/set-up-your-custom-dkg-node-fork-and-update-flow.md)
@@ -41,7 +49,7 @@
* [IPO voting](build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/initial-paranet-offerings-ipos/ipo-voting.md)
* [Contributing a plugin](build-a-dkg-node-ai-agent/contributing-a-plugin.md)
-## Contribute to the DKG
+## Contribute to the DKG
* [Hackathon: Scaling Trust in the Age of AI](contribute-to-the-dkg/hackathon-scaling-trust-in-the-age-of-ai/README.md)
* [DKG Social Graph Query Guide](contribute-to-the-dkg/hackathon-scaling-trust-in-the-age-of-ai/dkg-social-graph-query-guide.md)
@@ -68,7 +76,7 @@
* [Learn more](dkg-knowledge-hub/learn-more/README.md)
* [Understanding OriginTrail](dkg-knowledge-hub/learn-more/readme/README.md)
- * [OriginTrail Decentralized Knowledge Graph (DKG)](dkg-knowledge-hub/learn-more/readme/decentralized-knowle-dge-graph-dkg.md)
+ * [OriginTrail Decentralized Knowledge Graph (DKG)](dkg-knowledge-hub/learn-more/readme/decentralized-knowledge-graph-dkg.md)
* [Development principles](dkg-knowledge-hub/learn-more/readme/development-principles.md)
* [Linked data & knowledge graphs](dkg-knowledge-hub/learn-more/readme/kg.md)
* [Core DKG concepts](dkg-knowledge-hub/learn-more/readme/dkg-key-concepts.md)
@@ -83,7 +91,7 @@
* [Random Sampling & proofs explained](dkg-knowledge-hub/learn-more/introduction/random-sampling-dkg-proof-system/README.md)
* [Random Sampling rollout](dkg-knowledge-hub/learn-more/introduction/random-sampling-dkg-proof-system/random-sampling-rollout.md)
* [Random Sampling FAQ](dkg-knowledge-hub/learn-more/introduction/random-sampling-dkg-proof-system/random-sampling-faq.md)
- * [Rules & token thresholds](dkg-knowledge-hub/learn-more/introduction/rules-and-token-thresholds.md)
+ * [Edge Vs. Core Node — Rules & token thresholds](dkg-knowledge-hub/learn-more/introduction/edge-vs.-core-node-rules-and-token-thresholds.md)
* [Connected blockchains](dkg-knowledge-hub/learn-more/connected-blockchains/README.md)
* [NeuroWeb Parachain](dkg-knowledge-hub/learn-more/connected-blockchains/neuroweb.md)
* [Base Network (L2)](dkg-knowledge-hub/learn-more/connected-blockchains/base-blockchain/README.md)
@@ -98,6 +106,7 @@
* [Protocol updates](dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/protocol-updates.md)
* [Feature roadmap](dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/feature-roadmap.md)
* [How to upgrade to V8?](dkg-knowledge-hub/learn-more/previous-updates/dkg-v8.0-update-guidebook/how-to-upgrade-to-v8.md)
+ * [Staking cap & outstanding network rewards release](dkg-knowledge-hub/learn-more/previous-updates/staking-threshold-update-and-outstanding-network-rewards-release.md)
* [What is a DKG Node?](dkg-knowledge-hub/learn-more/decentralized-knowle-dge-graph-dkg.md)
* [How-tos & tutorials](dkg-knowledge-hub/how-tos-and-tutorials/README.md)
* [Fund your Web3 wallets](dkg-knowledge-hub/how-tos-and-tutorials/fund-your-web3-wallets.md)
diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/building-with-dkg-paranets.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/building-with-dkg-paranets.md
index adae5cec..49c34c65 100644
--- a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/building-with-dkg-paranets.md
+++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/building-with-dkg-paranets.md
@@ -6,21 +6,21 @@ Paranets are like "virtual" knowledge graphs on the OriginTrail Decentralized Kn
\***A** **knowledge collection (KC)** is a **collection of Knowledge Assets.** It refers to structured data that can be stored, shared, and validated within a distributed network.
{% endhint %}
-To gain access to the paranet knowledge graph, you can deploy a [DKG node](../../../getting-started/decentralized-knowle-dge-graph-dkg.md) and set it up to host the paranet (or "sync" it). More information is available on the [Sync a DKG Paranet](syncing-a-dkg-paranet.md) page.
+To gain access to the paranet knowledge graph, you can deploy a [DKG node](../../../getting-started/decentralized-knowledge-graph-dkg.md) and set it up to host the paranet (or "sync" it). More information is available on the [Sync a DKG Paranet](syncing-a-dkg-paranet.md) page.
**A direct code example of paranets in use can be found here —** [**Paranet Demo**](https://github.com/OriginTrail/dkg.js/blob/v8/develop/examples/paranet-demo.js)
### Querying paranets
-Once you have access to the paranet knowledge graph via a gateway node, you can use one of the [DKG SDKs](../dkg-sdk/) to interact with it. It is also possible to open your triple store SPARQL endpoint directly and query the paranet knowledge graph in its own repository (the paranet repository name is equivalent to the paranet profile Knowledge Asset UAL, with dash characters instead of slashes).
+Once you have access to the paranet knowledge graph via a gateway node, you can use one of the [DKG SDKs](../dkg-sdk/) to interact with it. It is also possible to open your triple store SPARQL endpoint directly and query the paranet knowledge graph in its own repository (the paranet repository name is equivalent to the paranet profile Knowledge Asset UAL, with dash characters instead of slashes).
-Using SPARQL, it is possible to query and integrate knowledge from multiple paranets and the entire DKG in a single query using SPARQL federated queries.
+Using SPARQL, it is possible to query and integrate knowledge from multiple paranets and the entire DKG in a single query using SPARQL federated queries.
### Running paranet services
Paranets enable registering and exposing both on-chain and off-chain services associated with it. A paranet service can be identified by all users of the paranet via its registry Knowledge Asset and can have multiple on-chain accounts associated with it, enabling them to engage in economic activity within the DKG. Examples of paranet services are AI agents (e.g., autonomous reasoners mining knowledge collections), chatbots (e.g., [Polkabot](https://polkabot.ai/)), oracle feeds, LLMs, dRAG APIs, etc.
-Paranet operators manage the services through the Paranet Services Registry smart contracts or DKG SDK.
+Paranet operators manage the services through the Paranet Services Registry smart contracts or DKG SDK.
### Paranet permissions
@@ -35,6 +35,3 @@ There are three permission policies for paranet:
* Knowledge Asset submission access policy:
* OPEN—Any Knowledge Asset can be added to the _paranet._
* STAGING—Knowledge miners first submit the Knowledge Asset to staging, where it is reviewed by curators chosen by the paranet owner. The curators can _approve_ (and automatically add a Knowledge Asset to the paranet) or _deny the_ staged Knowledge Asset (which then doesn't get added to the paranet).
-
-
-
diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/syncing-a-dkg-paranet.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/syncing-a-dkg-paranet.md
index 8eefd09f..72b6e8f1 100644
--- a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/syncing-a-dkg-paranet.md
+++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-paranets/syncing-a-dkg-paranet.md
@@ -1,10 +1,10 @@
# Sync a paranet
-To interact with specific DKG paranet's knowledge graphs using your OriginTrail node, you need to configure your node to synchronize the paranet's knowledge collections. This setup can be achieved by modifying your node's configuration file to include the paranet UAL.
+To interact with specific DKG paranet's knowledge graphs using your OriginTrail node, you need to configure your node to synchronize the paranet's knowledge collections. This setup can be achieved by modifying your node's configuration file to include the paranet UAL.
-If you have not yet set up your node or need guidance on configuring a DKG Node, please refer to the [Installation guide](../../../getting-started/decentralized-knowle-dge-graph-dkg.md).
+If you have not yet set up your node or need guidance on configuring a DKG Node, please refer to the [Installation guide](../../../getting-started/decentralized-knowledge-graph-dkg.md).
-To enable your node to sync with a paranet, you will need to add `assetSync` object to your node’s `.origintrail_noderc` file. Below is an example of how to configure this (make sure to replace the UAL in the example below):
+To enable your node to sync with a paranet, you will need to add `assetSync` object to your node’s `.origintrail_noderc` file. Below is an example of how to configure this (make sure to replace the UAL in the example below):
```json
"assetSync": {
@@ -42,4 +42,3 @@ Paranet sync: KA count from contract and in DB is the same, nothing new to sync,
```
Interacting with the paranet knowledge graph through your node is explained on [this](building-with-dkg-paranets.md) page.
-
diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/README.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/README.md
index 7cea669f..9afa300a 100644
--- a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/README.md
+++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/README.md
@@ -10,18 +10,14 @@ OriginTrail dev tutorial: SDK walkthrough
The OriginTrail SDKs are client libraries for your applications that enable your applications to interact with the OriginTrail Decentralized Knowledge Graph (DKG).
-From an architectural standpoint, the SDK libraries are application interfaces into the DKG. They enable you to create and manage **Knowledge Assets** through your apps and perform network queries (such as search or SPARQL queries), as illustrated below.
+From an architectural standpoint, the SDK libraries are application interfaces into the DKG. They enable you to create and manage **Knowledge Assets** through your apps and perform network queries (such as search or SPARQL queries), as illustrated below.
The interplay between your app, DKG and blockchains
-
-
The OriginTrail SDK currently comes in two forms:
-* Javascript SDK - [**dkg.js**](dkg-v8-js-client/)
-* Python SDK - [**dkg.py**](dkg-v8-py-client/)**.**
-
-
+* Javascript SDK - [**dkg.js**](dkg-v8-js-client/)
+* Python SDK - [**dkg.py**](dkg-v8-py-client/)**.**
### Try out the SDK
@@ -37,16 +33,14 @@ Set up a development environment using one of the following options:
* **Deploy your node on the DKG testnet (recommended):**\
This option allows you to quickly experiment with the SDK on a testnet of your choice.\
- Follow the [Installation guide](../../../getting-started/decentralized-knowle-dge-graph-dkg.md) for setup instructions.
+ Follow the [Installation guide](../../../getting-started/decentralized-knowledge-graph-dkg.md) for setup instructions.
* **Deploy your node on a local DKG network:**\
Use this option to set up a fully localized development environment by following the [Development environment setup guide](setting-up-your-development-environment.md).
-
-
SDKs for other programming languages would be welcome contributions to the project. The core development team is also considering including them in the roadmap.
{% hint style="info" %}
-Interested in building a DKG SDK in a particular programming language? We'd love to support you.
+Interested in building a DKG SDK in a particular programming language? We'd love to support you.
Create an [issue](https://github.com/OriginTrail/ot-node/issues) on our GitHub, and let's get the conversation started!
{% endhint %}
diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/README.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/README.md
index 08f2fdba..e48f7e5f 100644
--- a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/README.md
+++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/README.md
@@ -6,7 +6,7 @@ description: Javascript library for the Decentralized Knowledge Graph.
If you are looking to build applications leveraging [Knowledge Assets](./#create-a-knowledge-asset) on the OriginTrail Decentralized Knowledge Graph (DKG), the dkg.js SDK library is the best place to start!
-The DKG SDK is used together with an **OriginTrail gateway node** to build applications that interface with the OriginTrail DKG (the node is a dependency). Therefore, to use the SDK, you either need to run a gateway node on [your local environment](../setting-up-your-development-environment.md) or a [hosted DKG Node](../../../../getting-started/decentralized-knowle-dge-graph-dkg.md).
+The DKG SDK is used together with an **OriginTrail gateway node** to build applications that interface with the OriginTrail DKG (the node is a dependency). Therefore, to use the SDK, you either need to run a gateway node on [your local environment](../setting-up-your-development-environment.md) or a [hosted DKG Node](../../../../getting-started/decentralized-knowledge-graph-dkg.md).
## Prerequisites
@@ -61,7 +61,7 @@ const DKG = require('dkg.js');
OriginTrail dev tutorial: SDK walkthrough
{% endembed %}
-To use the DKG library, you need to connect to a running local or remote OT-node.
+To use the DKG library, you need to connect to a running local or remote OT-node.
```javascript
const dkg = new DKG({
@@ -109,9 +109,9 @@ The system uses default publicly available RPCs for each chain. However, because
## Create a Knowledge Asset
-In this example, let’s create an example Knowledge Asset representing a city. The content contains both public and private assertions. Public assertions will be exposed publicly (replicated to other nodes), while private ones won't (stay on the node you published to only).
+In this example, let’s create an example Knowledge Asset representing a city. The content contains both public and private assertions. Public assertions will be exposed publicly (replicated to other nodes), while private ones won't (stay on the node you published to only).
-If you have access to the particular node that has the data, when you search for it using get or query, you will see both public and private assertions.
+If you have access to the particular node that has the data, when you search for it using get or query, you will see both public and private assertions.
```javascript
const content = {
@@ -346,7 +346,7 @@ The response of the get operation will be the assertion graph:
## Querying Knowledge Asset data with SPARQL
-Querying the DKG is done by using the SPARQL query language, which is very similar to SQL applied to graph data.
+Querying the DKG is done by using the SPARQL query language, which is very similar to SQL applied to graph data.
_(If you have SQL experience, SPARQL should be relatively easy to get started with. More information_[ _can be found here_](https://www.w3.org/TR/rdf-sparql-query/)_)._
@@ -378,7 +378,7 @@ The returned response will contain an array of n-quads:
}
-As the OriginTrail node leverages a fully fledged graph database (a triple store supporting RDF), you can run arbitrary SPARQL queries on it.
+As the OriginTrail node leverages a fully fledged graph database (a triple store supporting RDF), you can run arbitrary SPARQL queries on it.
To learn more about querying the DKG, go [here](../../querying-the-dkg.md).
@@ -390,7 +390,7 @@ We can divide operations done by SDK into 3 types:
* Smart contract call (non-state-changing interaction)
* Smart contract transaction (state-changing interaction)
-Non-state-changing interactions with smart contracts are free and can be described as contract-getters. They don’t require transactions on the blockchain. This means they do not incur transaction fees.
+Non-state-changing interactions with smart contracts are free and can be described as contract-getters. They don’t require transactions on the blockchain. This means they do not incur transaction fees.
Smart contract transactions are state-changing operations. This means they change the state of the smart contract memory, which requires some blockchain-native gas tokens (such as ETH, NEURO, etc.).
diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-py-client/README.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-py-client/README.md
index dc819f4f..81ced4d2 100644
--- a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-py-client/README.md
+++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/dkg-v8-py-client/README.md
@@ -6,7 +6,7 @@ description: Python library for interacting with the DKG
If you are looking to build applications leveraging [Knowledge Assets](./#create-a-knowledge-collection) on the OriginTrail Decentralized Knowledge Graph (DKG), the dkg.py library is the best place to start!
-The DKG SDK is used together with an **OriginTrail gateway node** to build applications that interface with the OriginTrail DKG (the node is a dependency). Therefore, you either need to run a gateway node on [your local environment](../setting-up-your-development-environment.md) or a [hosted DKG Node](../../../../getting-started/decentralized-knowle-dge-graph-dkg.md), in order to use the SDK.
+The DKG SDK is used together with an **OriginTrail gateway node** to build applications that interface with the OriginTrail DKG (the node is a dependency). Therefore, you either need to run a gateway node on [your local environment](../setting-up-your-development-environment.md) or a [hosted DKG Node](../../../../getting-started/decentralized-knowledge-graph-dkg.md), in order to use the SDK.
## Prerequisites
@@ -37,7 +37,7 @@ poetry add dkg==8.0.1
## :snowboarder: Quickstart
-This package includes both synchronous and asynchronous versions of the DKG client.
+This package includes both synchronous and asynchronous versions of the DKG client.
The synchronous client is designed for applications that accept blocking calls. It operates sequentially, making it simpler to integrate into existing codebases that do not use asynchronous programming.
@@ -45,7 +45,7 @@ The asynchronous client is built for non-blocking operations, making it ideal fo
### Synchronous DKG client
-To use the Synchronous DKG library, you need to connect to a running local or remote OT-node.
+To use the Synchronous DKG library, you need to connect to a running local or remote OT-node.
from dkg import DKG
from dkg.providers import BlockchainProvider, NodeHTTPProvider
@@ -127,7 +127,7 @@ The system supports multiple blockchain networks, which can be configured using
## Create a Knowledge Collection
-In this example, let’s create an example Knowledge Collection representing a city. The content contains both public and private assertions. Public assertions will be exposed publicly (replicated to other nodes), while private ones won't (stay on the node you published to only). If you have access to the particular node that has the data, when you search for it using get or query, you will see both public and private assertions.
+In this example, let’s create an example Knowledge Collection representing a city. The content contains both public and private assertions. Public assertions will be exposed publicly (replicated to other nodes), while private ones won't (stay on the node you published to only). If you have access to the particular node that has the data, when you search for it using get or query, you will see both public and private assertions.
```python
const content = {
@@ -312,7 +312,7 @@ The response of the get operation will be the assertion graph:
## Querying Knowledge Asset data with SPARQL
-Querying the DKG is done by using the SPARQL query language, which is very similar to SQL applied to graph data.
+Querying the DKG is done by using the SPARQL query language, which is very similar to SQL applied to graph data.
_(If you have SQL experience, SPARQL should be relatively easy to get started with. More information_[ _can be found here_](https://www.w3.org/TR/rdf-sparql-query/)_)._
@@ -346,7 +346,7 @@ The returned response will contain an array of n-quads:
}
```
-As the OriginTrail node leverages a fully fledged graph database (a triple store supporting RDF), you can run arbitrary SPARQL queries on it.
+As the OriginTrail node leverages a fully fledged graph database (a triple store supporting RDF), you can run arbitrary SPARQL queries on it.
To learn more about querying the DKG, go [here](../../querying-the-dkg.md).
@@ -358,7 +358,7 @@ We can divide operations done by SDK into 3 types:
* Smart contract call (non-state-changing interaction)
* Smart contract transaction (state-changing interaction)
-Non-state-changing interactions with smart contracts are free and can be described as contract-getters. They don’t require transactions on the blockchain. This means they do not incur transaction fees.
+Non-state-changing interactions with smart contracts are free and can be described as contract-getters. They don’t require transactions on the blockchain. This means they do not incur transaction fees.
Smart contract transactions are state-changing operations. This means they change the state of the smart contract memory, which requires some blockchain-native gas tokens (such as ETH, NEURO, etc.).
diff --git a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/setting-up-your-development-environment.md b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/setting-up-your-development-environment.md
index 51523a5b..67b28f75 100644
--- a/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/setting-up-your-development-environment.md
+++ b/docs/build-a-dkg-node-ai-agent/advanced-features-and-toolkits/dkg-sdk/setting-up-your-development-environment.md
@@ -6,7 +6,7 @@ description: How to setup a local and shared development environment
## Running node engines on the DKG testnet (recommended)
-We recommend following the [Installation guide](../../../getting-started/decentralized-knowle-dge-graph-dkg.md) for testnet setup instructions.
+We recommend following the [Installation guide](../../../getting-started/decentralized-knowledge-graph-dkg.md) for testnet setup instructions.
## Running a local DKG network
@@ -50,7 +50,7 @@ Then, install the required dependencies by running:
npm install
```
-Next, create a file called `.env` and add the following lines:
+Next, create a file called `.env` and add the following lines:
```sh
NODE_ENV=development
@@ -61,7 +61,7 @@ RPC_ENDPOINT_BC2=http://127.0.0.1:9545
To start the local DKG network, run the **local network setup** script to install multiple node engines in the local environment. To ensure stability of operation, it is recommended to run at least 5 node engines (1 bootstrap and 4 subsequent node engines).
{% hint style="warning" %}
-The scripts below only work for macOS and Linux (or Windows WSL).
+The scripts below only work for macOS and Linux (or Windows WSL).
If you need help with the setup, contact the core development team on [Discord](https://discord.com/invite/FCgYk2S).
{% endhint %}
@@ -79,14 +79,12 @@ To start the local DKG network on **Linux**, run the following command:
./tools/local-network-setup/setup-linux-environment.sh --nodes=5
```
-
-
{% hint style="info" %}
-### Contributing
+#### Contributing
These setup instructions are a work in progress and are subject to change. The core development team expects to introduce improvements in setting up the DKG node engine in the local environment in the future.
-As DKG Node is open source, we **happily invite you to contribute to building the Decentralized Knowledge Graph.** We're excited about your contributions!
+As DKG Node is open source, we **happily invite you to contribute to building the Decentralized Knowledge Graph.** We're excited about your contributions!
Please visit the [GitHub](https://github.com/OriginTrail/ot-node) repo for more info.
{% endhint %}
diff --git a/docs/build-a-dkg-node-ai-agent/contributing-a-plugin.md b/docs/build-a-dkg-node-ai-agent/contributing-a-plugin.md
index 25ed2795..48578420 100644
--- a/docs/build-a-dkg-node-ai-agent/contributing-a-plugin.md
+++ b/docs/build-a-dkg-node-ai-agent/contributing-a-plugin.md
@@ -150,7 +150,7 @@ Learn more in Turborepo docs.
### Further resources
-👥 OriginTrail Discord server
+👥 OriginTrail [Discord](https://discord.com/invite/xCaY7hvNwD) server
📖 **Expo framework:**
@@ -165,4 +165,3 @@ Learn more in Turborepo docs.
* [Filtering](https://turborepo.com/docs/crafting-your-repository/running-tasks#using-filters)
* [Configuration Options](https://turborepo.com/docs/reference/configuration)
* [CLI Usage](https://turborepo.com/docs/reference/command-line-reference)
-
diff --git a/docs/build-a-dkg-node-ai-agent/essentials-plugin.md b/docs/build-a-dkg-node-ai-agent/essentials-plugin.md
index 6c508179..8fad8a0d 100644
--- a/docs/build-a-dkg-node-ai-agent/essentials-plugin.md
+++ b/docs/build-a-dkg-node-ai-agent/essentials-plugin.md
@@ -16,6 +16,7 @@ The **DKG Node Essentials Plugin** ships preinstalled with every DKG Node. It pr
* **DKG Knowledge Asset create tool** - basic too to publish Knowledge assets from a JSON-LD object with `public` or `private` visibility
* **DKG Knowledge Asset get** tool - retrieve a Knowledge asset by it's **UAL**.
+* **DKG SPARQL query tool** - execute SPARQL SELECT and CONSTRUCT queries on the DKG to search and retrieve knowledge.
Publishing Knowledge assets with the "public" visibility, will replicate their content to the entirety of the DKG - making it **publicly visible**. When creating private knowledge assets, their content never leaves your node - only knowledge asset registration material (such as the cryptographic hash and UALs) will be published publicly.
@@ -72,7 +73,7 @@ DKG Explorer link: https://dkg-testnet.origintrail.io/explore?ual=did:dkg:otp:20
***
-#### 2) DKG Knowledge Asset **get**
+#### 2) DKG Knowledge Asset get
**Purpose**\
Fetch a **KA or KC** by **UAL**.
@@ -126,9 +127,75 @@ did:dkg:otp:20430/0xABCDEF0123456789/12345/67890
}
```
+#### 3) **DKG SPARQL query tool**
+
+**Purpose**\
+Execute SPARQL queries on the DKG to search, filter, and retrieve knowledge. Supports both SELECT queries (tabular results) and CONSTRUCT queries (graph/N-triples output).
+
+**Inputs**
+
+* `query` _(string, required)_ — a valid SPARQL SELECT or CONSTRUCT query.
+
+**Returns**
+
+All tools return an **MCP-formatted** payload:
+
+* `content` _(array)_ — one item containing:
+ * For **SELECT** queries: JSON-formatted bindings with the query results.
+ * For **CONSTRUCT** queries: N-triples formatted graph data.
+
+**Example input (SELECT query)**
+
+```
+SELECT ?name ?description
+WHERE {
+ ?s ?name .
+ ?s ?description .
+}
+LIMIT 10
+```
+
+**Typical response (SELECT)**
+
+```json
+{
+ "data": [
+ {
+ "name": "\"Hello DKG\"",
+ "description": "\"My first Knowledge Asset on the Decentralized Knowledge Graph!\""
+ },
+ {
+ "name": "\"DKG Example KA\"",
+ "description": "\"The best KA example on the DKG\""
+ }
+ ]
+}
+```
+
+**Example input (CONSTRUCT query)**
+
+```
+CONSTRUCT { ?s ?name }
+WHERE { ?s ?name }
+LIMIT 10
+```
+
+**Typical response (CONSTRUCT)**
+
+```
+ "Jane Doe" .
+ "John Smith" .
+ "Alice Wonder" .
+```
+
+**Notes**
+
+* Only **SELECT** and **CONSTRUCT** query types are supported. UPDATE operations (INSERT, DELETE, MODIFY) are not allowed.
+* Invalid SPARQL syntax will return a validation error before execution.
+* Results are automatically formatted based on query type for optimal readability.
+
### Coming soon (preview)
-* **DKG query & retrieve** - generate/execute Schema.org-based **SPARQL** queries on the DKG.
* **Document → JSON/Markdown** - convert PDFs/Word/TXT/… into JSON/Markdown for downstream processing.
* **JSON/Markdown → JSON-LD** - transform structured text into a **schema.org** knowledge graph ready for publishing.
diff --git a/docs/build-a-dkg-node-ai-agent/set-up-your-custom-dkg-node-fork-and-update-flow.md b/docs/build-a-dkg-node-ai-agent/set-up-your-custom-dkg-node-fork-and-update-flow.md
index f3cff38b..2da3f4fd 100644
--- a/docs/build-a-dkg-node-ai-agent/set-up-your-custom-dkg-node-fork-and-update-flow.md
+++ b/docs/build-a-dkg-node-ai-agent/set-up-your-custom-dkg-node-fork-and-update-flow.md
@@ -75,13 +75,13 @@ Your custom DKG Node repository is now set up with:
## Configure and start your custom DKG Node project
-Once this setup process is complete, you are ready to configure and run your custom DKG Node using the `dkg-cli`. The `dkg-cli` provides automated installation, configuration management, and service control for your DKG Node. Detailed instructions on how to use `dkg-cli` to configure your node, and manage its services are available in the [**Installation**](../getting-started/decentralized-knowle-dge-graph-dkg.md#id-1-install-cli) page under "Getting started" section.
+Once this setup process is complete, you are ready to configure and run your custom DKG Node using the `dkg-cli`. The `dkg-cli` provides automated installation, configuration management, and service control for your DKG Node. Detailed instructions on how to use `dkg-cli` to configure your node, and manage its services are available in the [**Installation**](../getting-started/decentralized-knowledge-graph-dkg.md#id-1-install-cli) page under "Getting started" section.
-## Update your custom DKG Node project
+## Update your custom DKG Node project
When a new version of DKG Node is released, follow the process steps below to update your custom DKG Node project.
-**1. Fetch the latest changes from upstream:**
+**1. Fetch the latest changes from upstream:**
```sh
git fetch upstream
@@ -103,7 +103,7 @@ Most projects will encounter differences between upstream and local changes. Rev
git push origin main
```
-## You’re up to date
+## You’re up to date
At this point, your codebase is synced with the latest official [DKG Node](https://github.com/OriginTrail/dkg-node) while keeping your customizations intact.
diff --git a/docs/build-a-dkg-node-ai-agent/using-the-dkg-client.md b/docs/build-a-dkg-node-ai-agent/using-the-dkg-client.md
new file mode 100644
index 00000000..301f57fd
--- /dev/null
+++ b/docs/build-a-dkg-node-ai-agent/using-the-dkg-client.md
@@ -0,0 +1,568 @@
+---
+description: >-
+ Learn how to use the DKG client (ctx.dkg) in your plugins to query, retrieve,
+ and publish Knowledge Assets on the OriginTrail Decentralized Knowledge Graph.
+---
+
+# Using the DKG client
+
+When building plugins for your DKG Node, you have access to `ctx.dkg` — a powerful client that lets you interact directly with the OriginTrail Decentralized Knowledge Graph. This page covers the core operations you'll use most often: **querying**, **getting**, and **publishing** Knowledge Assets.
+
+{% hint style="info" %}
+💡 **Quick Reference:** The `ctx.dkg` client is an instance of [dkg.js](advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/) that's pre-configured and injected into every plugin via the `defineDkgPlugin` function.
+{% endhint %}
+
+## Accessing the DKG client
+
+Inside your plugin, you receive `ctx` as the first argument to `defineDkgPlugin`. The DKG client is available at `ctx.dkg`:
+
+```ts
+import { defineDkgPlugin } from "@dkg/plugins";
+
+export default defineDkgPlugin((ctx, mcp, api) => {
+ // ctx.dkg is your DKG client instance
+ // ctx.blob is the blob storage for file handling
+
+ // Register tools and routes that use ctx.dkg...
+});
+```
+
+***
+
+## Querying the DKG with SPARQL
+
+The most powerful way to explore and retrieve data from the DKG is through SPARQL queries. SPARQL is a query language for RDF data — think of it like SQL, but for graph databases.
+
+### Basic query syntax
+
+Use `ctx.dkg.graph.query()` to execute SPARQL queries:
+
+```ts
+const result = await ctx.dkg.graph.query(
+ `PREFIX schema:
+ SELECT ?subject ?name
+ WHERE {
+ ?subject schema:name ?name .
+ }`,
+ "SELECT" // Query type: SELECT, CONSTRUCT, ASK, or DESCRIBE
+);
+```
+
+### Query types
+
+| Type | Description | Returns |
+| ----------- | ---------------------------------------- | --------------------------------------- |
+| `SELECT` | Returns variable bindings (rows of data) | Array of objects with variable bindings |
+| `CONSTRUCT` | Builds a new RDF graph from results | RDF triples in JSON-LD format |
+| `ASK` | Boolean query — does a pattern exist? | `true` or `false` |
+| `DESCRIBE` | Returns RDF data about a resource | RDF description of the resource |
+
+### Query response structure
+
+```ts
+{
+ "status": "COMPLETED",
+ "data": [
+ { "subject": "https://example.org/entity1", "name": "Example Entity" },
+ { "subject": "https://example.org/entity2", "name": "Another Entity" }
+ ]
+}
+```
+
+### Example: Building a query tool
+
+Here's a complete example of registering an MCP tool that queries the DKG:
+
+```ts
+import { defineDkgPlugin } from "@dkg/plugins";
+import { z } from "@dkg/plugins/helpers";
+
+export default defineDkgPlugin((ctx, mcp) => {
+ mcp.registerTool(
+ "search-entities",
+ {
+ title: "Search DKG Entities",
+ description: "Search for entities by name in the DKG",
+ inputSchema: {
+ searchTerm: z.string().describe("Name or partial name to search for"),
+ },
+ },
+ async ({ searchTerm }) => {
+ const query = `
+ PREFIX schema:
+ PREFIX dkg:
+
+ SELECT ?entity ?name ?type
+ WHERE {
+ GRAPH {
+ ?g dkg:hasNamedGraph ?kaGraph .
+ }
+ GRAPH ?kaGraph {
+ ?entity schema:name ?name .
+ OPTIONAL { ?entity a ?type }
+ FILTER(CONTAINS(LCASE(STR(?name)), LCASE("${searchTerm}")))
+ }
+ }
+ LIMIT 20
+ `;
+
+ const result = await ctx.dkg.graph.query(query, "SELECT");
+
+ return {
+ content: [{
+ type: "text",
+ text: JSON.stringify(result.data, null, 2)
+ }],
+ };
+ }
+ );
+});
+```
+
+### Common query patterns
+
+#### Query all current Knowledge Assets
+
+Use `` to filter for currently valid Knowledge Assets (KAs):
+
+```sparql
+PREFIX schema:
+PREFIX dkg:
+
+SELECT ?subject ?predicate ?object
+WHERE {
+ GRAPH {
+ ?g dkg:hasNamedGraph ?kaGraph .
+ }
+ GRAPH ?kaGraph {
+ ?subject ?predicate ?object .
+ }
+}
+LIMIT 100
+```
+
+#### Query within a specific paranet
+
+Restrict queries to a paranet scope:
+
+```sparql
+PREFIX dkg:
+
+SELECT ?kaGraph ?subject ?predicate ?object
+WHERE {
+ GRAPH {
+ dkg:hasNamedGraph ?kaGraph .
+ }
+ GRAPH ?kaGraph {
+ ?subject ?predicate ?object .
+ }
+}
+```
+
+#### Query by publisher
+
+Find all KAs published by a specific wallet:
+
+```sparql
+PREFIX dkg:
+
+SELECT ?kaGraph
+WHERE {
+ GRAPH {
+ ?kc dkg:publishedBy .
+ ?kc dkg:hasNamedGraph ?kaGraph .
+ }
+}
+```
+
+#### Query by date range
+
+Filter KAs by publish time:
+
+```sparql
+PREFIX dkg:
+
+SELECT ?kaGraph ?publishTime
+WHERE {
+ GRAPH {
+ ?kc dkg:publishTime ?publishTime .
+ FILTER(?publishTime >= "2025-01-01T00:00:00Z"^^xsd:dateTime)
+ FILTER(?publishTime < "2025-02-01T00:00:00Z"^^xsd:dateTime)
+ ?kc dkg:hasNamedGraph ?kaGraph .
+ }
+}
+```
+
+{% hint style="success" %}
+**Pro Tip:** For more complex queries and advanced SPARQL patterns, see the [Query the DKG](advanced-features-and-toolkits/querying-the-dkg.md) documentation.
+{% endhint %}
+
+***
+
+## Getting Knowledge Assets
+
+To retrieve a specific Knowledge Asset by its UAL (Uniform Asset Locator), use `ctx.dkg.asset.get()`.
+
+### Basic get operation
+
+```ts
+const result = await ctx.dkg.asset.get(ual);
+```
+
+### Get with options
+
+```ts
+const result = await ctx.dkg.asset.get(ual, {
+ includeMetadata: true, // Include metadata about the KA
+});
+```
+
+### Response structure
+
+```ts
+{
+ "assertion": [
+ {
+ "@id": "https://example.org/entity",
+ "http://schema.org/name": [{ "@value": "Example Entity" }],
+ "@type": ["http://schema.org/Thing"]
+ }
+ ],
+ "operation": {
+ "get": {
+ "operationId": "uuid-here",
+ "status": "COMPLETED"
+ }
+ }
+}
+```
+
+### Example: Get tool implementation
+
+```ts
+import { defineDkgPlugin } from "@dkg/plugins";
+import { z } from "@dkg/plugins/helpers";
+
+export default defineDkgPlugin((ctx, mcp) => {
+ mcp.registerTool(
+ "get-knowledge-asset",
+ {
+ title: "Get Knowledge Asset",
+ description: "Retrieve a Knowledge Asset by its UAL",
+ inputSchema: {
+ ual: z.string().describe("The UAL of the Knowledge Asset"),
+ },
+ },
+ async ({ ual }) => {
+ try {
+ const result = await ctx.dkg.asset.get(ual, {
+ includeMetadata: true,
+ });
+
+ return {
+ content: [{
+ type: "text",
+ text: JSON.stringify(result, null, 2)
+ }],
+ };
+ } catch (error) {
+ throw new Error(`Failed to get asset: ${error.message}`);
+ }
+ }
+ );
+});
+```
+
+### Understanding UALs
+
+A UAL (Uniform Asset Locator) uniquely identifies a Knowledge Asset:
+
+```
+did:dkg:base:84532/0xd5550173b0f7b8766ab2770e4ba86caf714a5af5/10310
+```
+
+Components:
+
+* `did:dkg` — DID method prefix
+* `base:84532` — Blockchain name and chain ID
+* `0xd555...` — Contract address
+* `10310` — Asset ID (Knowledge Collection ID + optional Asset ID)
+
+***
+
+## Publishing Knowledge Assets
+
+Use `ctx.dkg.asset.create()` to publish new Knowledge Assets to the DKG.
+
+### Basic create operation
+
+```ts
+const content = {
+ public: {
+ "@context": "http://schema.org",
+ "@id": "https://example.org/my-entity",
+ "@type": "Thing",
+ "name": "My First Knowledge Asset",
+ "description": "An example entity on the DKG"
+ }
+};
+
+const result = await ctx.dkg.asset.create(content, {
+ epochsNum: 2, // How many epochs (months) to keep the asset
+});
+```
+
+### Public vs private content
+
+You can publish content as **public** (replicated across the network) or **private** (stays on your node only):
+
+```ts
+const content = {
+ public: {
+ "@context": "http://schema.org",
+ "@id": "https://example.org/entity",
+ "@type": "Organization",
+ "name": "Public Company Name"
+ },
+ private: {
+ "@context": "http://schema.org",
+ "@id": "https://example.org/entity",
+ "@type": "OrganizationPrivateData",
+ "revenue": "$10M",
+ "employeeCount": 150
+ }
+};
+
+const result = await ctx.dkg.asset.create(content, {
+ epochsNum: 6,
+});
+```
+
+### Create options
+
+| Option | Description | Default |
+| ------------------------------------------ | -------------------------------------------- | -------- |
+| `epochsNum` | Number of epochs (months) to store the asset | Required |
+| `minimumNumberOfFinalizationConfirmations` | Confirmations needed before finalized | 3 |
+| `minimumNumberOfNodeReplications` | Minimum nodes to replicate to | 1 |
+
+### Response structure
+
+```ts
+{
+ "UAL": "did:dkg:base:84532/0xd555.../10310",
+ "datasetRoot": "0x09d73283...",
+ "operation": {
+ "mintKnowledgeAsset": {
+ "transactionHash": "0x1a9f6b95...",
+ "blockNumber": 20541620,
+ "status": true
+ },
+ "publish": {
+ "operationId": "uuid-here",
+ "status": "PUBLISH_REPLICATE_END"
+ },
+ "finality": { "status": "FINALIZED" }
+ }
+}
+```
+
+### Example: Create tool implementation
+
+```ts
+import { defineDkgPlugin } from "@dkg/plugins";
+import { z } from "@dkg/plugins/helpers";
+
+export default defineDkgPlugin((ctx, mcp) => {
+ mcp.registerTool(
+ "publish-knowledge-asset",
+ {
+ title: "Publish Knowledge Asset",
+ description: "Create and publish a new Knowledge Asset to the DKG",
+ inputSchema: {
+ jsonld: z.string().describe("JSON-LD content to publish"),
+ privacy: z.enum(["public", "private"]).default("private"),
+ epochs: z.number().min(1).max(24).default(2),
+ },
+ },
+ async ({ jsonld, privacy, epochs }) => {
+ try {
+ const content = JSON.parse(jsonld);
+ const wrapped = { [privacy]: content };
+
+ const result = await ctx.dkg.asset.create(wrapped, {
+ epochsNum: epochs,
+ minimumNumberOfFinalizationConfirmations: 3,
+ minimumNumberOfNodeReplications: 1,
+ });
+
+ const ual = result?.UAL;
+
+ return {
+ content: [{
+ type: "text",
+ text: `✅ Knowledge Asset published!\n\nUAL: ${ual}\nPrivacy: ${privacy}\nEpochs: ${epochs}`
+ }],
+ };
+ } catch (error) {
+ throw new Error(`Failed to publish: ${error.message}`);
+ }
+ }
+ );
+});
+```
+
+### JSON-LD best practices
+
+When creating Knowledge Assets, follow these JSON-LD conventions:
+
+1. **Always include `@context`** — Use `http://schema.org` or a custom ontology
+2. **Use `@id` for unique identification** — URIs that uniquely identify your entity
+3. **Specify `@type`** — The type of entity (e.g., `Person`, `Organization`, `Product`)
+4. **Use schema.org vocabulary** — Prefer standard properties for interoperability
+
+```ts
+{
+ "@context": "http://schema.org",
+ "@id": "urn:myapp:products:12345",
+ "@type": "Product",
+ "name": "Wireless Headphones",
+ "brand": {
+ "@type": "Brand",
+ "name": "AudioTech"
+ },
+ "offers": {
+ "@type": "Offer",
+ "price": "99.99",
+ "priceCurrency": "USD"
+ }
+}
+```
+
+***
+
+## Complete DKG plugin example
+
+Here's a full MCP tool that demonstrates all three operations:
+
+```ts
+import { defineDkgPlugin } from "@dkg/plugins";
+import { z } from "@dkg/plugins/helpers";
+
+export default defineDkgPlugin((ctx, mcp, api) => {
+ // 1. Query Tool
+ mcp.registerTool(
+ "dkg-search",
+ {
+ title: "Search DKG",
+ description: "Search the DKG using a SPARQL query",
+ inputSchema: {
+ query: z.string().describe("SPARQL query to execute"),
+ queryType: z.enum(["SELECT", "CONSTRUCT", "ASK", "DESCRIBE"]).default("SELECT"),
+ },
+ },
+ async ({ query, queryType }) => {
+ const result = await ctx.dkg.graph.query(query, queryType);
+ return {
+ content: [{ type: "text", text: JSON.stringify(result, null, 2) }],
+ };
+ }
+ );
+
+ // 2. Get Tool
+ mcp.registerTool(
+ "dkg-get",
+ {
+ title: "Get Knowledge Asset",
+ description: "Retrieve a Knowledge Asset by UAL",
+ inputSchema: {
+ ual: z.string().describe("UAL of the Knowledge Asset"),
+ },
+ },
+ async ({ ual }) => {
+ const result = await ctx.dkg.asset.get(ual);
+ return {
+ content: [{ type: "text", text: JSON.stringify(result, null, 2) }],
+ };
+ }
+ );
+
+ // 3. Create Tool
+ mcp.registerTool(
+ "dkg-publish",
+ {
+ title: "Publish Knowledge Asset",
+ description: "Publish a new Knowledge Asset",
+ inputSchema: {
+ content: z.string().describe("JSON-LD content"),
+ privacy: z.enum(["public", "private"]).default("private"),
+ },
+ },
+ async ({ content, privacy }) => {
+ const parsed = JSON.parse(content);
+ const result = await ctx.dkg.asset.create({ [privacy]: parsed }, {
+ epochsNum: 2,
+ });
+ return {
+ content: [{ type: "text", text: `Published! UAL: ${result.UAL}` }],
+ };
+ }
+ );
+
+ // 4. REST API endpoint for querying
+ api.post("/query", async (req, res) => {
+ try {
+ const { query, queryType = "SELECT" } = req.body;
+ const result = await ctx.dkg.graph.query(query, queryType);
+ res.json({ success: true, data: result });
+ } catch (error: any) {
+ res.status(500).json({ success: false, error: error.message });
+ }
+ });
+});
+```
+
+***
+
+## Additional DKG client methods
+
+Beyond the core operations, `ctx.dkg` provides additional functionality:
+
+| Method | Description |
+| ----------------------------------- | ------------------------------------------------ |
+| `ctx.dkg.node.info()` | Get information about the connected DKG node |
+| `ctx.dkg.asset.update()` | Update an existing Knowledge Asset |
+| `ctx.dkg.asset.increaseAllowance()` | Pre-approve token spending for faster publishing |
+| `ctx.dkg.asset.decreaseAllowance()` | Revoke token spending authorization |
+
+***
+
+## Error handling
+
+Always wrap DKG operations in try-catch blocks:
+
+```ts
+try {
+ const result = await ctx.dkg.asset.get(ual);
+ // Handle success
+} catch (error) {
+ ctx.logger?.error("DKG operation failed:", error);
+ throw new Error(`Operation failed: ${error.message}`);
+}
+```
+
+Common error scenarios:
+
+* **Network errors** — Node unreachable or timeout
+* **Invalid UAL** — Malformed or non-existent asset
+* **Insufficient funds** — Not enough tokens for publishing
+* **Invalid JSON-LD** — Malformed content structure
+
+***
+
+## Next steps
+
+* [**Query the DKG**](advanced-features-and-toolkits/querying-the-dkg.md) — Deep dive into SPARQL query patterns
+* [**DKG JavaScript SDK**](advanced-features-and-toolkits/dkg-sdk/dkg-v8-js-client/) — Full SDK documentation
+* [**Customizing your DKG Agent**](customizing-your-dkg-agent.md) — Build custom plugins
+* [**Essentials Plugin**](essentials-plugin.md) — Reference implementation for DKG tools
diff --git a/docs/contribute-to-the-dkg/delegated-staking/README.md b/docs/contribute-to-the-dkg/delegated-staking/README.md
index 6a1e7714..060e61d6 100644
--- a/docs/contribute-to-the-dkg/delegated-staking/README.md
+++ b/docs/contribute-to-the-dkg/delegated-staking/README.md
@@ -13,7 +13,7 @@ As a decentralized system, the OriginTrail Decentralized Knowledge Graph (DKG) e
## TRAC delegated staking mechanics
-For a DKG Core Node to be eligible to host a portion of the DKG and receive TRAC network rewards, its TRAC stake plays a crucial role. Set at a minimum of 50,000 TRAC on a particular blockchain, the stake has an important role in ensuring the security of the DKG. The DKG Core Node operators can contribute to the node stake on their own or by attracting more TRAC to their stake through delegated staking.
+For a DKG Core Node to be eligible to host a portion of the DKG and receive TRAC network rewards, its TRAC stake plays a crucial role. Set at a minimum of 50,000 TRAC on a particular blockchain, the stake has an important role in ensuring the security of the DKG. The DKG Core Node operators can contribute to the node stake on their own or by attracting more TRAC to their stake through delegated staking.
There are 2 roles involved in delegated staking: **Core Node operators** and **TRAC delegators.**
@@ -33,7 +33,7 @@ Contrary to inflationary systems, TRAC staking is strictly utility-based, and re
As knowledge publishers create Knowledge Assets on the DKG, they lock an appropriate amount of TRAC tokens in the DKG smart contracts. The TRAC amount offered has to be high enough to ensure that enough DKG Core Nodes will store it for a specific amount of time. The nodes then commit to storing the Knowledge Assets for a specific amount of time, measured in **30-day periods called epochs**.
-At the end of each epoch, DKG nodes "prove" that they are providing DKG services to the DKG smart contracts, which in turn unlocks TRAC rewards initially locked by the knowledge publisher.
+At the end of each epoch, DKG nodes "prove" that they are providing DKG services to the DKG smart contracts, which in turn unlocks TRAC rewards initially locked by the knowledge publisher.
Many Core Nodes can compete for the same TRAC reward on the basis of their total stake, node ask, and publishing factor. Node rewards are a function of 4 parameters in order of importance:
@@ -49,15 +49,13 @@ After claiming rewards, the rewards are **automatically restaked, increasing the
To introduce a level of predictability into network operations, token withdrawals are subject to a 28-day unbonding period.
{% hint style="warning" %}
-If you want to withdraw tokens in order to delegate to another node on the same network (blockchain), you **do not** have to wait 28 days! [See here >](redelegating-stake.md)
+If you want to withdraw tokens in order to delegate to another node on the same network (blockchain), you **do not** have to wait 28 days! [See here >](redelegating-stake.md)
{% endhint %}
{% hint style="success" %}
Delegated staking is a non-custodial system, so the Core Node operator has no access to the locked TRAC tokens at any time.
{% endhint %}
-
-
Each Core Node operator can also set an “**operator fee,**” which is a percentage of the TRAC rewards deducted each time a node claims rewards from a Knowledge Asset. The remaining TRAC fee is then split proportionally to the share of staked tokens across all delegators.
{% hint style="info" %}
@@ -118,6 +116,6 @@ To understand how to set up your operator fee, follow the [Core Node setup](../.
## **Have questions?**
-Drop by our [Discord](https://discord.gg/aNpBjf97) or [Telegram group](https://t.me/origintrail), and feel free to ask your questions there. Make sure to follow our official announcements, and stay safe!
+Drop by our [Discord](https://discord.com/invite/xCaY7hvNwD) or [Telegram group](https://t.me/origintrail), and feel free to ask your questions there. Make sure to follow our official announcements, and stay safe!
Happy staking! 🚀
diff --git a/docs/contribute-to-the-dkg/delegated-staking/redelegating-stake.md b/docs/contribute-to-the-dkg/delegated-staking/redelegating-stake.md
index ab532db7..ed013d4a 100644
--- a/docs/contribute-to-the-dkg/delegated-staking/redelegating-stake.md
+++ b/docs/contribute-to-the-dkg/delegated-staking/redelegating-stake.md
@@ -11,7 +11,7 @@ If you want **move your delegated TRAC stake from one DKG node to another**, you
## Keep in mind
* The DKG is multichain. However, **TRAC tokens can only be redelegated within nodes on the same blockchain**
-* The amount of stake (TRAC) that you want to redelegate **should not exceed the second node's remaining capacity** (a node can have a maximum of 2,000,000 TRAC stake delegated to it).
+* The amount of stake (TRAC) that you want to redelegate **should not exceed the second node's remaining capacity** (a node can have a maximum of 5,000,000 TRAC stake delegated to it).
***
@@ -21,18 +21,16 @@ If you want **move your delegated TRAC stake from one DKG node to another**, you
2. Go to the **'My delegation**' tab to see available nodes that you can redelegate from.
3. Optionally, use the **'Filter by blockchain'** dropdown to select the desired blockchain, which will filter and display nodes on this network along with their staking information.
4. Once you've decided which node you want to redelegate your TRAC from, click on the **'Manage stake'** button next to the desired node on the right side of the table. Make sure you read the disclaimer.
-5. When the staking pop-up opens, you'll have the option to **Delegate, Redelegate,** or **Withdraw** TRAC tokens from the node. Proceed by selecting '**Redelegate**'.
+5. When the staking pop-up opens, you'll have the option to **Delegate, Redelegate,** or **Withdraw** TRAC tokens from the node. Proceed by selecting '**Redelegate**'.
Use the redelegate button in the popup to redelegate your stake
-
-
-6. After clicking on 'Redelegate', a field to enter the amount of TRAC you wish to redelegate to another node will appear on the right side of the pop-up, as well as the select box, for selecting the other node — the one that will receive the TRAC. **Enter the amount of TRAC you want redelegated and select the node you want to redelegate to.**
+6. After clicking on 'Redelegate', a field to enter the amount of TRAC you wish to redelegate to another node will appear on the right side of the pop-up, as well as the select box, for selecting the other node — the one that will receive the TRAC. **Enter the amount of TRAC you want redelegated and select the node you want to redelegate to.**
{% hint style="warning" %}
-You can stake your TRAC only to nodes that have less than 2,000,000 TRAC stake delegated to them.
+You can stake your TRAC only to nodes that have less than 5,000,000 TRAC stake delegated to them.
{% endhint %}
{% hint style="info" %}
@@ -44,8 +42,7 @@ Only the nodes from the same network with the remaining capacity greater than ze
8. Once both transactions are signed and confirmed, you should see a **'Stake redelegated successfully'** message appear.
-9. To confirm that the process was successful, **check your TRAC delegation** by going to the 'My delegations' tab above the table with the nodes and verifying that your delegations are listed there. Additionally, ensure that the stake amount on the node has decreased and the amount on the other node has increased following the successful redelegation.\
-
+9. To confirm that the process was successful, **check your TRAC delegation** by going to the 'My delegations' tab above the table with the nodes and verifying that your delegations are listed there. Additionally, ensure that the stake amount on the node has decreased and the amount on the other node has increased following the successful redelegation.\\
{% hint style="info" %}
If you encounter any issues during the staking process or require assistance, please get in touch with the OriginTrail community in [Discord](https://discord.gg/xCaY7hvNwD).
diff --git a/docs/contribute-to-the-dkg/delegated-staking/step-by-step-staking.md b/docs/contribute-to-the-dkg/delegated-staking/step-by-step-staking.md
index 8b9b7526..5d5aadc0 100644
--- a/docs/contribute-to-the-dkg/delegated-staking/step-by-step-staking.md
+++ b/docs/contribute-to-the-dkg/delegated-staking/step-by-step-staking.md
@@ -8,14 +8,14 @@ Welcome to the step-by-step TRAC delegated staking guide! First, lets start with
## Prerequisites
-1. You need to have some TRAC tokens to delegate. See ['How to get on TRAC(k)?' section of this website >](https://origintrail.io/get-started/trac-token)
+1. You need to have some TRAC tokens to delegate. See ['How to get on TRAC(k)?' section of this website >](https://origintrail.io/get-started/trac-token)
2. You need to decide which blockchain you want to stake on. The DKG supports multiple blockchains:
* [Base Blockchain](../../dkg-knowledge-hub/learn-more/connected-blockchains/base-blockchain/)
* [NeuroWeb](../../dkg-knowledge-hub/learn-more/connected-blockchains/neuroweb.md)
* [Gnosis Chain](../../dkg-knowledge-hub/learn-more/connected-blockchains/gnosis-chain/)
3. Bridge your TRAC to the chosen blockchain. See instructions for bridging:
* [Base Blockchain](../../dkg-knowledge-hub/learn-more/connected-blockchains/base-blockchain/)
- * [NeuroWeb](../../graveyard/everything/teleport-instructions-neuroweb.md)
+ * [NeuroWeb](../../dkg-knowledge-hub/learn-more/connected-blockchains/neuroweb.md)
* [Gnosis Chain](../../dkg-knowledge-hub/learn-more/connected-blockchains/gnosis-chain/)
4. Have some gas fee tokens available on the chosen network:
* Base Mainnet: ETH on Base
@@ -36,46 +36,46 @@ For the purpose of this tutorial we used the Metamask wallet extension.
Once you have confirmed that you have both gas tokens and TRAC tokens available in your wallet, you can proceed to the Staking Dashboard at [https://staking.origintrail.io/](https://staking.origintrail.io/) and follow the steps below:
-### **Step 1:**
+### **Step 1:**
Click on the **'Connect wallet'** button in the top right corner of the navigation bar and follow the prompts to connect your wallet to the interface.
-### **Step 2:**
+### **Step 2:**
-Make sure you have selected the right blockchain in your wallet.
+Make sure you have selected the right blockchain in your wallet.
-### **Step 3:**
+### **Step 3:**
The Staking Dashboard shows a list of all the Core Nodes hosting the DKG. This table shows different information, such as:
-* The node name,
-* Which blockchain it's connected to,
-* How much stake does a node have,
-* The node's ask,
-* The node's operator fee,
-* Reward statistics, and other.
+* The node name,
+* Which blockchain it's connected to,
+* How much stake does a node have,
+* The node's ask,
+* The node's operator fee,
+* Reward statistics, and other.
-**To delegate your TRAC tokens, you need to pick one or more nodes you believe are going to perform best for the network** (on the basis of criteria explained [here](./)). The chosen node has to have **enough "room" to take TRAC,** meaning less than 2M TRAC already staked. 2M is the maximum amount of TRAC staked per node.
+**To delegate your TRAC tokens, you need to pick one or more nodes you believe are going to perform best for the network** (on the basis of criteria explained [here](./)). The chosen node has to have **enough "room" to take TRAC,** meaning less than 5M TRAC already staked. 5M is the maximum amount of TRAC staked per node.
-### **Step 4:**
+### **Step 4:**
Once you click on a Core Node, a staking pop-up opens with the option to delegate or withdraw TRAC tokens from the node. Proceed by pressing the **'Delegate'** button
Delegating popup
-### **Step 5:**
+### **Step 5:**
Enter the amount of TRAC you would like to delegate and press the **'Delegate TRAC'** button. The delegation process will require two transactions: one to increase the allowance and another to confirm the contract interaction.
-### **Step 6:**
+### **Step 6:**
To confirm that the process was successful, check your TRAC delegation by going to the **'My delegations'** tab above the table with the nodes and verify that your delegation is listed there. Additionally, ensure that the stake amount on the node has increased following the successful delegation.
diff --git a/docs/contribute-to-the-dkg/hackathon-scaling-trust-in-the-age-of-ai/README.md b/docs/contribute-to-the-dkg/hackathon-scaling-trust-in-the-age-of-ai/README.md
index 24d37b5e..95f152ba 100644
--- a/docs/contribute-to-the-dkg/hackathon-scaling-trust-in-the-age-of-ai/README.md
+++ b/docs/contribute-to-the-dkg/hackathon-scaling-trust-in-the-age-of-ai/README.md
@@ -2,6 +2,7 @@
description: >-
🧠 Code trust & verifiability into AI. Join the global hackathon to build a
collective digital immune system for the AI era. 🚀
+hidden: true
---
# Hackathon: Scaling Trust in the Age of AI
diff --git a/docs/contribute-to-the-dkg/hackathon-scaling-trust-in-the-age-of-ai/dkg-social-graph-query-guide.md b/docs/contribute-to-the-dkg/hackathon-scaling-trust-in-the-age-of-ai/dkg-social-graph-query-guide.md
index a2531e64..a7adea9b 100644
--- a/docs/contribute-to-the-dkg/hackathon-scaling-trust-in-the-age-of-ai/dkg-social-graph-query-guide.md
+++ b/docs/contribute-to-the-dkg/hackathon-scaling-trust-in-the-age-of-ai/dkg-social-graph-query-guide.md
@@ -80,20 +80,26 @@ Represents a social media account's core identity:
"@graph": [
{
"@type": ["schema:Person", "foaf:Person", "prov:Agent"],
- "@id": "https://ca.investing.com",
+ "@id": "https://youtube.com/channel/UCY1kMZp36IQSyNx_9h4mpCg",
"schema:identifier": [
{
"@type": "schema:PropertyValue",
"schema:propertyID": "creatorId",
- "schema:value": "news::ca.investing.com"
+ "schema:value": "youtube::UCY1kMZp36IQSyNx_9h4mpCg"
},
{
"@type": "schema:PropertyValue",
"schema:propertyID": "platform",
- "schema:value": "news"
+ "schema:value": "youtube"
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:propertyID": "userId",
+ "schema:value": "UCY1kMZp36IQSyNx_9h4mpCg"
}
],
- "schema:dateCreated": "1970-01-21T09:19:38.440Z"
+ "schema:dateCreated": "2025-11-13T19:55:47.000Z",
+ "prov:generatedAtTime": "2025-11-13T19:55:47.000Z"
}
]
}
@@ -113,37 +119,76 @@ Captures time-series data about account activity and connections:
"@graph": [
{
"@type": ["prov:Entity", "schema:Observation"],
- "@id": "urn:uuid:25499e5d-7c9a-53d7-85c6-acc1cdac6ba8",
- "prov:generatedAtTime": "1970-01-21T09:19:38.440Z",
+ "@id": "urn:uuid:d103990f-b092-5f73-952c-b7c3554add43",
+ "prov:generatedAtTime": "2025-11-13T19:55:47.000Z",
+ "schema:observationDate": "2025-11-13T19:55:47.000Z",
"schema:about": {
- "@id": "https://twitter.com/i/user/748244810692104192"
+ "@id": "https://youtube.com/channel/UCY1kMZp36IQSyNx_9h4mpCg"
+ },
+ "prov:specializationOf": {
+ "@id": "https://youtube.com/channel/UCY1kMZp36IQSyNx_9h4mpCg"
},
"foaf:knows": [
{
"@type": "foaf:Person",
- "@id": "https://x.com/9to5mac",
- "schema:name": "9to5mac",
+ "@id": "https://www.youtube.com/@pondermusic",
+ "schema:name": "pondermusic",
"schema:additionalProperty": {
"@type": "schema:PropertyValue",
"schema:name": "connectionStrength",
- "schema:value": 17
+ "schema:value": 60
}
}
],
+ "schema:expertise": [
+ {
+ "@type": "schema:Thing",
+ "@id": "https://lunarcrush.com/api4/public/topic/money/v1",
+ "schema:name": "money",
+ "schema:additionalProperty": [
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "creatorRank",
+ "schema:value": 4572
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "postCount",
+ "schema:value": 2
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "percent",
+ "schema:value": 0.98
+ }
+ ]
+ }
+ ],
+ "prov:hadPrimarySource": {
+ "@type": "prov:Entity",
+ "@id": "https://lunarcrush.com/api4/public/creator/youtube/UCY1kMZp36IQSyNx_9h4mpCg/v1"
+ },
"foaf:accountProfileInfo": {
"@type": "schema:Person",
- "schema:name": "Ryan Christoffel",
- "schema:alternateName": "iryantldr",
+ "@id": "https://youtube.com/channel/UCY1kMZp36IQSyNx_9h4mpCg",
+ "schema:name": "Mark Rober",
+ "foaf:name": "Mark Rober",
+ "schema:alternateName": "markrober",
+ "foaf:nick": "markrober",
"schema:image": {
"@type": "schema:ImageObject",
- "schema:url": "https://pbs.twimg.com/profile_images/..."
+ "schema:url": "https://yt3.ggpht.com/ytc/AIdro_ksXY2REjZ6gYKSgnWT5jC_zT9mX900vyFtVinR8KbHww=s88-c-k-c0x00ffffff-no-rj",
+ "foaf:depiction": "https://yt3.ggpht.com/ytc/AIdro_ksXY2REjZ6gYKSgnWT5jC_zT9mX900vyFtVinR8KbHww=s88-c-k-c0x00ffffff-no-rj"
}
},
+ "schema:rank": 2099,
"schema:interactionStatistic": [
{
"@type": "schema:InteractionCounter",
- "schema:interactionType": {"@type": "schema:FollowAction"},
- "schema:userInteractionCount": 208065784
+ "schema:interactionType": {
+ "@type": "schema:FollowAction"
+ },
+ "schema:userInteractionCount": 71700000
}
]
}
@@ -165,25 +210,39 @@ Core metadata about a social media post:
"@graph": [
{
"@type": ["schema:SocialMediaPosting", "sioc:Post", "prov:Entity"],
- "@id": "https://247sports.com/college/notre-dame/article/...",
+ "@id": "https://youtube.com/watch?v=6zU2rLYHLhw",
"schema:author": [
- {"@id": "https://twitter.com/i/user/151595281"}
+ {
+ "@id": "https://youtube.com/channel/UCY1kMZp36IQSyNx_9h4mpCg",
+ "prov:agent": "https://youtube.com/channel/UCY1kMZp36IQSyNx_9h4mpCg"
+ }
],
+ "schema:image": [],
"schema:identifier": {
"@type": "schema:PropertyValue",
"schema:propertyID": "postId",
- "schema:value": "247sports.com-1025526375"
+ "schema:value": "6zU2rLYHLhw"
},
- "schema:datePublished": "1970-01-21T08:57:14.089Z",
- "schema:url": "https://247sports.com/...",
- "schema:genre": "news",
- "schema:headline": "Notre Dame Notebook: Irish Defense...",
- "schema:description": "Notre Dame's defense dominated...",
- "schema:keywords": "notre dame,north carolina",
+ "prov:wasAttributedTo": {
+ "@id": "https://youtube.com/channel/UCY1kMZp36IQSyNx_9h4mpCg"
+ },
+ "schema:datePublished": "2025-01-23T16:08:11.000Z",
+ "prov:generatedAtTime": "2025-01-23T16:08:11.000Z",
+ "sioc:created_at": "2025-01-23T16:08:11.000Z",
+ "schema:dateCreated": "2025-10-30T13:40:03.000Z",
+ "schema:url": "https://youtube.com/watch?v=6zU2rLYHLhw",
+ "foaf:page": "https://youtube.com/watch?v=6zU2rLYHLhw",
+ "schema:genre": "youtube-video",
+ "sioc:post_type": "youtube-video",
+ "schema:headline": "The Fastest Way To Make A Salad!",
+ "schema:description": "The Fastest Way To Make A Salad! w/@NickDiGiovanni",
+ "schema:keywords": [
+ "make a"
+ ],
"schema:about": [
{
"@type": "schema:Thing",
- "@id": "https://lunarcrush.com/api4/public/topic/notre%20dame/v1"
+ "@id": "https://lunarcrush.com/api4/public/topic/make%20a/v1"
}
]
}
@@ -204,27 +263,49 @@ Engagement metrics tracked over time:
"@graph": [
{
"@type": ["prov:Entity", "schema:Observation"],
- "@id": "urn:uuid:de0e9046-5754-5a22-b5a9-1f3a520a846b",
- "prov:generatedAtTime": "1970-01-21T09:19:38.340Z",
- "schema:about": {
- "@id": "https://9to5mac.com/2025/10/16/i-love-my-iphone-air..."
- },
- "prov:specializationOf": {
- "@id": "https://9to5mac.com/2025/10/16/i-love-my-iphone-air..."
- },
+ "@id": "urn:uuid:60330101-6002-59bf-a26b-97d1a8384766",
+ "prov:generatedAtTime": "2025-10-30T13:40:04.000Z",
+ "schema:observationDate": "2025-10-30T13:40:04.000Z",
"schema:interactionStatistic": {
"@type": "schema:InteractionCounter",
"schema:interactionType": "schema:InteractAction",
- "schema:userInteractionCount": 35921,
+ "schema:userInteractionCount": 62659943,
"schema:description": "Total interactions"
},
+ "schema:reviewRating": {
+ "@type": "schema:Rating",
+ "schema:ratingValue": "3.00",
+ "schema:bestRating": 5,
+ "schema:worstRating": 1,
+ "schema:description": "Sentiment score"
+ },
"schema:variableMeasured": [
{
"@type": "schema:PropertyValue",
- "schema:name": "detailedMetrics",
- "schema:value": "{\"views\":35744,\"quotes\":2,\"replies\":20,\"retweets\":5,\"bookmarks\":13,\"favorites\":137}"
+ "schema:name": "likes",
+ "schema:value": 2129894
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "views",
+ "schema:value": 60527488
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "comments",
+ "schema:value": 2561
}
- ]
+ ],
+ "schema:about": {
+ "@id": "https://youtube.com/watch?v=6zU2rLYHLhw"
+ },
+ "prov:specializationOf": {
+ "@id": "https://youtube.com/watch?v=6zU2rLYHLhw"
+ },
+ "prov:hadPrimarySource": {
+ "@type": "prov:Entity",
+ "@id": "https://lunarcrush.com/api4/public/posts/youtube-video/6zU2rLYHLhw/v1"
+ }
}
]
}
@@ -244,15 +325,15 @@ Basic topic information:
"@graph": [
{
"@type": ["schema:Thing", "skos:Concept", "foaf:Topic"],
- "@id": "https://lunarcrush.com/api4/public/topic/%240992hk/v1",
- "schema:name": "Lenovo Group Limited",
- "foaf:name": "Lenovo Group Limited",
- "schema:alternateName": "$0992hk",
- "skos:notation": "$0992hk",
+ "@id": "https://lunarcrush.com/api4/public/topic/money/v1",
+ "schema:name": "Money",
+ "foaf:name": "Money",
+ "schema:alternateName": "money",
+ "skos:notation": "money",
"schema:identifier": {
"@type": "schema:PropertyValue",
"schema:propertyID": "topicSlug",
- "schema:value": "$0992hk"
+ "schema:value": "money"
}
}
]
@@ -272,48 +353,171 @@ Topic trends, rankings, and sentiment over time:
"@graph": [
{
"@type": ["prov:Entity", "schema:Observation"],
- "@id": "urn:uuid:42422e77-aef8-5621-8350-5175e64b929b",
- "prov:generatedAtTime": "1970-01-21T09:19:53.508Z",
+ "@id": "urn:uuid:08ff5e69-85b9-5bba-80fb-dd76348a1043",
+ "prov:generatedAtTime": "2025-11-13T19:55:47.000Z",
+ "schema:observationDate": "2025-11-13T19:55:47.000Z",
"schema:about": {
- "@id": "https://lunarcrush.com/api4/public/topic/postgame/v1"
+ "@id": "https://lunarcrush.com/api4/public/topic/money/v1"
+ },
+ "prov:specializationOf": {
+ "@id": "https://lunarcrush.com/api4/public/topic/money/v1"
},
"schema:relatedLink": [
- {"@id": "https://lunarcrush.com/api4/public/topic/blue%20jays/v1"},
- {"@id": "https://lunarcrush.com/api4/public/topic/toronto/v1"}
+ {"@id": "https://lunarcrush.com/api4/public/topic/coins%20layer%201/v1"},
+ {"@id": "https://lunarcrush.com/api4/public/topic/investment/v1"},
+ {"@id": "https://lunarcrush.com/api4/public/topic/coins%20pow/v1"}
+ ],
+ "schema:category": [
+ "Finance"
],
- "schema:rank": 1223,
+ "prov:hadPrimarySource": {
+ "@type": "prov:Entity",
+ "@id": "https://lunarcrush.com/api4/public/topic/money/v1"
+ },
+ "schema:rank": 5,
"schema:additionalProperty": [
{
"@type": "schema:PropertyValue",
"schema:name": "numContributors",
- "schema:value": 1762
+ "schema:value": 225169
},
{
"@type": "schema:PropertyValue",
"schema:name": "numPosts",
- "schema:value": 1762
+ "schema:value": 225169
},
{
"@type": "schema:PropertyValue",
"schema:name": "interactions24h",
- "schema:value": 16461825
+ "schema:value": 803878895
},
{
"@type": "schema:PropertyValue",
"schema:name": "trend",
- "schema:value": "up"
+ "schema:value": "down"
}
],
+ "schema:genre": "Finance",
"schema:variableMeasured": [
{
"@type": "schema:PropertyValue",
- "schema:name": "postTypeDistribution",
- "schema:value": "{\"news\":293,\"tweet\":17606,\"reddit-post\":10608}"
+ "schema:name": "postTypeDistribution:news",
+ "schema:value": 10737
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "postTypeDistribution:tweet",
+ "schema:value": 1030324
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "postTypeDistribution:reddit-post",
+ "schema:value": 68340
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "postTypeDistribution:tiktok-video",
+ "schema:value": 773815
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "postTypeDistribution:youtube-video",
+ "schema:value": 830036
},
{
"@type": "schema:PropertyValue",
- "schema:name": "sentimentByType",
- "schema:value": "{\"news\":74,\"tweet\":59,\"reddit-post\":18}"
+ "schema:name": "interactionsByType:news",
+ "schema:value": 143137968
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "interactionsByType:tweet",
+ "schema:value": 159234882
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "interactionsByType:reddit-post",
+ "schema:value": 1045990
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "interactionsByType:tiktok-video",
+ "schema:value": 168630892
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "interactionsByType:youtube-video",
+ "schema:value": 331829163
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "sentimentByType:news",
+ "schema:value": 98
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "sentimentByType:tweet",
+ "schema:value": 64
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "sentimentByType:reddit-post",
+ "schema:value": 67
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "sentimentByType:tiktok-video",
+ "schema:value": 74
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "sentimentByType:youtube-video",
+ "schema:value": 76
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "sentimentDetail:news",
+ "schema:value": {
+ "neutral": 4229,
+ "negative": 2484,
+ "positive": 4024
+ }
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "sentimentDetail:tweet",
+ "schema:value": {
+ "neutral": 414100,
+ "negative": 219356,
+ "positive": 396868
+ }
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "sentimentDetail:reddit-post",
+ "schema:value": {
+ "neutral": 29674,
+ "negative": 12815,
+ "positive": 25851
+ }
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "sentimentDetail:tiktok-video",
+ "schema:value": {
+ "neutral": 346694,
+ "negative": 109220,
+ "positive": 317901
+ }
+ },
+ {
+ "@type": "schema:PropertyValue",
+ "schema:name": "sentimentDetail:youtube-video",
+ "schema:value": {
+ "neutral": 347658,
+ "negative": 118169,
+ "positive": 364209
+ }
}
]
}
diff --git a/docs/dkg-key-concepts.md b/docs/dkg-key-concepts.md
index b3241f76..412f1ba2 100644
--- a/docs/dkg-key-concepts.md
+++ b/docs/dkg-key-concepts.md
@@ -9,15 +9,15 @@ description: >-
## DKG Network & Nodes
-OriginTrail Decentralized Knowledge Graph (DKG) is a permissionless, multi-chain infrastructure designed to host and interlink semantically-rich “[Knowledge Assets](dkg-key-concepts.md#knowledge-assets)” - structured containers of machine-readable data (e.g., RDF-based graphs) that are discoverable, verifiable, and owned by their creators.\
+OriginTrail Decentralized Knowledge Graph (DKG) is a permissionless, multi-chain infrastructure designed to host and interlink semantically-rich “[Knowledge Assets](dkg-key-concepts.md#knowledge-assets)” — structured containers of machine-readable data (e.g., RDF-based graphs) that are discoverable, verifiable, and owned by their creators.\
\
The DKG enables AI-agents and applications to query, connect, and build upon distributed knowledge while preserving provenance and trust through blockchain-anchored proof systems.
-The DKG Network is comprised of network nodes, running on different servers and devices. **There are two primary node types that enable the network’s operation**. The first is **the DKG Core Node**, which hosts the public DKG, persistently stores and serves knowledge assets, participates in random-sampling proofs and token incentives, and requires a minimum stake (e.g., 50,000 TRAC) to participate.\
+The DKG Network is comprised of network nodes, running on different servers and devices. **There are** [**two primary node types**](dkg-knowledge-hub/learn-more/introduction/edge-vs.-core-node-rules-and-token-thresholds.md) **that enable the network’s operation**. The first is **the DKG Core Node**, which hosts the public DKG, persistently stores and serves knowledge assets, participates in random-sampling proofs and token incentives, and requires a minimum stake (e.g., 50,000 TRAC) to participate.\
\
-The second is the **DKG Edge Node**, which runs on devices at the “edge” (e.g., laptops, phones, IoT, and even servers, if deployed that way) and enables local knowledge processing, private-graph handling, and integration with AI-pipelines (via APIs like dRAG), allowing owners to retain control of their data while still contributing to the global DKG.
+The second is the [**DKG Edge Node**](getting-started/decentralized-knowledge-graph-dkg.md), which runs on devices at the “edge” (e.g., laptops, phones, IoT, and even servers, if deployed that way) and enables local knowledge processing, private-graph handling, and integration with AI-pipelines (via APIs like dRAG), allowing owners to retain control of their data while still contributing to the global DKG.
-Together, Core and Edge Nodes form the network and exchange knowledge, facilitated by the blockchain. They share the same codebas,e however, so **it is possible to turn a DKG Edge Node into a DKG Core node (more on that later in the docs)**.
+Together, Core and Edge Nodes form the network and exchange knowledge, facilitated by the blockchain. They share the same codebase, however, so **it is possible to turn a DKG Edge Node into a DKG Core Node (more on that** [**here**](dkg-knowledge-hub/learn-more/introduction/edge-vs.-core-node-rules-and-token-thresholds.md)**)**.
## Knowledge Assets
@@ -30,8 +30,6 @@ More precisely, a Knowledge Asset is a web resource identified by a unique Unifo
* **Uniform Asset Locator**: Globally unique URI with assigned ownership using blockchain accounts, implemented as a non-fungible token (NFT) on the blockchain.
* **Derivable vector embeddings**: These facilitate the neuro-symbolic features - such as link prediction, entity prediction, similarity search, and others.
-
-
Knowledge content can be observed as a time series of knowledge content states or **assertions**. Each assertion can be independently verified for integrity by the verifier recomputing the cryptographic fingerprint and comparing if the computed result matches the corresponding blockchain fingerprint record.
@@ -48,7 +46,7 @@ Similar to distributed databases, the OriginTrail DKG applies replication mechan
## What is a UAL?
-Uniform Asset Locators (UALs) are ownable identifiers on the DKG, similar to URLs in the traditional web. The UALs follow the DID URL specification and are used to identify and locate a specific Knowledge Asset within the OriginTrail DKG.
+Uniform Asset Locators (UALs) are ownable identifiers on the DKG, similar to URLs in the traditional web. The UALs follow the DID URL specification and are used to identify and locate a specific Knowledge Asset within the OriginTrail DKG.
UAL consists of 5 parts:
@@ -82,22 +80,22 @@ The Trace token (TRAC) is the utility token that powers the OriginTrail Decentra
## Decentralized Retrieval Augmented Generation
-Patrick Lewis coined the term Retrieval-Augmented Generation (RAG) in a [2020 paper](https://arxiv.org/pdf/2005.11401.pdf). It is a technique for enhancing the accuracy and reliability of GenAI models with facts fetched from external sources. This allows artificial intelligence (AI) solutions to dynamically fetch relevant information before the generation process, enhancing the accuracy of responses by limiting the generation to re-working the retrieved inputs. \
+Patrick Lewis coined the term Retrieval-Augmented Generation (RAG) in a [2020 paper](https://arxiv.org/pdf/2005.11401.pdf). It is a technique for enhancing the accuracy and reliability of GenAI models with facts fetched from external sources. This allows artificial intelligence (AI) solutions to dynamically fetch relevant information before the generation process, enhancing the accuracy of responses by limiting the generation to re-working the retrieved inputs.\
\
**Decentralized Retrieval Augmented Generation (dRAG) advances the model by organizing external sources in a DKG with verifiable sources made available for AI models to use.** The framework enables a hybrid AI system that brings together neural (e.g., LLMs) and symbolic (e.g., Knowledge Graph) methodologies. Contrary to using a solely neural AI approach based on vector embedding representations, a symbolic AI approach enhances it with the strength of Knowledge Graphs by introducing a basis in symbolic representations.
-dRAG is, therefore, a framework that allows AI solutions to tap into the strengths of both paradigms:
+dRAG is, therefore, a framework that allows AI solutions to tap into the strengths of both paradigms:
-* The powerful learning and generalization capabilities of neural networks, and
-* The precise, rule-based processing of symbolic AI.
+* The powerful learning and generalization capabilities of neural networks, and
+* The precise, rule-based processing of symbolic AI.
It operates on two core components:
-(1) the DKG paranets and
+(1) the DKG paranets and
-(2) AI models.
+(2) AI models.
-The dRAG applications framework is entirely compatible with the existing techniques, tools, and RAG frameworks and supports all major data formats.
+The dRAG applications framework is entirely compatible with the existing techniques, tools, and RAG frameworks and supports all major data formats.
## Knowledge mining
@@ -125,6 +123,6 @@ If you are interested in learning more about NFTs, you can find out more [here](
The next building block of the DKG is **AI para-networks** or **paranets**.
-**AI para-networks** or **paranets** are autonomously operated structures in the DKG, owned by their community as a paranet operator. In paranets, we find **assemblies of Knowledge Assets** driving use cases with associated **paranet-specific AI services** and an **incentivization model** to reward knowledge miners fueling its growth.
+**AI para-networks** or **paranets** are autonomously operated structures in the DKG, owned by their community as a paranet operator. In paranets, we find **assemblies of Knowledge Assets** driving use cases with associated **paranet-specific AI services** and an **incentivization model** to reward knowledge miners fueling its growth.
-**To see the DKG in action, continue to the** [**Installation section**](getting-started/decentralized-knowle-dge-graph-dkg.md)**.**
+**To see the DKG in action, continue to the** [**Installation section**](getting-started/decentralized-knowledge-graph-dkg.md)**.**
diff --git a/docs/dkg-knowledge-hub/how-tos-and-tutorials/README.md b/docs/dkg-knowledge-hub/how-tos-and-tutorials/README.md
index 4d9f0fcf..fa3f13e3 100644
--- a/docs/dkg-knowledge-hub/how-tos-and-tutorials/README.md
+++ b/docs/dkg-knowledge-hub/how-tos-and-tutorials/README.md
@@ -8,6 +8,6 @@ description: >-
#### Pages in this section
-* [**DKG V8.1.X Update Guidebook**](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/dkg-knowledge-hub/how-tos-and-tutorials/dkg-v8.1.x-update-guidebook) – A detailed walkthrough for updating your DKG Node to the latest release.
-* [**Bridging to Moonbeam**](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/dkg-knowledge-hub/how-tos-and-tutorials/bridging-to-moonbeam) – How to connect OriginTrail components to Moonbeam for multi-chain deployments.
-* [**Builder tutorials**](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/dkg-knowledge-hub/how-tos-and-tutorials/tutorials) – Practical tutorials and examples for publishing data, creating Knowledge Assets, and integrating with applications.
+* [**DKG V8.1.X Update Guidebook**](dkg-v8.1.x-update-guidebook.md) – A detailed walkthrough for updating your DKG Node to the latest release.
+* [**Bridging to Moonbeam**](bridging-to-moonbeam.md) – How to connect OriginTrail components to Moonbeam for multi-chain deployments.
+* [**Builder tutorials**](tutorials.md) – Practical tutorials and examples for publishing data, creating Knowledge Assets, and integrating with applications.
diff --git a/docs/dkg-knowledge-hub/learn-more/connected-blockchains/neuroweb.md b/docs/dkg-knowledge-hub/learn-more/connected-blockchains/neuroweb.md
index 60fba2cc..ad459426 100644
--- a/docs/dkg-knowledge-hub/learn-more/connected-blockchains/neuroweb.md
+++ b/docs/dkg-knowledge-hub/learn-more/connected-blockchains/neuroweb.md
@@ -14,9 +14,10 @@ More information on NEURO can be found in the [official NeuroWeb documentation](
## Bridging TRAC to NeuroWeb
-To use TRAC tokens on NeuroWeb for powering your nodes, staking, or other activities, you need to bridge TRAC to NeuroWeb.
+To use TRAC tokens on NeuroWeb for powering your nodes, staking, or other activities, you need to bridge TRAC to NeuroWeb.
-You can transfer TRAC tokens from Ethereum to NeuroWeb and vice versa via [Snowbridge](https://app.snowbridge.network/).
+You can transfer TRAC tokens from Ethereum to NeuroWeb and vice versa via [Snowbridge](https://app.snowbridge.network/). \
+Bridging instructions are available in the [NeuroWeb official documentation](https://docs.neuroweb.ai/ethereum-neuroweb-trac-bridge).
## Adding TRAC on NeuroWeb to your wallet
@@ -24,7 +25,7 @@ Here are step-by-step instructions for adding the TRAC token on NeuroWeb to your
TRAC token address: 0xFfFFFFff00000000000000000000000000000001
-### **Step 1:**
+### **Step 1:**
Open Metamask that is connected to NeuroWeb (connection details available here), then under the Assets tab, click on `Import tokens`.
@@ -39,4 +40,3 @@ On the import tokens page, you need to add the TRAC token contract address. Usua
### Step 3:
After you get all the fields filled with the right information (as in the image above), you click **'Add custom tokens'** and your TRAC balance will be displayed in Metamask.
-
diff --git a/docs/dkg-knowledge-hub/learn-more/decentralized-knowle-dge-graph-dkg.md b/docs/dkg-knowledge-hub/learn-more/decentralized-knowle-dge-graph-dkg.md
index 916a032d..d14230b8 100644
--- a/docs/dkg-knowledge-hub/learn-more/decentralized-knowle-dge-graph-dkg.md
+++ b/docs/dkg-knowledge-hub/learn-more/decentralized-knowle-dge-graph-dkg.md
@@ -18,7 +18,7 @@ The DKG Node is the next evolution of the OriginTrail node software — your int
In the OriginTrail ecosystem, there are two types of roles your DKG Node can fulfill:
* **The DKG Edge Node** is designed to operate at the network "edge" and can be set up on anything from laptops and phones to the cloud. They can enrich and utilize DKG knowledge and can operate agents, but are not responsible for hosting the DKG state. Therefore, DKG Edge Nodes do not require TRAC token stake to run — Edge Nodes can fully publish, query, and verify knowledge, but are not eligible for a share of protocol fees (via delegated stake) as they do not contribute DKG services to the wider network, and therefore do not require a high uptime.
-* **DKG Core Nodes** can do everything Edge Nodes can, but are intended to run as the "network core" — they are running the DKG and maintain its state, which requires high uptime. A DKG Core Node, therefore, needs a minimum of 50,000 TRAC staked as an economic guarantee, which can be "sponsored" by any ecosystem stakeholder willing to delegate TRAC to it. Once a DKG Core Node is set up and contributing to the network, it earns network publishing fees based on its contribution to the network.
+* **DKG Core Nodes** can do everything Edge Nodes can, but are intended to run as the "network core" — they are running the DKG and maintain its state, which requires high uptime. A DKG Core Node, therefore, needs a minimum of 50,000 TRAC staked as an economic guarantee, which can be "sponsored" by any ecosystem stakeholder willing to delegate TRAC to it. Once a DKG Core Node is set up and contributing to the network, it earns network publishing fees based on its contribution to the network.
@@ -31,7 +31,7 @@ Running a DKG Node is designed to be accessible — but like any powerful techno
* **AI agent concepts** – Knowing how agents interact with external data (like the DKG) will help you design better applications.
* **Basic terminal & server skills** – Installing and managing a node requires comfort with the command line and deploying services on a VPS.
-We recommend exploring our [introductory resources or tutorials if any of these areas are new to you](https://app.gitbook.com/o/-McnF-Jcg4utndKcdeko/s/-McnEkhdd7JlySeckfHM/~/changes/408/dkg-knowledge-hub). And remember — you’re not alone. The OriginTrail Discord community is active and welcoming, with dedicated channels where you can ask questions, troubleshoot issues, and share ideas as you learn.
+We recommend exploring our [introductory resources or tutorials if any of these areas are new to you](/broken/pages/NiPXKCuxKbkBdWfF6pBd). And remember — you’re not alone. The OriginTrail Discord community is active and welcoming, with dedicated channels where you can ask questions, troubleshoot issues, and share ideas as you learn.
***
diff --git a/docs/dkg-knowledge-hub/learn-more/introduction/rules-and-token-thresholds.md b/docs/dkg-knowledge-hub/learn-more/introduction/edge-vs.-core-node-rules-and-token-thresholds.md
similarity index 100%
rename from docs/dkg-knowledge-hub/learn-more/introduction/rules-and-token-thresholds.md
rename to docs/dkg-knowledge-hub/learn-more/introduction/edge-vs.-core-node-rules-and-token-thresholds.md
diff --git a/docs/dkg-knowledge-hub/learn-more/previous-updates/staking-threshold-update-and-outstanding-network-rewards-release.md b/docs/dkg-knowledge-hub/learn-more/previous-updates/staking-threshold-update-and-outstanding-network-rewards-release.md
new file mode 100644
index 00000000..c7347902
--- /dev/null
+++ b/docs/dkg-knowledge-hub/learn-more/previous-updates/staking-threshold-update-and-outstanding-network-rewards-release.md
@@ -0,0 +1,60 @@
+# Staking cap & outstanding network rewards release
+
+**TL;DR**
+
+* We’re **increasing the staking cap from 5,000,000 $TRAC to 10,000,000 $TRAC to accommodate more stake delegations** on the best-performing DKG nodes.
+* To ensure staking doesn’t outweigh other performance drivers (e.g., node publishing factor), we will **publish an RFC with an updated rewards formula before the updated staking cap goes live**. The formula update is planned for the **end of the next epoch (around February 10)**.
+* With the last epoch of previously allocated V6 rewards expiring on January 9th (today), the conditions to begin **releasing outstanding network rewards are met**. The rewards deployment is scheduled for **the end of epoch 13 (around February 10)**.
+
+#### What’s changing and why
+
+As the network continues to grow, we’re seeing increased delegation demand—especially toward the best-performing DKG nodes. To better accommodate this and reduce delegation bottlenecks, we’re updating the staking parameters used for delegations.
+
+**1) Staking cap increases to 10,000,000 $TRAC**
+
+To support more stake delegations on top-performing nodes, the staking cap on the DKG nodes will increase from 5,000,000 $TRAC to 10,000,000 $TRAC.
+
+This change is intended to make it easier for delegators to stake with high-performing nodes without hitting the previous threshold as quickly.
+
+**2) Rewards formula update will be introduced via RFC**
+
+We also want to maintain a balanced incentive structure—where staking is important, but does not overshadow other network performance factors, such as the node publishing factor.
+
+To retain that balance:
+
+* We will introduce a change to the rewards formula via a public RFC (Request for Comments) before implementation.
+* The goal is transparency and feedback prior to rollout.
+* The formula change is planned to be implemented at the end of the next epoch (around February 10).
+
+**3) Outstanding network rewards will be released after a snapshot**
+
+There are outstanding network rewards to be released. To do this cleanly and fairly:
+
+* After the current epoch ends on January 10, we will take a network snapshot.
+* That snapshot will be used to implement the distribution.
+* The outstanding network rewards will be released at the end of the next epoch (around February 10).
+
+
+
+#### Key dates and timeline
+
+* **January 9 - Epoch 12 ends**
+ * A network snapshot will be taken at the end of the epoch.
+* **\~February 10 (at end of epoch 13)**
+ * Outstanding network rewards released
+ * Formula update implemented (after the RFC is published and reviewed)
+ * Staking cap increased on DKG nodes from 5M to 10M $TRAC.
+
+_(All “around” dates are aligned to epoch timing.)_
+
+#### What this means for delegators
+
+* More capacity to delegate to the best-performing DKG nodes, due to the higher cap (10M $TRAC).
+* No immediate action is required purely because of this announcement — your delegation remains as-is unless you choose to adjust it.
+* If you care about how staking weight vs. publishing/performance factors are balanced, you’ll be able to review and comment on the upcoming RFC before the formula change is implemented.
+
+#### What this means for node operators
+
+* The network is **reinforcing a performance-based model: stake matters, and publishing/performance factors continue to matter**.
+* Please **keep an eye out for the RFC** outlining the formula update, and be ready to provide feedback.
+* The snapshot and rewards release timeline is now clearly defined: snapshot after Jan 9, distribution around Feb 10.
diff --git a/docs/dkg-knowledge-hub/learn-more/readme/README.md b/docs/dkg-knowledge-hub/learn-more/readme/README.md
index 1f3820b7..6609dd64 100644
--- a/docs/dkg-knowledge-hub/learn-more/readme/README.md
+++ b/docs/dkg-knowledge-hub/learn-more/readme/README.md
@@ -14,12 +14,12 @@ coverY: 0
>
> Hari Seldon, **Foundation series by Isaac Asimov (1951)**
-OriginTrail is building a verifiable knowledge layer for AI, where knowledge is traceable, memory is decentralized, and humans remain in control. It aims to achieve this by organizing all human knowledge in a **Decentralized Knowledge Graph (DKG)** through a **collective neuro-symbolic AI** approach.
+OriginTrail is building a verifiable knowledge layer for AI, where knowledge is traceable, memory is decentralized, and humans remain in control. It aims to achieve this by organizing all human knowledge in a **Decentralized Knowledge Graph (DKG)** through a **collective neuro-symbolic AI** approach.
-A collective neuro-symbolic AI combines structured and connected information from symbolic AI (DKG) with the creativity of neural AI technologies (LLMs), building a **robust decentralized AI infrastructure.**
+A collective neuro-symbolic AI combines structured and connected information from symbolic AI (DKG) with the creativity of neural AI technologies (LLMs), building a **robust decentralized AI infrastructure.**
-This provides a powerful substrate for **trusted, human-centric AI solutions** to tackle some of humanity's most pressing challenges. It also **drives AI agents’ autonomous memories and trusted intents**, as both AI agents and robots become potent enough to act on behalf of humans.
+This provides a powerful substrate for **trusted, human-centric AI solutions** to tackle some of humanity's most pressing challenges. It also **drives AI agents’ autonomous memories and trusted intents**, as both AI agents and robots become potent enough to act on behalf of humans.
### Choose your learning path
-
diff --git a/docs/dkg-knowledge-hub/learn-more/readme/decentralized-knowle-dge-graph-dkg.md b/docs/dkg-knowledge-hub/learn-more/readme/decentralized-knowledge-graph-dkg.md
similarity index 97%
rename from docs/dkg-knowledge-hub/learn-more/readme/decentralized-knowle-dge-graph-dkg.md
rename to docs/dkg-knowledge-hub/learn-more/readme/decentralized-knowledge-graph-dkg.md
index 3b5af11f..054e641f 100644
--- a/docs/dkg-knowledge-hub/learn-more/readme/decentralized-knowle-dge-graph-dkg.md
+++ b/docs/dkg-knowledge-hub/learn-more/readme/decentralized-knowledge-graph-dkg.md
@@ -16,12 +16,12 @@ Modern AI applications increasingly demand:
The DKG meets these needs by uniting the **trust layer of blockchains**, the **semantic expressiveness of knowledge graphs (symbolic AI),** and **state-of-the-art generative AI models (neural AI).**
-### Why use Blockchain?
+### Why use blockchain?
Blockchains enable:
* **Trustless verification:** Every claim is anchored to a consensus-verified state
-* **Decentralized Computation**: Blockchains enable consensus-based execution of code (e.g., via smart contracts) across decentralized networks, with no single point of control, making them ideal for building decentralized protocols like OriginTrail.
+* **Decentralized computation**: Blockchains enable consensus-based execution of code (e.g., via smart contracts) across decentralized networks, with no single point of control, making them ideal for building decentralized protocols like OriginTrail.
* **Data integrity and auditability:** Through cryptographic hashing and timestamping of data records on a blockchain, making it possible to verifiably track the origin of records and their update trail
* **Tokenization:** Enabling decentralized participation and support of the system through the TRAC token, as well as the ability to tokenize data through Knowledge Assets
@@ -69,7 +69,7 @@ In short, the DKG is an essential infrastructure layer for building trusted, int
## System architecture
-OriginTrail synergizes blockchains, knowledge graphs (symbolic AI), and LLMs (neural AI) in a 3-layer architecture, where each layer is implemented as a decentralized network.
+OriginTrail synergizes blockchains, knowledge graphs (symbolic AI), and LLMs (neural AI) in a 3-layer architecture, where each layer is implemented as a decentralized network.
The **trust layer leverages blockchains as trust networks,** established to enable reliable computation through **decentralized consensus**, operating as a global, dependable computer. It is used to track the origin of knowledge, its provenance, and integrity, and enable decentralized economic interactions in the system.
diff --git a/docs/dkg-knowledge-hub/learn-more/readme/dkg-key-concepts.md b/docs/dkg-knowledge-hub/learn-more/readme/dkg-key-concepts.md
index e999148c..5067636b 100644
--- a/docs/dkg-knowledge-hub/learn-more/readme/dkg-key-concepts.md
+++ b/docs/dkg-knowledge-hub/learn-more/readme/dkg-key-concepts.md
@@ -20,8 +20,6 @@ More precisely, a Knowledge Asset is a web resource identified by a unique Unifo
* **Uniform Asset Locator**: Globally unique URI with assigned ownership using blockchain accounts, implemented as a non-fungible token (NFT) on the blockchain.
* **Derivable vector embeddings**: These facilitate the neuro-symbolic features - such as link prediction, entity prediction, similarity search, and others.
-
-
Knowledge content can be observed as a time series of knowledge content states or **assertions**. Each assertion can be independently verified for integrity, by recomputing the cryptographic fingerprint by the verifier and comparing if the computed result matches with the corresponding blockchain fingerprint record.
@@ -49,22 +47,22 @@ The Trace token (TRAC) is the utility token that powers the OriginTrail Decentra
## Decentralized Retrieval Augmented Generation
-Patrick Lewis coined the term Retrieval-Augmented Generation (RAG) in a [2020 paper](https://arxiv.org/pdf/2005.11401.pdf). It is a technique for enhancing the accuracy and reliability of GenAI models with facts fetched from external sources. This allows artificial intelligence (AI) solutions to dynamically fetch relevant information before the generation process, enhancing the accuracy of responses by limiting the generation to re-working the retrieved inputs. \
+Patrick Lewis coined the term Retrieval-Augmented Generation (RAG) in a [2020 paper](https://arxiv.org/pdf/2005.11401.pdf). It is a technique for enhancing the accuracy and reliability of GenAI models with facts fetched from external sources. This allows artificial intelligence (AI) solutions to dynamically fetch relevant information before the generation process, enhancing the accuracy of responses by limiting the generation to re-working the retrieved inputs.\
\
**Decentralized Retrieval Augmented Generation (dRAG) advances the model by organizing external sources in a DKG with verifiable sources made available for AI models to use.** The framework enables a hybrid AI system that brings together neural (e.g., LLMs) and symbolic (e.g., Knowledge Graph) methodologies. Contrary to using a solely neural AI approach based on vector embedding representations, a symbolic AI approach enhances it with the strength of Knowledge Graphs by introducing a basis in symbolic representations.
-dRAG is, therefore, a framework that allows AI solutions to tap into the strengths of both paradigms:
+dRAG is, therefore, a framework that allows AI solutions to tap into the strengths of both paradigms:
-* The powerful learning and generalization capabilities of neural networks, and
-* The precise, rule-based processing of symbolic AI.
+* The powerful learning and generalization capabilities of neural networks, and
+* The precise, rule-based processing of symbolic AI.
It operates on two core components:
-(1) the DKG paranets and
+(1) the DKG paranets and
-(2) AI models.
+(2) AI models.
-The dRAG applications framework is entirely compatible with the existing techniques, tools, and RAG frameworks and supports all major data formats.
+The dRAG applications framework is entirely compatible with the existing techniques, tools, and RAG frameworks and supports all major data formats.
## Knowledge mining
@@ -90,7 +88,7 @@ If you are interested in learning more about NFTs, you can find out more [here](
## What is a UAL?
-Uniform Asset Locators (UALs) are ownable identifiers of the DKG, similar to URLs in the traditional web. The UALs follow the DID URL specification and are used to identify and locate a specific Knowledge Asset within the OriginTrail DKG.
+Uniform Asset Locators (UALs) are ownable identifiers of the DKG, similar to URLs in the traditional web. The UALs follow the DID URL specification and are used to identify and locate a specific Knowledge Asset within the OriginTrail DKG.
UAL consists of 5 parts:
@@ -115,6 +113,6 @@ More information on DID URLs can be found [here](https://www.w3.org/TR/did-core/
The next building block of the DKG is **AI para-networks** or **paranets**.
-**AI para-networks** or **paranets** are autonomously operated structures in the DKG, owned by their community as a paranet operator. In paranets, we find **assemblies of Knowledge Assets** driving use cases with associated **paranet-specific AI services** and an **incentivization model** to reward knowledge miners fueling its growth.
+**AI para-networks** or **paranets** are autonomously operated structures in the DKG, owned by their community as a paranet operator. In paranets, we find **assemblies of Knowledge Assets** driving use cases with associated **paranet-specific AI services** and an **incentivization model** to reward knowledge miners fueling its growth.
-**To see the DKG in action, continue to the** [**Installation guide**](../../../getting-started/decentralized-knowle-dge-graph-dkg.md)**.**
+**To see the DKG in action, continue to the** [**Installation guide**](../../../getting-started/decentralized-knowledge-graph-dkg.md)**.**
diff --git a/docs/dkg-knowledge-hub/learn-more/readme/kg.md b/docs/dkg-knowledge-hub/learn-more/readme/kg.md
index 9c8c32c3..3f5074e0 100644
--- a/docs/dkg-knowledge-hub/learn-more/readme/kg.md
+++ b/docs/dkg-knowledge-hub/learn-more/readme/kg.md
@@ -27,8 +27,8 @@ As humans, we can quickly understand that this data is related to the same **thi
### What is linked data and the Semantic Web?
-> _"The Semantic Web isn't just about putting data on the web. It is about making links, so that a person or machine can explore the web of data. With linked data, when you have some of it, you can find other, related, data."_ \
-> _- Tim Berners-Lee, the father of the World Wide Web and Semantic Web_
+> _"The Semantic Web isn't just about putting data on the web. It is about making links, so that a person or machine can explore the web of data. With linked data, when you have some of it, you can find other, related, data."_\
+> NAN;_- Tim Berners-Lee, the father of the World Wide Web and Semantic Web_
The core idea behind linked data is to represent all **things** with **relationships** between them in a common graph. Linked data is built on primitives called "**triples",** which connect a **subject entity** with an **object entity** via a **relationship**.
@@ -42,15 +42,13 @@ Integrating the two above-mentioned example datasets according to the principles

-
-
-Having such a "semantic network" of data, we inherently add context and enable easy extensions. The semantic graph can be easily queried in many ways and enables growing a body of _knowledge_ around things rather than keeping "tables of strings".
+Having such a "semantic network" of data, we inherently add context and enable easy extensions. The semantic graph can be easily queried in many ways and enables growing a body of _knowledge_ around things rather than keeping "tables of strings".
In the coming sections, we will show you how to use the OriginTrail Decentralized Knowledge Graph (DKG) for data discovery and querying. However, let's first explain what a knowledge graph is.
### What is a knowledge graph?
-There are many definitions of knowledge graphs (KGs), all slightly different. Without emphasizing precision, all of them describe a knowledge graph as a network of entities — physical & digital objects, events, or concepts — illustrating the relationship between them (aka a semantic network). KGs are used by major companies such as [Amazon](http://lunadong.com/talks/PG.pdf), [Google](https://en.wikipedia.org/wiki/Google_Knowledge_Graph), [Uber](https://www.youtube.com/watch?v=r3yMSl5NB_Q), [IBM](https://www.ibm.com/cloud/learn/knowledge-graph), etc., for various applications: search, data integration, knowledge reasoning, recommendation engines, analytics, machine learning, and AI, etc.
+There are many definitions of knowledge graphs (KGs), all slightly different. Without emphasizing precision, all of them describe a knowledge graph as a network of entities — physical & digital objects, events, or concepts — illustrating the relationship between them (aka a semantic network). KGs are used by major companies such as [Amazon](http://lunadong.com/talks/PG.pdf), [Google](https://en.wikipedia.org/wiki/Google_Knowledge_Graph), [Uber](https://www.youtube.com/watch?v=r3yMSl5NB_Q), [IBM](https://www.ibm.com/cloud/learn/knowledge-graph), etc., for various applications: search, data integration, knowledge reasoning, recommendation engines, analytics, machine learning, and AI, etc.
Key characteristics of knowledge graphs are:
@@ -62,7 +60,6 @@ For the moment, we restrict this document only to a high-level introduction and

-**Knowledge graphs are commonly deployed within the domain of one organization and are designed to capture knowledge from various sources both from within and outside of the organization.** These centralized knowledge graphs generate huge value for their owners, yet a decentralized globally shared knowledge graph brings orders of magnitude higher value to everyone participating.
-
-We present the **OriginTrail Decentralized Knowledge Graph (DKG)** as the first permissionless, global, open decentralized knowledge graph. Learn more about the [OriginTrail DKG here](decentralized-knowle-dge-graph-dkg.md).
+**Knowledge graphs are commonly deployed within the domain of one organization and are designed to capture knowledge from various sources both from within and outside of the organization.** These centralized knowledge graphs generate huge value for their owners, yet a decentralized globally shared knowledge graph brings orders of magnitude higher value to everyone participating.
+We present the **OriginTrail Decentralized Knowledge Graph (DKG)** as the first permissionless, global, open decentralized knowledge graph. Learn more about the [OriginTrail DKG here](decentralized-knowledge-graph-dkg.md).
diff --git a/docs/dkg-knowledge-hub/learn-more/readme/usdtrac-token.md b/docs/dkg-knowledge-hub/learn-more/readme/usdtrac-token.md
index 7f61eb2b..540b034c 100644
--- a/docs/dkg-knowledge-hub/learn-more/readme/usdtrac-token.md
+++ b/docs/dkg-knowledge-hub/learn-more/readme/usdtrac-token.md
@@ -49,6 +49,6 @@ Your node needs **both** to operate — TRAC to publish verifiable data and the
Now that you understand what a DKG Node is and how it’s powered by $TRAC, you’re ready to take action.
-If you’d like to start building right away, jump ahead to the “[Installation](../../../getting-started/decentralized-knowle-dge-graph-dkg.md)” section — where you’ll set up, install, and configure your own DKG Node to connect with AI models.
+If you’d like to start building right away, jump ahead to the “[Installation](../../../getting-started/decentralized-knowledge-graph-dkg.md)” section — where you’ll set up, install, and configure your own DKG Node to connect with AI models.
Or, if you want to learn more about tokenomics first, continue to “[Delegated staking](../../../contribute-to-the-dkg/delegated-staking/)” to explore how staking works across the OriginTrail ecosystem and how it powers trust, security, and participation.
diff --git a/docs/dkg-knowledge-hub/useful-resources/test-token-faucet.md b/docs/dkg-knowledge-hub/useful-resources/test-token-faucet.md
index 8b59fe52..ddb5e643 100644
--- a/docs/dkg-knowledge-hub/useful-resources/test-token-faucet.md
+++ b/docs/dkg-knowledge-hub/useful-resources/test-token-faucet.md
@@ -6,7 +6,7 @@ description: Learn how to get testnet tokens from the OriginTrail Discord faucet
The OriginTrail Decentralized Knowledge Graph (DKG) provides a testing environment on the NeuroWeb testnet, Gnosis Chiado, and Base Sepolia blockchains. To perform various blockchain operations on these testnets, users need both **test TRAC on the chosen network** and the **test utility token** of their chosen blockchain for gas.
-The **OriginTrail faucet service**, which provides test tokens, is deployed on the [**OriginTrail Discord server**](https://discord.com/invite/WaeSb5Mxj6) and located in the [**#faucet-bot**](https://discord.com/invite/WaeSb5Mxj6) channel.
+The **OriginTrail faucet service**, which provides test tokens, is deployed on the [**OriginTrail Discord server**](https://discord.gg/D9n4TeTaKG) and located in the [**#faucet-bot**](https://discord.gg/c8NSEmND) channel.
To view the available faucet options, run the following command in the chat of the **#faucet-bot** channel:
@@ -31,4 +31,3 @@ Currently, depending on your requirements, you can request tokens for the follow
{% endhint %}
If you experience any issues with the Faucet Bot, please tag the core developers in one of the Discord channels.
-
diff --git a/docs/getting-started/basic-knowledge-asset-operations.md b/docs/getting-started/basic-knowledge-asset-operations.md
index 4c7d8278..942b4b2a 100644
--- a/docs/getting-started/basic-knowledge-asset-operations.md
+++ b/docs/getting-started/basic-knowledge-asset-operations.md
@@ -9,7 +9,7 @@ description: >-
## **Creating and retrieving your first Knowledge Asset**
-This simple exercise demonstrates the basic end-to-end flow of the DKG - from AI-assisted publishing to knowledge retrieval (something like "Hello world"). I
+This simple exercise demonstrates the basic end-to-end flow of the DKG — from AI-assisted [Knowledge Asset](../dkg-key-concepts.md#knowledge-assets) publishing via the [Edge Node](decentralized-knowledge-graph-dkg.md) to knowledge retrieval (something like "Hello world").
### Create your first Knowledge Assets
diff --git a/docs/getting-started/decentralized-knowle-dge-graph-dkg.md b/docs/getting-started/decentralized-knowledge-graph-dkg.md
similarity index 86%
rename from docs/getting-started/decentralized-knowle-dge-graph-dkg.md
rename to docs/getting-started/decentralized-knowledge-graph-dkg.md
index 85148667..4c055c00 100644
--- a/docs/getting-started/decentralized-knowle-dge-graph-dkg.md
+++ b/docs/getting-started/decentralized-knowledge-graph-dkg.md
@@ -1,7 +1,23 @@
+---
+description: DKG Edge Node
+---
+
# Installation
+The **DKG Edge Node** is your gateway to verifiable AI. It's an intuitive, app-style node that lets you create and interact with verifiable knowledge effortlessly.
+
+With the Edge Node, you can:
+
+* Publish knowledge in the DKG as [Knowledge Assets](../dkg-key-concepts.md#knowledge-assets)
+* Retrieve knowledge from the DKG
+* Build reliable AI applications powered by the Decentralized Knowledge Graph (DKG) with ease through the [DKG Node AI Agent](/broken/pages/i91ic9qprIOpVgGjUy0a).
+
+The DKG Edge Node runs on devices at the “edge” (e.g., laptops, phones, IoT, and even servers, if deployed that way). It enables local knowledge processing, private-graph handling, and integration with AI pipelines (via APIs such as dRAG), allowing owners to retain control of their data while still contributing to the global DKG.
+
{% hint style="info" %}
If you are new to OriginTrail, DKG, knowledge graphs, or blockchains, we highly recommend becoming familiar with the [DKG—Key concepts](../dkg-key-concepts.md) before proceeding.
+
+To understand the difference between the DKG Edge Node and Core Node [check here](../dkg-knowledge-hub/learn-more/introduction/edge-vs.-core-node-rules-and-token-thresholds.md).
{% endhint %}
### What are we installing today?
@@ -12,7 +28,7 @@ To install the **DKG Edge Node**, we will be using the DKG CLI (`dkg-cli`) - a s
### The DKG utilizes blockchain
-The DKG Network utilizes blockchains as a trusted environment for incentivisation and securing data exchanges. It's a multichain network, so DKG Nodes support 3 blockchains, but can currently be deployed on a **single blockchain at a time** (multichain deployment support is on the way).
+The DKG Network utilizes blockchains as a trusted environment for incentivisation and securing data exchanges. It's a multichain network, so DKG Nodes support 3 blockchains, but can currently be deployed on a **single blockchain at a time** (multichain deployment support is on the way).
If you're not too familiar with blockchain technology, and not sure which blockchain to pick to get started with the DKG Node, which one is better for you etc - don't worry, a default blockchain will be chosen for you and you will be able to learn as you go (the DKG Node abstracts a lot of the complexities of blockchain for you). You shouldn't notice a big difference between blockchains while you are in development — this choice matters most when you are ready for your DKG Node deployment to mainnet.
@@ -25,7 +41,7 @@ For now, you need to know the following:
### What do you need for the installation?
* A **macOS** or **Linux** machine with at least 8GB RAM and 20GB storage space (Windows version is on the way)
-* Node.js **v22.20.0** or higher installed
+* NVM and Node.js **v22.20.0** or higher installed
* About 15-30 minutes of your time to complete all the steps
### OK, let's go!
@@ -38,7 +54,7 @@ npm install -g dkg-cli
#### 2. Generate the DKG Node Configuration
-Your DKG Node allows for rich configuration (more on that in the **Configuration** section later), however this setup focuses on a minimal default configuration.
+Your DKG Node allows for rich configuration (more on that in the **Configuration** section later), however this setup focuses on a minimal default configuration.
We recommend setting up your project folder and starting with the default development setup on DKG testnet.
@@ -65,14 +81,20 @@ All DKG node wallets require native blockchain tokens, while the publishing wall
+**MySQL configuration:**
+
+If you have an existing MySQL database already configured on your environment, please make sure to pass your root password to `DB_PASSWORD` parameter in the setup .env file.
+
+If you do not have MySQL installed, password passed this parameter will be set as your root password.
+
#### 3. Funding wallets
-As mentioned previously, your DKG Node requires tokens to be able to create Knowledge Assets.
+As mentioned previously, your DKG Node requires tokens to be able to create Knowledge Assets.
**To get tokens for the DKG testnet, use the** [**testnet token faucet**](../dkg-knowledge-hub/useful-resources/test-token-faucet.md)**.** For DKG Mainnet deployments, we suggest visiting the [TRAC token](https://origintrail.io/technology/trac-token) page to check for its availability.
{% hint style="warning" %}
-Make sure to fund your node keys with tokens before running the `dkg-cli install` command; otherwise, your DKG node might not function correctly.
+Make sure to fund your node keys with tokens before running the `dkg-cli install` command; otherwise, your DKG node might not function correctly.
{% endhint %}
Here's an overview of supported blockchains and the required tokens per key type.
@@ -95,14 +117,14 @@ The installation can take a few minutes. It installs the DKG Node in the same di
#### 5. Configure your DKG Agent
-Run the agent setup script to enable LLM features. You'll be prompted for your LLM provider, API key, model name, and DKG environment (must match your setup-config choice: testnet or mainnet). The agent supports multiple providers; examples are listed below.
+Run the agent setup script to enable LLM features. You'll be prompted for your LLM provider, API key, model name, and DKG environment (must match your setup-config choice: testnet or mainnet).
```sh
cd dkg-node
dkg-cli agent-setup
```
-DKG Node supports various LLM providers. Some examples include:
+DKG Agent supports various LLM providers. Some examples include:
| Provider | API Key Link |
| -------------------------- | ------------------------------------------------------------------------------------ |
@@ -190,7 +212,7 @@ All commands work from any directory and automatically detect your operating sys
A `createUser` is also possible via the `dkg-cli` included to simplify the creation of additional user accounts.
```sh
-cd dkg-node/apps/agent
+cd dkg-node
dkg-cli create-user
# Enter: email, password, permissions (e.g., `mcp llm blob scope123`)
```
@@ -202,7 +224,7 @@ dkg-cli create-user
* Contains sensitive data (wallet keys, passwords, API keys)
* Never commit to version control
-**Services and Ports**\
+**Services and ports**\
The following list provides an overview of which services are running locally and the ports they listen on:
* **8081** — Web UI & API
@@ -214,4 +236,4 @@ The following list provides an overview of which services are running locally an
* 📖 [Documentation](https://docs.origintrail.io/)
* 🐛 [Report issues](https://github.com/OriginTrail/dkg-node-installer/issues)
-* 💬 [Discord community](https://discord.gg/aNpBjf97)
+* 💬 [Discord community](https://discord.com/invite/xCaY7hvNwD)
diff --git a/docs/getting-started/dkg-node-services.md b/docs/getting-started/dkg-node-services.md
index f1f4009c..668244f3 100644
--- a/docs/getting-started/dkg-node-services.md
+++ b/docs/getting-started/dkg-node-services.md
@@ -7,7 +7,7 @@ description: >-
# DKG Node Services
-## Run[^1]ning your DKG Node in development mode
+## Running your DKG Node in development mode
You will be running your DKG Node in **development mode** while building, experimenting, and customizing your DKG Node, before deploying it in production. In this mode, the system automatically reloads on code changes, streams real-time logs, and gives you immediate feedback as you work.
@@ -24,7 +24,7 @@ This will:
* Help you debug and iterate quickly in a local environment.
{% hint style="info" %}
-## Troubleshooting
+### Troubleshooting
If `npm install` fails, try:
@@ -40,7 +40,7 @@ Also confirm your Node.js version is **v22+**.
Once your dev server is up (`npm run dev`), several powerful tools become available through your browser. These interfaces let you **manage, inspect, and debug** every part of your DKG Node.
-### **DKG Node & Agent UI**
+### **DKG Node & Agent UI**
[**http://localhost:8081/**](http://localhost:8081/)
@@ -67,7 +67,7 @@ It allows:
If your DKG Node is the “brain,” the MCP server is the **communication layer** - it’s what lets AI systems talk to your node programmatically.
-### **Swagger UI (API Explorer)**
+### **Swagger UI (API Explorer)**
[**http://localhost:9200/swagger**](http://localhost:9200/swagger)
@@ -94,9 +94,3 @@ It allows you to:
{% hint style="danger" %}
If you’re using the Brave browser, please disable Shields when accessing Drizzle Studio - otherwise you may not be able to view the database records.
{% endhint %}
-
-
-
-
-
-[^1]:
diff --git a/docs/getting-started/interacting-with-your-dkg-agent.md b/docs/getting-started/interacting-with-your-dkg-agent.md
index aeddcfa9..9f63aafe 100644
--- a/docs/getting-started/interacting-with-your-dkg-agent.md
+++ b/docs/getting-started/interacting-with-your-dkg-agent.md
@@ -1,12 +1,12 @@
# Interacting with your DKG Agent
{% hint style="info" %}
-This section assumes you have finished [Installation](decentralized-knowle-dge-graph-dkg.md) and will guide you through trying out the basic DKG Agent that comes bundled with the DKG Node.
+This section assumes you have finished [Installation](decentralized-knowledge-graph-dkg.md) and will guide you through trying out the basic DKG Agent that comes bundled with the DKG Node.
{% endhint %}
Each DKG node includes a **collocated neuro-symbolic AI agent** that combines neural model capabilities (e.g., LLMs) with symbolic reasoning over RDF-based graph data. This enables DKG nodes not only to publish and query semantic knowledge but also to perform knowledge graph reasoning, summarization, and data transformation tasks directly on locally or remotely stored knowledge.
-The **DKG Agent** is built around a modular **plugin system** centered on the **Model Context Protocol (MCP)**. Plugins define how the agent interacts with external tools, APIs, and reasoning systems. A generic DKG Node ships with a base set of plugins for common operations — such as knowledge publishing, retrieval, and validation — **while developers can extend functionality by creating custom plugins**.
+The **DKG Agent** is built around a modular **plugin system** centered on the **Model Context Protocol (MCP)**. Plugins define how the agent interacts with external tools, APIs, and reasoning systems. A generic DKG Node ships with a base set of plugins for common operations — such as knowledge publishing, retrieval, and validation — **while developers can extend functionality by creating custom plugins**.
Each plugin may expose both **MCP endpoints** (for agentic interoperability) and **classic REST/gRPC APIs** (for programmatic access). Example plugin types include ontology-specific retrieval tools (e.g., “social media query” modules), **knowledge-mining pipelines** for crafting Knowledge Assets aligned with domain ontologies, and **reasoning plugins** that apply declarative rule sets to infer new knowledge.
@@ -16,13 +16,13 @@ If you want to jump right into building your custom plugins, head over to the ["
Your DKG Node comes with a built-in agent interface serving two core purposes:
-* **Secure authentication portal** → OAuth 2.1 login system for accessing your DKG Node
+* **Secure authentication portal** → OAuth 2.1 login system for accessing your DKG Node
* **AI agent interface** → Direct chat with your DKG-Node-powered agent
The interface is built with **React Native (Expo)** for cross-platform compatibility, enabling a seamless interaction with your agent and the Decentralized Knowledge Graph (DKG).
{% hint style="info" %}
-If you are following this guide, make sure your [**DKG Node is running**](decentralized-knowle-dge-graph-dkg.md#id-7.-start-the-node), if it’s not already active.
+If you are following this guide, make sure your [**DKG Node is running**](decentralized-knowledge-graph-dkg.md#id-7.-start-the-node), if it’s not already active.
{% endhint %}
@@ -91,4 +91,3 @@ Your DKG Node **uses a** **standard MCP server** (with OAuth 2.1 over HTTPS), so
**Microsoft Copilot Studio**
* Follow [Microsoft’s MCP integration docs](https://learn.microsoft.com/en-us/microsoft-copilot-studio/mcp-add-existing-server-to-agent).
-
diff --git a/docs/graveyard/everything/dkg-edge-node/customize-and-build-with-the-edge-node.md b/docs/graveyard/everything/dkg-edge-node/customize-and-build-with-the-edge-node.md
index 605d1b75..7f5022e8 100644
--- a/docs/graveyard/everything/dkg-edge-node/customize-and-build-with-the-edge-node.md
+++ b/docs/graveyard/everything/dkg-edge-node/customize-and-build-with-the-edge-node.md
@@ -14,10 +14,9 @@ Users can add custom variables to the `UserConfig` table, making them accessible
## Local environment setup with forked services
-To begin customizing and building your own solution using the OriginTrail Edge Node stack, we recommend the following local development setup:\
+To begin customizing and building your own solution using the OriginTrail Edge Node stack, we recommend the following local development setup:\\
-
-1. ### Fork Core Edge Node Repositories
+1. #### Fork Core Edge Node Repositories
In order to fully tailor the Edge Node to your specific use case, it is recommended that you **fork the following components** into your own GitHub account:
@@ -32,20 +31,16 @@ To begin customizing and building your own solution using the OriginTrail Edge N
👉 Fork this if you want to modify business logic, expose new routes, or integrate additional microservices.
4. **Edge Node UI**\
The user-facing interface of the Edge Node.\
- 👉 Fork this to customize branding, UX, workflows, or connect it with your own backend services.\
-
-2. ### Authentication Service (Optional Fork)
- 1. **Edge Node Authentication Service**\
- This handles user sessions and tokens.\
- Recommended to use as-is for most cases to keep things simple and aligned with best practices.\
- 🛠️ Optional: You may fork this if you need:
-
- 1. Custom authentication methods (e.g., biometric login, enterprise SSO)
- 2. Integration with external identity providers
- 3. Custom logic for Verifiable Credential issuance or DID resolution
-
-
-3. ### Forked Repositories Setup
+ 👉 Fork this to customize branding, UX, workflows, or connect it with your own backend services.\\
+2. #### Authentication Service (Optional Fork)
+ 1. **Edge Node Authentication Service**\
+ This handles user sessions and tokens.\
+ Recommended to use as-is for most cases to keep things simple and aligned with best practices.\
+ 🛠️ Optional: You may fork this if you need:
+ 1. Custom authentication methods (e.g., biometric login, enterprise SSO)
+ 2. Integration with external identity providers
+ 3. Custom logic for Verifiable Credential issuance or DID resolution
+3. #### Forked Repositories Setup
Once you’ve successfully forked the core Edge Node repositories and tested the default setup using the official public repos, you’ll need to **clean your local environment** before installing your customized versions.
@@ -58,12 +53,10 @@ To begin customizing and building your own solution using the OriginTrail Edge N
After pruning the default Edge Node setup, your environment will be reset:
1. All previously cloned **Edge Node service repositories** will be deleted
- 2. All **Edge Node databases** will be dropped\
-
+ 2. All **Edge Node databases** will be dropped\\
2. **Switch to Your Forked Repositories**
1. **Open your `.env` file** located at the root of the project.
- 2. Replace the official repository URLs with the links to your **forked repositories.**\
-
+ 2. Replace the official repository URLs with the links to your **forked repositories.**\\
3. **Install Your Custom Edge Node**
1. Run Edge node installer script which will install services based on your forked repos.
2. If your Edge node is set on MacOS, execute following script to run your services:\
@@ -108,7 +101,7 @@ A **DAG** defines the execution order of tasks within a **pipeline**, while a **
* optionally: If _airflow webserver_ is used, it should also be restarted
* **Unpause** your pipeline\
`airflow dags unpause ${YOUR_DAG_NAME}`\
- NAN;_e.g. If your pipeline filename is xlsx\_to\_jsonld.py, unpause command should be "airflow dags unpause xlsx\_to\_jsonld"_\
+ \NAN;_e.g. If your pipeline filename is xlsx\_to\_jsonld.py, unpause command should be "airflow dags unpause xlsx\_to\_jsonld"_\
\
**NOTE:** If you are using Airflow webserver, you should be able to see your pipeline on http://localhost:8080 (or any other port you selected for the service) inside of "unpaused DAGS"
* **Registering the pipeline**
@@ -121,11 +114,11 @@ A **DAG** defines the execution order of tasks within a **pipeline**, while a **
* **Convert science paper PDFs to JSON-LD using a bibliographic ontology**\
Extract metadata from science paper PDFs, such as title, authors, publication date, and references, and convert the data into JSON-LD following a bibliographic ontology like BIBO. This allows for structured, machine-readable representation of academic papers for easier citation management and searchability.
* **Convert supply chain Excel documents to JSON-LD using GS1 standard ontology**\
- Parse supply chain-related data from Excel files (e.g., product lists, inventory records) and convert it into JSON-LD using the GS1 standard ontology.
+ Parse supply chain-related data from Excel files (e.g., product lists, inventory records) and convert it into JSON-LD using the GS1 standard ontology.
* **Convert images to JSON-LD using OCR**\
- Use Optical Character Recognition (OCR) to extract text and metadata from image files and represent it as JSON-LD.
+ Use Optical Character Recognition (OCR) to extract text and metadata from image files and represent it as JSON-LD.
* **Convert videos to Knowledge Assets by transcribing the audio and extracting key points**\
- Transcribe the audio from videos and extract key points or insights, then represent this information as JSON-LD knowledge assets.
+ Transcribe the audio from videos and extract key points or insights, then represent this information as JSON-LD knowledge assets.
* If you need to support a different file type:
* Create a new variable for the file type, e.g., `kmining_xlsx_pipeline_id`
* adapt the code in [Edge Node API - kMiningService](https://github.com/OriginTrail/edge-node-api/blob/main/services/kMiningService.js) to handle the new variable based on the input file's MIME type
@@ -153,7 +146,7 @@ The native query language for interacting with the DKG is SPARQL, as we use a tr
```
* **Creating your dRAG**
* The app currently contains two dRAGs as a demonstration of how natural language questions can be understood, processed, and answered using SPARQL and vector similarity search
- * Creating a new dRAG is basically creating a new API route in the app, and those steps are recommended but not mandatory: \
+ * Creating a new dRAG is basically creating a new API route in the app, and those steps are recommended but not mandatory:\
NOTE: (_you can create your own path as long as they are compatible with Edge Node interface, which also can be customized by your needs_)
* Create a new Controller in the controllers directory
* Create your dRAG method in Controller
@@ -176,8 +169,7 @@ The native query language for interacting with the DKG is SPARQL, as we use a tr
* **Feedback-loop-based SPARQL refinement:** Combine the LLM's natural language to SPARQL conversion with a feedback loop, in which the AI iteratively enhances the generated SPARQL queries, ensuring they align with the ontology and avoid errors.
* **Hybrid search — Combine vector and symbolic search:** Use a hybrid approach in which vector search (for semantic similarity) and symbolic search (e.g., SPARQL) work in tandem. Balancing structured queries with open-ended search results in this way can help ensure both accuracy and broad coverage.
* **Ontology-aware LLM fine-tuning:** Create a system to fine-tune a large language model (LLM) specifically on a given ontology. This approach involves providing the LLM with structured data from the ontology, including relationships, entities, and definitions, so it can learn to generate responses that align with the specific concepts and rules of the ontology. Then, use the trained model to formulate SPARQL queries based on the natural language.
-* You should now be ready to test your setup. Visit the Edge Node interface, go to the "AI Assistant" page, ask a question, and verify that your dRAG can answer it based on your custom logic.\
-
+* You should now be ready to test your setup. Visit the Edge Node interface, go to the "AI Assistant" page, ask a question, and verify that your dRAG can answer it based on your custom logic.\\
| Feature | dRAG | Pipeline |
| ----------------- | ----------------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------- |
diff --git a/docs/graveyard/everything/dkg-edge-node/deploy-your-edge-node-based-project/README.md b/docs/graveyard/everything/dkg-edge-node/deploy-your-edge-node-based-project/README.md
index fda13b96..41e58d82 100644
--- a/docs/graveyard/everything/dkg-edge-node/deploy-your-edge-node-based-project/README.md
+++ b/docs/graveyard/everything/dkg-edge-node/deploy-your-edge-node-based-project/README.md
@@ -12,4 +12,4 @@ You can deploy the Edge Node using the Automated Installer (more methods coming
Choose the method that best fits your needs to continue the installation process:
-
Deploy your Edge node with the automated installer
diff --git a/docs/origintrail-v9-v10/origintrail-decentralized-knowledge-graph-dkg-v10-terms-and-conditions.md b/docs/origintrail-v9-v10/origintrail-decentralized-knowledge-graph-dkg-v10-terms-and-conditions.md
new file mode 100644
index 00000000..5f4c6f74
--- /dev/null
+++ b/docs/origintrail-v9-v10/origintrail-decentralized-knowledge-graph-dkg-v10-terms-and-conditions.md
@@ -0,0 +1,286 @@
+# OriginTrail Decentralized Knowledge Graph DKG V10 - Terms and Conditions
+
+### Preamble
+
+The OriginTrail Decentralized Knowledge Graph Version 10 (hereinafter referred to as "**OriginTrail V10**" or "**V10**") is a neutral, peer-to-peer, multi-chain network designed to facilitate decentralized knowledge publishing, verification, and retrieval by human and autonomous software agents. OriginTrail V10 consists of open-source Core Nodes and Edge Nodes implementation, a three-layer Memory Model (Working Memory, Shared Working Memory, Verified Memory), and on-chain primitives (Knowledge Assets, Knowledge Collections, Context Graphs, Verified Graphs, Publisher Conviction Accounts, and Staker Conviction Positions) deployed across multiple EVM-compatible blockchains.
+
+OriginTrail V10 is developed by **OriginTrail d.o.o.**, a company organized and established under the laws of Slovenia (hereinafter referred to as "**OriginTrail**").
+
+The OriginTrail V10 Node software is licensed under the Apache License, version 2.0. You may obtain a copy of the License at [http://www.apache.org/licenses/LICENSE-2.0](http://www.apache.org/licenses/LICENSE-2.0).
+
+**PLEASE READ THESE TERMS AND CONDITIONS CAREFULLY BEFORE INSTALLING, OPERATING, OR OTHERWISE USING ANY OriginTrail V10 NODE, AGENT, OR RELATED ON-CHAIN SOFTWARE. BY DOWNLOADING, INSTALLING, OPERATING, PUBLISHING TO, QUERYING, OR OTHERWISE INTERACTING WITH THE OriginTrail V10 NETWORK, OR BY MINTING, ACQUIRING, HOLDING, OR TRANSFERRING ANY V10 NFT POSITION, YOU ACKNOWLEDGE THAT YOU HAVE READ, UNDERSTOOD, AND AGREE TO BE BOUND BY THESE TERMS AND CONDITIONS AND ALL TERMS INCORPORATED BY REFERENCE. IF YOU DO NOT AGREE, DO NOT USE THE OriginTrail V10 NETWORK.**
+
+***
+
+### 1. Definitions
+
+For the purpose of these Terms and Conditions, the following capitalised terms shall have the meanings set out below. Singular terms include the plural and vice versa.
+
+**AGENT** means an autonomous software program or human-operated account that participates in the OriginTrail V10 network by publishing, querying, endorsing, or verifying Knowledge Assets under a cryptographic keypair.
+
+**APACHE LICENSE** means the Apache License, version 2.0, available at [http://www.apache.org/licenses/LICENSE-2.0](http://www.apache.org/licenses/LICENSE-2.0).
+
+**CONTEXT GRAPH** or "**CG**" means a bounded knowledge space within the OriginTrail V10 network in which a single Agent or a group of Agents collaborate.
+
+**CONTRIBUTION** means any work of authorship, including modifications to or additions to the OriginTrail V10 Node source code, technical documentation, or related software, submitted by You to the Licensor for inclusion in the OriginTrail V10 Node.
+
+**DECENTRALIZED KNOWLEDGE GRAPH** or "**DKG**" means the shared, decentralized, multi-chain knowledge graph hosted by the OriginTrail V10 network, consisting of Knowledge Assets organized into Context Graphs.
+
+**DERIVATIVE WORK** means any work, whether in Source Form or Object Form, that is based on, derived from, or incorporates the OriginTrail V10 Node or any portion thereof, as defined under the Apache License.
+
+**KNOWLEDGE ASSET** or "**KA**" means the on-chain representation of a set of RDF triples sharing a single root subject URI, anchored on a supported blockchain. Each Knowledge Asset is an ERC-1155 token; the token holder controls UPDATE authority over that KA.
+
+**LICENSE** means the Apache License, version 2.0, together with these Terms and Conditions.
+
+**LICENSOR** means OriginTrail, digitalne rešitve za dobavne verige, d.o.o.
+
+**MAINNET** means the production OriginTrail V10 network deployed across the Supported Blockchains.
+
+**MEMORY** **MODEL** means the three-layer data organisation of the OriginTrail V10 network: (i) Working Memory - local, private, free; (ii) Shared Working Memory - gossip-replicated among selected peers; (iii) Verified Memory - blockchain-anchored, requires on-chain publishing, with a trust gradient from self-attested to consensus-verified.
+
+**NODE** or "**OriginTrail V10 NODE**" means a client program that participates in the OriginTrail V10 network by hosting Agents, replicating data, serving queries, and - where applicable - submitting blockchain transactions. "Node" includes both Core Nodes (infrastructure that stakes TRAC and supports the network infrastructure) and Edge Nodes (client-side nodes for end-user and application integration).
+
+**OBJECT FORM** means any non-Source-Form expression of the Work, as defined in the Apache License.
+
+**OriginTrail V10** has the meaning given in the Preamble.
+
+**PUBLISHER** means any Person (human or automated) that, acting directly or through one or more Agents, triggers a PUBLISH, UPDATE, or VERIFY operation that anchors data on DKG.
+
+**PUBLISHER CONVICTION NFT** means an ERC-721 position representing a commitment of TRAC by a Publisher for a fixed twelve (12) month term in exchange for a pre-purchased publishing allowance and - where applicable - a discount tier, as further described in Section 7 and in the Technical Documentation.
+
+**SOURCE CODE** means a set of instructions and statements written by a programmer using computer programming, representing a computer program.
+
+**SOURCE FORM** means the preferred form for making modifications, including software source code, documentation, and configuration files.
+
+**STAKER** means any Person that locks TRAC into a Staker Conviction Position, whether directly operating a Node or delegating capital to the netwo
+
+**STAKER CONVICTION NFT** means an ERC-721 position representing a network-level lock of TRAC for a chosen duration in exchange for a reward multiplier, as further described in Section 7 and in the Technical Documentation.
+
+**SUPPORTED BLOCKCHAIN** means any EVM-compatible blockchain on which OriginTrail V10 smart contracts are deployed by OriginTrail or its authorized contributors and recognised by the network, including, as of the effective date of these Terms and Conditions, NeuroWeb, Base, and Gnosis. The list of Supported Chains may be updated from time to time.
+
+**TECHNICAL DOCUMENTATION** means all documentation published by OriginTrail in reference to the operation of the OriginTrail V10 network, available at [https://docs.origintrail.io](https://docs.origintrail.io) and on GitHub at [https://github.com/OriginTrail](https://github.com/OriginTrail), including without limitation the V10 Protocol Core specification and the V10 Token Economics specification.
+
+**TERMS AND CONDITIONS** or "**TERMS**" means these terms and conditions, and all terms incorporated by reference, governing the installation, operation, and use of OriginTrail V10.
+
+**TESTNET** means any test network that emulates the operation of a Mainnet, used for testing and development. Testnets have no incentivization mechanisms and therefore cannot support the economic properties of the OriginTrail V10 Mainnet.
+
+**TRAC** means the utility token of the OriginTrail network, used to pay for on-chain publication, updates, verification quorum submissions, and to secure the network through staking.
+
+**US / WE / OUR** means the Licensor.
+
+**USDC** means the USD-denominated stablecoin issued by Circle Internet Financial, LLC (or its successor issuer), referenced here solely as one example of a non-TRAC settlement asset that may be used for knowledge commerce payments.
+
+**VERIFIED GRAPH** means a named verification scope within a Context Graph, with its own participant list and M-of-N quorum, represented on blockchain as an ERC-721 token.
+
+**WORK** means the work of authorship, whether in Source Form or Object Form, made available under the License. For the purpose of these Terms and Conditions, "Work" refers to OriginTrail V10.
+
+**x402** means the HTTP 402 Payment Required extension enabling per-query micropayments between Agents, as described in the Technical Documentation.
+
+**YOU** or "**YOUR**" means the natural or legal entity exercising permissions granted under these Terms and Conditions, whether acting directly or through one or more Agents, Nodes, or wallets.
+
+***
+
+### 2. Acceptance and Eligibility
+
+2.1 By downloading, installing, operating, publishing to, querying, endorsing, verifying, minting, acquiring, holding, or transferring any component of, or position within, the OriginTrail V10 network, You acknowledge that You have read, understood, and agree to be bound by these Terms and Conditions.
+
+2.2 You represent and warrant that:
+
+(a) You have, at the time You first interact with OriginTrail V10, reached the age of majority in Your jurisdiction of residence, and in any event are not less than eighteen (18) years old;
+
+(b) Your use of OriginTrail V10 complies with the laws of Your jurisdiction of residence and any other jurisdiction from which You access or operate on the network, and You are fully able and legally competent to use OriginTrail V10;
+
+(c) You have sufficient understanding of blockchain technology, cryptographic keys, smart contracts, multi-chain operation, volatile crypto-asset markets, and the technical and economic risks associated with participating in a decentralized network, to make an informed decision to use OriginTrail V10;
+
+(d) You are not located in, organised under the laws of, or ordinarily resident in any jurisdiction that is the target of comprehensive economic sanctions administered by the United Nations, the European Union, the United Kingdom, or the United States (including, without limitation, the Office of Foreign Assets Control), nor are You a person or entity designated on any consolidated sanctions list;
+
+(e) You are not using OriginTrail V10 to finance, facilitate, or conceal any illegal activity, including, without limitation, money laundering, terrorist financing, tax evasion, market manipulation, or unauthorised trading in regulated instruments;
+
+(f) You are solely responsible for determining whether Your participation in OriginTrail V10 - including the acquisition, holding, transfer, or lock-up of TRAC, Knowledge Assets, Publisher Conviction Accounts, or Staker Conviction Positions - requires authorisation, registration, licensing, or disclosure under the laws applicable to You, and You have satisfied all such requirements.
+
+2.3 You are solely responsible for ensuring the truthfulness and lawfulness of any information You provide or publish to the OriginTrail V10 network, including any content contained in Knowledge Assets You author, endorse, or verify.
+
+***
+
+### 3. OriginTrail V10 Node License
+
+3.1 Apache License version 2.0. The OriginTrail V10 Node software is licensed under the Apache License, Version 2.0, the full text of which is incorporated into these Terms and Conditions by reference as if set out in full herein. All copyright, patent, redistribution, and trademark provisions of the Apache License, Version 2.0 apply.
+
+3.2 Trademarks. The use of OriginTrail trademarks, service marks, trade names, and product names is forbidden, except as strictly required for use in describing the origin of the OriginTrail V10 Node and Derivative Works. Nothing in these Terms and Conditions grants You the right to use OriginTrail trademarks.
+
+***
+
+### 4. Contributions
+
+4.1 Unless You explicitly state otherwise, any Contribution submitted for inclusion in the OriginTrail V10 Node Source Code by You to the Licensor shall be under these Terms and Conditions, without any additional terms or conditions. Notwithstanding the foregoing, nothing herein supersedes or modifies the terms of any separate license agreement You may have executed with the Licensor regarding such Contributions.
+
+4.2 By submitting a Contribution, You:
+
+(a) Grant to the Licensor a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, transform, modify or adapt, publicly display, publicly perform, sublicense, and distribute the Contribution or its Derivative Works in Source Form or Object Form;
+
+(b) Grant to the Licensor a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in section 3 of the Apache License, version 2.0) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Contribution;
+
+(c) Represent that You have the right to grant the licenses set out above and that the Contribution does not infringe the intellectual property rights of any third party.
+
+***
+
+### 5. Nature of the Network - No OriginTrail Control
+
+5.1 OriginTrail developed the OriginTrail V10 Node software, licensed under the Apache License 2.0, and may promote it as an open-source network. **HOWEVER, YOU SHALL AT ALL TIMES NOTE THAT ORIGINTRAIL DOES NOT OWN OR CONTROL THE OriginTrail V10 NETWORK, THE DECENTRALIZED KNOWLEDGE GRAPH, ANY SUPPORTED CHAIN, OR ANY RELATED SOFTWARE OR ANY OTHER MODIFICATION TO IT, AND YOU ARE SOLELY AND IN FULL RESPONSIBLE FOR YOUR USE OF EACH AND ANY OF THEM.**
+
+5.2 You acknowledge and agree that:
+
+(a) OriginTrail does not own, operate, or control any Node, Agent, Context Graph, Verified Graph, Knowledge Asset, Publisher Conviction Account, Staker Conviction Position, or wallet, except any infrastructure OriginTrail itself chooses to operate in its capacity as one participant among many;
+
+(b) OriginTrail does not have any authority to approve, prevent, restrict, censor, reverse, or otherwise exercise control over any transaction, publication, endorsement, verification, update, deletion, or any other interaction that occurs through OriginTrail V10 or any Supported Chain;
+
+(c) You shall not have any expectation as to the performance of the OriginTrail V10 network, the uptime or behaviour of any individual Node, or the compensation paid to any Publisher, Staker, Node operator, or Agent;
+
+(d) Where OriginTrail or its affiliates publish opinions, roadmaps, forecasts, or community communications, such materials are provided for informational purposes only, do not create a contractual obligation, and may be changed or withdrawn at any time.
+
+***
+
+### 6. Node Operation
+
+6.1 You are responsible for the installation, configuration, security, maintenance, and operation of any Node You run, including keeping the Node software reasonably up to date with published releases at [https://github.com/OriginTrail](https://github.com/OriginTrail) and ensuring the secure custody of all associated private keys.
+
+6.2 To operate a Core Node, You must satisfy the minimum technical and economic requirements described in the Technical Documentation. Failure to meet these requirements may, pursuant to protocol-level rules enforced by other participants and smart contracts, result in reduced reputation, reduced allocation of publishing work, or inability to participate in certain operations. OriginTrail does not operate any discretionary disciplinary process - any consequences are the result of protocol rules and the independent behaviour of other network participants.
+
+6.3 You acknowledge that the OriginTrail V10 Node Source Code has not necessarily passed a third-party security audit and can be potentially unstable and could cause unexpected effects and system failures. You are solely responsible for determining the appropriateness of using or redistributing the OriginTrail V10 Node Source Code or any Derivative Work.
+
+6.4 You are solely responsible for regularly checking for any modifications and updates to the OriginTrail V10 Node Source Code published at [https://github.com/OriginTrail](https://github.com/OriginTrail).
+
+6.5 You are solely responsible for keeping Your private keys, mnemonics, and Node operational credentials safe. **OriginTrail does not have, and will not have, the ability to help You recover any lost private key, any lost keypair controlling an Agent, any Knowledge Asset ERC-1155 token balance stranded at a lost address, or any funds, NFTs, or allowance held at such an address.**
+
+***
+
+### 7. TRAC, Conviction Positions, and Token Economics
+
+7.1 **Utility character of TRAC.** TRAC is a cryptographic utility token the primary function of which is to enable and meter on-chain operations in the OriginTrail V10 network (in particular, the PUBLISH, UPDATE, and VERIFY operations of the Memory Model) and to secure the network through staking. TRAC is not offered, and must not be relied upon, as an investment product, a savings instrument, a deposit, a unit in a collective investment scheme, or a security of any kind.
+
+7.2 **Publisher Conviction.** Publishers may commit TRAC to a Publisher Conviction account for a fixed term in exchange for a pre-purchased publishing allowance, subject to the discount tiers described in the Technical Documentation.
+
+7.3 **Delegated Staker Conviction.** Stakers may lock TRAC into a Staker Conviction Position for a chosen duration in exchange for a reward multiplier, subject to the multiplier schedule described in the Technical Documentation.
+
+7.4 **Programmatic Rewards.** Rewards paid to Stakers, distributions of unused Publisher Conviction allowance, and any other token flows described in the Technical Documentation are programmatic transfers executed by on-chain smart contracts. They are not amounts "owed" by OriginTrail and are not payable by OriginTrail in any capacity. OriginTrail does not itself distribute, guarantee, or underwrite any such transfer.
+
+7.5 **Gas and operational costs.** In addition to TRAC, every blockchain transaction on a Supported Chain requires the payment of gas in that blockchain's native gas token (including, where applicable, NEURO on NeuroWeb, ETH on Base, and xDAI on Gnosis). You are solely responsible for obtaining and managing such gas tokens.
+
+7.6 **Knowledge commerce and x402.** You acknowledge that, in addition to protocol-level TRAC payments, the OriginTrail V10 network supports knowledge commerce via x402 HTTP micropayments and - in later protocol releases - FairSwap on-chain escrow, settling in TRAC, USDC, or other tokens configured by the serving Node or Publisher. OriginTrail does not operate any x402 or FairSwap endpoint on Your behalf, does not guarantee the availability, performance, or pricing of such endpoints, and is not a party to any commercial transaction concluded between You and any other Agent, Publisher, or Node operator via such mechanisms.
+
+7.7 **Not a security; no investment solicitation**. You acknowledge and agree that:
+
+(a) Neither TRAC, nor Publisher Conviction Accounts, nor Staker Conviction Positions, nor Knowledge Assets, nor Context Graph or Verified Graph tokens, are offered or intended as securities, investment contracts, collective investment schemes, units in a fund, derivatives, or any other form of regulated financial instrument;
+
+(b) You are not acquiring or holding any such asset with an expectation of profit derived from the entrepreneurial or managerial efforts of OriginTrail or any third party acting on OriginTrail's behalf;
+
+(c) Any statements made by OriginTrail in the Technical Documentation, roadmaps, or elsewhere regarding network design, conviction mechanisms, discount curves, reward multipliers, or market dynamics are informational and architectural in nature, and are not offers, promises, guarantees, or forecasts of financial return;
+
+(d) Where the laws of Your jurisdiction would, notwithstanding the foregoing, characterise any component of the OriginTrail V10 network as a regulated instrument, You are solely responsible for complying with all such laws, including securities, commodities, crypto-asset (including, where applicable, the EU Markets in Crypto-Assets regime), anti-money-laundering, tax, and consumer-protection laws.
+
+***
+
+### 8. Agents and Keys
+
+8.1 You may operate one or more Agents. Each Agent is identified by a cryptographic keypair that You (or the automated system operating on Your behalf) generate and control. Agent creation is free at the protocol level and does not require registration with OriginTrail.
+
+8.2 You are solely responsible for:
+
+(a) The generation, storage, rotation, and protection of each Agent's private key;
+
+(b) Any and all operations signed by any Agent under Your control;
+
+(c) The conduct of any automated or "agentic" software operating under Your authority, including the content it publishes, the commitments it makes, the TRAC or other tokens it spends, and the wallets it signs for. **The autonomous or probabilistic character of an Agent is not a defence and does not shift responsibility to OriginTrail or to any other participant.**
+
+8.3 If a key controlling an Agent is lost, compromised, or stolen, the current protocol does not provide social or off-chain recovery. You acknowledge that in such a case, You may permanently lose authority over Knowledge Assets or any other assets held at that Agent's address.
+
+***
+
+### 9. Content Responsibility
+
+9.1 All data that You, Your Agents, or Nodes under Your control publish, endorse, or verify on the OriginTrail V10 network is published by You on Your own authority. You warrant that any such data:
+
+(a) Does not infringe any copyright, trademark, trade secret, patent, publicity, privacy, or other intellectual property or personal right of any third party;
+
+(b) Does not contain unlawful, defamatory, deceptive, or misleading content;
+
+(c) Is, to the extent it contains personal data, processed by You in compliance with all applicable data protection laws (including, where applicable, the EU General Data Protection Regulation); and
+
+(d) May be replicated, stored, and served by other Nodes on the network in accordance with protocol rules, and - in the case of Verified Memory - may be anchored permanently on one or more Supported Chains in a manner that You accept is technically impossible to reverse.
+
+9.2 **Once knowledge is published to Verified Memory, its merkle root is anchored on blockchain and the underlying triples are replicated across Nodes within the relevant Context Graph. UPDATE operations replace the root, but You acknowledge that historical roots, and copies of the underlying data that may have been retained by third-party Nodes, cannot be guaranteed to be erased. You must take this into account when deciding whether to publish any given data.**
+
+***
+
+### 10. Risk Disclosures
+
+YOU EXPRESSLY ACKNOWLEDGE AND ACCEPT THE FOLLOWING RISKS:
+
+10.1 **Technology risk.** The OriginTrail V10 Node Source Code, Agents, smart contracts, and protocol specifications are complex software. They may contain undiscovered defects, vulnerabilities, or bugs that could lead to loss of TRAC, Knowledge Assets, Conviction positions, or other value.
+
+10.2 **Multi-chain risk.** OriginTrail V10 operates across multiple Supported Chains, including NeuroWeb, Base, and Gnosis. Each Supported Chain is an independent system operated by independent actors and validators, outside the control of OriginTrail. Any Supported Chain may experience outages, forks, consensus failures, re-organisations, governance disputes, regulatory action, or termination. OriginTrail makes no representation as to the availability, finality, or security of any Supported Chain, and You accept that Your assets or positions on any given Supported Chain may be adversely affected by events specific to that blockchain.
+
+10.3 **Cross-chain risk.** Operations that rely on state, tokens, or messages bridged between Supported Chains introduce bridge-specific risks, including smart-contract exploits, validator misbehaviour, loss of canonicality, and temporary or permanent loss of assets in transit. You are solely responsible for evaluating and bearing such risks.
+
+10.4 **Lock-up risk.** TRAC committed to a Publisher Conviction Account or a Staker Conviction Position is locked for a specific duration. During these lock periods, You cannot withdraw the locked tokens. Any secondary-market transfers of the NFT are neither guaranteed nor underwritten.
+
+10.5 **Reward volatility and conditionality**. Staker rewards, Publisher allowance utilisation, and any other programmatic transfers are determined by network activity, gas conditions, and smart-contract execution. Rewards may decrease, pause, or cease altogether. No minimum, guaranteed, or target return is promised by OriginTrail.
+
+10.6 **Stablecoin and payment-rail risk.** x402 micropayments and FairSwap escrow may settle in USDC or other non-TRAC tokens. Such tokens are issued, operated, or controlled by third parties and carry their own risks (including, without limitation, issuer risk, banking risk, peg failure, freezes, blocklists, and regulatory action). OriginTrail does not issue, endorse, or guarantee any such token.
+
+10.7 **Regulatory risk.** The regulatory treatment of crypto-assets, decentralized networks, autonomous software agents, and related activities is evolving worldwide and varies between jurisdictions. Actions or determinations by regulators, courts, or legislatures may adversely affect the availability or value of OriginTrail V10, TRAC, Conviction NFTs, Knowledge Assets, or other positions. You are solely responsible for monitoring and complying with such developments in every jurisdiction relevant to You.
+
+10.8 **Tax risk.** You are solely responsible for determining any tax consequences of Your participation in OriginTrail V10, including in connection with minting, holding, transferring, or selling any Conviction NFT or Knowledge Asset, receiving any programmatic emission, paying or receiving x402 or FairSwap settlements, or operating a Node, and for filing and paying any taxes and duties due.
+
+10.9 **Testnet risk.** Testnets have no incentivization mechanisms and therefore cannot support the economic value propositions of OriginTrail V10 Mainnet. Testnet state, balances, NFTs, and histories may be reset, abandoned, or otherwise destroyed at any time without notice.
+
+***
+
+### 11. Disclaimer of Warranties
+
+11.1 UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING, THE OriginTrail V10 NODE, THE DECENTRALIZED KNOWLEDGE GRAPH, THE OriginTrail V10 NETWORK, ANY SUPPORTED CHAIN, AND ANY RELATED SOFTWARE ARE PROVIDED ON AN "AS IS" AND "AS AVAILABLE" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OR CONDITIONS OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, ACCURACY, RELIABILITY, SECURITY, UPTIME, OR NON-INTERRUPTION.
+
+11.2 THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF USING OriginTrail V10 IS BORNE BY YOU. You are solely responsible for determining the appropriateness of using or redistributing the OriginTrail V10 Node Source Code, any Derivative Work, or any Knowledge Asset, and assume any risks associated with Your exercise of permissions granted under this License.
+
+***
+
+### 12. Limitation of Liability
+
+12.1 IN NO EVENT AND UNDER NO LEGAL THEORY, UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING, SHALL OriginTrail, OR ANY OTHER CONTRIBUTOR WHO MODIFIES AND/OR CONVEYS THE OriginTrail V10 NODE AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY DIRECT, INDIRECT, SPECIAL, GENERAL, INCIDENTAL, CONSEQUENTIAL, OR PUNITIVE DAMAGES OF ANY CHARACTER ARISING AS A RESULT OF THIS LICENSE OR OUT OF THE USE OR INABILITY TO USE THE OriginTrail V10 NODE OR THE OriginTrail V10 NETWORK (INCLUDING, BUT NOT LIMITED TO, LOSS OF DATA OR DATA BEING RENDERED INACCURATE, LOSS OF TRAC, NFTS, OR OTHER CRYPTO-ASSETS, LOSSES SUSTAINED BY YOU OR THIRD PARTIES, FAILURE OF ANY SUPPORTED CHAIN, FAILURE OR MALFUNCTION OF ANY AGENT, LOSS OF GOODWILL, WORK STOPPAGE, BUSINESS INTERRUPTION, COMPUTER FAILURE OR MALFUNCTION, OR ANY AND ALL OTHER COMMERCIAL DAMAGES OR LOSSES), EVEN IF OriginTrail OR ANY OTHER CONTRIBUTOR HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.
+
+***
+
+### 13. Indemnification
+
+13.1 To the maximum extent permitted by applicable law, You agree to indemnify, defend, and hold harmless OriginTrail, its affiliates, and their respective directors, officers, employees, and contributors from and against any and all claims, damages, losses, liabilities, costs, and expenses (including reasonable legal fees) arising out of or related to: (a) Your use of OriginTrail V10; (b) Your violation of these Terms and Conditions; (c) Your violation of any applicable law or of any right of a third party; (d) any content or data You publish to, or cause to be published to, the OriginTrail V10 network; or (e) the operation of any Agent, Node, or wallet under Your control.
+
+***
+
+### 14. Governing Law and Dispute Resolution
+
+14.1 These Terms and Conditions, and Your use of the OriginTrail V10 Node and the OriginTrail V10 network, shall be governed by and construed in accordance with the laws of Slovenia, excluding its conflict-of-laws principles.
+
+14.2 In the event of any dispute arising out of or in connection with these Terms and Conditions or Your use of the OriginTrail V10 Node, the competent Court in Ljubljana, Slovenia shall have exclusive jurisdiction to resolve the dispute.
+
+14.3 If any term, clause, or provision of these Terms and Conditions, or any terms incorporated by reference herein, is held unlawful, void, or unenforceable, then that term, clause, or provision shall be severable from these Terms and Conditions and shall not affect the validity or enforceability of any remaining part of that term, clause, or provision, or any other term, clause, or provision of these Terms and Conditions.
+
+***
+
+### 15. Changes to these Terms
+
+15.1 OriginTrail reserves the right to revise these Terms and Conditions, or any terms incorporated by reference herein, at any time, without prior notice. By continuing to use the OriginTrail V10 network after any such revision takes effect, You acknowledge and agree to be bound by the Terms and Conditions in force at the time of Your use.
+
+15.2 You are solely responsible for regularly checking for any revisions or amendments to these Terms and Conditions at the official repository and on [https://docs.origintrail.io](https://docs.origintrail.io).
+
+***
+
+### 16. Apache License Notice
+
+Licensed under the Apache License, Version 2.0 (the "Apache License"); You may not use this file except in compliance with the Apache License. You may obtain a copy of the Apache License at [http://www.apache.org/licenses/LICENSE-2.0](http://www.apache.org/licenses/LICENSE-2.0). Unless required by applicable law or agreed to in writing, software distributed under the Apache License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the Apache License for the specific language governing permissions and limitations under the Apache License.
+
+***
+
+_Issued in Ljubljana. Draft for community and legal review - V10 adaptation of the OriginTrail ODN Terms and Conditions._
+
+_OriginTrail, digitalne rešitve za dobavne verige, d.o.o._
+
diff --git a/docs/origintrail-v9-v10/origintrail-dkg-v10-bounty-program.md b/docs/origintrail-v9-v10/origintrail-dkg-v10-bounty-program.md
new file mode 100644
index 00000000..19bbb39f
--- /dev/null
+++ b/docs/origintrail-v9-v10/origintrail-dkg-v10-bounty-program.md
@@ -0,0 +1,218 @@
+# OriginTrail DKG v10 Bounty Program
+
+
+
+## Round 1 — Call for Integrations
+
+
+
+### 1. Summary
+
+[OriginTrail](https://origintrail.io) is opening the first round of the DKG v10 integrations bounty. This round is narrowly scoped to integrations that bring **Working Memory** and **Shared Memory** — the two pre-verification layers of the v10 memory model — into tools that advance Andrej Karpathy's vision of the **LLM Wiki** and **autoresearch**: collaborative, agent-native knowledge substrates where retrieval, writing, and verification collapse into a single loop.
+
+We are especially interested in integrations with **OpenClaw, Hermes**, and agents of comparable shape — autonomous or semi-autonomous research agents operating over long-horizon tasks and producing knowledge artifacts that benefit from provenance, collaboration, and eventual on-chain verification.
+
+The best contributions will be listed in the official DKG v10 integrations registry, featured across OriginTrail's documentation and ecosystem surfaces, and **rewarded with up to 10,000 TRAC per accepted submission**, tiered by impact — see sections 10 and 11. Integrations live in contributor-owned repositories and are consumed by users through the registry; this round does not merge contributor code into the dkg monorepo.
+
+This call is the first of **three planned rounds**. Round 1 focuses on Working and Shared Memory. Round 2 will target Verified Memory and context oracles. Round 3 will target agent-ready analytics and user support. See section 12 for the full roadmap.
+
+### 2. Why now
+
+AI is shifting from single agents to **multi-agent systems**. Teams of models are doing research, writing code, running operations, and producing knowledge faster than any single agent could.
+
+The bottleneck is no longer the model. **The bottleneck is memory** — specifically, a shared memory substrate that multiple agents can read, write, contest, and verify over time. Closed labs are racing to build this as proprietary infrastructure inside their own products. The market is heading toward a world where every major AI platform ships its own walled-garden memory layer, and the knowledge produced within each one is trapped there.
+
+An open, verifiable, agent-native alternative does not yet exist at a production scale. The DKG is the closest thing to it — a public knowledge substrate with provenance, trust gradients, and on-chain anchoring already wired in. v10 is the release that makes it usable for agents.
+
+{% hint style="info" %}
+**This is the moment to define the memory layer of AI**, before the closed alternatives harden into defaults. Round 1 of the DKG v10 bounty program is where builders who want that layer to be open can ship the integrations that will shape it.
+{% endhint %}
+
+### 3. Why this round
+
+The v10 memory model is built around a trust gradient: drafts mature from **Working Memory** (private, agent-populated) through **Shared Memory** (team-gossiped, collaborative) to **Verified Memory** (chain-anchored, with self-attested → endorsed → consensus-verified gradations).
+
+Rounds on Verified Memory will follow. This first round deliberately targets the **pre-verification surface** — the layers where agents draft, iterate, and collaborate — because that is where LLM-Wiki / autoresearch workflows live most of the time. A good research agent spends the bulk of its life drafting and revising; only a small tail of its output ever needs consensus verification.
+
+Karpathy's framing of the [LLM Wiki](https://gist.github.com/karpathy/442a6bf555914893e9891c11519de94f) — a knowledge substrate natively legible to language models, continuously curated by a mixture of humans and agents — maps almost directly onto the v10 three-layer model. This round is the first concrete step toward making that substrate real.
+
+### 4. Build this
+
+If you are unsure what to build, start from one of these. They are concrete, buildable inside a round, and each one plugs DKG memory into a workflow real users already have.
+
+#### Illustrative suggestions
+
+* **ChatGPT / Claude plugin or MCP server that writes to Working Memory.** Every drafted artifact — chat, research note, code analysis — is deposited into the author's Working Memory with provenance and an agent-assigned status tag. Turns any conversation into durable, attributable knowledge.
+* **Slack threads → Shared Memory.** An agent that watches a channel, identifies substantive exchanges (not chitchat), and promotes them into Shared Memory and team Context Graph membership.
+* **DKG as a memory backend for an existing RAG pipeline.** Swap a team's vector-store-plus-prompt retrieval loop for a DKG-backed one. Agents get provenance and promotion paths for free, and downstream context oracles become consumable without a rewrite.
+* **GitHub → Working Memory ingestion.** Every issue, PR, and review comment in a repo flows into the author's Working Memory with code-aware tagging. The engineering knowledge a team generates daily, captured.
+
+#### Other welcome shapes
+
+* An OpenClaw adapter that deposits every drafted artifact into the author's Working Memory, status tags, and provenance.
+* A Hermes Agent-style autoresearch loop that uses Shared Memory as its team scratchpad — gossip-replicated, multi-agent-readable, no merge-conflict UI.
+* A wiki-style editor (think Roam / Logseq / Obsidian) that syncs bidirectionally with Shared Memory and promotes pages toward verification via the agent conversation surface.
+* A citation-resolver that pulls from Shared Memory across a team's Context Graphs and resolves references back to canonical UALs.
+* An inbox-style integration — email, arXiv feeds, RSS — that populates Working Memory with incoming items tagged by an agent.
+* A SPARQL-backed research assistant that queries across a user's Working + Shared layers and a team paranet in a single cohesive view.
+
+Your own idea is welcome. The list is a starting point, not a constraint.
+
+### 5. In scope
+
+A submission is in scope if it does **both** of the following:
+
+1. Writes to or reads from **Working Memory** or **Shared Memory** on a DKG v10 node through a supported public interface and respecting v10 primitives (UAL, Knowledge Asset, Knowledge Collection, Context Graph, Integration, Curator).
+2. Connects that capability to a product, project, or agent advancing the LLM-Wiki / autoresearch direction.
+
+Supported public interfaces for this round are:
+
+* The node HTTP API (authenticated via bearer token)
+* The dkg CLI, invoked as a subprocess
+* MCP server (for MCP-capable clients such as Cursor, Claude Code, Claude Desktop, and other agent frameworks)
+
+#### Priority integration targets
+
+* **OpenClaw** — the Telegram-based agent build-orchestration environment. Integrations that route OpenClaw artifacts into Working Memory, or expose Shared Memory as a team substrate OpenClaw agents can read and contribute to, are highly valued.
+* **Hermes Agent** — or comparable autonomous research agents (long-horizon, tool-using, artifact-producing).
+* **Claude Code sub-agents and Agent Teams, Cursor-like IDEs with agent panels, Jupyter / notebook autoresearch kernels, literature-review and research-synthesis agents, RAG / dRAG pipelines** that want a verifiable upstream.
+
+### 6. Out of scope (for this round)
+
+* Integrations that touch _only_ **Verified Memory** or chain-anchoring flows. These are the subject of a later round. Submissions that primarily target Working and Shared Memory but deliberately anticipate promotion to Verified Memory and consumption by context oracles are in scope and, per section 9, scored higher for it.
+* UI buttons for endorsement, voting, or social consensus. In v10, these interactions are conversational and happen through the agent panel, not through UI affordances. Submissions reintroducing them will be rejected.
+* Publisher-side Conviction / staking UX. Separate track.
+* DKG v9-only work. The Situation Room-style v9 applications are valuable but out of scope here; this call is v10-specific.
+* Integrations that bypass the Curator authority model on PUBLISH / SHARE operations.
+
+Integrations that import from internal v10 packages (@origintrail-official/dkg-core, -storage, -chain, -publisher, -query, -agent, or any non-public subpath), patch node source, or load code into the node daemon process. These are forks of the node, not integrations in the v10 sense, and carry monorepo-merge obligations this program is not set up to handle. Build against the stable public interfaces listed in Section 5.
+
+### 7. Design principles submissions must honor
+
+Every accepted integration must respect the v10 design principles. These are non-negotiable:
+
+* **Agent-first.** Connecting an agent is the entry point. "Let the agent decide" is the sensible default. Advanced controls live behind Advanced Settings, not on the main surface.
+* **Trust gradient, not binary states.** Knowledge matures; it is not simply "unverified" or "verified". Working → Shared → Verified, with self-attested → endorsed → consensus-verified inside Verified.
+* **Conversational consensus.** Endorsement and voting occur through agent conversation, not UI buttons.
+* **Project-centric layering.** The three memory layers nest inside a project, not the other way around. Integrations should respect this hierarchy.
+* **No merge/conflict UI on Shared Memory.** Shared Memory is gossiped, not merged.
+* **Terminology discipline.** Use the established v10 vocabulary exactly: Context Graph, Integration, Curator, Entity, Knowledge Asset, Knowledge Collection, SHARE, PUBLISH, Projects (not "Memory Explorer"). Deviations should be justified in the submission.
+
+### 8. Submission requirements
+
+A complete submission includes:
+
+1. **Pull request** against the [OriginTrail/dkg-integrations](https://github.com/OriginTrail/dkg-integrations) repository, adding a single integration entry pinned to a specific commit and published package version of your own repository. Integrations live in contributor-owned repositories and consume the DKG v10 node through the supported public interfaces listed in Section 5. Round 1 awards are paid on registry acceptance; contributor code is not merged into the dkg monorepo. OriginTrail core developers may, at their discretion, later invite flagship integrations into first-party status as a separate conversation with an explicit maintenance handoff
+2. **Design brief** (Markdown, 1–3 pages) covering: problem, target user, which memory layer(s) are touched, which v10 primitives are used, how it maps to the LLM-Wiki / autoresearch direction, any terminology choices that deviate from the v10 vocabulary, and an explicit **promotion path** section describing how Working and Shared artefacts mature toward Verified Memory and how the integration's outputs are shaped for downstream consumption by context oracles.
+3. **Working demo** — a recorded walk-through or live endpoint. Screenshots alone are insufficient.
+4. **Test coverage** proportionate to the surface area touched, and integration tests against a local v10 node.
+5. **Security notes** — any credential, write-authority, or data-egress consideration, particularly around Curator authority on PUBLISH / SHARE.
+6. **Maintenance commitment** — named maintainer and at least a 6-month support window post-merge.
+
+### 8a. Security requirements
+
+Before a registry entry is accepted, the following are verified — most are automated as CI checks on the registry repository:
+
+1. Package is published to npm (or an equivalent verifiable registry) with build provenance (e.g., npm publish --provenance via GitHub Actions).
+2. No postinstall or preinstall scripts in the published package, unless explicitly justified and reviewed.
+3. License file present and SPDX identifier matches the registry entry.
+4. Declared network egress: every external domain the integration contacts beyond the local DKG node is listed in the registry entry.
+5. Declared write authority: every DKG endpoint or operation the integration invokes is listed, with any Curator-authority operations (PUBLISH, SHARE) called out explicitly.
+6. No dynamic code loading outside the DKG SDK (no eval on remote input, no remote module fetch and execute).
+7. npm audit --production clean, or outstanding advisories triaged in the submission.
+8. Contributor attestation on the PR: the code is the contributor's own work or properly licensed, and contains no intentional backdoors.
+9. For featured-tier submissions, a one-time manual security review of the pinned commit.
+
+### 9. Evaluation criteria
+
+Submissions are scored against the following, roughly in order of weight:
+
+1. **Fit with LLM-Wiki / autoresearch direction.** Does this meaningfully advance agent-native collaborative knowledge work, or is the DKG integration incidental?
+2. **Adoption potential.** Will real users use this? Does it plug into an existing workflow people already have? Does it unlock agent behavior that was previously impossible? The strongest submissions ship with a credible first user, not just a theoretical one.
+3. **Faithfulness to the v10 memory model.** Correct use of layers, primitives, Curator authority, and terminology.
+4. **Forward-compatibility with Verified Memory and context oracles.** Although this round targets the pre-verification layers, the strongest submissions treat Working and Shared Memory as upstream of Verified Memory — not as a terminal destination. Concretely: data structures, provenance, and UAL references should be shaped so that promotion to Verified Memory is a natural next step rather than a rewrite; and the integration should anticipate consumption by context oracles, so that artifacts maturing through the trust gradient become usable as oracle inputs for downstream agents and applications. Submissions that explicitly document their promotion path and oracle-readiness score higher.
+5. **Quality of the agent surface.** Agent-first onboarding, sensible defaults, conversational consensus respected.
+6. **Engineering quality.** Code clarity, test coverage, deployment story, dependency hygiene (standalone-repo-over-HTTP preferred over monorepo embedding).
+
+Documentation. Design brief, in-repo docs, and onboarding for other contributors.
+
+### 10. Bounty structure
+
+Awards are tiered. The committee assigns a tier based on evaluation score, with a published rationale for each award.
+
+
Tier
Award range
What it signals
Flagship
8,000 – 10,000 TRAC + ecosystem spotlight
Production-grade integration with a clear user base, faithful to the v10 model, strong forward-compatibility with Verified Memory and oracles. Lined up to become a default example in v10 documentation and ecosystem demos.
High-quality
3,000 – 7,000 TRAC
Solid, well-scoped integration that ships a real capability and is maintained. The working core of the ecosystem.
Experimental / early
1,000 – 3,000 TRAC
Promising early-stage work — a fully working prototype that proves a direction, a partial integration with a credible path to maturity, or a well-executed exploration of an underserved surface.
Note: Half-completed solutions are not eligible. Your application has to demonstrate a fully working, DKG-relevant capability in the context of the above-listed requirements.
+
+* Awards are disbursed upon the merge of the approved pull request.
+* Multiple submissions per team are allowed, but each must be substantively distinct.
+* Accepted submissions are lined up for merge in the order the committee deems appropriate; queue position does not affect the tier or award.
+* TRAC is paid on the network the contributor selects from those supported by v10 at the time of merge (NeuroWeb, Base, Gnosis), subject to the network's availability at the time of disbursement.
+
+### 11. What happens if you win
+
+Accepted integrations don't just get TRAC. They get distribution.
+
+* **Registry listing.** Every accepted integration appears in the official DKG v10 integrations registry, consumed by the node itself (integration discovery and installation flows), and is displayed in the v10 dashboard UI. Being in the registry is how users find and adopt your integration.
+* **Documentation.** Flagship and high-quality integrations are featured as default examples in the OriginTrail v10 documentation, developer guides, and agent onboarding flows.
+* **Ecosystem demos.** Highlighted in ecosystem demos, community calls, and Trace Alliance Academy materials, where enterprise partners and agent builders are actively looking for reference integrations to adopt.
+* **Introductions.** Contributors behind strong integrations are introduced to enterprise partners exploring the DKG for their own stack, and to agent-builder teams looking for memory primitives.
+* **Follow-on eligibility.** Rewarded integrations from this round may be eligible for further rounds of the bounty program — in particular, later rounds targeting Verified Memory, context oracles, and Conviction-related UX. Teams that deliver faithful Round 1 work and document a credible promotion path are well-positioned to extend their integration in subsequent rounds. Eligibility is not automatic; each round opens with its own scope and criteria, and prior participation is a signal of fit, not a guarantee of award.
+
+The goal is simple: Round 1 should be a platform entry point, not a one-off payout. Builders who show up here are the ones who define how the layer takes shape.
+
+### 12. Program roadmap — three rounds
+
+This call is the first of three planned rounds. The rounds are sequential: each builds on artifacts and infrastructure matured in the previous one, and scope tightens and layer coverage deepens as the program progresses. Round 2 and Round 3 scope is indicative and subject to refinement at each round's opening.
+
+
+
+Figure 1. The three-round progression of the DKG v10 bounty program.
+
+| **Round** | **Focus** | **Scope** |
+| ---------------------------------------------------------------------------------------------------- | ---------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+|
Round 1 — Working & Shared Memory
(open now)
| Pre-verification integrations, LLM-Wiki, and autoresearch agents | Working and Shared Memory surfaces; priority integration with OpenClaw, Hermes, and comparable research agents. Up to 10,000 TRAC per accepted submission. |
+|
| Agent-legible observability and agent-mediated support | Analytics surfaces legible to agents rather than only humans; agent-mediated user support spanning the full trust gradient. Builds on verified artifacts and oracle outputs delivered in Round 2. |
+
+**Why the sequence matters.**
+
+Round 1 seeds the pre-verification substrate with artifacts that have clean provenance and promotion paths.
+
+Round 2 turns that substrate into verified, oracle-consumable knowledge.
+
+Round 3 makes the whole stack usable — both by agents operating over it and by humans who need support navigating it.
+
+Integrations that anticipate this arc from Round 1 onward — see evaluation criterion 4 — are best positioned to extend across multiple rounds.
+
+**What is not committed.** The bounty cap, timeline, and review committee for Rounds 2 and 3 are not set here. They will be specified when each round opens. The Round 1 scope is the binding document for this call.
+
+### 13. Timeline
+
+
Milestone
Date
Round opens
On publication
First review cut-off
To be announced on the official channel
Rolling review thereafter
Yes
Round closes
When the round-one pool is exhausted (50.000 $TRAC, or when Round 2 opens — whichever is earlier
+
+Exact review cut-off dates are announced on the official OriginTrail channel on the day, in keeping with the ecosystem's practice of not publishing sensitive timing in advance.
+
+### 14. How to submit
+
+1. Open the pull request against [**OriginTrail/dkg-integrations**](https://github.com/OriginTrail/dkg-integrations)**,** adding a single integration entry for your project pinned to a specific commit and published package version of your own repository.
+2. Post the PR link, design brief, and demo link in the designated OriginTrail submission thread (link on the official channel).
+3. Tag the submission cfi-dkgv10-r1.
+
+Questions and early-stage design conversations are encouraged before opening a PR. The OriginTrail community channels are the right place for those.
+
+### 15. Governance & review
+
+* Submissions are reviewed by a committee drawn from OriginTrail, Trace Labs, and Trace Alliance contributors active on v10.
+* Decisions are recorded with reasoning published alongside each accepted, redacted where contributor privacy requires it.
+* Award amounts are described as **programmatic bounty disbursements**, not as amounts owed; the committee's decision on fit, quality, tier, and amount is final for the round.
+* Conflicts of interest (committee members contributing submissions) are disclosed and recused.
+
+### 16. Notes on conduct and IP
+
+* Submissions must be the contributor's own work or properly licensed with licenses for open source software like MIT, Apache 2.0 etc.
+* Contributors retain authorship; merged code is licensed under the v10 repository's standard license.
+* The call does not constitute a contract of employment or a promise of future work. It is an open call with programmatic rewards for accepted contributions.
+* Each participant accepts the [OriginTrail DKG V10 Terms & Conditions.](origintrail-decentralized-knowledge-graph-dkg-v10-terms-and-conditions.md)
+
+***
+
+_Issued by OriginTrail d.o.o. For the current official channel and submission thread, see_ [_origintrail.io_](https://origintrail.io)_._
diff --git a/docs/origintrail-v9-v10/roadmap.md b/docs/origintrail-v9-v10/roadmap.md
new file mode 100644
index 00000000..6403bdcd
--- /dev/null
+++ b/docs/origintrail-v9-v10/roadmap.md
@@ -0,0 +1,174 @@
+---
+description: >-
+ Multi-Agent Memory · DePIN · Truth-Seeking Algorithms · Conviction
+ Mechanisms
+---
+
+# Roadmap
+
+## Metcalfe Convergence Phase
+
+### 1. Why Now: Matching the Velocity of Agentic AI
+
+Agentic systems are entering production at a pace that will define infrastructure lock-in for the next decade. The unsolved problem at the frontier is no longer model capability, but rather how collaborating agents share and verify the knowledge they’ve learned.
+
+{% hint style="info" %}
+**KARPATHY’S AUTORESEARCH: THE PROBLEM THE DKG SOLVES**
+
+Andrej Karpathy [described the next frontier:](https://x.com/BranaRakic/status/2035467180431593939?s=20) thousands of autonomous agents collaborating across the internet on the same research problem, running parallel experiments where each commit builds on the last. His AutoResearch system ran 700 experiments in 2 days on a single GPU, discovering 20 optimizations autonomously. His vision: "a swarm of agents on the internet could collaborate to improve LLMs and could potentially even run circles around Frontier Labs."
+
+But Karpathy identified the critical unsolved problem: **how do you coordinate an untrusted pool of workers out there on the internet?** The work is expensive to produce but cheap to verify. The structure looks like a blockchain - instead of blocks you have commits, and the proof of work is doing tons of experimentation to find commits that work. The Earth has a huge amount of untrusted compute. We need systems in place that deal with using that compute with trust.
+
+**This is exactly what context graphs on the DKG do.**
+{% endhint %}
+
+{% hint style="info" %}
+**DECENTRALIZED KNOWLEDGE GRAPH (DKG): THE TRUST LAYER FOR KEEPING CONTEXT IN AGENT SWARMS**
+
+As Branimir Rakić (OriginTrail CTO) articulated: an auto-research swarm sets up a context graph with a defined set of verifier agents and an M-of-N signature threshold. Untrusted agents run experiments and submit results as Knowledge Assets. For those results to land in the shared context graph, M of the N trusted verifiers must confirm results, then cryptographically co-sign the batch on-chain, attesting that the claimed metrics actually reproduce.
+
+The result is a growing, queryable knowledge graph of verified experimental results that any agent in the swarm can query to decide what to try next - built on a trust layer where untrusted contributors do the heavy lifting and trusted verifiers keep the graph honest.
+{% endhint %}
+
+Meanwhile, every major AI company is racing to give AI an improved memory. Anthropic shipped it for Claude, OpenAI built it into ChatGPT, Google wired it into Gemini. But these are siloed, proprietary memory systems. The demos are compelling: an assistant that remembers. The problem: each system creates its own memory silo. No agent can access another’s memory. No memory is verifiable. No memory is ownable by the user. The DKG provides the alternative: **multi-agent memory that is persistent, verifiable, decentralised, controlled by users and shared across any framework.**
+
+The improvements to OriginTrail are not a reaction to advancements by major AI companies - DKG V9 must ship the multi-agent memory layer to service immediate demand signals from industries the DKG is already exposed to, and implicit indications from markets that do not yet use decentralised infrastructure for AI agents.
+
+The V9 testnet has validated the core architectural changes: multi-agent memory coordination, autonomous knowledge publishing, conviction mechanism viability, Edge Node + AI agents co-location via the DKG CLI, and enhanced graph structure. With the DKG v9 Testnet progress, **the following 4 week development plan’s objective is to release the full DKG v10 Mainnet:**
+
+|
| V10 Mainnet Candidate - Deployment & testing of conviction mechanisms, parameters finalised, data monetisation flow. |
+|
WEEK 3
W3
| V10 Mainnet Candidate - Full feature set: Context Oracles simulation integration, final stress tests, mainnet readiness. |
+|
WEEK 4
W4
| V10 Mainnet Launch - Conviction mechanisms live. Full DePIN infrastructure: every Edge Node = DKG participant + AI agent host + knowledge endpoint. |
+
+### 2. The Four Pillars of Convergence
+
+
DePIN Infrastructure
Nodes + AI agents on local devices
Multi-Agent Memory
Collective memory across swarms
DKG Apps
DKG-grounded swarm intelligence
Truth-Seeking Algorithms
Monetisation + conviction mechanisms
+
+#### 2.1 DePIN: Nodes Co-located with AI Agents
+
+Every DKG node is simultaneously a DePIN infrastructure participant and an AI agent host. Edge Nodes run DKG-enabled trusted AI directly on user devices - laptops, Mac Minis, enterprise systems - preserving privacy while participating in the global knowledge marketplace. Core Nodes form the resilient network backbone, hosting the public replicated DKG. Edge Nodes bring intelligence to the network edge.
+
+* **Local AI agent hosting:** Agents process sensitive data on-device. The Edge Node provides access to both private (local) and public (DKG network) knowledge.
+* **DKG CLI:** Single command-line interface for the entire node lifecycle - installation, agent deployment, plugin management, knowledge operations. Fully functional Edge Node with co-located agent in minutes.
+* **Fully local dRAG with enhanced privacy:** Decentralised RAG on-device, combining private Edge Node data with public DKG knowledge for hallucination-resistant responses.
+
+#### 2.2 Multi-Agent Memory
+
+The DKG provides persistent, verifiable memory for autonomous agents. Individual agents maintain private memory as knowledge graphs on their DKG Edge Node (ownable, portable, cryptographically verifiable). Agents publish selected knowledge to the public DKG (such as indexing information for private knowledge, or open public knowledge) where it becomes collective swarm memory.
+
+* **AI Agent Workspace:** Collaboration module enabling agents to coordinate and share context in real time without full verifiability of public DKG + on-chain proofs..
+* **Context Oracle:** Cryptographically verifiable corroborated multi-agent claims with Knowledge Assets - consensus checks that ground agent outputs in verified knowledge.
+
+The DKG trust layer for keeping context enables V9/V10 agents to autonomously identify knowledge gaps and fill them through neural reasoning - the DKG becomes a self-expanding knowledge organism. Through the DKG CLI and API, agents on any framework share the same memory backbone.
+
+#### 2.3 DKG Apps
+
+**Example 1: OriginTrail Game**
+
+Hello world for the DKG node - an Oregon Trail-inspired game demonstrating multi-node collaboration, serving as the accessible entry point for developers to experience the DKG’s multi-agent capabilities firsthand.
+
+
+
+**Example 2: Verifiable Agent-Driven Predictions**
+
+MiroFish-style swarm engines spawn thousands of agents to simulate emergent behaviour for predictions. The DKG transforms these from opaque simulations into verifiable infrastructure: agents ground reasoning in provenance-backed Knowledge Assets, memories persist across simulation runs, every prediction carries a full on-chain provenance trail, and DKG-verified predictions flow into prediction markets with cryptographic proof of methodology integrity.
+
+
+
+**Example 3: Trusted Autoresearch**
+
+Andrej Karpathy's autoresearch lets AI agents collaborate on ML experiments - for them to do so at scale the missing piece is trust. Any untrusted agent can run experiments and claim results, but without a verification layer, no other agent can know whether those results actually reproduce. The DKG v9/v10 autoresearch app solves this by wrapping the same experiment loop in a DKG context graph with an M-of-N verification threshold: untrusted agents run experiments and submit results as Knowledge Assets, but those results only land in the shared graph once M of N designated verifier agents cryptographically co-sign the batch on-chain, attesting that the claimed metrics reproduce. The result is a growing, SPARQL-queryable knowledge graph of verified experimental findings that any agent in the swarm can query to decide what to try next — keeping the open, permissionless contribution model of AgentHub while adding the cryptographic trust layer it lacks.
+
+#### 2.4 Truth-Seeking Algorithms: Monetisation + Conviction Mechanisms
+
+For the DKG to achieve self-sustaining growth, the knowledge economy must have native monetisation and aligned long-term incentives. Two conviction mechanisms - publisher conviction (demand-side) and staker conviction (supply-side) - form the economic engine, complemented by external composability with agent payment protocols.
+
+***
+
+### 3. Publisher Conviction
+
+#### 3.1 Mechanism
+
+Publishers commit a sum of TRAC in advance for 12 months of undefined DKG usage. The committed TRAC converts into a pre-purchased allowance for publishing, updating, and querying Knowledge Assets. This allowance depletes with usage and with completion of each epoch. This also means that TRAC emissions pre-comitted get distributed into the network, even if publishers don’t use the allowance to publish anything in any given epoch.
+
+If the epoch limit is exhausted (committed TRAC / 12 epochs), the publisher can top up at the current network rate.
+
+* **Fixed term:** Always 12 months. The conviction signal on time is binary - you’re in or you’re not.
+* **Variable capital:** The amount committed determines the discount tier.
+* **Locked TRAC:** Committed TRAC is locked for the full 12-month term, reducing circulating supply and increasing network security.
+* **Credit expiry:** Unused credits “expire” each epoch, but still flow towards staking rewards - ensuring committed capital always benefits the network.
+
+#### 3.2 Discount Tiers
+
+The discount curve is calibrated against industry benchmarks\* for comparable infrastructure commitment models:
+
+| **TRAC Committed** | **Discount** | **Industry Benchmark** |
+| ------------------ | ------------ | ---------------------------------------------------------------- |
+| 25,000 | 10% | Comparable to SaaS annual prepay (15–25%) |
+| 50,000 | 20% |
|
+| 100,000 | 30% | Comparable to AWS 1-yr Partial Upfront (\~40%) |
+| 250,000 | 40% | Comparable to OpenAI Batch API discount (50%) |
+| 500,000 | 50% | Between AWS 1-yr and 3-yr commitments |
+| 1,000,000+ | 75% | Comparable to AWS 1-yr All Upfront / EC2 Instance SP (up to 72%) |
+
+**\*Industry benchmarks**
+
+* **AWS Savings Plans:** 1-year commitments offer up to 66–72% off on-demand. These are the discounts publishers compare against when evaluating DKG commitment economics.
+* **LLM API pricing:** OpenAI Batch API offers 50% off. Enterprise contracts routinely reach 40–60%. The DKG’s mid-tiers match these benchmarks.
+
+{% hint style="info" %}
+**ENTRY TIER MATTERS MOST:** _Converting a pay-as-you-go publisher into a 12-month committed publisher is the highest-leverage decision in the flywheel. The 25K → 10% tier clears lockup friction and gets the publisher contributing to DKG growth for a full year._
+{% endhint %}
+
+***
+
+### 4. Delegated Staker Conviction
+
+Conviction Staking introduces a new way to commit to the OriginTrail ecosystem. When staking TRAC, participants choose their principal amount and a lock period - ranging from no lockup at all to a full 12-month commitment. Each tier carries a progressively higher reward multiplier:
+
+| **Lock Period** | **Multiplier** | **Description** |
+| --------------- | -------------- | ---------------------------------------------------------------------- |
+| No lockup | 1x | Base rewards with full liquidity - withdraw anytime |
+| 1 month | 1.5x | Equivalent to V8’s existing 28-day withdrawal period, now with a boost |
+| 3 months | 2x | Quarterly commitment - meaningful conviction signal |
+| 6 months | 3.5x | Half-year commitment - strong alignment |
+| 12 months | 6x | Full conviction - maximum alignment with network growth |
+
+The curve is designed to disproportionately reward long-term alignment with network growth, while preserving accessible entry points at every level of commitment.
+
+#### 4.1 ERC-721 Conviction NFTs (Uniswap V3 Model)
+
+Each conviction stake is minted as an ERC-721 NFT, making your locked position a first-class on-chain asset. The NFT encodes principal, lock duration, multiplier, and expiry - turning what would otherwise be an illiquid lockup into something composable and verifiable.
+
+The design draws direct inspiration from Uniswap V3, which pioneered ERC-721 NFTs to represent unique financial positions. In Uniswap V3, liquidity providers concentrate capital into a specific price range - the tighter the range, the greater the yield. Conviction staking applies the same logic to time: stakers concentrate commitment into a specific lock period - the longer the lock, the higher the multiplier. Where Uniswap V3 rewards precision in price, DKG V9 rewards conviction in time.
+
+* **Network-level conviction:** Unlike traditional delegated staking, conviction is not tied to a specific node. It is a commitment to the DKG network as a whole. Node selection and delegation remain separate concerns.
+* **Fractionalisable:** Conviction NFTs support fractionalisation - a single locked position can be split into smaller units, enabling shared staking positions, secondary market liquidity, and collective participation without breaking the underlying lock or forfeiting the multiplier.
+* **Composable:** Conviction NFTs can be held, transferred, traded, or used as building blocks in DeFi - bringing the full expressiveness of the NFT ecosystem to staking.
+
+***
+
+### 5. Monetisation: x402 Native Agent Payments
+
+x402 (Coinbase/Google/Visa/Cloudflare) enables autonomous stablecoin micropayments over HTTP. Integrated with the DKG: agents can pay per-query for premium Knowledge Assets, agent-to-agent knowledge commerce operates at protocol level without intermediaries, and the full cycle - knowledge retrieval, swarm simulation, prediction, market trade, settlement - becomes a single autonomous flow.
+
+***
+
+### 6. The Conviction Flywheel
+
+|
DEMAND SIDE
Publishers commit TRAC for 12 months
Knowledge Assets created at scale
DKG becomes more valuable (Metcalfe)
More agents integrate → more publishers
|
SUPPLY SIDE
Stakers lock TRAC for fixed periods
Infrastructure secured long-term
Node stability attracts publishers
Fee share + boost rewards stakers
|
+| --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+
+{% hint style="info" %}
+**RESULT:** Publishers commit TRAC for knowledge growth. Stakers commit TRAC for infrastructure security. Locked TRAC reduces circulating supply. Network stability attracts more publishers and agents. More Knowledge Assets make the DKG more valuable. The flywheel accelerates - driven by the network effects that Metcalfe’s Law predicts from connectivity.
+{% endhint %}
+
+{% hint style="info" %}
+**WE CONNECT WHAT OTHERS ISOLATE**
+
+The Metcalfe Convergence Phase is the inflection point where the DKG becomes the essential trust layer for the Age of AI. V9 testnet validated. V10 mainnet in 4 weeks. The Convergence is not a distant vision - it is happening now.
+{% endhint %}
+
+_The roadmap published on the official OriginTrail website does not yet reflect the pace of the DKG v10 Mainnet rollout. Most of the envisioned features of it however are covered in detail here:_ [_origintrail.io/ecosystem/roadmap_](http://origintrail.io/ecosystem/roadmap)
diff --git a/docs/origintrail-v9-v10/v10-mainnet-release-timeline.md b/docs/origintrail-v9-v10/v10-mainnet-release-timeline.md
new file mode 100644
index 00000000..d74fbb5f
--- /dev/null
+++ b/docs/origintrail-v9-v10/v10-mainnet-release-timeline.md
@@ -0,0 +1,94 @@
+# V10 Mainnet Release Timeline
+
+
Current epoch ends. Mainnet snapshot executed — V8 publishing allocation frozen at epoch boundary. Defines each publisher's TRAC balance eligible for V10 conviction migration.
Updating V8 publishing allocation to V10 Publisher Conviction
10 April 2026
Tokens sent to publisher wallets so they can republish under the V10 conviction system once V10 launches. Publishers have time to bridge to networks of their choice.
Node runners have 1 week to get ready to update.
Stakers prepare to potentially re-delegate ahead of V10 release. V10.0 Mainnet release scheduled for the following week.
New Conviction System Staking UI
15–17 April 2026
New Staking UI live, now including conviction staking. Exact release time communicated only via the official channel on the day of release — not published in advance to reduce potential attack vectors.
V10 Mainnet launch window
15–17 April 2026
DKG V10 deployed on all networks (NeuroWeb, Base, Gnosis). Publishers and node runners choose which network to operate on. Publishing factor resets to new V10 system (breaking change).
Publishers create publishing conviction accounts and allocate their TRAC on their network of choice.
Stakers use the new Staking UI to upgrade their staking positions to V10 conviction.
Ongoing V10 updates & bounty release
20 April onwards
Rolling V10 updates across all integrated networks. Bounty programme releases, with rewards for ecosystem builders and developers actively growing the V10 network, including bug bounty.
+
+### Publisher Conviction — How It Works in DKG V1
+
+[The V10 roadmap ](https://docs.origintrail.io/origintrail-v9-v10/roadmap)specifies Publisher Conviction in precise operational terms: publishers commit a sum of TRAC in advance for 12 months of DKG usage. The committed TRAC converts into a pre-purchased allowance for publishing, updating, and querying Knowledge Assets on the publisher's chosen network — Base, Gnosis, or NeuroWeb. The conviction signal is binary: you are in for 12 months, or you are not. The variable is the amount of TRAC committed.
+
+* **Discount tiers:** Six tiers from 10% discount (25,000 TRAC committed) to 75% discount (1,000,000+ TRAC committed). The DKG's conviction economics are benchmarked against comparable infrastructure commitment models, designed to make multi-year knowledge publishing financially attractive relative to pay-as-you-go alternatives.
+* **Epoch flow-through:** TRAC committed by publishers is distributed to staker rewards each epoch even if publishers do not use their full allowance in a given epoch. There is no dead capital in a conviction position — committed TRAC always flows to the network.
+* **ERC-721 Conviction NFTs:** Each conviction position is minted as an ERC-721 NFT encoding the principal, lock duration, discount tier, and expiry. Conviction NFTs are composable and fractionalisable — they can be held, transferred, traded, or used as building blocks in DeFi. Publishers who wish to exit a position before term can do so via secondary market transfer of the NFT.
+
+Network-level commitment: Conviction is a commitment to the DKG network as a whole, not to a specific node. Node selection and delegation remain separate concerns. The design intent: disproportionately reward long-term alignment with network growth while preserving accessible entry points at every level of commitment.
+
+#### The V8 → V10 Publisher TRAC Migration
+
+To republish existing knowledge to the V10 network, each publisher brings their accrued V8 publishing TRAC — TRAC committed but not yet emitted as staking rewards — to DKG V10, choosing their preferred network (NeuroWeb, Base, or Gnosis).
+
+This TRAC is committed under the V10 Publisher Conviction mechanism and emits programmatically to stakers over up to 2 years, made possible by the V10 conviction emission model entering into force at launch. Publishers add additional TRAC to their accounts for publishing new knowledge.
+
+The result: useful publisher knowledge is migrated to V10. Stakers receive the same total TRAC for published knowledge from the V8 period, but 3 years earlier in net terms. Publishers enter V10 already in conviction positions with discount tier access.
+
+{% hint style="info" %}
+**In Plain Terms for Stakers**
+
+Under DKG V8, the TRAC publishers committed for publishing is programmatically emitted to stakers over up to 5 years. Under V10, that same TRAC is brought by publishers under the Publisher Conviction mechanism and emitted programmatically over up to 2 years. The total amount does not change — only the schedule does — meaning stakers receive it 3 years faster in net terms.
+
+Stakers are not required to take any action to receive V10 publisher conviction rewards. If you are staked on a V10-active node, publisher conviction TRAC will flow to you each epoch automatically, starting from the V10 mainnet launch.
+
+The updated Staking UI gives you the option to boost your rewards further by locking your own staking position under conviction multipliers. Learn more about the Delegated Staker Conviction mechanism [here.](https://docs.origintrail.io/origintrail-v9-v10/roadmap)
+{% endhint %}
+
+### Release Timeline
+
+The following sections detail what happens, when, and what each participant group — publishers and publishing node runners, stakers, and node runners — needs to do at each stage of the V10 launch.
+
+{% stepper %}
+{% step %}
+### 6–10 April - Release Candidate & V8 Allocation Update
+
+**What Publishers and Publishing Node Runners Should Do**
+
+* Review the official V10 documentation — it specifies the exact conviction parameters, discount tiers, and emission schedule that will govern your V10 position.
+* Review your V8 TRAC balance as of the epoch snapshot on 9 April. This is the amount you can commit to a V10 Publisher Conviction position.
+* Decide your conviction tier: the amount of TRAC you commit determines your discount (10% at 25,000 TRAC up to 75% at 1,000,000+ TRAC).
+* Choose your network: NeuroWeb, Base, or Gnosis are all supported. Your conviction position is network-specific.
+* Prepare your V10 publishing wallets for the creation of conviction accounts on your chosen network.
+* Your V8 nodes will remain operational, but after V10 launches you will need to deploy new V10 nodes to resume knowledge creation.
+
+**What Node Runners Should Do**
+
+* Review the V10 node release notes that accompany the Release Candidate published on 8 April.
+* Prepare to deploy V10 nodes as fresh deployments — V10 nodes are not upgraded in-place from V8. The publishing factor resets to the new V10 system at launch.
+* Consider testing the V10 release candidate on testnet to prepare for the migration.
+* Plan for publishers to republish to your V10 node. All publishing on V10 starts from a clean state regardless of V8 history.
+
+**What Stakers Should Do**
+
+* No immediate action is required this week — staking features continue normally on V8.
+* Review the official V10 documentation to understand the new Conviction System Staking UI, launching between 15 and 17 April. Exact timing will be announced on the official channel on the day of release.
+* Prepare to potentially re-delegate under V10 conviction. The new staking model offers multipliers (1x to 6x) based on lock period — consider which tier suits your preference.
+{% endstep %}
+
+{% step %}
+### 13–17 April - V10 Mainnet Launch & New Conviction System Staking UI
+
+This is the launch week. The new Conviction System Staking UI goes live between 15 and 17 April — exact timing communicated on the official channel on the day to avoid providing potential attackers with advance information. V10 Mainnet launches in the same window across all networks: NeuroWeb, Base, and Gnosis. Publishers and node runners choose which network to operate on.
+
+**What Publishers and Publishing Node Runners Should Do**
+
+* Create your V10 Publishing Conviction account on your chosen network once mainnet is live. Allocate your V8 TRAC snapshot balance plus any additional TRAC you wish to commit to your conviction position.
+* Your conviction position is minted as an ERC-721 NFT. Verify it appears in your wallet and matches your intended principal, discount tier, and expiry.
+* Begin republishing your Knowledge Assets to V10 nodes. V10 is a fresh start — your V8 Knowledge Assets do not automatically migrate. Republish the knowledge you wish to carry forward at the discounted rate your conviction tier provides.
+* Your conviction TRAC begins flowing to staker rewards from the first epoch after you create your position — regardless of whether you have actively published anything yet.
+* Deploy your V10 DKG nodes after V10 lands on mainnet — V10 nodes are deployed fresh, not upgraded from V8. Your node's publishing factor resets to zero at launch. Attracting active publishers to republish through your node is the primary way to build publishing factor quickly.
+
+**What Stakers Should Do**
+
+* Access the new Conviction System Staking UI when it goes live. Review your V8 staking position and convert to V10 conviction staking if you wish to access reward multipliers.
+* Conviction staking on V10 offers five lock tiers: no lockup (1x multiplier), 1 month (1.5x), 3 months (2x), 6 months (3.5x), and 12 months (6x). Choose the tier that reflects your intended commitment.
+* Conversion is opt-in — V8 staking positions continue to receive base rewards without the V10 conviction multiplier. You can convert at any time after launch.
+* Publisher conviction positions begin emitting TRAC to staker rewards from epoch one of V10. If you are already staked on a V10-active node, you will begin receiving rewards without any additional action.
+{% endstep %}
+
+{% step %}
+### 20 April onward - Ongoing V10 Updates & Bounty Release
+
+Following the V10 Mainnet launch, the development cadence continues with ongoing V10 updates and the launch of the ecosystem bounty programme. These updates address post-launch optimisations, additional features, and any issues identified during the initial launch period.
+
+* Ongoing V10 updates will be communicated via the standard OriginTrail developer channels and documented in the DKG release notes. Node runners with auto-update enabled will receive patches automatically — it is recommended to review each update's release notes regardless.
+* The bounty programme releases alongside V10 updates. Eligible activities, reward amounts, and submission requirements will be published in the official programme documentation. Publishers and developers actively building on V10 from day one are best positioned to qualify.
+* Publisher conviction positions remain active and unaffected by V10 updates — no re-commitment is required. Conviction TRAC continues to emit to staker rewards each epoch throughout the update period.
+{% endstep %}
+{% endstepper %}