Install and Run a Node

There are currently two types of OSMI Nodes that are both executable with a node NFT license, storage and inference nodes. Memory nodes handle storage needs for the OSMI ecosystem and do not require high end computers. Inference nodes, on the other hand, have heavy GPU, CPU, and memory requirements and will likely be run by fewer participants of the OSMI network, thus receiving a greater portion of the OSMI rewards. In the future, daily inference token generation counts will be calculated and added into the node reward pool (i.e. "tokens for tokens").

OSMI Super Node

With one node install package you can run any type of node. The OSMI Super Node is your main entry point into the OSMI node ecosystem and the only OSMI software you will need to install. By running this node software, you will be able to link your Node NFT and run any type of OSMI node. This super node will be able to determine what type of node workload you'll be able to run in the OSMI ecosystem, depending on your node's system specifications.

Linux

On Linux, you will need to enter these instructions using a terminal window:

# IMPORTANT: Keep the user service running when logged out
loginctl enable-linger $USER

# Install support software - you may need to use sudo
apt update && apt install -y unzip build-essential curl wget git

# Get and unzip osmi-node
wget https://links.osmi.ai/osmi-node_linux_amd64.zip
unzip osmi-node_linux_amd64.zip

# Install the software - you will be asked for your pairing code from the website
./osmi-node install

# When you get this prompt, use the code from the nodes webpage used to link
# Please paste your activation code and press enter to continue: <Your Code Here>

For some systems you will need to open up or port forward a specific port for OSMI to run (this applies mostly to home ISPs). The OSMI super node needs to communicate with our servers through UDP/TCP on port 8670.

Windows

For Windows PCs, download this installer and run it. This current installer is not code signed, so you may see a warning that this publisher is not identifiable. We will eventually get code-signing enabled, but you can run the installer by clicking on run anyway. After running the installer a terminal window will open asking for the sctivation code. This is the code you receive from the "Link" button in the node page. Cut and paste it into the window and you're all set!

Memory Nodes

Because of their lower specifications, memory node workloads will be more widely run and thus receive a smaller allocation of node rewards when more nodes come online (all rewards are divided equally among other memory nodes).

Memory Nodes currently run on the Linux or Windows operating system and have the following system recommended requirements:

  • Ubuntu 22.04+

  • 4 or more CPU cores

  • 6GB or more memory

  • at least 100GB of hard drive space Below are the command line instructions to install and run this node:

LLM/Inference Nodes (Coming Soon)

Inference nodes require much higher specifications than memory nodes and running them requires a more thorough setup process and custom verification software to ensure these nodes are properly performing. If interested, please contact us through our various communication channels. We will be updating this space with install instructions soon.

Preliminary Specifications:

Some preliminary specs for these nodes are as follows (these are only guidelines and will change):

  • Linux Ubuntu 22.04+

  • Nvidia based GPU (3080Ti or higher performance with 12GB VRAM)

  • 16GB+ memory

  • x64 CPU with 6+ cores

  • 250GB+ disk space

Inference Node Distributions

Currently inference Nodes receive rewards in the same pool as memory nodes. Because they require much higher specifications and provide more compute power for the network, they will be in receiving distribution in their own pool and will be eligible for "token for tokens" $OSMI rewards. In the future, token metrics will be recorded. Tokens in AI are the actual inference results (e.g. words are made up of tokens). With OSMI's API and product offerings these tokens are generated by decentralized inference nodes and paid for by customers. When an inference node generates tokens it qualifies for the daily reward of those $OSMI tokens, thus "tokens for tokens". Also, we anticipate, fewer Node NFT owners will run systems powerful enough to do inference, so the daily reward distributions will be delivered to fewer Node NFTs. We anticipate this will be a self balancing ecosystem among all the node workload types due to computational needs.

In the future, node software will automatically detect if a node meets the requirements to run inference and will serve the network as an inference node.

Last updated