# Architecture

<figure><img src="https://4263197242-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FtVM0qTwW1SmwvBaQwlXq%2Fuploads%2FEXkT73bPb7ROSbzJndNW%2Fslinky_architecture%20(1).png?alt=media&#x26;token=1275d662-2f3b-4e4c-b659-a6c5e7bc6b4d" alt=""><figcaption></figcaption></figure>

The diagram above shows how SlinkyLayer moves raw market data through model training and out to live trading signals, all under a single control plane. Each block is summarised below.

<table><thead><tr><th width="40">#</th><th width="169.29541015625">Component</th><th width="324.255615234375">Purpose</th><th>Key Tech</th></tr></thead><tbody><tr><td><strong>1</strong></td><td><strong>Front End</strong></td><td>Browser UI and REST / GraphQL API used to launch jobs, view metrics, and stake on models.</td><td>React + TypeScript, WalletConnect</td></tr><tr><td><strong>2</strong></td><td><strong>Control Plane</strong></td><td>Central service that authenticates users, stores configs, schedules jobs, and writes immutable run IDs to the registry.</td><td>Kubernetes, Postgres, gRPC</td></tr><tr><td><strong>3</strong></td><td><strong>Data Lake</strong></td><td>Holds raw OHLCV and tick data fetched from Binance, Coinbase, Bybit, and Kucoin. Partitions by symbol and timeframe for fast reads.</td><td>Parquet on object storage, daily hash checks</td></tr><tr><td><strong>4</strong></td><td><strong>Feature Store</strong></td><td>Pre-computes technical indicators and scalers. Each version is addressed by <code>dataHash + codeHash + params</code> so every job can pin its exact feature set.</td><td>pandas, numpy, joblib</td></tr><tr><td><strong>5</strong></td><td><strong>Training Cluster</strong></td><td>Distributed RL engine. CPU worker pods collect roll-outs (<code>n_envs × n_steps</code>), a GPU learner pod applies gradients, checkpoints every <em>k</em> updates, and streams logs back to the UI.</td><td>Stable-Baselines3, PyTorch, Redis parameter store</td></tr><tr><td><strong>6</strong></td><td><strong>Back-test Engine</strong></td><td>Replays the trained policy on the test split with the same fee and slippage rules. Outputs equity curve and risk metrics (Sharpe, Sortino, drawdown).</td><td>pandas, ffn, matplotlib</td></tr><tr><td><strong>7</strong></td><td><strong>Model Registry</strong></td><td>Stores private artifacts in the platform file store; pins public artifacts to IPFS. A smart-contract entry anchors <code>modelCID</code>, <code>configCID</code>, and owner address, enabling transparent discovery and staking.</td><td>IPFS / Filecoin, Solidity registry</td></tr><tr><td><strong>8</strong></td><td><strong>Signal Gateway</strong></td><td>Turns the latest checkpoint into a lightweight runtime container and streams JSON signals:<br><code>LONG_100</code>, <code>FLAT_0</code>, <code>SHORT_100</code>. Supports WebSocket, webhook push, or Kafka topic pull.</td><td>FastAPI, NATS or Kafka</td></tr><tr><td><strong>9</strong></td><td><strong>$SLINKY Staking &#x26; Voting</strong></td><td>Token contract where users stake on public models. Votes influence ranking and route a share of usage fees to model creators and curators.</td><td>ERC-20 + staking module</td></tr></tbody></table>

## 4.1 End-to-End Flow

1. **Job Launch and Configuration**\
   The trader opens the browser wizard, chooses market pair, timeframe, train-test split, reward style, and, if desired, overrides defaults such as learning rate or rollout length. The Control Plane stores the full JSON config, assigns a run ID, and queues the job.
2. **Data to Features Pipeline**\
   Ingestion workers pull fresh candles from exchange APIs and write them to the Data Lake. A Feature Store task computes technical indicators and scalers, tags the table with a composite hash, and hands the feature CID to the scheduler.
3. **Model Training**\
   The scheduler starts a Training Cluster: CPU rollout pods sample the feature table under the configured fee, slippage, and short-selling rules, while a GPU learner pod consumes gradients and produces checkpoints. Logs stream back to the UI in real time.
4. **Back-test and Metrics**\
   After training, a back-test job replays the final policy on the held-out test range using the same environment settings. It writes equity curve CSV, trade ledger CSV, and risk metrics such as Sharpe, Sortino, and drawdown to the Model Registry.
5. **Artifact Registration**\
   Private artifacts live in the platform file store. If the creator selects Public, the artifact is anchored in the on-chain registry contract under the creator’s address.
6. **Staking and Ranking (optional)**\
   Community members review public metrics, stake $SLINKY on models they trust, and vote. A ranking formula blends live performance with stake weight. Usage fees flow to both creators and curators.
7. **Live Signal Streaming**\
   The Signal Gateway loads the latest checkpoint and streams timestamped actions -`LONG_100`, `FLAT_0`, `SHORT_100`- over WebSocket, webhook, or Kafka. Execution stays in user hands: bots, agent frameworks, or vault connectors consume the feed and place trades.

## 4.2 Environment Parameters (defaults)

<table><thead><tr><th width="175.921875">Parameter</th><th width="244.7855224609375">Value</th><th>Notes</th></tr></thead><tbody><tr><td>Initial cash</td><td>10000 USD</td><td>Reference balance</td></tr><tr><td>Fee per side</td><td>0.10 %</td><td>Adjusted by exchange</td></tr><tr><td>Slippage</td><td>3 ticks</td><td>Synthetic spread</td></tr><tr><td>Short selling</td><td>ON</td><td>Borrow cost zero in spot sim</td></tr><tr><td>Episode mode</td><td>Full-split</td><td>Can switch to fixed-steps</td></tr></tbody></table>

SlinkyLayer keeps heavy compute on managed GPU nodes, ensures every artefact is hash-pinned for reproducibility, and links discovery and rewards through the $SLINKY token&#x20;
