Architecture

The diagram above shows how SlinkyLayer moves raw market data through model training and out to live trading signals, all under a single control plane. Each block is summarised below.

#
Component
Purpose
Key Tech

1

Front End

Browser UI and REST / GraphQL API used to launch jobs, view metrics, and stake on models.

React + TypeScript, WalletConnect

2

Control Plane

Central service that authenticates users, stores configs, schedules jobs, and writes immutable run IDs to the registry.

Kubernetes, Postgres, gRPC

3

Data Lake

Holds raw OHLCV and tick data fetched from Binance, Coinbase, Bybit, and Kucoin. Partitions by symbol and timeframe for fast reads.

Parquet on object storage, daily hash checks

4

Feature Store

Pre-computes technical indicators and scalers. Each version is addressed by dataHash + codeHash + params so every job can pin its exact feature set.

pandas, numpy, joblib

5

Training Cluster

Distributed RL engine. CPU worker pods collect roll-outs (n_envs × n_steps), a GPU learner pod applies gradients, checkpoints every k updates, and streams logs back to the UI.

Stable-Baselines3, PyTorch, Redis parameter store

6

Back-test Engine

Replays the trained policy on the test split with the same fee and slippage rules. Outputs equity curve and risk metrics (Sharpe, Sortino, drawdown).

pandas, ffn, matplotlib

7

Model Registry

Stores private artifacts in the platform file store; pins public artifacts to IPFS. A smart-contract entry anchors modelCID, configCID, and owner address, enabling transparent discovery and staking.

IPFS / Filecoin, Solidity registry

8

Signal Gateway

Turns the latest checkpoint into a lightweight runtime container and streams JSON signals: LONG_100, FLAT_0, SHORT_100. Supports WebSocket, webhook push, or Kafka topic pull.

FastAPI, NATS or Kafka

9

$SLINKY Staking & Voting

Token contract where users stake on public models. Votes influence ranking and route a share of usage fees to model creators and curators.

ERC-20 + staking module

4.1 End-to-End Flow

  1. Job Launch and Configuration The trader opens the browser wizard, chooses market pair, timeframe, train-test split, reward style, and, if desired, overrides defaults such as learning rate or rollout length. The Control Plane stores the full JSON config, assigns a run ID, and queues the job.

  2. Data to Features Pipeline Ingestion workers pull fresh candles from exchange APIs and write them to the Data Lake. A Feature Store task computes technical indicators and scalers, tags the table with a composite hash, and hands the feature CID to the scheduler.

  3. Model Training The scheduler starts a Training Cluster: CPU rollout pods sample the feature table under the configured fee, slippage, and short-selling rules, while a GPU learner pod consumes gradients and produces checkpoints. Logs stream back to the UI in real time.

  4. Back-test and Metrics After training, a back-test job replays the final policy on the held-out test range using the same environment settings. It writes equity curve CSV, trade ledger CSV, and risk metrics such as Sharpe, Sortino, and drawdown to the Model Registry.

  5. Artifact Registration Private artifacts live in the platform file store. If the creator selects Public, the artifact is anchored in the on-chain registry contract under the creator’s address.

  6. Staking and Ranking (optional) Community members review public metrics, stake $SLINKY on models they trust, and vote. A ranking formula blends live performance with stake weight. Usage fees flow to both creators and curators.

  7. Live Signal Streaming The Signal Gateway loads the latest checkpoint and streams timestamped actions -LONG_100, FLAT_0, SHORT_100- over WebSocket, webhook, or Kafka. Execution stays in user hands: bots, agent frameworks, or vault connectors consume the feed and place trades.

4.2 Environment Parameters (defaults)

Parameter
Value
Notes

Initial cash

10000 USD

Reference balance

Fee per side

0.10 %

Adjusted by exchange

Slippage

3 ticks

Synthetic spread

Short selling

ON

Borrow cost zero in spot sim

Episode mode

Full-split

Can switch to fixed-steps

SlinkyLayer keeps heavy compute on managed GPU nodes, ensures every artefact is hash-pinned for reproducibility, and links discovery and rewards through the $SLINKY token

Last updated