Background
COMPAL | NVIDIA GPU Platforms

Scale from NVIDIA MGX™ building blocks to NVIDIA HGX™-class training platforms—with air, ORv3, and immersion-ready options.

Contact
Scale-up performance

Scale-up performance

From NVIDIA MGX™ GPU servers to NVIDIA HGX™-class systems for large-model training.

Deployment flexibility

Deployment flexibility

EIA 19" and ORv3 options to match datacenter standards.

Thermal headroom

Thermal headroom

Air + immersion cooling options for dense GPU configurations.

SG231-2-L1

2U DLC NVIDIA Rubin NVL8 platform for next‑gen AI training

SG231-2-L1
  • 8× NVIDIA Rubin NVL8 GPUs (SXM) with NVLink
  • 6th‑gen NVLink fabric for scale‑up multi‑GPU
  • 2U single‑phase DLC for dense, efficient AI compute

SGX30-2

8x NVIDIA Blackwell Ultra GPUs server for extreme AI training

SGX30-2
  • Up to 8x NVIDIA Blackwell Ultra GPUs
  • Dual Intel Xeon 6 for balanced CPU + I/O throughput
  • Scale-up fabric: NVSwitch + NVLink + PCIe integrated

SX420-2A

4U NVIDIA MGX™ platform on AMD EPYC for flexible 8‑GPU AI

SX420-2A
  • NVIDIA MGX™ reference platform on 2x AMD EPYC™ 9005
  • Up to 8x H200 NVL, L40S or RTX PRO Blackwell GPUs
  • Configurable storage: up to 16x E1.S/E3.S or 12x U.2

OX420-2A

ORv3 NVIDIA MGX™ 4U accelerated server for hyperscale AI racks

OX420-2A
  • OCP ORv3 form factor for 21" hyperscale racks
  • NVIDIA MGX™ + 2x AMD EPYC™ 9005 with 8 GPU options
  • Front I/O + serviceability optimized for fleet deployment

SG323-2A-I

2U 8-GPU immersion-cooled server for sustained AI loads

SG323-2A-I
  • NVIDIA MGX™ reference platform on 2x AMD EPYC™ 9005
  • Up to 8x H200 NVL, L40S or RTX PRO Blackwell GPUs
  • Configurable storage: up to 16x E1.S/E3.S or 12× U.2

SG223-2A-I

3U 8-GPU immersion-cooled server for sustained AI loads

SG223-2A-I
  • Immersion cooling maximizes GPU clocks under heavy loads
  • Up to 8x PCIe GPUs: AMD MI210 or NVIDIA H100 NVL
  • 2x AMD EPYC™ 9004/9005 with up to 6TB memory
SG231-2-L1
MODEL SG231-2-L1
WORKLOAD AI training, inference, and GPU acceleration
FORM FACTOR 2U
PROCESSOR 2 x Intel® Xeon® Scalable (6th Gen)
GPU 8 x NVIDIA Rubin GPUs NVL8 (SXM8)
MAX MEMORY 8TB
DRIVE BAY CAPACITY (MAX) 8 x E1.S
COOLING DLC (1-Phase)
MAX POWER(SYSTEM) DC Busbar
RACKSTANDARD EIA 19"
Spec
SGX30-2
MODEL SGX30-2
WORKLOAD AI training, inference, and GPU acceleration
FORM FACTOR 10U
PROCESSOR 2 x Intel® Xeon® Scalable (6th Gen)
GPU 8 x NVIDIA Blackwell Ultra GPUs
MAX MEMORY 8TB
DRIVE BAY CAPACITY (MAX) 8 x E1.S + 2 x M.2
COOLING Air
MAX POWER(SYSTEM) 3200W
RACKSTANDARD EIA 19"
Spec
SX420-2A
MODEL SX420-2A
WORKLOAD AI training, inference, and GPU acceleration
FORM FACTOR 4U
PROCESSOR 2 x AMD EPYC™ 9005
GPU 8 x NVIDIA H200 NVL or 8 x NVIDIA L40S or 8 x NVIDIA RTX PRO™ 6000 Blackwell Server Edition or 8 x NVIDIA RTX PRO™ 4500 Blackwell Server Edition
MAX MEMORY 8TB
DRIVE BAY CAPACITY (MAX) 16 x (E1.S or E3.S) or 12 x U.2 + 2 x M.2
COOLING Air
MAX POWER(SYSTEM) 3200W
RACKSTANDARD EIA 19"
Spec
OX420-2A
MODEL OX420-2A
WORKLOAD AI training, inference, and GPU acceleration
FORM FACTOR 4U
PROCESSOR 2 x AMD EPYC™ 9005
GPU 8 x NVIDIA H200 NVL or 8 x NVIDIA L40S or 8 x NVIDIA RTX PRO™ 6000 Blackwell Server Edition
MAX MEMORY 8TB
DRIVE BAY CAPACITY (MAX) 8 x E1.S + 2 x M.2
COOLING Air
MAX POWER(SYSTEM) -
RACKSTANDARD OCP ORv3
Spec
SG323-2A-I
MODEL SG323-2A-I
WORKLOAD AI training, inference, and GPU acceleration
FORM FACTOR 2U
PROCESSOR 2 x AMD EPYC™ 9004/9005
GPU 8 x AMD Instinct™ MI210 or 8 x NVIDIA H100 NVL
MAX MEMORY 6TB
DRIVE BAY CAPACITY (MAX) Up to 4 x E1.S + 2 x M.2
COOLING Immersion
MAX POWER(SYSTEM) 3200W
RACKSTANDARD EIA 19"
Spec
SG223-2A-I
MODEL SG223-2A-I
WORKLOAD AI training, inference, and GPU acceleration
FORM FACTOR 2U
PROCESSOR 2 x AMD EPYC™ 9004/9005
GPU 8 x AMD Instinct™ MI210 or 8 x NVIDIA H100 NVL
MAX MEMORY 6TB
DRIVE BAY CAPACITY (MAX) Up to 4 x E1.S + 2 x M.2
COOLING Immersion
MAX POWER(SYSTEM) 3200W
RACKSTANDARD EIA 19"
Spec
SG220-2
MODEL SG220-2
WORKLOAD AI training, inference, and GPU acceleration
FORM FACTOR 2U
PROCESSOR 2 x Intel® Xeon® Scalable (4th/5th Gen)
GPU 4 x NVIDIA H100 NVL or 4 x AMD Instinct™ MI210
MAX MEMORY 4TB
DRIVE BAY CAPACITY (MAX) 8 x 2.5" + 2 x M.2
COOLING Air
MAX POWER(SYSTEM) 2400W (max PSU listed)
RACKSTANDARD EIA 19"
Spec

Service

Platform validation and configuration tuning
Rack integration readiness (position as capability)
Cooling advisory: air vs immersion decision support

Get the latest on AI infrastructure

Contact