The midrange AI offering from Boston, the ANNA Ampere M1 server features the power of up to 4 NVIDIA A40 GPUs. The system supports PCI-E Gen 4 for fast CPU-GPU connection and high-speed networking expansion cards.
Ideal for AI/Deep Learning Training, Visualisation and HPC, this midrange AI offering from Boston features the power of up to 4 NVIDIA A40 GPUs. The ANNA Ampere M1 supports PCI-E Gen 4 for fast CPU-GPU connection and high-speed networking expansion cards.
With GPU Direct RDMA and 1 to 1 mapping between network interconnects and GPU. With 1 + 1 power redundancy it makes the system ideal for HPC and AI workloads. Designed with AI/Deep learning and Inference in mind, the ANNA Ampere M1 makes an ideal choice for a midrange system to start implementing advanced AI solutions.
Certified by NVIDIA for use with Nvidia AI Enterprise suite and with access to NGC.
Chipset
Intel C621A
Drive Bays
2 Front Hot-swap 2.5" U.2 NVMe Gen4
Drive Support
1 M.2 NVMe for boot drive only
NVMe
Expansion Slots
1 PCI-E Gen 4 x8 AIOM networking slot
6 PCI-E Gen 4 x16 (4 internal, 2 external) slots
Form Factor
1U Rackmount
GPU Manufacturer
NVIDIA
GPU Quantity
Up to 4x NVIDIA A40 double width GPUs
Manufacturer
Supermicro
Memory (Maximum)
Up to 4TB: 16x 256 GB DRAM and 8x 512 GB PMem
Memory Type
3200/2933/2666MHz ECC DDR4 RDIMM/LRDIMM
Intel® Optane™ persistent memory 200 series
Network Connectivity
1 RJ45 1GbE Dedicated IPMI Management Port
2 x 10GBase-T ports
Power Supply
2000W Redundant Power Supplies with PMBus
To help our clients make informed decisions about new technologies, we have opened up our research & development facilities and actively encourage customers to try the latest platforms using their own tools and if necessary together with their existing hardware. Remote access is also available
A summary of what to expect from the RUN:AI event