lambdalabs logo

Lambda

A100_sxm4

| 40GB VRAMs | gpu_1x_a100_sxm4

Explore Lambda's A100 cloud instance specifications and benchmarks. Compare hardware configurations and performance metrics to optimize your AI and ML workloads.

LLM Benchmark Comparison

Hardware Specifications

GPU ConfigurationValue
GPU TypeA100
GPU InterconnectSXM4
GPU Model NameNVIDIA A100-SXM4-40GB
Driver Version535.129.03
GPU VRAM40
Power Limit (W)400.00
GPU Temperature (°C)31
GPU Clock Speed (MHz)210
Memory Clock Speed (MHz)1215
PstateP0
CPU ConfigurationValue
Model NameAMD EPYC 7J13 64-Core Processor
Vendor IDAuthenticAMD
CPUs30
CPU Clock Speed4899.99
Threads Per Core1
Cores Per Socket1
Sockets30
MemoryValue
Total216Gb

Disks Specifications

StorageValue
Total512.00GB

Available Disks

PropertyValue
Disk 1
Modelvda
Size512Gb
TypeHDD
Mount PointUnmounted

Software Specifications

SoftwareValue
OSUbuntu
OS Version22.04.3 LTS (Jammy Jellyfish)
Cuda Driver12.2
Docker Version24.0.7
Python VersionPython 3.10.12

Benchmarks

BenchmarkValue
ffmpeg158
Coremark (Itterations per sec)29310.471
llama2Inference (Tokens per sec)44.25
Tensorflow Mnist Training2.007

Nvidia-smi output

Nvidia-smi topo -m outpu

Launch instance

CloudLambda
GPU TypeA100
Shadeform Instance TypeA100 sxm4
Cloud Instance Typegpu 1x a100 sxm4
Spin Up Time5-10 mins
Hourly Price$1.29

By clicking launch you agree to our Terms of Service.

Feedback

© Shadeform, Inc