lambdalabs logo

Lambda

A100_sxm4

| 40GB VRAMs | gpu_1x_a100_sxm4

Explore Lambda's A100 cloud instance specifications and benchmarks. Compare hardware configurations and performance metrics to optimize your AI and ML workloads.

LLM Benchmark Comparison

Compare performance metrics between different language models

Hardware Specifications

GPU ConfigurationValue
GPU TypeA100
GPU InterconnectSXM4
GPU Model NameNVIDIA A100-SXM4-40GB
Driver Version550.127.05
GPU VRAM40 GB
Power Limit (W)400.00
GPU Temperature (°C)34
GPU Clock Speed (MHz)210
Memory Clock Speed (MHz)1215
PstateP0
CPU ConfigurationValue
Model NameAMD EPYC 7J13 64-Core Processor
Vendor IDAuthenticAMD
CPUs30
CPU Clock Speed4899.99
Threads Per Core1
Cores Per Socket1
Sockets30
MemoryValue
Total216Gb

Disks Specifications

StorageValue
Total958.00 GB

Available Disks

PropertyValue
Disk 1
Modelvda
Size512Gb
TypeHDD
Mount PointUnmounted
Disk 2
Modelvdb
Size446K
TypeHDD
Mount PointUnmounted

Software Specifications

SoftwareValue
OSUbuntu
OS Version22.04.5 LTS (Jammy Jellyfish)
Cuda Driver12.4
Docker Version27.4.0
Python VersionPython 3.10.12

Benchmarks

powered byLLM Benchmark Logo
BenchmarkValue
ffmpeg217ms
Coremark (Iterations per sec)29044.438
llama2Inference (Tokens per sec)39.92
Tensorflow Mnist Training1.986

Launch instance

CloudLambda
GPU TypeA100
Shadeform Instance TypeA100 sxm4
Cloud Instance Typegpu 1x a100 sxm4
Spin Up Time5-10 mins
Hourly Price$1.29

By clicking launch you agree to our Terms of Service.

Feedback