Janky AI
  • Home
  • About
Sign in Subscribe

How many GPUs do you want to cram into your box? Yes.

DeltaSqueezer

Jul 9, 2024
How many GPUs do you want to cram into your box? Yes.

Custom case, or special server cards to fit 4 GPUs into a case? No need, we'll just squash them in there.

Congrats to stonedoubt for this tetris-like feat and great thermal density!

Behold my dumb sh*t 😂😂😂
by u/stonedoubt in LocalLLaMA

Sign up for more like this.

Enter your email
Subscribe
Notable LLMs that are Apache/MIT licensed

Notable LLMs that are Apache/MIT licensed

Sometimes you want LLMs that are unencumbered by non-commercial licenses. Below is a list of some notable LLMs that have friendly license agreements. * Mistral family * Mistral 7B, Mixtral 8*7B, Mixtral 8*22B * Mistral Nemo 12GB with quantization aware training for good FP8 performance * Qwen2-0.5B, Qwen2-1.5B, Qwen2-7B, Qwen2-57B-A14B
Jul 18, 2024 1 min read
Reducing idle power consumption for Nvidia P100 and P40 GPUs

Reducing idle power consumption for Nvidia P100 and P40 GPUs

One overlooked aspect of GPU usage is the power they consume when idle. Idle power draw refers to the amount of electricity a GPU consumes when it's not performing intensive tasks. This can significantly impact both energy consumption and electricity costs over time. Without any tricks, a P40
Jul 14, 2024 5 min read
Sometimes when you don't have 340 GB of VRAM

Sometimes when you don't have 340 GB of VRAM

You just have to resort to running on your computer with 12 sticks of 32GB RAM!
Jul 12, 2024
Janky AI © 2025
  • Sign up