Llama.cpp and Ollama servers + plugins for VS Code / VS Codium and IntelliJ (AI)
|
|
4
|
4694
|
May 21, 2024
|
Setup of Multiple Discrete AMD GPUs in an Incus Container
|
|
0
|
156
|
June 29, 2024
|
Run Offline TTS with AMD GPU Acceleration in an Incus Container
|
|
0
|
207
|
June 25, 2024
|
Docker + Perplexica + Snap + Firefox in a GUI-enabled Incus Container (AI)
|
|
7
|
327
|
June 25, 2024
|
ROCm and PyTorch on AMD APU or GPU (AI)
|
|
9
|
2813
|
June 23, 2024
|
LLMs in LM Studio (AI)
|
|
0
|
1197
|
April 19, 2024
|
Stable Diffusion SDXL with Fooocus (AI)
|
|
0
|
999
|
April 19, 2024
|