From 58520a1f7534627f75be92216db4913dd3720390 Mon Sep 17 00:00:00 2001 From: oobabooga <112222186+oobabooga@users.noreply.github.com> Date: Mon, 20 Feb 2023 12:44:31 -0300 Subject: [PATCH] Update README.md --- README.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/README.md b/README.md index 6af36c1a..1dfe97c8 100644 --- a/README.md +++ b/README.md @@ -24,11 +24,11 @@ Its goal is to become the [AUTOMATIC1111/stable-diffusion-webui](https://github. * Load large models in 8-bit mode ([see here](https://github.com/oobabooga/text-generation-webui/issues/20#issuecomment-1411650652) if you are on Windows). * Split large models across your GPU(s), CPU, and disk. * CPU mode. -* DeepSpeed ZeRO-3 offload ([guide](https://github.com/oobabooga/text-generation-webui/wiki/DeepSpeed)). -* Get responses via API. +* [DeepSpeed ZeRO-3 offload](https://github.com/oobabooga/text-generation-webui/wiki/DeepSpeed). +* [Get responses via API](https://github.com/oobabooga/text-generation-webui/blob/main/api-example.py). * Supports softprompts. -* Supports extensions ([guide](https://github.com/oobabooga/text-generation-webui/wiki/Extensions)). -* Works on Google Colab ([guide](https://github.com/oobabooga/text-generation-webui/wiki/Running-on-Colab)). +* [Supports extensions](https://github.com/oobabooga/text-generation-webui/wiki/Extensions). +* [Works on Google Colab](https://github.com/oobabooga/text-generation-webui/wiki/Running-on-Colab). ## Installation option 1: conda