GNU social JP
  • FAQ
  • Login
GNU social JPは日本のGNU socialサーバーです。
Usage/ToS/admin/test/Pleroma FE
  • Public

    • Public
    • Network
    • Groups
    • Featured
    • Popular
    • People

Conversation

Notices

  1. Embed this notice
    kaia (kaia@brotka.st)'s status on Sunday, 15-Oct-2023 05:46:06 JST kaia kaia
    • cohle
    @cohle can you share how you are using Stable Diffusion please :akko_please:
    In conversation Sunday, 15-Oct-2023 05:46:06 JST from brotka.st permalink
    • Embed this notice
      cohle (cohle@shitposter.club)'s status on Sunday, 15-Oct-2023 06:05:28 JST cohle cohle
      in reply to

      @kaia It used to be a bit harder, but I think now you basically just have to follow https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Install-and-Run-on-AMD-GPUs#running-natively

      You have to install a bunch of rocm packages, I have these installed

      rocm-clang-ocl 5.6.1-1 rocm-cmake 5.6.1-1 rocm-core 5.6.1-1 rocm-device-libs 5.6.1-1 rocm-hip-libraries 5.6.1-1 rocm-hip-runtime 5.6.1-1 rocm-hip-sdk 5.6.1-1 rocm-language-runtime 5.6.1-1 rocm-llvm 5.6.1-1 rocm-opencl-runtime 5.6.1-1 rocm-smi-lib 5.6.1-1 rocminfo 5.6.1-1

      I also have python 3.11.5 installed via the AUR, since arch already upgraded to 3.12, but pytorch requires 3.11. So I created the venv with python3.11 -m venv to make sure it used 3.11 for the virtual environment. I also changed the url for rocm from the guide from https://download.pytorch.org/whl/rocm5.1.1 to https://download.pytorch.org/whl/rocm5.4.2 (apparently there’s now already rocm5.6, but I haven’t tried that)

      For launching I just use this shell script:

      #/bin/bash source venv/bin/activate git pull TORCH_COMMAND='pip install torch torchvision --extra-index-url https://download.pytorch.org/whl/rocm5.4.2'; python launch.py --opt-sub-quad-attention --no-half-vae --api

      Lastly I have these variables in /etc/environment

      HSA_OVERRIDE_GFX_VERSION=10.3.0 MIOPEN_DEBUG_COMGR_HIP_PCH_ENFORCE=0 PATH=/opt/rocm/bin:/opt/rocm/llvm/bin LLVM_PATH=/opt/rocm/llvm ROCM_PATH=/opt/rocm

      I’m not sure if they’re still required though. You can check whether the virtual environment is set up correctly by just opening a python shell and then running

      import torch torch.cuda.is_available()
      In conversation Sunday, 15-Oct-2023 06:05:28 JST permalink

      Attachments



      1. Domain not in remote thumbnail source whitelist: opengraph.githubassets.com
        Install and Run on AMD GPUs
        Stable Diffusion web UI. Contribute to AUTOMATIC1111/stable-diffusion-webui development by creating an account on GitHub.

      kaia likes this.
    • Embed this notice
      kaia (kaia@brotka.st)'s status on Sunday, 15-Oct-2023 06:05:33 JST kaia kaia
      in reply to
      • cohle
      @cohle thank you! I'll try!
      In conversation Sunday, 15-Oct-2023 06:05:33 JST permalink
    • Embed this notice
      cohle (cohle@shitposter.club)'s status on Sunday, 15-Oct-2023 06:36:21 JST cohle cohle
      in reply to
      @kaia good luck, you'll probably need it. I also just upgraded to rocm5.6 and it still works.
      In conversation Sunday, 15-Oct-2023 06:36:21 JST permalink
      kaia likes this.

Feeds

  • Activity Streams
  • RSS 2.0
  • Atom
  • Help
  • About
  • FAQ
  • TOS
  • Privacy
  • Source
  • Version
  • Contact

GNU social JP is a social network, courtesy of GNU social JP管理人. It runs on GNU social, version 2.0.2-dev, available under the GNU Affero General Public License.

Creative Commons Attribution 3.0 All GNU social JP content and data are available under the Creative Commons Attribution 3.0 license.