GNU social JP
  • FAQ
  • Login
GNU social JPは日本のGNU socialサーバーです。
Usage/ToS/admin/test/Pleroma FE
  • Public

    • Public
    • Network
    • Groups
    • Featured
    • Popular
    • People

Conversation

Notices

  1. Embed this notice
    kaia (kaia@brotka.st)'s status on Tuesday, 24-Oct-2023 15:54:38 JST kaia kaia
    looking for GPUs feels like checking on drug prices :puniko_wtf3:
    In conversation Tuesday, 24-Oct-2023 15:54:38 JST from brotka.st permalink
    • Embed this notice
      kaia (kaia@brotka.st)'s status on Tuesday, 24-Oct-2023 16:19:41 JST kaia kaia
      in reply to
      • condret :verified: :ancom:

      @condret generating tomboys important AI research

      In conversation Tuesday, 24-Oct-2023 16:19:41 JST permalink
    • Embed this notice
      condret :verified: :ancom: (condret@fedi.absturztau.be)'s status on Tuesday, 24-Oct-2023 16:19:50 JST condret :verified: :ancom: condret :verified: :ancom:
      in reply to
      @kaia gpus for what purpose?
      In conversation Tuesday, 24-Oct-2023 16:19:50 JST permalink
    • Embed this notice
      condret :verified: :ancom: (condret@fedi.absturztau.be)'s status on Tuesday, 24-Oct-2023 16:24:25 JST condret :verified: :ancom: condret :verified: :ancom:
      in reply to
      @kaia as long as you don't want to heavily train models, it could work with some older gpu just fine, i believe.

      a friend runs some llama models on cpu.
      In conversation Tuesday, 24-Oct-2023 16:24:25 JST permalink
      kaia likes this.
    • Embed this notice
      kaia (kaia@brotka.st)'s status on Tuesday, 24-Oct-2023 16:25:05 JST kaia kaia
      in reply to
      • condret :verified: :ancom:
      @condret
      llama is CPU though. I need cold hard CUDA cores for StableDiffusion
      In conversation Tuesday, 24-Oct-2023 16:25:05 JST permalink
    • Embed this notice
      kaia (kaia@brotka.st)'s status on Tuesday, 24-Oct-2023 16:39:51 JST kaia kaia
      in reply to
      • condret :verified: :ancom:
      @condret yeah and it sucks so much :sadcat:
      In conversation Tuesday, 24-Oct-2023 16:39:51 JST permalink
    • Embed this notice
      condret :verified: :ancom: (condret@fedi.absturztau.be)'s status on Tuesday, 24-Oct-2023 16:39:52 JST condret :verified: :ancom: condret :verified: :ancom:
      in reply to
      • condret :verified: :ancom:
      @kaia i mean https://github.com/ROCm-Developer-Tools/HIPIFY exists, so you don't have to necessarily rely on cuda apis
      In conversation Tuesday, 24-Oct-2023 16:39:52 JST permalink

      Attachments

      1. Domain not in remote thumbnail source whitelist: opengraph.githubassets.com
        GitHub - ROCm-Developer-Tools/HIPIFY: HIPIFY: Convert CUDA to Portable C++ Code
        HIPIFY: Convert CUDA to Portable C++ Code. Contribute to ROCm-Developer-Tools/HIPIFY development by creating an account on GitHub.
      kaia likes this.
    • Embed this notice
      condret :verified: :ancom: (condret@fedi.absturztau.be)'s status on Tuesday, 24-Oct-2023 16:39:54 JST condret :verified: :ancom: condret :verified: :ancom:
      in reply to
      @kaia why cuda?
      In conversation Tuesday, 24-Oct-2023 16:39:54 JST permalink
    • Embed this notice
      ロミンちゃん (romin@shitposter.club)'s status on Tuesday, 24-Oct-2023 16:46:02 JST ロミンちゃん ロミンちゃん
      in reply to
      • condret :verified: :ancom:
      @kaia @condret
      >llama is CPU though
      It isn't? If you meant llama.cpp, they've supported gpus for a while.
      In conversation Tuesday, 24-Oct-2023 16:46:02 JST permalink
      kaia likes this.

Feeds

  • Activity Streams
  • RSS 2.0
  • Atom
  • Help
  • About
  • FAQ
  • TOS
  • Privacy
  • Source
  • Version
  • Contact

GNU social JP is a social network, courtesy of GNU social JP管理人. It runs on GNU social, version 2.0.2-dev, available under the GNU Affero General Public License.

Creative Commons Attribution 3.0 All GNU social JP content and data are available under the Creative Commons Attribution 3.0 license.