GNU social JP
  • FAQ
  • Login
GNU social JPは日本のGNU socialサーバーです。
Usage/ToS/admin/test/Pleroma FE
  • Public

    • Public
    • Network
    • Groups
    • Featured
    • Popular
    • People

Conversation

Notices

  1. Embed this notice
    mekka okereke :verified: (mekkaokereke@hachyderm.io)'s status on Friday, 01-Nov-2024 21:35:35 JST mekka okereke :verified: mekka okereke :verified:

    100%! We did it! 🏆🥇

    Great job everyone!

    https://www.geekwire.com/2024/ai-overwhelmingly-prefers-white-and-male-job-candidates-in-new-test-of-resume-screening-bias/

    In conversation about 6 months ago from hachyderm.io permalink
    • Rich Felker repeated this.
    • Embed this notice
      Tom Bellin :picardfacepalm: (tob@hachyderm.io)'s status on Friday, 01-Nov-2024 21:41:04 JST Tom Bellin :picardfacepalm: Tom Bellin :picardfacepalm:
      in reply to

      @mekkaokereke I'm sure this is an easy fix. Maybe put "don't be racist" in the prompt.

      In conversation about 6 months ago permalink
    • Embed this notice
      mekka okereke :verified: (mekkaokereke@hachyderm.io)'s status on Friday, 01-Nov-2024 21:45:55 JST mekka okereke :verified: mekka okereke :verified:
      in reply to
      • Tom Bellin :picardfacepalm:

      @tob 🤣

      In conversation about 6 months ago permalink
    • Embed this notice
      Rob Ricci (ricci@discuss.systems)'s status on Friday, 01-Nov-2024 21:49:13 JST Rob Ricci Rob Ricci
      in reply to

      @mekkaokereke Turning Test but for racism

      In conversation about 6 months ago permalink
    • Embed this notice
      Dana Fried (tess@mastodon.social)'s status on Friday, 01-Nov-2024 23:53:34 JST Dana Fried Dana Fried
      in reply to

      @mekkaokereke didn't Amazon try this - and immediately shut it down for the same reason - like a decade ago?

      In conversation about 6 months ago permalink
    • Embed this notice
      mekka okereke :verified: (mekkaokereke@hachyderm.io)'s status on Saturday, 02-Nov-2024 00:00:41 JST mekka okereke :verified: mekka okereke :verified:
      in reply to
      • Dana Fried

      @tess

      No, that was completely different! 🤡

      That was a primitive model, using random forests or something, that accidentally quantified and exposed bias. Old school. Remedial. An inch above trying to make fire by rubbing sticks together, or slapping flint rocks together above a clump of moss. Basic.

      This is cutting edge LLMs, based on Transformer architectures... that accidentally quantified and exposed bias. High tech! Much innovation!

      The future is now(tm)!

      In conversation about 6 months ago permalink
    • Embed this notice
      Galbinus Caeli 🌯 (skiphuffman@astrodon.social)'s status on Saturday, 02-Nov-2024 00:09:32 JST Galbinus Caeli 🌯 Galbinus Caeli 🌯
      in reply to

      @mekkaokereke

      Generally unsurprising. Use biased data to produce a biased algorithm, get biased results.

      But I am surprised that "name" is included in the evaluation factors. I can see no use for including that.

      In conversation about 6 months ago permalink
      Rich Felker repeated this.
    • Embed this notice
      Dieu (hllizi@hespere.de)'s status on Saturday, 02-Nov-2024 03:19:29 JST Dieu Dieu
      in reply to
      • Galbinus Caeli 🌯

      @SkipHuffman @mekkaokereke That's part of the secret sauce! If half your workforce is named "John", another John cannot hurt. And that's the kind of invaluable functionality that's promising to convince droves of CEOs they have hit pure gold.

      In conversation about 6 months ago permalink

Feeds

  • Activity Streams
  • RSS 2.0
  • Atom
  • Help
  • About
  • FAQ
  • TOS
  • Privacy
  • Source
  • Version
  • Contact

GNU social JP is a social network, courtesy of GNU social JP管理人. It runs on GNU social, version 2.0.2-dev, available under the GNU Affero General Public License.

Creative Commons Attribution 3.0 All GNU social JP content and data are available under the Creative Commons Attribution 3.0 license.