"Columbia targeted me for my activism, creating a new authoritarian disciplinary office to bypass due process & silence students criticizing Israel. Columbia surrendered to federal pressure by disclosing student records to Congress & yielding to the Trump administration's latest threats."
""If Chinese models become more advanced & more widely used by Americans, China could manipulate the models or ignore harms to American users from "illicit & harmful activities such as identity fraud & intellectual property theft," OpenAI alleged."
Lol but OpenAI on the other hand is shielding us from such activities. They're not arguing that THEY should be the one to commit those. Right?
The idea that you shouldn't be expected to know what data you're using to train your systems and that doing so is "an impossible task" is so normalized that its hard to know that this was not always the case even in the field of AI.
Data theft and scraping became completely normalized, along with the exploitation of crowdworkers, with the advent of photo sharing and other platforms and others like amazon mechanical turk.
So now NOT exploiting people and stealing data is the anomaly.
"California legislators have begun debating a bill (A.B. 412) that would require AI developers to track and disclose every registered copyrighted work used in AI training. At first glance, this might sound like a reasonable step toward transparency. But it’s an impossible standard that could crush small AI startups and developers while giving big tech firms even more power."
Whether you're a small restaurant or not you have to ensure that you're not stealing your ingredients. So why is this any different?
What about the 1-2 person creative startups? Who is protecting their works in a society that devalues artists so much that "starving artist" is an expectation?
""In their policy recommendations, OpenAI made it clear that it thinks funneling as much data as possible to AI companies—regardless of rights holders' concerns—is the only path to global AI leadership."
I've always wondered how these cars ever passed any type of safety test. RIP to the students.
"The Highway Patrol’s investigation into a November Cybertruck crash in Piedmont where three college kids died is finding two very Tesla problems: the vehicle immediately caught fire, and its doors would not open."
On information retrieval and what we need for healthy information access systems, please read Chirag Shah & @emilymbender's paper: "Envisioning Information Access Systems: What Makes for Good Tools and a Healthy Web?" https://dl.acm.org/doi/full/10.1145/3649468
Rather than responding that they won't do anything without due process, Yale University preemptively complies by putting her on administrative leave.
Watch what is coming next for you, courtesy of so-called "AI powered news" sites targeting you & institutions who can't wait to comply because they don't think they're setting the stage for their own targeting. Anything goes if you're accused of being "a terrorist."
Good. Lets replace search with "AI" then! Totally logical if you ask me. Even more worth it when you know they're exponentially overtaking the airline industry in their carbon footprint.
Meanwhile at Harvard, nothing special, just invited leaders of genocidal countries joking about sending exploding pagers to dissenting students during their speech on campus.
Columbia University's OWN faculty were targeting Mahmoud Khalil. One of those faculty was, surprise surprise, Shai Davidai.
The university has faculty & students who physically & chemically attacked students, & we are seeing what has happened to this particular student here being abducted.
@netopwibby Kind like how the press does a post mortem on Elizabeth Holmes or others like "WHAT HAPPENED"?
What happens, always, is everyone ignores those who know and tell who these people are, and decide to elevate their so-called "geniuses" to godlike status, and make us look like we're crazy for constantly warning the public.
@netopwibby people assumed we were talking about some "niche" issues when we discussed how he treated his workers, his eugenicist ideals, how he treats women, etc. And when he started keynoting effective altruist conferences, that was back in 2013 or so. Shit the press amplified his stupid "pause AI" paper just a couple of years ago, and the whole TIME cover, the SNL opening.
Everyone wants to be like "how did this happen?"
Well, no one wants to listen to us. That's been my story over & over.
@netopwibby There have been competitors for more than 10 years now. And he definitely didn't keep to himself. We've been yelling about him for at least 10 years, and within the last few years is where Tesla stock took off. In sum people don't listen to Black women even though we always pay the price for speaking up. I know how many attacks I have fielded for the constant ways in which I had been yelling about him. I even spent my precious time writing peer reviewed papers about his ideologies.
@SimonCHulse There were many people who did hear us, and chose to ignore us. And it was obvious, just that they thought it wasn't THEM who were impacted and so they were gonna be fine, like when we talked about what he was doing to his Black workers.
Fired from Google for raising issues of discrimination in the workplace and writing about the dangers of large language models: https://www.wired.com/story/google-timnit-gebru-ai-what-really-happened/.Founded The Distributed AI Research Institute (https://www.dair-institute.org/) to work on community-rooted AI research.Author of forthcoming book: The View from Somewhere, a memoir & manifesto arguing for a technological future that serves our communities (to be published by One Signal / Atria