Embed this noticePer Axbom (axbom@axbom.me)'s status on Saturday, 17-Feb-2024 18:21:19 JST
Per AxbomThe electricity and water use required by data centres is becoming cause for concern. Iowa and Ireland are calling for moratoriums on new development projects. Microsoft’s global water consumption grew 34% from 2021 to 2022. And estimates say that around 80% of content in data centres could be stuff we never use again, stored by default to consume energy in perpetuity for no purpose at all.
Abstract Always-listening devices like smart speakers, smartphones, and other voice-activated technologies create enough privacy problems when working correctly. But these devices can also misinterpret what they hear, and thus accidentally record their surroundings without the consent of those they record, a phenomenon known as a ‘false positive.’ The privacy practices of device users add another complication: a recent study of individual privacy expectations regarding false positives by voice assistants depicts how people tend to carefully consider the privacy preferences of those closest to them when deciding whether to subject them to the risk of accidental recording, but often disregard the preferences of others. The failure of device owners to get consent from those around them is exacerbated by the accidental recordings, as it means that the companies collecting the recordings aren’t obtaining the consent to record their subjects that the Federal Wiretap Act, state wiretapping laws, and consumer protection laws require, as well as contravening the stringent privacy assurances that these companies generally provide. The laws governing surreptitious recordings also frequently rely on individual and societal expectations of privacy, which are warped by the justifiable resignation to privacy invasions that most people eventually acquire.
The result is a legal regime ill-adapted to always-listening devices, with companies frequently violating wiretapping and consumer protection laws, regulators failing to enforce them, and widespread privacy violations. Ubiquitous, accidental wiretaps in our homes, workplaces, and schools are just one more example of why consent-centric approaches cannot sufficiently protect our privacy, and policymakers must learn from those failures rather than doubling down on a failed model of privacy governance.
OpenAI: "Training chat models is not a clean industrial process. different training runs even using the same datasets can produce models that are noticeably different in personality, writing style, refusal behavior, evaluation performance, and even political bias,"
They can say 'refusal behavior' and get away with it.
"[AGI] interprets the Turing Test as an engineering prediction, arguing that the machine “learning” algorithms of today will naturally evolve as they increase in power to think subjectively like humans, including emotion, social skills, consciousness and so on. The claims that increasing computer power will eventually result in fundamental change are hard to justify on technical grounds, and some say this is like arguing that if we make aeroplanes fly fast enough, eventually one will lay an egg."
From the book Moral Codes - Designing Alternatives to AI, by Alan Blackwell
Paris Marx is joined by Tim Schwab to discuss why the story we hear about Bill Gates and the Bill and Melinda Gates Foundation doesn’t reflect their real impact on education and health around the world.
Tim Schwab is an investigative journalist and the author of The Bill Gates Problem: Reckoning with the Myth of the Good Billionaire.
Tim Schwab: ”I mean, the problem is that this one side of reporting, it’s ended up producing what is essentially misinformation. It’s a lot of fictions really: this idea that Bill Gates is giving away all his money. That’s not true. His personal wealth is nearly doubled. During his tenure as a philanthropist. You go to the Gates Foundation’s website, and you’ll see lots of pictures of Black and Brown women and children, the so-called targets of the Gates Foundation’s charitable giving. But if you follow the money, almost all of the foundation’s charitable gifts go to rich nations like the United States, Switzerland, the United Kingdom. So, once you start to really peel back the layers, you realize that a lot of this sort of prevailing news coverage of Gates is telling a story that’s not just one-sided, but just it’s wrong.”
"News anchors" can be of all ages and backgrounds, and there can be 50% women presenting. The men in control can of course also give each woman exactly "the personality they want" and decide what they wear.
The men will definitely be applauded for their diversity efforts.
When I say in my talks that much of this forward motion is bringing us back in time, this is the type of "innovation" I am referring to.
Stop. Take a step back. Look at this state of mind from a distance.
If this is how it is making people feel, how is it helpful?
Creating fear and self-doubt is a sales technique. Notice and acknowledge when people are doing this to you, even as they are listing "50 AI tools you should be using today!"
I believe that by learning to ignore unhelpful assertions and claims designed to break down – rather than build – your confidence, you can improve your wellbeing.
My podcast colleague @Beantin alerted me to how Spotify have started attaching automated transcripts to our episodes.
And of course the automated tool gets my name quite wrong: "Hello I'm Pat Axbul".
So I had to check. How can I change that? As it turns out, I'm not allowed to. According to Spotify: "It's currently not possible to edit your transcripts in Spotify for Podcasters."
In my case it's a small thing, but now imagine what harmful faulty text may make it into transcripts and be attributed to a person who has said nothing of the sort. And they are not allowed to change it.
Don't you just love living in this time and age where foresight is 0/0?
Imagine a computer taking 90 seconds to start up in the morning. You press the on switch and during the wait you naturally take a moment to exchange pleasantries with your colleagues, ask them about their families, hobbies or more immediate concerns. They reciprocate with a smile.
Fast forward two years and the same computer takes 7 seconds to start. You have barely enough time to sit down and adjust your seat.
Now there’s no need to talk to your colleagues. No need to smile. Because technology has just saved you so much time. Almost 6 hours per year. Per employee
A sympathy notice means that a union declares their intent to take conflict measures that support another union in their ongoing negotiations. The difference between a sympathy notice and a "normal" notice is that the supporting union is not directly involved in the ongoing negotiations.
This means a total of seven(!) unions in Sweden are now taking action against Tesla.
1) IF Metall: No workshop repair work 2) Transport: Blockage of ports (no car deliveries in the 4 ports of Malmö, Gothenburg, Trelleborg, and Södertälje) 3) Fastighets: No cleaning of premises 4) Seko: Post and delivery blockade 5) Elektrikerna: No electricity work or repair 6) Målarna: No paintwork on vehicles 7) ST: No mail or package deliveries
Background: Tesla is refusing to sign a collective agreement with their union, IF Metall. Metal workers at Tesla’s seven Swedish repair shops have been on strike since October 27. For context, around 90% of Swedish employees are covered by collective agreements. These agreeements outline terms of pay, pensions, and working conditions.
IF Metall has been working on getting Tesla to sign a collective agreement with workers in its repair shops since 2018. Union representatives have said they are ready for a long strike, if deemed necessary.
By way of @garymarcus newsletter I was made aware of the following:
In a New York Times article on the self-driving company Cruise that recently suspended its cars, some interesting figured were revealed:
”Half of Cruise’s 400 cars were in San Francisco when the driverless operations were stopped. Those vehicles were supported by a vast operations staff, with 1.5 workers per vehicle. The workers intervened to assist the company’s vehicles every 2.5 to five miles, according to two people familiar with is operations. In other words, they frequently had to do something to remotely control a car after receiving a cellular signal that it was having problems.”
That’s a human intervention every 4-8 kilometres. More and more people are becoming aware of how many people are involved in the development, maintenance and running of machine-learning models. It’s safe to assume that machine-controlled cars are no different.
Most of the world is talking about self-driving and autonomous as if those are apt descriptions of what is already happening. Reality begs to differ. I think we need words that better describe what is really going on, and for media (and evangelists) to stop parroting whatever the companies feed them.
Autonomous used to mean something. Let’s ask the companies what they intend for the words to mean, and urge them to disclose the number of humans involved in making something appear ”autonomous”.
In light of these numbers being talked about, Cruise CEO Vogt clarifies (on Hacker News) that Cruise AVs are remotely assisted 2-4% of the time on average.* Interestingly he also says: ”This is low enough already that there isn’t a huge cost benefit to optimizing much further.” He also goes on to say that they are intentionally over staffed ”in order to handle localized bursts of RA demand”.
So maybe that’s what self-driving means.
—————
*Note that the numbers ”every 2.5 to 5 miles” and ”2-4% of the time” are not necessarily in conflict, especially in San Francisco.
LET ME KNOW what other terms you find have been invented or shifted to mean something else to obscure limited functionality. I may have to make a glossary. ”Hallucination” is for example another one of those for me.
It’s still unclear what this will include, so your guess is as good as mine. Including what impact US regulation may have on the rest of the world.
What I do feel is becoming more and more clear is a growing need for organisations to adopt a well-defined role around the area of anti-discrimination oversight.
With increased use, and increased liability, organisations will have to be accountable for the discrimination that everyday use and output may proliferate.
« the Oct. 23 draft order calls for extensive new checks on the technology, directing agencies to set standards to ensure data privacy and cybersecurity, prevent discrimination, enforce fairness and also closely monitor the competitive landscape of a fast-growing industry »
According to leaked drafts, Biden’s order will also direct ”the Federal Trade Commission, for instance, to focus on anti-competitive behavior and consumer harms in the AI industry”.
You are not learning fast enough. You are not using the new tools enough. You are not publishing enough. You are not in nature enough. You are not caring enough. You are not being social enough. You are not putting away your phone enough. You are not practicing enough. You are not recycling enough. You are not protesting enough. You are not planning enough. You are not moving quickly enough.
All of this is wrong. So very wrong. You are you. You matter. So much. And you are enough.
"AI is existing as it's supposed to exist," says McKernan. "I think it has had a lot of potential to make our lives easier, to make workflows more effective. My issue is that the implementation of it, especially with AI art, hasn't been ethical, in my opinion because of the way it is built off a massive data set with 5.1 billion images, and taxpayers' data, all of which was culled from the internet without consent."
Pending follow request? It’s a bug! Read this: https://axbom.com/migfail/Teacher, coach, speaker and designer in the space of #DigitalEthics, #InclusiveDesign and #Accessibility. Long history of tinkering with computers and making stuff on the Internet.Writer, blogger and author working to mitigate online harm. Maker of visual explainers. Communication theorist by education, #HumanRights advocate by dedication.Born in Liberia of Swedish parents.Country-living, book-loving middle-aged family man with adult kids and a French bulldog. Love to untangle digital messes. Preferably during long walks in the forest or meditative motorcycle rides.Co-host of @uxpodcast@mastodon.social. Try to get paid for my work but I put most of it out there for free ?Social media is fickle and unpredictable. To make sure you continue to get updates from me, I recommend signing up for my free newsletter below.This is my 4th Fediverse account. My posts are licensed under Creative Commons Attribution-NonComm