Complaints about how some regulation on #AI might "slow down innovation" are a bit weird.
Sure maybe but the pace of innovation isn't a priority in how to structure a society and legal environment: It's about protecting _people's_ rights, wellbeing and thriving, about protecting the ecological, social and political environment that society rests on. Innovation is so far down the list that I wouldn't even have time to scroll down that far.
So yeah, if "you as developer of an #AI system will be responsible to a certain degree for abuse it's used for" is a problem for you, maybe do something useful with your time. Because of course you should be responsible for the tools you put out into the world and the affordances that they come with or the mitigations they lack. Grow up for fuck's sake.
(sparked by Fei-Fei Li's bad take in Fortune Magazine: https://fortune.com/2024/08/06/godmother-of-ai-says-californias-ai-bill-will-harm-us-ecosystem-tech-politics/ )