@datarama@hachyderm.io@inthehands@hachyderm.io I'm sorry to jump in, but I share these kinds of feelings (and I like to believe there are a fair number of us who do). It's one of the many reasons I'm so negative about today's generative AI technology. I studied AI in graduate school, and technically I might have cashed in on this but I purposely chose not to purpose a career at any of the FAANG companies or the major corporate research labs for precisely these reasons. I'd like to live to see the neoliberal era end, not grab its hand and help pull it further along.
Not public yet. I’ll share when it’s ready to share, but honestly, you probably don’t want it: it’s •exactly• what I need (or on its way to being that), so it almost certainly won’t be what you need.
The thing is currently 261 lines of Ruby, and I already like it better than every alternative I’ve tried.
Still missing a few key features (the big one is components / partials that have some way to bundle up their associated CSS + JS), so who knows, it might be 1000 lines of code in a few days here.
Or I might actually try to make it fast, which will increase the size by ~100x and have me rewriting it in Swift or something.
@inthehands My approach to understanding a thing tends to be comparing three examples, so "a personally fitted SSG" would be a welcome part of compare and contrast learning.
There are many derivative static site generators which don't acknowledge that they are "Jekyll*, implemented in [language], plus [shiny feature!]"
*For Jekyll, substitute Hugo or Pelican or Eleventy or...
@jmeowmeow Yeah, Jekyll, Hugo, Eleventy were examples I looked at, along with Astro and Nanoc. Three deal-killers:
1. They’re all soooo complicated; figuring out how to fit inside their abstractions and avoid their unwanted help ultimately proved more trouble than building from zero
2. Most of this site is migrating from a Rails app, wanted to stick with Ruby + Haml
(2.5 Javascript annoys me, as do most templating langs; this project is supposed to be fun, and JS isn’t fun for me)
@jmeowmeow 3. SSGs tend to have an attitude of “you want to write a bunch of web pages except in .md instead of .html,” but I want something that’s much more data-driven, with a clean data/presentation separation (borrowed heavily from Astro’s content collections approach here)
Its essential features that might be relevant for naming are:
- Intentionally simple - Data / presentation separation - I am unreasonable for having written it at all, there are way too many SSGs out there already, just use an existing tool, Paul, what are you doing, do you have no sense at all
“Superfluous” it is! Thanks, @tehstu and @CptSuperlative, and to everyone who gave suggestions.
Working on it over holiday break, gradually rebuilding my personal site with it as I build up Superfluous itself. It’s…actually really good for my needs?! I still feel sheepish about building an SSG from scratch, which is clearly unnecessary, but it’s fun — and I’m a firm believe that fun is reason enough!
OK, tossing a software terminology question out to the Fedi. Context:
- Static site generator - Top-level separation between (1) data and (2) how data gets formatted / processed - (1) and (2) have user-visible names, are separate dirs in project root - (1) is basically a plain text DB (md, json, yaml, etc) - (2) may include: templates, scripting, partials, style, images, other static assets
I’m calling (1) “data”. ••What should I call (2)?••
@mattly It needs to be a noun. “Presentation” is one idea I was considering. It’s tough: it’s not strictly theme-related; it’s more like “all the content of the site, some of which may be data-driven templates.” But that’s definitely one in the mix.
“Presentation” is the clear winner, and it’s working well in context — in the code, and more importantly from the tool user’s perspective. I think I just needed some communal courage to slough off the Unix-flavored precedent and make a dir name that long. Thanks, everyone!
UPDATE: I’ve been continuing to poke at this project in my spare time, and it’s going shockingly well.
I will probably eventually document and share Superfluous for those who are interested — but I’m also really enjoying the old-school DIY feeling of scratching my own itch, without any expectations of wide adoption or monetization. It’s nice. Feels like an antidote to several things at once.
@datarama I really hope you find a way to shed that AI brainworm that’s taken hold. It would bring me joy knowing that you’re building cool things just for the hell of it, even if I never get to see them!
@datarama Reminds me so much of how performing musicians felt about mechanical instruments (player piano) and then audio recording. For all the changes those things brought, even in the year 2024 people •still• love live performance, treasure it, seek it out — and more importantly, more people than ever are making music on their own, even though so many others could do more/better/whatever, in that 19th-century spirit of “piano is at the center of the home, is for the player first.”
@datarama The point of my post above about the 19th century is that your concert-centric vision (“The spectacle, the joy of seeing a skilled performer applying their skills, the social context”) is a very specific view of what music is for — not the only one, but so dominant currently that it’s hard for us to imagine the alternatives.
@inthehands (That is to say: Since software serves a purpose *beyond* existing, and we can fulfill that purpose without programming the software, then programming seems a lot more pointless than playing music - in music, the making of the music *is* the point, it is not a means towards that point.)
@inthehands The second is that music serves no purpose beyond whatever it evokes simply by existing (and before recording, it only existed in the moment). Software serves some purpose - whether that be presenting a game world to interact with, generating pictures, doing accounting work or whatnot. Programming is the act of making software exist, and if we can make software exist without programming, then programming deteriorates into meaninglessness to an extent that performing music doesn't.
@datarama There have been times and places when music needed no purpose beyond the music-maker’s experience.
Piano music of the era of Chopin, Brahms, Schubert, etc. — probably Beethoven too — was not really for concert halls. It was for homes. People played it at home, sometimes for the household, sometimes just for themselves, sometimes for guests, but sometimes in the wee hours of the night for whatever ears the night had. Music that needs no audience, as my late teacher said.
@datarama That mode of relating to music — a personal, quasi-private pursuit, centered on the home, and primarily for the player — was so pervasive in the piano’s history that there were once many hundreds of piano manufacturers in the US alone. “The big-screen TV of the 19th century,” I’ve heard it called. And the early 20th! It was the Great Depression that did it in — and radio to some extent, and to larger extent shifting values & economics post-war.
@datarama I believe that this way of relating to creative work is now both radical and necessary, an antidote to several of the things that ail us now.
@datarama The radicalness of what I propose upthread is that Big Tech •can’t• destroy it, for the same reason that sequencers, no matter how good they are, no matter whether they can play things human never even could, still can’t replace the embodied pleasure of playing an instrument yourself.
@abucci I do not think the current generation of AI is going to fulfill the oligarchs’ wet dream of ruling the world.
I •do• think it’s important for us to start thinking now about these questions about how concentration of power and wealth interacts with technological progress.
@abucci@inthehands I think the logical end point of where AI is currently heading (provided it keeps getting better) is that the neoliberal era *will* end. It will then be replaced by the neo-feudal era; a world of ultra-wealthy oligarchs presiding over hordes of desperate serfs competing for work where humans are still cheaper than robots.
*Right fucking now*, think tanks are talking about how an AI can generate content for less energy than a human, given that humans need food and shelter.
@datarama@abucci Yeah, in the near term, I’m a lot more worried about the disruption caused by bad decisions prompted by AI hype / wishful thinking / misapplication / FOMO / panic than I am about the disruption caused by the tech itself actually being good at things.
As I've said elsewhere: I'm not worried that the robot cultists will actually build an electronic god, or fuck it up and accidentally build an electronic satan. I'm worried about all the collateral damage from an infinite firehose of corporate money pumping into robot cultists who *think* they're building an electronic god.
@datarama@hachyderm.io@inthehands@hachyderm.io Oh I fully agree. I apologize, I should have finished my thought: having a bunch of people with power thinking that this is possible is dangerous. I don't believe that their fantasy world will ever be real, but I absolutely believe they wouldn't hesitate to hurt an enormous number of people in the process of trying to make it real.
Oof, please don't get me started on academia...that's a week-long rant, minimum!
@abucci@inthehands I agree (and a lot of the publications on AI and society on arXiv read more like speculative fiction than like serious social science papers!).
But it's terribly disturbing that actual academics are signing their names to that sort of thing.
(And seeing Sam Altman talk about how he wants to "replace the median human" ... well, shit. I don't think they're anywhere near actually being *able* to do that, but we have an entire section of the tech industry *cheering* dystopia.)
I've read one of those "papers" and it was absurd and not credible, full of obvious reasoning errors. I fully believe that there are neoliberals and would-be barons who'd want nothing more than to have fully automated luxury capitalism, but it's a fantasy just as colonizing Mars is a fantasy.
@abucci@inthehands As I said, *right now* there are respectable academics writing papers about how AI will allow "energy saving" because GPT-4 consumes less energy while writing a page of bullshit than a human being needs (because human beings require food and shelter, and take longer time).
The thing *not* said in how this represents "saving" is chilling to consider.
@inthehands@hachyderm.io I don't recall if I shared this in thread yet but I think Dan McQuillan's thinking on this subject is good: https://danmcquillan.org/ai_thatcherism.html The real issue is not only that AI doesn't work as advertised, but the impact it will have before this becomes painfully obvious to everyone. AI is being used as form of 'shock doctrine', where the sense of urgency generated by an allegedly world-transforming technology is used as an opportunity to transform social systems without democratic debate. This shock doctrine process is clearly well underway.