@annaleen Yes, thank you, you're right. The eugenics project, and "superintelligence", is all about producing beings who are "generally" "better" in every way. Acknowledging the impossibility of this and celebrating specific strengths instead of total "superiority" seems like a good approach for avoiding ending up like Bostrom.
Now that I think about it, many plot-relevant traits I gave my AIs, like immortality or having a distributed brain, don't even require sentience.