It just can’t. A conscious mind might be harder, but sentience is just a very low bar. It seems to me that there has to be some insight we’re missing, because, empirically, the problem cannot be particularly hard.
What’s the least-complicated (or least-intelligent, or least-conscious, or whatever) animal that is sentient? Certainly my cat is sentient. Rats and mice and so on seem obviously sentient as well. I’m not an animal expert, so let’s go with rats, and if you know more than I do, substitute a simpler animal if you can.
So rats are sentient. Think about that. Every single rat is sentient. Every single rat going back tens of thousands of years has been sentient. That’s a shitton of different sentient brains, none of them particularly complex! That’s billions of slightly different arrangements of neurons, some for the better and some for the worse. That’s thousands of rats that we poke in the brain for Science™ that remain sentient. Smart rats, stupid rats, diseased rats, rats with birth defects, rats suffering neurotoxicity, malnourished rats, overfed rats, and on and on. All those brains, assembled sloppily from wetware, every last one sentient.
I assume you accept that this at the very least establishes that sentience is a goddamn robust property. If you take a sentient brain and make any random smallish perturbation to it or to its design (in the case of genetic blips), it remains a sentient brain (or design therefor). Which implies that once we’ve got our hands on a brain design, designing new, different brains will be extremely easy (although directing that design towards useful ends may prove to be difficult).
I aver that this also establishes that sentience is a low bar. Sentience doesn’t require anything particularly complicated, and it is extraordinarily robust to errors. If we imagine “things [like neurons] interacting with each other using signals [like neurotransmitters, or like electrons]”-space, the sub-section of that space that results in a conscious mind-thing is large. Mice have something like 71,000,000 neurons and some elephants have like 257,000,000,000 for some goddamn reason, and it seems extraordinarily unlikely that adding neurons would eliminate consciousness (most added neurons will simply be no-ops, they won’t break anything), so we have a lower bound of roughly 70 million neurons for a brain. And way less than that if you accept that a frog (16,000,000), or a shrew (36,000,000), is sentient. And lower still if you accept that damaged individuals with fewer neurons are sentient. And lower than that if you think that many neurons are not critical to sentience (which also seems obvious; how hard do you have to poke a brain before it becomes non-sentient? Pretty fucking hard, it turns out.*)
So, okay, let’s say… 20 million neurons is enough to create a brain. Sure, that’s gotta be deeply inconvenient to simulate in software, but it’s not that much! We should have figured something out about this now, because there’s billions of mice and rats and so on out there, and they each have their 20-70million neurons in substantially different configurations and they’re all goddamn sentient. So there’s gotta be some kind of core property that’s extremely easy to reproduce and extremely hard to remove that is a subset of all these different animals. It has to be a fairly small core just because the number of neurons is not that big, in absolute terms, and you can fuck it up so much before sentience disappears.
WHAT ARE WE MISSING?!
* I’m aware that this is a deeply misleading comparison. If you want an actual justification, I would argue that you can probably start chopping out bits of brains responsible for things like ‘automatic breathing’ and ‘muscular control’ and retain sentience. If you can’t, then… what the fuck is even going on in brains?!
pls moar posts