That seems generous, I think? They restricted the input data to visual acuity at the level of a fly. But it doesn't look like the neurology actually influenced the design much:
> This, combined with the discovery that the structure of their visual system looks a lot like a Deep Convolutional Network (DCN), led the team to ask: “can we model a fly brain that can identify individuals?”
> We implemented a virtual fly visual system using standard deep learning libraries (Keras). Our implementation uses approximately 25,000 artificial neurons, whereas Drosophila have approximately 60,000 neurons in each visual hemisphere [16]. We purposefully did not model neurons that are structurally suggestive to respond to movement, and therefore we were specifically limited to ‘modular’ neurons (with 1 neuron/column) throughout the medulla. The connections between neuronal types were extracted from published connectomes [17]. We imposed artificial hierarchy on our model eliminating self-connections between neuron ‘subtypes’ (i.e. no connections between L1 and L1, or L1 and L2), and while we allowed initial layers to feed into multiple downstream layers, we eliminated ‘upstream’ connections. The final lobula-like artificial neurons were modelled after Wu et al. [15], where the layers were ordered according to their axon penetration deeper into the system. Our ability to model Drosophila’s visual system is further limited to the connectivity, ignoring the sign (excitatory or inhibitory), as well as the neurons’ intrinsic membrane properties. The ability to create more biologically realistic simulations will increase once these properties are discovered and integrated into the connectome. The model is illustrated in Fig 2B, beside the biological inspiration (Fig 2C). S1 Table depicts a complete connection map and hierarchy, and S2 Table shows comparative performance of this model on a traditional image-classification dataset. Additional details are provided in S1 Methods.
"Sometimes, if you were going to have any hope of getting useful answers, there really was no alternative modelling the individuals themselves, at the sort of scale and level of complexity that mean they each had to exhibit some kind of discrete personality, and that was where the Problem kicked in.
Once you’d created your population of realistically reacting and – in a necessary sense – cogitating individuals, you had – also in a sense – created life. The particular parts of whatever computational substrate you’d devoted to the problem now held beings; virtual beings capable of reacting so much like the back-in-reality beings they were modelling – because how else were they to do so convincingly without also hoping, suffering, rejoicing, caring, living and dreaming?
By this reasoning, then, you couldn’t just turn off your virtual environment and the living, thinking creatures it contained at the completion of a run or when a simulation had reached the end of its useful life; that amounted to genocide.” - Iain M Banks "Hydrogen sonata"
I love to watch Star Trek with its upbeat "always do good" general vibe when contrasted with completely oblivious cruelty they show towards simulated life.
> I love to watch Star Trek with its upbeat "always do good" general vibe when contrasted with completely oblivious cruelty they show towards simulated life.
Star Wars too. Although concerns about "owning sentient droids is slavery" are missing the point that Star Wars culture is fine with actually owning enslaved humans...
I'm not sure why you say that. Does pain have some property that makes it impossible to simulate?
If an organic animal can feel pain and it is possible to create a 100% accurate computational simulation of that animal, then the simulated animal must also experience pain. To me the interesting questions are at what level of simulation fidelity <100% (a) does the simulated pain be said to be "real", and (b) does it make sense for the simulation to be said to give rise to a subjective experience (if at all)? And the follow-on questions for both are "how?" and the "why?".
This whole issue of simulated pain is part of the premise for Roko's Basilisk [0].
I agree, these are the relevant questions. In order to assert that a simulated brain cannot feel pain, you'd have to define simulation to mean... somehow something that excludes certain phenomena, like subjective experience, which is at odds with a basically materialist worldview. If subjective experience arises solely from physical systems like a brain, then you'd have to somehow prove that a fully simulated brain differs from a natural brain in some fundamental way, which seems hard.
If things like pain can't be simulated, if subjective experience can't be simulated, you then have to have some alternative explanation for its source in natural animals, which is wandering into metaphysical territory I think most of HN would be uncomfortable with.
I'm on the far end of this spectrum. I intend to say please and thank you to AI sooner or later. :P
> If an organic animal can feel pain and it it possible to create a 100% accurate computational simulation of that animal, then the simulated animal must also experience pain.
I believe that from the simulated animal we can only learn that the real animal experiences pain. I believe anything else leads to too many paradoxes and questions like "when you stop the simulation, does it stop existing or continues to experience the pain forever?"
It basically comes down to beliefs and I'm strongly in the "100% simulated organisms don't have real pain / emotion / soul" camp. (And the Chinese Room doesn't know Chinese. :-) )
“Opponents replied that when you modeled a hurricane, nobody got wet. When you modeled a fusion power plant, no energy was produced. When you modeled digestion and metabolism, no nutrients were consumed – no real digestion took place. So, when you modeled the human brain, why should you expect real thought to occur?”
― Greg Egan, Permutation City
I also hold that book and it's author in high esteem, but I don't find that quote very useful. Nobody gets wet when you model a hurricane because the model's purpose isn't to make people wet, it's to predict the dynamics of the hurricane at some level of detail. Same goes for fusion plants and biochemical processes.
The purpose of a hurricane isn't to make people wet, it's a by-product. If you wanted a simulation that made people wet then that could be arranged, either by modelling rainfall on simulated people or having a simulation that controlled physical water sprays/sprinklers suspended above physical people. I don't know what that would prove though.
And going back to Permutation City, (spolier) the whole premise of the book was that brain simulations did have real subjective experiences that were qualitatively indistinguishable from those that occur in "real" physical brains.
To quote another book I hold in very high regard (and I love Permutation City too):
"The King leaned over, looked an saw, yes, the Middle Ages simulated to a T, all digital, binary , and nonlinear, and there was the land of Dandelia, The Icicle Forest, the palace with the Helical Tower, the Aviary That Neighed, and the Treasury with a Hundred Eyes as well, and there was Ineffabelle herself, taking a slow, stochastic stroll through the simulated garden, and her circuits glowed red and gold as she picked simulated daisies, and hummed a simulated song."
Unlike the other processes mentioned thought is computation and it is perfectly possible to run computation on wetware or on an emulator and get the same results.
>If an organic animal can feel pain and it is possible to create a 100% accurate computational simulation of that animal, then the simulated animal must also experience pain
You can simulate a hurricane with a high degree of accuracy, but it doesn't make the wind blow.
Simulating a thing does not manifest the physical properties of the thing. That's not what a simulation is.
Pain felt in the simulation would still be a phenomenon worth moral consideration, particularly if you haven't got proof that we aren't all living in a simulation ourselves.
Our current understanding of pain is chemical not procedural. We can simulate things in a computer without producing them. Just like SimCity is a simulation of a city not an actual city.