<- Back
Comments (107)
- pjs_Be careful about how you interpret that paper. It looks really impressive -- real neurons in a petri dish seem to successfully (if amateurishly) murk a few imps.https://www.youtube.com/watch?v=yRV8fSw6HaEBut there's more to the setup than you might assume from a casual reading. Here's the code used for that demo:https://github.com/SeanCole02/doom-neuronSo there is an entire pytorch stack wrapped around the mysterious little blob of neurons -- they aren't just wired straight into WASD. There is a conventional convnet-based encoder, running on a GPU, in the critical path. The README tries to argue that the "neurons are doing the learning" but to my dilettante, critical eye it really looks as though there is a hell of a lot of learning happening in the convnet also.Are the neurons learning to play doom, or are they learning to inject ever so slightly more effective noise into the critical path? Would this work just as well if we replaced the neurons with some other non-markovian sludge? The authors do ablation experiments to try to get to the bottom of this but I can't really tell how compelling the results are (due to my own ignorance/stupidity of course)
- philipsI think this is the same ethical questions of veganism and our use/abuse of biological systems. This is an excerpt from "The Pig that Wants to be Eaten" by Julian Baggini> After forty years of vegetarianism, Max Berger was about to sit down to a feast of pork sausages, crispy bacon and pan-fried chicken breast. Max had always missed the taste of meat, but his principles were stronger than his culinary cravings. But now he was able to eat meat with a clear conscience.> The sausages and bacon had come from a pig called Priscilla he had met the week before. The pig had been genetically engineered to be able to speak and, more importantly, to want to be eaten. Ending up on a human’s table was Priscilla’s lifetime ambition and she woke up on the day of her slaughter with a keen sense of anticipation. She had told all this to Max just before rushing off to the comfortable and humane slaughterhouse. Having heard her story, Max thought it would be disrespectful not to eat her.> The chicken had come from a genetically modified bird which had been ‘decerebrated’. In other words, it lived the life of a vegetable, with no awareness of self, environment, pain or pleasure. Killing it was therefore no more barbarous than uprooting a carrot.> Yet as the plate was placed before him, Max felt a twinge of nausea. Was this just a reflex reaction, caused by a lifetime of vegetarianism? Or was it the physical sign of a justifiable psychic distress? Collecting himself, he picked up his knife and fork . . .> Source: The Restaurant at the End of the Universe by Douglas Adams (Pan Books, 1980)
- Imnimo>But this is where the line slightly blurs in my head. Did we possibly just build the first human biocomputer and immediately put it in a simulated hell, playing the same game on loop, forever? Using the same reward mechanisms we use for LLMs?This description does not seem to really match what was done in the Doom demo, and makes me skeptical that the author has actually looked into the details.
- slibhbI read an interesting book about consciousness recently: The Hidden Spring by Mark Solms.Solms argues, I think convincingly, that consciousness fundamentally has to do with emotions and not cognition. Consciousness is not produced by the cortex but rather by the brainstem, where signals from all over the body converge (e.g. pain, hunger, itchiness, etc).If that argument is true then a petri-dish of neurons is unlikely to be conscious, even it performs some analogue of visual processing.The book makes other arguments that I found less convincing. For example that consciousness is "felt homeostasis" and that a fairly simple system (somewhat more complex than a thermometer) will be conscious, albeit minimally.
- fhna couple of years ago, the mad scientists in me thought about a business where we preserve the brains of people a la Futurama. When the body dies, the brain does not necessarily have to follow. Possible? Yes. Feed it the right chemical cocktail, O2, remove waste products. Ethical/Moral? Whose to say? We are preserving life..in a sense. Profitable - Sure. Connect it to a keyboard/mouse interface. I mean we already have business cyro-preserving with the hope of unfreezing in the distant future!
- marjipan200The mind of the neuro-materialist is a radio so impressed with its own receiver that it's convinced it is the broadcasting tower
- atleastoptimalWe will never draw the line because morality among humans is coupled with looking human-like. For most people, their morals have aesthetic prerequisites, neurons in a lab don't mean as much as neurons in a meat case (especially if that meat case is physically attractive)
- lukasbAnyone who believes AI running on silicon could in principle be conscious has to believe that biological computers are conscious, right? Why aren't those people voicing more concerns?
- rolphfor now, this is a hyper simplistic and hacky POC.you may find a look at how a full visual system is constructed to be a relief.https://www.cell.com/fulltext/S0896-6273(07)00774-Xthere is a good distance to go before this is anything beyond a reflex circuit.https://www.sciencedirect.com/topics/neuroscience/spinal-ref...
- mrweaselIn the same line of thinking: I'm a little concerned that humans are, to some extend, just LLMs in a meat suit.
- AntiDyatlovYeah, we're totally fucked, there is no scientific theory that can tell you what is and isn't conscious. For all we know, my laptop, not running any LLM is conscious and always has been. Or my chair. Or a proton. This consciousness thing is a nasty problem for the scientific worldview.
- mr-footprintReminds me of an ethical dilemma in the game "Detroit: Become Human". I found myself philosophically asking what it means to be alive, what it means to be conscious, and if something without biological bones, blood and a brain can feel the same-level of consciousness as humans, or greater.
- yegortkICML paper about that: https://proceedings.mlr.press/v235/tkachenko24a.html
- AISnakeOilLLMs have awareness for the time they are spawned into memory. But it's very limited, think about if you could use your brain to think, but only after someone asked you a question. After you think the answer, then you are brain dead (unconscious) until another question is asked.
- anonundefined
- anonundefined
- keyboredI don’t believe that silicon has a soul (loosely speaking). For the same reason I don’t believe that some biomatter in a lab has a soul.
- LeCompteSftwareAn underappreciated source of nonsense in 21st century discourse is people watching YouTube instead of reading things. It doesn't appear this author read anything, preferring to be spooked and misled by a YouTube video. trained them to play DOOM - honestly better than I do. Maybe the author really really sucks at DOOM, but I think this is a false embellishment:>> While the neurons can play the game better than a randomly firing player, they’re not very good. “Right now, the cells play a lot like a beginner who’s never seen a computer—and in all fairness, they haven’t,” Brett Kagan, chief scientific officer at Cortical Labs, says in the video. “But they show evidence that they can seek out enemies, they can shoot, they can spin. And while they die a lot, they are learning.” [https://www.smithsonianmag.com/smart-news/a-clump-of-human-b... ] To play DOOM, the system feeds visual data to the neurons. For the neurons to react, they have to interpret that data in some way. This is totally false - not even a misleading metaphor, just plain wrong. The neuronal computer doesn't get any visual information:>> So how does a petri dish of brain cells play Doom when it doesn’t have any eyes? Or fingers? "We take a snapshot of the game with information like the player’s health and the position of enemies, pass it through a neural network, convert it into numbers, and send the data,” explains Cole. “This is called encoding – essentially turning the game state into signals the neurons can understand. The neurons then fire an output – move left, move right, walk forward, shoot or not shoot – which the system decodes and converts back into actions in the game." [https://www.theguardian.com/games/2026/mar/16/petri-dish-bra...]I am also concerned about neuronal computing. But it doesn't really help anyone to spread childish ghost stories about it.I really hate YouTube, by the way. My dad used to read newspapers and had interesting ideas. Now he watches a bunch of YouTube and he's a huge idiot. It's not (directly) because of age: nobody is immune to narcotic slop. I had to delete my account when I realized how much of my life and cognition I was wasting. I wish others would do the same.
- futuresoonpast
- FrustratedMonky"Where do we draw the line?"There will be no line as long as there is the rush to win the capitalist game.UNTIL -> The ball of neurons begins outthinking the humans. Probably also fused with some AI augmentation.It only takes a few percentage points for a Human to outthink a Chimp. This new 'thing' will dominate the humans.
- smitty1eContrarian take: the Promethian efforts will continue, and asymptotically approach the axis of The Real Thing, until we realize that that Prometheus is a variation on the theme of Sisyphus.Only in this telling, Sisyphus is rolling his uneven boulder along that asymptotic curve a little further with every iteration toward a smiling Zeus.
- Etoro1942[flagged]
- qoezWe treat actual biological animals a lot worse in some cases so until we bump up the number of neurons significantly higher above what the lowest tier is below us I don't think we should stop the experiments.