- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
cross-posted from: https://feddit.uk/post/17039986
Engineers gave a mushroom a robot body and let it run wild
Nobody knows what sleeping mushrooms dream of when their vast mycelial networks flicker and pulse with electrochemical responses akin to those of our own brain cells.
But given a chance, what might this web of impulses do if granted a moment of freedom?
An interdisciplinary team of researchers from Cornell University in the US and the University of Florence in Italy took steps to find out, putting a culture of the edible mushroom species Pleurotus eryngii (also known as the king oyster mushroom) in control of a pair of vehicles, which can twitch and roll across a flat surface.
Through a series of experiments, the researchers showed it was possible to use the mushroom’s electrophysiological activity as a means of translating environmental cues into directives, which could, in turn, be used to drive a mechanical device’s movements.
“By growing mycelium into the electronics of a robot, we were able to allow the biohybrid machine to sense and respond to the environment,” says senior researcher Rob Shepherd, a materials scientist at Cornell.
Knowing humanity, we’ll hit Last of Us zombies long before the spore drive.
And they’ll be coming for us on their insanely fast robot spider legs that we built for them for some fucking reason
There is something about the idea of fungus given that sort of agency that I find incredibly disturbing.
I really hate to break this to you, but:
Yeah, but that’s not really the same thing. That’s more following basic programming: get the ant to the top of the leaf. This feels different to me.
Not that cordyceps isn’t also creepy…
The idea that it mind controls the ant into its death is creepy as fuck to me. It’s this kind of thing that, if it ever made the jump to infecting humans, would probably wipe out the species.
Sure, but now imagine it can run toward you in order to infect you.
I wonder how much these results would also apply to making random connections between sensors or actuator signals and our brains. Like if we hooked up a microphone, to the brain, would we hear through it similarly to how we hear through our ears? If the microphone was stationarity, would our awareness expand to always include audio awareness of the location the mic is in, or would it just confuse audio processing from our ears? Would it go through the same audio processing “circuitry” as our ears, or would the brain develop a new channel?
And then the same question but this time a camera instead of a mic. And maybe that camera can see a wider spectrum than our eyes can, if we could see through that camera, would we see new colours or would our existing colours just get remapped?