this post was submitted on 15 Nov 2024
1116 points (98.8% liked)
Comic Strips
12601 readers
3062 users here now
Comic Strips is a community for those who love comic stories.
The rules are simple:
- The post can be a single image, an image gallery, or a link to a specific comic hosted on another site (the author's website, for instance).
- The comic must be a complete story.
- If it is an external link, it must be to a specific story, not to the root of the site.
- You may post comics from others or your own.
- If you are posting a comic of your own, a maximum of one per week is allowed (I know, your comics are great, but this rule helps avoid spam).
- The comic can be in any language, but if it's not in English, OP must include an English translation in the post's 'body' field (note: you don't need to select a specific language when posting a comic).
- Politeness.
- Adult content is not allowed. This community aims to be fun for people of all ages.
Web of links
- [email protected]: "I use Arch btw"
- [email protected]: memes (you don't say!)
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I think you people are vastly overestimating how much we actually know about the brain or severely underestimating how freaking complex it is.
The "you" reading this right now, is a fucking stack of six A4 sized sheets, each one nanometers thick, and crumpled into something which, by all appearances, looks to an external observer as an oversized walnut seed, cooled and maintained by a network of 400 miles capillaries, and isolated from the world by the blood brain barrier, which can only be described as a fucking miracle.
No. No one is going to be implanting any memories soon
Maybe memories are actually really simple. Like the words on a screen. An arrangement of symbols, then a boatload of meaning and interpretation and rationalization. So all you need to do to make memories is to insert a few words. The brain's "memory interpreter" does the rest of the work.
For example, we insert the words "brother appears". Then, for the "new memory", we reference your memories of your brother. His appearance and the sound of his voice. Then we contrive a narrative explaining why "brother" is at this place and time. Etc. Voila! You now have a memory of your brother standing there saying some stuff.
So to make a memory, it wouldn't require a grand delicate manipulation of brainstuff. Just a simple thing.
AI is better at recognizing patterns than we are. The brain may be unfathomable to us, but technology already exists which could recognize the signals in your brain that represent memories and reproduce or alter them.
Neuralink and similar devices are being used right now, today, to record the thoughts of animals. The first neuralink patient is alive and well, meaning it's already being used on humans.
Do you really think this technology won't exist in our lifetime?
Yes, absolutely. What you're describing is AGI. If an AI could untangle engrams from branched clusters of extremely plastic neurons, it could understand and improve it's own thinking. It would actually be self aware before it could untangle the mess that our brains are. And I don't see AGI happening with our current material and resource constraints before I die. Seeing brain regions being active and de-novo engram implantation is about as close as an LLM is to AGI.
It is as you say, the scale doesn't even exist at this point
Even the recent fly brain mapping, enhanced with AI, had to take a destructive approach to map a half a milligram brain and these people are thinking matrix reloaded already
Respectfully, this sounds like opinion and doubt rather than a credibly timeline. Other than rattling off industry terms the only support you've given your argument is "I don't see AGI happening". You've collected an impressive shopping basket of buzz words but done little to dissuade me or the engineers developing this technology that it won't be ready within a lifetime. Stay tuned.
Oh, and "its own thinking" not "it's own thinking". His, hers, its.
Your extrapolation has about as much support. I don't really know what bothers you about the vocabulary I used but I can say I don't play much attention to punctuation marks when inputting text with a swipe keyboard on my phone.
"pay much attention" not "play". I'd be more careful with that keyboard if I were you. Wouldn't want to lose any credibility.
I thought I made it clear enough I didn't give a shit.
But you expect us to care about your opinion? Be correct and be nice or you won't get to finish the discussion. It's like a recipe, you have to do the work to get the product.
If you primarily engage in typos versus ideas I don't particularly consider you worth discussing anything with anyway.
It's been fun.
Being 70-80 years old sucks. My condolences. We'll mess around with AGI when you're gone and I'll think about you
Haha bro thinks the AGI will not be messing around with him LMAO 🤣