The Orion's Arm Universe Project Forums





What the heck is consciousness anyway?
#11
Bear - That depends on your definition of sentience. Right at the bottom, something like a central heating thermostat could be said to have 1-bit sentience - in that it senses one variable and acts whenever that variable changes. Slightly less ridiculous, one could think about something like an earthworm; it acts to move away from harmful stimuli (heat, saltiness, too much light perhaps) and moves towards beneficial ones such as the smell of food. I very much doubt, however, that most people would take this to mean earthworms have emotions.

Incidentally, this does not mean that everything is sentient. A lump of granite reacts to being wet by (VERY slowly) changing its chemical composition - but it's not doing so actively.
Reply
#12
One interesting thing about animals is that they can have psychological states. Animals can be happy or sad, confident or depressed; this behaviour is often obvious in mammals, but can be seen in birds and reptiles, and sometimes even in arthropods (invade an arthropod's territory too many times and it appears to become apprehensive and cautious). Any entity that is sophisticated enough to have a state of mind is sentient, and there must be something that it is like to be one.

Sophonce seems to be a sophisticated kind of sentience wrangling. If you are aware enough to model another being's behaviour and predict it accurately enough to catch it or mate with it or drive it away, then you are sentient; if you are self-aware enough to model your own behavior and imagine what you want to do in the next five days, weeks or months, and to worry about your own state of mind and what you can do to improve things, you are sophont.
Reply
#13
To me consciousness is the feeling of experiencing a self-aware program from the inside. Build a sophont program (and a computer to run it on) and it will be self aware, and there will be something it is like to be that program; it will be conscious, even if its experience of consciousness may be totally different to your own.
Reply
#14
Maybe. I'd call the thermostat one-bit sapience, not one-bit sentience. Temperature is a symbol that it has no reason to care about - it reacts to temperature, but the reaction isn't solving a problem that the thermostat has. The thermostat would be just fine if the temperature went up to fifty or down to zero. The thermostat itself has no needs or desires or threats to respond to, and no need to organize its information about the world (its one bit of symbolic information) into qualia. And if it did can we even conceive of that signal *being* qualia?

You're right, it gets very fuzzy down there at the bottom of the scale. For most purposes I think I'd be okay calling earthworms non-sentient. I certainly don't feel guilty if I happen to kill one. But, what they've got is the same as the foundation of sentience in more complex organisms; they have needs and must avoid risks and injuries, and have sensory input specifically to give them information about the world that must be organized in ways that allow them to do that. Even if it's not highly developed, I can't really claim it's not sentience.
Reply
#15
Not to exhume a (very) old thread, but in light of the recent PIT vs CIT "mini" debate, I thought it might be a good time to follow up on this discussion and get a conversation going about how we should define and talk about consciousness in the EG.

As I stated in my reply to the above thread, I have a personal favorite definition of "consciousness," which I don't expect everyone else will agree with, but which I find especially compelling and I think really gets to the heart of what's actually interesting about this topic. This is the definition espoused by the philosopher Thomas Nagel in his 1974 essay "What Is It Like to Be a Bat?" , which has since been picked up and discussed at length by the neuroscientist/philosopher/author/podcaster Sam Harris.

The basic thesis of Nagel's paper is that the central mystery (and definition) of consciousness is the fact of subjective, first-person experience: as he puts it, "an organism has conscious mental states if and only if there is something that it is like to be that organism—something it is like for the organism." There is something it is like to be a person; there is presumably something it is like to be a bat, although it will necessarily be very different from what it's like to be a human; there is presumably not something it is like to be a rock. For us, and likely bats and other mammals, and probably most animals down to, perhaps, insects, "the lights are on" in a way that we generally do not assume to be the case for inanimate objects, or machines, or computers, although if one accepts the viability of Artificial General Intelligence one might assume a computer could have a similar subjective first-person experience if it was configured the right way.

This "seeming," "something it is like to be," "the lights being on," this subjective first-person experience of the world that we all claim to have but which no one can objectively verify, is what Nagel, Harris, and by extension I personally call "consciousness." It does not necessarily entail being self-aware, capable of metacognition, or even having thoughts at all- one could (maybe) imagine a "mind" devoid of any thoughts whatsoever, but which nevertheless has some basal level of experience.

There's a whole host of related concepts which I can see some of you are already familiar with (by the way: if I've been banging on about stuff you all are well aware of, please excuse me for wasting your time and probably sounding like a pompous windbag), such as philosophical zombies, the hard problem of consciousness, qualia, and various other things.

As it currently stands, I think the EG article on consciousness could do with a lot of love, ideally involving adding a discussion of the above topics. As I mentioned earlier, though, I'm aware that the definition of consciousness I've been blabbing about is not universally shared, and I'd be interested in hearing if any of you guys have different ones. I think many or all of our respective views could and should be factored into an updated consciousness EG article.
We are not simply in the universe, we are part of it. We are born from it. One might even say we have been empowered by the universe to figure itself out... and we have only just begun.
-Neill DeGrasse Tyson
Reply
#16
(06-16-2024, 02:42 PM)Andrew P. Wrote: As I stated in my reply to the above thread, I have a personal favorite definition of "consciousness," which I don't expect everyone else will agree with, but which I find especially compelling and I think really gets to the heart of what's actually interesting about this topic. This is the definition espoused by the philosopher Thomas Nagel in his 1974 essay "What Is It Like to Be a Bat?" , which has since been picked up and discussed at length by the neuroscientist/philosopher/author/podcaster Sam Harris.

The basic thesis of Nagel's paper is that the central mystery (and definition) of consciousness is the fact of subjective, first-person experience: as he puts it, "an organism has conscious mental states if and only if there is something that it is like to be that organism—something it is like for the organism." There is something it is like to be a person; there is presumably something it is like to be a bat, although it will necessarily be very different from what it's like to be a human; there is presumably not something it is like to be a rock. For us, and likely bats and other mammals, and probably most animals down to, perhaps, insects, "the lights are on" in a way that we generally do not assume to be the case for inanimate objects, or machines, or computers, although if one accepts the viability of Artificial General Intelligence one might assume a computer could have a similar subjective first-person experience if it was configured the right way.
Yeah, I've been thinking a lot about Nagel recently. In the OA Current era it is fairly commonplace for a human-derived sophont to choose to become something completely different, including a bat (usually but not always a sophont bat, or maybe something even more exotic like a To'ul'h). This implies that there must be procedures, processes and interpretation software available that can convert the mental experience of being a human into the experience of a bat, or a To'ul'h, or any of the other options available in the Terragen Sphere (and vice versa).

I expect that this would be easier if the original modosophont could upload themself, then modify their uploaded consciousness with effective neuro-translation software. The end result would be a human personality that is fully capable of experiencing bat or To'ul'h qualia, whatever that may be like.

It may even be possible to medically transplant a human brain directly into the body of a large animal of some kind, so that even staunch believers in CIT could gain the experience of being that animal (assuming suitable neurotranslation hardware could be manufactured). But there has to be room for a human brain in the host body to retain continuity. So CIT proponents could be implanted into an elephant, or a dolphin, or a whale, but not a mouse.
Reply
#17
Just found this on the BBC
https://www.bbc.co.uk/news/articles/cv223z15mpmo
Are animals conscious? How new research is changing minds
[Image: 1d971bc0-2a4f-11ef-9749-cd6cde939cc7.png.webp]
Reply
#18
(06-17-2024, 01:43 AM)stevebowers Wrote: Yeah, I've been thinking a lot about Nagel recently. In the OA Current era it is fairly commonplace for a human-derived sophont to choose to become something completely different, including a bat (usually but not always a sophont bat, or maybe something even more exotic like a To'ul'h). This implies that there must be procedures, processes and interpretation software available that can convert the mental experience of being a human into the experience of a bat, or a To'ul'h, or any of the other options available in the Terragen Sphere (and vice versa).

I expect that this would be easier if the original modosophont could upload themself, then modify their uploaded consciousness with effective neuro-translation software. The end result would be a human personality that is fully capable of experiencing bat or To'ul'h qualia, whatever that may be like.

It may even be possible to medically transplant a human brain directly into the body of a large animal of some kind, so that even staunch believers in CIT could gain the experience of being that animal (assuming suitable neurotranslation hardware could be manufactured). But there has to be room for a human brain in the host body to retain continuity. So CIT proponents could be implanted into an elephant, or a dolphin, or a whale, but not a mouse.

That's fascinating to think about - what must it entail, to take a baseline human mind (for example), and then port it into a very different body with very different senses? Would it be easiest/ most common to leave the mind itself essentially unchanged, and just wire the new senses into the old mind, translating them into a format that's easy for a baseline human mind to interpret and understand? Or would it be necessary, or somehow easier, to tweak the mind's architecture in such a way as to make it more, say, bat-like? So it would end up thinking like a bat does, to some extent?

I don't think there are any right or wrong answers to the above questions, at least none that we can evaluate right now, but it's fun to think about.
We are not simply in the universe, we are part of it. We are born from it. One might even say we have been empowered by the universe to figure itself out... and we have only just begun.
-Neill DeGrasse Tyson
Reply
#19
(06-20-2024, 06:02 PM)Andrew P. Wrote: ... what must it entail, to take a baseline human mind (for example), and then port it into a very different body with very different senses? Would it be easiest/ most common to leave the mind itself essentially unchanged, and just wire the new senses into the old mind, translating them into a format that's easy for a baseline human mind to interpret and understand? Or would it be necessary, or somehow easier, to tweak the mind's architecture in such a way as to make it more, say, bat-like? So it would end up thinking like a bat does, to some extent?

I'd say that in OA, especially in the Current Era but probably for some time before that, both would be possible. A PIT believer could be uploaded, then choose to change their internal mental architecture to conform to that of a bat (perhaps a sophont bat, or perhaps a non-sophont one).

A CIT believer couldn't chop their brain up small enough to fit in a bats' body without destroying part of their mind, but they could teleoperate a bat's body, utilising highly sophisticated mental translation software to change bat perceptions and skills into human ones, while still inhabiting their own human body (probably in some kind of biostasis).

Of course the person teleoperating the bat need not be a human, since OA modosophonts are very diverse; and the person teleoperating the bat might be a PIT believer as well, who prefers not to undergo radical neurotechnological modification. The person temporarily inhabiting this bat's body might even be a xenosophont, an aioid, or an archailect.
Reply


Forum Jump:


Users browsing this thread: 2 Guest(s)