4

We're biologically incapable of ignoring our senses. I want to clarify what I mean by this, because as posters have suggested we actually can do so, for example by blinding or gouging our eyes. I'm talking about ignoring our senses by a pure act of will. One of course could argue that an act of will is involved in blinding oneself, and this is true. By a pure act of will I mean one in which no external physical act is involved. For example:

  1. Looking at human being and seeing it as an Orangutan, with no discernable difference to seeing an actual orangutan.

  2. Holding an ice-cube and willing it to feel hot

I imagine a computer would have complete control of its sensory input and how it can be viewed. Whereas I can imagine a human being as an orangutan, but not physically see one as such, a computer should be able to see it exactly as that, if it so chose.

Given this, would a concious computer naturally decide that the world was a figment of its imagination?

I am aware that humans are capable to sensory adaptation to a certain degree. Although humans can hallucinate or dream, these are generally not something that we can conciously will. A human takes drugs to hallucinate, but even then cannot will the form of his hallucination; a human can dream when he s asleep, but the same goes (yes, there may be a slight degree of control, but generally not).

I would also suggest that evolution would have made it very difficult to ignore our senses, as it would simply be dangerous.

I am suggesting a computer would have complete control of how its sensory input would be represented to itself. We're not concious of the nerve impulse when a photon strikes a rod/cone in the eye, what we sense is the whole picture. I'm suggesting that this whole picture, for the computer would be entirely under its control. A simple example would be inverting all the colours. A more sophisticated way would be to change the form of objects in a faithful way.

The equipment that takes in the sensory input cannot ignore the sensory data, but the the interpretation is modifiable. Using Kants language, the forms of perception is not under concious control for a human, but I'm holding that it is so for a computer.

  • 1
    Is it not questionable that a computer would be capable of ignoring its sensors? A basic video program can't ignore video input. A complex computer program could be hard-wired to constantly receive input without having any idea where it's from. Your second sentence just seems quite non-sequitur.
    –  commando
    Apr 25, 2012 at 1:26
  • 3
    My grandpa used to turn off his hearing aid when my grandma was talking too much.
    –  user1746
    Apr 25, 2012 at 1:58
  • 2
    Hmm, I'd say that the very first sentence contains a problematic assumption. Do you have a citation for the claim that humans are "biologically incapable of ignoring [their] senses"? In fact, sensory adaptation is a known phenomenon whereby the sensitivity of sensory receptors is dramatically reduced (or even inhibited entirely) in the presence of sustained exposure to a particular stimulus. These are actually chemical changes that take place. Moreover, differential sensory perception is an accepted biological phenomenon. The mind is part of biology, too, even if it is all "just in your head". Apr 25, 2012 at 3:35
  • @commando: yes, the viedo camera itself cannot ignore the input coming in, but the program that interprets the input can. It can just output a blank picture. Apr 25, 2012 at 9:04
  • @cody: I agree the mind is part of biology, and that adaptation can occur. But these are minor issues (I think). I don't have a citation, but I think its obvious that evolution would have made all but impossible to ignore our senses as it would be dangerous. People who hallucinate (without drugs) are generally not seen as functional. Apr 25, 2012 at 9:07
  • 2
    "We're biologically incapable of ignoring our senses." This is not actually true. Not only can we gouge out our eyes, mutilate our noses, sever our tongues, take a brief skinny dip in sulfuric acid, but we also possess the ability to simply destroy the somatosensory cortex, rendering us incapable of sensory perception. And those are just the purely physical ways of doing so. Certain monks (e.g. Buddhist) have been known to become so removed from their sensory, physical selves that they've been able ignore even extreme sensory experiences, such as intense pain.
    –  stoicfury
    Apr 25, 2012 at 14:39
  • 1
    Evolution doesn't act on temporary conditions like the influence of psychoactive substances... Apr 25, 2012 at 19:33
  • I should have been clearer in my question that I was talking of ignoring our senses via an act of will, so that would discount all of these critiques apart from the Buddhist one, and even then this takes special techniques & training. I'll make that clearer in the question. Apr 25, 2012 at 21:55
  • Your second example does not match what I have seen of hypnosis. People can immerse their hand in ice water and calmly declare that it is warm simply because they were told as such. If that does not qualify as an "act of pure will," then we will need to agree upon a definition of "pure will" which excluded such hypnotic suggestions.
    –  Cort Ammon
    Jan 1, 2015 at 21:30
  • There is also the case of [Saccadic Masking] (en.wikipedia.org/wiki/Saccadic_masking). To discount those, we will also need to define "pure will" to exclude any effects explained by the brain.
    –  Cort Ammon
    Jan 1, 2015 at 21:33
  • I am not convinced that a sufficiently intelligent and autonomous computer would have a better ability to ignore its senses. If a computer is to gain our level of intelligence and autonomy, it would need to have a similar learning capacity. Which means it would need the ability to learn independently, without being programmed. This implies an inability to ignore reality, an a drive to seek out explanations to existing perceptions. An autonomous learning computer may have to have the same relationship to its perceptions as we. I would need to be further convinced of this part.
    –  Misha R
    Jul 10, 2018 at 16:01

3 Answers

6

We're biologically incapable of ignoring our senses. I imagine a computer would have complete control of its sensory input and how it can be viewed. Whereas I can imagine a human being as an orangutan, but not physically see one as such, a computer should be able to see it exactly as that, if it so chose.

This strikes me as profoundly wrong.

First of all, we are capable of ignoring our senses, and do so every time we close our eyes. Any conscious computer (if we stipulate such a thing) with control over its sensory input devices would clearly understand that the external world continues to send input which is being ignored.

Furthermore, the distinction between "imagining" and "seeing" would apply to a conscious computer in precisely the same manner. Just as I can picture "in my mind's eye" an orangutan and distinguish this imagined orangutan from a veridically perceived one, a computer would be able to clearly distinguish between data which it has constructed, and data which has come to it via the aforementioned sensory input devices.

A conscious computer would have no more reason than us to believe in solipsism; nor would it have any less reason. We're both worlded in precisely the same way: aware that we are receiving input from sources outside of our control. The fact that the computer is able to manipulate the input after receiving it is simply not germane.

  • On the whole I agree with your critique. Imagining is under our control, but the imagined orangutan is very different from one we actually see. And of course you are right, we are aware of the difference. But when we dream, or in a drug-induced hallucination, or are schizophrenic we take what we see as reality, we are not aware of the difference. Apr 25, 2012 at 13:51
  • 1
    That's true, and that's because the sensory input in those cases is not under our conscious control (although it is still internal.) Interestingly, Buddhist philosophy speaks of six senses, with "mind" (i.e., imagination) being the sixth sense. Apr 25, 2012 at 14:14
  • its an interesting perspective. I recall it vaguely from my reading somewhere. Apr 25, 2012 at 14:52
1

You shouldn't think a conscious computer would be aware of its internal data anymore than you are aware that neuron number 7,331,521,021,321 just shot out a new dendrite, or is having a hard time coping with your caffeine intake. People tend to see fewer levels of processing between a computer's conscious mind and the low-level implementation details, but this is an illusion due to the primitive state of computers and computer programming today. We can see all the processing steps in a typical computer program with relative ease, but this is reflected in the fact that we can't even get such programs to read the newspaper accurately.

But your question: "Can you persuade a computer not to be a solipsist?" can be turned around and asked of people: can you convince a person not to be a solipsist?

There are easy ways of convincing a solipsist to take the outside world seriously (even if not to believe it "exists"). The outside world is capable of solving problems that the individual cannot. For example, you can look up and learn the proof of Fermat's last theorem. Suppose you do so--- you would not claim that you came up with it yourself! This means that whether you assign the outside world the property of "existence" or not, you must assign the outside world the ability to produce stuff you can't produce yourself by shutting yourself off from it. So you have to assign it computational competence greater than your own. This is true whether you are a human or a computer.

Logical positivist interlude

The question of "existence" is logically positivistically meaningless. If you take away this property from a chair you're sitting on, no other perception is altered. So ultimately, from a logical positivist perspective, there is no meaning to the question of solipsism. But likewise, because it is meaningless, the existence of the outside world, or the lack thereof, can't come back to bite you--- it can't affect your decisions or measurable behavior in any way. Whatever decision you make should not depend on an unmeasurable attribute bit, the "exist" bit, but only on measurable attribute bits, like "if I sit where I see a chair, will my bottom fall to the floor or not?"

The undecidability of existence is positivistically just as true of you, by the way. Even if you don't exist, your nonexistent self will still need to pee (assuming you aren't a conscious computer), and you should probably go use a toilet, or the most convincing illusion of a toilet you can find, if you don't want to admit the toilet exists, or else you will be doing some laundry.

Achievable perceptual alteration

While you can't easily look at a person and see an Urangutan, it is possible to take hallucinogenic drugs that will make the people around you seem unnaturally deformed, and not-recognizably human. This does lead to a sense of separation, the drug abuser tends to disconnect, but this separation can lead to a comaraderie among those whose perceptions are altered in the same way. Does a habitual drug user tend to view non-users as somehow deficiently human?

Many people who regularly get stoned have an unnerving awareness of social minutia that is not commensurate with their simultaneous obliviousness to bigger-picture issues, and they consider a person who does not pay attention to every minutia as borderline autistic, or somehow mentally retarded. They amuse each other by extremely subtle nuances of speech and gesture, which unstoned person would completely miss. There are whole genres of music and art filled with these tiny barely perceptible gestures. This leads people to over-value instantaneous present-awareness and consider those who do not ingest marijuana as somehow deficient--- incapable or unwilling to process all the possible sensory minutia available at any given instant.

The point is that the solipsism issue is exactly parallel here, since one is making the choice to percieve or not to perceive certain minutiae based on ingestion of a chemical. This choice does not seem to be any different than a computer deciding to use a different input program, or to apply a filter to the input. It is also no different from non-chemical filters, like viewing the world through lenses that turn the image in your retina upside-down.

Ultimately it has no bearing on true solipsism.

  • Asking people to understand Fermats Theorem is a huge undertaking, and a step that pretty much the whole of humanity isn't interested in. I don't see how you can use that as an example to convince anyone of the reality of the world. The comaraderie in drug users is the same for any group with its rituals. Only a few will ponder on what they see to build a theology around it, ie Amazonian Shamans. Apr 25, 2012 at 9:11
  • @MoziburUllah: I see your point--- the Fermat thing was just an example of something big that the world does--- Beethoven's 7th symphony, or Marlowe's Faustus might be equally good more universal examples--- the world is richer than the individual, so it has observable attributes of complexity independent of the metaphysical attribute "existence". As far as building a theology around drug-use, there are parts of the US I think you would be surprised to visit (the modern drug cultures in the US can be traced in significant part to Native American drug shamanism through Alduous Huxley)
    –  Ron Maimon
    Apr 25, 2012 at 9:15
  • @maimon: It doesn't surprise me, we have a drug-scene in the uk, and its significantly mixed in with either music or poverty. Apr 25, 2012 at 13:54
  • @MoziburUllah: In the US, a certain amount and kind of hidden moderate drug use is socially expected of the nietzschian "superman", and is associated with wealth and political power, at least since the Beatles.
    –  Ron Maimon
    Apr 25, 2012 at 16:36
  • @maimon: yes, you're right. Its socially differentiated in the uk too. Crack cocaine for the poor, and coke for the rich. Apr 25, 2012 at 16:59
1

It might as well be that a conscious computer does not construct an image of itself. It might be so that it does not evaluate the concept of self. The self that you think you are is just a projection of what you mentally create. But you are not aware of the subjectivity of your creation (I assume, otherwise you would not ask this question I think).

Your question implicitly assumes that the conscious computer has such erroneous perception of reality in a sense that is not aware that it is computing an image of it self. You use the word 'naturally' in your question. Is being delusional (having an erroneous perception) naturally by definition? A computer might as well be conscious and just compute, without having to compute an image of itself in the way that you 'compute/construct' an image of your self. Hence solipsist is a concept, that a computer without an image of it self, can not even evaluate. Because the evaluation requires the concept of 'self' to be true. Having a self, requires a believe (as the image of the self is constructed by your own mind, and you believe it to be objective). If this conscious computer is aware of all that it creates, it would know it constructs an image (subjectively) of itself and hence disregards as objective truth and solipsist is out of the question.

But if all is 1 and dualism is created by the mind/computer, then indeed the self is all there is, not nothing else is verifyable, hence solipsist is the only way to go.

So my perspective on this is that it all comes down to: is the conscious computer delusion that it believes for the self to be objective, rather than subjective (created by itself). If that is not the case, then this question does not apply.

You might want to update your question to include that this computer is, or is not, aware that it creates the image of itself.

Your Answer

  • Links
  • Images
  • Styling/Headers
  • Lists
  • Blockquotes
  • Preformatted
  • HTML
  • Tables
  • Advanced help

Not the answer you're looking for? Browse other questions tagged or ask your own question.