Jump to content

Our beliefs are probably false


Dean Mullen

Recommended Posts

I believe in a lot of things but I can admit that some of my beliefs are probably false, the problem is I don't know which beliefs so I still hold onto my beliefs but most I avoid holding certainty in it. In my opinion 100% of the world is the same. We all have various beliefs mixed based on what theories and movements tell us, like science, religion, philosophy and of course we all have our own general original opinions. But let's be honest many of those beliefs are probably false.

Link to comment
Share on other sites

OK, so I have had a few beers, but let me know if this

"In my opinion 100% of the world is the same."

turns out to mean anything.

 

In the meantime, perhaps someone should think about the difference between things they believe because there is evidence, and things they believe as a matter of faith.

Link to comment
Share on other sites

In the meantime, perhaps someone should think about the difference between things they believe because there is evidence, and things they believe as a matter of faith.

Believing in evidence still requires faith. Faith that the evidence is valid. Faith in your logic of linking the evidence to the issue it is supposed to prove or disprove. It makes me laugh when I hear people say that Galileo (or Copernicus or someone like that) offered the pope to look through his telescope to see for himself that the planets move around the sun. How would looking through a telescope (once) provide direct empirical evidence of anything except the fact that there was light reaching the lens?

 

I don't know what it means to say that "100% of the world is the same." The same as what? In what sense?

 

 

Link to comment
Share on other sites

I think you will find that what Galileo offered a telescope ( to a bishop I think, rather than the Pope) for was to see the moons of Jupiter (which you can do with just one look).

This revelation was at odds with the Church teaching at the time. The Universe was created for man's benefit. A moon of a distant planet couldn't be useful so it shouldn't exist. It also rather comprehensively messed up the "crystal spheres" model. (though not as much so as watching the moons move would do).

 

What I meant was that, for example, I can look through a telescope at some distant object. Then I can walk to that object and confirm that the telescope does indeed give me the same effect as looking at the thing from closer up.

I can do this for lots of things (and I can get my friends to do so to).

Then, having confirmed that 'scopes let you get a better view of things, I can point one at the sky.

If I see that, Jupiter has blobs round it then I can reasonably deduce that, if I were a lot nearer to Jupiter then I would be able to see these blobs.

I also know that, in general, if I can see something, it is there. I have verified this a huge number of times by, for example, finding a biscuit in the tin, and eating it.

 

So, since I know that I can usually assume the reality of things I see, and I have confirmed that the telescope lets me see things I wouldn't other wise see by reason of their being too far away, I can conclude that the moons of Jupiter are real and distant.

 

Not proof, I grant you; but good evidence.

 

What we are arguing about is the degree of robustness of the evidence we need to accept things.

Some people will believe anything they see in the Sun; my viewpoint differs.

Edited by John Cuthber
Link to comment
Share on other sites

You can vastly cut down on the amount of your beliefs which are false just by adopting a positivist approach. Refuse to accept anything as true unless it is based directly on empirical data or the simplest and most non-controversial inferential steps to generate the most economical hypotheses to organize and explain that data. This means being constantly vigilant against being stampeeded into accepting without critical examination the various generally held values and the 'facts' conjured up by the fevered imagination of the public to 'confirm' those values. Some obvious examples of these sort of ideologized 'facts' are the old theory that masturbation, because it was socially disapproved of as a value, must also be objectively harmful, so it had to cause blindness, impotence, and hairy palms. Some less obvious examples of these sort of ideologized 'facts' are the common assumptions that anything other than adherence to currently accepted bourgeois values and mores must produce either individual or social harm.

Link to comment
Share on other sites

  • 1 month later...

I believe in a lot of things but I can admit that some of my beliefs are probably false, the problem is I don't know which beliefs so I still hold onto my beliefs but most I avoid holding certainty in it. In my opinion 100% of the world is the same. We all have various beliefs mixed based on what theories and movements tell us, like science, religion, philosophy and of course we all have our own general original opinions. But let's be honest many of those beliefs are probably false.

 

You should read Descartes. You seem to agree with him on the basics. He, however, took it further to say that he should eliminate any belief that is doubtful and test his basic beliefs to see what he *can* be certain of.

 

He reached the conclusion (among others, but this is one of the most important and notable ones) that the only truly known truth is that he is a thinking thing. ("I think therefore I am"). I personally have my qualms about some of his logic and process but he makes a good read, and your post reminded me of his meditations.

 

You can read them for free here: http://oregonstate.edu/instruct/phl302/texts/descartes/meditations/meditations.html

 

Specifically, read Meditation 1 and Meditation 2.

 

~mooey

Link to comment
Share on other sites

Descartes' basic insight, that sophisticated thinking has to begin with radical doubt so that secure foundations can be provided for whatever is developed afterwards, became a basic premise of scientific thinking. It is no mere accident that Descartes' century, the seventeenth century, was really the foundation of modern science (Galileo, Kepler, Hooke, Newton).

 

But the specific application that Descartes made of his insight, that he can only be absolutely sure that he is a thinking being, while other things have a subsidiary reliability, is generally taken today to have been disproved by Wittgenstein and other writers in the 20th century. Simply put, the only possible source of stable rule use is a community of independent subjects whose cooperative interaction ensures that the rule use is kept stable. Now since stable rule use is essential for the possibility of using a language, and since only using a language allows us to pick out and re-identify objects from the continuum of our experience, we can only really know our own inner subjectivity as something, as an object of experience, if we already exist in a world of other independent selves whose use of language with us stabilizes that rule use and language. So the certainty of knowing yourself as a thinking being is itself parasitic on your experience of the outside world and other thinking beings, thus showing that it is not primary.

Link to comment
Share on other sites

 

But the specific application that Descartes made of his insight, that he can only be absolutely sure that he is a thinking being, while other things have a subsidiary reliability, is generally taken today to have been disproved by Wittgenstein and other writers in the 20th century. Simply put, the only possible source of stable rule use is a community of independent subjects whose cooperative interaction ensures that the rule use is kept stable. Now since stable rule use is essential for the possibility of using a language, and since only using a language allows us to pick out and re-identify objects from the continuum of our experience, we can only really know our own inner subjectivity as something, as an object of experience, if we already exist in a world of other independent selves whose use of language with us stabilizes that rule use and language. So the certainty of knowing yourself as a thinking being is itself parasitic on your experience of the outside world and other thinking beings, thus showing that it is not primary.

 

Do you mean by this that we are too wrapped up in our particular thinking that language cannot possibly express it to other people, because the language of expression would be too general? And, therefore, subjective. So that I could not possibly verbalise to you what I am, or think, in a complete manner? I just wonder if it really matters if I am totally definite and specific in what I say to you. I find that expressing my need to eat chocolate, and spending a bit of money usually results in a chocolate bar which is quickly demolished. Rather than thought being secondary, I would suggest that our world is built on the foundations of thought and sensory experience. I take it that sensory deprivation from birth would not result in any thought. The outside world and the 'inside world' thought are linked together by sensory experience.

 

We seem to be bound to what our senses tell us and cannot go beyond to see what is truly 'real' or 'true'. If a supercomputer gave us the impression of touch, heat and hearing, by wiring us all up into a virtual reality world, replicating the 'impressions' made by these senses, how would we know it was false anyway?

Link to comment
Share on other sites

Jimmy: I think Wittgenstein would essentially agree with you. All capacity for generating stable rule use, which is of course the basis of having a usable language or determinate words as names for anything, is derivative of having experience of an outside world of other intellects who can make our rule use determinate by the pressure of mutual consensus. If I'm living alone on a desert island, perhaps I might want to make up a word for 'palm tree,' but then what stabilizes my conception of the meaning of that label as I encounter felled palm trees, floating palm trees, burned palm trees, and large bushes that look a lot like palm trees? Do I retain the same word or abandon it? Only in a community of speakers, each with his own independent rationality, is the rule use fixed, and thus rule use itself is parasitic on the prior existence of other people. This already means that Descartes' "cogito, ergo sum" can't be the primary certainty, since he couldn't even formulate that in language unless he already knew of other thinkers besides himself, so the logical primacy of his own consciousness is lost.

 

Similarly, the confused fugue of sensations which constitutes our inner states could never appear even to us as a determinate object, which we call 'the self,' unless it were framed into determinateness and stability by something lasting and determinate being given which was opposed to it. So our knowledge of the self is itself logically parasitic on our knowledge of the outside world. Again, this disproves Descartes' argument, that knowledge of the self is the first certainty.

Link to comment
Share on other sites

Jimmy: I think Wittgenstein would essentially agree with you. All capacity for generating stable rule use, which is of course the basis of having a usable language or determinate words as names for anything, is derivative of having experience of an outside world of other intellects who can make our rule use determinate by the pressure of mutual consensus. If I'm living alone on a desert island, perhaps I might want to make up a word for 'palm tree,' but then what stabilizes my conception of the meaning of that label as I encounter felled palm trees, floating palm trees, burned palm trees, and large bushes that look a lot like palm trees? Do I retain the same word or abandon it? Only in a community of speakers, each with his own independent rationality, is the rule use fixed, and thus rule use itself is parasitic on the prior existence of other people. This already means that Descartes' "cogito, ergo sum" can't be the primary certainty, since he couldn't even formulate that in language unless he already knew of other thinkers besides himself, so the logical primacy of his own consciousness is lost.

 

Similarly, the confused fugue of sensations which constitutes our inner states could never appear even to us as a determinate object, which we call 'the self,' unless it were framed into determinateness and stability by something lasting and determinate being given which was opposed to it. So our knowledge of the self is itself logically parasitic on our knowledge of the outside world. Again, this disproves Descartes' argument, that knowledge of the self is the first certainty.

 

Really clearly explained Marat. Thank you. And I agree with you about the world of sensory experience giving rise to a knowledge of 'self'. However, in the first point that you made, we also have the example of babies and other animals who have a concept of 'self' whilst harbouring an inchoate sensory experience and possibly not using language. I'm not sure of any research on this but do babies recognise Mummy and Daddy whilst expressing an internal language and coming up with categories for surrounding objects?

 

I found this interesting snippet:

Until recently, brain experts generally agreed that the newborn, like the beloved storybook character Winnie the Pooh, was "a bear of little brain." A recent book on the nature of the child by a noted Harvard psychologist says the cortex of the young infant resembles that of an adult rat!

With such poor equipment, how could a newborn think? Academic psychologists use big words to deny infant mental activity: pre-symbolic, pre-representational, pre-reflective. In other words, babies are without words and cannot think. This relates to another myth-that in order to think, you must have language. Recent investigations have shown that babies do a lot of thinking, with or without language. You will see evidence of this thinking when your newborn purposefully reaches out, gives an inquisitive look, frowns (or screams) in protest, gurgles in satisfaction, or gasps in excitement. Newborns also listen intently to their mothers reading stories and prefer to hear again those heard weeks before birth. And note this: they listen attentively as long as mother reads forward, but will stop listening as soon as she reads backward (nonsense)-another indication of good thinking.

Link

Link to comment
Share on other sites

There may well be some evolutionary hard-wiring to frame certain similarity spaces into which sensations are preferentially packed rather than others, and Chomskians might want to call this a 'depth grammar' that constitutes a sort of proto-language we are born with to make language use possible. Certainly there must be some fact about the nature of the brain which causes all speakers of languages around the world to factor the scene of a forest into numbers of brown tree trunks against a green background, rather than into just two large but discontinuous objects, one a predominantly vertically arrayed brown object and the other a laterally arrayed green object.

 

But to be truly self-aware of ourselves as subjects of experience, or of the outside world as something different from us, or to use word names consistently over time, seems to require something more. Psychologists often maintain that when a baby cries, it believes that by this crying it feeds itself, since it does not yet have the conceptual apparatus to distinguish itself and its wishes from the outside world, so since it is regularly given milk when it cries, it believes that crying is a self-feeding device.

Link to comment
Share on other sites

I believe in a lot of things but I can admit that some of my beliefs are probably false, the problem is I don't know which beliefs so I still hold onto my beliefs but most I avoid holding certainty in it. In my opinion 100% of the world is the same. We all have various beliefs mixed based on what theories and movements tell us, like science, religion, philosophy and of course we all have our own general original opinions. But let's be honest many of those beliefs are probably false.

 

by taking the pragmatic approach it is likely mostly everything we hold as being true or logical will either evolve in complexity and void our current truth or become totally fallacious of the new truth which means almost every belief (other than perhaps existence itself) are and will be proved void. To me things like morals and ethics are just like taste, we all have our own preferences, some are willing to try new things and some are not, some like sweet and some sour, we cant dictate our taste we just grow to accept it and in some cases if we know its bad for us or others we will choose not to eat it anymore (like fatty food or having a fking conscience)

 

i think actual truth is exclusive to the particular subject however to grow as a race we must create some collective consciousness in which we all accept our duty to the human race within a certain set of laws & to be open with each other about our own subjective beliefs. In the long run the rationals like me will destroy us all anyway so fuckit have fun and get laid....

Link to comment
Share on other sites

There is an important distinction between personal taste -- like saying that chocolate is better than vanilla -- and moral conviction -- like saying that murder is wrong. In the former case, if my neighbors disagree with me and assert that vanilla is in fact the better flavor, I just shrug and agree that tastes differ. I have no objection to their eating vanilla ice cream in front of me. But in the latter case, if the woman next door says murder is good, I will shun her and regard her as mentally ill. And if she starts to kill her husband I will call the police or intervene with force to stop her, as I would not if she were also wrong about eating vanilla ice cream instead of chocolate. Tastes are personal preferences, whereas moral injunctions are understood as objective, or at least intersubjective, truths. Is it objectively more true that the freezing point of water is 0 degrees C or that murder is wrong?

Link to comment
Share on other sites

There is an important distinction between personal taste -- like saying that chocolate is better than vanilla -- and moral conviction -- like saying that murder is wrong. In the former case, if my neighbors disagree with me and assert that vanilla is in fact the better flavor, I just shrug and agree that tastes differ. I have no objection to their eating vanilla ice cream in front of me. But in the latter case, if the woman next door says murder is good, I will shun her and regard her as mentally ill. And if she starts to kill her husband I will call the police or intervene with force to stop her, as I would not if she were also wrong about eating vanilla ice cream instead of chocolate. Tastes are personal preferences, whereas moral injunctions are understood as objective, or at least intersubjective, truths. Is it objectively more true that the freezing point of water is 0 degrees C or that murder is wrong?

 

no there really is no other difference other than the way you have been raised, if murder had been a predominant part of your upbringing I.E for survival, then you would certainly no longer claim that your neighbor is insane or on that note i dont think youd ring the police when you find out she was trying to kill him because he and a few of his mates turned her into a doll (GHB/rohypnol) & caused her life to be a misery. If we lived 1000 or even 2000 years ago im sure things would have been very much different. As you said its intersubjective, thats to say its a majority vote but still not objective, thats why we have laws in place, these laws are not right or wrong, they are just agreed upon by the majority just as im sure the majority enjoy sweet over sour, it doesnt mean sweet is absolutely the best taste.....

 

your raised not to question peoples tastes, your raised with religion and laws (generally? ;) ).

Edited by keelanz
Link to comment
Share on other sites

Of course there are different gradations of certainty. If I met a society of people from another planet who regularly murdered each other and considered it fine, would I find them immoral or would I just accept that this behavior was acceptable for the kind of beings that they were? I might still find it immoral, given our strong projection of the negative moral significance of killing into the sphere of objective truths. In contrast, if these same alien beings asserted that 1 + 1 = 1.73, I would probably even more strongly want to assert that they were objectively wrong. So perhaps this has an even stronger anchor in objectivity. But in comparison with these two statements having the very highest degree of objectivity -- 1 + 1 = 2, and a very high degree of objectivity -- murder is wrong, it is clear that the assertion that chocolate is better than vanilla has only a very much weaker claim to be objectively true; in fact most people would admit that it is purely subjective.

Link to comment
Share on other sites

Of course there are different gradations of certainty. If I met a society of people from another planet who regularly murdered each other and considered it fine, would I find them immoral or would I just accept that this behavior was acceptable for the kind of beings that they were? I might still find it immoral, given our strong projection of the negative moral significance of killing into the sphere of objective truths. In contrast, if these same alien beings asserted that 1 + 1 = 1.73, I would probably even more strongly want to assert that they were objectively wrong. So perhaps this has an even stronger anchor in objectivity. But in comparison with these two statements having the very highest degree of objectivity -- 1 + 1 = 2, and a very high degree of objectivity -- murder is wrong, it is clear that the assertion that chocolate is better than vanilla has only a very much weaker claim to be objectively true; in fact most people would admit that it is purely subjective.

 

Im not concerned weather most people can admit that taste is subjective, or weather mathematics is objective. You said yourself ethics are intersubjective which is just a combined subjective not objective. In other words its my opinion there is not many actual deductible truths other than perhaps logical, mathematical & scientific, all other truths are intersubjective which to all extents and purposes is truth in our reality (hard to break away from that belief system).

Link to comment
Share on other sites

There is a branch of mathematics that deals with choices called "Game Theory". Game theory shows that there are certain behaviours that are detrimental to groups.

 

For instance: Is murder good for the group?

 

Well, if a group were free to murder any other member of its group, then the group could quickly and severely be reduced in number.

 

If we then apply evolution and biology to this:

 

In species like humans, large groups have a survival advantage over smaller groups because of several reasons;

 

1) In large groups that share (sharing can also be shown by game theory to be beneficial to a group), if an individual by bad luck is not able to get enough food to survive, then they can survive off the rest of the group. The larger the group, the smaller the cost to any individual. This is thus an advantage to the group and to the individuals involved.

 

2) In a large group you are less likely to be attacked by a predator. This is because a large group can more successfully defend themselves or frighten off predators and because if a predator randomly takes someone from the group then you have less chance of being the one taken.

 

So, if a group had a behaviour that caused the group to become smaller quickly, then this group would always be at a disadvantage against groups that didn't have that behaviour.

 

Thus, we can conclude that murder is bad for group survival.

 

Also, trust is an important part of group maintenance. If a group can not trust the members of the group, then the group can not act as a group, but instead acts as a collection of individuals. This means any benefits that the group would normally confer (as shown above) would not apply and this kind of collection of individuals would be at a disadvantage against a group that could trust its members.

 

Again, this shows that murder is bad for group survival.

 

This type of analysis can show how certain behaviours are objectively bad (in most situations), and that these behaviours correlate closely with many immoral behaviours.

Link to comment
Share on other sites

There is a branch of mathematics that deals with choices called "Game Theory". Game theory shows that there are certain behaviours that are detrimental to groups.

 

For instance: Is murder good for the group?

 

Well, if a group were free to murder any other member of its group, then the group could quickly and severely be reduced in number.

 

If we then apply evolution and biology to this:

 

In species like humans, large groups have a survival advantage over smaller groups because of several reasons;

 

1) In large groups that share (sharing can also be shown by game theory to be beneficial to a group), if an individual by bad luck is not able to get enough food to survive, then they can survive off the rest of the group. The larger the group, the smaller the cost to any individual. This is thus an advantage to the group and to the individuals involved.

 

2) In a large group you are less likely to be attacked by a predator. This is because a large group can more successfully defend themselves or frighten off predators and because if a predator randomly takes someone from the group then you have less chance of being the one taken.

 

So, if a group had a behaviour that caused the group to become smaller quickly, then this group would always be at a disadvantage against groups that didn't have that behaviour.

 

Thus, we can conclude that murder is bad for group survival.

 

Also, trust is an important part of group maintenance. If a group can not trust the members of the group, then the group can not act as a group, but instead acts as a collection of individuals. This means any benefits that the group would normally confer (as shown above) would not apply and this kind of collection of individuals would be at a disadvantage against a group that could trust its members.

 

Again, this shows that murder is bad for group survival.

 

This type of analysis can show how certain behaviours are objectively bad (in most situations), and that these behaviours correlate closely with many immoral behaviours.

 

this is taking a logical approach though, theres no room for emotional accounts I.E an eye for an eye might not be logical for the group but ethically "fair" for the individual

 

this kinda means you made something objective that was subjective by applying objectives to it, but in reality you have just deducted the subjective aspect altogether

 

P.S im a rational so i rather like game theory ;)

Link to comment
Share on other sites

But suppose some alien beings were rapidly subdividing and reproducing, so like a batch of bacteria, the greatest threat to their survival was that they would be poisoned by the accumulation of their own uncleared metabolic by-products acting as toxins. In such a group of aliens murder might be a survival advantage to the group, by culling the surplus population, just as it has proved to be in certain emergencies among humans, such as when some passengers were tossed overboard to prevent a sinking boat from being swamped by the waves (U.S. v. Holmes, 1842). Perhaps such a group of aliens would approve of murder, just as some primitive tribes now do, when they refuse to accept young males as full members of the group unless they have killed one person from another tribe.

Link to comment
Share on other sites

But suppose some alien beings were rapidly subdividing and reproducing, so like a batch of bacteria, the greatest threat to their survival was that they would be poisoned by the accumulation of their own uncleared metabolic by-products acting as toxins. In such a group of aliens murder might be a survival advantage to the group, by culling the surplus population, just as it has proved to be in certain emergencies among humans, such as when some passengers were tossed overboard to prevent a sinking boat from being swamped by the waves (U.S. v. Holmes, 1842). Perhaps such a group of aliens would approve of murder, just as some primitive tribes now do, when they refuse to accept young males as full members of the group unless they have killed one person from another tribe.

 

Right i get that, that was pretty much what i was trying to say when i said morals and ethics are subjective, that means its relative to the individuals conditions, game theory aside if murder is needed for survival or is instinctively built into oneself its hard to persecute or define it as a bad thing.

 

Back to the question in hand, it follows that everything that isnt logical, mathematical or scientific is subjective and therefor true to oneself but false of reality. Everything YOU know is TRUE, we have a few things that collectively are TRUE to most of us and finally there are TRUTH's which are exclusive to ourselves and more than likely fallacious in reality. (such as my belief that you can create energy)

Link to comment
Share on other sites

this is taking a logical approach though, theres no room for emotional accounts I.E an eye for an eye might not be logical for the group but ethically "fair" for the individual

 

this kinda means you made something objective that was subjective by applying objectives to it, but in reality you have just deducted the subjective aspect altogether

 

P.S im a rational so i rather like game theory ;)

 

Why would it not be logical for the group? It punishes someone who has done wrong and shows others that deviance will be punished. So if the punishment of one dissuaded two to not become deviant it is cost effective.

Link to comment
Share on other sites

Why would it not be logical for the group? It punishes someone who has done wrong and shows others that deviance will be punished. So if the punishment of one dissuaded two to not become deviant it is cost effective.

 

If you take somebody's eye who would usually be an active member of a group (say a fireman?(by physical harm you usually take certain abilities away)) then it certainly doesnt benefit the group because we would be a fireman down, putting them in prison for a certain amount of time, taking away their liberty and rights might be a better way of stopping deviants (though an eye for an eye isnt really devious its straight forward)

 

in other words physical revenge usually isnt the best way to teach a lesson and hence isnt really logical (there is logic there but when you add other aspects it becomes either illogical or inadequate logic ), infact fighting fire with fire only makes it hotter and burn quicker

Link to comment
Share on other sites

But imprisoning also takes away the same member of society, but for a longer period. Not to mention it allows a deviant to intermingle with other deviants and find out how to better commit crimes. For the most part there is no 'perfect' answer to deviance.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.