• OK, it's on.
  • Please note that many, many Email Addresses used for spam, are not accepted at registration. Select a respectable Free email.
  • Done now. Domine miserere nobis.

Why INTP-types cannot really discuss anything with eachother

MEDICaustik

Member
Local time
Today 3:05 PM
Joined
Mar 5, 2012
Messages
85
---
Whenever Inquistor disagrees with someone, all of his paragraphs tend to begin with some sort of annoying passive-aggressive term like "Lol" or "Hilarious" or "Umm". It's too annoying to reply to.

It's not passive-aggression.

It's condescension.
 

Intolerable

Banned
Local time
Today 3:05 PM
Joined
Nov 13, 2015
Messages
1,139
---
I would say it depends on age and with that the value of ones own convictions.

I am your basic INTP in all those regards ( not reading much, contesting a lot, etc ) but mostly because my inquisitive years are behind me. I know enough about the world, life and the universe and anything I don't know I don't care to explore any further.

When I was younger I had much more conviction to understand life. I suspect that's true for most of us.
 

Tannhauser

angry insecure male
Local time
Today 9:05 PM
Joined
Jul 18, 2015
Messages
1,462
---
Ok I can see what you mean by the term "different way-of-thinking", but it does not imply that the theories of those two historical figures are based on different idealized realities. If anything, the fact that both probabilities are magnitudes/numbers between 0 and 1, and you can use the probabilities interchangeably, is evidence of the opposite.

What probabilities essentially are, are ratios. Fermat's probability theory is worded differently than that of Bayse, and may have been worded in a different language, but does rely on the same intuitive framework of numbers+relations and uses this framework in the same way, as ratios.

Just reading this section: "update it as more information is available to you" - I translate, it means that Bayes adopts Fermat's definition of probability, as soon as it becomes applicable over time.

In other words Bayes' definition comes down to a hypothetical question: what would the probability (Fermat's definition) be if the experiment went the way you believe it would go. And later once the experiment stops being the uncertain future and becomes the observable past, your belief, and probability, adjust (accordingly to Fermat's definition).

What Bayes and Fermat did was not invent a new idealization of reality that they called "probabilities". Rather, they found words that would make creatures/humans who already have the ability to perceive an idealized reality aware of implications of this idealization. The inevitable implication of ratios. That's what I meant by "stumbling in a dark room", we are all relatively stupid, we can't easily see any but the most rudimentary implications of our own idealization. That's doesn't change the fact that those implications are still there.
Even if you say that a probability is a ratio, that in itself doesn't mean anything. As I mentioned, the most intuitive interpretation is that if this ratio is, say, 1/3, the frequentist (or Pascalian/Fermatian) interpretation is that the long-run stable ratio of this event, over all possible outcomes is 1/3 – i.e. the event will show up 1/3 of the time in a repeated experiment. But if you ask "what is the probability of my house being robbed tomorrow", that interpretation is no longer useful/valid, because you cannot repeatedly observe your house being robbed or not robbed tomorrow – there is no stable ratio. A Baysian probability is not just another version of a frequentist probability, it is more like a bet. There is nothing that stops us from even setting the probability outside the range [0, 1], introducing negative probabilities. A probability is just a mapping from the space of outcomes onto some interval of numbers – it can only be [0, 1] in the frequentist interpretation, but can be any interval in another.

Hence, you see – it is easy to get blinded by established teachings and become convinced that what you have been taught is the one and only possibility. As an aside point, I think people would have a much easier time learning math if they first learned that math is just a system built by humans with a particular way of thinking – it is not some magical, universal system inscribed into the matter of the universe.

If it's still relevant, maybe you can make an example of what you mean? Because the case I made about human idealization dictates one idealized reality.

How familliar are you with the concept of induction? (with it, it might be easier to explain my point of view, maybe)
I actually don't have any good examples for the language claim right now, so I might retract that one. But yeah, I am fully familiar with induction.
 

Inquisitor

Well-Known Member
Local time
Today 3:05 PM
Joined
Mar 31, 2015
Messages
840
---
Whenever Inquistor disagrees with someone, all of his paragraphs tend to begin with some sort of annoying passive-aggressive term like "Lol" or "Hilarious" or "Umm". It's too annoying to reply to.

You just did. The real reason you and I don't agree is because you patently dismiss Jung even though you haven't actually read anything by him. This whole thread is completely hypocritical. With regards to typology, you consistently do the very thing you portend to dislike about "INTP types." You're not even willing to consider the possibility that Jung might be correct, let alone that you could find something of value in his writings. It's a complete dismissal w/o even the beginnings of any investigation, like you know in advance exactly what is contained in the fairly substantial literature on this topic.

It's not passive-aggression.

It's condescension.

No. It's actually just funny. It's like there's a complete disconnect in his mind between what he says and does.
 

Tannhauser

angry insecure male
Local time
Today 9:05 PM
Joined
Jul 18, 2015
Messages
1,462
---
You just did. The real reason you and I don't agree is because you patently dismiss Jung even though you haven't actually read anything by him. This whole thread is completely hypocritical. With regards to typology, you consistently do the very thing you portend to dislike about "INTP types." You're not even willing to consider the possibility that Jung might be correct, let alone that you could find something of value in his writings. It's a complete dismissal w/o even the beginnings of any investigation, like you know in advance exactly what is contained in the fairly substantial literature on this topic.



No. It's actually just funny. It's like there's a complete disconnect in his mind between what he says and does.

You have basically misunderstood the whole point in the OP. Respecting literature and the works of our predecessors means being able to analyze ideas in the context of a bigger collection of ideas. When you treat the works of Jung the way you do, you are committing the mistake I am describing in OP: building an esoteric system of concepts with complete disregard for everything else – for example the tradition of epistemology. I believe we have discussed MBTI in light of epistemology quite extensively. I tried to relate it to, for example, Popper's ideas of falsification, the difference between MBTI and Einstein's theory of Relativity etc, I think I even quoted Kant at some point. That was my attempt at bringing the theory into a bigger frame of ideas. Meanwhile, all I hear from you is that I haven't read every word of Jung's works, therefore I cannot reject it. But thing is that I can – because I know what sort of epistemological grounds it is built on. All the intricacies of Jung's elaborate scheme are not interesting to me, and I doubt that somewhere in his books he hid a paragraph saying "Just kidding, I actually have a falsifiable and testable theory which no one has heard about."

Other than that, I'd like to point out that I have never claimed to be free from the faults I am describing in the OP. In fact, most of the time I am writing shit about INTP-types, I am writing about my own faults.
 

Teax

huh?
Local time
Today 9:05 PM
Joined
Oct 17, 2014
Messages
392
---
Location
in orbit of a friendly star <3
As an aside point, I think people would have a much easier time learning math if they first learned that math is just a system built by humans with a particular way of thinking – it is not some magical, universal system inscribed into the matter of the universe.
For people who value history above all else maybe.... actually no I don't understand. Because for some, this information would be a major motivation killer. I mean, what's so wrong with a universal system? Wouldn't it make people want to engage with it more if they knew they're discovering a fundamental truth, rather than merely memorizing what some historical figure dictates?

(Also, just in case you were referring to my stance, I have by no means proposed that math is a system that the objective reality is based on, isn't that what you meant by "universal"?)

Even if you say that a probability is a ratio, that in itself doesn't mean anything. As I mentioned, the most intuitive interpretation is that if this ratio is, say, 1/3, the frequentist (or Pascalian/Fermatian) interpretation is that the long-run stable ratio of this event, over all possible outcomes is 1/3 – i.e. the event will show up 1/3 of the time in a repeated experiment. But if you ask "what is the probability of my house being robbed tomorrow", that interpretation is no longer useful/valid, because you cannot repeatedly observe your house being robbed or not robbed tomorrow – there is no stable ratio.
No, there still can be.

This video also talks about your scenario. (what a coincidence ;) )
https://www.youtube.com/watch?v=GC-l345c1FY#t=

Consider the coin toss experiment. If you play back a recording of the first coin toss, the coin is being launched exactly the same way every time, it yields the same result <-- Useless... So when talking about probabilities, what we implicitly mean is "With the randomized initial state given by tossing a coin, all other things being equal/repeated, what is the long-run stable ratio".

With the question about the house we're not sure what part of the state to randomize, it's not 100% clear/implied by the context. That is why we perceive it as an invalid question. However, analogous to the coin toss, one way of wording the burglar question is "If I were to repeat this entire week over and over again, given a randomized initial state of the burglars-dudes, everything I personally did and know being equal, what is the long-run stable ratio of such week-repetitions where my house gets robbed?".

This is a good example for the kind of understanding you only get by not basing everything on the literally of those historical figures, who in their early years of math have not yet seen as much of the big picture as we are privileged to see today.

Tannhauser said:
A Baysian probability is not just another version of a frequentist probability, it is more like a bet.
Yes, just because Bayes called his abstract concept with the same word, "probability", doesn't mean that Bayes is talking every step of the way about what we consider probabilities. He only arrives at probabilities by the end of his theory. I'll use your word "bet" to separate Bayes.

Tannhauser said:
There is nothing that stops us from even setting the probability outside the range [0, 1], introducing negative probabilities. A probability is just a mapping from the space of outcomes onto some interval of numbers – it can only be [0, 1] in the frequentist interpretation, but can be any interval in another.
This should stop you: if it doesn't make sense. It sounds like you repeated the point 1 on the slide on minute 13:30 of the video, but tell me, how does an arbitrary assignment lay a useful ground-work for real bets? And if does not, it's nonsense, how could it be a useful basis for describing probabilities in reality.

Point 2 on the same slide in the video, imposes a restriction: to be based on reality. This is what bayes called "rational betting strategy", and as a result finds that the resulting bet-theory would be consistent with all the rules of frequentist probability. A rational bet is equivalent to e.g. asking about the above hypothetical burglar question/experiment, what you believe the long-run ratio would be.

Bayes' work is further proof that there is only one rational way to define probabilities, in harmony with my proposition that there is only one idealized reality.

Tannhauser said:
Hence, you see – it is easy to get blinded by established teachings and become convinced that what you have been taught is the one and only possibility
Well I wasn't pulling that claim out of thin air :p and I did not arrive at it through analysis of every aspect of math (negative numbers, complex, probabilities) separately. Rather, my stance is inferred directly from my proposition, which I (tried to in my previous posts but) doubt I can explain without induction:

We perceive reality by telling things apart. The term "object" or "thing" or "unit" is the one most basic concept you can find, it applies to everything. Telling objects apart/identifying things allows us to see/memorize relations between objects, because we can tell these relations apart (and we can tell the memories about the relations apart etc..etc..).

Formal math in its current form is induced over set theory. All axioms of set theory are induced over 1) telling objects apart and 2) relations between objects. Some argue with me that 1) and 2) are the same/implied, but I'm separating this for understandability. That's why the structure of reality and math seem to be so closely related: when you observe reality, you involuntarily idealize it through 1) and 2), which results in patterns all of which are part of math because math is the induction over 1) and 2).

As a side-note: The brain/neural network is a highly parallelized machine with the express purpose of (surprise) telling things apart.
:crazy:
 

JimJambones

sPaCe CaDeT
Local time
Today 3:05 PM
Joined
Mar 18, 2013
Messages
412
---
You have basically misunderstood the whole point in the OP. Respecting literature and the works of our predecessors means being able to analyze ideas in the context of a bigger collection of ideas. When you treat the works of Jung the way you do, you are committing the mistake I am describing in OP: building an esoteric system of concepts with complete disregard for everything else – for example the tradition of epistemology. I believe we have discussed MBTI in light of epistemology quite extensively. I tried to relate it to, for example, Popper's ideas of falsification, the difference between MBTI and Einstein's theory of Relativity etc, I think I even quoted Kant at some point. That was my attempt at bringing the theory into a bigger frame of ideas. Meanwhile, all I hear from you is that I haven't read every word of Jung's works, therefore I cannot reject it. But thing is that I can – because I know what sort of epistemological grounds it is built on. All the intricacies of Jung's elaborate scheme are not interesting to me, and I doubt that somewhere in his books he hid a paragraph saying "Just kidding, I actually have a falsifiable and testable theory which no one has heard about."

Other than that, I'd like to point out that I have never claimed to be free from the faults I am describing in the OP. In fact, most of the time I am writing shit about INTP-types, I am writing about my own faults.

"Introverted intuition apprehends the images which arise from the a priori, i.e. the inherited foundations of the unconscious mind. These archetypes, whose innermost nature is inaccessible to experience, represent the precipitate of psychic functioning of the whole ancestral line, i.e. the heaped-up, or pooled, experiences of organic existence in general, a million times repeated, and condensed into types. Hence, in these archetypes all experiences are [p. 508] represented which since primeval time have happened on this planet. Their archetypal distinctness is the more marked, the more frequently and intensely they have been experienced. The archetype would be -- to borrow from Kant -- the noumenon of the image which intuition perceives and, in perceiving, creates." Carl Jung

Sounds pretty incompatible with, I don't know: a physical, naturalistic explanation of the universe. I would rather wait to see what we can discover about the mind through scientific understandings than to claim to "know" the "objective"subjective life of other people without any data whatsoever.

So far, no patterns can be discerned when individual minds are interviewed and studied that would correlate with the a priori categories Jung proposed. This alone should be enough to disregard them. Instead, what you find are researchers who will take the data and make them fit their theory, making the endeavor pseudoscience.

INTP is in actuality a very broad category and is dependent on the subject's interpretation of the description and not anything empirical.
 

QuickTwist

Spiritual "Woo"
Local time
Today 2:05 PM
Joined
Jan 24, 2013
Messages
7,182
---
Location
...
INTP is in actuality a very broad category and is dependent on the subject's interpretation of the description and not anything empirical.

I like this, however, I would have said "MBTI type is in actuality a very broad category and is dependent on the subject's interpretation of the description and not anything empirical."

Back on topic: I noticed no one even bothered to ask OP why he thinks this? What is the OP drawing from to get the conclusion that INTP's can't relate to each other (if those are the right words) comparing INTPs to other types? What about other IT types? What about other I types? What makes INTP's not relate? What do other types have that INTPs do not? Sorry to frame it this way, but the op talks about a communal problem having to do with other INTPs so I just assumed OP is not wasting everyone's time with a personal one that cannot be answered because of the reference to themselves as the primary source of the problem.
 

Sinny91

Banned
Local time
Today 8:05 PM
Joined
May 16, 2015
Messages
6,299
---
Location
Birmingham, UK
I presumed Tan asked because of his experiences with other INTP's ... online perhaps.

I would suggest that his hypothesis isn't wholly accurate, and 'it' just depends on the individuals in question.
 

Tannhauser

angry insecure male
Local time
Today 9:05 PM
Joined
Jul 18, 2015
Messages
1,462
---

INTP is in actuality a very broad category and is dependent on the subject's interpretation of the description and not anything empirical.

Exactly. Therefore, the point that some people have made that the problem applies to more people than just INTPs seems somewhat void to me. One can replace "INTP" in the OP with "many people" or whatever, or – one can try to recognize the broad tendencies of people usually described as INTP. Or one can even take an idealized version of an INTP – the one from the typical INTP profile description, and recognize that INTP's will to systems and the will to reducing everything to logic – those tendencies can often lead to blinding oneself to a multitude of source of outside information. Or one can even define INTP, in this context, as people who are the perpetrators of the mistakes I described. None of these options should actually change the discussion.
 

Tannhauser

angry insecure male
Local time
Today 9:05 PM
Joined
Jul 18, 2015
Messages
1,462
---
This should stop you: if it doesn't make sense. It sounds like you repeated the point 1 on the slide on minute 13:30 of the video, but tell me, how does an arbitrary assignment lay a useful ground-work for real bets? And if does not, it's nonsense, how could it be a useful basis for describing probabilities in reality.
It's very simple. Probability, as it is defined in axiomatic probability theory, is a function which maps values from a set of events into the interval [0, 1]. But this definition was made by Kolmogorov to satisfy the intuitions of frequentist probability. Whether it makes sense or not is a matter of utility. We can for example do the following: instead of mapping values into [0, 1], we map them into, say, the interval [-0.5, 1.5]. What is the use of this? Well, in some contexts you are only interested in a weighting of future events. For example, lets say you win $10 with probability -0.5 and win $15 with probability 0.5. Your expectation for the bet is -0.5*10 + 0.5*15 = 2.5.

For example in finance, we have the concept of a risk-neutral probability measure . In that context, probability doesn't really mean probability as you think of it, it just means, as in my example, a weighting of future events to arrive at an expectation.

But anyway, I would agree that most, if not all mathematics makes intuitive sense. But it does so, as I have argued, because it was created to serve specific purposes. The fact that mathematics "makes sense" is nowhere near a proof that mathematics is a way of discovering the one and only idealized version of reality. Would your theory be disproved if we found one case where it didn't make intuitive sense?

We perceive reality by telling things apart. The term "object" or "thing" or "unit" is the one most basic concept you can find, it applies to everything. Telling objects apart/identifying things allows us to see/memorize relations between objects, because we can tell these relations apart (and we can tell the memories about the relations apart etc..etc..).

Formal math in its current form is induced over set theory. All axioms of set theory are induced over 1) telling objects apart and 2) relations between objects. Some argue with me that 1) and 2) are the same/implied, but I'm separating this for understandability. That's why the structure of reality and math seem to be so closely related: when you observe reality, you involuntarily idealize it through 1) and 2), which results in patterns all of which are part of math because math is the induction over 1) and 2).
Yes, but if I had lived in a culture where, say, anytime we have N apples, we mash them into an apple purée, a sensible axiom for us to start with might be: 1+ 1+ ... + 1 =1 for any number of 1's. So when me and you would meet, we would disagree on basically all mathematical results.
 

Matt3737

INFJ
Local time
Today 2:05 PM
Joined
Oct 7, 2012
Messages
155
---
Location
Arkansas
Yes, even abstract knowledge like that cannot exist without some assumptions.

The assumptions for 2+2=4 are not imposed by culture, they exist as part of the human condition. Axiom is the word for such assumptions. Axioms do not have to be assumed, you assume them by default through interaction with what you perceive as reality.

Therefore axioms (and most of math) do not have to be developed and put forth by anyone to be useful. They're used by all kinds of illiterate creatures, and are merely discovered by culture to assign language-constructs to it so we can communicate to each other about it.

As used in mathematics, the term axiom is used in two related but distinguishable senses: "logical axioms" and "non-logical axioms". Logical axioms are usually statements that are taken to be true within the system of logic they define (e.g., (A and B) implies A), while non-logical axioms (e.g., a + b = b + a) are actually substantive assertions about the elements of the domain of a specific mathematical theory (such as arithmetic). When used in the latter sense, "axiom", "postulate", and "assumption" may be used interchangeably. In general, a non-logical axiom is not a self-evident truth, but rather a formal logical expression used in deduction to build a mathematical theory. As modern mathematics admits multiple, equally "true" systems of logic, precisely the same thing must be said for logical axioms - they both define and are specific to the particular system of logic that is being invoked. To axiomatize a system of knowledge is to show that its claims can be derived from a small, well-understood set of sentences (the axioms). There are typically multiple ways to axiomatize a given mathematical domain.

No you don't. The understanding of the number zero and negative numbers is innate in all of us. Numbers don't exist in a vacuum, they always have a purpose, a direction. The sign is nothing more than a direction modifier. The minus sign is the opposite direction.

Records show that the ancient Greeks seemed unsure about the status of zero as a number. They asked themselves, "How can nothing be something?", leading to philosophical and, by the Medieval period, religious arguments about the nature and existence of zero and the vacuum. The paradoxes of Zeno of Elea depend in large part on the uncertain interpretation of zero.

By 130 AD, Ptolemy, influenced by Hipparchus and the Babylonians, was using a symbol for zero (a small circle with a long overbar) within a sexagesimal numeral system otherwise using alphabetic Greek numerals. Because it was used alone, not just as a placeholder, this Hellenistic zero was perhaps the first documented use of a number zero in the Old World. However, the positions were usually limited to the fractional part of a number (called minutes, seconds, thirds, fourths, etc.)—they were not used for the integral part of a number. In later Byzantine manuscripts of Ptolemy's Syntaxis Mathematica (also known as the Almagest), the Hellenistic zero had morphed into the Greek letter omicron (otherwise meaning 70).

Another zero was used in tables alongside Roman numerals by 525 (first known use by Dionysius Exiguus), but as a word, nulla meaning "nothing", not as a symbol. When division produced zero as a remainder, nihil, also meaning "nothing", was used. These medieval zeros were used by all future medieval computists (calculators of Easter). The initial "N" was used as a zero symbol in a table of Roman numerals by Bede or his colleague around 725.

(All Gödel did was prove that there is no set of axioms that simultaneously "be used to prove/disprove every possible equation", and "provide exactly one solution for every possible equation". It's an interesting result, but nothing that pertains to our discussion. But many people misunderstand the implications of this so the internet is full of crap about this topic as you can imagine)

You acknowledge there is no single set of axioms that makes Zermelo-Fraenkel set theory complete, yet you seem intent on believing that there is one singular set of axioms applicable to all people under the broad category of 'mathematics' or possibly that mathematics is the solely accurate representation of a singular, objective reality and so all people are beholden to it?

Mathematics is not a singular entity. There are many alternative theories within mathematics that are not even necessarily compatible with one another.

"No man ever steps into the same river twice." - Heraclitus
 

Tannhauser

angry insecure male
Local time
Today 9:05 PM
Joined
Jul 18, 2015
Messages
1,462
---
^ an example of reasoning with a historical context. Great post, sir.
 

Teax

huh?
Local time
Today 9:05 PM
Joined
Oct 17, 2014
Messages
392
---
Location
in orbit of a friendly star <3
While in principle I have nothing against you quoting stuff, you probably should write at least one short sentence about how you think your quote relates to what I said. When you don't, I am left with nothing to indicate what you might have misunderstood, and thus have no basis for a direct response to your stance. The following 2 responses are based on a guess:

<Excerpt from Wikipedia article Axiom>
Nothing that the wiki article says contradicts with what I meant. I claim that some axioms are embedded into the idealization process of reality, and since axioms have to be assumed to work within an axiomatized system, through interaction with what you perceive as reality you automatically assume these axioms.

<Excerpt from Wikipedia article Zero>
Again no problems. While it is entertaining how the old folk were impeded from progress by their religious superstitions, this still shows that they had intuitive knowledge about the existence of a zero. About the bottom paragraphs, I already said:

It doesn't matter what we call the number, call it "0", call it "N" or call it "月", it's uniquely identifiable as the previous number from "one", and that's all that matters.

You acknowledge there is no single set of axioms that makes Zermelo-Fraenkel set theory complete, <=> yet you seem intent on believing that there is one singular set of axioms applicable to all people
Yes, the part where you quoted me implied what you just said. So all your post amounts to is "how dare you believe that". If you think that not being able to make a set theory complete is contradiction to my proposition, please show your reasoning so I can at least know your understanding of the matter and how to properly respond.

In other words: There is a non-sequitur at the red "<=>" thingy.

possibly that mathematics is the solely accurate representation of a singular, objective reality and so all people are beholden to it?
This is not what I mean, but rather, even if you're not convinced about the existence of an objective/physical reality, math exists between you and what you perceive as a subjective reality, as the idealization of what you perceive.

If you want to know more, I suggest you read my other posts too, since all my posts in this entire thread relate to this one proposition.

Mathematics is not a singular entity. There are many alternative theories within mathematics that are not even necessarily compatible with one another.
That doesn't prevent them from all being based on one idealized reality.
 

Teax

huh?
Local time
Today 9:05 PM
Joined
Oct 17, 2014
Messages
392
---
Location
in orbit of a friendly star <3
Tannhauser said:
But anyway, I would agree that most, if not all mathematics makes intuitive sense.
Great, this new common ground gave me an idea about how we might steer the conversation in a less theoretical direction:

If the INTP has, as the OP said, "built a matrix of conceptual connection in his mind", if mathematics makes intuitive sense, then his conceptual connections correspond to some mathematical equation/formal system. This implies that you can not dismiss what an INTP believes based on the mere fact that he disrespects existing literature.

Because even without respecting existing literature, his intuitive understanding can potentially be a solution that is equivalent to a solution that respects existing literature (if he didn't contradict himself in his theory). And INTPs who one or both disrespect existing literature, can discuss, relate and learn from each other, and their "tiny aquarium" is is not disjoined, but intersects greatly with ideas present in culture, even without them knowing that.




If this argument doesn't end the discussion, I'll have to make a meta-post inside this spoiler, to put some things straight. Otherwise it seems we might argue in a circle. And that wouldn't be fun.
Tannhauser said:
Respecting literature and the works of our predecessors means being able to analyze ideas in the context of a bigger collection of ideas.
Thank you! I couldn't put my finger on what's missing. Still even with this explanation, you must realize, that I am advocating the same thing, if we were to interpret your words differently.
:D

I believe the OP advocates a very unhealthy attitude towards history as an ultimate authority. To demonstrate what I mean here, just look at Matt3737's post. To me it looks like he simply takes the literally of whatever he is quoting, and if it sounds like it contradicts my position, then I must be wrong, right? It sounds like the OP advocates that we all behave like this and treat history as a collection of constructed ideas or "quotes", which we must all adhere to, parrot, and use "as is", instead of even considering that there exists a unifying framework for these ideas. Without proof that there is none.

Tannhauser said:
The fact that mathematics "makes sense" is nowhere near a proof that mathematics is a way of discovering the one and only idealized version of reality.
Yeah, in my previous posts I was replying to your misgivings about math making sense and showing you merely the implications of my theory.

I realize now that I was implicitly hoping that you would pick up on how my premise relates to these implications on your own. We're all doing that when we're communicating, it's just I created too big of a leap between premise and implication. My explanation that involved the word "induction" bridges that disconnect in smaller steps.
Tannhauser said:
But it does so, as I have argued, because it was created to serve specific purposes.
You must realize that from my perspective, we have no reason to believe your claim about history being the basis of math.

I am not advocating to ignore history altogether. What I mean by "communication is important" is, we cannot discover math on our own in just 1 lifetime, we should rely on the work of our forefathers as a basis, to learn from it. USA is maybe built on the discovery of Columbus, but that doesn't mean that Columbus invented the continent.

I am also not saying that physics is "one and only". Since physics is not an induction of the fundamental axioms, but a distorted view of what could be an objective reality, through our limited lens of math. And if you want to talk about purpose, if you want to design a car, you should learn from experience of other engineers about designing cars, since this is an invention with that specific purpose.

But, mere purpose does not imply that every invention, is actually an invention, rather than a discovery. If you also don't have a reason to believe that, the rational position for you would be to be impartial about this topic, rather than clinging to the OP. That is my central point for this thread.

Everything beyond that, boils down to you asking me about what reason I have to believe the exact opposite.

Tannhauser said:
Would your theory be disproved if we found one case where it couldn't make intuitive sense?
Yes. Since that is the implication of my theory, if you disprove the implication, you also disprove the theory. But notice I have changed one word, from "didn't" to "couldn't".

Without this modification, the answer is no. Because how are we to know whether there really exists no intuitive understanding, or whether we both are just two dumb to see it. After all you already demonstrated how easy it is to overlook the intuitive understanding of a/b, when you've been taught it in a historical context in school(which is advocated in the OP) as a mere manipulation of symbols.

Tannhauser said:
We perceive reality by telling things apart. The term "object" or "thing" or "unit" is the one most basic concept you can find, it applies to everything. Telling objects apart/identifying things allows us to see/memorize relations between objects, because we can tell these relations apart (and we can tell the memories about the relations apart etc..etc..).

Formal math in its current form is induced over set theory. All axioms of set theory are induced over 1) telling objects apart and 2) relations between objects. Some argue with me that 1) and 2) are the same/implied, but I'm separating this for understandability. That's why the structure of reality and math seem to be so closely related: when you observe reality, you involuntarily idealize it through 1) and 2), which results in patterns all of which are part of math because math is the induction over 1) and 2).
Yes, but if I had lived in a culture where, say, anytime we have N apples, we mash them into an apple purée, a sensible axiom for us to start with might be: 1+ 1+ ... + 1 =1 for any number of 1's. So when me and you would meet, we would disagree on basically all mathematical results.
It looks like you are ignoring the difference between the semantic and syntactic definitions of an idealized concept.

I concede that the number zero is assigned the symbol "0" because of our culture. In the same manner I concede that the abstract concept of addition is assigned the symbol "+". This is all syntactic sugar, and irrelevant to the question about whether the semantic understanding of what "addition" is, is universal to all humans.

The axioms 1) and 2) you automatically start with. Independent of what kind of culture you lived in.... call em "fundamental axioms". Humans who live in your mash-culture, would still have a chance to observe 2 apples, and to observe each other, distinguish 2 moments of time or hear 2 "punch noises" when mashing apples, as being something more than one, because of axiom 1).

Given the magnitude 1, which is akin to a "I think therefore I am" axiom. Or just 1 apple. Given the realization that there exist another number, which is the (relation) next number after one. This is akin to the "I just met someone else, who is not me". Or, having 2 apples.

This is all you need to induce all natural numbers, and all the theoretical ground work you need for addition. (is this obvious to you or should I explain?)

If the humans in your mash-culture are not smart enough to discover the "+" relation (which is just a basic combinator of the "next" relations which I just mentioned) because of their apple-mashing habit, maybe because they're being put in a state of pre-conditioned berserk everytime they see 2 apples, then their culture is the cause of their stupidity, but does not in any way render them incapable of understanding and discovering the one and only existing semantic concept of addition.

Just like me talking to one such creature does not prevent me from understanding the one and only concept of 1[+]1=1. I differentiate + from [+] merely for syntactic reasons, this has nothing to do with the semantic meaning of + and [+].

It's very simple. Probability, as it is defined in axiomatic probability theory, is a function which maps values from a set of events into the interval [0, 1]. But this definition was made by Kolmogorov to satisfy the intuitions of frequentist probability. Whether it makes sense or not is a matter of utility. We can for example do the following: instead of mapping values into [0, 1], we map them into, say, the interval [-0.5, 1.5]. What is the use of this? Well, in some contexts you are only interested in a weighting of future events. For example, lets say you win $10 with probability -0.5 and win $15 with probability 0.5. Your expectation for the bet is -0.5*10 + 0.5*15 = 2.5.

For example in finance, we have the concept of a risk-neutral probability measure . In that context, probability doesn't really mean probability as you think of it, it just means, as in my example, a weighting of future events to arrive at an expectation.
This discussion branch feels like a dead-end. I have not studied economy, all I know about risk-free-probabilities is the "fact" that the risk is embedded into the number. But I don't see how this topic can help either me or you.

I see no reason to universally limit the probabilities to [0, 1] (just because Kolmogorov said so, is the kind of arbitrary historical limitation that kept Kolmogorov and keeps you from seeing the bigger picture). But when you re-map the probability 0 to the number -0.5, there must be a damn good reason to do so, which I cannot see yet because I didn't study economy, and do not know the implications of such mapping.

Either that or you are talking about an entirely different semantical concept that has nothing to do with probabilities, but are again confused by the semantic/syntactic meaning of a concept, namely the fact that calling it a probability does not make it work like a probability. Ignore this if it doesn't apply to you.
 

Analyzer

Hide thy life
Local time
Today 12:05 PM
Joined
Aug 23, 2012
Messages
1,241
---
Location
West
Ludwig Von Mises wrote a book titled Theory and History: An Interpretation of Social and Economic Evolution which is pertinent to the discussion in this thread.

His main thesis is the idea of a methodological dualism towards ideas(epistemology). He argues that certain knowledge can be known independently of history, like what Teax has discussed(abstract tools) — things like math or praxeology. While other forms of knowledge mainly empirical, require constant testing and falsification. Of course this can blur when it overlaps, but they are strong distinctions in order to understand reality.
 

Tannhauser

angry insecure male
Local time
Today 9:05 PM
Joined
Jul 18, 2015
Messages
1,462
---
But, mere purpose does not imply that every invention, is actually an invention, rather than a discovery. If you also don't have a reason to believe that, the rational position for you would be to be impartial about this topic, rather than clinging to the OP. That is my central point for this thread.


Everything beyond that, boils down to you asking me about what reason I have to believe the exact opposite.

Your position looks somewhat like this: you know for certain that mathematics has been developed by humans (disregard whether "developed" means "invented" or "discovered" here). Then you posit that it was not invented, but discovered as the one and only possible system. It is trivially easy, and it has been done in this thread, to create any other arbitrary system which may or may not be intuitive. What is the Null hypothesis then? That we should have the religious belief that mathematics is the one and only possible metaphysical system we could have used to model the world – or that it was indeed invented by humans and that it could have been invented in a completely different manner?

The examples which show that we could reinvent mathematics completely differently refute your whole position, but you seem determined to interpret it as just a variation of notation. It is not. It is taking the operators and redefining their behavior. We can just as easily define new operators and build any system we want.

Furthermore, you hold the position that humans (and supposedly all possible creatures) have an innate understanding of arithmetic, the number 0 and negative numbers. Unfortunately you have not provided any justification for this whatsoever, so it remains a void point.

In your case, the history of mathematics is seen as a streamlined process, going in one direction and in the only trajectory it could take. This is one problem of disrespecting history: the inability to envision that the world we know today might have easily looked completely different.


Yes. Since that is the implication of my theory, if you disprove the implication, you also disprove the theory. But notice I have changed one word, from "didn't" to "couldn't".
"couldn't" to whom? And what do you mean by "couldn't"?

If the humans in your mash-culture are not smart enough to discover the "+" relation (which is just a basic combinator of the "next" relations which I just mentioned) because of their apple-mashing habit, maybe because they're being put in a state of pre-conditioned berserk everytime they see 2 apples, then their culture is the cause of their stupidity, but does not in any way render them incapable of understanding and discovering the one and only existing semantic concept of addition.
Here you are either misunderstanding or trying to obscure the point. They posit that 1+1 is not equal to 2 not because of their "stupidity" but because they have defined the operation of addition as a different abstraction that what you have been taught in school. Their system is no less valid than your arithmetic, and their system is not just a different notation than the one used by you.

Just like me talking to one such creature does not prevent me from understanding the one and only concept of 1[+]1=1. I differentiate + from [+] merely for syntactic reasons, this has nothing to do with the semantic meaning of + and [+].
same error again..

This discussion branch feels like a dead-end. I have not studied economy, all I know about risk-free-probabilities is the "fact" that the risk is embedded into the number. But I don't see how this topic can help either me or you.

I see no reason to universally limit the probabilities to [0, 1] (just because Kolmogorov said so, is the kind of arbitrary historical limitation that kept Kolmogorov and keeps you from seeing the bigger picture). But when you re-map the probability 0 to the number -0.5, there must be a damn good reason to do so, which I cannot see yet because I didn't study economy, and do not know the implications of such mapping.

Either that or you are talking about an entirely different semantical concept that has nothing to do with probabilities, but are again confused by the semantic/syntactic meaning of a concept, namely the fact that calling it a probability does not make it work like a probability. Ignore this if it doesn't apply to you.
But you claimed that any concept of probability must be grounded in a intuitive understanding of reality – in particular that a probability is necessarily a ratio: "What probabilities essentially are, are ratios". Now that you are provided with outside information to the contrary, you take the opposite stance – "I see no reason to universally limit the probabilities to [0, 1] ". Again – the problems of reasoning inside an aquarium..
 

QuickTwist

Spiritual "Woo"
Local time
Today 2:05 PM
Joined
Jan 24, 2013
Messages
7,182
---
Location
...
@Teax, you are fighting a losing battle because as long as you both disagree with one another, Tannhauser is going to be correct given his default position that INTPs can't relate with each other. Basically all Tannhauser has to do is come up with a reasonable argument against you and his point is made.
 

Teax

huh?
Local time
Today 9:05 PM
Joined
Oct 17, 2014
Messages
392
---
Location
in orbit of a friendly star <3
PS.: this one has a lot of questions so please slow down before answering them, I'm trying to understand the situation we're in.

Your position looks somewhat like this: you know for certain that mathematics has been developed by humans (disregard whether "developed" means "invented" or "discovered" here). Then you posit that it was not invented, but discovered as the one and only possible system.
Yes. What you seem to misunderstand is that when I say "one and only system", I mean the structure of math, not the notation. What do you think I mean by the term "structure"? (Because this is apparently the main issue here, and I ran out of ideas as to how else to explain "structure", so I'm asking you to give me feedback)

And if you don't know,
then why didn't you ask back when I first used the word?

Hmm..... and what did you think I meant by it back then?

Tannhauser said:
The examples which show that we could reinvent mathematics completely differently refute your whole position, but you seem determined to interpret it as just a variation of notation. It is not. It is taking the operators and redefining their behavior. We can just as easily define new operators and build any system we want.
That's what I'm saying, + and [+] are semantically different. (I can see now how you misinterpreted what I said before and It's my own fault for saying it so ambiguously)

The apple-mashers, in your example, are using the symbol + with a different semantic, which implies 1+1=1. Humans have a semantic assigned to + that implies 1+1=2. All I said, is that I choose to assign the symbol [+] to what you call +, because in our human culture the symbol + is already defined to have one specific semantic, and the symbol [+] is still devoid of meaning. (I hope it's not still ambiguous... Makes sense now?)

Now, you claimed for some reason that if a creature who assigned the semantic of [+] to the symbol + would disagree on the nature of math, here I quote:
Tannhauser said:
Yes, but if I had lived in a culture where, say, anytime we have N apples, we mash them into an apple purée, a sensible axiom for us to start with might be: 1+ 1+ ... + 1 =1 for any number of 1's. So when me and you would meet, we would disagree on basically all mathematical results.
let's say the apple-mashers swapped the meaning of + and [+]. Would humans and apple-mashers disagree about the fundamental structure of math? Or would they merely recognize that the only difference between their cultures is that + and [+] are swapped?

I'm asking for your version of the story, what would happen? So don't comment on mine before you write your own please. Here's my story:
These who cultures would not disagree about math at all, as you claimed, rather they would merely disagree about the assignment of the symbol + to a specific concept. They would even have no problem identifying which concept was swapped with which other. (Can you see how that works? I could elaborate if you want)

Therefore the concept still exists and is understood/shared innately between these two creatures. Both, the humans and the apple-mashers would use the same concept(semantic), but different notation(syntax) when counting people.

The apple mashers might use a different operator (and thus also semantic) for counting apples. But that's fine, It's a cultural thing, and this inconsistency would be explicitly taught as part of their culture. They would never get confused about "counting in general", they can count everything else just fine with the operator that is semantically the same as human use to count things. Because they understand what the concept of "counting" means.
I'm doing my best here to lay it out as I see it, no obscuring, please point out specifically what you disagree with in this above story. Or if you agree I will proceed explaining how this story related to my claim.

Tannhause said:
It is trivially easy, and it has been done in this thread, to create any other arbitrary system which may or may not be intuitive.
1[+]1=1 is not a new system or a new math, it's just an new operator defined in an existing math, the one math induced over the fundamental axioms. (call it set theory if you want). The [+] is merely one possible relation of numbers from all possible relations induced over set theory, with special properties that [+] and only [+] has, that no other operator, like e.g. + can possibly have.

Tannhauser said:
What is the Null hypothesis then? That we should have the religious belief that mathematics is the one and only possible metaphysical system we could have used to model the world – or that it was indeed invented by humans and that it could have been invented in a completely different manner?
This smells like a false dichotomy. Both options are too ambiguous/ridiculous for me to say for sure what you mean. So let me recap:

You cannot invent null in a different manner. The only point of my null story was to show you that there exists, for all humans, exactly one theoretical construct that has the role of being a null. The number null has the exact same implications/semantics for who-ever is restricted to the human-condition. There are no possible other idealized realities humans could invent where null would be something else than null. And if those humanoid aliens visit us, they will recognize our null and we will recognize their null in their math, because it will have exactly the same semantics in both.

Tannhause said:
Furthermore, you hold the position that humans (and supposedly all possible creatures) have an innate understanding of arithmetic, the number 0 and negative numbers. Unfortunately you have not provided any justification for this whatsoever, so it remains a void point.
None whatsoever? This is sad.

I have provided a justification for why the fundamental axioms hold for all humans. I have provided a proof that in a math induced/axiomatized by the fundamental axioms, zero is a semantically uniquely identifiable concept. This is the justification that I provide for you to see that zero is a concept that all humanoid creatures share regardless of culture.

Instead of dismissing all my justifications outright again, please, rather, pinpoint exactly where you think the gap is in my proof, and ask me to clarify. :storks:

Tannhauser said:
"couldn't" to whom? And what do you mean by "couldn't"?
If you find a proof that says "there can be no intuitive understanding", rather than just saying "I don't have an intuitive understanding at this point in time".

Tannhauser said:
Here you are either misunderstanding or trying to obscure the point. They posit that 1+1 is not equal to 2 not because of their "stupidity" but because they have defined the operation of addition as a different abstraction that what you have been taught in school. Their system is no less "valid" than your arithmetic, and their system is not just a different notation than the one used by you.
Why did you stop there? Think things through, please, because I cannot, since I'm trying to grasp what you mean. You stopped right before the most interesting part of justifying why I should regard their system as being equally "valid".

Tannhauser said:
But you claimed that any concept of probability must be grounded in a intuitive understanding of reality
I didn't say it must be, in order for my theory to be valid. I said that if my theory is valid, then it must be.

Whether we realize what the intuitive understanding of a concept is or not, there's gonna be one. That's what I meant when I said that it's an implication of my theory.

Tannhauser said:
in particular that a probability is necessarily a ratio: "What probabilities essentially are, are ratios". Now that you are provided with outside information to the contrary, you take the opposite stance – "I see no reason to universally limit the probabilities to [0, 1] ". Again – the problems of reasoning inside an aquarium..
I see no contradiction. So you cannot think of any ratios outside of [0, 1]?

All concepts in math are defined by their operators. A ratio does not have to be within [0, 1] to behave like a ratio, the restriction to [0, 1] is merely an additional property of a ratio, when talking about a certain class of real/hypothetical experiments. But since the operators of ratios do not limit the numbers to be [0, 1], other ratios, and probabilities can exist, outside.

The problem when talking about probabilities as they were historically defined, is that the people who first defined probabilities never thought that ratios outside of [0, 1] would make useful probabilities, because they concentrated on a small aquarium of physics, in which such probabilities do not appear. By taking such people at their word, you seem to (from my perspective) have brainwashed yourself into the same aquarium of believing that every other probability must be fundamentally different, and not merely an extrapolation of the existing probability concept. I never claimed that probabilities or ratios must fundamentally be always inside [0, 1], because I see no basis for it to be.
 

Haim

Worlds creator
Local time
Today 11:05 PM
Joined
May 26, 2015
Messages
817
---
Location
Israel
The axioms 1) and 2) you automatically start with. Independent of what kind of culture you lived in.... call em "fundamental axioms". Humans who live in your mash-culture, would still have a chance to observe 2 apples, and to observe each other, distinguish 2 moments of time or hear 2 "punch noises" when mashing apples, as being something more than one, because of axiom 1).

Given the magnitude 1, which is akin to a "I think therefore I am" axiom. Or just 1 apple. Given the realization that there exist another number, which is the (relation) next number after one. This is akin to the "I just met someone else, who is not me". Or, having 2 apples.

This is all you need to induce all natural numbers, and all the theoretical ground work you need for addition. (is this obvious to you or should I explain?)

If the humans in your mash-culture are not smart enough to discover the "+" relation (which is just a basic combinator of the "next" relations which I just mentioned) because of their apple-mashing habit, maybe because they're being put in a state of pre-conditioned berserk everytime they see 2 apples, then their culture is the cause of their stupidity, but does not in any way render them incapable of understanding and discovering the one and only existing semantic concept of addition.

Just like me talking to one such creature does not prevent me from understanding the one and only concept of 1[+]1=1. I differentiate + from [+] merely for syntactic reasons, this has nothing to do with the semantic meaning of + and [+].
Objects are our invention, our way for labeling the world, there are no objects in realty it is just in our brain, it wouldn't hear 2 sounds but one stream of sound, you just labeled it as 2 sounds, but an alien creature might not, you don't even have to use alien creature, Siri has no notion of two sounds, it takes sound and make it words.You divide the world to objects because that is the way your brain works.There are no 2 apples, it all a particles, and no you can not count particles, an apple is not made of X particles it is more complex than that, also there are no two of the same exact apple.
 

Teax

huh?
Local time
Today 9:05 PM
Joined
Oct 17, 2014
Messages
392
---
Location
in orbit of a friendly star <3
Objects are our invention, our way for labeling the world,
This depends on what you mean by invention. Can you think about anything, real or not, that is not an "object" in the abstract sense? If objects are an involuntary invention, then we didn't really invent it, did we?
Haim said:
there are no objects in realty it is just in our brain, it wouldn't hear 2 sounds but one stream of sound, you just labeled it as 2 sounds, but an alien creature might not,
Yes, that's what I meant by idealized reality. We live in a reality of ideas. Objects. You have your own idealized reality, I have mine. The human-alien has his.

Haim said:
There are no 2 apples, it all a particles, and no you can not count particles, an apple is not made of X particles it is more complex than that, also there are no two of the same exact apple.
Why do you care what an apple "really" is? Remember:

Teax said:
@Haim: Hereby it is irrelevant whether these things we see as "apart" really are apart in some sort of objective/physical reality or not. Because we're talking about idealization, not the actual reality.

I'm talking only about the subjective realities here. For my argument it doesn't matter what an apple really objectively is, only how you perceive(=idealize) it to be.
 

Tannhauser

angry insecure male
Local time
Today 9:05 PM
Joined
Jul 18, 2015
Messages
1,462
---
I'll start with this one:
That's what I'm saying, + and [+] are semantically different. (I can see now how you misinterpreted what I said before and It's my own fault for saying it so ambiguously)

The apple-mashers, in your example, are using the symbol + with a different semantic, which implies 1+1=1. Humans have a semantic assigned to + that implies 1+1=2. All I said, is that I choose to assign the symbol [+] to what you call +, because in our human culture the symbol + is already defined to have one specific semantic, and the symbol [+] is still devoid of meaning. (I hope it's not still ambiguous... Makes sense now?)

Now, you claimed for some reason that if a creature who assigned the semantic of [+] to the symbol + would disagree on the nature of math, here I quote:

let's say the apple-mashers swapped the meaning of + and [+]. Would humans and apple-mashers disagree about the fundamental structure of math? Or would they merely recognize that the only difference between their cultures is that + and [+] are swapped?

I'm asking for your version of the story, what would happen? So don't comment on mine before you write your own please. Here's my story:
These who cultures would not disagree about math at all, as you claimed, rather they would merely disagree about the assignment of the symbol + to a specific concept. They would even have no problem identifying which concept was swapped with which other. (Can you see how that works? I could elaborate if you want)

Therefore the concept still exists and is understood/shared innately between these two creatures. Both, the humans and the apple-mashers would use the same concept(semantic), but different notation(syntax) when counting people.

The apple mashers might use a different operator (and thus also semantic) for counting apples. But that's fine, It's a cultural thing, and this inconsistency would be explicitly taught as part of their culture. They would never get confused about "counting in general", they can count everything else just fine with the operator that is semantically the same as human use to count things. Because they understand what the concept of "counting" means.
I'm doing my best here to lay it out as I see it, no obscuring, please point out specifically what you disagree with in this above story. Or if you agree I will proceed explaining how this story related to my claim.


1[+]1=1 is not a new system or a new math, it's just an new operator defined in an existing math, the one math induced over the fundamental axioms. (call it set theory if you want). The [+] is merely one possible relation of numbers from all possible relations induced over set theory, with special properties that [+] and only [+] has, that no other operator, like e.g. + can possibly have.
The reason that 1+1=2 makes sense to most of us is that we can quite easily interpret non-identities as identities: one tiger + one tiger = two tigers. But there has never been a case where we have two identical tigers; you need to do an approximation to the truth in your mind so as to say that "tigers belong to the same species, so in this case we treat them as identical". I find it quite easy to imagine an evolution of the human mind, or a culture, where one doesn't have a habit of forming abstract identities, where one treats each individual case as individual. In that scenario, 1+1=2 doesn't make sense to anyone, because in that scenario, even 1=1 doesn't hold.

(btw: the tiger example is stolen from Nietzsche)

Which brings us to the next point: you seem to treat as equivalent the empirical application of math, and math. One has to realize that mathematics as a formal system is one thing, and it's application to reality as another. You can say 1+1=2 as a part of a formal system, which is always is going to be true as long as you don't break any rules of that system, or you can say 1+1=2 as an empirical claim – for example that 1 apple + 1 apple = 2 apples. That is a completely different category, because it depends on the parameters of the experiment. For example, 2 apples will not remain 2 apples for more than a couple of weeks, because they will eventually disintegrate into other material. Hence I am curious about what you actually refer to when you say "math". If it is just an arbitrary formal system, then I hope we can agree that one can create arbitrary formal systems, which in many cases will not map onto each other just by changing the notation. If you instead mean it is not an arbitrary formal system, but the particular formal system that can be applied to model physical phenomena, there are two things to say: 1) The reason current "math" works well in the physical world is because we can both choose which phenomena to model, and which formalism to use to define the model. 2) We can create other formal systems and then pick phenomena to model. There again, the statements of the formal system might become empirical claims which then must be verified by experiment. This formal system, however, may or may not map onto our current math just by changing notation. It might enable abstractions that are completely unimaginable in current math and vice versa.
 

Teax

huh?
Local time
Today 9:05 PM
Joined
Oct 17, 2014
Messages
392
---
Location
in orbit of a friendly star <3
The reason that 1+1=2 makes sense to most of us is that we can quite easily interpret non-identities as identities: one tiger + one tiger = two tigers. But there has never been a case where we have two identical tigers; you need to do an approximation to the truth in your mind so as to say that "tigers belong to the same species, so in this case we treat them as identical".
Doesn't matter if it's an approximation (to my argument).

How about counting prizes? You win three things:
  • a new car
  • ownership of the letter H
  • and a feeling of self satisfaction :kodama1:

My point is: We all are able to understand the above prizes. You just won 3 things, and it makes perfect intuitive sense. Because our thinking process is not an approximation of the physical reality, it's an idealization.

game-show-host-bob-barker-poses-amongst-a-sea-of-prizes-at-the-price-picture-id2977304

I find it quite easy to imagine an evolution of the human mind, or a culture, where one doesn't have a habit of forming abstract identities, where one treats each individual case as individual. In that scenario, 1+1=2 doesn't make sense to anyone, because in that scenario, even 1=1 doesn't hold.
Really, easy to imagine? How can you imagine a culture/mind in which the idea behind 1=1 does not hold? How can "I am not me" be ever true in an idealized reality. That stool is not that stool. How can you imagine a creature, who can hear a clap, but thinks that that clap is not the clap that it heard?

You again, haven't elaborated on implications of your hypothetical scenario about the life and culture, a story, so my next natural question is: have you really thought this through? Or did you mean by "I can imagine" that you can easily imagine a premise, but you leave it to me to pick it apart? (If so then it's ok, I just wanna know what's going on)

For me it's quite clear: a mind that can not form abstract identities is a mind that can not form relations (by definition of what an abstract identity is). At this point I wouldn't even call it a "mind", since it cannot remember things, reason, learn, solve anything.

Which brings us to the next point: you seem to treat as equivalent the empirical application of math, and math. One has to realize that mathematics as a formal system is one thing, and it's application to reality as another. You can say 1+1=2 as a part of a formal system, which is always is going to be true as long as you don't break any rules of that system, or you can say 1+1=2 as an empirical claim – for example that 1 apple + 1 apple = 2 apples. That is a completely different category, because it depends on the parameters of the experiment. For example, 2 apples will not remain 2 apples for more than a couple of weeks, because they will eventually disintegrate into other material.
The description you just made about the mistake is a LOT easier to understand than how I tried to explain it :facepalm: . Thanks.

Yes, of course, math is not the same as applicability of math. And it's funny how from my perspective, you are the one who is making that specific mistake you just described, and I've been trying to confront you with it. For example: You said that the apple-mashers will define the concept of addition as 1[+]1=1, because anytime they have more than 1 apple, they mash it together into 1 apple-purée. This claim mixes up mathematics and empirical knowledge observation.

It sounds to me like, you see a way, that the concept of addition (or the concept of zero) can vary from culture to culture, and still play the role of "addition", is that true? If yes, can you bring an example? Or demonstrate what you mean on that hypothetical apple-masher culture's 1[+]1=1, how does [+] play the same role of "addition" in throughout their culture and their thoughts, as + plays in ours? How would we as humans see them, and they see us?

I explicitly conceded that most (dunno about all) empirical knowledge depends on culture, and I agree that your apple-masher's culture would have a physical paper about a "operator for calculating mashed apples". So even saying that "an apple will not always remain one apple, it will rot and disappear", does not imply 1=0 in a math, it implies a piece of physics knowledge about the world, which we can express in mathematical terms, but which will not change the structure of math of that culture. And 1=0 is not the equation that will be used to describe it in that paper.

Off topic:
I can bring an argument about how an alien culture probably will or has discovered the theory of relativity exactly like we did. And suggest that the very-fundamental physics is discovered as-well, rather than invented. But let's not discuss this right now. My posts are already too long.

Hence I am curious about what you actually refer to when you say "math". If it is just an arbitrary formal system, then I hope we can agree that one can create arbitrary formal systems, which in many cases will not map onto each other just by changing the notation.
No, this is a good starting point but I see too much potential for misunderstanding. Please, example? Take two formal systems, maybe from our previous examples, that you believe are contradictory (Apple masher-example?). And I'll take it from there. That should resolve any term-definition issues between us.
 

Tannhauser

angry insecure male
Local time
Today 9:05 PM
Joined
Jul 18, 2015
Messages
1,462
---
Really, easy to imagine? How can you imagine a culture/mind in which the idea behind 1=1 does not hold? How can "I am not me" be ever true in an idealized reality. That stool is not that stool. How can you imagine a creature, who can hear a clap, but thinks that that clap is not the clap that it heard?
I suspected you would try misunderstand that point. Saying that 1=1 is not generally true, is not the same as saying 1=1 is generally false. It means, for example, that one person is not equal one person, because the persons might be different persons. In the current abstraction we call math, we can say, for example 6-5 = 1 <=> 1=1, because we treat all integer values as identical objects. That is a learned abstraction, because no person has ever seen two fully identical objects. Moreover, no person has ever seen a scenario in which 5 people leave a room, with 1 person remaining, and that one person morphing arbitrarily into any person in the world.

You again, haven't elaborated on implications of your hypothetical scenario about the life and culture, a story, so my next natural question is: have you really thought this through? Or did you mean by "I can imagine" that you can easily imagine a premise, but you leave it to me to pick it apart? (If so then it's ok, I just wanna know what's going on)
No, by "can imagine" I meant that if I see a coin land heads, I can imagine it could have easily landed tails. Your position is saying "the coin landed heads, therefore it is impossible for it to land tails".

No, this is a good starting point but I see too much potential for misunderstanding. Please, example? Take two formal systems, maybe from our previous examples, that you believe are contradictory (Apple masher-example?). And I'll take it from there. That should resolve any term-definition issues between us.
Just take your pick: https://en.wikipedia.org/wiki/List_of_formal_systems

Maybe try to show that Peano arithmetic and Formal ethics are actually the same system? Or show that all the formal systems are actually the same system, just with different notation? I think there would be a Nobel prize for that.
 

Teax

huh?
Local time
Today 9:05 PM
Joined
Oct 17, 2014
Messages
392
---
Location
in orbit of a friendly star <3
we treat all integer values as identical objects. That is a learned abstraction, because no person has ever seen two fully identical objects.
Yes you have: Every object/idea/abstract concept is fully identical to itself in every respect you can imagine. That's why it's one of my fundamental axioms (the 1) one). Every creature, if it is able to distinguish things, can identify every idea as that particular idea, by definition.

A tiger is always identical to itself. <-- can you give me a counterexample for something that is not a tiger?

The fact we think about the same abstract concept of "two" in our heads every time we see a different "2" on paper is learned behavior, true. That's what I call syntax. You however ignored the fact that a semantic exists behind every syntax. There's still only one abstract concept "two", which is identical to itself, and is the semantical meaning of the symbol "2" on paper.

Your previous faith in history seems to be supported by a profound misunderstanding of the math defined in your own historical context. This misunderstanding has influences on how you think about math:
Saying that 1=1 is not generally true, is not the same as saying 1=1 is generally false. It means, for example, that one person is not equal one person, because the persons might be different persons. In the current abstraction we call math, we can say, for example 6-5 = 1 <=> 1=1, because we treat all integer values as identical objects.
You claim math to be a formal system, yet you abuse a syntax to mean whatever empirical thing you want disregarding what a formal system is. No wonder I couldn't grok what you were saying, you're all over the place.

So despite your claims to the contrary, in a formal system, 1=1 can not ever possibly mean "one person is equal to another person". Identity of objects is the basis for any formal system, because without identity, a formal system has no meaning, it would not reflect our thinking process:
<Excerpt from Wikipedia article "formal system">
Formal systems in mathematics consist of the following elements:

  1. A finite set of symbols (i.e. the alphabet), that can be used for constructing formulas (i.e. finite strings of symbols).

Thus 1=1 can only mean:

  • <the possibly abstract thing labeled "1"> is the same object/thing as the <the possibly abstract thing labeled "1">.
    .
  • Which is the exact same thing as saying:
    <the possibly abstract thing labeled "1"> is equivalent to itself.
    .
  • In a foreign math, this could mean <some specific person named "1"> is equivalent to himself.
    .
  • And in our math, this specifically means <the magnitude 1> is equivalent to itself.

Moreover, no person has ever seen a scenario in which 5 people leave a room, with 1 person remaining, and that one person morphing arbitrarily into any person in the world.
If you want to talk about how reality correspond to some formal system that you construct, that would be empiricism/physics that you yourself said is what you should never confuse with what a formal system is. And yet you insist repeatedly that you could imagine to base a formal system on an empirical observation.

u7IBZAVZ8Xy14vTPJ5fc6DEqcs3Dm0lRhCmos_HANQ2kHsuQVXYCgyZgFtQQ7R3vaXE=h310-rw
Even if you saw a person that morphed into another person, the identity of the object still remains. The identity of the people still remains. All you did by watching this (mesmerizing) show is create relations between existing identities. <-- this is a tautological statement, it's not a claim


No, by "can imagine" I meant that if I see a coin land heads, I can imagine it could have easily landed tails.

Your position is saying "the coin landed heads, therefore it is impossible for it to land tails".
No, my position is: "The implication of a coin landing tails would make that coin not conscious, therefore, all conscious coins land heads."

It's a claim of deduction, not a claim of belief. Why do you keep ignoring this fact, despite me explicitly re-stating this every time, <pick your favorite condescending remark>.

Just take your pick: https://en.wikipedia.org/wiki/List_of_formal_systems

Maybe try to show that Peano arithmetic and Formal ethics are actually the same system? Or show that all the formal systems are actually the same system, just with different notation? I think there would be a Nobel prize for that.
Seriously? Why couldn't you pick the most minimalistic example you can construct? You are the one who claims that it's easy to imagine two formal systems. This response seems to be explicitly constructed as to not progress the discussion.
 

The Grey Man

το φως εν τη σκοτια φαινει
Local time
Today 3:05 PM
Joined
Oct 6, 2014
Messages
931
---
Location
Canada
I suspected you would try misunderstand that point. Saying that 1=1 is not generally true, is not the same as saying 1=1 is generally false. It means, for example, that one person is not equal one person, because the persons might be different persons. In the current abstraction we call math, we can say, for example 6-5 = 1 <=> 1=1, because we treat all integer values as identical objects. That is a learned abstraction, because no person has ever seen two fully identical objects. Moreover, no person has ever seen a scenario in which 5 people leave a room, with 1 person remaining, and that one person morphing arbitrarily into any person in the world.

I'm not so sure we treat all integer values as identical objects, but we do treat what the symbol "1" represents as identical to itself because it is (in fact, anything you can think of is itself, right?). One person is not equal to another (otherwise they would be the same person), but your referring to them as two separate persons indicates that you have abstracted to the point where both persons are reduced to an identical quantity of 1. This is possible because a quantity and the thing it is attributed to are not one in the same. Like any useful abstraction, a given quantity can result from abstracting from an infinite number of things that are different from each other each other.
 

Tannhauser

angry insecure male
Local time
Today 9:05 PM
Joined
Jul 18, 2015
Messages
1,462
---
A tiger is always identical to itself. <-- can you give me a counterexample for something that is not a tiger?
That is a philosophical question which by no means must be answered in order to establish whether a formal system must or must not contain that property. But as a thought experiment, consider the fact that the only reason you could look at a tiger and say that it is identical to itself, is that the atoms which comprise the tiger remain relatively stable so as to still comprise a thing closely resembling the tiger in the next moment. But the actual configuration of atoms changes continuously. For a creature that has a lifespan of, say, 10 million years, there might be no tigers at all – only a continuum of matter flowing from place to place.

The fact we think about the same abstract concept of "two" in our heads every time we see a different "2" on paper is learned behavior, true. That's what I call syntax. You however ignored the fact that a semantic exists behind every syntax. There's still only one abstract concept "two", which is identical to itself, and is the semantical meaning of the symbol "2" on paper.

Your previous faith in history seems to be supported by a profound misunderstanding of the math defined in your own historical context. This misunderstanding has influences on how you think about math:

You claim math to be a formal system, yet you abuse a syntax to mean whatever empirical thing you want disregarding what a formal system is. No wonder I couldn't grok what you were saying, you're all over the place.

So despite your claims to the contrary, in a formal system, 1=1 can not ever possibly mean "one person is equal to another person". Identity of objects is the basis for any formal system, because without identity, a formal system has no meaning, it would not reflect our thinking process:
Let's just note that you are flat out wrong here. The property x=x as a relation between an object and itself (not talking about arithmetic now) is an example of a Reflexive relation. It may or may not be a part of a formal system, but it is by no shape or form a necessary condition in order for that formal system to have a "meaning".

But sure, let us agree that it is very useful to establish the reflexive property x=x. Again, that is not a universal property of a formal system, as you claim, but it is a useful assumption.

Next problem is that it doesn't mean that it will always hold, as an arithmetical operation, that 1=1. I can define a system, for example, where I say that if I divide a number N by itself, it is a different object than the unit I get from doing the operation N+1-N. So now, in Peano arithmetic, we have

N/N = N+1-N <=> 1=1

but in my arithmetic we have N/N=1 and N+1-N=1, but

N/N != N+1-N <=> 1!=1

This is what I mean by the abstraction of the concept of identity – we all need to agree on what identities we have before doing operations within the formal system. And it is not an objective, universal rule that defines the rules of identity.
No, my position is: "The implication of a coin landing tails would make that coin not conscious, therefore, all conscious coins land heads."
Well, then it all depends on your definition of math.

Seriously? Why couldn't you pick the most minimalistic example you can construct? You are the one who claims that it's easy to imagine two formal systems. This response seems to be explicitly constructed as to not progress the discussion.
Well, it basically shows that there are many, distinct formal systems designed to serve various purposes. But OK, here is an example: Take all the rules of our usual arithmetic, but without the commutativity property x+y = y+x.
 

Teax

huh?
Local time
Today 9:05 PM
Joined
Oct 17, 2014
Messages
392
---
Location
in orbit of a friendly star <3
Actually The Grey Man's interpretation of what you said makes a lot of sense, but apparently it wasn't what you meant.

That is a philosophical question which by no means must be answered in order to establish whether a formal system must or must not contain that property. But as a thought experiment, consider the fact that the only reason you could look at a tiger and say that it is identical to itself, is that the atoms which comprise the tiger remain relatively stable so as to still comprise a thing closely resembling the tiger in the next moment. But the actual configuration of atoms changes continuously. For a creature that has a lifespan of, say, 10 million years, there might be no tigers at all – only a continuum of matter flowing from place to place.
I don't see what has changed. "The tiger" continues to be just one abstract concept in your mind for that creature.

The creature you identify as "the tiger" spans space and time.... Just as a thought experiment, I could unfold time and say: At every moment in time, in which "the tiger" is well defined/observable he is exactly equivalent to himself in that time. Whether what we call "the tiger" is also a continuum of matter, is here irrelevant, because it's a relation between "the tiger" and "matter" which is not what I was talking about.

I'm talking about identities/things/ideas, not relations.

Let's just note that you are flat out wrong here. The property x=x as a relation between an object and itself (not talking about arithmetic now) is an example of a Reflexive relation. It may or may not be a part of a formal system, but it is by no shape or form a necessary condition in order for that formal system to have a "meaning". But sure, let us agree that it is very useful to establish the reflexive property x=x.

Next problem is that it doesn't mean that it will always hold, as an arithmetical operation, that 1=1. I can define a system, for example, where I say that if I divide a number N by itself, it is a different object than the unit I get from doing the operation N+1-N. So now, in Peano arithmetic, we have

N/N = N+1-N <=> 1=1

but in my arithmetic we have N/N=1 and N+1-N=1, but

N/N != N+1-N <=> 1!=1

This is what I mean by the abstraction of the concept of identity – we all need to agree on what identities we have before doing operations within the formal system. And it is not an objective, universal rule that defines the rules of identity.
I'm not talking about just any reflexive relation, I'm talking about the identity relation. (But you can see now why some argue that my fundamental axioms 1) and 2) are the same can't you? ;) you just did the same).

The identity relation(the "=") is implied in every formal system. Even if it's not explicitly stated in wikipedia (because, implied, get it?). Every formal system defines a set of symbols as an "alphabet". Every symbol in that alphabet is uniquely recognizable, and therefore equivalent to every other such symbol used as part of a formula. Of course there exists other relations besides identities(the "="), but the identity(the "=") is a special relation that we do not have to "agree" on, we know what that identity is innately, because it is the foundation of our thoughts and that's why it's also the foundation of our formal system definition.

If you don't think so, give me an example of a formal system definition that does not imply/rely on the existence of an identity relation. Then you've won.

You called the relation "=" merely useful, but I say it's more than that, it's meaningful in every context, because of the above. In fact, I do not know any work that redefines the symbol "=" as blatantly as you do. E.g. congruence uses three lines instead of two, so to not to conflict with "=".

Also your example demonstrates structural equivalence: how does your new_one behave? Does it behave like the original one in every respect? Since it does not, it's not a 1, but merely something that you syntactically defined to look like a 1 but with other semantic meaning. This is invalid practice, you must assign a new symbol to this new_one concept, let it be [1]. (Otherwise you are purposefully making your system self-contradictory.)

Well, it basically shows that there are many, distinct formal systems designed to serve various purposes.
You only thought it relevant because you misunderstood my position. I am already aware and concede that there exist different historically defined systems, if that were really what I meant by "different maths" the discussion would've been over long time ago.

The symbol + is assigned a different set-theoretical meaning based on convenience in different historical inventions. True. Doesn't affect my argument.

My argument is: all of what you called math in your previous posts are induced over the same axioms, set theory, and therefore exist as part of the same framework of what I usually call "math".

But OK, here is an example: Take all the rules of our usual arithmetic, but without the commutativity property x+y = y+x.
Ok. let me see if I remember my lectures:
In such a case, it holds that

The semantic meaning of "+" in arithmetic is plus.
The semantic meaning of "+" in new_arithmetic is new_plus

Where plus and new_plus are not equivalent. Therefore with no loss of generality, we can rename "+" into each of those new names (syntax doesn't matter).

Since both plus and new_arithmetic are axiomatized by set theory, and plus is missing from new_arithmetic, plus can be added to new_arithmetic, with some name, without loss of generality (syntax doesn't matter).

Same goes for new_plus and arithmetic.

As a result we have two equivalent arithmetics. This, and the fact that we preserved the axiomatized domain, infers: the two initially proposed arithmetics do not contradict with each other.

For example this means that these two arithmetics can be used interchangeably within one framework of reasoning, or, simultaneously as one formal system.

As a side-note: Commutativity is an inducted property over numbers, not just a switch you can flick on-or-off. Therefore the lack of commutativity will have some other consequences beyond just not having this property, which are you know... probably irrelevant to this discussion. ;)
From this proof it follows that both of your arithmetics are actually a subset of one bigger math. They don't conflict. That "bigger math" is what I call math.
 

Matt3737

INFJ
Local time
Today 2:05 PM
Joined
Oct 7, 2012
Messages
155
---
Location
Arkansas
@Teax

You're masquerading a metaphysical proposition on the problem of universals as an argument. There's nothing wrong with subscribing to any particular philosophical school of thought, but it doesn't really make for a compelling argument.

You acknowledge historical context and temporal distinctions, but dismiss them out of hand as irrelevant because "ideals." You might as well substitute God in there instead.

In metaphysics, the problem of universals refers to the question of whether properties exist, and if so, what they are. Properties are qualities or relations that two or more entities have in common. The various kinds of properties, such as qualities and relations are referred to as universals. For instance, one can imagine three cup holders on a table that have in common the quality of being circular or exemplifying circularity, or two daughters that have in common being the daughter of Frank. There are many such properties, such as being human, red, male or female, liquid, big or small, taller than, father of, etc.

While philosophers agree that human beings talk and think about properties, they disagree on whether these universals exist in reality or merely in thought and speech.

I'm sort of confused if you yourself are even using your philosophy consistently. At times it seems like Platonic realism and at other times it seems like idealism. It seems like a strange mixture of the two.

There are many philosophical positions regarding universals. Taking "beauty" as example, three positions are:

  • Idealism: beauty is a property constructed in the mind, so exists only in descriptions of things.
  • Platonic realism: beauty is a property that exists in an ideal form independently of any mind or description.
  • Aristotelean realism: beauty is a property that only exists when beautiful things exist.
Taking a broader view, the main positions are generally considered to be classifiable as: realism, nominalism, and idealism (sometimes simply called "anti-realism" with regard to universals).[
 

Matt3737

INFJ
Local time
Today 2:05 PM
Joined
Oct 7, 2012
Messages
155
---
Location
Arkansas
Here's some fun distinctions:

In mathematics, the repeating decimal 0.999… (sometimes written with more or fewer 9s before the final ellipsis, for example as 0.9…, or in a variety of other variants such as 0.9, 0.(9), or
de0175f40b34536008a8752b6b2dc5c7.png
) denotes a real number that can be shown to be the number one. In other words, the symbols "0.999…" and "1" represent the same number. Proofs of this equality have been formulated with varying degrees of mathematical rigor, taking into account preferred development of the real numbers, background assumptions, historical context, and target audience.

Every nonzero, terminating decimal (with infinitely many trailing 0s) has an equal twin representation with infinitely many trailing 9s (for example, 8.32 and 8.31999…). The terminating decimal representation is usually preferred, contributing to the misconception that it is the only representation. The same phenomenon occurs in all other bases (with a given base's largest digit) or in any similar representation of the real numbers.

The equality of 0.999… and 1 is closely related to the absence of nonzero infinitesimals in the real number system, the most commonly used system in mathematical analysis. Some alternative number systems, such as the hyperreals, do contain nonzero infinitesimals. In most such number systems, the standard interpretation of the expression 0.999… makes it equal to 1, but in some of these number systems, the symbol "0.999…" admits other interpretations that contain infinitely many 9s while falling infinitesimally short of 1.

The equality 0.999… = 1 has long been accepted by mathematicians and is part of general mathematical education. Nonetheless, some students find it sufficiently counterintuitive that they question or reject it. Such skepticism is common enough that the difficulty of convincing them of the validity of this identity has been the subject of several studies in mathematics education.
Another fun identity paradox is Theseus' ship:

The ship of Theseus, also known as Theseus' paradox, is a thought experiment that raises the question of whether an object that has had all of its components replaced remains fundamentally the same object. The paradox is most notably recorded by Plutarch in Life of Theseus from the late first century. Plutarch asked whether a ship that had been restored by replacing every single wooden part remained the same ship.

The paradox had been discussed by more ancient philosophers such as Heraclitus, Socrates, and Plato prior to Plutarch's writings;and more recently by Thomas Hobbes and John Locke. Several variants are known, including the grandfather's axe, which has had both head and handle replaced.
 

Teax

huh?
Local time
Today 9:05 PM
Joined
Oct 17, 2014
Messages
392
---
Location
in orbit of a friendly star <3
@Teax
You're masquerading a metaphysical proposition on the problem of universals as an argument. There's nothing wrong with subscribing to any particular philosophical school of thought, but it doesn't really make for a compelling argument.

You acknowledge historical context and temporal distinctions, but dismiss them out of hand as irrelevant because "ideals." You might as well substitute God in there instead.
I have no idea where I dismissed anything based on ideals. I didn't even know I had ideals. :elephant:

My argument against the OP ended a loooooong time ago: when I said that Tannhauser does not have any reason to believe the OP claim that "math is based on history". Because there is no reason to believe that a civilization can be built on a math that is fundamentally different from the one we have now.

So I suggested that the only rational choice is to remain agnostic about this.

Since then Tannhauser was merely asking me about what reason I have to believe the opposite. If my reasons sound to you guys more like philosophical reasons, I guess there's nothing I can do about that.

Anyway since nothing I have said in this thread make a statement about the objective reality, it shouldn't sound too philosophical.

I'm sort of confused if you yourself are even using your philosophy consistently. At times it seems like Platonic realism and at other times it seems like idealism.
Do I understand you correctly, your view is that everyone must pick an existing philosophy?
 

Teax

huh?
Local time
Today 9:05 PM
Joined
Oct 17, 2014
Messages
392
---
Location
in orbit of a friendly star <3
yeah I remember these :)

0.9999... = 1
Here, we have 2 different syntax, but their semantics is the same.

Which means you can use them interchangeably, without ever stumbling on any contradictions. Because (at least this is how semantics is defined in CS) all relations for both numbers are equivalent.

This is really only hard to understand if you think that all formulas are just strings of symbols, which admittedly is exactly how they teach it in school.

Another fun identity paradox is Theseus' ship:
This paradox is a nice demonstration of why I don't think my argument should sound too Philosophical to anyone: Because I do not propose a solution for the paradox.

All I'm saying is that we as humans can think about every aspect of that paradox individually: "the grandfather's axe", "the head", "the handle". That's what I called an "idealized reality", just these things/objects/ideas, they're like individual points on a white piece of paper. You can assign symbols to them if you like. But these points already have a label: "the grandfather's axe", "the head", "the handle".

And you'll never get confused about which of these three points you're currently thinking about.

You don't have to subscribe to any specific philosophy to understand what I just said, no? You don't have to know anything about physics/atoms...etc...

Anyone get what I mean? If not, I'll just give up, cause what's the point.
 

Matt3737

INFJ
Local time
Today 2:05 PM
Joined
Oct 7, 2012
Messages
155
---
Location
Arkansas
I have no idea where I dismissed anything based on ideals. I didn't even know I had ideals.

Idealization is the process by which scientific models assume facts about the phenomenon being modeled that are strictly false but make models easier to understand or solve. That is, it is determined whether the phenomenon approximates an "ideal case," then the model is applied to make a prediction based on that ideal case.

If an approximation is accurate, the model will have high predictive accuracy; for example, it is not usually necessary to account for air resistance when determining the acceleration of a falling bowling ball, and doing so would be more complicated. In this case, air resistance is idealized to be zero. Although this is not strictly true, it is a good approximation because its effect is negligible compared to that of gravity.

Idealizations may allow predictions to be made when none otherwise could be. For example, the approximation of air resistance as zero was the only option before the formulation of Stokes' law allowed the calculation of drag forces. Many debates surrounding the usefulness of a particular model are about the appropriateness of different idealizations.
It's even more basic than that, it's an "idealization". Even if you're not convinced about the existence of an objective/physical reality, math still exists between you and what you perceive as a subjective reality, as the idealization of what you perceive.

It looks like you are ignoring the difference between the semantic and syntactic definitions of an idealized concept.

How have you not been discussing ideals?

My argument against the OP ended a loooooong time ago: when I said that Tannhauser does not have any reason to believe the OP claim that "math is based on history". Because there is no reason to believe that a civilization can be built on a math that is fundamentally different from the one we have now.

So I suggested that the only rational choice is to remain agnostic about this.

Since then Tannhauser was merely asking me about what reason I have to believe the opposite. If my reasons sound to you guys more like philosophical reasons, I guess there's nothing I can do about that.

Anyway since nothing I have said in this thread make a statement about the objective reality, it shouldn't sound too philosophical.

Yes, that's what I meant by idealized reality. We live in a reality of ideas. Objects. You have your own idealized reality, I have mine. The human-alien has his.

I'm talking only about the subjective realities here. For my argument it doesn't matter what an apple really objectively is, only how you perceive(=idealize) it to be.

You just seem to throw around words like "reality," "objective," "subjective," seemingly without any concern as to what they mean.

Do I understand you correctly, your view is that everyone must pick an existing philosophy?

No. I mean to say that I am having difficulty understanding what your stance is and if you are being consistent or not in your responses. This seems to say you are not being consistent:

I have provided a justification for why the fundamental axioms hold for all humans.

(All Gödel did was prove that there is no set of axioms that simultaneously "be used to prove/disprove every possible equation", and "provide exactly one solution for every possible equation". It's an interesting result, but nothing that pertains to our discussion. But many people misunderstand the implications of this so the internet is full of crap about this topic as you can imagine)

You acknowledge there is no single set of axioms that makes Zermelo-Fraenkel set theory complete, yet you seem intent on believing that there is one singular set of axioms applicable to all people under the broad category of 'mathematics' or possibly that mathematics is the solely accurate representation of a singular, objective reality and so all people are beholden to it?

Mathematics is not a singular entity. There are many alternative theories within mathematics that are not even necessarily compatible with one another.

"No man ever steps into the same river twice." - Heraclitus

Yes, the part where you quoted me implied what you just said. So all your post amounts to is "how dare you believe that". If you think that not being able to make a set theory complete is contradiction to my proposition, please show your reasoning so I can at least know your understanding of the matter and how to properly respond.

This is not what I mean, but rather, even if you're not convinced about the existence of an objective/physical reality, math exists between you and what you perceive as a subjective reality, as the idealization of what you perceive.

That doesn't prevent them from all being based on one idealized reality.
 

Teax

huh?
Local time
Today 9:05 PM
Joined
Oct 17, 2014
Messages
392
---
Location
in orbit of a friendly star <3
Idealization is the process by which scientific models assume facts about the phenomenon being modeled that are strictly false but make models easier to understand or solve.
Actually I do not make it a habit to look up every word, whether someone somewhere defined the same word in a different context. I mean have you seen some of the <disambiguation> pages from wikipedia? It's crazy. I just try to make it as obvious as possible. I was not talking about that "scientific models" Idealization process, that's for sure. I meant it literally as a combination of two words:

Idea + suffix that makes it sound like you convert something into an idea.

Don't look on wikipedia what Idea means, what I mean is: one specific thing that you can think about. Like a point. Or a letter. Or "joy", or the color red. Each of these is just one "idea".

The word "red" and the color red are 2 different ideas because you can think about them individually.

Tannhauser and I also used the term "abstract concept".

How have you not been discussing ideals?
Wait, does that mean that I should have said Ideaization? (without the "L") Is that even a word? I can't find it....

Idea + ization.

You just seem to throw around words like "reality," "objective," "subjective," seemingly without any concern as to what they mean.
Mean to whom?

"subjective reality" is that, what you yourself perceive to be your reality.

"objective reality" is a reality that exists independently of the perceiver, for all humans, shared.

I wasn't trying to make life difficult for you guys.
:chocor:

No. I mean to say that I am having difficulty understanding what your stance is and if you are being consistent or not in your responses. This seems to say you are not being consistent:
Please realize that from my perspective, since I wrote those quotes, I don't see the inconsistency. Can you phrase as a question, so that I can provide you with insight by attempting to answer it?
 

Tannhauser

angry insecure male
Local time
Today 9:05 PM
Joined
Jul 18, 2015
Messages
1,462
---
My argument is: all of what you called math in your previous posts are induced over the same axioms, set theory, and therefore exist as part of the same framework of what I usually call "math".

So basically, you're saying that by "math" you mean set theory? And if yes, which one of the set theories do you mean?
 

Teax

huh?
Local time
Today 9:05 PM
Joined
Oct 17, 2014
Messages
392
---
Location
in orbit of a friendly star <3
So basically, you're saying that by "math" you mean set theory? And if yes, which one of the set theories do you mean?
Yeah, sort of. I hope we can agree that set theory doesn't have to be defined/worded exactly in the same way to have the same expressive power as it has. I pretty much use the nitty gritty details only when we really need it, otherwise I stick to the definition of set theory as "a system of points and relations between points".

And since you like quotes, here's one from Wiki:
<Excerpt from Wikipedia article 'set theory'>
Set theory is commonly employed as a foundational system for mathematics, particularly in the form of Zermelo–Fraenkel set theory with the axiom of choice.

I'm saying set theory is "a foundation", not "the foundation" because, technically speaking, some branches of mathematics (which themselves are based on set theory) could be as expressively powerful as set theory itself. Then all of mathematics would be expressible from there as well, including set theory. It's just a matter of perspective, you know? The reason about "how our mind works" is the reason I see to pick set theory over any other foundation.
 
Top Bottom