• OK, it's on.
  • Please note that many, many Email Addresses used for spam, are not accepted at registration. Select a respectable Free email.
  • Done now. Domine miserere nobis.

Philosophy is dangerous

The Grey Man

το φως εν τη σκοτια φαινει
Local time
Today 3:05 AM
Joined
Oct 6, 2014
Messages
931
---
Location
Canada
but there seems to be no point in warning people about it. Any youngster who has a real vocation to philosophy, who genuinely loves the truth, will read Hume and Kant even if you tell him not to, and attempting to dissuade him from doing so will only make the situation worse, for no sooner will you have engaged him then you will have entangled the two of you in a philosophical discussion, precisely what you meant to avoid. As Aristotle said in his Protrepticus, whether it is right or wrong to philosophize, we must philosophize to find out. The most we can hope for is that young people do not become philosophical dilettantes, but rather undergo that long and rigorous study which alone can make them wise, or at least less foolish.
 

Black Rose

An unbreakable bond
Local time
Today 1:05 AM
Joined
Apr 4, 2010
Messages
11,431
---
Location
with mama
Wisdom is growth in intellectual and emotional maturity.

Remembering how I was I have advanced to a great degree.

I have just enough youth left to realize I will be wiser in the future.
 

The Grey Man

το φως εν τη σκοτια φαινει
Local time
Today 3:05 AM
Joined
Oct 6, 2014
Messages
931
---
Location
Canada
Pascal said:
Les sciences ont deux extrémités qui se touchent, la première est la pure ignorance naturelle où se trouvent tous les hommes en naissant, l’autre extrémité est celle où arrivent les grandes âmes qui ayant parcouru tout ce que les hommes peuvent savoir trouvent qu’ils ne savent rien et se rencontrent en cette même ignorance d’où ils étaient partis, mais c’est une ignorance savante qui se connaît. Ceux d’entre deux qui sont sortis de l’ignorance naturelle et n’ont pu arriver à l’autre, ont quelque teinture de cette science suffisante, et font les entendus. Ceux-là troublent le monde et jugent mal de tout.
"The sciences have two extremes which meet, the first being that pure and natural ignorance in which all men themselves at birth, the other where those great souls arrive who have gone through all that a man can know only to find themselves at that same ignorance from which they started, only now it is a learned ignorance that recognizes itself. Those in between, who have left the natural ignorance but have not arrived at the other, have some tincture of this perfected knowledge, and make themselves heard. It is these last who trouble the world and judge everything badly."

(Pascal's ignorance savante = the docta ignorantia of Cusanus)
 

Black Rose

An unbreakable bond
Local time
Today 1:05 AM
Joined
Apr 4, 2010
Messages
11,431
---
Location
with mama
sounds like a dead end to me.

damned if you do, damned if you don't.

 

The Grey Man

το φως εν τη σκοτια φαινει
Local time
Today 3:05 AM
Joined
Oct 6, 2014
Messages
931
---
Location
Canada
You're only damned if you do if you do it halfway. People who learn just enough to persuade themselves that they know better than you are the worst, but people who actually do know better than you are alright.
 

Black Rose

An unbreakable bond
Local time
Today 1:05 AM
Joined
Apr 4, 2010
Messages
11,431
---
Location
with mama
You're only damned if you do if you do it halfway. People who learn just enough to persuade themselves that they know better than you are the worst, but people who actually do know better than you are alright.

isn't that about honesty, not ignorance?

I know people smarter than I am who are complete bastards. (exaggeration term empathized)
 

The Grey Man

το φως εν τη σκοτια φαινει
Local time
Today 3:05 AM
Joined
Oct 6, 2014
Messages
931
---
Location
Canada
Wisdom and cleverness are two different things, and intelligence is a third. A person who learns from philosophy only how to be more clever, how to achieve his ends without discerning the worthiness of these very ends, this is a person who 'disturbs the world'. He may be a successful Machiavellian, but he is not wise.
 

Black Rose

An unbreakable bond
Local time
Today 1:05 AM
Joined
Apr 4, 2010
Messages
11,431
---
Location
with mama
intelligence is a third

raw it is just the ability to put parts together and deconstruct things. (in the mind)

wisdom is understanding "why" you are doing so.

clever just operates in real-time - dexterity
 

EndogenousRebel

Even a mean person is trying their best, right?
Local time
Today 2:05 AM
Joined
Jun 13, 2019
Messages
2,252
---
Location
Narnia
Is your problem with linguistics or philosophy?

Wittgenstein would say that linguistics, which are shaped by our cultures are the determining factors of our understanding of the world.

Hence if you believe that philosophy when articulated is bad, then perhaps you mean when it is represented with words?

Nietzsche I think also said something similar IIRC.

Plato is well known for creating examples and dialogues to elaborate philosophical ideas. Socrates as we know him may be more fiction than fact for example.

I'm fairly sure that Odin's scar on his eyes was caused by this pursuit of wisdom. Leaving him with only one eye. Of course this is a king of Gods with this affliction, so us mere mortals may be indeed blinded by it.

There are some things we won't learn until we are humbled by life or see someone be humbled by life with our own eyes.

Pandora's box, the apple of Eden, anime. There are warnings of seeking knowledge everywhere. What exactly are you proposing?
 

The Grey Man

το φως εν τη σκοτια φαινει
Local time
Today 3:05 AM
Joined
Oct 6, 2014
Messages
931
---
Location
Canada
The fruit of the Tree of the Knowledge of Good and Evil corresponds to the knowledge of those who are, as Pascal put it, in between the two ignorances, who have left behind the innocence of youth but have not yet reached that learned ignorance which recognizes itself and which we recognize in Socrates. "Amen I say to you, unless you be converted, and become as little children, you shall not enter into the kingdom of heaven." What I propose is that we who would call ourselves philosophers not lose sight of the aim, which is no mere knowledge of dualities like Goof and Evil, but integral wisdom, the fruit of the Tree of Life.
 

EndogenousRebel

Even a mean person is trying their best, right?
Local time
Today 2:05 AM
Joined
Jun 13, 2019
Messages
2,252
---
Location
Narnia
We must be man or beast in a world full of
people who can be both.

Innocence is an invention of man. Perhaps because we recognize value in a purity of life, but also to excuse our "vile" nature.

But that nature is a part of a whole. When we try to cast out parts of it, it ends up festering like a rotting corpse always at arms length.

Yes, control is key, and this is antithetical to innocence, but what recourse does one have when they have desires in their heart and no ability to seek it unencumbered?
 

onesteptwostep

Junior Hegelian
Local time
Today 5:05 PM
Joined
Dec 7, 2014
Messages
4,253
---
Honestly I don't think I've gotten smarter or 'wiser' by studying philosophy. I think I just have a bigger general knowledge about western civilization and western history than most people. That's about it. I think both my education in theology as well as philosophy grounds me in true liberal arts, and to be honest this knowledge doesn't mean much when the wider society doesn't have the same knowledge as you so you can forward discussions that would be fruitful for society as a whole. What I know is honestly just basic knowledge, and most probably should be known by anyone who calls themselves a cultural critic.

I think at the end of the road, your studies in philosophy should lead you to understanding politics, economics and cultural trends. Whatever your ideological inclinations are, knowledge, about human history, leads you to become sensitive to the cultural programs in your own country.
 

Daddy

Making the Frogs Gay
Local time
Today 3:05 AM
Joined
Sep 1, 2019
Messages
462
---
Pascal said:
Les sciences ont deux extrémités qui se touchent, la première est la pure ignorance naturelle où se trouvent tous les hommes en naissant, l’autre extrémité est celle où arrivent les grandes âmes qui ayant parcouru tout ce que les hommes peuvent savoir trouvent qu’ils ne savent rien et se rencontrent en cette même ignorance d’où ils étaient partis, mais c’est une ignorance savante qui se connaît. Ceux d’entre deux qui sont sortis de l’ignorance naturelle et n’ont pu arriver à l’autre, ont quelque teinture de cette science suffisante, et font les entendus. Ceux-là troublent le monde et jugent mal de tout.
"The sciences have two extremes which meet, the first being that pure and natural ignorance in which all men themselves at birth, the other where those great souls arrive who have gone through all that a man can know only to find themselves at that same ignorance from which they started, only now it is a learned ignorance that recognizes itself. Those in between, who have left the natural ignorance but have not arrived at the other, have some tincture of this perfected knowledge, and make themselves heard. It is these last who trouble the world and judge everything badly."

(Pascal's ignorance savante = the docta ignorantia of Cusanus)

I agree somewhat.

Human thinking is naturally in the in-between, without philosophy, to begin with. So the fault isn’t all on philosophy, but rather on that sad condition of being human.

Philosophy just gives you the meta tools to realize your own ignorance, but that’s definitely not a normal human way of being. Humans will do what they do regardless. Socrates shouldn’t have been found guilty.
 

The Grey Man

το φως εν τη σκοτια φαινει
Local time
Today 3:05 AM
Joined
Oct 6, 2014
Messages
931
---
Location
Canada
Thinking is difficult, therefore let the herd pronounce judgement! - Carl Jung
Your signature actually complements the Pascal quote rather well. I think Pascal goes on to say that the 'herd' judge each other well, since they judge badly.

Plato is well known for creating examples and dialogues to elaborate philosophical ideas. Socrates as we know him may be more fiction than fact for example.
Socrates shouldn’t have been found guilty.
When Plato wrote, somewhere in the Republic, that the just man can be recognized only by the fact that he is misunderstood and persecuted, since only by these signs will we know that he represents the truth and not just popular opinions, he was probably thinking of his master. "Men of Athens," says Socrates in the Apology, "I honour and love you, but I will obey God rather than you." Job's suffering similarly gives the lie to the claim of Satan (i.e. the Accuser) that he is only pious because God protects him from misfortune. In this sense both Socrates and Job are 'types' or prefigurations of Christ.

If Plato did not share Socrates's spirit of 'learned ignorance', it is probably because he perceived that it is not at all possible to 'know that one knows nothing', and that the fool who seeks wisdom is, by the very fact that he seeks wisdom, no fool. It is these paradoxes that prevent me from becoming a confessed Socratic. Given a choice between rigid dogmatism and Protean skepticism, I believe dogmatism is the lesser of the two errors. "Ask, and it shall be given you: seek, and you shall find: knock, and it shall be opened to you." "And you shall know the truth, and the truth shall make you free."

...So I suppose that, in the end, I, too, only 'somewhat' agree with Pascal. The fruit of the Tree of Life is not the pseudonymous gnosis of philosophical dilettantes, but it is not ignorance either. It resembles the innocence of children not by its privative nature, but by its simplicity.
 

EndogenousRebel

Even a mean person is trying their best, right?
Local time
Today 2:05 AM
Joined
Jun 13, 2019
Messages
2,252
---
Location
Narnia
Group consensus when there is one does often lead to a "right" answer.

Who wants to be a millionaire, when the audience was polled what the right trivia question was, they were right 70-90% of the time.

I think this would depend on the question being asked though.

Can I not say that this represents fear of change rather than some correct assessment that can be generalized?

It could simply be that whatever change these controversial thinkers have ideas that are too ambitious? That the audience simply knows that the prescription and the knowledge given is not compatible with the system in the state it is in?

It takes a lot to build trust in anything new. But newness itself is a lie, and yet me information does indeed need to be scrutinized, and yes that can take you down a rabbit hole. What is the danger in that?
 

The Grey Man

το φως εν τη σκοτια φαινει
Local time
Today 3:05 AM
Joined
Oct 6, 2014
Messages
931
---
Location
Canada
The danger of going down the rabbit hole is getting lost, but there is also danger in staying above ground, namely that of a superficial, rootless, inherently unstable existence. Danger is unavoidable, but perhaps the biggest danger is to think that there is no danger, no sacrifices to be made, that we can escape the consequences of our actions. But I'm philosophizing again. See? It's not possible to assess the dangers of philosophy without doing it.
 

EndogenousRebel

Even a mean person is trying their best, right?
Local time
Today 2:05 AM
Joined
Jun 13, 2019
Messages
2,252
---
Location
Narnia
Find ground and build from there. If you're lucky there will be other people willing to do it with you
 

Rook

enter text
Local time
Today 10:05 AM
Joined
Aug 14, 2013
Messages
2,544
---
Location
look at flag
but then again religion sparks genocide so *shrug* the nihilists armies of the future massacring folk en masse, the elites spouting their 'socially correct societies'........

yeah I guess philosophy can be dangerous

as with religion tho, that don't mean it always is---lot's of religious folk not committing genocide and all, just living their lives


don't fall off bridges tho

that shit'll kill u
 

EndogenousRebel

Even a mean person is trying their best, right?
Local time
Today 2:05 AM
Joined
Jun 13, 2019
Messages
2,252
---
Location
Narnia
I don't want to live in world where people have to ask if I was pushed
 

birdsnestfern

Earthling
Local time
Today 3:05 AM
Joined
Oct 7, 2021
Messages
1,897
---
Now who said its dangerous and why? Only if someone has a stake in keeping the status quo, usually thats government or religion fearing changes to its place in power.

Plato is where I'd start. I am not proficient enough to speak fluently on it, but I've listened to audio books on Plato and have taken some philosophy, just not enough to discuss yet.

The first rule is anything you say at ALL is an open invitation to another point of view and as long as you understand that first rule and expect other ideas it ends up taking you to a far greater mind that is open to new ideas.
I like philosophy in general because it stretches the mind, it helps you open your ability to question what you know, why you know it, and keep learning, its a participatory thing.

Ok, and it could be dangerous for the philosopher who wanted to change public minds, yes it was. But for Modern man, its academic and should be safe, not dangerous.
 

EndogenousRebel

Even a mean person is trying their best, right?
Local time
Today 2:05 AM
Joined
Jun 13, 2019
Messages
2,252
---
Location
Narnia
Yes, perhaps form is just as important as function. A circle and a square, and a triangle, how are they different and what are the implications? Spooky.
 

ZenRaiden

One atom of me
Local time
Today 8:05 AM
Joined
Jul 27, 2013
Messages
5,262
---
Location
Between concrete walls
To me, and I have thought about philosophy for sometime, I think philosophy is what happens between our first assumptions about the world ----->the natural conclusions in the world.
Since humans are first driven by instincts, at some point in life we learn our instincts which drive us, are not enough.
In comes the brain and rewires it self to adapt to new reality.
Ergo you realize your natural instinct is not enough to get from A to B and philosophy explores your thinking process between pure instinct and conclusions it makes about the world.
The danger of human brains and intellect is what one author calls over adaptation.
Ergo over adaptation is suppressing our nature in favor of intellectual self.

Since our intellect is the driving force behind life it self in human life, we cannot suppress intellect completely, but on flip side we can have too much intellect to suppress our nature.

The key is balance, but no one knows what that is, since from the way civilization evolved favor intellect over instinct.
Not only is our intellect more beneficial socially, sexually in mating or child rearing, or our job, it also means more security, ergo not putting fingers in electric sockets etc.

So its only natural to see why humans developed a brain.
Brain whether we like it or not is the thing that makes us survive.

The problem with mind over matter is that if you actually get to a point where mind rules over matter IT can kill you in a way.

For instance you can ignore pain or suffering.
Or ignore feelings.
Or ignore many other things that inform us about the world.

The question then becomes what informs our intellect about what is good.
The answer could be pure reason.
But we know from psychology there is no such thing as pure reason, as humans are fallible and slow thinkers.
We are especially slow and inefficient in the most prized faculty and that is analytical thinking.

However we prize analytical thinking most, because it seems to be the hardest commodity to come buy, because its time consuming and hard to develop.

And would you be interested in our savior the man himself Jesus Christ Marx.
I have a book for you...
1676765887294.png
 

Daddy

Making the Frogs Gay
Local time
Today 3:05 AM
Joined
Sep 1, 2019
Messages
462
---
If Plato did not share Socrates's spirit of 'learned ignorance', it is probably because he perceived that it is not at all possible to 'know that one knows nothing', and that the fool who seeks wisdom is, by the very fact that he seeks wisdom, no fool. It is these paradoxes that prevent me from becoming a confessed Socratic. Given a choice between rigid dogmatism and Protean skepticism, I believe dogmatism is the lesser of the two errors. "Ask, and it shall be given you: seek, and you shall find: knock, and it shall be opened to you." "And you shall know the truth, and the truth shall make you free."

...So I suppose that, in the end, I, too, only 'somewhat' agree with Pascal. The fruit of the Tree of Life is not the pseudonymous gnosis of philosophical dilettantes, but it is not ignorance either. It resembles the innocence of children not by its privative nature, but by its simplicity.

So I know you chose two extremes to contrast here (rigid dogmatism vs protean skepticism), but I think that's misleading.

Rather, why would it not be possible to 'know that one knows nothing'? Philosophy, by its intensive and extensive reflection/inquiry, seems to give all answers, yet counter-intuitively contradicts them all at the same time. In a way, it seems to really be a process of learning the limits of knowledge, or its biases.

Dogmatism can, I suppose, be a tool (and a great political/emotional/motivational tool at that) for trying to form reality into a vision of the human mind. But if the dogmatic person doesn't understand/know how they are wrong/limited in their vision, how can they improve it, or even be communicated with in any productive capacity? I think this is a huge problem (or maybe even the main problem) with why humans can't or refuse to get along. Illuminating this problem could actually be the only real practical use that philosophy has.

Now that doesn't have to mean that we must change our minds or views easily. Just that we have to put more thought into what we think and why and accept that there will be flaws and there will be ways of dealing with those flaws that will mean compromising vision. It seems more appropriate then to reframe your original contrast to a more realistic shades of grey - dogmatic vs compromising (as opposed to rigid dogmatism vs protean skepticism), where it seems you want to be a mix of dogmatic and compromising.
 

Black Rose

An unbreakable bond
Local time
Today 1:05 AM
Joined
Apr 4, 2010
Messages
11,431
---
Location
with mama
maybe some people understand reality better than others do?

so why not use this as an agenda? but what agenda?

if reality is the way it is then that is descriptive but it is the individual that decides the prescriptive. but without an understanding of reality obeying the prescriptions of those that don't is dangerous if not in accordance with reality.

e.g. tide pod challage

we enlightened people know it's dumb but why then do those who do not do it?

dumbness?
 

ZenRaiden

One atom of me
Local time
Today 8:05 AM
Joined
Jul 27, 2013
Messages
5,262
---
Location
Between concrete walls
Dogmatism can, I suppose, be a tool (and a great political/emotional/motivational tool at that) for trying to form reality into a vision of the human mind. But if the dogmatic person doesn't understand/know how they are wrong/limited in their vision, how can they improve it, or even be communicated with in any productive capacity?
I think the key here for example where logic is used as exploration tool, to go to the limits of reasons.
Where you use logic to see how far logic can take us.
Without that exploration we cannot really "know" limits of logic.
But logic is hard for humans. Its artificial process for our brains.
It has to be methodical and slow.
Which means without the work put into the discipline we would never even get half the things we today take for granted such as in math or any sort of reasoning.

One could view philosophy as a form of exploration of structure of human thought and following up to the extreme.
Such as Kant.
 

scorpiomover

The little professor
Local time
Today 8:05 AM
Joined
May 3, 2011
Messages
3,383
---
but there seems to be no point in warning people about it.
Of course you can warn people against doing something stupid. If you tell people to not drink bleach, you think they'd drink bleach?

Any youngster who has a real vocation to philosophy, who genuinely loves the truth, will read Hume and Kant even if you tell him not to, and attempting to dissuade him from doing so will only make the situation worse, for no sooner will you have engaged him then you will have entangled the two of you in a philosophical discussion, precisely what you meant to avoid.
Hume and Kant wrote their books to be read. Of course reading Hume and Kant would be good things. But if you read Hume and Kant with the wrong attitude, then even Hume and Kant would tell you that you're doing the same to your mind as drinking bleach.

It's the attitude that people take to philosophy that is the problem, not the books themselves.

As Socrates said "the unexamined life is not worth living". The point of philosophy is not to know things to show off, but to examine your life and improve it.
 

Black Rose

An unbreakable bond
Local time
Today 1:05 AM
Joined
Apr 4, 2010
Messages
11,431
---
Location
with mama
The point of philosophy is not to know things to show off, but to examine your life and improve it.

when I was 23 I believed all the conspiracy theories about the world ending and economic collapse because you know, 2008 and all that.

I see the same thing happen to younger people now because of covid. A 27 yo told me why will technology get better? the economy will collapse in 4 years.

To me, the myth of progress is no myth.

I remember the difference between my gigaflop 1998 computer and my teraflop 2014 computer.

In 2024 I will get a petaflop computer and use a.i. tools to develop a.i. in VR.

Intelligence peaks at age 35 so I guess I can make more progress with myself.
Crystalized intelligence does not peak until 55. I have 20 years left to learn a shit ton more.
 

scorpiomover

The little professor
Local time
Today 8:05 AM
Joined
May 3, 2011
Messages
3,383
---
The point of philosophy is not to know things to show off, but to examine your life and improve it.
when I was 23 I believed all the conspiracy theories about the world ending and economic collapse because you know, 2008 and all that.
When I was 23, I heard the conspiracy theories but didn't believe them. I am shocked to see how many of them have been coming true around the time that was predicted.

I see the same thing happen to younger people now because of covid. A 27 yo told me why will technology get better? the economy will collapse in 4 years.
If that is all that happens, they should count themselves very fortunate indeed.

To me, the myth of progress is no myth.

I remember the difference between my gigaflop 1998 computer and my teraflop 2014 computer.

In 2024 I will get a petaflop computer and use a.i. tools to develop a.i. in VR.
I remember back in 1988, when a friend of mine developed software to run an entire pharmacy, with contraindications, ordering supplies and doing the tax and vat, on a BBC Micro that had a 8MHz CPU and 16kB of memory.

Today, you wouldn't see an entire pharmacy run on a single laptop that runs at far 250 times the speed and 125,000 times the memory.

We have far more power. But far less usage of that power, and so the returns are swiftly diminishing.

Intelligence peaks at age 35 so I guess I can make more progress with myself.
Crystalized intelligence does not peak until 55. I have 20 years left to learn a shit ton more.
If you don't really make any effort and just coast on what you learned in the past, then that would be accurate. But that's only because you're not working your brain, and like any part of the body, it atrophies with lack of use. If, however, you keep working on learning new things and developing your intelligence, then studies on neuroplasticity show the mind will keep developing.

The proof of this is all the people who were in their 50s and over by the time they embraced the internet. If crystallised intelligence were fixed at that point, then none of the oldies would have embraced the internet. But they're all over the internet and email.

So the evidence of modern real life proves that the mind keeps developing, if you only let it.
 

Black Rose

An unbreakable bond
Local time
Today 1:05 AM
Joined
Apr 4, 2010
Messages
11,431
---
Location
with mama
When I was 23, I heard the conspiracy theories but didn't believe them. I am shocked to see how many of them have been coming true around the time that was predicted.

name names please :)

what came true?

who predicted it and under what circumstances did they become correct?

We have far more power. But far less usage of that power, and so the returns are swiftly diminishing.

When Moore's laws runs out we will need to prioritize efficiently over scale.
Computers will need to become smarter not more powerful because we need that.

So the evidence of modern real life proves that the mind keeps developing, if you only let it.

Tools shape the mind. In VR you can learn things in 3D environments, not just 2D youtube videos. And like in video games you can try and fail with no consequences of social pressure. A.I. will be made that instruct people without needing to be as intelligent as people but will become so as we teach it and it teaches us.
 

The Grey Man

το φως εν τη σκοτια φαινει
Local time
Today 3:05 AM
Joined
Oct 6, 2014
Messages
931
---
Location
Canada
It seems more appropriate then to reframe your original contrast to a more realistic shades of grey - dogmatic vs compromising (as opposed to rigid dogmatism vs protean skepticism), where it seems you want to be a mix of dogmatic and compromising.
True, human judgment is fallible, and there should be a limit to the stubbornness with which one clings to a conjecture in the face of refuting evidence, but one has to have a conjecture to be refuted in the first place. This is why I think dogmatism has a certain superiority or (maybe more accurately) priority over skepticism, though I agree that both pure dogmatism and pure skepticism are unreasonable.

Hume and Kant wrote their books to be read. Of course reading Hume and Kant would be good things. But if you read Hume and Kant with the wrong attitude, then even Hume and Kant would tell you that you're doing the same to your mind as drinking bleach.

It's the attitude that people take to philosophy that is the problem, not the books themselves.

As Socrates said "the unexamined life is not worth living". The point of philosophy is not to know things to show off, but to examine your life and improve it.
I was that youngster who read Hume and Kant even though people told me not to: this is the evidence for my claim that there is no point in warning people about the dangers of philosophy. I disregarded all those people who told me that I would be drinking intellectual bleach because, for some people, the unexamined life really is not worth living (though some can live their lives without reflecting, and I bear them no ill will). Consequently, I know all about those people Pascal mentions who "trouble the world and judge everything badly" because I was one and, to a certain extent, probably still am. And this is why I can't stop philosophizing: if my efforts are to bear fruit, I must see them through to the end.

St. Paul said:
Know you not that they that run in the race, all run indeed, but one receiveth the prize? So run that you may obtain.
I overestimated folks' knowledge of my life when writing the OP, but in case it interests anyone, here are the three stages of my love affair with Kant:
  1. The Honeymoon Phase: Like Schopenhauer, I read the Critique of Pure Reason and thought it was the greatest thing since sliced bread.
  2. The First Fight: In poring Kant's works and secondary literature, I discovered, to my dismay, that his baroque architectonic system is inconsistent.
  3. Reconciliation: I slowly realized that, to make his system consistent, he would have had to suppress many of the wonderful insights that mark him as a genius, some of which (e.g. the analytic-synthetic distinction, intuition as the foundation of geometry, his discussion of the schematism of concepts in sense-perception) will likely continue to influence me till I die.
 

scorpiomover

The little professor
Local time
Today 8:05 AM
Joined
May 3, 2011
Messages
3,383
---
When I was 23, I heard the conspiracy theories but didn't believe them. I am shocked to see how many of them have been coming true around the time that was predicted.

name names please :)

what came true?
Agenda 21, Naomi Wolf's 10 steps to fascism.

who predicted it
Don't know, Naomi Wolf.

and under what circumstances did they become correct?
All of them.

We have far more power. But far less usage of that power, and so the returns are swiftly diminishing.
When Moore's laws runs out we will need to prioritize efficiently over scale.
Moore's Law is about the speed of a single core of a CPU, which kept getting faster and faster until about 2000, when it reached about 2.5GHz. It really hasn't gotten that much faster., not compared to the meteoric rises in speed of cores that we saw beforehand.

Computers will need to become smarter not more powerful because we need that.
Does that explain the interest in AI and Machine Learning (computers teaching themselves)?

So the evidence of modern real life proves that the mind keeps developing, if you only let it.
Tools shape the mind. In VR you can learn things in 3D environments, not just 2D youtube videos. And like in video games you can try and fail with no consequences of social pressure.
VR has been around for a looong time. But I don't really know anyone who uses it. E.G. Did you learn to make a fire in VR?

A.I. will be made that instruct people without needing to be as intelligent as people but will become so as we teach it and it teaches us.
That's the hype. Does the hype match the reality of your life? Why aren't you being taught by A.I. how to solve the problems that you have in your life?
 

scorpiomover

The little professor
Local time
Today 8:05 AM
Joined
May 3, 2011
Messages
3,383
---
It seems more appropriate then to reframe your original contrast to a more realistic shades of grey - dogmatic vs compromising (as opposed to rigid dogmatism vs protean skepticism), where it seems you want to be a mix of dogmatic and compromising.
True, human judgment is fallible, and there should be a limit to the stubbornness with which one clings to a conjecture in the face of refuting evidence, but one has to have a conjecture to be refuted in the first place. This is why I think dogmatism has a certain superiority or (maybe more accurately) priority over skepticism, though I agree that both pure dogmatism and pure skepticism are unreasonable.

Hume and Kant wrote their books to be read. Of course reading Hume and Kant would be good things. But if you read Hume and Kant with the wrong attitude, then even Hume and Kant would tell you that you're doing the same to your mind as drinking bleach.

It's the attitude that people take to philosophy that is the problem, not the books themselves.

As Socrates said "the unexamined life is not worth living". The point of philosophy is not to know things to show off, but to examine your life and improve it.
I was that youngster who read Hume and Kant even though people told me not to: this is the evidence for my claim that there is no point in warning people about the dangers of philosophy. I disregarded all those people who told me that I would be drinking intellectual bleach because, for some people, the unexamined life really is not worth living (though some can live their lives without reflecting, and I bear them no ill will).
You can tell someone that taking drugs is dangerous. Doesn't mean they'll listen.

Consequently, I know all about those people Pascal mentions who "trouble the world and judge everything badly" because I was one and, to a certain extent, probably still am.
Sounds like he is saying that amateurs are a pain in the backside.

And this is why I can't stop philosophizing: if my efforts are to bear fruit, I must see them through to the end.
If you started drinking bleach, but thought it was safe, and then discovered how harmful it was, why would you keep drinking it?

St. Paul said:
Know you not that they that run in the race, all run indeed, but one receiveth the prize? So run that you may obtain.
I overestimated folks' knowledge of my life when writing the OP, but in case it interests anyone, here are the three stages of my love affair with Kant:
  1. The Honeymoon Phase: Like Schopenhauer, I read the Critique of Pure Reason and thought it was the greatest thing since sliced bread.
  2. The First Fight: In poring Kant's works and secondary literature, I discovered, to my dismay, that his baroque architectonic system is inconsistent.
  3. Reconciliation: I slowly realized that, to make his system consistent, he would have had to suppress many of the wonderful insights that mark him as a genius, some of which (e.g. the analytic-synthetic distinction, intuition as the foundation of geometry, his discussion of the schematism of concepts in sense-perception) will likely continue to influence me till I die.
Sounds like you were treating Kant the same way that most people seem to be interested in Kant, as someone who reveals a HIGHER TRUTH, some esoteric understanding that explains everything, and makes us feel smug and self-satisfied that we understand the true nature of reality.

Something that you do, that makes you feel good, but doesn't make your life better in a material way, and even detracts from your life by taking up your time that you could have used to improve your life, has the same properties as taking heroin.

If, OTOH, you look at philosophers as "lovers of wisdom", and wisdom as "useful knowledge", then reading the works of philosophers is about "obtaining useful knowledge." Knowledge that is useful can improve your life. But it's unlikely to give you esoteric knowledge, as most of what is likely to be useful is not esoteric and is rather humdrum.

E.G. the analytic synthetic distinction. I eventually figured out that all it means, is that some ideas are "synthetic", constructed from other ideas, and some are not. but if an idea is to be regarded as "synthetic" and "new", then it must be of a nature where you cannot look at the components of its construction and think that it is an obvious conclusion from the components. Hence, an analytic idea is something that is clearly so easily derived from the components that it is made of, that it seems almost simplistic by comparison.

By that definition, synthetic ideas seem clever, and analytic ideas seem not ideas at all.

OTOH, since analytic ideas are so clearly derived from their components that one scarcely considers them ideas at all, if you understand the underlying ideas that they are based on, then you'll almost never make a mistake in understanding how to use and apply the analytic idea either.

Not so with a synthetic idea, because it's so far from the underlying ideas that it is based on, that we think of it as a "new" idea that is divorced from its foundations. So we wouldn't automatically conclude that any conditions that apply to its foundations would also apply to a synthetic idea, and hence are likely to make many wrong assumptions about it, and thus it is much more prone to error and misunderstandings.

Which is better?

Analytic ideas won't impress anyone. But they're as likely to work for you, as the ideas they were built on.

Synthetic ideas are impressive, and you'll thus get lots of people willing to support you doing something with them. But you're much more likely to make lots of major errors and misunderstandings and make things even worse than if you'd done nothing at all.

This has relevance, because when you look at philosophy in a "synthetic" way, you love reading it, but it probably will just screw you up. If you look for the analytic in philosophy, it's rather trite-sounding, but has the power to vastly improve your life.
 

The Grey Man

το φως εν τη σκοτια φαινει
Local time
Today 3:05 AM
Joined
Oct 6, 2014
Messages
931
---
Location
Canada
@scorpiomover the object of philosophia is not "useful knowledge" or prudence (prudentia; φρόνησις), but wisdom (sapientia; σοφία), of which prudence is the practical aspect. Practical philosophy (ethics, politics) is incomplete without speculative philosophy (physics, metaphysics), which seeks, among other things, the principles that justify our understanding of man as a political animal and the ethical judgments that we use to regulate our lives.

To reduce philosophy to analytic judgments, similarly, is to reduce wisdom, not to its practical aspect, but to its rational aspect. Wisdom is not exhausted by rational knowledge (scientia; ἐπιστήμη), by knowledge of hypothetical necessities, but includes intellectual, noetic knowledge, which is knowledge of principles, those independent 'first causes' on which these necessities ultimately depend. Now, the statement of a principle cannot be analytic, since the content of such a statement must be independent, but the meaning of an analytic judgment is by definition dependent on the definitions of its terms (that an analytic judgment cannot be the statement of a principle is in fact an analytic judgment); therefore, all such statements must be synthetic.

Thus, just as denying the non-practical aspect of wisdom leads to a voluntaristic pragmatism, so does denying its non-rational (though by no means ir-rational in the sense of being contrary to reason!) dimension lead to scientism. Both are forms of relativism: pragmatism demands that everything be useful, forgetting that usefulness of things is determined by values which themselves have no use; scientism demands that every judgment be analytic, ignoring the fact that all analytic judgments are grounded in other judgments, and this leads to an indefinite regress of epistemic reasons analogous to that indefinite regress from means to ends to which pragmatism leads.

That these simple facts could be ignored so that relativism today enjoys an almost unchallenged dominance is, it seems, due to an irrational and epidemic preference for skepticism over dogmatism the roots of which go back to the seventeenth century (Galileo and Descartes) if not earlier. Studying philosophy helps me to uncover these deep prejudices and, at the same time, the principles that they tacitly deny. And this is why I continue to "drink the bleach": as I said to AK, "You're only damned if you do it halfway."

Revelation said:
I know thy works, that thou art neither cold, nor hot. I would thou wert cold, or hot. But because thou art lukewarm, and neither cold, nor hot, I will begin to vomit thee out of my mouth.
 

Black Rose

An unbreakable bond
Local time
Today 1:05 AM
Joined
Apr 4, 2010
Messages
11,431
---
Location
with mama
Moore's Law is about the speed of a single core of a CPU, which kept getting faster and faster until about 2000, when it reached about 2.5GHz. It really hasn't gotten that much faster., not compared to the meteoric rises in speed of cores that we saw beforehand.

In 2000 a GPU had 25 million transistors. Today they have 80 billion.

Speed may be important but so too is parallelism. It is the way in which the components are used that matters most. Utilization of all resources is paramount.

VR has been around for a looong time. But I don't really know anyone who uses it. E.G. Did you learn to make a fire in VR?

What matters is mass adoption.

Von Neumann architecture was invented in the 50s but mass adoption did not happen till the 90s.

We are on the verge of mass adoption of VR because of the cost lowering each year. Some transition will happen just like when the iPhone came out in 2007.

That's the hype. Does the hype match the reality of your life? Why aren't you being taught by A.I. how to solve the problems that you have in your life?

Hype does not matter to me. What matters is the trends and the trends say that mass adoption creates ecosystems where everything becomes feasible. Look at video games. It was hype in the 80s but is a reality today. I have been following the trends since 12 and I can tell you that there are no technical barriers. The ecosystem just has not developed far enough for everyone to get involved and out-compete each other in making the best a.i. - That will change because of VR where a.i. has real application. Right now it does not because no one cares about improving 2D apps. But in a 3D world that is a different matter. Especially one in which you are in for real, not a flat screen.
 

scorpiomover

The little professor
Local time
Today 8:05 AM
Joined
May 3, 2011
Messages
3,383
---
Moore's Law is about the speed of a single core of a CPU, which kept getting faster and faster until about 2000, when it reached about 2.5GHz. It really hasn't gotten that much faster., not compared to the meteoric rises in speed of cores that we saw beforehand.
In 2000 a GPU had 25 million transistors. Today they have 80 billion.
Moore's law is about the speed of a single core of a CPU, not a GPU.

What you are implying, is that Moore's law tapped out in 2000, and so motherboard designers chose to start making GPUs faster instead, by doing for GPUs what they did for CPUs.

But why didn't they do that before? Because they thought that the CPUs could handle the load, by increasing the speed of CPUs.

So there is no real improvement in terms of Moore's law.

Speed may be important but so too is parallelism. It is the way in which the components are used that matters most. Utilization of all resources is paramount.
I agree. The first system that used parallelism was Multics, the precursor of Unix. Unix didn't use parallelism. So why are modern OSes based on Unix and not on Multics.

Moreover, why does code only use parallel processing by running different programs on different cores, which is effectively NOT using parallel processing at all?

Parallelism has barely been touched.

VR has been around for a looong time. But I don't really know anyone who uses it. E.G. Did you learn to make a fire in VR?
What matters is mass adoption.

Von Neumann architecture was invented in the 50s but mass adoption did not happen till the 90s.

We are on the verge of mass adoption of VR because of the cost lowering each year. Some transition will happen just like when the iPhone came out in 2007.
You are welcome to believe that flying pigs are real. What really changed the iPhone, was the App Store, and Apple's rule that they wouldn't allow anything to be publicly available on the App Store until they had checked it out properly.

Likewise, VR will become popular when something else makes it worth using.

That's the hype. Does the hype match the reality of your life? Why aren't you being taught by A.I. how to solve the problems that you have in your life?
Hype does not matter to me. What matters is the trends and the trends say that mass adoption creates ecosystems where everything becomes feasible. Look at video games. It was hype in the 80s but is a reality today.
Everyone was playing video games in the 70s and 80s. Anyone who thinks video games were hype in the 1980s doesn't remember the 1980s, and has invented a false version of history.

I have been following the trends since 12 and I can tell you that there are no technical barriers. The ecosystem just has not developed far enough for everyone to get involved and out-compete each other in making the best a.i. - That will change because of VR where a.i. has real application. Right now it does not because no one cares about improving 2D apps.
That's because no-one seems to have figured out real-world viable uses for 3D apps that people want, and that are not already provided by 2D apps.

If you've been following trends since you were 12, then you've already noticed that any form of technology rises until it reaches maximum saturation and maximum feasible improvement. Most of the current tech tapped out in the 1990s.

But in a 3D world that is a different matter. Especially one in which you are in for real, not a flat screen.
When we come up with new tech that can take advantage of 3D, such as the stuff in the novel Neuromancer, the film Johnny Mnemonic, and the film Jurassic Park, then 3D will take off.

But right now, it's a toy.
 

scorpiomover

The little professor
Local time
Today 8:05 AM
Joined
May 3, 2011
Messages
3,383
---
@scorpiomover the object of philosophia is not "useful knowledge" or prudence (prudentia; φρόνησις), but wisdom (sapientia; σοφία), of which prudence is the practical aspect.
It's the lack of usefulness in what Modernists call "philosophy", that makes so many people turn against philosophy.

But then, that's probably intentional. If philosophy was useful, then so is what Socrates had to say, and then it's worth studying the past as well as the Modern, and tehn Modernism must be wrong. Do you disagree with my hypothesis?

Practical philosophy (ethics, politics) is incomplete without speculative philosophy (physics, metaphysics), which seeks, among other things, the principles that justify our understanding of man as a political animal and the ethical judgments that we use to regulate our lives.
Speculation is not worth anything until you can use it in some way. But the point of speculation is that some things are unsolvable from your current perspective, and so often, when you choose to speculate from a different perspective, you gain a much greater understanding that causes massive breakthroughs in machine technology and social technology.

Speculating about non-Euclidean Geometry led to Einstein's theory of Relativity, which made satellite transmissions so accurate in their timekeeping, that it made the idea of GPS, Satnavs, and mobile phone communications feasible.

To reduce philosophy to analytic judgments, similarly, is to reduce wisdom, not to its practical aspect, but to its rational aspect. Wisdom is not exhausted by rational knowledge (scientia; ἐπιστήμη), by knowledge of hypothetical necessities, but includes intellectual, noetic knowledge, which is knowledge of principles, those independent 'first causes' on which these necessities ultimately depend.
I would agree that philosophy is not just about "analytic philosophy". Where we differ, is about the meaning of the word "analytic".

Now, the statement of a principle cannot be analytic, since the content of such a statement must be independent, but the meaning of an analytic judgment is by definition dependent on the definitions of its terms (that an analytic judgment cannot be the statement of a principle is in fact an analytic judgment); therefore, all such statements must be synthetic.
If a statement is not dependent on the definitions of its terms, then it doesn't have any basis. Even if someone has an intuition, they still need a basis for trusting it, even if it's just that they have observed that their intuitions are more often right than not, and so ignoring one's intuitions is ignoring valuable information and making one's life worse unnecessarily, which is tantamount to being self-masochistic.

What I suggest, is that there are 2 types of ideas within mathematics:

1) Ideas that are not obvious at first glance to those who are already familiar with their axioms, such as Pythagoras' Theorem, which is based on the basics of geometry and algebra, but not obvious to those who have learned both but not yet learned the proof of Pythagoras' Theorem. They are brilliant. But they are also easy to misunderstand and easy to use in error. In mathematics, a synthetic idea is usually called a "theorem". If an idea is extremely synthetic, it is usually called a "theory", such as the "theory of calculus".

2) Ideas that are obvious at first glance to to those who are already familiar with their axioms. They are so obvious that they hardle count as ideas at all. But everyone familiar with their axioms is unlikely to get them wrong. If the idea is a simple conclusion from axioms, then it is called a "proposition". If the idea is a simple conclusion from a theorem, then it called a "corollary".

I propose that Kant called #1 "synthetic", and #2 "analytic".

What Modernists mean by "analytic" is something wholly different, as it is clearly not intuitionistic, but is so in contradiction with all of the rigour of mathematical logic, that I really see no virtue in it, other than being able to waffle on and on in philosohpy lectures and make a good middle-class living as a philosophy professor while saying little that is worth listening to.

Do you understand what people mean by "Analytic Philosophy" today? Can you explain it to me?

Thus, just as denying the non-practical aspect of wisdom leads to a voluntaristic pragmatism, so does denying its non-rational (though by no means ir-rational in the sense of being contrary to reason!) dimension lead to scientism. Both are forms of relativism: pragmatism demands that everything be useful, forgetting that usefulness of things is determined by values which themselves have no use;
Pragmatism is more about choosing to look at everything from the lens of "what is pragmatic". What people mean by "what is pragmatic", tends to be "what I can personally benefit from at the current time".

As a result, Einstein's theory of Relativity was not "pragmatic", because at the time, no-one could benefit from it, and now that we have satellites and mobile phones, the technology has already been applied, and so again we don't need it.

So not everything that is useful, is pragmatic, and thus, some things are not pragmatic, but are still useful.

scientism demands that every judgment be analytic, ignoring the fact that all analytic judgments are grounded in other judgments, and this leads to an indefinite regress of epistemic reasons
If science was held to the same standards as mathematics, I'd agree. But in maths, if something is true for all but 1 out of a trillion cases, it's thrown out, and in maths, if some proof is so strong that it's got a 1 in a trillion chance of being wrong, it's thrown out. Even if you have a perfect proof, but make a tiny mistake somewhere, like Andrew Wiles did with his 1st proof of Fermat's Last Theorem, it's thrown out. So quite clearly, science is not held to the standards of mathematics.

Logical empiricists would hold science to the same standards as those of mathematics, which is why are forced to reject 99.9% of science.

Everyone else clings to Scientific Positivism, when Positivism was a philosophy of the French Englightenment, that no-one really seems to understand or describe all that clearly.

Do you understand Positivism? Can you explain it to me? What's your explanation of Positivism?

analogous to that indefinite regress from means to ends to which pragmatism leads.
Same problem as Xeno's Paradox. Until you have solved the paradox, you'll be confused. But paradoxes are things that seem to not make sense, but make sense when you think about them. So once you think about Xeno's Paradox, you understand why that's not an issue. Can you see the solution?

That these simple facts could be ignored so that relativism today enjoys an almost unchallenged dominance is, it seems, due to an irrational and epidemic preference for skepticism over dogmatism the roots of which go back to the seventeenth century (Galileo and Descartes) if not earlier. Studying philosophy helps me to uncover these deep prejudices and, at the same time, the principles that they tacitly deny. And this is why I continue to "drink the bleach": as I said to AK, "You're only damned if you do it halfway."
Then it sounds like you are saying that philosopy is USEFUL to you. Hence, to you, philosophy counts as "useful knowledge". Do you agree?
 

Black Rose

An unbreakable bond
Local time
Today 1:05 AM
Joined
Apr 4, 2010
Messages
11,431
---
Location
with mama
Moore's law is about the speed of a single core of a CPU, not a GPU.

Wrong, it is not about speed but transistor count per square centimeter.
That is what Gordan Moore predicted and it did not stop in 2000.

Parallelism has barely been touched.

so room for improvement exists.

You are welcome to believe that flying pigs are real.

You are welcome to believe computers have not improved since 2000 but that is a delusion on your part.

VR will become popular when something else makes it worth using.

Which was my point. The ecosystem of the iPhone I just said that will happen for VR and a.i. - can you not grasp the argument?

If you've been following trends since you were 12, then you've already noticed that any form of technology rises until it reaches maximum saturation and maximum feasible improvement. Most of the current tech tapped out in the 1990s.

which again is delusion if you think progress is impossible.

"Most" mean not all yet.

But right now, it's a toy.

Videogames are only currently maxing out but because they are developers are turning to physics simulation where everything in the environment can be manipulated.

There is no way a.i. or VR has been maxed out yet so we will see.
 

scorpiomover

The little professor
Local time
Today 8:05 AM
Joined
May 3, 2011
Messages
3,383
---
Moore's law is about the speed of a single core of a CPU, not a GPU.
Wrong, it is not about speed but transistor count per square centimeter.
I looked this up. I stand corrected. Thank you.

That is what Gordan Moore predicted and it did not stop in 2000.
Well, it seems to, with respect to CPUs. It has just been applied lately to GPUs, and to the number of cores. Do you agree with that?

Parallelism has barely been touched.
so room for improvement exists.
Yes. It's just that it hasn't happened yet. Do you agree with that?

If you agree, then the question remains: WHY hasn't it happened yet, and what does that answer tell us about the future?

I've looked into the issue, and realised that it requires a change in programming paradigms, i.e. a complete change in the way computers are programmed, and thus requires a change in the way people look at things in general. Such an attitude would require a more pluralistic POV, which would end the culture wars, and would reveal the corruption in politics that benefits those in power. So those in power don't want people to think that way, which in turn makes them want to hold the technology back.

Why do YOU think it hasn't happened yet?

You are welcome to believe that flying pigs are real.
You are welcome to believe computers have not improved since 2000
I started using computers in the 1970s. So I've had a lot of time to see if and when computers have improved, and if and when computers have not improved. I've also spent a lot of time reading about the history of computers. How long have you been using computers for? How much of the history of computers are you familiar with?

but that is a delusion on your part.
A delusion is that which someone believes that is clearly false. As I said, I have plenty of evidence to prove my points. Do you? If I can easily prove my claims and you cannot, but merely believe you claims to be true because that's what you have been told, then who is deluded?

VR will become popular when something else makes it worth using.
Which was my point. The ecosystem of the iPhone I just said that will happen for VR and a.i. - can you not grasp the argument?
I can grasp the argument. It's just that it strikes me as more of a positive hope, than an observation about reality. Do you see why I might be concerned that it's more of a hope than a reasonable conclusion?

If you've been following trends since you were 12, then you've already noticed that any form of technology rises until it reaches maximum saturation and maximum feasible improvement. Most of the current tech tapped out in the 1990s.
which again is delusion if you think progress is impossible.
If I believed that progress was impossible, then I would also believe that humans never developed farming.

However, progress in any one area or direction may reach a maximum feasability. Then to progress further, one must progress in other areas and other directions. Can you agree with that?

But right now, it's a toy.
Videogames are only currently maxing out but because they are developers are turning to physics simulation where everything in the environment can be manipulated.
Yes. But why would they turn to physics simulations, when they can run real-life tests? Only because they've already run real-life tests, and either found one of 2 options:

1) They have already developed the technology but it is being held back for economic reasons, such as the way DVDs were held back for 10 years to help recoup the investment into CDs.

2) Their efforts are getting them nowhere, and they have no genuine ideas, and so are buying time by saying "Oh, but they work in simulations", so they will continue to get investment to fully develop the technology. That would be a positive hope which may not come to fruition and thus is a colossal waste of time & money.

I don't see either option as a positive. Do you?

There is no way a.i. or VR has been maxed out yet so we will see.
I agree that we don't know the future, and so we can only see what the future brings, when it arrives. But I am allowing for the possibility that your vision of a great new future may not materialise. That way, I cover myself for all possible outcomes. Do you see the benefits of covering yourself for all possible futures?
 

Black Rose

An unbreakable bond
Local time
Today 1:05 AM
Joined
Apr 4, 2010
Messages
11,431
---
Location
with mama
As I said, I have plenty of evidence to prove my points. Do you?

Yes, you said so yourself.

I stand corrected.

I am allowing for the possibility that your vision of a great new future may not materialise. That way, I cover myself for all possible outcomes.

An elephant could cause the stock market to crash via the butterfly effect.

But some things are worth speculating on well others are not.

I can grasp the argument. It's just that it strikes me as more of a positive hope, than an observation about reality. Do you see why I might be concerned that it's more of a hope than a reasonable conclusion?

If you cover all your bases then you cannot discount my ideas. Or elephants or giant meteors wipping out civilization.

So far I saw what happened with the iPhone and I am seeing what is happening today.

I remember in 2000 that people invested all their money in beanie babies and dog.com and lost a shit ton of money. But that is not what happened to Microsoft or Apple or Amazon, Google, or Nvidia. Before the internet and before social media I was into a.i. and read every book I could about it and made my own designs. And if I am being reasonable my gigaflop 1998 computer was way better than my 1992 organ trail computer. Game cube was better than NES. I can see how things saturate but the games I played in 2004 (knights of the old republic) have different graphics from the games I see today. Some are good and some suck but I also remember the wii games that were good. The wii U sucks but things suck not because of tech but development. I never have a touch phone until 2012 and the phone i have now is not a touch phone. But phones today have voice recognition and face recognition. In 1996 I wanted a portable tv radio I saw at the mall but now everyone has the equivalent.

I saw a video on people talking about creating VR avatars that are intelligent chatbots but designed so that they can actually look into your eyes. They have some social intelligence. But only like 50 people in the world are doing this. Just because facebook metaverse sucks does not mean VR will suck forever. I am predicting that people will still do research, still program, and still come up with new things. It is not unreasonable because I see new things appear all the time like 4K OLED screens that we knew about in 2006. When the price point matchup is when consumer products happen. Youtube has billions of videos today but only had less than a thousand in 2005. The first video that received a billion views was in 2012.

In 2024 I can afford a computer a thousand times more powerful than the one I am on right now for the same price. That means it can be programmed to do things impossible for this computer. And my internet will be a thousand times faster as well from 2014. But eventually, without 3D nanotube chips, people will focus on utilizing everything as efficiently as possible.

The only thing I need to know is that we will use our resources better than before because we will need to. We will run out of oil and coal and gas and computers will stop getting more powerful. But then we will need to move away from general-purpose chips to application-specific processors. And energy efficiency. Quantum computers to discover new maths. To max out all potential.

Everything is consolidating. And people are looking for new things to do/make. If people make new things the price will go down and they will do more.

Blue brain / The Human Brain project just completes their rat brain atlas. They can run simulations on a whole rat brain (in real time I think). But intelligence is not just about the duplication of brains. It is about a model that can self-program and correct errors in its thinking/logic. The algorithms the brain uses can be applied without all the complexity of the brain but with the efficiency necessary to make it human-level.

The only problem is ethics - what does a human level a.i. need to be considered sentient in the eyes of the law. Will it be given rights? Will it multiply and fight humans for survival? These questions are highly dependent on the ecosystem the a.i. goes into. And the number of people working in that ecosystem.

We will max out systems and move on to new systems. Systems, where we need intelligent computers and everyone, will shift to the a.i. stuff because that is the only area not maxed out yet.
 

onesteptwostep

Junior Hegelian
Local time
Today 5:05 PM
Joined
Dec 7, 2014
Messages
4,253
---
Philosophy is history, and philosophy is useless unless the larger population learns of it so that we as a society can debate how to progress forward with that knowledge of history. Truth is engagement with society and knowing how to make it better, in the light of all the history, morality, and human desires and accumlated knowledge.

This is ultimately what politics is, the ultimate 'game' of human existence. Until one realizes this, you are not really an agent of 'humanity', a program that has been running for 5000+ years.
 

The Grey Man

το φως εν τη σκοτια φαινει
Local time
Today 3:05 AM
Joined
Oct 6, 2014
Messages
931
---
Location
Canada
Then it sounds like you are saying that philosopy is USEFUL to you. Hence, to you, philosophy counts as "useful knowledge". Do you agree?
Philosophy is not knowledge at all, but an appetite the object of which is knowledge of a particular kind, namely wisdom, which, as I've said, includes both noetic knowledge of principles and rational, scientific/epistemic knowledge of applications, only some of which are political or ethical in nature. We seem to be going in circles, nor do I have time to pursue every line of discussion you've opened up. Your comment on the connection between the analytic-synthetic distinction and 'obviousness' is, however, very interesting, and I happen to have spent a lot of time thinking about this very topic, so I'll respond at some length.
What I suggest, is that there are 2 types of ideas within mathematics:

1) Ideas that are not obvious at first glance to those who are already familiar with their axioms, such as Pythagoras' Theorem, which is based on the basics of geometry and algebra, but not obvious to those who have learned both but not yet learned the proof of Pythagoras' Theorem. They are brilliant. But they are also easy to misunderstand and easy to use in error. In mathematics, a synthetic idea is usually called a "theorem". If an idea is extremely synthetic, it is usually called a "theory", such as the "theory of calculus".

2) Ideas that are obvious at first glance to to those who are already familiar with their axioms. They are so obvious that they hardle count as ideas at all. But everyone familiar with their axioms is unlikely to get them wrong. If the idea is a simple conclusion from axioms, then it is called a "proposition". If the idea is a simple conclusion from a theorem, then it called a "corollary".

I propose that Kant called #1 "synthetic", and #2 "analytic".
Kant calls a judgment "analytic" if its predicate is contained in the subject, so that to know what the judgment means is the same as knowing that it is true; a "synthetic" judgment is one the truth of which cannot be known simply by knowing what it means. Kant thus defines synthetic judgments 'apophatically' as those which are not analytic, just as he defines a priori judgments simply as those which are not based in experience. Does the mere fact that an analytic judgment is true by virtue of its meaning make it obvious? To be sure, Kant's famous example of an analytic judgment, 'All bachelors are unmarried', is obvious to anyone who knows what a bachelor is; but one of his examples of a synthetic judgment seems no less obvious: '2 + 2 = 4'. What are we to make of this? Are we to conclude that, as some have suggested, this last statement is not synthetic at all, that, perhaps, the sum of 2 and itself is by definition 4? By what definition, whose? It is easy to prove that 2 + 2 = 4 using the Peano axioms,

4 = S(S(S(S(0)))) ∧ 2 = S(S(0)) ∧ n + 2 = n + S(1) = S(n + 1) = S(S(n)) ∀n ∈ ℕ
⇒ 2 + 2 = S(S(2)) = S(S(S(S(0)))) = 4, q.e.d.
and so it would seem that the predicate, 4, is, in a way, contained in the subject, 2 + 2 (this is a simplification: any element of the proposition '2 + 2 = 4' could be considered its subject or that which it is 'about', even the copula), but no axiomatization of arithmetic was available in Kant's time. Did arithmetic sums suddenly become analytic in the late nineteenth century? I think not. Rather do I think that whether a statement of is analytic or synthetic depends on the linguistic context in which it is uttered. Arguably the biggest weakness of the Critique of Pure Reason is that (as J.G. Hamann, its earliest critic, pointed out) it fails to account for the role of language. This, however, does not change the fact that no one, literally no one, had even thought to distinguish analyticity from apriority before Kant. As Schopenhauer said, genius does not merely hit a mark that no one else can, but hits a mark that no one else can see. Kant may have missed his mark, but he was the only one who could see it in the eighteenth century, and this is one proof of his genius. But I digress.

Speaking of genius, this is what I take to be the mark of the genius: that he makes a new linguistic context within which statements that appear synthetic (and therefore dubitable) or even absurd from without are actually analytic. Newton is a perfect example: 'F = ma' is absurd in the context of seventeenth century physics, but no sooner does one understand what Newton means by force, mass, and acceleration, and how (together with the equally 'absurd' hypothesis of gravitation) it throws light on the results of Galileo in the terrestrial domain and Kepler's laws in the celestial domain, than one sees the superiority of the new physics over the old. It becomes 'obvious', like the usefulness of the cat flap (the invention of which is, incidentally, attributed by some to Newton).
 

scorpiomover

The little professor
Local time
Today 8:05 AM
Joined
May 3, 2011
Messages
3,383
---
Then it sounds like you are saying that philosopy is USEFUL to you. Hence, to you, philosophy counts as "useful knowledge". Do you agree?
Philosophy is not knowledge at all, but an appetite the object of which is knowledge of a particular kind, namely wisdom, which, as I've said, includes both noetic knowledge of principles and rational, scientific/epistemic knowledge of applications, only some of which are political or ethical in nature. We seem to be going in circles, nor do I have time to pursue every line of discussion you've opened up.
We are going in circles for a few very simple reasons.

1) I would say that your description of "wisdom" is more akin to what most people call "philosophy". Most people say that philosophy is a waste of time, or in your case, "dangerous", because it is foolishness that one is better off NOT knowing.

2) In English, "wisdom" what "wise men" say. "Wise men" are men who say very clever and useful things that most people would not have thought of, that are extremely useful to everyone. AFAIK, there are similar terms in almost every language. E.G. in Hebrew, "wisdom" is called "Chochmah" and a "wise man" is called a "Chacham". So these 2 terms seem to be pretty ubiquitous concepts.

When you combine #1 with #2, it is pretty clear that what you call "wisdom", which is what most people call "philosophy", is the opposite of what the word "wisdom" means.

Your comment on the connection between the analytic-synthetic distinction and 'obviousness' is, however, very interesting, and I happen to have spent a lot of time thinking about this very topic, so I'll respond at some length.
Kant calls a judgment "analytic" if its predicate is contained in the subject, so that to know what the judgment means is the same as knowing that it is true; a "synthetic" judgment is one the truth of which cannot be known simply by knowing what it means. Kant thus defines synthetic judgments 'apophatically' as those which are not analytic, just as he defines a priori judgments simply as those which are not based in experience. Does the mere fact that an analytic judgment is true by virtue of its meaning make it obvious? To be sure, Kant's famous example of an analytic judgment, 'All bachelors are unmarried', is obvious to anyone who knows what a bachelor is; but one of his examples of a synthetic judgment seems no less obvious: '2 + 2 = 4'. What are we to make of this? Are we to conclude that, as some have suggested, this last statement is not synthetic at all, that, perhaps, the sum of 2 and itself is by definition 4? By what definition, whose? It is easy to prove that 2 + 2 = 4 using the Peano axioms,

4 = S(S(S(S(0)))) ∧ 2 = S(S(0)) ∧ n + 2 = n + S(1) = S(n + 1) = S(S(n)) ∀n ∈ ℕ
⇒ 2 + 2 = S(S(2)) = S(S(S(S(0)))) = 4, q.e.d.
and so it would seem that the predicate, 4, is, in a way, contained in the subject, 2 + 2 (this is a simplification: any element of the proposition '2 + 2 = 4' could be considered its subject or that which it is 'about', even the copula), but no axiomatization of arithmetic was available in Kant's time. Did arithmetic sums suddenly become analytic in the late nineteenth century? I think not. Rather do I think that whether a statement of is analytic or synthetic depends on the linguistic context in which it is uttered. Arguably the biggest weakness of the Critique of Pure Reason is that (as J.G. Hamann, its earliest critic, pointed out) it fails to account for the role of language. This, however, does not change the fact that no one, literally no one, had even thought to distinguish analyticity from apriority before Kant. As Schopenhauer said, genius does not merely hit a mark that no one else can, but hits a mark that no one else can see. Kant may have missed his mark, but he was the only one who could see it in the eighteenth century, and this is one proof of his genius. But I digress.

Speaking of genius, this is what I take to be the mark of the genius: that he makes a new linguistic context within which statements that appear synthetic (and therefore dubitable) or even absurd from without are actually analytic. Newton is a perfect example: 'F = ma' is absurd in the context of seventeenth century physics, but no sooner does one understand what Newton means by force, mass, and acceleration, and how (together with the equally 'absurd' hypothesis of gravitation) it throws light on the results of Galileo in the terrestrial domain and Kepler's laws in the celestial domain, than one sees the superiority of the new physics over the old. It becomes 'obvious', like the usefulness of the cat flap (the invention of which is, incidentally, attributed by some to Newton).
Well, I'd say that the idea of a "cat flap" seems to be the work of "genius", because it allows cats to come and go as they please, and to sh*t outside without the owner needing cat litter, without risking anyone burgling your home, as it's far too small for even a child to fit through, and without losing heat from your house, as the cap flap would swing closed once the cat has gone through, and yet, it's such a simple idea that almost anyone with basic DIY skills could make a cat flap.

However, coming up with a distinction between trivially obvious trite statements like "all bachelors are unmarried" and every other type of statement, isn't really the work of genius, and is much closer to the work of an idiot.

So your description seems to imply that Kant was an idiot.

If you think that Kant was a genius, by all means, prove that Kant was a genius.

You could even do this by explaining how the analytic/synthetic distinction is so clever that it could only be the work of genius.
 

ZenRaiden

One atom of me
Local time
Today 8:05 AM
Joined
Jul 27, 2013
Messages
5,262
---
Location
Between concrete walls
My guess is wisdom comes from words wiz witt witz ergo ability to react in smart way, but wit can also come from words such as videt or visible or vision ergo someone who sees something...... in the abstract this could mean seeing solutions that others don't see, in the concrete this could mean someone who has seen many things.
Wise people are often depicted as travelers in the past, as travel usually implied knowledge beyond the known world.

The word wizard probably has similar connotation as in ability to envison see be a prophet see solutions and pertain to knowledge etc.

Wiz wit very alike words.

In slavic languages knowledge is called vediet or veda sanskrit, videt means to see, wise comes close to this, so the root etymology for wise is someone who has seen things.

Ergo wisdom comes from observation.

Analytical thinking often gives us ability to see things people usually don't get when they are being intuitive.

So ancient philosophers trained themselves to analytical thinking so much that they could often see solution most people did not even have ability to see.

Analytical thinking as a discipline begun only as writing and harnessing knowledge became a permanent staple of civilization.

Most people analyze things.

But analyzing things things systematically ergo logic for example was something done only when people had the energy to support academics such as Plato or Archimedes or Socrates.

I am sure people were wise before professional academics existed though.
Its just that they did not write this knowledge down or it was not preserved or found yet, and probably was not systematic.

Newtons genius was finding connection between mathematical methods and pretty obvious physical phenomena.

Everyone knows intuitively how mechanics work, but no one knew how to get precise numbers and add things up using math.

Since math in Newtons time was a still developing discipline and things like gravity were taken for granted, but not understood, it took analytical mind to look for connection between math and observable nature.

All Newton did was take the most basic knowledge of math and the most basic most obvious physical phenomena and formulated a model that worked.
He was genius in sense he workout the not so obvious way to connect the two worlds.

But people have always used mechanic logic and knowledge and they always used measurements in architecture or geodetics.

He simply had put effort into making it more logical and clear.
 

The Grey Man

το φως εν τη σκοτια φαινει
Local time
Today 3:05 AM
Joined
Oct 6, 2014
Messages
931
---
Location
Canada
We are going in circles for a few very simple reasons.

1) I would say that your description of "wisdom" is more akin to what most people call "philosophy". Most people say that philosophy is a waste of time, or in your case, "dangerous", because it is foolishness that one is better off NOT knowing.
I would like to know what these people think is worth knowing if not principles (the object of noetic knowledge) and their applications (the object of rational knowledge), both of which pertain to wisdom as I've described it. Even prudential (i.e. useful) knowledge is included in sapiential knowledge, though wisdom is by no means exhausted by prudence.
2) In English, "wisdom" what "wise men" say. "Wise men" are men who say very clever and useful things that most people would not have thought of, that are extremely useful to everyone. AFAIK, there are similar terms in almost every language. E.G. in Hebrew, "wisdom" is called "Chochmah" and a "wise man" is called a "Chacham". So these 2 terms seem to be pretty ubiquitous concepts.
The word 'wisdom' no more has a single, unanimously recognized meaning than 'intelligence' does, for the simple reason that English is not an artificially univocal, 'one word, one meaning' language like a mathematical deductive system, and is moreover quite unsuited to discussing immaterial things like the different kinds of knowledge. The Hebrew word 'Chochmah', likewise, has very different meanings in exegetical and Kabbalistic contexts from the one you described here. Convention is not always the best guide when one wishes to express oneself precisely. One must also rely on etymology, as I have.
If you think that Kant was a genius, by all means, prove that Kant was a genius.

You could even do this by explaining how the analytic/synthetic distinction is so clever that it could only be the work of genius.
I've already offered a proof of Kant's genius. Evidently, it didn't satisfy you, though I'm not sure if you've understood it, since you seem to think that it's based on his analytic-synthetic distinction whereas I pointed to his distinction between analyticity and apriority. This was, in any case, a digression from my main point, which was that the 'obviousness' of a statement depends on its analyticity, which in turn depends on its context.
 

Black Rose

An unbreakable bond
Local time
Today 1:05 AM
Joined
Apr 4, 2010
Messages
11,431
---
Location
with mama
The 3 Methods For Learning Wisdom, According to Confucius

“By three methods we may learn wisdom: First, by reflection, which is noblest; Second, by imitation, which is easiest; and third by experience, which is the bitterest.”
 

ZenRaiden

One atom of me
Local time
Today 8:05 AM
Joined
Jul 27, 2013
Messages
5,262
---
Location
Between concrete walls
The 3 Methods For Learning Wisdom, According to Confucius

“By three methods we may learn wisdom: First, by reflection, which is noblest; Second, by imitation, which is easiest; and third by experience, which is the bitterest.”
I would agree with this, but I assume this has to do with social order, which was and is still prominent in China, and lo behold what a doormat nation they are.
Not saying it in bad way. Some people being doormats is good, for social order massive collective.
But too much of doormat and you get very bad results in every other way.
Too me Confucius was typical autocratic bureaucrat.
He is not wrong in didactic sense. Makes teaching bureaucrats easy when you need them to be proficient and serving and able to work in organized ant hill of kingdom full of thousands people.

Also I would not say experience is bitterest, not unless you do something wrong.
But there are times where experience is only other way to push knowledge.
At times I find that schools even today push experience furthest from teaching, and I find the "experience is bitter" mentality a little curious.
I think the truth is experience is usually random and uncertain.
Sometimes it can kill you, and sometimes it can be a gold mine.
But no one found a gold mine by reflecting or imitation.
 
Top Bottom