# Do NT's have an advantage with Computers?

#### Animekitty

##### INFP
You need logic to understand computers.
So thinking types would understand.
Intuition recognizes patterns.
So they would solve problems creatively.

machine code
compilers
software development kits
Object-oriented programming
Operating system

I do not know about these much at all.

I might know more if I went to school but some people just learn of their own.

Personality has a large impact of if people can just learn computers on their own.

#### Jennywocky

##### guud languager
I grew up before home computers, so when I saw my first Apple IIe, I fell in love. Computers always made sense. I always knew how to find things on the system, and understood how things should work, and... it was all intuitive.

I was actually composing music in single note BASICA on IBM compats in 1985 (if you speed up the notes, you can "roll" chords), and taught myself enough Fortran that I was programming for the college computer lab in the mid/late 80's without majoring in computer science.

Was designing and programming webpages in HTML and ASP/vbscript in the late 90's. My career has always involved computers from the first day of my first job.

But I think the only computer course I ever took in college was two semesters of Turbo Pascal / basic computer language. Now I'm a systems analyst for disability systems.

#### Nymus Anon

##### AnoNymus
I believe I'm an istp and everything about computers is the funnest thing ever. I made the mistake of taking the comp sci 1 class in my high school this year, but I'm gonna jump into the comp sci 2 class in the middle of the year, but by that time that class will probably not be good enough for me.

#### Cognisant

##### Condescending Bastard
Machine Code
Learn how to count in binary, learn how to do binary arithmetic (regular math just with binary numbers, learn about the different kinds of logic gates, you now know 90% of machine code, the remaining 10% is dealing with hardware instability.

Compilers
A compiler is just a program used to abstract away the pedantry of writing machine code, instead of telling the computer how to store a variable every single time you just tell the compiler you want to store a variable and it does all the boring repetitive shit for you.
People who write compilers are a rare and special breed.

Drivers & Firmware
The software that makes the hardware do its thing right, in a way the hard drive in your computer is its own machine with its own software separate to (but interacting with) the software in the rest of your computer, as is every other computer component, even peripherals like USB flash drives, mice and keyboards have their own firmware.

Software Development Kits
An SDK is a compiler plus software that emulates another platform, this enables you to develop software for a phone or tablet using a computer and see how the software performs before installing it in the final product.

Object-Oriented Programming
Remember how the compiler automates storing variables for you, well OOP is kind of like that but on a larger scale, it assists with the creation of data structures and relationships between data sets. For example if I tell you mammals have four legs and that a dog is a mammal then you know a dog has four legs because you understand that a dog is a mammal type object with the property of "four legs", without OOP a computer would have to be told the dog has four legs because it has no concept of the dog object.

Operating System
Each computer component has its own firmware, the motherboard connects all of this into a whole computer and the operating system is the firmware for operating the computer as a whole, it's basically a program used to organize files and programs but it does many other housekeeping type things so your computer works as intended.

#### Animekitty

##### INFP
A computer lets say windows 98 with a Pentium chip and a million transistors ran on 1 or 2 threads. Let's just say that today a chip has 1 billion transistors and 8 threads. Because the number of threads is small the logic units need to do more work. Practically the bandwidth and the catch used is just going back and forth between batch processing inside the chip. The number of logic units are working on the computations and the thread delivers back results. I could guess that a chip with 1 billion transistors has 100,000 logic units running at once.

Object-Oriented Programming

A function lets you input something then receive a result.

You can create any function you want with high order structures.

From my program:

Code:
// This shows acumilated change of connections in Data.
for (int x = 0; x < ws; x++) {
for (int y = 0; y < ws; y++) {
difference_node[x][y] = Math.abs(Cortex1[x][y] - Cortex2[x][y]);
Data1[x][y] += difference_node[x][y];
}
}
The tricky part for me is that I did not understand function until 2 years ago. Because threads are only used is big software projects where you need to change the data in complex ways. Threading bothered me for a long time and I still have no idea how to make it work. I copied a program with a working thread and I do not mess with it at all. I break my programs that way. It is also impossible for me to make buttons and interactive interfaces with mouse clicks. (Java has this thing called an interface but it has nothing to do with interactions between humans and computers)

#### Serac

##### Prolific Member
I think NTs have an advantage in stuff that requires both creativity and in-depth knowledge. Writing OS or compilers are good examples I think.

Otherwise I think most programmers are not NTs. I don't think OOP is attractive to NTs, for example. I personally hate it as it presupposes humans need forced abstraction mechanisms in order to reason effectively about programs.

#### Cognisant

##### Condescending Bastard
I don't think OOP is attractive to NTs, for example. I personally hate it as it presupposes humans need forced abstraction mechanisms in order to reason effectively about programs.
Lol I'm actually trying to explain OOP to AK right now and I've gotten myself confused.

It makes sense when talking about in game objects like a car, you might want several different types of car but you want them all to do the same sort of thing and it makes debugging much easier when the fault with the car controller function can be fixed for all cars at once rather than having to find every relevant line for every type of car.

But then a car might have a physics object and a model object and a sounds object and a AI object and an interaction object and these objects aren't intuitively "objects" as we generally use the word, then you've got child objects of those objects and you're wondering if they should be objects and you've forgotten what an object is.

#### Animekitty

##### INFP
function1(function1(variable1 + variable2), function2(variable1 - variable2))

variable1 = 4
variable2 = 3

x = fuction1(function2(4 - 3))

x = 1

#### Animekitty

##### INFP
Class
Methods
Objects
Functions

Object McDonald passes by Method cheeseburger in Class restaurant to BurgerKing. BurgerKing adds pickles with pickles function then passes by Method cheeseburger to McDonald's.

#### Serac

##### Prolific Member
Lol I'm actually trying to explain OOP to AK right now and I've gotten myself confused.

It makes sense when talking about in game objects like a car, you might want several different types of car but you want them all to do the same sort of thing and it makes debugging much easier when the fault with the car controller function can be fixed for all cars at once rather than having to find every relevant line for every type of car.

But then a car might have a physics object and a model object and a sounds object and a AI object and an interaction object and these objects aren't intuitively "objects" as we generally use the word, then you've got child objects of those objects and you're wondering if they should be objects and you've forgotten what an object is.
I guess there are good uses of OOP, like the one you described. My experience, though, is that OOP is a perversely abused concept where people just mindlessly write classes for everything instead of thinking about what the essence of the problem is. It ends up making programs extremely bloated, inefficient, and a nightmare to understand and debug.

#### gps

##### INTP 5w4 Iconoclast
Do NT's have an advantage with Computers?
Posed this way, the power of suggestion might find one proceeding as if an NT were an NT were an NT ... as if Extroverted NTs use or interact with a multipurpose multifaceted tool as Introverted NTs ... or _NTJs didn't prefer early binding or closure/certainty from bottom-to-top and top-to-bottom where _NTPs might experience comfort with late binding, lazy binding, and emergent phenomena.

As a professional programmer I noticed that ALL of the hardcore coders were INTx ... not a ENT_ to be found.
So the _NT_ generalization applied to computers might be overly general UNLESS the ENT_ types end up using said computers at a higher level of abstraction;
to wit, an ENTJ field marshal' may use computer apps which aid in logistics, scheduling, and such while an ENTP inventor' may use ready-made apps for computer-aided design, spread sheeting, database and such.
I would be VERY surprised to see an ENTx doing low level bit twiddling and manifesting attention to details which INT_ types are better suited.
ENT_ types seem either interested AT ALL or MORE-THAN INT_ types about what the general public cares about what they DO with computers; they play to the crowd.
As real hard-core programmers the INT_ types may care what their peers think and feel about their CODE, apps, APIs, libraries, and such ... they are less likely to attention-whore to The General Public than their Extroverted cousins in temperament.

Though I can imagine extroverted NTs doing as well as we introverts in a limited-interval class or course as an INTx, I can't imagine an extroverted NT being able to churn out code in a cubicle without experiencing something lacking at a very substantial level of gratification.

My brother is an ENTP who has wanted to develop his own app for a decade or so, but he can't bring himself to co- or inter-operate with his INTP pro programmer pursuant to implementing his invention-in-process.

I met another ENTP in the summer of 2016 who watched me to my thing with programming, but whose eyes glazed over then led me to believe that his notions of programming his the metaphorical stop bath' with his use of the imperative paradigm through programming robots in his teen years; EG an inventor' cannot necessarily expand his or her range of operations either very well or AT ALL into the domain of an INTP architect' capable of weighing programming paradigms, language features, and such in light of a domain of application.

As strategists I can imagine ALL NTs starting with a desirable End Result then working backwards in an attempt to cobble together the modules to facilitate the desired outcomes; I believe the introverts can and do BETTER traverse the depths while the ENT_ bob on the surface near where the General Public uses computers'-cum-Smart'_devices ... phones and tablets.

#### Animekitty

##### INFP

I made this one in 2015

I made one other one but I have no more ideas left.
It would require CUDA / a Nvidia graphics card.
A card can have over a thousand threads operating at once.
It would be an avatar with a brain.
A CPU does not have the bandwidth for parallelism.

I do not think much will happen till AR and VR go mainstream.

I am trying to deconstruct the brain so I will have and architecture.

I need to learn how to manage my emotions so I am calmer and can think through problems and structure.

##### think again losers
Not sure if I'm an NT, but I fucking suck with computers. Like... context specific learning disability level suck. Not sure why, because I use them all the time and would think that they should fall within a category of my relative strengths. I have very little understanding of them, and get very frustrated with them quickly.

#### Animekitty

##### INFP
Not sure if I'm an NT, but I fucking suck with computers. Like... context specific learning disability level suck. Not sure why, because I use them all the time and would think that they should fall within a category of my relative strengths. I have very little understanding of them, and get very frustrated with them quickly.
Last time something was wrong with the forum Kuu fixed it. You are a mod so you must know some stuff? What kind of stuff do you know about computers that you don't suck?

what do you mean by this?

"context specific learning disability level suck"

##### think again losers
I do absolutely none of the technical stuff for the forum. I do the social side of things only.

When I say learning disability, I'm thinking attentional impediment? As in, there's no reason why someone of my general aptitude shouldn't be able to computer, especially since I spend like 8 hours a day on one. But I can't computer, so there might be something going on.

While I'm an average typist, anything more complicated than a word document is reminiscent of introducing a touch-screen to your average 90yo. I'm just not well equipped to deal with it. I have no patience for that particular brand of learning. If I'm forced to trouble-shoot something I'll probably have a tantrum, get half way through, and give up. There are the things I do everyday, and then there is the stuff I cannot and will not do.

#### gps

##### INTP 5w4 Iconoclast
I made one other one but I have no more ideas left.
It would require CUDA / a Nvidia graphics card.
A card can have over a thousand threads operating at once.
It would be an avatar with a brain.
A CPU does not have the bandwidth for parallelism.
Graphics cards are just one example of concurrent processing.
Both clusters and Multi-core processors can do it too.
Though from a programming perspective many -- if not most -- developers can get by with multithreading.
NetLogo comes to mind as an example; a programmer can put multiple turtles' to work generating a simulation, graphics image, or whatnot.

Clojure is pretty impressive too, as it allows multiple processors on the web to be used concurrently to perform parallel algorithms.

I do not think much will happen till AR and VR go mainstream.
Happen ... at what scope?
If an individual NT wants to develop an algorithm, app, or proof of concept he or she can do so in their own backwater billabong regardless of what the puddle ducks are doing paddling along in the mainstream'.

I am trying to deconstruct the brain so I will have an architecture.
Which hemisphere?
The left hemisphere closely resembles a serial processor; the right operates along massively parallel lines.
Then when it comes time to model the Corpus callosum which connects our serial processor with our parallel processor(s) you're in for all kinds of fun.

I need to learn how to manage my emotions so I am calmer and can think through problems and structure.
Why (mis)manage them when you can acknowledge that they (e)motivate you ... and if you allow them to flow they can and will carry your would-be-objective thought'/cognition processes downstream to produce results far better than if you attempt to fight -- or manage' -- their upwellings and flows?
I used to attend Emotions Anonymous meetings in which emotions' were vilified corresponding to Alcohol as per AA.
The Old timers' used to formulaically start their shares' with My name is <name> and I'm powerless over my emotions."
I used to mess with them by saying, "My name is Gene, and I'm EMPOWERED by my emotions."

IMNSHO, emotions don't need to be managed' so much as surfed, kayaked, ridden, etc.
C'mon! We're Perceivers with a Play Ethic.
The first myth of management' is that it EXISTS; the emotions attendant with playing can empower all sorts of experimenting via computers.

#### gps

##### INTP 5w4 Iconoclast
You need logic to understand computers.
So thinking types would understand.
I believe I understand the level at which you're intending to pin down computers'.

However, the shifting semantic of computer qua computer' has shifted so much in recent decades that with smart phones', tablets', and all *many* think of a computer' as something more like a laptop or desktop even though so-called smart phones' and tablets' either are' or contain' computers.

Nowadays I imagine understanding computers' is closing the gap with understanding a car', or understanding a abacus.
Most people don't have much interest in understanding cars; they are content to use them, drive, them, operate them.
So too with their programmable digital devices;
most have low to no desire to understand' them any more than their cars.
And -- if truth be told -- they don't use them for computing' so much as communicating with others; in this regard they are used more like the CB radios of the 70s and early 80's.

NTs over the developmental course of what lay persons have regarded a computer' have been both instrumental in their development and their application.
The huge difference between primitive barely-computers -- the first computer I programed had ferrite core memory -and no compiler or interpreter for higher-than-machine-code programming -- and modern smart phones, tablets, laptops, desktops, clusters, and such is several layers of abstraction.
So an NT's understanding' may be extend over a broad range or be narrowly scoped.
Two NTs may each have an understanding' of computers or work with computers' and yet not have enough overlap to hold a coherent discussion if/when they do both have sufficient interest to bridge their co-engendered gap.
A firmware engineer and someone developing apps for GUIized systems may work so many levels of abstraction apart that they each may each experience what people speaking different natural' languages do when they have no common language to intermediate.

And even if/when both NTs are working and playing at the same level of abstraction the tools and programming languages and paradigms they have-used or are-interested-in may put them at odds.

One's understanding' of domain-specific problem when looked at through an OOP lens may be so experientially different than the view one gets when looking at the same domain space through a functional paradigm lens where a domain-specific language may be crafted via metalinguistic abstraction.

All this said, it seems that as computers have evolved from the days of tubes, relays, and flip flops made from discrete transitors the nature of any conversation about understanding computers' has become increasingly multi-meta leveled.
So much so that individuals sufficiently well-intentioned may either talk past each other OR end up pinning down their mutually-agreed-upon notion of computer' to the extent they co-manifest Folie à deux

In recent decades the OOP paradigm was so entrenched in the thralls of CS departments around the world that it's (mal)practitioners couldn't -- or wouldn't -- even hold a conversation which didn't beg the paradigm; they could only talk in paradigmatic terms of the prejudicial cognitive framework they collectively manifested via Folie à plusieurs
No! I don't want to talk about methods' which beg the paradigm; I want to talk about algorithms' and data structures' which don't favor any one paradigm over an other AT THE OUTSET.

With every additional degree of nested complexity the so-called natural' language used to talk about arguably MORE-complex languages and the computer subsystems in which they apply becomes more strained and inadequate to the task.

How can one employ mere English' to convey the depth and breadth of NT's understanding' or advantages' vis-a-vis computers' to whatever extent we/they do?

In recent years I've noticed that such conversations take on an air of developmental psychology in that for me to communicate with someone effectively I have to become aware of which paradigms, languages, concepts, and preferential biases they hold.
My skills and abilities in linguistics, psychology, and philosophy have to improve above and beyond those one might imagine sufficient to understanding' computers to a level which advantages myself over mere-mortal non-NTs.

#### gps

##### INTP 5w4 Iconoclast
The tricky part for me is that I did not understand function until 2 years ago. Because threads are only used is big software projects where you need to change the data in complex ways.
Function' ... Because threads'?

It seems you might be suffering from the stinkin' thinkin' induced by (mis)use of Java as a language and OOP as a programming paradigm as per Linguistic Relativity addressed by the Sapir-whorf-korzybski hypothesis

By way of a prescription to help clear up what appears a nasty cognitive rash I'm prescribing a liberal topical dose of Execution in the Kingdom of Nouns
As there is nothing toxic in this ointment it won't hurt if this would-be rash is some post-objectified, post-reified thing' -- where Object' = figment of imagination arising from reification -- akin to adolescent acne correlated with the growing pains associated with becoming A Real Programmer.
Not that you asked, IMNSHO, Real Programmers' exhibit signs of eclecticism and would-be artificial'-linguistic Polyglotism.

And threads' can be manifested/implemented without the prejudicial use of the term thread', for example, via closures in a programming language which supports first-class functions.

As for the semantics encapsulated' in your code block
Code:
// This shows accumulated change of connections in Data.
for (int x = 0; x < ws; x++)
{
for (int y = 0; y < ws; y++)
{
difference_node[x][y] = Math.abs(Cortex1[x][y] - Cortex2[x][y]) ;
Data1[x][y] += difference_node[x][y] ;
}
}
a/the delta function is used throughout science and technology.
That your code uses' or `applies' this function inside a nested-loop structure might trigger an awareness that one *might* use applicative, functional, or applicative functional programming.
One might even implement the delta function as an anonymous function just to demonstrate how arbitrary and capricious NAMING -- as in the Kingdom of Nouns -- and the reification it pathoCognitively induces is-qua-IS ... and is-sharply-expressed via E-sharp as contrasted with E-Prime.

<snipped long-winded, indubitably-boring demonstration of functional programming via lisp>