Not Invented Here
I first became aware of the concept of "Not Invented Here" in the context of software development. While it applies to many different domains, it's particularly prevelent in software just because of the typical hubris of your average programmer: that it is more fun to write new code than to read and adapt someone else's code for some given purpose.
We programmers love to create, but no one wants to maintain.
This concept applies to human ideation as well. I regret that I spent much of my youth, in high school and college, devising little philosophical concepts in my brain, based on my own inevitably skewed cognitive biases which all human beings possess. I did this, instead of accepting that the ideas I was conceiving of were philosophical in nature, attaining a strong foundation in existing ideas, learning the lingo that accompanies the domain of philosophy, and then building my own ideas to extend or augment what already existed.
I suspect that we are more likely to get attached to ideas that we have conceived of ourselves, as opposed to those ideas that we have learned from others. In cognitive psychology, this concept must certainly exist, be well-defined, and have a name. Now that I'm dwelling on it here, I'm almost certain that I've read about such a thing in the past, but I don't remember what it is called. Just as well; I might be less inclined to leave this paragraph in this post if I found confirmation that this idea wasn't my own.
Nobel prize-winning physicist Richard Feynman was fond of telling a story about his first reading of a book by Paul Dirac, the physicist who revolutionized the realm of quantum mechanics in much the same way that Einstein revolutioned the realms of gravity and time. At the end of the book, Feynmam recalls, Dirac says that the current state of quantum physics required "new ideas". At this point, Feynman was younger, but had mastered much of the existing mathematical and scientific foundantions of physics.
This is where progress actually happens: on the fringes. Dirac has illuminated a patch of the dark wood of our understanding of the natural world. He pointed to the boundary of his own discoveries, and the places where the light could not yet reach. He challenged those who came after him to start igniting light sources and to venture into this great unknown.
This is a thing I have never done. I was raised to understand that the world is a preexisting system, into which we all must step and find some way of contributing, if for no other reason than to be employed and to be able to survive. The surest way of doing this is to gain an understanding of existing systems and learn to do something that someone else will readily pay you for.
I went to college to find a trade, to learn a skill that would be useful when I graduated so I could readily find places to which I could contribute my knowledge. I did a poor job of even this, since I spent my time doing what I had been conditioned to do in high school: memorize lots of information.
People have referred to me as intelligent; the older I get, the less I understand what this suggestion even means. I imagine a person who has learned all there is to know about a particular, practical subject. Let's say the subject is chemistry. The person, having all of this knowledge, is rendered unable to communicate in any significant way by a physically debilitating disease of some kind. Of what use is their knowledge now?
Intelligence, it seems, has a social component: if you cannot communicate what you know to others, to teach them what you know, to show them something they cannot see, then it's hard to argue that such a person is intelligent. You must be able to apply knowledge to the world in some meaningful way.
A human being wouldn't be physically attractive if they were the only person left alive on planet earth. Without any social context, the concept is rendered meaningless.
We programmers love to create, but no one wants to maintain.
This concept applies to human ideation as well. I regret that I spent much of my youth, in high school and college, devising little philosophical concepts in my brain, based on my own inevitably skewed cognitive biases which all human beings possess. I did this, instead of accepting that the ideas I was conceiving of were philosophical in nature, attaining a strong foundation in existing ideas, learning the lingo that accompanies the domain of philosophy, and then building my own ideas to extend or augment what already existed.
I suspect that we are more likely to get attached to ideas that we have conceived of ourselves, as opposed to those ideas that we have learned from others. In cognitive psychology, this concept must certainly exist, be well-defined, and have a name. Now that I'm dwelling on it here, I'm almost certain that I've read about such a thing in the past, but I don't remember what it is called. Just as well; I might be less inclined to leave this paragraph in this post if I found confirmation that this idea wasn't my own.
Nobel prize-winning physicist Richard Feynman was fond of telling a story about his first reading of a book by Paul Dirac, the physicist who revolutionized the realm of quantum mechanics in much the same way that Einstein revolutioned the realms of gravity and time. At the end of the book, Feynmam recalls, Dirac says that the current state of quantum physics required "new ideas". At this point, Feynman was younger, but had mastered much of the existing mathematical and scientific foundantions of physics.
This is where progress actually happens: on the fringes. Dirac has illuminated a patch of the dark wood of our understanding of the natural world. He pointed to the boundary of his own discoveries, and the places where the light could not yet reach. He challenged those who came after him to start igniting light sources and to venture into this great unknown.
This is a thing I have never done. I was raised to understand that the world is a preexisting system, into which we all must step and find some way of contributing, if for no other reason than to be employed and to be able to survive. The surest way of doing this is to gain an understanding of existing systems and learn to do something that someone else will readily pay you for.
I went to college to find a trade, to learn a skill that would be useful when I graduated so I could readily find places to which I could contribute my knowledge. I did a poor job of even this, since I spent my time doing what I had been conditioned to do in high school: memorize lots of information.
People have referred to me as intelligent; the older I get, the less I understand what this suggestion even means. I imagine a person who has learned all there is to know about a particular, practical subject. Let's say the subject is chemistry. The person, having all of this knowledge, is rendered unable to communicate in any significant way by a physically debilitating disease of some kind. Of what use is their knowledge now?
Intelligence, it seems, has a social component: if you cannot communicate what you know to others, to teach them what you know, to show them something they cannot see, then it's hard to argue that such a person is intelligent. You must be able to apply knowledge to the world in some meaningful way.
A human being wouldn't be physically attractive if they were the only person left alive on planet earth. Without any social context, the concept is rendered meaningless.