Many of the greatest ideas in science have waited centuries for even indirect confirmation.
Just as Nicholas of Cusa, looked at infinity and promptly declared, ‘terra non est centra mundi’, the earth is not the center of the universe and the church did not yet realize how dangerous and how revolutionary, this idea truly was.
Just as the Greek philosopher Democritus postulated that matter was comprised of atoms in the fourth century B.C. and in 1906, more than two millennia later Ludwig Boltzmann committed suicide in part because he was mercilessly ridiculed for believing in such atoms, for which there was no direct proof.
Just as Copernicus’s heliocentric system after struggling for nearly a century against the dictums of the Church, finally began to find acceptance.
“I place myself in a certain opposition to widespread views on the mathematical infinite and to oft-defended opinions on the essence of number.” - George Cantor (1845 - 1918), German mathematician
Avogadro’s law was first proposed in 1811 by Amedeo Avogadro, a professor of higher physics at the University of Turin for many years, but it was not generally accepted until after 1858, when an Italian chemist, Stanislao Cannizzaro, constructed a logical system of chemistry based on it.
In the first half of the 19th century the theory of evolution was mired in conjecture until Charles Darwin and Alfred Russell Wallace, compiled a body of evidence and posited a mechanism – natural selection – for powering the evolution machine.
Svante Arrhenius uncovered the relationship between electricity and chemistry in 1883, but was doubted and rejected by the scientific community of his time. Only after gradually winning over the minds of the scientists for twenty years did he finally receive his Nobel prize in 1903.
The theory of continental drift, proposed in 1915 by Alfred Wegener, was not accepted by most scientists until the 1960’s with the discovery of midoceanic ridges, geomagnetic patterns corresponding to continental plate movement and plate tectonics as the driving motor.
Einstein’s particle theory of light was not accepted for two decades. Eight, eleven and seventeen years after the publication of Einstein’s paper even respected physicists such as Max Plank, Robert A. Millikan and Neils Bohr were still rejecting it.
Charles T.R. Wilson’s cloud chamber method for making the paths of electrically charged particles (electrons and positrons) visible by condensation of vapour was not acknowledged and accepted for 15 years – not until a deeper understanding of the atomic structure had been obtained.
John Michell, an amateur British astronomer proposed the idea of an object with gravity strong enough to prevent light from escaping in 1783. In 1795, Pierre-Simon Laplace, a French physicist independently came to the same conclusion. J.R. Oppenheimer’s prediction of the existence of a black hole was first published in 1939, but no one took him seriously for many years.
Yoichiro Nambu, Holger Bech Nielson and Leonard Susskind co-founded string theory in 1970, which was initially derided but eventually became 40 years later the leading candidate for a unified theory of nature.
Barbara McClintock’s discovery of mobile genetic elements, which she published in the 1940s and 1950s was not recognised for more than 30 years.
James Clerck Maxwell’s discovery that visible light was one of the electromagnetic waves travelling at the speed of light (not instantaneously) was so revolutionary that his idea was ignored for many years in the physics departments of most universities. His professors stuck to teaching the classical physics of Newton.
Ignaz Semelweis the Hungarian physician who suggested obstetricians wash their hands in between deliveries - was discredited publicly by the scientific community at first.
How unwise it is to confine human intellect to a tiny corner of knowledge and convince oneself that it is not wise to go beyond. Scoff not at strange notions or isolated facts, let them be explored. Don’t ever think you’re done. We haven’t even explored the solar system, let alone the galaxy, let alone a galaxy cluster, let alone a galaxy super cluster, let alone a great void, let alone great wall, let a alone a great foam of a local area of the observable universe. And somebody sitting on the tiny planet who is not even visible from the sun and certainly not visible from the centre of the Milky Way says ‘I know’ on the planet is not even visible in the universe.
Think about this little earth, imagine it and shrink it down to the size of a pinhead, and imagine looking at it from the centre of the sun, 93,000,000 miles away. You realize it’s pretty small. Now go to the centre of the galaxy. It takes 190,000,000 years for our sun to go around the centre of the galaxy; it’s a long way to the centre of the Milky Way, it’s at least 100,000,000 light years across. They estimate that there are around 300 billion stars in the centre of it, and around 700 billion orbiting around it, so there are almost a trillion stars in the Milky Way, and our sun is just one of them. Our star that we see is just one of a trillion stars, and there is a tiny planet going around it that’s not even visible from the centre of the galaxy. Now go to the centre of the galaxy cluster, which is about a billion galaxies like our Milky Way, clustered together, billions and trillions of parsecs away from here, and we can’t even see our galaxy, let alone our solar system inside that galaxy, or our planet inside that solar system. Now take that and go out even further and see that the galaxy cluster is just a component in a super galaxy cluster, and those super galaxy clusters are arranged along the walls of giant spheres, and those giant spheres are agglomerated into unbelievably massive giant bubbles, and those bubbles are numbered in the billions of trillions to make up a foam, and that foam is just a small part of the observable universe, and outside that is unimaginable reaches of space that we haven’t even observed yet, and you begin to realise that we still have a long way to learn.
The evidence for our insignificance is overwhelming. What science has been able to unravel is merely a fraction of the cosmic phenomena. We are seedlings of infinitesimal smallness growing, and learning, I’d rather say that I am learning and have had a moment of relative awareness based on the awareness of the limited data that I have got that may or may not even be so and that I’m a growing individual growing having relative insight, and am on to the next step in my illusionary journey. Keep growing don’t ever be done. As long as you’re green your growing as soon as you’re ripe you rot.
“Wisdom leads us back to childhood. It is in a state of natural ignorance where man really belongs. Knowledge has two extremes which meet one is the pure natural ignorance of every man at birth, the other is the extreme reached by great minds who run through the whole range of human knowledge, only to find that they know nothing and come back to the same ignorance from which they set out, but it is a wise ignorance which knows itself. Those that stand half-way have put their natural ignorance behind them without yet attaining the other; they have some smattering adequate knowledge and pretend to understand everything…For, after all, what is man in nature? A nothing in relation to the infinite, all in relation to nothing, a central point between nothing and all and infinitely far from understanding either. The ends of things and their beginnings are impregnably concealed from him in an impenetrable secret. He is equally incapable of seeing the nothingness out of which he was drawn and the infinite in which he is engulfed…How can a part know the whole? Perhaps man will aspire to know at least the parts to which he bears some proportion. But the parts of the world are all so related and linked together that I think it is impossible to know one without the other and without the whole.” – Blasé Pascal, (1623 - 1662) French mathematician, physicist, and religious philosopher, from: Pensées
A prominent feature of the Western tradition is a propensity to think in terms of parallel dichotomies, so that the opposition between animality and humanity is aligned with those between nature and culture, body and mind, emotion and reason, instinct and art and so on. The trouble arises because the legacy of dualistic thinking invades our very conception of what a human being is, for it has given us the vocabulary for expressing it. We are according to this conception, constitutionally divided creatures, one part immersed in the physical condition of animality (lower aspects of mind), the other in the moral condition of humanity (higher aspects of mind).
"A great many people think they are thinking when they are merely rearranging their prejudices." – William James, Pioneering American psychologist and philosopher who was trained as a medical doctor.
Nature stands classically opposed to culture, the former an external reality, the latter a reality as it exists inside people's heads.
Ubique Semper Ab Omnibus - Let not mere human opinion be the ultimate authority, only a truly universal belief could be a true belief.
If one could experience all the lies and contradictions of one’s life at once they could not bear it, so buffers are created to lessen the brunt. Since one cannot eliminate all contradictions in life, self-righteousness results. They are the created evolved buffers to protect ourselves from the pain of facing the truth about ourselves.
Wisdom is not the realization of how much we know; rather it is the realization of how much we don’t. One of the most liberating, wonderful things in the world is when you openly admit you're an ass. Wisdom tends to grow in proportion to the awareness of our own ignorance. When we stop and realize how little rationality we really have, that is the beginning of real reasoning.
Discoveries have almost no impact in the world of science unless they are published in standard journals. And it often takes around 25 to 30 years or more sometimes before scientific discoveries are discovered by the general population.
Nature may be elusive, but she does not intentionally lie. But people do. Although scientists try to be:
1. Not motivated to do science for personal gain, advancement or other rewards they do
2. Objective and impartial when gathering data they aren’t
3. Not believing dogmatically in an idea They do
4. Not using rhetorical exaggerations to promote their ideas they do
5. Not permitting their judgements to be affected by authority They do
Scientists can intentionally perpetuate fraud, falsification, plagiarism and deception if they do not follow the rules to the letter. True science is designed to detect deception of one’s self and others through colleague collaboration, graduate student mentoring, peer review, experimental corroboration and results replication.
Scientists from all backgrounds form their beliefs for a variety of subjective, emotional and psychological reasons in their context of environments created by their family, friends, colleagues, culture and society at large. After forming their beliefs they then defend, justify and rationalize them with a host of intellectual reasons, cogent arguments and rational explanations. Beliefs come first, explanations for beliefs follow. Their perceptions about reality are dependent on the beliefs that they hold about it.
Actuality/reality exists independent of people’s minds, but their understanding of it depends on the beliefs they hold at any given time. Physicists argue that because no one model is adequate to explain reality, “one cannot be said to be more real than the other.” When these models are coupled to theories, they form entire worldviews. Once people form beliefs and make commitments to them they often maintain and reinforce them through a number of powerful cognitive biases that distort their perceptions to fit their belief concepts.
Among them are:
• Anchoring bias: relying too heavily on one reference anchor or piece of information when making decisions.
• Authority bias: Valuing the opinions of an authority, especially in the evaluation of something they know little about.
• Belief bias: Evaluation the strength of an argument based on the believability of its conclusions.
• Confirmation bias: seeking and finding confirming evidence in support of already existing beliefs and ignoring or reinterpreting disconfirming evidence.
• In group bias: value is placed on the beliefs of those whom physicists perceive to be fellow members of their group and less on the beliefs of those from different groups (tribal brain effect).
• Metabias (bias blind spot): The tendency to recognise the power of cognitive biases in other people but to be blind to the bias influence on their own beliefs.
• Experimenter expectation bias: The tendency for physicists to notice, select and publish data that agree with their expectations for the outcome of an experiment and to ignore, discard or disbelieve data that do not.
What can be asserted without evidence can also be dismissed without evidence. Scientists prefer to tether evidence, when it is available to logical analysis in support of a claim or proffer counterevidence that disputes a claim.
Although Kepler’s view of a living earth now seems whimsical, he reminds us that science is asymptotic: it never arrives at but only approaches the tantalizing goal of final knowledge.
One of the leading figures in natural philosophy and in the field of scientific methodology in the period of transition from the Renaissance to the early modern era, Sir Frances Bacon in his book ‘Novum Organum’ once stated: “The human understanding when it has once adopted an opinion…draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects or despises…in order that by this great and pernicious predetermination the authority of its former conclusions may remain inviolate.”
No matter what the issue under discussion, both sides of the issue are equally convinced that the evidence overwhelmingly supports their position. This surety is called the confirmation bias – whereby both parties seek and find confirmatory evidence in support of already existing beliefs and ignore or reinterpret disconfirmatory evidence. This bias is unconscious and driven by emotions. When this occurs, the orbital frontal cortex processes the bias emotions, the anterior cingulate processes moral accountability judgments and the ventral striatum processes rewards and pleasures for being right, dopamine, oxytocin, etc...
Scientists choose to think about what they are philosophically predispositioned by. Science can be portioned into: Hard sciences (physical), Medium sciences (biological) and soft sciences (social) and into technical and popular sciences. Charles Darwin’s dictum holds that if observations are to be of any use they must be tested against some view – a thesis, model, hypothesis, theory or paradigm. The facts that we measure or perceive never just speak for themselves, but must be interpreted through the coloured lenses of ideas. Percept’s need concepts and vice versa.
In theory scientists proclaim themselves ready to follow the facts wherever they might lead. But, in practice, the social mechanisms of the scientific community, the institutional imperative, sets limits beyond which its members in good standing may cross only at their “peril”. When eminent authorities announce their rejection of certain categories of evidence, others hesitate to mention similar evidence out of fear of ridicule. Thus, “anomalous” evidence gradually slides from dispute into complete oblivion. In science there are certain theories that are followed in text books and are what’s taught, believed, promoted, marketed and funded. Certain theories are researched and funded and believed in because some noted authority said it, and everybody else jumps on for the ride. It’s not always necessarily true, it’s just the existing paradigm, and the same thing occurred in Galileo’s time, Copernicus’ time, Gordano Bruno’s time, and Charles Darwin’s as well.
“I by no means expect to convince experienced naturalists whose minds are stocked with a multitude of facts all viewed, during a long course of years, from a point of view directly opposite to mine. But I look with confidence to the future to young and rising naturalists, who will be able to view both sides of the question with impartiality.” – Charles Darwin
“A new scientific truth does not triumph by convincing its opponents and making them see the light but rather because its opponents eventually die, and a new generation grows up that is familiar with it.” – Max Planck, (1858 – 1947) German physicist
The commonest way of dealing with anomalies is either to explain them away, ignore them or to declare them frauds. Reports about evidence conforming to the standard view generally receive greater credibility than reports about non-conforming evidence. Thus deeply held beliefs, rather than purely objective standards, may become the determining factor in the acceptance and rejection of reports about controversial evidence. Knowledge filters tends to keep evidence tinged with dispute outside official channels. A challenge to conventional wisdom in science, no matter what field, is no trivial matter, it usually invites vigorous and sometimes scathing rebuke. “Anomalous” evidence that goes against an accepted theory tends to be subjected to intense critical scrutiny, and is expected to meet very high standards of proof. It is a fairly typical procedure for scientists to demand higher levels of proof for anomalous finds fpr evidence that fits within the established ideas. There appears to be a double standard. (That which is perceived greater is brought down.) The greater the degree of departure from the previous idea, the greater the degree of certainty required. Scientists representing an establishment view often dismiss anomalous evidence by requiring it to meet a higher standard of proof than the conventionally accepted evidence. It is common yet unwise to have a double standard employed in the evaluation of evidence. Having an impossibly strict standard for anomalous evidence and an exceedingly lenient standard for acceptable evidence is unwise but inevitable. Prevalence and acceptance do not make an idea wise or truthful.
No matter how many experiments you may carry out to verify a claim, you can never exclude the possibility that another repetition of the experiment might yield a different result. Hence, empirical evidence can never establish a mathematical fact. Man ultimately cannot understand, represent, or prove any kind of experience or phenomenon; scientific theories, models, or impressions can only approximate the true nature of things. Fortunately, the errors involved in these approximations have often been small enough to make this approach meaningful and useful. The value of any physical theory depends upon its practicability in real life; and the most that man can say about this or that theory, is not whether it is true or not, but only whether it works, i.e., whether it does what it is supposed to do. It takes but a little honest searching to discover that many well-known “facts” are not so. Statements you initially believe to be true often turn out to be illusions. As Plato immortally stated, "all men (people) are liars".
Humans are fallible, and even those with high integrity and the best intentions don’t know everything. One can never rest assured that they have not committed a falsity.
“Vulnerable, like all men, to the temptations of arrogance, of which intellectual pride is the worst, the scientist must nevertheless remain sincere and modest, if only because his studies constantly bring home to him that, compared with the gigantic aims of science, his own contribution, no matter how important, is only a drop in the ocean of truth.” - Prince Louis-Victor de Broglie, New Perspectives in Physics (1962), 215.
There is enough in scientific literature to help understand the limitations of human knowledge. No one can claim to have a monopoly on understanding, or the relationship between man and the cosmos. One must apply the sharpest and most insightful science to help further develop our understanding. None but a fool is right in his beliefs and opinions, for he who believes he can exist without folly is not as wise as he or she thinks they are.
“Being ignorant is not so much a shame, as being unwilling to learn.” - Benjamin Franklin, Polymath, former American President
"Condemnation without investigation is the height of ignorance." - Albert Einstein< poster boy
“The greatest ignorance is to reject something you know nothing about.” - Derek Bok, American lawyer and former president of Harvard University
“Fear always springs from ignorance.” - Ralph Waldo Emerson, American lecturer, essayist, and poet
"When your colleagues aren't up on it, they're down on it." - Linus Pauling, American chemist, activist, author and educator. Listed as one of the most influential chemists in history and ranks among the most important scientists of the 20th century. Pauling was among the first scientists to work in the fields of quantum chemistry and molecular biology. Pauling is one of only four individuals throughout history to have won more than one Nobel Prize.
“Everyone calls barbarity what he is not accustomed to.” - Michel de Montaigne (1533 –1592), influential writer of the French Renaissance
“One only sees what one looks for, one only looks for what one knows.” - Johann Wolfgang von Goethe (1749-1832)
"Predominant opinions are generally the opinions of the generation that is vanishing." - Benjamin Disraeli (1804 - 1991)
"There are no facts, only interpretations" - Nietzsche (1844 - 1900)
"If you hide your ignorance, no one will hit you and you'll never learn" - Ray Bradbury, Fahrenheit 451
“The recipe for perpetual ignorance is: Be satisfied with your opinions and content with your knowledge.” - Elbert Hubbard
“He must be very ignorant for he answers every question he is asked” – Voltaire
Most people cannot live continually in the question, they sooner or later come to; “Well, this is the way it is, and don’t tell me no. That’s the way it is!”
Human ignorance has ever stood in fear of the unknown. Often fear is not of the unknown but of losing the ‘known’, the unknown does not incite fear, but dependence on the known does. The mind accepts a pattern of existence, and tenaciously clings to it. As by a driven nail, the mind is held together by idea, and around the idea it lives and has its being. The mind in this state of fear is often never free, pliable for it is always anchored; it moves within the radius, narrow or wide, of its own biased locality. From its position it dare not wander; and when it does, it is lost in fear.
Few possess the ability, and still far fewer the desire or inclination, with aim to take a comprehensive view. By confining the view to a particular fraction, taking things out of context it is easy to pick out apparent contradictions. To speak is to distort. For a view, to be effective, it must be complete, for fractions there is no end.
We all do that to some degree and at some level, and that becomes our Peter Principle of learning. But as long as we can continue to live in the question, we can continue our learning process. However, there are some things that are pretty solid principles, and the more you accumulate those solid principles the more you can build a certainty upon them.