The roots of creativity and genius

Dr Piotr Wozniak Summer 2001

New article by Dr. Piotr Wozniak: The true history of spaced repetition

This article is an attempt at formulating a prescription for genius and creativity. In equal part, it was written to inspire the bright ones as it is supposed to help those who consider themselves less intellectually fortunate. In short, it will reiterate the claim that training can do miracles to your mind. It will attempt to demonstrate that a majority of population can reach today's standards of genius. It will list nearly forty preconditions and fallacies related to genius and creativity. It will also attempt at presenting a simplified metaphor of genius for the sake of demystifying the concept. Hopefully, it will also encourage parents to spare no effort in providing rich and loving environment for their kids to grow in

Important! For the most up-to-date version of this text, see: "The roots of creativity and genius."

Contents

Sideline stories

Inserts related to SuperMemo

High IQ in high demand

Intelligence, creativity and genius are generally regarded as highly valuable assets of the human mind. As a strong positive correlation exists between IQ and the median earned income, most people would gladly boost their IQ, improve creativity or accept being called a genius. Exceptions to this rule are few and most revolve around a claim that intelligence may be an obstacle on the way towards universal happiness. Here are a few exemplary arguments against human intelligence listed by the detractors of genius:

  • high intelligence reveals existential truths and as such is highly depressive
  • high intelligence prevents atavistic enjoyment of relationships
  • high intelligence is a source of envy and other bad feelings in others
  • high intelligence leads to inhuman behaviors and most sophisticated forms of evil

In this article, I will tacitly ignore the above claims and assume that you would gladly become more intelligent, creative or innovative. I believe it can be shown that an increase in knowledge and creative power can statistically leads to more "goodness" (see: Goodness of knowledge). I will tacitly assume throughout this text that achieving creative genius is a desirable goal.

Nature-vs-nurture dilemma resolved

Many books on psychology put a substantial emphasis on the nature-vs-nurture debate. Psychologist ask which factors are decisive in developing human behavioral characteristics: genetic background or education and upbringing? As far as intelligence is concerned, both genetics and upbringing determine the final outcome. Using reductio ad absurdum we quickly notice that we have not yet recorded a case of a success in science by an individual affected with Down syndrome, i.e. we can easily show that genetics can stifle intellectual development. At the same time, we notice that individuals deprived of education and human contact may be deprived of the ability to read, speak or conduct abstract reasoning, i.e. we can show that lack of education may be equally devastating to the human mind (see: Feral children).

The power of genetics on the functioning of the brain is illustrated by afflictions such as Down syndrome (mental retardation), dyslexia (reading problems), amusia (problems with recognizing sounds and music), unipolar and bipolar disorders (depression and manic-depressive disorder), and many more. These factors on one hand illustrate that we may at birth be handicapped in the quest for genius. At the same time, behavioral therapies used in all listed cases, show the tremendous power of training in developing compensation for disability.

If you look at the human brain from 100,000 years ago, you will not see much difference when compared with today's brains. Yet training and education, as well as the ability to communicate and work collectively, has lifted the human potential to unimaginable levels. See gray insets for more insights on the potential and limitations of the human brain.

Throughout this article, gray inserts will provide additional illustrative material. All inserts are optional! These are stories that can either explain major points by example or simply serve as a source of additional inspiration. The order of inserts is arbitrary. Each insert makes a separate and independent reading. Some may require more knowledge in a given field (e.g. biology, computing sciences, etc.). You do not need to read gray inserts to understand the text. You can read all inserts now, later, or not at all
Down syndrome

Of the inborn disorders that affect intellectual capacity, Down syndrome is the most prevalent and best studied. Down syndrome is a term used to encompass a number of genetic disorders of which trisomy 21 is the most representative (95% of cases). Trisomy 21 is the existence of the third copy of the chromosome 21 in cells throughout the body of the affected person. Other Down syndrome disorders are based on the duplication of the same subset of genes (e.g. various translocations of chromosome 21). Depending on the actual etiology, the mental retardation may range from mild to severe. Trisomy 21 results in over-expression of genes located on chromosome 21. One of these is superoxide dismutase gene. Some (but not all) studies have shown that the activity of the superoxide dismutase enzyme (SOD) is elevated in Down syndrome. SOD converts oxygen radicals to hydrogen peroxide and water. Oxygen radicals produced in cells can be damaging to cellular structures; hence the important role of SOD. However, the hypothesis says that once SOD activity increases disproportionately to enzymes responsible for removal of hydrogen peroxide (e.g. glutathione peroxidase), the cells will suffer from a peroxide damage. Some scientists believe that the treatment of Down syndrome neurons with free-radical scavengers can substantially prevent neuronal degeneration. Oxidative damage to neurons results in rapid brain aging similar to that of Alzheimer's disease. Another chromosome 21 gene that might predispose Down syndrome individuals to develop Alzheimer's pathology is the gene that encodes the precursor of the amyloid protein. Neurofibrillary tangles and amyloid plaques are commonly found in both Down syndrome and Alzheimer's individuals. Layer II of the entorhinal cortex and the subiculum, both critical for memory consolidation, are one of the first affected by the damage. A gradual decrease in the number of nerve cells throughout the cortex follows. A few years ago, the Johns Hopkins scientists created a genetically engineered mouse called Ts65Dn (segmental trisomy 16 mouse) as an excellent model for studying the Down syndrome. Ts65Dn mouse has genes on chromosomes 16 that are very similar to the human chromosome 21 genes. With this animal model, the exact causes of Down syndrome neurological symptoms will soon be elucidated (for the amazing genetic science in action see: Cytogenetics Resources Ts65Dn including pictures of "Down syndrome mouse"). Naturally, Ts65Dn research is also likely to highly benefit Alzheimer's research. 

Whatever the actual molecular reason, over-expression of chromosome 21 genes puts children with Down syndrome at immediate disadvantage as compared with normal kids. Their IQ rarely goes beyond 60. The brain of children with Down syndrome is usually small and underweight. The cerebellum and brain stem are unusually small. So is the superior temporal gyrus. Their intellectual potential is further limited by a number of ailments such as recurring infections diseases, heart problems, poor eyesight, etc. Genetics is a true roadblock here. People with Down syndrome have (until now) never become great scientists, novelists, politicians, etc. 

At the same time, medical treatment, conducive family environment, vocational training, etc. can increasingly produce excellent improvement in the overall development of Down syndrome kids. On one hand, Down syndrome shows that we cannot jump over genetic limitations; on the other, it shows that intense training can produce miracles whatever the starting point. In conclusion, the optimum path to excellence goes via the mental training independent of genetic limitations

What is intelligence?

You will find many definitions of human intelligence of which three make the most of the daily use of the word:

  1. problem solving ability -  the power of the human mind to process information and solve problems. When you see a bright scientist with wide knowledge and numerous discoveries to his credit, you may say: This person is really intelligent! Look at his record! To use a computer metaphor, the scientist is endowed with the best hardware and software money can buy. He or she is optimally equipped for problem solving
  2. processing power - the raw nimbleness and agility of the human mind. When you see a smart student quickly learn new things, think logically, solve puzzles and show uncanny wit, you may say: This guy is really intelligent! See how fast his brain reacts! The student has a fast processor installed and his RAM has a lightning access time. He may though still need a couple of years to "build" good software through years of study. IQ tests attempt to measure this sort of intelligence in abstraction of knowledge. The difficulty of improving processing power by training comes for similar reasons as the fact that programming cannot speed up the processor
  3. intelligence potential - the potential to develop intelligence in senses listed above. When you see a young child that shows a number of talents and seems to be on a straight path to become a nimble student or a prolific scientist, you may say: This kid is really intelligent! The sky is the limit for him.  The kid is equipped with high quality extensible hardware infrastructure. He is on the best path to reach highest intelligence both in terms of processing power (Definition 2) and problem solving ability (Definition 1)

In this article, I will focus on ways towards developing the intelligence in the sense of problem solving ability (i.e. Definition 1). After all, the whole purpose of education is to improve our problem solving ability, i.e. the ability to optimally answer questions such as What to eat for dinner? What job to take? How to build a better mouse-trap? What should my position on abortion be? Which party should I vote for? etc.

High IQ is welcome but it makes up for only a fraction of intelligence (Definition 1). As much as a fast processor stands only for a fraction for what we expect of a good computer.

Later in the article, I will argue in support for the scientifically obvious statement: well-designed training can produce amazing results in enhancing intelligence (Definition 1). However, this statement is surprisingly little understood in general population. It falls into the category of scientific facts that may find more skeptics than believers. Naturally, vox populi does not detract from the merits of evolution, genetic engineering, human cloning, Big Bang theory, sociocybernetics, neuropsychological interpretation of the thought and consciousness, etc. However, to make the obvious more digestible, I will use the computer metaphor to illustrate the building blocks of intelligence and genius

The computing brain 

The neural network of the brain can be seen as mental hardware. It includes inborn ROM memory as well as highly plastic RAM. The inborn wiring and structure of the brain may roughly be compared to a ROM memory. If you stop eating for a day, program stored in your ROM will make you experience hunger. Things we learn in life can be considered software that is stored in your RAM. 

If you doubt a mental ROM exists try the following experiment: look at the computer screen, keep your eyes open, stay conscious and yet try not to perceive the picture of the screen. Seems impossible? Now try to superimpose the face of a loved person by using the power of your imagination. This is easy for most people. Here is your RAM in action superimposing over a ROM-enforced perception. You can even imagine touching parts of the imaginary face. Yet the screen underneath does not seem ready to go away. The impulses from the retina hit the visual cortex, and you can do little about it.

Knowledge is encoded in the modifiable strength of connections between neurons in a similar way as bits are stored by electrical charges in cells of RAM memory. Our software can roughly be compared to an expert system. An expert system is a software application that can be used in problem solving such as producing a medical diagnosis. An expert system is built of two components: factual knowledge and an inference engine. They roughly correspond to data and software in a computer or to knowledge and reason in the human brain. 

Expert systems

Expert systems are computer programs that take over the job of an expert in a highly specialized field such as medical diagnosis, production management, criminal profiling, etc.. An expert system is fed with data and it's job is to answer questions such as: "What is the list of the most likely diseases the patient is suffering from?",  "Which supplies need to be ordered next?", or "Which offenders in the database match the profile of the described crime?". Expert systems provide an excellent metaphor for studying human problem solving and provide clues for enhancing creativity.

An expert system is usually built of a knowledge base (collection of facts representing factual knowledge) and an inference engine (collection of rules representing inferential knowledge). An expert system may store facts such as "E. coli bacteria is not resistant to norfloxacin" and "E.coli can cause urinary tract infections". It can also store updateable facts such as: "Pain during during urination is associated in X% of cases with urinary tract infection" (where X is a number regularly updated as the expert system improves its knowledge), or "E. coli is a cause of Y% of urinary tract infections". The expert system can also store rules such as "If (A is an antibiotic) and (B is a bacteria) and (patient is infected with B) then (suggest administration of A)". Some rules can be fuzzy, i.e. applicable with a degree of probability or producing a given outcome with a given probability, for example, "If (patient infected with E. coli) then (probability of success with norfloxacin is P%)" or "If (probability of E. coli infection is greater than P%) then (use norfloxacin)". By analyzing the facts stored in the database and facts fed into the expert system, the expert system can use its inference rules to answer questions on the optimum antibiotic therapy. It can also generate the probability profile of the successful application of individual antibiotics. Although the difference between static facts and if-then rules in an expert system is very clear-cut, there is no sharp fact-rule distinction in the human brain which uses neural representations for storing knowledge. However, the difference between facts and rules is very valuable in explaining the difference between smart and dumb learning.

Expert systems are always based on storing large amounts of information. They are built by peeking at human experts in action and concluding about their reasoning. A knowledge engineer or an expert himself needs to formulate the rules that are used in arriving at a solution to a problem. Consequently, there is a very direct parallel between an expert system and a human expert in action.

Much of expert thinking is much simpler than what happens in a child's brain in the course of ordinary play! The reason for this is that we are inborn with powerful computing machinery for visual processing, for association, for analyzing motion, for spatial orientation, for phonological analysis, for language parsing, etc. A child recognizing a simple ba-ba language may be harder to imitate in a computer than an expert botanist recognizing one of a thousand species of plant. As Marvin Minsky put it: It can be harder to be a novice than to be an expert! A program written in 1961 by James Slagle could solve calculus problems that are normally given to college students. This program was able to score an A on an MIT exam. This program needed only about a hundred algebraic rules to solve all the required calculus problems! Calculus permeates engineering and forms part of the foundation of the industrial world. It is also a classroom nightmare to many students. Yet in essence it is very simple and compact. Simplicity of calculus powerfully illustrates what our brains were not born to do. It also shows what new powers our brains can acquire with relatively little effort if the new knowledge is selected in the right way. Algebra can serve as a model of abstractness of rules. After all, it is based on symbols that can mean anything: a plane or a bird or just anything. As stated throughout this article, abstractness of rules stored in the human brain lays at the foundation of creative thinking.

What an expert needs to know can indeed be simple. However, it is often not simple to discover or explicitly formulate it in the first place. Many students may have problems with calculus because of the simple fact that some rules of calculus are highly heuristic and cannot be found in math books. Good (or rather hard working) students acquire those rules implicitly by solving a large number of calculus tasks. Poor students could easily catch up if their books or teachers explicitly formulated those hazy rules, e.g. if you see those two symbols on the left, go for the rule X rather than wasting time on the remaining five other possibilities that can cost you an hour each. Human experts seem more intuitive than computers. But this only comes from the fact that they apply rules that they themselves have hard time formulating. There is no qualitative difference between human or computer expert in that respect. Intuition is not a magic power. Intuition is an inability to explicitly express knowledge that is already wired in the neural network of the brain.

As with the haziness of the rules, similar uncertainty may concern the actual application of inference rules: the problem solving strategy. A creative individual will often not be able to clearly say how and why he or she arrived at the solution. When later writing a scientific paper on the solution to the problem, the creator will often need to look for a clear path towards the solution even though he has definitely arrived at the goal before. Expert systems may use various strategies such as data-driven derivation called forward chaining (going from the facts to a conclusion, e.g. deriving symptoms from a disease database), goal-driven derivation called backward chaining (going back from the goal to test a hypothesis, e.g. testing for a disease given the symptoms), search (applying simple rules repetitively over a large number of combinations that could yield a solution), and various combinations of these strategies. As for the problem solving strategy, the human brain is even harder to simulate. Usually the search space for major problems is huge and no simple strategy can be used (otherwise the problem would not be a problem in the first place). Then the lucky genius stroke, the brilliant association, insight, breakthrough, etc. is nothing else than applying the right rule to the right data at the right time. The "right time" here refers to the different states of the brain at different moments of time. The brain works associatively and two or more neuronal assemblies must be active at the same time for the association to be formed. Archimedes could have thought of volume when entering his bathtub before he yelled: Eureka! Newton's brain must have been sensitized to gravity when he was struck by a falling fruit. James Watt must have had his engine-power neurons potentiated when looking at a rattling kettle. Millions of people see kettles daily but they rather do not think about a steam engine as a result. The genius breakthrough comes from an association of ideas in the brain. In terms of an expert system, the right rule must be applied to the right set of facts. The best term to describe human problem solving is heuristic search. We apply available rules using the best-search rules which may be subject to another layer of meta-rules that are implicitly interwoven in the intricacies of the neural circuitry of the brain.

In conclusion, knowledge is the key to problem solving. In particular, highly abstract inferential knowledge is central to a creative search for solutions. The disassociation of the link between knowledge and genius can be harmful. The confusion usually comes from the fact that memorizing worthless data is not differentiated from memorizing useful rules. Many people mistakenly fail to recognize the associative power of human memory and conclude that relying on external sources of information may suffice in their particular field of activity. As a result, memorizing is perceived as a dumb act. Some articles at supermemo.com illustrate this problem: 

  • In today's world, information is so abundant and can be accessed so readily that it is hardly necessary to lumber one's memory with it (from SuperMemo is Useless)
  • It should be enough to create in our memory some sort of an index to the global wealth of information (from No force in the world ...)

The most important things we learn from expert systems is that extensive knowledge helps solve problems. We also learn that the way we represent knowledge may determine the successful outcome of problem solving. Conclusions: we need to keep on learning and we need to pay special attention to how we represent things in our memory to ensure we understand the implications of the things we learn.

For a quick course on basic concepts of Expert Systems and Artificial Intelligence see: ABC of AI

Factual knowledge is made of facts. A fact may have a form of "Jimmy Carter was elected the US president in 1976" or "Abraham Lincoln was assassinated in 1865". Inference engine is based on inferential knowledge. Inferential knowledge is made of a set of rules. Unlike static facts, rules can be applied to facts to produce more facts, assertions, statements, theorems, formulas, etc. For example, a rule may say "Since the 22nd Amendment, a US president cannot serve for more than ten years" (i.e. two terms plus two years of possible succession). From a fact "Jimmy Carter was the president" and from a rule "President cannot serve for more than 10 years" we can derive new knowledge: "President Carter served no more than 10 years". In mathematics, a fact may say that x=3 and a rule may say that x+x=2*x. By applying the rule to the fact we can conclude that 3+3=2*3. Rules can then be used to derive new facts and new rules. If we know that x+x>x (for x>0) then we also derive a new rule: 2*x>x. In the course of problem solving, our brain will often develop new rules and store them in memory. These new rules will form a highly valuable component of your knowledge and will decide on your creative powers. Rene Descartes said: "Each problem that I solved became a rule which served afterwards to solve other problems"

Apart from declarative facts and rules which we can learn in a textbook, our nervous system also includes other forms of knowledge. For our analysis we will mostly need to discern: inborn knowledge and procedural knowledge. Inborn knowledge can be compared to rules stored in our ROM. For example, when feeling a burning pain in fingers, retract the arm. Procedural knowledge is knowledge that is acquired by trial and error via punishment-reward stimuli. For example, when we ride a bicycle, each time we lose balance, an information is sent to the motor system not to repeat the recent moves that should be considered an error. At the same time, the elation of smooth ride, reinforces the circuits responsible for sequential stimulation of muscles involved in cycling.

Apart from inference engine, our brain is equipped with a sort of "interference engine". Our brain was programmed for survival. It is supposed to make you search for sources of water when you are thirsty or react with interest to an attractive representative of the opposite sex. We are driven by instincts and emotions. Emotions helped humans survive thousands of years of evolution. However, emotions also interfere with the intellectual effort. Isaac Newton might be the brightest scientific mind of the 17th century, yet the last 25 years of his life were marred by a bitter battle with Leibnitz over their claim to having invented the calculus. Alan Turing, the father of the famous Turing Test, committed suicide by cyanide poisoning under the burden of intolerance brought forth by his homosexuality. His mind might have been affected by a hormonal therapy that was supposed to "cure" him of homosexuality. Even the greatest mind may be incapacitated by a strong interference from hormones or lower-level brain circuits. Emotions can literally kill genius.

Here is the summary of the computer metaphor of the human mind. Terminology defined here will be used throughout the rest of this article:

Hardware - the brain

Infrastructure - brain components: cortex, thalamus, cerebellum, basal ganglia, etc.
ROM - inborn knowledge (e.g. acrophobia)

Software - knowledge

Declarative knowledge - textbooks knowledge

Facts - e.g. Mary is a pilot

Rules - e.g. All snakes are reptiles, formula for solving quadratic equations, etc.

Procedural knowledge - skills (e.g. playing piano, touch typing, swimming, etc.)

Interference - emotions, instincts, reflexes (e.g. hunger, thirst, orgasm, etc.)

positive emotions (e.g. passion, laugh, elation, zeal, energy, etc.)

negative emotions, instincts, reflexes (e.g. anger, envy, hate, malice, etc.)

In the above light, we can simplify genius to the following:

Genius is based on good hardware, excellent knowledge, strong motivation, and minimum negative interference.

In other words:

  1. it is helpful to be blessed with a healthy brain (hardware)
  2. this brain must be subject to a lifelong training in acquiring useful knowledge (software); esp. problem solving knowledge
  3. knowledgeable brain must be driven by strong motivational factors (drive), including positive emotions (passion, enthusiasm, love, etc.)
  4. well-driven knowledgeable brain must avoid negative interference from inborn weaknesses and destructive emotions (e.g. few things cloud judgment as badly as anger, and few things are as distracting as love)

What is special about a genius brain?

Using the "simplified brain model" above, I will try to look for factors that determine a genius brain and how these factors could be influenced.

A genius brain in action will tackle a problem, quickly find an appropriate set of rules, and derive a solution. Actually, the speed of processing the rules is not as critical as the skill in choosing the appropriate rules at hand. For a genius breakthrough, the speed is usually quite unimportant. It took Darwin five years to collect data during his Beagle trip to come up with a vision of the evolutionary process. Yet it took him another 20 years collecting all necessary material, and opinions before mustering courage to publish On the origin of species. The book has changed our view of the human species for ever. It is hard to pinpoint a single breakthrough or a stroke of genius. Darwin's reasoning wasn't blindingly fast neither. Yet Darwin's impact on the ways of the mankind was monumental

Biological basis of genius

Humans do differ in their brain power. Some get a biological head start, others get handicapped from early childhood. In cannot be stressed enough though that the optimum path towards maximum achievement is always through training. The starting point is not relevant for choosing hard-work learning trajectory. It is also important to know, that in majority of cases, mental limitations can be overcome. Some major disabilities, such as Down syndrome or brain injury can pose a formidable challenge. However, practice shows that a huge proportion of the population see a problem where it does not exist. Many people write to me about their memory problems just to discover (e.g. with SuperMemo analytical tools) that qualitatively their memory does not differ from their peers. What usually prevents people from reaching intellectual heights is personality and the environment (school, family, etc.). Many do not live up to their potential simply because of insufficient motivation or belief in their own powers. Others fail due to parental inattention. Those factors are statistically by far more important than inborn limitations.

Scientists have studied Einstein's brain to look for the clues as to his genius. On cursory examination, they could hardly find any. Later it transpired that some areas of his brain were indeed better developed and nourished by a rich fabric of glial cells, i.e. brain cells that are, among others, responsible for the right environment for neurons to work in. Yet it is difficult to predicate as to whether all these differences were inborn or were rather a result of his training in abstract thinking.  

Anatomical studies show that various areas of the human brain may substantially differ in size between individuals. Yet it is not easy to find correlations between these difference and mental powers. In people with a normal range of IQ, the volume of cerebral cortex may vary twice between one person and the next. So may the extent of differences in metabolic rates in the same organ. Similar differences have been found between such critical brain structures as the hippocampus, entorhinal cortex, and the amygdala. Connections between the hemispheres can dramatically differ in volume (e.g. seven-fold difference for the anterior commissure). The left inferior-parietal lobule (located just above the level of the ears in the parietal cortex) is larger in men, and was also found to be larger in Einstein's brain as well as in the brains of mathematicians and physicists. On the other hand, the two language area of the cortex: Broca and Wernicke areas are larger in women, which may explain why women might be superior in language processing and verbal tasks. Bigger men have bigger brains but are not smarter.

A racially sensitive subject of lower SAT test scores among blacks and Hispanics in the US has been a matter of debate for a number of years. The differences could not be explained by the material status of families or the neighborhood factor. Stanford psychology professor Claude Steele has conducted revealing experiments in which black students could do equally well on the test as long as they were not told they are being scored.

Although we can point to differences based on sex or ethnicity, the ultimate difference in the creative potential is by far more dependent on the upbringing, education and student's personality. As explained in Genius in Chess, despite chess being a "male game", female chess player, Judit Polgar, developed skills that are superior to those of 99.99997% of the male population.

When we tried to see if student IQ makes it easier to do well in learning and in exams, we found that some personality factors matter more. A small group of students learned with SuperMemo, and the main success factor was the perfectionism trait, not the actual IQ (Wozniak 1994, Gorzelanczyk et al. 1998). Most optimistically, SuperMemo and memory research show that our memory works in the same way at the very basic molecular and synaptic level. Our forgetting is described by the same forgetting curve whose steepness is mostly determined by knowledge representation. As the analysis of success stories with SuperMemo shows, main learning differences between individuals can be found in (1) personality (perseverance, delayed gratification, optimism, etc.) and (2) knowledge representation skills. A week-long course in mnemonic techniques immediately illustrates that knowledge representation skills can be learn very fast indeed. Those skills also develop in proportion to the amount of learning as demonstrated by differences between primary, secondary, undergraduate and graduate levels. All users of SuperMemo, unless primed beforehand, start with building clumsy collections of learning material that is quite difficult to retain in memory. Within months, most users develop reasonable strategies on how knowledge should be represented to minimize the effort of learning (see: 20 rules of formulating knowledge in learning).

To produce breakthrough ideas, most valuable rules are those that are highly abstract (i.e. detached from a particular subject matter). They should be applicable to a wide range of problems. This is why various branches of mathematics should be taught to students of all professions. Logic, probability calculus, or statistics are highly abstract and highly applicable. The same formula of logic may be the basis of dozens of other highly abstract rules. Surprisingly, many professionals find it hard to differentiate between conjunctions such as AND, OR, AND/OR, or XOR. Let alone the difference between deduction and induction which forms the basis of scientific investigation, as well as the basis of logical (read "correct") thinking about such simple choices in life as selecting the appropriate brand of cereals for breakfast

Rule abstractness: If you learn the rule "Wheat contains 340 kcal per 100 grams" its is only applicable to wheat. If you narrow the term wheat to a single concept (i.e. not grain of all species of plan called "wheat"), this rule can be interpreted as a fact. However, the rule "Most cereals contain 330-360 kcal per 100 grams" is probabilistically applicable to both wheat and maize. The latter rule is more abstract and statistically more valuable in problem solving (i.e. you can use cereal rule in more circumstances than the wheat rule)

The applicability of rules does not only depend on their express meaning. The actual representation of the rule in the human brain is paramount! The same rule in the mind of a genius can find a dozen more applications than can be borne out of an effort of a plain crammer. The skill of learning the rules the right way is a critical component of genius. Genetic component may play a minor role here. Many individuals find it difficult to represent knowledge in their minds in a way that can lead to a genius breakthrough. Understanding the right forms of training for abstract representation of rules in the human mind may bring untold benefits to mankind in years to come.

Let us use our computer metaphor to illustrate the problem:

If we take this rule:  "if HardDiskSpace<5MB then raise(HardwareAlert('Running out of hard disk space')"

If you type this rule to MS Word and save it in a doc file, the rule will be as useless as any rule crammed into your memory without understanding. Yet the same rule encoded in a hardware monitor DLL can be a blessing to the security of data stored in your computer. The way we represent rules in our brain determines their applicability.

For the same reason, I started this article with a computer metaphor. This way I tried to represent the foundation knowledge of this text in a form that is easily understood by everyone. The rules I am expressing can hopefully be easier to digest and store in your mind with a more tangible long-term benefit. With the appropriate representation, no scientific theory is complex. All great theories were born in the human mind. Einstein, Turing, G�� or Heisenberg did not have to be inherently brighter than you. However, they were able to arrange the pieces of the puzzle in their mind in such a way that they could easily see the light. There is nothing inherently complex in the theory of relativity, the theorem of incompleteness or the uncertainty principle. Some theories may be more voluminous than others. Some may be voluminous enough, in their digestible simple representation, to discourage many from digging in. An important conclusion: No product of human thought is inherently complex or incomprehensible. The difference between easy subjects and difficult once can always be explained by the representation and volume.

Abstractness calls for particularly well-chosen representation. The fact that dinosaurs became extinct 65 million years ago may require no special approach. Abstract mathematics, on the other hand, may be introduced to a student in a number of ways that differ in their effectiveness by many orders of magnitude. There are many more students who fear algebra than those who tremble before a literature class. Symbols of algebra do not have specialized brain circuits to process and simplify them. Student problems with algebra can usually be tracked down to insufficient training in math at primary and secondary levels. Consequently, a motivation factor builds up another inhibitory layer. The gratification from reading an excellent novel is instant. The benefits of math require good command of the raw basics, starting with the multiplication table and the sums. We have not been able to find many shortcuts from the basic level math towards solving differential equations. However, yet a few years ago, you could hear from many: Computers? That's not for me. I have never been good at technical subjects. Today, the same people surf the net for hours. Seniors are flocking to the net in droves. We have succeeded in simplifying the way people see and use the computers. We have changed the way computing is represented in public mind. 

Blue inserts in this article are dedicated solely to users of SuperMemo. If you are not a user, you can skip these
Popularity of SuperMemo vs. knowledge representation
SuperMemo is still far from being widely accepted. It still awaits a moment to be packaged in a way that is digestible for an average citizen. Its problem is its representation in the public mind. It is surprisingly difficult to explain the benefits of SuperMemo. Try to convince your classmates or colleagues at work to use SuperMemo to experience this difficulty first hand. It is even more difficult to explain the program itself and how to use it. And it is by far the hardest to illustrate the destruction committed on learned knowledge by giving up spaced repetition. Without finding a formula for simplicity and popular appeal, SuperMemo will for long remain a tool for only those with the highest intellectual aspirations

In acquiring knowledge, never say "this article or book is too hard for me". When listing books he read in his youth, Charles Babbage, the inventor of the first mechanical computer, wrote "Amongst these were Humphry Ditton's 'Fluxions', of which I could make nothing". We know that Babbage was the last person you would suspect of having problems with mathematical texts. If you see the text of which "you could make nothing", go to the first sentence and analyze it. Most often than not, it is just the author who uses the language or structure that is either inappropriate or not matching your present knowledge in the field. If you encounter problems, and there is no explanation, no introduction, or if specialist terminology runs out of the field without a suitable glossary, you may safely excuse your comprehension problems. Do not attempt to dig into advanced chemistry article without the basic chemistry background. Every fourth word may fall out of your vocabulary range. It may take months or years to build a necessary background! Least of all, blame your own perception. Just keep on working harder and one day you will see the light. 

If you find difficult material, do not waste time for depression or despair. Abstractness is inherently harder to digest than plain facts. Methodically analyze the reasons for which you cannot comprehend given material. Either the material is badly presented or you need new knowledge that will resolve your problem. Be patient and remember: Everything is difficult before it becomes easy!

If you are a user of SuperMemo 2004, see Dealing with complexity in SuperMemo 2004 later in this article

Working out genius

High achievements in all fields require hours of training. This refers to music, chess, sciences, sports and what not. I wholeheartedly subscribe to the famous statement by Edison: "Genius is 1% inspiration and 99% perspiration". Training can have a miraculous impact on the human brain. It does not matter much how well you were endowed by the genetics. You got no better choice than to commit yourself to a lifelong course of learning. If you are in a minority that shows identifiable genetic limitations, you may need to hone your routine to your particular needs; however, if you have already arrived to this point in this article, health permitting, you are highly likely to be equipped with all the basic intellectual components for building genius. 

Genius in chess 

It is a pity that not all those genius chess brains had been sufficiently employed in the betterment of this planet. However, they all provide highly valuable material for studying the human brain power. There are a couple of reasons for chess being so valuable to study. Chess rules are clear-cut. The competitive achievement is measurable. Individual games are available for study move-by-move on the Internet. Last but not least, chess is often associated with aura of genius, and world champions generate lots of excitement that results in numerous books and studies on scientific and popular-scientific platforms. In those conditions, we can study factors that help some people reach processing power that is hard to match with the present computing technology.

Chess is a great metaphor for creativity. Chessboard positions roughly correspond to facts and applicable moves correspond to inferential rules (see: facts and rules). The more abstract the rules, the more positions they can resolve. The more abstract the rules you acquire, the less sheer computation your brain needs to do in the game of chess. Consequently, the better your chess score. The move rules will often be based on pattern recognition rules who can filter complex position into simply identifiable patterns. The better your arsenal of pattern recognition rules, the more applicable your move rules become. The rules are the key to chess genius.

British chess player and author Jonathan Levitt proposed a formula linking chess scores with IQ (The Levitt Equation: Elo ~ (10 x IQ) + 1000). Although the formula does not represent exact science, it is a good illustration of the difference between the two concepts of intelligence and genius: one of the true mental processing power and the other of the potential to develop it. Levitt's formula determines the approximate maximum chess score for a given IQ assuming years of extensive training. The purpose of IQ is to distil innate mental skills from expertise. Although this is never entirely possible, people with little expertise in any selected field may still show high IQ which is indicative of high intellectual potential. In chess, adding new recognition and move rules to memory will plateau with time, and the quality of reshuffling them in conditions of maximum concentration will determine the champion. However, there is no substitute for hard work on the way to success in chess. No amount of lateral thinking or transcendental mediation will help. The chess player's brain needs to be equipped with the arsenal of thousands position patterns. The chess scores reflect the true processing power of a players brain in the narrow specialty of chess. In real life, high IQ is welcome; however, what will determine a person's success in a given field is the actual ability to solve problems in that given field. This ability is always related to knowledge, skills and expertise. One of the greatest geniuses of the past century, Herbert Simon (Nobel prize in economics, 1978) has devoted his whole life to studying expertise and proposed another (very rough) formula: it takes 10 years for an individual to reach the top-rank level in any field of expertise (be it chess, medical diagnosis or botany). This number reflects the fact that we tend to measure human accomplishments relative to the accomplishments other individuals in the same class. With classical learning methods, acquired knowledge tends to plateau after a period of time in which the forgetting rate becomes comparable with the acquisition rate. Today, this plateau can be overcome with spaced repetition (see: SuperMemo) that linearizes the acquisition of knowledge in lifetime. Simon's 10-year period reflects the approximate acquisition plateau in non-linear learning. If an individual works hard enough, he will sail close to his maximum knowledge acquisition potential in more or less ten years. His knowledge and skills, as compared with his peers, will then be most noticeable. Due to the law of diminishing returns, the increase in expertise will not be as easy to notice later on. Levitt's formula links the intellectual potential expressed by IQ with the maximum level of expertise in the field expressed by chess score. Herbert Simon's "formula" fits well with chess. The brightest stars of chess, Bobby Fischer and Judit Polgar both got their grand master titles in just under ten years. Some estimates put the number of position patterns recognized by a grandmaster at 50,000. This is more or less as much knowledge as you accumulate with several-hours-per-day extensive learning in the period of ten years in any field (or in a much shorter period in SuperMemo). 

An important component of success in chess is the way chess knowledge is represented in the brain. Optimum representation cannot be described verbally, but it is acquired with time via the inherent properties of neural networks employed in processing of the chessboard configurations. Herbert Simon noticed that grandmasters show huge advantage over amateur players in their ability to memorize or recognize meaningful positions in chess. At the same time, their advantage all but evaporates when it comes to memorizing meaningless positions (i.e. those that are not likely to result from a real game). Grandmasters see the chessboard in their special way. They use their own representation. Their own language. Their own pattern recognition. This special representation is the key to getting away from the complexity of chess and reducing games to (relatively) simple game of applying thousands of memorized rules of the winning strategy. As with memorizing the result of 199 x 199, good rules make it possible to replace lengthy computation with a quick retrieval of a solution or applying a succession of just a few well-fitting rules. This is also why it is so difficult to write computer programs that could match grandmaster skills. Those non-verbal skills are difficult to convert to unfailing algorithms.

In essence, chess training is based on memorizing positions and moves (see: smart vs. dumb learning if the word memorizing raises an opposition here). A chess player's brain subconsciously develops a specific chess language in which it expresses the events on the chessboard. This language is a form of knowledge representation which, as it is always the case in learning, plays a central role in success. Once this internal language develops and becomes the player's second nature, all games analyzed and played, leave a trace of memorized chess knowledge in player's memory. Over years, player's memory acts like an efficient pattern recognition computer. One look at the chessboard results in a quick retrieval of relevant patterns from memory and a quick analysis of not-so-many applicable move rules and their outcomes. Unlike Deep Blue beating Kasparov by juggling 200 millions positions per second in its digital memory, a chess player, with a high error rate, quickly guesses best moves in a process that is hard to replicate in a computer.

Of numerous interlinking factors, the personality of a chess player may be one of the most important factors for his or her ultimate success. The baseline IQ may determine the realistic ceiling of achievement. However, it is hard work and training that makes a great chess player. For this, you need a truly neurotic personality with an extreme obsession for the game. Scrupulous analysis of the game and highly competitive spirit are crucial ingredients. It is the personality that turns a budding player into a computer-like achieving machine where chess permeates all aspects of an individual's life. Training, tournaments, game analysis and the highest accomplishment are central points of a chess champion's mind throughout his day. With training, further qualities develop: the art of concentration, and chess expertise. On-demand concentration plays a greater role in chess than in other areas of creative activity. A chess player must reach top concentration at the right moment and sustain a high-level of game processing power until the next move is chosen. On the other hand, success in sciences, engineering, business, etc. will rely on the quality of the creative output independent of the speed at which it is reached. More like in correspondence chess. If you can produce a better result in 3 hours of thinking than another genius in 3 minutes of thinking, you can still arrive to a better business plan, better scientific theory, better algorithm, better design, better marketing idea, etc. Your creation over many years will accumulate those incremental points. In creativity, quality counts more than speed

In chess, it is easy to notice that statistically it better to be Jewish, middle-class, and male for top achievement. The Jewish factor is more to do with home environment and family values rather than with genetics. The male factor may have more to do with the genes; however, Judit Polgar could still beat 99.99997% males of this planet (i.e. just about all of them except few). Additionally, women's incentives to enter the chess world are miserable (judging by less glamour and offensive prize offers), and disincentives to leave it are by far greater (see the issue of marriage and children in Polgar sisters insert). Probably, the sex and race, as baseline IQ, can influence the hard to measure ceiling of achievement; however, in practical terms they appear inconsequential. It is the quality and the amount of training that will determine the outcome

Ultimately the short formula for genius in chess is: (1) the right competitive personality that makes one work hard and able to reach the peaks of concentration at critical times, and (2) the resulting hard work that leads to mastering thousands of highly abstract chessboard rules

Similar preconditions are true for creativity in general: it all begins with the rage to master and years of training towards a problem solving expertise in a given field

A well-planned training regimen has been shown to lead to a remarkable progress in people suffering from various inborn limitations to the functioning of the brain. The brain's amazing ability to compensate for the limited functionality of its components can be well illustrated by an excellent prognosis for kids with hemispherectomy (i.e. surgery in which half of the brain is removed). If hemispherectomy is conducted early enough, the kid is likely to return to normal life. Due to the brain's symmetry, a damage to the same area on both sides of the brain may be harder to compensate but still not impossible. Dyslexia is a genetically based condition in which reading may pose particular challenge in otherwise bright people. Dyslectics show reduced activity in their language center on the left side of their brain. In dyslexia, training can be very frustrating but the right hemisphere can compensate for the limitations of the left side. To experience the hardship of dyslectic training, pick up the pen in your non-dominant hand and write now the letter that has waited years to be written. Don't just slug it away, try to match the speed of your dominant hand. See the pain? Incidentally, Edison was a dyslectic too. And so was Einstein

Dyslexia

People who experience reading difficulty without being otherwise intellectually disabled are said to suffer from dyslexia. Studying dyslexia is very valuable for understanding intelligence and creativity. It illustrates the power of inborn wiring of the brain in developing mental skills. At the same time it can show how inborn limitations can be overcome by using the compensatory power of the brain. Dyslexia is caused by an inability to handle linguistic information in visual form.

5-15% of the population can be diagnosed as suffering from various degrees of dyslexia. Its main manifestation is a difficulty in developing reading skills in elementary school children. Those difficulties result from reduced ability to link up visual symbols with sounds. In the past, dyslexia was mistakenly thought to have a motivational background. Researchers studying the brains of dyslectics have, however, found that in reading tasks dyslexics show reduced activity in the left inferior parietal cortex. Otherwise, dyslectics are known to often show higher than average intelligence. A number of bright brains are said to have suffered from varying degree of dyslexia. Those include Einstein, Edison, Alexander Graham Bell, Faraday and many others. Dyslectics may show a natural dislike of reading and, in consequence, compensate by developing unique verbal communication skills, inter-personal and leadership skills. Hence so many prominent CEOs list minor to severe dyslexia among their childhood disabilities. Those include Richard Branson (Virgin Enterprises), Henry Ford, Ted Turner (AOL - Time Warner), John Chambers (Cisco), as well as prominent statesmen: Winston Churchill, George Washington, Thomas Jefferson, John F. Kennedy and others. Perhaps for similar reasons, many dyslexics tend to take on arts (e.g. Tom Cruise or Whoopi Goldberg)

The list above indicates that those who show reading difficulties in childhood can also cope well with their deficiency later in life and become avid readers and skilled writers. Research shows that intense training in dyslectics helps them use the right part of their brain to take over the limited functionality in the left part. Even a few weeks of intense phonological training (e.g. breaking down and rearranging sounds to produce different words) can help noticeably improve reading skills. Unlike normal adults, phonological training shows increase in the activity in the right temporoparietal cortex. This part of the brain works in spatial tasks and may be the main compensatory structure in phonological training. This is the sister region of the left temporoparietal cortex responsible for visual motion processing which is underactive in many dyslexics. The earlier the phonological regimen is taken on, the better the overall result. Advanced brain scans could identify children at risk of dyslexia before they can even read.

In 1979, anatomical differences in the brain of a young dyslexic have been documented. Albert Galaburda of Harvard Medical School noticed that language centers in dyslectic brains showed microscopic flaws known as ectopias and mycrogyria. Both affect the normal six-layer structure of the cortex. An ectopia is a collection of neurons that have pushed up from lower cortical layers into the outermost one. A microgyrus is an area of cortex that includes only four layers instead of six. These flaws affect connectivity and functionality of the cortex in critical areas related to sound and visual processing. These and similar structural abnormalities may be the basis of the inevitable and hard to overcome difficulty in reading.

Several genetic regions on chromosomes 1 and 6 have been found that might be linked to dyslexia. In all likelihood, dyslexia is a conglomeration of disorders that all affect similar and associated areas of the cortex. With time, science is likely to identify and classify all individual suborders with benefits to our understanding of how low-level genetic flaws can affect the wiring of the brain and enhance or reduce a particular component of human mental capacity.

Whether today's models of dyslexia are correct or not, the main lesson of dyslexia is that minor genetic changes affecting the layering of the cortex in a minor area of the brain may impose inborn limitation on the overall intellectual function. At the same time, dyslexia shows that the brain exhibits a strong ability to compensate for its inborn or acquired limitations, and intense training can often result in miraculous turnabouts 

Smart lifelong training is an essential component of the formula for genius! Even though genetic background or health may handicap a minority, the optimum strategy for maximizing the intellectual power is still the same: as much quality learning as possible. Learning is your genius brain work-out. Commit yourself to heavy learning for life today! Be sure that this is smart learning (as emphasized in the next section). Genius of spatial symmetry, Buckminster Fuller said: I'm not a genius. I'm just a tremendous bundle of experience. See also: Practice can make a perfect genius

Most average students today could amaze Aristotle with their ability to draw conclusions in many areas of science. They would laugh at the great philosopher's theories. Their brains are better primed for scientific thinking than the brain of the greatest philosopher of the 4th century B.C. In today's world, your IQ or the folding of your cerebral cortex are valuable assets but they are ultimately less important than your ability to solve problems. This ability is based on knowledge. And knowledge is inherently acquirable. One thing you must not forget though: Make your learning smart:

Smart and dumb learning

To build genius, your learning program must be based on high applicability of newly acquired skills and knowledge. If you memorize the whole phone book (i.e. a big set of facts), you won't be much closer to a genius mind and your problem solving ability will increase only slightly (mostly through the beneficial effect of memory training on the health of your brain). On the other hand, a simple formula for expected payoff may affect all decisions you make in problem solving and in life in general. It can, for example, save you years of wasted investment in lottery tickets. Millions of people are enticed with huge lottery jackpots, yet they would never agree to give up their whole income for life in order to get it back at retirement in one-off payment, which is a frequent probabilistic payoff equivalent of taking part in lotteries. Using the terminology defined above, you will find most benefit in mastering and understanding highly abstract rules of logical thinking and decision making.

To accomplish smart learning, you will need to constantly pay utmost attention to what material you decide to study. You must avoid short term gratification at the cost of long-term learning. It may be great fun to learn all Roman emperors and details of their interesting lives and rule. However, unless you study with a big picture in mind (e.g. in an attempt to understand why civilizations thrive or fall), your genius may benefit less than by slogging through less funny but highly applicable formulas of operation research (those can for example help you optimize your diet, investment, daily schedule, etc.). In other words, you cannot be guided just by the fun of learning but by your goals and needs. In time, you will learn to see the link between long-term learning and long-term benefits. You will simply conditions yourself to love beneficial learning. Hard study material can still provide instant gratification.  

While you focus on your goals, you cannot forget about the overall context of human life. You cannot dig solely into studying car engines only because this happens to be your profession. This would put you at risk of developing a tunnel vision. Your genius could be severely handicapped. You might spend years improving liquid fuel engine efficiency while others would leap years by getting involved in hydrogen engines. Their decisions would not come from genius itself but from an extensive knowledge of the field, relevant sciences and the human endeavor in general. One of the main reasons for which companies go bankrupt is that their leadership fails to spot the change. As corporate darwinism eliminates short-sighted teams, future society will witness more and more intellectual darwinism. To understand the trends and the future, you need to study human nature, economics, sociology, history, neurophysiology, mathematics and computing sciences, and more. The more you lick the stronger your predictive powers and your problem solving capacity and creative strength.

A bright 25-year-old Microsoft programmer has suggested to me recently that I use wrong examples in my articles on learning. He specifically referred to the question "Which year was the Internet born?", which he classified as a piece of trivia. He implied I should use more "useful" examples to encourage readers. Here my own tunnel vision showed up as I found his position very surprising. I misjudged the concept of trivia in the eyes of people that do meet the criteria of genius. The term trivia excellently reflects the sort of knowledge we do not want to learn in the quest for genius. These are not-so-useful facts or rules of low applicability. However, the concept of trivia is highly relative. To a child in a kindergarten, the birth of the Internet is rather meaningless. At this stage of development, the child may find it difficult to grasp the concept of the Internet itself. Most of parents will wait until the primary school before showing a child a web browser (esp. that reading skills may be needed to appreciate the concept). The value of putting the date on the birth of the Internet probably develops only in the context of an effort to understand the history of technological development. In this context, 1969 may be as important as the years of Gutenberg. Only when multiple events of the 1960s and the 1970s dovetail together, the commissioning of ARPANET becomes meaningful. When we figure out that we landed the man on the moon before making the first connection via the net, 1969 looms larger. If we dig deeper, we may find it inspiring to know that when Charley Kline tried to log in on October 29, 1969, the network crashed as he typed the letter G. This little detail may still contribute to your genius! Say you work on commissioning a major installation you worked on for several years. You know that the installation implements revolutionary concepts yet it keeps on crashing. You are about to lose hearth. This may not necessarily be an emotional event, after all you also need to apply probability to deciding when to give up blind-alley pursuits even after years of investment. The juxtaposition of the small letter G and the groundbreaking concept of the interconnected world will help you see the big picture. If your concept is great enough, you will go on through another 100 crashes in hope of diagnosing the reason. If you win, your measure of genius will be enhanced. 

Listen to other people's advice and valuations. The younger you are the more you should listen. In the end though, it must be you who determines the criteria for sifting golden knowledge from trivia. Only you can measure the value of knowledge in the light of your own goals.

Remember that not all knowledge can easily be formulated in a declarative manner. Remember then to use the power of your own neural networks: solve problems, practice your skills, compute, abstract, associate, etc. You and others may not be able to see or verbalize some rules but your brain will extract them in the course of practice. Once the rules have been developed, try to formulate them and write them down. This can be of benefit to you and others

Sifting trivia in SuperMemo

In early versions of SuperMemo, your decisions related to sifting trivia from valuable knowledge would be binary in nature: memorize or forget. In 1991, the concept of the forgetting index made it possible to memorize items with a given probability of recall. In SuperMemo 2004, with incremental reading, there is a continuous transition from trivia to your platinum genius-building knowledge.  Apart from the forgetting index, you can use ordinals and rescheduling tools to manage unheard-of quantities of knowledge

Predicting the future 

The ability to "see" the future is one of the best tests for genius. The nature of spacetime does not seem to make it possible to probe the future like we can probe the past via historical records. However, the laws of physics provide a strong platform for peeking into what may happen. A ball falling freely to earth may be an easy guess based on the Newtonian laws of gravity. However, the true difficulty in predicating the winner of Gore-Bush clash in October 2000 came out only after the election day on November 7. Guessing the winner of the 2004 election today would be yet harder. Guessing on the state of mankind beyond 2100 is a game reserved for only the best-equipped futurist minds. Predictive powers are so good in probing genius because they test all of these: (1) nimbleness of the mind, (2) extensive knowledge on the mechanics of the universe and the society, and (3) the abstractness of reasoning rules. Write down your predictions of the future today. In five years you will be amazed with your own predictive lapses. When will we be able to cure AIDS or cancer? When will we talk freely to computers? What job will you land after graduation? Would you predict the web explosion in 1990 (i.e. before the publishing of the web protocols)? Or in 1994 (i.e. already after Filo and Yang started collecting their Yahoo links)? What knowledge do you think you lack today to make your predictions more accurate?

Predictive powers are the cornerstone of success in business. Those who can see the technologies and trends that will shape a market in 3-5 years are posed to do well. Here comes the value of basic sciences such as math and physics in extracting trends from the chaos of the modern world. The value of math and physics comes from the fact that it equips you with highly abstract rules with a wide range of applications. This is why it pays highly to learn artificial intelligence, neural networks, sociology, neurophysiology, systems theory, statistics, evolutionary psychology, history, etc. Those sciences formulate rules that make it possible to better understand the reality, and most of all, draw conclusions about the reality. Those rules are the tools of computation for processing the picture of reality in your mind.

Here is an example: when Alan Turing developed the concept of his Turing machine, he equipped his genius brain with the tool for understanding computation. The Turing machine is a sort of a toy computer that scans a tape of symbols and stamps the tape depending on the currently read symbols and its own state. Turing's early intuition was that his toy computer, given enough time, could compute everything that is computable. If future was deterministically computable from the quantum states of subatomic particles, the Turing machine could compute it. If future was non-deterministic, the density function of individual outcomes could be computed too. The Turing machine became the simplest possible metaphor for the human brain. Turing could see the parallel between the shifting states of the Turing machine and the states of the human mind, including emotional states and the most complex computations of the human thought. Turing could then state boldly that one day machines will be as intelligent as humans. The famed Turing test is based on putting a computer in one room, a human in another, and testing if outside observers could distinguish between the two by means of a conversation (e.g. via a computer terminal). Once computers become indistinguishable from humans, they will have been said to have passed the Turing test. Most of people living at Turing's time (the 1930s) would disagree, but their predictive powers were limited by lack of tools for understanding the mind and computation. Turing machine and basic truths about its properties, equipped Turing's brain with tools that made it easy for him to see the simple parallel between the mind and the machine. For most researchers in the area of artificial intelligence, it is obvious that the Turing test will be passed sooner or later. Perhaps in 2010, perhaps in 2040, but it will happen. In the 1950s, Herbert Simon, using the same abstract rules related to computation, spoke loudly about his belief that the computer will beat the world chess champion within ten years. He was off by thirty years. This illustrates the difficulty in predicting the future, as well as the power of some basic abstract rules. In this case, Simon concluded that given the appropriate objective function for evaluating chess positions, it is only the matter of the number of moves the computer can process before it can produce better moves than a human being. He underestimated the power of human brain in simplifying (read: representing) the chessboard situation. Yet the ultimate outcome of Simon's prediction was inevitable and obviously true. This example illustrates how a simple abstract tool (Turing Machine) can be used to predict the future (fate of the Turing test) by providing a simple model of complex reality (human brain and its behavioral characteristics). 

Ray Kurzweil is probably best know for his improbable-sounding predictions of the future. Machine intelligence is not only obvious to him. It should also come sooner than most AI researchers predict. Kurzweil's predictive powers come from immense knowledge of technology, sciences, and the society. Kurzweil's case shows how extensive learning equips the brain with genius powers of which predictive powers are so noticeable. Kurzweil predictions (including world wide web) have already materialized in a number of cases. Read Kurzweil's lips. That could be the shortest way towards reading the future save your own years of heavy learning.

In 1977, the bright mind of Ken Olson, President of the Digital Equipment Corporation, committed a notorious blunder expressed at the Convention of the World Future Society. Olson said: There is no reason for any individual to have a computer in their home. Possibly reading this text on your home PC, you may wonder how Ken Olson could possibly be considered bright if he could not see an obvious value of the PC? His blunder does not detract a bit from Olson's brain powers. After all, he did not reach the top of DEC by chance or connections. He built it from the ground up. His creative powers were in this particular case curtailed by his own experience with computing (fascination with the power of VAX and VMS in juxtaposition to a weakly microcomputer). Yes, knowledge can be detrimental too. Einstein's relativity theory gained him the most identifiable status of the ultimate genius of science mostly due to the fact that he was able to extricate himself from the Newtonian mold that is so natural to our day-to-day thinking. Not being able to break the mold is not a sign of lacking genius! It is simply a sign of being burdened with the prejudice of one's current knowledge. In no way should this mean that learning on its own can be detrimental. It never is as long as we do not apply the creative mold to the learning process itself. One of the most important rules your genius brain needs to store in the very beginning is that: no rule is true for ever. Rules can be added, modified, deleted or replaced. You need to strengthen your rules related to fuzzy logic. In simple words, you have to learn to think in terms of the probability of truth

SuperMemo vs. contradiction

SuperMemo makes it easy to see that knowledge we are fed daily via various media is rich in contradictions. If we learn with a lower degree of retention (classical learning), new contradictory knowledge easily obliterates old knowledge. We often do not even see the contradiction. If you learn for a high retention (say 95-99% in SuperMemo), contradictions become painfully visible. This helps you to become critical in evaluating the sources of information. If this article tells you that Einstein was dyslectic, take into account the rules of memetics: this comforting piece of news propagates easily. It propagates by far more easily than the core meaning behind Einstein's theory of relativity. From article to article. From website to website. From person to person

Ken Olson blundered by claiming no demand for personal computers, but his brain was able to quickly absorb the new reality (esp. in the context of DEC's rapid decline). Olson's enlightenment might have been too late for DEC, but not to Olson's ability to creatively contribute to the computer industry. Long before Olson's blunder, the founders of Apple had already known the truth: microcomputers will take the planet by storm. The power of the storm was still a surprise to Steve Wozniak. So was the fact that the clunky PC was later to displace his cherished Apple line. The PC storm surprised even the man who made the most of it: Bill Gates. The man whose predictive powers made him as valuable as the economies of whole countries. Bill Gates's wealth attracts as much envy as it attracts admiration. This is why his own blunders were studied to the last detail. Bill Gates blundered dismally on more than one occasion. And again it does not detract from his true software business genius. Gates was clearly late with noticing the power of the Internet, yet his .NET initiative shows that he and his team were able to correct the strategy on the go. Actually, the .NET credit goes to Microsoft employees who were able to contact their boss directly with their own ideas on strengths, weaknesses, opportunities, and threats. In the end, treating the company electronic communications as a nervous system returns credit back to Gates and his managerial skills. In 1981, Gates is reported to have said: "640K ought to be enough (memory) for anybody" thus contributing to the infamous 640K-lock. It is also Gates who predicted that OS/2 would be the most important piece of software that has ever been developed. So what? Bill Gates, as all true geniuses, keeps on learning. To err is human. As long as we do not stick to the old mold. There is no fool like an old fool

Notorious predictive lapses 

Predictive lapses do not detract from human genius. They befell to presidents, Nobel Prize winners, CEOs, analysts and the most amazing genius minds such as that of John Von Neumann. Because they often comfort those who are less brisk intellectually, many are a myth only: 

  • It would appear that we have reached the limits of what it is possible to achieve with computer technology, although one should be careful with such statements, as they tend to sound pretty silly in 5 years. John Von Neumann, a major contributor to the design of modern computer architecture, ca. 1949
  • Television won't hold on to any market it captures after the first six months. People will soon tire of staring at a box every night. Darryl Zanuck, head of 20th Century Fox, 1946
  • This telephone has too many shortcomings to be seriously considered as a means of communication. The device is of no value to us. Western Union internal memo, 1876 (shortly before years of a patent battle with Bell Company)
  • An amazing invention, but who would ever want to use one? U.S. president Rutherford Hayes, after participating in a trial telephone call between Washington and Philadelphia in 1876
  • No matter what happens, the U.S. Navy is not going to be caught napping. Secretary of the Navy, December 4, 1941 (three days before Pearl Harbor)
  • There is no likelihood man can ever tap the power of the atom. Robert Millikan, Nobel Prize winner in physics, 1920
  • I have traveled the length and breadth of this country and talked with the best people, and I can assure you that data processing is a fad that won't last out the year. The editor in charge of business books for Prentice Hall, 1957
  • I think there is a world market for about 5 computers. Thomas Watson Sr., President of IBM, 1943 (probably an urban legend)
  • Everything that can be invented has been. US Patent Office 1899 (an urban legend)

For a taste of excellent knowledge-based predictive powers in action, see the highly educational and heart-warming "Long Boom" by a long-view guru: Peter Schwartz (with Peter Leyden; Wired, July 1997). Even though the article is only four years old, we can see that the authors underestimated the destructive power of investor greed for the dot-com economy. But future holds positive surprises for optimists too 

What is creativity? 

Creativity is usually defined as the ability to generate new ideas that are both highly innovative as well as highly useful. A new idea will not be called creative unless it is quite hard to come by. For example, if you decide to paint your car orange with little blue ants all over it, you won't fall into a highly creative field. After all, everyone can paint her car like this. That you do not see blue ants in the streets comes from the fact that a number of objects that could take ants' place is near to infinite. An art expert passing a judgment on your car's artistry could perhaps change the verdict. On the other hand, if you keep on churning dozens of ideas which have little or no practical value, few will consider this a highly creative effort. Similarly, potentially valuable ideas that live and die in your brain without ever being converted into a practical application will not pass the test of the definition used herein. In this article, we will adhere to the pragmatic criterion in judging creativity. Let us analyze the basis of creativity and ways to improve creativity via training and application of relevant tools and/or techniques. We will skirt around artistic creativity, which falls out of my own professional focus, and is by far more relativistic: artistic creativity is in the eye (or ear) of the beholder.

Here are some examples of creative breakthroughs that we will use in an effort to find a prescription for:

  • Johannes Gutenberg built upon the idea of metal blocks with letters, combined existing technologies, and sparked one of the greatest revolutions in the history of mankind
  • Steve Wozniak combined his knowledge of electronics with a vision of a computer displaying images on a TV screen, and working with a typewriter-like keyboard. Those ideas opened a path towards a personal computer for the masses
  • Tim Berners-Lee inspired by the idea of hypertext and in need of an efficient communication tool for large teams came up with a protocol framework for the future world wide web. He converted multiple ideas and hours of design and programming into a foundation of the greatest communication breakthrough since Gutenberg
Tim Berners-Lee

In 1980, Tim Berners-Lee wrote a little program called Enquire that helped him link pieces of information together. The program itself was inspired by an old computer game Adventure. Unlike later Hypercard, Enquire would run on a multi-user system and make it possible for people to share data. Using his experience and the inspiration from the hypertext concept coined in the 1950s by Ted Nelson and derived from Vannevar Bush's Memex system (early 1940s), Tim Berners-Lee envisioned a system that could improve information exchange in large teams. In March 1989, while employed at CERN, Tim Berners-Lee wrote a proposal for improved information management. His main concern was to improve keeping track of large projects. His proposal was to build a system that would be distributed on remote machines, allow of heterogeneity, decentralized (i.e. growing freely at its nodes independently from other nodes), and privately extensible. He proposed a team of two people to develop the project within a year. His proposal's reference section clearly points to the seminal influences of Ted Nelson and other authors. By November of 1990, Tim started working on the prototype. The world-wide-web, as it was then called, went into use at CERN in May 1991. By August 1991, its existence was announced to a number of Internet newsgroups. By 1994, the web edged out telnet to become the second most popular service on the Internet. In the percentage of byte traffic, it was only behind FTP-data. Today, the web is the single most important tool of global transformation.

Tim Berners-Lee creatively combined his experience, and existing ideas into a breakthrough concept that changed the world (and we have barely seen the beginning). Building blocks of the world wide web are simple enough to be understood by a high school student. Yet their unique combination into a simple, extensible, and cohesive concept deservedly rewarded the genius of Tim Berners-Lee with the credit for the greatest human breakthrough since Gutenberg

If you look at Gutenberg's, Steve Wozniak's or Tim Berners-Lee's breakthrough ideas, you may think: "That's simple. I could have invented it". The greatest power of an invention is often in its simplicity. Yet creative molds often prevent dozens of inventors from hitting the right idea. The fact that great creative breakthroughs seem so simple in retrospect gave origin to the popular saying: The darkest place is under the candlestick. Creative mold is simply very hard to overcome even to the most insightful mind. Ted Nelson had spent years perfecting his genius Xanadu ideas. Yet Tim Berners-Lee, in the course of two short years, combined a couple of simple concepts to turn the world upside down. The simplicity, and a near-obvious nature of their inventions make it hard for the inventors themselves to recognize the invention's potential early. Without Steve Jobs, Wozniak may have never gone to believe that his new computer design could be used beyond his hobbyist club, let alone by millions. Great creative breakthroughs combine luck, coincidence, timing, and persistence. They are also helped greatly by a very specific kind of creative mind: at times inattentive, hyperactive, distractingly creative, obsessive, often paranoid, and even nearly psychotic (as in the case of John Nash depicted in Oscar-winning Beautiful Mind). As for luck and timing, Gutenberg's ideas would not work had they been originally transplanted to China (see Gutenberg). Steve Wozniak without Steve Jobs might not work (see: Steve Wozniak). WWW striving for the perfection of Xanadu might not work. For your creative genius to change the world, you need to look for workable solutions that fit the present world. As Vannevar Bush noted, the genius of Charles Babbage, the inventor of the first mechanical computer, was born a century too early. Not only was his mechanical computer hard to implement. Not only would it be uneconomical and unworkable had it been constructed. Not only would it be hard to educate his contemporaries about its usefulness. The most painful mismatch in timing was that the Babbage's work has largely been misunderstood, forgotten and had little impact on the design of the first electric computers a century later. The timing was not right. Only for Babbage's 200th birthday has the machine along his design been proven workable by researchers at the Science Museum in London. One common feature of the greatest failed inventions is that their fathers gave them up upon hitting a better idea. In such cases, obsessive compulsive creativity may be a hindrance. Babbage gave up his Differential Engine as soon as a vision of a far better Analytical Engine dawned upon him. In contrast, Seymour Cray, when designing his supercomputers would maximally simplify the architecture in order to be sure his designs see the market. Cleverly balancing implementation speed against perfection, Cray has outdistanced everyone in the supercomputing field (until new creative breakthroughs set him back). Creativity and cold meticulousness are often at odds. They require a different type of mind. A biologist will notice that they are based on a different neurohormonal brain profile! Tim Berners-Lee is an excellent example of a brilliantly creative mind, which is still able to focus on a task at hand, efficiently execute the plan of action, and make things happen. 

Steve Wozniak

Steve Wozniak is universally credited with initiating the entry of computers into private homes. Although his contribution may be seen as a compilation of a few well-known ideas that have perfectly coincided with the technological readiness for a mass-produced computer, Steve Wozniak's ingenuity and relentless creativity made him uniquely suitable to pick up the credit for starting the PC revolution.

Successor to Tom Swift

Wozniak's early inspirations came from his father Jerry who was a Lockheed engineer, and from a fictional wonder-boy: Tom Swift. His father infected him with fascination for electronics and would often check over young Woz's creations. Tom Swift, on the other hand, was for Woz an epitome of creative freedom, scientific knowledge, and the ability to find solutions to problems. Tom Swift would also attractively illustrate the big awards that await the inventor. To this day, Wozniak returns to Tom Swift books and reads them to his own kids as a form inspiration.

Woz's values were shaped and strengthened over years by his family, Christian philosophy (turning the other cheek), radio amateur ethics (helping people in emergency), books (Swift's utilitarian and humanitarian attitude) and others.

As a lasting Swift legacy, throughout his life, Wozniak loved all projects that required heavy thinking. He learned the basics of mathematics and electronics from his father. He would at times be so absorbed in his projects that his mother would have to shake him back to reality. When Woz was 11, he built his own amateur radio station, and got a ham-radio license. At age 13, he was elected president of his high school electronics club, and won first prize at a science fair for a transistor-based calculator. Also at 13, Woz built his first computer that laid the engineering foundation of his later success. 

First home computer

With all engineering skills at hand, it was not hard for the Wizard of Woz to envisage a simple computer of his dreams. The keyboard would work like a typewriter. The messages would be displayed on a TV-like monitor. The computer could be assembled with relatively cheap circuitry. By 1975, Woz would drop out of the University of California at Berkeley and would come up with a computer that could sweep the nation. However, he was largely working within a scope of the Homebrew Computer Club, a local group of electronics hobbyists. His project had no wider ambition. As it often happens in history, Woz was just a single hemisphere of a genius brain. The other component was Steve Jobs whom Wozniak met when he was 16. Jobs, 5-years Woz's junior, who himself had dropped out of Reed College in 1972, was a perennial starry-eyed visionary who could see far beyond the possible. Jobs and Wozniak came to the conclusion that a completely assembled and inexpensive computer would be in demand. They sold some of their prized possessions (e.g. Woz's scientific calculator), raised $1300, and assembled the first prototype in Jobs' garage. Their first computer was quite unimpressive by today's standards but in 1975 it was an engineering breakthrough that would change the course of history (picture). In simplicity of use it went years ahead of Altair which was introduced earlier in 1975. Altair had no display and no true storage. It received commands via a series of switches and a single program would require thousands of toggles without an error. Altair output was presented in the form of flashing lights. Altair was great for true geeks, Bill Gates and Paul Allen were one of the first among them, but it was not really usable for a wider public. It would not even come assembled. Woz's computer, on the other hand, named Apple I, was a fully assembled and functional unit that contained a $25 microprocessor on a first-ever single-circuit board with ROM memory. On April 1, 1976, Jobs and Wozniak formed Apple Computer Company. Wozniak quit his job at HP and became the vice president in charge of research and development at Apple. Apple I was priced at $666. Jobs and Wozniak made a killing by selling their first 25 computers to a local dealer. 

Wozniak could now focus full-time on fixing the shortcomings of Apple I and adding new functionality. His genius was in full creative swing. Apple I earned his company close to a million dollars. His new design was to retain the most important characteristics: simplicity and usability. Woz introduced high-resolution graphics in Apple II. His computer could now display pictures instead of just letters: "I threw in high-res. It was only two chips. I didn't know if people would use it''. By 1978, he also designed an inexpensive floppy-disk drive. He and Randy Wigginton wrote a simple disk operating system. In addition to his hardware wizardry, Wozniak wrote most of software that ran Apple. He wrote a Basic interpreter, a Breakout game (which was also a reason to add sound to Apple), the code needed to control the disk drive, and more. On software side, Apple II was also made more attractive to a business user by the famous  pioneering spreadsheet: Dan Bricklin and Bob Frankston's VisiCalc. This unique combination of new ideas resulted in a screaming market success. In 1980, the Apple company went public and made Jobs and Wozniak instant millionaires. At the age of 27, Jobs was the youngest Fortune 400 man in 1982 -- a rare case before the dot-com bubble era. Incidentally, in 1978, when the company cut the price of Apple II, it helped to launch yet another meteoric software career; that of Mitch Kapor. Kapor scraped enough money to buy his own Apple. Inspired by VisiCalc and a meeting with its inventors, he went on to develop Lotus 1-2-3 and swept the spreadsheet market place for years to follow.

Plane crash

In February of 1981, Wozniak nearly lost his genius in an accident that could have easily claimed his life at age 30. While taking off from Scotts Valley airport, an engine failed in his Beechcraft Bonanza airplane and it crashed. In addition to facial injuries, Woz experienced a retrograde amnesia. This means that he could not recall things from before the accident. He had also problem with forming new memories. At worst, years of training could have been permanently erased from his memory. Those memories laid the foundation of his genius thinking. Luckily, five weeks after the accident, his memory powers returned. The genius was ready for more breakthroughs but his passions shifted from technology to people. 

Woz became less enthusiastic about his work for Apple. He got married and returned to the university under the name "Rocky Clark" to get his degrees in 1982 in computer science as well as in electrical engineering. In 1983, he decided to return to mainstream Apple development. However, he wanted to be no more than just an engineer and a motivational factor for Apple workforce. Here he demonstrated a typical characteristic of a creative mind: craving for creative opportunities away from the spotlight (cf. William James Sidis). Woz stunned the world by leaving Apple for good in 1985 nine years after setting up the company. Jobs was also forced to leave Apple as a result of a power struggle. Wozniak and Jobs are proud to have originated an anti-corporate ethic among big players of computer market. Jobs focused on not always practical innovation with his NeXT vision, while Woz went on to fulfill his other passions: teaching to fifth grade and charitable activities in the field of education. Today, Steve Wozniak's passion is to help young talent catch on the train of opportunity. He provides kids with computers, Internet accounts, and lessons in programming. In September 2000, Steve Wozniak was inducted into the National Inventors Hall of Fame alongside Bell, Edison, Fermi, Marconi, Pasteur, Tesla and others

The future of creativity

Wozniak said: "Apple is not the company I had hoped it would be. I always thought that a major player in the personal computer business, with its label on the products, would be composed of top engineers and multiple labs full of scientists developing new devices out of physics and chemistry".

Geniuses dislike corporate structures because corporations tend to bend creativity to commercial purposes. Creative minds tend to be in minority. At the same time they are convinced that their visions are the only valid ones. This inevitably leads to tension and disruptions at work. Some corporations create independent R&D departments and lavish their most precious brains with generous research funds. However, only leading high-tech companies can afford such solutions. Luckily, the new economy based on the Internet has provided grounds for breeding countless young geniuses sprouting here and there. Many creative minds are now operating via their small websites that provide a planetary access to the product of their intellectual effort. Two young users of SuperMemo have recently set up a website in their living room. Antimoon.com, set up at nearly zero-investment is now quickly gaining popularity. 

In addition to opening ways towards individual creativity, the Internet helps corporations hire geniuses without restraining their creativity. A creative mind operating from a small country over a shoddy Internet connection can be hired on conditions that provide a unique coincidence of needs. The genius gets a creative job that is not available in his area or country at a competitive pay. The corporation gets the most precious commodity their money can buy: creative genius

Ingredients of creativity

All creative individuals experience periods of time when new ideas come into their mind in droves and there is hardly enough time to write them all down. A creative individual can hardly hope to implement a fraction of his or her ideas. Some people are born with highly creative minds. They are privileged from the onset, but they are also more likely to suffer from side-effects of neurohormonal aspects of creativity such as inattention, anxiety, depression, etc. For those who are born with less poetic minds, the understanding of the creative process can be of great help. Ordinary brains can be made to work in a highly creative mode. Let us list the conditions needed for the brain to churn out ideas en masse:

  • suitable state of mind: alert, excited, and excitable
  • suitable environment: minimum irrelevant interference from the outside word (e.g. ringing phone) and maximum creative stimulation (e.g. creative reading, incremental reading, brainstorming, etc.)
  • time: the more time you give for an idea to grow, the greater the likelihood of a breakthrough association; two hours separated by a period of sleep may do more than two continuous hours
  • motivation: there must be a need to come up with a solution and strong motivation to document and analyze the transition steps
  • curiosity: the mind must curiously stray into unexplored paths when new associations and unexpected solutions can be found
  • knowledge: knowledge in relevant areas

Of the above factors, genetic endowment may greatly help in achieving the suitable state of mind, which also entails motivation and curiosity. However, the neurohormonal advantage given by the lucky genotype can be made up for with relatively simple tools and techniques such as: massive learning, cup of coffee, brainstorming, good health, etc.

Computer metaphor of the creative mind

A creative mind can be compared to an expert system that must go beyond its current field of expertise and generate new facts and rules. The goal is often reasonably defined but the path towards the goal is unclear. At other times, even the goal is not well-defined. It is simply supposed to crop up suddenly as a creative enlightenment in an effort hazily targeted at innovation. Let us introduce the concept of a creative computer system as a metaphor for the creative mind. Using the computer metaphor we can redefine the previously listed preconditions of creativity in the following manner:

  • speed: the more parallel processing paths the creative system can spawn the richer its output (equivalent: state of mind). The rat maze model of creativity shows that massively parallel processing is the key factor underlying the "creative speed". It also provides hints on how the parallel processing power can be harnessed  
  • minimum interference: we want to avoid power failure, additional computational tasks (e.g. computing the factorial of one billion at the time of running the creative process, etc.) (equivalent: environment)
  • time: the more time we allow for the exploration, the greater the number of tasks we can explore (equivalent: time). Running parallel creative processes can cut down on time needed for a creative breakthrough
  • goal: the creative system by definition will focus on churning new ideas and recording the results with a general goal defined by its program (equivalent: motivation). Unbridled random creativity can often come with valuable outcomes (e.g. in poetry), but may be less useful in highly focused activities (e.g. engineering)
  • branching: the creative system will use all heuristics available at hand to drive exploration of innumerable paths that could potentially result in a breakthrough (equivalent: curiosity). Curiosity will spawn conscious branching, but spontaneous branching of the creative mode is also welcome (as shown in the rat maze model of creativity
  • knowledge: the more rules and well-chosen facts our system has in store, the greater its explorative potential. Our system is parallel and associative. In other words, we do not need to worry much about operational overhead. We do not need to worry about knowledge overload. However, we do want to maximize relevant knowledge and the abstractness of the rules (equivalent: knowledge)

In short, our creative system can be improved by adding speed (esp. through stimulation of parallel processing), minimizing interference, improving the exploratory heuristics, adding more knowledge and allocating more time.

In simple terms the above means stimulating creative powers, using creative techniques, avoiding interruption and the pursuit of lifelong learning.

Rat maze metaphor of the creative mind

Our brain metaphor presented earlier does not suffice to efficiently explain the mind in the creative mode. It is the creative mind where the parallel processing of neural networks comes to play in a most prominent way. Your PC can run many processes in parallel (e.g. display this article, scan TCP/IP ports, run multiple service threads, run background applications, etc.). None of these, however, comes close to what is happening in your brain right now. Zillions of neural assemblies and centers get activated and inhibited in parallel fighting for your attention, running voluntary and involuntary control processes, filtering information or amplifying it, etc. In addition, the text you are reading produces processes that are in part hidden from your attention and may, at any minute, spring up with new inspiration or a creative breakthrough.

The way brain controls major parallel processes can roughly be illustrated with the rat maze model. Imagine a lab rat in a maze. It runs around until it hits the reward. Let a running rat illustrate one of parallel computation processes running in your brain. Let hitting the reward illustrate the moment when a creative association of ideas breaks through to your attention. Your brain is able to unleash many rats running in parallel. You are basically unaware of the rodential infestation crawling in your mind. Imagine your rats are able to breed on the run. One rat can breed when hitting a reward center or just breed spontaneously. If you drink a cup of coffee, your rats run faster and spawn more offspring. If you drink more coffee, rats start jumping over the maze walls and find new shortcuts and solutions. After more cups of coffee, rats start jumping one over another and scream for your attention resulting in chaos. The same happens when you get overly enthusiastic. You might transition from enthusiasm, via hypomania to a manic state that meets the criteria of psychosis. Coffee, excess serotonin, excess dopamine, etc. can breed more ideas in your mind, but you gradually lose the ability to focus on the task at hand until your mind disentangles into a psychotic chaos. The reverse happens if you take a sleeping pill. Your rats start dying out or hiding in remote corners of the maze. In bipolar disorder, a manic swing can tire the rats. Once they run out of steam, the brain may collapse into clinical depression. Once rats slow down, you run out of creative powers. If you spray a portion of the maze with a cocaine inhalant, that portion of the maze may suffer a particular infestation of manic rats. Rats jump over one another and spread out in amuck into various direction. This may happen in an overly stimulated portion of a schizophrenic brain. If the cocaine sprayed part of the maze is responsible for representing a particular place or person, you may experience hallucinations or delusions. You may hear or even see people who do not exist. Your mind becomes paranoid. If the same overexcited rats run amuck in your temporal cortex during sleep, you may experience nightmares. When the rats start running in circles you can experience obsessive compulsive behaviors. You can wash your hands many times during the day and your rats will still make you think your hands are dirty. You can step in front of the class and the circling rats will make you repeat "I am bound to fail", and you do. The same happens if you repeat "I will not fall asleep tonight" and you don't. The rats are responsible for the writer's block. There is an optimum number of rats at an optimum level of excitation for any given task; including writing. If rats are drowsy, you won't invent much. If they run amuck, you won't be able to convert the chaos of new ideas into a coherent stream of thoughts. If they run in circles, you may get stuck with a self-imposed limitation: "I won't write anything creative today"; you keep tossing balls of scratch paper into the basket. You will want to keep your rats at bay when you cross a street or when you drive. They can cost you a life. Creative minds may be unwelcome in an air traffic control tower. On the other hand, you will let rats play to their heart's content when you brainstorm a new marketing slogan or idea for a movie script. It is vital to understand that attention and creativity are two difficult to reconcile qualities. A genius mind must find the right balance. It is the prefrontal cortex that helps channel rampant creativity into focused attention. Try LSD and creative chaos will disentangle your reasoning. This happens due to the prefrontal executive inhibition. The creative mode balance will also change during the day. Your brain may welcome hyperactive creativity at work time and a complete mode switchover by the time you are ready to rebuild your neural fabric in sleep. Creativity is a blessing when needed and may be a bane for insomniacs, schizophrenics, bipolar patients, ADHD kids, OCD people, etc. It can be gently stimulated with a cup of coffee, or turn your mind into chaos with LSD, cocaine or even stress. Brainstorming, conducive social environment, incremental reading, creativity tools (e.g. Ideafisher) can stimulate creativity by non-pharmacologically unleashing new rats without a major impact on their agility.

Some people are born with highly creative minds. They invent new things faster than they can be utilized. However, the same people are at much higher risk of mental disorder. They may also show less ability to focus or persist. They may crave novelty that makes them jump on new tasks before old once are complete. However, the way their prefrontal cortex harnesses creativity may determine the thin line between a true inborn genius and mental disease.

If increase in creativity was purely beneficial, it would have certainly been far better promoted in the course of evolution. However, evolutionary development of creative powers had to be matched by the development of executive powers that govern attention. Otherwise side effects of creativity would act as an evolutionary ballast: low-stress tolerance, relationship problems, increased suicide rate, increased divorce rate, risk and novelty seeking, tendency to get bored, increased risk of mental disorders, etc.  

To take only one example, the high suicide and low reproduction rate of people with bipolar disorder and their significant prevalence in the society (1%) clearly indicate that its genetic component (documented as very strong) must carry some evolutionary advantage. This advantage is only increasing in the knowledge economy based on innovation. See: Creativity, Evolution and Mental Illness

Conclusion: Creative balance is the key! There is an optimum level of creativity for any given task. You can learn to stimulate and extinguish creativity. Your genes will largely determine how difficult it is for you to control your creativity. Your skills in that respect may determine if you ascend from average mind towards genius without falling into a variety of mental ailment traps 

Johannes Gutenberg

Before Johannes Gutenberg invented the printing press in 1438, there were only 30,000 books in Europe. Most of these were related to the Bible and religious writings. Gutenberg combined pieces of knowledge available at his time into a breakthrough technology. The unique elements in his invention: a mold with punch-stamped matrices, a type-metal alloy, a press similar to those used in winemaking, and an oil-based printing ink. None of these elements could be found in earlier Korean or Chinese print or in woodblock printing. Little is known of the mental process which lead to the invention. Most of what we know of Gutenberg comes from documents recording his financial or legal troubles. At the time of his invention, Gutenberg worked also on less groundbreaking technologies such as polishing precious stones and mirror manufacturing. All we know about his print invention is that he started with the idea of producing blocks of metal with letters on them. The idea might have been his own or picked up from others or from difficult to track Asian sources. By combining existing technologies, he converted his ideas into a working device and initiated one of the greatest revolution in the history of mankind. Books became the world's first mass-produced items. In 50 years since the invention, as many books were produced as in the preceding millennium. Only after Gutenberg, affordable books made literacy a highly-valued skill. In consequence, sciences and technology went into a period of unprecedented growth. These also sparked social change as well as global democratization in the 20th century.

It is important to add that it was a Chinese blacksmith and alchemist Pi Sheng who should be credited with the first documented invention of the print in the 11th century. He produced Chinese ideograms in clay and baked them in a fire. Then he stuck them on an iron plate with resins mixed with wax. He would cover the ideograms with ink and impress them on paper. Pi Sheng's invention, however, did not get far. Ideograms are too meaningful. As a result there are 50,000 of them. Far too many for convenience. Only the primitive Latin alphabet helped Gutenberg claim the greatest invention of the millennium.

Gutenberg illustrates important aspects of creativity:

  1. great breakthrough may being with a very simple association of ideas
  2. the inventor himself may take long to realize the importance of his own invention. If it was otherwise, why would Gutenberg jeopardize his legacy by wasting time on gems and mirrors
  3. the creative process is subject to multiple constraints that can stifle the brightest idea for the most trifling reason. Gutenberg would probably get nowhere had he had to deal with Chinese ideograms. For someone to propose a simplified alphabet in China, it would take an entirely different size of a creativity and social leap. Take a look at the great idea of Esperanto. Via social and economic mechanisms, it is English that became international lingua franca even though Esperanto may be seven times as easy to learn and use

Enhancing creativity: summary

Using the above creative system models, let us quickly list the areas where your creativity can receive a boost:

  • suitable state of mind: nurture your mental health, get enough sleep, avoid stress, learn about the neurophysiology of the mental effort, and work on understanding your own mental states to optimize the conditions and the timing of creative effort. In a healthy, well-managed individual, the best creative results can be obtained early in the morning (e.g. after a cup of coffee). If you are rather mentally slow in the morning and prefer to work late in the night, read: Good sleep for good learning for reasons why you might be different
  • suitable environment: turn off the phone, lock your door, turn off the radio, CD or MP3-player (even your favorite music can be distracting), and focus 100% on the studied subject. For inspiration turn to brainstorming, "creative walking", creative blackboard doodles, Ideafisher, or use incremental reading with a heavy load of related study material
  • time: for a breakthrough solution, give up as much of little things in life and focus on your goal: keep on learning and thinking. One of the most creative times in Newton's time was when he was forced to the province as the bubonic plague closed his university in 1665. The Beagle trip gave Darwin five years to digest new observations on variability of species. Heisenberg's best time might be when he was recovering from a bout of hay fever on the island of Helgoland in 1925. Seymour Cray found his work at CDC so distractive that he had R&D facilities built out of town at Chippewa Falls
  • motivation: this one is hard to develop. The vicious circle of bad motivation comes from the fact that once there is no motivation, you have no motivation to develop it. Luckily, the fact you are reading this article may testify to the fact that your motivation is sufficient (this may though not translate to self-discipline and execution yet). Psychogenic motivation of a creative mind comes from an unshakeable hierarchy of values, a lofty goal well-rooted in that hierarchy, and understanding of self-discipline techniques. Sticking carefully to all points of the Genius Checklist is critical for boosting your motivation
  • curiosity: lifelong learning is a sine qua non of creativity. The more you learn the more curious you become. What is irrelevant trivia to most, may become a fascinating aspect of the universe for you. An average man curses a rock he stumbles against. A great scientist can pick up a rock and write a dissertation about it. This is exactly how the groundbreaking theories of continental drift or plate tectonics begun. The more curiosity you find in your mind, the better your creative prospects. If you doubt your own curiosity, try incremental reading on subjects that are interesting to you today. If you persist for some time and hone your incremental reading skills, you curiosity may in time grow beyond manageable limits. You will hopefully discover that even if there were twenty of you, you could hardly lick the surface of interesting things to do or study
  • knowledge: knowledge is the software of your creative engine. The more knowledge you throw at a problem, the more new ideas and associations you will be able to generate. Most of new associations is chaff but with scrupulous recording habits you will learn how to sift the wheat. No major breakthrough in science or engineering is produced in a knowledge vacuum. Human brain works incrementally. It is basically unable of great leaps. Even Einstein arrived at his breakthroughs in incremental manner by piecing the blocks of the jigsaw puzzle produced by the non-relativistic physics of his time. Your creative breakthrough in the area of chemistry may come while studying architecture. See also: Ideafisher

The shortest formula for enhanced creativity: quiet, focus, curiosity, understanding the creative process, and inter-disciplinary knowledge. To boost your creativity, keep fit, work on ensuring peaceful and creative working conditions, learn to focus ("keep rats in control"), learn to boost parallel processing ("unleash the rats"), add more time to creative training, and keep on learning new things that could potentially be a source of inspiration.

Ideafisher - creativity software based on associational thinking

Creativity can be molded, enhanced and directed. This is the basis of Ideafisher software. Ideafisher is based on an immense database of facts, statements, quotations and loose ideas. Combined with a set of direction questions, it can be used by writers or managers to generate new ideas; either individually or in a brainstorming group. The underlying assumption is that once you intensely focus on a subject and look at it from most unusual directions, you will be able to massively produce new associations and generate new quality.

Incremental reading in SuperMemo 2004 can be employed in a similar fashion. In incremental reading, you read short pieces of articles belonging to various domains. Those pieces tend to juxtapose at random. The main purpose of incremental reading is learning; however, if you intensely focus on a single problem to solve, and appropriately select the reading material, you can use incremental reading in the creative process.

The main differences between Ideafisher and incremental reading:

  • incremental reading was designed for the purpose of learning; Ideafisher focuses specifically on generating new ideas
  • Ideafisher comes with a ready-made database of association cues; in incremental reading you must import the relevant reading material based on your focus of interest
  • Ideafisher can help you direct your creative effort towards a specific goal; incremental reading tends to be more explorative in nature. You can direct the exploration only by the appropriate selection of the material as well as by using prioritization tools
  • Ideafisher capitalizes on improbable associations that might be impossible with your present knowledge. In incremental reading, high retention might lessen explorative creativity. However, overloaded incremental reading (i.e. when you import far more knowledge than you can learn) may result in a substantial decrease in knowledge retention. Such a decrease may be unwelcome in the learning process; however, it further increases associational and recall effort that is highly stimulating in the creative process

Personality factor

In the 1970s, Laszlo Polgar, a teacher from Hungary, concluded that all normal children could be driven to a genius level with sufficient attention and training. If this does not happen on a regular basis, he claimed, it only comes from parental inattention and lack of patience. An average parent is busy with her or his own life and does not devote sufficient time to raising the kids. According to Polgar, it is easy for a parent to say: "Oh, this child has no genius!" and do nothing further. Interestingly, Polgar had no impressive scientific credentials in the field of child care and education (unlike Boris Sidis), so when he decided to experiment with his own kids, many accused him of using dictatorial methods for the case of a genius show. Few would take Polgar seriously, his methods even led to a clash with the Hungarian government. For details of Polgar experiment see Polgar sisters.

Polgar's optimistic claim does not leave much place for genetics. Throughout history, most prodigy training occurred in families with high average IQ. Hence it is again hard to separate nature from nurture.

If genetics comes into play in limiting genius, it is less so in the area of the sheer brain power, processing speed, associative power, number of neurons, creative power, etc. Human genius seems to be by far more limited by the personality profile which has a strong genetic background. In simple terms, if the child is ready and willing to be trained for genius, it will likely succeed. The main obstacle is in the fact that a child may not want to accept a heavy load of training. 

Except for mental disorders, important personality factors that limit overall creativity include low stress tolerance, aggression, impulsivity, depression, and the resulting poor motivation. On the other hand, traits such as curiosity, perfectionism, runaway creativity, and compulsiveness may enhance development if properly channeled and rationalized.

Such largely inborn factors as the overall level of serotonin or dopamine in the brain can determine stress tolerance, probability of suicide, as well as aggressive and violent behaviors.

Destructive personality factors are highly correlated with each other. For example, non-virgin adolescent girls are 6 times more likely to attempt suicide, 6 times more likely to use alcohol, and 18 times as likely to run away from home as compared with their counterparts who were able to delay their early sexual experience.

Early signs of genius personality are best described as the rage to master. By 1899 the Wright brothers had read all books in their local library that were relevant to their interest in flying. They decided to write to the Smithsonian Institution for suggestions as to further reading in aeronautics. This unorthodox approach is quite typical of young prodigies and future Nobel Prize winners. With those personality traits, the job of a parent is greatly simplified. Actually, it is often not parents that push prodigies, but prodigies that push parents who may often be worried by their child's unusual hobbies (e.g. dissecting dead animals to study anatomy). 

Nathan Mhyrvold, former chief scientist at Microsoft, claims that a great programmer is worth 10,000 times more than an average programmer. The number may well be exaggerated, but studies indeed show that efficiency comparisons between programmers yield much larger numbers that one might expect by studying their IQ. I had a great pleasure to work with a number of programmers, those amazing whiz-kids and those who slowly slog through the code. Those who program structurally with lots of foresight and those who produce tons of spaghetti code that is impossible to maintain. If I was to name the most important factor that differentiates programmers, I would choose personality. There are those who slug it out red-eyed until the early morning until they see the problem solved. They seem not to see the world beyond their problem to solve. There are also those who impatiently peek at their watch at 14:55 to see when they can rush back to their families. The latter are more likely to wade through manuals and ask for assistance once they get stuck with the problem. They are rather impassioned and by far easier to depress with difficulty. The young red-eyed enthusiast will rather not eat and not sleep than give up the quest for the solution. In early stages of their development, the red-eyed geeks may start with awfully buggy code but in time they will spurt ahead and develop healthy programming habits. In 4-6 years they may be in Mhyrvold's 10,000 league. Those who do not show passion for programming will never reach the necessary levels of concentration necessary to resolve pieces of programming puzzles in their mind. Those resolved puzzles gradually build strong and highly abstract models of universal algorithmic solutions to most of logic problems a programmer is likely to encounter on his way. In other words, knowledge and skills seem secondary to the personality factor here. The excellent programmer will begin his career with entangled code full of bugs and poor design but with passing months he will rapidly change his style, organize, improve day by day, and surprise others, After years, he may be unbeatable, the poorly motivated one stagnates at the plateau level and pulls it off from day to day.

Motivation and belief in one's abilities may be a key to sustained development in a young man. Ask an 18-year old: What problem would you like to solve? What is important? Often, you will get no conclusive response! Then suggest to the young man to become the world's best footballer or get the newest Lamborghini and you will notice that his life is not as goalless as it seems at first. However, if you ask: How do you want to get to your footballer-Lamborghini goal? You will get no response again. The reason: the young man does not believe there is a snowballs chance in hell he will ever reach those goals. At the same time, minor goals in-between do not seem worth the effort. The net result: the young man plays videogames and gets nowhere. This is why rage to master, self-discipline and will power may be the factor that will make or break the genius in a young man.

Inability to cope with stress can also stifle creativity. Stress takes your mind away from the creative problems to solve and makes you focus on yourself and petty problems that are not worth your attention. It affects your health and your self-discipline. It is also detrimental to the growth of the nervous tissue and memory consolidation (see: The Medical Basis of Stress). 

Lewis Terman, the precursor of IQ tests, agrees that the greatest differences between of gifted and normal kids is their greater drive to achieve and the greater mental and social adjustment.

Dr Ellen Winner in her book Gifted Children (1996) presents her analysis on what makes the personality of a future genius. She tends to disagree that all kids can be made into geniuses and puts a special emphasis on the need for a child to exhibit a natural rage to master. Prodigies show obsessive fascination with a certain content such as numbers, visual patters, auditory stimulation, symmetry, etc. This fascination begets curiosity and an indomitable will to master selected domains of activity.

Polgar sisters  

A Hungarian teacher Laszlo Polgar had a theory about child-raising: every child can be made into a genius with sufficient training. Those ideas solidified when he got married. He decided to test his theory on his own children. As it often is the case with far-out ideas, Laszlo did not get much sympathy from others. When he started putting his ideas in practice with his three little daughters, Laszlo was accused of using his kids as guinea pigs. His determination ultimately lead to a clash with Hungarian government who attempted to stop him in his tracks. One day, armed government officials knocked at Polgar's door. His experiment was considered potentially abusive. Laszlo evoked his "parental immunity" and proceeded with his educational experiment. He and his wife Klara taught and supervised the daughters solely on their own. At some point, it became obvious that specialization may bring more tangible and measurable results. Mathematics and languages could be the fields in which Polgar's daughters could develop genius. However, the ultimate choice came by accident. When the eldest Susan (Zsuzsa) was only three and a half years old, by accident she discovered chess. She asked Laszlo what chess was, and soon she and her father where exploring intellectual challenges of the game. Early indications were that Susan could learn chess quite quickly, and it is hard to say if her father's prior training or some inborn predisposition helped her make fast progress. Laszlo decided to focus on chess which is objective and makes it easy to measure the accomplishment in championships,  tournaments and player's rankings. Chess may not be as important to the future of this planet as physics or medicine; however, it was an excellent material for Polgar's experiment, which on its own could yield highly valuable insights into child's upbringing: whatever the experiment's final outcome. Laszlo explained chess to Susan like he was telling a fable, and she soon acquired a natural love for the game. Some six years later, Polgar's experiment brought first amazing fruits. Susan broke into the Guinness book of records as the youngest chess master at age 10. At age 12 she became a world champion for girls under 16. By age 15 she was the strongest player in the world in her age category! From there on, Susan progress bears typical trademarks of a young prodigy: a unique mix of genius on one hand, and problems to fit the gray reality on the other. In the heart of Susan's problem was her struggle against the male establishment in chess. For ages, male and female chess used to be disjoint worlds. Susan refused to accept sexist divisions and struggled for equal rights to participating in male tournaments. She became the third woman ever to earn the male grandmaster title. Despite her determination to avoid sex differences, Susan decided to try for the Woman's World Championship. She won it easily in 1996. She was later stripped of the title when she refused to play her mandatory challenge matches in the period when she became a young mother. Her battle with FIDE to regain her title is another example of a mismatch between feisty genius and the established rules and tradition (she won the lawsuit against FIDE in March 2001 but did not get her title back). Despite appearances, Susan is a nice, down-to-earth and complete human being. Besides Hungarian, she speaks 7 other languages (Bulgarian, English, Esperanto, French, German, Russian and Spanish). Polgar experiment worked! The theory of training for genius scored one of the most powerful case studies in history. We have no way of knowing Susan's fate had her father had different approach to education. Susan's husband observes however: Had Susan found a thermometer at home instead of a chess set, perhaps we would have cures for cancer and AIDS today.

Interestingly, Susan's success does not complete the Polgar sisters story. Younger sister Sofia (Zsofia) shocked the world in 1989 at the age of 14. In a chess tournament in Rome, she defeated a string of Soviet Grandmasters and reached the highest performance rating of any chess player (male or female) in any tournament in history (2879 for scoring 8.5 of 9). In 1999, Sofia married to a Georgian chess player and now lives in Israel with chess lower on her priority list.

Last but not least, the youngest of the Polgar sisters, Judit (born 1976) is an amazing and still ongoing story on her own. She started learning chess at age 5. At age 11, Judit earned an International Master title, i.e. at earlier age than Bobby Fischer or Gary Kasparov! At age 13 she was the World Under 14 Champion for boys and FIDE's highest rated woman. At age 15 she beat Bobby Fischer's record by becoming the youngest grandmaster ever at 15 years and 4 months (in 1958, Fischer became gm, at 15 years and 6 months). In 1993, at age 17, Judit defeated former world champion Boris Spassky 5.5-4.5 in a 10-game exhibition match. In 1998 she beat FIDE world champion Karpov 5-3 and also became the first woman to win the US Open Championship. She is also the only woman to ever reach the top ten FIDE list (see the current Top 100 ranking). Today, Judit is the strongest of the three Polgar sisters. She leads women rankings by a safe margin (see the current Top 50 women ranking) and may one day become a World Champion (male or female). She played Kasparov in May 2000 over the Internet and lost. Like her sisters, her meteoritic rise has ultimately plateaued as she got married in September 2000. To track her progress see: http://www.controltheweb.com/polgar/news.htm 

It is known that chessplayers often experience various psychological problems. The level of concentration and the game stakes often lead to exposing and amplifying aggressive behavior. Except for an unflinching drive to win, the Polgar sisters show very little of that in their lives outside the chess world. Polgar sisters will for long remain a valuable case study for understanding the role of personality and training in developing genius qualities

Could indeed any normal child be made a genius with sufficient training? Unfortunately, except for major intellectual deficiencies or physical disabilities at birth, there is also a personality factor. Not all children will be able to efficiently develop the rage to master. Of creativity factors listed earlier, curiosity, motivation and the suitable state of mind may be hard to develop in some. All great geniuses showed remarkable curiosity. Curiosity does not have to show itself in the area of sciences or art. If it extends to other areas of life, it bodes well for mental growth. If a child obsessively collects teenage magazines or devours news about pop-stars, you can take it in good part. This sort of curiosity can easily be converted into a more productive force. A more worrying sign would be to see a young person mindlessly stare in the ceiling. Motivation may be the hardest personality component to develop. It has a strong neurohormonal basis that is not yet entirely understood. Depression and bipolar disorders clearly show how the levels of hormones affect motivation. The manic state may often lead to enhanced creativity. Depression is a powerful inhibitor of mental processes and creativity. Then there is an illusive skill of delayed gratification. Those who are most successful are people who are able to wait for the reward. This is true in nearly all professions (Mike Tyson might be a rare exception in a profession were instinctive behavior might actually give one some edge). Impulsive people might differ in neural activity of the prefrontal cortex, and the component of the reward system, the nucleus accumbens. To a large degree, the impulsiveness is genetically determined. The skill of delaying gratification is particularly precious in creative individuals. Gregor Mendel's research in genetics was entirely based on scientific curiosity, and subject to a religious regimen of rigorous observation. There was no reward expected and no reward granted (in his lifetime). His groundbreaking findings waited 35 years to be rediscovered while Mendel himself focused on his duties related to being promoted to the position of a monastery abbot.
The ability to delay gratification can be easily recognized by the parent. It can also be measured scientifically as well as improved by training. Children that are poor at delaying gratification are also poorly organized and show little stress tolerance. They tend to be shy, introvert, and make social contacts slowly. Some will show bully characteristics. On the other hand, children that exhibit better ability to delay gratification will also be more verbally fluent, more skilled in rationalizing their behavior, more attentive and able to concentrate. They are better at planning, more curious and open to new experiences. Early forms of training should be based on showing the relationship between the delay of gratification and the potential increase in the size of the reward ("if you do not eat this cake, you will get money that will let you buy more cakes or ... whatever you decide to buy"). In short, the ability to master one's own emotional drives may be as important as the standard school curriculum. 

A blind belief that every normal child can be trained to be a genius can be dangerous if there is a disparity between parent's expectations and the actual results, esp. if the parent is not emotionally ready to accept setbacks. There are quite a number of cases of prodigy kids that reached astonishing skills in young age only to fizzle out later in life. The most notorious, although frequently misrepresented, case is that of William James Sidis. His father, Boris Sidis started with similar assumptions as Laszlo Polgar. His training plan worked out great in the early years of young William. However, his rebellious nature led him to a lifestyle that many considered wasteful, esp. in the context of his amazing mental skills. In essence though, there is nothing wrong with a prodigy child leading a withdrawn or "normal" life. 

William James Sidis: The smartest brain in history

Some sources claim William James Sidis (born 1898) was a genius that has reached the highest estimated IQ of all people in history. Yet you will not find his name among Nobel Prize winners or famous writers. William James Sidis is important for our analysis of genius mostly due to the fact that he is often quoted as a textbook example of an early prodigy burnout. Authors quote his life story to warn parents against being overly zealous in bringing up their prodigy children. Williams James Sidis became a poster child of the adage: Early to ripe, early to rot. His case is juxtaposed with that of Polgar sisters, Norbert Wiener, or John Stuart Mill. Incidentally, the latter is also attributed by some with the highest IQ in history. Note, however, that the concept of IQ has been developed decades after Mill's death in 1873

Formula for genius by Boris Sidis

To understand William James Sidis case, we need to look back to the days of his grandfather Moses Sidis, an Ukrainian Jew who was a well-to-do merchant and an intellectual. Moses Sidis, along the family tradition of early tutelage, took fatherly care of his child Boris Sidis born in 1867 in Kiev. Boris Sidis was a prodigy on his own and showed an early interest in poetry and languages. His childhood experiences have probably influenced his way of thinking on the way he would bring up his own children. His life story also shows traces of an inclination for being a social rebel. During the czarist pogroms of the Jews in the 1880s, Boris was arrested for teaching peasants the skills of reading and writing. He spent two years in solitary confinement and was subject to severe torture. This may have subsequently influenced his nature of an intellectual with a profound scorn of ignorance. Hopeless and penniless, Boris Sidis emigrated to the US and quickly become a respected physician, innovative psychologist and a pioneer in the field of psychopathology. He was an intellectual of wide interests and extensive reading. He wrote profusely on a variety of subject showing deep concern for the future of society. In Philistine and Genius (1911) he wrote: "We spend on barrack and prisons more than we do on schools and colleges. What is the level of a civilization in which the cost of crime and war far exceeds that of the education of its future citizens?"). His writings are peppered with radical ideas that might have been but did not have to be the root of his problems in bringing up William James Sidis. Boris Sidis writes: "the school-system should be abolished and with it should go the present psychologizing educator, the schoolmaster and the schoolma'am". Boris Sidis showed deep-rooted contempt to mindless memorization, irrational tradition, superstition, myths, creeds and dogmas. An opposition to irrationality should be harmless unless it was tainted with negative emotion. In Philistine and Genius (Chapter II), Sidis wrote angrily:

... uncritical belief in authority, meaningless imitation of jingles and gibberish, memorization of mother-goose wisdom, repetition of incomprehensible prayers and articles of creed, unintelligent aping of good manners, silly games, prejudices and superstitions and fears of the supernormal and supernatural, are censured in adults, why should we approve their cultivation in the young? "At home and at school we drill into the child's mind uncritical beliefs in stories and tales, fictions and figments, fables and myths, creeds and dogmas which poison the very sources of the child's mind. At home and at school we give the child over as a prey to all sorts of fatal germs of mental diseases and moral depravity

Boris dedicated a substantial effort to making his son an example of what education can do to a young human brain: "Education must aim at the bringing out of the genius in man"; "I appeal to you, fathers and mothers, and to you, liberal-minded readers, asking you to turn your attention to the education of your children, to the training of the young generation of future citizens"

Unfortunately, Boris's early success with bringing up William James showed little warning of sad things to come. In the spirit of avoiding "uncritical belief in authority", his son went on to defy the authority of others', including that of his own father's. Ultimately William James Sidis estranged his parents. Unorthodox father, Boris Sidis, died suddenly in 1923 of a cerebral hemorrhage at an early age of 56

Amazing prodigy

Young William James Sidis was tutored relentlessly from the cradle. He could read English at the age of 3. He amazed everyone around at the age of four when he would use a typewriter to write in English and in French. When he was five, he could already speak five languages. He would read Plato in Greek. At the age of nine he was ready to pass entrance exams to Harvard but was not admitted as not being mature enough for college life. At 11, after three years of trying and being refused on the grounds of his youth, young Sidis entered Harvard. His stardom begun with a lecture about "Four-Dimensional Bodies". After the lecture, Prof. Daniel Comstock of MIT hailed him to be a material for the greatest mathematician of the 20th century. New York Times picked up a story and gave him national prominence. Norbert Wiener, William's Harvard contemporary, wrote in 1953 about the lecture:  "The talk would have done credit to a first- or second-year graduate student of any age. Sidis had no access to existing sources so that the talk represented the triumph of the unaided efforts of a very brilliant child"

Prodigy downfall

At Harvard, young Sidis appeared to be rather poorly adapted emotionally and socially. His colleagues considered him to be an eccentric. He was reclusive and did not make many friends. He is said to had suffered from various mental problems at that time and had to be treated in his father's Psychotherapeutic Institute. In 4.5 years, William James Sidis nevertheless graduated from Harvard cum laude in 1914. At about the same time, other Harvard prodigies graduated without receiving a fraction of attention granted to Sidis. The list includes Norbert Wiener, Buck Fuller, and Roger Sessions. On the graduation day, Sidis was in the company of students such as Julius Spencer Morgan, Gilbert Seldes, Vinton Freedley, and Laurence Schwab. Sidis told a reporter "I want to live the perfect life, and the only way to live the perfect life is to live it in seclusion". He went on to study law at Harvard Law School, which he quit shortly before graduation. In 1918, Sidis began teaching mathematics at Rice University, Texas. Rice students would ridicule their childish teacher for eight months. Sidis was constantly annoyed with the attention of the media. Not feeling well in academia, William James Sidis left his academic pursuits once and for all in 1918. In the meantime, he espoused leftist ideas and became a budding Marxist. In 1918, he was arrested during a May Day anti-war rally in his home town of Boston. He was accused of inciting a riot and sentenced for 1.5 years in jail. With his father's backing, the sentence was not served. However, young genius Willam James Sidis disappeared from the public eye. In 1923, a reporter found him in New York working a menial job. Sidis told him he was not a genius people want him to be. He also declared his wish to become anonymous in his life. Sidis expressed a simple desire to focus on undemanding jobs that give him more satisfaction. And that seemed to be the end of William James Sidis miracle. 

In 1937, the New Yorker dug out the young prodigy story again and run a piece entitled "April Fool". In the article, Sidis is reported as saying: "I hated Harvard (...) anyone who sends his son to college is a fool -- a boy can learn more in a public library". The article paints a picture of an incredible young Harvard genius that went on to live a life of a low-paid bookkeeper. Sidis considered the article a gross violation of his privacy and sued ultimately receiving a small out-of-court settlement (see: US Supreme Court decision)

One of the reasons for lasting ridicule of Sidis was one of the books he wrote. While most people never write any book in their lifetime, when it transpired that Sidis, the future greatest mathematician, wrote a lengthy hobby book about streetcar transfers, the eyebrows went up. A transfer is a piece of paper given to passengers who want to take a particular transfer route. Sidis collected streetcar transfers obsessively, and considered it a "reasonable" hobby, which he called a peridromophila. His classification of transfers was Aristotelean. His scrupulousness Darwinian. Yet, the book published in 1926 under a pseudonym of Frank Folupa was not found to be a reason for admiration. Supposedly, this was "the most boring book ever written". Throughout his life, Sidis struggled to escape the publicity and the aura of expectation. To no avail. When he died of a cerebral hemorrhage in 1944 at the age of 46, he sparked yet another suspicion in reference to early prodigy training: "too much thinking can harm your brain, too". Sidis's father died early of the same affliction (see above)

Fallen star or an anonymous giant

It is not true that Sidis contributed little to society beyond providing a remarkable study case for the growth and decline of genius. The only book he published under his own name was The Animate and the Inanimate in 1925 at the age of 27 (as befits a true rebel, he argued spuriously therein against the second law of thermodynamics). In 1979, another giant of the centry, Buckminster Fuller rediscovered the book and wrote in a letter: "Imagine my excitement and joy on being handed the xerox of Sidis' 1925 book, in which he clearly predicts the black hole" (Subramanyan Chandrasekhar came up with the idea only a few years later). Incidentally, Buck Fuller was Sidis's classmate at Harvard and was saddened by William's story. Unlike Sidis, Fuller bloomed late into his life. In 1927 at the age of 32, bankrupt and jobless, he was on a verge of suicide. However, he decided to embark on an experiment: what a single individual can contribute to change the world and benefit all humanity. Documenting his life scrupulously in a half-century daily diary, Fuller went on to produce miracles of the mind (an array of inventions and 47 honorary doctorates are just a drop in the bucket of Buck's achievement). 

Fuller got Sidis's book from Dan Mahony, a researcher who decided to unearth the truth about Sidis and answer the question: Where has all the genius energy gone? Mahony notes that Sidis's interest in Okamakammessetts Indians may have resulted in espousing the principle of anonymous contribution to society. Mahony begun a search for possible anonymous contributions of Sidis. One of the fruits of this search is a remarkable 600-page unpublished manuscript on the history of the red race in America dug out from a dusty attic: "The Tribes and the States". Lots of evidence indicates that this book packed with most unusual theories had been written by young Sidis. The search continues (see also: Sidis discoveries; incl. Sidis's patent on a perpetual calendar and its picture, Unconscious Intelligence where teenage Sidis self-analytically writes about how subconscious brain works, Sidis's letter written shortly before his death with his troubles attributed to being a conscientious objector, http://home8.swipnet.se/~w-80790/Q&A/Q&A_2.htm. and http://www.uh.edu/engines/epi969.htm)

Sidis Fallacy

Multiple studies indicate that young prodigies end up better adapted to life in society than their peers (incl. the famous Terman Study). Sidis case is an exception rather than a rule. One theory says that an increase in IQ is welcome only up to a certain point beyond which it leads to extreme attitudes and social problems. Another theory says that there is an optimum span of time in which a domineering father figure helps child's development. The inflection point comes at the moment when a young man crystallizes his own views on how to run his life. Once father's and child's goals diverge, fatherly domination may spark negative emotions in immature mind. Instead of fostering further growth, this emotions may channel young energy in the wrong direction. The spiral of negative emotions escalates and parent-child partnership becomes a struggle for dominance in goals and values. Parental dominance shows up, for example, in a strong correlation between the occupation of parents and children (e.g. among doctors, lawyers, musicians, etc.). In Sidis's case, fatherly supervision may have gone beyond this theoretical inflection point, and William James Sidis was transformed from an admired prodigy to a haunted freak forced into behaviors incongruent with his mental profile and his own set of values. There is no hard evidence Boris Sidis made any major upbringing errors. His unrelenting tutoring and scorn of ignorance might have put too much burden on William, yet this could well be measured only in retrospect. Boris's radical views might have affected the family atmosphere: less love, more rigor (in contrast to warm atmosphere of families of Bohr, Darwin, etc.). The theory of reserve energy might have pushed Boris to neglect natural circadian underpinnings of healthy mental effort. Early in the 20th century, the knowledge of the function of sleep in learning was little. Even today, many students fall into the treacherous trap of belief that you can learn more by adding study hours by cutting down on sleep (this is how the cruel myth of polyphasic sleep gained a foothold in numerous young lives). Kathleen Montour believes that Sidis Fallacy has badly affected the education of the gifted children. In her view, a rational approach to special education for the gifted children can only lead to positive results, and the main problem on the way is, what she calls, a creator parent with excessive ambition for the child's future in proportion to the knowledge of the psychology of growth. Yet, as a psychiatrist, Boris Sidis could not have been better primed to be a good educator. The case may never be answered satisfactorily

Young rebels

The world is populated with young rebel geniuses. They tend to be radical. After all, only genius mind fully exposes the imperfections and contradictions of the surrounding world. Young geniuses power rebel environmentalist movements, animal-right groups, open source community, human rights organizations or groups for legalizing narcotic drugs. They espouse socialism, communism, or ... vegetarianism. They struggle to abolish nuclear power, GM foods, human cloning, urban sprawl, MacDonalds, air-polluting car, stem cell research, animal testing, etc. They often abhor the riches of Bill Gates, look with horror at George W. Bush presidency, or glorify Linus Tornvalds,  Mike "Gorbie" Gorbachev or Nelson Mandela. Driven by similar but often not completely crystallized motives, they join opposing camps. Some end up fighting for monolingual world, others struggle for "language ecology" against the "linguistic imperialism of English". Some will fight for the United States of the World, while others see globalization as a veil under which global corporations exploit poorer countries in Africa, Asia and South America. Those radical minds usually end up being highly productive and balanced individuals. They learn to temper negative emotions and start to favor maths over myths. 

William James Sidis has definitely grown to be a young rebel. It is difficult to diagnose how much of his attitude was pathological or neurotic. His fascination with Indian tribes and support for anonymous contributions to society could have concealed his true opus vitae. The peer-review pressure and publish-or-perish principles have driven many bright minds away from academic pursuits. A contemporary little-known dyslectic rebel genius Dr Robert Skoyles writes: "I prefer to be without an income and academic address. I would rather starve than be ignorant. I am puzzled why many bright people allow themselves to be time-lobotomized by the const