Ability & Disability

The case of an 82-year-old female with probable Alzheimer’s disease (AD), who developed unusual artistic creativity after development of her disease, is described. The possible pathogenetic mechanism is discussed. The patient showed no inclination toward visual arts during her premorbid years. However, 4 years after development of AD suggestive symptoms she started painting beautiful pictures rather impulsively. Some such paintings have been appreciated even by a qualified art expert. Such de novo development of artistic creativity had been described earlier in subjects with the semantic form of fronto-temporal dementia (FTD), but not in AD. The prevailing concept of lateralized compromise and paradoxical functional facilitation, proposed in connection with FTD subjects, may not be applicable in AD subjects where the affection is more diffuse and more posterior in the brain. Hence, the likely pathogenetic mechanism involved in the case described may remain uncertain. Possibilities are discussed.

Annals of Indian Academy of Neurology

An external file that holds a picture, illustration, etc. Object name is AIAN-14-291-g001.jpg

To summarize, the exact mechanism contributing to the prolific artistic creativity in the present case remains a mystery. The question why this phenomenon is not observed in many patients with AD remains also unanswered.

The Mathematician and the Mystic

“Once there were a brother and sister who devoted themselves to the search for truth. A brother who spent his long life solving problems. A sister who died before she could solve the problem of life.” The sister was Simone Weil (pronounced “vay”), a philosopher and political activist who died in 1943 at age thirty-four and gained fame with the posthumous publication of works, assembled from her voluminous notebooks, on society, justice, and the mystical life of faith. Her elder brother André, who lived to ninety-two, was a prodigy who became one of the twentieth century’s preeminent mathematicians.

Book review at New Atlantis

The Weil Conjectures: On Math and the Pursuit of the Unknown

Crisis in Cosmology

The two main components of the Universe that we’re familiar with are ordinary matter and light. The matter is mostly atoms, and the light mostly microwaves.

The matter is strongly clumped together to make stars and galaxies separated by vast regions of empty space. But the light is distributed with almost perfect uniformity. The cosmic microwave background is the same everywhere with differences only in the fourth decimal place. The microwave temperature ranges from 2.721 to 2.729 Kelvin across the sky; but matter in a star is 26 orders of magnitude more dense than the isolated atoms we find in the space between galaxies.

We’d like to be able to understand the difference in terms of gravity pulling so much more strongly on the matter compared to the microwaves*. Fair enough. Let’s go back to a time when the light and the matter were stuck together. Much earlier in the history of the universe, the radiation was much more energetic and the temperature was much hotter. When it was hotter than about 10,000K there could be no atoms, because the heat was enough to free electrons from their atoms, creating a gas of charged particles, called a plasma.

A gas made of atoms is transparent; light goes right through it. But a plasma is like a dense fog that traps light. So in the hot, early universe, the light and the matter were tied together, such that hotter places were also denser. White noise made random waves that we think account for today’s tiny differences in the microwave temperature across the sky.

When the universe was 400,000 years old, the electrons and protons became cool enough to settle down into atoms, and from that point, matter and light were free to go their separate ways. Gravity caused the matter to start clumping, while the light streamed in all directions, nearly unaffected by gravity.

This is a story that was known at the end of the last century, when I was studying astrophysics. In 1997, the observations and the calculations became accurate enough to ask the question whether the known force of gravity could account for all the clumping of matter that has occurred, starting when the universe was 400,000 years old. The answer was “no”, and cosmologists started looking for some previously unknown form of matter that they called “dark matter”. Compared to the electrons, neutrons, and protons of ordinary matter, there would have to be about 5 times more dark matter to account for the observed clumping.

This is an embarrassment, of course. Scientists don’t like to make arbitrary assumptions about a substance that is all around us but which streams right through ordinary matter without interacting, and which evades every form of detection we have ever devised.

But this past summer, the situation has gotten worse, precipitating what Joe Silk calls a “crisis in cosmology”. The crisis is this:

What happened in 1997 was that two lines of evidence converged to answer a long-standing question in cosmology. The question was, how fast is the expansion of the universe slowing down due to mutual gravitational attraction. The first lines of inquiry came from refinement of the project begun begun by Edwin Hubble in the 1920s. The redshift of various supernovae in distant galaxies (redshift comes from recession) is plotted against the apparent brightness of those supernovae (apparent brightness is dimmed by distance). The second line is to count galaxies further and further away to get a sense of how much space is out there. Our intuition tells us that if you look a distance r, you see a sphere with area 4πr2. The number of galaxies at distance r should be proportional to 4πr2. But in general relativity theory, space can be structured differently from this. A lot of mass would create a strong gravitational pull, so there is actually less space than 4πr2 for very large distance r.

The two lines are tied together by general relativity theory, which relates the structure of space to the matter in it. Both lines agreed: The expansion isn’t slowing down; it’s speeding up. And there is actually more space than 4πr2 at large distance r. The implication was that the average mass density of the universe is negative. In response, cosmologists invented the concept of dark energy, a hypothetical substance that has negative mass, which is something that no one has ever seen on earth or in space. Dark matter is a different, unknown substance that it has a gravitational pull and and clumps up like ordinary matter and thins out as the universe expands. But dark energy is spread uniformly and has the same density today as it did when the universe was small. You can’t dilute it.

This is strange enough, but Silk’s crisis is a further paradox. There is a third way to measure the average density of gravitational matter in the universe, and that involves gravitational lensing of radio waves from the 3 degree background. These radio waves are left over from the glow of ordinary matter when it was opaque, less than 400,000 years after the Big Bang. They are almost but not quite uniform, for reasons that physicists like to explain as statistical fluctuations. And these tiny fluctuations are distorted when we look at them because of gravitational lensing from the matter that the radio waves have passed through along the way. How much gravitating matter would it take to cause the lensing? The answer comes out positive, as though there were no dark energy. So this line of evidence says the expansion of the universe should be slowing down, not speeding up. And the galaxy count should increase less rapidly than 4πr2 at large distance r, when the observations show that the count increases more rapidly.

The venerable Joe Silk says this is a “crisis”. Is that too strong a word? Maybe not. The whole science of cosmology is based on the conceit that the physics we study in the laboratory (and in particle accelerators) continues to work at the largest distances and the earliest times. The laws of physics were born with the Big Bang and are the same everywhere and for all time. We’ve already had to invent two strange, new forms of mass-energy that have never been observed in the lab, which calls the conceit into question. If Silk is right, we might have to modify the laws of physics themselves, and then anything goes. Our conceit was unjustified. Physicists would have to say that the laws we know and understand can’t explain what we see. We would lose the science of cosmology. Silk calculates the probability at 3.4 standard deviations, or 99.93%.


*Microwaves and other forms of light are also subject to gravity, but the force is too small to be important.

What is the cosmic web? - Big Think

Does the Brain Generate Consciousness or Channel Consciousness?

There are two ideas about the relationship between conscious awareness and the brain.

  1. The brain creates consciousness through its computational functions. Consciousness is an emergent feature of physical computation having attained a certain kind of complexity. For example, Hofstadter says that consciousness arises when a computational system is able to reference itself.
  2. Consciousness is a primitive element of reality with an existence independent of matter. The brain is a transceiver that channels sensory input from the physical world into consciousness, and transduces intention into thought, nerve signals, and ultimately into motion of the body. This idea is associated generally with Descartes, and it was articulated explicitly by William James

Many of the smartest people in the world are trained in physics and steeped in computation. Overlapping this group is the set of researchers in artificial intelligence. Almost all these people tend to favor #1.

The Scientific World-View which has arisen in the last 150 years regards physical matter and its fields as a closed system in which all causality is accounted for. There is no need nor indeed is there room for consciousness as a separate element of reality.

Weighing in for #2 is a 30-year course of experimentation in the Princeton University laboratory of Robert Jahn and Brenda Dunne, 1975-2005. Jahn had a PhD in physics, and was department chair and at one time Dean of the School of Engineering. Their signature experiment demonstrated the ability of human consciousness to modify probabilities that are treated as random in standard quantum mechanics.

They used voltage fluctuations in a Zener diode to generate random numbers, and asked self-selected experimental subjects to sit in front of a computer screen and use their will to make some function of the quantum-random input higher or lower in each trial. At the end of 800,000 trials, the difference between those in which the subjects intended “high” and intended “low” was over 5 sigma. The probability of this occurring by chance is less than 1 in 10 million.


If you believe the result of Jahn and Dunne, it is decisive evidence in favor of paradigm #2.

(Incidentally, in parallel to the two views of the mind, there are two views of the measurement problem in quantum mechanics. 

  1. The wave function never collapses, and quantum probabilities derive from the relative number of branches of the universe in which an observer might find himself. (Everett, de Witt, Tegmark) 
  2. The wave function collapses when the result of a measurement is registered by a conscious being. (Bohr, von Neumann, Wigner, Bohm)

The results of Jahn and Dunne provide support for #2 here as well, and the Inverse Quantum Zeno Effect provides a mechanism. Kauffman’s experiments on quantum criticality in superposition states of neurotransmitters are evidence that the brain is evolved not for maximum reliability (as we engineer computers) but rather for the opposite: maximum quantum indeterminacy at the level of individual neurons.)

Why do so many smart physicists and computer scientists ignore the experiments of Jahn and Dunne? Some don’t know about the experiments, which mainstream physics journals have refused to publish. Most assume there must be some mistake in Jahn’s methodology, and feel confident enough in their paradigm that they don’t judge it a worthwhile use of their time to delve into details of the experiments looking for the error. We should not underestimate the psychological power of cognitive dissonance. Scientists, like most humans, are subject to confirmation bias and herd mentality.

Here are some reasons we should take Jahn and Dunne seriously:

  • Jahn was uber-competent as a statistician and experimental scientist.
  • Recognizing the high burden of proof to which he was subject, Jahn included far more confirmations and calibrations in his experimental design than would be expected in any other kind of physics experiment.
  • The most direct check was to substitute computer-generated pseudo-random numbers for the quantum random numbers. When they did this, results were within 2 sigma of expectation.
  • Jahn and Dunne’s results were replicated directly in two European laboratories.
  • Working with a very different physical system but just as much experimental rigor, Dean Radin has also found an effect of human consciousness on quantum systems.
  • The effect that Jahn found was ~10-4, a few bits altered out of every 10,000 bits with subjects drawn randomly from the population. Experiments in which screened, talented psychics are used yield much larger effects. Effect size also rises as subjects have an emotional stake in the outcome, rather than just a random number on a computer screen.
  • Jahn was not motivated to fabricate or inflate his results. In fact, simply because he took parapsychology seriously, he paid a high price in his personal life, in his academic life, and in his career as an aerospace engineer.
  • Jahn and Dunne’s result is much less surprising in light of a well-established and well-maligned science of experimental parapsychology that has produced a body of knowledge which the Scientific World-View is at pains to accommodate.

If our commitment to the methodology of science is to have any meaning at all, we have to pay careful attention to replicated findings that contradict our fundamental theories.

Is this guy for real?

Eric Weinstein is deeply disaffected from the academic community that provided his pedigree. He holds a PhD in mathematical physics from Harvard (1992). But he has never held an academic job, and has been working largely in isolation on the Big Enchilada, the Theory of Everything, the Unified Field Theory, the integration of quantum field theory with Einstein’s theory of gravity that the biggest brains in physics have been trying to crack for lo these last 80 years.

Last month, Weinstein came forward with details of his Geometric Unity theory. He claims that a 14-dimensional space naturally divides into the 4 dimensions that we all know and love plus 10 dimensions of spinors, which Paul Dirac identified 90 years ago as the appropriate wave function form for electrons and the other familiar particles.

I have a PhD in theoretical physics, but his presentation is way over my head. There are probably 20 or 30 elite physicists in the world who are qualified to judge whether Weinstein’s theory is sound. All of them are already committed to one or another approach of their own. Where is the motivation for them to invest months studying and evaluating Geometric Unity, which comes from an outsider, and which is unlikely to enhance their own professional standing, no matter how the theory eventually fares?

Weinstein has claimed that academic science is populated by the most brilliant minds on the planet, but that their university culture is deeply troubled and unable to assimilate truly new ideas. He says that anyone whose thinking is radical enough to have a chance of solving fundamental new problems will not survive in today’s academic environment. From my personal experience, I agree.

What I and many others find hard to understand and even suspicious is why Weinstein has not submitted a journal article so that his ideas can be evaluated through established channels?

Paul Erdős

Paul Erdős, born this day in 1913, was a prolific and eccentric Hungarian mathematician.  He was known both for his social practice of mathematics (he engaged more than 500 collaborators) and for his eccentric lifestyle. He lived out of a suitcase, and traveled the world, staying with other mathematicians, talking shop long into the night, sleeping on the living room sofa. “Property is a nuisance.” He devoted his waking hours to mathematics, even into his later years—indeed, his death came only hours after he solved a geometry problem at a conference in Warsaw.

Erdős asked questions only a mathematician could love (or understand) and proposed problems for others in the most abstract and recondite areas of math, but also in practical questions of approximation, set theory, and probability. Much of his work centered around discrete mathematics, cracking many previously unsolved problems in the field. Overall, his work leaned towards solving previously open problems, rather than developing or exploring new areas of mathematics.

Erdős published around 1,500 mathematical papers during his lifetime, a figure that remains unsurpassed. He firmly believed mathematics to be a social activity, living an itinerant lifestyle with the sole purpose of writing mathematical papers with other mathematicians. Erdős’s prolific output with co-authors prompted the creation of the Erdős number, a point of pride for all mathematicians of a certain age. Your Erdős number is 1 if you co-authored a paper with Erdős, 2 if you have co-authored a paper with someone whose Erdős number is 1, 3 if you co-authored a paper with someone whose Erdős number is 2, etc.

[This text was edited from his Wikipedia entry.]

Ancient technology exceeding anything we can do today

It’s a technical argument dependent on engineering and material science, but well worth following in detail because it is some of the solidest evidence we have that human history is in need of a thorough rewrite.

There are drill holes from ancient Egypt into granite that could only have been made with a very sharp, very hard drill under enormous pressure. Granite is so hard that it can only be cut by corrundum or diamond. And spiral scars on the cores that were removed from these holes tell us that with every turn of the core-drill, the blade advanced several millimeters into the stone. The drills that we have today operate at high speed, and with each turn of the drill they go much less than 1 mm into the stone. Hence the conclusion that these drills were made of a very hard mataerial and were backed by several tons of pressure.


Traditional archaeology dates these drill holes to the earliest dynasty or pre-dynastic Egypt, and people at that time did not have iron, let alone diamond drill bits or power tools. They had not yet invented the wheel.

A growing consensus among rebel archaeologists is that an ancient high-tech civilization left us the Pyramids and the megaliths of Machu Picchu and some other examples of precision work with giant pieces of stone. Their technology was lost in a global catastrophe that suddenly ended the last ice age, 12,700 years ago. Civilization rebooted from a few surviving hunter-gatherers, but our surviving myths of floods and a race of gods remind us today of that earlier civilization.

One of my all-time favorite human beings

It is remarkable that mind enters into our awareness of nature on two separate levels. At the highest level, the level of human consciousness, our minds are somehow directly aware of the complicated flow of electrical and chemical patterns in our brains. At the lowest level, the level of single atoms and electrons, the mind of an observer is again involved in the description of events. Between lies the level of molecular biology, where mechanical models are adequate and mind appears to be irrelevant. But I, as a physicist, cannot help suspecting that there is a logical connection between the two ways in which mind appears in my universe. I cannot help thinking that our awareness of our own brains has something to do with the process which we call “observation” in atomic physics.

— Freeman Dyson, 1923 – 2020

My favorite Dyson story is the one that got him started in his career. Twenty years after quantum mechanics, 40 years after relativity, no one had been able to combine the two in a coherent, unified framework. Then, in 1948, there was an embarrassment of riches. Julian Schwinger, a fireplug who spouted equations as if by direct channel from God, told us it was all about particles and sources.

Image result for schwinger source

Richard Feynman, who played the bongo drums and made wisecracks, told us that while we’re not looking, the electron does every possible and impossible thing, and he gave us a cartoon system for keeping track of what all those things are.


Dyson was a 24-year-old grad student, and undeterred by their mutually antagonistic personalities, he befriended both of them, channeled both their thought processes, and published a paper in which he demonstrated that the two theories, though they looked so different on paper, always produced the same predictions.

After that, the rigid hierarchical system of academic physics carved a place for him, and allowed him to do whatever he wanted, so that he never again had to take a test or qualify for a degree. He hung out at Princeton’s Institute for Advanced Study, home to Einstein, Oppenheimer, von Neumann, and Gödel.

He was an original and independent thinker on all subjects. He thought about culture and politics and technology as well as the deep structure of science. He was still writing for the New York Review of Books in his 96th year.

Dyson had a gift for the memorable line and a disarming honesty that admitted the possibility of error. It was, he would say, better to be wrong than to be vague, and much more fun to be contradicted than to be ignored. Dyson was by instinct and reason a pacificist, but he understood the fascination with nuclear weaponry.
Guardian Obit

“The more I examine the universe and the details of its architecture, the more evidence I find that the universe must in some sense have known we were coming,”

What do you make of this?

This material stretches our credulity, and clearly Dr. Wood doesn’t expect to be believed. Nazis landing on the moon in the 1930s? Races of lizard-like aliens maintaining earth bases in Antarctica to this day?

What he documents most convincingly is that there is a sophisticated CIA program of deceit to keep us from knowing the truth about an Alien presence on earth. Maybe Dr Wood is part of that program. Maybe much of what he reports is an elaborate fantasy designed to deceive. But he (and other sources) have convinced me that there is something there worth covering up, and this bare fact is astounding.



We could be SOOO much healthier

Do you know that there’s not a pharmaceutical drug on earth that works for more than seventy percent of the population? Not one drug. Pharma companies consider a drug a success if it’s effective in a much smaller proportion of patients.
— Zia Haider Rahman

Thanks to Sanders and Warren, the Democratic debates are highlighting the huge inefficiencies and inequities in the way we pay for medicine in America. Health care in America costs twice as much per capita as other modern, industrialized countries, and our outcomes are worse than all of them [Harvard Gazette]

But even more important than changing the way we pay for medicine is changing the way we practice medicine, and this is a discussion that has been marginalized.

The gold standard for validating a treatment is the placebo-controlled double-blind study. If you practice medicine that is not based on PCDB studies, you can’t get third party payments and you can’t even get malpracctice insurance.

And yet, we know that PCDB studies are effective validation for only about half of a half of medicaal practice. We’re excluding ¾ of what we know to be effective.

“Placebo-controlled” means that we are focused on the body, not the mind. We are deliberately excluding anything that works through the mind from study, treating it as  an annoying artifact in our scientific study. Medicine that works with the mind as well as the body can be twice as effective. Yet, medical employers assure that doctors’ calendars are so crowded that they have no time to develop a caring relationship with their patients. We make sure our doctors function only as diagnosticians and prescribers, confining them to the part of job that computer algorithms can actually do better. We forbid them to function as healers, or to bring empathy, intuition, and caring to their practice.


The structure of a PCDB study specifies uniformity. Every subject in the study receives the same treatment. We know that choosing the right treatment for each individual patient is half the story, and yet we are not even studying individualized medicine, let alone practicing it. Genetics, personality, and the microbiome make each patient unique; yet every medical intervention in use today has to be validated in a study that treats patients as if they were the same.

Medical technology concept. Medical instruments.

Medicine could be at least four times as effective for the same expense and effort, based on individualized medicine, and considering the mind together with the body. And this is in addition to the low-hanging opportunities to eliminate insurance overhead and administrative costs which are peculiarly American inefficiencies.