Mid-morning.
Today is your son-in-law’s birthday as well as the ‘Griffith’ twins, Kay and
Ann – a reminder to wish them well. – Amorella
0838
hours. It is a beautiful summer morning, low humidity and in the low seventies.
We have the windows open and Jadah and Spooky are quite content with this
arrangement. Carol and I saw a fawn traveling through within the back tree line
and before our cement slab in the woods. She still has her spots – I can’t tell
if she’s the same one that I saw last week or not.
1714
hours. Two good articles are published on BBC today. I think both are
interesting and might actually be useful somewhere down the pike.
I agree. Post, boy. - Amorella
** **
BBC –
Future: “Why true coincidences are
hard to find”
•
By William Park
•
11
July 2016
It
would appear we cannot go for more than a few days in 2016 before there is
another high-profile celebrity death or mass shooting reported somewhere. More
recently, in the space of a fortnight Harambe the gorilla was killed for putting the
life of a toddler in danger and an American alligator killed a two-year-old in
Florida. And in the UK in little more than a week, the prime minister, leader
of Ukip, England football team manager, the host of the BBC’s Top Gear and 63
key Labour figures all resigned.
But
seemingly coincidental headlines, or spates of disasters are more likely to
reflect how we consume news and save memories.
“The
sceptic’s response is that a coincidence is bound to happen if you have a
sample large enough,” says Bernard Beitman, visiting professor at the
University of Virginia and author of the book Connecting with Coincidence. “If
you toss a coin 1,000 times, through chance the odds are that you’ll have a run
of seven or eight consecutive heads. However, if you tried to toss eight
consecutive heads it’s incredibly unlikely. But, because in the first example
your dealing with a large sample, it’s not surprising that something seemingly
unlikely happened.
“This
is a random clustering effect of events. For example, with aircraft accidents,
there are probably dozens each year, and eventually you’ll have four or five
that all happen in the same week purely by random clustering. It doesn’t
necessarily mean that week was much more dangerous for global air travel,
though.”
Apparent
spates of disasters, like aircraft accidents, are more likely the result of
random clustering.
“A
sceptical statistician will say ‘in a large population, any strange thing could
happen’, and dismiss the coincidence as random.”
However,
there’s a problem with using the word random. “The statistical definition of
random would be that two events are entirely unconnected,” he adds. “But in
reality, it’s almost impossible to prove that two events are unconnected,
particularly when we’re aware of so much of the discourse in the media. A
random universe is hard to prove.”
Consider
reports of apparently simultaneous discoveries.
Alexander
Bell and Elisha Gray, for example, both had their lawyers file patents for the
telephone in different offices on the same day – February
14, 1876. Both designs, and prototypes tested contained many similarities, and
some key differences.
Bell’s
patent application was judged to show he achieved a working design earlier than
Gray
However,
this isn’t such a massive coincidence. Their work was based on research being
carried out by themselves and many other scientists experimenting with the
exciting new field of telephonography. The inventions didn’t magically appear
out of thin air on exactly the same day, they were an end goal that was being
worked towards, and in the end it was a race to finish the designs first to
prove you had a working design. It’s not surprising, then, that two researchers
both trying to design an instrument for transmitting and receiving vocal sounds
filed their patents at the same time.
In
2016 there were a ‘phenomenal’ number of high-profile deaths reported.
Even
the apparent increase in the number of celebrity deaths reported this year can
be explained in a similar way. According to the BBC’s obituary editor, Nick
Serpell, in the first three months of 2016 there were a ‘phenomenal’ number of high-profile deaths reported,
from David Bowie to Prince, compared to previous years. Between 1 January and
31 March, 24 obituaries were published, compared to five in the same period in
2012.
Serpell
suggests there are a few reasons why this might be the case: firstly, a rise in
populations 50 years ago means that there are more people dying. “People who
started becoming famous in the 1960s are now entering their 70s and are
starting to die,” he told BBC News.
“There
are also more famous people than there used to be,” he adds, as technology has
brought more public figures into our living rooms. “In my father or
grandfather’s generation, the only famous people really were from cinema –
there was no television.”
Similar
reasoning may explain a bizarre coincidence in the history of comics. In 1951,
two new series of comic strips – one British, one American – were published for
the first time. Both featured a young boy, his dog and the mischief the two got
up to – both were called Dennis the Menace. The two
comic strips are entirely unrelated, but were so similar they had to be renamed
when distributed in each others’ countries to avoid confusion.
The
chances of two cartoons with the same name and the same concept being published
within five days of each other on different sides of the Atlantic seems highly
unlikely, but as Beitman explains, such a coincidence reveals much about the
zeitgeist.
A
demand for a certain type of story means news editors will look out for similar
stories.
“There
are ideas in the air,” he adds. “Even though these two cartoons were published
entirely separately, the creators were probably drawing on ideas that were
circulating in the population. It’s the same with news events, similarities
between seemingly unrelated events crop up because we’re aware of current
zeitgeists.”
Random
clustering could explain apparent spates in aircraft accidents.
This
is then reinforced by news editors who see a demand for a certain type of
story, whether it’s about celebrity deaths or animal attacks, and so are more
likely to cover similar events – however loosely related – to play up to the
zeitgeist, because they know that this is what people are talking about.
We’re
also more likely to remember remarkable coincidences, explains Beitman. We
think that bad news comes in threes, but in reality, pairs of news events are
far more likely to occur, it’s just we remember when it comes in threes because
it’s more unusual. It reflects a bias in the way we store memories – unusual
events stand out to us more.
So if it seems like it never
rains but it pours with depressing headlines that may say more about your own
temperament than the state of the world.
Selected and edited from - http://www.bbc.com/future/story/20160708-why-true-coincidences-are-hard-to-find
** **
BBC Earth: “The Quantum Origin of Time”
•
By Philip Ball
11
July 2016
Science
has a habit of asking stupid questions. Stupid, that is, by the standards of
common sense. But time and time again we have found that common sense is a poor
guide to what really goes on in the world.
So
if your response to the question “Why
does time always go forward, not backwards?” is that this is a daft thing
to ask, just be patient.
Surely
we can just say that the future does not affect the past because (duh!) it has
not happened yet? Not really, for the question of where time's arrow comes from
is more subtle and complicated than it seems.
What's
more, that statement might not even be true. Some scientists and philosophers
think the future might indeed affect the past – although we would only find out
when the future arrives. And it may be able to due to an emergent property of
quantum mechanics.
To
all intents and purposes, time seems to have a direction.
Our
everyday experience insists that things only happen one way. Cups of coffee
always get colder, never warmer, when left to stand. If they are knocked to the
floor, the cup becomes shards and the coffee goes everywhere, but shards and
splashes never spontaneously reassemble into a cup of coffee.
Yet
none of this one-way flow of time is apparent when you look at the fundamental
laws of physics: the laws, say, that describe how atoms bounce off each other.
Those
laws of motion make no distinction about the direction of time. If you watched
a video of two billiard balls colliding and bouncing away, you would be unable
to tell if it was being run forwards or backwards.
The
same time symmetry is found in the equations of quantum mechanics, which govern
the behaviour of tiny things like atoms. So where does time's arrow come into
the picture?
There
is a long-standing answer to this, which says that the arrow only enters once
you start thinking about lots and lots of particles.
The
process of two atoms colliding looks perfectly reversible. But when there are
lots of atoms, their interactions lead inevitably to an increase in randomness
– simply because that is by far the most likely thing to happen.
Say
you have a gas of nitrogen molecules in one half of a box and oxygen molecules
in the other, separated by a partition. If you take away the partition, the
random movements of the molecules will quickly mix the two gases completely.
There
is nothing in the laws of physics to prevent the reverse. A mixture of the two
gases could spontaneously separate into oxygen in one half of the box and
nitrogen in the other, just by chance.
But
this is never likely to happen in practice, because the chance of all those
billions of molecules just happening to move this way in concert is tiny. You
would have to wait for longer than the age of the Universe for spontaneous
separation to occur.
This
inexorable growth of randomness is enshrined in the second law of
thermodynamics. The amount of randomness is measured by a quantity called
entropy, and the second law says that, in any process, the total entropy of the
Universe always increases.
Of
course, we can decrease the entropy of a group of molecules, say by sorting
them one from another. But doing that work inevitably releases heat, which
creates more disorder – more entropy – somewhere else. Ordinarily, there is no
getting around this.
However,
the entropic arrow of time gets less well-defined at smaller scales. For
example, the chances of three oxygen and two nitrogen molecules briefly
"un-mixing" are pretty good.
This
was illustrated by a 2015 study.
Researchers studying single molecules found evidence that the growth of entropy
is a good measure of how far the system was from being reversible in time.
This
argument about entropy, which was worked out in the late 19th Century by the
Austrian scientist Ludwig Boltzmann, is often seen as a complete and satisfying
answer to the puzzle of time’s arrow.
But
it turns out the Universe holds deeper secrets. When you start looking at very small
things, Boltzmann's neat story gets increasingly muddled.
In
Boltzmann's picture, it takes a while for the arrow of time to find its
direction. In the tiny fractions of a second after the partition between the
two gases is removed, before any of the molecules have really moved anywhere,
there is nothing to show which direction of time is forwards.
Entropy
increases when collisions between atoms even out their energies, as for example
when the heat of hot coffee spreads out into the surrounding air. This process,
which washes away reservoirs of energy, is called dissipation.
Until
dissipation starts to happen, a process looks much the same backwards or
forwards in time. It does not really have a thermodynamic arrow.
But
there is a one-way process in quantum mechanics that happens much faster. It is
called decoherence.
At
the quantum scale, particles behave as if they were waves. This has peculiar
consequences.
For
example, if you shoot individual electrons or whole atoms through two closely
spaced slits in a screen, they will interfere with each other as if they were waves.
But this does not happen with ordinary-sized objects. If you throw two coffee
cups through two open windows, they do not interfere with each other.
Decoherence
explains why objects on the everyday scale of coffee cups do not show the
wave-like behaviour of quantum objects.
It
arises because quantum particles can be coordinated in their quantum waviness,
but if there are lots of them – like the countless atoms in a coffee cup – they
rapidly lose any coordination. This means the object they constitute cannot
show quantum behaviour.
Decoherence
happens because of interactions between the objects and their environment: for
example, the impact of air molecules on the cup. Quantum theory shows that
these interactions rapidly cause a large object's quantumness to
"leak" into its environment.
This
means the object takes on unique characteristics. Quantum theory tells us that
objects can show one of many possible properties when they are measured, but in
our everyday world objects only have a single well-defined position, speed and
so on. Decoherence is thought to be how this "choice" is enforced.
Quantum
decoherence is incredibly fast, because interactions between particles are
extremely efficient at dispersing quantum coherence.
For
a dust grain one-thousandth of a centimetre across, floating in air, the
collisions of other air molecules will destroy any quantum behaviour in around 0.0000000000000000000000000000001
seconds. That is a trillionth of the time it takes for light to cross the face
of a single hydrogen atom.
This
is much faster than the time it takes for heat in a dust grain to get
redistributed in the environment. In other words, decoherence is faster than
dissipation – and it seems to only work one-way. That means decoherence reveals
the arrow of time faster than dissipation.
This
implies that the arrow of time really comes from quantum mechanics, not
thermodynamics as Boltzmann thought.
In
a sense it has to, because everything is made of atoms, and quantum mechanics
is the right theory to use for atoms. "The thermodynamic arrow of time
must emerge from the quantum one," says George Ellis of the University of Cape Town in South Africa.
Yet
in the end, the quantum and thermodynamic explanations amount to the same
thing: the scrambling of information.
It
is easy to see that the mixing of two types of gas molecule is a kind of
scrambling, a destruction of orderliness.
But
decoherence involves scrambling too: of the coordination between the
"waves" that describe quantum objects. In effect, decoherence comes
from the way interactions with atoms, photons and so on in an object's
environment carry away information about the object and scatter it around. This
is, in fact, a quantum version of entropy.
In both the classical and the quantum cases,
then, time's arrow comes from a loss of information.
This
offers a better way to think about time's arrow. It points in the direction in
which information is lost and can never be retrieved.
A
process is only truly irreversible when the information about the change is
lost, so that you cannot retrace your steps. If you could keep track of the
movement of every single particle, then in principle you could reverse it and
get back to exactly where you started. But once you have lost some of that
information, there is no return.
"The
loss of information is a key aspect," says Ellis. "At the macroscopic
scale this gives the second law."
It
is still not entirely clear when, in the quantum world, the information is
truly lost.
Some
researchers think that decoherence alone is enough. But others say that the
information, although smeared and dispersed in the environment, is still recoverable
in principle. They think an additional, rather mysterious process called
"collapse of the wave function" – in which the quantum waviness is
irreversibly lost – takes place. Only then, they say, does the arrow of time
point unambiguously in one direction.
In
either case, in quantum physics we can only really say that an event has
happened if we have lost the option of making it "unhappen".
The
arrow of time seems to reflect a process of the Universe "committing
itself" to something, rather than hedging its bets by allowing for many
different outcomes. It is this "crystallising" of the classical
present from the quantum past, says Ellis, that produces a direction in time.
This
idea fits neatly with one of the most famous thought experiments in physics.
The
second law of thermodynamics says that things tend to become more random, but
only because randomness is so likely. In the late nineteenth century, the
Scottish physicist James Clerk Maxwell came up with what looked like a way to
get around this.
Maxwell
imagined a tiny intelligent being that could observe the random motions of
molecules. This "demon" could un-mix two gases, by opening and
closing a door at just the right moments.
Maxwell's
demon seems to violate the second law, but in fact it cannot. The reason is
that the demon has to accumulate information in its brain as it observes the
molecular motions. To keep this up, it has to delete the older information, and
that increases the entropy.
It
is the act of erasure, of forgetting, that guarantees the thermodynamic arrow
of time. Once again, the key thing is loss of information.
However,
the quantum-mechanical picture of time's arrow leads to something deeply
peculiar. In some experiments, it looks as though influences can work backwards
in time. The future can affect the past.
Take
the double-slit experiment, in which a quantum particle such as a photon of
light is fired at two narrow slits in a screen.
Suppose
we do not measure which way the particle went, and so cannot tell which slit it
went through. In this case, we see an interference pattern – a series of light
and dark bands – when the particles emerge on the far side.
This
reflects the wave-like character of quantum particles, because interference is
a wave property. The interference even persists when the particles pass through
the slits one at a time, which only "makes sense" intuitively if we
imagine each particle passing, wave-like, through both slits at once.
However,
now suppose we place a detector by the slits to reveal which one the particle
passes through. In this case the interference pattern disappears, and the
particles act more like sand grains being fired through holes. Measuring a
particle's path destroys its waviness.
Here
is the really strange thing. We can set up the experiment so that we only
detect which slit a particle passed through after it has done so. And yet we
still see no interference.
How
does the particle "know" that it is going to be detected after
passing through the screen, so that when it reaches the slits it
"knows" whether to go through both slits or just one? How can the
later measurement seem to affect the earlier behaviour?
This
effect is called "retrocausality", and it seems to imply that the
arrow of time is not as strictly one-way as it seems. But does it really?
Most
physicists think that retrocausality in these delayed-choice experiments is an
illusion created by the counterintuitive nature of quantum mechanics.
Detecting
a particle "after" it has passed through the slits does not really
influence the path it takes, they say. That is just the way we are forced to
imagine what is happening when we try to apply our classical intuition to
quantum events.
"Post-selection
is like a parlour trick that makes it seem like there is backwards causation
where there actually is none," says Tod
Brun of the University of Southern California. "It's like the guy
who shoots at the side of a barn and then goes and draws a target around the
bullet hole."
However,
others say that backwards effects in time are a perfectly valid way of interpreting
such processes.
According
to Ellis, we can regard retrocausality as a kind of fuzziness in the
"crystallisation of the present". "Quantum physics appears to
allow some degree of influence of the present on the past, as indicated by
delayed-choice experiments," he says.
Ellis
has argued that the past is not
always fully defined at any instant. It is like a block of ice that contains
little blobs of water that have not yet crystallized.
Even though the broad outline of events at a
particular instant has been decided, some of the fine details remain fluid
until a later time. Then, when this "fixing" of the details happens,
it looks like they have retrospective consequences.
It is hard to find the right words to
describe this situation.
Certainly
we should not imagine a "force" reaching back through time and
altering earlier events. Instead, those things are slow to acquire the true
status of events – of stuff that has actually "happened" – at all.
Alternatively,
perhaps those past events really did happen at the time, but the laws of
quantum mechanics forbid us from seeing them until later. "If the future
detector choice causes the particle to behave a certain way in the past, one
should consider the past behavior 'real' when it originally happened,"
says Ken Wharton of San José State
University in California.
Quantum
retrocausality crops up quite naturally if, instead of trying to force an arrow
of time onto quantum theory, we simply let quantum mechanics work equally well
in both directions in time. Wharton's colleague Huw Price of the University of Cambridge in the UK has argued that such a theory really would allow backwards
causation. Still, Wharton admits that such retrocausal theories are
speculative.
Only
a handful of physicists and philosophers have embraced retrocausality. Most
consider backwards causality "too high a price to swallow", says
Wharton.
But
he feels that we only resist this idea because we are not used to seeing it in
daily life.
"The
view that the past does not depend on the future is largely
anthropocentric," says Wharton. "We should take apparent backwards causation
more seriously than we usually do. Our intuition has been wrong before,
and this time symmetry on quantum scales is a reason to think we could be wrong
again."
If
time's arrow is not quite as one-way as it seems, that raises one last
question: why do we perceive it as always pointing one way? Why should the
"psychological arrow of time" be aligned with the physical ones?
Again,
it might sound like an odd question. If the laws of physics dictate an arrow of
time, surely our perception of it should follow quite naturally? But that is
not as obvious as it sounds.
Suppose
you have some particles moving around in a box according to physical laws. If
you know all the exact positions and speeds of the particles right now, you can
predict the future exactly.
In
principle, there is nothing in the future that could not be known in the
present. Why could that not be experienced as a "memory" of the
future; a little like a master chess player seeing well in advance how the game
will inevitably end?
This
question is usually swept under the carpet. "Since biology rests on a
foundation of chemistry, which rests in turn on foundations of quantum
mechanics and thermodynamics, I think most people believe that the biological
arrow of time is a consequence of the thermodynamic arrow," says Brun.
But
this is not a foregone conclusion. Brun and his colleague Leonard Mlodinow at the California Institute of Technology in
Pasadena have argued that the psychological and thermodynamic arrows of time
are actually independent. They only coincide because of how our memories work.
The
first part of the argument was put forward by David Wolpert of the Santa Fe Institute in New Mexico. He said
that, before you can remember something, you must initialize your memory in
some standard starting state; like wiping your hard drive to make space for new
data.
But
as Maxwell's demon showed us, erasing information always increases entropy.
This means initialization is an irreversible process, and so only works forward
in time.
However,
Mlodinow and Brun say that this argument is not quite complete. In principle,
you can eliminate any need for erasure and initialization just by remembering
everything. Then, recording information in the memory can be fully reversed,
and has no arrow of time.
In
this case, they argue that the psychological arrow of time is still preserved,
by something they call "generality".
The
problem with remembering the future now is that there can always be
"surprise" events that wreck the link between the system's state now
and its state in the future. I can "remember" that my clock will show
11 o'clock in an hour's time – but if the battery runs out in ten minutes, my
"memory" will be wrong.
A
real memory cannot be contingent on the system behaving a certain way, Mlodinow
and Brun say. It has to remain general, meaning it is true whatever happens. So
even if memory is recorded in a reversible system, the requirement of
generality imposes a directionality in time.
In
this case, says Brun, the reason we cannot remember the future is not simply
because it has not happened yet. Instead, it is because the "memory"
would really only be a prediction, which might or might not be correct. And it
is not a true memory unless it is correct.
In
other words, a putative "future memory" is fine-tuned to a particular
outcome, and must be readjusted if any slight change occurs between now and the
appearance of the "remembered" future state. That means it is too
fragile to count as a genuine memory.
Not
everyone is convinced that Brun and Mlodinow have cracked the problem.
Some
think their argument is circular. They had to put in the asymmetry of time by
hand, making it possible for contingent events to intervene in the future but
not the past – which some feel is a bit of a cheat.
All
the same, simply by posing the question, Brun and Mlodinow have shown that the
answer is not as obvious as we might suppose. Our perception of time may not
have much to do with the actual passage of time.
What is clear is that the arrow
of time, which seems like such a common-sense fact of life, is actually a
profoundly tricky concept. The closer we look, the less we can be sure that the
arrow is really always one-way
Selected and edited from -- http://www.bbc.com/earth/story/20160708-the-past-is-not-set-in-stone-so-we-may-be-able-to-change-it
No comments:
Post a Comment