Unwinding the Body Clock
Your body, if you pay attention to the way it ticks, may be your best timekeeper. At dawn, your blood pressure has its sharpest rise, allowing you to safely assume a vertical position. Around lunchtime, your liver enzymes kick into full gear in anticipation of food. In the evening, the pineal gland in the base of your brain begins producing the hormone melatonin, which makes you feel sleepy. As you sleep, your body temperature drops. In the morning, as the sun comes up and light hits your retinas, your body stops making melatonin and your temperature rises, revving up your metabolism for the day ahead.
James Collins
External cues keep the body in sync.
Many basic biological functions follow a 24-hour cycle. These rhythms—called circadian, Latin for "about a day"—are hardwired, controlled by a master clock, a cluster of specialized nerve cells in the hypothalamus, and other subservient clocks in the tissues of the body—the liver, for example. Still, cues like the sun or an alarm clock are crucial for keeping these internal clocks in sync with the external world.
While researchers know that the clock in the hypothalamus is in charge, they're not sure how it works with the millions of other clocks in the cells of the brain and body. "Biological timing is even more complicated than we thought five years ago," biologist Gene Block told attendees at the 2004 Penn State Lectures on the Frontiers of Science. Most of the research today is at the molecular level, using genes and proteins from fruit flies and mice to tease apart clock mechanisms, he added. But the more researchers learn, the more complicated the body's timekeeping system seems.
Sleep bunkers and zeitgebers
Early circadian rhythm experiments were a little simpler and usually involved tracking the sleep patterns of people, Block noted. Groundbreaking work took place in the late 1960s and early 1970s, when dozens of volunteers spent weeks in underground bunkers at the Max Planck Institute near Munich, Germany. The volunteers—mostly students looking for extra money and a quiet place to study for exams—lived in isolation, with no exposure to daylight, no clocks, and no way to measure the passage of time. Researchers monitored their sleep patterns, body temperature, and a host of other rhythms.
Over the course of the experiment, almost all of the people settled into a 25-hour cycle, gradually falling out of sync with above-ground dwellers on a standard 24-hour cycle. Interestingly, a small percentage of them developed a cycle that was closer to 48-hours, often staying awake and active for very long stretches of time. However, their body temperatures still fluctuated on cycle that was close to 25 hours.
Courtesy Gene Block
Gene Block
These "sleep bunker" experiments, said Block, told researchers that body clocks are somewhat independent from the 24-hour clock the world runs on. Exposure to light and other external cues—zeitgebers, as the German
researchers called them—is necessary to reset internal clocks and keep
people in sync with the natural cycles of day and night. The experiments also demonstrated that some people—those 48-hour folks, for example—have their own unique rhythms.
Clock genes
Around the time these sleep experiments were taking place at the Max Planck Institute, researchers were beginning to explore the genetic basis for biological rhythms. In 1971, the first clock gene was discovered in the common fruit fly, Drosophila melanogaster. Advances in genetic and molecular techniques over the last decade have led to an explosion in the number of clock genes discovered, most of them in more complex animals like mice. Today, researchers are still identifying new clock genes and proteins, but they're also trying to understand how all of these timekeeping components work together.
A clock within a cell is really just a simple chemical loop. For example, the clock genes period (per) and timeless (tim) switch on early in the night and begin making their corresponding proteins, PER and TIM. When the proteins reach a certain concentration in the cell, they bind together, move into the nucleus, and block the cell's protein-making mechanism, shutting down the work of the genes. After a while, the protein structure falls apart and the genes switch on again. When normal copies of the genes are present in the cell, this chemical loop plays out over a roughly 24-hour period. You can actually observe the cycle in a Petri dish, said Block. "The cells are outside of the brain, in culture, and they're still generating electrical activity in cycles." That 24-hour electrical rhythm is generated by the genes switching on and off as they make the proteins.
The bigger question, said Block, is how all of the clocks in the cells throughout the brain and body stay calibrated and running smoothly. And what happens if someone has a sleep disorder, or works the night shift, or travels across multiple time zones? Researchers at several labs, including the Center for Biological Timing at the University of Virginia, which Block directs, are tackling the question of clock coordination. In one experiment, Block and his team exposed different cells from mice to changing cycles of light and dark, meant to represent time zone changes travelers might experience. While the master clock cells adjusted rapidly to each new "time zone," clock cells from other parts of the brain and body took much longer to adjust.
Understanding body clocks at the molecular level could help researchers develop drugs that will help people whose clocks are out of sync, said Block. But, that's not an easy task. To be successful, a drug would have to do more than target just the master clock. "The reason you feel terrible is because all of the clocks are out of sync. It's a system problem."
—Dana Bauer
Keeping Time
Time marches on, the vintage newsreels blare. We can't stop it. We have real trouble defining it. But we have done a remarkable job of figuring out how to measure its passage.
James Collins
Kurt Gibble with his atomic clock
Admittedly, the first step was fairly obvious. The Sun rises and sets. We wake, we sleep. Then we wake again: Day 2. Soon, however, early humans began to notice larger increments. The cycles of the moon produced the month. The return of seasons yielded the year. Recognizing these patterns would quickly prove crucial to the development of agriculture and trade.
Breaking the day into sub-units took a little longer. In due time, however, our
ancestors were parsing out hours, and minutes, and seconds. These details — and precision in measuring them — became particularly important for the business of navigation. For mariners judging longitude by the stars, knowing the exact time was crucial to getting a bead on where you were.
Galileo gets credit for realizing that a pendulum's period — the time it takes to swing back and forth—is of constant duration. But it was Dutch astronomer Christiaan Huygens, in 1656, who first applied this idea as a useful measure. By harnessing a swinging pendulum to an interlocking series of gears, Huygens showed, you could keep count of all those periods, and translate the "ticks" into moving hands on the face of a clock.
The Atomic Age
"Atomic clocks work on the same principle," says Kurt Gibble, associate professor of physics at Penn State, who gave the third lecture in the 2004 Frontiers of Science series. According to the laws of quantum mechanics, Gibble explained, atoms can only have discrete, sharply defined energies. "They have to be either in one state or the other. They can't be in-between." By shining light on an atom in a low energy state, you can excite it, driving it to a higher energy state. But this transition, for a given atom, occurs only at precisely what is known as that atom's resonant frequency. The incoming light has to match this frequency exactly, or the atom will not be excited.
Every atom of a given type has the same resonant frequency. An atomic clock takes advantage of this universality, deploying the chosen atom (actually a gaseous cloud of atoms of the same type) as an extremely accurate frequency regulator. The transition-triggering frequency of the light, kept stable by the regulator, stands in for a pendulum.
Nobel laureate Isidor Rabi hatched the idea for an atomic clock back in the late 1930s, as an offshoot of his pioneering work on the fundamental properties of atoms and molecules. By 1949, the National Bureau of Standards (now the National Institute of Standards and Technology, or NIST) had produced the first such timepiece, using an ammonia molecule as its core. Since then, better and better atomic clocks have been built using atoms of cesium 133, a heavy element whose high resonant frequency produces a high tick rate. (Counting more ticks over a given interval, Gibble explained, allows you to make a more accurate clock.) In a standard atomic clock, a gaseous beam of cesium atoms is fired through a vacuum chamber, where it is zapped with brief pulses of light — actually, microwave energy. If the pulses are of the right frequency, they goose the atoms into changing states.
By 1967, the precision of such clocks had become so refined that the definition of a second was changed. No longer would time's fundamental unit be based on dividing up the period of Earth's rotation, a duration which can be slightly affected by physical forces like friction. Instead, "a second is defined by measuring the transitions of a cesium atom exposed to microwave radiation at the proper frequency." By international agreement, that's exactly 9,192,631,770 ticks per second.
Just Chill
The best of the beam atomic clocks, called NIST 7, is a thousand times more accurate than the 1967 version. Amazingly, according to Gibble, it's dead-on to within one second over six million years. And recent advances in physics have uncovered several approaches to even greater precision.
"NIST 7 uses room-temperature atoms," Gibble explained, "At room temperature, atoms move at the speed of sound, or a jet airplane." At that velocity, it took only a thousandth of a second for a cesium atom to fly through the ten-foot tunnel that houses NIST's vacuum chamber. "That brief observation period limits the accuracy of the clock."
Now, however, there's a good way to get those atoms to chill out. In 1997, Stephen Chu, Claude Cohen-Tannoudji and William Philips won a Nobel prize in physics for their work on a technique called laser cooling. If you bombard atoms with laser light tuned to just the right frequency, Gibble explained, "the momentum of the atom is taken away by the momentum of the light." It's not unlike the way velocity is transferred when two pool balls collide. And the result of losing speed is rapid cooling.
By this technique, Gibble said, "You can cool a gas of atoms to within one one-millionth of a degree of absolute zero — and you can do it in a thousandth of a second. In that fraction, "the atoms go from the speed of a jet airplane to the speed of an ant."
Clock-makers have learned to lengthen observations even more by configuring their instruments vertically, as atomic "fountains." In a fountain set-up, target atoms are cooled and collected at the bottom of the vacuum chamber by a set of six lasers placed opposite each other and aimed toward the center. Then a small change in frequency sends the huddle of atoms up through a microwave trap, where they undergo the tell-tale transition. All the lasers are then switched off, and the atoms fall back down through the trap.
"It looks like a water fountain," Gibble said, and — by making use of gravity — it doubles the available observation time. "You get about a half second on the way up, and another half second on the way down. With a full second to look at the atoms, you can make a very accurate clock."
But cooling atoms to near absolute zero also creates some distracting quantum effects. At very low temperatures, he noted, atoms look more like waves than they do particles. "They get bigger, and when they're bigger, they're more likely to collide — and every time there's a collision it gives your pendulum a little kick."
To Infinity and Beyond
To get around this problem, Gibble and his graduate student Chad Fertig designed and built a clock that substitutes rubidium atoms for the standard cesium. They showed that "rubidium atoms don't collide as often," Gibble explained, "and their collisions don't have as great an effect." This work has led other groups around the world to build similar clocks based on rubidium.
One of these clocks, built by French researchers, is currently the world's most accurate clock. Rubidium clocks now under construction at the U. S. Naval Observatory will soon be the basis for Earth's Global Positioning System. Used around the world for everything from emergency response to precision agriculture, GPS relies on 24 to 30 Earth-orbiting satellites, each outfitted with an atomic clock. These satellites transmit radio signals to receivers on the ground, like the one in your car's onboard navigation system. The receiver calculates the distance the signal has traveled by how long it takes to arrive. By repeating the process with at least four satellites, the receiver can pinpoint its own location anywhere on Earth. "The entire system," Gibble said, "depends on satellites and receivers having accurate clocks that can be precisely synchronized."
In the future, Gibble added, rubidium-based clocks could be capable of an accuracy that is difficult to imagine: within five seconds over the entire life of the universe. "Better clocks," he said, will have potential impacts in many areas, from advanced communication systems, to navigation, and to better tests of the theory of general relativity. For interplanetary navigation -landing on Mars, for example—clocks of unprecedented accuracy will be essential.
That's why Gibble, with colleagues at NASA's Jet Propulsion Laboratory, has been working on a rubidium clock that is deployable in space. By removing the limiting factor on an Earth-bound atomic fountain—gravity -a space-based clock will yield much longer observation times, up to ten seconds, Gibble said.
In addition, he said, he and his clock-making colleagues are working on other approaches: fine-tuning laser cooling techniques to allow time-lapse "juggling" of atoms in atomic fountains; and using ultra-short pulse lasers that allow researchers to count very high tick rates, the ticks of laser light. Fundamentally, he said, there's no end to the possible improvements:
"There's no known limitation to how accurately we can measure time."
—David Pacchioli
How Long Can Humans Live? Time, Longevity, and human aging
In front of a mixed university audience—aging baby boomers and invincible twenty-somethings—at the 2004 Penn State Lectures on the Frontiers of Science, Robert Mitchell, professor of biology at Penn State, tackled that nagging question: Why get old anyway? What follows is a conversation Research/Penn State had with Mitchell after his talk.
R/PS: How long can humans live?
RM: We have to distinguish between maximum lifespan, which is as long as any human has ever lived, and life expectancy, which is how long you could live predictably, based on insurance statistics, for example. As far as maximum lifespan goes, we suspect it's around 125 years. There's no evidence that humans can live any longer than that. But as far as life expectancy goes, decade by decade we see that going up and up and up. We're at the point where it's somewhere around 76, 77 years. And that will continue to rise.
Most geneticists will tell us that about 30 percent of our longevity depends on our genetic make-up. That surprises a lot of people. That's saying that 70 percent depends on environment, behavior, what you eat, how careful a driver you are, if you smoke, if you wear your seatbelt. These things have a profound impact.
I don't suspect that we'll see maximum lifespan increasing beyond 125, unless we start juggling people's genetic codes, and I don't think anybody's ready to do that.
Jennifer Howell
A baby born today has a life expectancy of about 76 years.
R/PS: Who is the oldest recorded living person?
RM: Jeanne-Louise Calment of France, who died in 1997, lived to be 122 years and five months. This woman was quite a character, if you read about her. She was bright and spry into her later years. She took up fencing at age 87. I hesitate to say this, but she smoked until she was 117. The only reason she quit is because she got tired of asking people to light her cigarettes for her. She couldn't see well enough to do it herself.
R/PS: Why is 125 the magic number?
RM: It's evolved. The program, so to speak, within our system has evolved to the point that we just can't get beyond that. It's a matter of repair. We only have so many repair systems that have evolved over millions of years. The best an individual can do is keep repairing damage, up to a certain point, and then you just can't do it anymore. Long-lived animals have much better repair systems then short-lived animals. That's genetic and it's evolved that way.
R/PS: So what about other animals? How long do they live?
RM: Longevity correlates with a number of other factors, like body size, duration of growth period, fecundity (rate of reproduction), and rate of energy metabolism. In general, larger animals live longer. The smallest and shortest-lived of the mammals, and also the one with the fastest metabolism, is the shrew. We all know that if you want to know how old your dog is you take your dog's age and multiply it by seven and that gives you human years for your dog. What's really going on is that a dog has a metabolic rate that's about seven times faster than a human's metabolic rate.
R/PS: How did you get started in this research?
RM: I first got interested in aging by accepting a fellowship with the National Institutes of Health, which was trying to encourage young Ph.D. graduates to do research in that area. Prior to that, I hadn't thought much about aging. I don't think very many 26-year-olds do.
At that time, in the late 1960s and early 1970s, there was little mention of aging in any biology textbooks. Our library at Penn State had just a few journals specific to aging. Today there are dozens. And quite frankly, there was very little good biological research about aging. That began to change in 1974 when the NIH formed the National Institute for Aging.
R/PS: Why the increased interest?
M: In 1900, only three million people in America were over 65. Today, 37 million people are over 65. That's 12 and a half percent of the population. In the next 30 years, 70 million people will be over the age of 65. That's 22 and a half percent. There's no question that people are living longer and that people are generally healthier in their late lives. Also, the number of people in this country living past 100 has increased. Today there are 50,000 people over the age of 100 in America. By 2050, 800,000 people will be over the age of 100. There are lots and lots of people living longer and longer and that's interesting from a biological, economic, and sociological point of view.
R/PS: Can our economy and our health care system handle so many people who are going to be living for so long?
RM: It's an issue, not so much for the biologists, but for the sociologists and the health policy people. To have so many people living for so long is costly. It used to be that people could expect to live 12 years after retirement, but now they can expect to live 25 more years, without a regular salary. I think younger people are worried that they're going to have to work longer to support the older people who are living longer.
Courtesy Robert Mitchell
Robert Mitchell
R/PS: What are some of the most exciting discoveries you've seen in the field of aging in the past 30 years?
RM: Two things. The first: It was always thought that once you took human cells out of the body and grew them in a dish they would divide and divide and divide and not age. In the sixties, we learned that there is a limit to how many times a cell could divide in a dish. They do age and have a limited lifespan. We know now there is some kind of a clock that limits the longevity of these cells. That discovery in the sixties led hundreds of investigators to start working with human cells in culture—we say in vitro—to see what it was that was limiting their life span. That's called the Hayflick limit, the limited number of times a cell is able to divide. So, that discovery probably did more than anything else to spark all kinds of research, cell and molecular biology research, on the basic mechanisms of aging.
The second thing, in the nineties, has been the discovery of specific genes in simple animals—fruitflies and roundworms that we use to study the aging process—that have profound effects on rate of aging and natural lifespan of those organisms. Many people believe that even though the code within our cells is made up of thousands of genes, it might only be several dozen that are really key genes as far as programming our rate of aging and longevity. A lot of people right now are looking for those genes. And it's one thing to find a gene, but it's another to figure out what it does. What does the protein it makes do? That's the next hurdle. For example, in one experiment, if you insert a gene which codes for an enzyme that neutralizes dangerous free radicals in a fly or a roundworm, then those animals live longer. That enzyme is called SOD, for superoxide dismutase. Free radicals cause havoc in cells and accelerate the process of aging.
And we've only had the ability in the last 10 years to do these kinds of genetic manipulations—identify genes, knock out genes, insert genes—and that's opened a whole line of investigations.
—Dana Bauer
The Arrow of Time
Maybe Immanuel Kant was right. It's impossible, the greatest of modern philosophers thought, to step away from time sufficiently to try to explain it.
"As long as we don't think about it very much, we have a feeling that we understand it," said Joel Lebowitz, the last of this year's Frontiers of Science lecturers. Lebowitz, the George William Hill professor of mathematics and physics at Rutgers University, took as his topic the arrow of time, or as he put it, "the unidirectional nature" of events we observe. "Why," he asked, "can we remember the past but not the future?"
Courtesy Joel Leibowitz
Joel Leibowitz
Philosophers tend to divide into two camps over the nature of time, Lebowitz said. From an every day, intuitive point of view, "the passage of time is an objective feature of reality. The present is always advancing into the future. What is real is the present."
A more subjective stance, sometimes known as the block universe theory, "regards past, present, and future as a single entity, in which time is an ingredient. In this view, the present is a subjective notion, and 'now' depends upon your viewpoint, in the same way that 'here' does.
"Most physicists take this view," Lebowitz said. Einstein, for example, in a famous letter, wrote: "For those of us who believe in physics, this separation between past, present, and future is only an illusion, although a persistent one."
No Turning Back
A part of that persistent illusion—if such it is—is what physicists call the problem of irreversibility. Think of it as the Humpty Dumpty syndrome. "You drop an egg, and you can't put it together again. Milk spills, and you can't unspill it. This seems to us very natural," Lebowitz said. "Many of the phenomena that we observe are asymmetrical."
A problem arises, however, because "the basic laws of the universe, as we understand them, are symmetric in time—they do not have this unidirectionality." How do we reconcile this seeming paradox? Or, as Lebowitz put it: "What is the relationship between the irreversible behavior of the objects that we can see and touch and the reversible dynamics of the atoms and molecules that make up those objects?
"Let me begin by considering the relationship between microscopic and macroscopic laws," he said. "Our understanding of nature is that it has a very hierarchical structure." We divide the world into scales, ranging from the infinitesimally small to the unfathomably large. "To some extent, we can discuss these scales independently. We have to—it doesn't do any good to bring in quarks when you want to understand protein folding, or to bring in atoms when you want to study ocean currents.
"Nevertheless, it is a central lesson of science over the last 300 years that there are no new fundamental laws, only new phenomena, as one goes up the hierarchy. Explanations, therefore, are always to be looked for in the microscopic scales.
"In the language of classical mechanics, matter is made up of particles in perpetual motion; they attract each other when they are at a certain distance apart, and repel each other when very close to each other." The laws of mechanics that govern these motions work equally well whether time moves forward or in reverse.
To illustrate the point, Lebowitz flashed on the lecture-hall screen an old Physics Today cover, with a sequence of drawings of stick-figure athletes running around an oval racetrack. In the first drawing, the athletes are bunched together at the starting line as the gun goes off. In the second and third images, the athletes grow increasingly separated as they round the track at varying speeds. In the fourth image a second gunshot tells them to reverse directions. Their respective velocities are now the reverse of what they were, and by the sixth panel the runners arrive, all together, back at the starting line.
"Given that microscopic physical laws are reversible," Lebowitz asked, "why do all macroscopic events have a preferred time direction?" What keeps Humpty from getting back together?
The Odds are Against It
The best explanation, Lebowitz said, comes from 19th-century Austrian physicist Ludwig Boltzmann, inventor of the field of statistical mechanics. Boltzmann's answer, he said, has to do with the hugely disproportionate number of possible microscopic states that correspond to a single macroscopic state. Lebowitz borrowed an image from his present-day colleague Brian Greene, now famous as host of the PBS television series "The Elegant Universe." In a recent book, Lebowitz noted, "Greene asks us to imagine taking an unbound copy of Tolstoy's War and Peace—697 pages—and throwing it into the air. What is the probability that those pages will land in exactly the right order?" Conversely, how many possible wrong orders are there? Greene's answer to the latter question fills two-thirds of a page with digits—the number is incomprehensible.
Similarly, Lebowitz said, "a glass of milk is a macroscopic system composed of many, many microscopic particles. There is only one orderly state for all those particles—or at best a very few—compared to many, many possible disorderly states." Once that orderly system is bumped out of its equilibrium, i.e., the milk is dashed to the floor, "it would be very, very difficult" to get it back to the precise order it had previously assumed.
Boltzmann's "is a probabilistic explanation," Lebowitz acknowledged: The probability of recreating that one orderly state is extremely low. "But it explains almost everything," he added. What it doesn't explain, is why there is an experimentalist on hand to topple that milk in the first place. Or, as Lebowitz asked, "Why are we here, when—if you look at all the possible microscopic states—this is such an unlikely state?"
Copyright 2004 Physics Today
Backwards and forwards: Why do all macroscopic events have a preferred time direction?
The Old Order
Boltzmann assumes the initial conditions of his experiment, in other words. "We are always assuming an initial state in which there is more order," Lebowitz said. To justify that assumption, he added, you have to scroll back to the real initial conditions, i.e., the origin of the universe. And throw in the second law of thermodynamics: Entropy increases. That is, things tend to spread out, to move from order to disorder.
"The universe began in a state of very low entropy, a very ordered state—there was a uniform distribution of energy," Lebowitz said. "It was not clumpy.
"Unlike the case with regular matter," he continued, "where being disorganized, spread out, is a state of higher entropy, with gravitation the state of increasing entropy is actually the state of clumping. That's why matter collects into planets, planets collect into solar systems, stars collect into galaxies, and galaxies collect into supergalaxies." As the universe has spread out, it has become highly irregular. "That's why we're here in this lecture hall," Lebowitz said.
And that's also why we can remember the past. Unlike a glass of milk, which has no memory of where it came from, whether it was boiling or in the refrigerator, we humans are not in a state of equilibrium, he said. "We are moving toward equilibrium," he conceded. "That's the future we predict.
"But given initial conditions starting with a state of very low entropy, it is not unreasonable to see what we see."
—David Pacchioli
Deep Time
During three weeks in May 2004, two hardy Penn State geoscientists traveled through 12 stunning National Parks of the southwestern United States with 13 lucky students. The trip was sponsored by CAUSE (Collaborative Active Undergraduate Student Experience), an annual course offered by the College of Earth and Mineral Sciences. Richard Alley, Evan Pugh professor of geosciences, led the expedition. CAUSE 2004 was an extension of his course, "Geology of National Parks," and allowed students to interact with and learn from the rocks and landscapes of Arizona, Utah, and Colorado. Here, Alley explains the concept of deep time, how it tells the history of our planet, and how it affects our lives.
—Emily Wiley
Richard Alley, Ph.D., is Evan Pugh professor of geosciences in the College of Earth and Mineral Sciences, rba6@psu.edu.
Time in Film
From its beginnings in the 1890s, the cinema has shaped both time and space like putty. But of all the modernist arts, film has been perhaps the most obsessed with time. When director D.W. Griffith proposed the use of flashbacks in his 1908 short film "After Many Years," his wife recalled that panicky studio executives asked, "But how can you tell a story like that, jumping around in time?" Griffith knew that audiences in the urban storefront theaters of the time saw movies as a reflection of the attributes of city life around them, hectic, fragmented, a psychic realm of excitement and the unexpected. Since Griffith's day, the flashback, and later, the flashforward, have become conventions of film narration. What is more remarkable is how conventional the cinema's tricks with time have become for its viewers: the movies have trained their viewers to follow the most contorted temporal patterns with such ease that it seems "natural," and even the most routine films skip back and forth between narrative worlds (cross-cutting), and elongate or compress specific moments, as in films like The Matrix, or even repeat incidents, sometimes from multiple perspectives, like the great Japanese film Rashomon, which looks at a crime from several different perspectives.
Getty Images
Even those films which use the so-called "continuity" system of editing to regularize viewers' understanding of time and space within a narrative, count on the cognition of a skilled viewer to put back together the temporal fragments of real time that cinematic editing shatters. Since Griffith's day, gifted filmmakers have exploited this skill to raise the most profound issue about the time of our lives. Directors like Alain Resnais in Hiroshima Mon Amour, or Christopher Nolen, in Memento, weave time in and out of their plots with such daring that they call into question how real time unfolds; why, films like this ask, must time be the linear construction we've always assumed it is?
Kevin Hagopian, senior lecturer in media studies in the College of Communications, kxh24@psu.edu.
Books about Time
Stephen Hawking, A Brief History of Time (updated edition, 1998)—Hawking's comprehensive history of contemporary cosmology centers on the nature of time.
Dava Sobel, Longitude—Story of the marine chronometer, the clock that revolutionized ocean navigation.
Alan Lightman, Einstein's Dreams—An enchanting series of fables about the nature of time.
Huw Price, Time's Arrow and Archimedes's Point—Philosopher argues that time is not one-directional; only our perspective makes it seem so.
Julian Barbour, The End of Time—The Next Revolution in Physics: Argument for a timeless universe.
Brian Greene, The Fabric of the Cosmos—From Newton to Einstein to string theory, human understanding of space and time.
Web sites about Time
Official U.S. time: http://www.time.gov
U.S. Naval Observatory and the Global Positioning System: http://tycho.usno.navy.mil/time.html
Worldtime interactive world atlas: http://www.worldtime.com/
—Dana Bauer
SIDEBAR
Quotations About Time
"The first creatures on earth to become aware of time were also the first creatures to smile." —Nabokov, Speak, Memory
"What is time? If nobody asks me, I know; but if I were desirous to explain it to one that should ask me, plainly I know not." —St. Augustine
"Time flies like an arrow; fruit flies like a banana." —Marx
"For those of us who believe in physics, this separation between past, present, and future is only an illusion, although a persistent one." —Albert Einstein
"Time present and time past Are both perhaps present in time future, And time future contained in time past. If all time is eternally present All time is unredeemable." —T.S. Eliot, "Four Quartets"
"Oh! Do not attack me with your watch. A watch is too slow. I cannot be dictated to by a watch." —Jane Austen, Mansfield Park
"Time is an illusion. Lunchtime doubly so." —Douglas Adams