The Myth of the 100-foot Whale

Blue whale off the California coast. (Photo: Erik Olsen)

Not So Big: How We Overstate the Length of the Blue Whale, Earth’s Largest Creature

One of the most extraordinary privileges of living in California, especially near the coast, is witnessing the annual arrival of blue whales. I’ve been at sea on several occasions when these giants surfaced nearby, and to see one in person, or even through my drone RC, is astonishing and unforgettable. I once had the rare and mind-blowing opportunity to swim with and film blue whales off the southern tip of Sri Lanka for a story I wrote and produced, an experience that will forever be seared into memory.

For decades, the blue whale has been celebrated as the largest creature ever to exist (Bigger than dinosaurs! True.), with many popular accounts claiming that these animals can reach lengths of 100 feet or more. Yet in all the videos, photographs, and encounters I’ve seen, not a single whale has come close to that. Still, article after article and documentary after documentary continues to repeat the claim that blue whales “reach 100 feet or more.” Nearly every whale-watching company in California repeats the claim, echoed endlessly across Instagram and TikTok.

But is it true? Most blue whales I’ve seen off the coast of California are half that size or maybe 2/3. It felt misleading to say so otherwise. And so I did a lot of digging: reading, reaching out to experts, poring over historical records, and the fact is that no single blue whale has ever been scientifically measured at 100 feet. Close, as you will soon read, but not 100 feet or more. Especially not off the coast of California.

This discrepancy not only distorts our understanding of these magnificent creatures, but also highlights the broader issue of how media can shape and sometimes mislead public perception of scientific facts.

Blue whale tail fluke in Sri Lanka. (Photo: Erik Olsen)

In other words: the perception that blue whales commonly reach lengths of 100 feet or more likely stems from a combination of historical anecdotes, estimation errors, and a tendency to highlight extreme examples.

All that said, the blue whale (Balaenoptera musculus) is a truly magnificent creature. Hunted nearly to extinction in the 17th to 19th centuries, the blue whale has staged a hopeful recovery in the last five decades, since commercial whaling was outlawed by the international community in 1966 (although some Soviet whale hunting continued into the early 1970s). And California, in particular, has been blessed with the annual appearance of the largest population of blue whales in the world, called the Eastern North Pacific population, consisting of some 2,000 animals. That population makes an annual migration from the warm waters of Baja California to Alaska and back every year. This is the group I’ve seen off Newport Beach.

These numbers are painfully, tragically small compared to what existed before commercial whaling began. Prior to that, it was estimated that there were some 400,000 blue whales on earth. 360,000 were killed in the Antarctic alone. (IMO: this stands as one of the most shameful acts in human history).

Another way to support us is to buy something from our California wildlife store.

The International Union for Conservation of Nature estimates that there are probably between 5,000 to 15,000 blue whales worldwide today, divided among some five separate populations or groups, including the Eastern North Pacific population. Many now swim so close to shore that an entire whale-watching industry has flourished along the California coast, especially in the south, with many former fishing boats converted into whale-watching vessels..

But back to size, or, more specifically, length: there are two credible references in scientific papers of blue whales that are near 100 feet. The first is a measurement dating back to 1937. This was at an Antarctic whaling station where the animal was said to measure 98 feet. But even that figure is shrouded in some suspicion. First of all, 1937 was a long time ago, and while the size of a foot or meter has not changed, a lot of record-keeping during that time is suspect, as whales were not measured using standard zoological measurement techniques (see below). The 98-foot specimen was recorded by Lieut. Quentin R. Walsh of the US Coast Guard, who was acting as a whaling inspector of the factory ship Ulysses. Sadly, there is scant detail available about this measurement and it remains suspect in the scientific community.

Blue whale in Sri Lanka. (Photo: Erik Olsen)

The second is from a book and a 1973 paper by the late biologist Dale W. Rice, who references a single female in Antarctica whose “authenticated” measurement was also 98 feet. The measurement was conducted by the late Japanese biologist Masaharu Nishiwaki. Nishiwaki and Rice were friends, and while both are deceased, a record of their correspondence exists in a collection of Rice’s papers held by Sally Mizroch, co-trustee of the Dale W. Rice Research Library in Seattle. Reached by email, Dr. Mizroch said that Nishiwaki, who died in 1984, was a very well-respected scientist and that the figure he cited should be treated as reliable.

According to Mizroch, who has reviewed many of the Antarctic whaling records from the whaling era, whales were often measured in pieces after they were cut up, which greatly introduces the possibility for error. That is likely not the case with the 98-foot measurement, which took place in 1947 at a whaling station in Antarctica where Nishiwaki was stationed as a scientific observer.

Blue whale (NOAA)

Proper scientific measurements, the so-called “standard method”, are taken by using a straight line from the tip of the snout to the notch in the tail flukes. This technique was likely not used until well into the 20th century, said Mizroch. In fact, it wasn’t until the 1940s that the use of a metal tape measure became commonplace. According to Dan Bortolotti, author of Wild Blue: A Natural History of the World’s Largest Animal, many of the larger whales in the whaling records — especially those said to be over 100 feet — were probably measured incorrectly or even deliberately exaggerated because bonus money was paid to whalers based on the size of the animal caught.

So, according to the best records we have, the largest blue whale ever properly measured was 98 feet long. Granted, 98 feet is close to 100 feet, but it’s not 100 feet, and it’s certainly not over 100 feet, as so many otherwise reputable references state.

So, setting aside the fact that so many sources say the blue whale has reached 100 feet or more, and that there is no scientific evidence proving this, a key question to ask is how large can whales become?

Blue whale from the National Oceanic and Atmospheric Administration

Most baleen whales are so-called lunge feeders. They open their mouths wide and lunge at prey like krill or copepods, drawing in hundreds of pounds of food at a time. Lunge-feeding baleen whales, it turns out, are wonderfully efficient feeders. The larger they become, the larger their gulps are, and the more food they draw in. But they also migrate vast distances, and oftentimes have to dive deep to find prey, both of which consume a large amount of energy.

A 2019 scientific paper in Science described how a team of researchers used an ocean-going Fitbit-like tag to track whales’ foraging patterns, hoping to measure the animals’ energetic efficiency, or the total amount of energy gained from foraging, relative to the energy expended in finding and consuming prey. The team concluded that there are likely ecological limits to how large a whale can become and that maximum size in filter feeders “is likely constrained by prey availability across space and time.” That is especially the case in today’s era, when overfishing and illegal fishing, including krill harvesting in Antarctica, have reduced the amount of prey available even in regions that used to be very prolific.

Whale fall off the California Coast (Ocean Exploration Trust)

John Calambokidis, a Senior Research Biologist and co-founder of Cascadia Research, a non-profit research organization formed in 1979 based in Olympia, Washington, has studied blue whales up and down the West Coast for decades. He told California Curated that the persistent use of the 100-foot figure can be misleading, especially when the number is used as a reference to blue whales off the coast of California.

The sizes among different blue whale groups differ significantly depending on their location around the globe. Antarctic whales tend to be much bigger, largely due to the amount of available food in cold Southern waters. The blue whales we see off the coast of California, Oregon, Washington and Alaska, are part of a different group from those in Antarctica. They differ both morphologically and genetically, and they consume different types and quantities of food. North Pacific blue whales, our whales, tend to be smaller and likely have always been so. Calambokidis believes that the chances any blue whales off the West Coast of the US ever reaching anything close to 100 feet is “almost non-existent”.

I emailed Regina Asmutis-Silvia, Executive Director North America of Whale and Dolphin Conservation, to ask about this discrepancy among so many seemingly authoritative outlets. She wrote: “While it appears biologically possible for blue whales to reach or exceed lengths of 100’, the current (and limited) photogrammetry data suggest that the larger blue whales which have been more recently sampled are under 80 feet.” Photogrammetry is the process of using several photos of an object — like a blue whale — to extract a three-dimensional measurement from two-dimensional data. It is widely used in biology, as well as engineering, architecture, and many other disciplines. Photogrammetry measurements are now often acquired by drones and have proven to be a more accurate means of measuring whale size at sea.

Antarctic whaling station.

Here’s a key point: In the early part of the 20th century and before, whales were measured by whalers for the purpose of whaling, not measured by scientists for the purpose of science. Again, none of this is to say that blue whales aren’t gargantuan animals. They are massive and magnificent, but if we are striving for precision, it is not accurate to declare, as so many articles and other media do, that blue whales reach lengths of 100 feet or more. Or to insinuate that this size is in any way common. This is not to say it’s impossible that whales grew to or above 100 feet, it’s that, according to the scientific records, none ever has.

A relevant point from Dr. Asmutis-Silvia about the early days of Antarctic whaling: “Given that whales are long-lived and we don’t know at what age each species reaches its maximum length, it is possible that we took some very big, very old whales before we started to measure what we were taking.”

In an email exchange with Jeremy Goldbogen, the scientist at Stanford who authored the study in Science above, he says that measurements with drones off California “have been as high as 26 meters” or 85 feet.

So, why does nearly every citation online and elsewhere regularly cite the 100-foot number? It probably has to do with our love of superlatives and round numbers. We have a deep visceral NEED to be able to say that such and such animal is the biggest or the heaviest or the smallest or whatever. And, when it comes down to it, 100 feet is a nice round number that rolls easily off the tongue or typing fingers.

All said, blue whales remain incredible and incredibly large animals, and deserve our appreciation and protection. Their impressive rebound over the last half-century is to be widely celebrated, but let’s not, in the spirit of scientific inquiry, overstate their magnificence. They are magnificent enough.

California Is a Nobel Powerhouse

You can keep your Oscars, Emmys, Grammys, and Tonys. Take your Pulitzers, Bookers, and Peabodys, too. Even the Pritzker and the Fields Medal don’t quite measure up. For me, nothing competes with the Nobel Prize as a symbol that someone has truly changed the world.

I’m not a scientist, but my mind lives in that space. Science, more than anything else, runs the world and reshapes it. This newsletter was born out of my fascination with how things work and the quiet mechanics behind the visible world and my love for all that California has to offer in the way of innovation and natural beauty. I love standing in front of something familiar and asking: why? how? what exactly is going on here? And nothing satisfies that intense curiosity more than science.

That said, I’ve never loved the word science. It feels cold and sometimes intimidating, as if it applies to people in lab coats and not to everyone else. I kinda wish there were a better word for that spirit of discovery that lives in all of us. Maybe it’s wonder. Maybe curiosity. I dunno. “Science” turns people off sometimes, unfortunately.

Whatever you call it, the Nobel Prize represents the highest acknowledgment of that pursuit. It is the world’s way of saying: this mattered. This changed something. And there are few places (if any) on Earth that can rival California when it comes to the number of people who have earned that honor.

This year, 2025, was no different. Three of the Nobel Prizes announced this week carried California fingerprints, adding to a tradition that stretches back more than a century.

The Nobel Prize in Physiology or Medicine came first. It went to Mary Brunkow, Shimon Sakaguchi, and Fred Ramsdell, the last of whom studied at UCLA and UC San Diego. (In epic California fashion, Ramsdell, who studied at UCLA and UC San Diego, didn’t even learn he’d become a Nobel laureate until after returning from a trip deep into the Wyoming wilderness, where he’d been out of contact with the outside world. What’s more Californian than that?) Their research on regulatory T cells explained how the immune system knows when to attack and when to stand down. Ramsdell’s discovery of a key gene that controls these cells has transformed how scientists think about autoimmune disease and organ transplantation.

Next came the Nobel Prize in Physics, awarded to John Clarke of UC Berkeley, Michel H. Devoret of UC Santa Barbara and Yale, and John M. Martinis of UC Santa Barbara (big shout out to UCSB!). Their award honored pioneering work that revealed how the strange laws of quantum mechanics can be seen in circuits large enough to hold in your hand. Beginning in Clarke’s Berkeley lab in the 1980s, the trio built superconducting loops that behaved like subatomic particles, “tunneling” and flipping between quantum energy states. Those experiments helped create the foundation for today’s quantum computers.

The Chemistry Prize followed a day later, shared by Susumu Kitagawa, Richard Robson, and Omar M. Yaghi of UC Berkeley for discoveries in metal–organic frameworks, or MOFs. These are crystalline materials so porous that a single gram can hold an entire roomful of gas (mind blown). MOFs are now used to capture carbon dioxide, filter water, and even pull drinking water from desert air. Yaghi’s Berkeley lab coined the term “reticular chemistry” to describe this new molecular architecture. His work has become one of California’s most important contributions to the climate sciences.

California Institute of Technology (Photo: Erik Olsen)

Those three announcements in as many days lit up California’s scientific community, has garnered many headlines and carried on a tradition that has made the state one of the world’s most reliable engines of Nobel-level discovery.

The University of California system now counts 74 Nobel Prizes among its faculty and researchers. 23 in physics and 16 in chemistry. Berkeley leads the list, with 26 laureates, followed by UC San Diego, UCLA, UC Santa Barbara, and UC San Francisco. Even smaller campuses, such as UC Riverside, have ties to winners like Barry Barish, who shared the 2017 Nobel in Physics for detecting gravitational waves.

Linus Pauling with an inset of his Nobel Prize in 1955 (Wikipedia – public domain)

Caltech, which I have written about extensively and is quite close to my own home, counts 47 Nobel laureates (faculty, alumni, or postdocs). Its history is the stuff of legend. In 1923, Robert Millikan won for measuring the charge of the electron. In 1954, Linus Pauling received the Chemistry Prize for explaining the nature of the chemical bond. He later won the Peace Prize for his anti-nuclear activism, making him the only person to win two unshared Nobels.

Stanford University sits not far behind, with 36 Nobel winners in its history and about 20 currently active in its community. From the development of transistors and lasers to modern work in medicine and economics, Stanford’s laureates have changed the modern world in ways that is impossible to quantify, but profound in their impact.

These numbers tell a clear story: since the mid-twentieth century, about one in every four Nobel Prizes in the sciences awarded to Americans has gone to researchers based at California institutions, an extraordinary concentration of curiosity, intellect, and ambition within a single state.

University of California Santa Barbara (Photo: Erik Olsen)

California’s Nobel dominance began early. In the 1930s, UC Berkeley’s Ernest Lawrence invented the cyclotron, a device that would transform physics and eventually medicine. Caltech, meanwhile, became a magnet for the world’s brightest physicists and chemists.

Over the decades, California’s universities turned their focus to molecular biology, biochemistry, and genetics. In the 1980s, the state’s physicists and engineers drove advances in lasers, semiconductors, and now, quantum circuits. And as biotechnology rose, San Diego and the Bay Area became ground zero for breakthroughs in medicine and life sciences. One of the great moments in genetics took place in Asilomar on the coast. 

Nobel Museum in Stockholm, Sweden (Photo: Erik Olsen)

This is all about more than geography and climate (although those are a big sell, for sure). California’s research institutions kick ass because they operate as ecosystems rather than islands. Berkeley physicists collaborate with engineers at Stanford. Caltech chemists trade ideas with biotech firms in San Diego. Graduate students drift between labs, startups, and national research centers like Lawrence Livermore and JPL. The boundaries between university and industry blur, with campuses like Stanford turning breakthrough discoveries into thriving commercial ventures (look how many of our big tech brains came out of Stanford). In California, research doesn’t end in the lab, it often turns into companies, technologies, and treatments that generate both knowledge and enormous economic value. Just look at AI today. 

Check out our Etsy store for cool California wildlife swag.

I think the secret is cultural. Over the years, I’ve lived on the East coast for almost two decades, and abroad for several as well, and nothing compares to the California vibe. California has never been afraid of big risks. Its scientists are encouraged to chase questions that might take decades to answer (see our recent story on just this idea). There’s an openness to uncertainty here that works well in the natural sciences, but can also be found in Hollywood, Silicon Valley and, of course, space exploration. 

When next year’s round of early morning calls comes from Stockholm, it is a good bet that someone in California will pick up. Maybe a physicist in Pasadena, a chemist in Berkeley, or a physician in La Jolla. Maybe they’ll pick up the phone in bed, maybe a text from a spouse while camping, or on a morning jog. That’s when a Swedish-accented voice tells them that the world has just caught up to what they’ve been quietly building for years.

The Unsung California Labs That Powered the Digital Revolution

Researchers at Lawrence Livermore National Laboratory working with the Big Aperture Thulium (BAT) laser system, part of the laser and plasma research that laid the groundwork for generating the extreme ultraviolet light at the heart of today’s most advanced chipmaking machines. (Photo: Jason Laurea/LLNL)

When I started this Website, my hope was to share California’s astonishing range of landscapes, laboratories, and ideas. This state is overflowing with scientific discovery and natural marvels, and I want readers to understand, and enjoy, how unusually fertile this state is for discovery. If you’re not curious about the world, then this Website is definitely not for you. If you are, then I hope you get something out of it when you step outside and look around. 

I spend a lot of time in the California mountains and at sea, and I am endlessly amazed by the natural world at our doorstep. I am also fascinated by California’s industrial past, the way mining, oil, and agriculture built its wealth, and how it later became a cradle for the technologies and industries now driving human society forward. Of course, some people see technologies like gene editing and AI as existential risks. I’m an optimist. I see tools that, while potentially dangerous, used wisely, expand what is possible.

An aerial view of Lawrence Livermore National Laboratory in 1960, when the Cold War spurred rapid expansion of America’s nuclear and scientific research campus east of San Francisco Bay. (Photo: LLNL Public Domain)

Today’s story turns toward technology, and one breakthrough in particular that has reshaped the modern world. It is not just in the phone in your pocket, but in the computers that train artificial intelligence, in advanced manufacturing, and in the systems that keep the entire digital economy running. The technology is extreme ultraviolet lithography (EUV). And one of the most important points I want to leave you with is that the origins of EUV are not found in Silicon Valley startups or corporate boardrooms but in California’s national laboratories, where government-funded science made the impossible possible.

This article is not a political argument, though it comes at a time when government funding is often questioned or dismissed. My purpose is to underscore how much California’s national labs have accomplished and to affirm their value.

This story begins in the late 1980s and 1990s, when it became clear that if Moore’s Law was going to hold, chipmakers would need shorter and shorter wavelengths of light to keep shrinking transistors. Extreme ultraviolet light, or EUV, sits way beyond the visible spectrum, at a wavelength far shorter than ordinary ultraviolet lamps. That short wavelength makes it possible to draw patterns on silicon at the tiniest scales…and I mean REALLY tiny.

Ernest Orlando Lawrence at the controls of the 37-inch cyclotron in 1938. A Nobel Prize–winning physicist and co-founder of Lawrence Livermore National Laboratory, Lawrence’s legacy in nuclear science and high-energy research paved the way for the laboratory’s later breakthroughs in lasers and plasma physics — work that ultimately fed into the extreme ultraviolet light sources now powering the world’s most advanced chipmaking machines. (LLNL Public Domain)

At Lawrence Berkeley National Laboratory, researchers with expertise in lasers and plasmas were tasked with figuring out how to generate a powerful, reliable source of extreme ultraviolet light for chipmaking. Their solution was to fire high-energy laser pulses at microscopic droplets of tin, creating a superheated plasma that emits at just the right (tiny) wavelength for etching circuits onto silicon.

The movement of light on mirrors in an ASML EUV lithography machine. More on it below.

Generating the light was only the first step. To turn it into a working lithography system required other national labs to solve equally daunting problems. Scientists at Berkeley’s Center for X Ray Optics developed multilayer mirrors that could reflect the right slice of light with surprising efficiency. A branch of Sandia National Laboratories located in Livermore, California, worked on the pieces that translate light into patterns. So, in all: Livermore built and tested exposure systems, Berkeley measured and perfected optics and materials, and Sandia helped prove that the whole chain could run as a single machine.

Each EUV lithography machine is about the size of a bus, costs more than $150 million, and shipping one requires 40 freight containers, three cargo planes, and 20 trucks. (Photo: ASML)

It matters that this happened in public laboratories. The labs had the patient funding and the unusual mix of skills to attempt something that might not pay off for many years. The Department of Energy supported the facilities and the people. DARPA helped connect the labs with industry partners and kept the effort moving when it was still risky. There was no guarantee that the plasma would be bright enough, that the mirrors would reflect cleanly, or that the resists (the light-sensitive materials coated onto silicon wafers) would behave. The national labs could take that on because they are designed to tackle long horizon problems that industry would otherwise avoid.

Only later did private industry scale the laboratory breakthroughs into the giant tools that now anchor modern chip factories. The Dutch company ASML became the central player, building the scanners that move wafers with incredible precision under the fragile EUV light. Those systems are now capable of etching transistor features as small as 5 nanometers…5 billionths of a meter. You really can’t even use the “smaller than a human hair” comparison here since human hair variation is so large at this scale as to render that comparison kind of useless. However, many people still do.

The ASML machines are marvels of tech and engineering. Truly amazing feats human design. And they integrate subsystems from all over the world: Zeiss in Germany manufactures the mirrors, polished to near-atomic perfection, while San Diego’s Cymer (now part of ASML) supplies the laser-driven plasma light sources. The technology is so complex that a single scanner involves hundreds of thousands of components and takes months to assemble.

ASML’s EXE:5000 High-NA EUV lithography machine — a room-sized tool that etches the tiniest features on the world’s most advanced computer chips. (ASML)

It was TSMC and Samsung that then poured billions of dollars into making these tools reliable at scale, building the factories that now turn EUV light into the chips powering AI and smartphones and countless other devices. Trillions of dollars are at stake. Some say the fate of humanity lies in balance should Artificial General Intelligence eventually emerge (again, I don’t say that, but some do). All of this grew from the ingenuity and perseverance, along with the public funding, that sustained these California labs.

It’s disappointing that many of the companies profiting most from these technological breakthroughs are not based in the United States, even though the core science was proven here in California. That is fodder for a much longer essay, and perhaps even for a broader conversation about national industrial policy, something the CHIPS Act is only beginning to deal with.

However, if you look closely at the architecture of those monster machines, you can still see the fingerprints of the California work. A tin plasma for the light. Vacuum chambers that keep the beam alive. Reflective optics that never existed at this level before EUV research made them possible.

A photorealistic rendering of an advanced microprocessor, etched in silicon with extreme ultraviolet light — the kind of breakthrough technology pioneered in U.S. national labs, but now fabricated almost entirely in Taiwan, where the future of digital society is being made.

We often celebrate garages, founders, and the venture playbook. Those are real parts of the California story. This is a different part, just as important. The laboratories in Livermore, Berkeley, and Sandia are public assets. They exist because voters and policymakers chose to fund places where hard problems can be worked on for as long as it takes. The payoff can feel distant at first, then suddenly it is in your pocket. Like EUV. Years of quiet experiments on lasers, mirrors, and materials became the hidden machinery of the digital age.

The Salton Sea Was California’s Strangest Catastrophe

In California’s southeastern desert, the Salton Sea stretches across a wide, shimmering basin, a lake where there shouldn’t be one. At about 340 square miles, it’s the state’s largest lake. But it wasn’t created by natural forces. It was the result of a major engineering failure. I’ve long been fascinated with the place: its contradictions, its strangeness, its collision of nature and human ambition. It reflects so many of California’s tensions: water and drought, industry and wilderness, beauty and decay. And it was only relatively recently that I came to understand not just how the Salton Sea came to exist, but how remarkable the region’s geological past really is, and how it could play a major role in the country’s sustainable energy future.

In the early 1900s, the Imperial Valley was seen as promising farmland: its deep, silty soil ideal for agriculture, but the land was arid and desperately needed irrigation. To bring water from the Colorado River, engineers created the Imperial Canal, a massive infrastructure project meant to transform the desert into productive farmland. But the job was rushed. The canal had to pass through the Mexican border and loop back into California, and much of it ran through highly erodible soil. Maintenance was difficult, and by 1904, silt and sediment had clogged portions of the canal.

The Southern Pacific Railroad was forced to move it lines several times as the raging, unleashed Colorado River expanded the Salton Sea. (Credit: Imperial Irrigation District)

To keep water flowing, engineers hastily dug a temporary bypass channel south of the clogged area, hoping it would only be used for a few months. But they failed to build proper headgates, critical structures for controlling water flow. In 1905, an unusually heavy season of rain and snowmelt in the Rockies caused the Colorado River to swell. The torrent surged downriver and overwhelmed the temporary channel, carving it wider and deeper. Before long, the river completely abandoned its natural course and began flowing unchecked into the Salton Sink, an ancient, dry lakebed that had once held water during wetter epochs but had long since evaporated. (This has happened many times over in the region’s history).

For nearly two years, the Colorado River flowed uncontrolled into this depression, creating what is now known as the Salton Sea. Efforts to redirect the river back to its original course involved a frantic, expensive engineering campaign that included the Southern Pacific Railroad and U.S. government assistance. The breach wasn’t fully sealed until early 1907. By then, the sea had already formed: a shimmering, accidental lake nearly 35 miles long and 15 miles wide, with no natural outlet, in the middle of the California desert.

In the 1950s and early ’60s, the Salton Sea was a glamorous desert escape, drawing crowds with boating, fishing, and waterskiing. Resorts popped up along the shore, and celebrities like Frank Sinatra, Jerry Lewis, Rock Hudson, the Beach Boys, and the Marx Brothers came to visit and perform. It was billed as a new Palm Springs with water, until rising salinity and environmental decline ended the dream. There have been few if any similarly starge ecological accidents like it.

The erosive power of the floodwaters was immense. The river repeatedly scoured channels that created waterfalls, which cut back through the ground, eroding soil at a rate of about 1,200 meters per day and carving gorges 15 to 25 meters deep and more than 300 meters wide. (Credit: Imperial Irrigation District)

The creation of the Salton Sea was both a blessing and a curse for the people of the Imperial Valley. On the one hand, the lake provided a new source of water for irrigation, and the fertile soil around its shores proved ideal for growing crops. On the other hand, the water was highly saline, and the lake became increasingly polluted over time, posing a threat to both human health and the environment.

Recently, with most flows diverted from the Salton Sea for irrigation, it has begun to dry up and is now considered a major health hazard, as toxic dust is whipped up by heavy winds in the area. The disappearance of the Salton sea has also been killing off fish species that attract migratory birds.

California Curated Etsy

 The New York Times recently wrote about the struggles that farmers face as the Salton Sea disappears, and how the sea itself will likely disappear entirely at some point.

“There’s going to be collateral damage everywhere,” Frank Ruiz, a program director with California Audubon, told the Times. “Less water coming to the farmers, less water coming into the Salton Sea. That’s just the pure math.”

Salton Sea can be beautiful, if toxic (Photo: Wikipedia)

To me, the story of the Salton Sea is fascinating: a vivid example of how human intervention can radically reshape the environment. Of course, there are countless cases of humans altering the natural world, but this one feels particularly surreal: an enormous inland lake created entirely by accident, simply because a river, the Colorado, one of the most powerful in North America, was diverted from its course. It’s incredible, and incredibly strange. What makes the region even more fascinating, though, is that the human-made lake sits in a landscape already full of geological drama.

The area around the Salton Sea is located in a techtonically active region, with the San Andreas Fault running directly through it. The San Andreas Fault is a major plate boundary, where the Pacific Plate is moving north relative to the North American Plate (see our story about how fast it’s moving here). As pretty much every Californian knows, the legendary fault is responsible for the earthquakes and other tectonic activity across much of California.

If you look at a map of the area, you can see how the low lying southern portion of the Salton Sea basin goes directly into the Gulf of California. Over millions of years, the desert basin has been flooded numerous times throughout history by what is now the Gulf of California. As the fault system cuts through the region, the Pacific Plate is slowly sliding northwest, gradually pulling the Baja Peninsula away from mainland Mexico. Over millions of years, this tectonic motion is stretching and thinning the crust beneath the Imperial Valley and Salton Basin. If the process continues, geologists believe the area could eventually flood again, forming a vast inland sea, perhaps even making an island out of what is today Baja California. (We wrote about this earlier.)

Entrance to the Salton Sea Recreation Area (Wikipedia)

Yet even as the land shifts beneath it, the Salton Sea’s future may be shaped not just by geology, but by energy. Despite the ongoing controversy over the evaporating water body, the Salton Sea may play a crucial role in California’s renewable energy future. The region sits atop the Imperial Valley’s geothermal hotspot, where underground heat from all that tectonic activity creates ideal conditions for producing clean, reliable energy. Already home to one of the largest geothermal fields in the country, the area is now gaining attention for something even more strategic: lithium.

An aerial view of geothermal power plants among the farmland around the southern shore of the Salton Sea.
(Credit: Courtesy Lawrence Berkeley National Lab)

Beneath the surface, the hot, mineral-rich brine used in geothermal energy production contains high concentrations of lithium, a critical component in electric vehicle batteries. Known as “Lithium Valley,” the Salton Sea region has become the focus of several ambitious extraction projects aiming to tap into this resource without the large environmental footprint of traditional lithium mining. Gov. Gavin Newsom called the area is “the Saudi Arabia of lithium.” Even the Los Angeles Times has weighed in, claiming that “California’s Imperial Valley will be a major player in the clean energy transition.”

Companies like Controlled Thermal Resources (CTR) and EnergySource are developing direct lithium extraction (DLE) technologies that pull lithium from brine as part of their geothermal operations. The promise is a closed-loop system that produces both renewable energy and battery-grade lithium on the same site. If it proves viable, the Salton Sea could significantly reduce U.S. dependence on foreign lithium and cement California’s role in the global shift to clean energy. That’s a big if…and one we’ll be exploring in depth in future articles.

Toxic salt ponds along the Western shoreline (Photo: EmpireFootage)

Such projects could also potentially provide significant economic investment in the region and help power California’s green energy ambitions. So for a place that looks kind of wrecked and desolate, there actually a lot going on. We promise to keep an eye on what happens. Stay tuned.

The Caltech Experiment That Proved How Life Copies Itself

DNA molecule (Midjourney)

I love reading New York Times obituaries, not because of any morbid fascination with death, but because they offer a window into extraordinary lives that might otherwise go unnoticed. These tributes often highlight people whose work had real impact, even if their names were never widely known. Unlike the celebrity coverage that fills so much of the media, these obituaries can be quietly riveting, full of depth, insight, and genuine accomplishment.

For two years I managed the New York Times video obituary series called Last Word. We interviewed people of high accomplishment who had made a difference in the world BEFORE they died, thus giving them a chance, at a latter age (in our case 75 was the youngest, but more often people would be in their 80s) to tell their own stories about their lives. They signed an agreement acknowledging that the interview would not be shown until after their death. Hence the series title: Last Word. Anyway, when I ran the program, I produced video obituaries for people as varied as Neil Simon, Hugh Hefner, Sandra Day O’Connor, Philip Roth, Edward Albee, and my favorite, the great Harvard biologist E.O. Wilson. Spending time and learning about their lives in their own words was a joy.

All of that is to say that obituaries often reveal the lives and accomplishments of people who have changed the world. These are stories that might never be told so thoughtfully or thoroughly anywhere else.

California Institute of Technology (Photo: Erik Olsen)

Which bring us to a quiet lab at Caltech in 1958, where two young biologists performed what some still call “the most beautiful experiment in biology”. Their names were Matthew Meselson and Franklin Stahl, and what they uncovered helped confirm the foundational model of modern genetics. With a simple centrifuge, a dash of heavy nitrogen, and a bold hypothesis, they confirmed how DNA, life’s instruction manual, copies itself. And all of it took place right here in California at one of the world’s preeminent scientific institutions: the California Institute of Technology or CalTech, in Pasadena. The state is blessed to have so many great scientific minds and institutions where people work intensely, often in obscurity, to uncover the secrets of life and the universe.

California Curated Etsy

Franklin Stahl died recently at his home in Oregon, where he had spent much of his career teaching and researching genetics. The New York Times obituary offered a thoughtful account of his life and work, capturing his contributions to science with typical respect. But after reading it, I realized I still didn’t fully grasp the experiment that made him famous, the Meselson-Stahl experiment, the one he conducted with Matthew Meselson at Caltech. It was mentioned, of course, but not explained in a way that brought its brilliance to life. So I decided to dig a little deeper.

Franklin Stahl in an undated photo. (Cold Spring Harbor Laboratory Library and Archives)

The Meselson-Stahl experiment didn’t just prove a point. It told a story about how knowledge is built: carefully, creatively, and with a precision that leaves no room for doubt. It became a model for how science can answer big questions with simple, clean logic and careful experimentation. And it all happened in California.

Let’s back up: When Watson and Crick proposed their now-famous double helix structure of DNA in 1953 (with significant, poorly recognized help from Rosalind Franklin), they also suggested a theory about how it might replicate. Their idea was that DNA separates into two strands, and each strand acts as a template to build a new one. That would mean each new DNA molecule is made of one old strand and one new. It was called the semi-conservative model. But there were other theories too. One proposed that the entire double helix stayed together and served as a model for building an entirely new molecule, leaving the original untouched. Another suggested that DNA might break apart and reassemble in fragments, mixing old and new in chunks. These were all plausible ideas. But only one could be true.

Watson and Crick with their model of the DNA molecule (Photo: A Barrington Brown/Gonville & Caius College/Science Photo Library)

To find out, Meselson and Stahl grew E. coli bacteria in a medium containing heavy nitrogen (nitrogen is a key component of DNA), a stable isotope that made the DNA denser than normal. After several generations, all the bacterial DNA was fully “heavy.” Then they transferred the bacteria into a medium with normal nitrogen and let them divide. After one generation, they spun the DNA in a centrifuge that separated it by weight. If DNA copied itself conservatively, the centrifuge would show two bands: one heavy, one light. If it was semi-conservative, it would show a single band at an intermediate weight. When they performed the experiment, the result was clear. There was only one band, right between the two expected extremes. One generation later, the DNA split into two bands: one light, one intermediate. The semi-conservative model was correct.

Their results were published in Proceedings of the National Academy of Sciences in 1958 and sent shockwaves through the biological sciences.

Meselson and Stahl experiment in diagram.

To me, the experiment brought to mind the work of Gregor Mendel, an Augustinian monk who, in the mid-1800s, quietly conducted his experiments in the garden of a monastery in Brno, now part of the Czech Republic. By breeding pea plants and meticulously tracking their traits over generations, Mendel discovered the basic principles of heredity, dominant and recessive traits, segregation, and independent assortment, decades before the word “gene” even existed. Like Mendel’s experiments, the Meselson-Stahl study was striking in its simplicity and clarity. Mendel revealed the rules; Meselson and Stahl uncovered the mechanism.

There’s a fantastic video where the two men discuss the experiment that is worth watching. It was produced produced by iBiology, part of the nonprofit Science Communication Lab in Berkeley. In it Meselson remembered how the intellectual climate of CalTech at the time was one of taking bold steps, not with the idea of making a profit, but for the sheer joy of discovery: “We could do whatever we wanted,” he says. “It was very unusual for such young guys to do such an important experiment.”

California Institute of Technology (Photo: Erik Olsen)

Most people think of Caltech as a temple of physics. It’s where Einstein lectured, where the Jet Propulsion Laboratory was born (CalTech still runs it), and where the gravitational waves that rippled through spacetime were detected. But Caltech has a quieter legacy in biology. Its biologists were among the first to take on the structure and function of molecules inside cells. The institute helped shape molecular biology as a new discipline at a time when biology was still often considered a descriptive science. Long before Silicon Valley made biotech a household term, breakthroughs in genetics and neurobiology were already happening in Southern California.

Meselson and Stahl in the iBiology video (Screen grab: Science Communication Lab)

The Meselson-Stahl experiment is still taught in biology classrooms (my high school age daughter knew of it) because of how perfectly it answered the question it set out to ask. It was elegant, efficient, and unmistakably clear. And it showed how a well-constructed experiment can illuminate a fundamental truth. Their discovery laid the groundwork for everything from cancer research to forensic DNA analysis to CRISPR gene editing. Any time a scientist edits a gene or maps a mutation, they are relying on that basic understanding of how DNA replicates.

In a time when science often feels far too complex, messy, or inaccessible, the Meselson-Stahl experiment is a reminder that some of the most important discoveries are also the simplest. Think Occam’s Razor. Two young scientists, some nitrogen, a centrifuge, a clever idea, and a result that changed biology forever.

California Curated Etsy

California’s Eye on the Cosmos: The SLAC-Built Camera That Will Time-Lapse the Universe

Images from the most powerful astronomical discovery machine ever created, and built in California

A breathtaking zoomed-in glimpse of the cosmos: this first image from the Vera C. Rubin Observatory reveals a deep field crowded with galaxies, offering just a taste of the observatory’s power to map the universe in unprecedented detail.
(Credit: NSF–DOE Vera C. Rubin Observatory)

I woke up this morning to watch a much-anticipated press conference about the release of the first images from the Vera Rubin Telescope and Observatory. It left me flabbergasted: not just for what we saw today, but for what is still to come. The images weren’t just beautiful; they hinted at a decade of discovery that could reshape what we know about the cosmos.I just finished watching and have to catch my breath. What lies ahead is very, very exciting. 

The first images released today mark the observatory’s “first light,” the ceremonial debut of a new telescope. These images are the result of decades of effort by a vast and diverse global team who together helped build one of the most advanced scientific instruments ever constructed. In the presser, Željko Ivezić, Director of the Rubin Observatory and the guy who revealed the first images, called it “the greatest astronomical discovery machine ever built.”

This image combines 678 separate images taken by NSF–DOE Vera C. Rubin Observatory in just over seven hours of observing time. Combining many images in this way clearly reveals otherwise faint or invisible details, such as the clouds of gas and dust that comprise the Trifid nebula (top) and the Lagoon nebula, which are several thousand light-years away from Earth.
(Credit: NSF–DOE Vera C. Rubin Observatory)

The images shown today are a mere hors d’oeuvre of what’s to come, and you could tell by the enthusiasm and giddiness of the scientists involved how excited they are about what lies ahead. Here’s a clip of Željko Ivezić as the presser ended. It made me laugh.

So, that first image you can see above. Check out the detail. What would normally be perceived as black, empty space to us star-gazing earthlings shows anything but. It shows that in each tiny patch of sky, if you look deep enough, galaxies and stars are out there blazing. If you know the famous Hubble Deep Field image, later expanded by NASA’s James Webb Space Telescope, you may already be aware that there is no such thing as empty sky. The universe contains so much stuff, it is truly impossible for our brains (or at least my brain) to comprehend. Vera Rubin will improve our understanding of what’s out there and what we’ve seen before by orders of magnitude.   

This image captures a small section of NSF–DOE Vera C. Rubin Observatory’s view of the Virgo Cluster, revealing both the grand scale and the faint details of this dynamic region of the cosmos. Bright stars from our own Milky Way shine in the foreground, while a sea of distant reddish galaxies speckle the background.
(Credit: NSF–DOE Vera C. Rubin Observatory)

I’ve been following the Rubin Observatory for years, ever since I first spoke with engineers at the SLAC National Accelerator Laboratory about the digital camera they were building for a potential story for an episode of the PBS show NOVA that I produced (sadly, the production timeline ultimately didn’t work out). SLAC is one of California’s leading scientific institutions, known for groundbreaking work across fields from particle physics to astrophysics. (We wrote about it a while back.)

The night sky seen from inside the Vera Rubin Observatory (Credit: NSF–DOE Vera C. Rubin Observatory)

Now fully assembled atop Chile’s Cerro Pachón, the Vera C. Rubin Observatory is beginning its incredible and ambitious mission. Today’s presser focused on unveiling the first images captured by its groundbreaking camera, offering an early glimpse of the observatory’s vast potential. At the heart of the facility is SLAC’s creation: the world’s largest digital camera, a 3.2-gigapixel behemoth developed by the U.S. Department of Energy.

This extraordinary instrument is the central engine of the Legacy Survey of Space and Time (LSST), a decade-long sky survey designed to study dark energy, dark matter, and the changing night sky with unprecedented precision and frequency. We are essentially creating a decade-long time-lapse of the universe in detail that has never been captured before, revealing the dynamic cosmos in ways previously impossible. Over the course of ten years, it will catalog 37 billion individual astronomical objects, returning to observe each one every three nights to monitor changes, movements, and events across the sky. I want to learn more about how Artificial Intelligence and machine learning are being brought to bear to help scientists understand what they are seeing.

The camera, over 5 feet tall and weighing about three tons, took more than a decade to build. Its focal plane is 64 cm wide-roughly the size of a small coffee table-and consists of 189 custom-designed charge-coupled devices (CCDs) stitched together in a highly precise mosaic. These sensors operate at cryogenic temperatures to reduce noise and can detect the faintest cosmic light, comparable to spotting a candle from thousands of miles away.

The LSST Camera was moved from the summit clean room and attached to the camera rotator for the first time in February 2025. (Credit: RubinObs/NOIRLab/SLAC/DOE/NSF/AURA)

Rubin’s camera captures a massive 3.5-degree field of view-more than most telescopes can map in a single shot. That’s about seven times the area of the full moon. Each image takes just 15 seconds to capture and only two seconds to download. A single Rubin image contains roughly as much data as all the words The New York Times has published since 1851. The observatory will generate about 20 terabytes of raw data every night, which will be transmitted via a high-speed 600 Gbps link to processing centers in California, France, and the UK. The data will then be routed through SLAC’s U.S. Data Facility for full analysis.

The complete focal plane of the future LSST Camera is more than 2 feet wide and contains 189 individual sensors that will produce 3,200-megapixel images. Crews at SLAC have now taken the first images with it. Explore them in full resolution using the links at the bottom of the press release. (Credit: Jacqueline Orrell/SLAC National Accelerator Laboratory)

The images produced will be staggering in both detail and scale. Each exposure will be sharp enough to reveal distant galaxies, supernovae, near-Earth asteroids, and other transient cosmic phenomena in real time. By revisiting the same patches of sky repeatedly, the Rubin Observatory will produce an evolving map of the dynamic universe-something no previous observatory has achieved at this scale.

What sets Rubin apart from even the giants like Hubble or James Webb is its speed, scope, and focus on change over time. Where Hubble peers deeply at narrow regions of space and Webb focuses on the early universe in infrared, Rubin will cast a wide and persistent net, watching the night sky for what moves, vanishes, appears, or explodes. It’s designed not just to look, but to watch. Just imaging the kind of stuff we will see!

The LSST Camera’s imaging sensors are grouped into units called “rafts.” Twenty-one square rafts, each with nine sensors, will capture science images, while four smaller rafts with three sensors each handle focus and telescope alignment. (Credit: Farrin Abbott/SLAC National Accelerator Laboratory)

This means discoveries won’t just be about what is out there, but what happens out there. Astronomers expect Rubin to vastly expand our knowledge of dark matter by observing how mass distorts space through gravitational lensing. It will also help map dark energy by charting the expansion of the universe with unprecedented precision. Meanwhile, its real-time scanning will act as a planetary defense system, spotting potentially hazardous asteroids headed toward Earth.

But the magic lies in the possibility of the unexpected. Rubin may detect rare cosmic collisions, unknown types of supernovae, or entirely new classes of astronomical phenomena. Over ten years, it’s expected to generate more than 60 petabytes of data-more than any other optical astronomy project to date. Scientists across the globe are already preparing for the data deluge, building machine learning tools to help sift through the torrent of discovery.

And none of it would be possible without SLAC’s camera. A triumph of optics, engineering, and digital sensor technology, the camera is arguably one of the most complex and capable scientific instruments ever built. I don’t care if you’re a Canon or a Sony person, this is way beyond all that. It’s a monument to what happens when curiosity meets collaboration, with California’s innovation engine powering the view.

As first light filters through the Rubin Observatory’s massive mirror and into SLAC’s camera, we are entering a new era of astronomy-one where the universe is not just observed, but filmed, in exquisite, evolving detail. This camera won’t just capture stars. It will reveal how the universe dances.

Caltech’s Einstein Papers Project is a Window into the Mind of a Genius

Albert Einstein on the beach in Santa Barbara in 1931 (The Caltech Archives)

We wrote a piece a while back about the three winters Albert Einstein spent in Pasadena, a little-known chapter in the life of a man who changed how we understand the universe. It was our way of showing how Einstein, often seen as a figure of European academia and global science, formed a real affection for California and for Pasadena in particular. It’s easy to picture him walking the streets here, lost in thought or sharing a laugh with Charlie Chaplin. The idea of those two geniuses, one transforming physics and the other revolutionizing comedy, striking up a friendship is something worth imagining.

But Einstein’s connection to Pasadena didn’t end there. It lives on in a small, nondescript building near the Caltech campus, where a group of researchers continues to study and preserve the legacy he left behind.

The Einstein Papers Project (EPP) at Caltech is one of the most ambitious and influential scientific archival efforts of the modern era. It’s not just about preserving Albert Einstein’s work—it’s about opening a window into the mind of one of the most brilliant thinkers in history. Since the late 1970s, a dedicated team of scholars has been working to collect, translate, and annotate every significant document Einstein left behind. While the project is headquartered at the California Institute of Technology, it collaborates closely with Princeton University Press and the Hebrew University of Jerusalem, which houses the original manuscripts.

Einstein at the Santa Barbara home of Caltech trustee Ben Meyer on Feb. 6, 1933.
(The Caltech Archives)

The idea began with Harvard physicist and historian Gerald Holton, who saw early on that Einstein’s vast output—scientific papers, personal letters, philosophical musings—deserved a meticulously curated collection. That vision became the Einstein Papers Project, which has since grown into a decades-long effort to publish The Collected Papers of Albert Einstein, now spanning over 15 volumes (and counting). The project’s goal is as bold as Einstein himself: to assemble a comprehensive record of his life and work, from his earliest student notebooks to the letters he wrote in the final years of his life.

Albert Einstein and Charlie Chaplin during the premiere of the film ‘City Lights’. (Wikipedia)

Rather than being stored in a traditional library, these documents are carefully edited and presented in both print and online editions. And what a treasure trove it is. You’ll find the famous 1905 “miracle year” papers that revolutionized physics, laying the foundation for both quantum mechanics (which Einstein famously derided) and special relativity. You’ll also find handwritten drafts, scribbled calculations, and long chains of correspondence—sometimes with world leaders, sometimes with lifelong friends. These documents don’t just chart the course of scientific discovery; they reveal the very human process behind it: doubt, revision, flashes of inspiration, and stubborn persistence.

At the Mount Wilson Observatory with the Austrian mathematician Walther Mayer, left, and Charles St. John of the observatory staff. (The Caltech Archives)

Some of the most fascinating material involves Einstein’s attempts at a unified field theory, an ambitious effort to merge gravity and electromagnetism into one grand framework. He never quite got there, but his notebooks show a mind constantly working, refining, rethinking—sometimes over decades.

But the project also captures Einstein the person: the political thinker, the pacifist, the refugee, the cultural icon. His letters reflect a deep concern with justice and human rights, from anti-Semitism in Europe to segregation in the United States. He corresponded with Sigmund Freud about the roots of violence, with Mahatma Gandhi about nonviolent resistance, and with presidents and schoolchildren alike. The archive gives us access to the full spectrum of who he was, not just a scientist, but a citizen of the world.

The Einstein Papers Project home near Caltech in Pasadena (Photo: Erik Olsen)

One of the most exciting developments has been the digitization of the archive. Thanks to a collaboration with Princeton University Press, a large portion of the Collected Papers is now freely available online through the Digital Einstein Papers website. Students, teachers, historians, and science nerds around the globe can now browse through Einstein’s original documents, many of them translated and annotated by experts. The most recent release, Volume 17, spans June 1929 to November 1930, capturing Einstein’s life primarily in Berlin as he travels across Europe for scientific conferences and to accept honorary degrees. The volume ends just before his departure for the United States. Princeton has a nice story on the significance of that particular volume by EPP Editor Josh Eisenthal.

The California Institute of Technology, CalTech (Photo: Erik Olsen)

For scholars, the project is a goldmine. It’s not just about Einstein—it’s about the entire intellectual climate of the 20th century. His collaborations and rivalries, his responses to global upheaval, and his reflections on science, faith, and ethics all provide insight into a remarkable era of discovery and change. His writings also show a playful, curious side—his love of music, his wit, and his habit of thinking in visual metaphors.

Caltech’s role in all this goes beyond simple stewardship. The Einstein Papers Project is a reflection of the institute’s broader mission: to explore the frontiers of science and human understanding. For decades, Caltech has been a breeding ground for great minds. As of January 23, 2025, there are 80 Nobel laureates who have been affiliated with Caltech, making it the institution with the highest number of Nobelists per capita in America. By preserving and sharing Einstein’s legacy, Caltech helps keep alive a conversation about curiosity, responsibility, and the enduring power of ideas.