I was struck by this work by a group of U.C. Davis Scientists, as reported by the Merc: Lake Tahoe: Warmest water temperatures ever recorded threaten famed clarity, new study shows
Last year, the lake's average clarity was 73.1 feet, the UC Davis scientists reported. That's a 4.8-foot decrease from the previous year.The worst recorded average clarity was 64.1 feet in 1997, and the best was 102 feet, in 1968, the year measurements began. As the lake's crystal clarity began to suffer in the 1970s and '80s from fertilizer, polluted runoff and erosion, environmentalists launched a Keep Tahoe Blue effort. Federal officials and scientists began devoting more money and attention to the lake.
Local rules were tightened and education campaigns launched to reduce erosion and the amount of nitrogen and phosphorus -- chemicals that can cause algae to increase in the lake -- from fertilizer and old septic systems and air pollution from vehicles. Homeowners were required to capture stormwater from their properties -- channeling water from gutters into filters in gardens, for example, to keep it from running into the lake.
The work paid off: Lake clarity has generally improved over the past decade.
But now the warmer water and weather are presenting a new challenge.
The notion to use water clarity as a unifying measurement of the overall ecological health of the Lake Tahoe basin is a fascinating one.
For one thing, water clarity seems to be quite straightforward to measure.
For another thing, we've been doing it for 50 years, which means we're starting to get a non-trivial dataset.
For yet another thing, it has proven, in the past, to be a valid sentinel for problematic changes in the basin, as the phosphate example from 35 years ago demonstrates.
However, simple as it sounds to measure Lake Tahoe's conditions, it's still a challenge:
In recent years, UC Davis researchers have built 10 scientific monitoring stations to take readings every 30 seconds of water temperature, wave height, algae concentrations and other key indicators -- and then report them online. Each one, which sits in 7 feet of water, costs $50,000.
I'm not sure if I've noticed these monitoring stations; next time I'm up at the lake, I'll have to see if I can see them.
One of the interesting things about trying to measure water systems and their behavior, is that they are large, complex systems.
I loved this article over at 99% Invisible, with its coyly tongue-in-cheek title: America’s Last Top Model
In 1943, the Corps began construction on a model that could test all 1.25 million square miles of the Mississippi River. It would be a three-dimensional map of nearly half of the continental United States, rendered to a 1/2000 horizontal scale, spanning more than 200 acres. It was so big that the only way to see all of it at once was from a four-story observation tower.
It's possible that, as a child, I went through or near Clinton, Mississippi, though I don't remember ever doing so. I certainly don't remember seeing the model, though I'm sure I'd have been fascinated.
About all I remember from those years in South Louisiana is that I loved to make hydrology models of my own: when the rain would fall (and, oh boy, would it ever fall!), I would go outside and find a place near the curb or at the end of property where the water was running, and I would funnel out little channels with my fingers, and launch tiny leaf-boats down the rivulets to see how they would proceed.
Perhaps I secretly always wanted to be a hydrologer. I bet I'd love going to visit the Chesapeake Bay Hydraulic Model
But, I must confess: somehow, after living here for decades, I still have never managed to spend any time at the far-better-known San Francisco Bay Model
In the late 1940s, John Reber proposed to build two large dams in the San Francisco Bay as a way to provide a more reliable water supply to residents and farms and to connect local communities. In 1953, the U.S. Army Corps of Engineers proposed a detailed study of the so-called Reber Plan. Authorized by Section 110 of the Rivers and Harbors Act of 1950, the Bay Model was constructed in 1957 to study the plan. The tests proved that the plan was not viable, and the Reber Plan was scuttled.
In software engineering, we often do this sort of experimentation. It commonly comes by the name "rapid prototyping", and is intended to be a way to learn rapidly, with low risk and low cost. "Build one to throw away," they say.
55 years later, I still love to build models and experiment.