Thank you for writing. I am so very, very sorry.
Mental health issues are, in my opinion, the greatest unsolved tragedy of my time here on earth.
Thank you for writing. I am so very, very sorry.
Mental health issues are, in my opinion, the greatest unsolved tragedy of my time here on earth.
Erik Larson is deservedly famed for his The Devil in the White City, which I enjoyed quite a lot when I read it years ago.
But Larson has written a number of books in addition to The Devil in the White City, and recently I happened to read Thunderstruck.
Thunderstruck is set at the turn of the 19th century, from about 1895 through about 1910, and focuses primarily on the unusual and turbulent life of Guglielmo Marconi, the Irish-Italian inventor who, at a very young age, moved to London and invented a set of devices which could send and receive messages using radio waves, something which quickly became known as the "Wireless Telegraph".
As a narrative device, Larson spins Marconi's story together with another story, that of H.H. Crippen, a physician of sorts and a peddler of the sorts of homeopathic "cures" that were popular at that time.
Alternating back and forth between the two stories, Larson brings things to a climax with a certainly entertaining depiction of a heinous crime and its unraveling by Scotland Yard.
But, overall, I found myself rather unaffected, for a variety of reasons.
Most importantly, the two stories didn't really have anything to do with each other, except for the rather boring observation that Scotland Yard were able to make use of the Wireless Telegraph during their apprehension of the murderer.
Furthermore, neither of the main characters are all that gripping. Marconi certainly had a vivid time in the world spotlight, rushing here and there to demonstrate his invention, build a company to deliver it, and grow it into a worldwide success. But Marconi (at least in Larson's telling) was a quiet, private, almost reclusive man, with not much more than a string of failed relationships to add background and color to the story of his tireless work on refining and improving the wireless transmitters and receivers.
Crippen, meanwhile, is even less appealing, and Larson is forced to tell Crippen's story primarily through the stories of those he came in contact with: wives, girlfriends, business associates, etc.
The most interesting character in the book, I thought, was the completely fascinating John Nevil Maskelyne: magician, entertainer, skeptic, and general gadabout; the best part of the entire book, in my opinion, is the wonderful story of how Maskelyne attempts to disrupt one of Marconi's public exhibitions of wireless technology by commandeering a wireless transmitter of his own and beaming risque limericks into the on-stage receiver. And, we must not forget, Maskelyne is one of those rare creatures whose own book is still read and enjoyed now, 125 years after it was first published!
As he always does, Larson does a fine job of painting the picture of the time, and the book is pleasant enough to read.
But I guess I think he tried to force his material to carry more than its fair share of a story, and the result felt to me like one of those situations where the enormous build-up with which Thunderstruck arrived led to an unavoidable disappointment when its reality sunk in.
... but here's a little snapshot, anyway.
The flight management computer is a computer. What that means is that it’s not full of aluminum bits, cables, fuel lines, or all the other accoutrements of aviation. It’s full of lines of code. And that’s where things get dangerous.
Those lines of code were no doubt created by people at the direction of managers. Neither such coders nor their managers are as in touch with the particular culture and mores of the aviation world as much as the people who are down on the factory floor, riveting wings on, designing control yokes, and fitting landing gears. Those people have decades of institutional memory about what has worked in the past and what has not worked. Software people do not.
In the 737 Max, only one of the flight management computers is active at a time—either the pilot’s computer or the copilot’s computer. And the active computer takes inputs only from the sensors on its own side of the aircraft.
When the two computers disagree, the solution for the humans in the cockpit is to look across the control panel to see what the other instruments are saying and then sort it out. In the Boeing system, the flight management computer does not “look across” at the other instruments. It believes only the instruments on its side. It doesn’t go old-school. It’s modern. It’s software.
The TRITON intrusion is shrouded in mystery. There has been some public discussion surrounding the TRITON framework and its impact at the target site, yet little to no information has been shared on the tactics, techniques, and procedures (TTPs) related to the intrusion lifecycle, or how the attack made it deep enough to impact the industrial processes. The TRITON framework itself and the intrusion tools the actor used were built and deployed by humans, all of whom had observable human strategies, preferences, and conventions for the custom tooling of the intrusion operation. It is our goal to discuss these adversary methods and highlight exactly how the developer(s), operator(s) and others involved used custom tools in the intrusion.
In this report we continue our research of the actor’s operations with a specific focus on a selection of custom information technology (IT) tools and tactics the threat actor leveraged during the early stages of the targeted attack lifecycle (Figure 1). The information in this report is derived from multiple TRITON-related incident responses carried out by FireEye Mandiant.
Using the methodologies described in this post, FireEye Mandiant incident responders have uncovered additional intrusion activity from this threat actor – including new custom tool sets – at a second critical infrastructure facility. As such, we strongly encourage industrial control system (ICS) asset owners to leverage the indicators, TTPs, and detections included in this post to improve their defenses and hunt for related activity in their networks.
(Yes, he was the man who invented Erlang. And did a surprising amount of other stuff that you never knew about.)
Over lunch at an Avenue Matignon café in Paris, I asked a literate media entrepreneur and political expert to explain the Yellow Vests mystery. Who are they, what do they want, who leads them? He started by offering a metaphor. The interior of France, some call it La France Profonde (think of Red States), is like a forest that has been progressively desiccated by climate change. Then gasoline was poured on the trees. Sooner or later a spark would set it ablaze.
I’ve seen the desiccation, the emptying of France. In an October 2017 Monday Note titled The Compostelle Walk Into Another France, I recounted how, with daughter Marie, we walked through empty villages, no café, no bakery, no garage. Barking dogs roamed the streets during the day, the inhabitants having rushed to their distant jobs. I had had a similar impression years before, driving small country roads in Northern Burgundy, but walking the Compostelle (Camino de Santiago) made it more vivid.
Over the past three or fours decades, La France Profonde has been slowly hollowed out as jobs moved out of the country or to larger urban areas. This desiccation isn’t unique to France, but gasoline was poured on the forest in the form of an accumulation of laws and regulations that exasperated the remaining France Profonde population, creating a schism between “real people” and “those people in Paris”.
This year’s developer festival will be held May 7-9 at the Shoreline Amphitheatre in Mountain View, CA
We all know about catastrophic headline-generating failures like AWS East-1 region falling apart or a major provider being down for a day or two. Then there are failures known only to those who care, like losing a major exchange point. However, I’m becoming more and more certain that the known failures are not even the tip of the iceberg - they seem to be the climber at the iceberg summit.
There are some cases when a person really does want control. If the person wants to determine their own path, if having choice is itself a personal goal, then you need control. That’s a goal about who you are not just what you get. It’s worth identifying moments when this is important. But if a person does not pay attention to something then that person probably does not identify with the topic and is not seeking control over it. “Privacy advocates” pay attention to privacy, and attain a sense of identity from the very act of being mindful of their own privacy. Everyone else does not.
Overall, it’s a nice design. They have adopted a conservative process node and frequency. They have taken a pretty much standard approach to inference by mostly leaning on a fairly large multiple/add array. In this case a 96×96 unit. What’s particularly interesting to me what’s around the multiply/add array and their approach to extracting good performance from a conventional low-cost memory subsystem. I was also interested in their use of two redundant inference chips per car with each exchanging results each iteration to detect errors before passing the final plan (the actuator instructions) to a safety system for validation before the commands are sent to the actuators. Performance and price/performance look quite good.
An important optimization for Git servers is that the format for transmitted objects is the same as the heavily-compressed on-disk packfiles. That means that in many cases, Git can serve repositories to clients by simply copying bytes off disk without having to inflate individual objects.
But sometimes this assumption breaks down. Objects on disk may be stored as “deltas” against one another. When two versions of a file have similar content, we might store the full contents of one version (the “base”), but only the differences against the base for the other version. This creates a complication when serving a fetch. If object A is stored as a delta against object B, we can only send the client our on-disk version of A if we are also sending them B (or if we know they already have B). Otherwise, we have to reconstruct the full contents of A and re-compress it.
This happens rarely in many repositories where clients clone all of the objects stored by the server. But it can be quite common when multiple distinct but overlapping sets of objects are stored in the same packfile (for example, due to repository forks or unmerged pull requests). Git may store a delta between objects found only in two different forks. When someone clones one of the forks, they want only one of the objects, and we have to discard the delta.
Git 2.20 solves this by introducing the concept of “delta islands“. Repository administrators can partition the ref namespace into distinct “islands”, and Git will avoid making deltas between islands. The end result is a repository which is slightly larger on disk but is still able to serve client fetches much more cheaply.
It had been six months, so I picked up the next episode of The Expanse: Babylon's Ashes.
I'm not actually sure how many series I've read this far, i.e., all the way to Book Six. Patrick O'Brian's stories of Jack Aubrey and Stephen Maturin is the only other that comes to mind.
Of course, O'Brian's books were one of the great literary achievements of the 20th century; that's a high bar!
But clearly The Expanse has something going on.
I agree with those who say that Babylon's Ashes was not the strongest of the series so far. Book Five was much better. The Free Navy are not very interesting, and I'm not sure where I stand on the investigation into the soul of Filip Nagata.
However, one of the strongest parts of the overall series is the way that characters with dark pasts are developed into rich and fascinating stories. Think of Clarissa Mao, or Basia Merton, or even Joe Miller from Book One (Leviathan Wakes).
And, we get some great space battles, we get a very new interesting character in Michio Pa.
My pattern of late has been to take a bit of a break between books. After all, I have much else to read.
But I'm sure I'll be back for Book Seven. I suspect that, just as summer turns into fall, Persepolis Rising will be finding its way into my hands...
Here's a pretty amazing resource:
In 2006, documentary filmmaker Jason Scott began production on GET LAMP, a video documentary about the realm of text adventures and interactive fiction. Shooting and research time was roughly 4 years, during which Jason interacted with a large variety of members of the various communities and companies that made up the story of text adventures. Among these was Steve Meretzky, developer at Infocom, Inc, arguably the largest and most influential of the 1980s adventure game companies.
During his time at Infocom, Steve Meretzky meticulously gathered thousands of pages of notes, journals, maps, memos, forms and other printable materials related to all aspects of Infocom, and kept them in his basement for decades. During the GET LAMP production, Jason Scott scanned in roughly 9,000 pages of these documents across a number of months, borrowing the materials from Steve and scanning them as quickly as possible, at around 600dpi. From these scans, a portion was used in the GET LAMP movie to illustrate various scenes and descriptions by interviewees.
A collection of historical source files, for education and perusal.
A generous benefactor has very kindly offered retro fans and the wider internet a glorious gift this morning: a dump of source code from classic Infocom text games, including the original Zork adventures, Shogun, and Infocom's adaptation of the legendary Douglas Adams novel, Hitchiker's Guide To The Galaxy.
Now, about those other 16 hours per day that I requested in order to study all this...
Generally speaking, I feel like I ought to like Medium
So why is it that I seem to be actively avoiding every medium link that shows up in my various feeds, nowadays.
I can't clearly express the unease I have about the platform.
What do you think? Is Medium to be avoided, embraced, or is it just "meh"?
Every year, as we approach Earth Day, it's good to remember, and good to consider, this magical aggregate of dust upon which we all survive:
I’m paddling the length of the river, to try and understand that risk, my own and other people’s, and to see, from river level, what we could stand to lose if we don’t change how we use and allocate water. “Throughout the whole last century, if you needed more water it always worked out somehow, but it doesn’t work when you get to the point where you’re storing every last drop,” Doug Kenney, Director of the Western Water Policy program at the University of Colorado, tells me before I set out on the river. “You have to talk people through it, and explain that for every new reservoir you try and fill you’re putting more stress on the other parts of the system. Things are changing and we should behave in a way that limits our risk.”
On a map, Glen Canyon before its submersion looks like a centipede: a 200-mile-long central canyon bending and twisting with a host of little canyons like legs on either side. Those side canyons were sometimes hundreds of feet deep; some were so cramped you could touch both walls with your outstretched hands; some had year-round running water in them or potholes that explorers had to swim across. Sometimes in the cool shade of side-canyon ledges and crevices, ferns and other moisture-loving plants made hanging gardens. Even the names of these places are beautiful: Forbidding Canyon, Face Canyon, Dove Canyon, Red Canyon, Twilight Canyon, Balanced Rock Canyon, Ribbon Canyon. Like Dungeon Canyon, they are now mostly underwater.
When the Sierra Club pronounced Glen Canyon dead in 1963, the organization’s leaders expected it to stay dead under Lake Powell. But this old world is re-emerging, and its fate is being debated again. The future we foresee is often not the one we get, and Lake Powell is shriveling, thanks to more water consumption and less water supply than anyone anticipated. Beneath it lies not just canyons but spires, crests, labyrinths of sandstone, Anasazi ruins, petroglyphs, and burial sites, an intricate complexity hidden by water, depth lost in surface. The uninvited guest, the unanticipated disaster, reducing rainfall and snowmelt and increasing drought and evaporation in the Southwest, is climate change.
Why did Flint’s river pose so many problems? Before processing, the water itself is polluted from four sources: natural biological waste; treated industrial and human waste; untreated waste intentionally or accidentally dumped into the river; and contaminants washed into the river by rain or snow. The river is also warmer than Lake Huron, and its flow is less constant, particularly in the summer. All of this raises levels of bacteria and organic matter and introduces a wide range of other potential contaminants, whether natural or human-made.
In fact, while the Flint River had been improving thanks to the new regulations, the departure of heavy industry, and local cleanup efforts, it had long been known as an exceptionally polluted river. Until very recently, it had been repeatedly ruled out as a primary source for the city’s drinking water. It is hard to imagine why anyone familiar with the river’s history would ever decide to use it even as a temporary water source. But they did.
A neglected step caused the reactor’s power to plunge, and frantic attempts to revive it created an unexpected power surge. Poorly trained operators panicked. An explosion of hydrogen and oxygen sent Elena into the air “like a flipped coin” and destroyed the reactor. Operators vainly tried to stop a meltdown by planning to shove control rods in by hand. Escaping radiation shot a pillar of blue-white phosphorescent light into the air.
The explosion occurs less than 100 pages into this 366-page book (plus more than 100 pages of notes, glossary, cast of characters and explanation of radiation units). But what follows is equally gripping. Radio-controlled repair bulldozers became stuck in the rubble. Exposure to radiation made voices grow high and squeaky. A dying man whispered to his nurse to step back because he was too radioactive. A workman’s radioactive shoe was the first sign in Sweden of a nuclear accident 1,000 miles upwind. Soviet bigwigs entered the area with high-tech dosimeters they didn’t know how to turn on. Investigations blamed the accident on six tweakers, portrayed them as “hooligans” and convicted them.
President Trump, fresh off a trip to the U.S. southern border, doubled-down on his message that “the country is full”
“The country is full. We have ... our system is full. We can't do it anymore,” Trump said
The president shared the same message earlier in the day at a roundtable with law enforcement and immigration officials, telling any potential migrants to “turn around” because the U.S. “can’t take you anymore.”
And here's another: Heartland Visas Report
- U.S. population growth has fallen to 80-year lows. The country now adds approximately 900,000 fewer people each year than it did in the early 2000s.
- The last decade marks the first time in the past century that the United States has experienced low population growth and low prime working age growth on a sustained basis at the same time.
- Uneven population growth is leaving more places behind. 86% of counties now grow more slowly than the nation as a whole, up from 64% in the 1990s.
- In total, 61 million Americans live in counties with stagnant or shrinking populations and 38 million live in the 41% of U.S. counties experiencing rates of demographic decline similar to Japan’s.
- 80% of U.S. counties, home to 149 million Americans, lost prime working age adults from 2007 to 2017, and 65% will again over the next decade.
- By 2037, two-thirds of U.S. counties will contain fewer prime working age adults than they did in 1997, even though the country will add 24.1 million prime working age adults and 98.8 million people in total over that same period.
- Population decline affects communities in every state. Half of U.S. states lost prime working age adults from 2007-2017. 43% of counties in the average state lost population in that same time period, and 76% lost prime working age adults.
- Shrinking places are also aging the most rapidly. By 2027, 26% of the population in the fastest shrinking counties will be 65 and older compared to 20% nationwide.
- Population loss is hitting many places with already weak socioeconomic foundations. The share of the adult population with at least a bachelor’s degree in the bottom decile of population loss is half that in the top decile of population growth. Educational attainment in the fastest shrinking counties is on average equivalent to that of Mexico today or the United States in 1978.
- Population loss itself perpetuates economic decline. Its deleterious effects on housing markets, local government finances, productivity, and dynamism make it harder for communities to bounce back. For example, this analysis found that a 1 percentage point decline in a county’s population growth rate is associated with a 2-3 percentage point decline in its startup rate over the past decade.
Happily for me, I live in one of those areas where immigrants are welcomed; nearly everyone that I spend my waking hours with is either an immigrant or a child of an immigrant, and my part of the country is experiencing the most breathtaking growth, both cultural growth and economic growth, since the pre-Civil-War "Gold Rush" years of 1849-1850.
But I understand that other areas of the country are different.
Two great tastes that taste great together! Alex Honnold Breaks Down Iconic Rock Climbing Scenes.
Thanks very much, GQ, for making and sharing this very entertaining 15 minute short feature!