It's another beautiful Bay Area summer afternoon, by which I mean that it's 2:00 PM and the sun has finally come out. At least, the next 5 hours will be beautiful. In the meantime, here's some of the things I'm reading this weekend:
- Moving forward
There is never a perfect time for this type of transition, but now is the right time. My original thoughts on timing would have had my retirement happen in the middle of our transformation to a devices and services company focused on empowering customers in the activities they value most. We need a CEO who will be here longer term for this new direction.
- Meet the Town That's Being Swallowed by a Sinkhole
What happened in Bayou Corne, as near as anyone can tell, is that one of the salt caverns Texas Brine hollowed out—a mine dubbed Oxy3—collapsed. The sinkhole initially spanned about an acre. Today it covers more than 24 acres and is an estimated 750 feet deep. It subsists on a diet of swamp life and cypress trees, which it occasionally swallows whole. It celebrated its first birthday recently, and like most one-year-olds, it is both growing and prone to uncontrollable burps, in which a noxious brew of crude oil and rotten debris bubbles to the surface. But the biggest danger is invisible; the collapse unlocked tens of millions of cubic feet of explosive gases, which have seeped into the aquifer and wafted up to the community. The town blames the regulators. The regulators blame Texas Brine. Texas Brine blames some other company, or maybe the regulators, or maybe just God.
- The Cost Of Creating A New Drug Now $5 Billion, Pushing Big Pharma To Change
A 2012 article in Nature Reviews Drug Discovery says the number of drugs invented per billion dollars of R&D invested has been cut in half every nine years for half a century. Reversing this merciless trend has caught the attention of the U.S. government. Francis Collins, the director of the National Institutes of Health, in 2011 started a new National Center for Advancing Translational Sciences to remove the roadblocks that keep new drugs from reaching patients.
- What Is Medium?
All this built the idea that Medium was something more than yet another blogging platform. It was a place to be seen. Pieces that might have run on The Atlantic, The New Yorker, or Wired would pop up on Medium, and I'd be like, "Dang. How'd that happen?"
- What Medium Is
rich guys buy “credible” publications in order to have big platforms for their ideas.
But, even though I like what Hughes seems to represent, and he’s seemed to have a thoughtful touch in how he’s running TNR, I’m pretty sure I’d forgotten the magazine existed by the time he bought it. I have no doubt there is a small but significant audience to whom the brand is really important, but cultural credibility is no longer based entirely on having an august old name atop of some writing.
By contrast, Medium is a free-for-all, with the most perversely obtuse branding for a platform since Google named its nearly-chromeless browser Chrome. There’s some amount of crap on the site, for which it’s justifiably earning criticism, but there are also paid pieces which will undoubtedly start to meet or exceed the quality of the average TNR article.
- The Datacenter as a Computer: An Introduction to the Design of Warehouse-Scale Machines, Second edition
It's such a far ranging book that it's impossible to characterize simply. It covers an amazing diversity of topics, from an introduction to warehouse-scale computing; workloads and software infrastructure; hardware; datacenter architecture; energy and power efficiency; cost structures; how to deal with failures and repairs; and it closes with a discussion of key challenges, which include rapidly changing workloads, building responsive large scale systems, energy proportionality of non-CPU components, overcoming the end of Dennard scaling, and Amdahl's cruel law.
- Of Dice and Men: The Story of Dungeons & Dragons and The People Who Play It
In Of Dice and Men, David Ewalt recounts the development of Dungeons & Dragons from the game’s roots on the battlefields of ancient Europe, through the hysteria that linked it to satanic rituals and teen suicides, to its apotheosis as father of the modern video-game industry. As he chronicles the surprising history of the game’s origins (a history largely unknown even to hardcore players) and examines D&D’s profound impact, Ewalt weaves laser-sharp subculture analysis with his own present-day gaming experiences. An enticing blend of history, journalism, narrative, and memoir, Of Dice and Men sheds light on America’s most popular (and widely misunderstood) form of collaborative entertainment.
- The Pentagon as Silicon Valley’s Incubator
Though Silicon Valley sees itself as an industry far removed from the Beltway, the two power centers have had a longstanding symbiotic relationship. And some say the cozy personal connections of ex-intelligence operatives to the military could invite abuse, like the divulging of private information to former colleagues in the agencies.
“They have enormous opportunities to cash in on their Washington experience, sometimes in ways that fund further innovation and other times in ways that might be very troubling to many people,” said Marc Rotenberg, executive director at the Electronic Privacy Information Center in Washington. “Both sides like to maintain a myth of distant relations. The ties have been in place for a long time.”
- John Carmack discusses the art and science of software engineering
I talked a lot last year about the work that we’ve done with static analysis and trying to run all of our code through static analysis and get it to run squeaky clean through all of these things and it turns up hundreds and hundreds, even thousands of issues. Now its great when you wind up with something that says, now clearly this is a bug, you made a mistake here, this is a bug, and you can point that out to everyone. And everyone will agree, okay, I won’t do that next time. But the problem is that the best of intentions really don’t matter. If something can syntactically be entered incorrectly, it eventually will be. And that’s one of the reasons why I’ve gotten very big on the static analysis.
-
100x faster Postgres performance by changing 1 line
Postgres is reading Table C using a Bitmap Heap Scan. When the number of keys to check stays small, it can efficiently use the index to build the bitmap in memory. If the bitmap gets too large, the query optimizer changes the way it looks up data. In our case it has a large number of keys to check so it uses the more approximative way to retrieve the candidate rows and checks each row individually for a match on x_key and tags. All this “loading in memory” and “checking individual row” takes time (the Recheck Cond in the plan).
- what does "Bitmap Heap Scan" phase do?
A plain indexscan fetches one tuple-pointer at a time from the index, and immediately visits that tuple in the table. A bitmap scan fetches all the tuple-pointers from the index in one go, sorts them using an in-memory "bitmap" data structure, and then visits the table tuples in physical tuple-location order.
- I will not do your tech interview.
As I informally observed the track record of those pipelines in hiring great people, I began to realize that the only real predictor of great hires was if the candidate already knew someone on the team.
- Cultivating Hybrids: 4 Key Data Architectures for Scaling Infinitely
When a transaction happens on an in-memory data grid (IMDG), it is distributed across nodes with micro-second based latency. Of course, not all businesses require this type of performance, but thousands of simultaneous transactions per second basically mandate it. With processing via functions, procedures, or queries, each member gets a request, partial results are sent back, and they are combined. This scatter gather or MapReduce type of approach is the same model Hadoop uses, but it is in real-time with memory-level latency. Different from in-memory databases that have the entire data set replicated across each member in a cluster, IMDGs distribute parts of the data across members. The system is responsible for tracking itself and knowing where each piece of data is, making the location transparent to clients. Part of the approach to architecting and managing IMDGs is optimizing the data’s distribution and replication. For example, strongly correlated data is colocated on a peer to remove network hops within a single query. The system also distributes functions, procedures, and queries transparently to nodes hosting specific shards of data. There is still a limit here, there must be a financially reasonable way to store the entire data set in memory—petabytes can draw a limit.
- Engineers Unplugged Series 3 Episode 9 – Overlay Networking
While attending Cisco Live USA this year, Amy Lewis put me in the head lock and refused to let me go until I agreed to appear in a video for the current series of Engineers Unplugged.
- Chesscademy: A fun and free way to learn chess!
Well, that should keep me busy for a few hours...
No comments:
Post a Comment