Wednesday, September 28, 2011

Silk Fire Kindling

Far and away the most insightful take on today's monumental set of announcements from Amazon comes from Chris Espinosa, who says:
Amazon now has what every storefront lusts for: the knowledge of what other stores your customers are shopping in and what prices they’re being offered there. What’s more, Amazon is getting this not by expensive, proactive scraping the Web, like Google has to do; they’re getting it passively by offering a simple caching service, and letting Fire users do the hard work of crawling the Web. In essence the Fire user base is Amazon’s Mechanical Turk, scraping the Web for free and providing Amazon with the most valuable cache of user behavior in existence.

Read through the comments, too. Great discussion.

Nothing stands still. The world evolves. As the Red Queen said to Alice:

"Now, here, you see, it takes all the running you can do, to keep in the same place. If you want to get somewhere else, you must run at least twice as fast as that!"

Can we really already be talking about a post-Android market?

As the gallows humor goes: "once I get one, they're obsolete". Still, it seems startling that Peter-Paul Koch, among others, is already talking openly about what happens "after Android". He's written a terse, but readable, analysis of how he sees the story playing out, and what might happen next. His rather bold conclusion:
The Android market share is up for grabs. Apple will capture a juicy part with the iPhone Nano, and the rest will remain with the Android vendors, although it’s unclear which OSs they’ll use.

We’ll only start to notice all this in the market in Q2 or so. All vendors have traditional Android devices in the pipeline, and they’ll be released according to schedule for Christmas and Chinese New Year. It’s the next set of devices that will run other OSs

Tuesday, September 27, 2011

It depends on how you look at it

Our modern life in the always-on Internet has become quite complicated and entangled; sometimes things look very different depending on which perspective you take. Below are three unrelated, but yet quite related, reflections on the implications of that choice of point of view:

  1. People are rhapsodizing over the new If This Then That web service platform, and there's no denying that it's a fascinating service. But before you declare it to be the future of everything for all time, you might want to spend a bit of time thinking about Jon Udell's critique:
    What if I only want to give IFTTT the power to tweet on my behalf, though, and not give up access to my private direct messages? More generally, how can I think about the tradeoffs involved in delegating all versus some versus no powers to IFTTT, across a range of services I might authorize it to use on my behalf?
  2. Some years back, the various browser vendors blurred the distinction between the "address bar" in your browser, and the "search box" in your browser. It used to be the case that these were separate, and you had to use each for the correct purpose: if you wanted to go to Apple's web site, you could go to your address bar and type in, and the browser would take the string you entered, interpret it according to RFC 1738, and retrieve the specified resource. Meanwhile, if you entered 'apple' into the search box, the browser would contact your preferred search engine, and send it the search request "apple", and display the results, probably something like these.

    Well, at some point this changed, and the browsers became more "user friendly", and it became possible to forget all that complexity, and just enter whatever you wanted into the address bar of your browser; if it was a URL, the browser went there directly; if it wasn't, the browser treated it as a search. In fact, Chrome has no "search box" at all, only an address bar.

    However, this blurring of the lines between "asking the browser to search for things for me" versus "telling the browser exactly what to do", while generally easing the user experience, has opened up a grey area in your use of the Internet, one you may not have been aware of, in which intermediate servers on the Internet are examining your search traffic and potentially altering your searches in ways that the Internet, and not necessarily you, find "better":

    • The New Scientist provides great coverage of this story, noting that
      Users entering the term "apple" into their browser's search bar, for example, would normally get a page of results from their search engine of choice. The ISPs involved in the scheme intercept such requests before they reach a search engine, however. They pass the search to an online marketing company, which directs the user straight to Apple's online retail website.
    • The EFF blog provides some more detail, with links to the original research, and explains that:
      Paxfire provides a product for ISPs that rewrites DNS errors (effectively conveying "the name you asked for doesn't exist") to responses sending users to search pages that host advertisements, for which Paxfire then shares the corresponding ad-related revenue with the ISPs. This practice has already been controversial.
    • There has been some discussion about whether this is, technically speaking, illegal. Julian Sanchez spends some time thinking about the problem, and issues his assessment:
      The mechanics are opaque to the average user, but Paxfire is in effect combing through all these messages to find the ones that maybe, possibly, perchance the user really meant to be an address rather than a search request, because they don’t really understand how their browsers work. And thaaats kinda wiretappy.
  3. Over at The eXileD, Yasha Levine spends some time thinking about an Internet service called CloudFlare, which is supposed to protect your users from accidentally visiting those dangerous parts of the Internet. Says CloudFlare:
    We automatically optimize the delivery of your web pages so your visitors get the fastest page load times and best performance. We also block threats and limit abusive bots and crawlers from wasting your bandwidth and server resources. The result: CloudFlare-powered websites see a significant improvement in performance and a decrease in spam and other attacks.

    Well, OK, notes Levine, but that also means that:

    People who sign up for the service are allowing CloudFlare to monitor, observe and scrutinize all of their site’s traffic, which makes it much easier for intel or law enforcement agencies to collect info on websites and without having to hack or request the logs from each hosting company separately. But there’s more. Because CloudFlare doesn’t just passively monitor internet traffic but works like a dynamic firewall to selectively block traffic from sources it deems to be “hostile,” website operators are giving it a whole lotta power over who gets to see their content. The whole point of CloudFlare is to restrict access to websites from specific locations/IP addresses on the fly, without notifying or bothering the website owner with the details. It’s all boils down to a question of trust.

When you're connected to everything, all the time, and all of your services are keeping track of you, and sharing and discussing and analyzing your activities in real time, you're interacting with an overall entity that is considerably more aware of who you are and what you do than you might realize, no matter how security-conscious or privacy-aware you might think you are.

It's not obvious what the answers are to any of these concerns, but it's great to see various people taking the time to raise the questions and study them and point out the various pros and cons.

Monday, September 26, 2011

Ingres is now Actian

Over the weekend I read that Ingres Corporation has renamed itself, and is now Actian, Inc.. I confess that I haven't paid a lot of attention to Ingres Corporation in a few years, so I spent a bit of time reading some of the coverage.

From what I can tell, this is most about company names, and branding/positioning. The Ingres team would like to be considered a player in the modern DBMS markets, such as "cloud computing" and "big data". They felt that the old "Ingres" name was holding them back, and decided to rename the company.

This is the sort of thing that marketing departments at software companies do.

In fact, even when I was at the company, this was going on:

  • When I joined in 1988, I joined Relational Technology, Inc. (RTI)
  • Later, RTI renamed itself to Ingres Corporation to emphasize the tight link between the company and the product
  • Later, following a merger, we became known as ASK, Inc., which was an acronym standing for Ari and Sandra Kurtzig, the couple that founded the company we merged with.
  • Then in 1994, after I left, the company became an un-named subdivision of Computer Associates

Given my own experience, I would say that renaming your company is not an effective solution to problems of market positioning. But I'm not a marketer, and not really qualified to comment. If it's part of an overall plan to revise product features, target modern usage patterns, and adopt a company vision that they believe in, then hopefully it will work for them.

I wonder what the Ingres code looks like nowadays? When it was open-sourced, about 4-5 years ago, I downloaded a copy and looked through it. The overall structure was similar to what I remembered, although the details had changed, as well they might since it was (by then) more than a decade since I had worked on the code.

It's interesting that the company and product continue to live on; best of luck, Ingres / Actian!

Saturday, September 24, 2011

Victoria trip

Recently, my wife and I snuck away for our 25th anniversary celebration (only about 15 months late -- sorry!): a long weekend trip to Victoria, Canada.

We flew up to Victoria on a Thursday, and returned on a Monday. United Airlines offers a convenient non-stop daily flight between Victoria and San Francisco, so we had most of Thursday, some of Monday, and all of Friday/Saturday/Sunday to spend in Canada. Just us! No kids! No dog! No email! No phone!

We chose to stay at the Laurel Point Inn, which is a nice, if somewhat aging, hotel with a spectacular location on Victoria's Inner Harbor.

From VictoriaHoliday2011
Behind me you can see downtown Victoria, with the Fairmont Empress just to the right of center. The Royal British Columbia Museum is in the far right of the picture, with the Parliament building just off camera to the right. Immediately behind me is the Victoria Clipper ferry docks.

Victoria is a beautiful city, if a little tourist-y. Cruise ships stop here regularly (3 visited during our stay), and there are many tourist-related activities, such as taking a ride in a horse-drawn carriage through downtown.

From VictoriaHoliday2011
From VictoriaHoliday2011

But the entire city is beautiful, not just the downtown. Public schools decorate their chain-link fences with beautiful wooden painted fish sculptures, and paint murals on the schoolroom walls.

From VictoriaHoliday2011
The city features sculptures, totem poles, historical plaques, and many other cultural objects throughout, and there are many smaller touches: where other cities nail ugly steel blocks to their stone walls to discourage skateboarders from sliding along them, Victoria chooses to mount small metallic sculptures of starfish and salmon instead. In many ways, its similar to other Pacific Northwest cities in this respect; I've noticed this in Portland Oregon and Seattle Washington as well, but it was particularly noticeable in Victoria.
From VictoriaHoliday2011

Friday was forecast to have nice weather, so we set out for the Butchart Gardens near Sydney. Before we left Victoria, though, we drove up above Beacon Hill Park to see Craigdarroch Castle. It was only 9:15 AM, though, and the castle didn't open until 10:00 (in fact, most establishments in Victoria don't open until 10:00 AM, we found), so we decided not to linger and just took some pictures of the outside of this impressive mansion.

From VictoriaHoliday2011

The Butchart Gardens are a wonderful destination on a sunny summer day. Mr. Butchart ran a cement factory, for which purpose they quarried immense amounts of limestone out of the ground, leaving behind a deep cavity. Mrs. Butchart decided to turn the unused quarry areas into a sunken garden, and contracted with various landscape designers over the years to extend and improve the grounds. The result is superb! It's a three-dimensional experience, as you climb up and down, over and around, in and out of the various garden plots.

From VictoriaHoliday2011
At the very bottom of the quarry, a tremendous fountain sends dancing waters 75 feet into the air.
From VictoriaHoliday2011

It was Dahlia season, so of course we took many pictures (and bought some seeds to bring home!)

From VictoriaHoliday2011
From VictoriaHoliday2011

We had a nice late lunch in Sydney, a small town on the coast in the upper Saanich peninsula. After lunch, we visited a very nice local winery named Muse, where we found the Chardonnay quite pleasant.

Saturday the weather looked threatening, but we were not to be deterred, and we set out up-island. Our first stop was Goldstream Provincial Park, which runs along a river valley where salmon come to spawn in the fall/winter. Attracted to the salmon, various birds (gulls, crows, eagles) and land animals (racoons, bears, lynx) are also drawn to the park at that time, and humans come to watch the spectacle.

Unfortunately, we were about 6 weeks too early for the salmon run, so we just took a peaceful walk along the river down to the visitor center. We found the visitor center all dressed up, for it was just time for the annual Art Show. We admired the work of several dozen local artists, some of which was quite fine indeed, then set back on the road.

Heading north from Goldstream Park the Trans Canada Highway climbs some fairly dramatic mountain cliffs, and the views over the Strait of Georgia are superb. After some ups and downs, we exited the superhighway and drove down to the coast at Cowichan Bay. This area had been recommended to us by the winemaker at Muse Winery, who told us to look for Hilary's Cheese store on the waterfront. We found a parking spot and climbed out to see Cowichan Bay, which is a delightful small town quite similar to Point Reyes Station or Carmel-by-the-Sea, full of art galleries, boutiques, cafes, and, right smack in the middle, Hilary's Cheese Store. We bought two small cheeses there, and went next door to the all-natural fully-organic bakery to get a French baguette (and a slice of fresh apple strudel, since it was there and warm and ready :) ).

Although we could have stayed in Cowichan Bay all day, time was a-wasting, so it was back on the road and up, through Duncan (City of Totems) and on to Chemainus (World Famous Murals). I was kind-of hoping we would find a place to eat in Chemainus, but it turned out to be more of a mill-town-with-some-beautiful-murals-on-the-buildings than a place we really wanted to stop for lunch, so we took some pictures and continued on up-island.

From VictoriaHoliday2011

After Chemainus comes Ladysmith, voted "most beautiful town in Canada", and famous for being exactly on the 49th parallel. Now, at last, we were truly "in Canada" (north of the Washington border, that is). But Ladysmith didn't move us much, either, so it was on, on, on to Nanaimo.

Nanaimo is the second-largest city on Vancouver Island and has a rich history. The Hudson Bay Company established a trading outpost there around 1850 or so, and the thriving fur trade quickly developed the region. Not long after that, immense seams of coal were discovered underground, and Nanaimo became one of the largest coal-mining regions on the West Coast; it was from these coal mines that Mr. Dunsmuir got the money to build Craigdarroch Castle. Amazingly, the coal seams were actually underneath the harbor; the miners dug shafts from the main island and from several surrounding islands, then dug laterally out under the ocean waters to retrieve the rich coal.

By this time we were quite hungry so we found a nice restaurant at the marina and had a delighful lunch. You can see the restaurant if you follow the link above, or here's the view from our table:

From VictoriaHoliday2011

Of course, no lunch in Nanaimo is complete without a Nanaimo Bar, so we (happily) shared one:

From VictoriaHoliday2011

After lunch, we wandered around Nanaimo Harbor a bit. The Womens Masters Dragon Boat Rowing Teams were having their annual championship, and we walked through the park where the teams were celebrating their success, then headed back towards our car. Just as we were about to leave, we realized that, without realizing it, we had parked right underneath the Nanaimo Regional Museum. The museum had been closed for the day for a special event, but our timing was perfect and the museum attendant let us in for free. It's a delightful museum; my particular favorite part was the coal mine exhibit, while Donna was delighted by the special exhibit on the fashion of the early 1900's.. We were very glad we found the Nanaimo Museum; even without it, Nanaimo was a treat, but the museum was much better than we thought it might be.

Sunday was forecast to be the coldest and rainiest day of the trip, so we had saved up various indoor activities. But the day dawned clear and the weather was steady, so we wavered. Still, a plan is a plan, so off we went to the Royal British Columbia Museum. I'm rather a nut for museums, and this one is world-known, so I was eagerly looking forward to it. It's important to get to the museum early, since by the early afternoon the crowds are overwhelming and there are lines everywhere. So we got an early start.

This summer, the museum is featuring a special exhibit on Emily Carr, and, given the layout of the museum, this is the first place you visit after you take the escalator upstairs. Emily Carr was born on Government Street and raised in Victoria, and spent most of her life on Vancouver Island. After the tragic death of her brother she embarked on a missionary trip to live among the First Peoples farther up the island, and spent years there learning the language(s), the art and culture and traditions, and returned with a deep appreciation for their approach to living on the island. Rather late in her life she became a world-renowned painter and author, but was always known as somewhat of an eccentric and hard-to-reach.

Outdoors, in front of the museum and directly on the Inner Harbor, is this fascinating statue of Carr with her pet monkey on her shoulder and her pet dog beside her.

From VictoriaHoliday2011

The Carr exhibit is very nicely done. It takes an unusual approach, thematically organized around a "dialogue" between the museum's curator and a current-day artist, Manon Elder as they explore Carr's life and work. The exhibit includes photographs of Carr's life, essays and letters by Carr, other memorabilia of Carr, and lots and lots of paintings. Some of the art is by Carr herself, some is by Manon Elder, inspired by Carr's work and by events in her life. My favorite work of art was the magnificent Tanoo painting from 1913, which is impossible to do justice to on the small screen, and must be seen in person to appreciate the glowing light and spiritual feeling of the work. In addition to showing Tanoo itself, this section of the exhibit has a delightful approach to appreciating the painting:

  • There is a short note from Carr, remembering how she came to paint it.
  • There is a photograph of Carr at her easel, working in front of the totems in the village
  • Inspired by the photograph, there is a companion piece by Elder, showing Carr at work, creating her masterpiece.

Upstairs from the Carr exhibit, the Royal BC Museum has a spectacular exhibit gallery called the First Peoples Gallery. The gallery is multi-level; you can move up and down and appreciate the art and exhibits from various arrangements and perspectives. The section on the smallpox epidemic of 1862 is extremely dramatic and well-presented, as is the Totem Room with its display of a variety of totem poles.

Another little tidbit: the museum has a nice display of the maritime explorers who ventured along the Pacific Northwest coast, including the actual dagger which (is claimed to be the one which) was used to stab Captain Cook to death in Hawaii. Since Donna and I, years ago, visited the beach where Cook died, this was an interesting addendum to our perspective on that most unusual man, whose statue is prominently displayed on the Victoria waterfront.

From VictoriaHoliday2011

Since the museum has an IMax theater, our original plan was to spend the rainy day indoors, watching some IMax movies after we toured the museum. But the weather was steadily improving outside, and so we decided that, rather than watch movies about the great outdoors, we'd go out into the great outdoors ourselves! So we exited the museum, took a quick walk through Thunderbird Park next door, and headed back to the hotel to get the car.

From VictoriaHoliday2011

This time, we decided to strike out south and west, toward Sooke and Jordan River and Port Renfrew. Sooke has a certain amount of fame because it is the southernmost harbor in Canada. There is also a very fancy restaurant there (the Sooke Harbor House), and a nice small park down by the harbor at a location called Whiffen Spit (which sounds rather like something that seems to always be going on at the baseball game whenever I tune in).

Although we made good time, we had got a bit of a late start, so after we passed through Sooke we started looking for a destination. About midway between Sooke and Jordan River is the spectacular French Beach Provincial Park. Had we known ahead of time what an incredibly beautiful place this was, we would have cut our trip to the museum in half and spent an entire day here! As it was, I was amazed that there were only a few dozen others at the park, which features dozens of picnic spots with clean tables in good repair, grassy areas for play and relaxing, trails to walk along by the water, stunning views of the Juan de Fuca Strait, and one of the most dramatic waterfronts I've seen in many years.

From VictoriaHoliday2011

French Beach is not really a beach at all, it is a waterfront composed of extremely weather-beaten rocks. The rocks are rounded and smoothed from an eternity of processing by the tides and waves, but it is quite different from any other beach I've been to. One of the most unusual and eerie things is the sound of it all: as a wave crashes against the shore, it carries some number of the rocks slightly up the land; then, when the water recedes, the rocks slip back down the slope, clattering as they do so. It is a sound I had never heard before!

From VictoriaHoliday2011

Unfortunately, we hadn't really come prepared to spend much time at the park, and the day was getting late, so after an hour or so wandering along the beach, we climbed back in the car and headed back to Victoria. After a yummy dinner at the Tapa Bar on Trounce Alley, we spent the evening walking around downtown; when the lights are turned on it is very beautiful.

From VictoriaHoliday2011

Monday morning we found ourselves with an hour or so to spare before it was time to head to the airport, so we drove across Victoria out to the old neighborhood of Oak Bay. This is a very desirable neighborhood, and there are many nice houses and apartment buildings against the ocean. Just as we were getting ready to leave Oak Bay, Donna spotted, out of the corner of her eye, some movement in the water.

From VictoriaHoliday2011

We walked down closer to the water and found to our amazement that a group of 4 otters were feeding on the fish in Oak Bay! The otters would dive into the water, grab a fish, then trot up onto the rocks with their catch to eat. Then they'd jump back into the water, swim about, find another fish, and repeat the process. They were so active that the pictures turned out a little blurry, but they were a treat to watch! We stayed for about 20 minutes watching them frolic in the water, then reluctantly climbed back into the car and headed home.

From VictoriaHoliday2011

Even after all this, we barely scratched the surface of things to do and see in Victoria. If you get the chance, don't miss the opportunity to visit this wonderful area.

Thursday, September 22, 2011

I feel the NEED for SPEED

As Maverick said to Goose, sometimes you feel the need for speed. Well, apparently the train from Switzerland to Italy arrived 60 nanoseconds early today!

Randall Munroe is taking bets.

Counter Tunnel Robotics

Here's a great essay by Geoff Manaugh on his BldgBlog site about robots that are capable of exploring, mapping, comprehending, and navigating underground tunnel environments. As Manaugh observes:
the implication here is that these autonomous spelunking units are perhaps seen as a new type of ordnance—that is, they are intelligent bombs that don't explode so much as explore. They are artillery and surveillance rolled into one. Imagine a bomb that doesn't destroy a building: instead, it drops into that building and proceeds to map every room and hallway.

Although Manaugh focuses mostly on the military aspects of this technology, imagine how wonderful it would be for peaceful use. One of the most dangerous and least appealing jobs on the planet is that of the human miner, working underground in dangerous and unpleasant conditions.

Could we soon be able to make robots that are capable of doing this work for us, and free the human race from the terrors of cave-ins, explosions, mine floods, and all the other terrible disasters that all-too-frequently plague the mining industry?

Ethical Artificial Intelligence

This summer, Professor Nick Bostrom of Oxford, together with Eliezer Yudkowsky of the Singularity Institute, published a fascinating paper on the area where philosophy meets ethics meets computer science: The Ethics of Artificial Intelligence.

The paper explores a number of fascinating issues, including:

  • Ethical issues that might arise in AI software
  • Building AI that operates safely
  • Whether a program can have moral status
  • Differences in the ethical assessment of software
  • Whether software can become more ethical than humans

The paper opens with an intriguing hypothetical situation:

Imagine, in the near future, a bank using a machine learning algorithm to recommend mortgage applications for approval.  A rejected applicant brings a lawsuit against the bank, alleging that the algorithm is discriminating racially against mortgage applicants.  The bank replies that this is impossible, since the algorithm is deliberately blinded to the race of the applicants.  Indeed, that was part of the bank’s rationale for implementing the system.  Even so, statistics show that the bank’s approval rate for black applicants has been steadily dropping.  Submitting ten apparently equally qualified genuine applicants (as determined by a separate panel of human judges) shows that the algorithm accepts white applicants and rejects black applicants.  What could possibly be happening?

Finding an answer may not be easy.  If the machine learning algorithm is based on a complicated neural network, or a genetic algorithm produced by directed evolution, then it may prove nearly impossible to understand why, or even how, the algorithm is judging applicants based on their race.

The closest I've come to diagnosing behaviors of this sort is in working with the query optimizers of modern relational database systems; these systems tend to use complex strategies to choose query execution plans for user-provided queries, and when you don't get the query plan that you want, it can be quite challenging to figure out why. Modern DBMS implementations provide extremely powerful tools for studying such problems, but it is still quite hard.

Later in the paper, the authors consider the question of whether it is possible for humans to write a computer program which is more ethical than any human. This is kind of a strange question, as it begs the whole topic of anthropomorphism, but the discussion is quite interesting regardless. They consider a (perhaps) similar question, which is whether it is possible for humans to write a computer program which is better at playing chess than any human. This has clearly been done. If you accept that the two questions are similar, then you can follow their reasoning which leads them to suggest that the same thing should be possible in the realm of ethics:

  • Firstly, they note that the approach to this should not be merely to describe to the computer what we humans consider to be the best ethics, since that approach did not work for chess:
    if the programmers had manually input what they considered a good move in each possible situation, the resulting system would not have been able to make stronger chess moves than its creators.  Since the programmers themselves were not world champions, such a system would not have been able to defeat Garry Kasparov.
  • Secondly, they note that we will need to improve our own knowledge about ethics in order to be able to describe to the computer what we mean:
     Perhaps the question we should be considering, rather, is how an AI programmed by Archimedes, with no more moral expertise than Archimedes, could recognize (at least some of) our own civilization’s ethics as moral progress as opposed to mere moral instability.  This would require that we begin to comprehend the structure of ethical questions in the way that we have already comprehended the structure of chess.

I enjoyed reading this paper; it is approachable and intriguing. If the subject interests you, the authors also provide many other references to follow to learn more. Although this is far-off-in-the-distance stuff, it is still entertaining and rewarding to read.

Wednesday, September 21, 2011


The latest NVIDIA graphics chips are incredibly sophisticated. Check out this description of how they work:
NVIDIA’s Project Kal-El processor implements a novel new Variable Symmetric Multiprocessing (vSMP) technology. Not previously disclosed publicly, vSMP includes a fifth CPU core (the “Companion” core) built using a special low power silicon process that executes tasks at low frequency for active standby mode, music playback, and even video playback. The four main “quad” cores are built using a standard silicon process to reach higher frequencies, while consuming lower power than dual core solutions for many tasks. All five CPU cores are identical ARM Cortex A9 CPUs, and are individually enabled and disabled (via aggressive power gating) based on the work load. The “Companion” core is OS transparent, unlike current Asynchronous SMP architectures, meaning the OS and applications are not aware of this core, but automatically take advantage of it.

"Aggressive power gating" ... I love it!

There's more information available from the links in the NVIDIA blog.


Wow, is this for real? It seems that way.

The first album that my girlfriend and I bought for our new (shared) stereo in 1985 was a REM album; 3 months later we were married. We don't have the record anymore, but we replaced it with the CD when we upgraded our stereo.

And one of our favorite shows of all-time was an REM show at the Greek Theater in Berkeley. We haven't had time to go to the Greek in years, but that one will stay in our memories.

Thanks for all the great music, Mike, Peter, and Michael!

Wednesday, September 14, 2011

The challenge of online anonymity

The Bitcoin system has been the center of much interest of late, for a variety of reasons. One interesting aspect of the Bitcoin system is that it attempts to provide secure, private, anonymous online transactions. From the Bitcoin paper:
The traditional banking model achieves a level of privacy by limiting access to information to the parties involved and the trusted third party. The necessity to announce all transactions publicly precludes this method, but privacy can still be maintained by breaking the flow of information in another place: by keeping public keys anonymous. The public can see that someone is sending an amount to someone else, but without information linking the transaction to anyone. This is similar to the level of information released by stock exchanges, where the time and size of individual trades, the "tape", is made public, but without telling who the parties were.

As an additional firewall, a new key pair should be used for each transaction to keep them from being linked to a common owner. Some linking is still unavoidable with multi-input transactions, which necessarily reveal that their inputs were owned by the same owner. The risk is that if the owner of a key is revealed, linking could reveal other transactions that belonged to the same owner.

This idea of online anonymity has been around for a while; one primary source is Wei Dai's famous "b-money" proposal, which begins as follows:

I am fascinated by Tim May's crypto-anarchy. Unlike the communities traditionally associated with the word "anarchy", in a crypto-anarchy the government is not temporarily destroyed but permanently forbidden and permanently unnecessary. It's a community where the threat of violence is impotent because violence is impossible, and violence is impossible because its participants cannot be linked to their true names or physical locations.

You can learn more about Tim May's crypto-anarchy here. May traces these ideas back to the mid-1980's, and describes them as follows:

Computer technology is on the verge of providing the ability for individuals and groups to communicate and interact with each other in a totally anonymous manner. Two persons may exchange messages, conduct business, and negotiate electronic contracts without ever knowing the True Name, or legal identity, of the other. Interactions over networks will be untraceable, via extensive re- routing of encrypted packets and tamper-proof boxes which implement cryptographic protocols with nearly perfect assurance against any tampering. Reputations will be of central importance, far more important in dealings than even the credit ratings of today. These developments will alter completely the nature of government regulation, the ability to tax and control economic interactions, the ability to keep information secret, and will even alter the nature of trust and reputation.

Now that the Bitcoin implementation is online and operational, the question naturally arises: is it successful? Does it provide privacy, with assurance against any tampering, because its participants cannot be linked to their true names?

Two computer scientists at University College Dublin have been studying this question, and this summer they published An Analysis of Anonymity in the Bitcoin System. The paper discusses the general notions of online anonymity, and then looks specifically into the question of whether Bitcoin is achieving its goals. The core of their analysis explores several particular attacks on Bitcoin anonymity:

  1. Integrating Off-Network Information, that is, correlating Bitcoin transactions with other information that may be available from other sources outside of Bitcoin, such as from those organizations and services that accept Bitcoins as payment.
  2. Egocentric Analysis and Visualization of the User Network
  3. Context Discovery
  4. Flow and Temporal Analyses, which involves developing circumstantial evidence about possible correlations and identities of Bitcoin actors
and points to a number of other possible attacks.

The authors argue that the techniques they've developed so far have been significantly effective:

Using an appropriate network representation, it is possible to map many users to public-keys. This is performed using a passive analysis only. Active analyses, where an interested party can potentially deploy marked Bitcoins and collaborating users can discover even more information.
Lastly, as the authors point out, security attacks on a system which intentionally preserves all known transaction data publically, forever, will only get better over time, so the approaches they've developed so far may well be improved upon.

The goals that Bitcoin have set for themselves are challenging. I believe that they welcome this sort of analysis and study; it's the sign of a serious system that they take their work seriously. From my own perspective, I enjoyed following the techniques that the authors used to explore the possible weaknesses in Bitcoin. I'll look forward to following the discussion as it develops.

Tuesday, September 13, 2011

Type safety defense in depth

One of the more interesting quotes from Preliminary Design of the SAFE Platform is the following, regarding the new programming language Breeze that they are working on:
The Breeze language used in SAFE is a mostly functional language, similar in spirit to ML, except that it enforces an information flow model to support both confidentiality and integrity. Breeze will be both statically and dynamically typed in keeping with the standard security principle of defense-in-depth. Currently, the type system, which is based on the Myers-Liskov decentralized label model, is only dynamically-typed.
I'm having a little bit of trouble understanding what they mean by a language being both statically and dynamically typed. Has a programming language attempted this in the past? What does this look like, concretely, in the syntax and behavior of the language?

Generally I've understood that static typing means explicit declaration of the type of the variable to the compiler, with compile-time verification of access to only fields and methods that are known to be valid for that type, while dynamic typing means that the compiler does not so restrict the access via that variable, and may not even know the intended type of the variable until runtime.

So what would it mean for a language to be both statically and dynamically typed?

Friday, September 9, 2011

I love a mystery...

... but I am puzzled and confused about what happened down in Southern California yesterday.

As near as I can make out, this is what we think occurred:

  • A utility worker near Yuma, AZ was replacing a capacitor, and there was a malfunction of some sort
  • In response to resulting conditions on the line, a major transmission link between the Colorado River power systems along the AZ/CA border and the San Diego area detected irregularities, and was shut down
  • Some sort of secondary problem developed, which caused the gigantic San Onofre nuclear plant to experience a power cut (Yes, the power plant lost power! Huh!)
  • Available power existed, but could not be delivered, from Arizona, to California, due to the transmission link being down.
  • San Onofre operators decided that the safe thing to do was to initiate a system shutdown. This deprived, oh, 5 million people of electricity.
  • Some sort of higher-level desparation cut-off stopped the cascading shutdown before it hit L.A. Or 20 million more people would have been affected.
  • The lines were checked, the system was reset, San Onofre went through its (multi-hour?) re-boot
  • 18 hours later, "the system" was back online.

I spend a lot of my time studying and thinking about failures. From what I can tell, there is a lot to study and learn about here.

Any good pointers to send me? Drop me a line, and let me know!

An accidental witness to history

Wow. For the most part, I've been avoiding the 10-years-after coverage, but I had never heard this story before.

Thursday, September 8, 2011

Bushwhack grading

I like this light-hearted Bushwhack Rating System. Thank goodness I've never had to move above BW2 (and only rarely above BW1).

I do sort of think that the gradings for BA4 and BA5 got reversed somehow...

Wednesday, September 7, 2011

Digi Notice all those updates?

You may or may not have noticed, but over the last 96 hours your computer has probably been going completely nuts with automatic updating. Windows Update, Microsoft Update, Firefox Update, Mac OS X Software update, Chrome Update, Thunderbird Update; pretty much every piece of software that you have that connects to the Internet as part of its daily business, has been frantically updating.

If you didn't notice, that's fine; it means that the automatic updates are working as intended, and that's good. Maybe you just clicked "OK" a few times, or tolerated an extra reboot or a sluggish startup in the morning when you got to the office. Cool if that's so, because unobtrusive background updating is one of the Great Good Joys of these times, and I'm pleased when it turns out as it should.

But if you did notice all that updating, and wondered what it was all about, here is (I think) the scoop: it all had to do with a tiny little company in Holland that nobody had ever heard of named Diginotar.

Diginotar was (yes, I am using the past tense intentionally) one of the global Certificate Authorities: a commercial organization charged with the right, and responsibility, of issuing signed SSL certificates. These certificates are the foundation of server identity verification on the Internet; they are what makes that little "lock" icon appear in your browser when you access an "https" URL; they are what provides the software on your computer with the vital assurances that, when it ventures out into the cruel and heartless jungle of the Public Internet, that your software is talking to the partners that you think it is.

Concretely, when you sit down at your computer and enter, a web page appears in your browser. But how do you know that you are actually talking to the real GMail server? That is what SSL certificates do: your browser does a bunch of cryptography and ascertains that the server that it got connected to, presented a bit of data, that can be independently verified as being data that only an authentic GMail server could provide.

Unless, that is, some tiny Certificate Authority in someplace that you've never heard of, gets hacked. By a powerful government agency. And is comprised for many months, possibly even years, allowing the Bad Guys to issue more than five hundred forged certificates for, essentially, every important web site on the planet.

At this point, my rambling isn't making things much clearer, though, so you need to get the facts. Read this, and then read this, and then read this.

I'd like to tell you there's a simple answer, but there isn't. SSL and its Certificate Authority trust chain are widely felt to be flawed beyond repair, but it isn't clear what could replace them. A number of people are working hard on a new basic security mechanism called DNSSEC, but it has both technical and political obstacles to overcome.

It may be a rocky road over the next few years. Hang on tight.

UPDATE: The technical team over at the Electronic Frontier Foundation have a new detailed discussion of the attack

Faint Praise

StackOverflow's algorithms have decided to award me the Tenacious badge:
Zero score accepted answers: more than 5 and 20% of total.
I'm not exactly sure how to read that, but I think it means: "you keep answering questions on StackOverflow, even though nobody up-votes or accepts your answers."

I think that part of what's going on is that I've been participating in the Derby sub-community on StackOverflow, which is a particularly small and quiet community.

I've found that even the smallest of participations that I've made into the Perforce sub-community on StackOverflow have resulted in much larger voting and commenting activity, as that group of StackOverflow users seem much more active.


Tuesday, September 6, 2011

Perforce 2011.1 has entered beta testing!

The latest Perforce release, 2011.1, is now online and available for beta testing: here are more details. The two main features of this release are the new "Streams" functionality and the re-written integration engine. I didn't directly work on either of those features, but was pleased to be a part of some of the other, smaller, aspects of the new release. Give it a try! Let us know what you think!

Presence in Endeavor

The board game Endeavor is one of my wife's favorites, and a favorite of mine as well. It is a very well-balanced game with a moderate playtime and a nice initial random-ness in the setup that leads each game to be pleasantly different.

Recently, we've been exploring the 2 player variant rules published by Jarratt Gray (the author of the game) himself. Although Gray issues several disclaimers for these rules, we've found them to be simple and very playable, and it's renewed our interest in the game.

One thing we've discovered as we play the game some more, however, is that in our initial games we had completely overlooked the crucial concept of presence in Endeavor. We simply weren't using this concept, and so we were playing the game entirely wrong (this is not uncommon with me; in my haste to start playing a game I often read the rules very fast, and miss some fundamental concept until I re-read the rules some time later).

When the game initially starts, the only open region is Europe, and all players automatically have presence (of '0') in that region. Subsequently, as other regions open, a player only has presence in that region if they have placed a player marker there. Presence is a numerically-measured concept: if you have 3 player markers in a region, you have a presence of 3.

Presence affects the game in a number of ways:

  • You cannot occupy a city, nor attack a city, unless you already have presence in that region
  • You cannot draw a card from an open region unless you have a equal or greater presence in that region (for example, to draw a value 2 card from a region, you must first establish 2 or more player markers in that region).

Except for the initial region of Europe, where no shipping occurs, presence must be initially established by shipping to a region, since that is the only action you are allowed to take in a region where you have no presence. In Europe, your presence is established solely by the number of cities you occupy, but in the other regions (once open), your presence counts the number of shipments you have made and the number of cities you have occupied.

The concept of presence explains a number of aspects of Endeavor that are somewhat puzzling without it:

  • The Docks building, which allows you to ship and occupy in the same turn, allows a player who is "late to the party" to establish presence in an already-open region by shipping to it (even though the shipping lane is already complete, you just "over-ship" and place your marker alongside the full shipping lane) and then occupying a city in that region, in a single turn.
  • The more valuable cards in the game are much harder to acquire, since in order to legally draw a value 5 card from a region you must first have established (at least) 5 population markers in that region.
  • The otherwise "poisoned" Slavery cards are much more attractive. Firstly, it will take a while for slavery to be abolished (if at all), because for that to occur, a single player must accumulate (at least) 5 cities in Europe, which is a challenging feat since there are only 10 total cities in Europe, and in most games they will end up split rather evenly. Secondly, since drawing cards requires a superior presence, the low-valued Slavery cards are compelling to players with a lesser presence in Europe.
  • The limit on population markers per player (though we rarely reach it) means that a player will find that they cannot establish presence throughout the board, but will end up having a greater presence in some regions, and a lesser presence in others. This subtle imbalance is crucial to gameplay.

Another rule which we often forget to play in Endeavor involves the re-filing of discarded over-capacity cards, although now that we aren't drawing cards so willy-nilly, it is less common that we end up discarding cards. When a player chooses to discard a card (to get down to their required limit according to the number of Politics tokens they have scored), the player returns the discarded card to the stack where it came, and refiles it in numeric order, unless it is a Slavery card which are always retained, upside down, for negative points at the end of the game.

This means, importantly, that a low-value card may return to the game in an open region late in the game, so if you weren't able to draw that card initially, and are locked out of drawing the higher value card due to insufficient presence, you may find that the lower value card's return provides you that opportunity. Similarly, the return of that lower-valued card may block some other player, who has high presence, from drawing a higher-valued card that they have their eye on (thus the point of the Trade Office, which lets you draw twice from an open region in a single turn.

Monday, September 5, 2011

Backups remain non-trivial

Here's a nice post from Scott Hanselman about his recent re-work of his home computer backup strategy.

Backups remain the poor step-child of computing. Nobody does them, and so every day countless valuable data is lost to computer crashes or accidental fumble-fingering on the keyboard. Reading Scott's essay, you can see why nobody does their backups properly: it's a lot of work!

My parents are more dedicated to their backup strategy than most people I know; they have decades of digital photography, genealogy, writing, and other treasures on their computers. But still, even though they know the importance of backups and do the best they can, it's hard work. It requires constant attention and discipline to ensure that you have complete, verified, and reliable backups.

Like Scott, we've switched to using multiple spare external hard drives as the basic backup technique; online cloud backup tools seem nice, but just aren't up to the task of backing up hundreds of gigabytes. We let one run on our somewhat sizable machine, and after days it was still reporting less than 5% complete. The far simpler technique is to periodically hook up an external drive, copy everything to it (overnight), and then label that drive with an index card and store it someplace far away for safekeeping.

One problem is that those spare external drives age, too, and there's no guarantee that you'll be able to fire one up in case of emergency. Not so very long ago we did a restore from backup, and sadly had to go through three space external drives before we found one we could restore from (and we thanked our stars, since the third one was the last one we had!).

So here's a Labor Day though for those of you sitting around relaxing on this fine morning: take a few minutes and think about your backup strategy. If you haven't got one, make one. And if you've got one: I bet it's time for you to make a new backup!

Saturday, September 3, 2011

Stuff I'm reading

It's been a busy few weeks, so this is one of those "link dump" sort of posts. Hope you find something interesting in here:

Thursday, September 1, 2011

Google Summer of Code 2011 nearly complete

The 2011 edition of the Google Summer of Code is winding down. This year, I worked with Houx Zhang on internationalization issues in the Derby test harness. I think it was a successful year, and we had a chance to dig into some problems with the Derby testing techniques that had languished for years. It isn't enough to internationalize the software that you distribute; you must also internationalize your documentation, your tools and infrastructure, and your tests. Derby has a rich and mature test suite, and making it possible to run the Derby test suite in non-English locales makes the test suite even stronger. I haven't had much time to work with Derby recently, but I still enjoy the team and the community tremendously, and I'm glad to stay involved as time permits.