10.01.11

New ES5 strict mode support: new vars created by strict mode eval code are local to that code only

tl;dr

Ideally you shouldn’t use eval, because it inhibits many optimizations and makes code run slower. new Function(argname1, argname2, ..., code) doesn’t inhibit optimizations, so it’s a better way to dynamically generate code. (This may require passing in arguments instead of using names for local variables. That’s the price you pay to play nice with compilers that could optimize but for eval.)

Nevertheless, it’s still possible to use eval in ES5. eval of normal code behaves as it always has. But eval of strict mode code behaves differently: any variables created by the code being evaluated affect only that code, not the enclosing scope. (The enclosing scope’s variables are still visible in the eval code if they aren’t shadowed.) Firefox now correctly binds variables created by strict mode eval code, completing the last major component of strict mode (though bugs remain). These programs demonstrate the idea:

var x = 2, y = 3;
print(eval("var x = 9; x"));               // prints 9
print(x);                                  // prints 9
print(eval("'use strict'; var x = 5; x")); // prints 5
print(eval("'use strict'; var x = 7; x")); // prints 7
print(eval("'use strict'; y"));            // prints 3
print(x);                                  // prints 9
"use strict";
var x = 2, y = 3;
// NB: Strictness propagates into eval code evaluated by a
//     direct call to eval — a call occurring through an
//     expression of the form eval(...).
print(eval("var x = 5; x")); // prints 5
print(eval("var x = 7; x")); // prints 7
print(eval("y"));            // prints 3
print(x);                    // prints 2

This partially defangs eval. But even strict mode eval inhibits optimizations, so you are still better off avoiding it.

eval is a double-edged sword

eval is one of the most powerful parts of JavaScript: it enables runtime code generation. You can compile code to perform specific operations, avoiding unnecessary general-purpose overhead — a powerful concept. (But you’d be better off using new Function(argname1, argname2, ..., code), which doesn’t inhibit optimizations and still enables code generation, at loss of the ability to capture the local scope. Code using eval may see considerable speedups: for example, roc’s CPU emulator sped up ~14% switching from eval to Function. Less beefy code won’t see that magnitude of win, yet why give up performance when you have a ready alternative?)

Yet at the same time, eval is too powerful. As inline assembly is to C or C++ (at least without the information gcc‘s asm syntax requires), so is eval to JavaScript. In both instances a powerful construct inhibits many optimizations. Even if you don’t care about optimizations or performance, eval‘s ability to introduce and delete bindings makes code that uses it much harder to reason about.

eval arbitrarily mutates variables

At its simplest, eval can change the value of any variable:

function test(code)
{
  var v = 1;
  eval(code);
  return v;
}
assert(test("v = 2") === 2);

Thus you can’t reorder or constant-fold assignments past eval: eval forces everything to be “written to disk” so that the eval code can observe it, and likewise it forces everything to be read back “from disk” when needed next. Without costly analysis you can’t store v in a register across the call.

eval can insert bindings after compilation

eval‘s ability to add bindings is worse. This can make it impossible to say what a name refers to until runtime:

var v;
function test(code)
{
  eval(code);
  return v;
}

Does the v in the return statement mean the global variable? You can’t know without knowing the code eval will compile and run. If that code is "var v = 17;" it refers to a new variable. If that code is "/* psych! */" it refers to the global variable. eval in a function will deoptimize any name in that function which refers to a variable in an enclosing scope. (And don’t forget that the name test itself is in an enclosing scope: if the function returned test instead of v, you couldn’t say whether that test referred to the enclosing function or to a new variable without knowing code.)

eval can remove bindings added after compilation

You can also delete bindings introduced by eval (but not any other variables):

var v = 42;
function test(code)
{
  eval(code);
  function f(code2)
  {
    eval(code2);
    return function g(code3) { eval(code3); return v; };
  }
  return f;
}
var f = test("var v = 17;");
var g = f("var v = 8675309;");
assert(g("/* nada */") === 8675309);
assert(g("var v = 5;") === 5);
assert(g("delete v") === 17);
assert(g("delete v") === 42);
assert(g("delete v") === 42); // can't delete non-eval var (thankfully)

So not only can you not be sure what binding a name refers to given eval, you can’t even be sure what binding it refers to over time! (Throw generators into the game and you also have to account for a scope without a binding containing that binding even after a function has “returned”.)

eval can affect enclosing scopes

Worst, none of these complications (and I’ve listed only a few) are limited to purely local variables. eval can affect any variable it can see at runtime, whether in its immediate function or in any enclosing function or globally. eval is the fruit of the poisonous tree: it taints not just the scope containing it, but all scopes containing it.

Save us, ES5!

ES5 brings some relief from this madness: strict mode eval can no longer introduce or delete bindings. (Normal eval remains unchanged.) Deleting a binding is impossible in strict mode because delete name is a syntax error. And instead of introducing bindings in the calling scope, eval of strict mode code introduces bindings for that code only:

var x = 2, y = 3;
print(eval("var x = 9; x"));               // prints 9
print(x);                                  // prints 9
print(eval("'use strict'; var x = 5; x")); // prints 5
print(eval("'use strict'; var x = 7; x")); // prints 7
print(eval("'use strict'; y"));            // prints 3
print(x);                                  // prints 9

This works best if you have strict mode all the way down, so that eval can never affect the bindings of any scope (and so you don’t need "use strict" at the start of every eval code):

"use strict";
var x = 2, y = 3;
// NB: Strictness propagates into eval code evaluated by a
//     direct call to eval — a call occurring through an
//     expression of the form eval(...).
print(eval("var x = 5; x")); // prints 5
print(eval("var x = 7; x")); // prints 7
print(eval("y"));            // prints 3
print(x);                    // prints 2

Names in strict mode code can thus be associated without having to worry about eval in strict mode code altering bindings, preserving additional optimization opportunities.

Firefox now correctly implements strict mode eval code binding semantics (modulo bugs, of course).

So if I write strict mode code, should I use eval?

eval‘s worst aspects are gone in strict mode, but using it still isn’t a good idea. It can still change variables in ways the JavaScript compiler can’t detect, so strict mode eval still generally forces every variable to be saved before it occurs and to be reloaded when needed. This deoptimization is unavoidable if runtime code generation can affect dynamically-determined local variables. It’s still better to use Function than to use eval.

Also, as a temporary SpiderMonkey-specific concern, we don’t perform many of the binding optimizations strict mode eval enables. Binding semantics might (I haven’t tested, and it’s entirely possible the extra work is unnoticeable in practice) slow down strict eval compared to normal eval. Strict mode eval performance in SpiderMonkey won’t be much better than that of regular eval, and it might be slightly worse. We’ll fix this over time, but for now don’t expect strict mode eval to improve performance. (If you really need performance, don’t use eval.)

Conclusion

eval is powerful — arguably too powerful. ES5’s strict mode blunts eval‘s sharpest corners to simplify it and permit typical optimizations in code using it. But while strict mode eval is better than regular eval, Function is still the best way to generate code at runtime. If you must use eval, consider using strict mode eval for a simpler binding model and eventual performance benefits.

You can experiment with a version of Firefox with these changes by downloading a nightly build. (Don’t forget to use the profile manager if you want to keep the settings you use with your primary Firefox installation pristine.)

09.01.11

New ES5 requirement: getters and setters in object literals must not conflict with each other or with data properties, even outside strict mode

Conflicting properties in object literals

Object literals in ECMAScript can contain the same property multiple times:

var obj = { prop: 42, prop: 17 };

How does this behave? The object, when fully initialized, has that property with its last assigned value:

var obj = { prop: 17 }; // same effect

The expression 42 is still evaluated in source order, but that value isn’t found in the final object when construction and initialization completes.

Are conflicting properties desirable?

Duplicating property names is at best innocuous, but at worst it’s the source of bugs. Repeated assignment of the same side effect-free expression is aesthetically unpleasing but harmless. But what if that expression has side effects? Or what if the two expressions are ever made to differ? (This needn’t be purely human error. For example, a conflict might be the result of a bad merge of your changes with changes made by others.) What if a developer accidentally changes the first instance of a property but doesn’t notice the second? You can see how this might cause bugs.

Compatibility

ES5 generally avoids breaking compatibility with ES3. For the sake of existing code, duplicate property names are a syntax error only in strict mode code.

function good() { return { p: 1, p: 2 }; } // okay
function bad() { "use strict"; return { p: 1, p: 2 }; } // ERROR

What about getters and setters?

ES5 standardizes syntax for getters and setters in object literals. Using getters and setters you can write properties which lazily compute their values only when asked. You can also write properties which post-process values assigned to them: to validate them, to transform them at time of assignment, and so on. Getters and setters are new in ES5, so they don’t present compatibility concerns.

Conflicts with accessors are worse than conflicts with data properties. What if a setter and a data property conflict? Properties in an initializer don’t invoke setters, so a conflicting data property might blow away an accessor pair entirely! Also, since getters and setters quite often involve side effects, or reliance on object structure and internals, errant fixes of one of a pair of getters or setters are likely to cause worse problems than conflicting data properties.

Therefore ES5 prohibits conflicting property getters and setters, either with each other or with existing data properties. You can’t have both an accessor and a data property, and you can’t have multiple getters or multiple setters for the same property. This applies even outside strict mode!

/* syntax errors in any code */
({ p: 1, get p() { } });
({ get p() { }, p: 1 });
({ p: 1, set p(v) { } });
({ set p(v) { }, p: 1 });
({ get p() { }, get p() { } });
({ set p(v) { }, set p(v) { } });
({ get p() { }, set p(v) { }, get p() { } });
({ set p(v) { }, get p() { }, set p(v) { } });

/* syntax error only in strict mode code */
function fail() { "use strict"; ({ p: 1, p: 2 }); }

SpiderMonkey and Firefox no longer permit conflicts involving accessor properties in object literals

Firefox 4 nightlies now reject any property-name conflicts in object literals. The only exception is when the object literal is outside strict mode and all assignments are for data properties. Previously we implemented accessor conflict detection only in strict mode, but now Firefox 4 fully conforms to the ES5 specification when parsing object literals. (While I’m here let me give a brief hat-tip to the ECMAScript 5 Conformance Suite for revealing this mistake, the result of spec misreading by multiple SpiderMonkey hackers.)

If you ever have conflicting properties in an object literal, odds are they were a mistake. If you’ve done this only with data properties, no sweat now — but you’ll have to fix that if you ever opt your code into strict mode. If you’ve done this with accessor properties (previously a non-standard, implementation-specific feature), you’ll need to change your code to eliminate the conflict. Conflicts are reported as syntax errors (but note the bug that syntax errors aren’t reported for JavaScript XPCOM components), and they should be easy to fix.

Conclusion

Object literals containing the same property multiple times are bug-prone, as only one of the properties will actually be respected when the code executes. That mistake can’t be fixed for data properties in normal code, but ES5 can prohibit new conflicts involving accessor properties; Firefox now properly treats such conflicts as syntax errors. You can experiment with a version of Firefox with these changes by downloading a nightly build. (Don’t forget to use the profile manager if you want to keep the settings you use with your primary Firefox installation pristine.)

27.12.10

Merry Christmas again!

Regrettably the Washington Post chose to wait until after Christmas to publish this, so this post is not quite on time. Nevertheless, in the vein of the post of Christmas past I present you with Amazon.com‘s patented method (as yet unimplemented) for ameliorating the deadweight loss of Christmas, to theirs and the receiver’s benefit, and in some sense to the giver’s benefit as well:

Amazon patents procedure to let recipients avoid undesirable gifts

Apparently returned purchases are a major cost for retailers, especially otherwise largely-automated ones like Amazon.com. So avoiding shipping bad gifts, only to then have to process them again when returned, and possibly resell them at a loss, is a good way for Amazon.com to cut costs. (And though the article doesn’t mention it, presumably this system would act as an incentive for shoppers to shop exclusively at Amazon.com rather than elsewhere — even better from Amazon.com’s point of view.) Strangely (or perhaps not so strangely in today’s newspaper world, alas) the article doesn’t link to the patent itself, but it’s not particularly hard to find. I doubt many patents these days include “mildred” in their text. :-)

I express no position on the wisdom of permitting Amazon.com to patent this. But the idea itself is a good one.

11.12.10

A rather-belated response to Brain Drain Vs Foreign Invasion

(I considered posting this as a comment on the original post, but it’s grown enough that it warrant its own post. As a further aside, I find it interesting that the prevalence of post-to-reply varies across different planets. My impression is it’s much more common on Planet GNOME than on Planet Mozilla, for example.

And as one last aside, I’m amazed how much easier I find it to write on a topic I care about, presenting analysis I actually believe important, than to write on a topic I find uninteresting, or to espouse a position to which I hold little to no attachment — as most paper-writing in school tends to be, especially for one who is generally apathetic concerning literary analysis.)

Introduction

Shortly before Thanksgiving roc ruminated about the effects of choosing a foreign college over a local college, in particular making these observations:

I’ve always found it ironic that at the same time Americans complain about foreigners stealing US jobs, people in the originating countries complain about the “brain drain” of talent moving to the US. Can both groups be right? Would everyone be better off if talent stayed at home?

I think the right conclusion is that sound bites are rarely “right”. :-) There’s some truth and some incompleteness in both (even if the answer to the second question is unequivocally “no”).

The simplistic view

The obvious “cost” of “brain drain”, stated as such, is to the country losing the talent, and the obvious “gain” is to the country gaining it. Looking only at it in this narrow sense it’s just a zero-sum game. Of course more visas must be better for the gainer and worse for the loser!

The most obvious “cost” of incoming talent is that you must work harder for your position, and the corresponding gain is to those who get better positions than they had. Again it’s zero-sum: better for my labor force competitors, worse for me.

But since immigrants and not nations benefit in the latter case, “brain drain” can only be bad for the nation as a whole when it loses its best and brightest. So nations are better off if everyone stays at home in a state of autarky, right?

The unconsidered benefits

The narrow views ignore the benefits of migration. (One easy way to win a game of war: deal yourself the entire deck.) Examine both at once, and you see the incompleteness of either view.

The expats are better off

First, consider the value the expats, the people actually migrating, derive in doing so. They benefit from concentrations of people in their fields, or close to them, or perhaps even just of similar mental acuity, which would not necessarily be available if they couldn’t broaden their search radius to include the destination country. A country of four million like New Zealand may not be able to sustain world-class universities specializing in and job markets covering all of computer science, nuclear physics, oncology, aeronautics, and quantum mechanics. (Add more industries if you think New Zealand could field these.) If you can only study locally, you probably can’t study with the best in the field. If you can only work locally, you may not find your ideal job.

Incidentally, this point flatly answers the question, “Would everyone be better off if talent stayed at home?” Certainly some may see moving as a complete loss, but most will not.

Skill concentrations are more efficient

Second, consider the externalities from skill concentrations afforded by talent migration. If you get a lot of smart people working on a problem in the same location, you’ll likely get more progress than if they were geographically dispersed. The portion of the SpiderMonkey team that works in Mountain View, for example, is helped by being able to sit down and discuss issues, from small to large, in person. (Although to be sure, this is rarely necessary, as IRC, Bugzilla, and so on are adequate for all but the most intricate communication.) Functionally instantaneous communication lessens the gains of physical proximity, but it’s no substitute. This translates into more effective universities and more efficient companies. Heightened efficiency translates into reduced costs to provide products and services to the market (education is but one product/service). Reduced costs, in competitive markets (universities certainly do compete, as do most businesses), translate to reduced prices or increased quality. This is the oft-neglected good that reducing barriers to immigration provides; it is also one notably absent from the sound bites.

Restrictions burden even desirable immigration

Third, supposing that some restrictions are nevertheless desirable, consider that restrictions and impositions on visa quotas make it harder for the desired level of talent to migrate. Businesses and universities must fill out more forms and employ more people to process foreign talent. Immigrants who would be acceptable must still undergo more interviews, pay higher entry fees, and suffer more onerous restrictions on their freedom to modify their future plans. It’s hard to see how this yak shaving is good for anyone but the excess government workers employed to administer it (and demagogues who gain power promoting it).

A certain level of screening necessary to reject utter lowlifes may be unavoidable. Yet I see no rational relationship between this aim and, say, the rough US requirement that “you must remain continuously enrolled in a university or permanently employed while you remain on your visa”. And even a rational and properly limited policy might be the camel’s nose prior to truly excessive restrictions (no doubt spurred on by demagoguery and special interest groups).

Circling back

So if you add it all up, is brain drain good or bad when considered in total?

I think the benefits are much greater than popular rhetoric makes them out to be. Moreover, we should acknowledge that not every emigrant, say, who goes off to CMU to study computer science is forever lost to a country like New Zealand. Making exit easier for skilled workers does not necessarily doom a state to permanent loss.

Still, some people certainly will be worse off at the individual level. It’s also sometimes the case that groups of people, even entire industries, may be worse off with greater trade: the people the simplistic views focus on to the exclusion of all others.

To sum it up: the marginal gains from mobility of talent are widespread but small, while the losses are isolated and larger. But don’t expect special interest groups or demagogues, of whatever stripes, to acknowledge this.

A closing question, and answer

Having said my piece on the opening questions, I will close with one of my own, with an answer I hope may illuminate a deeper issue.

Take as a given that restricting “brain drain”, or restricting labor competition, is sometimes selfishly good policy. Why apply restrictions nationally and not at other levels? Why not at the level of the Swiss canton, the Indonesian province, or the American state (or in the special case of the European Union, the European country)?

It seems to me that the reason we see far fewer restrictions at non-national levels is that the overarching governmental units prohibit or severely curtail them, and special interests can’t overcome obstacles to changing that. But nationally, disparate special interests reach the critical mass to successfully push for restrictions. (At the international level the multitude of self-centered sovereignties make effective advocacy much more difficult.)

The national level isn’t really the appropriate level for restrictions on talent mobility. It’s merely the one at which special interests can be effective enough to get them enacted.

25.11.10

John Muir Trail: Thousand Island Lake to Squaw Lake

September 14

(17; 0 side; 60 total, 151 to go)

A pre-dawn panorama from my campsite of Banner Peak and the surrounding area
A pre-dawn panorama from my campsite of Banner Peak and the surrounding area

As planned I wake up early enough to catch the tail end of darkness before sunrise; this being the middle of a fairly large valley, I should be able to see it unimpeded. Surprisingly, given last night, little wind blows past the lake, and it’s much more comfortable than it was or than I had expected it would be. The sunrise is excellent but blinding; my camera has some difficulty capturing both brightness of the sky and comparative darkness of the ground. Nevertheless, I take a few pictures as I huddle inside sleeping bag and bivy sack waiting for the sun to rise to provide warmth to leave them.

Hiking poles lean against a rock in the foreground, while in the background brightness limns the surrounding hills
The light, it burns!
Banner Peak in the lingering stages of dawn
Banner Peak greets the early-morning sunlight

Hiking begins relatively early today, in accord with rising for an early sunrise. The first several miles of trail wind around several lakes named for gems, climbing up and over and down ridges along the way. Today is the fourth day of hiking, and my ankles are beginning to adjust to the inclines and constant pounding through which I’m putting them. But for now, I’m far more engrossed in enjoying the thoroughly ridiculous scenery than in feeling any lingering pain.

Teal-blue Ruby Lake, ringed by cliffs and a gently sloping trail through trees
Ruby Lake
Reflections and color against the shallows of Ruby Lake; fallen trees and the occasional small rock are clearly visible against the soil floor
Reflections and color against the shallows of Ruby Lake

The largest of the precious-stone lakes, Garnet Lake, provides the greatest views. It dominates the landscape through its size, and its gently-rippling waters are a blurred mirror for the peaks in the distance behind it.

Mount Ritter and Banner Peak, seen over the rocky shore of Garnet Lake
Mount Ritter (left) and Banner Peak (right), seen over the rocky shore of Garnet Lake; curiously, Mount Ritter is the taller of the two — a matter of perspective

(Interestingly, my first picture of it and the mountains in the background is a near-exact copy of the cover of the guidebook I carried, even though I didn’t intend to precisely replicate the picture. [I probably aimed for the general idea — towering mountains above lake with some ground and trail in the foreground — but I didn’t notice the exact spot of that picture was mere steps away, even though I usually try to look for the settings of pictures in guidebooks I use.] The guidebook picture is obviously older, but beyond that the major difference is that my picture captures reflection in the lake while the guidebook doesn’t. I suspect it was deliberately airbrushed out of the picture to reduce busyness.)

Looking across the eastern expanse of Garnet Lake toward its outlet, crossed by a barely-visible wooden bridge
Looking across the eastern expanse of Garnet Lake toward its outlet, crossed by a barely-visible wooden bridge
Mount Ritter and Banner Peak, mirrored in Garnet Lake; the mirror effect progresses from near-perfect closest to the camera to significantly blurred in the distance, as slight ripples in the water accumulate to distort the reflection
Mount Ritter and Banner Peak, mirrored in Garnet Lake, from near the footbridge across its outlet

Past Garnet Lake the trail ascends out of Garnet’s bowl, then generally descends on the way toward Devils Postpile National Monument. I pass by more lakes, none of which strike me enough to merit a picture.

The start of the descent past Shadow Lake toward Devils Postpile; far in the distance lies the black Volcanic Ridge
This descent, covered in baseball-sized rocks as it is, reminded me of some of the worst stretches of the Appalachian Trail in Pennsylvania *shudder*
Shadow Lake is surrounded by mountains, except for its outlet near a notch
Looking down toward Shadow Lake
Looking across Shadow Lake toward mountains within a few miles of it
Looking across Shadow Lake, with San Joaquin Mountain and Two Tears/Two Teats (web searches find both names: maybe it was bowdlerized?) in the distance

Just past one stream crossing I wander by another deer. Unlike previous deer on the JMT, this one cautiously watches me as I stop and take its picture, starting briefly at my experimental, abrupt move intended to gauge its reaction. Yet as with earlier deer, it generally ignores me. In the past I’ve considered this unnatural: wild animals should be afraid of humans, and they should retreat when humans approach. Yet this deer makes me reconsider. The primary problem with Shenandoah deer (often brazen beggars) was not their willingness to be near humans: it was their willingness to be near humans to beg. Proximity, and even some level of ease, is not inherently bad. The problem occurs when this is taken for granted: then, fearlessness and misguided beneficience produce a vicious cycle by which wildlife becomes no longer truly wild.

Shenandoah is too easily accessible for wildlife’s cautious acceptance of human presence to be workable. Throngs of visitors will to a sufficient extent ignore signs, act carelessly, and inexorably lead deer and other wildlife to mendicancy. In Shenandoah it really would be better for deer and other wildlife to be fearful of human presence to the point of fleeing it. (Bears in Shenandoah actually do this, mostly, I suspect, because SNP deals with problem bears much more aggressively than it deals with the vastly greater multitude of problem deer. Of course, bears being much more fearsome than deer also reduces interaction. 😉 )

But in many sections of the John Muir Trail, in the middle of remote wilderness, the deer that turns a wary eye in my direction yet continues about his business presents no problem. Nor does he induce any. Backpackers generally well-educated about interacting with wildlife (and usually not carrying food to spare!) won’t be much of an issue. Horseback visitors from nearby Devils Postpile are inherently hindered from over-close interaction, and they’re often supervised by informed guides. Less-educated day hikers are most problematic, but fewer of them will be here simply because it’s difficult to get to much of the JMT, severely blunting their ill effects. Complete lack of fear in wildlife is likely unworkable; it lowers barriers to interaction too far. But wildlife’s cautious acceptance along the JMT of human presence at a small distance, so long as the JMT remains remote, is a fragile yet stable equilibrium.

A deer eats of the grass and greens in front of it
The aforementioned deer

After much more descent I finally reach relatively flat ground: Devils Postpile National Monument is at hand. Devils Postpile’s main attraction is its bizarre natural rock formations: tall, regular hexagonal basalt columns (other sidedness less frequently) formed by volcanic action. The JMT passes through Devils Postpile’s periphery, so I’d have to detour to see the formations, partly contributing to my decision not to go see them. But more than the delay, I decide not to go because the monument feels like it’s a Pacific Crest Trail experience, not a JMT experience. If I’m not deliberately visiting Devils Postpile, I’m going to leave seeing it for when I thru-hike the PCT. (For the same reason you won’t find me hiking a section of the PCT to hike it, except as part of a thru-hike.)

Boundary sign for Devils Postpile National Monument, indicating the dividing line between it and Inyo National Forest
Now entering Devils Postpile National Monument
Trail sign: Muir Trail to left and straight ahead, Pacific Crest to left and to right
The trail north bifurcates as the JMT winds around several lakes while the PCT travels the crest

I follow the trail through Devils Postpile, guided mostly by a picture I took of the map at a trail junction shortly inside it. (The guidebook strangely foregoes a map to awkwardly describe it in prose, making it less useful and more confusing than one might hope.) It’s mostly deep sand, so the going is a bit slow. Finally, I reach the turnoff to visit Reds Meadow, a campground, store, and restaurant just off-trail where I hope (likely quixotically, given the wide variety of digital cameras and batteries) to find a replacement camera battery. The meter on my camera’s been declining much more quickly than I’d expected, so I’m worried about running out partway down the trail and thus missing the end. I’m not in luck: the store has nothing more than standard batteries and regular rolls of film. I consider eating dinner at the restaurant, but I propel myself southward in hope of reaching a camping spot with some daylight. It’s now 17:00, and if I move quickly I can reach Crater Meadow in daylight.

The San Joaquin Fork heads south to Rainbow Falls seen in late afternoon from a footbridge on the JMT
The San Joaquin Fork heads south to Rainbow Falls

Trail south of here turns a bit eerie as I pass through the remains of a forest fire eighteen years ago. Blackened trees are everywhere, but smaller growth abounds. The trail curves through the area before heading up into the mountains again, and I hit a solid pace as I push to the end of the day.

Numerous short (under twenty feet) denuded, burnt tree trunks cover the hillside; new tree growth is mostly limited to small (no taller than a person) conifers, amidst grass, shrubs, and other ground growth
The tree cemetery south of Devils Postpile, devastated by the 1992 Rainbow Fire
Reddish-yellow spiky berries surrounded by small green leaves
Edible (insides only 😉 ) reddish-yellow gooseberries; if only I'd known what they were, and that they were edible, at the time...

The trail leaves the burned area and starts ascending, and I notice a few decent campsites. However, having noted Crater Meadow as a goal, I feel compelled not to stop until I reach it. It starts to get dusky as I finish out the day, but I make it to a campsite near a small river crossing with light to spare and call it a day. The site’s partially occupied by Michelle, another JMT thru-hiker (albeit one starting from Tuolumne Meadows due to scheduling mishaps, hiking the entire stretch without resupply — a very aggressive pace/load that’s still not inconceivable). She’s started a small campfire, which provides for a nice break from my usual habit of not having campfires while backpacking. I eat dinner and we talk off and on as night falls.

I’m still carrying that boxed wine, but there’s a lake roughly a day’s hike from here. Maybe I can break it out tomorrow night, cool it in the lake, and finally get rid of its weight with dinner.

September 15

(18.5; 0 side; 78.5 total, 132.5 to go)

It’s up and out around the usual time this morning. Michelle and I end up leaving about the same time, but I overtake her shortly as I move faster with much less food to carry. I speed through the first six-odd miles of the day: there’s some scenery but no water, so I have little reason to stop.

Mountains in the distance, framed by pine trees
Double Peck East, seen over Cascade Valley from the JMT as it carves across a mountainside near Mammoth Crest
The trail continues south at far left; panning right the view passes Double Peck East and other mountains before concluding in tall evergreen trees
A panorama of Double Peck East from the trail
Mountains in the distance after an evergreen-covered valley just below
A less-obscured view of Double Peck East

Duck Creek ends the drought, and even though it’s a little early in the day I take the opportunity and stop for lunch, Michelle passing me as I eat. Readily-available water is always good at mealtime, especially as tortillas with thick peanut butter or Nutella contain little water. I continue south again after lunch, curving around a small mountain before descending to Purple Lake. It looks like a nice camping spot, if I had reached it at the end of the day. I exchange pleasantries with a few people near the lake’s outlet, learning that Michelle is shortly ahead of me, and ascend again toward windswept Lake Virginia, passing Michelle along the way. Lake Virginia’s slopes are much flatter than those of other recent lakes, probably because it’s a larger lake.

The teal lake is surrounded by rocky slopes sprinkled with evergreens
An unnamed lake/pond south of Purple Lake
A small boulder-strewn depression just adjacent to the trail; the granite blocks remind me of toy blocks
A youthful giant's playpen

Past Lake Virginia the trail switchbacks steeply descending into Tully’s Hole, then follows a creek to a trail junction. It’s around 17:00 now, so if I move quickly and keep moving I should reach Squaw Lake a bit short of nearby Silver Pass with daylight to spare: just about perfect for lightening my pack of a liter of wine. I do so, arriving at the sublime Squaw Lake shortly after 18:00.

A massive wall of rock, dotted with pine trees, makes up the backdrop for Squaw Lake, in the foreground, as dusk approaches; the sun has set far enough that land before the lake is in shadows, while the lake and slope behind it are yet sunlit
Squaw Lake against the mammoth expanses of part of the Silver Divide, from the rocks where I stayed for the night; that's Michelle in the bottom right, with her tent just barely in view in the foreground in the bottom left

I scout around for some sort of decent campsite before settling for bare rock halfway between the lake and the trail proper. (Bivy sacks are versatile — and mine is certainly more versatile than my non-freestanding tent.) Michelle arrives and contemplates continuing to Pocket Meadow, which looks to be about five miles south (most of which she’d be traveling after dark) before deciding to stop here as well.

Dinner is scampi (Knorr pasta as always) with salmon and a splash of pinot grigio. I have no idea whether I’m significantly improving the taste, or if I’m adding the optimal amount at the correct time, but I figure I can’t go wrong (and in any case, there’s nothing wrong with the placebo effect :-) ). I have more than I really need to drink, so it’s painless to experiment. Michelle also takes a splash in her dinner since I have so much.

I finish off dinner, then the remaining part of the liter, as darkness falls. But it’s not dark! Entirely by accident I have scheduled my hike to occur during the moon’s waxing phase, ending a couple days into its waning phase. (I don’t believe I could have timed my hike any better if I’d tried.) The moon is large and bright in the sky, enough so that I eventually turn off my flashlight as darkness falls; I really don’t need it to see as long as I don’t have to walk around much. Once I finally finish off the wine it’s off to sleep underneath stars and moon by an alpine lake. Can it get any better than this? I am extremely hard-pressed to think how.

…but, as has happened before, this is not the end of the day! Around 03:00 I wake up to the flashlights of two hikers passing from the north. They don’t stop, perhaps recognizing a campsite with sleeping hikers when they see it, and I’m in no mood to wake up and find out why they’re hiking now, so back to sleep I go.

« NewerOlder »