tim: text: "I'm not offended, I'm defiant" (defiant)
Content warning: discussion of rape, domestic violence, sexual harassment, and victim-blaming in linked-to articles.

I'm about to submit I just submitted a pull request to add my name to the Tech Event Attendance Pledge. Specifically:

  1. I will not attend any tech events where Joe O'Brien (also known as @objo) is in attendance.
  2. I will not attend any tech events without a clear code of conduct and/or anti-harassment policy.

For me, the first item is likely to be a moot point, since I'm not a Rubyist (although I play one on TVpodcasts). Even so, I think it's important for me to explicitly say that a space that's unsafe for women is a space that's unsafe for me. And a space that accepts harassers, abusers, or rapists who have not been held accountable or shown remorse for their actions -- whether we're talking about Joe O'Brien, Michael Schwern, or Thomas Dubuisson, just to pick a few out of many examples -- is an unsafe space.

The second item is more likely to affect my day-to-day activities, but fortunately, the two conferences I'm most likely to attend in the future already have anti-harassment policies. Open Source Bridge's code of conduct is a model for all other events of its kind. And ICFP (along with all other SIGPLAN conferences) has an anti-harassment policy. At this point, there's no reason for any conference organizers to not have already done the work of establishing an anti-harassment policy (and it's not much work, since the Citizen Code of Conduct is available and Creative-Commons-licensed to permit derivative works; it's the basis for Open Source Bridge's code of conduct), so there's no reason for me to speak at or attend a conference that doesn't have one.
tim: Tim with short hair, smiling, wearing a black jacket over a white T-shirt (Default)
I've mostly been quiet lately since a lot of what I've been doing is wrestling with Linux test bots that behave slightly differently from other Linux test bots, and that's not very useful to write about (plus, if I did, it would mostly be swearing). However, I do want to make an exciting announcement! Copied verbatim from the mailing list:

This year, the Rust and Servo projects are participating in the GNOME
Outreach Program for Women (OPW). OPW is an opportunity for women to
take on a paid, full-time, three-month internship working on one of
several open-source projects. Interns can be located anywhere in the
world and will primarily work from home; some funding will be available
to travel to meet with the project mentors for a brief period of time.

This is the third time that Mozilla has participated as a source of
project mentors, but it's the first time that the Rust and Servo
projects specifically have participated. We're very excited about this
and encourage you to apply if you're a woman who's interested in Rust.
(OPW is an inclusive program: as per their information for applicants,
"This program is open to anyone who is a female assigned at birth and
anyone who identifies as a woman, genderqueer, genderfluid, or
genderfree regardless of gender presentation or assigned sex at birth.
Participants must be at least 18 years old at the start date of the
internship.") If you're not eligible, please spread the word to any
eligible individuals you know who have any interest in Rust or Servo.

No previous experience with Rust is required, but we do request some
experience with C and systems programming, and of course people who have
previously used Rust may make faster progress.

The deadline to apply is November 11.

The details are available at:

https://wiki.mozilla.org/GNOME_Outreach_December2013 (specific to Mozilla)
https://wiki.gnome.org/OutreachProgramForWomen (applying to OPW)

If you are not a woman, but interested in an internship working on Rust,
never fear! If you are currently enrolled in an undergraduate or
graduate program, you can apply for an internship at Mozilla Research
for the winter, spring, or summer term. (Note that OPW is open to both
students and non-students.) We will be announcing this program to the
mailing list as well as soon as the details are finalized.

If you have any questions, please join #opw on irc.mozilla.org and/or
contact Tim Chevalier (tchevalier at mozilla.com, tjc on IRC) (Rust contact
person) or Lars Bergstrom (lbergstrom at mozilla.com, lbergstrom on IRC)
(Servo contact person).
I'm coordinating Mozilla's involvement in OPW this year as well as any potential Rust projects (others may end up mentoring interns depending on the intern's preferred project, but the buck stops with me), so I'd be very grateful if readers would publicize this notice to relevant venues, as well as direct any questions to me (via email, IRC, or in a comment on this post).
tim: "System Status: Degraded" (degraded)
I wrote this comment on a LinkedIn forum thread a few weeks ago for alums of my undergrad school, and I wanted to save the comment somewhere less ephemeral. Most of it is replying to another comment that I won't quote directly since it was posted to a private forum, but it's from a faculty member who I respect very much.
I haven't read the blog post Daniel linked to yet, but I would suspect that the reasons based on humanities/social science apply to us as CS people more than you might think.

As far as [REDACTED]'s points: I don't know of any CS Ph.D programs that will put you into debt either... well, sort of. I went from having ~ $5000 in student loans post-Wellesley to having ~ $40000 in student loans after being a Ph.D student at Portland State for 4 years (and not graduating). Why? Because my grant didn't pay for health insurance, so I had to pay for that out-of-pocket (and the cost of student health insurance went up from about $1200/year to $4000/year over that four years). I had a lot of other expenses, mostly medical, that I couldn't pay for just out of my grad student stipend either. So while a CS Ph.D program may seem like it's not going to cost you anything, you also have to think about the opportunity cost of spending 4-10 years making $15K-$25K/year, most likely with no benefits (since RAs and TAs at many schools are hired at 0.45 FTE so the university won't have to give them benefits -- the exception is schools where RAs and TAs are unionized, which anyone looking to go to grad school should seek out). Especially for CS majors, those 4-10 years could be critical in your career -- people who don't go to grad school spend them developing and gaining experience that you're likely going to have to start over with anyway if/when you leave or graduate and can't or don't want to work in academia.

I agree that being your own boss is very appealing, and it's a big reason why i once wanted to be a professor. But it's not quite right to say that professors get to decide what they work on, and especially not grad students. Applying for grants limits what you can work on and also means you have to do a little or a lot of "spin" to make your work sound like something that will "help the war fight" (at least if you're applying for defense funding, which is what a lot of CS funding is). For some people, this may feel dishonest.

Teaching and mentoring are also appealing. There are many jobs that have mentoring as a component.

I think there should be more women in CS, but I also wouldn't want to make any individual woman deciding on a career feel like she *has* to do something that's difficult, costly (financially and emotionally), and will likely expose her to various kinds of harassment just to further the cause of equality. Ultimately, it's men's job to stop driving women out of CS.

In the end, I think if somebody reads all of this stuff and makes an informed decision that they still want to go to grad school, then they should go. That's why I went, two separate times! I won't exactly say that I wish I hadn't, but I also wonder how it could have been if I had spent those years working on projects with clear goals and expectations where I would have been compensated fairly.
tim: Tim with short hair, smiling, wearing a black jacket over a white T-shirt (Default)
It took me about six weeks, but I finally finished reading Samuel Delany's recent novel _Through the Valley of the Nest of Spiders_. Maybe I just have a kink for long books -- it's 804 pages and, like _Infinite Jest_ (which is even longer), I suspect it's going to be one of those books that keeps being important to me for a really long time. (The third one like that is _A Suitable Boy_ by Vikram Seth, though it hasn't stayed with me quite the same way _Infinite Jest_ has; I've also only read it once.)

In lieu of more thoughts, some quotations from it:

"'There ain't no normal," Shit said. 'That's what he always told me.' With his scruffy beard, Shit pointed his chin toward Dynamite. 'There's just comfortable and uncomfortable. And I like to be comfortable with pretty much everything.'" (p. 305)

"'Well--' Eric looked back up and put his hand on Shit's warm shoulder--'state supported marriage comes with a whole lot of assumptions about how it's gonna be, a history of who has to obey who, when you're justified in callin' it quits, all sorts of things like that. Now, you could agree with each other to change some of those things or do 'em differently, but for thousands and thousands of years gay men and women didn't have even that--except for a few Christian monasteries here and there, where the monks were allowed to marry each other. But nobody likes to think about those. For us, decidin' to be with someone else wasn't a matter of acceptin' a ready-made set of assumptions. You had to work 'em all out from the bottom up, every time--whether you was gonna be monogamous or open; and if you was gonna be open, how you was gonna do it so that it didn't bother the other person and even helped the relationship along. Workin' all that stuff out for yourselves was half the reason you went into a relationship with somebody else. We had some friends once--back when we lived in the Dump--that was faithful for ten months out the year, but for two months they'd go on vacation and do all their tom-cattin' around.' He realized he was making that up, but hell, it was plausible. 'Then they'd be faithful again. But that's how they liked to do it. Then there were guys like us that just had to make real sure that the other person was feelin' good about things, when they did it and knew they were number one and didn't mind. See, that's what people who get married don't have. Or don't have in the same way." (p. 785-786)

"'Bein' a pervert was the only was I ever learned anything worth knowin'.'" (p. 792)

There's also this epigraph, which, if I ever wrote papers anymore, I would try to include in a paper about GC:

"Except there's garbage, which is part of what we're trying to include in our work and our thought, which is to say, we are attentive still to what remains, what gets tossed away and off. We want to include the trash in many ways, thinking of this refuse according to all sorts of disposal systems." -- Avital Ronell
tim: Tim with short hair, smiling, wearing a black jacket over a white T-shirt (Default)
The last five days have been one full day of travel, preceded by three full days of Mozilla Summit in Toronto, preceded by another full day of travel. It's been quite a time, with not much sleep involved, but (for me) a surprising amount of code getting written and interesting sessions being attended and pure fun being had (karaoke night Friday and dance party Sunday -- if any of the organizers were reading, both of those were really good ideas). Actually, watching "Code Rush" -- a documentary made in 2000 about the very early history of Mozilla -- with Mozilla folks was quite an experience as well. You can watch it for free online; it's only about an hour, and I encourage you to if you have any interest at all in where we came from. (Passive sexism and other kinds of isms aside.)

So far as sessions, one of the highlights was Jack's talk on the Servo strategy on Saturday -- featuring about 30 minutes of lecture followed by a full 80 minutes of Q&A. The questions were the most focused, on-topic, respectful set of questions I've heard after any talk I've been to in recent memory (and it wasn't just because Jack prepared a list of suggested questions to ask in advance!) The other highlight for me was the excellent "Designing your project for participation" session with David Eaves, Emma Irwin, and Jess Klein on Sunday. One of the reasons why this session was so great was that it was mostly small-group discussion, so everybody got to be more engaged than they might have been if most of the time had been sitting and listening. I came away with a better understanding that a lot of different people are thinking about how to make the volunteer experience better in their specific project, and there are lessons to learn about that that are transferable. For example: volunteers are much more likely to keep participating if they get a prompt review for their first bug; and, it might actually be okay to ask volunteers to commit to finishing a particular task by a deadline.

During downtime, I got a few pull requests in, some of which even landed. One of the bigger ones, though, was #9732, for making automatically checked-out source files read-only and overhauling where rustpkg keeps temporary files. (That one hasn't been reviewed yet.) #9736, making it easier to get started with rustpkg by not requiring a workspace to be created (it will build files in the current directory if there's a crate file with the right name) is also important to me -- alas, it bounced on the Linux bot, like so many of my pull requests have been doing lately, so I'll have to investigate that when I'm back on the office network and able to log in to that bot tomorrow. Finally, I was excited to be able to fix #9756, an error-message-quality bug that's been biting me a lot lately, while I was on the plane back to San Francisco, and surprised that the fix was so easy!

Finally, I request that anyone reading this who's part of the Rust community read Lindsey Kuper's blog post about the IRC channel and how we should be treating each other. Lindsey notes, and I agree, that the incident that she describes is an anomaly in an otherwise very respectful and decorous IRC channel. However, as the Rust community grows, it's my job, and the job of the other core Rust team members, to keep it that way. I know that communities don't stay healthy by themselves -- every community, no matter how small, exists within a kyriarchy that rewards everybody for exercising unearned power and privilege. Stopping people from following those patterns of non-consensual domination and submission -- whether in such a (seemingly) small way as inappropriately calling attention to another person's gender on IRC, or in larger ways like sexual assault at conferences -- requires active effort. I can't change the world as a whole on my own, but I do have a lot of influence over a very small part of the world -- the same is true of most other human beings. So, I'm still thinking about how to put in that effort for Rust. The same is true for community processes as for code: patches welcome.
tim: Mike Slackernerny thinking "Scientific progress never smelled better" (science)
Open to: All, detailed results viewable to: All, participants: 31

Which of the following vegetables would you generally peel before cooking with?

View Answers

7 (22.6%)

18 (58.1%)

17 (54.8%)

12 (38.7%)

21 (67.7%)

butternut squash
21 (67.7%)

1 (3.2%)

kabocha squash
10 (32.3%)

19 (61.3%)

4 (12.9%)

None of the above
0 (0.0%)

I have never cooked any of these vegetables
0 (0.0%)

I don't eat vegetables
0 (0.0%)

I don't cook
1 (3.2%)

I don't eat
0 (0.0%)

Which of the following fruits would you generally peel before eating (raw)?

View Answers

22 (73.3%)

6 (20.0%)

0 (0.0%)

2 (6.7%)

kiwi fruit
16 (53.3%)

2 (6.7%)

0 (0.0%)

None of the above
5 (16.7%)

I have never eaten any of these fruits
0 (0.0%)

I don't eat fruit
1 (3.3%)

I don't eat
0 (0.0%)

tim: Tim with short hair, smiling, wearing a black jacket over a white T-shirt (Default)
Once #9654 lands (after ~ 24 hours in the queue, it finally started building only to fail because of an overly long line... and I thought I checked that every time I pushed a new version, but apparently not), we'll have completed the "build Servo with rustpkg" milestone. The next milestone is the first part of community adoption, which, roughly speaking, is the list of features we think rustpkg has to have in order for it to get widespread use in the Rust community (such as it is).

The first thing on that list, which I started working on today, is #6480: making locally-cached copies of source files that rustpkg automatically fetched from github (or, in the future, elsewhere) read-only. The intent behind this bug is to prevent a sad situation -- that really happened to a Go developer, and could happen again with a similar package manager -- where somebody makes local changes to packages that their projects depends on without realizing they're editing automatically-fetched files, and then loses those changes when they redistribute the code (without their local changes).

One solution is for rustpkg to make any source files that it automatically fetches read-only. Of course, anybody can just change the permissions, but that requires deliberate action and (with hope) will remind people that they might be making a poor life decision. That's easy enough to implement, and in fact, I implemented it today.

But I realized that this really goes along with another issue I filed, #9514. Servo depends on many packages that the Servo developers are also hacking on in tandem with hacking on Servo -- so if we started making everything read-only, that would make it hard to work with local (uncommitted) changes.

I think the right thing to do (and as per discussion in meetings, it sounds like it's the general consensus) is to make rustpkg do two things differently: (1) store fetched sources in a subdirectory of a workspace's build directory, not its src directly (it's less confusing if src is only things you created yourself); and (2) always search for sources under src before searching under build. That way, if you want to make local changes to a package that you refer to with an extern mod that points to a remote package, you can manually git clone the sources for it under a workspace's src directory, and then rustpkg will automatically find your locally-changed copy (which it will also never overwrite with newly fetched sources).

I looked at what Go does, and it looks like it does more or less what I want to do in rustpkg: "By default, get uses the network to check out missing packages but does not use it to look for updates to existing packages." So, the only case where rustpkg will pull a fresh copy of a repository off github is if there's no local copy of the sources, or installed version of the binaries for it, that can be found by searching the RUST_PATH.
tim: Tim with short hair, smiling, wearing a black jacket over a white T-shirt (Default)
It's been a while since my last Rust blog post. The latest thing that's happened is that I just submitted a pull request that gives rustpkg packages that have custom build scripts the ability to have C dependencies (or, in fact, dependencies on any sort of artifact that you can write a Rust program to build). There's no magic here -- in fact, for the most part, it's just a matter of writing an example to show what needs to be done. Someday there will be a higher-level interface to rustpkg where you can just declare some source file names and rustpkg will do the rest, but not today. In the meantime, the build script explicitly sets up a workcache context and declares foreign source files as inputs, and so on.

The thing I did have to change in rustpkg itself was changing the install_pkg function (in the API that build scripts can call) so as to take a vector of extra dependencies. Currently, the types of dependencies supported are files (any source files, which get checked for freshness based on hashing their contents and metadata) and binaries (any files at all, which get checked for freshness based on their modification times).

Completing this bug means that (once it's reviewed and merged) I'll have completed the "build all of Servo" milestone for rustpkg, and on time, no less (albeit with less than 30 minutes remaining). Jack has been steadily working on actually building Servo with rustpkg, discovering many issues in the process, and once this pull request is merged, we should be able to do the rest.
tim: Tim with short hair, smiling, wearing a black jacket over a white T-shirt (Default)
The 2013-2014 summary of benefits for Mozilla's Anthem PPO insurance plan is missing one sentence that was in last year's version:

"Sex Transformation. Procedures or treatments to change characteristics of the body to those of the opposite sex."

(I'm not even going to talk about all the ways that that sentence does harm and is factually incorrect; feel free to point them out if you want to.)

Since the "procedures and treatments" the phrase clumsily attempts to designate are medically necessary, that was the only edit that needed to be made to make the plan non-discriminatory against trans people. If that sentence had been removed two years ago instead of now, I'd be in $15K of debt instead of $64K. I'd have been able to pay off all of my student loans.

I do not know whether all Anthem plans in California lack the problematic sentence as well. All health insurance plans in California (except for "health and life insurance plans" offered by out-of-state companies) are now required to cover any procedure for trans people that would be covered for cis people. But potentially, this still allows plans to exclude certain procedures that only trans people need. So I don't know whether Anthem is just doing the minimum that's legally necessary here, or whether Mozilla specially arranged for additional coverage. I also don't know what the situation is for US employees who work for Mozilla at offices outside of California.

But for me, this makes certain possibilities a lot easier, and more importantly, it means trans people who work for Mozilla, now and in the future, will know that their health care will be covered, just like that of their cis colleagues. It also means that one more company is setting an example for others to emulate of how to treat employees with respect.

It's an injustice that health care is a for-profit industry in the US and that access to good-quality coverage is usually tied to employment. But while we fight to change that, we can also break down the barriers that make trans people's lives harder and shorter. Removing trans exclusion clauses doesn't remove the barriers that exist for trans people obtaining housing or being considered fairly for employment -- particularly trans women facing intersecting oppressions. It doesn't decriminalize sex work or change how the prison-industrial complex oppresses trans women. But I still think that removing one continuous message that says that my friends and I just aren't as human as cis people are is a good thing.

ETA: I'm still not completely sure, but it appears that all Anthem Blue Cross plans in California will be trans-inclusive as of January 1, 2014. According to that blog post, employees in the US but outside California who are covered by a California-based employer's plan through Anthem will also have trans-inclusive benefits. The post seems to suggest that Anthem is going beyond the legal minimum by covering all transition-related care (though it sounds like there is still room for them to deny certain procedures, like facial surgery for trans women (which can be crucial for some people's mental health and personal safety) on the grounds that they are "cosmetic").
tim: Tim with short hair, smiling, wearing a black jacket over a white T-shirt (Default)
I've got tests passing for my fix for #7879, making dependencies work more sensibly. I was proud of my voluminous status update for the past week, so it was a bit sad to only work on one bug today. It was a hard one, though, and I finished it! Aside from fixing tidy errors and such, anyway.

Amusingly/frustratingly enough, I did the real work over the weekend and the work I did today was almost completely centered around testability. I realized I can't use timestamps from stat() to ensure that a given file wasn't rebuilt. Someone else discovered this too: time granularity on Unix is only 1 second, so it's easy for two files that get created in quick succession, but sequentially, to have equal timestamps. Next idea was to compare hashes (which is what workcache already does!), but that won't do since if no dependencies have changed, but a file gets rebuilt anyway, it's likely to be bitwise identical. Jack suggested making the test case set the output to be read-only so that in case the second invocation of rustpkg tries to overwrite it, that would be an error. This required a little bit of fiddling due to details about conditions and tasks that I fought with before (see #9001), but the basic approach works.

So I can get a pull request in for this tomorrow, and happily, it's likely to fix several other bugs without additional work (or with minimal work): #7420, #9112, #8892, #9240, and I'm probably missing some.
tim: Tim with short hair, smiling, wearing a black jacket over a white T-shirt (Default)
I landed four pull requests between middle-of-yesterday and now! See the schedule -- #6408, #8524, #7402, and #8672.

The remaining work that's necessary for building Servo for rustpkg is making dependencies between crates in the same package work right; building C libraries; and (added today) rustpkg test. I worked mostly on the first one today (but worked a little on making a test case for the second one).

A community member reported this bug about rustpkg going into an infinite loop when compiling a package with two crates in it, where one depends on the other. Digging in, I realized that various parts of rustpkg assume that a package contains either only one crate, or several crates that are all independent of each other. The key thing is that it was computing dependencies at the package level, not the crate level. So if you have a package P containing crates A and B, and B depends on A, rustpkg goes back and tries to compile all the crates in P first, rather than just compiling A.

So I'm working on that; it requires some refactoring, but nothing too complicated. I was thinking at first that I would have to topologically sort the dependencies, but after talking to Erick Tryzelaar on IRC, I realized that that approach (the declarative one) isn't necessary in the framework of workcache, where function calls express dependencies. The issue was just that I was treating packages, rather than crates, as the unit of recompilation.

This past week was the Servo work week, so today I also spent some time in person working with Jack to get more Servo packages to build with rustpkg. He found a bug in how extern mod works that I fixed, but only after he had to leave for the airport.
tim: Tim with short hair, smiling, wearing a black jacket over a white T-shirt (Default)
Today I was able to finish the first three items on the "Build all of Servo" milestone on the schedule:

  • Sub-package-IDs allow you to name a subdirectory of a package directory by its own package ID. For example, if your package foo has an extras directory with several subdirectories, and foo/extras/baz has a crate file in it, you can rustpkg build foo/extras/baz without building the other crates in the foo package. This pull request landed already.
  • Recursive dependencies: actually, this almost worked before, and it was a matter of fixing a bug that assumed that all of the dependencies for a package (except for system libraries) were in the same workspace. This is building on the bots and should land in another ten minutes or so.
  • Installing to RUST_PATH: previously, rustpkg would install a package to the same workspace its sources were in. Now, it installs the package to the first workspace in RUST_PATH, if you've set a RUST_PATH in the environment; if you didn't, it defaults to the old behavior. This is in the queue to build.
I'm working on putting build output in a target-specific subdirectory now, which is easy except for cleaning up all of my crufty test code that made assumptions in lots of different places about the directory structure. Along those lines, I'm also cleaning up some of the test code to make it less crufty.

I looked at #7879 (infinite loop compiling dependencies) too, and it's harder than I thought; right now, rustpkg assumes it's okay to build the crates in a multi-crate package in any order. That's not necessarily true (as the examples given in the original bug and in my comment show) and so rustpkg can end up telling workcache that there's a circular dependency between two files (when there's not). I think fixing this will require topologically sorting the crates in a package in dependency order, which in turn will actually require... writing a graph library in Rust, since there isn't one. So that will be fun, but I don't have time today.
tim: Tim with short hair, smiling, wearing a black jacket over a white T-shirt (Default)
Today's pull request: making rustpkg find dependencies recursively. It wouldn't be much of a build system if it didn't do that, and it actually almost did without deliberate effort -- the missing piece was making the code that infers dependencies and searches for them actually search different workspaces other than the workspace of the package that has the dependencies. Other than that, it was just a matter of adding a test case, and then #8524 can close.
tim: Tim with short hair, smiling, wearing a black jacket over a white T-shirt (Default)
I just submitted a pull request that implements most of the commonly-used rustc flags as flags for rustpkg. This took longer than it would have taken if I had taken the approach of just passing any "unknown" flag to rustc, but I think it's useful for rustpkg to do a bit of checking so as to tell you if you're passing nonsensical combinations of flag and command. For example, most rustc flags don't make sense unless you're either building or installing a package, so rustpkg errors out in those cases (for example, if you do rustpkg list --opt-level=3).

I didn't add a test case for every flag, and there's still some refactoring of boilerplate that could be done, but I prioritized getting this done on schedule over completeness.
tim: Tim with short hair, smiling, wearing a black jacket over a white T-shirt (Default)
Now that workcache is in the queue, I've moved on to teaching rustpkg how to pass command-line flags to rustc. At first, I thought I was going to implement a syntax like:

rustpkg +RUSTC --linker myspeciallinker --emit-llvm -RUSTC build foo

(ripped off GHC's +RTS/-RTS options), where +RUSTC and -RUSTC delimit arguments that rustpkg should just pass straight to rustc. But then I realized that Rust's getopts library doesn't support this kind of flag. I didn't want to change it, and also realized that many flags are only valid when you're actually building or installing a package. So, I decided to enumerate the flags you can use in rustpkg (a subset of rustc's flags), as well as the subset of flags that work only with build, or with build and install.

So far, this approach is working fine; I'm writing a test for each flag, as well as a test that the same flag gets rejected when in the wrong context. I don't expect any surprises with this one.
tim: Tim with short hair, smiling, wearing a black jacket over a white T-shirt (Default)
Short post, since I stayed up late getting the workcache pull request ready. But: the workcache pull request is ready! Once this lands, I won't feel like I have to use finger air quotes anymore when saying that I'm working on a build system.

I thought I might be able to come up with something profound to say about it, but apparently not.
tim: Tim with short hair, smiling, wearing a black jacket over a white T-shirt (Default)
Today, I finished the initial work to integrate rustpkg with the workcache library, so that it remembers dependencies and only recompiles sources when they or their dependencies have changed. I'll write about that in a later post, though; for now I want to talk about a bug, or infelicity, that I discovered today.

Conditions are Rust's answer to exceptions; they're a lot simpler and more predictable than exceptions, and the conditions tutorial is quite clear, so you should probably just go read that first.

I've tried to exercise conditions a fair bit in rustpkg, since they are a fairly new feature and it's good to be able to see how they work out. For testing purposes, I wrote one regression test that makes sure the right conditions get raised when you try to install a package that doesn't exist.

When I finished with workcache, that test failed. Which was weird at first, since avoiding recompilation shouldn't affect the behavior of a test that never compiles anything. Eventually, I realized the problem: conditions don't propagate to different tasks. Before adding workcache, rustpkg was totally single-task; but, when workcache does a unit-of-work-that-can-be-cached, it spawns a new task for it. That's great, because it means we get parallel compilation for free (at the crate level, anyway). But, it means that if the process of compiling one crate raises a condition, it will manifest as that entire task failing.

I'm not sure what to do about this in the long term, but for now I just changed that test case to use task::try, which catches task failure, and to test something weaker than it did before (that the task fails, rather than that it raises three particular conditions in sequence).

That solves my problem, but I'm not totally sure that this interaction between conditions and tasks was intentional, so I filed an issue. It seems to me like it's confusing to have to know whether a particular piece of code spawns multiple tasks in order to know whether the condition handlers you wrap around it will actually handle all the conditions that that code raises. That seems like an abstraction violation. On the other hand, maybe I'm misguided for thinking that conditions should be used for anything but signaling anomalous cases within a single task.
tim: text: "I'm not offended, I'm defiant" (defiant)
I've got a new post on geekfeminism.org today about the tone argument and how it gets used to stop women from occupying positions of power.

Coincidentally, today I ran into a passage from Jonathan Kozol's book The Night Is Dark and I Am Far from Home that I'd saved somewhere that says a lot of the same things; no doubt, the ideas in it incubated in my head for five years, then came out in another form. I'm just going to copy and paste all of it because it's that good. Emphasis added.

Thoreau wrote in 1854: "I fear chiefly lest my expression may not be _extra-vagant_ enough. ... I desire to speak somewhere _without_ bounds." In terms of syntax, style, and word-preference, the message of the public school is the exact reverse. Children come to realize, early in their school careers, the terrible danger to their own success in statements that give voice to strong intensities or to extravagant convictions. Instead, they are instructed, in a number of clear ways, not only not to speak but also not to think or feel or weep or walk beyond the clearest bounds laid out by public school. They learn whole sequences of moral obviation. They learn to abhor and to distrust what is known as "unconstructive" criticism. They learn to be suspicious of "extreme" opinion, most of all if it is stated with "extreme emotion." They learn to round off honest judgments, based upon conviction, to consensus-viewpoints, based solely on convenience, and to call the final product "reason." Above all, they learn how to tone down, cushion and absorb each serious form of realistic confrontation.

Anger between two parties, conflict starting up between two sides, is not accepted as the honest manifestation of irreconcilable interests (power and its victim; exploitation and its cause; victimization and the one who has the spoils) but solely as a consequence of poor communication, bad static on the inter-urban network, poor telephone connections between Roxbury and Evanston, or Harlem and Seattle. Nobody _really_ disagrees with someone else once he explains himself with proper care. Confrontation, in the lexicon of public school, is a perceptual mistake. It is the consequence of poorly chosen words or of inadequate reception: "We have to learn not just to talk, but also how to listen, how to understand ..." The message here is that, if we once learn to listen well, we will not hear things we do not like. To hear things that we do not like is not to hear correctly. (The teacher tells us that we need more exercise on "listening skills.")

The level of speech which is accepted, offered and purveyed within the public schools is the level appropriate to that person who has no reason to be angry, or no mandate to be brave. The implication is conveyed to kids that almost anything they ever say, or hope to say, will, by the odds, be "somewhat stronger," "somewhat less temperate," than the limits of the truth require; that there will be, in every case, a heightened likelihood of untruth in a statement that appears to carry strong conviction, _more_ truth in a statement that appears to carry _less_ investment of belief. Conviction in itself, as children come to understand, is the real enemy; but it is the presentation, not the content, which is held up to attack.

"Linda," says the teacher, in the classic formula of admonition, "isn't that a bit strong?" The teacher seldom comes right out and says the sort of thing that might be true, or at least half-true: "Look, we're going to have a much less complicated day if you can learn to cut into your sense of conscience and integrity a bit."

Instead he asks the children, "Aren't we overstating?"

As the first assertion is restated to conform to satisfactory limits of conviction, the viewpoint it conveys begins to seem "more true," and finally wins the badge of mild approval: "That sounds more sensible ...." In practice, as there comes to be less to believe, it comes to seem more readily believable. It is rare indeed, during twelve years of school and four of college, that pupils get back papers from their teachers with the comment, "Be more angry! Go further! You have stated this with too much caution!" Emphasis is all the other way.

Equally distrusted is unique opinion which has not been rounded off to fit the class consensus. "Okay ... David's said the Negro people have been fighting for their rights ... and Susan says that we need law and order ... Well, there might be truth in _both_ of their positions ... Let's see if we couldn't find a _third_ position ... " It is not argued in a candid manner by the teacher that the third position may well prove to be _convenient_; rather, there is the implication that the third position will be more "true" than either of the two extremes, that truth dwells somehow closer to the middle.


It is an easy step from this to the convenient view that all extremes of action end up in the same place, that radical change must bring inevitable repression. The phrase "EXTREMISTS AT BOTH ENDS" is, for this reason, a manipulative phrase. Its function is to tell us: (a) There is, in every case, a "greater truth" residing some place in the middle; (b) There _is_, in every case, a "middle situation" --- one which is not artificial, or dishonest, or contrived.

-- Jonathan Kozol, _The Night Is Dark and I Am Far from Home_. Continuum (New York), 1975.

tim: Tim with short hair, smiling, wearing a black jacket over a white T-shirt (Default)
I'm continuing to make progress on workcache/rustpkg integration. Today I realized that I'd been doing things a bit wrong, and that it would be important for rustpkg to declare all the inputs (for a particular package) before starting to do actual work (building).

Let me back up a bit, though. workcache is a library for tracking cached function results. A function, in this case, is the process of generating build artifacts (libraries and/or executables) from inputs (source files). workcache knows about declared inputs, discovered inputs, and discovered outputs. In the case of rustpkg:

  • A declared input is a source file that you know about when you start building. For example, if the user wanted to build package foo in someworkspace, and someworkspace/src/foo/lib.rs exists, then you know that someworkspace/src/foo/lib.rs is a declared input.
  • A discovered input is a source file or binary file that you discover while searching for dependencies. In the previous example, if lib.rs contains extern mod bar, and rustpkg determines that bar lives in $HOME/.rust/lib/libbar-whatever-someversion.dylib, then $HOME/.rust/lib/libbar-whatever-someversion.dylib gets added as a discovered input.
  • A discovered output is what you're trying to build when you build the package. So, in this example, someworkspace/lib/libfoo-whatever-someversion.dylib would be a discovered output.

The code I'd written so far was interleaving declaring inputs with doing actual work (building, which produces discovered output). That won't do, because when you look up a cached function result, you're looking it up by the name of the function paired with the list of declared inputs. If you don't know all the inputs yet, you won't get the right function result, and you also won't save it in a way that lets it be found later.

So, I fixed that, and am putting it all back together now. It's slow going since workcache is not entirely documented (although it has more documentation than some Rust libraries do!) and I've had to add functionality to it.
tim: Tim with short hair, smiling, wearing a black jacket over a white T-shirt (Default)

I know my daily posts are usually pretty low-level, so today I want to step back just a bit and explain a little bit of how rustpkg works. rustpkg is heavily modeled on the Go package manager.

rustpkg operates on workspaces. A workspace is a directory. It has several subdirectories: src, lib, bin, build. rustpkg creates the last three automatically if they don't exist.

If you are using rustpkg, you probably want to build and install Rust code that you wrote yourself. You probably also want to build and install code that other people wrote, living in version control repositories that are available online (for example, on Github).

When you check out a remote repository to build, you check it out into a workspace you already created. So if you want to use my library called "quux" (which doesn't really exist), you would do:

mkdir -p Workspace/src
cd Workspace/src
git clone http://github.com/catamorphism/quux
cd .. rustpkg install quux

This would make a directory called Workspace/src/quux. Here, quux is a package ID. A workspace can contain several different packages with different IDs.

rustpkg actually makes this a little bit easier for you. You can also do:

mkdir -p Workspace/src
cd Workspace
rustpkg install github.com/catamorphism/quux

n.b. Next paragraph edited for clarity on 8/29/2013

This would make a directory called Workspace/src/github.com/catamorphism/quux. Since the source of the package is a remote repository, here the package ID is a URL-fragment: github.com/catamorphism/quux. Supposing quux is a library, it would be installed in Workspace/.rust/lib/libquux-HASH-VERSION.dylib, where HASH is computed by rustc and VERSION is quux's version (as per the most recent git tag). You might assume that it would be installed in Workspace/lib/..., but since the source code was fetched remotely, rustpkg uses the first workspace in the Rust path as the destination workspace. By default, the first workspace in the Rust path is CWD/.rust,, where CWD is the current working directory. You can change the Rust path by changing the RUST_PATH environment variable, but I'll get to that.

You can use one workspace for many different projects. The idea is that the sources for each project live in its own subdirectory under Workspace/src (or whatever you want to call your workspace). Or, you can make one workspace per project. Then the src/ directory would contain one subdirectory for the sources for your project, and one subdirectory for the sources for each of its dependencies.

The motivation here is to avoid so-called "dependency hell" by compartmentalizing the dependencies for different projects, rather than storing all libraries in one system directory. So if project A needs version 1.2 of library Z, and project B needs version 3.6 of library Z, no problem! The downside is potential multiple copies of libraries; we'll see how that works out in practice.

You do have a choice if you want to avoid multiple copies: you can use the RUST_PATH. RUST_PATH is an environment variable that rustpkg knows about; it's a colon-separated list of workspaces. Like:

declare -x RUST_PATH=/home/tjc/rust:/home/tjc/servo:/home/tjc/miscellaneous-useful-libraries

(if you use bash, anyway.)

It's assumed that if you add a directory to the RUST_PATH, that directory is a workspace: that is, it has a src/ directory with one or more project-specific subdirectories. If it doesn't have a src/ directory, rustpkg won't find any source files -- or installed libraries -- there.

After I implemented all of that, Jack began trying to port some of the Servo submodules to rustpkg. He ran into the issue that rustpkg always installed the build artifacts into the same workspace as the sources, assuming the sources are in a workspace. He wanted to be able to have the source in one workspace, and tell rustpkg to put the build artifacts into a different workspace. As he pointed out, RUST_PATH represents both "where to find binaries" and "where to find sources". This was by design, but that doesn't mean it's necessarily the best design.

Jack wanted to be able to support the Servo submodule scenario without too much restructing. Once ported to rustpkg, the Servo repository will look something like:


bin.rs is the main crate module for the Servo executable. foo is an example submodule of Servo; lib.rs is its crate module. There are many other submodules under servo/deps/src. We want to be able to build foo with rustpkg and put its build output in servo/build and servo/lib, so that they can be found when building Servo with rustpkg.

At this point, I said: why not just put the deps directory under servo/src? Jack's answer is that the Servo repository has many subprojects, so the developers want to organize them somewhat. And if the deps directory has subdirectories, the subdirectory names become part of the package ID. Since these subprojects all have their own git repositories, it starts looking a little strange for the local directory structure to affect how you refer to the package.

Then we considered whether or not to just use the extern mod foo = ... mechanism for everything. That is, don't manually store the subprojects locally; rather, write a directive in the Servo crate like extern mod foo = "github.com/mozilla/foo"; that tells rustpkg "fetch the sources in the repository named mozilla/foo on github.com, cache a local copy, and build them". That way it's up to rustpkg how to organize the local files, and the developers won't have to think about it. However, this solution does not work well if the Servo developers often make local uncommitted changes to the libraries they depend on, concurrently with developing Servo. In this scenario, rustpkg will use the revision it found in the remote repository, and not the locally modified version.

The solution we came up with is to change what's allowed to appear in the RUST_PATH. So when building Servo, if somebody wanted to build a particular submodule -- rust-geom -- first, their RUST_PATH would look like:


and they would type rustpkg build rust-geom to install rust-geom into the $HOME/servo workspace.

I have a pull request in the queue to implement this change. I wasn't sure whether we wanted it to be transitional -- until, perhaps, the Servo repository gets restructured to make it conform to rustpkg's view of the world -- or permanent, so I put it behind a command-line flag, --rust-path-hack. Perhaps instead, rustpkg should conform to Servo's view of the world, since we are motivated by wanting to build Servo and rustc with rustpkg.

The main advantage of allowing the RUST_PATH to only contain workspaces seems to be simplicity. But for a project with many submodules that may be organized in an arbitrarily complicated way, perhaps making the RUST_PATH more flexible makes it simpler to build packages.


tim: Tim with short hair, smiling, wearing a black jacket over a white T-shirt (Default)
Tim Chevalier

March 2014

23 24252627 2829


RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags