On WSL performance

As somebody who does a lot of work in a Linux environment, WSL (the Windows Subsystem for Linux) has become almost a required too for me.  A while back, I looked up ways to share files between native Windows and WSL.  For various reasons, the most convenient workflow for me is to do much of my work on the code from within Windows, but then run various tests and parts of the build process in Linux.  So I wanted to see what my options were.

The option I had been using was to use the mount point that WSL sets up in Linux for the Windows filesystem.  In addition to that, it turns out there are a couple of ways to go the other direction and read Linux files from Windows.  There's the direct, unsupported way or the the supported way using a network share.  Sadly, it turns out none of these are really good for me.

My main problem and motivation for looking into this was simple: performance.  When crossing container boundaries, filesystem performance takes a nose-dive.  And I'm not just talking about an "I notice it and it's annoying" performance hit, I'm talking "this is actively reducing my productivity" hit.  For filesystem-intensive processes, on a conservative estimate, when running a process in Linux, things take at least 2 to 3 times as long when the files are hosted in Windows compared to when they're hosted in Linux.  And it's frequently much worse than that.  For one project I was working on, the build process took upwards of 20 minutes when the files were on Windows, but when I moved them to Linux it was around 3 minutes.  And it's not just that project.  Even for smaller jobs, like running PHPStan over a different project, the difference is still on the order of several minutes vs. 30 seconds or so.  Perhaps this has improved in more recent versions, but I'm still stuck on Windows 10 and this is seriously painful.

My solution?  Go old-school: I wrote a quick script to rsync my project code from Windows to Linux.  Not that rsync is super-fast either, but it's not bad after the initial sync.  I just set it up to skip external dependencies and run NPM et al. on Linux, so even when there's "a lot of files", it's not nearly as many as it could be.  Of course, then I need to remember to sync the code before running my commands, which is not idea.  But still, the time difference is enough that I can run the command, realize I forgot to sync, do the sync, and run the command again in less time than just running it once on the Windows-hosted code.

You can reply to this entry by leaving a comment below. This entry accepts Pingbacks from other blogs. You can follow comments on this entry by subscribing to the RSS feed.

Add your comments #

A comment body is required. No HTML code allowed. URLs starting with http:// or ftp:// will be automatically converted to hyperlinks.