My Key Learnings after 30,000 LOC in Rust
I used to love C and C++. If we date back to the mid 90’s, I did C, probably poor C++ which I thought was great, and Assembly exclusively as part of my reverse engineering/security work.
There’s something about being mindful of the price of instructions and data you use that feels intuitive and safe and generates confidence. You get low level building blocks, build your own blocks— and pay little for those because there’s a notion of “zero cost” or at least you directly control the cost. You had little moving parts and that — as we all know — is a strong trait of simple design.
Then, around 2010–2011, I found an outlier — Go. Not functional, not dynamic, and feels like home (home being C). When Node.js was failing me, I built and took Go based systems to production from early Go versions with extreme success (I remember that around a year later Segment.io publicized their move from Node.js to Go, as well as others).
I think I felt that Go strikes the balance of robust systems programming and great performance with almost zero investment, and great developer experience. Some of that old 90’s feel, and a lot of that modern fantastic development practices.
But there’s a reason I’m being nostalgic and making you read all that, which leads me to the first two things you should know about Rust.
1. You Have to be Ready To Commit
For me, Rust takes a stroll over the memory lane above and picks and drives home the best experiences from all those languages — which is a mind boggling, insane achievement.
To pay for all that greatness(as I like to think), it comes with what people commonly experience as an annoyance — the borrow checker and what is making a come back (looking at you gradual typing!) — static typing. This means that if you’re working in an environment with a GC at your disposal and / or a dynamic language, be ready to commit a good chunk of your time for thinking and designing around managing memory and types and for that experience in general — slow, safe thinking.
And here’s another curve-ball — it takes around 30 seconds to compile one of my larger Rust projects, and a type check cycle takes around 3 seconds. It’s so bad that as I switch from Rust to Go it takes my brain a few minutes to adjust to Go’s compliation speed; I type Go code and then gaze at the screen waiting for the compiler to catch up, only that it had caught immediately as I typed :).
I’m sure you’ll find your own challenges; which brings me to the best Rust advice I can give you.
Be ready to commit. Don’t expect to be in, write code, and win — especially not when you’re starting. Expect to be in, fall on your face, lots of face palms, write code, and be out after twice the amount of time you planned to invest (more like ten times but I didn’t want to sound gloomy).
Which leads me to the second best Rust advice: survive the beginning.
Or in the words of the great John Carmack, Rust code feels very wholesome.
2. Rust is Wholesome
If you’re planning to dive into Rust, here’s what you should expect starting up — if you’re coming from dynamic languages, static and garbage collected languages or static and manual memory managed languages.
One of the major discussion points around how Rust “feels” is the borrow checker and ownership. Here’s my makeshift lifecycle of a Rust adopter and the borrow checker.
- Fight it
- Almost lose it
- Agree with it
- Embrace it
- Understand it
Coming From Dynamic Languages
This is a meaningful a change in how you design and build software. Because of the borrow checker and static typing, your design needs to be bolted into place firmly and there is little head room for “leaving things for later”.
It means that from time to time, the borrow checker will tell you that your design stinks. Some times it doesn’t matter where you’ll shift code to satisfy the borrow checker — after hours of fighting and only after you take a few steps back and come to it later — you’ll realise your design was fundamentally flawed. You’ll negotiate your way around redesigns by cloning, copying, and boxing, and if you’re lucky it’ll be enough.
If you’re more lucky you’ll actually go through redesign, which will force you to look at your object graph, responsibilities and dependencies, at which point you’ll finally get Rust.
Coming From Static Languages, Garbage Collected
You will feel most of your pains with the borrow checker, the same as if you came from dynamic languages, only now you’ll have to unlearn parts of your static types intuition, and unlearning things is hard.
Rust will feel bizarre and boilerplatey. And if you can’t get over it, at this point, try to challenge what you’re building — does it have to be built with Rust? Is that something you want to build with a systems programming language? Do you need a predictable memory model, zero cost, great performance (which arguably you can already get with a good VM, trading off memory for performance)? Maybe not. Better pick the right tool for the job.
Coming from Static Languages, Manual Memory Management
If you’re coming from C++, soon after getting into Rust you’ll feel like you’ve been handed a new pair of shoes; I say “soon after” and not “immediately after” because chances are your first impression would be that Rust “decides for you” too much, and then I would hope cargo and time will do their thing and soften you up.
Compared to C++, where you have (and want!) full control, some of Rust will feels like magic you won’t agree with. Also remember that Rust is very young, relatively; a good part of the ecosystem you’d expect to be there, coming from C++, doesn’t exist.
3. Sweet Spot for Building Tools
Building tools with Rust is amazing. You have an amazing CLI library, just as amazing serialisation library for configuration, data structures, and a great logging library, and even safe parallelism for free. Safe parallelism for free!
The rest is battery included as the standard library is extremely rich.
Rust is also performant and safe by default, has a very good cross-compilation story, and it wouldn’t hurt to have relatively compact binaries ranging 5mb-20mb (in most cases smaller than Go binaries) with jaw dropping start up times.
4. Great for Interop and Replacing Existing Infra
In my mind, Rust’s killer feature was supposed to be C++ interop, but that’s only me. And like many languages that try to integrate with C++ and struggle, well, the short story is that it’s not fully there yet (because of C++’s mangling etc.).
However Rust’s FFI and integration direct-to-dynamic languages have become unmatched quickly, which puts it well underway to replacing C/C++ as a low-level language for performance applications for dynamic languages — but that’s my own prediction.
5. It’s Consistent and Stable
I started using Rust very early on, and to share a confession — ditched it with anger. I couldn’t get a build to work consistently with the language and crates and standard library breakage.
When I picked it up again a couple years later, it wasn’t the same. I’ve never had a problem with crates or dependencies or packages, or build infrastructure! and there is a good reason.
It takes that kind of polarized experience to appreciate stability as a highly valued feature.
In a project with a large surface area in terms of crates, I’ve been consistently upgrading my nightly rust every weekend for the last half a year without a single breakage, and in fact — with gains in performance and compilation times!
6. Slicing and Dicing Software
One Rust feature that’s easy to overlook is features. Features let you mix and match your software in compile time; you can ship reduced versions, differently licensed versions (using different crates with difference license), or ship with a menu for others to choose from, based on their own individual constraints.
I use features as part of a plan to ship a commercial as well as a free version of software from the same codebase, which guarantees the free version binary has none of the licensed code, at all.
I also like to control compilation times by excluding features I don’t use from other crates (more on that later).
7. Amazing Performance
Clean, simple Rust is very fast by default:
I see performance gains every time I update my Rust, so much that I’m looking forward and update Rust weekly (I use nightly).
8. Use Go For When the Ecosystem Isn’t There
Some of the ecosystem isn’t there yet — and that’s fine. Obviously a better karma is to build and share what’s missing, but when pressed for time, I use Go to complete the Rust story, by compiling go as cshared and using libloading. Go has a richer ecosystem, and I can iterate faster that way, when the trade-offs allow for it; you have to remember to account for build complexity (cross-compiling the Go code and making sure enough of it is statically linked to be loaded easily).
Why Go of all others?
- Can be compiled into a clib, and has a fantastic cross compilation story around it.
- By now has an enormous ecosystem.
- Easy to pick up in a day, small number of concepts to learn and so a low mental overhead. Fast in, fast out, back to your Rust code.
- Shares some performance traits with Rust.
9. Slow Compilation Speeds
In my mind this is a big one. And so I also want to share some tips to fix this.
Starting with a cargo new --bin and jumping into your editor, compilation speeds are great and editor feedback is fast. Add around five crates, and you're in the 5 seconds zone for compilation.
Fast forward to a medium to large project and you’re looking at 30 seconds for a build and around 5 seconds for a cargo check, which is what my editor (and probably other editors) will use for quick error checking, which is supposed to be what sets the squigglies for your fast feedback loop.
To get a faster build you can break your project to pieces and compile each separately. I waited on this for a while, preferring to suffer with compilation times, because I didn’t believe it will go smoothly (but it did, massively smooth); I thought I’d be stepping into a Rust flavoured rabbit hole, somehow having the borrow checker have the upper hand on me in all this.
I went from a painful 30 seconds to 9 seconds for a build by shattering the main project crate into four (read: split the main code base to multiple independent libraries), most of that 9 seconds now is linking a binary. And it turns out, it was a breeze.
Additional benefits I’ve discovered with the new project layout:
- Easy to identify optional features.
- Easy to flag some of the sub crates as features and not include them.
- Easier to isolate and performance test smaller crates.
So, lots of power when breaking to small pieces. This means project layout becomes much more important in Rust, as it is a workflow optimization enabler for individuals and more so for teams.
To break down cleanly into crates, as with other languages and modern software architecture, try to identify your:
- Core: data, types, abstractions
- Logic: algorithms (can sit with core)
- Peripherals: networking, HTTP, reporting
- App: entry point, uses and glues everything
Here’s another nice surprise that happens which reminds you Rust is smart:
4 use crate::project::Project; ^^^^^^^ unresolved import help: a similar path exists: aural_engine::project
Rust will help you if it sees import paths in your code that looks for local code but now that you’ve moved those into a crate — will point to it.
Note that to enjoy speed, you might have to open your editor rooted at one of the sub crates. This means it’ll do checks based on that crate; if you open at workspace root, your editor (or cargo) will do checks in the context of all dependencies. Namely, if B depends on A, and B changes — cargo/rust will check both B and A, nullifying the purpose of crate shattering. However if you open your project in the context of B, and change B, only B will be checked, which means — speed.
To finish with some reading material about diagnosing, thinking and fixing performance, here’s some more compiler tips, issue tracking progress, and how to profile the compiler and some more profiling and benchmarking.
For better linking times, experiment with linking with LLD on your OS. Sadly, many of these aren’t possible on macOS (including linking with LLD). To give you a clue about how to profile on OSX (since it does not have perf) read this or more simply use macos-profiler that packages these instructions more or less.
10. Services Story is Still In Progress
Servicing, APIs and web frameworks can be a dream in Rust, but the infrastructure story to support it is still being told.
A few crates, which I find exciting are hyper, tokio and actix. I’ve settled on actix for services, but still, I recognize that this is not what will make me pick Rust for a traditional backend service project at this stage.
Bonus: Documentation Is Great
There’s a high chance you already know this, which is why I didn’t count it. The consensus is that Rust has one of the better documentation stories, and again to factor in the number of years that Rust exists — it’s nothing short of amazing. You have Rust editions, by example, effective Rust (although in the older book), and the regular documentation you’d find though googling anything Rust related, AKA “the book”.
If I ignore the fact that Rust is still new, I’d be happy to see a learnyou variant in Rust.
When you’re ready, check out my other Rust related article (part of this series), that covers great Rust libraries that you should try.