Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Just: A Command Runner (github.com/casey)
482 points by tnorthcutt on Jan 9, 2023 | hide | past | favorite | 199 comments


I've been using this for about six months now and I absolutely love it.

Make never stuck for me - I couldn't quite get it to fit inside my head.

Just has the exact set of features I want.

Here's one example of one of my Justfiles: https://github.com/simonw/sqlite-utils/blob/fc221f9b62ed8624... - documented here: https://sqlite-utils.datasette.io/en/stable/contributing.htm...

I also wrote about using Just with Django in this TIL: https://til.simonwillison.net/django/just-with-django


As someone who uses neither Just nor Make, I'm trying to understand the value proposition.

From what I can tell, Just accomplishes what a set of scripts in a directory could also do, but in a more ergonomic fashion. Since the justfile is a single file, you're not cluttering up your directory. You don't have to look for all *.sh files, but can instead do a "just --list". And using "just target" is probably easier to type than "./target.sh" since the just version has no punctuation.

What are some of the other benefits of Just that makes it superior to "a set of scripts in a directory"?


They are both directed graphs that support dependencies, i.e. if you run `make a` or `just a` and they depend on b and b depends on c, both tools will build c first, then b, then a.

What often happens is that people should use `make` (or `just`), but don't, and instead end up writing a poor replica of `make` as a custom python script for example.

And then that python script perhaps shells-out to a subprocess because you can't run something in python directly as a lib, so now you have to import and invoke subprocesses, and so on. Building and deploying a static website using just or make is 4 lines.


This should be in their README. After looking at the repo and some of the comments here, yours was the moment I went from “what exactly is the value prop?” to “that could be useful for me”


"Just accomplishes what a set of scripts in a directory could also do, but in a more ergonomic fashion"

That's exactly the value proposition here. It's a small quality of life improvement, nothing more.


Don’t forget discoverability - writing «make» and whacking tab gives you a nice menu of what can be done with the Makefile. (Just, too.)


But any shell that tab completes make targets should also tab complete executables? Just write ./scripts (or whatever) and whack tab.


I don’t know where to start when the script is in ./Scripts/ or ./tools or ./bin or ./shared/tools. Make has a convention that its file is called Makefile. Just has Justfile. Easy to find. Justfile - here I don’t even have to find it, the «just» executable will search up the directory tree.


Most people know about scripts and what are for. Most people don't know anything about just or what a "Justfile" is.


Most people with a programming or Unix sysadmin background know about Make and what it is for (i.e. compiling source code, plus building and installing system tools).

Knowing that 'Just' is an alternative to Make for running project utility scripts, it may gain popularity and become as well known.


I can't defend Just but everyone on UNIX should know what a Makefile is. Make is a standard. Scripts are just a mess.


I'm not sure `make` is any less of a mess, to be fair. Especially when you're trying to use `make` for general project automation rather than the standard set of build tasks.


Then this is good, but hopefully scope creep doesn't creep in.

I'm not a software developer, but a scientist, and eventually like 5 shell scripts fill a directory. Having them all be in one place sounds neater and easier to document and come back to a few months later without needing to either open up each individual file to read its own comments or to make a separate readme file, seems good for small projects.


You can also run it from any subdirectory as if they were in the root of a project (it searches the parents for the nearest justfile). E.g 'just build' from /proj is the same as when it is executed from /proj/lib1/module1/ without having to think about what the relative path should be.


That sounds like one path traversal bug away from a bad idea.


obviously ginger snark. this is also how cargo works.


Because task runners come from the lineage of build systems, I (previously) would have thought that smart management of dependencies was a critical feature, especially skipping outputs whose inputs haven't changed, so that tasks can be run efficiently. Make and rake both do this.

But I guess, enough of the time, what people want is just a nice tidy way of organising tasks to run. Not drastically different to a set of shell scripts, but organised differently, with a few extra features and less boilerplate per task.


It isn't anything revolutionary, just helpful. I think of it like a lightweight framework for writing shell scripts.


Job pipelines - it's easy to say that the build-release step requires the test step. Admittedly this is still possible in a shell script, but it is either messy (folder of helper scripts?) or requires code duplication.


You know shell scripts can have functions right?

And nice stuff like command/options parsing + nice help messages.


Just has pretty ergonomic support for argument parsing [1] and help messages [2]. You're right about functions of course.

1: https://github.com/casey/just#recipe-parameters

2: https://github.com/casey/just#documentation-comments


> Since the justfile is a single file, you're not cluttering up your directory.

You can do the same with any other scripting language, because with most popular languages it's quite easy to handle cli-parameters on that level.

And using a single messy file is usually not a good pattern. But on the other side, "just" comes with builtin tab completion which you would not get for any random script out-of-the-box. And if you are only using half oneliners or very short scripts, the messiness of a single file does not matter that much. Though, how well this will scale over time is a different topic.


> And using a single messy file is usually not a good pattern.

I find my Justfiles rarely ever get large enough to become messy, and then I just factor out the large tasks as scripts and it's all nice and tidy again. Almost all of my projects with a Dockerfile have docker-build and docker-run-local just tasks and those are pretty much always one-liners, and usually that's as complex as it gets. I treat Justfiles more like runnable readmes than a framework for homebrew build systems; if I need a proper build system, I'm probably going to invoke it via a one-liner in a just task.


https://github.com/casey/just#recipe-parameters

Accepting arguments alone seems to be a pretty big value add.


It's nice you don't have set it up yourself, but it's hardly difficult to do in a shell script.


I think the more reasonable comparison is with Make.


Variables can be set on the command-line: https://www.gnu.org/software/make/manual/make.html#Overridin...

  $ make HOST=dev1.example.org deploy
Environment variables are also available in makefiles.


Uniform interface that abstracts over commands? Nice when you have either bad command / build system or multiple in same project?


Dependencies.


I also love just, but I try to restrict my usage to projects that don't have larger communities or user populations, as the getting it installed aspect is nowhere near universal as make. my favorite is mixing scripting languages and shell in the same file, albeit its got some rope.. but its productive and intuitive. https://github.com/kapilt/aws-sdk-api-changes/blob/master/ju...


There are like 15 supported platform specific installation paths of Just.

It’s easy to install on any internet connected server that you have install privileges to.


You don’t need to use all of make’s features (I’ve been using it for something like 30 years and still can’t fathom sets of it). But TBH I just don’t see anything in those Justfiles I couldn’t do with make without needing to do anything special…


I'm sure you can do that stuff with Make. Like I said though, I have tried and failed to get Make to stick for me in the past.

Maybe it's just that my brain has such a strong negative reaction to that PHONY hack that I was never able to get past it.

Maybe Just's tagline should be "it's Make, but you don't have to remember what PHONY means".


All phony means is there’s no output file. And frankly, it works fine without phony, unless you accidentally create that file.

Make’s built around creating files. Phony works around that.


Most people don't have the advantage/burden of three decades of experience with the tool. Make's only advantage as a task runner I see is being ubiquitous. If that isn't a concern, no reason to use upper Pleistocene Make that isn't even a task runner over a more ergonomic tool. Make is so warty I have yet to see a team using Makefiles not run into any of its idiosyncrasies, but with just? Smooth sailing. Just works. No surprises. I'm a fan.


Getting the list of make targets by default is pretty nifty. As is not having to preface everything with PHONY.

Do any of these make alternatives reinvent the paradigm? No, but they do offer some quality of life improvements I wish were within reach without jumping through hoops.


What's the problem with ".PHONY" ? Just the name ? Would it be alright for you if it had a different name, like "INTERFACE" or "COMMANDS", or "NOFILE" ?

I never thought about .PHONY as a workaround to anything. Just a slightly unnecessary annotation that you may add to the makefile if you want to be pedantically correct.


Invoking scripts with a language env and arguments both stand out to me.

Both are possible in make, but are extremely non-obvious and come with a bunch of caveats and require some heady code blobs at the beginning of the file.


Make is a mess to install on windows. Just just works.


Weird. I’ve always used make on Windows by dropping in a single binary.


Where is that released? Would be good to know.


I've been using Just since at least mid-2018 (that's the oldest commit I can find), and we're using it on almost every single project at $WORK. It's easier to comprehend than make, doesn't have random GNUisms or BSDisms, it's easier to work with than a collection of random 5-10 line scripts, and despite being a bespoke tool, it's intuitive enough to a point where it immediately feels familiar.


One of the listed benefits of just is "Recipes can be listed from the command line."

There is a nice trick that gives makefiles this ability: http://marmelab.com/blog/2016/02/29/auto-documented-makefile...

Adding .PHONY targets and so forth is a bit inelegant, but I can share a makefile with confidence that any Linux/Mac OS/BSD user can use it without needing additional software, and I will never have to worry about make becoming unavailable or no longer maintained. Just my personal opinion.


> but I can share a makefile with confidence that any Linux/Mac OS/BSD can use it without needing any additional software

I'm sure you're kidding, but in the case that you're not: Make portability is gross.

We have ./configure steps precisely because Make is difficult w.r.t. portability, but even if that wasn't the case and you were just using make as a command executor: you still have enormous warts.

Oh, and yeah, you'd need whatever additional software too.

Be it: headers, linters, formatters, libraries or test suites that you've bundled.

Valgrind is a popular make target, but "Make" does not bring in valgrind (for example).

Honestly one of the most backwards things the Go community did was adopt "Make", it's so kludgey even as a pure command executor that I can't really take anyone seriously who argues for it's use.

I'm not saying "Just" is a replacement, I don't know what is.

Most of the arguments for using Make boil down to "I enjoy typing `make <something>`" and "you probably have it installed already?".


This. People here are acting like make is installed by default on all Linuxes, but it absolutely is not. And the various BSD makes are very different to GNU make.

Make portability is bad enough on UNIX-like OSes, to say nothing of what a crapshoot it is on Windows. Even if you do have make installed on Windows, there's no guarantee that what it shells to is going to be able to run all the commands people tend to put in there.

Plain old shell scripting is much more portable than make, because a shell is definitely installed on every UNIX-like OS, and there is a very clear baseline of functionality that works in every Bourne/POSIX descendant. And it's quite likely to exist on a modern Windows developer machine too because bash is bundled together with Git.

My theory of writing developer scripts is to prefer the tool that already exists in the language you're developing. Gradle for JVM, npm for Node etc. Otherwise just use shell. Make feels like wrong tool for the job.


> there is a very clear baseline of functionality that works in every Bourne/POSIX descendant

Where is the best place to learn what this is? I'd love to make sure I'm writing portable shell scripts when I do have to.

There's also the issue that "shell scripts" often involve using binaries which you might not even realise are binaries (is "echo" a shell builtin or a binary? I forget) and which may differ from system to system. I've been bitten by grep issues before writing scripts across Ubuntu and OSX.


https://www.shellcheck.net/ and it’s accompanying cmdline tool/ lsp integrations is a lifesaver for preventing that kind of thing. It’ll warn you if you’re doing anything not portable and even smartly changes it’s behavior depending whether your shebang line uses bash or sh (iirc)


Do you need portable shell scripts though? IMHO, it really depends on the context.

If I was about to ship an open source application that came bundled with some shell scripts, then I agree portability is good, so that I know the script would run for people who might not have Bash installed.

But at ${DAYJOB} I much prefer to let Bash run all my scripts, and I make that explicit via ‘#!/usr/bin/env bash’.

Bash is still evolving and the Bash devs are adding new features that I would miss in pure sh. Case in point: A (somewhat) reasonable way of working with arrays.


ChatGPT has done wonders for my understanding of bash. I presents me with lots of tedious things I would never spend time learning


This just in, Windows isn't UNIXy, film at 11

Let me know what shell scripting selectively builds only the artifacts whose dependencies have changed (honest question).


Make works remarkably well. You’re just confusing it with C compilation. None of these complaints have anything to do with make.

Autotools, which generate configure scripts, was built to work around the specific issues associated with old-school C cross platform compilation (with shared libraries, version differences, and misc libc editions). Ditto valgrind, et.al.

So, yeah. Make’s fine. You just don’t like C. Which, that’s cool, just unrelated.


I'm not writing C. so I'm not sure what you mean, I mentioned ./configure as a solution to a problem because it was obviously a big enough problem;

To go into issues though:

Make itself executes by default with `sh` which is wildly different between platforms.

Even if you write portable enough shell; Paths are still incompatible between OS's and distros.

You still must ship your tools, which is a direct contradition of what is mentioned.

In fact; I just googled it and this chapter from Managing Projects with GNU Make; talks about the issues in making Make (GNU Make, as opposed to BSD Make, which is different enough to have broken my things!): portable

https://www.oreilly.com/library/view/managing-projects-with/...


./compile works around c cross compilation issues, where different platforms have different files with the same names.

Fixing that within Make would require it to be platform aware. Not just “is this Linux” but “which flavor of Linux is this and what version of that flavor”. It’s also highly specific to C.

Perhaps your other complaints here are valid, but they’re issues I’ve never run into myself, in my 20 odd years with it.

EDIT: Would you blame Just for not handling C cross compilation capability built in? Would you blame Just if someone automates the creation of Just command files to work around a particularly nasty workflow?


I don't need to google to find fault, but I figured the make book might have something to say about portability and in better words than I can construct at 3am.

Look, I'm not taking your tools away, there is no need to be defensive.

Make isn't going anywhere, but it is a bad tool, the syntax is completely arcane and it's designed for things few people actually need these days.

The most common case I've seen of modern Make usage is `make docker` and for Golang, where it doesn't get an opportunity to stretch its legs as a dependency manager at all -- making it a glorified task runner.

The portability aspect is all I mentioned, because that was all that was in question.

But if you really want me to get into it, I can be quite cruel.

Just because you spent 20 years learning or using something does not mean it is a good tool, I'm glad it works for you, truly, but it is an abomination and people only continue to use it for a sense of sunk cost fallacy or by telling themselves that "most people have it installed already".


You're confusing make with GNU Make.

One of the reasons autotools is so complicated is it is used to build GNU Make so has to work with whatever crummy make your Unix vendor shipped with


I’m...talking about using make for its intended purpose, which is selectively building artifacts whose dependencies have changed, e.g. compile executable A if source code files X Y or Z have changed and leave it alone if not.

What exactly are you talking about? Are you upset that make isn't a cross-platform package management system?

You don't know what alternative there is to Make and then in the next breath you say the only arguments for it are personal preference?


Oh excellent, then better (and more portable!) tools are available:

http://pants.build

https://ninja-build.org

https://buck.build

and, if you hate yourself: https://bazel.build

> What exactly are you talking about? Are you upset that make isn't a cross-platform package management system?

Ok, so you were serious with your claims of portability, that is concerning.

The majority of times I've seen Make used it's primarily been as a task runner.

For example:

    TAG=some-service

    SVC=website.com/$(TAG)
    BUILDER=golang:1.19-alpine
    export REVISION_ID?=unknown
    export BUILD_DATE?=unknown

    RUN=docker run --rm \
     -v $(CURDIR):/opt/go/src/$(SVC) \
     -w /opt/go/src/$(SVC) \
     -e GO111MODULE=on

    build:
    ifeq ($(OS),Windows_NT)
    # Workaround on Windows for https://github.com/golang/dep/issues/1407
     $(RUN) $(BUILDER) rm -rf vendor vendor.orig
     $(RUN) $(BUILDER) rm -rf vendor vendor.orig
    endif
     $(RUN) -e CGO_ENABLED=0 -e GOOS=linux $(BUILDER) \
      go build -o service -ldflags "-s -X main.revisionID=$(REVISION_ID) -X main.buildDate=$(BUILD_DATE)" \
      ./cmd/some-service/...

    # $(RUN) $(BUILDER) rhash --sha256 service -o service.sha256
     docker build --tag="$(TAG):$(REVISION_ID)" --tag="$(TAG):latest" .

    run:
     docker-compose up

    serve:
     go run ./cmd/some-service/...

    dev:
     ulimit -n 1000 #increase the file watch limit, might required on MacOS
     reflex -s -r '\.go$$' make serve
^ the only thing this is "using" of Make is the name "Make" and it's so much worse to actually debug than a bash script, it's even got workarounds for various platforms inside of it.


Are we going to dive into cmake now (sorry)

Edit: We're actually using this - and I remember there was a modern version with colors and a cool short form. But I can't find it - anybody else got nice examples?


The question isn't if make is portable, the question is: is make any less portable than just?


> I'm sure you're kidding, but in the case that you're not

https://news.ycombinator.com/newsguidelines.html


Probably you're referring to:

> Be kind. Don't be snarky.

But, I'm not being snarky or rude... I'm just not 100% sure he is being sarcastic.

Given that Make is not portable and does not include _any_ tools that it is commonly used to execute, I consider it hugely sarcastic.

However, other lines were true and it sounds like it was written seriously, so, can't tell if sarcasm or not.


Being fair, your rant is about autotools and c compilation, not make, so your own post is as equally suspect for being satire.


There is more to my rant than just autotools and C compilation though, and it applies equally.

I went into more detail for you here: https://news.ycombinator.com/item?id=34319254

(btw, most of times I've used "Make" has been times where the pain has been done for me or I'm using Python, Go, Docker or in one case: Rust. I have never touched Autotools or C-compilation myself with Make, my arguments have nothing to do with C compilation at all.)


If you’re complaining about ./compile or valgrind, you’re soundly in the world of autotools and C compilation.

Not make.


These are just examples of poor portability (thus workarounds) and tools that are not installed when you run `make` as the GP suggested, the claim was that you could write a make file and kinda not have to worry about anything else.

Not the case.

I could just as easily talk about the pathing issues between MacOS and Linux: or the multiplicity of issues surrounding `sed` (GNU) and `sed` (BSD), or that the `black` python formatter won't be installed by default.

It is not as easy as the author claimed, you need whatever dependencies you call on, obviously.


Unlike OP, I get the impression you’re referring to the good faith clause. I think they did that admirably, balancing a stated assumption that the intent was well meaning and knowledgeable jest with a friendly and informative rebuttal clearly intended to be helpful if their assumption was flawed.

But you could be referencing any number of other things in the guidelines. It would be incredibly helpful if people on HN who engage in community moderation actually explain their reasoning. Here I’ll help:

> Comments should get more thoughtful and substantive, not less, as a topic gets more divisive.

If you’re bothered enough by someone’s comments to link the guidelines, sharing a thought about which part of them is pertinent would be more thoughtful and would facilitate a healthier discussion.

> Please don't post shallow dismissals […] A good critical comment teaches us something.

This is one I’m working on too. If it’s worth challenging something, it’s probably worth challenging it with some proverbial meat. It might seem obviously wrong to you, or to me, but it doesn’t always seem that obvious to everyone else.

I’m certain you meant well with this, but I also think OP would benefit from clarification. I know I would too, and I expect the discussion would benefit as well.


> I can share a makefile with confidence that any Linux/Mac OS/BSD user can use it without needing additional software

Makefile portability can be tricky. Especially if you try to do something fancy with the makefiles.

GNU Make has features and syntax that the other Makes don’t. Likewise there are features and syntax that some Make programs have that GNU Make doesn’t.


EDIT: Not really a serious suggestion

Make is not _that_ portable. If you're using high level languages and only need a task runner to kick off your compiler, watch rules or similar and need portability, you could write an executable bash script with functions that serve as your commands;

  #!/bin/bash
  set -e

  function build() {
    echo "Your build steps"
  }

  function clean() {
    echo "Your clean steps"
  }

  eval $1 $@
Which you can run with

  $ ./task clean
  $ ./task build
You can also write these kinds of task scripts in JavaScript or Python - which might be easier to manage compatibility with Windows


All of my makefiles have SHELL = /bin/bash as the first line.

Let me know what a bash script would look like that does dependency checking for each build artifact


Yeah that's fair. I only use make as a task runner for high level languages where the compilers that take care of those aspects of compilation. My makefile commands are never much more complicated than "cargo build" or adding compiler flags to my "go build" command


I actually do exactly this in some of my personal projects. don't need the eval, store $1 in a variable and shift $@. and you might also want to check that $1 is a function defined in the script, lest the task runner execute some other arbitrary command


Nice! Any chance you could share a GitHub gist or similar?


Why not write two files ./clean.sh and ./build.sh instead?


No reason why you couldn't do that too - it's to taste. I generally like to have fewer files in the top level of my repos to keep them approachable. You could also have a folder that contains scripts like: `./scripts/build`


It's a little more convenient to share code / variables between tasks in a single script


Actually, I did a hack that does that too (this one is nicer, though). I can’t find mine offhand (too many makefiles), but it is also a Makefile target that runs grep -B1 on the Makefile itself and spits out each target’s name and the comment I usually add to it… And my Makefile template uses .env files too… I think I cover most of the list in my day-to-day use.


That works well if you have targets with simple comments that are directly above the target and don't extend beyond a single line. It may be important to provide several lines of help. This gist has other solutions as well:

https://gist.github.com/prwhite/8168133


There's also a whole GH gist on the topic that a few of us have golf'd on.

https://gist.github.com/prwhite/8168133

The linked blog's solution is also mentioned in the gist comments.

https://gist.github.com/prwhite/8168133?permalink_comment_id...


I like similar tricks. Another trick I like to use is being able to do a `make showconfig` and have it print the list of variables & their values that I care about. You can see that here. You can also see my Make+BASH solution for documenting targets.

https://github.com/lpsantil/oop0/blob/master/Makefile#L87-L1...


> ... any Linux/Mac OS/BSD user can use it without needing additional software ...

In my experience, `make` also needs to be installed. (On ubuntu it's part of `build-essential`).

I guess more precisely, `just` might not be available in all package managers?


One nice thing fish does is that it can enumerate your make commands via tab completion


As can bash and zsh!


If you're going to use it like Make without the build system parts, why not just have a directory of tiny scripts? More portable than either, the only boilerplate is a shebang line, and you can use static analysis tools (shellcheck), formatters (shfmt) and the like.

Take the example they've screenshotted in the readme[1]:

  alias b := build
  
  host := `uname -a`
  
  # build main
  build:
      cc *.c -o main
  
  # test everything
  test-all: build
      ./test --all
  
  # run a specific test
  test TEST: build
      ./test --test {{TEST}}
The equivalent would just be something like the following:

build.bash:

  #!/usr/bin/env bash
  set -o errexit
  cc *.c -o main
test-all.bash:

  #!/usr/bin/env bash
  set -o errexit
  "$(dirname "${BASH_SOURCE[0]}")/build.bash"
  ./test --all
test.bash:

  #!/usr/bin/env bash
  set -o errexit
  "$(dirname "${BASH_SOURCE[0]}")/build.bash"
  ./test --test "$@"
If you want to get really fancy you could make a common.bash with safety pragmas and things like the host string.

[1]: https://github.com/casey/just/blob/master/examples/screensho...


Just runs fine on windows, too, which most people seem to miss. Bash is not always available, make even less so.


As matter of practice, I always install git-bash or MinGW or CygWin on my Windows boxes. There's only one instance I've found where these solutions were not sufficient for my uses cases and it was an admittedly horrible corner case where a customer's network forced me to not "do the right thing(tm)".


Having tried all 3, I’ll warmly recommend WSL2 instead, which is basically linux. But many people do work on windows, even write software for it. While I’m in your Linux based corner, some people aren’t, and it’s nice to share a task runner with them.

Last time I checked, git bash doesn’t ship make, either.


Huge fan of just; I add a Justfile to pretty muc hevery new repo I create regardless of language or stack.

My personal favorite feature is the ability to load environment variables from a `.env` file and set them for all commands run. Just have to add this to the top of your `Justfile` to make it happen:

`set dotenv-load := true`


> the ability to load environment variables from a `.env` file

make can do that too. Put

  include .env
  export
at the top of the Makefile. Works for me on GNU Make 3.81 on MacOS.


Thanks, I was just looking for a way to do this today!

Looks like there are some caveats and a slightly different approach: https://unix.stackexchange.com/a/235254


As someone who _extensively_ uses Makefiles everywhere to speed up things (why bother remembering how to start a server in a particular language when “make serve” will work anywhere), I almost understand why this exists, but then I remember that make is available everywhere and has tab auto completion and I have to wonder why…


Same! The fact that Make is pre-installed everywhere and you can create a consistent interface to tasks across projects is a big win.

I wrote a post about that here: https://rosszurowski.com/log/2022/makefiles


> Make is pre-installed everywhere

No, its not, and Just supports Windows, where that is particularly true.


Make also has a windows version, it just needs to be installed, just like Just.


Okay, but a ton of people are saying “make comes preinstalled on every system” as the reason to choose it over something like Just.


Because, statistically, it is true. Pretty much every POSIX machine under the sun has make available.


GP and GGP were talking about Windows, which is a significant portion of all machines, so I don’t see how you can say that’s statistically true?


Because make is dog shit if you need to intertwine make with bash. You have to remember various escaping rules (double $$ signs or not depending on whether you want to refer to a make variable or interpolate a bash variable), tabs instead of spaces that new devs often (quite rightly) get tripped up by, and various other idiosyncracies you can waste hours on.


Yup. All anyone needs is to learn Make properly once, which can be done from following a simple guide like https://medium.com/stack-me-up/using-makefiles-the-right-way...


sure, let's stop looking for better ways to do things because this one 30-year old way is capable of being twisted into what you need, no matter what it is.

I get the reverence around make but don't blind yourself to potentially better ways of doing things.

I would even go so far as to say that intentionally avoiding new ways to approach these things will necessarily blind you to better ways once they do come along, and I wonder how many good tools have died because "[x] does all I need." (replace "x" with whatever you like.)


Is there really any propriety around this tool when something as abysmal as autotools has to exist to generate the makefiles/etc.?


This simple-ish guide made me realise that make is not as suited to my needs as a mostly web developer, and that just would work better. NPM functions are doing just fine for me now, and some of the routines I'd map to npm commands anyway I don't know how much benefit it would actually bring to me.


Pretty much my reaction too. Seems like Make, reinvented.


Which is a good thing, as it supports a use case that Make wasn't made for.


What use case is that? I looked over the README, including the parts where it claims to do things Make can't (I don't agree with much of it). I remain unconvinced that it supports use cases Make can't.


It's not that Make can't work as a task launcher, it's that it's not designed as such.

The simple fact that a build target isn't executed if the build file already exists, requiring a workaround to run it, points to the difference in purpose and philosophy. Those little details, together with the more modern design, accumulate to make Just a tool that is simpler to use for that purpose.


and if make is too much for you, if you just need to run some tasks and have no need for managing artifacts, there is always bash function

https://github.com/adriancooney/Taskfile


Just is built in Rust rather than C. Rust is generally nicer to work with and maintain than C libs so huge win there development wise. Feature wise, only see a couple things with Just right now that differentiate it from Make.


I don’t understand how Just being written in rust as opposed to C has anything to do with this?


I'm not sure what you're confused about, maybe you missed the point of this being shared on ycombinator news? Just being written in Rust versus C is part of an open source movement to modernize these old tools and the significance of using a modern language versus an ancient one is better support and features going forward. The reason this tool is currently top voted on ycombinator news is purely because it's written in Rust and of interest to programmers who care about their open source tooling.


Implementation language is not very relevant. If you have to look at the source code of your build tool, you've already lost.


This is a weird opinion to me, but I reckon you're an ops person who doesn't code a lot so I respect not wanting to have to look at code if you're an ops/admin type. Most programmers would grasp the relevance of modernizing their tools and the maintainability and feature gains from it. The open source movement in general is based entirely upon being able to look at the source code of your tools and modify and update them.


I hate to be the bearer of bad news, but your assumption is incorrect. I possess deep expertise in a number of areas, code flows with ease. Reading code is it's own skill, too. I'm not easily intimidated by any programming language or problem. On the technical front I've done everything from development, SRE, ML/AI, founding a company, being a leader and executive at small and very large companies.. it's all fascinating in it's way. But the most fascinating things I've found in the universe are people.

When something doesn't work as expected, I dive in as deep as the rabbit hole goes to get across the line.

Curious what led you to arrive at "Aha! They must be an ops person", will you humor me with an explanation?


Would you throw out sqlite, written in C for a Rust clone?

It is possible to reimplement a relatively easy tool like make, and having learnt from its historical shortcomings it can be better irrespective of the implementation language. But that’s a different point.


Wait until you find out that Rust uses llvm.


Why not just have a shell script that you run with ./serve? Even simpler.


This is actually very different than `make`, since it will always run the tasks even if their inputs haven't changed[0].

It's a bit bizarre to me that their example involves building code, since that application generally benefits from this "idiosyncratic" behavior of `make`.

You probably would want the behavior of `make` for the `test` command too, to avoid running potentially time-consuming tests unnecessarily. Bazel caches test results to avoid this, for example.

[0] https://github.com/casey/just#what-are-the-idiosyncrasies-of...


Caching test results sounds like a horrible idea, since the test could rely on external state or dependencies that weren't checked. For example, if you're testing something that opens a socket.


People often come up with this absurd argument. If all your tests are opening sockets, it must be a nightmare as they may fail randomly every time you run them.

Keep tests that open sockets and otherwise interact with resources not controlled by the test environment itself separately, what most people like to call integration tests. Only those cannot be cached - and it's a huge trade off and while you get a lot of value from those tests, exactly because they can't be cached, you want to keep those to a minimum.


Most applications I've ever worked on talk to the internet or to other servers on the network, you can't close your eyes and pretend everything runs on one machine. You eventually have to test all of your software.

Certainly, you can optimize some of your tests by caching them! But they should be fast anyway, because your software should be fast. The software I currently work on has massive test suites containing around 50k tests that run in about 3 minutes per suite on my PC. It's time consuming, I suppose, but caching would mean that I'm not actually flushing out latent bugs. Instead, we run all of the tests every time and do things like randomize the execution order, so we find bugs before customers do.

At the point where your software interacts with threads (or the kernel!) you've already brought in non-determinism and trying to act like your tests are really deterministic is naive.


I've written so many tests in my life I think I know what I am talking about. It's not naive, it's what professionals do.

Learn to separate tests that use outside resources and behave undeterminastically from the ones that are never flaky, make sure to cache the latter. If you don't do that you're still in the dark ages of testing when tests are a pain to touch and fail all the time for no reason.


Many folks believe that builds and tests should ideally be "hermetic", and not depend on external state. This can make tests more robust, and facilitates tasks such as bisecting to identify the cause of regressions.


I'm a big fan of fast, self-contained hermetic tests, but at the end of the day you should be testing the actual behavior of your software. This means all your dependencies should be versioned so you can bisect actual behavior, not mocked test behavior.

I've also rarely ever touched software that is actually free of nondeterminism, so I am deeply skeptical of caching anything but the simplest test case. And those are fast.


You should avoid flaky tests. They slow down presubmits and make it hard to do bisections.


Sounds like you're testing wrong then ;)


How do you propose testing software that is meant to use sockets? Just don't?

Mock the entire socket API, so you're not actually testing your software?

At the end of the day you need to run some tests with external interactions, and you shouldn't cache those.

Also keep in mind any time you perform syscalls, you have external interactions. Their behavior could change and you want to know if they have intermittent failure scenarios. Using threads? You're nondeterministic, you should keep running tests.


> For example, if you're testing something that opens a socket.

Then the destination of that socket should be managed by the test too.

For example: https://pkg.go.dev/net/http/httptest#Server


I do love me some make, and have since forever. but we do have to look for a successor as cleaning out all of make's historical baggage would be a disservice to too many (really any is too many).

If you are not sure of what I am talking about, try typing

`make -p`

those builtin rules can be disabled if you are not building ancient artifacts but that we have to is why one of these work-a-likes is going to win some day.


I dig the general idea, but question the value add over a directory of `scripts` that follow sane conventions (ie `script/test`, `script/build` etc). Is the main thing that you can do `just -l` to see available commands? I have never really reached for `make` when I've had a choice, as I've done mostly ruby, JS, or java where you have more sane, humane tools (i.e. Rake, Yarn, Maven though that one is never fun).

My general approach is every repo should have something that follows https://github.com/github/scripts-to-rule-them-all, written in sh (maybe bash, its 2023), linted with shellcheck. When you need something fancy Rake is great or grab some nice bash command line helper and source it from all your scripts. Is a command listing really worth another dependency over what you get from `ls script` or `cat script/README` ?


I've migrated from a ./scripts directory to a justfile and greatly appreciate the concision and modularity. I understand and have tried source'ing common bash files, but have never managed to make it feel quite right. ```just``` has bought my current project a lot of time until we replace it with some behemoth like bazel.


If you don't have enough yaml in your life, https://taskfile.dev/ is excellent.


I use https://github.com/davetron5000/gli for this, since I work in ruby. Adding something like just or gli to your project is a huge win. Every dev can just `just update_db` to refresh their dev db, `just update_secrets` to update dev secrets. Whatever. So much better than putting snippets in a wiki or whatever.

I like gli because it gives you subcommands, like `gli database refresh` etc.


Another simple tool similar to this is makesure[1]. It’s written in shell so the idea is you include makesure itself in your repo, which avoids needing to install another tool to run commands on your project.

It’s very simple so isn’t good for everything, but works well as a simple command runner.

[1] https://github.com/xonixx/makesure


Looks cool! Including the script in the project directory is the way, and I created on makesh[1] with that in mind. Since it's a submodule it can be easily pulled around with a repo and updated.

Glad to see people are finally going back to ahell scripts since they are very ergonomic and with a little portability in mind, they can be cross platform too.

[1] https://github.com/Baldomo/makesh


Competition: https://pydoit.org/

I've been using it for a while and it's pretty flexible:

  - dependencies
  - parallelism
  - programmatically generated tasks (since the config file is just a Python file)
  - "udf" to specify when a task is up-to-date and can be skipped


It seems to me like the selling point of this tool is that it is language agnostic - you don't have to write all your jobs in Python.

There are a lot of language specific tools like this. I love Ruby's rake.


pydoit is language agnostic (apart from using python for your configuration, but you'll need to use something)


> It seems to me like the selling point of this tool is that it is language agnostic

How is having to use Python different from having to use the Justfile language?


I like the idea of trying to rethink the Make interface, but it just seems like most projects could actually benefit from build targets (conditional execution based on file existence/age), even if it's not the first thing you need to automate. I don't want to give that up because PHONY is confusing.


Nah, just does not support automatic parallelization of the dependency graph. This is a deal breaker for me. Also I'm not a huge fun of a yet another built-in language that tries to mimic a full-featured programming language.


I assume u write everything in assembly then


I have been using Just for 3 months and it is such a fantastic tool. I never looked back at make. I don't even look at npm scripts anymore. I love Just.


I don't see anything about including other Justfiles or just generally other file. Might be nice to split up huge projects.

The alternative Taskfile can do that https://taskfile.dev/usage/#including-other-taskfiles


That looks cool but I fundamentally hate yaml so it’s a no go.

I would rather lose my hair screwing with a makefile like thing than add more yaml to my day to day; I currently say “I hate yaml” at least 4 times a day.

If it’s not too much trouble and I don’t need comments I just write json as yaml.


What makes you hate YAML so much? What alternative would you suggest for a tool like this?


+1 YAML are a bane to my existence.


It seems to me much nicer choice is https://taskfile.dev/

Just doesn't seem to even support caching task results (by declaring inputs/outputs)?


Tools like these are handy, for sure. Problem is: If you're collaborating in a project, then you're requiring a new dependency to be installed in everyone's machine.

Stuff like `make` is already there, always. In my case, I assume everyone has Python 3 and include a `task` script using this template, which does something similar: https://gist.github.com/sirikon/d4327b6cc3de5cc244dbe5529d8f...


I feel like this a pretty non-issue so long as you document both that the dependency is required, and how to install it. Just is much easier to install than Python!


Is it? Python is basically everywhere, and if it's not already installed, your package manager has it ready. `apt install python3`, done.

Also, with Python you have... Python, all the language's power to extend your scripts as needed. `Just` works with a shell like bash, and I pretty much prefer python for scripting. Bash scripts get complicated very quickly.


It does, but you probably don't want to use the Python from your package manager. That tends to cause all sorts of problems when you need a newer version or different versions for different projects down the line. This is especially the case if you intend to use any packages in your python code. Managing dependencies in python gets complex quickly.

Your package manager likely also has just: https://github.com/casey/just#packages.


These issues are part of my daily work. I’ve started converting the make targets/commands to shell scripts because the hacks and ugliness that you have to do to provide make with arguments isn’t worth it. It seems like the more advanced shell features you want to use in a makefile, the more make gets in your way.

Not that I fault it. It’s supposed to be for making programs hence the name. We’re abusing it by turning into a script collection.


My personal favorite for small projects is invoke: https://www.pyinvoke.org/. I prefer it with python because it is just another lean dependency I can directly install along with other dependencies. Works pretty well unless you wanna chain and run long shell commands.


My biggest pet peeve with pyinvoke is that you can’t pass arbitrary arguments through to the underlying task. For something like invoking pytest you need to replicate the arguments you use in the task definition.


Yup! I totally get that as I ran into it the first time I used pyinvoke. I got around this limitation by using pyinvoke to just specify the tasks, their arguments and what other tasks they rely on, and let the tasks that share arguments delegate their core to the common function. It is an inconvenience, so I was planning on contributing this missing feature upstream to the library.


Another huge fan of just, here. I love that it's installable with a one-liner and I've added this as an option to my dotfile setup.

I'm done trying to cast spells at Make


Make is firmly in a category of 'better the devil you know' for me. Not that I deeply know it, I use just a subset of it anyway and for that it's fine.

If there's was a short Make: The Good Parts O'Reilly book, then I would probably read it.


Just is designed for running commands, make is designed for tracking dependencies. I use them both in the same project calling each other if need be: there isn’t a competition here.


If already you use make, I don't see a reason to also use just, since you can just do what just does with make.


That’s often pragmatic if you have 3-4 commands. But when you have more eventually you’ll end up with a cluttered Makefile and a poor command runner. just -l shows all the tasks available. Just commands take cli arguments instead of being forced to use environment variables which are susceptible to typos, etc. I can write a 3 line script in normal bash syntax instead of slightly different make syntax and then easily move it to be a separate shell script when it gets bigger.


I use github.com/TekWizely/run for this use case. It's a robust way to build one-off command-line "APIs" for managing projects and documenting processes. I deploy it in production environments along with application-specific Runfiles for devops.

It makes life a lot easier when onboarding new folks, or for remembering how to do something months later.

I also use make, but limit it to just building software. All of my repos have a Runfile and a Makefile.


I use make extensively for a company wide build system across most/all products. Normalized everything.

Someone at the company introduced Just in their projects. I’ve used it quite a bit now, it’s great EXCEPT that you cannot include other Justfiles. So abstraction is impossible. If I want to implement something like a push feature, that has to be implemented everywhere, with no way to centrally update all projects.


Look at task instead. It's the superior project IMHO.

https://taskfile.dev/


I invite you take a look at Run, a similar tool that I maintain:

* https://github.com/TekWizely/run

Support for including other Runfiles was recently introduced, with support for globbing and the ability to indicate if an error should be generated if no files are found.


I've come across this a few times, it seems to cope very well with all the things I'm abusing Make to do. I'm hesitant to add niche tooling requirements to my projects though.

Can anyone comment with their experience using this? (in particular the social ramifications)


Well, let me put it this way: I never use a tool that isn’t bundled with the OS or the runtime I’m developing for, because I don’t want my environment to be a special snowflake (and I develop stuff on Linux and macOS, with essentially the same CLI tools).


Using the Nix package manager solves these problems.

With nix, all sorts of niche packages are available, so installing just isn't difficult. (& packages nix installed only go in /nix/store, so the filesystem isn't messed up).

If you want the same programs (the same version of programs, even) across different Linux distributions, and macOS, nix is the best tool for that.

If you're worried about your environment being an unreproducible snowflake, nix's main advertised feature is reproducing package installation.

Although, yeah, Nix suffer the same cost of "1 more niche thing to install".


Direnv is the real QOL improvement. You just add a .envrc file, write a flake.nix file listing your dependencies and anytime you enter a project directory you have every project dependency/tooling instantly available, but they otherwise don’t bother you at all.


Are you writing everything in sh?


It’s funny how things evolve. This is clearly something similar to make.

All sorts of tools use xxxxfile for their config. But the tool “make” it’s actually short for “make file”. That’s what make does, it makes files (and has modified date dependencies)


Well, it just make files. But it is quite clearly a revamped make.


A tool that I created in the same category:

RUN : A Task runner that helps you easily manage and invoke small scripts and wrappers

* https://github.com/TekWizely/run

Do you find yourself using tools like make to manage non-build-related scripts?

Build tools are great, but they are not optimized for general script management.

Run aims to be better at managing small scripts and wrappers, while incorporating a familiar make-like syntax.

Some features:

* Auto generates command list

* Auto generates help for commands

* Supports defining command-line arguments, which are presented in help text

* Can be composed via includes

* Each command can define its own shell (or even python, etc)

If you're interested in task runners, I hope you'll take a look!


What would it take to unseat Make? Make is installed everywhere, so it is really hard for me to justify leaning on new tooling. Make is ok, but it has enough deficiencies that I longingly look at tools like this.


I definitely don't think Just will ever unseat Make. Just doesn't have file-based dependencies, so it's not a build system, just a command runner.

As far as unseating Make as a command runner, I think that might just take Just being available in more places, since one of the main advantages of Make that many users cite is that it's available everywhere. Just is already available in a lot of package repos, but not all of them. Finally packaging Just for Debian[0] would help a lot.

[0] https://github.com/casey/just/issues/429


This looks great and the first time I've come across it.. One thing I notice straight away is the incredible amount and quality of the documentation!


the just dev has questionable privacy view, used to copy all Justfiles on github

- https://github.com/casey/just/issues/1163 - https://github.com/casey/just/issues/503


The issue appears to be that if you make a public Github repo, his bot will crawl it and add it to a repository of Justfiles for him to check against? That really isn't a privacy view in my mind. As soon as you add something to a public Github repo it is instantly indexed and made available on a bunch of (shady and legit) Github mirrors.


I'm still waiting on a tool that installs an activate.sh/activate.bat file that will bootstrap the tool when it's not installed, and then load up the environmental variables so that clean/build/deploy/test becomes active.

Manually installing stuff sucks. It should be cross platform, automated and local by default whenever possible.


There is also cargo make which, at least in rust land, serves a similar purpose. https://github.com/sagiegurari/cargo-make

I ended up using it over just because it felt easier to use cross platform, and toml seemed like a right choice


This looks incredibly useful. Random question since I couldn't find it in the README – can i define a recipe (or recipe list) in a common location in my home folder and then use them from anywhere? While having a per project config is the intended use case, I'd also like a bunch of global ones.


The just manual describes some ways to do this. https://just.systems/man/en/chapter_65.html

The intended way seems to be to invoke just with explicit --justfile and --workdir arguments, and wrap that invocation with a shell alias/function.



Not exactly relevant. You'd still need a justfile in any location you are running the command from even after this change.


Make is acceptable for running arbitrary bunches of commands. For anything grander, I just write shell scripts that call shell scripts. Once you know shell well, it's very easy to throw together a simple build system. For a complex build system I'd go find a complex build tool.


The manual appears to be busted. The installation docs at https://just.systems/man/en/chapter_2.html#installation just (ha) shows an empty page.


I believe this is just due to how they're parsing the README.md on https://github.com/casey/just. (Though this should probably still be updated)

The installation section has no direct body, but there are subheaders with the actual instructions.

Ex. https://just.systems/man/en/chapter_3.html


Try going to the subsections inside of Installation.

https://just.systems/man/en/chapter_4.html


Ah thank you


Hey, this is really cool! Coincidentally I’ve been doing something really similar with `alias j=‘make -f ~/Makefile’` for a single giant makefile across multiple projects; I also combine it with fzf to allow fuzzy-searching for target names.


For rust cargo-make is also pretty good and got lots of cool rust specific features https://github.com/sagiegurari/cargo-make (just is written in rust)


How is this different to a shell script / command alias?


You can easily create a separate Justfile for every one of your projects. Shell aliases are generally global, not project-specific.


This ships with your project and is project specific.

Sure you could accomplish something similar with direnv to maintain the project-level scope. But Just makes it more explicit.


Right.

I think looking at the features, you see "minor quality of life UX improvements", and it's not mind-blowing. (just commands implicitly run with the workdir of the justfile, just --choose uses fzf, soft tabs not required, etc.).

The effect does end up being a nicer tool to use.


Has anyone got a just recipe for looping through commands using a list of strings as inputs?


That README is quite the novel. I guess I prefer overly verbose documentation to under though.


My dream is that git will add basic build capabilities


That would go against the Unix philosophy.


Great project, 40 years late though.


Smells like alias just=make




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: