Code signing installed executables


(Marco Masser) #1

First of all: I’m one of the developers of Little Snitch, a user-friendly firewall for macOS that allows creating rules on a per-application (or per-executable) basis.

We occasionally have users who run into issues because executables installed via Homebrew do not have a code signature. This is relevant to Little Snitch because rules in Little Snitch can be configured to require an executable to have a valid code signature. If they don’t, Little Snitch will not allow connections and instead report the situation to the user.

We can improve the UI and UX on our side, but ultimately, the underlying issue is that executables without a code signature are a security risk because they can be modified or replaced without the user knowing about it. This risk is even greater if these executables are in user-writable locations.

I thought I’d run an idea by the community to see if there’s a desire to tackle this problem on Homebrew’s side. After all, validating an executable’s integrity is a benefit for every user of Hombrew, not just those who use Little Snitch.

One more disclaimer: I’m a very light user of Homebrew and have no experience whatsoever with creating my own packages. It is entirely possible that I got some of the naming wrong and I would very much appreciate any feedback on these things.

What follows is the text for a GitHub issue I’m planning on opening, but I wanted to run it by the community first. If you have any comments, I would be happy to hear them.


First of all: I’m one of the developers of Little Snitch, which will be relevant further below.

Description

Executables and libraries installed via Homebrew from source are not code signed. This makes sense since the code code is compiled on the user’s machine. This is usually not a huge problem, but can get in the way if other software tries to verify the integrity of the installed products. Also, there is no easy way to verify if the executables installed via Homebrew were modified or replaced with something else.

I’m proposing to add an optional configuration option to Homebrew that allows users to specify a code signing identity that should be used for signing installed products. Code signing can be done by brew in a post-install step using the codesign command.

Since the signatures would only have to be accepted on the user’s machine and none other, it would be sufficient to use a self-generated signing identity. This could even be generated by Homebrew during setup and be stored in the user’s login keychain. That way, no certificate authority is needed to provide signing identities, which saves maintenance and monetary cost.

Motivation

I will not explain the motivation for code signing in general here. But as previously stated, code signing would allow Homebrew itself, as well as other software to verify that executables were not modified since they were signed.

This is where my initial disclaimer comes into play. Little Snitch allows users to create rules for allowing or denying network connections on a per-executable and per-destination basis. By default, these rules require an executable to have a valid code signature (it is of course possible to completely ignore the code signature). There are cases that Little Snitch users run into that are quite complex, but that’s a UI and UX issue on our side. Nonetheless, if all executables were signed, users wouldn’t run into these issues and we’d have fewer users who are confused about this.

Therefore, my motivation on proposing this feature is more from a Little Snitch developer’s standpoint than from a Homebrew user’s.

Relevancy to Homebrew users

I personally think being able to verify that an executable installed via Homebrew was not modified in any way is highly desirable for everyone from maintainer to end-user. I realize that the way how this verification is done is debatable.

Alternatives considered

None at the time.


Homebrew Questions
(Mike McQuaid) #2

I’m not sure I understand the benefit here; if you’re just signing it on the local machine and this can be done non-interactively: what’s to stop an attacker from following the same process to self-sign a malicious executable?

Rather than checking executables are not modified we do make use of checksums to ensure that the source or binary packages that are being used have not been modified. This isn’t flawless or a replacement for code signing but it bears mentioning.

I don’t think we’ll ever get a situation where all Homebrew executables are signed. If we/you want to get the majority signed, however, we’ll need to move the signing process to our CI which generates the binary packages the vast majority of users consume. If we introduce code signing arguably this is the first place that would make sense to do so. If we can figure out a solution that doesn’t cost Homebrew too much money and is sufficiently integrated into our CI process: I’d consider accepting this functionality.

https://github.com/homebrew/homebrew-test-bot is where the majority of our CI logic lives and would likely be the first place that codesigning would be done (on any files in bin or sbin, I’m guessing).

To ask another question: even if all binaries are signed: how does one verify that libraries haven’t been replaced with malicious/modified versions?


(Marco Masser) #3

Thank you for your time. I’m glad to hear this isn’t completely out of the question.

You can make this argument just as well for code signing on any developer’s computer. The Keychain in macOS where the signing identity is stored requires a password to access the private key used for signing. But the dialog that asks for the password offers an “Always Allow” button and I would assume that the vast majority of developers choose to click that. If they don’t, they’d have to enter their password for every build that performs code signing – which for iOS developers is every build when they test on a real device.

That is probably one of the most problematic points in this whole story. If code signing should happen on a Homebrew user’s machine non-interactively, there’s probably nothing stopping an attacker from signing whatever they want.

I don’t have a good answer here, to be honest.

That is good to know.

That sounds very good! Honestly, I didn’t do any research on that beforehand, but I assumed that most users use most stuff in source code form and compile it on their machines. If binary packages are (more) common, that’s an excellent place to start code signing and also the more sensible thing to do.

This poses a question, though: If your CI signs the executables, what signing identity from which certificate authority does it use? It must be one that is accepted on user’s computers, which either means the CA must be well known and pre-installed with macOS, or it must be installed during Homebrew setup.

I think first option is preferable, but will likely involve money. (I only know that Let’s Encrypt does not offer code signing certificates, so I would assume you can’t get free code signing certificates.)

What would be the next steps to formally propose something like this?

That’s basically the same thing as for the binaries themselves. If the signature is invalidated, it gets a bit more complicated, though, because all of a sudden you have to know about static code signatures and dynamic code signatures, i.e. the code signature of the binary on disk vs. the code signature of the running process.

I summarized this in the Little Snitch 4 help in two short sections that are not actually Little Snitch specific:

High level overview:
Applications with a valid code signature that is broken

More detailed description including examples of codesign commands relevant to this case:
Independently verifying code signature issues reported by Little Snitch


(Mike McQuaid) #4

Yep. The difference is that there’s normally a somewhat small window of opportunity and/or manual process that would need to be circumvented for the software of a developer that is redistributed to others to be exploited in this way.

I think it has to be non-interactive to be useful. We could tell people to “always allow” the first time but, as you’ve mentioned, this eliminates some of the security benefits. This strengthens the argument for me that the best place to do this would be in our CI.

I think creating a PR with some of the code would be the best start for discussion. We’re not opposed to paying for a certificate. Presumably this is something e.g. Apple provides as a paid service?

Ok. Does that mean we’d want to sign all the libraries too?

Another problem that’s just occurred to me is that we use install_name_tool to relocate binaries on installation. Presumably this would invalidate and/or not play nice with code signing?


(Marco Masser) #5

OK, thank you for the input. I have no idea if or when I will have the time to actually work on this. Due to my lack of experience with Ruby and the inner working of Homebrew, this would be a terrific learning experience, though.

For the record: I would not mind at all if anyone else wants to work on this.

Apple provides code signing certificates for paid Apple developer members ($99 / year, IIRC) and come to think of it, the Developer ID certificate could be suitable for this, though I’m not sure. That is at least what you use to distribute apps outside of the Mac App Store, but I don’t know from the top of my head if the license agreement allows using the certificate in this way.

Yes. For the process to have a valid code signature, the code in the executable and all code the process dynamically links to must have a valid code signature.

I have no experience with install_name_tool but looking at its man page, it sounds like this is a problem. Also, a quick test with -add_rpath verifies that this breaks the code signature of an executable. I think this might be a showstopper.

l assume this tool is used because package maintainers can’t know for sure the paths to the dynamic libraries at build time?

Edit: After a quick search, it seems like users of Intel’s Fortran libraries also had a related problem. I don’t know enough about this whole topic to know if this is applicable to Homebrew, but does this forum post help in any way?


(Mike McQuaid) #6

Good to know thanks.

Cool, yeh, I think this would be the best fit.

Gotcha, good to know.

Yeh, I agree. It looks like we could consider using @loader_path instead. This would need to be a wider change to Homebrew, though, and I/we can’t commit to doing that in the short-term.


(Marco Masser) #7

Yeah, I imagine this isn’t a trivial change. But good to know there are options.

Mike, I’d like to thank you very much for all your input. While I will/can not take this any further at the moment, I think this thread might prove helpful if someone else should take this up in the future.


(Shaun Drong) #8

I’m not sure this is directly related to the little snitch / code signing issue, but having sat thought the wwdc security sessions it’s pretty clear unsigned code execution days are numbered and at some point the homebrew community will need to deal with it in some fashion. @marcomasser Not sure if you had time to look a Mojave’s gatekeeper changes, but could we get away with just using the new notarized services once it’s better established vs a full code signed review? see: https://help.apple.com/xcode/mac/current/#/dev88332a81e
I as a avid homebrew open source user I’m concerned there will “Come a day there won’t be room for naughty local compiled code run about in MacOS”. If there’s anyway I can help let me know where to look.


(Marco Masser) #9

I think you’re mixing up a few things here.

  • Developer ID is a program by apple that provides code signing certificates, among other things.
  • Developer ID has nothing to do with any review process whatsoever.
  • Notarized apps are an extension of the Developer ID program.
  • Notarized apps must be code signed and go through an automated review that checks for “malicious things”.

For Homebrew this means that to use the notary service, it must first have a code signing certificate by Apple (through the Developer ID program) and sign all its code.


I just finished watching session 702 “Your Apps and the Future of macOS Security” and the my takeaways are:

  • Developer ID signed apps must be code signed (as it always was).
  • Notary Service is a new extension to Developer ID.
  • Notary Service requires the new Hardened Runtime.
  • The new Hardened Runtime allows disabling code signature validation, but the app must still be signed in the first place. This is e.g. for processes that modify themselves during runtime, breaking their own code signature by doing that.
  • Notary Service is optional for now but will be required in the future.

I assume you’re referring to what I listed as the last point. It was not explicitly stated in the session what “required in the future” means, but I assume that is for Developer ID apps only. Apps that are not Developer-ID-signed are unaffected by all this. If this assumption is true, Homebrew should be able to continue as is.

But that doesn’t mean Apple will not prevent unsigned apps at some point in the future.


(Shaun Drong) #10

gotcha thanks for clarification @marcomasser ! I’m definitely coming for novice apple developer deployment position as a ruby/php coder.
I do feel like the use case apple presented for notarized app was for developers that want deploy via their own services/sites vs app store, but sounds like I missed a step for signing for local use as you suggested for compatibility with little snitch(fan btw). My suggestion was more for the binaries pre-compiled by homebrew’s CI vs local built, but if self signed can work for this use case so much the better. again let me know if I can support this feature change.

thanks


(Marco Masser) #11

Don’t worry, you’ll get the hang of all this code signing stuff if you need it. It’s just a bit much at the beginning and Apple adding or changing something every year doesn’t exactly make it easier, especially for newcomers.

Glad to hear you’re a fan of Little Snitch!

In order from no restrictions to more restrictive, it’s like this on macOS:

  1. Unsigned code: Homebrew as it is today. Doesn’t matter if the executables are compiled by someone else or by you. Self signed code does not provide much more than this level of security.
  2. Developer ID: Code is signed by the developer using a certificate provided by Apple. Apps are distributed by the developer. If the developer screws up in a major way with one of their apps, Apple can revoke that certificate and all apps by that developer become unusable for all users.
  3. Notarized Apps (new): Code is signed by the developer using a certificate provided by Apple. Some limited automated testing is performed by Apple. Apps are distributed by the developer. If the developer screws up in a major way with one of their apps, Apple can revoke that single app and only that app become unusable for all users. All other apps by that developer continue to work fine.
  4. App Store: Apple checks what the developer builds and distributes it via the Mac App Store. If the developer screws up in a major way with one of their apps, Apple can revoke their certificate and all apps by that developer become unusable for all users.

As @MikeMcQuaid and I concluded before, code signed by Homebrew’s CI would probably be the way to go. I can’t speak for any planning or coordination for that since I’m not working on this or anything else related to Homebrew. But I’m happy to hear that you care about this too!