Munki, Gitlab & Git-LFS

NOTE: This is not a HOW TO, more of a rant, the how to that may or may not come later. This is written from the perspective of someone who manages (for the most part) their infrastructure from end to end, vm config, service management, monitoring and patching. When it comes to scale, and complexity your milage may very.


  • there are some caveats this route
  • its worth it
  • git isn’t a one way street, you can use it as little or as much as you want
  • Gitlab as a tool sets you up for more in the CI/CD realm

The idea

This post is just some gernal process / lessons learned from migrating a Munki repo to a git tracked repository

fwiw, “git” and “git-lfs” don’t need Gitlab, its just my life, and these are my thoughts.


  • vcs
    • a version control system was huge, the benefit to see what I changed when I changed was really rad
  • pipelines
    • though I didn’t have that word for it at the time, some flow of dev/test/deploy of a repo
  • multiple users
    • though this still isn’t prod, I wanted the ability for multiple users to make changes, and said changes be tracked
  • no more smb/sfp
    • all the transport security can be done with pub/priv keys, etc

Why Gitlab?

  • I could keep it locally
    • OR use the cloud, privately too
  • The CE edition is rad (and I will do my best to link exclusively to the Community Edition docs)
    • and free
  • It has CI/CD built in
    • (though I didn’t realize how great this was ’till later)

The Reality

There were some kinks that had to be worked out, the first and maybe most obvious was managing large files with git. Gitlab has LFS support built in, which is rad, but still comes with its own nuances.


Git Large File Storage

Git Large File Storage (LFS) replaces large files such as audio samples, videos, datasets, and graphics with text pointers inside Git, while storing the file contents on a remote server like or GitHub Enterprise.

Heres a handy tutorial: Getting started with Git LFS

Git LFS on Gitlab

Sounds great, and can be enabled in Gitlab easily.

LFS on Gitlab also requires you use HTTPS for auth and transport, rather than ssh. Digging into the Gitlab administration docs, we can see the docs straight list some pretty big limitations:

Support for removing unreferenced LFS objects was added in 8.14 onwards.
LFS authentications via SSH was added with GitLab 8.12
Only compatible with the GitLFS client versions 1.1.0 and up, or 1.0.2.
The storage statistics currently count each LFS object multiple times for every project linking to it

But some stuff they don’t tell you that I found out the hard way-

LFS caches (this, srsly)

What I found was when Gitlab was receiving LFS file sit could cache them in a /cache location, then move them to the configured storage location.

This was noticed when my OS disk on my VM filled. womp. Modifying the gitlab.rb you can change the LFS storage location.

Looking at gitlab.rb.template, we have these LFS options:

### Git LFS
# gitlab_rails['lfs_enabled'] = true
# gitlab_rails['lfs_storage_path'] = "/var/opt/gitlab/gitlab-rails/shared/lfs-objects"
# gitlab_rails['lfs_object_store_enabled'] = false # EE only
# gitlab_rails['lfs_object_store_direct_upload'] = false
# gitlab_rails['lfs_object_store_background_upload'] = true
# gitlab_rails['lfs_object_store_proxy_download'] = false
# gitlab_rails['lfs_object_store_remote_directory'] = "lfs-objects"

Oh nice! a gitlab_rails['lfs_storage_path'] option, sweet, so you can store your repo on /dev/sd(whatever), this is good to know- so say- your os disk doesn’t fill…

And what about your client? Most people have Munki running with autopkg or munki admin on a macOS box. So you have to have Git-LFS on your mac, and little less supported.

LFS on macOS

You can do this via the installer git provides (macOS), which is basically the command line extension and a shell script. Or you could use brew, loads of people have loads of opinions about brew, so don’t use it and just keep that to yourself.

Regardless you will need to initialize it-

# Update global git config
$ git lfs install
# Update system git config
$ git lfs install --system

Which is like HEY GIT we going to use LFS now. But for what? and when?, touché git.

Looking into Configuring Git Large File Storage you can, once you’re in a tracked dir,

$  git lfs track "*.psd"
Adding path *.psd

Which is cool, and it gets added to your .gitattributes file, but most admins know what they’re big files in their munki repo are… so somehting like this in your .gitattributes file may be more applicable:

*.pkg filter=lfs diff=lfs merge=lfs -text
*.mpkg filter=lfs diff=lfs merge=lfs -text
*.dmg filter=lfs diff=lfs merge=lfs -text
It still CACHES

Lets take a look at our mcOS git env…

$ git lfs env

That effectively means your local repo folder can be double its actual size. So plan accordingly.

Once its tracked it is as simple as a git push to get those files and changes to the Gitlab server.


I know it sounds pretty negative up until this point, the benefit though, of the hump of getting it setup is git. And you can “git” as much or as little as you want.

Munki & Git

Theres some great documentation on that here. But specifically check out Munki’s Repo Plugins, specifically the GitFileRepo:

GitFileRepo – a proof-of-concept/demonstration plugin. It inherits the behavior of FileRepo and does git commits for file changes in the repo (which must be already configured/initialized as a git repo)

This is rad because once your repo is tracked it does git commits for file changes in the repo. Rad.

Git Theories

master Branch All Day

Once you have all your stuff in git, you can choose how you’re going to use it- I have seen a lot of benefit of a tracked repo that just simply commits changes to master. Since dev, test, and prod are all contained in “munki logic” commit any changes to master allows you to track any changes made to .pkginfo or manifest files.

Which this works, and is totally legit, and will get you a load of good info from tracked files.

Let’s Face(book) It, we need more

Or maybe you don’t but Facebook’s CPE team has a really rad option…

Check out their CPE resources, specifically the autopkg_tools

This is an AutoPkg wrapper script creates a separate git feature branch and puts up a commit for each item that is imported into a git-managed Munki repo.

They have a Getting Started Guide if this sounds like more of what you’re looking for-

Stuff I didn’t touch on, that I could

  • catalogs, not tracking them, making them, making them with a runner
  • git-fat as an option?

In summary

I personally won’t go back, there was a little tweaking to get it all sorted, but the information and tracking git provides to the munki repo is well worth it.

Coming Soon

  • I am going to try and sanitize some of the helpful CI/CD stuff I got rolling in Gitlab and talk about it here.
  • Munki in the cloud stuff
  • macOS monitoring?
  • maybe a similar rant on osquery and


Macadmin Resources

🎥 Mac Justice – Intro to Gitlab (MacDevOPs, shorter)
🎥 Mac Justice – Intro to Gitlab (PSU Macadmins, longer)

🔗 Advanced Munki Infrastructure: Moving to Cloud Services by Rick Heil
🎥 Advanced Munki Infrastructure: Moving to Cloud Services

General Resources

🔗 Git LFS
🔗 Gitlab CE
🔗 Gitlab and LFS

🎥 Git Large File Storage – How to Work with Big Files
🎥 Git LFS Training – GitHub Universe 2015
🎥 Tracking huge files with Git LFS, GlueCon 2016

NEMS – Nagios for your Pi

NEMS or Nagios Enterprise Monitoring Server developed by Robbie Ferguson is a modernized version of NagiosPi.

NEMS is a modern pre-configured, customized and ready-to-deploy Nagios Core image designed to run on the Raspberry Pi 3 micro computer. At its core it is a lightweight Debian Stretch deployment optimized for performance, reliability and ease of use.

I had used FAN (Fully Automated Nagios) for my home instance until development stopped around 2013, NagiosPi was a good alternative, and I liked the idea of Nagios living on a Pi rather as another vm on a server, seemed counter intuitive to have it live on virtual host, and a pi allows for an inexpensive platform to have a stand alone service.

NEMS is built for RPI3 and requires as such. The beauty of NEMs (as with FAN or NagiosPi) its all rebuilt, download the .img, flash it to a Micro SD and you’re off.

NEMS bundles a lot of great features that use the Nagos Core, and it can be a simple box preforming check_pings or it can be as robust as nagios NRPE can get- that’s up to you.

Within an hourI had it up and running preforming basic checks on my home environment and at no more than $65 USD for all the parts (pi, case, powersupply, Micro SD). Its an easy and robust solution. I recommend checking out as there is an excellent write-up and a direct download to the image.

If you have a small environment or home infrastructure I highly recommend NEMS  (Nagios Enterprise Monitoring Server) by Robbie Ferguson.

LinuxFest Northwest

I am super excited to announce be presenting at LinuxFest Northwest May 6th on “Managing macOS, without macOS(almost)” you can read more about the session here. LinuxFest Northwest is an annual OpenSource event held at Bellingham Technical College.

What is LinuxFest Northwest? LFNW features presentations and Exhibits on various F/OSS topics, as well as Linux distributions and applications. LinuxFest Northwest has something for everyone from the novice to the professional. The hours are 9:00 a.m. to 5:00 p.m. both days.

LinuxFest Northwest  a great conference and you cannot argue with the price. I hope to see you there! 

Macadmins Meetup

“Unofficial” Apple Admins of Seattle and the Great Northwest social to follow Saturday’s sessions will be held at Elizabeth Station at around 5pm. They should have a food truck outside and an over abundance of Beer/ Cider selection. There is also the incredible Primer Coffee right next door if that’s more your speed. As always find us on Slack, hope to meet you soon.