Saving Bandwidth With Apt-Cacher : Revisited

By | 2007/10/03

Those of you that are long time readers may remember my previous article on Saving Bandwidth With Multiple Machines Using Apt-Cacher. With the next Ubuntu release coming down the pike in just a few weeks I wanted to revisit this article for those of you that will be upgrading via aptitude. If you have multiple machines you’ll really want to look into setting this up!

I’ve been testing and retesting different hardware running Ubuntu 7.10 alpha and now beta releases. Nothing is worse than installing two machines and downloading the same updates twice, one for each machine. Apt-Cacher steps in at this point and lets us just download the update once to then be shared by all. Let’s revisit the setup:

Installing Apt-Cacher

The first thing you’ll need to do is select a central machine that you’ll want to act as your apt-caching service. I use one of my old servers sitting in the closet, but this could be one of your desktops if you like. You should note that this machine will need to be on any time another networked machine wants to request an update. On this single machine we’ll install the apt-caching service:

sudo aptitude install apt-cacher

Configure Apt-Cacher

You’ll also want to set this service to auto start at boot time. To do this make the following change:

sudo vim /etc/default/apt-cacher


You’ll also want to updated the access restrictions to allow your local machines access to this service. By default only localhost ( will be allowed.

sudo vim /etc/apt-cacher/apt-cacher.conf

change allowed_hosts= to allowed_hosts= (update to your range as appropriate)

Once you’ve made these changes you’ll need to restart the apt-cacher service on the central machine:

sudo /etc/init.d/apt-cacher restart

Configure the Clients

You can configure your client machines (the other machines on your network) to use this apt-caching system with a simple edit to the apt sources.list file. Make sure you know the IP address of the apt-caching server you configured above.

If your sources.list currently looks something like this:

deb gutsy main restricted universe multiverse
deb gutsy-updates main restricted universe multiverse
deb gutsy-security main restricted universe multiverse

prefix the address to include the caching-server IP address: (replace with your local server IP)

deb gutsy main restricted universe multiverse
deb gutsy-updates main restricted universe multiverse
deb gutsy-security main restricted universe multiverse

Once you’ve done this to the clients you should be able to fetch and install updates via the central machine and save bandwidth. Once a package has been fetched for one machine it will stay available centrally for each of the others therefore saving bandwidth and increasing update times by using LAN speeds vs WAN speeds.

If you’re interested in more information see my previous post on the topic or see the man page for apt-cacher (man apt-cacher).

14 thoughts on “Saving Bandwidth With Apt-Cacher : Revisited

  1. Matt Mossholder

    The alternate approach on your clients is to leave the urls in sources.list alone, and configure a proxy server in apt.conf that points to apt-cacher. It makes it a lot easier to turn it on and off, as well, as requiring less effort.

  2. Soren Stoutner

    Mr Troll,

    When it comes to something as important as package management, I prefer to not have my computers automatically doing anything, especially selecting new repositories. Perhaps having that functionality automated would be nice, but it should never be enabled by default. The setup here is truly not that difficult and basically involves 1) installing the apt-cacher package, 2) turning it on, and 3) pointing the other boxes to it. That’s exactly how it should be.

    Now, if you want to make the argument that there should be a nice little gui for the process that shows you all the options and automatically scans for available cache servers on the network and provides them in a list that you can choose from, that is all well and good (as long as you are offering to write the gui yourself). But please don’t try to make the computer smarter than I am and do a bunch of dumb things because it is trying to “help” me. I work on networks for a living and get paid a whole bunch of money to fix those exact types of problems in other operating systems.

  3. Karl Bowden

    Hey Christer,
    I would also recommend approx.
    It seems the approx and apt-cacher do almost exactly that same thing and setup of approx was very much the same except the domain is defined in the approx setup file instead of on the client sources.list.
    The only advantage I had with this is that when one of the mirrors I was using started timing out, I simply changed the mirror name in approx.conf, restarted approx and then all of the machines using it as it’s source were still functioning as normal.


  4. Richard

    The key line to put in a file under /etc/apt/apt.conf.d is:

    Acquire::http::Proxy “http://cache-host:3142″;

  5. Thorne

    Well I need some info about this apt-cacher.
    I have dapper (lts) and feisty and guttsy. how can I set it up to use this “apt-cacher”. Do i i use different sub directory’s for each of have them all going in the same directory??
    I would very much like to use this program for my repository cache but just don’t know how to set it up to use it. can anyone please help…

  6. Ivan

    Also, if you change sources.list, sudo apt-get update is required.

  7. Mark Waters

    I’d like to thank both Christer for the article and Brian for his comments.

    I got apt-cacher-ng installed last night for our small network and its working perfectly saving us bandwidth and time.

    Good work guys!
    Mark Waters

  8. Alec

    Hi – just wanted to say thanks to Christer and everyone else for their comments.

    I’ve just installed apt-cacher again, and I came back to these same instructions!

    Very helpful instructions, so I was pleased to find that they are still around.

    Many thanks,
    Alec Simpson

  9. Alec


    PS – one thing I did do different to the script was to use the hostname in place of the host IP address.

    Seems to be working well for me, and it removes dependency on static IP addresses.

    Any comments on whether this was a good idea are welcome!



Leave a Reply

Your email address will not be published. Required fields are marked *