How to create a Ubuntu repository server - LinuxConfig.org

This tutorial was quite helpful, thanks for your time. Your instructions were not to blame for the problems I experienced. They were caused by the version of apt-mirror that ships with Ubuntu 20.04.
Further details can be found in these github issues
apt-mirror apt-mirror issues 102
apt-mirror apt-mirror issues 118

Here’s the github updated apt-mirror package I used to solve this problem.
Stifler6996 apt-mirror

1 Like

Hi and thank you for your tutorial.
I followed all your steps and I can’t get an upgrade done.
My local server is working fine but after apt update on my client, it gets several outputs: Get, Ign & Err.
My system tells me it has fetched 21.8 MB. After I try to upgrade the client, it just tells me that he is done with 0 upgrades, 0 newly installed, 0 to remove and 0 not upgraded.
I would appreciate some help @sandmann

Hi Chmio57,

Welcome to our forums.

Could you please post the error messages you encounter? That would help us understand your issue better.

Of course, here you go:

apt-get update (…)
Fetched 483 kB in 1s (346kB/s)
Reading package lists… Done
E:Failed to fetch http:/192.168.9.250/ubuntu/dists/focal-backports/main/binary-i386/Packages 404 Not Found [IP: 192.168.9.250 80]
apt-get upgrade
Reading package lists… Done
Building dependency tree
Reading state information… Done
Calculating upgrade… Done
0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded

The 404 is not my problem as I didn’t download everything for testing.

It always says 0 to upgrade and so on… Even when it fetches about 20 mB it has nothing to upgrade.

(Can’t put links in posts that’s why the http is wrong)

Seems to me that the 404 error is indeed your problem, because there is no metadata for apt to process (which should be under your Packages directory). You don’t need to download the whole repository for tests, but metadata is mandatory for it to work - that’s what tells apt where to find everything else.

1 Like

Hello,

I’m looking at this guide but I’m not sure if its right for my use case… Can someone help me out?

I’m trying to keep air-gaped Ubuntu 20.04LTS GNS3 server up to date…I need to copy the repos needed for Ubuntu 20.04LTS server and the GNS3 repo to keep files up to date…this system is not allowed to touch the internet…And I’m basically having to build the updates at the house and use a USB driver to get them on to the air-gaped network. Any idea of what would work best for this?

Thank you!

Hi justjag16,

Welcome to our forums.

Since the system is not allowed to reach the Internet directly, there don’t seem to have many options but to transfer the needed data by hand. If it is allowed, you could create a repository server on the air-gapped network, and enable it to reach the Internet only when it is isolated from the rest of the secured network. When syncing done, close the Internet connection, and open the internal network access - after making sure the downloaded data is safe and the repository server is not compromised. Sort of the same as getting the data on an USB drive, but less handwork.

Thanks for the guide it worked very well as long as I read everything. I’m a little new to Ubuntu, but have run Red Hat/CentOS for years. I maintain a CentOS Repo on an air-gapped network. I use a script to pull the RPMs and it makes a copy of those files in a burn directory for me to put on a CD/DVD and carry over to the air-gapped network to add to the air-gapped Repo. My question is this: Is there a way to monitor the files being downloaded? I can use find and make a list from that, but that seems a little clunky. I really would like to find a way to just copy the new files and not have to resort to copying the entire directory every time, plus that’s not really a practical way to operate.
I do plan to have a repo on the air-gapped network, but it won’t ever be able to connect to the internet.

Hi OldCrow,

Welcome to our forums.

You could use something like aide to create a reference database of the contents of your local repository before you download the new files. Running a check against this database would result in all the files that where changed during refreshing the repository. You could use the output of the check to copy only packages that changed to the place where you write the media from.

You may be on to something there. I’ll give that a look. “find /path -type f -mtime 1 -exec ls -la {} \;” gives me a pretty decent list, but it takes a little while to run. The only thing about doing that is I don’t know how much I trust it to be accurate. I know I’ll need to get all files and not just .deb files unless I want to rebuild the Release files with dpkg.

On the topic of how long any solution will run, I am afraid all of them will run for quite some time, since the Debian way is that many-many little packages constitute a service or a function. The same goes for downloading them. Traversing trough so many files will take some time for sure.

I can also suggest to check our rsync intremental backup guide, since it may also provide a way to copy only the files that changed since the last “backup”, and do so in one step, opposed to “1.) find all files that differ, 2.) copy/move them to the staging area”.