So after the great power outage fiasco I decided to look at some settings and perform some tests. Sure enough the server BIOS was set to ‘power off’ after AC fail, so changed this to ‘on’. Then did a full AC outage test. It lasted well over an over, but when it got to about ten minutes remaining both the server and the firewall shutdown. I then returned the AC supply. The UPS did as it should, cut the output relay, charged for a bit and then turned the output on. The firewall dutifully powered back up, however the server didn’t. Because the server had shut down gracefully, it didn’t actually suffer an AC fail when running, so didn’t power back on. Huston we have a problem. So off to the BIOS again looking for a ‘wake on lan’ setting, nowhere to be found. Played around with a couple of Ubuntu utils and identified the network card etc. It said it could support WOL using the magic packet, but it was currently disabled, some Googling and now a service was setup to reenable WOL after every boot. However no matter how many packets I sent it and wether it was a full shutdown, sleep or hibernate it would not wake up. As the setting wasn’t in the BIOS my guess is the hardware after the NIC just doesn’t support it. Back to the BIOS. One interesting thing I did notice in there was a setting for RTC Alarm, this looked interesting. I set the clock for a few minutes later and shutdown the machine. Sure enough when the clock struck the machine rebooted. So I’ve now set it to one minute past midnight (UTC). I’m now running a fall AC fail test again, there is about fifteen minutes of power left and it’s coming up to ten-fifteen. So when it shuts down at about ten-thirty I’ll restore the power. Hopefully then just after eleven (we are on BST at the moment so UTC+1) all should be backup.
Tag Archives: Ubuntu
So we live in a virtual world
So the last couple of days I resurrected a very old laptop and I mean very old to the some of twenty plus years. It has videos my ex. husband made when we first start going out. I wanted to use it for other things, I’ll come on to that in a bit. But there was some stuff on there I would like to keep. Trying to connect it to a modern network share wasn’t happening. Trying to back up to a 128GB USB stick wasn’t happening. Eventually managed to get it to talk to an old 300GB external drive. But rather than just do a backup, instead I ‘virtualised’ it. Using VMWare converter I turned it into a bunch of files. I then transferred this to the USB stick and used VMWare tools to create a virtual machine. Works perfect, boots up into Windows XP in a window and you can use it just like the old laptop.
Now the old laptop I wanted to reuse as a Ubuntu desktop machine just for playing about. This laptop only has a 32 bit processor so had to go back quite a few versions to find one which would install. Eventually after a couple of days of pissing about and various updates I’ve got it up to 18.04. Sadly it isn’t really useable, the machine only has 512MB of ram and a very slow ‘M’ series processor. I looked and I could buy a 1GB memory upgrade and a new battery for around £60. But sadly looking around I could get a second hand laptop with much higher specs. for the same money. So I’m going to reinstall Windows XP Home on it and see how it runs, but have a feeling, with its dead CMOS battery, dead main battery, slow processor and lack of RAM it will be making its final journey to the recycling box. Shame I had many fond memories of that machine.
Still, I have found another unused laptop which has a 64 bit processor and 2GB of RAM, I think this maybe my next Ubuntu desktop target.
Comes to something when you have a will to look after your thirteen year old dog
So went back to Unity and they actually had me down for an appointment. So got stabbed with a needle and took a swab of my arse. Spent a lot of the day doing accounts and admin stuff. Finally got round to my ‘will pack’ from Battersea Dogs Home. Sasha and Dillon are now taken care of in the event of my death. They are thirteen and eleven. Thinking they may outlive me is worrying.
In other news I’m making great progress on SSL certificates. I’ve now produced a CSR with both the firewall and backup firewall subdomains, produced a certificate and tested it on the pfSense box. Even got so far as now SSH into the firewall, pushing the key, pem and php script and then running the script to copy and install the certificate and restart the web interface. Need to do some bash magic server side now.
So what happened
I was sat down watching crap telly last night, fiddling on the laptop connected to the server over SSH. I was trying to get fail2ban to work with wordpress, so it would ban an IP based on a bad login. So when fiddling I was checking the log files and it was showing that iptables were not working properly. After much Googling I ended up trying to reinstall the kernel. This did indeed reinstall, unfortunately it didn’t install the network drivers which apparently were part of some extension package. So I was now left with a system that booted but had no network connectivity. So travelled upstairs and got on the console. Ended up restoring from a backup. Didn’t work. Tried various restores. Didn’t work. Tried deleting the /boot directory and restoring that, except it wasn’t on the backup. So now had a system that wouldn’t even boot and just sat there at a grub prompt. Bugger. So reinstalled the OS as a clean install. Thankfully it only took a few minutes. System then booted with full networking. Except now of course it was a blank system. So tried restoring the backup again and then rebooted. All came back up and instantly started working again. Relief. So expanded the volume and checked a few things. This wasn’t bad activity considering I’d downed two bottles of wine during the panic.
So today I just monitored the logs. Thankfully everything was fine, as a bonus fail2ban was now working fine. So I added the wordpress jails and all seems to work.
I’m not going to touch it again now until I come back.
Back wax lyrical
So this morning I had my back waxed by Sharlene. Yes it’s vane. I really don’t care. I just don’t like a hairy back, plus I can’t reach it and I’ve known Sharlene for years. I shave everything else myself. Plus I only do it if I’m going somewhere where I’ll be on a sun bed.
It rained a lot. The weather in this country is really pissing me off. I’ll miss the dogs, I won’t miss the rain. Still, managed to walk everyone and myself. Still haven’t reached my weight goal.
Sat down this evening and removed the private keys out of my courier .pem’s. These have always annoyed me as it makes the private key readable. So removed the privated key, put it in it’s own key file and made it only readable by the courier user, updated the config and restarted the service. After a couple of cockups mail reading was working again. I moved them to the certs folder, I’m happier now, I’ll need to update the doc.
Also logged into the SFTP server and downloaded last nights backup, decrypted it with my private key and then tested the integrity of the tar file, all good.
My new hat arrived. I like it. Feel like I need matching shoes.
Now backed up to the cloud
So spent a few hours yesterday on SSH into the server. So got an account with ‘adrive.com’ which is one of the only ones which is a) cheap and b) allows ftp access, more importantly SFTP access. So created a public / private key pair with ssh-keygen. Uploaded the public key to ‘adrive’ and magically I can now log in with open-ssh. This is handy as it has a batch mode, so you can log in and upload from a bash script. So I modified the backup script to backup to the tmp directory and then copy this to the external drive. Then use open-ssl to encrypt it using another private key. It then logs into the ftp server using SFTP and uploads the file. To find out if the upload was successful was a bit more tricky. Uses ‘stat’ to get the file size of the local file, then uses ‘ls -l’ on the remote file and stores that in a file. I then had to use a series of ‘sed’, ‘grep’ and ‘cut’ commands to get the file size. I dump this to a file and then just compare the two files. It failed multiple times in testing until I got the script right. Satisfaction.
Also bought a new hat.
Dodged a bullet
So I’ve now got to Saturday and failed to come down with anything. Much to be said by spraying vast amounts of chemicals up your nose. I did a fifteen mile walk as well as walking the dogs. Got back and immersed myself in a bash script. The object of the exercise was to produce a script that could upload a file to a ftp server. First issue was said server requires an SSH public / private key to validate. To be fair after a bit of Googling this wasn’t much of an issue. Producing a batch file to do this, again wasn’t a problem. Validating the file was there however turned out to be a challenge. There wasn’t any response to say that ‘file is uploaded’. So ended up with with a very interesting combination of ‘sed’ pipes to list the directory entry to a file to get it’s size. So validated the upload by comparing file sizes in a file containing the file size. Still. It was a night in.
The joys of FTP servers
So I have an ftp server set up on the, er hmm server. In fact I have two of them running. This is due to apache running two web servers, each under it’s own domain and each under it’s own IP address. WordPress likes to update using FTP, but the directories are all owned by the website users (as in I have a unique user account per web domain). This is for security reasons, I don’t want the user of one to be able to access the other. So each FTP server is bound to the IP address of it’s own domain and set to the certificate of that domain (even though it’s only really communicating to itself inside the box, there is no external FTP access as it’s blocked by two levels of firewalls). This all works fine, except when you want to transfer a file to / from the server. You can piss about and log into one of the existing servers using the credentials for that domain and end up uploading / downloading files from the var/www/domain directory (when you’ve finally figured out which directory you actually have write access to). But that’s an ass and I wanted to just be able to move files to some home type directoy.
So now I’ve added a third FTP server deamon. This one is bound to the DHCP IP address of the server (local subnet), this is just using the snake oil certificate as again I’m only transferring inside the local network. I had to create a new unique user. But the fun thing is when I logged in using the user credentials I get and error about chroot. It appears the only way to fix it was to make the user directory not writeable and then add a subdirectory under it called ‘upload’ and make this writeable by the same user. This works fine. Also cannot escape the home directory so all is good with the world.
Next job is remote server backup storage by SFTP. So far I’ve got as far as encrypting the backup files using a private key using openssl. More joy for the weekend I’m sure.
She’s alive and has shiny new boots
So after almost two years of procrastinating and a PC on my bench, I’ve finally finished the upgrade. Server has been running Ubuntu 12.04LTS (Long term support) version for almost ten years. I know it’s long term support, but that ran out a few years ago. So started to upgrade it to 16LTS I think, started writing down the entire procedure in a notebook. Some eighteen months later and it hadn’t been touched. Decided it really was time as sooner of later the server was going to die. I’d got as far as getting the web server working and updating all the SQL stuff, so basically this blog was working.
So a couple of weeks ago I decided to take what I did and actually document it in a word document. Three weeks later and it’s done (well very much almost). It’s completely rebuilt from scratch and now running Ubuntu 20.04.3LTS which is apparently good until 2030….I wonder if I’ll upgrade in time. Doubt it. Everything was fairly smooth when I followed all the instructions correctly. The original Shorewall config didn’t work, so that put up fight trying to get something going there. Also I have a fault in fstab somewhere and need to work out what root disc to mount, but that’s quite minor in the grand scheme of things.
Also upgraded the hardware as the old server was struggling with memory. Had a look around and found a ‘mini PC’ on Amazon for £270. It has 8GB of RAM and a 128GB SSD, plus six USB ports and VGA. So was pretty much perfect. It came with Windows 10 Pro, but soon said goodbye to that.
It’s now all working perfectly except the mount issue, and it’s getting late. But now I have a very secure up to date server running Apache, Postfix, Courier and lots of other goodies. It’s all setup correctly with SSL and a very strict double firewall in my DMZ.
So overall, very pleased.
So I finally updated my SSL certificates
Just a note for myself here as I won’t remember it otherwise. Copy domain.crt and gd_bundle.crt to etc/ssl/certs. But for postfix concat domain.crt and gd_bundle.crt to server.pem, that seems to fix postfix. Apache and courier don’t seem to require anything extra to work. Verify with TLS Receiver.
I’m sure the above makes a great deal of sense to you. Had a fairly mundane day apart from that. About to have a shower and eat salad.