After a week of the release of macOS Mojave I decided to do the upgrade. The last major upgrades of Mac OS I have not experienced any problems what so ever. But this time I encountered a small problem that took me about 30 minutes to figure out.
I downloaded the macOS Mojave and was ready to install but for some reason I was unable to install. When I came to the last step in the installer where you select the disk where you want to install the new macOS I encountered the following error: “Unable to install, Damaged Core Storage Users”. This error did not make any sense at all. A little Google Kungfu did not give me any useful results but after a while of searching I was wondering if the FileVault could be an issue here. My data on my MacBook Pro is encrypted using macOS FileVault so I started to wonder if there could be an issue with my user and FileVault. I opened System Preferences and then Privacy & Security in macOS and then the FileVault tab and bingo. I noticed that my user was unable to unlock the disk.
The trick is to click the “Enable users” button and choose your user. Then shutdown the macOS Mojave installer and reopen. Now you are able to complete the installation. An hour later I was running macOS Mojave, nice. I love the dark UI by the way.
After some considerations I decided that today was the day that I upgraded my Intel NUC with Ubuntu 18.04 LTS server (Bionic Beaver). It took 15 minutes and no problems during the upgrade, yay. After reading some articles on the big Internet I was recommended the following approach:
At some point you are asked if you want the upgrade to remove obsolete packages. Please say no to that, as it can cause you some problems later on, this is what I heard. But run the apt autoremove command instead after the upgrade. The Ubuntu 18.04 contains the following new features:
- Python 3.6.5 (from 3.5.1)
- Ruby 2.5 (from 2.3)
- Go 1.10 (from 1.6)
- PHP 7.2 (from 7.0)
- Node.js 8.10 (from 4.2.6)
Let us see how the system runs for the next few days. My PostgreSQL version is now 10 and both my JIRA and Confluence use this is as the primary datastore. But all is good, so far.
PostgreSQL autovacuum daemon
PostgreSQL has the autovacuum daemon that will do some database housekeeping for the following reasons:
Recover or reuse disk space occupied by updated or deleted rows.
To update data statistics used by the PostgreSQL query planner.
To update the visibility map, which speeds up index-only scans.
To protect against loss of very old data due to transaction ID wraparound or multixact ID wraparound.
Many of the latest versions of PosgreSQL has the autovacuum enabled by default but you can check if the vacuum daemon is running with the following command:
PostgreSQL full vacuum with crontab
It might be overkill as my PostgreSQL databases are not that big but still I like to do a full vacuum sometimes.
It is easy to incorporate a crontab that can do a full database vacuum on a specific time.
This is just the basics but it still powerful enough to use.
Finally got Langhorn Web up and running. Over the last year I have tried to convince myself that the best way to learn about Confluence, JIRA, Bamboo and Bitbucket is read, use and experiment with the applications.
Well, I use Confluence, JIRA, Bamboo and Bitbucket at work on a daily basis and assist/consults Netic A/S customers with configuration and hosting issues regarding these applications. But I would like to have my own Confluence running, where I can share my code (some of it) and my experiences (some of them), when it comes to the Atlassian application suite.
So should I run my own Confluence at home or use one in the cloud? I decided to use my Intel NUC as the server for my Confluence setup and I must say, this tiny little machine can do the work. So Langhorn Web is running on an Intel NUC with i3 CPU, 16 GB Ram and 256 GB SSD with Ubuntu server OS installed. Work like a charm.