Using a dedicated NUC for Roon and a Synology DS918+ NAS as the storage for all music seemed like the perfect solution. After a while something started to annoy me.
The four spinning hard drives in the DS918+ fell asleep after twenty minutes. So if I had not played any music for twenty minutes they would have to be spin up again, before I could hear any music. This process takes probably less than a minute, but when encountered often enough it started to feel like an eternity.
I had already expanded the DS918+ with a 4GB SO-DIMM and even though the extra memory is allegedly being used for cache also, I did not notice any improvements. My thoughts turned to the M.2 slots at the bottom of the NAS. Also, memory uses was quite low.
In articles and posts I read about the vulnerability of SSD caches, but wondered how serious of an issue this would be. Especially considering my use case. I would use a read only cache that would not have to deal with a lot of constant and intensive reads and writes . The idea being that it would only be filled with albums I regularly listen to.
So I ordered a Synology SNV3400-400G, which is a 400GB M.2 NVME drive. The install was a piece of cake. No screws needed. Just pop out one of the covers on the bottom of the NAS, install it, return the cover and start up your NAS. After that I only had to assign the SDD to cache duties.
At first, of course, the cache was empty, so when I played some music after the drives had gone to sleep, I still had to wait for the one minute spinup cycle. Even after playing the same album several times, I could hardly see the cache growing. I felt a bit disappointed.
I have no idea how the caching algorithm works, but I am starting to see a change. After a week I do not experience the lag anymore. Albums I play frequently start immediately. The cache hit rate over this week is over 80% which sounds good, no pun intended. The user interface of the NAS also feels snappier.
It is a bit surprising to see that the cache size is still only at 4.6GB. I play a lot of hires audio and I would expect a larger cache size. Let us see how this cache evolves with longer use.
Yesterday a computer, which I had ordered quite a while ago, finally arrived. It is an Intel NUC 10i7FNH with 64GB of memory and a 500GB Samsung 970 EVO Plus. I now have three of these. All the same specs.
I bought them over a period of several months. The 10i7FNH is not the most current model, still the price of every machine I bought was higher than the previous one. Between the first and the last machine there is a price difference of 160 euros. Quite a difference if you take into account the first one cost 790 euros. It is just another effect of COVID-19. Let us hope we can leave this whole pandemic behind us soon.
The Kubernetes cluster now has 144GB of RAM to run applications in. There are three master nodes for High Availability and three master nodes also means etcd has reached quorum.
Adding another master and worker node to a running Kubernetes cluster is quite a job. I could not have done it without the help from this article.
Now I can safely wait for one of the SSDs to break. Master nodes write so much data to disk, it’s just a matter of time before one of the consumer SSDs in the nodes breaks. Or at least that is my expectation. We will see.
When I read that Bitbucket Server is going to be discontinued in the future, I could have done two things. I could have waited as I could still use Bitbucket Server for quite a long time or I could go out and search for a new solution. I did the latter. Well, at least the searching part. I am still trying to find the best solution.
I am still trying to work with Bitbucket Cloud, but I am running into some issues:
I am still not very pleased with having to put the credentials for my Nexus server into someone’s web application.
Pipelines in Bitbucket Cloud aren’t very fast.
Creating a Docker image with the spring-boot-maven-plugin fails at this time and it seems this problem isn’t going to be fixed any time soon.
I’d better have a look at gitlab and see what it can do for me, but there’s a good chance I’ll stick with Bitbucket Cloud and my own Jenkins server. More on that later.
COVID-19 and the possible financial ramifications made me back off big investments in audio. The physical size of my home did not help either.
An audio system made up of individual components takes up a lot of space. Something I tend to forget when I am listening to shiny speakers and a big stack of components in an audio store. The speakers I could easily place in the room. I would just have to remove my current speakers and the makeshift stands they are on.
Putting the amplifier somewhere is a different story. The cupboard under the tv is just too shallow to house a decent amplifier. For example a Naim Supernait 3 is over 30 centimeters deep and that is without cables connected to the back. Under the tv is practically the only place where I could put audio equipment. A solution would be to remove the cupboard under the tv and replace it with something else, but will cost extra money.
Also, should I start my journey with getting my end game rig? A Naim Supernait 3 and a pair of Spendor D7.2 speakers cost close to 10.000 euros and that is without any cables. Should I not start a little smaller? Work my way up and learn what I like?
Would big speakers like the Spendor D7.2 even work in my small living room?
These complications continuously make me rethink what I should get. Should I start out simple, with for example a Cambridge Audio CXA81 with a pair of KEF R3s?
Or should I look at active speakers again? My mind wanders to the KEF LS50W again, but I am afraid the little drivers in that speaker would give me an underwhelming amount of bass. If only KEF made a wireless version of the R3.
The Poly and Mojo combination is a powerful bit of hardware, but you need a case. The two fit snugly together, but I did not feel comfortable to walk around with them without a case keeping them fixed. So I ordered the case. At 95 euros it is not cheap. It also took a month to arrive at my doorstep. According to the retailer where I bought it, this was due to COVID-19. Oh well, in the end I got it.
After setting up Roon I searched for the best way to use Roon and be sort of mobile with my Sennheiser HD 660 S. With mobile I mean having high quality audio that I can move around the house with relative ease. I started out with my iPhone connected to the Chord Mojo. This was not very convenient. The Roon app on iPhone crashes a lot. Moving around a phone connected with two cables to a DAC was quite inconvenient.
On Chord Electronics’ site I saw their Poly, a streaming attachment to their Mojo DAC, is Roon ready. At €599 it is not exactly cheap, but it seemed like just the solution I needed.
So I ordered it and waited… I ordered it at wifimedia.eu and they didn’t have one in stock. Which is fine, it happens, you cannot have every product on the shelf. Unfortunately it was sent to me through DHL, which in my country isn’t exactly trustworthy. I stayed at home all day on a Saturday and they did not show up and then told me on their tracking site I was not at home that day.
When it finally came on Monday. I had to wait another six hours for it to be charged. I then plugged it into the Mojo and set it up through a bit of a clumsy app over bluetooth. I set it to Roon mode, added it to Roon and that was that.
My first impression was that the music now sounded a lot better than when I had the Mojo connected to my phone. Tighter, more balanced, voices sounded more lively and instruments more real. I did have an issue with playing DSD. It stuttered. Not quite sure what I can do about that or what was the cause.
I am not sure how battery life will be. I do not think I will get the play time as advertised by Chord. Time will tell if charging takes too long. The Poly manual states you can charge will listening, but that charging will be slow. I also wonder about how much heat will accumulate playing and charging at the same time.
If you get the Poly also get a case, because dust can get in between where the two connect and it will make sure the Poly and Mojo stay connected. That will set you back another 90 to 100 euros…
It seems it’s the second time this week that Roon is have trouble with their systems. I’m playing Bob Dylan’s Desire as we speak over Tidal and Roon, but I’m reading lots of articles about an outage. I also can’t reach their knowledge base.
I’ve just read it’s due to a Google Cloud Platform outage. Let’s hope things get fixed soon.
About a month ago I tried out Roon on the fastest NUC I had, a 6th generation i5. Not exactly a recent machine. It had 32GB memory and a 256GB M.2 hard drive.
I downloaded Roon ROCK, the Roon distribution specifically made for the NUC and burnt it to a USB stick. In less than 15 minutes Roon was up and running. Quite a small and quick install indeed. It immediately noticed the Bluesound Node2i and Naim Mu So QB Gen 2 on the network.
After reformatting a two drive Synology I put all my music on there and configured it as a source in Roon.
Then I ran downstairs to check out the sound. The audio systems I have aren’t anything special, so I mostly enjoyed the ease of use of the Roon app on my iPad.
Just when my trial was up I came to the conclusion I couldn’t miss that little old NUC as a server, so I ended the trial.
Then followed two weeks of missing the convenience of Roon. I had gotten used to sitting on my couch in the evening with a good glass of wine, going back in time and listening to a lot of music I loved, but had forgotten.
So I ordered an eighth generation NUC with an i7, 32GB of memory and the fastest 500GB M.2 drive I could find. It’s overkill, the 500GB and 32GB, but better safe than sorry. ROCK must do caching, right? It’s running on Linux after all.
When the parts arrived I put them together and immediately ordered a year’s subscription to Roon.
I can’t say it’s all running flawlessly. The iOS apps are the biggest problem. The iPhone app crashes quite often and it lacks a lot of features the iPad app has. The macOS app is brilliant though.
One day when I was listening the music just stopped and the Roon Core seemed to be unreachable. I could reach its web server and restart it, but that hasn’t happened again. Ever since the latest update it’s been clear sailing. Not for all though. There are a lot of articles on the Roon community site about Roon breaking its connection with Qobuz and Tidal.
One day I moved all my LXC containers to one host. This was done to use one of my NUCs as a Roon ROCK server. Moving the containers was easy with LXC. Just take a snapshot of the container and copy it to another server. Start it there and well, that was that.
In the back of my head a voice was telling me that all my LXC containers have boot.autostart set to true. The voice was telling also telling me this might become an issue. What if the Bitbucket server starts before the PostgreSQL server running on the same host?
Anyway, quite soon, after a few reboots, I got into trouble. Bitbucket was stuck at “Migrating home directory”.
I’m not saying booting all containers at the same time is the problem. It might be. It might also be that I shutdown down the SQL server before Bitbucket.
Looking for a solution wasn’t easy as I couldn’t find anything in the Bitbucket logs:
******@bitbucket ERROR: function aurora_version() does not exist at character 8
Apparently there is some sort of PostgreSQL implementation you can run on the Amazon cloud that is called Aurora. You learn something new every day…
I thought I had found the root cause, but also realised that all the people mentioning these log messages weren’t saying their server didn’t boot.
Then I started googling the message “Migrating home directory” and quickly had a solution. It seems my database was locked. This statement allowed my server to boot Bitbucket successfully again:
UPDATE DATABASECHANGELOGLOCK SET LOCKED=false, LOCKGRANTED=null, LOCKEDBY=null where ID=1;