Page 1 of 2 12 LastLast
Results 1 to 10 of 46

Hybrid View

Previous Post Previous Post   Next Post Next Post
  1. #1

    Server hardware stuff

    In this thread I'll try to keep track of what goes on with the server hardware wise.

    The last publicly documented server change was this: [completed] New server plans (These are no longer 'new' plans)
    We upgraded the server to a 6-core/12-thread Intel Core i7 5820K CPU with 64GB of RAM.

    Unfortunately that setup gave us trouble running VMWare ESXi which resulted in an unstable server. It took a few weeks to track down the exact cause. The problem was with the CPU. If you ever do any PC building: it's almost never the CPU that's causing stability issues. Unless you do overclocking. But that isn't the case here.

    I decided to upgrade the CPU to a 14-core/28-thread Intel Xeon E5 2680 v4 CPU with 96GB RAM after that. This is what we are currently running all our servers on. That took care of the stability issues.

    Then the next issue was with server performance. Or to be more precise: disk performance. The servers are running of two 7200 rpm SATA drives. And even though they are connected to an Areca 1680i raid controller with 4GB cache and a dual core 1.2GHz PowerPC cpu, it's not enough to run all the additional servers we're now running. It used to be:
    • website
    • email
    • Minecraft, 10 or so instances.

    And now we added:
    • ARK Survival Evolved, 3 modded instances.

    Disk I/O was lagging behind and that caused some noticeable performance issues.

    So I purchased an Icy dock Tough Armor 4 x 2.5" mobile rack for 1 x 5.25" device bay. In this dock, I have placed three second hand 300GB 10K rpm SAS drives. These disks are meant to offset the disk I/O that's been hammering the OS drives. I also swapped out the four 80GB Intel Postville SSDs and replaced those with two new Samsung 850 Pro 256GB SSDs.

    The additional drives worked well for a few weeks. Unfortunately this morning at around 5:02AM one of those three drives failed. This is not a big deal since they were running in raid-5. However it does mean I now have to move the virtual machines that were running on those drives back to the other disks they were on before. Which means we may experience some slow downs in the time to come.

    The 10K rpm SAS drives were second hand with no warranty so the faulty drive will have to be replaced by buying another. We just had a beautiful baby girl and with all the stuff we need for that I'm not allowed (haha ) to spend more money on my hobby. So if anyone wants to help out by donating (see the front page). Those SAS drives cost about $55 each:
    HP 300GB 6G SAS 10K 2.5 inch

    At any rate the servers will continue to run. The ARK servers are running on two Samsung 850 PRO SSDs. It's just everything else that will get a performance hit now that they have to share the slower storage.

  2. #2
    Would this lower tps on the freebietfc server? It was around 18 this morning with 2 on, and around 14 now with 4 on.
    If it is I will let people know if they ask about it. And this in no way a push to get it fixed, family is first .

  3. #3
    I'm not sure if TPS will drop when loading chunks takes a bit longer. It might.

    As for fixing the issue, that won't take much time. If/when I have the funds it's just a matter of ordering a drive. Then pull the defective drive from the server and replace it with the new one. The raid controller will then start rebuilding the array automatically. After that is done I'll move the virtual machines back to the sas drives. All in all it won't take more than half an hour of button pushing. The rebuilding and virtual machine moving will take longer but that's just a progress bar filling up

  4. #4
    Is there anything I can do to check what may be causing the tps drop?
    I used /lag and the entities where around 4000, below the 10k number I have seen on the message.
    Also at one point mem had 646 or so remaining.

    /lag
    now shows
    2065 chunks, 990 entities, 126,537 tiles, looks like it reset about 3 hours ago. and is back to 19.87 tps
    though the one on the tab screen show a different number of about 4, bukkit tps?
    Last edited by Rainnmannx; 27th March 2017 at 02:12.

  5. #5
    /lag shows TPS from the Bukkit side of the server, same as the value shown when pressing TAB. Use /forge tps instead to get a more accurate reading on how the server is doing. Bukkit TPS tends to always be slightly lower than the Forge TPS and less than 20 (19,xx). To me it looks like one sits on top of the other since Forge and Forge mods seem to take precedence to Bukkit and Bukkit plugins.

    The TAB TPS being low was due to a glitch in BungeeCord and the plugin that takes care of that information. I've restarted BungeeCord and now the 'Bukkit TPS' in the TAB screen is similar to that when you type /lag or /tps

    Name:  2017-03-27_10.53.00.png
Views: 674
Size:  305.3 KB

    Name:  2017-03-27_10.54.13.png
Views: 760
Size:  290.8 KB

    Bottom pic shows output of:
    /lag
    /tps
    /forge tps

  6. #6
    I've received three donations last night. Thanks guys!
    With this I'm going to purchase two new drives. Unfortunately the drives I linked above won't ship to The Netherlands. Buying them here (European Amazon) they are 89.98 euro. Which means to avoid an angry wife we could really use another donation or two

    The two drives will replace the faulty one and expand the array to give us a net storage of 900GB with more I/O. (Raid-5 capacity = n-1, meaning: 4x300 - 300) The drives should arrive in a few days.

  7. #7
    Quote Originally Posted by InsaneJ View Post
    In this thread I'll try to keep track of what goes on with the server hardware wise.

    The last publicly documented server change was this: [completed] New server plans (These are no longer 'new' plans)
    We upgraded the server to a 6-core/12-thread Intel Core i7 5820K CPU with 64GB of RAM.

    Unfortunately that setup gave us trouble running VMWare ESXi which resulted in an unstable server. It took a few weeks to track down the exact cause. The problem was with the CPU. If you ever do any PC building: it's almost never the CPU that's causing stability issues. Unless you do overclocking. But that isn't the case here.

    I decided to upgrade the CPU to a 14-core/28-thread Intel Xeon E5 2680 v4 CPU with 96GB RAM after that. This is what we are currently running all our servers on. That took care of the stability issues.

    Then the next issue was with server performance. Or to be more precise: disk performance. The servers are running of two 7200 rpm SATA drives. And even though they are connected to an Areca 1680i raid controller with 4GB cache and a dual core 1.2GHz PowerPC cpu, it's not enough to run all the additional servers we're now running. It used to be:
    • website
    • email
    • Minecraft, 10 or so instances.

    And now we added:
    • ARK Survival Evolved, 3 modded instances.

    Disk I/O was lagging behind and that caused some noticeable performance issues.

    So I purchased an Icy dock Tough Armor 4 x 2.5" mobile rack for 1 x 5.25" device bay. In this dock, I have placed three second hand 300GB 10K rpm SAS drives. These disks are meant to offset the disk I/O that's been hammering the OS drives. I also swapped out the four 80GB Intel Postville SSDs and replaced those with two new Samsung 850 Pro 256GB SSDs.

    The additional drives worked well for a few weeks. Unfortunately this morning at around 5:02AM one of those three drives failed. This is not a big deal since they were running in raid-5. However it does mean I now have to move the virtual machines that were running on those drives back to the other disks they were on before. Which means we may experience some slow downs in the time to come.

    The 10K rpm SAS drives were second hand with no warranty so the faulty drive will have to be replaced by buying another. We just had a beautiful baby girl and with all the stuff we need for that I'm not allowed (haha ) to spend more money on my hobby. So if anyone wants to help out by donating (see the front page). Those SAS drives cost about $55 each:
    HP 300GB 6G SAS 10K 2.5 inch

    At any rate the servers will continue to run. The ARK servers are running on two Samsung 850 PRO SSDs. It's just everything else that will get a performance hit now that they have to share the slower storage.
    Hi InsaneJ,
    I have a small job as web developper and realised I got a not in use 860 evo 256gb laying around, which I can sell to you for cheap. If i'm right you live in the netherlands aswell. I would also suggest to run the website on a cheap hosting service to make some space on the ssd's for ark and mc. Website hosting is really cheap atm. I can also host something for you for free if youre intrested. So contact me if you want anything,
    Greetings Bram.
    Last edited by bram_dc; 23rd October 2018 at 00:36.

  8. #8
    Quote Originally Posted by bram_dc View Post
    Hi InsaneJ,
    I have a small job as web developper and realised I got a not in use 860 evo 256gb laying around, which I can sell to you for cheap. If i'm right you live in the netherlands aswell. I would also suggest to run the website on a cheap hosting service to make some space on the ssd's for ark and mc. Website hosting is really cheap atm. I can also host something for you for free if youre intrested. So contact me if you want anything,
    Greetings Bram.
    Thanks for your offer, Bram

    Right now we have no shortage in disk storage. Recently I've upgraded our server with two 6TB SAS drives. In addition to four 300GB 10K SAS drives, two 256GB EVO PRO SSDs, six 3TB WD Red drives and a 1TB nvme SSD for caching we have plenty of space to put everything.

    Compared to hosting ARK and heavily modded Minecraft servers, the web server doesn't use up a whole lot of resources. We upgraded our server to 128GB of RAM a short while ago. Right now that is sufficient to run all the servers we want.

    All those figures combined add up to quite a large sum of money. But compared to having to rent servers, it's cheap. Also it's a bit of a hobby

  9. #9
    Some of you may know about our crazy plans to setup another HappyDiggers server in the US. That plan seems to be going forward. Jiro bought my current 14core Xeon, motherboard and 64GB of RAM. This allowed me to buy the following parts to upgrade our current server with:

    AMD Ryzen Threadripper 1920X 12 core / 14 thread - CPU € 335,00
    ASRock X399 TAICHI - Motherboard € 348,33
    Noctua NH-U14S - CPU cooler € 79,95
    Corsair RMx Series RM850x (2018) - power supply € 118,00
    USB drive € 15,99

    I put together these components along with 32GB of RAM and an old Nvidia Quadro card I had lying around. Unfortunately the Quadro shorted and let the smoke out. Yikes...

    After that I removed it from the motherboard and booted it up. Oddly enough it displayed the all-OK post codes and the num lock LED on the keyboard seemed to turn on/off when pressing the num lock key. So it seems that apart from the Quadro the other hardware is fine.

    I don't have any other graphics cards lying around to test with and I don't feel like taking one out of our HTPC. So I've bought an ATI-102-B17002(B) 256MB PCI-e x1 card for 2 euro. That's right. ATI. From way back before it was bought by AMD. Essentially all it has to do is display the BIOS and a VMWare terminal. If I could have found a 4MB non-3D card that fits in a PCI-e x1 slot I would have gotten that. But this card should do nicely

    Once I get the 'new' graphics card I'll try installing VMWare and do some testing. If that all goes well I'll do the server upgrade after that. This involves removing the current motherboard and expansion cards from the server case and replacing it with the new parts. This shouldn't take too long. After that I'll have to do some work on VMWare to get all the VMs up and running again.

    I won't send the parts to Jiro after that just yet. He asked me to wait until he can upgrade his Internet connection to allow for more upstream bandwidth. This means the current server will continue to run with 128GB of memory, but with a faster CPU.

    Pics!
    Spoiler!

  10. #10
    I think I had at one point an ATI card with specs similar to your "new" one. It had the same fate as your old Quadro

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •