ESXi

Expand all | Collapse all

New Server Project

TopHatProductions115

TopHatProductions115Sep 16, 2020 12:50 AM

TopHatProductions115

TopHatProductions115Sep 23, 2020 10:23 AM

Mike Gerolami

Mike GerolamiMay 16, 2022 09:41 PM

TopHatProductions115

TopHatProductions115Feb 12, 2021 10:20 PM

TopHatProductions115

TopHatProductions115May 23, 2021 11:32 PM

TopHatProductions115

TopHatProductions115Sep 19, 2021 05:19 AM

TopHatProductions115

TopHatProductions115Sep 21, 2021 03:14 AM

TopHatProductions115

TopHatProductions115Oct 06, 2021 02:27 AM

TopHatProductions115

TopHatProductions115Oct 08, 2021 08:17 AM

TopHatProductions115

TopHatProductions115Nov 18, 2021 06:21 PM

TopHatProductions115

TopHatProductions115Nov 24, 2021 04:12 AM

TopHatProductions115

TopHatProductions115Nov 30, 2021 01:35 PM

TopHatProductions115

TopHatProductions115Dec 23, 2021 05:06 PM

TopHatProductions115

TopHatProductions115Dec 30, 2021 12:31 AM

TopHatProductions115

TopHatProductions115Jan 30, 2022 05:52 AM

TopHatProductions115

TopHatProductions115Mar 20, 2022 03:00 AM

TopHatProductions115

TopHatProductions115Mar 24, 2022 02:32 AM

TopHatProductions115

TopHatProductions115Jun 16, 2022 04:06 AM

  • 1.  New Server Project

    Posted Jul 23, 2020 10:10 PM

    Hello! It's been a while since I last posted here with my own topic. I now have a dedicated ESXi server in the works. The server project is meant to replace (and exceed) my previous workstation - a Dell Precision T7500. Here are the specs for the hardware:

     

    HPE ProLiant DL580 G7

     

     

        OS   :: VMware ESXi 6.5u3 Enterprise Plus
        CPU  :: 4x Intel Xeon E7-8870's (10c/20t each; 40c/80t total)
        RAM  :: 256GB (64x4GB) PC3-10600R DDR3-1333 ECC
        PCIe :: 1x HP 512843-001/591196-001 System I/O board + 
                    1x HP 588137-B21; 591205-001/591204-001 PCIe Riser board
        GPU  :: 1x nVIDIA GeForce GTX Titan Xp +
                    1x AMD FirePro S9300 x2 (2x "AMD Radeon Fury X's")
        SFX  :: 1x Creative Sound Blaster Audigy Rx
        NIC  :: 1x HPE NC524SFP (489892-B21) +
                    2x Silicom PE310G4SPI9L-XR-CX3's
        STR  :: 1x HP Smart Array P410i Controller (integrated) +
                    1x HGST HUSMM8040ASS200 MLC 400GB SSD (ESXi, vCenter Appliance, ISOs) + 
                    4x HP 507127-B21 300GB HDDs (ESXi guest datastores) +
                    1x Western Digital WD Blue 3D NAND 500GB SSD + 
                    1x Intel 320 Series SSDSA2CW600G3 600GB SSD +
                    1x Seagate Video ST500VT003 500GB HDD
        STR  :: 1x LSI SAS 9201-16e HBA SAS card +
                    1x Mini-SAS SFF-8088 cable + 
                            1x Dell EMC KTN-STL3 (15x 3.5in HDD enclosure) + 
                                    4x HITACHI Ultrastar HUH728080AL4205 8TB HDDs +
                                    4x IBM Storewise XIV v7000 98Y3241 4TB HDDs
        I/O  :: 1x Inateck KU8212 (USB 3.2) +
                    1x Logitech K845 (Cherry MX Blue) +
                    1x Dell MS819 Wired Mouse
                1x Sonnet Allegro USB3-PRO-4P10-E (USB 3.X) +
                    1x LG WH16NS40 BD-RE ODD
        PRP  :: 1x Samsung ViewFinity S70A UHD 32" (S32A700)
                1x Sony Optiarc BluRay drive
        PSU  :: 4x HP 1200W PSUs (441830-001/438203-001)

     

     


    The details for the ProLiant DL380 Gen9 will appear here once data migration is complete. VMware Horizon (VDI) will have to wait for a future phase (if implemented at all). The current state of self-hosted VDI is Windows-centric, with second-hand support for Linux and no proper support for macOS.

    The planned software/VM configurations have been moved back to the LTT post, and will be changing often for the foreseeable future.

    Product links and details can be found here.

     

    ESXi itself is usually run from a USB thumb drive, but I have a drive dedicated to it. No harm done. A small amount of thin provisioning/overbooking (RAM only) won’t hurt. macOS and Linux would have gotten a Radeon/FirePro (ie., Rx Vega 64), for best compatibility and stability, but market forces originally prevented this. Windows 10 gets the Audigy Rx and a Titan Xp. The macOS and Linux VMs get whatever audio the Titan Z FirePro S9300 x2 can provide. The whole purpose of Nextcloud is to phase out the use of Google Drive/Photos, iCloud, Box.com, and other externally-hosted cloud services (Mega can stay though).

     

    There are three other mirrors for this project, in case you're interested in following individual conversations from the other sites (in addition to this thread).

     

    P.S. Out of all the sites that I've ever used, this forum has one of the best WYSIWYG editors I've used in a while Smiley Happy

    Kudos to the devs!



  • 2.  RE: New Server Project

    Posted Jul 24, 2020 10:08 AM

    Just a quick note that flash read cache is now deprecated - vFlash Read Cache Deprecation Announced - VMware vSphere Blog

    Edited to add that you would be violating the macOS EULA and I suspect VMware's EULA by running macOS on non Apple hardware.

    It also looks like the latest version of ESXi that a DL580 G7 supports is 6.0 U3.



  • 3.  RE: New Server Project

    Posted Jul 25, 2020 02:19 AM

    The version(s) of ESXi that apply to my use case were released before the deprecation, weren't they? Did VMware go back and retroactively remove the feature from older ESXi images? Or is it just no longer officially supported? Just trying to make sure I know what I'll be facing.



  • 4.  RE: New Server Project

    Posted Jul 25, 2020 07:05 AM

    Deprecated means the feature will be removed in a future release however that feature will remain in earlier versions.

    Out of interest, why are you choosing a DL580 ? Are you able to get one super cheap? I would have thought it would cost an absolute fortune to run.



  • 5.  RE: New Server Project

    Posted Jul 25, 2020 07:49 PM
    1. Thanks for the clarification on deprecation.
    2. I've been using a Dell Precision T7500 with 48GB of DDR3 ECC and a mid-range graphics card. The DL580 G7 came to me for under 300 USD, and can take a lot of the parts that I already have on-hand. Parts for it also have been super cheap lately, which allows me to have spare parts inventory for it (in case something goes wrong). I'm also working toward using HPE-branded drives, to keep the fans from ramping up too easily. The only concern I have at the moment is power consumption, but that's being looked into by my household already.


  • 6.  RE: New Server Project

    Posted Jul 24, 2020 03:11 PM

    Moderator: Running MacOS on anything other than Apple hardware would be a violation of the Apple EULA and cannot be discussed here.

    This post can remain in relation to all the other things you are doing, but please keep MacOS out of any discussion.



  • 7.  RE: New Server Project

    Posted Jul 25, 2020 02:17 AM

    Understood. I'll keep MacOS out of the discussion here. Thanks for the heads-up.



  • 8.  RE: New Server Project

    Posted Jul 27, 2020 02:59 AM

    Currently waiting on some HP-branded SAS drives for the server, since those have the potential to affect the acoustics in a positive manner (reduced sound output). Can't wait to test them out when they arrive. Then I'll be able to test a VMware ISO on the server. I'll be sure to document how that goes.



  • 9.  RE: New Server Project

    Posted Jul 28, 2020 09:13 PM

    Made some changes to the SAS HDD choices I'm using, due to compatibility and acousitcs reasons. While I could go and LLF the whacky NetApp drives I purchased, I'd still have to put up with a noisier server afterward. I'd rather move in a different direction, and restrict that issue to my decisions in PCIe cards instead. Also removed the old HITACHI HDD, since it didn't really belong in this project. It's SATA 1 or 2 iirc. Here are the items I kicked from the project:

    • (1x) 250GB HITACHI HTS542525K9SA00
    • (4x) 600GB HGST NetApp X422A-R5 SAS

    Still looking to see if I can get the Dell mouse...



  • 10.  RE: New Server Project

    Posted Jul 30, 2020 03:22 AM

    Currently looking into making a custom ESXi 6.5 image for the DL580 G7, since official support was axed after 6.0. I already own the license, and I'd rather not waste it in laziness. It wouldn't be the first time I had to do something like this. On a side note:



  • 11.  RE: New Server Project

    Posted Jul 30, 2020 09:43 PM

    Just removed a Tesla K10 from the project. It's been reduced to a spare component, for the sake of noise reduction and power concerns. Artix Linux is no longer in line to receive a GPU. MacOS will take over the F@H role. If you have any questions, feel free to ask.



  • 12.  RE: New Server Project

    Posted Aug 01, 2020 10:16 PM
    Once I buy this cable (to power the HBA disk array), the server project will be ready to go. I definitely should list the E7-2870's, since I can't use those with the server.

    The interesting part is, it has molex and other endings on it to. Multipurpose...



  • 13.  RE: New Server Project

    Posted Aug 09, 2020 03:46 PM

    I've ordered the cable. Now I'm just waiting on the postal service to get it to me, so I can (possibly) begin the project this Wednesday. The last week I'll spend in pre-installation delay. I should be able to test out ESXi 6.5 installation today, as long as no one/thing interferes this afternoon...



  • 14.  RE: New Server Project

    Posted Aug 11, 2020 03:49 AM

    Delaying initial ESXi testing to this Thursday, since today got swamped with unexpected events.



  • 15.  RE: New Server Project

    Posted Aug 14, 2020 03:35 AM

    Currently looking into VMware Horizon 7, for app virtualisation. Time to see if I can beat Turbo/Spoon at their own game.

    If I can move UnGoogledChromium and KeePass 2 to a remote instance, I'd call that the first step to success...



  • 16.  RE: New Server Project

    Posted Aug 15, 2020 01:32 PM

    Replaced the Rosewill RASA-11001 with a Kingwin MKS-435TL, due to the fact that it doesn't need molex and will also have a cleaner look to it.



  • 17.  RE: New Server Project

    Posted Aug 16, 2020 03:33 AM

    Grabbed the wired mouse. Now back to the waiting game, to see when everything will arrive in the mail...



  • 18.  RE: New Server Project

    Posted Aug 21, 2020 02:58 AM

    The new drive cage arrived via Amazon. There's only one item left that hasn't arrived in the mail yet - my new mouse...



  • 19.  RE: New Server Project

    Posted Aug 26, 2020 11:30 PM


  • 20.  RE: New Server Project

    Posted Aug 28, 2020 02:30 AM

    Onto the next issue:

    Also have to look into this at some point (even though the server will be mostly hiding behind a VPN):



  • 21.  RE: New Server Project

    Posted Aug 28, 2020 09:04 PM

    Now to address the SSD issue in the background, while I look at security patches and initial VM setup.



  • 22.  RE: New Server Project

    Posted Aug 28, 2020 11:50 PM

    Currently trying to make a new datastore via SSH. More info here:



  • 23.  RE: New Server Project

    Posted Aug 30, 2020 04:43 AM


  • 24.  RE: New Server Project

    Posted Aug 30, 2020 11:48 PM

    While I'm waiting on comments for the previous issue, and a few ISOs to upload to my server, I can start working on investigating this:

    Gotta search for the patches through this page, by entering the details mentioned in the last 3 kb pages:

    On a side note, also ran into this when setting up my first VM:

    Reserve Memory beforehand, I guess



  • 25.  RE: New Server Project

    Posted Sep 06, 2020 03:25 PM

    Just finished installing the vCenter Appliance, and will be using the FLEX (Flash web) client to setup the SSD for virtual flash in a bit. Stay tuned :smileyhappy: One step closer...

    TXP-Network Does :: ESXi Test Stream! - YouTube



  • 26.  RE: New Server Project

    Posted Sep 11, 2020 03:48 AM

    Solved the SSD/Virtual Flash issue! Now onto the next one​​​​​​​...



  • 27.  RE: New Server Project

    Posted Sep 11, 2020 02:18 PM

    Removed the HP 491838-001 (NC375i)​​​​​​​https://support.hpe.com/hpesc/public/docDisplay?docId=emr_na-c01951393 due to space constraints, increased RAM to 128GB, purchased 4TB HDDs to replace the 2TB HUA722020ALA330's, and delaying the addition of the SolarFlare NIC.



  • 28.  RE: New Server Project

    Posted Sep 13, 2020 01:18 PM

    Currently working on DNS, after which I'll focus on setting up the first VPN solution - SoftEther.



  • 29.  RE: New Server Project

    Posted Sep 16, 2020 12:50 AM


  • 30.  RE: New Server Project

    Posted Sep 17, 2020 01:15 PM

    Just got a new GPU in the mail, which may end up replacing the GTX 1060 6GB. Still troubleshooting this issue...



  • 31.  RE: New Server Project

    Posted Sep 19, 2020 01:17 PM


  • 32.  RE: New Server Project

    Posted Sep 20, 2020 09:32 PM


  • 33.  RE: New Server Project

    Posted Sep 23, 2020 10:23 AM

    The K80's are coming...



  • 34.  RE: New Server Project

    Posted Sep 26, 2020 09:12 PM

    I've got a GRID K520 coming in the mail in about 2 weeks, to replace the Tesla K10. Perhaps I can buy a GRID K2 in the near future, so that I can have all three of the major variants for this card. GRID K520 looks like a GeForce card from inside a VM, if my memory isn't failing me. GRID K2 would be the Quadro variant. Tesla K10 is a pure compute version. I wonder if anything like that exists for Tesla K80...



  • 35.  RE: New Server Project

    Posted Sep 27, 2020 08:26 PM

    Just solved another looming issue for the server project. Now to get that SSD working and added to the Virtual Flash resource pool...

    TXP-Network Does :: ESXi Server - HBA Storage Array Update! - YouTube



  • 36.  RE: New Server Project

    Posted Oct 12, 2020 01:26 PM

    Time for a long-overdue project update. Omitting a lot of steps/details here, for relative brevity. A friend of mine, from Discord (the same one who was kind enough to help me troubleshoot the many of the issues I encountered), had me run a Linux LiveCD on the server to troubleshoot the LSI HBA. For those of you who did not know, the LSI HBA wasn’t working as expected until a few hours ago (late last night). I tested it in my current workstation (Precision T7500 - Windows 10), the server (DL580 G7 – ESXi 6.5u3), and even on my laptop (EliteBook 8770w - Windows 10). When tested on the T7500, the HBA showed up – but none of the 4TB hard drives showed up. The same for the laptop and the server. After a bit of Googling (as the cool kids say), I decided that it may behoove me to try flashing it with the IT firmware, to see if that would fix it. I did so from my laptop, by making use of a powered PCIe dock (to prevent further downtime on the T7500 – running a Minecraft server). I did so, using a GUI application called MegaRAID Storage Manager. The HBA was on v17.X, and now it's on v20.X. The drives also appeared in Windows Device Manager for once. However, they didn't stay in Device Manager for long. They popped in and out, sporadically. I was instructed to reboot after the firmware update was applied. MegaRAID Storage Manager stopped being able to connect to the local server after the reboot it said to do, for the firmware update to take hold. That meant that, if the firmware I flashed was the wrong one, I’d have to resort to using sas2flash. After no luck checking on the HBA from my laptop, I decided to put it in the server, with the Linux LiveCD (as mentioned earlier). The Linux LiveCD was running an older build of Manjaro, and managed to see all of the drives in gparted. However, we were unable to get SMART data for most of the HDDs. If you look closely at the HDD models, you may or may not be able to tell why. However, while I was in the LiveCD, I decided to also try GPT scheming the Intel SSD as well, since messing with it in Windows simply did not work for some reason. A short while later, we tried the latest Manjaro LiveCD available (because Manjaro is my preferred distro with sysemd). That one didn’t see the drives at all, but did still see the HBA. At this point, I saw no other way to validate the HDDs further. I made the decision to test them in ESXi and try to pull SMART data from esxcli. The drives showed up in ESXi, and even allowed for us to pull SMART data – but it was limited, in a different format than most common drives on the market. I was able to add the Intel SSD to the Virtual Flash pool for once, though. As such, this is strictly a partial victory. We have the drives ready for use, presumably. But we don’t know how the drives are doing – which is very different from all of my previous experiences, where I could pull up SMART data immediately after installing the drives. The game is afoot.



  • 37.  RE: New Server Project

    Posted Oct 14, 2020 12:12 AM

    On a side note, the results of last night's livestreaming attempt are tempting me to make YouPHPTube part of the project again. If this keeps up, I might actually go for it...



  • 38.  RE: New Server Project

    Posted Oct 17, 2020 02:10 PM


  • 39.  RE: New Server Project

    Posted Oct 31, 2020 01:16 AM


  • 40.  RE: New Server Project

    Posted Nov 08, 2020 04:41 AM

    ToDo List for the next few days:

    • Figure out Split Horizon DNS records (Technitium)
    • Setup ejabberd and hMailServer
      • FQDNs and subdomains
      • AD/LDAP integrations
    • Setup Artix Linux VM
      • secondary Technitium instance (AD DNS forwarding)


  • 41.  RE: New Server Project



  • 42.  RE: New Server Project

    Posted May 16, 2022 09:41 PM

    Thanks for this link



  • 43.  RE: New Server Project

    Posted Nov 11, 2020 07:21 PM

    Just removed YaCy from the project, in favour of researching YaCy Grid. Here's to hoping I can get it working in a shared environment...



  • 44.  RE: New Server Project

    Posted Nov 28, 2020 08:02 PM

    I currently have a PCIe WiFi NIC coming in the mail. I also have a pair of Ethernet NICs sitting in inventory. The server already has a SolarFlare SFN5322F sitting in it. What if I threw FRRouting onto a Linux VM, and passed through the mentioned NICs to it? Sounds like a virtual managed switch in the making. I could have the Linux VM use the wireless NIC to connect to the house WiFi on one network (192.168.1.0), and have it sit at an arbitrary address (Perhaps 192.168.1.2). Then have the wired NICs be used for an internally-managed network (10.0.0.0). Setup the Linux VM as the default gateway (Maybe 10.12.7.1), have it handle DHCP and internal DNS. Last step would be to route all outbound traffic from clients on 10.0.0.0 through 10.12.7.1 => 192.168.1.2 . All outbound traffic from 10.0.0.0 clients will appear to come from 192.168.1.2, which sounds similar to NAT (many clients/private IPs behind one gateway/public IP). Setup forwarding rules and throw the Linux VM sitting at 192.168.1.2 into the DMZ (since port forwarding on the new ISP router is utter garbage for some reason). That would kill off the need for a router/extender in my room. Also still need to work on this. The rack-mounting kit for my server is ~200 USD by itself - yikes...



  • 45.  RE: New Server Project



  • 46.  RE: New Server Project

    Posted Jan 03, 2021 09:59 PM

    It's been a slow weekend playing with the server. On Thursday, I couldn't get anything done because of New Year's (which I am fine with). On Friday, I slept in due to how late I stayed up, and then had surprise visitors. Didn't get any work done that day, since I was busy keeping the visitor's kids out of the room. On Saturday, I finally got to throw in the HP NC524SFP NIC (along with its memory module). Once they were attached to the SPI board, I fired up the server and checked to see if the 16TB drive cage and ~1TB Virtual Flash Resource Pool showed up in ESXi - that of which they did.

    FYI, just about every time I add new hardware to the DL580 G7, I check for those two things - because they tend to act as immediate indicators for whether something is wrong, strangely enough. That's when no other problem indicators are present (which there rarely ever are). vCenter has thrown an occasional warning, but nothing of consequence from what I've seen thus far.

    After that, I spent most of last night changing my AD and DNS settings, to prepare for adding my first devices to AD. That went on until close to midnight, and is still not quite done yet. Today, I replaced the SolarFlare SFN5322F with an HPE 641255-001 (PCIe ioDuo MLC I/O Accelerator) - a gutsy move with how the server can be with adding new hardware. At first, only 2/4 SAS HDDs showed up in ESXi. After a reboot, and letting the server warm up for a bit, all storage devices and new components showed up. So far, so good!

    However, due to how slow testing has been, I had to put off testing the Tesla K80's and DERAPID PCE-AX200T wireless NIC. If I can get the DERAPID PCE-AX200T working, the Linux VM is definitely going to run an FRRouting instance. Still need to figure out the vCenter startup time issue. At least I can start the 10GbE transition soon...

     

    https://www.youtube.com/watch?v=BsHh6jOhrxI

     



  • 47.  RE: New Server Project

    Posted Jan 04, 2021 04:07 AM

    Onto the next task!

    Initial hardware testing is coming close to an end...



  • 48.  RE: New Server Project

    Posted Jan 11, 2021 04:27 AM

    Just attached the rail kit to the server, in preparation for the rack that's coming in the mail this week. Can't wait to take photos of the finished result...



  • 49.  RE: New Server Project



  • 50.  RE: New Server Project

    Posted Jan 17, 2021 12:42 AM

    Getting ready to kick ejabberd from Windows Server, due to reliability issues observed during initial testing. Probably going to the Arch VM instead. Also need to upgrade the vCenter Appliance from 6.5 to 6.7u3, due to FLEX getting EOL'd...

    https://www.reddit.com/r/activedirectory/comments/kyxf73/setting_up_my_first_active_directory/

     



  • 51.  RE: New Server Project

    Posted Jan 27, 2021 02:48 AM

    From what I can tell, I may have to start from scratch with both vCenter and AD. But, if I manage to pull it off, I would have a few spare CPU cores and a datastore to use for something else.

    Also, found this:

    Time to see if I can find instructions for OS's outside of Windows...



  • 52.  RE: New Server Project

    Posted Feb 01, 2021 03:15 AM

    I would have held out for VCSA 6.5 indefinitely if the HTML5 UI was able to manage Virtual Flash/Host Cache resource pools. As noted in past updates, the VCSA took anywhere from 20-45 minutes to initialise. And with the deprecation of FLEX UI (reliant on Adobe Flash - unsupported in 2021), the now-neutred vCenter Server Appliance VM (6.5) had no practical place in this project. Without the option for an in-place upgrade to a newer version, I also do not have the ability to upgrade to VCSA 6.7. It has been replaced, and will soon be decommissioned. vCenter has been moved to the Windows Server 2016 VM, for practicality reasons. The next step is to re-build the failed MS AD instance and promote a new domain controller. That will happen later this week. Hopefully, things will go a bit better this time around...



  • 53.  RE: New Server Project

    Posted Feb 07, 2021 02:12 PM

    Alright - everything is almost ready for Active Directory setup, attempt #2. Not only did I kick ejabberd to Linux (due to issues when installed on Windows), but I also had to re-install multiple other applications. Demoting the AD DC appears to have been what led to it. So, I got to start from scratch in some sense. Still need to make a new SQL db for hMailServer, unlike last time. But, that should be relatively easy. Already installed vCenter Server, and it starts up way faster than the VCSA. Windows doesn't even take longer to boot from what I've seen. Had what appears to have been an unexpected part failure as well - the Mini-SAS SFF-8088 to SATA Forward Breakout x4 cable. Got that replaced, and now can see all of my SAS HDDs once again. Last step is to (re-)promote the DC and test client devices. This time, I'll set the intended domain from the start (instead of setting it to something else by accident and having to change it twice later).



  • 54.  RE: New Server Project

    Posted Feb 07, 2021 03:01 PM

    Due to your "not to be discussed here" flag - just for your information:

    Using the Unlocker patch on recent VMFS 6 volumes can cause vmfs-locks of type: abcdef03.
    I would not use it on production hosts  ...

    Ulli



  • 55.  RE: New Server Project

    Posted Feb 07, 2021 03:52 PM

    Thank you for letting me know. I will keep this in mind for when the time comes  



  • 56.  RE: New Server Project

    Posted Feb 12, 2021 10:20 PM

    Screenshot (76).png

     

    Backup Complete!



  • 57.  RE: New Server Project

    Posted Feb 16, 2021 01:12 AM

    I had to disable the vCenter Server for Windows instance to get Active Directory instance installed. But I didn't think to change any networking settings on vCenter Server (embedded - 6.7) before disabling it. With the help of a friend, I managed to fix my DNS and get the Active Directory instance working. It was due to some missing NS records. Once I cleaned up the DNS, I was actually able to get a client device joined to the AD. Now I have to figure out how to make Windows clients connect to the VPN before attempting LDAP sign in, since the AD is VPN-locked. Once I figure that out, I will be able to add any devices I want. Also have to see if I can bind vCenter Server for Windows to a single IP address while it's disabled. Otherwise, I'll have to resort to using the VCSA again - and who knows how that will go in long-term. The last time I used it, it started throwing up more warnings and errors than ever before, which leads me to question the overall longevity and performance of it. ESXi was just fine when I monitored the performance metrics, and the server was nowhere near full utilisation - ever. Almost tempted to go without vCenter and Virtual Flash because of the trouble. But then I lose out on features, and the funds I used to acquire vCenter in the first place. At least I can start focusing on the rest of the project more in the near future...

    Without vCenter, how will I be able to add the Precision T7500, as an ESXi host, to my datacentre? For vMotion?

    Also have to figure out whether (and if so, how) to destroy the Virtual Flash resource pool or not...



  • 58.  RE: New Server Project

    Posted Feb 19, 2021 01:42 AM

    Just bought four of these:

    HITACHI Ultrastar HUH728080AL4205 (HGST)

    32TB upgrade, here I come...



  • 59.  RE: New Server Project

    Posted Feb 21, 2021 02:34 PM

    vCSA 6.5 is practically neutred without the use of Adobe Flash, and the HTML5 UI was almost useless until at least 6.7u3. The settings I do have in the current install are mostly small ones, but could only be reversed via the FLEX UI (Flash). vCenter also doesn't allow for in-place upgrades. So, it's time to kill the current vCSA and start from scratch. If I had known to look out for the death of Flash, I could have been ahead of this. But, got held up by other responsibilities. Today, I'm re-installing vCSA. Today's going to be a long day...



  • 60.  RE: New Server Project

    Posted Feb 27, 2021 09:50 AM

    Well, I have more news. I managed to kill the old vCSA (6.5) instance and replace it with a newer (6.7) version. The newer version has a dark theme - nice. Also is pretty well organised, and connected to my ESXi server with no issues. However, Virtual Flash is pretty much dead. I will have to assign the SSDs to something else now. Perhaps I can start setting up the next VM...

     

    On a side note, the current Reddit project mirror is ded again - because those expire every 6 months, regardless of activity. I think it'll stay ded this time. Not in the mood to make yet another one...



  • 61.  RE: New Server Project

    Posted Mar 07, 2021 07:49 PM

    Currently installing Ms SQL Server 2019 for a test drive. Then migrating over to the 8TB SAS HDDs completely.

    Gonna have to redo the backups - had no way of imaging the 4TB HDD before swapping in the 8TB HDD. But enough changes have been made that the old backup is no longer valid.



  • 62.  RE: New Server Project

    Posted Mar 12, 2021 10:26 AM

    GPU Interest Checks:

    Not sure if I'll ever get my hands on the AMD card. That would be an interesting card to try out, once I figure out the K80's. The K80's are due for a VirtualGL experiment soon.


    Also, just updated the OP(s) for each mirror. Please let me know if anything seems to be missing from one mirror or the other. I intend to work on the server later today, assuming that nothing interferes...



  • 63.  RE: New Server Project

    Posted Mar 14, 2021 06:03 AM

    Troubleshooting the hMailServer installation:

    Definitely not a fun time. Kinda wishing that SQL Compact Edition worked like it did last time (no idea how, though)...



  • 64.  RE: New Server Project

    Posted Mar 14, 2021 01:42 PM

    May divert my attention from hMailServer for a bit and skip right to the Linux VM if this doesn't get resolved in the next week or so. This has been dragging on for a while now, and I want to get the rest of the server ready. While hMailServer would be nice, I also have other matters to attend to. And it appears that hMailServer's most recent release is 32-bit. It may be having issues working with Ms SQL Express, which is the free edition for Ms SQL Server - because I used a 64-bit release of it? If this keeps up, I may take the mail server role and toss it to Linux as well. Can't even begin to think about touching Exchange Server...



  • 65.  RE: New Server Project

    Posted Mar 19, 2021 03:21 AM

    Had a momentary power outage today, which took most of my equipment offline again (for the umpteenth time). I've finally decided to just tough out the cost and buy a pair of UPS's this weekend. Time to see if I can get things straightened up around here. They'll have to sit on the floor since I haven't purchased proper rack shelves for them yet. They're both going to be Liebert GXT3 1350W units. No more playing with fire...

    https://www.youtube.com/watch?v=tYsejb8ht8o



  • 66.  RE: New Server Project

    Posted Mar 25, 2021 12:43 AM

    Once the UPS's get here, I'll be able to get both the server and the workstation protected. Also found out that one of the DIMM slots on the T7500's motherboard went out, so I swapped a 4GB stick for a 16GB DIMM I had laying around. 

    MariaDB works well with hMailServer so far, and now I'm trying to add a CA to the project, for future security considerations:

    Once the CA is ready, it'll be time to get crackin' on the Artix Linux VM.



  • 67.  RE: New Server Project

    Posted Mar 27, 2021 05:33 AM

    vCenter certificate replacement, with an MS AD CA:

    Also managed to actually get ejabberd working in one go, from what I can tell. Looks like this one could be here to stay. One less thing to save for later...

     



  • 68.  RE: New Server Project

    Posted Mar 31, 2021 01:44 AM

    Firstly, I need to re-install my Artix OpenRC VM - got the partitions all wrong. Also need to get the WiFi adapter back in the server, for the Linux VM (router/NAT). I'll do those tasks sometime this week, after work.

    Then I need to dust out and service my first UPS this weekend. It arrived this afternoon. When I plugged it in, it showed the following symptoms:

    • beeps every 5-6 seconds
    • Fault and AC Input indicators glow steadily
    • Battery indicator blinks
    • Bypass and Inverter indicators are off

    A second, pristine UPS should be arriving in the next week or so. I'll use that one on the server when it arrives, and clean up the current one for the T7500.



  • 69.  RE: New Server Project

    Posted Apr 02, 2021 05:55 AM

    New partition setup for Artix OpenRC VM:

    • 300GB SAS HDD
      • 8MB, unformatted, [!mnt_point] (bios_grub)
      • 512MB, FAT32, /boot;/boot/efi (esp)
      • 8GB, linuxswap, [!mnt_point] (swap)
      • 256GB, EXT4, / (root;system)
      • 32GB, XFS or ZFS, /home (home)
    • 8TB SAS HDD
      • /srv, still deciding on size and filesystem. Would like to use ZFS possibly
      • /var, still deciding on size and filesystem. Would like to use ZFS possibly

     

    On a side note, seriously considering Docker, podman, or similar for containerisation, to keep things a bit more isolated and cleaner.



  • 70.  RE: New Server Project

    Posted Apr 06, 2021 12:47 AM

    Okay, finally got around to updating Technitium DNS. The newest installer, for v6, doesn't appear to allow selection of a different install location in the GUI. So I grabbed the portable installer and a copy of .NET v5 instead. Installed .NET v5 first. Then, made a .zip backup of the previous install (because reasons). Nuked everything in the DNS server folder but /config and the backup.zip. Finally, copied the new DNS server files over to the DNS server folder. Also had to register a new Windows service, since the old one does not work with the newer version. Not too difficult if I say so myself - just tedious. And I have to do the process by hand from here on, which is a bit tedious as well. May have to look into a way of automating this myself. May need to see if the DNS server can have a self-signed (CA) certificate as well. 

     

    Also waiting on a second drive cage and mini-SAS SFF-8088 to SATA forward breakout cable to arrive, so I can put the 4TB SAS HDDs to use with the Linux VM. If the DL580 G7 can handle powering 2 drive cages at once, I'll give the 16TB drive cage to the Linux VM. Use that in either a RAID10 or RAID0 (OpenZFS pool) and let nextcloud have free reign over that. 

     

    New partition setup for Artix OpenRC VM (GPT, BIOS), as of a few nights ago:

    • 300GB SAS HDD
      • 8MB, unformatted, [!mnt_point] (bios_grub)
      • 512MB, FAT32, /boot (esp)
      • 256GB, EXT4, / (root, system)
      • 32GB, EXT4, /home (home - would like to convert to ZFS in the future)
      • 8GB, linuxswap, [!mnt_point] (swap)
    • 8TB SAS HDD
      • 5TB, EXT4, /srv, (Would like to convert to ZFS in the future)
      • 2TB, EXT4, /var, (Would like to convert to ZFS in the future)

    Coming soon - either:

    • 16TB (4x4TB), ZFS RAID0, /nextcloud
      • or...
    • 8TB (4x4TB), ZFS RAID10, /nextcloud


    It's all coming together now...



  • 71.  RE: New Server Project

    Posted Apr 14, 2021 03:01 AM

    The test with the 2nd drive cage installed didn't go too well tow nights ago. When connected, the drives in the 2nd cage did not appear in ESXi. In addition to this, only 3 of the HDDs from the original/first cage showed up. One of the 3 from that cage showed up intermittently. I think I may have encountered a power issue. While the activity indicators on both cages did light up, they weren't indicative of the true status of the drives. I also checked the ESXi kernel logs (Alt+F12) during runtime, and saw some interesting errors. I tried rebooting the server, to see if it needed some time to get acquainted with the new hardware. But, two reboots did nothing. Everything appears to be working as expected after removing the 2nd drive cage. If it has been a bad data cable on the 2nd drive cage, I would expect the issue to not affect the drives from the 1st cage. But perhaps I've overlooked something. Now I'm stuck at trying to figure out how to power the second drive cage, since internal power appears to be off the table for this. Perhaps an external SATA-only PSU or DC power supply?

    On a different note, I also can't seem to get the WiFi NIC to show up in ESXi - which leaves me with 3 conclusions:

    • the card needs drivers
    • the card needs to be re-seated (for the 7th time)
    • the card is DOA and needs to be replaced

    The first one seems most likely, seeing that ESXi may need drivers for anything that wouldn't be in a normal enterprise environment. The second one seems unlikely because of how many times that I've already attempted that solution. The third one is the worst case scenario, and would be one that incurs the most up-front monetary cost to me. If I do have to install drivers for the wireless NIC in ESXi, a backup of the host config needs to be made first. Otherwise, I'll be in hot water if the installation fails. I'll be attempting to use a 3rd party Linux driver in ESXi, with no way to know if it'll work in advance.

    On a side note, the second drive cage was the only real way I was ever going to get to play with ZFS in Linux. That would have been a pool of four SAS HDDs that I could have experimented with, using ZFS's RAIDz options. Since that's not in the cards at the moment, the extra 16TB of SAS storage is back to sitting without a use.



  • 72.  RE: New Server Project

    Posted Apr 22, 2021 03:27 AM

    Just purchased a GXT3-2000RT120 without the batteries. Waiting for it to arrive in the mail. Then need to see if I can get it working in the next few weeks with some fresh batteries...



  • 73.  RE: New Server Project

    Posted Apr 28, 2021 03:07 AM

    The GXT3-2000RT120 arrived in the mail today in what appeared to be pristine condition this afternoon. I went on and purchased 4 batteries for it, and am now waiting for them to get here next. This Friday, I will need to purchase2 rack shelves. One will be for a new printer that was gifted to me recently (Lexmark Prevail Pro 705), in addition to the drive cage that sits on the back of the DL580 G7. The other will be for the T7500 to sit on. The UPS will end up sitting on the floor for a while, until I can get the rack mount kit for it in a few months. The new UPS will be more than capable of having all devices on the rack connected to it from what I can tell, which will save me space on the rack. Nice not needing to consider buying a second UPS. Current rack setup plan thus far:

    • Top sliding shelf (S1):
      • 1-2x Kingwin MKS-435TL, 1x Lexmark Prevail Pro 705, router/AP (if applicable)
    • 1x HPE ProLiant DL580 G7 (S2)
    • Mid sliding shelf (S3):
      • 1x Kenwood 104AR
    • Lower sliding shelf (S4):
      • 1x Dell Precision T7500
    • Bottom drawer(s - S5)
      • Spare parts, tools, etc.
    • Bottom sliding shelf (S6):
      • 1x Liebert GXT3-2000RT120

    1-2 PDUs are planned for this setup as well. Just a matter of time. The Kenwood and Liebert will not have shelves until at least later this year, due to budget constraints. The rack drawers are in the same category as of now.

    On a different note, getting open-vm-tools installed onto Artix+OpenRC is proving to be a fun little challenge. Almost tempted to write my own init script for it...



  • 74.  RE: New Server Project

    Posted May 01, 2021 11:45 AM

    I bought 2 of the 4 planned shelves yesterday, from here:

    Now I'm waiting for them to arrive in the mail. Also have UPS batteries to install today.



  • 75.  RE: New Server Project

    Posted May 08, 2021 10:36 PM

    More coming soon, once I get up the energy to power on the beast tonight



  • 76.  RE: New Server Project

    Posted May 09, 2021 02:17 AM

    The one thing that always causes trouble is when I have to fiddle with that Mini-SAS SFF-8088 to SATA breakout cable. If I have to mess with it too much, and accidentally damage it, that's another 15-20 USD down the drain. Not saying that it's inevitable, though. I simply treat the cable pretty badly at times. The last rack shelf installation may have damaged the previous cable a bit. And I have a spare cable this time, since I still can't connect the other drive cage at this time. Just means that I'm now out of spare breakout cables to trash. Next one has to come out of my paycheck. Happens about every 60 days with my luck XD Really have to look out for that...



  • 77.  RE: New Server Project

    Posted May 09, 2021 02:58 AM

    Tasks that I want to get done tonight, assuming nothing goes wrong:

    • Installing the nVIDIA drivers for a GRID K520
    • Installing a new terminal emulator (terminology)
    • Installing a new file manager (nemo)
    • Installing a browser (unGoogled Chromium)
    • May add it to the AD domain I have running as well
    • Installing docker for container management
    • Adding nextcloud via docker

    Time to see how helpless I really am XD



  • 78.  RE: New Server Project

    Posted May 09, 2021 06:10 AM

    Okay, I blew through most of the tasks set out for today. But a few major ones still remain:

    Those all can take hours each on their own. Glad to get the other tasks out the way first, so I can have an easier time with those in a bit.



  • 79.  RE: New Server Project

    Posted May 23, 2021 11:32 PM



  • 80.  RE: New Server Project

    Posted Jul 15, 2021 07:54 PM

    Okay, a lot has happened since I last posted here:

    Not much of a fun month or two XD Here’s to ditching the previous dry spell…



  • 81.  RE: New Server Project

    Posted Jul 17, 2021 02:31 AM


  • 82.  RE: New Server Project

    Posted Jul 29, 2021 03:18 AM

    May consider doing this once I add the Linux VM to my AD domain:

    Also hoping that the disk shelf from Project Rackcentre​​​​​​​ can eliminate the need for the drive cage(s) I've been relying on for so long...



  • 83.  RE: New Server Project

    Posted Jul 29, 2021 10:42 PM

    Just made a few part swaps, due to inventory changes.

    • The Kingwin MKS-435TL's now belong to Project Personal Datacentre (2nd node)
    • The DL580 G7 now uses a Dell EMC KTN-STL3 (as shown here) for direct-attached local storage
    • The TPM chip for the DL580 G7 has been installed, and will be used for security purposes in the future
    • The DL580 G7 will actually be keeping the GTX 1080, due to lack of compatible power cables
    • Project Personal Datacentre (2nd node) will have the Titan Xp for the foreseeable future, until I get the PCI 8+8 cable for the DL580 G7
    • The Arctic F9 PWM 92mm fans have been moved to Project Personal Datacentre (2nd node) as well

     

    Getting ready to update parts listings to reflect this in a few...



  • 84.  RE: New Server Project

    Posted Aug 02, 2021 09:40 AM

    Re-made the Linux VM, so I can do it properly this time around. Will be implementing backups this week...



  • 85.  RE: New Server Project

    Posted Aug 07, 2021 03:49 AM

    Managed to remove the need for OTP-based 2FA clients like WinAuth, and am now looking into whether I can replace Ditto (and its Linux companion) with CopyQ. Time to start looking into more cross-platform applications and sync-friendly solutions in general.

    Also just enabled shared clipboard and drag-n-drop for my VMs, for easier use through VMware Workstation Pro. The steps were easy enough, and now I can work a little quicker as a result.

    Only thing left to do is setup proper backups for the Linux VM, so I can safely attempt the driver install for the the GRID K520...



  • 86.  RE: New Server Project

    Posted Aug 25, 2021 04:41 PM

    The GRID K520 is a go. Sunshine gamestreaming server is next:

    Screenshot_199

    Then, Docker+Compose and Nextcloud...



  • 87.  RE: New Server Project

    Posted Aug 26, 2021 05:13 PM

    Sunshine gamestreaming server will take a bit to sort out:

    At least Docker+Compose is installed. Need to install and configure a Nextcloud instance soon...



  • 88.  RE: New Server Project

    Posted Aug 31, 2021 12:43 PM

    A second ESXi node will be coming in the 2021/2022 transition, hopefully...

    Windows 11's requirements seem to have ditched 1st gen Threadripper, so no need to even consider a dedicated Windows machine in the future.



  • 89.  RE: New Server Project

    Posted Sep 19, 2021 05:19 AM

    Titan Z time!

     

    f6b935e0b1040d9678332058ada53b7ba125d546.jpeg

    08a30e1d754a465fd728d53741f656c4370b0c47.jpegb3a23a4197f5bf8c93ea67b1701f86d33ce21ee9.jpegef93b670439653d3696bf82e6b81e272e419a264.jpegf6c96020f22d87250e5b33a593dcd7b2395307e6.jpeg

    bc65dfc75fb47ca010223c98a7c35f61ff6e4573.jpeg



  • 90.  RE: New Server Project

    Posted Sep 19, 2021 05:21 AM


  • 91.  RE: New Server Project

    Posted Sep 21, 2021 03:14 AM

    Screenshot (259).png

    Screenshot (260).pngScreenshot (261).png

    Screenshot (262).png



  • 92.  RE: New Server Project

    Posted Sep 21, 2021 03:22 AM

    Last note for the night, I've decided to replace the GRID K520 with the GTX Titan Z. Need this to work in Linux and macOS, while also being supported in Sunshine when the time comes. Can't currently do that with the GRID K520 for some reason. Need to work that out later, when the rest of the project has caught up...



  • 93.  RE: New Server Project

    Posted Sep 24, 2021 05:29 PM

    With all of the difficulty I've had getting realmd installed to Artix, I'm beginning to think that I simply shouldn't push any further with adding the VM to AD...

    I'll give it another 2 weeks before I make a decision.



  • 94.  RE: New Server Project

    Posted Sep 25, 2021 03:30 AM

    Background context: A few days ago, the GRID K520 was swapped out for the Titan Z. The Linux VM had half of the Titan Z passed through to it. However, no displays/dummy plugs were attached to it afterward.

    An unexpected development for the Linux VM has occurred. While running through regular maintenance and installing updates, I decided to try running Sunshine once more (sudo sunshine). The logs kept mentioning permission denied for pulseaudio, whenever I attempted to remote in via Moonlight. In the past, running Sunshine without sudo had never worked - the attempt would always error out. But this time, it ran without error. 

    When I attempted to remote in, I actually made it to the desktop - and it had audio passthrough. It was streaming the same video out that the VMware adapted would show. Somehow, I can stream a screen/display that isn't rendered by an nVIDIA card.

    More Nextcloud/MariaDB troubleshooting tomorrow...



  • 95.  RE: New Server Project

    Posted Sep 27, 2021 04:16 AM

    Okay, that took a bit longer than expected. I had to tweak the DB for Nextcloud before I could attempt installation. But then, a mystery power outage struck. The UPS kicked in, giving me enough time for an emergency power-off procedure, but Nextcloud install got interrupted. So, had to go in and:

    • remove the container(s)
    • remove the container(s)'s volumes and networks
    • clear the directories I had mounted to the container(s)
    • drop and recreate/reconfig the DB
    • re-attempt installation

     

    It went something like this in MariaDB:

     

    [CODE]

    DROP DATABASE nextcloud;

    CREATE DATABASE nextcloud;

    GRANT ALL ON nextcloud.* to 'admin'@'remotehost' IDENTIFIED BY 'password' WITH GRANT OPTION;

    ALTER DATABASE nextcloud CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci;

    SET GLOBAL innodb_read_only_compressed = OFF;

    FLUSH PRIVILEGES;

    [/CODE]

    By the time I was done, hours had past Still have to finish configuring AD/LDAP integration tomorrow...



  • 96.  RE: New Server Project

    Posted Oct 03, 2021 03:53 AM

    It would seem that there are still a few issues to address with the Docker setup on Artix OpenRC. I pretty much have to re-install docker, compose, and the openrc scripts after a full system upgrade at times:

    sudo pacman -R docker-openrc docker-compose docker
    shutdown -r 0
    sudo pacman -S docker docker-openrc
    sudo pacman -U docker-compose.tar.zst
    sudo rc-service docker start

     

    After that, everything else tends to be fine. Portainer and Redis run as if nothing has happened. However, Nextcloud is a different story. After a clean install and no other configuration changes, I get this whenever I attempt login:

     

    Internal Server Error

    The server was unable to complete your request.

    If this happens again, please send the technical details below to the server administrator.

    More details can be found in the server log.

    Technical details

    Remote Address: <INSERT_CLIENT_IP_HERE>

    Request ID: <INSERT_NEW_REQUEST_ID_HERE>

     

    Also need to look into backup solutions for this setup:

    Time to see if the Nextcloud community can help me figure this out:

    And about that push for LDAP integration…

     

    On a better note, I found a way to get Sunshine to autostart on login. Used the same solution for Syncthing and F@H as well. It’s specific to the DE that I’m using, sadly. So it won’t be useful to all Linux users following this. Only if you’re using Xfce, I think.



  • 97.  RE: New Server Project

    Posted Oct 04, 2021 04:04 AM

    More news on the Docker front:

    Also considering dropping Redis from the Nextcloud compose file, due to a suggestion...



  • 98.  RE: New Server Project

    Posted Oct 05, 2021 04:25 AM

    Yeah, the Nextcloud instance needs more work. Docker now appears to be doing fine, surprisingly (L1T project mirror) :

     


  • 99.  RE: New Server Project

    Posted Oct 06, 2021 02:27 AM

    The real test begins now:

    Screenshot (277).png

    Screenshot (278).png



  • 100.  RE: New Server Project

    Posted Oct 06, 2021 03:32 AM

    Screenshot (280).png

     

    On a side note, Portainer made me pacman -Syu tonight. Couldn't access it until I caved and did the system upgrade. Was worried that something may break.



  • 101.  RE: New Server Project

    Posted Oct 08, 2021 04:26 AM

    If you setup your ESXi and vCenter with just IP addresses initially, and added domain names after-the-fact, this may be of interest to you:

    For setting that, and redoing your server's certs.



  • 102.  RE: New Server Project

    Posted Oct 08, 2021 08:17 AM



  • 103.  RE: New Server Project

    Posted Oct 14, 2021 10:11 PM


  • 104.  RE: New Server Project

    Posted Nov 11, 2021 07:10 AM

    It has been a long month since the last update, and a lot has changed. Here's what has been completed thus far:

    • activated EaseUS Todo Backup Server for easier backups of Windows Server 2016
    • created AD integration/query users for Nextcloud, ejabberd, and FreePBX
    • initiated AD integration config for ejabberd
    • updated, broke, and revived the Artix VM
    • kicked F@H from the Artix VM, to re-add it as a container later on
    • initial planning for the move to ZFS (the entire Artix VM)
    • purchase the MikroTik RB4011iGS+RM
    • initiated Samba setup for the Artix VM

    And now I'm preparing to move ejabberd to a Docker container. Gonna have to change the OP once the dust settles. Still more to announce, once things get under way...



  • 105.  RE: New Server Project

    Posted Nov 11, 2021 06:01 PM

    Just received a MikroTik RB4011iGS+RM in the mail, purchased a MikroTik CCR2004-1G-12S+2XS, and put in an offer for a MikroTik Audience RBD25GR-5HPac, to act as the wireless gateway to my serverside network. Also purchased 50x 12-24 rack screws+cage nuts and 50x 10-32 rack screws+cage nuts. That should be able to mount most of my upcoming equipment...



  • 106.  RE: New Server Project

    Posted Nov 12, 2021 02:54 AM

    Just joined the Artix OpenRC VM to the Windows Server AD, with Samba. We're one step closer to getting the Artix VM ready for production use.

    Now I need an automated way to assign the following to existing AD objects, and new ones on-the-fly:

    • GID (primary group ID)
    • UID (user's ID number)
    • LSH (user login shell)
    • UHD (users *nix home)

    These RFC 2307 attributes are going to be required for single identity across the setup in the future if I go with Samba. With this, I will be able to enhance the user experience further...



  • 107.  RE: New Server Project

    Posted Nov 13, 2021 05:38 PM


  • 108.  RE: New Server Project

    Posted Nov 18, 2021 06:21 PM

    P_20211118_113018



  • 109.  RE: New Server Project

    Posted Nov 20, 2021 03:26 PM

    I got AD/LDAP integration working in Nextcloud, and got NGINX Reverse Proxy Manager working (had to use built-in DB). HTTPS and Asterisk coming next...



  • 110.  RE: New Server Project

    Posted Nov 23, 2021 05:29 PM

    More networking equipment added to the rack last night. Waiting for one more piece to arrive. Then, Verizon needs to get my service activated next month.

     

    For some reason, the image keeps getting rotated on its side in the editor. Haven't figured it out yet...



  • 111.  RE: New Server Project

    Posted Nov 24, 2021 04:12 AM

    A decent PDU never hurts...

     

    I think it posted right-side-up for once  



  • 112.  RE: New Server Project

    Posted Nov 28, 2021 04:35 PM

    The disk shelf is now connected to the PDU as well. PDU now handles Networking and Storage. Only items directly connected to the UPS are servers. Also, the Titan V arrived. Threadripper node is a go, photos posted on the other mirrors for convenience. Need photo galleries here  



  • 113.  RE: New Server Project

    Posted Nov 29, 2021 05:20 PM

    Installing ESXi 6.7 on the Threadripper is proving to be a hitch. Getting nothing but black screen whenever I attempt it. Will have to try again once I'm home...



  • 114.  RE: New Server Project

    Posted Nov 30, 2021 05:55 AM

    The new node has ESXi 6.7u3 installed. Connecting to vCenter tomorrow, when time permits. Had to swap out the SolarFlare 9021-r7 4a for a SolarFlare SFN5322F that I had sitting in the spare parts inventory. The previous NIC kept getting the server either stuck at POST code 92 (PCI init, iirc) or at SolarFlare Boot Manager screen (with no way to skip to OS). After enabling PCI passthrough on the Titan V and the swapped NIC, the new host had an issue rebooting on its own. Had to hit the Reset switch in order to get back into ESXi. Will have to look out for that if I ever add any new devices. But, adding new devices requires me to be in the same room as the server, so not that problematic. This is the first 24/7 ESXi host that I'll run.

     

    Finally, after dealing with a pesky NIC...



  • 115.  RE: New Server Project

    Posted Nov 30, 2021 01:35 PM

    The new node has a name! 

    Adding to vCenter next...



  • 116.  RE: New Server Project

    Posted Dec 04, 2021 06:19 PM

    The new Ethernet bridge is in, but I need some more SFP+ cables to hook it in properly…



  • 117.  RE: New Server Project

    Posted Dec 23, 2021 05:06 PM

    The rack drawer is here!



  • 118.  RE: New Server Project



  • 119.  RE: New Server Project

    Posted Dec 27, 2021 12:15 PM

    Just bought:

    • 8x HGST HUSMM8040ASS200* MLC 400GB SSDs

     

    * / HUSMM8040ASS201



  • 120.  RE: New Server Project

    Posted Dec 30, 2021 12:31 AM

    P_20211229_113720.jpg

     

     



  • 121.  RE: New Server Project

    Posted Dec 31, 2021 03:27 PM

    I guess this counts as multitasking?

    The Titan V is also giving me trouble on ESXi 6.7, so looks like Threadripper will have to wait. On a side note, also trying to setup a KMS server, since Windows 10 Enterprise LTSC (Titan Xp) appears to require KMS, and won't activate via Microsoft servers. Perhaps I need to log a Microsoft account into that VM sometime today...



  • 122.  RE: New Server Project

    Posted Jan 01, 2022 07:44 PM

    Resolved the Nextcloud issue, the other issues still remain. Focusing on macOS VM and KMS for now...



  • 123.  RE: New Server Project

    Posted Jan 01, 2022 09:52 PM


  • 124.  RE: New Server Project

    Posted Jan 02, 2022 12:50 AM

    More things I'm doing on the side, to streamline domain UX:

    https://www.youtube.com/watch?v=sJ_E7I4CHw0

    https://www.youtube.com/watch?v=YreKZqdu4fo

    A lot of the benefits won't work until Windows 10 is activated, sadly. So, it's all just prep work.



  • 125.  RE: New Server Project

    Posted Jan 06, 2022 07:41 AM

    Considering virtual audio device for Artix Linux, since the GPU's audio device didn't work well with PCI Passthrough. Still need to setup xBrowserSync and NGINX Revers Proxy Manager. YouPHPTube will undergo a final validation/testing phase after this. If the MariaDB issue can't be resolved, I'll just rely on LBRY and other existing alternatives and shift focus to YaCy Grid as the final Docker container instance. Still need to setup Windows 10 Enterprise VM (Titan Xp) for daily use. Also still troubleshooting the Titan V issue...



  • 126.  RE: New Server Project

    Posted Jan 11, 2022 08:14 AM

    I've decided to merge the Win10 and Remote Dev VMs. I've worked in this environment before, and it hasn't been an issue for me in the past. Saves time and resources in my case. Also, may have to kick the Threadripper from the VM server project if it can't be resolved by month's end. I can just toss Windows 10 on that thing if necessary, and the Titan V should work...



  • 127.  RE: New Server Project

    Posted Jan 12, 2022 03:28 PM

    Getting ready to test Windows 10 Enterprise VM with this enabled:

     



  • 128.  RE: New Server Project

    Posted Jan 13, 2022 07:15 PM

    Finished initial setup for xBrowserSync last night. Now looking to attack the last container - YaCy...

    *YouPHPTube is being delayed until the rest of the server project is finished, to avoid unnecessary delays.



  • 129.  RE: New Server Project

    Posted Jan 16, 2022 01:30 AM

    Welp, here are the major changes for the project thus far:

    • YouPHPTube is probably getting jettisoned, in favour of using LBRY instead
    • YaCy Grid is going to be a long-term experiment, and has to be built from source
    • xBrowserSync is officially part of the project now - and it's got more on the way!
    • PleX has Movie and Music streaming, but doesn't have external LDAP integration
      • added Trakt.tv plugin (scrobbler) for additional functionality
    • Would like to add eBooks management to Nextcloud, but may need a new container instead
    • Azure Active Directory was added, to enable possible MFA in the future
    • The majority of the server project is complete - looking into 24/7 testing soon...


  • 130.  RE: New Server Project

    Posted Jan 17, 2022 01:13 AM


  • 131.  RE: New Server Project

    Posted Jan 20, 2022 11:25 PM

    The task list has been updated:

     



  • 132.  RE: New Server Project

    Posted Jan 23, 2022 06:35 PM

    Okay, you guys may get a laugh from this. I was remoted into the Windows 10 Enterprise VM (equipped with a Titan Xp). I installed 3DMark, thinking I was gonna do some benchmark runs today. Instead, the entire ESXi host rebooted. All of the VMs went offline, and I'm waiting to see if the server comes back from this in one piece. Hoping this also doesn't rule out just gaming in general. If the power draw from the Titan Xp is too much, I may have to consider other options...



  • 133.  RE: New Server Project



  • 134.  RE: New Server Project

    Posted Jan 26, 2022 03:41 AM

    The job is never finished:

    I march onward...

     



  • 135.  RE: New Server Project

    Posted Jan 27, 2022 05:47 AM

    And here I was, thinking I was close to being done  

    I need sleep.



  • 136.  RE: New Server Project

    Posted Jan 29, 2022 03:20 AM

    What I've managed to get done thus far:

    • install a 2nd PCIe SSD
    • install a USB adapter card
    • plug in a 2nd PSU
    • buy 128GB of RAM
    • decommission Threadripper

    Off to a rough week. Still looking for a good PCIe enclosure...



  • 137.  RE: New Server Project

    Posted Jan 30, 2022 05:52 AM


  • 138.  RE: New Server Project

    Posted Feb 10, 2022 03:22 AM

    Just purchased a Magma EB7-X8G2-RAS-F (7-slot PCIe 2.0 expansion enclosure), with what appears to be a two x8 ports on its expansion interface. I will most likely need to acquire the following next:

    • x16 interface card
    • x16 host adapter
    • x16 PCIe cable

    This is going to be a long one...

    https://vimeo.com/46778368

     



  • 139.  RE: New Server Project

    Posted Feb 10, 2022 09:25 PM

    The PCIe x16 Host adapter and PCIe x16 cable arrived today! Pictures in a few...

    EDIT: There's 128GB more RAM on the way as well. And guess what's going in the enclosure?...




  • 140.  RE: New Server Project

    Posted Feb 11, 2022 10:44 PM

    It's finally starting to warm up where I'm at. But that also means I can't run F@H anymore, due to thermal reasonsUntil next Autumn, the folding will be paused.



  • 141.  RE: New Server Project

    Posted Feb 15, 2022 03:47 PM

    The spec sheet for the server has changed, in anticipation of the PCIe enclosure that arrived recently. Still waiting for a few more components to arrive in the mail, but this is the way for ward from here. Some parts have been moved from the DL580 G7, to the enclosure, to free up space and reduce power draw. Certain parts that don't receive regular use don't need to be in the DL580 G7 necessarily. Optional parts will be in the enclosure instead. The server has received a RAM upgrade as well, from 128GB to 256GB. The SAS HDD-to-SSD cloning operation will occur after the migration from ESXi 6.5 to 6.7. If you have any questions, feel free to ask!



  • 142.  RE: New Server Project

    Posted Feb 16, 2022 10:21 PM

    Folding@Home is off the table for the foreseeable future:

    https://linustechtips.com/status/316035/



  • 143.  RE: New Server Project

    Posted Feb 17, 2022 11:48 PM

    I've been forced to hold off on the OpenStreetMaps backend (routing) container. This was due to the insane memory usage, which appears to have been what crashed my once-stable Linux VM. I'd need to move beyond 32GB RAM for that one VM, which would be pretty crazy. The ESXi host only has 256GB RAM, with all slots filled (4GB sticks). To move beyond that would cost me a fortune, buying 8GB and 16GB sticks off the used market. The current market does not lend itself to that errand too easily. I'll focus on just YaCy Grid for the time being.



  • 144.  RE: New Server Project

    Posted Feb 27, 2022 04:15 PM

    The PCIe enclosure is being removed from the project. Unable to get it working, and OEM/ODM won't communicate to assist with troubleshooting. No way to justify keeping it in the rack at this point.



  • 145.  RE: New Server Project

    Posted Mar 06, 2022 03:37 AM

    Current ToDo's:

     

    **Current ToDo's:**
     - Windows Server 2016:
            - UNIX/POSIX attributes in AD
                    - <https://github.com/wruppelx/win2016setuid>
     - Artix OpenRC:
            - Docker container: YaCy Grid
                    - <https://blog.fossasia.org/creating-a-dockerfile-for-yacy-grid-mcp/>
                    - <https://github.com/yacy/yacy_grid_mcp/blob/master/docker/all-in-one/docker-compose.yml>
                    - initiate web crawl
     - Windows 10 Enterprise:
            - Gaming VM troubleshooting (<https://www.reddit.com/r/VFIO/>)
    
    **Upcoming ToDo's:**
     - Server/Networking:
            - migrate from ESXi 6.5 to 6.7 **
    
    **Long-term ToDo's:**
     - Server/Networking:
            - clone HDDs to SAS SSDs
                    - Acronis True Image
            - VDI host when?
                    - pushed to 2023, due to performance requirements
            - DL580 Gen8/9 planning...


  • 146.  RE: New Server Project

    Posted Mar 07, 2022 03:00 AM

    I've finally managed to setup wireless Time Machine backups for the MacBook. Next will be the EliteBook, if I can figure out how to do so. In addition to the other tasks I have in front of me.

    WARNING: Apple only supports virtualisation of their OS, on their own hardware. I will not discuss unsupported configurations on this website.

    The other way of doing this task would be to get the macOS Server app (~20 USD on the App Store) and setup the role from the app.



  • 147.  RE: New Server Project