Quantcast
Channel: StarWind Software
Viewing all articles
Browse latest Browse all 781

StarWind Virtual SAN (VSAN) [+Free], HCI Appliance (HCA), Virtual HCI Appliance (VHCA) [+Free], StarWind x Proxmox VE SAN Integration Services • Network Config Guidence for a 2-Node Hyper-V Cluster

$
0
0
I'm in the process of evaluating vSAN (free for right now) for a new Hyper-V 2022 2-node cluster. I've been reading the quick start guides as well as these forums and can't quite grasp the best way to accomplish my goals.

I have 2 identical servers each with two 2-port ConnectX-6 cards (4x 100GB ports total), two 2-port E810 cards (4x 10GB ports total), and one 4-port i350 card (4x 1GB total) for networking. The goal of this is to provide maximum fault tolerance and uptime for critical hosted VMs within the cluster.

My original design was to team 2 of the 100GB ports for a vSAN Synchronization channel, team the other 2 100GB ports for the iSCSI channel, team 2 of the 1GB ports for a vSAN HB channel, and team the other 2 1GB ports for a management channel. This would leave the 4 10GB ports for VM network access, and I was going to use the iSCSI channel for Failover Cluster communication/live migration.

However, after reading everything it doesn't seem that you can have separate iSCSI and HB channels and NIC teaming should be avoided.

Given my hardware then what would be the recommended setup to achieve my goals?

Statistics: Posted by AGWin — Wed Feb 26, 2025 2:31 pm



Viewing all articles
Browse latest Browse all 781

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>