ESXi

 View Only
  • 1.  Cisco Nexus 1000V VDS - vmotion

    Posted Feb 11, 2011 11:26 AM

    I have a question about connecting VDS's together to allow vMotion and Storage vMotion of guests, betweeen different ESXi vSphere 4.1 Clusters

    We have a Prod, Dev, and Test Clusters, all with two Dell R810 Servers in each Cluster, and each server has two 10Gbit Nic Cards in. We have deployed a Cisco Nexus 1000V VDS switch in each Cluster environment. We want the ability to Storage vmotion guests from one Cluster to another, and  each cluster has it's own datastores.   Being quite new to VDS and especially the Cisco Nexus, we wondered, can we just virtually connect each VDS to each other to allow the vmotion of guests between clusters with their own VDS's and Datastores. All the clusters are deployed under one datacenter

    Thanks

    Andy



  • 2.  RE: Cisco Nexus 1000V VDS - vmotion

    Posted Feb 11, 2011 01:00 PM

    Hi,

    You can answer this by looking at the configuration maximums.

    The maximum hosts in a cluster is 32. The maximum hosts using a vDS is 350. This tells you that you can spread a vDS across far more than one cluster.



  • 3.  RE: Cisco Nexus 1000V VDS - vmotion

    Posted Feb 11, 2011 01:48 PM

    Josh we have a requirement from a security perpspective that we must have dedicated VDS for each environment and are not allowed to spread a VDS across Clusters, but can we link two VDS's together. Apologies but my VDS/LAN knowledge is not that great

    Thanks

    Andy



  • 4.  RE: Cisco Nexus 1000V VDS - vmotion

    Posted Feb 11, 2011 03:08 PM

    Is there a shared datastore between clusters? You'll need that to do vMotion. As for linking the N1K domains, I suppose you could create another uplink that shares the same layer 2 vlans and then create your vmkernel IPs on both sides of the cluster to "link" the N1Ks together.