Bridging Virtual Networking into Real LAN on a OpenNebula Cluster

Posted by user101012 on Server Fault See other posts from Server Fault or by user101012
Published on 2011-11-15T15:51:26Z Indexed on 2011/11/20 1:57 UTC
Read the original article Hit count: 618

I'm running Open Nebula with 1 Cluster Controller and 3 Nodes.

I registered the nodes at the front-end controller and I can start an Ubuntu virtual machine on one of the nodes.

However from my network I cannot ping the virtual machine. I am not quite sure if I have set up the virtual machine correctly.

The Nodes all have a br0 interfaces which is bridged with eth0. The IP Address is in the 192.168.1.x range.

The Template file I used for the vmnet is:

NAME = "VM LAN"
TYPE = RANGED

BRIDGE = br0 # Replace br0 with the bridge interface from the cluster nodes

NETWORK_ADDRESS = 192.168.1.128 # Replace with corresponding IP address
NETWORK_SIZE    = 126
NETMASK         = 255.255.255.0
GATEWAY         = 192.168.1.1
NS              = 192.168.1.1

However, I cannot reach any of the virtual machines even though sunstone says that the virtual machine is running and onevm list also states that the vm is running.

It might be helpful to know that we are using KVM as a hypervisor and I am not quite sure if the virbr0 interface which was automatically created when installing KVM might be a problem.

© Server Fault or respective owner

Related posts about linux

Related posts about networking