I have a machine booting ESXi 5.1 on Intel 520 180GB SSD. Then then running FreeNAS-8.3.0-RELEASE-x64 (r12701M) on a VM that I have configured the 4 local 1TB 7200 RPM SATA disks as RDMs. http://blog.davidwarburton.net/2010/10/25/rdm-mapping-of-local-sata-storage-for-esxi/
FreeNAS runs ZFS on the 4 x 1TB (rdm) drives. I added 32GB vmdk on the SSD is used by FreeNAS as a ZFS cache.
NFS export list mount process requires the name resolution of the client. When a client initiates the request for a NFS share, the servers checks its export list for the requested directory and name of the client in this access list for that particular share. Now if server fails to resolve the name of the initiator it denies its request for mounting that share. In order to overcome this problem you must have a dns server in network or else you have you manually enter the names and IP address information in to hosts file of the server.
That little bit of news was very helpful during this build as I had not been able to connect from the ESXi host to the FreeNAS VM running on that same host after much tinkering with permissions and network settings. Opened up the console on the FreeNAS VM, edited /etc/hosts to give the ESXi server IP a host name. BAM, NFS mounted with no problem.
Alastair from ProfessionalVMware.com blog has put up a very comprehensive vSphere Lab kit @ http://www.labguides.com/autolab/
I’m sure everyone has some variation of this same setup, but Alastair took the time to automate and document the steps. Now you too can keep a fully operational vSphere 5 cluster with iSCSI storage in as little as 8GB of RAM!!
This is a very easy to use network latency simulator to see how applications would work over high latency WAN scenarios.
vSphere 5 is almost GA and View 5 Beta is already in the hands of approved clients. XenServer 6 beta is out. It’s time for me to update the home lab. Obligatory link to ESX Whitebox HCL!
My ASUS motherboards have onboard Realtek 8111E Gigabit adapters which ESX 4.1 U1 does not support. HOWEVER, much to my surprise ESX5 DOES see my Realtek adapters!!! This makes it kinda depressing when going back to 4.1 U1.
I have the typical 3 box scenario. 2 x ESX hosts and 1 x NAS.
I boot from 2GB USB flash drives since like most people I have several laying around from every vendor and partner imaginable. I’m able to swap from ESX5 to 4.1U1, and back, very quickly.
I’ve repurposed as much existing hardware components as possible. An HP xw4300 workstation as my NAS. I’m using NFS for the VM’s and ISO’s. Also I had as my primary workstation a very swift AMD Athlon 7750 x2 with 8GB of DDR2-667 RAM which holds it own against the new AMD PhenomII x6 system.
ESX01 Host (AMD 6 core, 16GB RAM, 3 x GigE NIC’s, 1 x 80GB SATAII)
ESX02 Host (Repurposed AMD 2 core, 8GB RAM, 3 x GigE NIC’s, 1 x 500GB SATAII)
NAS Host (Repurposed HP xw4300, 4GB RAM, 2 x GigE NIC’s, 2 x 500GB SATAII Software RAID1, 1 x 64GB SATAIII SSD)
Switch (8port, Gigabit, VLAN capable)
1 x NETGEAR ProSafe GS108T-200NAS Gigabit Smart Switch 10/100/1000Mbps 8 x RJ45
How did I connect all this equipment together? Did it actually work? Is it even functional? All these answers and more when I get time to post Home Lab Configuration and Testing. Hint: With much cursing, yes, and yes.