UP! Edit: the reason is whether using a Linux-based OS that efficiently offers a write-back caching technique could be much more efficient than Windows OS, running as a VMware host environment.
But that is not even a Linus/XP issue. Point here is: * You have a 64 bit host that most likely has more ram than... * the outdated 32 bit operating system IN the virtual machine can even address and * uses that additional ram for caching. Run xp 64 bit or a more modern variant, and that advantage disappears as the OS can actually address the complete RAM.
'that advantage disappears'? I know that - for example a 64 Bit will allow to access 4 or 8 Gb while the 32 bit XP Pro client will be limited to about 3 Gb..that is not the issue. I'm not sure to understand your reasonament: do you suggest or not to use the Ubuntu 64 as host in a VMware environment instead of XP Pro x64? Or it's the same since XP Pro x64 is enough efficient to compete with caching vs. Ubuntu 64? The client/s will be Windows XP Pro x86 - 32 bit, anyway - with 1 or 2 Gb RAM allocated.
That actually IS the issue when you talk about performance gains through better IO caching in virtualization. The 64 bit host has access to more ram to cache more efficiently, that the terribly outdated 32 bit operating system you are running can not even see, so it can not use it for caching. Any particular reason you are stuck on a TERRIBLY suboptimal 32 bit XP? I am just upgrading my complete stations to Windows 7 and was using 64 bit Vista for ages without any problems. 64 bit "native" kills most problems one can have with 32 bit (which, agreeable, imposes some really bad limits for certain kinds of operations).
That is right. I installed Windows 7 on ages old 32 bit computer and configured it as a server. Runs much quicker than 32 bit XP. Updated 1 Dell 8250 with WS2008R2 and several other 64 bit workstations with Win 7. Configured for max power saving. The 1 month electricity bill came and I see 3/5 reduction. The new Windows has 5 levels of programmatic accessible power control to gain additional saving. All computers and the server run significantly faster. The most interesting thing is the old computer is based on MSI Motherboard and AMD Duron manufactured in 1999. It has SSE SIMD instructions and seems to me Win 7 is using them for a good reason.
Similar here, though my 2 local servers did NOT see their power usage go down when i upgraded them (according to load levels on the UPS) (to Server 2008 R2). Now I have to upgrade half a dozen machiens in my data center location and then start the long way with the virtual machines - waiting for the DVD image of 2008 R2 WITHOUT Hyper-V to be released for those No sense having Hyper-V installed on a virtual machine that will never ever utilize it per definition I seriously can not understand why someone would still run XP, and why somone would be stuck with a 32 bit system if his hardware supports more.
It depends from the load and type of application. I programmed my server with our code generator to make a quick computations with max performance. It is like a bursts, then the usage is going down and thus power efficiency. Also there are a lot of old programs not optimized to use all processor resources so if you switch them off gradually you can actually use only a part of processor. It needs tweaking. yes Huper-V needs CPU support. Why don't download the DVD from MSDN? A mystery to me too I am with 64-bit 5 years ago. Every new computer I bought is 64-bit. It was new 5 years ago but now all new computers are 64 bit.
Nothing to download, they are not there yet. Note again: I do not mean 2008 R2 - i mean 2008 R2 WITHOUT HYPER-V - that is a special DVD of the OS that has no Hyper-V on it. Also cheaper in licensing. I prefer to install that version on the virtual machines, as those, running IN a hyper-visor, will never get a chance to run hyper-v themselves The physical machines are fine with the full version that I have