Using virtualization to automate deployment: is it a good idea or not?

As the number of servers needed to run slideshare increases, we are spending more and more of our time simply deploying our software. Each new box has to have a lot of software installed, configured, and tested before it can be hooked up. Scripting common tasks makes things go faster, but doesn’t resolve the fundamental problem, which is that there’s never any way to prove that Server A has the exact same configuration as Server B. This makes troubleshooting tricky, obviously.
One path we’re starting to consider is virtualization. I haven’t heard of this as a common use for virtualization. Typically, people seem to use software like Xen or VMWare to run multiple virtual servers on one physical server, so they can get more use out of existing hardware. We don’t have that problem: all our boxes are in the red! But we would like to be able to roll out new servers reliably, at the push of a button, the way you can make a new instance of an image on Amazon EC2 just by typing a command into your command line.
The way I look at it, the configuration of a machine is valuable intellectual property, and it needs to be captured so that it can be reproduced whenever we need it. Of course there’s a performance penalty: something like 5/10% of CPU will be consumed by the virtualization software, meaning that overall we’ll need more boxes than we would otherwise. But we’ll be able to set up or rebuild boxes faster, and right now that seems more important to me.
Thoughts? Is this a good idea or not? Has anyone used virtualization in this way? Any recommendations on which software to try first? As always, reply in the comments field below.
Also: a special bonus slideshow on virtualization for your reading pleasure!