Most companies share a common
goal: create a corporate-standard desktop configuration based on a
common image for each operating system version. They want to apply a
common image to any desktop in any region at any time, and then
customize that image quickly to provide services to users.
In reality, most
organizations build and maintain many images—sometimes even hundreds of
images. By making technical and support compromises and disciplined
hardware purchases, and by using advanced scripting techniques, some
organizations have reduced the number of images
they maintain to between one and three. These organizations tend to
have the sophisticated software distribution infrastructures necessary
to deploy applications—often before first use—and keep them updated.
Business requirements
usually drive the need to reduce the number of images that an
organization maintains. Of course, the primary business requirement is
to reduce ownership costs. The following list describes costs associated
with building, maintaining, and deploying disk images:
Development costs
Development costs include creating a well-engineered image to lower
future support costs and improve security and reliability. They also
include creating a predictable work environment for maximum productivity
balanced with flexibility. Higher levels of automation lower
development costs.
Test costs
Test costs include testing time and labor costs for the standard image,
the applications that might reside inside it, and those applications
applied after deployment. Test costs also include the development time
required to stabilize disk images.
Storage costs
Storage costs include storage of the distribution points, disk images,
migration data, and backup images. Storage costs can be significant,
depending on the number of disk images, number of computers in each
deployment run, and so on.
Network costs
Network costs include moving disk images to distribution points and to
desktops. The disk-imaging technologies that Microsoft provides do not
support multicasting, so network costs scale linearly with the number of
distribution points you must replicate and the number of computers to
which you’re deploying.
As the size of image
files increases, costs increase. Large images have more updating,
testing, distribution, network, and storage costs associated with them.
Even though you only update a small portion of the image, you must
distribute the entire file.
Thick Images
Thick images
are monolithic images that contain core applications and other files.
Part of the image-development process is installing core applications
prior to capturing the disk image, as shown in Figure 1. To this date, most organizations that use disk imaging to deploy operating systems are building thick images.
The
advantage of thick images is simplicity. You create a disk image that
contains core applications and thus have only a single step to deploy
the disk image and core applications to the destination computer. Thick
images can also be less costly to develop, as advanced scripting
techniques are not often required to build them. In fact, you can build
thick images by using BDD with little or no scripting work. Finally, in
thick images, core applications are available on first start.
The disadvantages of
thick images are maintenance, storage, and network costs. These costs
rise with thick images. For example, updating a thick image with a new
version of an application requires you to rebuild, retest, and
redistribute the image. Thick images require more storage and use more
network resources in a short span of time to transfer.
If you choose to build
thick images that include applications, you will want to install the
applications during the disk-imaging process. I
Thin Images
The key to reducing image
count, size, and cost is compromise. The more you put in an image, the
less common and bigger it becomes. Big images are less attractive to
deploy over a network, more difficult to update regularly, more
difficult to test, and more expensive to store. By compromising on what
you include in images, you reduce the number you maintain and you reduce
their size. Ideally, you build and maintain a single, worldwide image
that you customize post-deployment. A key compromise is when you choose
to build thin images.
Thin images contain few if any core applications. You install applications separately from the disk image, as shown in Figure 2.
Installing the applications separately from the image usually takes
more time at the desktop and possibly more total bytes transferred over
the network, but spread out over a longer period of time than a single
large image transfer. You can mitigate the network transfer by using
trickle-down technology that many software distribution infrastructures
provide.
Thin images have
many advantages. First, they cost less to build, maintain, and test.
Second, network and storage costs associated with the disk image are
lower, because the image file is physically smaller. The primary
disadvantage of thin images is that post-installation configuration can
be more complex to develop initially, but this is offset by the
reduction in costs to build successive images. Deploying applications
outside of the disk image often requires scripting and usually requires a
software distribution infrastructure. Another disadvantage of thin
images is that core applications aren’t available on first start, which
might be necessary in high-security scenarios.
If you choose to build
thin images that do not include applications, you should have a
systems-management infrastructure, such as Microsoft Systems Management
Server (SMS) or Microsoft System Center Configuration Manager (SCCM), in
place to deploy applications. To use a thin image strategy, you will
use this infrastructure to deploy applications after installing the thin
image. You can also use this infrastructure for other post-installation
configuration tasks, such as customizing operating system settings.
Hybrid Images
Hybrid images
mix thin and thick image strategies. In a hybrid image, you configure
the disk image to install applications on first run, giving the illusion
of a thick image but installing the applications from a network source.
Hybrid images have most of the advantages of thin images. However, they
aren’t as complex to develop and do not require a software distribution
infrastructure. They do require longer installation times, however,
which can raise initial deployment costs.
An alterative is to
build one-off thick images from a thin image. In this case, you build a
reference thin image. After the thin image is complete, you add core
applications, capture, test, and distribute a thick image. Testing is
minimized because creating the thick images from the thin image is
essentially the same as a regular deployment. Be wary of applications
that are not compatible with the disk-imaging process, however.
If you choose
to build hybrid images, you will store applications on the network but
include the commands to install them when you deploy the disk image.
This is different than installing the applications in the disk image.
You are deferring application installs that would normally occur during
the disk-imaging process to the image-deployment process. They become a
post-installation task. Also, if you have a systems-management
infrastructure in place, you will likely use it to install supplemental
applications post-deployment.