News Contact Company



October 25, 2011
October 2011 Newsletter

Having trouble reading this email? View it in your browser.

TrueSec

News and Geek Stuff

October 2011

 

Great news for all deployment masters to be out there:  Johan Arwidmark’s epic class “Deploying Windows 7 using MDT 2010 and SCCM2007 SP2 R3” is now available as a video training.  Johan will take you thru the deployment process in this training recorded in high definition. It is the full 3 day class covered and in addition to the lab manual with its excercises you also get all the sample scripts used in the training. This is the perfect deployment training in times when travel budgets may be in short or when time away is in even shorter supply…..Read more.

The market seems to be ready to go for the new version of ConfigMgr 2012 as we notice great response on our Mastering ConfigMgr 2012 class with Kent Agerlund.  We have 3 more sessions scheduled in the US for the coming months, so if you want to be ready for the release or just learn about the future, make sure to book a seat on any of these Master classes.

In our series of free live meetings, we have a late listing with Johan Arwidmark and Mikael Nystrom: Windows Intune v2 Managed PC Light. Already this Thursday October 27 can you hear from them giving their view on how and why Windows Intune works. Sign up here.

Last but not least, 5 more days to enroll in the upcoming Deployment Geek Week in December at Early Bird price. Save $300 on this one of a kind training.

Happy reading and stay safe.

Kent Agerlund:

Understanding the new content library store, in 5 minutes.

Johan Arwidmark:

How copyfile really works in Windows 7 deployment

Michael Petersen
Start Task Sequense from CD on ConfigMgr Enabled PC

Mikael Nystrom:
Dif disk and mklink

Understanding the new content library store in 5 minutes

In ConfigMgr. 2012 you will find a lot of improvements in terms of content management. Content management has somewhat been completely rewritten for this version and do require “a little getting used to”. Below are some of the changes related to the Content Library and a quick recap of the new distribution point features.

Distribution Points  

We only have one type of DP in ConfigMgr. 2012, no more branch DP’s, DP shares or hidden PXE shares - only a standard DP which can be installed on a Windows client (Vista) or a Windows server (Windows 2003 SP2).

·        You can control the bandwidth when sending content to a remote distribution point

·        You can PXE enable a distribution point

·        You can configure the distribution point for prestaging and only send content changes over the wire.

·        You can prestage multiple applications on a Distribution Point with one command

·        Only files not already present on a distribution point will be send even if the files is being used by another application.

Site Server components

On the site server we still have the distribution manager (distmgr.log) who is responsible for handling content between sites. To handle content within a site, we now have a new guy in the class - the Package Transfer Manager (PkgXferMgr.log).  In this log file you will be able to see the file copy process between the site server and the distribution point.

Distribution Point components

On the local distribution point you will find a new distribution point WMI provider and the Content Library. The Content Library is very different from the traditional ConfigMgr. 2007 shared folder where you used to find all the content.

SMS_DP$

The SMS_DP$share is used to host the distribution point log files, the files used to validate content, prestage content  and is also to store a temporary copy of the package before it is integrated into the Content Library. By reading the PkgXferMgr.log you can easily see the process.

Content Library

The Content Library consists of three subfolders:

 

The Package Library (pkgLib)

In this folder you will find a PackageID.ini file for every package. In this illustration, it’s the second version of a package that exists in the library. Configmgr. Uses the term package when talking about objects in the content library regardless of whether it is an application or actual package.

 

The data library

In the data library you will find a PackageID.ver.ini file for each package and a PackageID.ver folder. The folder is “copy” of the exact package folder structure but without any of the real files. For each file in the original package you will find a corresponding .INI file with information about the original file.

 

Above you see the Data Library with two packages. Notice the .INI file for each package. Open the .INI file and you will find the HASH for the package.

 

Open the PackageID.Version folder and you will see a copy of the original folder structure with an .INI file replacing each of the original files.


Each .INI file contains information about the file like, HASH value, size, attributes and a last modified time stamp.

 

The File Library

In here you will find the actual files that are used in the different packages. All files are grouped together in a folder. The folder for each file is the 4 letters in the file hash (you can see the file hash in the Data library. In this example I look at the folder for the ccmsetup.cab file which begins with 504C.

 

Inside the folder you will find 3 files.

 

First you will see a file with no extension, this is the actual file named with the HASH value. You can copy the file, rename it to the original file name and start the installation (provided that this is the only file in the package).

Hash.ini file contains a link between the file and the packages that uses the file. If you have multiple packages referring the same file, you will just see an entry for each package.

 

The third file is a signature file called Hash.sig. It contains the original package signature.

I hope above cleared a few things about the content library.
//Kent

How CopyProfile really works in Windows 7 Deployments

By Johan Arwidmark
Microsoft MVP – Setup and Deployment

There seem to be quite much confusion on how CopyProfile works in Windows 7. The major source for the confusion is that the Windows AIK documentation is incorrect. I felt it was about time to write a post on how it really works :)

The scenario is that you want make configurations to the default user profile. The CopyProfile feature in the unattend.xml file is just one of many ways to do that. In the Resources section I have put links to other methods, not covered in this post.

CopyProfile

In the Windows 7 unattend.xml there is a CopyProfile value you can set to true. If you set the CopyProfile to True in the Specialize Pass of the unattend.xml file used to deploy your image, the administrator profile in your image will be copied to default user.

Please note that the copy happens when you deploy the image, not when Sysprep runs (like the WAIK documentation incorrectly states).

Step-by-Step Guide

If you are using MDT 2010 Step 1 - 3 are done automatically, but I explain them anyway

1.    Deploy Windows 7 to a virtual machine, make sure you only have one single enabled account, the local administrator account (see Limitations).

2.    Customize the administrator profile as you want it to be

3.    Sysprep the machine and capture it

4.    Deploy the captured image by running setup.exe with an answer file where the CopyProfile is set to True in the Unattend.xml (Specialize Pass).

Limitations

All Customizations to Default User Profile Lost
http://support.microsoft.com/kb/2101557

Additional Resources

Configuring Default User Settings – Full Update for Windows 7 and Windows Server 2008 R2
http://blogs.technet.com/b/deploymentguys/archive/2009/10/29/configuring-default-user-settings-full-update-for-windows-7-and-windows-server-2008-r2.aspx

How to customize the default local user profile when you prepare an image of Windows Vista, Windows Server 2008, Windows 7, and Windows Server 2008 R2
http://support.microsoft.com/kb/973289

 

//Johan

Start Task Sequence from CD on ConfigMgr Enabled PC!

By: Michael Petersen | http://blog.coretech.dk/mip

I guess the first question that comes to mind is WHY!! Well for me at least, its sometimes quiet annoying that during testing the error messages (in any) shown in the full OS are somewhat limited, and Advertisements does not always show right away.

Here are some Scenarios where problems might occur:

1.     The Windows client is in a workgroup, and Advertisements does not seem to show in RAP.

2.     The CM client is not working properly, and can't evaluate policies.

3.     You get an error saying “The requested software cannot be located….”. This could be both missing package referenced by the TS, Packages being updated on the DP, or even policies being reevaluated.



If we were to boot the machine to the boot image, we would be presented with the TS choice list right away (provided there is a TS available).
I case there is a problem with the TS, then, upon selecting the TS, we would be presented with an error message box properly telling us what's wrong. In this case revealing the identity of a missing package.



You could even look in the SMSTS.log and get the same info.

So why not do the same for the running OS. If you, like me, run most tests in virtual environment, chances are you have already attached a Boot media ISO to the client (as it is much faster than PXE). If that is not the case, create a Task Sequence boot media ad use that.

If you try and run the CD/ISO without any customization, you will get the below message stating that you must run all advertised programs from RAP.

All you have to do to make this work though, is change the following registry key:

For X86: [HKLM\SOFTWARE\MICROSOFT\SMS\Mobile Client]
ProductVersion=4.xx.xxxx.xxxx

For X64: [HKLM\SOFTWARE\WoW6432Node\MICROSOFT\SMS\Mobile Client]
ProductVersion=4.xx.xxxx.xxxx

To some other value like ProductVersion=!4.xx.xxxx.xxxx

Now just right click the CD drive and choose Open AutoPlay

You should now be presented with the same Task Sequence Wizard you would be, if you were to boot the machine to WinPE. Subsequently this means you would also be presented with the same list of Task Sequences available to the machine (if any), notifications of missing packages, and a SMSTS.log in the CCM\LOGS\SMSTSlog folder.

//Michael

Dif disk and mklink

I work with OS deployment and I also work with our own LAB environment that in combination means that I build task sequences for our own labs. These labs really tends to consume space, we are talking in numbers of GB, close to 200 GB for one LAB. The problem is not really the space the data consume, it is more the time it takes to copy the stuff, so far we do this mostly in the evening or in the night. So, how can we get rid of data we don’t need?

Start using Reference disk and Differencing disk:

To be able to create VM’s really fast we can’t really wait until the installation is done the “normal” way, which just takes too long time, which is way we use a reference image. A Reference image is just plain vanilla windows installed and maybe some “Windows Update” in the end to have the image updated with patches and some settings. A Difference disk is also an easy understanding, it is just a disk that behaves like a “read from me source, but write to this child disk instead”. From a Hyper-V standpoint, it cannot see the difference at all. The effect will be less space, since most of the servers in a lab situation are running the same OS anyway… Let’s look at one example

That means that if we install 4 servers, we build these based on a one differencing disk. The base disk will consume around 8 GB in size and the 4 unique disks per VM will then consume the difference from the base disk, and that will be approximately 2 GB. So my 4 VM’s will be 8( The base Image)+2+2+2+2(2 for additional unique data) = 16 GB instead of 8+2*4 = 40GB. The bigger LAB I have the more do I save. To be able to use a differencing disk you first need to create one.

 Create a reference image:

You can create one using various methods but here is what I use

1.      Use Microsoft Deployment Toolkit to create a Ref Image using LiteTouch

2.      Use WIM2VHD to create a Ref Image

Create VM’s using Differencing Disk:

It is not that hard, you just create a new VM, but when it comes to create a disk you just pick “later”

 

“Later”, meaning when the machine is created (without any hard disk), you open up settings

-        Browse to IDE Controller 0

-        Select Hard Drive and click Add

-        Select New

-        Select Differencing

 


Here is where you select where to create the disk

 

And here is where you select on what vhd this difference vhd should be based on

 

The BIG problem

Well so far everything is nice and friendly, but what happens if you export those VMs so you can import them on another machine, well this is what happens. Hyper-V will export all resources and create an import XML file so we can import that VM’s somewhere else and that is almost what I want, I don’t need the reference disk, but that is going to be copied anyway. Also a small note, the export function also modifies the path to the parent differencing disk to be located in the same folder the child difference disk is stored. To make this simple to understand, as soon as you export the machines, all the space savings are lost since every VM also copies the parent differencing disk to the export folder

It looks like this after an export:

 

So now it will take the same disk space when you import the VM’s and this is sad…

The TINY fix – mklink.exe

So one easy way to fix this is to before we import them, just keep one copy of the difference parent disk, and them copy the disk to each and every location, but that would not really save space at the destination, only in transit and that’s not good enough. No, the fix is to use “links” in the file system, something that has been around in NTFS for some years now. So the basic steps is going to be

-        Create Ref

-        Use ref as Parent Difference Disk

-        Export VM’s

-        Create one folder in the export folder

-        Store all differencing parent disk in that folder

-        Remove All copies of those files in the export folders subfolders

-        Export done

To import them somewhere else:

-        Copy the content of the Export folder to the location where the VMs are going to run

-        Recreate the missing differencing disk using “mklink.exe” and point hem to the location where they are now

-        Import the machines

-        Use them

Here is the command for mklink.exe:

Mklink.exe "d:\VMs\New Virtual Machine\Virtual Hard Disks\W2K8R2X64SP1.vhd" “c:\Ref\W2K8R2X64SP1.vhd”

This will create a link from the location the Hyper-V believes that disk should be to the location where it actually is located

(Yes I know, you should not lie, but sometimes a lie can really make life a bit easier…)

So from a file system perspective it looks like this when I open the imported folder

 

As you can see my difference/reference disk in this case is just a link to the real object, but hyper-V thinks it is a working disk and will use it.

/mike

Sample SCRIPT

Yeah, I know it should be PowerShell, VB or something really cool, but… Anyway here is a sample batch file that will import two VM’s that have been exported and then modified so that the difference disk is located in the RefDisk folder: You can download this here (including the vbscript that imports VM’s)

REM SET Stuff
SET SOURCE=E:\labfiles\LABX
SET DEST=D:\LAB

                                            
REM Set working dir
CD /D %SOURCE%

                                            
REM Copy Files
RoboCopy %SOURCE%\source %DEST% /s

                                            
REM Link VHD
mklink "%DEST%\DEMO-DC01\Virtual Hard Disks\W2K8R2X64SP1.vhd" "%DEST%\RefDisk\W2K8R2X64SP1.vhd"
mklink "%DEST%\DEMO-DC02\Virtual Hard Disks\W2K8R2X64SP1.vhd" "%DEST%\RefDisk\W2K8R2X64SP1.vhd"

                                            
REM Import VM's
cscript importvms.vbs %DEST%\DEMO-DC01 DEMO-DC01
cscript importvms.vbs %DEST%\DEMO-DC02 DEMO-DC02

 //Mike

 

 

Where to find us......

Deploying Windows 7 using MDT 2012 and ConfigMGr2012 with Johan Arwidmark

Online Live

November 8-10, 2011

Master the deployment process with Lite & Zero Touch with Mikael Nystrom

Minneapolis

December 5-7, 2011

Deployment Geek Week with Johan Arwidmark and Mikael Nystrom

 Redmond

 December 12-16, 2011

Mastering ConfigMgr2012
with Kent Agerlund

Atlanta December 12-15
Master the deployment process with Lite & Zero Touch with Johan Arwidmark Minneapolis
January 31- February 2, 2012

 
Full schedule at http://www.truesec.com

 

 

 

 

 

 

 

This message was intended for '%%emailaddress%%'
Unsubscribe | To contact us please email info@truesec.com

TrueSec Inc.
8201 164th Ave NE, Redmond, WA 98052


 




TrueSec Inc    |     +1(425) 285-4477     |     info[at]truesec.com    |     Infrastructure    |     Security    |     Pentesting    |     TrueSec Inc. Website Privacy Statement