News Contact Company



March 23, 2011
March 2011 Newsletter

Having trouble reading this email? View it in your browser.

TrueSec

News and Geek Stuff

March 2011

Thus far March has been dominated by meetings and classes: Microsoft MVP summit and MMS (ongoing as we speak) and in between our Deployment Geek Week. The later was five full days at Microsoft Campus with Mikael Nystrom and Johan Arwidmark at the helm of the deployment ship with a crew of 30 students from all over the globe.

 

 “It was great to have a course taught from experience and not just because the instructor read the book”

“ …It was a rare privilege for me to experience the level of excellence demonstrated by Mikael, Johan, and Per. Although it was a very intense week of taking in vast amounts of information at a break-neck pace, it was enjoyable, professional, and an experience that I will fondly remember. The instructors' knowledge and real world experience was surpassed only by their graciousness. Being instructed on the Microsoft campus was like being at the Geek Vatican. Brother Johan and Archangel Mikael led us in seminary (and sometimes prayer) as we were absolved of our syntax…... I feel very lucky to have been a part of something this special.”

Thus far March has been dominated by meetings and classes: Microsoft MVP summit and MMS (ongoing as we speak) and in between our Deployment Geek Week. The later was five full days at Microsoft Campus with Mikael Nystrom and Johan Arwidmark at the helm of the deployment ship with a crew of 30 students from all over the globe.

 

 

 “It was great to have a course taught from experience and not just because the instructor read the book”

“ …It was a rare privilege for me to experience the level of excellence demonstrated by Mikael, Johan, and Per. Although it was a very intense week of taking in vast amounts of information at a break-neck pace, it was enjoyable, professional, and an experience that I will fondly remember. The instructors' knowledge and real world experience was surpassed only by their graciousness. Being instructed on the Microsoft campus was like being at the Geek Vatican. Brother Johan and Archangel Mikael led us in seminary (and sometimes prayer) as we were absolved of our syntax…... I feel very lucky to have been a part of something this special.”

Deployment Geek Week class of March 2011

Next opportunity to attend Deployment Geek Week: London June 6-10 and Seattle July 18-22.

In this issue we will introduce yet one more of our consultants; Niklas Goude, who will contribute around SharePoint 2010 and PowerShell. Niklas is a Microsoft MVP in the above subject and is also the author of the book in the subject “PowerShell for Microsoft SharePoint 2010 Administrators”.

We will shortly start offering some of our classes together with New Horizons as “OnLine Live” (OLL) a learning delivery method that they have run successfully for the majority of their classes. This way we will be able to offer our labs to a bigger audience that may not have the means or time to travel. However, rest assured that we will continue to run our normal in-classroom labs as well!

 


Niklas Goude:

SharePoint 2010 & PowerShell

mikael-nystrom-soft-mugshot.png

Mikael Nystrom:

How to create a Windows 7/2008 R2 Reference image for deployment

marcus-murray-soft-mugshot.png

Marcus Murray:
One of  the most critical IT-based attacks ever revealed!

kent-mugshot.jpg

Kent Agerlund:
First Look at System Center Update Publisher 2011   

  Michael Petersen
Auto adding files to Config.Mgr Boot Images
   

One of the most critical IT-based attacks ever revealed!

 Today you can read an open letter to all RSA customers on the RSA website [http://www.rsa.com/node.aspx?id=3872]

The letter states that a extremely sophisticated cyber-attack  has been mounted against RSA and that sensitive information regarding RSA-Secure ID products has been successfully extracted from their systems.

 

The severity of this issue is clear to anyone interested of involved in IT-Security. RSA Secure ID product are used to authenticate to systems and networks all around the globe and right now it´s impossible to know how this will affect the massive number of companies, governments and authorities using RSA products to secure logins to their systems.

 

Also RSA had, at least until today one of the highest reputation you can possibly get when it comes to trust. It´s a cold awakening, at least for me that my trust in their product will never be the same again. It simply can´t be since it´s now official that bad guys we don’t´ event know may have access to secrets that potentially can compromise the security of the RSA Secure ID solutions.

 

RSA states that there is until today no evidence that the attack will directly affect their customers but at the same time they recommend their customers to increase security on several areas regarding their IT-environments.

 

This is the recommendation RSA gives to all their customers:

•We recommend customers increase their focus on security for social media applications and the use of those applications and websites by anyone with access to their critical networks.

•We recommend customers enforce strong password and pin policies.

•We recommend customers follow the rule of least privilege when assigning roles and responsibilities to security administrators.

• We recommend customers re-educate employees on the importance of avoiding suspicious emails, and remind them not to provide user names or other credentials to anyone without verifying that person’s identity and authority. Employees should not comply with email or phone-based requests for credentials and should report any such attempts.

• We recommend customers pay special attention to security around their active directories, making full use of their SIEM products and also implementing two-factor authentication to control access to active directories.

• We recommend customers watch closely for changes in user privilege levels and access rights using security monitoring technologies such as SIEM, and consider adding more levels of manual approval for those changes.

• We recommend customers harden, closely monitor, and limit remote and physical access to infrastructure that is hosting critical security software.

• We recommend customers examine their help desk practices for information leakage that could help an attacker perform a social engineering attack.

• We recommend customers update their security products and the operating systems hosting them with the latest patches.

 

These are very demanding recommendations from RSA and reading between the lines I get worried. I don’t think RSA would ask their customers to do all this unless the result of the intrusion was VERY serious.

 

As a security consultant my guess is that I and my team will be very busy helping RSA customers to follow these recommendations for months to come. I have to say that the recommendations however are valid to anyone interested in protecting their IT-infrastructure so let the message to RSA Customers be a message to all of us.

 

And if you need help in this area you know where to find me and the Truesec security Team!

 

Feel free to contact me marcus.murray[at]Truesec[dot]com

 

Stay safe!

/Marcus Murray,

Security Team Manager, Truesec

MVP – Enterprise Security

SharePoint 2010 & PowerShell

 

SharePoint is one of the fastest growing products in history, and it is quickly becoming mission critical for numerous companies around the world. Whereas SharePoint 2007 was a really cool product, with an automation API, its use for automation purposes was a bit complicated for the average SharePoint admin. This is where the inclusion of Windows PowerShell as a management tool for SharePoint 2010 comes in to play.

In SharePoint 2010 you can start Windows PowerShell through the SharePoint 2010 Management Shell. The shell runs the SharePoint.ps1 script at startup and executes the following code:

 

$ver = $host | select version

if ($ver.Version.Major -gt 1)  {$Host.Runspace.ThreadOptions = "ReuseThread"}

Add-PsSnapin Microsoft.SharePoint.PowerShell

Set-location $home

 

The code in the example above stores the hosts version in a variable and if the major version is greater than one (if you are running PowerShell V2) the ThreadOptions property is set to “ReuseThread” - which runs each line, function or script on the same thread. When working with the SharePoint object model using PowerShell, running code on separate threads can cause memory leaks, while commands running on the same thread have a smaller chance of doing so. This is because some SharePoint objects still use unmanaged code and the way memory is allocated to those objects. Next the SharePoint Snap-in is loaded (Microsoft .NET Framework assemblies that may contain custom Windows PowerShell cmdlets).

 

The SharePoint 2010 cmdlets

 

The SharePoint 2010 snap-in for Windows PowerShell contains more than 500 cmdlets that you can use to perform a large variety of administrative tasks. Let’s see how we can list all the SharePoint cmdlets using the Get-Command cmdlet. Get-Command returns basic information about cmdlets and other elements of Windows PowerShell commands, such as functions, aliases, filters, scripts, and applications. All nouns of the SharePoint 2010 cmdlets start with “SP”. Knowing this, we can use Get-Command’s –noun parameter followed by SP*:

 PS > Get-Command –noun SP* 

 

The list of cmdlets returned is pretty long. You can use Get-Command to find specific SharePoint 2010 cmdlets. If you for example want to find all cmdlets that are used to manage site collections you can simply type:

 

  PS > Get-Command -Noun SPSite 

 

  CommandType     Name                    Definition

  ----------- --------      -------                     ------------

  Cmdlet                   Backup-SPSite      Backup-SPSite [-Identity] <SPSitePip

  Cmdlet                   Get-SPSite             Get-SPSite [-Limit <String>] [-WebAp

  Cmdlet                   Move-SPSite         Move-SPSite [-Identity] <SPSitePipeB

  Cmdlet                   New-SPSite           New-SPSite [-Url] <String> [-Languag

  Cmdlet                   Remove-SPSite    Remove-SPSite [-Identity] <SPSitePip

  Cmdlet                   Restore-SPSite     Restore-SPSite [-Identity] <String>

  Cmdlet                   Set-SPSite             Set-SPSite [-Identity] <SPSitePipeBi

 

 

Working with the SharePoint 2010 cmdlets

 

Let’s see what we can do with the Get-SPSite cmdlet. Typing the cmdlet in PowerShell returns the Site Collections available:

 

  PS > Get-SPSite

 

  Url

  ---

  http://spserver

 

Notice how the command returns the Site Collections URL property although the returned objects have a lot more properties. The properties displayed by default are controlled by a set of formatting files. Windows PowerShell includes ten formatting files and SharePoint 2010 comes with 13 additional formatting files that are used to generate a default display of various .NET objects.

 

We can display additional properties using the Select-Object cmdlet.  In the example below we use the –Identity parameter supported by the Get-SPSite cmdlet to retrieve a specific Site Collection and pipe the object to the Select-Object cmdlet.

 

  PS > Get-SPSite -Identity http://SPServer | Select-Object -Property Url, Zone, Port

 

  Url                         Zone       Port

  ---                           ----           ----

  http://spserver   Default   80

 

It’s also possible to change specific properties on a Site Collection. First let’s see how to we can add a secondary contact to the Site Collection using the Set-SPSite cmdlet.

 

  PS > Get-SPSite -Identity http://SPServer |
  >> Set-SPSite -SecondaryOwnerAlias domain\user

 

If we use the Select-Object cmdlet again and display the SecondaryContact property we’ll see that the user a secondary contact is added to the Site Collection.

 

  PS > Get-SPSite -Identity http://SPServer | Select SecondaryContact

 

  SecondaryContact

  ----------------

  Domain\user

 

You can also store an object of the type SPSite in a variable and set the SecondaryContact property. The property requires an object of the type Microsoft.SharePoint.SPUser – which is just the type of object that the Get-SPUser cmdlet returns. Note that the user has to exist in the Site Collection.

 

  PS > $spSite = Get-SPSite –Identity http://SPServer
  PS > $spSite.SecondaryContact =
  >> (Get-SPUser -Web http://SPServer -Identity domain\user)

 

What if you want to add a user that exists in Active-Directory but does not exist in the Site Collection? Simply use the New-SPUser cmdlet to add a user to a Site Collection and then add the object to the SecondaryContact property.

 

  PS > $spUser = New-SPUser -Web http://SPServer -UserAlias domain\newuser
  PS > $spSite.SecondaryContact = $spUser

 

When we are done with the object stored in the $spSite variable it’s important to Dispose of it correctly. One way of doing this is by calling the Dispose() method as shown below.

 

PS > $spSite.Dispose()

 

Why do we have to dispose of the object? Well, SPWeb, SPSite, and SPSiteAdministration objects can sometimes take up large amounts of memory, so using any of these objects in PowerShell requires proper memory management. Normally, instances of these objects obtained through cmdlets such as Get-SPSite are disposed of automatically at the end of the pipeline, but this does not happen to instances stored in variables. You can dispose of objects using the Dispose() method as demonstrated in the example above or you can use the Start-SPAssignment and Stop-SPAssignment cmdlets that were introduced in SharePoint 2010 to spare scripters the need to dispose of such objects individually.

 

Be sure to check out “PowerShell for SharePoint 2010 Administrators” for detailed examples on how to automate your SharePoint 2010 environment using Windows PowerShell.

Regards

Niklas Goude

 

 

 

Auto adding files to Config.Mgr Boot Images

Sitting I my seat at approximately 30.000 feet, on my way to MMS in Vegas, I thought I might as well spend some time writing a post on the good old Osdinjection.xml file.

As some might know this file can be used to customize, which files get added to the boot images when updating the distribution point.This method of adding files to the boot image has been around since at least Config.Mgr SP1 (or that's when I started using it anyway) but it still seams not that many people take advantage of it.

So here is how I use it, and a description of the syntax:

First navigate to \\siteserver\sms_sitecode\bin\i386 and make a copy of Osdinjection.xml. This is to make sure we can return to the original file if something goes wrong. This is also the file that we are going to modify.

Now navigate to \\siteserver\sms_sitecode\OSD and make a folder called custom, like this:

 

 

 

In that custom folder we will now place the files we want to add to the boot image whenever we update the distribution point (remember that the boot image is rebuild each time this is done).

In my case I have placed, a tools folder for different tools I might want to add, a custom media hook file, the media hook executable file and trace32.

 

 

 

I obviously want to place these files in different places in the boot image, like TSconfig.ini must of cause be in the root directory, and trace32 I want in system32 so I can launce it from anywhere. That's where the osdinjection.xml file comes into play. Open the Osdinjection.xml file and add the following custom section just beneath <InjectionFiles>.

 

 

 

 

You will notice that TScinfig.ini and RunHTA.exe I placed in the destination “\” which resolves to the root of the boot image or the X drive if you will.

The tools folder is placed in destination Tool, and since that folder does not exist, it will be created in the root. Also notice that by using the file name=”*” all files from the source custom\tools will be copied to the destination in the boot image.

Finally Trace32.exe is placed I the destination windows\system32. If you want to place many files in system32 you might want to use the solution from the tools folder setup.

Now all you have to do is update you x86 boot image on the distribution point, and the files will automatically be added. Should you then want more files, or perhaps want to change the existing ones, all you have to do, is to make the necessary changes to your custom folder, and update the distribution point again. 

Note: If you want to update your x64 bit boot image you must change the” imgArch” to “x64”,

 

Best Regards

Michael Petersen

First look at System Center Update Publisher 2011

Here at MMS 2001 day 1, I have had the pleasure of attending to great SCUP 2011 session. From my test drive I have made my personal top list of new and improved features.

  1. Performance
    1. The application has undergone a major improvement when it comes to performance. Where it could take 10-15 minutes to load some of the larger catalogs in SCUP 4.5 the same process today takes less than a minute.
  2. New UI
    1. The admin UI has moved away from the MMC (big applause) and is now built to look like the other System Center products. 
  3. New Authoring experience
    1. Authoring updates is so much easier and also introduces some new features like adding a list of required updates to your update.
    2. You can control which update(s) are being superseded.
    3. The applicability rules are also much easier to create and modify. You have the same looks and feels as with the DCM feature in ConfigMgr.
  4. Rules
    1. We can create “global” rules that can be used in multiple updates. You can use the rules as templates, once added to an update you can always modify the rule. You add a Saved rule in the same way as any other applicability rule.
  5. Publications
    1. Publication is a new feature allowing you to group your updates into logical groups. Each group can be independently published no need to publish all updates as in version 4.5.
  6. Import catalogs
    1. The way we import catalogs has changed. We are not forced to select between a single cab file or a bulk import. All subscribed catalogs are listed, and can be multi selected.
  7. Compatibility
    1. Finally the schema hasn’t changed which allows us to reuse the existing all previous updates. There is no direct update path, updates must be exported from SCUP 4.5 and can imported directly into SCUP 2011.
    2. Certificates can be reused thus eliminating the need to redistribute anything to existing endpoints.

Best Regards

Kent

 

How to create a Windows 7/2008R2 Reference Image for deployment?

Over the years I have meet many customers and that means that I have also seen many different approaches on the subject “How to create a Ref Image”. And finally I have diced to once and for all really explain how to. It doesn’t fit in to one blog post, so there will be a series of them and here is the first one

http://itbloggen.se/cs/blogs/micke/archive/2011/03/22/how-to-create-a-windows-7-2008r2-reference-image-for-deployment-part-1.aspx

This one only cover the basics, but my plan is to add the rest as fast as I can (need to work from time to time, sorry about that)

There are some highlights you need to think of and the most important is

 

Never, ever create ref images on a physical hardware if you are going to use the same image on other systems. Yes I know, you have tried it and it did work. The problem is that it will most likely give you a fantastic headache sooner or later so don’t do it. Install the ref image on a “generic “machine = a VM

The reasons are many, but here is an example, let’s say that you do install the ref image on a machine and during install it will download a driver for some piece of hardware. Fine, but here comes the problem, with that driver it also install an application, does some tattoo to the registry, turn of a service and some other nasty thing. Everyone seems to think that the driver itself is the issue, but it’s not, it is the other things that are nasty.

Always Sysprep the machine, strange, I keep telling people about this over and over again. It is not an option, it is a “Must Do” and there is a bunch of reasons for this, but the basic reason is that the sysprep process will generalize the image making it prepared to be re imaged, otherwise it is not. Most people have an argument which states that is not needed, since the SID number is going to be changed anyway. The SID number change is just 1% of the actions that sysprep to and that part is not important, it is the other 99% of things that happen in the OS that is.

So do it the easy way, use MDT to create the ref image.

/mike

 

 

 

 



Where to find us......

Mastering SCCM 2007 SP2 R3 with Kent Agerlund

Cleveland, OH    

April 18-21

 Deploying Windows 7 using MDT 2010 and SCCM 2007 SP2 with Johan Arwidmark

New York City

April 12-14

Mastering MDT and WDS with Mikael Nystrom

Chicago

April 4-6

 Build the Cloud Infrastructure with Mikael Nystrom         Chicago    April 7
 Build the Cloud Infrastructure with Mikael Nystrom  Boston      April 8

Deployment Geek Week with Johan Arwidmark and Mikael Nystrom

London, UK

June 6-10

Deployment Geek Week with Johan Arwidmark and Mikael Nystrom

Redmond, WA

July 18-22

Full schedule at http://www.truesec.com

This message was intended for '%%emailaddress%%'
Unsubscribe | To contact us please email info@truesec.com

TrueSec Inc.
8201 164th Ave NE, Redmond, WA 98052


 




TrueSec Inc    |     +1(425) 285-4477     |     info[at]truesec.com    |     Infrastructure    |     Security    |     Pentesting    |     TrueSec Inc. Website Privacy Statement