You are browsing the archive for 2011 August.

Live meeting: How to prepare for SCOM 2012

7:22 am in Uncategorized by Dieter Wijckmans


On 20th of September I’ll be hosting a live meeting where we’ll go over the different steps to prepare yourself and your environment for the move from SCOM 2007 to SCOM 2012. The upgrade path has been said to be easier than the one from MOM2005 to SCOM2007 (god thank). But still there are some things to keep in mind and consider before moving towards the new version when it’s released.

So join me on the 20th of September to prepare yourself for the next version of the SCOM software family.

The abstract of the topics covered (more to come):

  • Backup and Document your environment.
  • Health Check your SCOM installation.
  • Compare management packs for SCOM2007 and SCOM2012.
  • Supported upgrade paths.

Link to join in:

I’ll be prepared for SCOM 2012… will you?

SCOM 2007: How to add Custom Field to monitor

8:39 am in Uncategorized by Dieter Wijckmans

In SCOM 2007 it’s possible to fill in custom fields with rules like you did in MOM 2005 as explained here:

However this is not possible in monitors because there’s a fundamental change in how the alerts are created. In Rules the GenerateAlert module is used to create the alerts. In this GenerateAlert module it’s possible to pass extra data like the custom fields. In monitors the alert creation is slightly different. The alert is generate with parameters in the monitor itself so it’s not possible to pass extra data.


For a client I’m migrating MOM2005 to SCOM2007R2 and of course I would like to take advantage of the fact I can create monitors instead of rules. My client has a mainframe based problem management system (could also be any other system which doesn’t have a connector) which uses a mail scrubber to read out mails and scan for specific keywords to create tickets.


The specific keywords were passed in MOM through the custom fields. This is also possible in SCOM but only by using rules and not monitors. A solution could be to create a monitor and create a separate alert generating rule for that monitor. This solves our issue but is not manageable because if things change you have to change both the monitor and the rule to make sure they reflect the new situation.


Therefore I came up with another solution. Because there are only 15 possible combinations of keywords at my client I choose to use the Subscription / notification channel to insert the keywords in the dbase before I send it to the Problem management system. I could have just passed the parameters to the mail and send it but I prefer to update the dbase as well to also reflect the changes in the alert.

I’ve based my script on the script I used earlier on and is featured here:

The main difference with the script above is that instead of reading the custom fields out of the dbase I pass them with the Notification Channel. Doing this makes the keywords centrally manageable when the keywords change.

The script:

As mentioned above I mostly reused the script of my previous blog post but for the record I’ll explain the script here once more:

First of all. You can download the script here:

Preparing the environment and reading the parameters:

The main difference with the previous script is the fact that we are not reading the data out of the dbase (in the $_.customfield fields) but inserting the data in the dbase through parameters by using the script.

Parameters: The “param” statement needs to be on the first line of the script. In my case I’m reading 3 parameters: The alertID (which is mandatory for the script), The Problemtype and the Objecttype.

The last 2 fields will be inserted in the $_.customfield dbase fields and are needed by the third party problem management solution to make the proper escalation.

RMS: Read the RMS server name of your environment. If you are using a clustered RMS it’s better to fill in the name of the cluster and comment the automatic retrieval of the name out to avoid problems.

Resolution State: The resolution state needs to be defined here and also defined in the SCOM environment (for more details on how to configure this in the SCOM environment check here:

Loading the SCOM cmdlet

Culture Info: To make sure that the data format is correct you need to fill in the Localization. In my case it’s nl-BE.

Read in alert + fill in custom fields: The alertID which is passed as parameter is read in here and the data is retrieved out of the dbase. The other 3 custom fields which are required by the problem management system are filled in here and updated in the dbase. Technically there’s no obligation to fill in the fields in the dbase but to make sure that the custom fields are filled in when you open the alert in the console I update the alert anyway.



Error Handling setup to make sure we have proper error handling
The conversion before dumping the data to the file.

Note that I needed to make modifications to the date format to reflect the localization format here. All the data will be dumped to a file which is kept for future reference. The File path in yellow can be changed to reflect your location.


Mailing section:

Mailing the file to the problem management system or if in case an error occurred alerting the SCOM admin. Make sure you fill in the OK recipient, the NOK recipient and the SMTPserver to send out the mail.


Last but not least we are writing an event in the event log whether the operation was successful or not. This gives us the opportunity to monitor the problem creation script from within SCOM. scom_custom_fields_monitor0003

This solution works for me because I have a limited number of possible combinations.

A couple of things you need to configure before this script can be used in production:

  1. Create custom resolution state in SCOM:
  2. Create a notification channel per combination of the values of the custom fields:


    • The following data is needed for the notification channel
        • Full Path of the command file: In my case this is PowerShell as I would like to run a PowerShell script: C:\Windows\System32\WindowsPowerShell\v1.0\Powershell.exe
        • Command Line Parameters: In my case I’m running a PowerShell script and I’m passing the AlertID of the specific alert, problemtype and objecttype as arguments which I’m using in my script. Again you can use any arguments here if you like: C:\scripts\monitorcustomfields\create_customfields_monitors.ps1 -sAlertid ‘$Data/Context/DataItem/AlertId$’ -Problemtype ‘OSSUP’ -Objecttype ‘server’
        • Startup folder for the command line: This is basically the path of your program you want to run. C:\scripts\monitorcustomfields

The script must be run on the RMS (if it’s a clustered RMS make sure that the script is on both clusters in the same location).

Note:  If you want to use more parameters or different names you have the change the following things:

  • In the command line parameters: add –parametername ‘desired value’ to the command line
  • In the script:
    • change the first line in the script to reflect your custom situation: param([string]parametername,…)
    • change the section where the custom fields are filled in to reflect your situation:
      • $oalert.customfield# = desired value

There are 10 customfields available in the dbase so you can pass up to 10 parameters in the script and thus into the customfields.

If you have remarks or questions regarding the script please do not hesitate to drop me a line or contact me on twitter!/dieterwijckmans

Scug Event: System Center Orchestrator notes from the field

8:37 am in Uncategorized by Dieter Wijckmans

SCUG Belgium is holding it’s first offsite event after the summer.

Join us on Thursday 29th september 2011 to discuss the brand new version of SCORCH 2012(the product formerly known as Opalis).

This event will consist out of 2 sessions of an hour:

Session 1: System Center Orchestrator 2012 overview will go over the new version of SCORCH. How can SCORCH help you to automate tasks and create workflows. This session is given by Kurt Van Hoecke who’s a member of the SCUG and a SCSM specialist. You can find his blog here:

Session 2: System Center Orchestrator 2012 migration plan will give you an overview on what things to consider to migrate from a previous version to the new version. This session is given by Christophe Keyaert who’s a member of the SCUG and specialized in SCORCH / Opalis. You can find his blog here:

Don’t miss this great opportunity to view this new release in action and sign up here:

Update: We’ve added a live meeting to sign in as well so if you can’t make it to Belgium make sure to follow it by livemeeting!

See you there!

SCOM 2007: Utility to Count Total Instances in Your Management Group

7:24 am in Uncategorized by Dieter Wijckmans

Scom is a great product but from time to time you need a custom build tool or script to do just the thing, or change just that bit that’s not possible in the SCOM console.

I’m personally a huge fan of the Powershell cmdlet supplied with SCOM. For most of the tasks (whether it’s automating or extending SCOM) it does the trick quickly and easily.

From time to time there’s a tool passing by on the world wide web that fills a gap to make our lives as a SCOM admin more easy.

Yesterday another of these fine tools emerged:

Note: You need to register to download the tool.

This is the first version of the nice tool to count the instances per management group. This can be helpful to troubleshoot your environment. The PowerShell script which was posted in the community a while back took sometimes 3 hours to complete the task while this nice .net program is taking minutes…

You need .net framework 4 to run the tool.

Keep an eye on the topic because I’m sure it will progress in the next days like the authors mentioned in the topic itself.

SCOM 2007R2: Ramp Up Guide

1:16 pm in Uncategorized by Dieter Wijckmans

I came across an excellent list written by Sonda (MPFE in UK) of all the required resources you’ll need to get up to speed with SCOM 2007 R2.

There’s plenty of info for the absolute rookie and the novice.

You can find the exhaustive list here:

If you are looking for a similar list of links to resource for Service Manager look no further. Kurt Van Hoecke created a nice list of all the resources you’ll need to get you going.

Happy reading Smile

SCOM 2007: Cumulative Update 5 is released

12:16 pm in Uncategorized by Dieter Wijckmans

Yesterday Microsoft released the Cumulative Update 5 (CU5) for Scom 2007 R2.

This new update contains some additional fixes for Operations Manager 2007 R2 + support is added for Red Hat 6.

You can download the CU5 package (948.0 MB) here:

The KB2495674 article apparently is not online yet but can be found here:

For instructions on how to install a CU package in general (blog is written for CU4 but best practices can be followed for installation of CU5 as well) you can check this :

SCOM2007: How to backup your Reporting

1:38 pm in Uncategorized by Dieter Wijckmans

This blog post is part of a series how to backup your SCOM environment.

You can find the other parts here:

Another part in the process of backing up your environment and thus making sure that all the data is available to restore your environment is backing up the Reporting services dbase which basically contains all your reports.

The standard report can be easily recreated when reimporting the management pack but if you made custom reports they will be lost if you do not have a backup.

This process consists of 4 steps:

  • Backing up the Report Server Databases (Reportserver and Reportservertempdb)
  • Backing up the Encryption Keys
  • Backing up the config files
  • Backing up the Data files.

Let’s get started!

Backing up the Report Server Databases

The 2 dbases to backup are Reportserver and Reportservertempdb. Although it’s not absolutely necessary to backup the Reportservertempdb to restore your environment it will definitely save you some time in the process. If you loose your Reportservertempdb you’ll have to recreate it… So if you’re in the process of backing up take a backup of the Reportservertempdb as well.

You can use any method which is allowed by SQL to backup these dbases whether it’s System Center Data Protection Manager, third party software or the build-in SQL backup process.

I’ll be using the build-in SQL backup:

Open the Microsoft SQL Server manager and browse to your server / dbase:


Right click your reporting dbase and choose Tasks > Back Up…scom_backup_reporting0001

Leave the backup type as Full, change the name (if you like otherwise use the default name) and check the location of the file.scom_backup_reporting0002

Caution: Make sure you choose a file location which is included in your normal day to day file backups so you have it in your backup system when your server is completely lost.

If all goes well you’ll get the message that your backup was successful


Now repeat the steps above as well for the ReportservertempDB and save it in the same location as you have saved your ReportServerDB backup.

Backing up the Encryption Keys

This encryption key is used for encrypting sensitive information in the dbase to ensure the safety of the data in it. You normally only have to save this key once as this is used in a 1-on-1 relationship with the dbase and the Symmetric key.

The key needs to be restored in the following cases:

  • Change the report Server Windows Service Account name or resetting the password.
  • Migrating a report server installation to use a different report server dbase.
  • Recovering a report server installation due to HW failure
  • Renaming the computer or instance that hosts the report server.

Open your reporting Services Configuration Connection by choosing Start > all programs > Microsoft Sql Server ‘version’ > Configuration tools > Reporting Services Configuration Manager

A dialog box will appear to check the Server name and the report Server Instance:


If the are correct click Connect.

On the next page choose Encryption Keys and in the right pane click Backup button.scom_backup_reporting0007

Choose the file location + name by clicking the … button.


Fill in a desired password. This password is used to encrypt the file so make sure you use a password you remember because there’s no way to restore the key without it + there’s also no way to reset the password on the exported SNK file.

If all goes well the key has been backed up and your receive the “Creating Encryption Key Backup” successful message at the bottom.


Backing up the Config files.

The reporting services uses different files to store the application settings. It’s very important you have this config files handy when disaster strikes because they contain all your settings / customizations.

Best practice is to take a backup of these files when you have installed the server, deploy custom extensions or when you run a full backup of your environment for drp reasons.

The following files must be included in a backup location which is covered by your filebackup system:

  • Rsreportserver.config
  • Rssvrpolicy.config

  • Rsmgrpolicy.config

  • Reportingservicesservice.exe.config

  • Web.config for both the Report Server and Report Manager ASP.NET applications

  • Machine.config for ASP.NET

Backing up Data Files

Backup the files that you create and maintain in Report Designer and Model Designer. These include report definition (.rdl) files, report model (.smdl) files, shared data source (.rds) files, data view (.dv) files, data source (.ds) files, report server project (.rptproj) files, and report solution (.sln) files.

Remember to backup any script files (.rss) that you created for administration or deployment tasks.

Verify that you have a backup copy of any custom extensions and custom assemblies you are using.


This was the last blog post in the series “How to backup your SCOM environment”. If you follow these guidelines you’ll have a pretty good chance of recovering from a disaster with as little downtime, as little data loss as possible.

In the next series I’ll be posting how to Recover it all using the backups we took so stay tuned and as usual if you have remarks or feedback you can reach me on facebook / twitter.