You are browsing the archive for system center.

TechEd NA and TechEd Madrid: What to see

8:22 am in TechED by mikeresseler

Introduction

It has been a great week being at TechEd NA in New Orleans. I had to be there to deliver 2 presentations (see later) but I was of course very anxious to learn about all the new stuff that is coming with System Center 2012 R2 and Windows Server 2012 R2. Yes, for those who were living at another planet last week, the new release will be named R2 which is a difference with the name of the release of the Windows 8.1 release. And there was a lot to learn and explore this week for sure!

Below you find an overview of all the different sessions you much watch when you didn’t had the change to visit TechEd NA.  However, if you are going to go to TechED Madrid, then don’t watch the sessions but have a look at this list and use it to plan your agenda.

Let’s start with the first session of the event (pre-conference not included).

The Keynote

Brad Anderson (@inTheCloudMSFT) delivered the keynote and it can be watched here:

http://channel9.msdn.com/Events/TechEd/NorthAmerica/2013/Key01#fbid=gadmougCguJ

Besides driving with a nice car on stage and a very funny bond alike start movie the keynote showed us quite some stuff that needed to be checked out.  My attention was drawn and I needed to check some stuff out.  Unfortunately, as it is always at a conference, it is quite impossible to see all the sessions that you want to explore so I will have to spent some time on the TechEd NA Channel 9 site to see more presentations.  Luckily I am one of those that will go to TechEd Europe in Madrid also so that will give me the possibility to see some more sessions :-)

Let’s start with the list of sessions about Windows Server 2012 R2

Windows Server 2012 R2

Start off with What’s new in Windows Server 2012 R2 to get general information. This session will give you a quick overview of all the interesting new improvements and then you can select deep-dives on the different topics where you want to learn more from.

Some of the sessions that I have seen or that are still on my wish list:

Windows Server Work Folders overview – my corporate data on all my devices: As a road warrior and home worker I have multiple (read LOTs) of different devices that I use to do my work.  Because I never visit the corporate office I’m not even connected to the domain. Work Folder capabilities in Windows Server 2012 R2 will give me the capabilities of having my files on all of my devices.  And IT will still be sure that it is done securely and has control over those files. This is pretty cool technology and I can’t wait to see this in my environment! (PS: for the deep-dive on this topic: http://channel9.msdn.com/Events/TechEd/NorthAmerica/2013/WCA-B332)

Hyper-V – What’s New in Windows Server 2012 R2: This session is worth at least a blog post (probably multiple) on its own. Ben Armstrong (@virtualPCGuy) delivered this session about all the new stuff in Hyper-V. And there is a lot of new and cool stuff coming.  Watch my words and start learning… Virtual Machines Gen2…

What’s new in Window Server 2012 R2 Networking: Many improvements have been made in Windows Server 2012 on the networking level and R2 just goes further. If you work a lot with these capabilities than this is the sessions to get you up-to-date with the enhancements (and then afterwards you can start investigating in-depth more :-))

Continuous Availability: Deploying and Managing Clusters using Windows Server 2012 R2: If you have clusters today then you will probably have clusters in R2. This session show you the updates in R2

System Center

Yep, there was a lot of good content on System Center and many of it was delivered by MVP’s and people that are actually in the field. Here is a list of good sessions:

Mission: IT Operations for a Good Night’ Sleep by Walter Eikenboom (@wwwally). A good session for those who want to take component monitoring to LOB monitoring.

Effective Capacity Planning of Your Infrastructure Resources with Microsoft System Center 2012 – Operations Manager Reporting by Gordon McKenna. A session on capacity planning with Operations Manager, SharePoint and the Datawarehouse

 

PowerShell

Windows PowerShell Unplugged: A session delivered by Jeffrey Snover (@jsnover) and this is a session that you must watch if you are a beginner at PowerShell or even already an advanced user. I love PowerShell and work with it on many occassions and this is just one of these sessions that teaches me how to learn and use PowerShell. I take benefit of this one every day

Advanced Automation Using Windows PowerShell: After seeing previous version you quickly will want to go further with PowerShell. Don’t rush too fast but the moment you feel confident in PowerShell then take this session and take the next step.

Key Metrics and Practices for Monitoring Virtualization Platforms by Raymond Chou (@exotrinity) and Alec King. If you want to know more about the metrics you need to monitor mixed environments than this is the best place to start.

Monitoring and Managing the Network and Storage Infrastructure with Microsoft System Center 2012 – Operations Manager by Maarten Goet (@Maarten_Goet). Yes you can monitor more than servers alone. Find out more here about monitoring your storage and network stack.

Tips and Tricks for Creating Custom Management Packs for Microsoft System Center – Operations Manager by Mickey Gousset (@Mickey_Gousset) is a good session if you want to start creating your own management packs.

And of course: Introduction to System Center 2012 R2

Upgrading your Private Cloud

For those who are totally convinced of the new technologies that are heading our way… Upgrading your Private Cloud with Windows Server 2012 R2 is probably something for you :-)

Last but not least!

I also did some presentations so here are the must-see (:-)) sessions:

Managing Multi-Hypervisor Environments with System Center 2012

Microsoft System Center Data Protection Manager and Veeam: Better Together

As you can read, there is a lot of good stuff and I will be digging in to the catalog the next 2 weeks to start scheduling my sessions that I want to see in Madrid.

For more information on TechEd Europe:

But there is more than learning and working alone… At TechEd NA we had the possiblity to buy a Surface PRO and RT for a very sharp price and I took that possibility with 2 hands. Now let’s hope that we get the same opportunity in Madrid. I would watch this space and get your money ready :-)

http://channel9.msdn.com/Blogs/TechEd and http://channel9.msdn.com/Blogs/TechEd/Special-Surface-Offer-for-TechEd-Attendees-via-Microsoft-Retail-Store#fbid=OQulHhLXnf6

Have fun

Mike

Wrong assumptions about the System Center Suite

6:45 pm in Uncategorized by mikeresseler

Hey All,

It’s Friday (TGIF) and I’m sitting on the train (for quite a long time) heading home after an exiting, but exhausting week.  I’ve had some nice successes with my customer and I went to the Techdays 2010 in Belgium.  As I am sitting here, I was thinking about a few things that were said on the techdays about System Center, and also what the new products will mean for System Administrators in their day-to-day work.  And I realized that many people are having the wrong ideas about the system center suite.  So here are my thoughts about some wrong assumptions.

 

Wrong assumption #1: System Center is a technical solution

 

Every time I’m talking to IT-pro’s, I basically meet two kind of people: the ones that are convinced about System Center tell me that it can do practically everything.  And the ones that are not yet convinced will tell me that it is a great suite, but needs some additional work, and most of them will tell me that they either use it in their environment, or certainly looking at it at a constant base.  But this is were tehy all go wrong.  We look at it as a great tool that can do great technical things.  So what are most people doing?  They integrate it as a technical solution, deploy as many management packs, DCM rules, packages or whatever, as possible, and finally deploy all the agents.  Oke, we are monitoring everything, deploying everyting, backing up everything… how cool is that. 

Yes, it is cool, but we should also look at it from a business perspective, and preferred BEFORE we start to implement this solution.  Why?  To deploy it as a monitoring tool for services.  Or a deployment solution for services or… Think about it.  What if you can say to your peers (and I know I am talking now an easy example) that you monitor the email system as a whole.  They will like it (and probably ask a monthly report about it).  Imagine that you say to your peers that you can use Operations Manager to monitor all your exchange servers, both the virtual and physical ones, your switches, your internet line for sending and receiving emails, your gateway, your anti-virus, your …  What will the response be?  If you are lucky, they will say ‘Good for you’ and if not they will say ‘I don’t care’ or even worse ‘Did I pay that much money and that’s the only thing you can do with it?’.

Back to the ‘email service’.  Peers or managers or directors or whatever title your bosses have will not care what technical magic you are doing.  They only care about the fact that their email is working, that it is safe, and that they can’t catch viruses.  That’s it.  So if you say that you monitor the email service (call it messaging service, sounds even better for them) and can give a report about it, then they will say “Job well done” and you are on the way for buying your next system center product 😉

Doing it like this, will give you additional benefits.  Think about it, the only thing that you need to monitor is the “messaging service”.  As long as it runs, no problem.  So if you have failover, clustering, DAG, live-migration and who knows what extra protection then you can deliver your 99,99% messaging service availability almost every time.  Even if one of the components go down, you get an alert, your SLA is not violated, you fix it with the knowledge in the system, case closed, nobody cared, no time pressure, and no slap on the head because you didn’t meet your SLA’s.

 

Wrong assumption #2: Opalis takes the ITPro’s work away

A new product is coming to the system center suite.  For the moment, still called System Center Opalis (no idea if they going to rebrand it) and it is the talk of the town.  Opalis is what they call an ITPA (IT Process Automation) or RBA (Run Book Automation).  A few weeks ago, I was telling about Opalis to a colleague and the first thing he said was ‘Damn, something to create automatic ITIL processes, keep it away from me as long as possible’.  Wrong assumption.  While it can be perfectly used to implement ITIL processes (or Cobit, or MOF) it can also do general maintenance.  So he says, it is a ‘scheduler’.  Wrong again, true, this can be done by it, but it is also event-driven.  ‘So what is it?’ he asked.  I explained him, that the purpose of this tool is to create processes in such a way that it takes away as much as possible of the day-to-day work and to automate solutions to problems when it goes wrong.  Before I could say more, he drew his gun and shot me down :-).  The reason for that?  I was talking away his work.  He would get fired because he wouldn’t be necessary anymore.  And if it wasn’t him, it would be one of his colleagues.  Wrong assumption.  Although that he was wrong three times, it is quite a clever IT guy.  I told him that there still were people necessary to build these processes.  And that these processes aren’t always working.  Ill explain that a little bit later with an example given by Maarten Goet yesterday at the techdays in Belgium which I found a very good example.  Anyway, what Opalis will be doing is:

  1. Try to solve issues out of its own
  2. Fill in data (ticketing systems, alerts, …)
  3. Do day-to-day maintenance out of its own without a user intervening so stopping human errors (and we all know that these most of the time happen when you need to do repetitive work)
  4. Be faster then humans! Think about it, if you can automate many things, the problem will be much faster resolved then you can as an IT pro.  And is that a bad thing?  No way, the lesser fires you need to put out, the better.

So what’s the IT Pro is going to do more?  Simple.  As I said, he is a clever guy, so he can use his time to improve the IT infrastructure and processes.  He can use his time to think actively about the requirements of the users, his customers.  If he’s lucky, he will even have the change to discuss the future roadmap of the company.  And that is exactly what he should do.  No, changes are that he won’t be taking the decisions.  Hey, he will probably not be in the meeting where the decision is taken.  But he will be actively involved in finding a good solution for the business requirements and assist his manager in finding and defending a good solution towards the management.  The manager who doesn’t want this is in my opinion not a good manager.  And the IT Pro that doesn’t want time to investigate new things on the market, study on business requirements but only wants to fight fires?  I still have to meet this guy.  And even if he doesn’t get the ability to do these things, it will still mean less evening work, less weekend work, and less stress.  Hooray!

Now about that example.  Maarten showed us a process where Operations Manager alerted that a certain windows service went down.  The process created a ticket in the helpdesk system, changed the alert in Operations Manager, tried to start the service again, and on success updated the alert, the ticket and case closed.  Nobody did something except see that there was a new ticket in the helpdesk system.  But Maarten created also the path when the service didn’t start again.  And then the ticket got escalated.  An SMS was sent to an engineer (in the demo a pop-up but you get the picture right :-)) and now it was time for the engineer to troubleshoot why the service doesn’t start anymore.  Still, the first step, restart the service was already taken, the ticket already existed, so basically the engineer won at least 15 minutes if not more.

 

Wrong assumption #3: The suite are stand-alone separate products

The system center suite consists out of x number of applications, and they don’t work together.  Difficult one, because it is true, and not true.  Yes, you can perfectly setup an infrastructure where all these tools are running next to each other, and where they don’t work together.  In fact, I hear that many projects are dealt this way.  Why?  Probably because it is much cheaper this way, and the project is much faster implemented so you have quicker results.  This is (of course) short term thinking, and should be avoided.  Operations Manager and Virtual Machine Manager work perfectly together.  Configuration Manager and Operations Manager work perfectly together.  And while you are busy, deploy your virtual machines with Configuration Manager, just the same way you are deploying your physical machines.  Don’t forget to use the patch management within Configuration Manager and view the failures in Operations Manager.  Data Protection Manager and Operations Manager.  Check.  Data Protection Manager and Virtual Machine Manager.  Check.  Deploy your different agents with configuration manager.  Check.  (I can continue for a while but I hope you get the picture.)

The statement is also true, because it was lacking two important things within the suite.

  1. A central helpdesk system (or service desk, whatever term you are using and don’t get me wrong, I refuse to look down on the “helpdesk”.  I invite every engineer, architect or whatever to run a few weeks in an average helpdesk, they will start smoking and drinking after day 3 ;-))
  2. An automation system

Hey, number 2 is just discussed, and number 1 will be there very soon.  Say again that Microsoft has no vision, they just proved you wrong :-)

But for all those guys that like to shoot at Microsoft, it was already possible before.  There were connectors to other products that could do the job very fine.

 

Wrong assumption #4: It is only for technical guys

No, no, three times no.  Almost every product within this suite can be organized so well that whoever needs access to whatever data, you can arrange it WITHOUT risking that those persons screw up everything.  This takes me back quite a few years where I wanted to delegate the right to create, update and disable (not delete) users within Active Directory to the secretary of Human Resources.  Reasons:

  1. I’m lazy and type to many times a name wrong :-)
  2. By the time I knew that somebody new started at department x, he was already standing at my desk for a username (don’t you just hate that, but hey, another reason for Configuration Manager and OS Deployment, or MDT if that fits your needs)
  3. I never had the time (I used to be an excellent fire fighter.) to drop everything and create that user.  Not to mention that I always wanted to see paperwork from HR, you never know who tries to fool you :-)

About every colleague I had started to shout that that was about the most stupid idea I could have (And I can assure you, from time to time I really can have stupid ideas ;-)).  Why I asked? Answer: Nobody wanted the secretary (there were actually 2) to do something inside the heart of our infrastructure.  Now you have to image that Identity Management solutions were or extremely expensive or basically not usable so that was not an option.  So I tried to convince them with delegation of rights, tutored the secretaries for 2 hours, and let them use it, closely monitored by the colleagues.  After one month, nobody cared anymore, and no one of us needed to create a user anymore.  Only if the user needed to be deleted (which I think is bad practice…) we needed to get in action.  No more changing AD data because a user had a new address, changed her last name into her married one and so on.

Back to 2010 and system center.  Why not delegate certain jobs to other people.  Wouldn’t it be great if the helpdesk could use Config Manager to take over desktops.  To deploy workstations.  (I even have the scenario where the guy from the warehouse accepts the new workstations, takes the attached email sheet received from VendorX (I can’t advertise here right :-)), puts it in SCCM, takes out the computers to a desk, plugs in the cable and magic is happening… 50 new workstations ready to be given to new users.  Or what about the SQL team?  Give them a limited view in Operations Manager so that they can see their alerts.  And how about some managers?  Give them reports about the software metering, number of alerts on a monthly base (Imagine this, a manager sees 500 alerts in one month, but no user complaints and more important, he never noticed anything… Meet your new nickname: Speedy Gonzalez).  I can probably continue here with hundreds of examples but just think about your own situation in the office.  How many times do you need to deliver data or reports or whatever to managers, other IT teams and so on.  Give them the rights to look at them their selves.  Then they won’t bother you, and you just have some more free time.

 

Wrong assumption #5: First we implement a project, then we think about System Center.

Yeah, common mistake, but big mistake.  Try to convince your co-workers, peers and whoever is involved to think about the management upfront.  I promise you, you will gain with it.  We actually use sometimes Operations Manager to take away human errors out of our implementations.  If it is there, use it.  Oh, and never forget Backup before you start a new project.  I hate it when they do that and it always costs me way too much time to resolve everything afterwards.  First thinking, then touching the keyboard.

 

Conclusion

Do you agree with me?  Some will say no, others will say yes and others will say yes but…  And it is normal.  You should look at the system center suite as a framework.  Don’t just install everything with next next next finish, but think about it before you start.  Think about the advantages that you can have if you model the suite to the business needs.  You will save valuable time to spent with your family, investigate new stuff or get a go from your boss to be at the techdays next year and come over to the SCUG booth and have a discussion with me :-). 

If you have comments about these Friday evening thoughts just shoot.  Don’t agree? Fire away.  The more people, the more ideas and hey, the better the results.  Some of you will probably have even more wrong assumptions about System Center.  Will be glad to hear them.

Hoping that you do these things above so that you don’t read this in the weekend 😉 Oh and sorry about my proza, it is friday evening after all, and I really just typed my thoughts :-)

Cheers,

Mike

System Center: Integration Software from Dell

1:40 pm in Uncategorized by mikeresseler

Hey all,

You probably noticed that it has been quite silently here on this blog, mostly because the SCUG team allowed me to post the DPM blogs straight to the SCDPM blog of SCUG.  (Thanks again for that guys…)

But this doesn’t mean I’ve forgotten about my own little blog, and today I’m going to discuss a recently posted announcement from Dell

The Press Release dates from 16 February 2010:

Dell Helps Customers Efficiently Deploy, Update and Monitor IT Resources For Enterprises Using Microsoft System Center

Original release can be found here: http://content.dell.com/us/en/corp/d/press-releases/2010-2-16-dell-ms-integration-tools.aspx

With a title like that, it had to draw my attention :-).  Anyway, I decided to read on and see what they actually have done.

Basically, Dell has released a set of software tools that integrate directly into the system center suite from Microsoft, and they do that at no charge for Dell customers.  Now this is good news!  The more integration you can have into one platform, the easier it is to work and the bigger the return on investment.  So I continued to read and see what they had to offer.

Most of the tools you will probably know already.  Actually all tools are already out there for a while.  Why Dell now comes with this press release is a mystery for me, but still, the importance of the press release is big.

By announcing it to the world, Dell has committed itself to deliver add-ons for the System Center Suite from Microsoft.  And I’m quite sure that other hardware vendors will come out with integration tools also.  Again, this is not really new, since Dell and HP and IBM and … already are doing this for a longer period, but it could step-up the speed for more integration tools into the System Center Suite.  And preferably free of course ;-).  And this is something I can only like.

So what has Dell for us to offer?

1. Dell Management Pack for Microsoft System Center Operations Manager (link)

This management pack is @ release 4.0, A00 dated from 31 august 2009 and monitors the status of Dell PowerEdge / PowerVault Servers, Dell Remote Access Controllers (DRAC) and Chassis Management Controllers (CMC) on a defined network segment.

It is supported on SCOM 2007 SP1 / R2 and SCE 2007 SP1

2. Dell MD Storage Array Management Pack Suite for System Center Operations Manager (link)

Also @ release 4.0, A00 dated from 18 January 2010.  The management pack allows SCOM SP1 / R2 and SCE 2007 SP1 to discover, monitor and give the status of Dell MD Storage Arrays (MD3000, MD3000i, MD1000 daisy chained to MD3000 or 3000i)

3. Dell Client Management Pack for Microsoft System Center Operations Manager (link)

Release 4.0, A00 dated from 26 October 2009. This Management Pack allows SCOM SP1 / R2 and SCE 2007 SP1 to discover, monitor and give the status of Dell Business Client Computers.

4. Dell Printer Management Pack for System Center Operations Manager (link)

Release 4.0, A00 dated from 11 November 2009.   

The Dell Printer Management Pack v4.0 enables System Center Operations Manager (SCOM) R2/SP1 and System Center Essentials (SCE) SP1 to Discover, Monitor and accurately depict the status of Dell Printers on a defined network segment.

5. Dell Server Deployment Pack for Microsoft System Center Configuration Manager (link)

Release 1.1, A00 dated from 17 July 2009

The Dell Server Deployment Pack is an easy-to-use graphical user interface (GUI) based tool that integrates directly into the Microsoft System Center Configuration Manager 2007 (ConfigMgr) console.
It eliminates the need for command-line tools and scripts normally used in the DTK. To configure and deploy your Dell systems, you need to select configuration options and commands on the GUI using drop-down lists and check boxes. This makes your system deployment an easy, automated task.

6. Dell ConfigMgr – DLC Plugin (link)

Released 30 October 2009, version A00

The Dell Lifecycle Controller Integration for Microsoft System Center Configuration Manager v 1.0 allows customers to simplify and improve their Operating Systems Deployment experience on Dell Servers. Combining the capabilities of our embedded Lifecycle Controller (LC 1.2), the Dell Server Deployment Pack (DSDP 1.1) for ConfigMgr and the Integrated Dell Remote Access Controller (iDRAC6), Dell is positioned to offer a very powerful remote provisioning solution to MS ConfigMgr users.

7. Dell Client Deployment Pack for Microsoft System Center Configuration Manager 2007 (link)

Version 1.1, A01 and released @ 18 December 2009.

Dell Client Deployment Plugin Installer pack installed as an add-on to Microsoft ConfigMgr 2007 SP2 helps administrators simplify the image deployment process on Dell Systems in their Enterprise Environment. With the plug-in installed, the end user will be able to use the capabilities to:-
1. Import drivers packs released in ‘CAB’ format from Dell Support Website
2. Create Windows Vista and XP Deployment task sequence
3. Remotely Configure BIOS using Client Configuration Toolkit

8. Dell Server PRO Management Pack for Microsoft System Center Virtual Machine Manager 2008 (link)

Released @ 29 December 2009, version 2.0, A00

Dell PRO enabled Management Pack’s integration with System Center Virtual Machine Manager(SCVMM) 2008, R2 and SCOM 2007 SP1, R2 or SCE 2007 SP1 enable customers with effective management of Dell Virtualized Server Hardware with PRO. It enables remedial action/recommendation, such as migration of Virtual Machines and placing the host into Restrict mode, based on alerts generated by the host with workloads running in a Hyper-V based Virtualized Environment.

 

Need more information?

Make sure you go to http://www.dell.com/content/topics/global.aspx/sitelets/solutions/management/microsoft_sms?c=us&cs=555&l=en&s=biz and find the information you need.

 

Conclusion:

I already worked with most of the tools that Dell delivered for System Center, but the first versions contained many issues.  But many of these issues are out now, and the tools are getting better with each version.  So I certainly advise everybody to try them out if you are having Dell hardware running in your environment.

I also hope that more vendors will release tools such as Dell, and I’m not only talking about the major hardware vendors, but also vendors such as 3Com, Cisco, Juniper and so on are hopefully getting the picture. The more integration into one suite of management tools, the more easier it will be for the system administrators to effectively monitor their environments in a pro-active way, and not do the fire fighting which many of us know, still do or have done.  And even further, it will allow for system administrators to get closer to the much wanted dynamic data center… But more on that topic later.

Just my thoughts…

Cheers,

Mike

System Center Data Protection Manager 2007: Knowledgebase article for the new hotfix package is now available

4:17 pm in Uncategorized by mikeresseler

http://support.microsoft.com/default.aspx?scid=kb;EN-US;976542

Cheers,

Mike

System Center Data Protection Manager V3: Worth to look at?

6:44 pm in Uncategorized by mikeresseler

Hey All,

Just been a week in vacation with the wife and kids and had a great time swimming, walking and playing in the playgrounds with the children.  Now that the DPM V3 Public Beta is announced and there, I thought it was time again to look at the promised features for V3 and to check if the product is worth our attention.  (Hey, a silent evening in a cottage, with the kids and wife sleeping, a good glass wine in my hand and some good music on… ;-))

So here’s what’s new, based on what was already in V2

For Exchange:  Of course they will support E14 in addition to Ex2007 and Ex2003.  There will also be improved Restore Granularity.  What exactly is not known yet but the DPM team worked together with the Exchange team to see what is possible and supported.

For SQL: You will be able to protect an entire SQL instance, making that DPM will auto discover new DB’s within that instance.  DPM v3 will also be able to protect 1000 DB’s per DPM server which is a huge improvement over the 300 DB’s that v2 could.  There will also be Role-Based Access for a SQL admin to do his or hers work with the DPM console.

For Sharepoint: Support for Office 14, Sharepoint server 2007 and 2003.  There will be no recovery farm requirement for Office 14 and auto-detection of new content databases within the farm.

For Hyper-V: Item-level restore from VHD backup, support for Hyper-V2 deployments using Live Migration (CSV) and Dynamic VM guest migration support (Meaning you should be able to restore to alternate Hyper-V hosts)

What’s completely new?

For AD: Active Directory will now appear as a data source and not be part anymore of the system state.  This will allow IT administrators to centrally manage backups from DPM (although performed locally) and local admin restore from windows server backup.  It will also allow to do a DPM restoration of a whole domain controller.

For Bare Metal Recovery: Windows Server 2003 will continue with SRT but windows server 2008 will use Windows Server Backup for image-based restore, again centrally managed from DPM but locally executed.

For New Datasources: Protection of windows guests on VMWare hosts will be supported and Microsoft Dynamics AX will be a new datasource.  SAP running on SQL server will also be a new datasource

For Laptops: Backup over VPN will be possible where Windows seven is of course the new OS that will be included.  Per DPM server you will be able to scale up to 1000 clients.  Only Unique user data will be protected so that not the entire OS is repeatedly on your expensive storage ;-).  DPM will also integrate with local shadow copies for Vista and Windows 7 which will be centrally configured from the DPM Admin User Interface BUT the end user will be able to restore from local copies offline and online as well as from DPM copies online. 

Server Side:

DPM V3 promises to  be Enterprise Ready where the scalability is increased and each DPM server will now have the possibility of having 80 TB of storage.

New Management Pack updates will be ready for SCOM and SQL admins will have Role-based management at their hands.

The one thing I am really looking forward to is the fact that MS promises to give us automatic rerunning of jobs and improved self-healing.  Also the automatic protection of new SQL databases or MOSS content databases seems very promising.  They also promised less “Inconsistent Replicas” errors and they will reduce the Alert Volume.

The DPM to DPM replication will also be improved and (more important I think) there will be a One-click DPM DP failover and failback scenario available.  Also improved scheduling will be there.

For the SAN restore, there will be continued support using scripts and whitepapers that are delivered through the vendors but there’s no change with the previous version.

And last but not least, the DPM server should be 64-bit and W2k08 or better.

Conclusion:  This seems like a lot of improvements and definitely worth to check it out

Next post: Installation of DPM v3 Beta

Cheers,

Mike

System Center Virtual Machine Manager: Hosts having 0kb free memory

5:24 pm in Uncategorized by mikeresseler

Hey All,

One of the big issues with SCVMM 2008 for me was that the memory available on the hosts is not correct.  If you are running quite a large environment with a lot of virtual machines, it is difficult to know how many you have left.  You can of course start calculating it manually but… 😉

So here’s a screenshot of the problem

image

As you can see, all my hosts are giving zero kb free of memory.  That’s not exactly correct

Here’s how to solve this one:

On each host that has this problem, open the registry and go to the following key:

HKLM\SYSTEM\CurrentControlSet\Services\PerfOS\Parameters

If you see the key “Disable Performance Counters” then change it to 0.

Now you need to restart the WMI service on the host.  Please pay attention because it will restart other services too and you don’t want to do this during production hours…

image

After that the services are restarted, you can refresh your configuration in SCVMM and then you get the correct figures

image

 

Cheers,

Mike

System Center Data Protection Manager 2007: Installation Part I: Prerequisites

9:02 am in Uncategorized by mikeresseler

Hey All,

In the next 3 posts, an installation guide for System Center Data Protection Manager 2007.  Let’s start with explaining the environment I’ve used and the prerequisites.

First, let’s look at the prerequisites as they are provided by Microsoft. (http://technet.microsoft.com/en-us/library/bb808832.aspx)

First, the security requirements

1) Security Requirements

To install DPM you must be a local admin and logon to the computer where you want to install DPM.  OK, this can be arranged :-)

After the installation of DPM, you need to have a domain user with administrator access to use the DPM Administrator Console.  Basically, this means that when you need to be able to use the administrator console, you need to be a local administrator on the DPM server.

2) Network Requirements

According to Microsoft, you need to have a windows server 2003 or windows server 2008 active directory domain.  My testenvironment is a native windows server 2008 domain so no problems there.  You also need to have persistent connectivity with the servers and desktop computers it protects and if you are protecting data across a WAN, then you need to have a minimum network bandwidth of 512kbps.  So far so good

3) Hardware Requirements

Disks:  You should have a disk that contains the storage pool and one that is dedicated to system files, DPM installation files, prerequisite software and database files.  Basically, this means that you need an OS and program files just as any other server (you can divide it into multiple volumes) but the storage dedicated for the backups can’t be the same as above.  When DPM takes the storage, it allocates it and it is not usable for something else.

CPU: Minimum 1Ghz, Recommended 2.33 Ghz

Memory: Minimum 2 GB RAM, Recommended 4 GB Ram

Pagefile: 0.2 percent the size of all recovery point volumes combined, in addition to the recommended size (1.5 times the Memory)

Disk Space:

– Minimum: Program Files: 410 MB, Database: 900 MB, System Drive: 2650 MB

– Recommended: 2 till 3 GB of free space on the program files volume

Disk Space for storage pool: 1,5 Times the size of the protected data, recommended = 2 – 3 times the size of the protected data

LUNS: Maximum of 17 TB for GPT dynamic disks, 2 TB for MBR disks

32-bit servers: 150 data sources

64-bit servers: 300 data sources

Since this will be a test environment, there is no way I’m going to follow the complete requirements for DPM. :-)

4) Operating System Prerequisites

32-bit or 64-bit servers.  No ia64 bit is supported

The DPM server cannot be a management server for SCOM

The DPM server cannot be an application server or an DC server.

There is a VSS limitation on 32-bit servers.  It can’t outgrow 10 TB of protected data.  Preferred is 4 TB on a 32-bit server.

OS: Windows Server 2008 Standard and Enterprise Edition or Windows Server 2003 (R2) SP2 or later or Windows server 2003 Advanced server or Windows Server 2003 Storage server.

The management shell can also be run on windows XP SP2, Vista and Windows Server 2003 SP2 or later.

 

My Environment

I will run the server on a Hyper-V platform (Not yet R2) and I’ve created a virtual machine with the following specifications

CPU: 2 * 3.60 GHz Xeon

Memory: 2048 MB

Storage: C:\ 40 GB  (OS and application) D:\ 60 GB (Play storage for backup)

OS: Windows Server 2008 x64 SP2, Fully patched

Software Prerequisites

Before I started with the installation, I’ve installed following things first:

– IIS 7 with the following options on:

  • Common HTTP Features
    • Static Content
    • Default Document
    • Directory Browsing
    • HTTP Errors
    • HTTP Redirection
  • Application Development
    • ASP.NET
    • .NET Extensibility
    • ISAPI Extensions
    • ISAPI Filters
    • Server Side Includes
  • Health and Diagnostics
    • HTTP Logging
    • Request Monitor
  • Security
    • Windows Authentication
    • Request Filtering
  • Performance
    • Static Content Compression
  • Management Tools
    • IIS Management Console (not really a prerequisite but handy ;-))
    • IIS 6 Management Compatibility
      • IIS 6 Metabase Compatibility
      • IIS 6 WMI Compatibility
      • IIS 6 Scripting Tools
      • IIS 6 Management Console

 

– SIS (Single Instance Store)

To enable this, open a command prompt with administrator privileges and run the following command:

start /wait ocsetup.exe SIS-Limited /Quiet /norestart

image

After this is finished, restart the server.  To check if the SIS is enabled, open regedit and check the existence of the following key:

HKEY_LOCAL_Machine\SYSTEM\CurrentControlSet\Services\SIS

image

That’s it

In the next part, the installation of SCDPM 2007

Till then,

Cheers,

Mike

System Center Service Manager: Questions Answered

7:43 am in Uncategorized by mikeresseler

Hey All,

As said in my previous posts, I had a few questions concerning the product (probably more to come but…) so I asked them at the Microsoft Forum.  I also promised to let you know the results so here they are:

 

Q1: No R2 Components Allowed

As stated in the documentation, you can’t install no Operations Manager R2 components on the server(s) running the Service Manager Components.  This is the answer I’ve got

 

Quote from Anders Bengtsson

With the current version it is not supported to install any operations manager 2007 SP1 or R2 components on a service manager server, doing that might result in collisions as both the products are built on the same platform. This will be fixed before RTM.

Unquote

Fair answer, let’s hope they fix this indeed before RTM

 

Q2: I’ve noticed that you had to fill in a name for the management group, and that I had to do it twice (once during the DataWarehouse installation and once during the Service Manager installation).  I’ve installed it with entering two times the same name and this seemed to be the correct decision.  I’ve asked the support team what the meaning is of these management groups and if my decision was the right one.  Here is the answer, again from Anders Bengtsson

Hi,
With the current installation wizard you have to input the name twice, that might change before RTM. If you look under the hood it is two seperate installations of the R2 platform, that is why you have to input two management group names. There are more info about what a management group is at http://technet.microsoft.com/en-us/library/cc540367.aspx . In operations manger a management group name is the name of the logical installation, most often based on responsibility or geography, for example EMEA or "Exchange Team".

 

Cheers,

Mike

System Center Service Manager: Installation Part III: The Service Manager Management server, CMDB and a console

6:50 pm in Uncategorized by mikeresseler

Hey All, this is part 3 of the installation of System Center Service Manager Beta 1.  This time I’m going to install the Service Manager Management server, the CMDB and a console.  Please note that you can’t install this on the same computer as the Data Warehouse because according to Microsoft, this will work but the results will be unpredictable.

image

After running setup.exe, I got the start screen again.  This time I have chosen for Install Service Manager management server. 

image

Again I have the Product registration page.  I’ve changed username, company and accepted the license agreement.  Again, no key is needed for the time being.

 image

Choose the location where you want to install the program.

image

The prerequisite wizard has run, and again I got a warning.

image

In the SCSMSetupWizard log (in my case found under the directory where the installation media was extracted) I notice (again) that I don’t have enough memory.  Still I continue (It is a virtual after all, I can add more memory if necessary)

image

On this page, I needed to tell the program where the SQL server was located.  Since this was the same as where my Data Warehouse was located, I choose that one and again he listed me the only possible instance.

I wanted to create a  new database and choose that one.  The setup program filled in the rest for me.

image

Here I need to point to the server which contains the Data Warehouse.  This is optional if you have installed a data warehouse.  Otherwise, this is not necessary to fill in.

image

Again I need to tell the name of the management group which I can’t change afterwards, so I decided to take the same name for the management group as I did for the Datawarehouse.  I’m not sure if that is the good decision so I’m gonna post a question on the Microsoft forums and let you know what their answer is.

image

The account for Service Manager, which is in my case the SM_Acct account.

image

Finally the installation summary.

image

image

And job done.  Again, this took less then 10 minutes to install.

image

To check if everything works, I started the console on the server.

image

I get this window, where I filled in the servername of the Service manager management server.

image

And yes, it worked.  The first view on Service Manager 2010 beta 1.

To check for more, find the logs that I talked about in the previous post.

Next posts will be the first configuration.

Cheers,

Mike

System Center Service Manager: Installation Part II: The Datawarehouse

6:18 pm in Uncategorized by mikeresseler

Hey All, this is part two of the installation post of System Center Service Manager Beta 1.  In this part, we will install the Service Manager Data Warehouse.

I started the installation with the Setup.exe (of course ;-))

 image

I choose Install Service Manager Data Warehouse

image

Now he is asking for some data. The traditional name, organization and key.  Since this is a beta, I don’t need a key yet.  I’ve checked the License acceptance box and continued.

 image

Here I choose the installation location.

image

Now the prerequisite checker has run and he has given me warnings.

 

image

In the SCSMSetupWizard.log (in my case found under the directory where the installation media was extracted) I can read that he warns me that I don’t have enough memory (Still I’m going to try with the 2 GB ;-))

 

image

In this screen, I need to give the instance for the Databases.  Since he only shows the possible instances, this is an easy one :-)

 image

On the next page, I need to choose the Data Warehouse Management group name.  I left it as default, but be aware that you can’t change this name afterwards.

image

And finally, on that same screen, I need to choose the admin(s) for that management group.  I choose for the created admin group, SM_Admins.

image

For the Data Warehouse account, I chose the user we created in the 1st post named sm_acct.  The great thing here is that you have a button to test the credentials to see if your account is ok.

image

Finally the summary and of we go.

image

The installation is running

image

After less then 10 minutes the installation was finished.

For more information about the installation, or to troubleshoot a failed installation, check the log @ \users\<user name>\AppData\Local\SCSM\Setup or at the location where the installation files are located.

And finally, to validate if everything is ok, check if the following databases are created: DWSMDB, SCDM and SCDW

Up to the next part of the installation

Cheers,

Mike