You are browsing the archive for dpm.

System Center Data Protection Manager 2007: Knowledgebase article for the new hotfix package is now available

4:17 pm in Uncategorized by mikeresseler

http://support.microsoft.com/default.aspx?scid=kb;EN-US;976542

Cheers,

Mike

System Center Data Protection Manager V3: Worth to look at?

6:44 pm in Uncategorized by mikeresseler

Hey All,

Just been a week in vacation with the wife and kids and had a great time swimming, walking and playing in the playgrounds with the children.  Now that the DPM V3 Public Beta is announced and there, I thought it was time again to look at the promised features for V3 and to check if the product is worth our attention.  (Hey, a silent evening in a cottage, with the kids and wife sleeping, a good glass wine in my hand and some good music on… ;-))

So here’s what’s new, based on what was already in V2

For Exchange:  Of course they will support E14 in addition to Ex2007 and Ex2003.  There will also be improved Restore Granularity.  What exactly is not known yet but the DPM team worked together with the Exchange team to see what is possible and supported.

For SQL: You will be able to protect an entire SQL instance, making that DPM will auto discover new DB’s within that instance.  DPM v3 will also be able to protect 1000 DB’s per DPM server which is a huge improvement over the 300 DB’s that v2 could.  There will also be Role-Based Access for a SQL admin to do his or hers work with the DPM console.

For Sharepoint: Support for Office 14, Sharepoint server 2007 and 2003.  There will be no recovery farm requirement for Office 14 and auto-detection of new content databases within the farm.

For Hyper-V: Item-level restore from VHD backup, support for Hyper-V2 deployments using Live Migration (CSV) and Dynamic VM guest migration support (Meaning you should be able to restore to alternate Hyper-V hosts)

What’s completely new?

For AD: Active Directory will now appear as a data source and not be part anymore of the system state.  This will allow IT administrators to centrally manage backups from DPM (although performed locally) and local admin restore from windows server backup.  It will also allow to do a DPM restoration of a whole domain controller.

For Bare Metal Recovery: Windows Server 2003 will continue with SRT but windows server 2008 will use Windows Server Backup for image-based restore, again centrally managed from DPM but locally executed.

For New Datasources: Protection of windows guests on VMWare hosts will be supported and Microsoft Dynamics AX will be a new datasource.  SAP running on SQL server will also be a new datasource

For Laptops: Backup over VPN will be possible where Windows seven is of course the new OS that will be included.  Per DPM server you will be able to scale up to 1000 clients.  Only Unique user data will be protected so that not the entire OS is repeatedly on your expensive storage ;-).  DPM will also integrate with local shadow copies for Vista and Windows 7 which will be centrally configured from the DPM Admin User Interface BUT the end user will be able to restore from local copies offline and online as well as from DPM copies online. 

Server Side:

DPM V3 promises to  be Enterprise Ready where the scalability is increased and each DPM server will now have the possibility of having 80 TB of storage.

New Management Pack updates will be ready for SCOM and SQL admins will have Role-based management at their hands.

The one thing I am really looking forward to is the fact that MS promises to give us automatic rerunning of jobs and improved self-healing.  Also the automatic protection of new SQL databases or MOSS content databases seems very promising.  They also promised less “Inconsistent Replicas” errors and they will reduce the Alert Volume.

The DPM to DPM replication will also be improved and (more important I think) there will be a One-click DPM DP failover and failback scenario available.  Also improved scheduling will be there.

For the SAN restore, there will be continued support using scripts and whitepapers that are delivered through the vendors but there’s no change with the previous version.

And last but not least, the DPM server should be 64-bit and W2k08 or better.

Conclusion:  This seems like a lot of improvements and definitely worth to check it out

Next post: Installation of DPM v3 Beta

Cheers,

Mike

System Center Data Protection Manager 2007: Installation Part I: Prerequisites

9:02 am in Uncategorized by mikeresseler

Hey All,

In the next 3 posts, an installation guide for System Center Data Protection Manager 2007.  Let’s start with explaining the environment I’ve used and the prerequisites.

First, let’s look at the prerequisites as they are provided by Microsoft. (http://technet.microsoft.com/en-us/library/bb808832.aspx)

First, the security requirements

1) Security Requirements

To install DPM you must be a local admin and logon to the computer where you want to install DPM.  OK, this can be arranged :-)

After the installation of DPM, you need to have a domain user with administrator access to use the DPM Administrator Console.  Basically, this means that when you need to be able to use the administrator console, you need to be a local administrator on the DPM server.

2) Network Requirements

According to Microsoft, you need to have a windows server 2003 or windows server 2008 active directory domain.  My testenvironment is a native windows server 2008 domain so no problems there.  You also need to have persistent connectivity with the servers and desktop computers it protects and if you are protecting data across a WAN, then you need to have a minimum network bandwidth of 512kbps.  So far so good

3) Hardware Requirements

Disks:  You should have a disk that contains the storage pool and one that is dedicated to system files, DPM installation files, prerequisite software and database files.  Basically, this means that you need an OS and program files just as any other server (you can divide it into multiple volumes) but the storage dedicated for the backups can’t be the same as above.  When DPM takes the storage, it allocates it and it is not usable for something else.

CPU: Minimum 1Ghz, Recommended 2.33 Ghz

Memory: Minimum 2 GB RAM, Recommended 4 GB Ram

Pagefile: 0.2 percent the size of all recovery point volumes combined, in addition to the recommended size (1.5 times the Memory)

Disk Space:

– Minimum: Program Files: 410 MB, Database: 900 MB, System Drive: 2650 MB

– Recommended: 2 till 3 GB of free space on the program files volume

Disk Space for storage pool: 1,5 Times the size of the protected data, recommended = 2 – 3 times the size of the protected data

LUNS: Maximum of 17 TB for GPT dynamic disks, 2 TB for MBR disks

32-bit servers: 150 data sources

64-bit servers: 300 data sources

Since this will be a test environment, there is no way I’m going to follow the complete requirements for DPM. :-)

4) Operating System Prerequisites

32-bit or 64-bit servers.  No ia64 bit is supported

The DPM server cannot be a management server for SCOM

The DPM server cannot be an application server or an DC server.

There is a VSS limitation on 32-bit servers.  It can’t outgrow 10 TB of protected data.  Preferred is 4 TB on a 32-bit server.

OS: Windows Server 2008 Standard and Enterprise Edition or Windows Server 2003 (R2) SP2 or later or Windows server 2003 Advanced server or Windows Server 2003 Storage server.

The management shell can also be run on windows XP SP2, Vista and Windows Server 2003 SP2 or later.

 

My Environment

I will run the server on a Hyper-V platform (Not yet R2) and I’ve created a virtual machine with the following specifications

CPU: 2 * 3.60 GHz Xeon

Memory: 2048 MB

Storage: C:\ 40 GB  (OS and application) D:\ 60 GB (Play storage for backup)

OS: Windows Server 2008 x64 SP2, Fully patched

Software Prerequisites

Before I started with the installation, I’ve installed following things first:

– IIS 7 with the following options on:

  • Common HTTP Features
    • Static Content
    • Default Document
    • Directory Browsing
    • HTTP Errors
    • HTTP Redirection
  • Application Development
    • ASP.NET
    • .NET Extensibility
    • ISAPI Extensions
    • ISAPI Filters
    • Server Side Includes
  • Health and Diagnostics
    • HTTP Logging
    • Request Monitor
  • Security
    • Windows Authentication
    • Request Filtering
  • Performance
    • Static Content Compression
  • Management Tools
    • IIS Management Console (not really a prerequisite but handy ;-))
    • IIS 6 Management Compatibility
      • IIS 6 Metabase Compatibility
      • IIS 6 WMI Compatibility
      • IIS 6 Scripting Tools
      • IIS 6 Management Console

 

– SIS (Single Instance Store)

To enable this, open a command prompt with administrator privileges and run the following command:

start /wait ocsetup.exe SIS-Limited /Quiet /norestart

image

After this is finished, restart the server.  To check if the SIS is enabled, open regedit and check the existence of the following key:

HKEY_LOCAL_Machine\SYSTEM\CurrentControlSet\Services\SIS

image

That’s it

In the next part, the installation of SCDPM 2007

Till then,

Cheers,

Mike

System Center Data Protection Manager SP1: How to start a DPM project

2:43 am in Uncategorized by mikeresseler

// Note: For some reason, my tables are not shown.  I’ll try to fix this but I don’t seem to be able to figure out why… Sorry about that.

As promised, here is a guidance on how to start a DPM project, based on the IPD Guides of Microsoft.

First thing to know is when you implement a DPM project (and this counts for about ALL of the System Center Suite) you need to know and understand exactly the business requirements.  Without them, it is impossible to deliver a great implementation.  With them, and understanding them, then the implementation will be quick, easy and your backup worries will decrease a lot.

Oke, here goes

First, I will give the Decision Flow according to the IPD

image

All these steps will now be discussed

Step 1: Project Scope

In this Step, you will need to collect all the information necessary for the implementation.

  • AD domain & Forest information

You start with the AD Domain and Forest information.  The servers you need to protect need to be in the same domain or there have to be a two-way cross-domain or cross-forest trust between the domains where the protected servers will be located.

I normally use the following table to write down the information

Domain Name FQDN Netbiosname DC
  • Network Topology and Bandwidth

Make sure that you have an overview of the Network Topology and Bandwidth.  It will be difficult to protect a server every 15 minutes if it is located on a WAN connection that has high latency or doesn’t always have connectivity.  A drawing of the topology can be very handy when designing the solution.

  • Data Loss Tolerance

The business will need to give this input.  What is tolerated in case of a disaster.  This is the equivalent to the recovery point objective (RPO).  This is necessary to determine the load on servers, storage and tapes.  Don’t let the business tell you that there is no tolerance for data loss if they are not prepared to pay the price for the storage, servers and tapes.  If there is not enough budget, then you can’t get it all…

  • Retention Range

Q from IT: How long must data be kept for availability?

A from business:…

Mostly it is important to verify if all services / data / … have the same retention range.  From time to time it is not necessary to keep certain data for 6 months or longer.  You should always ask the business whether they need a 3 month copy or if the last week is ok.  The better you precise the question about the different applications, the better the answers will be and you will save storage and tapes that you can use for more important issues.

Q from IT: Are you under some regularity compliance? (HIPAA, SOX…)

A from business:…

If you are under compliance, the retention ranges and stuff are already defined for you.  Read them and implement them.  End of story.

  • Speed of data Recovery

This is similar to the Recovery Time Objective (RTO) and will determine when disk is used or when tape is used.  The quicker you need to be able to recover, the more disk you will use and vice versa.

  • End-User Recovery

Will end-users be able to recover their own deleted files without the intervention of IT? What’s the business requirement on this one?

  • BCP / DRP

Will this implementation be part of the Business Continuity Plan (BCP) or/and Disaster Recovery Plan (DRP).  In other words, if it is part of the BCP, this means that you need to be able to recover crashed items asap.  If it is only part of the DRP plan, then you need to have a good strategy to recover when things fail, but then it is not necessary to recover on the spot.

  • Future plans

Are there any business acquisitions or divestments planned in the near future?  Will the DPM solution be used for this?  Are there servers or applications that will be retired in the near future.  Do we need to calculate a new application in the design?

Step 2: Determine What Data Will Be Protected

In this step, you will need to figure out what kind of data you will be protecting.

  • Virtual Machines

As you know, DPM can protect entire guest VM’s.  Fill in the next tables to get an overview of all VM’s you need to protect (This includes Hyper-V and Virtual Server 2005 SP1 virtual machines)

Additional note: Pass-through disks are NOT protected with this method.  You will need to backup that disk with an agent inside the VM.

Host

Host IP

Guest

Guest IP

       
       

 

  • Exchange Server

DPM can protect only mailbox servers.  So no edge servers or other roles.  Only the data is protected.

Additional information:

– Exchange Server 2003 and Exchange Server 2007 Single Copy Cluster (SCC); DPM agent on all nodes in the cluster

– Exchange Server 2007 local continuous replication (LCR): Install the DPM agent on both the active and passive node

– Exchange Server 2007 cluster continuous replication (CCR): The DPM agent must be installed on both nodes in the cluster

– Exchange Server 2007 SP1 standby continuous replication (SCR): Install the DPM agent on the active node and standby nodes

Again, I use a simple table and fill in the data

Servername

OS

Server IP

Storage Group

Database

         
  • Sharepoint services

What are we going to protect here?  Again my tables… :-)

Servername

OS

Server IP

Farmname

Site

         

This can be for Sharepoint Services 3.0, MOSS 2007 or Sharepoint portal server 2003.

Please note that when you protect a sharepoint farm or a sharepoint services site that you don’t need to backup that database separately afterwards.  It will only cause you troubles.

Also note that for recovering your sharepoint sites, you will need a recovery server.  So keep that in mind when you need to ask for more servers.

  • Volumes, folders and shares

No explanation necessary here I think, only remember that we are talking about windows server 2003 SP1 or later.  No more windows server 2000!

Servername

IP

Name

     
  • System State

Things that are protected by the system state are listed in the following article: http://technet.microsoft.com/en-us/dpm/bb808714.aspx

Servername

OS

Server IP

     
  • Exclusions

Write down the exclusions here.  This can be folder based, File based or File extension based.  Maybe interesting if you don’t want to backup the entire companies MP3 collection 😉

Think also about the following: If DFS-N is in place (Distributed File Share-Namespace) then map to the actual file locations, because shares through the DFS hierarchy cannot be selected for protection, only the target paths can be selected.  If DFS-R (Replication) is used, the map to all the replicas and then select one of them for protection.

Servername

ServerIP

Excluded folder

     

 

Servername

ServerIP

Excluded files

     

 

Servername

Server IP

Excluded file extension

     

 

Step 3: Create Logical Design for Protection

In this step, the protection requirements will be translated into a logical design.  And that logical design will be configured as one or more protection groups.  But before you start, stop for a moment and consider the following VSS limitations

  • File protection to disk is limited to 64 shadow copies
  • File protection can have a maximum of 8 scheduled recovery points for each protection group each day
  • Application protection to disk is limited to 512 shadow copies, incremental backups are not counted towards this limit

Keep those in mind while designing this step

To do this, you will now need to fill in the next table so you can determine recovery goals, protection media and how the replica’s will be created.

Here is the explanation for the various parameters

Server or workstation: Name of the Server and if it is a server or a workstation
Location: Location of the data
Data to be protected: Application data or File data.
Data Size: Current size of the data
Rate of Change: How fast does the data change?
Protected volume: The name of the protected volume (if applicable)
Synchronization Frequency: How many times do we need to apply the changes to the replica
Retention Range: How long must this data be kept available (online and/or offline)
Recovery Point Schedule: How many time between RPs
Media: Which media is used? (disk or tape or disk/tape)
Replica Creation Method: Automatic or manual (backup/restore)?
Protection group name: Choose a name
DPM Server: Choose the correct DPM server. (If more then 1 server will be in place)

 

SQL Production Databases

Server or workstation

SQL1, SQL2, SQL3

Location

Physical Location, eg Antwerp Office

Data to be Protected

Application

Data Source Type

Disk

Data Size

in total, 600 GB, calculated to 1 TB in five years

Rate of Change

Frequent

Protected Volume

SQL Store

Synchronization Frequency

15 min

Retention Range

7 days

Recovery Point Schedule

9.00, 12.00, 15.00, 18.00, 21.00

Media

Disk/Tape

Replica Creation method

Automatic

Protection Group name

SQL Production

DPM Server

DPM01

If you make this map for each of the data you need to backup, you already designed your protection groups.

Step 4: Design the Storage

Here’s the tricky part.  It is almost impossible to correctly calculate how much storage you need.  There are a few helpfull hands on the internet, but most of the time I have seen that taking the complete storage and make it two times that size is good enough.  This is (of course) when you want to use the synchronization features at full.  If you are only interested in the traditional way of backing up, then you can go with less.

Anyway, here are a few links for storage calculation

http://technet.microsoft.com/en-us/library/bb795684.aspx

http://technet.microsoft.com/en-us/library/bb808859.aspx

http://blogs.technet.com/dpm/archive/2007/10/31/data-protection-manager-2007-storage-calculator.aspx

http://www.microsoft.com/downloads/details.aspx?FamilyID=445BC0CD-FC93-480D-98F0-3A5FB05D18D0&displaylang=en

  • Custom Volumes

Do we need to consider custom volumes?  Only if:

  1. Critical data must be manually separated onto a high performance LUN
  2. To meet regulatory requirements
  3. To separate IO-intensive workloads across multiple spindles
  • Choose the Disk Subsystem

If you have an option, decide what you are going to use as disk subsystem.  Will you be using DAS, SAN, iSCSI?  What RAID configuration?  Choose this based on the Peak IOps during backup or restore but in my humble opinion, a good iSCSI solution will do the trick without any problems (Think Dell MD3000i for example…)

  • Tape Storage

What tapedrive model or robotic library will you be using?  Is it compliant?

Check http://technet.microsoft.com/en-us/dpm/cc678583.aspx for compliancy

  • Placement of Disk and Tape Storage

What is the location of the disk and tape storage towards the DPM server?  Is it close?  Is it network connected, fiber? scsi connected?

Step 5: Design the DPM Server

Finally you are getting to the end of this process.  You can design the DPM server itself.

  • calculate how many DPM Servers are needed

These are the limitations of one DPM Server:

  1. Maximum 250 storage groups
  2. Maximum 10 Tb for 32-bit DPM servers
  3. Maximum 45 TB for 64-bit DPM servers
  4. Maximum 256 data sources per DPM server (64-bit) where each data source needs two volumes
  5. Maximum 128 data sources per DPM server (32-bit)
  6. Maximum 8000 VSS shadow copies
  7. VSS Addressing limits: Add a DPM server for each 5 TB (32-bit) or 22 TB (64-bit)
  8. Maximum 75 protected servers and 150 protected workstations per server
  9. Data sources in another domain / forest that is untrusted… Add a new DPM server
  • Map protection groups to servers and storage.

Well, as already said, if more then one DPM is in place, map the table to the correct server.  Few pointers here:

  1. Separate data that cannot coexist on the same server for legal or compliance reasons
  2. Group protection groups that have different synchronization frequencies
  3. Group protection groups with the same media requirements
  4. Group protection groups that comprise data sources that are within the same high-speed network.
  5. Group protection groups that will be backed up from or to VM’s.
  • Hardware requirements

According to Microsoft:

What

Minimum

Recommended

Processor

1 Ghz

2.33 Ghz quad-core CPUs

Memory

2 GB

4 GB ram

Pagefile

0.2 % the size of all recovery points + 1.5 times the RAM

N/A

Disk Space

Program Files: 410 MB

Database file drive: 900 MB

System Drive: 2650 MB

2-3 GB free on the PF volume

Disk Space for Storage Pool

1.5 times the size of the protected data

2-3 times the size of the protected data

Logical Unit Number (LUN)

N/A

Maximum of 17 TB for GPT dynamic disks

2 TB for MBR disks

  • Software Requirements

You need to know these 5 things before deciding to place DPM on a server.

  1. NO ia64-bit OS
  2. NO Microsoft System Center Operations Manager on same server.
  3. NO domain controller or application server
  4. Windows Server 2008 (Standard & Enterprise Edition)
  5. Windows Server 2003 with SP2 (R2)
  • Virtual or not?

Yes you can run DPM virtually when you use pass-through disks or iSCSI device.  Please note that you can’t connect to a tape library directly attached to that server at that time.

  • Database

Please keep in mind that you need to run the DPM database on a separate SQL instance!  You also need to plan for SSRS to be implemented on each DPM server.  It is necessary, you can’t without.

  • Dedicated Network

Will you be using a dedicated network?  If so, write it down.

  • Fault Tolerance and protection for DPM

Two components of DPM can be made fault tolerant: The DPM server and the DPM database.  However, keep this in mind for fault tolerance:

  1. Server cannot be run as an MSCS clustered application
  2. Server can run in a VM, which can be a part of a clustered environment
  3. Database is not supported in an MSCS cluster
  4. DPM Server can backup its own databases to tape.
  5. A DPM Server can be used to protect the data from other DPM Servers.

 

Oke, that’s it.  Before you even started to do something, you have gathered all the information necessary to deploy a good DPM implementation.

It will lower the changes of failure and even (if necessary) point out to the management that additional resources are needed or that you can not deliver the asked business requirements

Cheers,

 

Mike

 

Normal
0
false

21

false
false
false

NL-BE
X-NONE
X-NONE


 

System Center Data Protection Manager 2007 SP1: How it works

9:34 am in Uncategorized by mikeresseler

Hey All,

As promised, a post about how Data Protection Manager works.

First, System Center Data Protection Manager 2007 supports three types of backup as shown in the picture.

clip_image002

 

1) Disk-to-disk (D2D)

2) Disk-to-tape (D2T)

3) Disk-to-disk-to-tape (D2D2T)

Many organizations suffer with the same problem.  The size of the data is growing, and the backup window is getting smaller or is simply not big enough anymore to backup all the data necessary.  And even the weekend becomes more problematic these days.  Because of that, a lot of backup programs are working now with Disk to Disk backups.  Why?  Disk storage is getting cheaper, it is more reliable then tape, and the restore is much faster then from tape.  SCDPM does this also.

Still, the end of the tape is still far away.  Many organizations will keep tape backups to store them offsite in case of a disaster.  So a backup to tape is still necessary.  SCDPM supports this with the D2T, or better known as the ‘old school’ backup to tape, or in combination with the D2D, which makes it the D2D2T, the Disk to disk to tape backup.

 

First, how does it works.  As an example, we will talk about the backup of a file server.

 

image_thumb[1]

As you can see, I will backup a volume (D:) on a file server.  The DPM server will have a replica of this volume.

After that, the DPM server will take snapshots on different times (that are adjustable by yourself) and he will base himself on changes of the filesystem.  Only the changes will be replicated to the replica.

image_thumb[2]

In the picture, you see that the DPM server will synchronize every 15 minutes.  He will keep the data for 12 days and 5 times a day, a recovery point will be taken.

image_thumb[3]

This is a screenshot for a user that can see the different versions of a file on a fileserver.

With this mechanism, you can actually give users the ability to restore their documents without intervention of an IT engineer.  An administrator / IT Engineer can decide on a schedule when the DPM server will synchronize the newer versions with the replica residing on the DPM server.  This process is called Continuous Data Protection (CDP).

clip_image002[1]

Example of a real-life problem

DPM does not only have this mechanism for files, but also for a few key-applications within your environment such as Exchange, SQL and Sharepoint.  As an example, I will discuss the exchange technology.

Just like it does with files, DPM will make a full replica of the Exchange databases.  Then, depending on your settings, it will synchronize each x time (by default 15 minutes) a copy of the closed transaction logs to the DPM server.

clip_image002[3]

First step: Full replica

clip_image002[5]

Second step: Synchronization of the closed transaction logs

Minimal each 12 hours, there is a full express backup of the exchange replica.  This will cause each change since the last full express backup to be applied to the replica.  DPM uses a special filter that keeps track off the changes on byte-level, causing it to only transfer a minimum of content on the network.

The combination of the full express backup and the synchronization of the exchange logs gives an IT Engineer / Administrator the possibility to restore the full database and the transaction logs just before the problems arose or when the exchange serve went down.

clip_image001

Another (great) possibility is the option to create a disaster recovery with a secondary DPM server protecting the first one.  If you can locate this second DPM server in another datacenter or in another location, you are creating a very good disaster recovery plan and then you have the possibility to quickly recover from problems or disaster.

clip_image002[7]

One of the most important things in these scenario’s are the calculation of the storage necessary to support these technologies.  But I will come back on that in my next post.

Although this all seems nice and many IT administrators / engineers will want this in their environment, one of the questions IT decision takers will ask is how difficult this is to work with.  Will the IT staff need special education.  Can they easily monitor the system?  What about reporting?  And how can I verify that everything is running as smoothly as it should.  And maybe last but not least, how many work will this application give my IT staff on a daily basis?

The good news is, I can answer on all this questions very positive.

No special education is needed because the interface is based on the same gui as for example Outlook or Microsoft System Center Operations Manager 2007 (SCOM). 

clip_image002

Monitoring the system is easy if you have SCOM in place (dedicated Management Pack for this technology), and if not, you can always let the system sends emails each time there is a problem.  And, as with every System Center product, when there is an alert, the application will always suggest possible solutions.

clip_image002[8]

Reporting is also no problem, there are a lot of predefined, usable reports in the system that can be mailed daily

clip_image002[6]

Also one of the major advantages of the system is that it will automatically do a consistency check after each “backup”.  This will allow the IT staff to quickly find non-consistent data in the environment.

And last but not least, will this give the IT staff a lot of work?  Honestly, no.  DPM is not a product that requires a “baby sitter”.  As long as everything is well-designed and implemented, your staff can read the daily reports, check for errors in SCOM or view the alerts in the console once a day (or through email) and the system will run on itself.  Second, you will gain a lot of time each time a restore needs to be done because of the speed and easiness you can recover.  Of course, you will need to invest a lot of time in the initial architecture / design and implementation of the system.  How you can achieve this?  Check out my next post on designing a DPM solution.

Cheers,

Mike

System Center Data Protection Manager 2007 SP1: A good backup choice?

1:48 pm in Uncategorized by mikeresseler

ith the release of SP1 2 months ago, Microsoft has created a product that can actually take a huge amount of this market.  So if you ever think that this product suits your environment, here are a few pointers to help you sell it to the upper-management.

Ok, now about the product.

First, why would you not choose for this solution.

  1. If you only want one backup solution in your environment, and you’re not only running on windows, then stop thinking about this product.
  2. If you have servers in a DMZ zone (workgroup) or in a domain that has not a full trust, the story also ends, unless you want to install multiple SCDPM servers

Only two reasons and end of story… This product must be fantastic ;-).  Unfortunately, there aren’t too many environments that don’t have a DMZ zone, or that are running full windows.  So these two items are really a killer to defend this to the management…

How you can solve this will be explained later on.  First, we’ll start with what is new to DPM 2007 Service Pack 1.

  • Protection of Hyper-V machines, including 2008 Hyper-V and Hyper-V server.  And since there is still the possibility to do a complete backup of Virtual Server 2005 R2 guests, the whole microsoft virtualization suite is covered.  And we are really talking here about online backup of an entire guest!  How cool is that…
  • SQL server protection now protects SQL 2008, and it gives you added protection capabilities for mirrored databases
  • Sharepoint Server 2007 and sharepoint services 3.0 receive index protection, catalog optimization and support for mirrored content databases.
  • Exchange Server 2007 SCR protection
  • Cross-forest Protection:  It is possible to backup from other domains, but a forest level trust is necessary!

Overview of DPM Functionality 

For more information about the product, make sure you check out the FAQ and the product page

If you want to use this product, you need to “sell” following advantages to your management:

  • Lower cost on licenses.  Make sure you check out the video on edge (edge.technet.com/media/DPM-2007-sp1-licensing).  Second, if you already have SMSE licenses, or if you are thinking about buying those, this product is included!
  • Fast restore.  If you have enough storage (SAN, iSCSI…) you can create protection groups so that you can easily and quickly restore files, mailboxes, mails, databases, sharepoint items….)
  • You can backup to storage online for various products and afterwards back-up to tape without interference with the infrastructure.  This means that the “backup-window” disappears and that you can do your backups to tape whenever you want.
  • Whether we like it or not, it is a Microsoft product, and the functionality for Exchange, SQL and Sharepoint are fantastic.
  • As already said, full host server based backups for Virtual machines.
  • Users can perform their own recoveries
  • Possibility of having one DPM server protect another, making it possible to have a backup server in another location to get up and running again very fast after disaster

 

image

  • Great reporting possibilities

image image image

I think these 8 lines are already a great start for building a business case towards the decision taker.

But now for the problems I mentioned in the beginning of this post.

DPM is not capable of backing up non-windows systems.  So for that, you need another solution.  Also a true san backup is also not possible.  So what can we do about it?

In many cases, we implemented the following solution:

We implemented DPM to do backups on SAN space or iSCSI systems.  After that, we used another solution (mostly HP Data Protector) to take a backup of that storage.  The combination of both systems is perfectly possible as described on the following technet article: http://technet.microsoft.com/en-us/library/bb795753.aspx

For the non-windows environments, the HP data protector is also a solution but you can of course go to Veritas or ArcServe or other solutions.

For the next release of DPM, I think Microsoft should think about the following features:

  • San backup
  • Backup of filesystems on non-windows environments (Hey, if they can monitor them through SCOM, why not backup them???)
  • Possibility of protecting servers that are not in the same domain or are in a workgroup / forest without full trust

Cheers,

Mike

System Center Data Protection Manager 2007 SP1: Terminology

1:02 pm in Uncategorized by mikeresseler

In my next few posts, I will discuss three topics about DPM 2007 SP1.

  1. A good backup choice?  This will discuss how to “sell” this product to the IT decision takers and talks about the ROI of the product
  2. How it works.  This will discuss more how the product works in and what snapshotting is on a high level, not to technical.
  3. How to start a DPM project. This will discuss (based on the IPD from Microsoft) what the best way is to prepare for a DPM project.

To understand the next few topics, I will give here a few definitions that are used in the product. (taken from the technet site)

RPO: RPO means Recovery Point Objective.  This means the following:  What is the amount of data that is tolerated to lose in case of a disaster.  In many cases, the business will say no loss :-).  But since that is almost impossible (unless you have sophisticated systems such as SAN replication etc…).

Retention Range: Basically, how long must data kept for availability.  Most known example is the GFS principle which stands for Grandfather, Father, Son.

RTO: Recovery Time Objective.  How fast do you need to be online again.  Most of the time, IT decision takers need to find a balance between the RTO and the RPO which is a very difficult exercise.

End-User Recovery: Will the end-users be able to do recovery there selves without intervention of IT?

BCP / DRP: Business Continuity Plan / Disaster Recovery Plan.  The first is about figuring out if the business can continue to run after a disaster and the second is more about recovering what is lost.

Data Source: The data that DPM considers as a unit for protection.  DPM allocates separate storage for each data source.

Differential Backup: This term is not used with DPM.  It refers to a backup in which all of the files that are new, or have changed since the last backup, are backed ip.

Express Full Backup: A type of backup that only transfers blocks that changed since the previous express full backup.  This creates a recovery point.

Incremental backup (sync): A type of backup that uses the application’s native form of incremental data protection.

For example, for a server running SQL Server, it’s a log file backup; and for a server running Exchange Server, it’s a VSS incremental backup. Each incremental backup creates a new recovery point. With incremental backups, only the data that has changed since the most recent full or incremental backup are backed up, which means that the size of the incremental backup is usually much smaller than a full backup.

Replica: A full copy of a protected data source that reflects the result of the most recent DPM operation for that data source.

Replica Volume: A volume that holds the current copy of the protected data for a data source.

Synchronization: The process by which DPM transfers changes to protected data from the protected computer to the DPM server, and applies the changes to the replica of the protected volume.

Volume Shadow Copy Services (VSS): Provides the backup infrastructure for the Windows XP, Windows Vista®, Windows Server 2003, and Windows Server 2008 operating systems, as well as a mechanism for creating consistent point-in-time copies of data known as shadow copies.

Protection Group: One or more data sources that are grouped together, to be protected in the same way, by the same DPM server.

Consistency Check: The process by which DPM checks for and corrects inconsistencies between a protected data source and its replica. A consistency check is performed only when normal mechanisms for recording changes to protected data, and for applying those changes to replicas, have been interrupted.

Recovery Point: The point in time view of a previous version of a data source that is available for recovery from media managed by DPM.

That’s it for now, next post: A good backup choice?

Cheers,

Mike