DPM 2010 launch week @ MMS 2010: Part 3: Protecting Windows Clients

April 28, 2010 at 6:37 am in Uncategorized by mikeresseler

Hey All,

Here’s part 3 of our DPM 2010 launch week overview

For the full set:

DPM 2010 launch week @ MMS 2010: Part 1: Technical Introduction

DPM 2010 launch week @ MMS 2010: Part 2: Protection Applications

DPM 2010 launch week @ MMS 2010: Part 3: Protecting Windows Clients

DPM 2010 launch week @ MMS 2010: Part 4: Virtualization and Data Protection, better together

DPM 2010 launch week @ MMS 2010: Part 5: Disaster recovery and advanced scenarios

DPM 2010 launch week @ MMS 2010: Part 6: Partner announcements

A session given by Tim Kremer and in backup, you guessed it, Jason Buffington :-)

This session was all about protecting your clients.  First thing we started with was the reason why we wanted to protect clients.  Many companies or IT pro’s will react that users should save their valuable data somewhere on the network or take a backup on their own.  While this probably works with one or two percent of the companies, I’m sure it fails with the other 98 percent.  The reason for that is simple.  When people are travelling, they won’t be uploading their data to a network share, and even when they are in the office and need to copy their data on a Friday evening to the server… guess what will happen :-).  If they need to backup their own data, then you will probably have users that have a 100 copies of their data on a expensive network share and others who never bother or backup to their local drive on their laptops.  So if the laptop gets stolen or the disk is dead….

 

According to some research companies (something like forrester or gartner or so, forgot which one) about 60% of the intelligence of a company resides on local disk from the users.  Now that’s a lot.  So if we want to protect that knowledge, then we need to find a good way to do that without too much trouble and without disturbing the users or let them do it themselves.  It just won’t happen.  Period.  (This process of letting the end users do the backup their selves is often called the “tax” for the end users)

 

When designing the solution, the architects @ Microsoft had the following challenges:

  • Mobile workforce
  • Different users with different needs
  • Large scale (many many desktops / laptops)

So they created the following goals:

  • Remove the end user tax
  • Support roaming user backups
  • Allow customizability for specific users
  • Enforce admin defined restrictions
  • Keep IT costs low

How did they solved those requirements.

With the same agent as the one for the servers you can start protecting your clients.  By using your favorite deployment method (SCE, SCCM, AD, MDT…) you can get the agents out there.  Remember, you don’t pay licenses for an agent if you don’t use it.  So deploying it over your entire network is not going to give you a licensing issue.  You start paying the moment you start to protect it.  Period.

Second is that an IT Pro can create different policies.  Let’s say that we want that a client will protect it’s my documents, a specific company directory and maybe some more folders that can be imported for the user such as favorites or something.  But of course, we don’t want the My pictures or My music folder to be protected.  The company is not interested in getting all the vacation pictures or mp3 library of their employees.  (Ok, the IT Pro’s might be interested in the mp3 collection :-)).  By defining a policy and including / excluding folders you can achieve this.  And it get’s even better, you don’t need to know the exact location of the my documents folder.  DPM will use the path variable to define where it is.  And last but not least, you can actually deny certain extensions.  No .mp3 files is a good example for this.  Whether we like it or not, end-users are mostly smart enough to see that certain folders are excluded and will move their “valuable data” to a folder that is protected.

 

Now what if users want to be able to protect some specific folders?  Folders that are not default in the company but still contain valuable information.  By giving the end-users (or some of them) the rights they can choose their selves certain folders to be protected.

image

Now what about users on the road?  How is this going to work?  Here’s the answer.

1. They support backup over VPN and direct access.  So whenever a client is connected to the main office over vpn or direct access it has the possibility of synchronizing with the office.  Remember the block-level copy from part 2!  So the data that is sent over is really not that much.

2. DPM provides you with two mechanisms.  While performing a backup, it will send the data to the DPM server if it is reachable.  At the same time, it will keep a local copy on the laptop.  So users will be able to restore from their local cache if necessary.  Will this protect you from hardware failure or from a stolen laptop?  No, it won’t, but users will be able to go to a previous version of a document when it is necessary even if they are working on the road.

3. What about notifications.  Everybody who has ever worked with DPM 2007 or with whatever backup solution for that matter will know that the system will start complaining whenever it can’t reach its clients.  DPM will do that also but they built in a system where you can specify how long it takes before it starts to complain.  Consider the fact that many people take 14 days vacation.  Add the weekends with that and you get 18 days.  So only after 18 days you let the DPM server complain that is missing a connection to a client.  This way you will avoid a lot of false alarms and only those that take more then 2 weeks vacation or those that are travelling longer are going in alert. 

What about the costs?  You can imagine that all the users data will take a lot of disk space.  First you know that you can use low-cost storage to do this and second, because the system is working pretty well you don’t have many human effort.  Compare it with letting the users backing up their own data to a network share.  This is mostly high-end storage which costs a lot, never cleaned by the users and you will probably have many files standing there 50 times.  DPM does not need this because it only contains the changes.  Second, think about the value of the data.  Ask the business what it cost when a road warrior loses its laptop and the data that it contains.  You can do the math quickly.

 

So how does the end-user sees this?

Below are a few screenshots of the end-user experience

image

End-user recovery

image

Agent in the notification area

image

Agent UI

 

Want more?  How about this…

A user loses his or hers laptop.  Or the machine just died.  You have a backup of yesterday on your DPM server.  The deployment team quickly prepares a new laptop with their favorite OSD tool.  Agent is installed or sysprepped on it.  You jump behind the DPM console and do a restore to another location.  User gets the data back :-)

Even more?

The DPM agent allows the end-user to synchronize now.  So suppose they made some important changes to a document they can synchronize it whenever they want to the DPM server if they have connection or to the local cache if they are not connected.  So if the end-user really did some important work, then he or she can create a “backup” of their own before flying out or going on a vacation.  With one simple click, the system will do the work.

 

Till next for part 4

Cheers,

Mike