You are browsing the archive for Management.

Who cares about the hypervisor? part 1

7:24 am in Cloud, Hypervisor by mikeresseler

I have been following the announcements at VMworld San Francisco this week to find out what is coming to VMware 5.5. Those who know me a bit may wonder why I did that as I am normally very Hyper-V focused. I have (in my opinion at least) a very good reason for that; The more we advance in the virtualization and cloud world, the more I’m getting convinced that the hypervisor has become a commodity. A couple of months ago I had a (virtual) bar-type conversation with Greg Shields about this and it is becoming more and more true every minute. Before I start to explain why I feel like that, allow me to take a minute why I don’t like hypervisor comparisons…

I don’t like hypervisor comparisons on blogs, whitepapers, independent researches (if you can call these independent…) and more because they are always biased. It’s that simple. If I would need to create a comparison post or whitepaper, I’m pretty sure that by the end of the day Microsoft Hyper-V would be the winner of the day because I am biased for Hyper-V. Is it because I believe it is the best hypervisor? No, it is because I know that hypervisor best. Those who write those posts try to compare different features with each other but in the end they have a preference and depending on what they see as necessary/optional/nice-to-have the answer to the best hypervisor question will be different. It gets even worse when we try to compare more hypervisors.

With that said, here are my observations from the last month. (Please note I’m looking at VMware and Hyper-V here… I can’t judge about the other hypervisors as I have not enough knowledge about them…). Both Hyper-V and VMware can be easily placed next to each other. They both have an impressive set of possibilities and if VMware says they have A then Hyper-V probably has it also but it is called B. In recent past times it always has been Hyper-V that needed to catch up with VMware but recently I just see 2 hypervisors with a lot of capabilities that are pretty much equal. The time that one fan-base can laugh at the other for missing X is practically over (it will always remain since they will never be fully equal but hey… follow me here for a second.)

A simple example: Microsoft Hyper-V and its VHDX format supports 64TB of disk space. Announced with Hyper-V 3 (or Windows Server 2012) which meant that the Microsoft guys could start shooting at the VMware guys because of this. At VMworld San Francisco they announced 64 TB support for VMDK also (technically correct it is 62 TB but hey…). But do we really care? When I need to chose for a hypervisor in my environment, is this one of my requirements? The possibility to have one (1!) virtual disk that is 60+ TB large. Or what about the fact that somebody supports x number of logical CPU’s and x amount of memory… (PS: for the Microsoft guys, I could have easily taken an example that turned around the example Winking smile)

Whenever I talk to somebody about what hypervisor he or she needs, I ask them the same simple question. What are your requirements? 9 out of 10 they have not a single clue and they tell something like performance and stability and then they are lost. So instead of focusing on what product has more features over the other ask yourself the following questions:

  • What features do I need in my environment?
  • What will the cost be of training the administrators in the technology
  • What kind of support do I want (and how much do I want to pay for it)
  • What’s the amount of VMs I want to run on a single host (density)
  • Do I have different locations?
  • (And many more questions that need to be asked upfront…)

Once you have the answers to those questions, you can start looking for the solution that BEST fits your needs. And not mine or any other out there that has a preference for one of the hypervisors.

All right, you made it this far… now it is time to come to my point. While I don’t care about the hypervisor anymore (as long as it fits your specific needs I’m happy) it now comes down to the next part in the chain… Automation, management, monitoring, backup, security… Setting up an environment is one thing. Keeping that environment running, healthy, protected is much more work and cost a lot more than the initial cost of the project.

I strongly believe that you need to ask yourself those questions during the initial phase of the project. What are we going to do when we are live? How will our patch management look? How am I going to see and foresee issues without staring at a monitor 24/7. What in case of disaster? How am I going to protect my environment against security risks?

In the follow-up post I’m going to discuss an integrated approach for not only your virtual environment but for your entire environment as a whole. And yes, I’m biased again for a certain suite of management tools but even if you don’t like those, you might want to read the follow-up post to catch my vision about management.

System Center Data Protection Manager 2007: New release for KB970867

9:26 am in Uncategorized by mikeresseler

Hey All,

Yesterday, Microsoft has released a new version of KB970867 (http://support.microsoft.com/kb/970867), although the KB article is not yet updated, the new download is already available: http://www.microsoft.com/downloads/details.aspx?FamilyID=14e1a04b-2323-4344-b737-a3194b9ab3ed&displaylang=en

So let’s see if I can use this download on my test environment.  I’ve installed the hotfix on my environment and after the installation, I’ve opened the console

image

I can see immediately Update Available so I decided to do the update to check if the protected servers need to restart

image

image

Press Update Agents to start

image

Make sure you select Manually restart the selected servers later

image

image

After the upgrade, I can see that the version of the agent is changed to 2.0.8851.0

 

Cheers,

 

Mike

UPDATE: In the meantime, the associated KB is online.  It is a quite impressive list of fixes… http://support.microsoft.com/default.aspx?scid=kb;EN-US;970868

Management of Security?

7:05 pm in Uncategorized by mikeresseler

Last week, I’ve got the change to go to a round-table meeting with Kai Axford from Microsoft.  Kai is a Senior Security Strategist in Microsoft’s Trustworthy Computing Group.

It was a nice evening in a small restaurant somewhere in Belgium.  Normally, a security specialist from our team would have joined the meeting but because of unforeseen reasons he couldn’t make it so I had a change to go to the meeting.

Although I am interested in security, I’m certainly not a specialist.  And in the preparation of the meeting, I’ve read the blog from Steve Riley, basically the inventor of the concept of the Fortified Datacenter.

I had to admit, during the meeting, listening to Kai, who practically made the word security “cool” was great and if this guy was a sales person, he probably would sell the concept fortified datacenter to everybody.  Just imagine that whenever you start your laptop, and you have an internet connection, you would basically be on the intranet.  That means that at all times, you can access your programs, file shares, email… and at the same time you are protected by every rule of your company, without starting a VPN! If your company denies the website xyz.com in the office then, I’m very sorry,  but it will be denied when you are in the airport using the free wifi and so on and so on.  How great is that from an IT point of view :-)

Later at the evening, after listening to a lot of security questions and answers, I popped the question, from a management standpoint.  All nice and well, a lot of infrastructure changes as a project, suddenly the datacenter works, is fortified and everything is completely transparent to the end-user.  I think management would kill to have this.  But how are we (the IT-pro’s) manage this in a convenient way?  How are we going to make sure that we can keep the overview, and that we know at all times which user has access to what…  How are we going to check who had access to what at what time?

Luckily for me, Kai was prepared for this question, and he not only proposed a solution, but in my opinion this is a solution that it is basically a best practice for everything.  Here it is…

  1. All access to whatever application, file share, resource should be managed from within Active Directory.
  2. ILM should be presented within your environment, with through thought workflows to provision new users, users that change from department etc…
  3. Different management systems should be used to monitor, provision,… systems automatically, as much as possible

Of course, I’m a system center dude, so I will use as much as possible system center products, and even one more (ILMv2, soon to be released.)

Great, probably, quite a lot of ITpro’s are already convinced but how are we going to sell this investment towards our management.  Let me try to do this for you with a few pointers.

 

All access to whatever application, file share, resource should be managed from within Active Directory

Cost prize: man hours, and a good functional design.

What you need to do, is to give each access to a shared resource a dedicated security group, even multiple security groups if different access is necessary (e.g. read rights, admin rights, reporting rights…).  Give this security group a logical name and a good description.

Example:  Suppose you have a file share called “Company Branding”.  You want that every user in your company has read access to this share, because they need to use it for their presentations, sales work etc… but a few people need write rights to change if necessary, and then you have another few people who need to have full control on it.  Sounds like an everyday situation for each company.

I would create three security groups in AD:

Name Description
GG-S-CompanyBranding-Read Read Access to the CompanyBranding fileshare
GG-S-CompanyBranding-Write Write Access to the CompanyBranding Fileshare
GG-S-CompanyBranding-Full Full control to the CompanyBranding Fileshare

GG stands for Global Group (You can of course also have U(niversal)G(roup) and D(omain)L(ocal)

S stands for Security (Other option: D(istribution).

On the File share, you give the different groups the correct rights, and afterwards, the only thing you need to do is put the correct users in the correct security group.  (or other groups in these groups…)

What are the advantages?  Well basically, you will have at all times an overview who has access to which resource.  I mean, whenever an auditor (and I just had one who asked the question…) asks you to tell him or her who has access to xyz, you open up the security group(s) and show which members of have access.  Or even worse, whenever an auditor asks to show the access rights of person X.  You pop-up member off for that user, and you have a great overview.  If you have documented well, then you can make the links easy and you can prove to the auditor to which resources the user has access to.

Imagine that you do this for each and every resource you have.  (Oke, I admit, this will cost the first time a lot of work and research, but hey, afterwards, you have a complete overview of access to your resources and it is much more easy to maintain this.  If a user leaves the company, remove his memberships and case closed…)  This way, you could easily outsource the creation of new users to a helpdesk or service desk environment and give them a controlled MMC with limited AD access.  And for everything else, they can’t touch or change it.

Now a management wants to see ROI.  So let’s try to calculate this effort.

  • Investment:  Initially, this will cost you some time and will need to be thought through thoroughly.  So you will not only invest time from a system engineer, but also from some guys who are good in thinking this through functionally.  Depending on the size off the environment, and the current situation of the infrastructure, this will take more or less time.
  • Gaining’s:  Creating a user, or changing somebody’s rights will become much more faster as before.  Every resource will be greatly documented and once you have your base done, a new implementation or resource does take you evenly much time to do it with groups then to put individually rights on it.  You will even be faster with groups when you need to give access to multiple users to the new resource. 
  • Gaining’s: Documentation.  As every ITpro knows, and also any Manager, documentation is crucial.  This documentation can immediately be used for your Business Continuity Plan and for your Disaster Recovery Plan.  How cool is that…
  • Transparency:  Control your access rights from one single point.  Your Active Directory.  Making it easy to provide better security.
  • No more loss of time or controlling 20 excel sheets for who has access where.  Everything is in active directory.  You need to have a list, make an LDAP query and get the member off for your users and here you go.

 

ILM should be presented within your environment, with through thought workflows to provision new users, users that change from department etc…

I think a lot of ITpro’s know this.  They get the notification from HR that two hours later, a new guy will start in department x.  So here is your work list for the next two hours:

  • Create the user in AD
  • Create an email for this user
  • Add the user to the correct distribution lists (let’s cheat, take another user from that department because who news what this guy needs…)
  • Give him or her manually access to resources
  • Prepare his or hers laptop / desktop at the same time (If you’re lucky, you have WDS or SCCM or equivalent, otherwise, start building manually) (or worse, you just don’t have any hardware in stock 😉
  • Prepare the ICT letter for this user
  • Done

User starts, gets his things and the next five days he is sending you tons of email because he hasn’t has access to this or access to that… Which you give him or her ad hoc.  (Or, when you already have the security groups in place… ;-))

The point is, this is a terrible way of letting a user start, he or she has already a bad feeling about local IT and you started at the wrong foot.  Why, there is now man capable in doing this in 2 hours or less without making a mistake or without forgetting something.  And even if you have the best procedure in place, timing is crucial here, so you slip up.

What’s the solution for this?  ILMv2 or Information Life Cycle Management version 2 (http://www.microsoft.com/windowsserver/ilm2/default.mspx).  For me, this should be a System Center product, because it is meant to manage an environment, to “provision” identity management and to give users great self-service capabilities.  It is not out yet, but is believed to be RTM April 2009.

What can this thing do?  Basically, it gives ITpro’s the possibility to manage this information flow (such as creating a new user and adding it to the correct groups, giving it an email address and so on…) through a defined workflow (with Policy Management!)

It has policy management, credential management, user management, group management and so on.  Imagine this:

Your HR enters a new user in his or hers ERM tool.  Automatically, a workflow starts and sends out emails to the service desk where the service desk manager approves it and bam, the user is created, has an email address and resides in the correct groups.  At the same time, emails are sent to the application owners and when they approve, these users go into the correct groups and they have access.  Furthermore, application owners can change subscriptions to these groups at all times through ILM (which is embedded in a sharepoint-console).  You could make user rights and security very transparent through workflows.

ROI?

  • Costs: I don’t know pricing yet, but this will probably be a huge investment.  Also in time, because you need to create workflows that are suitable for your environment
  • Savings: No more human error, when the system is running, the users are getting the rights they need.  When a user moves to another department, it only costs one drag from department A to B and after approval of the department managers (if you want to work with approvals…) the system does his job and the user has his or hers new rights.
  • Savings: Safely give users administrative tasks, and give one place to fill in the data and put it everywhere in the correct place.  (No more filling in a user’s address on five places…)
  • Savings: Users will be more happy about Internal IT, because they arrive and everything is OK, or they change from department and they have their new rights immediately.
  • Savings: Maybe stupid, but you finally will be able to tell your manager that you don’t have all the strings in your hands anymore.  No more “GOD” ITpro.  I’ve learned that this is a huge deal!

This product is certainly something to look at when you want to have a easily and secure identity lifecycle management.  With this process, I actually provision a user quickly without an effort and flawlessly.

Microsoft, consider this to brand it as a system center product please 😉

Different management systems should be used to monitor, provision,… systems automatically, as much as possible

As said, I am a System Center lover so here comes… :-)

When you want to work with the Fortified Data Center, you should have a good overview of who is accessing what and who is changing where or what.

With a good policy, Audit Collection Services from Operations Manager 2007 is the solution.  I even had it installed with a bigger company in Belgium which is SOX compliant and this product does the job.  (Although, if you really want to use this thoroughly, consider SecureVantage also.)  There are quite some good posts on Alkin’s blog about ACS, make sure you check this out.

I’m not going to talk about the ROI about this one, I’ll keep this for another post.

Also think, that the more tools you give to your users, the more system center products you should be thinking about  (Mobile phones, self-service Hyper-V servers/desktops…) .  But again, this will be for another post.

Conclusion

Everybody is convinced of the importance of security.  And the fortified datacenter, while probably still far away is a great concept that will gain a lot of interest and popularity over the next few years.  But one thing remains the same, after the setup, it needs to be managed.  And how are you going to convince your peers of buying tools and suites for something that is not a benefit for a user.  After my meeting with Kai, I got convinced that managing this (and oke, it is actually more then security alone) is necessary to keep mistakes out of the picture.  In this post, I’ve given a few pointers what you can do to make your life as an ITpro more easier, and how you can prove to your peers that this really has an added value in your company or infrastructure.

For the Operations Manager products, a more detailed post will come, because they can do much more than only focus on this datacenter 😉

Comments / remarks are welcome

Cheers,

Mike