torsdag 16 februari 2017

System Center Data Protection Manager 2016, Best Design Practices

The purpose of this blog article is to inspire and inform you as a reader what you should consider before you start implementing System Center Data Protection Manager 2016 as the restore-feature within your datacenter.
Throughout the years, I have noticed a lot of different designs that hasn’t played out well. So hopefully readers of this article will avoid any of those pitfalls that either leads to reinstalling System Data Protection Manager or realizing that you are not able to do that restore scenario that you first thought of.
Having sorted out those questions I can assure you that the road to a successful DPM implementation based on the BaaS concept will be closer at hand for sure.

The start, building the restore scenario

The most important part when it comes to backup has always been, and will always be one thing, restore. This should always be your major and primary focus in any design regardless of technology.
Having a prober recovery-plan will provide you with an optimal backup design and strategy that will meet the company need for recovery. However, this is just the thing that the majority of companies misses out on. Building the restore scenario first will provide you a more optimal design when it comes to, for example, the number of DPM servers needed, time for recovery point creations and also archiving/long-term protection and more. The best way to get started with this strategy is to identifying two things:

  • Infrastructural Services
  • Business Services

Your Infrastructural Services is for example your Active Directory meanwhile your Business Services could be your CRM solution that consumes the Infrastructural Services that builds-up the Business Services. The initial step in building a true restore scenario is to breakdown all the Infrastructural Services that cooperates to deliver the Business Services. What Windows Servers are involved? Does the Business Services use SQL Server and is there any IIS also dependent for a successful delivery of the Business Service from an end-user perspective and so on.
Having a detailed breakdown of a Business Service will also help you identify and understand how you could consume Azure services like Azure Site Recovery and others IaaS, PaaS or SaaS services in the most optimal way for your restore-process of the Business Service.

Virtual vs. physical DPM servers

This is a very common discussion, even today, where people tend to keep the physical concept for the DPM servers instead of virtualizing them. The most important part when it comes to providing restore concepts for a company is being able to simply scale and provide more resources to your BaaS (Backup-as-a-Service) service that you provide and deliver for your company regardless of the company size.
This is a simple task when you adapt a virtual concept for your DPM servers where you are able to provide a deployment plan for new DPM servers that will be associated with your BaaS.
The most important take-away; if you haven’t virtualized your DPM servers yet and thinking of deploying DPM 2016, Microsoft highly recommend you to virtualize them using Hyper-V to achieve the possibility of providing scale. 

DPM disk setup

To get the most out of your DPM servers, both form a performance perspective but also from a disaster recovery perspective (restoring the DPM server itself), you should use the following disk setup.

  • OS disc                  (your %systemdrive%)
  • Program disc      (Where you install all software)
  • DPMDB disc        (dedicated disk for the DPMDB)
  • Azure Scratch     (A disc dedicated for the MARS agents scratch catalogue)
  • Recovery disc     (A dedicated disc dedicated for the prestaging procedure for the MARS agent)

All discs should be in the VHDX format and the dedicated DPMDB disc should be fixed in size due to performance; all other discs could be dynamic discs.

SQL Server installation

There are some key points when it comes to delivering an optimal SQL Server installation for System Center Data Protection Manager 2016. The importance of having dedicated service-accounts for your DPM server is a must but also the fact of using the correct collation of your SQL instance hosting the DPMDB. The only collation you should use are SQL_Latin1_General_CP1_CI_AS
All other collations are unsupported, so keep in mind to use the right one or you end up reinstalling your DPM servers from the beginning.
Also, remember to setup the amount of memory your SQL Server should consume. This should be set accordingly to the number of GB you have, spare at least 4-6 GB of RAM for the operating system. The SQL memory configuration is defined in the SQL instance that hosts the DPMDB.
Having a poor or wrong setup or configuration will give you a negative impression of DPM. Keep in mind that the SQL Server is the engine behind it all; If its configured poorly the engine will perform badly, simple as that.

Antivirus exclusions DPM and the MARS agent

The most common performance challenges there is regarding DPM is the fact of not setting up the real-time protection on the antivirus software correctly. If you don’t configure the exceptions for catalogues and services, you will end up with possibly corrupted data.
For System Center Data Protection Manager you should exclude the following catalogues that resides within the DPM installation catalogue:
  • XSD
  • Temp\MTA
  • Bin

The following process should be excluded from real-time scanning:

For the DPM servers that has the MARS agent installed and pushing data to Azure it's important to exclude the following catalogues:
  •  Microsoft Azure Recovery Agent\Bin
  • Scratch folder

The following processes must be excluded from real-time scanning:
  • CBEngine.exe
  • CSC.EXE (Goes for both DPM and the MARS agent)

Not having the correct exclusions will end up with your local antivirus scanning your DPM disk pool or the or scratch area for example.
However, there is one more thing you should keep in mind. In the case where your antivirus software does find a threat you shouldn’t quarantine it (which is the most common policy), you should have it deleted by default.

Protection Group Designs

This is one of my favorite topics from the field. I have seen a lot of interesting designs when it comes to designing Protection Groups throughout the years and here is some of my thoughts that has played out very well for a large number of companies.
The first thing to keep in mind is the Protection Group name. The best staring point is to build your Protection Groups designs according to your Business Services RTO, RLO and SLA (if any). The Recovery Time Objectives (RTO) is the definition if how fast you should be able to be back-on-track with your Business Service. System Center Data Protection Manager can either synchronize your data changes every fifteen minutes for workloads like SQL, Exchange and File. For other workloads like SharePoint, Hyper-V etc., DPM can make Recovery Points every thirty minutes. So, having a clear understanding of your Business Services and your Infrastructural Services are crucial since the Protection Group is where you set the actual backup strategy or plan that should correspond to your restore plans.
Regarding naming of Protection Groups there are a few tips that could hopefully inspire some DPM administrators. To get a decent start you should consider using the following naming convention for your Protection Groups:
  • Workload + “number of recovery points per day or week” + time + SYNC info + Azure + time

An example for a Protection Group having this naming convention would be:
  • File (1RP/d 01:00PM | 6h Sync | Azure 01:00 AM)

In some cases, you could also provide the retention range for your on-prem disc and also Azure to make it even more clear. An example would be:
  • File (1RP/d 01:00PM 30 days|No Sync|Azure 01:AM 180 days)

In this latter example, you get a clear understanding of what kind of members should be associated with this Protection Group but most importantly your restore capabilities. Let’s break it down shall we. The members of this Protection Group are for the File workload. Everyday a recovery point is made 01:00 PM that has a Retention Policy that states that the data should be available for 30 days on-prem meaning in the DPM disk pool. There is no extra synchronization made for the members of this Protection Group and all protection data will be sent to Azure 01:00 PM where it will be stored 180 days back in time.

Replace your tapes, use Azure since it really works…

There are many companies that has now started their journey to using Azure instead of tapes for their long-term retention. In many Business Cases I have made, Azure is far most the most optimal and cost effective solution when it comes to long-term retention for System Center Data Protection Manager protected workloads.
More than 50% of the cases; using Azure will cut the cost in half even for those companies that uses TaaS (Tape-as-a-Service) solutions or VTL.

One important fact to point out is the possibility to restore data from a Recovery Service Vault that is shared between DPM servers. Let’s say that you have two or more DPM servers associated with the same Recovery Services Vault. If one DPM server fail you are still able to restore the data that the DPM server pushed to Azure by adding the Azure Passphrase generated in the MARS setup. This could be done via the Add External DPM feature from the Recovery pane in DPM. This means one simple thing, as long as you have your protected data in Azure, you will always be able to restore it.

fredag 23 september 2016

VMware Protection

During the last Ignite 2015 Microsoft announced that DPM will be able to backup and most important, be able to do restore of Vmware. Within the UR 11 just released Microsoft kept their promised and provided an agentless solution for the Vmware protection.

All protected virtual servers within the DPM protection will also be able to leverage the cloud integrated long-term protection via an Azure Backup Vault.

In short-terms, DPM will be able to auto-protect new deployed server that is provisioned via vCenter. This is due to the fact that vCenter lets you organize your virtual machines in VM folders.
In the scenario where protected virtual servers are load balanced in the Vmware environment, DPM will keep track and continue to protect those VM’s.

For more information regarding this new feature that Microsoft provided in UR 11 please read the following TechNet article

fredag 9 september 2016

Update Rollup 11

This update rollup adds backup and recovery support for VMware server 5.5 and 6.0. Create protection groups, manage policies, and protect and recover VMware VMs with the user interface or new PowerShell cmdlets. This update is available through Microsoft Update or by manual download. UR 11 includes these features:
  • Agentless backup: DPM provides doesn't require an agent on the VMware server. You can use the IP address or fully qualified domain name (FQDN), and login credentials to authenticate the VMware server with DPM.
  • Cloud-integrated backup: DPM protects workloads to both disk and cloud. DPM's backup and recovery workflow helps you manage long-term retention and offsite backup.
  • Folder-level auto-protection: vCenter lets you organize your VMs in VM folders. DPM detects these folders and enables you to protect VMs at the folder level. Once you have turned on folder-level auto-protection, DPM protects the VMs in that folder and its sub-folders, and any new VMs added to the folder or sub-folders.
  • DPM can recover individual files and folders from a Windows Server VM.

fredag 5 augusti 2016

Hyper-V backups keeps failing 0x800423F4

A few days ago I bumped into an interesting error where the backup of some virtual machines that where managed via System Center Virtual Machine Manager or SCVMM kept failing and never could create a Recovery Point. The backup job ended with the same error code that was ID 30111 or 0x800423F4.

As always when it comes to troubleshooting DPM (I should write a blogpost regarding this…) is to verify the dependent technology that DPM “integrates” with when it comes to deliver the backup or restore job for the protected Data Sources.

Since all VSS’s was reporting Healthy I moved on the trouble shooting to VMM itself. When DPM makes a backup of a virtual machine it verifies the internal VSS’s present in the virtual OS before it makes a snapshot of the virtual machine.

In this case in VMM I right clicked the server in production that I couldn’t get a backup of and choose properties. In the properties for the virtual machine I choose the “Hardware Configuration” and under “Advanced” in the scroll list I choose the “Integration Services”.

I unchecked the “Backup (volume snapshot)” and saved the configuration to the VM. When that job was done I re-enabled the “Backup (volume snapshot)” again.

After I initialized the “Integration Service” for Backup I was able to get the backups up and running again.

Never underestimate the power of re-initializing an “Integration Service”.

fredag 24 juni 2016

Session @ Ignite 2016

Got the question if I could consider delivering a session at Ignite in Atlanta later this year, the answer was for sure yes!

I will be delivering a Business Continuity focused session with the aspect of clarifying the concept of HOW you can adapt Microsoft technology within or for your Business Continuity concept. Also I will clarify what Business Continuity is and how you can get started with it when you come back home from Ignite.

I hope to see you at my session and on other great sessions @Ignite J

tisdag 24 maj 2016

Update Rollup 10

Microsoft has just released the Update Rollup 10 and with that a great deal of optimiztion! 
The following list of issues (listed i KB3143871)has been fixed in this update rollup:

  •  If you try to exclude a page file for a VM running on Microsoft Hyper-V Server 2012 R2 server, DPM may still continue to back up the page file
  •  DPM provides an Active Directory schema extension tool to make required changes to Active Directory for DPM End-User Recovery. However, you may the tool may not work on System Center 2012 R2 Data Protection Manager 
  • If you try to protect a SharePoint content database that has Always On enabled, and there is a failover of the database, you may notice that new sites, lists, and items are not displayed on the Recovery tab. This applies only to new items that are created after failover. Additionally, the issue is automatically resolved after failback
  •  If you run Update Rollup 7 for System Center 2012 R2 Data Protection Manager, or a later version, and then try to do item level recovery for a Hyper-V VM, you may receive the following error message when you click the VHDX file on the Recovery tab: "DPM Cannot browse the contents of the virtual machine on the protected computer DPMServerName
  •  The DPM Console crashes when you try to open any of the six built-in DPM reports
  • Optimized item level recovery doesn't work for a SharePoint farm. This causes the full farm data to be copied to the staging server that's running Microsoft SQL Server
  •  The DPM UI crashes when you try to recover online recovery points by using another DPM server that is registered to the same backup vault as the original server
  •  The Get-DPMJob cmdlet does not provide any information about recovery jobs that are performed through the external DPM server
  • This update revises the message of error code 33504 to add details about the issue and steps to resolve the issue
  • If you try to protect Microsoft Exchange Server 2016 while you create a protection group, the server that's running Exchange Server is displayed as "Exchange 2013 Database" instead of "Exchange 2016 Database
  • If you use the DPM Central console, and you receive an EvalShareInquiryAlert (3123) alert on DPM, the alert is still displayed as active in System Center Operations Manager even though the issue is resolved on DPM
  • DPM crashes when you try to configure SMTP settings
  • If you try to stop protection of a data source in which the FQDN contains more than 64 characters, the DPM service crashes

fredag 8 april 2016

Sessoins @ MMS

Got the email saying the nice community conference in Minneapolis called Midwest Management Summit or MMS wanted me as one of their speakers. I will be delivering three sessions this year (or maybe more...will see)
  • Business Continuity using Microsoft Solutions
  • Be a Hero od be fired. Backup and restore strategy
  • Tearing down IT Silos
I will be around the conference the whole week and really looking forward connecting with new community members and also old friends.

Looking forward to meet up in Minneapolis!

fredag 1 april 2016

News in DPM 2016

Hopefully around this years Ignite Microsoft will release the new version of DPM 2016. With that release comes not just an updated version of the actual code but also a few nice additions regarding new supported scenarios from Microsoft.

The following features are new features within DPM 2016:
  • Mixed-Mode Cluster protection
  • Support for StorageSpacesDirect
  • Support for Resilient Change Tracking (RCT)
  • Virtual TPM

fredag 18 mars 2016

Can’t make any Recovery Points, volsnap error 39

The DPM software is dependent on the VSS architecture which of course is nothing new. But when it comes to troubleshoot DPM many seems to forget that the initial steps in troubleshooting DPM starts with having a healthy setup of VSS’s present in the affected operating system.

Also its recommended that you verify the System log on the server that you try to protect. In one scenario that I bumped into regarding issues with DPM creating new Recovery Points the local System log was filled with volsnap errors with event ID 39.
In this particular situation the associated VSS area has lost its association to the shadow copy area and therefor cannot create any new Recovery Points.
So how did I soleve it? Well, you need to create a new unbounded shadowcopy area. This is done via this simple set of instrucitnos:
  1. Open an elevated PowerShell prompt.
  2. If you D drive is not getting any Recovery Points done this is what you type is: “Vssadmin add shadowstorage /For=D: /on=D: /maxsize=unbounded
  3. You can now create a Recovery Point again in DPM

fredag 4 mars 2016

DPM console wont open...

Last week I ran into a strange error when I tried to open a DPM console. All of the sudden I got an error saying “MMC cannot open the file . This may be because the file does not excist, is not an MMC console, or was created by a later version of MMC. This may also be because you do not have sufficient rights to the file.”

So to solve this we need to get deep down in the user profiles. What is really happening behind the scenes? Every time the DPM server console opens the file “Microsoft System Center 2012 R2 Data Protection Manager” that is located in the Roaming AppData catalogue of your profile is verified. If that file is corrupt or faulty it needs to be reinitialized.

This is done via the simple trick of deleting the file and restarting the DPM server console.

The file location is “C:\Users\USERNAME\AppData\Roaming\Microsoft\MMC” and keep in mind to view “Hidden Items” otherwise the AppData catalogue will not show up.  

fredag 19 februari 2016

Error opening the DPM console ID948

Now and then I bump into scenarios where people can’t seem to open their DPM console. DPM throws an error saying “Unable to connect” and with that an error ID that is 948.
This error ID says that a connection can’t be made since the DPM servers database (DPMDB) is no longer within sync.

To solve this error you must get the DPMDB in sync again, this is made via the DPM CMDLET called DPMSYNC.

Open an elevated administrative version of the DPM Management Shell and run the  following command: “DPMSYNC -SYNC”. After a while the command has finished and you are now able to logon to your console again.

You could need to run a “Consistencey Check” on all your protected workloads datasources to get also the protection in sync. 

torsdag 11 februari 2016

Update Rollup 9

Update Rollup 9 released and with that some optimixation of the product.
This update rollup fixes the bugs that are listed in KB 3112306. It also includes the following features:
  • DPM now automatically repairs DPM filter corruption without requiring a consistency check.
    When there is DPM filter corruption, DPM automatically repairs the corruption by triggering a synchronization job within 15 minutes of the previous sync/backup job failure. (DPM does this instead of running a consistency check.)
  • In the DPM UI, the Offline/Online tags have been removed from Hyper-V VM names.
  • If you already upgraded to DPM 2012 R2 Update Rollup 6 or a later version, you will not have to restart the production server when you install this update.
This update is available through Microsoft Update or by manual download. You can download the package and find installation instructions and information about agent updates in KB 3112306.

fredag 29 januari 2016

Community feedback – got any ideas?

Have any ideas regarding how DPM can be improved? Please share them with me so I can provide the feedback direct to Microsoft Product Group for DPM.

Summarize your thoughts or festures/functions that you would like to see in DPM and send them to

fredag 15 januari 2016

Online Recovery Points keep failing

A time ago Microsoft introduced the great feature to provide a long-term protection using a Backup Vault in Azure to replace onprem tape solutions.

In the scenario where DPM starts to protect datasources within a workload that has a Recovery Point volume that is smaller than 3 GB the Online Recovery job will constantly fail with the error ID 100034.

To solve this you must increase the size of the Recovery Point and Replica Volumes to 3 GB each. After this is done your Online Recovery Point jobs will start to function again.

This is done via the “Modify Disk Allocation” function that you access by right-clicking either a Protection Group or a singel Datasource that is a member of the Protection Group.

fredag 1 januari 2016

MVP award

Once again I’m moved and deeply honored. For the sixth time in a row I’m presented with the MVP award from Microsoft.

Hard dedication and hard work always pays of, I will do my best to continue to provide the communities voices to Microsoft and Microsoft voice to the communities.  

2016 will be a blast =)

tisdag 27 oktober 2015

Update Rollup 8

Microsoft has just released the Update Rollup 8 that contains some bugfixes.

  • The DPM Agent crashes intermittently during a backup
  • If you are trying to recover data from an imported tape, DPM may crash with a "Connection to the DPM service has been lost" error
  • If you try to back up a SharePoint site that uses SQL Always On as a content database, SQL logs are not truncated as expected
  • You cannot verify tape library compatibility for tapes that use RSMCompatmode settings such as IBM 35xx, 2900, and so on
  • If you have multiple SharePoint farms hosted on the same SQL cluster with different instances but the same database names, DPM cannot back up the correct SharePoint farm content
  • If you run Update Rollup 7 for Data Protection Manager 2012 R2, and you have already configured online protection for one or more protection groups, trying to change the protection group populates the default DPM settings for the "Select long-term goals" wizard instead of the previous configured values
  • When you try to protect a SQL failover cluster, the Data Protection Manager UI crashes for every backup or synchronization operation
  • If you install Update Rollup 7 for Data Protection Manager 2012 R2, self-service recovery for SQL databases may not work, and you receive the following error message:

This update rollup fixes bugs listed in KB 3086084. It's available through Microsoft Update or by manual download.

onsdag 29 juli 2015

Update Rollup 7

Update Rollup 7 has been relesed! With this Update Rollup comes the ability to share Backup Vaults in Azure via the "Add External DPM" feature!

This update rollup fix includes:
  • Bugs listed in KB 3065246.
  • Increased support for DPM backup to Azure. We’ve added:
    • The ability to recover data to any DPM server that's registered in an Azure Backup vault. If two or more servers are registered in a vault and you back up data to Azure then any of those registered DPM servers can recover data from the vault. To use this feature after installing Update 7 you can enable the setting Add External DPM on the Recovery tab in the DPM console. Specify vault credentials. note that credentials are only valid for two days. If they've expired you'll need to download new credentials from the vault. Select the DPM server whose data needs to be recovered, and provide the encryption password. You can close the dialog after the server appears. Then you can recover the data by browsing the recovery points. To return to the local DPM data view click Clear external DPM.
      Note that if you want to use this feature for existing backed up DPM you'll need to wait at least a day to obtain the DPM protection group related metadata for upload to Azure (this occurs for the first time during the nightly job).
  • Support for protection and backup of data on client computers running Windows 10. You back up in the same way you did with client computers running earlier versions of Windows.
This update is available through Microsoft Update or by manual download. You can download the package and find installation instructions and information about agent updates in KB 3065246.

fredag 22 maj 2015

Ignite session

I had the great pleasure delivering the session Enterprise Backup: Custom Reporting, BAAS and Read-World Deployments in Data Protection Manager together with Steve Buchanan and Islam Gomaa at Ignite.

If you weren’t able to attend the session live nor Ignite here is the link so you can watch the session on-demand.

I would like to dedicate my gratitude to the Product Group team for DPM/Azure/System Center that made this session possible. 

onsdag 20 maj 2015

Update Rollup 6

Update Rollup 6 for DPM 2012R2 has been relesed and with that some good fixes! 
This update rollup fix includes:
  • Bugs listed in KB 3030574.
  • Support for using SQL Server 2014 as the DPM database. Support for protecting SQL Server 2012 was introduced in Update 4. Now, in Update 6 you can configure a SQL Server running SQL 2012 as your database. You'll need to install Update 6 and then configure DPM to use SQL Server 2014.
  • If you're backing up DPM to Azure you can now select to keep online backed up data when deleting a protection group. When you delete a protection group you can choose whether to preserve backed up data in Azure or on a local disk. You can always browse the retained data, together with other recovery points, from the server on the Recovery tab.
This update is available through Microsoft Update or by manual download. You can download the package and find installation instructions and information about agent updates in KB 3030574.

fredag 15 maj 2015

DPM 2012 R2 book released

I’m very happy to announce that my book for System Center Data Protection Manager 2012 R2 is finished and is available for you at the following link:

The book covers both basic content for DPM beginners but also more advanced design, configuration and deployments scenarios.

Happy reading =)

fredag 8 maj 2015

Ignite interview

During Ignite I had the great pleasure to talk to Lee Berg regarding information released on the two DPM sessions.

If you have any questions or need help on Backup, Restore or Disaster Recovery design…let me know

fredag 10 april 2015

Interview regarding the Enterprise backup session @Ignite

Did an interview with Lee Berg regarding the session that is coming up at Ignite. In our session we will cover UR5 information, Custom Reporting, BaaS and Real-World Deployments.

Happy viewing =)

torsdag 5 mars 2015

Error 197 after applying UR5 for DPM 2012 R2

This is an issue with UR5, where Create\Modify\Delete on a PG with Clustered File Server datasources may fail with Error 197, or cause a console crash.

Please use this workaround and see if it resolves the issue:

  1. Close the UI and all DPM services
  2. Important: Take a full database backup of the DPMDB to a safe location
  3. Run the SQL script on the DPM DB that is posted below
  4. Start all DPM services, and open the UI

This issue may recur whenever the underlying volume of the File Server migrates across nodes of the cluster and an inquiry is triggered on the File Server cluster role. You can run the script again as required.

DELETE FROM dbo.tbl_IM_ProtectedObject
WHERE ProtectedObjectId IN
SELECT PO.ProtectedObjectId
FROM dbo.tbl_IM_ProtectedObject AS PO
JOIN dbo.tbl_AM_Server as SRVR
ON PO.ServerId = SRVR.ServerId
JOIN dbo.tbl_IM_DataSource AS DS
ON PO.DataSourceId = DS.DataSourceId
WHERE DS.AppId = '00000000-0000-0000-0000-000000000000'
AND PO.ProtectedInPlan = 0
AND PO.ProtectedObjectId NOT IN (SELECT ProtectedObjectId FROM dbo.tbl_IM_ProtectedObjectAlerts)
    AND CONVERT(XML,PO.LogicalPath).exist(
    ) = 1
    AND PO.ProtectedObjectId != PO.DataSourceId
AND SRVR.ServerId = DS.ServerId

lördag 21 februari 2015

DPM 2012 R2 UR5 released

UR 5 has just been released and it is a really feature rich update rollup. Within this UR you have:

  • Improved workload support regarding backup to Azure
  • Custom Reporting
  • Offline transfer of protected production data to Azure
  • Multiple retention ranges for long term retention with Azure Backup

Features that are implemented in this update rollup

  • Protect SharePoint with SQL Server AlwaysOn configuration

With Update Rollup 5 for System Center 2012 R2 Data Protection Manager, Data Protection Manager can now protect Microsoft SharePoint Server farms that are hosted on instances of Microsoft SQL Server with an AlwaysOn cluster.

There is no change in the Data Protection Manager UI for the backup and recovery steps. If there is a failover within the SQL Server AlwaysOn cluster, Data Protection Manager will automatically detect the failover and continue to back up from the active SQL Server availability instance without requiring user intervention.

Note The Data Protection Manager Agent must be installed on all the nodes of a SQL Server AlwaysOn cluster.

  • Protect SharePoint Server, Exchange Server, and Windows Client workloads to Microsoft Azure by using Data Protection Manager

With Update Rollup 5 for System Center 2012 R2 Data Protection Manager, Data Protection Manager can now protect Windows Client, Microsoft Exchange Server, and SharePoint Server workloads to Microsoft Azure.

  • Support for multiple retention ranges for long-term backup on Microsoft Azure

With Update Rollup 5 for System Center 2012 R2 Data Protection Manager, Data Protection Manager will enable users to configure multiple retention policies for long-term retention of backup data on Microsoft Azure. Users can choose between daily, weekly, monthly, and yearly retention policies and can configure the number of recovery points (retention range) for each policy.


  • Data Protection Manager with Update Rollup 5 will enable up to 366 recovery points for each data source.
  • This option will be available only for protection groups that enable online protection for the first time.
  • Ability to transfer initial backup copy offline to Microsoft Azure

While it is creating a new protection group or adding data sources to a protection group, Data Protection Manager has to create an initial backup copy for the data sources that were added. Depending on the data source, this data could be large. This could make it difficult to send data over the network.

This update provides an option to transfer the initial backup copy to the Microsoft Azure data centers. Large data can now quickly be transferred to Microsoft Azure without consuming Internet bandwidth. If a user decides to transfer the initial backup copy offline to Microsoft Azure, the backup data will be exported on a disk. This disk is then shipped to Azure data centers. After the data is imported to the customer storage account in Azure, the disk is returned to the user.

More information about this feature can be found here.

  • Support for protecting Microsoft workloads that are hosted in VMware

With this update, Data Protection Manager can protect Microsoft workloads that are hosted on VMware. It will provide application-consistent protection at the guest OS level. Data Protection Manager Agents have to be installed on the guest OS of the VMware virtual machines that are hosting the workloads to be protected.

Note VMWare VM backup/recovery is not yet supported.

  • Display missed SLA alerts on the Data Protection Manager console

In Update Rollup 4 for System Center 2012 R2 Data Protection Manager, a feature for configuring backup SLA by using theSet-DPMProtectionGroupSLA cmdlet was added. For any protection group that missed the configured backup SLA, Data Protection Manager raised an alert that was visible only from Operations Manager. With this update, the SLA missed alert will also be displayed in the Data Protection Manager console.

During its nightly job, Data Protection Manager will check for all the protection groups that have the SLA configured and check if any protection group missed the SLA and raise an alert. For example, if a user configured a backup SLA of 8 hours for a protection group but there was no recovery point created in the past 24 hours, at midnight the missed SLA job would run and display three SLA missed alerts for the protection group.

Note Data Protection Manager SLA miss alerts will not be resolved automatically on the next successful recovery point. Users have to manually activate the alert.

  • Enhanced reporting with Data Protection Manager Central Console

This update also provides a new enhanced reporting infrastructure that can be used to create customized reports instead of just relying on the standard canned reports that were shipped with Data Protection Manager. We also expose the reporting schema to enable the creation of custom reports.

Users can now have aggregated reports from various Data Protection Manager servers that are managed in Operations Manager. A demonstration report is also available to help users create custom reports.

The new reporting infrastructure will be available after users upgrade their Data Protection Manager servers to Update Rollup 5, install the Data Protection Manager Central Console on Operations Manager, and import the new Data Protection Manager Reporting Management Pack that is available on the Microsoft Download Center.

To install the new Data Protection Manager Central Console update, follow these steps:
1. For existing customers who already use Data Protection Manager Central Console
a. Update the Data Protection Manager Central Console to Update Rollup 5 for both the client and the server by using the Update Rollup 5 download link.
b. Delete the following older Data Protection Manager MOM packs:
Discovery and Management (version 1126)
Library (version 1126)
c. Import the following Update Rollup 5 Data Protection Manager MOM packs:
Discovery and Management (version 1276)
Library (version 1276)Reporting (version 1276)
2. For new customers who use the Data Protection Manager Central Console for the first time
a. Import the release version of the following Data Protection Manager MOM packs on the Operations Manager server:
Discovery and Management (version 1126)
Library (version 1126)
b. Install Data Protection Manager Central Console client and server from the release version of Data Protection Manager 2012 R2. For example, run the setup program from the CDLayout folder of the Data Protection Manager server.
c. Update the Data Protection Manager Central Console client to Update Rollup 5 for both the client and the server by using the Update Rollup 5 download link.
d. Delete the following older Data Protection Manager MOM packs:
Discovery and Management (version 1126)
Library (version 1126)
e. Import the following Update Rollup 5 Data Protection Manager MOM packs:
Discovery and Management (version 1276)
Library (version 1276)
Reporting (version 1276)

fredag 13 februari 2015

Delivering sessions @ NIC conf

During this week I had the great pleasure to deliver two sessions on NIC in Norway. The first is regarding how to deliver BaaS, RaaS and DRaaS using Windows Server, System Center and Azure.

The second one was Monitoring and Manage you datacenter…let’s get started.

lördag 7 februari 2015

Session @ TechX Azure

I had the great pleassure presenting a session regarding how to create a proactive monitoring concept, this session was delivered in Swedish.

fredag 9 januari 2015

The purpose of a backup network

The majority of the datacenters that are build up using the modern datacenter approach are committed to both redundancy but also smart designs regarding providing uptime for the hosted services.

Providing a dedicated backup network that the backup traffic can use and rely on is a good strategy since it will offload the primary network architecture and not be dependent of the network hardware that builds up the production network.

There are some prerequisites for enabling a backup network:

  • Secondary network card
  • Enabling name resolution 
  • Verifying network connectivity 

The configuration for setting up a backup network is made via PowerShell but first you must have a secondary network card installed that is configured with the IP addresses that is a member of the backup network subnet.

The next step is to verify name resolution, alter the HOST file on both the production server and also the DPM server and enter the NetBIOS name for the servers involved.

Verify that you can ping the IP address for the backup network on the DPM server from the production environment and also via NetBIOS name.

To configure the backup network you open the DPM Management Shell and use the PowerShell CMDLET called Add-BackupNetworkAddress.

To configure the DPM server to use the dedicated network of enter the following syntax:
Add-BackupNetworkAddress –Address –DPMServer DPM1 –SequenceNumber 1

The three switched are:

  • Address
  • DPMServer
  • SequenceNumber

The Address switch provides the address for the backup network. The DPMServer provides the NetBIOS name for the DPM server. With the SequenceNumber switch you provide the primary backup network and also the failover backup network.

The DPM agent will verify its XML configuration and understand what network it should use as its primary backup network. Normal communication between the DPM agent and the DPM server will still travel over the production network. If the backup network connectivity fails the DPM agent can be setup using also a secondary backup network that will be a failover network for that traffic. The failover network is something that you must configure for it to work.

fredag 2 januari 2015

MVP Renewal

This year starts of really great. For the fifth time I’m awarded the Microsoft Most Valuable Professional award (MVP) which im very honored of. 

I’m looking forward to be the community voice into the very internal core of Microsoft for another year but also provide close cooperation with my internal Product Group team members.

torsdag 27 november 2014

What can be improved?

I have had some interesting meetings with the Product Group managers at Microsoft regarding the development of Data Protection Manager also known as DPM. I’m very pleased that Microsoft are working very hard to make improvements to the product and that is the reason for me writing this blog post. 

I want to understand the community wishes regarding HOW Microsoft can improve the DPM software from two scenarios:

  1. The on-prem tape story. What do you think Microsoft should improve regarding tape management and features in DPM
  2. SharePoint protection. What do you think Microsoft should improve regarding SharePoint protection, restore and features? What is missing from your point of view.

Just to be clear. I will read all of your email and I will summarize them without changing your words. The information you provide to me will be provided to the product group managers for system center and DPM.

Let Microsoft know what they can imnprove send me an email @

fredag 21 november 2014

Almost in progress

Dear readers,

Laytley I have got some questions regardin if the new DPM book will be out and released this year, I have great news for you. I have only three chapters left to write of a total of 14 chapters.

I will have my part done ASAP and then the great people who are reviewing my work will have their work done. The nice people at PacktPublishing will produce the book as soon as all the information is gathered and reviewed.

Hang in there the book is comming, thank you for asking =) 

fredag 14 november 2014


Are you intressted in learning how to start your proactive monitoring experience and also learn how to get started with your business continuity planning based on the features and functions of Microsoft product stack?

If the answer is yes I would like to personally welcome you to my two sessions that I deliver at the TechDays event in Stockholm.

For more information regarding the sessions have a look at this webpage!

See you all there!

onsdag 29 oktober 2014

DPM 2012 R2 UR4 released

Microsoft has just released UR 4 and with this comes the demand for reboot of your production environment that you want to protect.

It is important to emphasize why you need a reboot of you production environment. When Microsoft makes changes to the drivers that is a part of the deduplication process the DPM team must update their code to adopt to the changes. For DPM to be 100% adoptable the operating system kernel needs to be rebooted due to the fact of re-initializing the deduplication updates made by the Windows Server team. Hence, this is not due to DPM but the fact of re-initializing the code changes for the deduplication engine made by the Windows Server team.

The content of UR4 is a mixture of many good things, the following list provides a description of what has been fixed or added to the product:

  • Item level recovery does not restore custom permissions when Microsoft SharePoint is running localized language packs.

You try to back up SharePoint when SharePoint has various files that have custom access permission settings and is using a non-English locale for SharePoint sites. When you try to recover the files, data is recovered successfully. However, the custom permissions may be lost. This causes some users to no longer have access to the file until permissions are restored by the SharePoint administrator.

  • The DPM user interface crashes, and you receive the following error message: Connection to the DPM service has been lost.

This issue occurs when you change some protection groups after you upgrade to DPM 2012 R2 Update Rollup 2 or a later version.

  • You experience issues when you try to restore a SQL Server database that has error 0x800423f4.

You have a SQL Server database that uses multiple mount points that point to the same volume and that protect it by using DPM. When you try to recover from an existing recovery point, you receive an error message 0x800423f4.

  • An instance of SQL Server is missing when you try to create a protection group (PG).

You have two or more instances of SQL Server on same server so that one of the database names in one instance is a substring of another name of an instance of SQL Server. For example, you have a default instance of SQL Server and another instance that is named "contosoinstance." Then, you create a database that is named "contosoins" in the default instance of SQL Server.

In this case, when you try to create or change a protection group, the “contosoinstance” SQL instance and its database do not appear in the Create Protection Group Wizard.

  • The DPM console or service crash after VMs are removed from the Hyper-V host.

Several virtual machines on the Hyper-V host are physically removed. Then, the DPM UI crashes when you try to stop protection for these virtual machines.

  • DPM crashes with NullReferenceException.

When you protect a SQL database by using long-term protection on tape, the SQL logs are moved to a different disk volume. When you try to make a backup for the first time after you move the logs, the DPM UI crashes. However after you restart the UI, everything works correctly.

  • Scheduled backup to tape runs on an incorrect date for quarterly, semi-annually, and yearly backups. In some cases, you see that the backup runs before the specific date.

  • The DPM UI sometimes crashes when you change a protection group that has the following error: Connection to the DPM service has been lost. (ID: 917)

The following features has been added in UR4:

  • Added support for protecting SQL Server 2014

This update let you protect SQL Server 2014 as a workload. There is no change in the user experience or in scenarios that are supported. Therefore, you can continue to back up SQL Server 2014 by using DPM in the same way that you protected older versions of SQL Server.

Unsupported scenarios in SQL Server 2014:
You cannot use SQL Server 2014 as the DPM configuration database.
You cannot back up the data that is stored as Windows Azure blobs from SQL Server 2014.
The "Prefer secondary backup" preference for the SQL Always On option is not supported completely. DPM always takes a backup from the secondary. If no secondary can be found, the backup fails.

  • Added support for SQL Server 2012 Service Pack 2 (SP2)

This update lets you protect the latest SQL Server 2012 SP2 as a workload and also use it as a DPM configuration database. There is no change in the user experience or scenarios that are supported. Therefore, you can continue to back up SQL Server 2012 SP2 by using DPM in the same way that you protected older SQL Server versions.

  • Simplified steps for Online backup registration

Note Existing DPM to Azure customers should upgrade to the latest agent (version 2.0.8689.0 or a later version available here). In case this is not installed, online backups fail, and no DPM to Azure operations will work.

By adopting the latest new features and improvements in this update, DPM customers can register for Azure protection in several clicks. These new features and improvements remove the dependency of creating a certificate by usingmakecert and uploading it to the Azure portal. Instead, you can download vault credentials from the vault page on the Azure portal and use those credentials for registration.

To register the DPM server to the Azure backup vault, follow these steps.

You must have a subscription on the Azure portal, and a backup vault must be created on the portal.

If you already use DPM-A
1. From the backup vault page on the portal, download the agent for SC Data Protection Manager.
2. Run the installer on the server. To do this, follow these steps:
a. Select the folder to install the server.
b. Enter the correct proxy settings.
c. Opt in for Microsoft updates, if you are not already opted in. Critical updates are usually propagated through Microsoft updates.
d. The installer verifies the prerequisite software and installs them.

Please be aware that you do not have to register the server again if the server is already registered and backups are in progress. In this case, you have only to upgrade the agent.

Getting started with DPM-A
3. From the backup vault page on the portal, follow these steps:
 . Download vault credentials.
a. Download the agent for System Center Data Protection Manager.
4. Run the installer on the server. To do this, follow these steps:
 . Select the folder where you want to install the agent.
a. Enter the correct proxy settings.
b. Opt in for Microsoft updates, if you have not already opted in. Critical updates are usually propagated through Microsoft updates.
c. The installer will verify the prerequisite software and installs them.
5. Register the DPM server. To do this. follow these steps:
 . In Data Protection Manager, click Online, and then click Register Server.
a. In the Registration window, select the vault credential file that is downloaded from the portal. In the case of a remote scenario, only the network share path is enabled.
b. Follow the rest of the steps, and complete the registration.

Click on the following link to download UR4 ( and remember to reboot your environment.

fredag 19 september 2014

Mastering System Center 2012 R2 Data Protection Manager

There is a new opportunity coming up regarding attending the "Master System Center 2012 R2 Data Protection Manager" course that I will teach in Stockholm the 8th of December this year.

If you are looking for an education that will give you the whole picture regarding how you can interact DPM with your Business Continuity plan and make the product work optimal with ohter System Center families and Azure this is the course for you.

We will also be focusing on restore since that is the reason of backup!!lab=Mastering_System_Center_2012_Data_Protection_Manager_R2

fredag 5 september 2014

Want to protect your Azure IaaS workloads? Use DPM!

A few hours ago the product group released the great information regarding that DPM now is supported to run as a IaaS in Azure protecting other IaaS Azure servers. There are some important facts that you need to consider and this blogpost will cover the basics that you need to get started.

The IaaS Azure virtual machine must be a virtual machine that is A2 or higher. Please keep in mind that DPM can also protect workloads that runs across multiple Azure cloud services that have the same Azure virtual network and Azure subscription. The number of disks that can be used for the DPM disk pool is limited be the size of the virtual machine. If you want to know more about size limits please read this information regarding Azure Virtual Machines (

The workloads that are supported to backup in Azure are:
  • Windows Server 2012 R2 – Datacenter and Standard
  • Windows Server 2012 – Datacenter and 
  • Windows Server 2008 R2 SP1 – Standard and 
  • SQL Server 
  • SQL Server 2008 
  • SQL Server 
  • SharePoint 
  • SharePoint 2010  

For further information please read this blog post published by the DPM team


fredag 29 augusti 2014

Longer retention for the Backup Vault, how about 3360 days?

The community has for a while mention that they need more retention time for their Recovery Points that they choose to store in Azure. As the UR3 release for DPM 2012 R2 a month back ago some final changes and optimizations were made from the DPM team to make DPM ready to deliver a fully supported, optimized and efficient Privet Cloud protection. With that came the architecture that was needed from the DPM server side to interact with the new version for the Azure agent, please keep in mind that there is a specific Azure team that develops the Azure services that closely cooperates with the DPM team to provide all great features.

To get this up and running you need to apply the UR 3 for 2012 R2 ( and the new version of the Azure agent ( on your DPM box. Remember when you apply the UR3 you need to reboot your DPM box and your production servers when you choose to update your DPM agents via the console or via SCCM (System Center Configuration Manager).

From a more personal view, I find this update a milestone where the great possibility to partially remove tape from your restore plans or even remove them entirely. There is a very positive feeling regarding the development of the DPM technology and also a great focus of delivering great interactions between systems.

For more information regarding the new retention time in the recovery service the Backup Vault please read this article that Sherees published a few days ago.

onsdag 30 juli 2014

DPM 2012 R2 UR3 released

Today the UR3 for System Center Data Protection Manager 2012 R2 was released and with that Microsoft Product Group team in India sealed the deal regarding private cloud protection. More blogpost will follow regarding protecting your private cloud using the great features of System Center Data Protection Manager 2012 R2.

Features that are implemented in this update rollup

  • Scalable VM backup

    This update rollup improves the reliability at scale for Virtual Machine (VM) backups on Hyper-V and Windows Server 2012 R2 infrastructures. This feature is supported on both Cluster Shared Volumes (CSV) and scale-out file server (SOFS) storage configurations for VMs.

  • Backup and consistency check window

    Important This feature is supported only for disk protection for VM data sources.

    This feature, configured through Windows PowerShell, enables specific time windows to restrict backup and consistency check (CC) jobs. This window can be applied per protection group and will limit all jobs for that protection to the specified time window.

    After the backup job has ended, all in-progress jobs can continue. Any queued jobs outside the backup and consistency jobs will be automatically canceled.

    This feature affects only scheduled jobs and does not affect specific jobs that are triggered by the user.

    Windows PowerShell script examples are available on Microsoft TechNet. These examples show how to use PowerShell cmdlets to create the backup and consistency window.

    • This feature is not supported for tape or cloud protection jobs.
    • This feature is not supported for non-VM data sources.
    • Setting these windows is the same as running a Modify Protection Group workflow.

  • Support for synthetic fiber channel-to-tape

    This update rollup introduces support for the synthetic fiber channel-to-tape process. Follow the tape certification process for third-party tape devices when you use Data Protection Manager 2012 R2 and Windows Server 2012 R2.

Issues that are fixed in this update rollup

  • A backup of a mirrored SQL instance fails if the principal SQL instance that was first backed up is now the mirror.
  • DPM console crashes while a recatalog or "mark as free" operation is performed on an imported tape.
  • The MSDPM service crashes when protected data sources have long names.
  • The DPMRA service crashes during replica creation when the database name on one of the SQL instances matches or is a substring of a SQL instance name that is hosted on the protected server.
  • This update lets administrators configure the DPMRA port and select a nondefault port by following these steps:
    1. Install UR3 for DPM 2012 R2.
    2. Run Setagentcfg.exe by using the following command:

      setagentcfg.exe s  

      • In this command, represents the nondefault port.
      • By default, this command should be located in the following folder:

        %PROGRAMFILES%/Microsoft System Center 2012 R2/DPM/DPM/Setup
    3. Verify that a new entry is created in the following registry subkey:

      HKEY_LOCAL_MACHINE \Software\Microsoft\Microsoft Data Protection Manager\Agent\2.0\PsPortConfig
    4. Copy the setagentcfg.exe file from the DPM server to the following folder on the protected computer:

      %PROGRAMFILES%\Microsoft Data Protection Manager\DPM\bin
    5. Run the following command on the protected server:

      setagentcfg.exe e DPMRA 
      Note Use the same port number that is specified in step 1.
    6. Restart the DPM server.
    7. Restart the DPMRA service on the protected server.
To get the UR3 for DPM click the following link:

tisdag 20 maj 2014

DPM 2012 R2 UR2 re-released

Today Microsoft re-released the UR2 to resolve the DPMAMSERVICE crash that occurred if the original update was applied (KB2958100).

The updated bits can be downloaded from here:

fredag 25 april 2014

Great features keep on coming, now support for backing up Hyper-V replica

Microsoft keeps on delivering great technologies to their customers, the last contribution is the support for backing up Hyper-V replicated servers.

When building a disaster recovery plan for a customer one important key component is the Hyper-V replica feature that lets you asynchronous replicate a Hyper-V machine between two hosting servers. The only previously supported scenario was to back up the virtual machines running at the primary site, this has now changed. On the 24th of April Microsoft announced that they now support backup of the replicated sever that is located at the replica site, this makes it easier to provide a decent and also a more optimal disaster recovery scenario or strategy to customers.  
Please note that backup of a Hyper-V virtual machine should only be considered as a disaster recovery strategy. From a DPM perspective you can recovery items that are defined as flat files from within the backup of the virtual machine, this does not apply to the applications (SQL, Exchange etc.) running inside the virtual machine. To summarize it, to be able to create a basic recover strategy you need to deploy a DPM agent to the virtual machine OS and perform what Microsoft call a guest-level backup providing a recovery scenario for the hosted application.

The Hyper-V replica feature has been a great contribution from Microsoft to their customers and with this announcement of support the efficiency of building a more optimal and strategically disaster recovery plan, which will directly map to the customer business continuity plan, means a great deal.

For more information regarding Hyper-V replica please have a look at this website:

For more information regarding the new support please read this article from Neela Syam Kolli that is the Product Group Manager for the DPM team:

torsdag 10 april 2014

Bringing back the support for Windows Server 2003

Releasing the R2 version of System Center Microsoft excluded the support for backing up Windows Server 2003. This raised a discussion in the community and among the MVP’s.
Microsoft realized the challenge this resulted in for many companies and decided to bring the support back with the UR2 that is due to the end of April.
I would like to see this action as a result of Microsoft really listening to the community and their customers. If you got any feedback regarding DPM please provide me with your thoughts.

fredag 28 mars 2014

Article on Technet, how to get started with DPM and Azure

A little while back I wrote an article for Microsoft that was published on Technet. The article covers how you get started using the Recovery Services in Microsoft Azure (name change!!) with focus on the backup vault and DPM integration.



fredag 21 mars 2014

New DPM book comming soon

Just would like to inform you that there will be a new DPM book soon published via Packt Publishing regarding DPM 2012 R2.

I’m writing the book right now and will try to provide good content to the reader not just regarding DPM as a product bit the whole picture of design, implementation etc.

Happy Friday!


fredag 7 mars 2014

1 – 1 Q&A @ TechEd NA

As I mentioned in my previous blog post ( I’m going to TechEd NA to deliver a session together with John Joyner. We will talk about how to deliver Backup As A Service (BaaS), Restore As A Service (RaaS) and Disaster Recovery As A Service (DRaaS) using SCDPM and the other members of the System Center 2012 R2 stack, Windows Server 2012 R2 and Azure.

If you are attending TechEd NA I would of course be very happy to see you in the audience but I would be even more than happy if I got the opportunity to listen to your challenges, thoughts or any idea you may have regarding DPM, SCOM or other members in the System Center family stack.

If you would like to have a 1 – 1 Q&A @ TechEd NA please send me an email ( with the subject TechEd2014.

Looking forward to listen to you.