Category Archives: DPM

DPM: What’s next? It’s up to you.

Users is not always a malicious addition to a program. Actually (however oddly that sounds) software is produced for the users. Well, this is the secondary reason, according to someone, but definitely if the product has not found its user then it was not a success. The same matters apply to the DPM. Each new version brings us new features, but a human being always wants more, of course. Some people ask how to communicate their wish lists to the DPM Product group or just asks on forums for new features. The good news is that this is easy now: fill in a survey and the developers will read it. No, really, they analyze what is submitted Winking smile so the next version may include some features you want.

DPM as a statistics tool

image Remember old days of DPM 2007 RTM? I don’t know about others but I myself was used to use it more in OpsMgr fashion. No, really, Data Protection Manager was great monitoring tool back there: as soon as something had happened with, say, your Exchange server or SQL, the backups just stopped working. It is not that way now (I guess it is rather good =) ). But still we are able to use DPM in some tasks not related actually to the data protection itself.

What I am asked for quite often as a DPM person is… to calculate how much data on a server changes. Yes, we may get approximate figures from the corresponding Recovery point volume size. For example: if we have a DB, which uses 3000GB on the recovery point volume and retention period is, say, 30 days, then the change rate may be approximated as 3000GB/(30days*24hours) ~ 4GB/Hour. Of course it doesn’t tell us the peak values. Of course it is only approximation, but sometimes it is enough.

DPM: Scripting Additions to a Backup

image It’s beautiful when we are able to use features of DPM to backup an application as a whole (say, SharePoint or Hyper-V virtual machine), but what will happen when it is not enough to just copy files and system state (like in OCS in some cases, or Windows Server 2008 BMR Backup), or our MOSS 2007 farm configured in a way we can’t backup it at once (some SQL mirroring scenarios prevent us from doing so)?

Well… It is where scripting steps in.  DPM actually lets us to run any script before backup starts or after it finishes. Is it difficult? Not at all. Is it something I can recommend to create comprehensive backup packages, or run a bunch of pre-/post-backup tasks? Not really. Let us discuss why:

1) Difficulty. Actually you have just to publish your scripts to the local drive of a protected computer and change a configuration file. Script may be of any type: shell, VBS, PowerShell or even Perl. For example, script for BMR backup on Windows 2008 may look like

@echo off
setlocal enabledelayedexpansion
set BACKUP_TARGET=\BackupServerServerBackup
rd /s /q "%BACKUP_TARGET%WindowsImageBackup%computername%"
wbadmin start backup -backuptarget:"%BACKUP_TARGET%" -allcritical -quiet
if %ERRORLEVEL% == 0 (
rem    pushd "%BACKUP_TARGET%WindowsImageBackup%computername%"
rem    for /f "tokens=*" %%i in (‘dir /b /s *.vhd’) do move /Y "%%i"


(the code is almost from this document)

Anyway, the most important part of the task belongs to the file ScriptingConfig.xml which usually is situated in folder c:Program FilesMicrosoft Data Protection ManagerDPMScripting

The content of the file by default is:

<?xml version="1.0" encoding="utf-8" ?>

What we need is to add some configuration before </ScriptConfiguration>. The final file looks like that:

<?xml version="1.0" encoding="utf-8" ?>

<ScriptConfiguration xmlns:xsi="" xmlns:xsd="" xmlns="">

<DatasourceScriptConfig DataSourceName="c:">

  <PreBackupScript>"Path-To-Script or command line to run script"</PreBackupScript>

  <PostBackupScript />

You see: everything looks clear in the code.

  • DataSourceName – the name of the data source before (or after) backing up which the script runs.
  • PreBackupScript – string which should run before backup
  • Postbackupscript – string which should run after backup
  • TimeOut – script timeout in minutes

Pretty simple, isn’t it?

2) Why not to do it unless you really have to? Well… It’ is hard to monitor whether the script worked fine or not. DPM console won’t tell you about it. For example, when we are speaking of Windows 2008 Bare Metal Recovery Backups we may backup any file on a file system preceding the backup with a script which makes BMR backup with WBAdmin.  If the file itself is backed up successfully then your DPM server will show you a green mark even if the BMR Backup script failed. The only ways to know if everything went good are:

  • Make a test recovery (good idea usually, but just imagine hundreds of servers backed up every week… Impossible to check them all every week)
  • Make the script complex enough to catch errors in a backup and report them. Again, not really bad idea, but it is totally unmanaged and may become very costly in large environments
  • Use 3rd party tools.

But for some tasks it is really good stuff and you may use it.

DPM 2010 Upgrade Advisor

Guys from the DPM team issued small Excel tool which allows to receive instant overview of steps required to upgrade your current DPM infrastructure. [image[3].png]

The tool covers questions of upgrading from DPM 2007 RTM to 2010 RTM (pre-release) installed on various OSs and in diverse configurations. According to the utility, by the way, we’ll be able to upgrade to the RTM from the release candidate. However that doesn’t mean that DPM 2010 RC is supported. =)


Source. x