Topic: Data Deduplication in Windows Server 2012

Now this is something I'm excited about!
http://blogs.technet.com/b/filecab/arch … ected=true

Short summary:
Old way...5 copies of a 100MB file would mean 500 MB of storage space used.

New way...5 copies of a 100MB file would mean 100 MB of storage space used (5 "pointers" to the location of the "real" data).

Read BEFORE you post.  HWID tool   DriverPacks Tutorial   DONATE!
http://driverpacks.net/userbar/admin-1.png
Not all heroes wear capes, some wear Kevlar!

Re: Data Deduplication in Windows Server 2012

this is a very old idea ... MS used to do this on their NT4 install CD's...

if you tried to copy the disc to HDD you would end up with well over 640 MB of data because the file index referenced a single chunk of data multiple times.

A it saved disk space.
b had the side effect of defeating a simple disk duplication

So we are excited becuase they dug out 15 year old tech and dusted it off?
Whats next Drivespace 2? lol

Ps what if the master gets pooped ... then all the copies poop too... NICE.

DP BartPE Tutorial   DP_BASE Tutorial   HWID's Tool     Read BEFORE you post    UserBars!
http://driverpacks.net/userbar/admin-1.png
The DriverPacks, the DP_Base program, and Support Forum are FREE!.

Re: Data Deduplication in Windows Server 2012

That seems best suited to SDD's.  I hope the new backup services support this strategy.
The following article seemed relevant in a tangential way:  http://www.theregister.co.uk/2012/06/04 … a_too_big/