Recently I came across code in a project that had been built to check for changes in order to give the user warnings like "remember to save.." "do you really want to navigate away.." etc. As this project uses Typed datasets for all they're worth (which is quite a lot as long as the complexity of the domain is easily managed) the actual check for changes was done through calling the datasets ToXml() method and holding that string in the session and constantly check against that as the user went about his business.
Now, this project had in NO way a stateless web server, but then the requirements didn't call for that either, but still. each dataset (of which the user would typically have 1-20 of within a session) was saved in the session and also its xml was saved (to perform change-watching).
Thw problem was that although there were few users (<10), the xml for each dataset had a footprint of about 1.6 MB each.
So what did I do to ease the pain? I changed the code to check and save not the raw xml, but a hash of the xml. To do this I made one small method that translated from string to a hashed string, All up I spent about 20 minutes implementing and testing this schema and the estimated footprint-savings is in the area of 150 MB of server memory.
so what about the code eh? well here it is:
public static string ComputeHashFromString(string message, HashAlgorithm ha)
byte msgbytes = ASCIIEncoding.ASCII.GetBytes(message);
byte hashBytes = ha.ComputeHash(msgbytes);
StringBuilder sb = new StringBuilder();
foreach(byte b in hashBytes)
public static string ComputeHashFromString(string message)
return ComputeHashFromString(message, new MD5CryptoServiceProvider());