DH Is Getting A New And Improved Compression Algorithm
I will be releasing DH with a new algorithm I wrote that improves the data compression of your history content by about 50%. I’m calling it an adaptive cache compression algorithm.
Also the old compression algorithm has been improved to reduce or eliminate “noise” from sites like Facebook and Google Drive that inject a lot of data into their pages.
Lastly, I’ve decided to implement a site black list. I was not in favor of this before because I refuse to implement non-general solutions, but with Google’s search page my reasoning is two-fold.
- Why would anyone recall the content of a search page? You would just Google it again – it’s faster.
- The value of indexed content from this page was so little, yet it’s database footprint was so large.
So after looking at those two reasons I decided, in this case, a “fitted” solution was warranted.