Quality Software Engineering

DH Is Getting A New And Improved Compression Algorithm

I will be releasing DH with a new algorithm I wrote that improves the data compression of your history content by about 50%. I’m calling it an adaptive cache compression algorithm. 

Also the old compression algorithm has been improved to reduce or eliminate “noise” from sites like Facebook and Google Drive that inject a lot of data into their pages.

Lastly, I’ve decided to implement a site black list. I was not in favor of this before because I refuse to implement non-general solutions, but with Google’s search page my reasoning is two-fold. 

  1. Why would anyone recall the content of a search page? You would just Google it again – it’s faster.
  2. The value of indexed content from this page was so little, yet it’s database footprint was so large.

So after looking at those two reasons I decided, in this case, a “fitted” solution was warranted.

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s