Science behind botlist

I haven't done a lot of work on botlist recently. I have been captivated by other projects. But I have a couple of doable features that I want to implement.

First I need to develop the push-pull system a little bit further. This allows for botlist to be updated remotely. Right now, it is working fine but I want to ensure data integrity for when the client is communicating with the server. Basically, when the client sends data uploads to the server, I want to ensure that the server has all the right data. It will probably be a simple MD5 scheme to validate a binary upload or something similar. As of right now, the push pull system just ignores invalid records when they are encountered.

Recommendation System


Botlist is a really simple system, right now. New links are submmitted by users or by the remote bots and added to the table. I am currently working on a system for the main page to be filled with popular links and actual content. Recently I picked up a USA today newspaper and amazed at the layout. Who decides and under what conditions articles should be placed at the front. How much text is too much, too little, what is captivating to the reader. It would be nice mimic such functionality. If you have visited the Drudge Report, it would be similar to how that works (minus Drudge manually updating his popular links).

Comments

Popular posts from this blog

On Unit Testing, Java TDD for developers to write

Is Java the new COBOL? Yes. What does that mean, exactly? (Part 1)

JVM Notebook: Basic Clojure, Java and JVM Language performance