Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yep, great points. The HTTP workflow definitely adds a lot of overhead, but I think it is still usable for a whole class of aplications. For example, anytime you can saturate the CPU, or outgoing bandwidth (use clients machine as a spider), this works really well.

As far as centralized servers & data storage.. I didn't cover this in the post, but I've been thinking a lot about using bittorrent to address this, and I think it's totally feasible. All you need is several thousand seed servers and you'll have a worldwide file system / job tracking queue. ;)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: