Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

According to the article they have 2 thousand servers. These are Dual Xeon 5620 4 core CPUs with hyper-threading.

So that averages out to: 70000 / (4 * 2000) = 8.75 requests per second per CPU.

That seems like quite a low number, I would presume that the load would be shared amongst less servers than this and the others would be used for replication/redundancy.



It says 12 datacenters distributed around the globe.

I think we can safely assume each datacenter holds a dedicated load-balancing server. Let's assume they have only one of these at each datacenter, it gives us 70000 / (4 * 12) = 1460 requests per second per CPU, which is in line with reliably handling over 10,000 request per second of live traffic to WordPress applications from a single server.

and

In April 2008 Automattic converted all WordPress.com load balancers from Pound to NGINX. , which points to the fact that they're only talking about load balancers and they have several of them.

This link implies that one NGINX load-balancing instance cannot handle 70k r/s.


It's definitely not one load balancer, but nor is it 70krps distributed across 2000 servers. I also think having only one per datacenter would be a bit risky -- I'd have at least 2. It's very frustrating that the article doesn't make clear how many load balancers they run.


WordPress.com currently has 12 load balancers per data center. They are HA and used for different subsets of traffic.


Okay, so we can now do the "math" :

70 000 / (4 * 12 * 12) = 121.52 request per CPU


I don't think that is right, it says that automattic has 2000 servers but they also run things like gravitar, akismet, and vaultpress. Which makes me think that using that number is completely wrong. (Also these are just the load balancers not the back end, which could be taking up a large amount of that 2000 server count)

From the article it seems more like they have ~100, maybe less, for the load balancers which comes out to 175 requests/second per cpu which is getting a little more reasonable.


Of the 2000 servers, about 90% of them are running something related to WordPress.com. There are 36 "load balancer" servers in total. I added up the req/sec across those 36 machines and came up with the 70k/sec number in the article. The requests aren't evenly distributed across that subset of machines though, so you can't just divide evenly to figure out a req/sec/CPU rate. I left another comment in this thread that mentions 5k req/sec on a "normal" load balancer and that the limiting factor isn't Nginx CPU usage.


If this is just about load balancing, then they could probably get better performance from haproxy, right? Anyone should be able to exceed 200 req/s with haproxy. In fact, you should be able to get around double that or more.

The point is the title is useless. It tells you almost nothing. That blog is a joke.


Well it is average req/s so peak should be much higher, and they probably can average much more than what they are doing now in case of spikes and other such things.

But yeah the title is pretty much useless.


They have two thousand servers total; that includes database servers, memcached servers, regular backends, and load balancers.

Peter Westwood gave a talk on WP.com's infrastructure in London in January; I've tried to find slides online but I can't, which is a shame because he went into quite a bit of detail about their nginx/HyperDB/memcached/mogileFS setup.


There's a video from WordCamp 2012 where Barry does some Q&A about large WordPress setups:

http://wordpress.tv/2011/08/31/barry-abrahamson-ask-barry-ab...


2000 servers? My god. Aren't most blog posts just static pages or generated static pages? A good html cache setup should help a lot.


They can't possibly mean 2000 of the beefy servers described as being the load-balancers, or it wouldn't be impressive at all. But it's very frustrating that the article gives the specs of the load balancers without saying how many of them they need.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: