If you want to compare security track records, you have any of several huge vulnerability databases to consult; Secunia is the one that seems to have done the best job of SEO.
But there are two huge problems with this approach.
First, nobody cares. Nobody really evaluates products based on "security track records". If I did a "Month of Cisco Bugs", do you think any company in the world with more than 500 employees would switch to Juniper or Astaro? No, they would not.
Second, vulnerability track records don't directly measure product security, or even product security team responsiveness. They measure researcher interest and researcher effectiveness. Microsoft, Adobe, and Apple are deluged with reports, because researchers are incented to target them. It also tends to take (say) Microsoft longer to fix things than J-Random-Vendor; QA is harder, releases are more expensive, and more security issues are on the plate to begin with.
But there are two huge problems with this approach.
First, nobody cares. Nobody really evaluates products based on "security track records". If I did a "Month of Cisco Bugs", do you think any company in the world with more than 500 employees would switch to Juniper or Astaro? No, they would not.
Second, vulnerability track records don't directly measure product security, or even product security team responsiveness. They measure researcher interest and researcher effectiveness. Microsoft, Adobe, and Apple are deluged with reports, because researchers are incented to target them. It also tends to take (say) Microsoft longer to fix things than J-Random-Vendor; QA is harder, releases are more expensive, and more security issues are on the plate to begin with.