Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Blog post from same company in two years:

"How we switched from a custom database to Postgres".



I used to work at Man, and they have been using ArcticDB (including the earlier iteration) for over 10 years now.


I don't understand your cynicism here. Should people/businesses not try new things?


>> Should people/businesses not try new things?

Well yes, in the right context, like hobby/personal programming. But things like "we built our own database" tend to be really hard to justify and mostly represent technical people playing toys when they actually have an obligation to the business that is spending the money to not play toys but to spend the money wisely and build a wise architecture, especially for business critical systems.

It's indulgent and irresponsible to do otherwise if standard technology will get you there. The other point is that there are few few applications in 2024 that would have data storage requirements that cannot be met by Postgres or some other common database and if not then perhaps the architecture should be changed to do things in a way that is compatible with existing data storage systems.

Databases in particular as the core of a business system have not only the get "put data in/get data out" requirement but hundreds of sub requirements that relate to deployment and operations and a million other things. You build a database for that core set/get requirement and before you know it you're wondering how to fulfill that vast array of other requirements.

This happens everywhere in corporate development that some CTO who has a personal like for some technology makes the business use that technology when in fact the business needs nothing more than ordinary technology, avoiding all sorts of issues such as recruiting. The CTO moves on, leaving behind the project built with the technology flavor of the month and a project that either needs to struggle to maintain it into the future, or the need to replace it with a more ordinary way of doing things. Likely even the CTO has now lost interest in their playtime technology fad that the industry toyed with and decided isn't a great idea.

So I stand by my comment - they are likely to replace this within a few years with something normal, likely Postgres.


I think you just fundamentally misvalue what he is optimising for. It's live or die by alpha generation, and his tradeoffs are not HFT. There is going to be a whole other infra and opsec for executing the strategy.

Versioned dataframe native database hits perfectly for attracting productive quant researchers.

(Disclaimer, I'm also CTO of a quant fund and understand what he was optimising for)


I think it is somewhat like git's creation story. Sometimes a senior dev sees a tool that is close to ideal but needs to work a little differently than what the industry has built.

Databases are up there with encryption. Don't roll your own... mentality.

But sometimes they don't fit the problem your solving. Sometimes the data never changes so why have infrastructure for updates.

Having a big DB running all the time could be too expensive for your business model

Also it is good to be curious about "what is an index" and how does a parquet file look in hex editor. Why can't I write the underlying db table outside of postgres. Why are joins hard..

And then you discover your tools give you a competitive edge

Most of the time there are existing tools, but sometimes they don't.


They have tons of money, enough to support a development team improving their database. In addition there is a long legacy, both technically and politically, making it hard to even propose to get rid of it. The only likely switch is a gradual decline, with parts of the system moved to another database.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: