Yes, it's a niche (some may even say it's not software, it's hardware, but it is indeed a large code base that requires a software engineering environment/toolchain/practices/etc...). Again, just answering your question for the benefit of the discussion. It would be interesting to hear of others such closed source areas.
Kdb is a very niche product that has open source alternatives that are made use of by the rest of the software industry outside of FinTech. It's not bad, it's just not what the market wants to use because it's a closed box.
Wikipedia has an article on Time series databases and they contain a short list of popular TSDBs [0]. On that list 11 are libre-software to some degree and 4 are commercial. Relational TSDBs are new to the open source space but they're still being built despite there being the existing kdb+. This shows that people will spend considerably more funds to engineer a replacement (and give that replacement away for free) to avoid using a closed source product for this use case.
Unfortunately, there are no open source alternatives to kdb+. Effortlessly handling hundreds of terabytes of data on a single machine, while consisting of only a 300 kB (yes, kilobytes)... Nobody even came close.
Market adoption could be better, but the license costs around $100,000/year (probably the most expensive software per kilobyte). Fintech can afford it, other industries can’t.
You'll also end up spending at least that much on consultants. While KDB is remarkable in many ways -- the first time I saw it I thought the demo was fake, the performance so great -- it is very difficult to develop in.
They have a an SQLish interface, that allows non-specialists to get work done, but anything serious needs to move beyond that.
I haven't used it in a few years, but queries had to be carefully optimized -- swapping the order in a where clause could cause order of magnitude differences in performance. Also it is append-optimized. If you need to update or insert data, it is a nontrivial exercise.
Regarding being append-only — these days, where reactive programming, immutability and FP are all in vogue, this is a feature. As for the rest, yes — K/Q are idiosyncratic, but they are simpler than it looks at first sight (and second.. and fifth). :)
Other TS products like OneTick, RDBMSs for TAQ (esp. if you need non-TS indexes), streambase for UI backends
I even met a programmer who hacked something together in Java using the same columnar kdb layout but, you know, multithreaded, so different basket optimization jobs could run simultaneously on the same store.
kdb is frequently described as an in-memory database, or with great emphasis on how efficiently it utilizes a ton of RAM. I've no idea how much does its performance actually drop when the amount of data goes past, say, 2x the amount of available memory, but I'm pretty sure hardly anyone can afford multiple terabytes of RAM in addition to the software license these days…
You wouldn't happen to be a commentor/author for this Reddit post[0], would you? Your sentiment here seems quite similar. I have no experience at all with hardware design, but it seems like that domain is especially geared towards closed-source for specific reasons.
That is an interesting post. I think there are two other potential ways FPGA tools could become open source:
1) If there are some widely used open hardware devices or open standards. The fact that FPGAs are proprietary designs and can't be copied is a significant barrier. If there was an analogue of ARM in the FPGA world that could open up open source dev tools too.
2) DIY FPGA. This one is almost certainly much further down the road than any of the other options. But if fab and design tools of the chips themselves become a commodity then there will be open source dev tools. It used to be pretty difficult to make a custom PCB, but now with PCB as a service open tools are seeing more use. I think the reason this is so far down the line is because fpga offers cutting edge performance for specialized applications. If you wait several generations or are ok with reduced performance then you can stick with generic CPU/GPU. With Moore's law running down there may be more and more things that aren't possible without fpga. And I don't think it would be until then that open source dev tools follow more diy fpga.
For my comment I was thinking about HDL simulators, which are very expensive/proprietary tools (vendors like Synopsys, Cadence, Mentor Graphics), with a lot of vendor lock-in, but yet are based on standard/open languages: Verilog/System Verilog.
I know there are some open-source simulators that have been around for a while, but they aren't widely used, and the big 3 EDA vendors still seem to have a strangle hold on the HDL simulator market.
Simulation is generally more important for ASIC hardware than FPGA hardware, since you can quickly iterate on an FPGA design with logic/design fixes, whereas with ASICs design iteration can be prohibitively expensive and time consuming.
RISC-V was explicitly created to solve the problem of their being basically no open source architectures, and it's still fairly new, so it really is the exception that proves the rule.
I agree in terms of the equivalent of standard libraries. Ex: if you want a DMA engine, a PCIE interface, Ethernet, etc... you either have to build that from scratch, or buy IP from a vendor, there's not a lot of open source. There is opencores.org, but when a company is investing a lot in their own hardware design, then tend to want higher quality, or more recent/updated designs.
But the IC/hardware toolchain is a separate issue from the design itself. It's largely proprietary, probably due to the niche market and large R&D costs involved in some of the tools.
If you know of any chips taped-out, and shipping in volume, that used any of these as their primary tool, I’d be very interested to know more about that.
I’m aware of chisel used to verify some RISC-V cores that were fabricated, but that’s a research POC, not a volume/production ASIC.
All of SiFive's chips use Chisel, it's our primary design language. We use commercial tools post-Verilog (ie, synthesis and place+route). We use Verilator for some simulation, but also use commercial Verilog simulators.
>What programming language stack do you use that’s closed source?
Verilog/System Verilog (generally HDL/FPGA/ASIC design/verification).
Yes, it's a niche (some may even say it's not software, it's hardware, but it is indeed a large code base that requires a software engineering environment/toolchain/practices/etc...). Again, just answering your question for the benefit of the discussion. It would be interesting to hear of others such closed source areas.