Hacker Newsnew | past | comments | ask | show | jobs | submit | alwillis's commentslogin

> They just need to start deprecating and removing old features.

That's not going to happen.

They can't break millions of websites by removing old features. Besides, for the most part, current developers can ignore most of the old stuff.

But a site made in 1997 has to render on current browsers.


>On the contrary, a lot of the reason CSS is confusing is because it's full of insane hacks people have to do to get the behaviour they want.

CSS is confusing because the vast majority of web developers never learned it properly. Many developers won't learn any "new" CSS (like CSS Grid which shipped in all browsers in 2017) beyond the hacks they learned in the '90s and early 2000's.

That's not the fault of CSS.


> CSS is confusing because the vast majority of web developers never learned it properly. Many developers won't learn any "new" CSS (like CSS Grid which shipped in all browsers in 2017) beyond the hacks they learned in the '90s and early 2000's.

Disagree. The newer stuff is, if anything, more confusing. The old stuff, awful as it was, at least had a consistent model.


> Disagree. The newer stuff is, if anything, more confusing. The old stuff, awful as it was, at least had a consistent model.

With the "old stuff", we didn't a layout model or an alignment model. Everything required float and positioning hacks to do things they weren't designed to do. There's no logical way that was "better."

There were several different grid systems, all mostly incompatible with each other, which were required to do anything interesting.

Many layouts that are common today were impossible to do with just HTML & CSS back in '90s and 2000's.

Capabilities that almost all developers had to reach for a framework Bootstrap or Foundation for are built-in to CSS today. Or lots of JavaScript.


A gentle reminder: conditionals aren't new to CSS; @supports and @media are conditionals; so are style queries.

if() just codifies behaviors and hacks [1] developers were already doing.

[1]: https://lea.verou.me/blog/2020/10/the-var-space-hack-to-togg...


> Prompt ingested (time to first token) in "18 seconds" with the new chip... end of the joke

18 seconds on the M5; 4.4x times faster than the previous M4 running one of Qwen’s 8 billion parameter local models.

That’s quite impressive for a tablet and faster than most laptops available today.


> Apple feeling like they have to partner with Google to replace their own "Apple Intelligence"

They’re not replacing Apple Intelligence; the partnership with Google is for the backend of Siri.


Yeah but this is what I mean. They are unable to produce a decent AI solution so they have to outsource. And if it was just about being limited by on device processing power, they could just switch to using their cloud infrastructure.

To me that's just dumb. They are spending money for a competitor solution just to keep control and appearances. At this point they could just forget about AI altogether and/or open their OSs to let others integrate their solutions. But Apple is way too proud to admit being wrong and we will get a half-assed solution that will both cost them money and be inferior to the competition because of their own self-imposed limitations.

They have been riding on their privacy/security narrative for a while, yet they work with governments and ads/tracking hasn't gone anywhere. And they'll have to open up the App Store golden goose at some point anyway. They should recognise when the strategy isn't working for the long run and switch it up...

But they have tried to compete with Google for maps, it is still largely inferior in many ways and they seem to be fine with that. So I think it will end up in a similar situation.


You know Google pays Apple ~$20 billion a year to be the default search engine, right? The rumored deal with Google is $1 billion, so Apple is still up $19 billion.

There's only three frontier model makers: OpenAI, Anthropic and Google. So everyone that's using a frontier model behind the scenes is using one of these three.

It's likely whatever happens, it'll be running on Apple's Private Compute Cloud [1]—it won't be mined for ads, etc.

What you don't understand is on-device AI isn't going away due to using Gemini for Siri; they're two separate products. Using Gemini is the fastest way to add "world knowledge" to Siri.

[1]: https://security.apple.com/blog/private-cloud-compute/


I do understand, and I mostly agree with your perspective. But narrowing the problem to money isn't really helpful. Sure, Apple will be able to make a shitton of money regardless.

For consumers, it would be more beneficial if Apple could actually use all the money they get to really build standalone products. It feels like Apple is not trying very hard (or has become too incompetent) to make a viable competitive product because they profit enough regardless. Don't you think it would be better if Apple were able to make a search engine against Google instead of just pocketing the profit?

I'm not saying that on-device AI is going away because of Gemini. I just mean that Apple's strategy wasn't good, and they wasted resources on a dead end (a common trend with Apple recently), and they will rely on a competitor until they can provide their own in-house solution (that may or may not be good). They marketed the shit out of their Apple Intelligence for their hardware, and that ended up not being good.

The Private Cloud Compute push does not inspire great confidence because it basically relies on their hardware, just at a bigger scale. If they can't get good results with a Mac Studio, it's not clear the results will be that much better. It seems like the approach has a built-in limitation from both a hardware and software standpoint. Apple hardware is only decent at inference, and that's only when going for memory-heavy models and comparing to single consumer GPU machines. I think they have to get the model from Google in part because even if they have in-house talent, they would need to rely on 3rd-party hardware anyway.

They are not very competitive with local AI for average consumer devices, and they are not competitive with local AI at the high end (prompt processing). You can load high-quality models if you spend lots of money on their unified RAM, but it's slow. The Nvidia DGX, as well as the newer APUs from AMD, are rendering their memory advantage somewhat irrelevant for the midrange. And they can't provide competitive cloud AI for consumers who will never pay for such powerful hardware. So why should we pay more money for Apple hardware when you are going to end up using their competitors technology ? It's like buying an item from a white-label brand on Amazon that is just rebadging generic Chinese stuff. There is nothing wrong per se, but you are just giving money to a useless middleman; it's not efficient, and in a way, you get scammed.

You mention privacy as if it were still relevant. It's all moral posturing and virtue signaling from Apple. As I said, they don't work any less with governments than the other tech company, which is the real privacy you want. As for ads/tracking, not only does using their hardware not at all prevent that, but they are starting to sell ads as well, just in Maps for now, but it's likely to spread.

This is why I talked about Maps. Apple does not offer a competitive solution; the routing is pretty good, but today we use mapping apps for discovery and reviews, and personally, I also use transit and bicycle routes. Apple Maps is really not a decent equivalent, and I say this as someone who had a lot of hope when Apple Maps came out and used it almost exclusively for as long as I could. Even using both a Mac and an iPhone, I find the Google solution more convenient and useful. I live in the EU, where Google was forced to remove Maps links in their search engine, which was quite convenient. But the Maps macOS app is even more useless, so it might as well not exist. Poor data, poor UI, poor performance, it's just plain bad.

So in the end I use Google Maps, and the privacy argument is largely pointless because Google gets my data anyway. This is the same thing for needing to use Google Search, YouTube, and various other Google properties. Ultimately, Apple receives large sums of money for nothing of value, and they continue to act morally as though they were providing their customers with satisfactory service. It's effortless to paint competitors negatively, but if you can't offer an adequate solution, then nobody will benefit except yourself. And they have the money, so it's either greed or incompetence.

And this is why I said that about their AI marketing. It's largely useless compared to cloud AI that most consumers get to use on competitors devices/solutions. Even for strictly local AI, they are not very competitive at any price point. Best-case scenario, it's a built-in solution that could be decent if they figure out the model part correctly.

The reality is that the hype is widely overblown, but I have been buying Apple hardware since the early 2000s, so I'm used to it. But they are just losing relevancy, and at some point having good chips and nice premium hardware won't be enough.


> All of the fuss seems to be entirely driven by Mitchell's clout, and maybe some interest in Zig.

Nope, that's not it.

It's mostly because he noticed the majority of terminal applications were okay but not great. So he decides to address this by creating a cross-platform terminal app that's faster and more compatible than pretty much every existing terminal app. And has a native macOS UI written in Swift without compromising its cross-platform features.

Kind of out of nowhere, Ghostty is in the conversation of being the best terminal app available. "Best" doesn't mean the most features; but it nails speed and compatibility. (I’d love to see iTerm switch to using libghostty in the near future. That would be a killer combination!)

From "State of Terminal Emulators in 2025: The Errant Champions": [1]

Before presenting the latest results, Ghostty warrants particular attention, not only because it scored the highest among all terminals tested, but that it was publicly released only this year by Mitchell Hashimoto. It is a significant advancement. Developed from scratch in Zig, the Unicode support implementation is thoroughly correct

In 2023, Mitchell published Grapheme Clusters and Terminal Emulators, demonstrating a commitment to understanding and implementing the fundamentals. His recent announcement of libghostty provides a welcome alternative to libvte, potentially enabling a new generation of terminals on a foundation of strong Unicode support.

[1]: https://www.jeffquast.com/post/state-of-terminal-emulation-2...


> I don't think there are any. It is likely just a social media hype.

It's not hype. Here's a comprehensive review of a lot of terminals and Ghostty did very well--"State of Terminal Emulators in 2025: The Errant Champions" [1]

[1]: https://www.jeffquast.com/post/state-of-terminal-emulation-2...


Can ghostty finally search in the scrollback? The last time I tried it it didn't support a freaking search. This is #1 feature I need from any terminal.

No sign of alacritty :(

Have you read the post you linked, and do you understand what it is about?

tmux is another terminal layer inside of any terminal.

Newer terminal apps like WezTerm have a multiplexer built-in.


WezTerm was my daily driver for a long time—it’s a great app.

Ghostty is blazing fast and the attention to detail is fabulous.

The theme picker is next level, for example; so are the typographical controls.

It feels like an app made by a craftsman at the top of his game.


> Mac, alt-minus.

I've been using Macs for decades; it's called the Option key; no seasoned Mac user calls it "Alt".

I know when a PC-style keyboard is attached to a Mac, the Alt key functions as the Option key. [1]

- Option-minus creates an en dash

- Option-Shift-minus creates an em dash

[1]: https://support.apple.com/guide/mac-help/intro-to-mac-keyboa...

[2]: https://www.merriam-webster.com/grammar/em-dash-en-dash-how-...


Mac since 1995 or so, pretty seasoned.

But I also have windows keyboards plugged in. Hard enough getting the ones I like around here without also constraining them to Apple's preferred symbols printed on the keys.


It says “alt” on it

Not on my MacBook Pro.

Seasoned Mac users have ones that do ;)

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: