Hacker Newsnew | past | comments | ask | show | jobs | submit | modeless's commentslogin

The price is 5x Opus: "Claude Mythos Preview will be available to [Project Glasswing] participants at $25/$125 per million input/output tokens", however "We do not plan to make Claude Mythos Preview generally available".

I didn't see this at first, but the price is 5x Opus: "Claude Mythos Preview will be available to participants at $25/$125 per million input/output tokens", however "We do not plan to make Claude Mythos Preview generally available".

It's so ridiculous that Google made a custom SoC for their phones, touting its AI performance, even calling it Tensor, and Apple is still faster at running Google's own model.

Google really ought to shut down their phone chip team. Literally every chip from them has been a disappointment. As much as I hate to say it, sticking with Qualcomm would have been the right choice.


It runs very fast on my Qualcomm Elite Gen 5 SoC Oppo Find N6

How many tokens per second? Also, does it get warm/hot?

If this Gemma tokenizer I found online is accurate then my Pixel 10 Pro XL is getting ~22 tok/s on Gemma 4 E2B using the NPU, vs. 40 tok/s is what people are saying the MLX version gets on iPhone.

Actually I found official performance numbers from Google saying iPhone gets 56 tok/s and Qualcomm gets 52. They don't even bother listing Tensor in their table. Maybe because it would be too embarrassing. Ouch! https://ai.google.dev/edge/litert-lm/overview


My problem isn't that these people exist in the world. My problem is they're increasingly drowning out other voices in a community I'm part of. I would prefer significantly more active moderation against politics and general non-technical negativity on this site.

Huh, I didn't know AV2 was out. What are the new features besides (I assume) incremental compression efficiency?

> AV2 is the next-generation video coding specification from the Alliance for Open Media (AOMedia). Building on the foundation of AV1, AV2 is engineered to provide superior compression efficiency, enabling high-quality video delivery at significantly lower bitrates. It is optimized for the evolving demands of streaming, broadcasting, and real-time video conferencing.

> AV2 provides enhanced support for AR/VR applications, split-screen delivery of multiple programs, improved handling of screen content, and an ability to operate over a wider visual quality range.


The general rule for video codecs is that each release reduces the required bitrate by 30% - a release cycle is 7-10 years so each year the research version of the new codec can be expected to improve ~5%.


It’s in the first sentence of the article.

Back of the class you go.


I don't think it actually is. Could you indulge me and quote it here, please?

There’s some blue text with an underline labeled “AV2 Specification”. That’s called a link. If you click that, you’ll see the date the spec was ratified and some details about it.

Are you suggesting that I read the whole spec and then read the whole AV1 spec and diff it in my head? Or are you referring to the only text in that link describing differences with AV1: "enhanced support for AR/VR applications, split-screen delivery of multiple programs, improved handling of screen content, and an ability to operate over a wider visual quality range"? This is not a description of technical features, it's a high level statement of aspirations. I'm asking what features they added to achieve these goals.

I was hoping someone familiar with AV2 might be frequenting this site alongside the much larger population of smartass pedants, and they might be able to summarize the new features in a way useful to me and others.


I am actually interested in an explainer of the technical differences between AppImage and Flatpak and Snap and why one is better than the others, but I didn't find it here.

Personally as a user I have found AppImages annoying as there's no install process to get a binary in your PATH and an app launcher icon automatically, and updating them is a manual process usually, and also I always get this FUSE error that I have to google how to fix. Snaps I have found annoying as the applications packaged that way seem to have limitations that the non-snap versions don't have. Flatpak I have no experience with.

All that said, I like the idea of an app being a single file, and if they just provided a standard way for AppImages to register with app launchers and your PATH on first launch, and made them update themselves automatically in a way as seamless as Chrome, and fixed that damn FUSE error, then I'd prefer them.


I just start AppImages from the command line and put them in my /home/$username/bin that seems to take care of most of the annoying edge cases. Snaps are ridiculously hostile abusing the mount system and polluting all kinds of places where they have no business going, I've completely purged the whole snap subsystem from my machine. Flatpak I've managed to avoid so far.

I unpack them because the app runs faster, no container/fs problems, and I use apps where I want to access the files in the app. Kicad in particular has a load of component files that I always want to copy from /usr/lib into each project so that the project is fully self contained and in a nicer way than the way kicad does it itself.

freecad has a problem where it uses python, and python3 defaults to shitting turds (pycache directories) everywhere it's executed. There is a way to tell python not to do that, but that mechanism is not possible when the app is an appimage. But is possible if the appimage is unpacked.

It's a simple, but totally manual process to unpack and integrate into the desktop fully so that file types autolaunch the app and displays the icon in the file manager. I started to write a script to automate it but there is just enough variation in the way appimages are constructed that it's annoying to try to script even though it's pretty easy to do manually.


If you look at Apple / Google mobile platforms, these are the requirements for modern desktop apps:

1. providing a build environment for app developers to build something that can run on any distro

Both Flatpak and Snap solve this by providing a SDK; for Snap there is one SDK built out of Ubuntu packages, for Flatpak there is a choice of various options, most built on the Freedesktop.org SDK (Gnome/KDE), plus some independent ones. AppImage provides nothing to solve this problem.

2. providing a runtime environment that conveniently integrates the app on users' desktops

Flatpak and Snap solve this via integration into Gnome Software, KDE Discover and similar UIs; AppImage also solves it in a way by being just a single file that the user clicks on.

3. sandboxing to keep users safe

Flatpak provides sandboxing via Bubblewrap, which works on any Linux distro. Snap provides sandboxing mostly via AppArmor, which requires (last I checked) out-of-tree Linux patches, and only works fully on Ubuntu. AppImage does not provide sandboxing, but the expert user can manually run an AppImage with firejail to sandbox it.

4. a convenient way for users to find applications to install

Flatpak has Flathub as a vendor-independent central app store with volunteer reviewers, and also provides the option to self-host apps conveniently. Snap has Snap Store as a central app store that is run and monetized by Canonical, and it's not possible to set up an independent alternative. AppImages are typically hosted directly by the upstream project, but now there is also an AppImageHub.

5. automated updates

Flatpak and Snap provide this automatically from Flathub/Snap Store; AppImages may be auto-updatable in several different ways but it requires the application author to implement support for it.


A few years ago I read this small post[0] that talks about Flatpak and it's use of OSTree[1].

It doesn't exactly compare it to the other formats, but still interesting on its own.

[0]: https://blogs.gnome.org/alexl/2017/10/02/on-application-size...

[1]: https://docs.flatpak.org/en/latest/under-the-hood.html


AppImages are just a portable packaging format while Flatpaks and Snaps also offer sandboxing, and Flatpak is more distro-independent than Snap which is developed by Canonical. Ubuntu used to be extremely pushy about Snap which turned me off and now Flatpak just seems a lot more popular.

Ubuntu is still pushing snap - they still kept the practice of silently replacing apt packages with snaps, I think the default Firefox is still a snap, and so is node.

I'd love to see snap go the way of upstart...

The Ubuntu defaultism still puzzles me to this day... Canonical has been shown to subject users to its horrible science experiments pushing broken software on its users sometimes even persisting for half a decade or more (see pulseaudio, it was shipped in ubuntu for literal years, and it never worked...). Snap is their latest science experiment.

Though Im not sure what should be the default, as I can think of disadvantages to several alternatives.


I've generally preferred AppImages because they're just a single-file (in appearance) binary I can put in a ~/bin folder and run. Flatpaks require using an external tool to run them, and update them, and there's confusion whether you need the --user flag or not, and after every graphics driver update on the main OS you need to upgrade all the flatpaks again... It's so much of a hassle. The permissions isolation is nice in theory but firejail works for that too, arguably better in some ways.

In the space of retro gaming, the DuckStation devs recently had some drama (I think primarily with Arch users) and it resulted in purging the flatpak builds, now there's only an AppImage. I'm sure much righteous rage etc. like this post but against Flatpak or who knows.


My only experience with Flatpaks is on Fedora Kinoite and Silverblue, so YMMV.

Flatpaks are updated at the same time as the system with the GNOME and KDE update GUIs, or in one step from the command line with "flatpak update" (or "sudo dbus-launch flatpak update" when running outside a graphical environment), and I've never run into problems with graphics drivers, though I've admittedly only used them on systems with Intel and AMD GPUs supported by in-tree drivers (but what you're saying makes sense because Flatpak runtimes do bundle user-mode graphics driver components).

While you're not wrong that running Flatpaks requires an external tool, installing them creates both symlinks to wrapper scripts in a common directory that can be added to your PATH and launchable desktop application icons in GNOME and KDE that work no differently than those for applications installed through other means.

The wrapper scripts and symlinks have qualified names, e.g. "io.mpv.Mpv", but that's trivially fixed with an alias or additional symlink if desired.

The only problems I've run into with Flatpaks are limitations due to sandboxing, e.g., the Wireshark Flatpak can't capture packets, which makes it useless in common scenarios.

"--user" is for working with per-user Flatpaks, rather than system-wide Flatpaks, which I've personally never had any reason to use since all my Fedora systems are personal, but it doesn't seem any more confusing than similar switches in other package management systems.


Generally speaking, between flatpaks and appimages, flatpaks are actually a lot easier to make and get out to people. The thing you're talking about here, not being able to easily run or get them, is kinda the core issue with appimages.

I've never actually worked with snaps before, but they're Canonical's format, somewhat? specific to Ubuntu. I can't say too much about them.


Flatpak gives me one package to install and a little bit of a minor security boundary around the application.

Out of Appimage, snap, and flatpak, flatpak has been the only one after years that seems sane.

It also works really well with atomic desktops tbf.

Fuck snap, fuck snap so hard.


Because it lets you ignore your 5 hour quota for a while.

The only listed qualification is "You’ve subscribed to a Pro, Max, or Team plan by April 3, 2026 at 9 AM PT", however I am not getting the banner for this credit. I suspect that there's an unstated additional qualification: that your account hasn't previously received an extra usage credit.

I don't know if they should be handing out more credits right now considering that my Sonnet requests in Claude Code are routinely delayed by several minutes, presumably due to capacity issues...


I previously received one and was able to activate this one too. I agree it seems like there's more to this, just wanted to add another anecdote to further confuse things. :)

I'm not seeing the usage credit banner either. Didn't manage to claim it.

> I suspect that there's an unstated additional qualification: that your account hasn't previously received an extra usage credit.

My account has never received any credit. I subscribed to Pro only a few weeks ago.


I was able to claim the bonus usage today. It just randomly appeared.

> I suspect that there's an unstated additional qualification: that your account hasn't previously received an extra usage credit.

I received extra usage credit in February, and I got the email today asking me to claim this extra credit. Managed to claim this one too.


Dunno, maybe there is a bug. I was both a subscriber, and had the extra usage enabled, and paid for extra usage before, and didn't get the extra credits. I am on the max plan, so was rather looking forward to the extra $100 to burn on /fast mode.

I guess it was just delayed. I received the credit banner just now and redeemed it successfully.

I received the old one and I've received the new one too, so that's not the rule.

I agree. I happened to see Boris' tweet about it as soon as it posted, and the endpoint for redeeming the credits (the one that fires when you click it on the usage page) was already returning sporadic 400s for everyone.

I wouldn't be surprised if they pulled it back so they could spread out the load.

edit: whoops, meant to leave this as a reply to the (now-sibling) comment from 'flutas.


Yeah, same here. I think it's just rolling out gradually, mine wasn't showing for the longest time then just randomly appeared.

maybe they've changed the page since you checked but there are two requirements, the second one being: - You’ve enabled extra usage

If you haven't, then you don't get the credit.


That's not really a qualification so much as a required step, as you can do it anytime in the next two weeks to get the credit. And I have, but no credit.

Agreed, watching national or world news is useless. If you want to know what is likely to happen instead of what someone wants you to think will happen, we now have prediction markets. Whenever I see a headline I'm curious about, now instead of reading the article I just go to a prediction market and check the probabilities.

Prediction markets miss all the experts, whether academic or laypeople wonks, who simply don't care to have a financial stake in the decision. I can't imagine how it'd be representative. In any case, the people weighing in are getting their information from somewhere and it's not thin air. How can you understand an issue without knowing motivations/vested interests from all sides, history leading up, etc?

If I have a specific interest in a topic I can do extensive research over many hours and come to my own conclusions. But for the vast majority of news headlines I see, including almost all "national" or "world" news, I don't have the time to do hours of research. In this case, reading or one or three news articles is far more likely to give me a biased and ultimately incorrect take than looking at a prediction market, which takes all the available information and condenses it into one number that matters.

Polymarket publishes accuracy statistics

https://polymarket.com/accuracy


Thanks - I wish it could be drilled into by category, i.e. what are the stats for categories of import (filter out sports, crypto, etc). My worry is the average could appear rosier if the share of trivial events are high.

I don't know if it's just getting older or some deeper change in society, but more and more the reading of how my peers view the world depresses me. Even beyond the specific issues with prediction markets, there is a whole lot more to understanding our world than merely knowing the rough odds of possible outcomes.

On the other hand it's a boon to those establishing new businesses. And a huge boon to employees. And a boon to the overall economy because it accelerates transfer of know-how out of older and more dysfunctional companies into newer and more nimble ones. This is what made Silicon Valley what it is, starting all the way back with the Traitorous Eight in 1957 and continuing today.

There are so many wannabe "New Silicon Valley" alternative areas that are unwilling to copy the non-compete ban, and subsequently fail to compete with the real Silicon Valley. It's a necessary ingredient in my opinion.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: