Hacker Newsnew | past | comments | ask | show | jobs | submit | xbenjii's commentslogin

I'm confused. They're claiming "Apple’s M4 Max is the first production CPU to pass 4000 Single-Core score in Geekbench 6." yet I can see hundreds of other test results for single core performance above 4000 in the last 2 years?


Are those production results?

https://browser.geekbench.com/v6/cpu/1962935 says it was running at 13.54 GHz. https://browser.geekbench.com/v6/cpu/4913899 looks... questionable.




7614 MT/s on the RAM is a pretty large overclock for desktop DDR5.


There are 8000MT/s CUDIMMs for the new Intel Chips now...


They've been announced, within the past two weeks, and as far as I can tell aren't actually available for purchase from retailers yet: the only thing I've seen actually purchasable is Crucial's 6400MT/s CUDIMMs, and Newegg has an out-of-stock listing for a G.Skill kit rated for 9600MT/s.

The linked Geekbench result from August running at 7614 MT/s clearly wasn't using CUDIMMs; it was a highly-overclocked system running the memory almost 20% faster than the typical overclocked memory speeds available from reasonably-priced modules.


Geekbench is run pre-release by the manufacturers.


But by definition that means it’s not a production machine yet.

So it doesn’t invalidate Apple‘s chip being the fastest in single core for a production machine.


The post doesn't say anything about production machine. It talks about consumer computing.


Yeah that's fair lol


As far as I can tell those are all scroes from overclocked CPUs.



That result is completely different from pretty much every other 13700k result and it is definitely not reflective of how a 13700k performs out of the box.


Geekbench doesn't really give accurate information (or enough of it) in the summary report to make that kind of conclusion for an individual result. The one bit of information it does reliably give, memory frequency, says the CPU's memory controller was OC'd to 7600 MT/s from the stock 5600 MT/s so it feels safe to say that number with 42% more performance than the entry in the processor chart also had some other tweaks going on (if not actual frequency OCs/static frequency locks then exotic cooling or the like). The main processor chart https://browser.geekbench.com/processor-benchmarks which will give you a solid idea of where stock CPUs rank - if a result has double digit differences from that number assume it's not a stock result.

E.g. this is one of the top single core benchmark result for any Intel CPU https://browser.geekbench.com/v6/cpu/5568973 and it claims the maximum frequency was stock as well (actually 300 MHz less than thermal velocity boost limits if you count those).


Could those be overclockers? I often see strange results on there that looks like either overclockers or prototypes. Maybe they mean this is the fastest general purpose single core you can buy that is that fast off the shelf with no tinkering.


AMDs upcoming flagship desktop CPU (9800 X3D) reaches about 3300 points on singlecore (the previous X3D hit 2700ish)


Are you saying a product that has not been released yet will be faster than a product that is?

And that a desktop part is going to outperform a laptop part?


I think he was backing up Apple's claim.


No, neither of those.


We had to disable the fraud protection on our account to be able to accept transactions temporarily, luckily we have alternative fraud protection in play too.


I love that you can literally just send an envelope of cash with your account number to pay.



Awesome! Do you know if/when there will be support for self-hosted Gitlab instances?


Same with the main domain too www.thunderbird.net


Looks like they have corrected it by moving to a letsEncrypt certificate for the main www.thunderbird.net domain.


UNetbootin doesn't support multiple installs on the same device.


This is why I block at the router level with AdGuard/PiHole.


The issue with deeper and deeper ad blocking technology is that you're going to end up putting more and more trust into your ad blocker.

uBlock Origin requires access to the DOM, where it can do nasty things like overwrite window.fetch or window.XmlHttpRequest and intercept network traffic. PiHole, just based on the way it operates, has to route all your network traffic along a different route, and it's up to you to watch its upstream output to make sure it's not doing something bad.

I think there's some benefit to the way Apple is intentionally limiting the available surface for content blockers, but it'd be nice to expand on that surface in limited means (eg: freeze library functions so more of the DOM can be accessed, but at the risk of breaking badly-behaved websites) or at least to get a better, plain-English explanation as to _why_ those decisions are being made.


> The issue with deeper and deeper ad blocking technology is that you're going to end up putting more and more trust into your ad blocker.

To be honest, I trust my ad blocker more than I trust Apple.

This is not a joke. Remember that Apple takes literally billions of dollars per year in payoff to make Google the default search engine in Safari. Apple's interests are not exactly aligned with mine.


But it’s not about trusting Apple or the content blocker. It’s about trusting Apple AND the content blocker and all its dependencies.


They allow the DEV1-M boxes now, which are 8EUR a month.


I hope that's not your real password!


EDIT: it's actually his address, I thought it was just a coincidence but you can the house on Street View... I removed the actual name as doxxing isn't great, sorry.


Thanks. Not the address with the WiFi garage thankfully.

I was actually thinking "better not commit that" ... before, y'know, I committed it :/


I got just the thing for you:

https://gist.github.com/hraban/10c7f72ba6ec55247f2d

Every time you write some code you need to remember removing before commit, surround it with a comment containing "NOCOMMIT". With this script as a pre-commit hook, git will echo an error message and fail.

E.g.:

  print("debug: ", myval)
becomes:

  print("debug: ", myval) # NOCOMMIT
I end up relying on this every day I program. Can't go back.


Thanks! I'm not sure how easy it would be to put the git hook on all my machines though? I have a collection of laptops (and one desktop) that I work on and I often don't use the same machine for a few weeks :-/

I ended up using a "env.h" file... is there a C-equivalent of the PHP (?) .env file?


https://direnv.net Would be my recommendation


It's possible to remove that from the history of your repo, although it breaks any forks.

https://rtyley.github.io/bfg-repo-cleaner/


Heh. I think I did worse ... made a local copy of the repo, nuked it on GitHub, then re-created the three commits by hand ... less credentials.

That looks like a much more useful tool, though.


git add -p

It will let you approve each hunk in a file to commit or not.

git commit -e -v

Will force you to edit the commit message and in the editor show you the diff of the commit against HEAD.


Thank you! I use 'git add -p' all the time, but didn't know the trick with commit. I am a sucker for nice commits so I will check every commit's diff multiple times. When I don't, I usually end up including pieces of code which is not ready yet, which is meant for debugging,...


Well looks like a nice neighborhood at least.


Well according to Google it’s an address in Auckland NZ so I hope it’s not his address either.


hey, I'm in Auckland,maybe I can go around and hack his garage :)


Bring beer.


I kinda hope you come home to find multiple beers with notes saying "pull request pending"!


Me too!


Aw man I missed the address otherwise I would


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: