Hacker Newsnew | past | comments | ask | show | jobs | submit | immutable_ai's commentslogin

Fascinating. It is good to see that some people are actually dreaming and acting on a bigger scale than rent seeking mobile apps.


On time at only double the original cost is a bargain by Californian standards.

As you say, better to wait till its done before getting too excited.


I would define "post-scarcity" as when you organize your economic system to constrain supply in order to prop up prices.

For instance, the U.S. burns 30% of its corn crop to produce ethanol, which would never happen in a market system, but it happens because of a government mandate.


This article seems very misleading, at least for California.

If you look at aggregate water usage for California agriculture uses 4X "urban" and "urban" includes industrial, golf courses, parks, etc.

So even if lawns are using more area they are much less water intensive than agriculture.

http://www.ppic.org/publication/water-use-in-california/


the majority of usage goes towards power generation (https://water.usgs.gov/edu/wateruse-total.html). urban areas consume the power, but they dont produce it. they hide their water usage in the rural areas.


Looking at that data it appears that a majority of California's water for power generation is coming from salinated sources, so at least for California the largest user of fresh water sources is still irrigation (agriculture).

Either way, the point that agriculture takes a lot more water than people watering their lawns still stands.


Is this basically the same transition as V8 did with crankshaft to ignition+turbofan?


I think that was the exact opposite - they generated an interpreter from their JIT. This generates a JIT from their interpreter, like PyPy and Truffle do.


It seems to be to generate JIT snippets from the Rust compiler, not an interpreter.


As I understand it you write an interpreter in Rust then you generate a JIT compiler from that interpreter.


I use Ubuntu for programming and it is great.

Still use Windows for HTPC though because I gave up on trying to get Ubuntu to sync video playback correctly after banging my head against the wall for a while.


It is my understanding that difficult tasks should be attempted before banging one's head against the wall.


Directions unlear; head stuck in a computer


Redditization of HN, ladies and gentlemen. Jokes/Puns will ruin this place.


This "vulnerability" is extremely hypothetical and they have not given proof that it can be exploited in the field. Just conjecture and a demo in a totally unrealistic environment.


It's not hypothetical at all. Watch the demos; they clearly demonstrate data being exfiltrated. Additionally, the article mentions several times that yes, the bits/sec is quite low due to several factors. I don't think the author is exaggerating the situation at all.

Would you rather wait for the API to go live and then be abused to steal real data? I would much rather researchers discover and report on possible attack vectors long before they are enabled by default. "Trust by default" long ago proved foolish.


> Although in our proof of concept demonstrations we rely on the assumption that the light conditions do not change during the exfiltration phase, extending the demos to handle these situations shouldn’t be a problem

They say themselves that their demo is not real world and wont work in the real work and then say it "shouldn't be a problem" to make it work.

Not to mention that it takes 20 seconds of flashing the users screen to do the thing (how is that supposed to work without setting off alarm bells).

As I said, they have no proof of a real world vulnerability, only proof it a staged environment, and they readily admit it.


> they have no proof of a real world vulnerability

So what. This "default allow" attitude is easily more damaging than any other source of security problems. You (or anyone else) cannot know all of the ways exposing new data could be exploited, or might already be exploited in ways that we are not lucky enough to know about.

Caring about security - which includes the future unknown unknowns you don't yet know to even look for - means minimizing what is exposed to the public attack surface to what is both needed (which does not include anything merely "wanted") and demonstrated/proven to have trivial risk with known limits.


Yes, multi-threading has high overhead to provide the illusion of parallel operation, but when all the cores are saturated you are in the same boat, whether you have 1 or 100.

The benefit to programs that don't use threading and use event loop and shared nothing multi-process is that they don't have the overhead when things are maxed out.

This is why virtually every high performance server (nginx, redis, memcached, etc) is written this way and things like varnish (thread per request) are multiples or orders of magnitude slower.

Funny people criticizing nodejs for using the same architecture that all the best-in-class products use.


Nile is great for this


So net neutrality applies to their networks too right?


If they were offering transit services as a common carrier, and enjoyed the associated protections, absolutely.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: