Hacker Newsnew | past | comments | ask | show | jobs | submit | astrobe_'s commentslogin

> Their value is going to stay limited if people don't want to actually use them.

Namely, in the case of PeerTube, content creators. Youtube is convenient because it comes with builtin monetization. You can probably expect loud objections (rightfully so) from some of them if you download their stuff from Youtube to upload it to PeerTube.

If you don't have the content creators, you don't have content consumers and you cannot bootstrap a network effect (some services did bootstrap a network effect with plain and simple piracy, though).

I believe the UX is secondary to available content. People do make the necessary efforts if they think the benefit is worth it.


Apparently 2 meters is : A 2 meters (6 ft 7 in) high tsunami hit Chiba Prefecture about 2+1⁄2 hours after the quake, causing heavy damage to cities such as Asahi. (Tohoku 2011) [1]

WRT comparison with hurricane waves, I assume they carry a lot less energy than tsunami's, because they are "superficial waves" - caused by the friction of the wind on the water - whereas a tsunami wave is caused by the movement of a huge mass of mater.

[1] https://en.wikipedia.org/wiki/2011_T%C5%8Dhoku_earthquake_an...


People vastly underestimate the danger of a moving body of water in general, but especially when that water is where it isn't normally. Even a relatively tame storm surge picks up sewage, dangerous chemicals, debris, and confused wild animals.

And what does "modern" has to do with it anyway.

Nope. That's the divine invisible hand of the market.

I wonder, if the device is equipped with a microphone and/or a webcam, does it mean that the school has the right to remotely activate them for "monitoring" purposes? It not too far from what they did when the monitoring software sent the screenshots of an email that never existed.

And what if he joked about stabbing his girlfriend/boyfriend? Would the school report him to the police? What the police would do in this case?


This is a substantially more serious scenario now that transcribing and analyzing audio / video content is so much faster & cheaper today. Previously some freak had to watch or listen to everything in nearly 1:1 time to eavesdrop.

I have worked extensively with this technology and have witnessed many of its pros and cons firsthand. I have seen it misused, but I have also observed it saving students' lives and preventing mass violence events.

A major point to consider in the public conversation is what happens after a tragic event occurs. The school district is often called out for ignoring the warning signs, not paying attention to things that could have prevented the event, etc. So the other side of abolishing this technology is that school districts no longer have those tools and public expectations should be adjusted accordingly. What really happens is that public opinion ebbs and flows between support and opposition, depending on what tragedies have happened near term.

The policy and legal frameworks used by US schools clearly state that school district staff are not allowed to remotely activate the microphone or camera on a student's device.

There's also legal precedence: https://en.wikipedia.org/wiki/Robbins_v._Lower_Merion_School...

When the Robbins case occurred, school districts everywhere took notice. In the organizations I've worked with since then, we no longer activate the microphone or webcam, regardless of the Chromebook's location. However, I can't speak for every school district, whose morals and ethics may vary greatly.

> It not too far from what they did when the monitoring software sent the screenshots of an email that never existed.

It did exist but it wasn't never sent. The software runs as an agent on the student device and inspects the DOM tree for text phrases it considers alert-worthy: self-harm, threats, drug use, etc.

> And what if he joked about stabbing his girlfriend/boyfriend? Would the school report him to the police? What the police would do in this case?

This entirely depends on the school and police personalities involved, but the answer is "possibly" or "probably", depending on the jurisidiction.

Regardless of the outcome, I think what's really important include the following:

- There ARE bad actors employed in every school district! Many of them would love to spy on students, collect naked photos and share them.

- School districts need STRICT AND ENFORCED use policy and minimal "need to know" access for TRAINED district staff. No hand slaps. Termination of employment, and legal and criminal consequences are in order.

- Auditing flipped on for everything possible (for CYA, if anything). If school staff flips on a webcam, that should be logged somewhere that cannot be scrubbed. In the case of a webcam activation, I'd have it auto-notify key personnel and probably legal. Those audit logs should be reviewed often by multiple auditors -- preferable a third party. Audit events should be backed by extensive documentation, such as a help desk ticket, if anything.

- If possible, students should obscure the webcam when not in use to protect themselves. If feasible, I also suggest they get a cheap dummy mic off of Amazon and keep that plugged in.

If this type of product survives litigation, we need to move toward assurances of privacy (eg. verifiable Private Cloud Compute model), so this doesn't turn into another Flock situation where certain government entities may have a global/national single-pane-of-glass.

I almost said "on-premises" there, but that would be a disaster because school districts don't patch their stuff.


> but I've also seen it save students' lives and stop mass violence events.

The saving lives thing is always the excuse for total surveillance. Trading away your freedom for security gets you neither.


Touche. I get that and agree. It's certainly a polarizing conversation.

I'm hoping the conversation and courts arrive at definitive guidance and regulations that preserves freedom, doesn't add to the surveillance state and provides some kind of answer to the half or more than half of the population that expects school districts to surveil everything kids do on their devices (self-harm, harm, bullying, etc).

It's a really weird experience to hear the same powerful people argue both sides. How do you expect us to do one without the other?

And again, it's... safe to assume there are a lot of bad actors in education where enforced safeguards are needed.


Just keep the spying to a minimum. Any spying on a kid or his family outside of school is off-limits. 1984 was a warning, not an instruction manual.

Already do, but that doesn’t help with the many more curious and nosy administrators out there, which is why you need regulations, enforcement, and auditing.

Making the spying impossible is better than any regulations or auditing.

With a separate camera and microphone, I just unplug them when not in use. For the builtin camera, I put a sticker over it.

I have zero faith in any software lockouts.


On a linux-like system, one could block access using file permissions, e.g chmod a-r /dev/snd/pcmC0D0c for a microphone. Probably less easy to do on Android, though.

It also relies on knowledge of a counterfactual situation. Was the guy arrested for a threat genuinely going to hurt people, or was it a dumb joke that was taken seriously by somebody snooping in a conversation they lacked the context to even understand?

From experience, this is generally easy to figure out. The ratio between dumb joke threats vs actual threats is something like >99.9%.

> The ratio between dumb joke threats vs actual threats is something like >99.9%.

This makes the situation ripe for false positives. If 99.9% of threats are jokes and you correctly identify joke threats as jokes 99% of the time, that remaining 0.9% is several times larger than the real threats.

And while the difference may seem obvious to you and I, being able to perceive the difference probably becomes harder across cultural divides (such as between teachers and a younger generation of students.) Furthermore, any bitter teacher with an axe to grind can leverage "zero tolerance" rules and strategic ignorance to deliberately construe a joke as a real threat to get rid of a kid they don't like (I've seen this sort of thing happen.) The more surveillance there is, the more opportunities there are to catch somebody making a edgy joke in what they thought was a private conversation.


Thanks for your answer. The fact that the legal framework you suggest aimed at preventing abuse doesn't exist already is terrifying, when you think about it.

The situation is also deeply unfair: wealthy students can keep the device provided by the school in a box and use their own instead, while less wealthy students will have to use the school's device and be spied on. "But that's actually a good thing," some might say, "it's always the poor who cause troubles."

IMO, this spyware shouldn't have been there in the first place, even if it means that in some cases it could have prevented yet-another shooting, or suicide, or drug abuse. The school should have the right to inspect the device anytime (when at school, not remotely) to make sure it is not being misused, and nothing more.

More surveillance won't make those problems disappear - actually quite the opposite I believe; because learning that your classmate has been suspended because he said the wrong words when talking to himself but some forgotten microphone caught them nevertheless is really stressing or depressing, depending on your personality.


I just don't see how the pros out-weight the cons here. Even if the laws and policies were well written, enforcement seems impossible. As you've said, there are going to bad actors sprinkled throughout any of these systems (and in a big enough system, this will always be the case). This power, in my opinion, is simply too great to be allowed. It will be abused. A lot.

I wonder if this minimal cell could be described instead as something between a bacteria and a virus. I am not a biologist, but IIRC viruses penetrate cells then hijack the cell's standard machinery to replicate itself, until the cell explodes; sort of like a DNA/RNA injection exploit.


> I think the whole message would be more palatable if it weren't written as a decree including the dig on "retro computers"

Yes, and more generally, as far as I am concerned, the antagonizing tone of the message, which is probably partly responsible for this micro-drama, is typical of some Rust zealots who never miss an occasion to remind C/C++ that they are dinosaurs (in their eyes). When you promote your thing by belittling others, you are doing it wrong.


Summarized translation:

Following the propaganda of the ministry of interior, several articles were published in press about GrapheneOS, which is described as a solution for criminals because it allows to hide things.

La Quadrature du Net [similar to the FSF with regard to defending users' rights] argues that the purpose is of course not cybercrime, but to secure and protect the privacy of its users.

The head of the anticybercrime brigade of Paris threatens of suing the developers of GrapheneOS if connections with organized crime were to be found.

The government has repeatedly tried to extend cyber-surveillance previously. They are trying to use a law designed to fight drug traffickers in order to enforce backdoors in services that use cryptography, such as Signal or WhatsApp, without any success for the moment.

---

So, it's a threat before having a proof. They also mention the arrest of Pavel Durov, who was arrested because Telegram failed to answer legal requests, which was then constructed as complicity with criminals using Telegram, but that's obviously a very different case.

But of course, if they succeed in forcing backdoors, criminals will just use other ways to communicate (doesn't matter if they are legal or not because, well, they are criminals...) or tricks; for instance, back in the day when (analog) phone calls could be wiretapped, they were already using code words. They could use e.g. steganography tomorrow.

But we will be left with backdoors that are an unacceptable compromise on security and privacy. This is a recipe for dystopia considering that far-right parties are getting stronger in Europe, including France.


TFA shows that most vulnerabilities have a "window of opportunity" smaller than one day. Are you anxious going on week-end because Friday evening a zero-day or a major bug could be made public?


Well then you agree that the answer is yes. At the end of the article a 14 day window is mentioned but not dismissed and does not mention the downsides.


Yes, I observe a 30-40x slowdown with that method on my interpreter. I was perfectly aware of the cost, Anton Ertl had shown in the 90ies that it wasn't the best on Pentium already [1].

The trick is that you can regain 100% the speed of C (or machine code) by native-coding the critical parts. It's pretty much the same cheating method as JIT or calling native code with an FFI, just manual.

In this regard, a simple, naive subroutine threaded scheme makes it very trivial to do. Think Lua extensions but even more easy because you don't have dynamic types or GC memory. Seriously, when I look hear that language X is "easy to extend" and I look at it... No, it is not.

I have come to the same conclusions as eForth "independently" (I have looked at many Forth systems before making my own, so there could be some influences), except I wasn't interested in compatibility with the standard, so I ditched the counted strings for C's ASCIIZ strings. This makes interfacing with C/C++ libraries as straightforward as you can get.

I quite don't understand why eForth goes for C++; I do see its value to parse more complex languages but Forth? I also don't see the value of multithreading. Cooperative schemes usually work well enough and are easier to handle. If concurrency is needed, multiple programs with RPC, shared memory or pipelines are options usually available (some options are more portable than others, though).

> So, the question is, how to encourage today's world of C programmers to take a look at Forth.

This is a huge mistake. If you make a Forth for others instead of doing it for your own needs, you are doing it wrong. Doing it for yourself, and not being tied by backwards compatibility because you have published your Forth and you don't want to lose your audience, leads to vastly different answers.

[1] http://www.complang.tuwien.ac.at/projects/forth.html


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: