Much more likely scenario is every vehicle runs on closed source software controlled by state government which can halt the movement of the citizen at will based on social credit or any some thing like that.
If a software is used to create data that is okay like editing text files in microsoft windows. But computational software such as Wolfram being closed source bothers me a lot. There is no way to verify the science you do is correct.
Quite often it gives you a result that you can then prove directly, or check with other tools. The benefit of MMA is that it has a lot of tools and a good interface, good documentation, and a large community.
In practice, pretty much no one doing science has the expertise or time to completely verify the science they are doing - they are building on centuries of knowledge across many disciplines, and for the most part the community verifies each part as they build knowledge.
And certainly opensource does not allow the vast majority of people "to verify the science you do is correct." They'd have to check the code, the compiler, the hardware, ensure no cosmic rays flipped bits during computation, and so on.
So I'd not worry too much about the closed source vs open source nature of it. It's a solid tool that enables lots of research.
The “cosmic rays” argument, to me, is inane. It simply doesn’t practically apply and is certainly not an argument against the benefits of open source code. You’re castigating the whole practice of code review, computer-aided proofs, automated theorem proving, etc.
Have you ever looked at the rates, or you just dismiss it without looking at it? Note that the current rate is higher than older references since the feature sizes have shrunk, and lower energy events can change bits on newer hardware.
Scientific computation, especially at the level of most researchers, is affected by cosmic ray bitflips, without question.
Since the OP was complaining about not being able to check everything ad absurdium, then this effect is certainly on the table. It's more likely to affect research than the difference between closed and open source if a researcher is ignorant of it.
It's also why good researchers, who know this is a real effect, tries to run a computation in multiple methods over different times, until they feel a consensus on the calculations is robust enough.
If you've never done it, write a program to watch memory for bit flips, and be amazed.
Here's an intro - do a back of the envelope calculation and see if you still think these events are rare enough that they don't affect common scientific work.
>You’re castigating the whole practice of code review, computer-aided proofs, automated theorem proving, etc.
No I'm not. Those are but one avenue of reducing the probability of error during computation. All of those only ensure that the code part is solid - there is an entire other world on the physical part that needs incredible engineering, noise reduction, error correction, defect mitigation, thermal issues, quantum issues, physical data decay, memory leakage, and so on.
I think by focusing only on aspects for code, you miss a large part of ensuring modern computing is accurate.
And how many people doing "science" do code review, computer-aided proofs, or automated theorem proving to verify their code is correct? Very, very, very few.
No. Microsoft did not adopt to Github. The first thing microsoft did after acquiring github is forced to login to view commits. I felt like some huge corporation took over the park i daily visit and charged me. It felt terrible.
My manager(people manager) has the same opinion. But in reality he is afraid that people might question what's does he do anyway if every one works from home. Companies are going to find out people managers job are really not as important.
In my companies it's the managers telling people to work from home when possible and parents asking to come in at least every other day to have some hours of distraction-free work.
The example problem is unclear. Could you please describe more on it. Which DSP algorithm falls under the Halting Problem and is there any database or general programming related problem that falls under "Halting Problem" ?
Suppose your company is developing a framework for creating mobile apps. You work on optimizing the GUI renderer and you are assigned a task to make a module which finds computationally equivalent render functions (or a company making a programming assignment plagiarism system, or a duplicate config file detector when the config language is accidentally Turing-Complete).
You may try removing whitespace, finding invariants, ordering the variables by type, converting for loops to while, tail call optimization, but your collegue always has a trick to make equivalent functions undetectable by your module. You were solving the halting problem. Instead, you should use CS knowledge and tell them it's impossible but they can have a non-optimal search. Perhaps, the fastest way to complete the task would be to score similarities between the functions' machine code or AST.
Asking for email address is asking for a huge favor. Its not free. Just because a product took hard work to create does not get the right to call it free when asking for email address. Let that sink in.
In India it is to fulfill the ego of professors nothing more. Unlike America professors are paid lower than the industry and those who can't get jobs in Industry become professors and treat students like slaves. I did not understood while in college but now i get it, it is a coping mechanism for incompetence. One good thing about Corono is that it forces online education like the recent MIT online classes, hope this will set a chain reaction and take the monopoly on education from traditional colleges.
> Unlike America professors are paid lower than the industry and those who can't get jobs in Industry become professors and treat students like slaves
Just sharing experiences, I was in-industry at the time of my education which caused a huge issue in backlash from the professors. These old salty men were dinosaurs and although I wasn't at some prestigious company I was still learning what it meant to run large-scale and highly-available web services - ie: they were stuck in the 8/16-bit era and I was HUNGRY to learn web.
Although I never felt like a slave, I felt like I was always a "lesser" to them no matter what. Coming to school with a fair amount of programming ability subverted them as I had a ton of people calling me up/IMing me for homework help vs. going to their restrictive office hours.
I wasn't the best student, aced the tests even with horrid attendance, organized collaboration on Google Docs so everyone had a consistent study-guide, and I tutored half of the class because I had better "office hours" that lined up with when college kids are working on homework/assignments... they haaated me.
I'm pretty sure that in the U.S. professors are paid way lower than what they could get in industry. They probably get at least six figures but so does anyone in the industry, especially with as much experience as a professor would.
Depends on the field and on the individual. In many fields, the professors would be all but unemployable. And even in hot fields like CS, some professors would not be very useful. It's hard to remember now, but back in the 80s, a PhD in CS was almost a strike on your resume rather than a plus.
I have observed when i am alone for 3 or more days without talking to anyone i get depressed. Even if i have conversation with anonymous people on the internet over text chat, depression gets reduced a lot. Are there any scientific research on human contact over text and depression ?
You can't change an asshole but you need to be more asshole to an asshole in order to inspire respect among peers and people around you. In my career before i get promoted i was already a go to guy among my peers because people knew i don't expect sympathy from an asshole and push their ideas if i see merit in it.