It took me almost a decade to become licensed because I couldn't find any tests that weren't at 9am on a sunday, an hour drive away. Maybe that seems like a low bar, because it is!
But, I got into ham radio to do interesting things, and not communicate. After all, why bother with making sure you have line of sight to a repeater when you can just fire off a text message? I'm much more interested in telemetry, instrumentation, and remote control.
I dont think dropping the CW requirements were a bad move, as you mentioned it can be difficult to make a trip to a testing session. When I got my general ticket, I was stationed down in San Diego, which does have a pretty big HAM presence. Even then, it was still that hour and a half drive to a testing session.
While a massive restriction, you can prevent interference (but not eavesdropping!) by signing your payloads. Maybe it's because ham radio and privacy aware people is a venn diagram with a lot of overlap, but it feels like perfect being the enemy of good.
Completely different threat model. I want to send messages to my friend that are private, and it would be awesome if we could do it p2p on a mesh net that doesn't require any centralized infrastructure.
Modification of those messages isn't something I'm worried about. It's not perfectionism, it's a different use case.
Yes, but ham radio is all about being open and a community of people communicating together.
And if that's the use case you want, you should look elsewhere.... At least, that's what the people leading the ham radio community and arrl say.
I'm not sure I agree with that, but I'm not sure I disagree either. I certainly think the community aspect is less important today due to the internet, but it's probably still crucial.
Yeah, I definitely think they come from an era where we all didn't have a pocket communicator that plugs us into any community at will.
When I flung packets across the bay with a friend, we used cell phones to figure out antenna positioning and alignment. violating the spirit of the intention of these rules, imo, but hard to say
> Yeah, I definitely think they come from an era where we all didn't have a pocket communicator that plugs us into any community at will.
Makes sense. And my cohort comes from an era when we can talk to anyone, anywhere, but the channels are controlled and watched. The ability to communicate isn't special anymore, but the ability to communicate independently and privately is. We don't want communication, we want known-good (dare I say, safe?) communication channels.
Do you think that ham radio couldn't be? Or isn't? What you really want is privacy, authz/authn concerns, and decentralization, it sounds like. And TCP/IP is about as useful as a "PHY" layer for your application as is ham radio.
Plus, solutions like what you're describing require a relative ease of use - unless you only want to talk to the 17 other people in your geographical area who have similar technical backgrounds.
The whole idea of long range is that my geographical area isn't a limiting factor. I'd be on a 20kbps VPN LAN with the 200 other people in the world who have similar technical backgrounds - especially if we were running this over shortwave.
And that's just the beginning, before people put in the effort to make it more accessible.
TCP/IP isn't the point, it's the encryption that lets me send encrypted, secure emails to my friends over a network that only wideband jamming or a natural disaster can take down.
Not to mention the amazing software we could build in a distributed P2P way now that we have a massive amount if research into things like CRDTs and consensus algorithms.
We could have a leaderless IRC server, leaderless peer discovery and routing, IP/DNS controlled by consensus.
Yes, but ham radio is all about being open and a community of people communicating together
My question is whether this kind of regulated openness is more worthy of the spectrum assigned to hams than some alternative scheme might be. It's certainly not a very efficient use of spectrum.
>Yes, but ham radio is all about being open and a community of people communicating together.
Maybe on paper. Where I live it seems to be about old grouchy white guys gatekeeping and LARPing that they would be of any value in a natural disaster.
When they are the de facto form of communication for a significant percentage of the population, it starts to go from "their rules" territory to "society's rules".
Can you imagine being cut off from the phone system 40 years ago because you were selling answering machines?
> Can you imagine being cut off from the phone system 40 years ago because you were selling answering machines?
Yes, actually. It was illegal to connect any equipment beside Bell's equipment to the US telephone system. Not only would you be disconnected and possibly fined by your phone provider for doing so, but American law also made it illegal to sell such devices as well for use on the phone network. If you wanted to use an answering machine not sold by Bell, you had to get it custom rewired by Bell and pay a monthly rental fee for the privilege:
> AT&T, citing the Communications Act of 1934, which stated in part that the company had the right to make changes and dictate "the classifications, practices, and regulations affecting such charges," claimed the right to "forbid attachment to the telephone of any device 'not furnished by the telephone company.'"
> Initially, the Federal Communications Commission (FCC) ruled in AT&T's favor. It found that the device was a "foreign attachment" subject to AT&T control and that unrestricted use of the device could, in the commission's opinion, result in a general deterioration of the quality of telephone service.
It was challenged and the seller of the amplifier device ultimately won in federal court: https://en.wikipedia.org/wiki/Hush-A-Phone_Corp._v._United_S... (Even then, you couldn't actually electronically connect a device, you could only acoustically couple it. Direct connection of modems wouldn't be legal until the 1980s.)
> Even then, you couldn't actually electronically connect a device, you could only acoustically couple it. Direct connection of modems wouldn't be legal until the 1980s.
In the late 90s, I remember watching the scenes in WarGames
(which came out in 1983) where Matthew Broderick's character is using a modem where you had to place the phone cradle on top of it and thinking why would you ever design a modem that way?
And of course the reason was to work around this stupidity.
> It was not until a landmark U.S. court ruling regarding the Hush-A-Phone in 1956 that the use of a phone attachment (by a third party vendor) was allowed for the first time; though AT&T's right to regulate any device connected to the telephone system was upheld by the courts, they were instructed to cease interference towards Hush-A-Phone users. A second court decision in 1968 regarding the Carterfone further allowed any device not harmful to the system to be connected directly to the AT&T network. This decision enabled the proliferation of later innovations like answering machines, fax machines, and modems.
From [2]:
> After the ruling, it was still illegal to connect some equipment to the AT&T network. For example, modems could not electronically connect to the phone system. Instead, Americans had to connect their modems mechanically by attaching a phone receiver to an acoustic coupler via suction cups.
It's not entirely stupidity. Acoustic modems also made sense for portability. Reporters in the 80s used to use things like TRS-80 Model 100 + an Acoustic modem to send stories back to the office over public telephones rather than have to hunt down a phone jack somewhere.
Australian here. I remember that scene, and I’ve always wondered what that was about too. I was using modems in the mid 90s and I never saw anything like that cradle device! Thankyou. That makes horrible, awful sense to me now.
+1 and this mentality persisted all the way through the Lucent bankruptcy, prior to which their business model was to sue everyone who had ever talked on a phone, right on up to the previous presidential administration which involved trying to place former telco execs on FCC regulatory boards to rewrite rules which aren't really rules.
But what they don't address is that HBOMax is already the worst streaming app on my TV, and therefore it doesn't matter how much money AT&T throws at politics. Their stuff sucks because they're AT&T, not because of some political misfortune.
The existing AT&T is just a zombie formed from 5 of the original 8 fragments of the old AT&T. I can’t help but wonder how many former coworkers at AT&T in 1981 where reunited in 2014 without ever having left the fragments.
At least some of the fragments were doing a lot of pushing out of older employees in the 2000s, so probably not a whole lot left, but I like the concept.
No, it's the previous AT&T with some parts factored out (most notably, Verizon). It also apparently managed to remerge several parts of itself over the years, which is mind boggling (how that didn't trigger immediate court action is beyond me): https://en.wikipedia.org/wiki/Breakup_of_the_Bell_System
Sounds like you can imagine the law permitting something horrific & ghastly.
I too can imagine that. But the restraints the law allowed to be imposed on our freedom sound absurd, sound outlandish to me now. We were in a situation that de-legitimizes the law & the legal system, and eventually we fixed that.
> Can you imagine being cut off from the phone system 40 years ago because you were selling answering machines?
Also, it was illegal to sell connecting equipment, sure, but AT&T didn't go nearly as far as what we see today. They didn't do anything this bad. The question posed wasn't about the legality or ability to interoperate, to make devices.
The question was about the reprecussion. Hush-A-Phone & other companies did not have their corporate phone numbers dropped, did not lose their ability to make phone calls when the started making a device AT&T didn't like. AT&T took them to court & tried to get them to stop making devices, but they didn't retaliate by kicking their corporate entity off the network. AT&T also didn't search for people using the phone system to talk about using other means of communication & kick them off the phone network (something we've seen repeatedly, recently with Mastodon, although those policies may/may not have been improved recently). Facebook is acting far more like a bully than AT&T did, in my view.
>> It was illegal to connect any equipment beside Bell's equipment to the US telephone system. Not only would you be disconnected and possibly fined by your phone provider for doing so, but American law also made it illegal to sell such devices as well for use on the phone network.
Which is the exact thing which gave rise to phone phreaking and getting around the limitations on Bell Systems.
"Exploding The Phone" by Phil Lapsley is a great book that examines these early hackers:
Directly-connected modems were an obvious threat to AT&T as this would enable high-bandwidth packet-based distribution over AT&T's network, including ultimately what we know as VOIP. AT&T fought packet-switched networks from the start:
Paul Baran: The one hurdle packet switching faced was AT&T. They fought it tooth and nail at the beginning. They tried all sorts of things to stop it. They pretty much had a monopoly in all communications. And somebody from outside saying that there’s a better way to do it of course doesn’t make sense.
But they didn't ban the seller from using the phone system, they banned people from attaching a different machine to the system. That's like the Facebook trying to detect the extension and trying to block it. That's similar to some websites blocking ad blockers, while it would not go down well with Facebook users I think it's very different to what Facebook does here.
I wonder how much of the ban was from concern of poorly regulated voltage related device damage. Now at least FCC regulations alone protect against the worst because anything that cheap also tends to output significant noise on reserved spectrum(s).
Yeah, I always defend there should be a "law of scale". When you start making your own project, taking a risk, spending hours and hours working on something, it's fair for a single individual to have the right to make any calls in what they are doing. But when it expands to thousands of workers and millions of users (or even much, much less), your responsibilities and reach can not be the same anymore. Saying "I built it" is no justification. The growth and the contributions that are making something possible, users included, do not support the logic of "my house my rules" anymore.
This wasn't particularly related to this specific case, and visions and missions of companies should still be respected, but society does have a very warped concept of "property" when it involves their work or ideas.
Facebook isn't a monopoly, as evidenced by their recent outage resulting in 70M new Telegram users overnight.
Even so, as others have pointed out, the telcos did have arguably reasonable restrictions placed on what one could connect to the network.
But to put glorified web sites in the same class as government-sanctioned monopolies utilities tend to necessarily be is asinine. Your telco had to run physical wires across the land, gas company physical pipes everywhere, there was no practical means of a free competitive market, it's a completely different situation.
Nah. Nobody needs to use FB to communicate, there are many dozens of available communication platforms, you can't even sign up for FB without a phone number anyway, this idea that FB is some critical communication infrastructure is totally false.
For some countries Facebook, Instagram and WhatApp are the internet.
Official entities and company use them as the only form of communication with the citizens and customers.
This argument gives the platforms more credit than they're worth. It's been obvious for half a decade that social media is bad for mental health. I've cut it out of my life. I tell others to cut it out of their life. No one's under the impression that these platforms are good for anything. They're popular now, but they are not important.
Cigarettes are obviously bad for people's individual health, but we don't rely on individual responsibility to ensure children don't purchase themselves cigarettes.
I cut facebook out of my life almost 5 years ago before it was "cool" to do so. Its like junk food or cigarettes or anything else that is net bad for a person. I would say let people decide for themselves if they want to use it and you hope they make the smart choice of just saying no to facebook and all its toxicity that comes with it.
They are important because they contain a significant portion of many people's address books. When Facebook was offline a few days ago, I had no way of reaching about two thirds of my contacts. And I'm someone who's made a significant effort to move off of Facebook. There were people I wanted to contact that day that the only way to reach them would have been to ask mutual friends for other contact details. And there were a few people that I either don't have mutual friends with or who our mutual friends were also only reachable via Facebook. If legislation aims for some form of "interoperability" the main condition should be that, if Facebook were to disappear again, I would still have the ability to reach all of my Facebook contacts via another network.
I loathe Facebook and am hesitant to take its side on any issue. But if you cannot be bothered to ask your "contacts" for a phone number, email address, Telegram, whatever, I don't see why it is Facebook's responsibility to ensure you have access to these people 24/7.
That's a social issue, not a Facebook one. Interoperability is an insane ask that has absolutely no precedent, and I say that as one of the biggest FOSS enthusiasts this side of the Mississippi. There's simply no way that the United States government could force a private company's hand like that, and even if they did the fallout from that would be insane. Where do we stop with interoperability? Do all browsers need to share the same history storage format? Do all cloud storage providers need to use the same app? Do all of us need to use the same operating system, communication protocols and news outlets?
No, because we're different people. Some people are drawn to Facebook's firehose feed, and there's not really anything you can do to stop them in a free world. It's a disgusting, albeit perfectly legal exchange of goods and services. Microsoft and Apple fought long and hard to make sure consumer protection laws like that never saw the light of day.
>When they are the de facto form of communication for a significant percentage of the population, it starts to go from "their rules" territory to "society's rules".
Facebook is nowhere near the de facto form of communication for a significant percentage of the population, as evidenced by the fact that the world didn't crash to a halt when it went down a few days ago. It's merely popular, but being popular doesn't mean it controls society or dictates its rules.
>Can you imagine being cut off from the phone system 40 years ago because you were selling answering machines?
That would be a valid comparison of Facebook owned the infrastructure of the internet, but they don't. It's trivially easy to communicate without Facebook.
This is like saying if you don't like the rules you can build your own multinational telephone network. There's a reason telecoms are subject to common carrier rules and I don't see why tech monopolies should be any different.
The crucial difference is lack of a right of way. The thing which creates an actual monopoly instead of the language degradation of monopoly to mean "But it is big and I don't like it!".
My understanding of common carrier rules is that they want to avoid a situation where a railroad or telephone operator who controlled the only available line could charge exorbitant rates to customers who had no alternatives. I don't really see how the same concern is true for Facebook - we have lots of options to disseminate information online.
I think you are seriously and intentionally misunderstanding the point. So far it was completely fine for Facebook to ban whoever they wanted and it was justified by them being a private company. Anybody who complained about it was told that they can build their own social network/cloud provider/payment provider.
It's fine if you're a small or medium size business that commands at most single digit % of the market. Facebook dominates ad spending and reach to the point that you can't just build your own, because they have a de facto monopoly/oligopoly over digital ads.
Let me ask you this: do you think Apple should be allowed to ban whoever they want from their platform justified by them being a private company? If you say no, then you should also say no to Facebook being allowed to do so. Otherwise you're just twisting the facts to support your political position.
Different people on the Internet will say different things. You can't really assign one collective motive to everything on the Internet and then say it's hypocritical.
And your own social network will fail because of network effects. If Facebook can be as terrible as they have been and retain their users, it's really because of their users that they're being propped up with a successful business. I gotta say at that point even I start thinking they owe their users more than a free market exchange would imply.
Not to mention we're talking about them sending a pretty formal legal threat. Would you philosophy in this case not be "if you don't like their browser extension, don't use it?"
If you're building a new social network today, it makes sense to tap into an existing social graph so you can bootstrap your network with an existing ecosystem. Michelle Lim made a great case for this in her post here:
> And your own social network will fail because of network effects
Speak for yourself. Not everything needs to be 'planet scale', I run a few social networks and they do just fine.
And I agree with Facebook in this case, if you have someone come into your house with the sole intent of burning it down, of course you're going to kick them out. It's no different than dealing with trolls or other bad actors.
Burning down Facebook? What on Earth are you talking about? It makes it easier for users to remove their own accounts across multiple services. It's a common interface to features the social networks themselves provide. This is the opposite of a bad actor.
I certainly think the same argument applies, FWIW - I don't think Apple is being reasonable if they ban people from services for fixing their own devices or other people's devices. Not sure if you were implying I would feel differently about that instance.
If you don't like the rules of English you can always invent your own language - sure you won't be able to talk to anyone but isn't that freedom enough?
All these large technology platforms are as ubiquitous as utilities, as powerful as governments, and as unregulated as can be. Their network effects and access to capital gives them unusually strong protection from competition, and also the ability to just copy smaller competitors with impunity. After all, what legal action could a cash-tight startup take up against a behemoth with a war chest in the tens of billions of Dollars? Given their size, scope, and the lack of healthy competition, they need to be reigned in. We need to treat social media platforms like we treat telecom services - as common carriers. And we need to treat other large tech platforms as public utilities as well.
Sure, but that has literally no impact on the incentives that created this problem in the first place. It's not related to volume or frequency, and if anything less frequent and smaller shipments might exacerbate this issue in the current system. Supply chains are counter-intuitive like that sometimes.
I'm not entirely convinced that fully disconnecting is necessary, although it is nice. Instead, I take an old phone without any work stuff on it, and go moto camping. Occasionally, messages will come in from friends and family, and I can respond to them.
But the most important thing, imo, is not attaching yourself to outside responsibilities for a few days.
I started a project that I hope to maybe turn into a product (foss/community edition hybrid approach). One of the tech stacks I evaluated was mono-based, using c# and the .net libraries.
I ended up turning down that choice due to a couple of factors, but one of them was the glacial pace of .net process and development. It just didn't look or feel like a healthy ecosystem, if you weren't paying microsoft $$$$. Just a ton of small warning flags around the community, the stack, and the maintenance of core projects.
C#, .NET, and the associated ecosystem doesn't feel good to work with, out of date, and in desperate need of modernization. The alternatives (spring, flask/falcon, lightbend) all seemed much more modern and easy to work with.
I actually have a concrete example of this: one of the features of my project is that you can define yaml based configuration documents, and share them. Using something that was easy to write for small but coherent configurations was crucial - XML was right out. The .NET ecosystem's yaml support is not great, feels kinda janky, and yaml validation is a paid product! That's completely untenable for something that's still in the weekend work phase. And it wasn't just that, but tons of things I had become accustomed to in python were either incredibly immature or paid. Not to mention that the built in build system was horrific, confusing, and had weak documentation. It's probably a lot easier when you just write a check to a contractor to set up a template for your team, but I have no luxury. Sure, some of this is on me, but I wanted to get started building and the .net ecosystem repeatedly got in my way until I was forced to give up.
So, upon hearing that the .net foundation is spending all of its time generating stacks of bureaucracy and causing internal drama, I feel like I made the right choice. At this point, it's unlikely I re-evaluate .net for future projects unless I hear massively good things (like what happened with java, which took a decade).
> C#, .NET/FCL, and the associated ecosystem doesn't feel good to work with, out of date, and in desperate need of modernization.
I recently built a company over the past year with .NET (after not using it for 5+ years) and felt the total opposite. I suspect you may be referring to the old .NET back when you needed Mono to run it on anything but Windows? .NET Core has been out for a few years and is an amazing improvement, the APIs (especially for web dev) feel very complete and modern, and there have been a lot of neat language features added in the recent major versions. The old .NET was definitely clunky though for targeting anything but Windows apps or servers.
Not sure what you are referring to for the paid products - .NET Core is now open source and even accepts PRs. Although there is no official YAML support still, so maybe the third party library was paid? JSON is a great option these days and .NET's new JSON parser is really fast and uses much less memory (Years ago C# got support for the new Span<T> and Memory<T> types which allow type-safe access to the underlying memory, making serialization operations much better since there isn't a bunch of copying and allocating involved any more)
This is our experience too. We have 5 software engineers maintaining 3 mobile apps, 1 desktop app, 1 SDK product, and 1 web-app (Blazor) all with a shared C#/.NET code base that is bringing about $6M annually if you include the associated hardware product revenue. I’m not aware of many other tech stacks out there that can “natively” integrate with hardware/sensors and get that level of code sharing maintained by such a small team.
How is your experience with using Blazor and also creating the mobile apps (not even sure what .NET tech is used for that these days)? I maintain a web and mobile app so using C# across all platforms would be amazing, but I thought Blazor was still too new so I decided to use TypeScript & React for the web and mobile apps.
While Xamarin has some pain points it definitely satisfies a market need for “native” cross platform development. My company uses it quite a bit.
With .NET MAUI on the horizon as the evolution of Xamarin it has a future. Although I do wonder if the Blazor (mobile bindings) might end up having the brighter future. Kind of seems like the back up plan to appeal more to the HTML/CSS web devs who don’t want to adopt/learn the XAML UI markup.
Sadly(or luckily, for the people that haven't had any contact with the ecosystem), you must have missed the semi-recent open source developments around .NET Core. It's open, has a very healthy community and is well supported by Microsoft.
Sure, back in the days when you had to rely on Mono for cross-compact, this is exactly the answer you'd expect from anyone, but it couldn't be further away from the truth nowadays with the modern runtime and ASP .NET Core.
I can't refute your examples, but I think on one hand, C#/.NET has a lot of modern stuff you maybe didn't see, which you'd wish other languages had. The lang has great features (i.e. properties; the libraries for collections/concurrent collections; System.Linq; and Tasks, to name some). VS has great profiling tools (cpu/memory) if you know how to use them. Package and dependency management were bad, but I believe got some needed improvements starting with .NET Core. Things like that.
OTOH, that said, I don't actually prefer C#/.NET, when I think of those techs I don't think of a great OSS ecosystem (more like marketing blog posts by Microsoft MVPs, as opposed to technical deep dives by passionate engineers). I also think of github repos with at least 10x more open issues than stars or forks. These techs have great features, and they're used for successful projects/companies, but there's something about the fact MSFT owns them that makes me not want to use them at all, and just prefer golang/java/etc.
They're not microsoft employees. It's just MS has explicitly recognized them.
There's an irony that the award MS gives to passionate engineers in the .Net community has been completely misunderstood by someone outside of it.
If you want to see a great example of a passionate engineer who's been awarded the MVP, go read a few Rick Strahl blog posts.
Another one was Scott Hanselman, though I believe he joined MS like 10 years ago.
They addmitedly muddy the waters by awarding the MVP to their own employees writing about their new stuff, but even they can be passionate and interesting. Scott Guthrie's time on ASP.Net is a good example.
When you join the MVP program you are an evangelist. You are briefed, you are under NDA and you start to have a very different view that the broader community. Not every MVP is like that, but while I like the content they create I see them seriously disconnected from many .NET developers and actual use cases.
I think that is because product teams at Microsoft more or less expect MVPs to function as evangelist of the things they want to communicate marketing-wise. Originally the program started as recognizing community leaders/technical experts in specific MS technologies, but I feel things got hijacked somewhat along the way. Teams see MVPs as credible voices towards their audience so naturally the tempatation to use them as marketing voice is there.
Add to that the fact that MVPs themselves are passionate about those MS technologies so they for most part do want to repeat marketing line.
I used to be MS MVP, not for developer tech but for it pro side of things, and definitely saw these ”asks” in the messaging the product team did towards us from time to time.
> The alternatives (spring, flask/falcon, lightbend) all seemed much more modern and easy to work with
I never thought I'd see Spring(or any of those) described as more modern and easy to work with than .Net haha. Spring is the Java land web framework bloatlord. It's quickly fallen behind and out of favor in recent years with the hip crowd preferring micro frameworks like Quarkus or Micronaut. Spring abandoned hot-reload in favor of restarting app context which takes FOR EV ER on larger projects; .Net is releasing hot reload as a first class platform feature as we speak.
.Net runtime is one of the hotest pieces of action on the planet. AOT compile, multi-core threading, async/await, Span<T>. Mono is pulling its weight too with compile to WASM. Then you have ASP.NET, Entity Framework, and a lot of other really great and performant build blocks.
Ah yeah, I was using core I think. Sorry, it's been a while, I kinda glazed over the acronyms and names over time. Since it was my own project, I really was trying to do things "the modern way".
Frankly, I don't remember. It's been a long time and using .net was a failed experiment.
>Why were you targeting mono then?
Well here's another issue with .net I ran into... really unclear guidance from numerous sources who all really wanted you to use THEIR stack and thus end up paying THEM.
I did multiple things, multiple attempts to make it work. I tried to do it "the right way" from various sources.
The reason I was targeting mono at some point was because I wanted to run a microservice, without paying windows server licenses fees, and as best I can tell, that was the root runtime I'd need to target.
Keep in mind that this was me, going in blind into a large and established ecosystem. I don't think going astray detracts from some of my bigger complaints, such as the poor quality of community open source libraries.
> Well here's another issue with .net I ran into... really unclear guidance from numerous sources who all really wanted you to use THEIR stack and thus end up paying THEM.
Meaning no offense, but it’s hard to see how you ended up so far off the mark. The .Net frameworks and runtimes (.Net Framework, .Net Core, and mono) have always been free. Likewise for the Asp.Net web framework you would have been using for web applications. I’ve been in the .Net ecosystem for over 10 years and I’m not aware of _any_ paid web stacks that would align with the quote above. Not to say that they don’t exist somewhere out there to extract money from foolish enterprise IT departments, but they aren’t even close to mainstream. If you did this experiment within the last 5 years it would have been extraordinarily hard for you to miss out on .Net Core, especially considering it was built to do exactly what you want - microservices with Linux as a first class citizen.
> really unclear guidance from numerous sources who all really wanted you to use THEIR stack and thus end up paying THEM
Sorry but at this point it is hard to believe you are posting in good faith and not trolling. Even back in the Windows only days the runtime was free and there was free dev tools (VS C# Express). Mono was free too, and open source. I know no paying runtime and I've been using the language and tools as a daily driver for a decade.
It's been possible to build micro-services in ASP.net Core running on Linux (and in Docker containers) for the last 3 years very easily.
Agree, I decided not to reply - if, a year ago, you ended up on the path of "compile to run on mono", you didn't bother to read any of the .net5 documentation - and without checking, likely just the main download page for dotnet core. /shrug
Why would you start a project on a platform then build that product on a foundation of one of that platforms weaknesses instead of one of its strengths? This anecdote is not good evidence against the platform.
You're referring to the YAML point? The way I read it, it sounds like the poster only learned about the issue when they ran into it during development. That being said, a couple of specific complaints is probably not the best argument against a broader platform.
I remember XML support in .NET being excellent. The author could have used C# classes that would serialize/deserialize to/from the XML, then used built-in tools to generate an XML schema for the config files. The schema could then be consumed by vendors or other connected projects using .NET and have perfect, code-first, statically typed interop. You'd never be writing XML itself or the schemas or the serialization/deserialization code. You can't get any of that with YAML, it doesn't support that sort of safety. You'd have to manually implement all the safety checks and serialization/deserialization for a worse end result. Of course that would be a bad outcome.
the .net ecosystem is somewhat pay to win - its like the apple store of the devtool world, clients are actually willing to pay. if your company has the funds, you have access to some of the best tools in development, but you can get pretty far with its free options too.
C# is quite a nice language to work with. historically projects were very enterprisey oop monsters but cleaner functional paradigms are trending in the .net space.
You can absolutely write in a functional style. It does not have to be the enterprise OOP monsters of the past.
Apart from something radical such as adapting a Hindley-Milner type system, I'm not sure it can support functions much better than it already is.
I think most people simple oppose to the idea of a static type system, and my experience is that static type checking is an absolute time saver in the long run for large long lived. Significantly.
This list to me reflects a different element of the .NET ecosystem: .NET teams don't just buy into .NET, they also adopt often buy whole hog into a grab bag of other MS solutions alongside.
Are Windows Server, SQL Server, Azure DevOps, Teams, Chocolatey, and PowerGrep really a better dev experience than Linux, Postgres, Github, Slack, apt, and actual grep?
Many are debatable, but SQL Server & Windows Server are honestly a huge pain to deal with, and on top of that cost a boatload in licensing fees. Even Microsoft has decided Azure devops is a waste of time post github acquisition and it is apparently eventually going away. Teams honestly sucks in comparison to Slack.
That is seriously outdated in 2021. A modern .NET developer use Postgre, SPAs like everyone else, vscode notebooks, VS Code, etc.
Visual Studio or JetBrains rider is the only thing a professional developer needs. Which is tiny investment for a professional. The rest of the products I have not seen in new projects use for years.
It's not, although Microsoft absolutely tries to make it much easier to write/deploy/maintain C# in an all-Azure tech stack. That's good or bad, depending on whether you have other reasons to be on Azure.
I'd say Visual Studio (not VSCode) is pretty much the only not-really-optional, not-really-free dependency. I suppose that means you also need to buy Windows.
Personally, while I use(d) C# for work and I think it comes with a solid toolchain and I have no big complaints, I have never chosen it when I could choose the stack from the beginning.
Closest I can think of is the days of yore where .NET Component licensing was a thing that was encountered frequently enough it was mentioned more prominently in books. This was useful for a lot of 3rd party developers that wanted to provide either tooling or runtime libraries to IT Shops. UI Toolkits and other 'specialized' libraries would use this mechanism as a way to keep some level of IP Protection.
I ran into this scheme in the wild with Devart's DotConnect for Oracle; at the time it was worlds better than the free 1st party providers, and came with some interesting T4 templates for generating models for either EF or their special adaptation of L2S, and MVC CRUD HTML generation. Fairly cheap for an unlimited site license too.
it's been about a year but it was a death by a thousand papercuts kinda thing.
I want to add that I was not an experienced python developer at the time - I'd used it for a handful of simple automation scripts. My day job was writing scala.
- targeting mono under nginx on linux, from within windows was really difficult, I could never quite get it to work
- no support for monodevelop on windows - cross platform was crucial to me and it just got so onerous. And maybe it's just me, but I was really hitting a lot of frustration with visual studio.
- i wanted to use both yamlschema (broken at the time) and jsonschema (paid)
- msbuild is abysmal with useless error messages. I struggled to run a simple shell script during the build process.
- a couple of libraries I tried were broken or unmaintained. The .net ecosystem seems to have very low quality of community maintained tools.
- no clear guidance around using DAPR without paying microsoft (which i guess is fair, but being vendor locked before I even start is a rough look)
- ... and that's all I can remember. There was more.
I fully acknowledge that some, if not all, of those complaints may be self-inflicted. I can't say that, for my first complicated project in python, it's been completely smooth sailing.
However, with python/falcon, it didn't feel like I was fighting with the ecosystem to define a package, import the package into a microservice codebase, and then get the service running with full test coverage. It didn't feel like I was fighting to bring a third party library in, verify it's actively being maintained, and start using it.
I spent as much time getting a simple microservice hello world with authz/authn validation and a reasonable packaging system, in python, set up in about time as it took me to give up on msbuild.
I've found bootstrapping python microservice development on windows to be quite well supported. I didn't find that with .net.
I think the only time I've found people this far off the main track is users that only know C#/mono coming from Unity Engine.
> targeting mono under nginx on linux...no support for monodevelop on windows
Because you should have never been anywhere near mono.
Even a year ago, if you followed the main .NET documentation it would have directed you to install the .NET 3.1 or 5 SDK on to your linux environment and get setup with VSCode. It's the first step in the hello world tutorial: https://dotnet.microsoft.com/learn/dotnet/hello-world-tutori...
> no clear guidance around using DAPR without paying microsoft (which i guess is fair, but being vendor locked before I even start is a rough look)
Dapr is not a Microsoft product. Microsoft does not offer commercial support for it. There's docs for the .NET library on the Dapr website, along with various community support channels available.
> I've found bootstrapping python microservice development on windows to be quite well supported. I didn't find that with .net.
Bootstrapping a worker service or ASP.NET Core on windows is either a few clicks in VS, or a one line command via the dotnet CLI.
Have you tried using .NET Core? It's a pretty thorough reworking of .NET Framework that is officially supported by Microsoft on Mac, Linux, and Windows, and pretty much all of my experiences with it thus far have been positive. VS Code has great built-in support for it.
IMHO for startups, productivity and development speed usually beats performance and enterprisy-ness (looking at you, Spring).
F# dominates C# there, but Django, FastAPI, various other options for Node or other languages, are simply easier to learn and faster to iterate. You can change/reload a Django app several times before the C# compiler finds a problem with your static types in a bigger codebase...
And to me it seems like you can acquire technical debt in C# just as fast as with any other language. Maybe faster because it is harder to learn and less "obvious" with all the DI magic. Of course, if you are proficient in C# and its ecosystem, you can write some damn fine software in it. If so, I wish you the best of luck finding affordable coworkers who can do the same.
I'm not saying it's at the same time. And yes, type feedback is mostly instant.
The code base I'm currently working on takes a long time to compile and sometimes there are errors. Or there are runtime errors. Combined with the much higher volume of code I have to write, and the greatly reduced readability C# does feel a lot less productive than Python.
Usually when I see that, it's because someone's implemented an inappropriate pattern, or gone over board on 'SOLID' principles or whatever.
C# developers seem especially prone to getting trapped in stupid implementations that massively bloat the code. For taking some Uncle Bob dictat to an unnecessary extreme, or implementing a dangerous pattern like CRQS stupidly for a simple CRUD app. Less a reflection on the language and more of a reflection of the enterprisey culture.
I was interviewing in the C# world recently, it was quite disheartening to hear the same phrases repeated 'hybrid microservices' (i.e. we've got the worst of both worlds in one project! But our TA got to put microservices on his CV!) or 'we value SOLID code' (i.e. we've massively over-engineered our code, development has slowed to a crawl, there are far too many interfaces, and debugging is a nightmare!).
But you can write great, terse, code in C#, and I feel much quicker than dynamically typed languages. I feel like I can implement the same code much faster in C# than I do in the equivalent javascript, even though I've been writing javascript for longer, and that's down to the autocompletion and snippets of a powerful IDE, which dynamic languages will never be able to match. It's also much easier and quicker to move code, refactor, and generally redesign your code on the fly as you're exploring the problem space.
Even Javascript has quite a bit of autocompletion nowadays, with all the annotations coming from typescript. Much less typescript itself. Frameworks on the JS side also massively improved.
Same thing with Python. Not that it really needed annotation much, the productivity always was quite high, whatever people not using Python said. Now there is static type checking and the IDEs got even smarter. In my experience even statically typed software still needs automated and manual testing so much of the benefits are limited.
>UK has the highest tech salaries in Europe and trade unions were buried in the 70s.
Regardless of what we may or may not disagree upon, connecting these the way that you have betrays your ignorance of how salaries, employer relations, and unions work.
Car accidents resulting in fatality are relatively uncommon, thanks to modern safety standards, and Tesla FSD doesn't have nearly as many miles as it needs.
Your sample size is way too small to be implying that FSD is safe - given a year or so, with millions more miles, we'll see what the score is.
By the way, and this is anecdotal, I was personally given a FSD demonstration a few weeks ago. The car immediately did an unsafe lane change in the middle of an intersection. Albeit, that was one of the pilot program beta versions.
But, I got into ham radio to do interesting things, and not communicate. After all, why bother with making sure you have line of sight to a repeater when you can just fire off a text message? I'm much more interested in telemetry, instrumentation, and remote control.