My first thought looking at that first "new suburb" photo was: where's the sidewalk? And the second photo has sidewalks but no visible amenities.
Sure they're denser, and townhouses are at least less materials-intensive to build, but I'm curious how this is otherwise an improvement on the old planning model of swathes of detached homes?
> I'm curious how this is otherwise an improvement on the old planning model of swathes of detached homes?
It's not. (Or at least, I'm not arguing it's better). My only point is that "more density" isn't the magic bullet fix people sometimes think it is -- because the US is doing a lot more density than we historically have in the past 60-ish years, and it's not meaningfully different in terms of lifestyle or planning.
If you want good alternative public transit, you have to just decide to build it first, and you kind-of have to do it almost everywhere nearby. (not just Division St, not just Woodward Ave, not just Hiawatha Ave, etc).
"Crank up density" doesn't magically result in a "car-free" city. You can't townhouse your way into having the Chicago Elevated, you have to actually decide to build the L. (similar to, say, Seattle SoundTransit)
This is what is happening in the US city I reside in. Developers are gobbling up land and putting in “high density” housing, but without building any additional amenities or restructuring roads that make the area more enjoyable to live in. And this is in a city with very libertine zoning rules, so mixed use is not restricted. The price/foot is about the same for the higher density housing as it is for a house and a yard just a 10-20 min drive further out, so given the choice between high density/no amenities/no parking/not walkable/surrounded by high speed roads vs a house with a yard and a garage (will have to drive in either scenario)…I’ll take the bigger house and the yard.
As someone shopping for a PHEV in Ontario right now: yeah it's dire. The Prime I've consistently heard is a 2-3 year wait, with the dealers saying Ontario is getting somewhere around 12 for the province because they're all going to BC and Quebec (higher provincial incentives there) and even there it's years out. Kia is the fastest right now but even that's a 10+ month wait for their most comparable PHEV.
The only one that seems easy to get is the Outlander, but that car is also way too big for my needs among other issues.
> IMO Molly should be getting public good funding from the ETH ecosystem for being such a good & reliable source
It's a generous thought, but it's very bad optics to take a bunch of money from someone that you report on. Yes I know this happens all the time in bigger media circles, but they're big enough that it's either not going to be a dominant revenue stream or the appearance of integrity is irrelevant (the ur-example being Fox News and their claims of being "entertainment")
> There are no moats to being a plumber, a baker, a restaurant...
This line is interesting to me, because actually I think there _is_ a major moat there: locality. I don't disagree with the rest of your comment, but for those examples specifically a lot of the value of specific instances of those business comes from their being in your neighborhood. If I live in Toronto, I'm not going to fly a plumber from Manhattan to fix my pipes; if I want a loaf of sourdough, I'm not going to get it from San Francisco, I'm going to get it from the bakery around the corner; I might travel out of town for a particularly unique and amazing restaurant, but not every week, I've got solid enough options within a ten minute drive. Software is different because that physical accessibility hurdle doesn't exist.
What you're describing is less a statement on moats and more a statement on markets. Plumbers in one location share the market (the potential clients in that area), and as the parent comment states, there is no moat in that given market. A moat is a barrier to compete within a given market. So if something made it really difficult for a new plumber to serve an already-served clientele, that would be a moat. But individuals on the other side of the planet are by physical encumbrance not actually clientele... they're not even in the market.
From my experience, extrapolating retention like that isn't really particularly valuable. It's basically the same as making something up whole-cloth past a certain point--feels more like lying than projecting.
(I also read the OP's story as the investor asking to see actual numbers, and not projections--if someone asked to see those kinds of numbers my first thought would be that they were asking for historical data)
Lived in London during and slightly after university: it's not. City's pretty much dominated by cars, there's technically a transit system but it was slow and infrequent, and the way the city is laid out with three major N/S and three major E/W roads makes it hard to get around during constructions season. Downtown is...kind of walkable, but it's bordering on a food desert with only one (expensive) grocery store at the edge of the core.
I generally agree with all your points, but I also remember the forum posts and I find that SO has at least one big advantage over those: it's really quick to pick out a good answer compared to a long thread. Just the fact that a) answers are clearly demarcated vs comments and b) the author can select a "correct" answer makes them way more skimmable than, say, a phpBB thread where every post looks the same.
I definitely see that side of those, though I think they both have pros and cons.
Especially in the JS space[1], there are lots of historic answers upvoted on SO that are now wrong and you have to scroll around (or find a slightly rephrasing of the same question that occurred more recently) to find a good answer. Unfortunately the longer a question is around both the more answers it has and the more likely the answers are to be wrong. On forums / mailing lists the question would just be reasked. On SO it is supposedly a wiki so new versions of the same question gets closed.
[1] Because it has evolved a lot, Rust has similar issues, as I'm sure lots of languages do
I really do try my hardest to keep old questions and answers relevant. Stack Overflow allows editing Q&A for a reason, and any time I stumble on an older post, I at least think about updating it.
One thing that I've occasionally found frustrating is that someone sees a question was asked 2 years ago and thus assumes that it must be invalid / out-of-date. They simply don't look at the edit dates.
Remember that Rust has a pretty strong backwards compatibility guarantee. Any answer using only the standard library in the last 3 years should still be valid (crates are a different concern). If you find a question you think has become stale, add a bounty to it to raise attention — that's the SO recommended path of action.
We also have a Rust chatroom on SO where a bunch of regulars hang out and people are welcome to pop in and ask if a Q&A is still valid.
Should the answers be edited? Everyones had to work with software on the job that wasn't the latest edition. If we edit answers to software after every update doesn't that just fuck over everyone who's not capable of updating?
As your other responses have stated, editing doesn't necessarily mean "destroy the old and replace it".
A lot of the editing I do is to improve the grammar, reduce fluff, use Rust-standard indentation, improve the formatting, include complete error messages, update links, etc. None of that should affect users of older versions other than to make it easier to get to the core content of the Q/A.
When a new Rust feature comes out, usually the original part of the answer gets a header denoting the compatibility. https://stackoverflow.com/a/28953618/155423 is such an example.
Note that SO does keep a revision log of edits (e.g. https://stackoverflow.com/posts/28953618/revisions) to a post. You can browse that if you think there might be something hidden that's worth exploring. This doesn't necessarily help with search engines, of course.
I often update my answers over time to cover multiple versions, usually as the result of someone leaving a comment pointing out a problem. It's pretty easy to do in a way that preserves the old information (eg. https://stackoverflow.com/a/1576221/8376).
You shouldn't necessarily erase history, but it's pretty easy to move your old text under an Old Answer banner or something, if you still think the info is relevant to someone or even that surfacing the delta between versions is educational.
To be clear, Rust is easily 10x better than JS in that regard, it was just an obvious example for me to use where <1.0 code is now weird and wrong :-)
I have nearly given up looking for JS answers in SO because they are such garbage these days. I am learning Rust right now and still find answers on there useful.
We actually have a specific tag for those cases (`rust-obsolete`), so if the question / answer cannot be rehabilitated for Rust 1.0, that's a useful thing (I also try to add an in-question disclaimer to make it easier to spot)
> there are lots of historic answers upvoted on SO that are now wrong and you have to scroll around (or find a slightly rephrasing of the same question that occurred more recently)
And how many of those slight rephrasings never got a chance to get a more relevant, correct, modern answer because they were marked as a duplicate?
Under the StackOverflow-as-Wikipedia model where it's a card catalogue of best answers, the right thing to happen is:
- they get marked as duplicates, so that all future people coming in via Google get funnelled to one place
- that place gets new modern, more relevant answers
- they get voted up because they're useful
- people add comments about answer compatibility
- they either overtake the other answers and the other answers stay around as history, or if they don't there is a single place where relevant answers can be found.
1 question with 10 answers is a lot more discoverable than 10 questions with 10 answers.
One leads to 9 answers to skim-read and ignore and 1 answer to use.
The other leads to you finding 6 questions and 6 answers you don't like and 0 answers to use and missing 4 you didn't know existed.
You misunderstood me, my points there are different.
In SO, re-askings are not allowed, but as the question becomes less and less similar at some point it will be allowed. The problem is that "canonical" question for that problem is really just the first one that came along, and there are no good tools for keeping it up to date. It doesn't matter how much karma you have, there are no tools for getting rid of highly upvoted or marked as correct answers that are now wrong / dangerous. So the trick is to find similar but different answers that approach the problem in a more modern way, but haven't been closed as being dupes.
In forums, re-askings are expected and if you ask a JS question 10 years later you will get answers relevant to the time. Posts are not expected to be evergreen, so you can look at when it was asked and have context as to the environment it was asked in, as opposed to SO where the date is sort of meaningless because it's sort of a wiki.
If you know the modern answer, you can downvote the bad old answer, add a comment saying "answer is wrong/dangerous", flag for moderator attention, or edit it and correct the answer, or add your own answer as well even though the question is 'answered'.
Re-asking a question is worse - then there's that big famous first answer with 75 upvotes prominent on Google. And your question/answer with 1 upvote and a "correct" modern answer.
Nobody will find that.
But it's a good point - if you don't know the modern answer and want to, and you can't "re-ask" for new input, then what?
Not in my experience on the workplace so - in some cases the answer the OP needs is "NO do not do that", but the highest rated answer is long one that really does give the answer that is needed.
Sometimes you have to work out the underlying Q is and answer that even if the OP doesn't want to hear it.
I much prefer the answers that explain why, if you really shouldn't do something in addition to an answer to the actual question. Usually there are perfectly valid reasons to do things you "shouldn't" and until you know that there isn't one you should just answer the question.
I believe that you can't be down-voted, and therefore lose your hard-won points, by commenting. But if you answer, you can be down-voted. So for some participants, commenting is prudent.
And then there are the poor souls who have a useful comment to make, but can't because you need reputation points to make comments. So they leave an answer instead, which is completely inappropriate and just leads to more discouragement.
I can't figure out what the issue(s?) is. What was the context of the moderations and associated apologies? And was another complaint that he was promoting specific companies to help them win more business within the Node community? And I guess he doesn't respect the concept of a Code of Conduct?
Maybe I'm dense but it's not exactly Damore level stuff here.
The issue is that people have apparently on many occasions said "Hey Rod, don't do that, it's against the CoC (code of conduct)" and his attitude has been "I don't care about the CoC, I don't believe in CoCs, and I'm important enough that you can't touch me." This creates a really difficult environment for completely non-SJW-related reasons (I am very sad that people keep treating this as a social justice issue).
Codes of conduct are only meaningful if they apply to core contributors with all the more force than occasional contributors. The chief accusation lobbied against a CoC is "oh you just use those to go after people you don't like," much like, say, the police sometimes use US law. The only security against this accusation is that the CoC is applied with more forgiveness at the lowest levels and more stringently to the people-at-the-top. It's not social justice; it's consistency.
Admittedly I think a bunch of HN would like the anti-SJW approach of "let's just burn the CoC and forget that it ever existed," and in that sense I suppose that it is "social-justice-ish", but these are orthogonal concerns. The one concern is that members of the Node.js CTC are beyond reproach for shitty things they do; the other concern is defining clearly what things are shitty.
The issue is that people have apparently on many occasions said "Hey Rod, don't do that, it's against the CoC (code of conduct)" and his attitude has been "I don't care about the CoC, I don't believe in CoCs, and I'm important enough that you can't touch me."
The accusations would carry more weight if they were accompanied by links to examples of their behavior. At the moment, it reads like you're quoting him even though he probably didn't say that.
Two big things: 1) it's a lot faster to send to Pocket with the button > mark as read than managing bookmarks IMO but the real killer feature for me is 2) the offline saving of articles. Pocket is killer when I'm commuting or without Wi-Fi because it means I can keep reading things without worrying about finding a hotspot or wasting my (very limited) data.
> Pocket is killer when I'm commuting or without Wi-Fi
... and yet this thread and every other Mozilla apologist thread is always filled with folks who suggest that it's no big deal to assume an always-on, always-available network connection.
Yeah, the problem with all these web apps is the lack of functionality when offline. Most just don't work, and some have a half working offline mode, but non-perfect connectivity or no connectivity really limits productivity today.
You should check again; it's tucked under the labs section but there's an HTML5 audio player available as a Flash alternative. Doesn't work in Safari though.
Sure they're denser, and townhouses are at least less materials-intensive to build, but I'm curious how this is otherwise an improvement on the old planning model of swathes of detached homes?