Hacker Newsnew | past | comments | ask | show | jobs | submit | albertsun's commentslogin

The only reliable way is to have the clause negotiated into a union contract which also has a "just cause" termination clause.

One more reason to for tech workers to start organizing.


This is pretty interesting too: https://www.carsized.com/en/cars/compare/bmw-3-1997-sedan-vs...

Same car has grown by 5% over 15 years.


A domain name can go away just as easily, maybe more easily than a popular URL shortening service.


It can? I trust my $12/yr domain to stick around longer than a hypothetical $12/yr subscription to a SaaS link shortening service. The only exception I think is bit.ly who seem to have staying power because they're actively fighting spam and have B2B revenue.


Try https://www.newsblur.com/ it does a lot of what you're asking for and works well on web and mobile app.


It's only a tradeoff if you assume that the process must be automated and operate with a minimum of actual human scrutiny or consideration.

Any qualified human review would immediately be able to distinguish and vastly improve both the false negative and false positive rates.

But because these tech companies devalue that kind of work they aren't willing to invest in it.


A couple of things to consider:

1. There's a real social cost associated with constantly subjecting human reviewers to traumatizing content. It can lead to devastating mental health problems. (some relevant articles: https://www.theverge.com/2019/2/25/18229714/cognizant-facebo..., https://www.telegraph.co.uk/technology/2019/12/16/youtube-mo...)

2. Humans make mistakes as well, especially as volume of content increases and policies get more nuanced.

3. There's way more violating content then you could ever imagine. As just one example, YouTube reported having to remove >1.8 BILLION comments for spam or other policy violations in 2019 (https://transparencyreport.google.com/youtube-policy).

4. The large tech companies spend billions on human moderation as it is. YouTube alone for example has >10,000 full time human reviewers. (https://youtube.googleblog.com/2017/12/expanding-our-work-ag...)


Regarding 1: I find it unlikely that most DMCA takedown notices contains traumatizing content. (Arguably) most of it has a copyright on it, which means that someone probably wanted to sell it to a sizeable market. The DMCA takedown notices identify URLs, so you don't really need to use the same pool of people that screen for child abuse and various horrors, and they don't get an unfiltered view of every sort of terms of service violation from every platform, just potential copyright violations, a completely separable issue.


So why not raise the cost of submitting arbitrary user content?


Because I, the user, don't want that. I don't have a problem subjecting myself to Content ID in exchange for the ability to just host all sorts of crap.


Umm, you do realize that this is basically a DDOS attack, and automated tools are a necessity. What needs to happen is a law/legal process where you can counter sue against false DMCA/copyright complaints, with large penalties for indiscriminate and abusive filings. That would put the onus on copyright holders to put in some validation effort before filing.


Well, almost no one does, but someone has filed a whistle-blower claim with the IRS that this pricing arrangement is an illegal tax dodge: https://www.nytimes.com/2016/02/07/your-money/vanguard-a-cha...

It seems as though that case is still pending resolution.



Any backend openings? I've been looking to move into frontend work more but don't think my experience is there yet.


If you're looking for a backend role, we're also hiring in New York Times Data Engineering--application engineers, data engineers, and database reliability engineers:

https://nytimes.wd5.myworkdayjobs.com/en-US/Tech/job/New-Yor...

https://nytimes.wd5.myworkdayjobs.com/en-US/Tech/job/New-Yor...

https://nytimes.wd5.myworkdayjobs.com/en-US/Tech/job/New-Yor...


I'm an engineer on our CMS team and as Albert mentioned we're currently seeking both back-end and front-end team members. We're not opposed to hiring skilled developers that don't have deep experience the specific language or frameworks we use. There's also always the option of joining the back-end and transitioning over time to a more front-end role. If you're interested, I'm happy to answer any questions or connect you to the hiring manager.


NYT seems remote friendly, but is NYT remote non-US friendly? Front-end/Full-stack engineer from The Netherlands here.


We do have news employees in bureaus around the world but I'm not sure what the the exact policy is for tech employees. I'll see if I can find someone with the answer for you


A CMS team? Christ, no wonder NYTimes is losing money.


Could you please not post unsubstantive comments to HN?

https://news.ycombinator.com/newsguidelines.html


As someone who has worked on the integration of off the shelf CMS and as well as built in house ones, I question this statement.

There’s a lot of value in building tools that fit the domain and workflow perfectly well. It’s expensive in the short term but in the long term saves a lot of time and money.


Sure. But it doesn't take a "team" to build up a CMS. This is why media "tech teams" are viewed as a joke in the tech industry.


WTF?

It takes a 'team' to provide any sort of professional level of support for an organization of their size. If the one-person "CMS/tech wizard" decides to stop working, or die, or get sick, or take an extra 4 hours to be with their family, the entirety of the NYT operations should just say "oh well, we'll just wait around for a while until we find our next one person to be our tech saviour"?


It is not constructive for me to engage with this level of cinicism.


There are! We're hiring for lots of open technology jobs here https://nytimes.wd5.myworkdayjobs.com/en-US/Tech

For backend specifically, on the CMS team there's this job https://nytimes.wd5.myworkdayjobs.com/en-US/Tech/job/New-Yor... or a fun one with the cooking team https://nytimes.wd5.myworkdayjobs.com/en-US/Tech/job/New-Yor...


Any openings that need Python folks?


A word of unsolicited advice: Someone who thinks "I am a Python developer" is generally less useful to a team than someone who thinks "I am a developer who is good at Python."


Not all languages are created equal. While yes a developer with years of experience in python may have lots of transferable skills to other languages, my team arent going to hire someone for a mid level/senior position without a good knowledge of c++


C++ is somewhat of an outlier due to the sheer size of the spec(and getting bigger all the time). Not everyone will use the same subset of features either, which complicates things.

That said, aren't you using C++ as a proxy for other needed skills though? Many Python programmers are not exposed to concepts like vtables, or they do not know how a linker works(or why it is even needed). Heck, many won't have done manual memory allocation or manipulated pointers.


Even though most people won’t use everything, all c++ programmers will need to understand RAII, manual memory management, lifetimes, effects of inheritance and some* undefined behaviour. We’d probably take people with C experience, and maybe Some D or Rust, but really for a mid level C++ job we wouldn’t accept much less than C.

Funny that you mention linkers, as they’re a bit of a minefield. Lots of the behaviour of linkers is platform (and implementation) dependent, I.e. are you building a shared library or an exe on windows or linux, and there are different gotchas for all of them. We wouldn’t expect people to know al of those things, but it definitely helps when it comes up, and you don’t have the same problem (ODR violations come to mibd) in many other languages...


I like your way of putting it, but I think those concepts (RAII etc) are fairly easily learned by a competent developer in any language. Most languages have analogous concepts if you use them long enough.

A shop that says it requires years of experience in a particular language is probably making their own hiring situation more difficult than they need to.

C++ for example is definitely a sharp language that takes some time to adjust to, and experience in C will help.

However: I think any competent programmer with a few years of experience would be able to gain basic proficiency in a month or two and probably daily-level-mastery in 6-12 months.

I say this as someone who had 10+ years in Java and only passive experience in C. I've managed to become mostly proficient in C++14 in about a month of full-time learning. After 10+ years of programming, I can pretty quickly adapt to the C++ way of thinking about things. I don't think I'm alone in this style of learning a language, and many hiring teams seem to agree.


> Not all languages are created equal

No language is inherently "better" than another. They are each useful for specific cases, and understanding more languages means understanding their contextual strengths and weaknesses.


Apologies, I didn't mean to imply that C++ was better (or worse) than any language, just that it's not the same as others.


May I ask, what's the company (and the particular team)?


Sorry, it was more of a hypothetical. I'm not involved in hiring for my current company and definitely don't speak on behalf of them.

I work in games on gameplay and engine code.


In most hiring settings this isn't true though. Language-specific skills actually do matter, for all but entry-level positions at massive tech companies. It's not like my 3 years of experience in Ruby is immediately translatable to Java/Python/whatever. Could I get up to speed in those languages in due time? Sure. But if you were on a hiring committee, hiring for a team that works mostly with Java (or whatever) would you rather hire me or the version of me that's had the same amount of experience but with Java?


> It's not like my 3 years of experience in Ruby is immediately translatable to Java/Python/whatever.

It should be.

Picking up language X is very easy for most values of X (more esoteric ones excluded). After you know a few (and the more diverse the set, the less new concepts you'll see), picking up a new one is quick. If it is something like Go, it's a weekend's worth.

Using the language in idiomatic ways and knowing the most useful libraries and frameworks does take more significant ramp-up time. If you are a solo developer, or if the entire team is ramping up at the same time, then it is bad.

But a new, experienced team member, with zero knowledge in the most used language in the team? Sure, why not? They may even bring new skills to an otherwise homogeneous group.

I always let the interviewee pick the language during my interviews. Usually it is something like Javascript or Python. Sometimes it is Bash. I keep waiting for the day someone will offer to do it in Scheme, Haskell, Elixir, Prolog, D, Rust, even Ada. Pseudocode is actually fine too, unless I have a reason to think the individual lacks programming experience.

Whatever you do, DO NOT brand yourself as a X developer, if X is a programming language. I can understand specializations like machine learning. But languages? It's a mere tool, and you need more than one in your toolbox.


> But a new, experienced team member, with zero knowledge in the most used language in the team? Sure, why not?

It really depends on how much time you have to wait for a new hire to ramp up before they are useful. If you need immediate help, then hiring someone who doesn't even have experience with your programming language is a complete non-starter. If, on the other hand, you are hiring for the long term and can afford to pay a ~3-month tax for a good engineer for (hopefully) several years, then you are correct and the ramp-up cost becomes pretty insignificant. But don't make the mistake of thinking that this is an exception to the rule—what languages you know absolutely does matter to any company needing to get work done now.


> Picking up language X is very easy for most values of X

I disagree. If you know Y and you're learning X, it depends much more on the value of Y. If Y is Lisp you will be quite successful at learning any X. If Y is something like C or assembly language, then you will not (it's always easy to spot the C guy who just started using Python etc.)


In my experience it is like learning a spoken language. If you know a language in the same family then learning the new language is relatively easier (e.g, Java -> Scala, Lisp -> F#). I also think the ease of learning a language is proportional to the ratio of expressiveness of the language you are coming from to the expressiveness of the language you are going to (e.g., Lisp to F# might have a much easier time than F# to Lisp).


>Using the language in idiomatic ways and knowing the most useful libraries and frameworks does take more significant ramp-up time.

Yeah. As in years. This stuff also changes all the time so requires perpetual learning.

>But languages? It's a mere tool, and you need more than one in your toolbox.

It's not a mere tool and languages are more than just trivially learned syntax. They are the umbrella under which a panoply of tools, approaches, ideas, priorities and culture aggregate.

Sure, you specialization isn't necessarily always the optimal approach and sure there are benefits to the cross fertilization of ideas across community lines. However, being a jack of all programming languages, master of none probably means you'll suck more than a specialist at a lot of stuff.

Moreover, some ecosystems plain suck and avoidance is the sign of a good programmer. Statistically, you're probably a worse programmer if you've used a lot of PHP and better if you've learned rust/F#.


Language-specific skills actually do matter, for all but entry-level positions at massive tech companies.

Absolutely, which is why I didn't stop at "I am a developer". That "who is good at Python" bit is definitely important.

However, seeing yourself as a "Python developer" rather than "a developer who is good at Python" makes it look like you see every problem as something you'll solve using Python. That's bad. That will put a lot of companies off hiring you (companies that believe in a 'best tool for the job' approach anyway, and those are the ones you probably want to work for).


>However, seeing yourself as a "Python developer" rather than "a developer who is good at Python" makes it look like you see every problem as something you'll solve using Python. That's bad.

If a company didn't hire me because they idiotically interpreted "I am a python developer" to mean "I would try to build a rendering engine in python" then I'd say that's probably a bullet dodged.

I'd interpret it as "I don't build rendering engines but I build other stuff for which python is a good fit", but maybe that's just me being weird.

The trope about great programmers being people who reverse binary trees in their sleep and being able to swap ecosystems at the drop of a hat needs to die. Great programmers specialise.


Being able to interpret around your words does not make your words inherently better.

You aren't honestly trying to sift out clueless prospective employers out by their ability to parse a title, are you?

The advice here is to put focus on ability to write software, and experience using python as a tool to do so, rather than implying that your ability is limited to python.


That is true.

What parent was trying to explain is that an individual is not defined by a specific subset of his/her experience.

Just as fluency in Spanish does not make me Hispanic, experience in python does not make me a "python developer". That title puts focus in the wrong place.


It depends on whether you're taking the perspective "if someone offered me this job, would I take it?" or "if I applied to this job, would they hire me?".


Don't suppose you'd consider remote candidates?


NYT is remote friendly, but most engineers are based in NYC. Wirecutter (https://thewirecutter.com/) is owned by The New York Times and is fully remote: https://nytimes.wd5.myworkdayjobs.com/Wirecutter


Thanks :D! I love Wirecutter, didn't know they were fully remote.


This is like a less thought out version of the carbon fee and dividend plan https://citizensclimatelobby.org/carbon-fee-and-dividend/ that has wide ranging support - though not from the people who currently matter.


This section is troubling.

> In a not-so-distant future, if we're not there already, it may be that if you're going to put content on the Internet you'll need to use a company with a giant network like Cloudflare, Google, Microsoft, Facebook, Amazon, or Alibaba.

For context, Cloudflare currently handles around 10% of Internet requests.

Without a clear framework as a guide for content regulation, a small number of companies will largely determine what can and cannot be online.


Bonus question: what is cloudflare's role in either enabling or preventing the ddos attacks that one requires cloudflare to survive?


Do they enable any? I don't recall hearing about that.


They protect a large amount of the DDoS-as-a-service sites ("stress testers" or "booters").


That is kinda a BS statement, there are hundreds of thousands of web hosts out there in some form. I am not sure where he is getting such a BS statement.


From the linked post:

> The size and scale of the attacks that can now easily be launched online make it such that if you don't have a network like Cloudflare in front of your content, and you upset anyone, you will be knocked offline. In fact, in the case of the Daily Stormer, the initial requests we received to terminate their service came from hackers who literally said: "Get out of the way so we can DDoS this site off the Internet."

As far as I know, he's right. It's basically only Cloudflare, Google, and a handful of other megacorps that can keep your content online if someone's willing to pay a vigilante with a botnet to get rid of it.


I don't think that's true. Cloudflare is just the most visible option at the moment, and probably (one of) the most cost effective.


His point is that with the amount of DDoS power available out there to various parties, without a major ISP or CDN hosting your content you can trivially be booted off the internet. Once you accept that as a given, if one of the major ISP or CDN networks won't host your content, then you're open to censorship from anyone who doesn't like your message, which if your controversial enough that the ISPs and CDNs won't host you it's probably a given that someone is going to want to DDoS you out of existence. To further complicate things, most small ISPs when faced with a substantial and prolonged DDoS of one of the clients, will terminate that client in order to preserve service to their other clients, which means once again if you aren't being fronted by a major ISP or CDN will likely mean you'll be hoping from ISP to ISP until eventually nobody will be willing to host your content.


I think the point is, if you make a site forgo any sort of DDOS protection it effectively does not exist, especially if DDOSers want to take your site offline. Some website running on a VPS on a small hosting company likely won't be able to have the resources to keep their site running... which in my opinion is fine. If people want to shout you down in public because they don't want others to hear what you have to say, well then find somewhere else to express your views.


And yet, everyone trying to work against this gets immediately downvoted on HN, because everyone considers the work of these companies just so convenient.

It’s classical short-term vs. long-term thinking, and it’s damaging not just to privacy, but also to the startup economy as a whole.


To me it seems like decentralization is actually very popular on HN.


As long as it's written in Go, Rust or formerly Haskell


In fairness your particular example isn't necessarily a good one. Snapchat is a trivial idea and it's frankly amazing it got as far as it did.


Imho his example illustrates his well. Snapchat was fairly disruptive and lost a lot of value when facebook just ripped it off. If we allow giant companies to engage in anti-competitive practices it will hurt us in the long run as people won't even try to innovate. The snapchat story is pretty demotivating. Why bother when one of the largest 5-6 companies in your space will just shut you down or steal your ideas?


My point is that Snapchat isn't a good example of a company pushed out by anti-competitive practices. Their core product is technically trivial and uninteresting. The fact that there is a market demand for it and being first to do it at large scale doesn't in my mind meet any minimal standard for protection from anti-competitive practices as you imply.


If you had something that would disrupt it would :)


I'm really confused, have all the grown-ups returned to HN? Suddenly after several years of self-congratulatory virtue-signalling, HN realizes that self-righteousness censorship is not risk-free, and has long-term consequences? I'm glad I started coming back to HN. Maybe the long recess from reason is over.


We are going to have to have regulation to reign in these companies.

FB, GOOG, MSFT, etc all serve billions of people. FB's network has a 1 people more than china.

The pro-censorship crowd wants to distract with "government vs private company" argument but that really doesn't fly when these companies are larger, wealthier and more powerful than a handful of countries.

FB censorship would affect more people than the communist chinese censoring content in china. That is extremely dangerous.


They're not really growth curves, the curves show relative growth rates. Not quite the same thing.

And if you think some central mechanism has gone horribly awry, what central mechanisms changed around 1980?


IIRC, one of the things that changed back then was a movement away from workers having retirement plans with the companies they worked for (pensions) and toward managing their retirement individually (IRAs). This could have conceivably led to a new, steady source of investment in the stock market through mutual funds and whatnot (and perhaps even more aggressive investments presumably), which, if the changes shown on that graph could be thought to be generously helped by investment income, may explain some of this.

Disclaimer: I don't know enough about this stuff to make a definitive argument.


> what central mechanisms changed around 1980?

https://en.wikipedia.org/wiki/Reaganomics

?


> what central mechanisms changed around 1980?

The Copyright Act of 1976, perhaps? If we look at the list of wealthiest Americans, most of them are there because they provide something to the world that nobody else is able to directly copy. Without competition, there is no mechanism to spread the money around.


This would certainly disproportionately impact authors, entertainers, and anyone else who makes royalties from media, but I don't know that they represent a large enough swath of the economy to account for the broad trends we see here.

To my original point, though, making money off of a copyrighted work represents income from an asset, the value of which compounds over time, due to inflation.


> I don't know that they represent a large enough swath of the economy to account for the broad trends we see here.

I'm not sure it is that broad to begin with. The 99.999 percentile of adults with these near-infinite income gains represents only ~2,000 people. While wealth and income aren't the same thing, I expect there is a lot of overlap with this list[1] that outlines 400 of the potential names.

If you look at how they made their fortunes, many are directly attributable to copyright. Bill Gates certainly wouldn't have topped the list if anyone was free to copy/resell Windows.

> To my original point, though, making money off of a copyrighted work represents income from an asset, the value of which compounds over time, due to inflation.

Increased value of an asset does not necessarily translate to increased income. A primary residence, for instance, is a good example. The value of your home may be increasing, but nobody is paying you to live there. Many people are quite happy to pay top dollar for an asset that generates no revenue on the hope that capital gains alone make the purchase provide returns.

But you do rightfully point out that those who have entire control over a certain asset have complete control over the streams of money directed at that asset. That is not exclusively limited to copyright, but changes to copyright opened up a whole new set of wide-ranging assets to hold that were previously not there. Anyone holding hard assets was presumably already milking it for everything it was worth, thus, while incredibly profitable, not increasingly profitable.

[1] https://www.forbes.com/sites/chasewithorn/2016/10/04/forbes-...


> They're not really growth curves, the curves show relative growth rates. Not quite the same thing.

An exponential growth curve is an exponential growth curve, irrespective of what it measures. Thanks anyway for the pedantry.

> And if you think some central mechanism has gone horribly awry, what central mechanisms changed around 1980?

In 1974, the US dollar switched from being a store of value to a store of debt: https://www.amazon.com/Creature-Jekyll-Island-Federal-Reserv...


I know people without valid credential sometimes have very insightful ideas.

And sometimes it's hard to have a discussion with someone when they don't understand the fundamentals of a field.

However it would be nice if you went into detail as opposed to just linking to that book. It's not part of the standard economic body of knowledge so I don't think it would be appropriate to expect your would be arguers to read it, especially when the author's wikipedia page is

G. Edward Griffin (born November 7, 1931) is an American far-right conspiracy theorist, author, lecturer, and filmmaker. He is the author of The Creature from Jekyll Island (1994), which promotes theories about the motives behind the creation of the Federal Reserve System.[1][2] Griffin's writings include a number of views regarding various political, defense and health care interests. In his book World Without Cancer, he argues that cancer is a nutritional deficiency that can be cured by consuming amygdalin, a view regarded as quackery by the medical community.[1][3][4] He is an HIV/AIDS denialist, supports the 9/11 Truth movement, and supports a specific John F. Kennedy assassination conspiracy theory.[1] Also, he believes the actual geographical location of the biblical Noah's Ark is located at the Durupınar site in Turkey.[5]


And James Watson has some pretty wacky ideas about biology, but he still discovered the structure of DNA. I'm not sure how not to take most of this, other than as ad hominem.

Other people have cited links elsewhere in the thread [0], and I don't feel especially compelled to be redundant. Not sure what to suggest other than googling some Austrian economists and their views on inflation.

[0] https://news.ycombinator.com/item?id=14960011


> some Austrian economists

Aren't there multiple schools of thought about economics? How do I know which one is the Right One to believe? Do they differ fundamentally on some levels?

I feel like googling for some school's teachings runs the risk of drinking one side's kool-aid, without even realizing what the counter-viewpoint is. Is there a good way to get a neutral overview of the different economic schools of thought, other than taking courses at a local university?


One of the differences is Austrians don't rely on empiricism. They start off with first principles about how humans behave and basically prove all of their proofs about human behavior and economics from their.

Mainstream economics has a much larger reliance on empiricism(especially recently after the Great Recession invalidated a lot of Chicago school models).

They differ fundamentally but like any field you should start with the mainstream. Start with the consensus of how the recognized leaders of a field agree on how the world works. Then start branching off into sub-fields(some of which will be labeled quackery like the Austrians). This doesn't mean they're wrong. Many times a sub-field is labeled as quacks before the mainstream finally accepts them. But more often than not the mainstream is right.

Some recognized leaders would be Greg Mankiw(representing conservative mainstream economic thought) and Paul Krugman/Brad Delong(representing liberal mainstream economic thought).

Also I think the best way to start an education in a particular field is with a textbook. They are great overviews of a field.


Thank you for great starting points. I think my biggest hurdle was that I didn't feel like I had a good handle on how to discern the mainstream from the quacks. I appreciate your concise overview :)

> I think the best way to start an education in a particular field is with a textbook.

Excellent suggestion.


None of them are right, but some of them are useful.

Many of their followers forget the fact that they are thinking in simplified models. If reality seems to contradict the model, it is because the reality is somehow deficient from the ideal of the model and if we could only just shape reality to be closer to the model...or so most economic arguments seem to go especially when it comes to public policy.

Wikipedia is actually pretty good at being a neutral source in this area.

https://en.wikipedia.org/wiki/Schools_of_economic_thought

Understand the implications of the inevitable simplifications and you'll avoid drinking anyone's Kool-Aid.


You referenced G. Edward Griffin's book, which implicitly points to him as an authority on the subject so I think that its completely acceptable to question his authority. (I'm more familiar with ad hominem as an attack on the person giving the argument, i.e. you, not someone they reference as an authority)

I'm also not aware of James Watson's wacky ideas about biology. I know he has some controversial and maybe racist ideas about intelligence, but I don't think any of his ideas about biology are described as quackery.

And its about Bayesian inference, sometimes the mainstream is wrong. Not as often as it's right, but sometimes it's wrong. Sometimes people without credentials make large contributions to a field, not nearly as often as the credentialed make contributions but sometimes. Rarely a conspiracy theorist is right. But I think people who maybe don't know that much about economics should at least be aware of his position in the field before investing considerable amount of time reading his book.

Maybe vaccines cause autism, but it's a weird place to start a biology education.


You're seeing something shaped like an exponential growth curve and missing the fact that it's not a growth curve at all. Think about what's on the x-axis here. The graph is really a histogram with average income growth binned into one percentile intervals.

On the second, thanks for the reference, haven't read that. Would be interested to hear how you think that change would lead to these effects.


> You're seeing something shaped like an exponential growth curve and missing the fact that it's not a growth curve at all.

Okay, slice the upper quartile and graph it over time. Now it's a proper growth curve, but it communicates the exact same idea.

> On the second, thanks for the reference, haven't read that. Would be interested to hear how you think that change would lead to these effects.

Quick thought experiment that expands on my explanation in the original comment: Suppose you have (a) a wage-earner who spends 90% of their income on goods & saves/invests the other 10%, and (b) an investor who earns an income from assets, spends 10% on goods and reinvests the other 90%.

Now (for the sake of simplicity) suppose inflation is 10% per year. What happens after year 1? What about year 2?


Massive shift of the taxation burden from corporations to citizens.


Charts / sources?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: