Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Approved Cameras (netflixstudios.com)
305 points by lai-yin on Jan 8, 2022 | hide | past | favorite | 307 comments


The technical quality of Netflix originals is really high. One thing that annoys me though is for some reason despite having rigorous standards for the camera it seems a common trend through numerous Netflix Originals to use the Vantage Hawk V-Lite Vintage '74 line of anamorphic lenses. I guess it's a stylistic choice but it results is some (for me at least) annoying image distortion that can really distract from the show. They are literally described at claiming the flaws are a feature: "The Vintage’74 version has certain gentle aberrations and other characteristics that might be considered flaws by others but are welcomed storytelling tools for cinematographers"

You can see this is some example footage: https://www.youtube.com/watch?v=GhbRkoeNb9U

Note the distortion and blurred top and bottom of the image.


Might be related but I'm finding recent TV shows and film to have a really narrow field of focus while the rest of the screen is either blurry like your example or a depth of field effect.

My eyes generally wander to look at the surrounding scene or detail but I get the feeling I'm not supposed to be looking there.


It's probably too blur out any imperfections in the background green screen, or to hide any uncanny valley lack of detail in the CG set. It's become basically standard practice to use a CG set of some kind in almost everything, because it's cheaper than shooting on site or on a fully detailed sound stage.


To some degree the DoF feels like a trend, there will probably be movies in the 50's made in the Netflix style of the 20's.


Is this at least partly due to higher resolution / larger displays? Better quality displays with higher resolution sources will make a difference between out-of-focus and in-focus more apparent.


Maybe, but crazy shallow DoF goes back at least to Kubrick famously shooting parts of Barry Lyndon at f0.7.


Barry Lyndon was shot with an f/0.7 lens because Kubrick wanted period-correct lighting: candlelight, which required a super fast lens with the film used. To my knowledge he wasn’t going for shallow DoF.


You are correct. This wonderful article taken from American Cinematographer goes over this very topic in great detail:

http://www.visual-memory.co.uk/sk/ac/len/page1.htm


I just want to say thank you for that link; it was an inspiring read. I think it's time to watch Barry Lyndon again.


Be that as it may, the DoF is unavoidable unless every shot is insanely tight.


Not sure about TV shows but for films (especially Netflix originals), that's partly due to the recent trend of films getting shot for TV viewing experience. These films tend to have fewer "wide" shots and more "character" shots compared to the earlier theatre release only films.


i'n not sure what you're watching, but it's probably at least partly because you're not supposed to be looking there - VFX cost money, and if they can blur or darken a bunch of the frame that makes it cheaper.


That kind of distortion is naturally common to most anamorphic lenses, but especially apparent in older glass. See: Soderbergh’s vintage anamorphic choices in No Sudden Move.

In the old days, they would mostly restrict anamorphics to locked off shots/specific compositions to minimize how visible these type of distortions can be.


Cinematographers almost always want part of the image blurred - an entire image in sharp focus would make it difficult to perceive depth and would take away the ability to direct the gaze of the audience.


In this case the OOF points of light are actually vertically oriented ovals. The idea being its reminiscent of "1974" style anamorphic lenses. Its an affectation not unlike digital retro photo filters, except that its the physical lens. Its not unlike musicians who use vintage equipment because they want to reproduce something as it was done in the past. Its gets a bit dissonant if the movie is set in the present though.


I just imagined sitting for a photography course test and the first question is "why do anamorphic lenses not produce round out-of-focus points of light, since the in-focus parts are not stretched". Naturally the course material didn't cover any of this.


The out of focus spots are an 'image' of the aperture mask. So they'll take on whatever shape the aperture has... not its physical shape, but its optical shape.

In an anamorphic lens the image is squeezed, so the effective aperture is squeezed... well, in most of them. There are anamorphic lenses with the anamorphic part behind the iris, and for those you don't get the oval shaped oof highlights.


A lot of cinematographers don’t really like the extreme resolving power inherent to 4K+ sensors, and optics that diffuse that sharpness (for instance, a lot of anamorphics) are VERY much in vogue as a result.


I don't know a whole lot about camera lenses, but it's this the reason shows like Sabrina have that extremely annoying distortion/aberration near the edges of the frame but covering almost everything not in focus at close range? I find it extremely jarring.

Edit: grammar


They toned that down by the end of Sabrina season 1. I think the idea there was to make a sense of otherworldly spookiness.


And yet they gave her that thirsty wig. I stopped watching the show with Season 2 solely based on the poor wig work.


I’ll take terrible wig work over ‘oh shit, am I about to get a migraine?’ camera work any day. Nevertheless, I think my wife and I quit watching around the beginning of season 3 when it was clear the show had utterly jumped the shark.


The show jumped with the wig I tells you! That accursed wig! Ah!


thanks for the laugh, seriously XD


Oh wow, I've noticed that a lot of Netflix Originals blur out regions of the frame that aren't front and center and it bugs the hell out of me. I assumed it was some weird post-production attempt to mimic depth of field, I didn't realize it was a lens choice.


Oh thanks. I thought I was crazy because I could notice the blur on the bottom and top of the screen but not anywhere else. I thought "Maybe they did this so the subtitles have a better contrast" but I found it stupid. At least it was done for artistic motives...


It looks pretty cool during a rack focus, but I agree, it's definitely used to give a filmic look but gets tired quickly.


Are you kidding? They look very cheap


I had a quick google and found a set for rent at 13900 SEK per day, which is apparently about $1500.


Am I the only one who feel continually let down by the script writing in Netflix Originals? Not all of them, but many of them seem to have a budget that allows for “decent” special effects and scenery, but when it comes to actual writing, which in my mind is what carries or collapses a series to begin with, it sometimes feels like they just grabbed interns or whoever was available for the lowest cost.

The latest example of a promising series torpedoed by shoddy writing is in my opinion “The Defeated”. I mean, it’s about killing nazis in post war Berlin, how do you mess this up? And yet the characters are boring, cliché and the story is predictable to the point that it feels like a chore to watch.

Maybe I’m not the majority in this opinion but I really feel like Netflix could probably make better series if they spent less on production and more on writing, but now that I’m saying it out loud I guess I am the minority after all. Most people probably prefer a well produced series with a bad script over a well produced, pretentious indie movie or bottle episode.


You're definitely not alone. I was trying to watch Netflix's newest Harlan Coben adaptation, "Stay Close", and it all looked very nice (especially the title sequence) but the script was abominable, the acting was bad, and it was just... dull. Apparently they're slated to make 14 adaptations of this guy's books, which really sums up the problem: quantity over quality.


I agree. My partner is a script writer and there is actually a talent shortage on the writing side. No shortage of writers, of course; there never is! But it's hard to come up with actually coherent stories, on tight deadlines, that can be translated directly into a video shoot and assembled into a movie or series. Clearly Netflix is under-investing here, or perhaps relying on a cabal of existing creative "buddies" on the executive side to pump this stuff out to meet KPIs.


My perspective on this is that this happens because they are in the position to afford doing what basically is a "throw everything at the wall and see what sticks" approach, which can lead to some really great stories/writing but will also produce subpar results. I personally like this because I feel like they are able to give chance to a lot more diverse ideas which won't fit traditional "Hollywood".


I've watched several that i've thought have been written by AI / GPT.


This means Netflix operates as a production and post studio for their many and talented film-makers. The stuff with the red N on the poster -- some of it is REALLY good -- uses Netflix-operated handling.

Any video production goes through a lot of steps from raw studio footage to edited, finished product. This web site says that Netflix processes and people handle a lot of that.

Also it looks like 4K is their minimum standard now. I bet the higher resolution makes it easier, with big*ss GPUs to clean up the raw footage.

(Footage? WTF? Video measured by distance? Maybe we should say "frameage" or "pix.")


> I bet the higher resolution makes it easier, with big*ss GPUs to clean up the raw footage.

Even if you don't do any other post-processing, downscaled video looks better than video captured at sensor resolution. This is because you can implement a steeper low-pass filter (closer to ideal brick-wall filter) in software than you can in actual optics. Hardware lowpass filters tuned to block most stuff above nyquist will also block a lot below it.

A 1080p video captured off a good 1080p sensor will have vastly lower amplitude at, say, 500 line pair vertical than a 1080p video captured on a 4k sensor and lowpassed in software with a good filter.


It was always standard to acquire at higher resolution than being delivered. Hell, they shot 35mm film for SD for decades, 16mm for non-extravagent budgets.

The first use of 4k wasn't for 4k delivery, but for single camera shoot of talking head reframing. Full frame for the wide, cropped in for the MED/CU shots. Larger image also allows for cleaner image stabilization. Allows for more detail on pulling keys in post, etc.

There are lots of benefits from larger image than delivery.


> Hell, they shot 35mm film for SD for decades

I'm not sure this is the case. Low-ISO cinema 35mm film (like Kodak VisionT 50D/250D) far exceeds SD resolution. How do you think Star Wars was ever scanned into 4k? Or 2001: A Space Odyssey? Film captures highlight detail logarithmically as well, while digital sensors still haven't really figured out how to render highlights well. This allows greater shadow detail with a skilled cinematographer with no loss of quality in brightly lit scenes. Theoretically, a single 35mm still can be scanned into 4k with no loss in detail, and higher-resolution scans won't lose quality -- there's just dimishing returns and in the end you might just be getting higher-res grain, but it is a far cry from standard definition.

Even 16mm can do HD scans just fine.

There's a reason Tarantino, Wes Anderson, PTA, Nolan, Sam Raimi, Scorsese, Spielberg, and David Lynch still swear by film (even though many of these directors have dabbled with digital). The idea that film is inherently "low-res" is a really frustrating and common misconception among younger generations unfamiliar with the technology. I find it kind of unbelievable that DPs would shoot 35mm just for an SD scan. That type of use case seems better suited for mini-DV or other contemporary video formats.


Maybe you misinterpretted what I was meaning. What you quoted from my post was meant as saying they took all of the image from a 35mm frame just to shoot a measly SD image. That's like shooting 4k to deliver SD. It's seemingly way overkill.

>That type of use case seems better suited for mini-DV or other contemporary video formats.

You realize that people were shooting film for TV production long long before DV was ever invented right?


> I find it kind of unbelievable that DPs would shoot 35mm just for an SD scan. That type of use case seems better suited for mini-DV or other contemporary video formats

Lots of shows were filmed on 35mm. For example, Star Trek: The Original Series and Star Trek: The Next Generation were filmed on 35mm, so in modern times they've been able to re-scan them and make HD Blu-ray releases. I believe shooting direct to video for anything higher budget than daytime soap operas didn't become commonplace until the mid-90's.


> Hell, they shot 35mm film for SD for decades, 16mm for non-extravagent budgets.

This probably was also to deal with period TV cameras. Video cameras of that time couldn’t deal with bright lights and had terrible dynamic range. CCDs improved this a lot. Direct to video productions looked cheap.


> Hell, they shot 35mm film for SD for decades

They shot 35mm for the big screen.


Primetime TV shows were shot 35mm. Lower budget was shot on 16mm.

They weren't doing for screen size considerations. It's just what they had. You shot 35/16 or you shot video (shudder).


What does low pass filter mean in the context of video (i understand in audio)?


The same thing: a filter that only lets through low frequencies. A blur effect is a spatial low-pass filter. A motion blur effect is a temporal low-pass filter.

What GP is talking about is the antialiasing low-pass filter used when downsampling video to smaller resolutions, or when initially sampling it at the sensor. This is the same exact concept as the antialiasing low-pass filter used when downsampling audio to lower sample rates, or when initially sampling it at the ADC.


Perfect. Thanks!


Downscaling or blurring - these are the same thing if you pretend it’s bandlimited. Which it isn’t of course.


The signal is typically bandlimited by hardware — the optics and sensor. When it's not, you get weird effects, though that's rare because cameras have optical low pass filters to avoid this.

When they don't, like the Canon 5D mark ii when used in video mode (which just threw out most of the sensor's pixels to get to 1080p), you get aliasing producing moire patterns like this:

https://youtu.be/dMHDB5fG6ys?t=218


Also, video is not actually bandlimited so the supposedly perfect filter isn’t actually perfect; you can do a better nonlinear one.


What does a better filter look like?


Selective Gaussian blur or hq2x are examples of image filters.

Font hinting is an example of a nonlinear downscaling though most people don’t think of it that way. It preserves line sharpness, which is a non-bandlimited feature since lines are infinitely sharp = made of infinite frequencies.

Overly sharp linear filters produce artifacts like blockiness and ringing; nonlinear ones would try to avoid that but you could still get more difficult ones like moiré.


This is mostly obsolete with High-DPI screens though; we're at the point where such hacks are no longer needed and we can use close to ideal filters (e.g. lanczos, which is a windowed sinc). macOS no longer does font subpixel AA or strong hinting by default for this reason.

This is good, because it means we can finally treat computer graphics as properly scalable images without introducing "blurriness".

Note that lines were never infinitely sharp; pixels aren't infinitely sharp, they have a shape and the screen construction determines how sharp they look. They are, however, sharper than what you'd get in a nyquist-limited system.


> Also it looks like 4K is their minimum standard now. I bet the higher resolution makes it easier, with big*ss GPUs to clean up the raw footage.

Resolution is far less important than bit color depth. We barely need 4K(Alexa used to be 2K) much less 8K footage for something to look good on a tv.


I would think that capturing at higher resolutions makes it feasible to crop and zoom? In digital production, you always want to work at high resolution for all intermediate stages, then reduce bit depth only during the final output stage.

When I worked in music production a while back, we often had the problem of source material recorded at low amplitude to 16-bit media (e.g. DAT, CDDA, ADAT). While 16-bit is theoretically sufficient, in the real world you need headroom while recording in order to avoid clipping. So we'd often wind up with grungy sound from inadequate bit-depth during capture and intermediate production.


I know recording live music can get hairy, but yikes! How was 16bits not enough? Were all the preamps set to max? Who records at -12db - -6db? Record at a way lower level and boost at the amp stage?


Imagine you're recording highly dynamic music, like Jazz. Clipping is absolutely never OK. You have to leave tons of headroom, there's no other way. So 16 bits is not enough for live capture — especially when the source is dynamic or the recording conditions are chaotic.

The studio I used to work at was one of the first ADAT studios back in the 90s, and they did a lot of rock. One of the ways they got past the limitations of the medium was to compress and limit to tape. You generally don't want to do that unless you have tons of time to set up and get it right (a la Rudy Van Gelder), which was not in most budgets. But it worked well enough and the ADAT recordings out of this studio sounded way better than most.

Mercifully storage has gotten cheaper in the decades since and you really shouldn't capture to 16 bits any more.


How much post-production dynamic range do you really need/want for live jazz recordings? Surely 50 dB would be plenty, and even that would be pretty impractical to listen to except in very quiet environments. Wouldn’t 16 bit digital audio easily give you 50 dB of usable dynamic range and still have like another 50 dB of headroom?


You need 8 bits to do 48dB, so if you want your music to have 96dB of dynamic range and you want your editor to be able to slide things around 48dB without perceptible loss of quality, you need to record at 24 bits of depth.

If you record at 16 bits of depth you are pretty much automatically forcing your engineer to choose between quality and range when they are normalizing the recording.


Listening to music with 96 dB of dynamic range is extreme though, right?


You seem quite determined to leave inadequate margin for error. :(

The most important rule of audio recording is "always be ready to roll tape". What the engineer does is of little consequence compared to the talent. Don't ever fuck up a keeper take.

Recording with some extra bits means that even if your gain staging was bad, you just get some extra uncorrelated noise, not correlated quantization distortion.

As for whether this stuff is perceptible, there are two issues at play. First, differences which are not perceptible during production can result in a perceptible difference in the final product. This is easiest to understand with a visual analogy: if you have inadequate resolution of an image during processing, the end result may be slightly defocused and smeared.

Second, as I described elsethread, quantization distortion is a particularly pernicious form of degradation, poisonous in small amounts. https://www.youtube.com/watch?v=8KkS4NSv-0M It is hard to listen for it during production, so we defend against it by giving ourselves lots and lots of headroom.


I get it. I’m asking what dynamic range you need for the completed production, and how much additional headroom you need when recording. I don’t think you normally would need 96 dB for the completed production, because that would be very impractical to listen to.


16 bit certainly suffices as a delivery format. Many listening environments provide far less dynamic range (e.g. cars) and environments where the lowest level details aren't buried in the ambient noise floor are going to be few and far between. Even though it is possible to encode information below the noise floor in practice information that far down isn't going to be perceptible. (A practical definition of "perceptible" might be "able to be distinguished in an ABX test by a trained listener".)

As for how much safety margin is "enough" during initial capture... We're looking at a stark choice between either 2 or 3 bytes per sample. 2 bytes seems to be borderline, while 3 bytes seems to be plenty. It would be hard to quantify a safety margin more precisely than that.


Say that I need a 300x400 image for a webpage. Does it make sense to set my camera to take 300x400 pictures? What happens when we crop and postprocess?

If we want the best quality end product, free from low-res artifacting, we need to preserve an excess of resolution all the way through production until the final export. Exactly how headroom much is enough? Enough that artifacts arising from insufficient resolution are not perceivable in the end product.


Yes, I understand the reasons to have enough dynamic range and enough headroom. My question is about what specifically is enough.


I think 16-bit quantization noise level is usually lower than the noise of all the analog hardware before the ADC, but 24-bit quantization noise seems to be lower than even the internal noise of audio ADCs so there seems to be no reason to go higher.


I had the privilege of visiting the van Gelder studio in high school. I asked him about analog vs digital and surprisingly he said he was all in on digital and spending his days remastering his old work with modern tools.


My understanding is that 24-bit source material (the standard when I was last recording which was several years ago) provides enough dynamic range that you can bring up the desired volume without overly muddying quiet sections with noise, especially when mixing multiple sources where that noise adds up quickly. My own personal experience recording at 16 bits was… “I must play louder and overstate my song’s understatements” because I never could get a 16 bit output that didn’t sound like garbage. Granted I’ve never been more than a novice at recording and I’m certain there’s more I could learn to work with the lower bit rate but… why?


Also: this got me wondering about how much the “loudness wars” and “brick wall” music products overlap (in time) with the almost total transition to digital recording. It also got me wondering how much the more gradual digital transition affected musical style and taste. If analog recording captured a much more flexible dynamic range, it’s easy to imagine a lot of other recording artists similarly deciding “fuck it play it louder”. And it’s easy to imagine that culturally influencing louder recording targets generally.


The resolution bottlenecks of early digital as described in this thread are not related to the loudness wars of the aughts. If you were doing big budget recordings back then you were either doing 2" analog or you had enough setup time to set gain levels carefully and "fill up the meters" for your digital recorder.

Recording close-mic'd drums to analog tape is actually a useful technique when maxing out levels because analog tape saturation can be aesthetically superior to digital peak limiting for shaving drum peaks — analog tape saturates high frequencies first. It wasn't uncommon to either record basic tracks to analog then transfer to Pro Tools, or even to track to Pro Tools then bounce the drums out to analog tape and back.

For the record, there were loudness wars back in the vinyl days, since it was seen as important to have your 45 rpm single "compete" with others in the jukebox. It wasn't something new in the aughts.

> If analog recording captured a much more flexible dynamic range

It doesn't, especially not in comparison to modern digital. Dynamic range is not a strong point of analog because of tape hiss.

The issue with low res digital is that truncation distortion is non-harmonic (not an integer multiple of the input frequency) and so is disproportionately aesthetically damaging even in small quantities — in other words, digital grunge sounds worse than tape hiss.


It is trivial for modern hardware to work at 24bit. 16bit hardware is probably old, overly value engineered, or deliberatly operating below its capacity. These are all more plausable reasons for it to sound bad.


> capturing at higher resolutions makes it feasible to crop and zoom?

The more you capture, the more you have to play with at later production stages.


And pay for. Larger data size across the chain isn’t costless.


Overall, data storage is not a large fraction of the cost of a TV series or movie production. If your assets grow too big and you don't plan on accessing them, you can store them in cold cloud, or archive-grade BD and tapes.


I wish we would move to higher bit color for playback as well. Dark scenes very easily result in banding (especially with streaming compression). With video shot with cameras there is usually noise and grain (real or artificial) to help break it up some, but dark scenes in video games almost always look like garbage to me. Some games, like Playdead’s Inside get around this by, again, add a ton of noise to the image, but that doesn’t always work stylistically.


Also compression. ARRI 2.5k cameras made decent images as they were lot less compressed as some 4K cameras at the time.

Netflix streams 4K but often doesn’t look so great over internet compared to any cinema in my area which projects only 2K on a huge screen but from bigger files.


> Resolution is far less important than bit color depth.

I discovered this playing with my scanner, looking for the best settings. It surprised me.


Sure is nice to crop in though.


If you’re shooting 4K, you have to be calculated with your lens configuration to actually get that resolution at the sensor. After that kind of effort, cropping your shot in post is going to be a silly step to take.


You don’t have to crop in significantly, but just trim edges/etc.


When you used actual film it was long lengths of the stuff, hence footage. Lots of words have weird no longer relevant sources. Like “logs”.


Not what I expected at all:

log (n.2) "record of observations, readings, etc.," 1842, sailor's shortening of log-book "daily record of a ship's speed, progress, etc." (1670s), from log (n.1). The book so called because a wooden float at the end of a line was cast out to measure a ship's speed. General sense by 1913.*

So it does have a relationship to the wooden log - it was originally a series of measurements from a floating bit of wood.

https://www.etymonline.com/word/log


On set, depending on the size of crew, part of the job for an AC or 1st AC is to make a camera report. Each shot is logged to note things like which lens was used, what filters were used, the height of the camera from the floor, the distance of the film plane to the focus point, the frame rate, aperture, ISO, etc. All of this info can then be used if they needed to do a pick up later to put the camera in the same setup. Also, it can be used in post production so that the camera can be recreated in 3D to have the special f/x match.

Also, log can refer to the gamma curve which affects how roll-offs in highlights/shadows are saved (rough description).

"Log" is still used daily in production work.


This may not be universal, but when I was a 2nd AC/Loader - the 1st AC didn’t log anything. Always 2nd AC along with keeping the slate.


"depending on size of crew"

To get to 2nd AC, you had a good sized crew


I suppose, but we’re talking about Netflix here. DP, Operator, 1st AC, 2nd AC, and a loader is pretty standard on even sub-10 mil budgets (which is nothing).


Debating the size of the crew is fun and all, and yes, traditionally for a full crew, this is the purview of the 2nd AC. As the crew gets smaller, those duties roll up to the next person.

The fact that a "log" is still a thing in today's productions is not dependant on which crew member is doing it. In fact, with all of the digital f/x, they are crucial.


Absolutely. No disagreement here.


See also the unit of speed for ships and aircraft, knots, after the knots tied in the line attached to the log.


"mark twain" means 12 ft. Samuel Clemens adopted it as a pen name.

https://www.mvd.usace.army.mil/Portals/52/docs/regional_floo...


Yeah and knots from the literal knots in that rope attached to that literal log. Not sure what you expected though, look at the etymology of any word and you get back to something concrete and tangible eventually (with rare exceptions for a few feelings and such.) We haven't been abstract that long, so most of it is metaphor or analogy.


It's becoming less and less of a thing as film is used less and less, but some of the older NLEs would allow you to view your time as feet+frames and allowing to specify between 35mm and 16mm.

Film was also sold by the foot. You didn't by film by 15 minutes. You bought film by 1000' rolls (35mm).


Or a fraction of a second at 10,000fps, where 400’ would be spooled through in under 3s (16mm)

As ingenuous as our mechanical past was, I am so happy to have lived through the transition to 100% solid state everything.


> Film was also sold by the foot. You didn't by film by 15 minutes. You bought film by 1000' rolls (35mm).

But that just makes sense doesn't it? It's only 15 minutes' worth if you run it at 1000'/15m.


Yes, I was just mentioning that for those that might not be aware.

Developing your film was also charged by the foot. The soup didn't care about your framerate. Just how much was need to process.


Thank you. Now I understand why it’s "métrage” in French, and also why a “long métrage” means a long movie!


> Also it looks like 4K is their minimum standard now. I bet the higher resolution makes it easier, with big*ss GPUs to clean up the raw footage.

I assume it's their minimum standard for Netflix Originals as they want more content in 4K rather than making it easier for them, particularly as 4K is an up-charge subscription.


From what I've read on their documentation [1], it's mostly for future-proofing their own content.

[1] https://partnerhelp.netflixstudios.com/hc/en-us/articles/150...

--- StartQuote ---

Why does Netflix require UHD on Netflix Originals?

Answer: In 2014, Netflix made the decision to begin shooting and delivering all Originals in UHD. This decision was made for several reasons, the most of important of which is to future-proof our content. UHD is here, and adoption of UHD in the home is increasing. In just a few years, it will be harder to find an HD television than a UHD television. For this reason, we feel it only makes sense to shoot natively in the format that most of our customers will see for years to come. The experience for customers viewing HD is still fantastic. Our encoding pipeline takes a UHD master and produces beautiful HD (and SD) streams for all those customers who are viewing on smaller or older displays.

--- EndQuote ---


Well that and 4K60 is the new standard definition. It’s been 4 years since flagship phones that could record 4K60 were released and most of those devices are now discontinued and obsolete. It’d be hilarious if studios used their fancy optics with cameras that have a lower res than something you can get for $80 on eBay


Eh, just because it has the pixels doesn't mean it has the resolution. Often 1080p content from high quality systems still looks more detailed than 4k content from smartphones, even fairly modern smartphones, as the pixels can only record detail that gets through the optics and isn't marred by noise. With phones having relatively low quality lenses (for the size of the pixels they must resolve) and smaller sensors, this can quickly become an issue. With that said, it is great to see 4k becoming standard for productions!


Regarding "footage", notice that you also refer to them as "film-makers." There really isn't film involved. And Netflix is named after the term "flick" which refers to the flickery effect you get with movies (especially in the old days), which hopefully isn't there anymore.


Netflix farms out all this post work. That's why they have these delivery specs. You follow the white papers and then they run it through their automated systems that do QC (followed by people verifying). Then they run it through an auto pipeline to create final streaming files and archives but they don't do any further manipulation of the image. Once you start doing one Netflix job they return to your post facility consistently.


> (Footage? WTF? Video measured by distance? Maybe we should say "frameage" or "pix.")

I'll start saying one of those when my phone dialer icon stops looking like a Western Electric Model 500 handset.

https://en.wikipedia.org/wiki/Model_500_telephone


> The stuff with the red N on the poster -- some of it is REALLY good

An overwhelming majority of it is total garbage. It's such a bad ratio that I treat the red N as a negative signal when I'm choosing something.


The McDonald's of TV show production. With very, very few exceptions (Midnight Mass, maybe?) I at least know what I'm getting into: designed-by-committee, very very "safe", out of place (most likely contracted) references to other Netflix shows to try to cross-pollinate viewers... not exactly a recipe for success.

Similar to how Amazon Originals now seem synonymous with "we cut the writer's outline from 12 episodes to 6, enjoy the rushed character and plot development!"


> designed-by-committee, very very "safe"

This is why I was so surprised by some of the background references Inside Job was able to drop in its first season. (Do they still call them seasons when they release all at once? Either way it's really good)


Agreed, except your last example. Most shows are much too long now, and extend way past after they have exhausted what little they had to say.


Yea, I can agree to Midnight Mass. Best quality was the plot of “X discovered by Y who thought it was Z”. But now that I think about it, the cinematography was pretty good. I know some people were put off by the long monologues, so it wasn’t without some odd choices.


I thought Dark was excellent and not at all like you described.


DAЯK was excellent, but as the saying goes, it's the exception that proves the rule.


Agreed. Netflix has so much pulp now it's not even funny. It used to produce much higher quality content. After other copyright holders started pulling out they seem to have started focusing on quantity.


The streaming services are in strong competition for original content. Sometimes buying exclusive rights to completed work off the shelf. This content seems to get labeled as an "XYZ original," be it Apple, Netflix or something else.

Do you differentiate between content greenlit and produced by the streaming services from those purchased? I'm curious because they are two different capabilities and it seems each service is attempting some amount of both.


Yes, there is a clear quality difference between the stuff they acquire/license (some of which is quite good) and the utter dreck they produce themselves.


I wasn’t aware that „dreck“ is being used in English. TIL


What other language(s) is that word used in?

It's a moderately rare word in everyday spoken English.


“Dreck” is a German word.


I didn't know that! I got it from my mum who presumably got it from her British mum.


It's traveled fairly broadly on account of having been adopted into Yiddish - I picked it up as a very young child from an issue of MAD Magazine, and was somewhat surprised to find the New Englander side of my family very familiar with it and only bemused at the happenstance of hearing it from a five-year-old.


There is also some original content which is quite good.


Yeah they greenlight almost anything. It means they take a lot of risks which can produce some really creative and unexpected treasures but also a ton of crap. It's basically the VC model applied to content creation. One blockbuster is worth ten flops.


Most stuff on Github is garbage as well, but all the best projects are there.


This analogy doesn’t hold since the best projects aren’t on Netflix. HBO’s TV catalog destroys Netflix from a quality and critical acclaim perspective. I can’t even think of a Netflix show I’d consider to be HBO-level quality.


> total garbage

Which historically correlates with commercial success.

It's why Netflix is more than happy to green-light awful shows such as Emily in Paris.


Definitely.

McDonalds grosses a helluva lot more than Wolfgang Puck -- $19.21 billion vs $57.50 million.


Yeah, footage.


Since there we're questions of what this list is for, this list doesn't determine with which cameras a TV show or film has to be shot with to be included in Netflix. Its rather one of several technical specifications (there are others for post-prod flow for example) for production companies who produce the so-called Netflix Originals for Netflix.


They probably do better to put more work into writing a guide for approved writers.

So many (many) shows with great actors completely let down by terrible scripts....


I agree so much. The recent Cowboy Bebop TV show had decent casting and high production value but the worst writing I've seen in a very long time. It was like watching The Room for TV shows. It's no surprise it got canceled. I really wish their writers could be better.


Let's just add some explosions! That'll fix a terrible script.

The king of witless, trite scripts has got to be the recent "La Brea" series. There's such a lack of creativity that the characters recycle each others lines.


Let me bet, they stare at each other to express feelings, very very often as a way of stretching the duration of scenes?


They'd say things like "I promise I'll come back for you", and "I can't leave XXX" ad infinitum.

They'd also do the usual plot device of splitting up and wandering around alone, for the sole purpose of getting into a fix so everyone else can rescue them.

Although sabre tooth tigers and dire wolves regularly eat the red shirts, the rest never think of arming themselves.

I watched the show out of incredulousness. And because I enjoy snarking on bad shows :-)


yes but this has worked to create uniformly beautiful "netflix original" documentaries


On their anime production requirements (https://partnerhelp.netflixstudios.com/hc/en-us/articles/360...):

> The working resolution must be 1920x1080 or higher throughout the production, including drawing, scanning, background, CG and VFX.

At least a few years ago, there was a belief that most anime were produced at under 1080p, then upscaled if necessary. Therefore 1080p anime didn't offer and advantages compared to 720p releases. I wonder if this is still true to this day.


IIRC, the quality concerns in anime has never really been resolution — in all the circles I used to dabble in, the real concern was color bit-depth, with 1080p just being a given. Last time I checked in, 10-bit rips of bluray releases were the holy grail and anyone “serious” about their anime quality made sure to have all the proper media players and codecs to handle it.


You're not wrong, 10-bit depth is essential to avoid color banding. However, I believe that all anime blu-rays are encoded in 8-bit AVC, so the issue is not the bit depth in the source material. Rather, fansub groups re-encode blu-ray releases to achieve smaller bitrate, and 8-bit => 10-bit conversion allows higher compression rate while still avoiding banding.


Did Arcane fall under those requirements? Its been a long time since I saw something animated and styled so beautifully.

Before that, I was usually impressed by Ufotables quality (Garden of sinners, Fate Stay / Night stuff).


Arcane is 3D animation (vectors and polygons) over painted backgrounds for 100% of it I believe.

Anime is computer edited 2D cells drawn on tablets for foreground and background and layered. The 3D work is limited to certain scenes.

3D animation can be scaled up and down, but it's not hand drawn and converted to raster images like the 2D cells necessarily.

I imagine with anime they just agree on a resolution for all the rasters and plug away at it, because they're under insane time/budget pressure.

The painted backgrounds who knows.


Just to add, Arcana also makes use of 2d effects animation. You can notice this when there is fire for instance, and in many of the explosions as well. Impact frames are also likely hand drawn.


I imagine the 2D effects animation is also done in vectors like the 3D.

Maybe a better way of phrasing my answer is that Arcane was done in vector shapes.

Whilst anime is sometimes done "quick and dirty" by agreeing on a raster resolution and mastering everything in raster.


Netflix is like an in-flight entertainment system. There's really nothing there you genuinely want to watch, but here you are, you might as well watch something.

Every once in a while, they do have a good show. And even that sucks, because you know they will now ruin it by adding 3 seasons where absolutely nothing happens.

And there's the woke thing spreading its wings. Even in the Witcher, where it's perfectly fine to behead people and burn down villages, but one most not engage in acts of "toxic masculinity", by making dismissive remarks at women. This requires a lengthy dialogue to correct.

Yes, I should cancel.


Agreed.

I’ve been at a friends with all the streaming subscriptions. The most used is HBO (been watching Silicon Valley and they got tons of movies), Amazon is second (terrible ads at the beginning, sells movies as well), then Disney+.


I would like to see the film industry focus on not only increasing resolution, but frame rate as well. 24fps has remained a standard for too long mostly because of legacy and stylistic reasons, but with the switch to digital productions and projections, and internet bandwidth increases across the board, there's no reason why shows and movies couldn't be produced and delivered at 60fps and beyond.

The main argument against this is that consumers would find the "soap opera effect" jarring, or that it would add even more strain on traffic for ISPs. These are minor complaints that would go away as more high FPS content is produced, and bandwidth is increased. With the advent of cloud gaming, the infrastructure should already be prepared for these high bitrates anyway. And consumers could always be given the choice to stream at different frame rates if it bothers them stylistically.

I for one would appreciate more detail in fast moving scenes that otherwise become a blurry mess at 24fps.


>stream at different frame rates if it bothers them stylistically.

I think that would be less than ideal. I think usually with 24fps, the shutter is open for 1/48th of a second. So you get a certain amount of motion blur, and movie viewers are used to that amount of motion blur. If you shoot at 60fps, the shutter will likely be open for 1/120th of a second. If you then play it at 30fps by cutting out half the frames, the motion blur will still be 1/120th, which is probably not what the viewer wants.


The shutter-open-for-half-a-frame or "180° shutter angle" is commonly recommended as a default for shooting video. It is the result of having to compromise between inaccurate recording and a blurry video when your frame rate is low.

With 24fps and a small "shutter angle" you are capturing wrong information (e.g. resulting in wheels turning backwards) and you get a choppy look. With a large shutter angle you get too much motion blur. 180° is neither really correct nor really sharp, but it's the most commonly used compromise.

These problems are really the result of a too slow frame rate though - with 60fps you can shoot with a 1/60 shutter, and you get 100% accurate playback while still having frames sharper than a 24fps movie (with 1/48 s shutter) and a smooth video.

When you capture with a high frame rate (e.g. 120fps) and a 360° shutter angle (1/120s) you can also easily reduce the frame rate later without any ridiculous hacks like 3:2 pulldown by combining (averaging) frames or leaving them out, depending on the motion blur amount you want.

Applying the "180-degree-rule" to frame rates other than 24 or 30 fps just comes from missing knowledge or misunderstandings from what I can tell.


When I watched The Hobbit in high frame rate (HFR) it was jarring for the first 10 minutes, but after that I really enjoyed the look, and wished it would become the new standard.

Unfortunately the list of HFR movies remains rather short, with each grand new attempt at popularising HFR having failed: https://en.wikipedia.org/wiki/List_of_films_with_high_frame_...


I suspect production cost, especially for special effects, do rise significantly with a higher frame rate.


Agreed. There's a new platform called TrueCut Motion [0] for handling this sort of thing, which I can see becoming big. There were some examples of it at CES this year [1]

[0]: https://www.pixelworks.com/en/truecut [1]: https://twitter.com/Vincent_Teoh/status/1479507421546622977


James Cameron’s latest installment of Avatar will be in high frame rate. In fact he’s been advocating for high frame rate for many years now in the industry. Went so far as to shoot a mini movie demonstrating various formats and their pros/cons.

Was shown at CinemaCon years ago and is available within the industry.


A couple of months ago I had a Zoom meeting with the CEO of another startup. I remarked his camera must be good because the capture was of incredible quality. His "webcam" was actually one of the Canon ones on the list.

https://www.bhphotovideo.com/c/product/1633519-REG/canon_379...


Having a C for webcams is really over the top! The qualities of a digital cinema camera are wasted on such a compressed stream. A mirrorless and basic prime will give you the same effect for 1/10 the price.

That said it's entirely possible he needs to record serious footage on it for marketing and stuff and they sent him this because it shoots to their standards. Still, it's funny.


Maybe it's a pre-production prototype, repurposed to fit a need :)


I’m a hobbyist videographer and I’m not above setting up my lighting (3-point with octobox, strip box, and hair snoot) and camera (a7s iii with G lens) as a webcam to impress. I’ve recorded a few instructional videos at work with it, set it up for the first couple weeks after acquisition to meet the team, and I’d use it for job interviews if I was looking.

It’s not why I have the gear but… “smoke ‘em if you got ‘em.”


Canon released an app to allow most of their cameras to run as a fairly low-res webcam (it's a streaming capture from the liveview system rather than 720p).

This was really great for me with Zoom because I could get a sharp colourful image even in the low lights I had available, where a standard webcam would have dissolved into a grey mush of noise reduction.


I've got a 720p $50 (pre-pandemic price) Logitech at home. I regularly answer to the same questions about picture quality. This is because everyone are using Apple laptops with awfully shitty cameras and just used to it.


I've had the opportunity to work on two short productions for ESPN in the last two years. My role is writing the code for the screen overlays and real time scoring system.

I have very little knowledge in cameras but did have to review all the production guidelines.

Netflix doesn't mention image sensor size (CCD) but ESPN requires 2/3" or more. I'm guessing every camera they list as approved would meet that spec even though they don't specify it.

At the time I was surprised to learn that ESPN wants you to deliver all content in 720p. Wikipedia says that their Digital Center is equipped for 2160p, but they still broadcast in 720p. The archivist in me would want to receive the material greater than 720p even knowing it is to be broadcast in 720p, but I can understand the additional complexity that such a small thing can add, especially when dealing with hundreds of production companies.

I speculate that Netflix wants you to turn in higher quality video because they are factoring in that over time encoding and streaming algorithms and increased bandwidth will allow them to push higher quality feeds to end users, and they want the highest possible raw capture so they can improve the stream as technology improves, whereas ABC is content with airing footage in the future that looks as it did when it aired originally.


On a similar note, working at a TV station I've noticed that if you look carefully, about a third to half of a person's channel lineup is still in SD. Not only is there a wealth of 1080 content, but it's also difficult to even buy equipment that only produces standard definition. Still, it's cheaper to downconvert high resolution content than it is to upgrade the amount of bandwidth one has available. To this day internet streaming is considered inadequate for a network and transmission is being done over satellites and antennas which are pretty costly to upgrade.


I'd like to see their guidelines for the social content of their productions. Every Netflix production I've seen seems to feature commonly themed social commentary, in some cases more subtle than others.


Looking for blueprints for Jewish space lasers, are we?


It would be like asking to see Facebook's algorithm for the news feed.


Why would you like to see it? (Assuming any such thing exists, which I can't imagine)


To confirm their own assumptions as to how the media operates.


I know we can't do this here, but can we talk about the scroll-locked header which takes almost a third of my screen?


Don’t forget the squashed pictures of the actual cameras.


Came to say exactly this, the designer of this web doesn't know a thing about web design.


It looks better if you view it on their Roku app ;-)


It's too bad they don't screen their actual content with the same rigor as they do their technical side. I don't doubt their technical side is excellent.

Most of Netflix is utterly mindless garbage content for background consumption. I struggle to name even 5 of their originals over the years that you could call good "TV" or Cinema.

Black Mirror is the only one that really comes to mind.


Stranger Things

The Crown

Bojack Horseman

Lupin

Unbreakable Kimmy Schmidt

Russian Doll

Glow

Narcos

Sex Education

House Of Cards (although, this is less fun to watch now considering Kevin Spacey's real life actions but the show itself was and is quite good)

Dark

The Witcher

Daredevil

The Haunting of Hill House (similarly Midnight Mass and Haunting of Bly Manor)

All of these are award winning as well as just being incredibly good shows. I think Netflix just pumps out so much content. There's plenty of bad stuff, but there's also plenty of very very good stuff.


Black mirror was originally a channel 4 production (uk). It moved to Netflix on the 3rd season, so I wouldn't quite call it a Netflix original.


> I struggle to name even 5 of their originals over the years that you could call good "TV" or Cinema.

Netflix puts out plenty of good stuff, some people just can't distinguish "I don't like it" from "it's bad."

Comments like this are usually just an invitation for someone to put forth what shows they think are decent so you have a chance to dunk on their taste.


Well, someone did actually put out a list, and some of those shows look pretty interesting, although I do disagree with some picks.

A decent outcome nevertheless!


If you can name some things you like, I'll try to name some Netflix originals that might match?

I think you could be pleasantly surprised


Now do "sound must be remastered for stereo speakers."


This is the most ridiculous thing about Netflix. So many great brains there yet none of them can perceive a world where not everyone has a high end home cinema.


I mean, this isn't for movies that are bought by Netflix from an existing rights holder, it's movies and series that are commissioned by Netflix. It seems reasonable that they'd want to capture as high quality as possible for as much content as possible, as you can always lose quality in post (or distribution). I don't suppose this would preclude directors using cameras that wouldn't work otherwise when needed (like clip from a cell phone camera).

I also have a hard time believing Netflix would be ok putting up money to capture a picture/series all in 16mm film (or that anyone would suggest the commercial success or artistic vision demanded it).

So it doesn't stifle creativity, it future proofs Netfix for the lucrative high end market, AND Netflix is paying (at least indirectly) for that level of quality. I don't see the problem here.


You may have mixed up where you're responding to: This particular comment thread is not really about the cameras that are in use, but instead about the sound. (Sound recorded by the camera itself -- if it can even record sound -- is essentially never used.)

The problem is that the camera requirements are defined, presumably to try to have some lower bound on the quality of the visuals, there is no requirement for audio such that it sounds good on stereo speakers, which is probably what the majority of consumers are using.

The idea that Netflix have such a strict definition on camera quality is also a bit of a farce, given how woeful the image looks after it goes through their incredibly overenthusiastic level of compression, but that's neither here nor there.


I think the compression complaint is very fair. It totally negates the excellent camera quality! It doesn’t matter if your camera has excellent dynamic range if dark scenes in Netflix are compressed to shit and look terrible.

Compare to Disney+. As much as I dislike Disney, they have 4K Dolby vision for no extra cost, and quality is excellent. Prime and Netflix could both take examples out of their book. (Some of the compression in recent Amazon shows has also been very bad. For example, the trees in the background of the wheel of time were often a mess of compression artifacts even at 4K.)


I think you're right, I mistook OPs comment about the video, not the sound. Does seem weird they wouldn't have minimum sound quality, but it is a tricky thing moving 5.1 to stereo, so that's more about the mixing.


This is their studio PRODUCTION standard. It has very little to do with the exotic capabilities, or not, of each viewing endpoint.


I guess they’re optimizing for customers who do, and then transcoding to lower qualities for all the people who do not?


Or allowing the home hardware to do the mix for them. This is when it sounds the worst in my not so humble opinion.


Seems prudent.


Sadly the requirements for sound mastering state that having only a 5.1 mix is fine, and if you’re doing a stereo mix it’s alright to just slap the centre/surround channels into left/right after dropping the volume a bit. https://partnerhelp.netflixstudios.com/hc/en-us/articles/360...

That last point means that dialogue is likely to be quieter than sound effects and music, since dialogue is usually on the centre channel.


This pretty much ended around Blu-ray/HD-DVD switch from DVD. It was pretty standard to have a dedicated 2.0 Lt/Rt (sometimes a Lo/Ro) mix that was typically included as the first (default friendly) audio track on the DVD. This was convenient for people that did not have a surround system. Ideally, these were separate mixes vs automated downmix. They typically sounded good in stereo only. With the HD formats, it became the norm to no longer include the stereo source on the disc. Now, the expense of this specific mix could be avoided.

Who in the world only has 2 speakers? /s


Still, Bluray discs with the downmix metadata set correctly should still sound better in stereo than a streaming source that was downmixed without dynamic range and mix preference metadata.

The technology is old enough to be long out of patent either way.


Dolby's DialNorm setting was their true bit of gold. Incorrectly setting that value on surround encodings had a huge impact. Correctly getting that DialNorm value required Dolby gear. Many people could not afford that gear, so a -27 value was set across the board whether it was correct or not.

Metadata only isn't a panacea



What's the problem?


When the sound design is targeted for 5.1, most of the dialogue comes from the center speaker.

This is great if you have one.

However if its downmixed on the fly (ie done on the client side) the center channel is often just played out of both speakers. This means that sound can be muffled, because its drowned out by the incidental noise that comes from the left/right channel.


> the center channel is often just played out of both speakers

Well what other option do you have when there are only two speakers? Only play it out of one?


The center channel needs to be made louder in the mixdown to 2 channels


But why can't that be done automatically from the 5.1 signal on the consumer's device?


It could. But it's an artistic and a sound engineering decision that changes based on what you actually want the listener to hear.

The defaults for automatic sound mixing will almost always be wrong. And they will differ in how they are wrong from consumer box to consumer box.


If wonder if instead of a separate mix, 5.1 could be modified to include hints for how to better down-mix a given production.


It can. That's part of the Blu-ray spec. But it's not standardized in streaming video AFAIK (not that Netflix has to care about that, they have their own player) and, even if the feature exists, somebody still has to go do it.


A speech recognition model can give you a reading on how understandable the speech is and use that information to guide the channel volume in the mixing.

OTOH, a lot of the models end up trained on features that are very different from what humans hear.


Exactly and I have no idea, why this is not done - is it a technical reason, or is it lazy consumer device programming?


No,

When you master for 5.1, because you have the centre speaker, you can have dialogue at 100% volume, then out of left and right, you can have incidental noise, be that music or "atmosphere" also at 100%

(its been a while since I've mastered in 5.1) However, two channels of 100% volume (well 0dbu) is louder than just one channel. Which means that if you have lots of music, wind or other foley it'll drown out the dialogue.

It requires artistic choices from the sound team to make work properly.


The next sentence explains why this is a problem.

> This means that sound can be muffled, because its drowned out by the incidental noise that comes from the left/right channel.


Yeah I read the same comment you did, but if you're taking a signal with 6 channels, and only have two channels out, you don't really have any option but send the middle channel to both outputs, else it'll sound far too intense in a single ear.


A separate stereo audio mix.


How else would you mix the dialogue except to come out of both speakers though?


You mix the dialog louder for the entire piece, and every other sound in the middle. No extreme highs, no extreme lows. General compression with dialog forward choices.


I don't know why compression isn't built in to consumer media devices, it's so often called for (and closely followed by volume normalisation ... but I guess the advertisers veto that).


> I don't know why compression isn't built in to consumer media devices

As always, it depends on the device. Dynamic range compression seems to be a relatively common feature, usually as an option described (inaccurately) with something like "Reduce Loud Sounds" like it is on the Apple TV.


Just make it a bit louder, before summing it into the left and right channel


Right... so why isn't that done already from the 5.1 signal?


Because other times when it's not dialogue (or even when it is--the busy/crowded street effect) you may not want the center channel gain to be raised during mixdown.

Consumer hardware can only guess. A sound engineer can know.


I have no idea why that’s not being automatically done when the mixdown occurs in the player.


Wild volume swings, insane punches for explosions but silent dialogue.


It’s SOP for Hollywood. Sad we’re not getting Real Innovation around this problem.


Blueray supports downmix metadata that I think lets you mix volumes with the full 6x2 matrix, and I don't remember if there's EQ but there may be. Also a 5.1 AV is like 500 bucks if you have the space. I don't think what other innovations you need. The technology to stream different audio streams and let the user choose is there.

Thing is most studios now do only one mix for theaters and also most streaming providers don't give many fucks about audio. It's not an innovation problem it's a product problem.


Yeah, I have no interest in surround sound, I just want to hear the words. I specifically don't want to be rumbled by gunshots and explosions.


Top of the list is Munich based ARRI. A fascinating story by itself: https://en.wikipedia.org/wiki/Arri


Industry holds Arri color science above all others. Even when Red and others were shooting higher than 2k images, people preferred to shoot Arri for the color.

Arri was quite proud of their RAW format. I was told that on top of the price of the camera body, it was $20k to license the RAW format for the camera. Their RAW format was a sequence of frames that used the frame number derived from timecode appended to the filename. Had a strange situation with a client's Arri RAW footage. Typically, the camera's timecode is set to time of day. However, this particular day the camera op failed to do this. During one of the shots, the timecode rolled over "midnight" so the 23:59:59:23 timecode had a frame number of 2073599, but the very next frame 00:00:00:00 has a frame number of 0000000. Importing the footage broke this shot into 2 pieces. The last half of the shot showed up as the very first file in the imports, and the first half showed up as the last. RAW formats that are saved into a container format avoids this.

TR;DR camera peeps love Arri, post peeps have other opinions


All that color science just for some lazy editor to drop a blue LUT over it and make everything nice and drab and samey.


An editor isn't the one who picks the direction of the color grading. A--wait for it--director is.

The idea that an editor is "lazy" for doing what they're told to do by the director and DP is pretty offensive.


Some editors are chosen for their creative input. Sometimes it is the editor that convices a director, but it's often a collaboration. Editors are part of the creative team of a production compared to just a grunt level worker. The director, DP, and editor can all collaborate so that the image is conevying the desired feel to match the story and vision. Or so the theory goes.


Absolutely. But within epsilon of zero editors on a real production are just having it dumped in their lap without direction. It's a collaborative thing at best and the post to which I replied is the same sort of sneer-at-the-grunts attitude that makes people lecture the Starbucks worker for company policy.


Definitely not unique to video.

People are happy to take a Leica camera and then apply the same Kodak Portra colour filter that every influencer seems to use.


At the directors approval


The really interesting part is that ARRI Alexa was pretty much the #1 camera in Hollywood but Netflix won't accept it because it's not 4K.


Alexa is a camera line/category, not a specific camera.


The original Alexa, yes, but that's from 2010.


The last generation Alexa Mini too.

To OP’s point it’s shortsighted they’d accept a Mark I FS7 vs. nearly any image any Alexa has ever produced. The extra .8k it lacks on paper is more than made up in overall image quality.


I guess this list is not exhaustive or they are more flexible with their major films, because of the Netflix films that I've heard of from 2021 some were shot with the Alexa Mini, which isn't on the list. One was shot partly on Betamax, partly on super 8 film.


curious - does anyone know the least expensive camera on their approved list? It's a lot for me to look up price for each one


Buying second-hand prices will always be variable, But the cheapest brand new are probably

Panasonic BGH1 - $2k

Panasonic S1H - $3.7k

Canon C70 - $5.5k

Blackmagic Ursa - $6k


Those prices are likely body-only. By the time you add lenses and other accessories you're probably looking at 2-4x the price of the body.


Thank you!

NB: BGH1 is literally just a body with sensor and couple of sockets and that's it.


Thanks for that! Not as bad as I would have thought.


From a quick look I’d say Panasonic S1H

It’s the only one that’s just a mirrorless camera built with video in mind, while the others are full blown cinema cameras


I was going to ask the same thing if no one else had.


> Limited use of non-approved cameras is allowable in certain circumstances (e.g. crash, POV, drone, underwater, Pan-Tilt-Zoom/Robocam etc). In all cases, such cameras must be explicitly approved by Netflix for a specific project.

The implications here are amazing.

Say someone wants to like film a mountain bike scene on a GoPro. Someone has to realize how this is going to work in advance. Then they have to tell someone to write Netflix an email to ask for permission. Someone writes that email. Someone at Netflix receives the email, and asks their team whether or not to approve the exception. A meeting is scheduled and it's discussed. They decide yes! They make a note of the approval in their project planning system. Or rather, try to... the system doesn't support that. They file a bug report. The engineering team's product manager takes a look and decides it's important, so they have a short call with the team's manager to get it prioritized. It gets prioritized. The product manager tells the Video Camera Approvals department that it will be looked at in the next couple weeks. Meanwhile an engineer picks up the ticket, decides it makes sense, and adds a new column to the ongoing_production table 'boolean unapproved_camera_exeception_granteed default false'. Some glue is put together to make this a checkbox, and in just a week, the feature is launched! The camera exceptions department is informed. They say they actually wanted it to be a list of camera models that were approved, the date that that camera model was approved, and the user who approved it. Also can you add a button to client-facing UI so that production companies can make a request there? The project manager says this is a great idea and prioritizes it. The engineer that added the checkbox is on vacation, so the more complicated version of this feature is given to the new-hire as their starter project. The new-hire puts together a small document with an overview of what they plan to do; a database migration to add a new table to store approved cameras, the UI work, a messaging system to tell the video cameras approvals team that there is a request to review, etc. A design review is scheduled for next Thursday. Don't work on it until then, we'll do the rest next sprint. Meanwhile over in the video camera exceptions approval department, the production company writes in "hey, we're running out of time, should we just use a GoPro?" The exceptions manager writes "This is approved. We're having some trouble adding a note to the CRM, but that should be fixed in the next couple months. For now, if anyone asks, this email is the record of your approval." Unfortunately, right before they press send, a cat picture is posted to Slack. Slack makes a loud sound to notify everyone of this, and the email window is closed as everyone goes over to check out the cat picture. As it turns out, this email just lives in the Drafts folder for the rest of eternity. It was the Exception Manager's last day; turns out Hulu is paying Video Camera Exceptions Managers twice as much, and in theory have better software. Meanwhile, over at the video production company, they are frustrated that they can't get ahold of anyone at Netflix to approve the camera, so they decide to just strap a Sony F65 to the rider's handlebars. There is no off-the-shelf-mount for this, so they shop around the idea to local machine shops. One offers to design and build it over the weekend so they can start using it on monday. $20,000 for the design and first prototype; $10,000 for the rush job. Approved. On Monday morning, they attach the F65 to a bicycle in the studio and pedal it around. A little wobbly, but nothing our stunt rider can't handle. They book her for a shoot tomorrow. She inspects the bike and says "whoa, this is weird" but gets used to it after a while. They head out to location and begin the technical descent. So far so good, even with the very wobbly cockpit. She hits a huge jump and get higher up in the air. The bike unexpectedly pitches downwards. The landing is not good! CRUNCH! She's off the bike. The camera flies into a ravine. The chase crew stops and throws their bikes away as quickly as possible to check on the stunt rider. She's conscious. They quickly call 911 to get a medical team out here to assist. Broken neck? Broken back? They don't know what to do. Unfortunately, there is no cell reception. Someone volunteers to hike to the summit to call. 20 minutes later, they're talking to 911. They can't get an ambulance out there, so they're sending a helicopter. Do you have the GPS coordinates? They don't, but offer to signal to the helicopter when they hear it. Someone probably has a mirror that they can signal with, right? 20 minutes pass, and the helicopter is overhead. The rest of the team doesn't know that one is coming, but someone has the bright idea of using one of their cheaper lenses as a mirror. They attract the helicopter pilot's attention. A paramedic descends through the trees, attaches a back brace to the stunt rider, and they're off to the hospital. The rest of the crew descends the slope on their bikes, reaching the production van and telling them the news. They're weeks behind schedule at this point, they might have to just cut this technical scene.

Some time passes. The stunt rider makes a full recovery. It looked worse than it was, and she's back in the saddle within a week. The scene is cut from the film, and it's delivered in time. The film gets amazing reviews. Subscriptions are at an all time high. The CRM software is modified to support the video camera approval workflow, and it works great. The junior engineer is promoted. A quarterly business review shows that the processes are working great, and that the scrappy Netflix is out-innovating all of the competition. High fives all around. Their share price increases and all the employees are richer.

This is the American economy in a nutshell, and I'll be honest -- I don't fucking get it.


Send this script to Netflix, they can make an Original Series out of it.


Seriously laughed out loud when the ending brings it back to the junior engineer and the engineering processes. Well done!


I'll have what he's having.


What is the purpose of this document, i.e. who does it apply to? Does it mean that a movie like Tangerine[1] (filmed on an iPhone) would never come to Netflix?

[1] https://en.m.wikipedia.org/wiki/Tangerine_(film)


I suspect any film that was good enough could bypass many, many of the guidelines they have for partners, including use of a camera that is not on that list.

The collection of documents seems to be guidelines and templates to begin from if seeking to partner with Netflix.

It doesn't seem like requirements for the film from being considered for inclusion in the catalogue, but at least offers a relative standard that can be deviated from as necessary.

For example, the general safety and guidelines document they provide for the US includes guidelines on weapons handling and stunts.[1] Having documents like this to start from might remove some of the administrative overhead to creating art in film.

To offer a comparison, YC has the SAFE investment template, an enterprise sales agreement template and recommendations for organization of startups. Using these might make a new company an easier fold into the portfolio but if the startup is successful enough, many things likely can be seen past.

Regarding iPhone specifically, the device has been able to record in 4k since the 6S and 6S Plus, which was one generation later than Tangerine which used the 5S. That said, I don't know the details of video capture to say if anything up through the 13 Pro compares with these other cameras.

[1] https://drive.google.com/file/d/1gs6n32bMC_2zLGp2jI4U3JA7k7q...


> I suspect any film that was good enough could bypass many, many of the guidelines they have for partners...

To clarify, I believe this is for original Netflix content (partners) where Netflix does the production. 3rd party studios licensing their content would be exempt from this (ABC/NBC/CBS/FOX). The big issue is that you have to get pre-approval for non-approved cameras.

I suspect that Netflix/Apple/Amazon is trying to make their name in content beating out cable and movie production studios. High quality 4k video is what people are clamoring for. I've seen a lot of cable/broadcast 1080p (The local CW station comes to mind) that's been recompressed to a lower bit rate. Effectively giving you 720p resolution in 1080p format -- it's not nearly as enjoyable.


Look at the domain name. It's Netflix Studios, not Netflix the consumer service. If you aren't doing business with Netflix Studios then ignore it. I'm sure you already know that there are many films and TV programmes on Netflix that were made before any of these cameras existed that they're perfectly happy to accept for Netflix?


High Flying Bird was a Netflix movie shot on iPhones. Granted, that was Soderbergh, so he probably gets a lot more pull than most other filmmakers. It was a movie where they bought the distribution rights and didn't produce, so maybe that stuff is different. I'm pretty unclear on what really constitutes a "Netflix Original" — lots of stuff that gets label is really just distributed by them.

Pretty sure some of their other original stuff was shot on film too, which obviously wouldn't meet these requirements. At least, I know the non de-aged scenes for The Irishman were done on film. But again, maybe the big guys get more leeway here.


You'd be amazed at the number of people involved in creating content that know nothing about the technical aspect of content creation. In most situations, they hire people to do things for them, and they will typically as for what kind of specs they are looking to deliver.

Any company that produces something will have similar type of "specs".


I once thought about a high-end camera rental side business. Never really completed the research, but I am curious again... Do studios and creators own these cameras or they usually rent? Is there an aggregated marketplace where I could be able to put my camera for rent if I do eventually get one?


Renting out cameras might give you influencers or youtubers as customers, but for producing a feature film or a TV show the rentals include everything from lights,lenses to generator trucks, with the camera only being the very tip of the ice berg. Production companies will continue to go to the rental companies where they can get everything at once. For reference a look at the perhaps biggest rental company might give some perspective: https://www.arrirental.com/en


The old saying was you buy the lenses, rent the camera. It's even more applicable today. Camera bodies change almost annually, but the lenses can carry over from body to body.

However, in feature film work, some lens collections can easily cost $250,000 for a set of 6 lenses (Leica Sumicron-C). Cooke lenses could reach $100k. So, not many people owned these either. You just rent. You then get to write all of it off as an expense on the budget of the production. Camera gear used to be so expensive that few owned.


Don't do it. Cameras and lenses are a low margin, high labor industry. Professionals are exceptionally picky about their gear, so you need skilled technicians working hard to keep your gear in top shape, and the prices are very low.

The bigger rental houses make all their money on other things -- the ROI on sandbags, sound blankets, even consumables is ridiculous compared to cameras. (Granted I'm blending grip and camera, but some houses do both.)

You're also competing with owner/operators, and even in-house studio operations. It doesn't scale -- relationships matter and it's highly regional. Compared to other ways to put your capital to work, it's not a great investment.


My FIL (a serious amateur photographer) regularly rents specialty lenses from http://lensrentals.com

They rent cameras too, not just lenses. The RED Raven model (one of the NetFlix approved models) rents for $490/week. That's just for the "body" (what RED calls the "brain"). A full kit with accessories is $850/week. Lenses are on top of that.


I’ve seen people try to make “uber for camera rental”, I’m not sure if any of them succeeded. Most photography specialized retailers rent equipment.

I think there’s a good mix between rent and own as there’s a lot of diversity in creators.


It went horribly wrong when people realized that theft is easy and it wasn't covered by insurance. https://www.documentary.org/column/peer-peer-rental-cautiona...


There's a lot of rental but the real scarcity is in certain lenses, which don't really become obsolete in the same way.

Arri puts out a new Alexa and you can maybe get one if you have a hundred grand in your pocket, but there are some master primes or legacy sets that are simply not for sale.


Surely legacy lenses can be reproduced (or modelled digitally!) so that it is impossible to tell the difference. For the component lenses don't you 'just' use the same refractive index and make it the same shape?

We can make microchips for a few quid but a metal tube with a few small lumps of shaped glass costs 10,000 times as much? Seems unlikely.

Is it a fashion thing, like unless it has the right logo on the lens body then the producers reject it?


Unfortunately it isn’t that simple. A lot of these lenses are expensive not just due to rarity but from the shear complexity and difficulty in producing them. Some can take literally months of production time. They frequently contain dozens of pieces of specialised glass which are ground to tolerances measured in nanometers. The way the light interacts with these lenses is simply not something that you can simulate. At least not on anything less than a decent sized supercomputer.

And the reason you want one of these expensive lenses, is that they can do things that other lenses physically cannot and can never be persuaded to do.

The famous Carl Zeiss Planar 50 mm f0. 7 lens used in Barry Lyndon is special because it can physically let more light into your camera than any other lens. That’s why nasa commissioned a set of them from Zeiss to use in filming the moon landing.


I believe it's the other way around: Kubrick got a version of the NASA lens modified to accept a Mitchell mount so he could put it on a 35mm motion film camera. He only used it for the candlelight scene because he had to remove the spinning mirror assembly used for monitoring, so they were shooting blind.

Don't remember the sources for the anecdote, sorry. But the wikipedia page corresponds:

https://en.wikipedia.org/wiki/Carl_Zeiss_Planar_50mm_f/0.7


This video goes into it: https://youtu.be/2p5E7iXxeQE

Note that even then they had to use brighter candles (they had 3 wicks instead of one).


The lenses are expensive because they're made to very exacting standards but also they're a low volume product line. A cinema lens is hand-assembled from hundreds of components that must be aligned to extremely fine precision, with a lot of human-in-the-loop workflows requiring trained technicians. Why optics are expensive is a long and deep conversation.

As for quality, the highest of high end lenses are much sharper for a given f-stop, and sometimes have higher t-stop ratings, than cheaper lenses you can get. They will look consistent through all focus settings, and won't "breathe" (zoom while focusing). The lenses also come as a matched set, so you can switch between focal lengths and the images look the same -- no color shift or aesthetic change. This puts a huge number of constraints on the lens designer, requiring different materials and manufacturing processes. Cinema lenses aren't irrationally priced, even if the demand is driven by aesthetics.

And yes, modern lens designers could probably recreate classic lenses. No, it's not economically viable, even at outrageous prices. (Because you can digitally fake most of the obvious artifacts, and the few people who really care about the in-camera look aren't enough to justify the engineering cost.)

There may also be a minor materials issue; some of the glass used in very old lenses was pretty exotic, some were even mildly radioactive. It's possible those glasses are not available since there are better replacement options in glass catalogs now. But this is just speculative, I haven't tried to source thorium glass. (Yet?)


This fellow has explained it better than I ever could.


The lenses Kubrick used for Barry Lyndon are prime examples. pun intended


> Do studios and creators own these cameras or they usually rent?

Depends. Some of the gear mentioned on the Netflix page can be afforded even by ambitioned hobbyists, others - especially the lenses - are way too expensive for anyone sans big studios.

Major studios will have a selection of their own equipment, but rent from companies like ARRI (who, coincidentally or not, make their own cameras of which some are rent-only) for projects requiring more than that - or anything on location in foreign countries. It's easier to call up Arri and have them and their local offices deal with cameras, lighting equipment and the likes than to ship all that stuff around the world, deal with customs etc.

> Is there an aggregated marketplace where I could be able to put my camera for rent if I do eventually get one?

In Germany there's gearo, no idea about any other.


You could maybe do a deal with a bigger outfit to contribute some capital for some gear and get a cut when they rent it? But for having your own rental company (for the kind of people you want to rent to), you need to invest in lot of gear because they want to hire as much as possible from as few places as possible. Hundreds of thousands of dollars in cameras, accessories, lenses (especially important), lights, rigging gear etc.


They rent, usually in the country where they are filming. I worked as a contractor making am ecommerce-like website for renting high-end movie making equipment.


PRG is one of the biggest in the US for camera rentals, its rather unexciting. They have all the cameras, but for whatever reason most of them you have to specifically ask for. https://prggear.com/product-category/camera/camera-camera/ca...


It's not a bad gig but you need contacts and one of your primary competitors is a camera manufacturer, I think it was Panavision that has a big rental branch (and they rent Arri or Red too I think).


So long with "shoot your next movie on iPhone". I was hoping for Canon EOS, but it is $5.5k "cinema" one.

Gotta re-read Rodriguez' "Rebel without a crew" (and try to forget his way subpar episode of the next Disney Star Wars humiliation).


This is really high end stuff. Makes me realize how much I'm getting for just 10 bucks.


You times how many millions of other users?


How does that matter? They are just stating how much value they are getting out of $10.


it just goes to show how these large budgets are possible. nothing nefarious to see here.


This Netflix site lists the preferred capture formats (RAW, REDCODE RAW, X-OCN, etc.) for recording the video. What file format would be used to submit the final video (edited with audio and captions) to Netflix?



What’s with the Java vulnerability warning that takes up 1/3 of my phone screen? I know about the log4j vulnerability, and this still made me think this was a pwned website trying to serve up malware…


Maybe this is a little off topic, but I was curious to know why, when the Blackmagic pocket camera came out like 3 years ago, it was such a revolution/attention-grabbing camera? As in, people were like, this upends the dominance of ARRI, etc etc.?

Was it just the resolution available for the price point at the time? How did the company achieve it, and why weren't any of the major camera manufacturers able to do that up until that time?


I work professionally as a colorist and the blackmagic cameras aren't professional cameras. They just don't look nearly as good (especially in difficult scenarios) and the footage is far less flexible than arri, red, or the high end sony venice line.

The only projects that use them in my experience are low budget music videos (I'm talking very low budget - typically sub $10k) and documentaries. Occasionally you'll see them as a C cam for higher end jobs.


Oh interesting, thanks for the reply!

Is it that they don't record the same bit depth (?) What aspects of the output or format of a camera make it suitable for pro use?


Talking about brushes instead of paintings.

An overwhelming part of society has lost the essence in everything and went full-on meta, some 10 years ago one could find wonderful album reviews on Discogs, as example, but not anymore - the vinyl press quality is the topic 99% of comments over there now. The only feeling music seems to be inducing in those people is that of annoyance at clicks, pops and scratches.

6-21-3-11-ing metaworld.


For reference: BBC[0] 'technical specification for the delivery of television programmes'. 35 pages, with links to additional specs for cameras etc.

[0] http://downloads.bbc.co.uk/scotland/commissioning/TechnicalD...


For some odd reason, I expected an affiliate purchasing link on that camera list. Even though it is a multi billion dollar company, I have been pavloved by webpage's with list of products.


As the comments below explain, production companies don’t buy any equipment anyways for most things. It’s all rented.

You can’t even buy some of the stuff here even if you wanted to either. Studio-grade camera equipment is like trying to buy a Ferrari.


I wonder when Netflix will have their own video editing software...


how much is the cheapest on that list?

the Canon EOS C70 looks most consumer still.


I would guess the Panasonic S1H or the BGH1 is cheapest. The Blackmagic cameras on the list aren’t that pricey either.


I saw something getting shot today in downtown SF where a ton of extras were all walking down the street wearing VR headsets. Anyone know what that was?


For anyone into photography, an unrelated topic is that Amazon prime offers unlimited photo storage, you can upload all your RAWs as long as you keep your subscription.

The next generation of cameras should move away from SD cards. I can’t wait until you can stick NVMe on one, it’s a shame they don’t even make any emmc SD cards. Even the iPhone has had that for half a decade now.

Digital cameras really suck, they’re all horrible proprietary hardware devices, basically expensive locked down computers with large sensors that attach to lens through proprietary mounts.


It’s a bit random Blackmagic Ursa 4.6k is approved but Ursa 12k or Pocket Cinema Camera 6k are not


Canon R5 and R6 aren't approved. A good example of white listing fail.


Are the 4.5K cameras intended for 4K output? If so, why 4.5K?


Wonder why they approve the Canon C70 but not Canon C200


The C70 has the same sensor as the C300 MKIII so that could be why. But then you question why isn't the A7s III or FX3 approved when the FX6 is approved and they all share the same sensor.


My guess is that it's what they've been asked to approve and upon which they've done their due diligence.

For Netflix Originals, I would assume most productions are done on rental gear. The difference between renting an FX3 and an FX6 isn't going to be significant at production-budget scales. (And if you had a real burning reason to use something off-list I would bet they can write you an exception.)


That's what I'm curious about as well, as I'm looking at the FX3.


If it were up to me, if the camera doesn't have Global Shutter, it should not be allowed. Nothing more annoying to me than when flash/strobes appear on partial multiple frames. <shudder> Then again, if it were up to me, it would not be allowed to be called cinema unless it used global shutter.


I agree with you about how ugly the split screen effect is, but that's not strictly a camera issue -- you can sync strobes to your genlock signal. That reflects a production choice not a camera choice.

The quality of global shutter cameras is simply nowhere near the quality of rolling shutter cameras, though there are more acceptable options showing up these days, they still typically have lower dynamic range and generally worse specs.


The Arri Alexa is the camera on which essentially every Oscar winning film and it has been shot on for the past decade has a rolling shutter as do the film cameras before it.


You are aware film cameras use a rolling shutter, too?


Nothing from NIKON?


Even on set for static reference images (i.e. to capture lighting for light extraction, and colourchart and grey/reflect ball for matching set to the shot footage and colour correction) it mostly seems to have been Canon EOS 5Ds (II -> IV) over the past 10 years, at least in the Film VFX industry.


Nikon doesn’t target the cinema segment of the camera market, but hopefully the recently released Nikon Z9 gets added to the list soon!


Nikon does video cameras?


Not much separates high end still cameras from video cameras besides taxes and a bit of firmware (some other stuff too, but all minor issues).

The big reason all high end cameras aren’t video cameras are mostly taxes in some regions and lazy product differentiation.


Unless you're the camera operator. Shooting motion on a stills camera in a cinema environment is absolutely attrocious. While the small form factors are attractive for various reasons, they are notoriously hated to actually operate. Tiny tiny buttons. The most actively used button start/stop tends to be in a difficult location to access when not being used as point-n-shoot style camera. Mounted on gear, it becomes difficult to access. Accessing other settings is also tucked away on the smaller camera as well that have discreet/dedicated knobs/buttons on a cinema camera. Some times, small can be too small to be friendly. Some of these can now be accessed via app to help, but it's still a pain.

If you're a youtube creator, these cameras can make you look "pro" for a small budget. The price points to entry because of this type of gear brings the creation market to a much wider market, which can be a good thing. Bringing this gear to a full camera crew with DP, AC, etc makes integration harder. Good times!


Yeah I guess I should have narrowed my statement to the “capturing frames” aspect, surely the interface is better adapted to specific use case like you say.

Though you definitely hear about productions using them anyway.


When the desire to make content is greater than your budget, you find a way. I know, I've been there. Graduating to using "real" gear is like a spotlight from heaven beaming down on you with a choral of angels singing as you suddenly realize how much nicer it is using "real" gear. In the mean time, you make do with what you got. The great thing about these newer gear is that it doesn't look like as obvious as lower gear. Fake it till you make it


The most obvious Nikon problem is the absence of cine lenses (I skip ergonomic because it is a common 'photo camera for video' problem). Of course, you can use photo lenses, but it's so much pain. More flaws and more time.

- Cine zoom lenses must provide a fixed point of focus when you are zooming. This leads to a more complex mechanism.

- Different cine fixed lenses in one project usually must have identical image quality. That is why they are produced in series (for example 'XEEN PRO Cinema Lenses') with the same technologies.

- Anamorphic lenses... They should exist. They complicate lens production because a stable quality (for series) of an aspherical lens can be achieved only from a central part of a large glass.

- Focusing is the process that often requires a separate person (focus puller) and additional machinery (follow focus system). All cine lenses have exactly the same standard focus wheel. There is no such standard for a photo lens focus wheel.

Nikon F has very few cine lenses and has no series of lenses. Nikon Z has nothing. Thanks to the closed very secret, unique, perfect, bla-bla specification of Z mount and zero collaborations with other manufactures.


No Apple iPhone?


>netflix requires approval if you want to film a crash with a slightly lower spec camera

centralized platforms strike again!


Too bad their streaming video platform uses low bitrates.

So disgusting seeing big square macroblocks, esp.on dark scenes.

Cancelled it months ago.


This is totally arbitrary. Many Nikon cameras match those requirements but are randomly not listed.

You could use an “unapproved” camera and just overwrite the metadata. Nobody would know (unless you’re using an iPhone or something).


Canon and Sony make cine cameras. Nikon makes still cameras that happen to shoot video, but not all that well - I shoot Nikon myself, but I shoot Nikon because I want the best still cameras I can get. If I cared about video at all, I'd probably have gone with Canon instead, not least because their cine and still bodies can share EF-mount lenses.


It’s not just the number of photosites, or even the bit depth - Netflix approved cameras need timecode support, among other features that distinguish a “production ready” camera from a consumer one.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: