Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
About Bokeh (bokehtests.com)
152 points by luu on Dec 20, 2019 | hide | past | favorite | 86 comments


I am surprised about how biased this article is in assigning attributes like ‘good’, ‘bad’ or ‘perfect’ to Bokeh shapes. You can talk about this in these terms solely from an optical engineering pov.

The shape of Bokeh is often why a lens get picked for a shot.

What the author calls ‘bad Bokeh’ other people call ‘soap bubble Bokeh’ (google for it) and they are willing to cash out [2] for owning a lens that can produce this sort of thing. This is an artistic choice[1].

This article’s generic statement that this sort of Bokeh creates images where the out of focus BG areas distract from the FG/in focus motive shows a narrow understanding of how Bokeh can be used in composing an image.

[1] https://www.shutterbug.com/content/creating-unique-macro-ima...

[2] Caveat: I bought a Meyer Optik Trioplan 100mm f/2.8 replica on Kickstarter for this very reason: the ‘bad’ soap bubble Bokeh. I love this lens.


Sentences 3 and 4 in the article seem to excuse this. Is there further information that I may be missing regarding why the bias is still a problem?


The "bad" pattern he calls out is not exactly the Trioplan soap-bubble style, it's the thing you get from a mirror telephoto lens.

In Trioplan images[1], the light intensity smoothly increases towards the edge of the disk of confusion, and creates a sharp edge. I agree that this looks kind of nice, although it's a quite specialized effect.

In mirror lens images[2], the "disk" of confusion is actually a ring, with two sharp edges rather than just one. This is not at all pretty, and as far as I know basically nobody buys these lenses for this effect.

This actually matters a lot, because other than this, mirrors are superior to refractive glass lenses in almost every way (note that astronomical telescopes are always mirror-based), so if it was not for the bad bokeh we'd all use them.

[1] https://fujilove.com/the-legend-and-the-bubble-bokeh-review-... [2] https://hoaiphai.wordpress.com/2011/10/18/recumbent-review-t...


>>This actually matters a lot, because other than this, mirrors are superior to refractive glass lenses in almost every way

No. Reflective designs (mirrors) are usually of the casigrain type, and casigrain designs offer very narrow but limited zoom range field of views. This is at the cost of obscuration of the secondary mirror. Note that there are still refractive elements (lenses) in a casigrain design. This very narrow fov is why such designs are prefered in telescopes and other narrow fov application areas.

The majority of lens designs are refractive and this should give you an indication that a statement such as mirrors are superior to refractive designs simply is not true.

Edit - added application areas


I didn’t say the author was talking about the Trioplan. I only said the Trioplan’s Bokeh shape would be an example that he purports as undesirable.

As a photographer I don't care so much what optical design choice gave rise to an artifact (that I want to exploit artistically).


One thing I notice is when people get into photography bokeh is one of the first things that they want to emulate (another was HDR). But once they know the basic rules[1] of photography well enough to break them, they tend to view bokeh as that puppy-love attraction and now they are on to more serious photography. That is to say, bokeh is sometimes used as a crutch before people learn the basic rules and then learn how and when to break them to create a better photograph.

[1]https://www.flickr.com/groups/94761711@N00/discuss/721576221...


Exactly my experience! This is especially true for people who are into street photography. I've come full circle and I find myself looking for the sharpest picture across the whole frame.

Aesthetics aside, I think the reason we reach out for bokeh first is that it's an easy way to isolate subjects and obscure clutter. That gives your photography an instant boost. A lot easier than using composition, timing, and relying on luck.


It's also this quasi-metric for "professional quality" so first timers tend to spend the money to acquire these fast lenses in hopes it will suddenly make them a quality photographer.


> Two key terms in this definition are SUBJECTIVE and IMAGE. One cannot measure bokeh, as it is something that may be pleasing to one person and not another. I have my opinion of what makes for good bokeh, and, as I own this site, I get to define good and bad Bokeh for the purposes here!

And every time he mentions it he uses those caveats.

> You may have your own idea of what good bokeh is, but this site will be evaluating lenses based on MY opinions. And lets face it, there are actually times when BAD bokeh really works for a shot, but that is usually for people who have a whole lot more artistic talent than I do.


To my personal taste, the bubbles in photos of your link 1 is very distracting.


Yeah, and the swirls in Van Gogh's Starry Night are distracting too.


An R package `rayshader` that can emulate bokeh. It's nice read. https://www.tylermw.com/portrait-mode-data/


Thanks for the shoutout!


If anyone's interested in the technical details of implementing state of the art real-time bokeh in a game engine, Guillaume Abadie's "Life of a Bokeh" slides from SIGGRAPH 2018 are really good:

https://epicgames.ent.box.com/s/s86j70iamxvsuu6j35pilypficzn...

Unfortunately, a video of the talk doesn't seem to be available online.


As a very nearsighted person, I can see bokeh if I take off my glasses and look at points of light - either nearby small lights, or far-away large lights. Also, when I look at a bright sky through a tree's canopy, the canopy transforms into a breathtaking mosaic of bokeh, which becomes animated if the tree is swaying in the wind. Can people with normal vision experience this?


Anyone can see what it looks like to be (strongly) nearsighted by putting a magnifying lens in front of their eyes.

Anyone nearsighted can estimate the strength of their nearsightedness by looking at an object close enough to be perfectly clear, then increasing the distance until the farthest where it appears clear. Take that distance in meters and invert it. The result is the amount of nearsightedness in diopters. For example : you see clear up to 50 cm away, or 0.5m, you have 2 diopters. You see clear up to 10 cm away, 0.1m you have 10 diopters.

Anyone with normal (or corrected) vision can estimate the strength of a magnifying lens by the same method. Most magnifying glasses are 5 to 20 diopters.

Well, perhaps not strictly anyone can, but you get the idea.


For me, points of light look like the canonical star shapes you see in telescope photos — which I don’t really understand because from what I’ve read, that effect is due to the braces which hold the telescope’s lens in place.


That's called a 'starburst' and it happens in human eyes for all sorts of reasons. It's possible naturally and also as a LASIK complication, for instance.

Look for 'starburst' here https://visionsimulations.com/ Sadly the simulation site is down because of the new Cross-Origin restrictions on images in canvases but perhaps you can see what it looks like in the preview images on the main page and if you google 'starburst vision'.


I have good vision, and I can see good bokeh if I focus closer than the objects I see, and bad bokeh if I focus farther than them (which is much easier to do).


Not sure what it is I'm doing but intentionally blurring my vision creates a sort of visual bokeh


I can if I squint. I have glasses, though.


finally, an advantage for the near-blind


And its all done in software (or ML) with recent phones to get around the lack of a good lenses system. And for most people it seems to work satisfactorily.

This has been one of the complaints against camera manufactures recently, that the firmware on the cameras is wildly archaic.


If we're talking about non-point and shoot cameras, such as DSLRs or Mirrorless Cameras that cost $500+ such as my Sony A6000, I absolutely DO NOT want post-processing done on my camera.

Portrait mode on my iPhone 7+ looks okay at a glance, but thinks like hair, wires, and twigs are TERRIBLE on closer inspection.

Not to say the software couldn't be improved. For example, after taking a 20 second exposure, I have to wait another ~15 seconds before I can take another photo. I'd love for that processing time to be cut down.


The delay after taking long exposures is caused by the camera taking a dark frame with the same exposure length as your actual image. When you take a long exposure, the sensor can exhibit noise caused by "hot pixels". The dark frame is taken with the shutter closed so that the camera can get an image of just the hot pixel noise, which it then subtracts from your actual long exposure.

You can turn this off in the settings, but I imagine it's way easier for the camera to correct for this specific kind of noise than for Lightroom to do so.

I believe it kicks in once the exposure length is >2" on my Sony A7ii.


On the one hand that's correct. On the other hand you could completely avoid that if you did long exposures the way a Google Pixel (2+ I think) does it (in software):

Take lots of short exposures and fuse them. If the device is handheld you get variability in positioning for free, if it's on a tripod, it will automatically wiggle the OIS slightly to achieve the same effect.


It seems like movement would be effective for averaging out random noise, but the described process isn’t for eliminating random noise, it’s for eliminating persistent hotter pixels on the sensor.


Indeed, and in fact the pixel does exactly the same dark frame trick to identify hot pixels. I don't know that it takes as long though - perhaps it's hidden by the fact you can keep doing other things while the photo is processing.


It cannot do the same thing as an a6000, as it lacks a physical shutter.


That's great information and makes me much less frustrated with my slow a6000.


You can also try turning off long exposure NR in the menus. I don't know if it makes enough of a difference on the a6000 to be worth leaving enabled and paying the extra cost in time, but that's a way to find out.


Oh thanks for letting me know! I have an A6000, so that makes sense!


> Portrait mode on my iPhone 7+ looks okay at a glance, but thinks like hair, wires, and twigs are TERRIBLE on closer inspection.

Portrait mode on any phone looks crap. I have not yet seen a picture where fake shallow DOF does not fail and look crap.


If you have solid edges (e.g. hair pulled back or bald), and a fairly smooth background already... it looks acceptable.

But it's not a substitute for a really nice lens and sensor combo. Maybe someday.

I just wish the DSLR/mirrorless manufactuers would invest more in making their RAW files more intelligent. I don't want 'portrait mode' but I do want to be able to focus stack in one image (with originals preserved), do true adjustable HDR (combining exposures), etc.

Sony and Nikon have a couple hint at these capabilities, but the software is not easy to use, and don't get me started on moving pictures between camera bodies and wherever they need to be to do things with them (laptop, iPad, phone).


These improvements to bokeh on the Pixel 4 and Pixel 4 XL show some pretty good results: https://ai.googleblog.com/2019/12/improvements-to-portrait-m...


Impossible to say because sample[1] is highly compressed and 800 pixels high (including a white border).

[1] https://1.bp.blogspot.com/-3Sm2XHss0lA/XffZ95s_KjI/AAAAAAAAF...


Here’s the full size album: https://photos.app.goo.gl/cwaKemRDusTK5W6SA


Still not full-res and unfortunately the first two samples[1] I opened shows that it's still crap and can't handle fine detail/hair. Many people have hair.

[1] https://imgur.com/a/MXCeYaU


I mean, you are using an iPhone 7, iirc that’s the first iPhone to implement portrait mode. The processing time on my iPhone 11 that I just got is minimal. I upgraded from an iPhone 6S so I’m not sure what the processing time was like on iPhones with portrait mode prior to the 11 models though.

As well, who’s to say what Apple could do when given an image from a high end camera sensor instead of the tiny one in an iPhone?

And lastly, why not just enable power users to turn it off? Sure you can say manpower would be better directed at other problems, but who cares if post processing capabilities are on the camera if you can turn it off? Those who want it use it. Those who don’t don’t.


> who’s to say what Apple could do when given an image from a high end camera sensor instead of the tiny one in an iPhone?

Ruin it. But they wouldn't, because there'd be no need.

The thing about a big high-end camera sensor is that having one means you don't need to fake bokeh. You can put a good fast prime lens in front of it and get the real thing. Phone cameras have to post-process their way to it because the physics of their small sensors mean they simply aren't capable of capturing enough off-axis light to produce true bokeh.


My wife has iPhone 8, I can't find portrait mode on it (I'm coming from Pixel 3 so I was wondering how it looks on iPhones).

When I scroll the photo modes there is no such thing as "portrait".

Or maybe it is only for the Plus models with two cameras?


Yeah, it is on plus model including animoji


> I'd love for that processing time to be cut down.

yeah its called an iPhone 11 PRO



faster than a 7+


As a photographer I really don't ever want to use the camera software. Even when it's perfect for my application, I want RAW photographs because you have to fix so many other things before you ever get to special effects.

You need to crop, color correct, distortion correct, noise correct, exposure correct. Only after that would you ever really want to apply some kind of bokeh filter.

The entire reason to use a high end camera is to do it right. Camera software is never going to do it right until they start putting photoshop on cameras.


> You need to crop, color correct, distortion correct, noise correct, exposure correct. Only after that would you ever really want to apply some kind of bokeh filter.

Modern phones already do all that - and seemingly in that order.

As a photographer, what I find lacking are the lenses. You can only squeeze so much light from the relatively tiny lenses. Perfect recent example was kids Christmas play. Sitting back 25+ meters, with somewhat dim lighting and fast moving kids, I could see the other parents with their smartphones and tablets getting very poor photo's and vids - whereas my Nikon had zoom galore and excellent clear focus.

Physics matters.


I've never seen any phone get even half as good results as any DLSR released this decade. Just because the phone software can do it doesn't mean it's any good. I mean my phone has photoshop on it. That's pretty good, but nothing beats a proper suite of tools on a color calibrated monitor.


Modern phones do not automatically crop, to my knowledge. They do attempt to color correct, but they don't always get the correct colors. (There needs to be a reference to infer the appropriate white balance, and such is not always available.)


It's not the software that holds back phone cameras. It's the literal size and shape of the sensors and lens.

You can already shoot RAW on a phone, but putting Photoshop on the phone won't solve the lack of depth of field inherent to the lens and sensor setup.


Does the RAW format save the kind of information that the software uses to guess at bokeh (depth sensors, multiple lenses, multiple shots, defocused pixels)? Or is there some other format that could save all that info, so you could really use the phone but do the processing later?


There’s no, but there isn’t really a need. The R package mentioned in a comment in this thread achieves a similar effect using no additional data.

Ray tracing will suffice!


It shows a hand made depth map. You can't just 'raytrace' a 2D image and magically add depth of field.


You can shoot in raw on most high end smartphones now if you want to do handle that after the fact, but the default used by the camera software does all of that


It works horribly with hair and sharp edges even on the flagship phones.

Plus it can't possibly work correctly with transparent and semi-transparent objects like glass.


My main complaint isn't that it doesn't work through glasses, but rather that it can't handle the frames at all. They are always blurred if the photo isn't taken from directly in front. It's a bit surprising as so many people use glasses -- a third, maybe? One could think that to be a major use case.


> And for most people it seems to work satisfactorily

Most people apparently don't look at their pictures.

The "enhanced" photos I've gotten from the Iphone for the last few models have just been getting worse. They look OK if you don't look too closely, but then they're just wrong. Blurry details on the same focal plane as crisp ones, crappy details with translucency, hair or anything else that confuses the "foreground/background" detector, they're just bad.

My DLSR will be in service for quite a while longer.


Same experience here. My guess is most people view their photos on their phones, via Instagram or Facebook. Rarely are photos looked at in full size on a high quality monitor.

I have no intention of giving up my Olympus mirrorless system. I'll probably add at least one more lens to my collection next year - I've been relying on my 12-40 f2.8 for portraits, but I'm liking what I've read about the Sigma 56 f1.4.


The watercolour effect from the de-noiser in iOS results in watercolour-esque patterns in the final image, and washes away most of the fine detail at 1:1.

FWIW, I have an iPhone X. The watercolour trend started around the iPhone 5s, IIRC.


"the lack of a good lenses system"

This is very much a subjective interpretation and I don't mean to be picking at nits, however most smartphones have tremendous lens systems. They just happen to have a short focal length (e.g. 1.54mm, 4.25mm and 6mm on the iPhone 11). Depth of field is primarily a function of focal length, and the smaller the focal length the larger the depth of field. Which is why portrait photographers often use zoom lenses.

In the real world though 99% of the time what people are aiming for is simply sharp photos where the subject is in focus. That small focal length makes that much, much more likely, yielding a dramatically higher percentage of "keeper" photos. Shallow depth of field photos are a gimmick in most cases.


No, it hasn't. This comes up every time someone on HN mentions anything to do with photography - mostly, it seems, from people who don't do a lot of photography or at the very least aren't closely familiar with interchangeable-lens cameras - and it never stops being nonsense.

There are plenty of things that could be improved about DSLR and mirrorless cameras, in the firmware and out of it. The number of those things which are in any way related to, or could be in any way improved by, the idea of jamming a smartphone into an interchangeable-lens camera, is zero.

(In case anyone's inclined to question my own chops: that's fair, considering my line of argument. I don't post much of my work lately, but here's my best shot of 2019, taken from a distance of six inches with a Nikon D500 and a 105mm macro lens: https://aaron-m.com/wp-content/uploads/2019/07/DSC_9393.jpg - so you can look at that and judge for yourself whether I'm qualified to speak to the question of how familiarity with interchangeable-lens cameras influences the perceived value of junking up their UI with a lot of smartphone garbage.)


im curious to see how good it will get in like 5-10 years because right now, to a mildly trained eye its very noticeably bad


Ugh, most people probably eat fast food three times a week too.


Someone should mention the oval shaped, very cinematic, Anamorphic bokeh. Bokeh is not just the shape of blurry small light sources, but also changes the quality of any out of focus region, since the blur is doubled vertically compared to horizontally. A good read on the subject with images https://www.provideocoalition.com/three-lenses-a-look-at-bok...


I cringe whenever i see that oval bokeh in a film. It's so jarring. I can't believe a competent cinematographer would ever allow it in their shot.


Competent cinematography is about artistry and emotion, and whatever technical standards are necessary to achieve that. And everything with technical issues about tradeoffs. Movies are chock-full of tradeoffs.

Oval bokeh is... hardly noticeable to anyone except professional photographers, and also isn't inherently "bad", it's just different, and has its own personality.

If your standards are so perfectionist, I have a hard time believing you find any of Hollywood's cinematographers -- whether blockbuster or indie -- up to your competent standards. ;)


One of my favorite Bokeh lenses is the Nikon 500mm reflex telephoto lens. The "Mak" style with a rear primary mirror and a front secondary works a lot like common compact telescopes. The Bokeh is tricky to apply, it's a fixed zoom telephoto lens so your subject must be at least 10 meters or so a way and it's hard to handhold, but the "donut" aperture of the lens makes for some very cool (or weird depending on your tastes) effects. The article calls this a "bad" lens but I love the effect.


Almost anything can be used well for artistic effect.

Here are some shots from the Nikkor 500mm:

http://www.indeed.co.jp/blog/scrap2/lens/reflexnikkor_500mm_...


Oh, very nice! Since we're sharing bokeh photos, here's my favorite of the ones I've taken. It's the "creamy" style, not circular. Olympus E-P5 with Olympus 75mm F/1.8 (my favorite portrait lens):

https://geary.smugmug.com/Pets/Dogs/i-dNMQW2v/A


This is a lovely example, thank you for the link!


Your definition conflicts with the author's.

1st paragraph:

>as I own this site, I get to define good and bad Bokeh for the purposes here! The other key term is “image.” Bokeh is a property of IMAGES, not a property of LENSES.


The shape of the bokeh is given by the shape of the aperture, so it is also a property of the lenses.

The specific word is related to the aesthetic quality of the effect in context, but that doesn’t mean it is truly independent of the lens.


Whenever bokeh is mentioned in /r/pics, they are usually talking about the blurred background in an image that has shallow depth of field, not any particular quality of light in the image. Drives me nuts, but not worth arguing about.


What exactly drives you nuts? That they shouldn't talk about the subjective quality of the blur, but rather the quality of light?


Lots of people think bokeh is just a blurred background.


...and that drives me nuts too.


And yet more than half of the page is about lenses.


not sure why I'm getting downvoted here for simply pointing out the author defines his definition of Bokeh in the 1st paragraph.


It’s because you’re coming off as clueless and tone deaf. Your post is argumentative just to make a point, and clearly demonstrates you don’t know much about the subject.

Who cares if the definition is slightly different? We’re talking about the same emergent phenomena and I think you realize that.

People upvote contributions not trolls.

Welcome to HN!


I would guess that people think that the author's preferred definition doesn't preclude other people from having different definitions. You seem to assume it does, at least in the context of this discussion, and that assumption is noxious to the people who are downvoting you.


I'd add that while the Author is correct--bokeh is a property of images, not lenses--its not materially different that how most photographers understand the term. While a lot of photographers may say things like "that lens has great bokeh", they don't think that bokeh is some quality in the lens. They are just shortening an idea lke "this lens produces images with great bokeh." It is a lazy way of rephrasing it, but I have never met a photographer that thinks bokeh is not referring to the image.


Shape is important. I find that some lenses have a different look on-axis versus off-axis, either leading to ugly smearing, or pretty twirly bokeh.

The aesthetic also depends on image. Nice crisp circles of light are sometimes nicer than a Gaussian blur. As someone pointed out, mirror lenses have their uses too.

But there are lenses with just bad bokeh too.


> I find that some lenses have a different look on-axis versus off-axis, either leading to ugly smearing, or pretty twirly bokeh.

This is indeed pretty interesting. It seems like double-Gauss lens designs like the Zeiss Planar and Biotar are especially prone to this kind of distortion. There's a Russian copy of the 58mm Biotar called the Helios 44 that's popular on eBay for this reason (lots of people use an adapter to put it on a MFT camera and shoot video with it).


Lots of mirror lens bashing :(

Bokeh is subjective. While the rings instead of filled circles resulting from a mirror lens look worse in many cases, I have seen cases where the 'rings' actually add to the image.


Isn't the airy disk the right PSF to use here?


if bokeh gets to this point, I don't know how is it going to look in the future. Is photography going to die




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: