< ! --Digital window verification 001 -->

2 Nikon D90 cameras, 3D portraits

Alex Fry and Jamie Nimmo will have a 3D photography exhibition "Stereo Portrait Project" in Sydney Australia on 27th May (scroll down for the full press release).

Here is the technical background:

For the Stereo Portrait Project, we used off the shelf hardware and software, from the cameras, lighting, triggering, post production to the printing of the book. We shot using Nikon d90's with shutter cables connected to an RF trigger. The Lighting system was three Nikon sb900's, one connected via pc sync in a soft box, the other two optically slaved.

A custom camera rig was needed to put the cameras as close together as possible, roughly emulating the distance between the human eyes. There was no quick or cheap solution to do this, so we built a number of prototypes out of wood and brackets. Since we were taking portraits, having the cameras mounted vertically base to base made the most sense, giving us a relatively small interoccular and a vertical frame. We ended up making three versions of the rig, the first to test how the base plates attach to the cameras, a hand held version that was unwieldy to hold and the final tripod mounted version you see in the pictures.

To synchronize the cameras we used an RF trigger split out to two preload shutter release cables. We tested how fast we could sync both shutters together with the flashes, and got reliable sync up to 1/160 speed. Giving us the ability to have people move around, talk to us and not inhibit their performance. This was very important since hands in front of the body look fantastic in 3d.

The image pairs were then sorted using Aperture before being exported into Nuke, where final tweaks were made before combining them into the final Anagylphic 3D images.

Aperture made sorting and tagging hundreds of very similar left and right images much more manageable, while Nuke allowed us to simply adapt many of the same 3d workflows we use in our day jobs as stereoscopic visual effects compositors.

We printed a limited edition run of 25 books for the exhibition using the self publishing website Blurb. The process from layout to finished product was smooth and easy. When we got our books back, they looked exactly as expected. This is a great service that we used to compliment the larger prints that will be on display in the gallery and give the people a chance to take home the entire collection for tabletop stereoscopic enjoyment. The book is now available through their online bookstore here.

There is a lot of helpful information regarding 3D photography in the comments section of this post.

Press Release:

Stereo Portrait Project

Thursday 27th of May

Stereo Portrait Project is the brainchild of visual effects artists, Alex Fry and Jamie Nimmo. Born from their shared interest in stereo imagery whilst working on the
upcoming Legends of the Guardians: The Owls of Ga'Hoole, they started a collaborative project aiming to document people in society using 3d portraiture.

Their debut show concentrates on Australian creatives including Children's Author Morris Gleitzman, Archibald Prize winner Craig Ruddy, Writer and Futurist Richard Neville, Musician Rai Thistlethwayte, Director Michael Gracey and more. The exhibition invites the audience to face themselves as someone else using the sculptural aspects of 3d photography, opening up the inner lives of fascinating faces at Sydney’s emerging art breeding ground, Oh Really Gallery in Newtown.

Stereo Portrait Project will be participating at the launch of Creative Sydney, Friday the 4th of June at the Museum of Contemporary Art (MCA) in Sydney. To add something different on the night, SPP will be setting up a stereo booth to give everyone the chance to see themselves in 3d whilst streaming those images live to twitter.

Stereo Portrait Project is using off the shelf, accessible technologies throughout all aspects of the show. It's supported by Blurb Inc, an award-winning creative publishing platform. The authors have produced a print on-demand book available from the Blurb Bookstore.

Stereo Portrait Project debut Show

Opening 27th May 6pm-9pm, til 8th June. Oh Really Gallery, 55 Enmore Road.

Stereo Booth 4th June. Creative Sydney. (see website for details)

www.stereoportraitproject.com
www.ohreallymagazine.com
www.creativesydney.com.au
au.blurb.com
www.animallogic.com
legendoftheguardians.warnerbros.com

Update: here are some 3D related links taken from the comments section:

Here is the update from one of the photographers:

I'll just answer a few of the points brought up in the comments here.

Yes, the lenses are 35mm AFS 1.8.
Focusing was done by having the subjects stand on a mark, flip to live view, find manual focus for both cameras, switch off live view, engage preload triggers.. Everything was shot at 2.8, this caused issues with both the subject moving off their mark, and asymmetrical focus when something was bumped or nudged accidentally..

As for the oversize inter-occular, it was a byproduct of the construction technique. We have no metal working experience so the rig was bolted together using available materials and skills. Its backyard, but it works.
The impact on the stereo is to exaggerate the depth, but the effects are negligible at the kind of display sizes we are viewing at, life is also made easier by having a blank background.

As for the rig's lack of ability to converge the cameras, we subscribe to the parallel school when it comes to camera setup. In a perfect world we would have purpose designed camera bodies with mechanically shiftable sensors, or be shooting with tilt/shift lenses to offset the sensor.. But we don't have those things, so we shift in post. It means you lose some resolution and field of view, but its fast, easy and avoids all the keystoning and vertical disparities issues inherent in shooting converged.

The Anaglyph conversion was done in Nuke. I implemented the conversion fomulas found here:
http://www.3dtv.at/knowhow/anaglyphcomparison_en.aspx
Specifically the Optimized Anaglyph formula, it dramatically reduces luminance asymmetry by knocking out colours that only appear in one eye or the other. You lose colour range but reduce the visual buzzing. Its responsible for the heavy green cast, but it feels better in 3d relative to traditional anaglyph.
If anyone is interested they can download my Nuke gizmo below:
http://dl.dropbox.com/u/3656251/OptimizedAnaglyph.gizmo.zip
We dint use Ocular but most likely will in the future. We aren't using it in our day jobs because Guardians is all CG and doesn't suffer from the issues that Ocular is designed to address (mainly repairing problems with mismatches in live action stereo footage).

@Larry N.Bolch
Are you the guy who takes all the great 3d Lambo and Ferrari stuff on Flickr?

As for the W1.
I have one now, its great and lives in my bag everywhere I go. Takes seconds to fire up unlike the 30 minutes of frigging around with the D90s. I love it.
Its worth noting that the W1 is also a parallel camera, doing the convergence as a post process..

@Rob Bannister
Yes, reconstructing 3d point clouds from stereo pairs is an area that interests us.
Its non-trivial but we have tried some stuff with projecting into a deepimage.. Its cool, but we need more hours in the day..

As for Nikon making dedicated stereo cameras in the future?
I think its going to have to evolve out of wherever they go with mirrorless cameras, smaller lenses for the same size sensor will make it easier to position the sensors close together..
See similar ideas with people getting excited about stereo rigs built on RED's small sensor scarlet.

This entry was posted in Nikon D90 and tagged . Bookmark the permalink. Trackbacks are closed, but you can post a comment.
  • Alexander

    Cool as shit.

    • Alexander

      Anyone know what lens that is they’re using?

      • http://www.flickr.com/photos/philograf/ Philipp Hilpert

        50mm 1.4 G i assume

        • manimal

          They are 35mm afs 1.8g lenses.

          • http://www.almondbutterscotch.com Almond

            they look to have focus distance scales. 50mm f/1.4G

          • Anonymous

            wash your eyez its a 35

          • http://www.flickr.com/photos/friedtoast/ Fried Toast

            I don’t own the 35, but can see that they’re definitely not the 50G. If they are, then I’ve got a super limited special edition 50G that looks completely different :D

    • Global Guy

      I really like that they did this. I don’t care that people say the images aren’t great. Or that there are parallex (angular focusing) problems or that the “eyes” are too far apart, therefore making an awkward look.

      The point is that these guys did it. Which is a lot more than most of any of the people on this forum will ever do, let alone come close to in 3D. So pretty fun project. Not really sure why this post doesn’t focus on professionals who work in 3D, but still, interesting hobby — and the set up is an easy DIY, which means others here could try it, if they have a friend with similar gear.

      • QuBe

        “I really like that they did this. I don’t care that people say the images aren’t great. Or that there are parallex (angular focusing) problems or that the “eyes” are too far apart, therefore making an awkward look. The point is that these guys did it.”

        Did what?
        If the images suck, they’ve done nothing. The whole point is the images! They don’t get credit just for taking another go at a tired old gimmick….certainly not in my book they dont.

        “Which is a lot more than most of any of the people on this forum will ever do, let alone come close to in 3D.”

        Um, speak for yourself bud.

        • Anonymous

          3D suxxxxxx It was invented undred years ago and fuwxcvbhj up your eyes ?I m waiting for ologram technology the rest is gimmicks!!

        • hybris

          i second that
          not very interesting photo witch ps could do better

          it looks like somthi´ng has gone wrong

          bad rumor/not interesting anyway

  • http://www.mattwilson.cl Matt Wilson

    Very interesting idea. Good luck with the expo

  • http://nikonrumors.com/ [NR] admin

    If you guys have any questions to the photographers regarding this project, let me know (email me the questions or enter them as a comment to this post) and we can organize a Q&A session.

    • tobi

      I can go there and personally ask the questions for ya ! :-)

  • woble

    What about focusing?

  • Mike

    The distance between the two “eyes” is to great (about 4 1/2 inches, estimating from the picture) which exaggerates the 3d effect and makes it looking artificial. The girl looks like she’s somehow glued together. Greater distances are used for landscapes and other big objects, certainly not for portraits.

    • Tim

      RE: The lenses being too far apart. I don’t know much about 3D, but I know quite a bit about custom metalwork. They could have gotten those lenses much closer together by mounting them to a single 1/4″ or 3/8″ thick piece of aluminum and fastened with flathead screws countersuck into the backside of the aluminum to attach into bottom of the camera. A fabrication shop would probably charge then less than $100.00 to do the work and would probably even weld a piece to attach to the tripod mount as well.

      • Worminator

        Yeah, but this is the kind of garbage art students tend to spout. “Oh, it was really hard, we had to make three different versions…” etc etc.

        They bolted the cameras on whatever pits of ply-board they had handy… and it took them a couple of hours. It looks like crap, and the two cameras probably don’t even point in the same direction.

      • George

        If they they used nuke, that probably they used the foundry’s ocular plugin to readjust the interocular distance. This is what was used on avatar.

        • Louis Rosenthal

          yeah i was just thinking, if 4 inches is too much, how on earth can it be done in cinematography with how large those cameras are sometimes!
          glad to see somebody who seems to know what they’re talking about :)

  • http://josephthomas.org Joe

    Looks interesting. I am interested in seeing the result of this quasi replica of the human eye.

  • AS

    With all the fuss, he still couldn’t make good 3D images at all…The one above has a lot of ghosting artifacts on the entire left part of the woman’s body.

    If you want to see real good 3-Dimensional pictures check this site, this is a Pro:
    http://3d-experience-publishing.com/products.php

  • http://micahmedia.com Micah

    I don’t see any adjustment for parallax. That’s a a bigger issue than the distance between the two. The cameras have to be pointed slightly inward for anything closer than infinity. The eyes do this automatically and most people aren’t aware of it.

    • http://sdickinson.com Sam

      That’s the first thing I noticed when I saw this rig too.

  • amcon

    channel-game-play
    ;-)))

  • grumps

    How are the images processed?
    How is the distance between the two cameras determined between different FL lenses?

    Thanks Admin!

    • Anonymous

      They are processing in Nuke and hopefully using the foundry’s plugin Ocula to fix any issues between the images when combining them. I use this software all the time in film and it does an excellent job (used on Avatar and other features) http://www.thefoundry.co.uk/pkg_overview.aspx?ui=39DEE70B-C88F-48F1-9BEC-99A9BAFE2850

      It should be able to determine the distance from the cameras and fix the convergence so it doesnt look as wonky as some has pointed out.

  • http://www.robbannister.com Rob Bannister

    oops I didnt mean to post as anonymous. Also with Nuke’s 3d camera tracker it is possible for it to pull lens data and recreate a scene in its 3D environment. Now Im not sure if this is possible with a still but you can definitely input the focal length and other data in it manually.

  • Dave

    The comments about the cameras being too far apart are correct. 2-2 1/2 inches is best from the distances used. Turning the cameras in slightly is not really needed although your eyes DO turn in more as the objects are closer. More care could have been taken to line up the left and right images so they register better (less ghosting). As for processing: take a left and right eye image. In Photoshop drop the output level to zero in the red channel of the right eye image, drop the blue and green channel output levels to zero in the left eye image, drag on image over the other forming an image with two layers, choose SCREEN as blending mode then move top layer around till it registers as closely as possible with bottom photo. If you are shooting static images, you can do this with a single camera just shoot the images slightly left or right of each other. For moving images connect two cameras together at the tripod mount using a 1/4 / 20 threaded length of metal (available at any hardware store). Shameless self promotion following:
    http://diverdave.smugmug.com/Three-Dimensional-Photography/3D-images/3129155_ACeiQ#171799089_JkKGm

  • darksideofnikon

    Hmm I think if nikon release a camera that produce 3D it will be a sell out :)

    Imagine an innovation.

    • Anonymous

      Sony is working on it so…

    • http://www.flickr.com/photos/friedtoast/ Fried Toast

      A sell out? Because the name is Nikon? It certainly won’t be because they were the first out w/ a 3D camera.

      Here’s the latest. Not sure if it’s the actual first 3D digital camera or not. There may have been other before it:
      http://www.fujifilm.com/products/3d/camera/finepix_real3dw1/

  • Anonymous

    i never understand the benefit of 3D photography, they all look the same.

  • Greenwood_Geoff

    It’s not something I want to do since you need to have special glasses to see the cool effect, /shrug

  • NikoDoby

    QUESTION: Will they be doing admin’s official Nikon Rumors 3D portrait?

    • http://nikonrumors.com/ [NR] admin

      I am not famous :)

  • http://larry-bolch.com Larry N.Bolch

    I have been using the Fujifilm 3D W1 digital stereo camera since December last year, and am totally pleased with it. Prior to that, I used the Pentax stereo adapter on the D700 mounted on a 50mm f/1.8 lens, giving an effective focal length of 100mm for each eye. The Pentax stereo adapter has no parallax control, so results were variable. The Fuji camera has both auto-parallax based upon auto-focus distance and manual parallax which works well for macro-photography. It has a 2D mode that lets one use one lens set to wide angle and the other to telephoto allowing one to shoot both simultaneously. It also shoots 3D movies.

    The monitor is glasses-free stereo using something similar to the old lenticular prints. Shooting a picture of someone and immediately showing the results with glasses-free stereo viewing tends to draw an amazed reaction from the subject.

    I have automated printing with a couple of Photoshop Actions, one producing anaglyph prints and the other Holmes Stereoscope cards for viewing with an actual antique Stereoscope. Stereoscopes are still in production and give superb results. I print either the anaglyph or stereo cards with an Epson photo printer. IrfanView so far has produced the best quality self-running anaglyph slide-shows for viewing – with glasses – on any HDTV. I recently acquired Fuji’s viewing frame, which also uses the same lenticular technology for glasses-free viewing.

    Much easier than working with my old Stereo Realist and film. The Fuji camera is way sharper and has all the features of a fairly advanced digital camera, with the exception of RAW, which would be nice. Its native format is .mpo – which is a container format – that is split into JPEG stereo pairs by included software. 3D movies are .avi, and they too can be split into left and right eye views for projection or YouTube viewing.

    My contact at Vistek – sort of the Canadian version of B&H – dropped me an e-mail yesterday saying that he just was handed a dongle that will allow viewing on the new 3D TVs. He had not had time to test it yet.

    The view-frame handles .mpo and 3D .avi files directly. It has very sophisticated firmware, making slide-shows a breeze. Create folders on SD cards, drop in the .mpo files and .mp3 files if background music is desired. Set the frame-rate from one per second to one per day, point the frame at the folder and hit play. One can intermix movie and still files. The music stops with a movie file and the stereo sound-track plays instead.

    It is the first and only digital stereo camera, and is a very well thought-out system. Shooting stereo is more complex than 2D, and the camera design makes it fairly easy. I expect that it will be the first of many, however. The camera is quite compact, with a 77mm optical base. It has a sliding clam-shell that serves as both a lens guard and on/off switch, 35-105mm equivalent zoom. It is my carry everywhere pocket camera – not a monster like the two D90s. At least for the present, it is the state of the art.

    • http://nikonrumors.com/ [NR] admin

      thanks for sharing!

    • http://www.flickr.com/photos/friedtoast/ Fried Toast

      I’ve been fooling w/ the idea of picking one up for some time now, but haven’t convinced myself (having bought a Canon S90 at the beginning of the year put a pinch on my desire for *another* P&S). Having read your summary, it brings the itch back a bit ;)

  • alvix

    its like the Neumann Head KU100 for stereo recordings..one day someone will design a camera that looks like a head with two “eyes” (@50mm) that can take stereo pics of the outside…mmh.. maybe its already here? .. its a robo-head?
    ..

  • Christina

    How are you viewing the picture in 3D??
    I don’t have any of those funky glasses, and I have been trying to figure out how to look at the pix just using the computer.

    Help…??

    • http://larry-bolch.com Larry N.Bolch

      Multiple methods of viewing. The Fuji viewer shows 3D stereo without glasses. The camera monitor can also be used as a viewer, though it is much smaller than the viewer.

      The Holmes Stereoscope allows one to make stereo cards with a high-quality ink-jet printer for top-notch viewing. Though the Stereoscope was invented in the mid 1800s by Oliver Wendell Holmes, it is still in production, and readily available. The US Library of Congress has a huge collection of high-resolution Stereoscope images from the 1800s and early 1900s that can be downloaded and printed out. However, the best quality I have seen are those shot with the Fuji.

      There are also methods of viewing stereo pairs with bare-eye viewing. I have managed to achieve it a few times after a good bit of trying, but for me it is not worth the effort. Evidently, some can achieve it easily. The images can be viewed printed on cards or on the computer monitor.

      I can either make the traditional red/cyan anaglyphs or the newer green/magenta, for viewing with glasses. The green/magenta produces considerably less cross-talk between eyes. These can be either anaglyph prints or viewed as a slide show, using the glasses.

      I just got word yesterday, about some method of viewing on the new 3D HDTV sets, but have no details. I expect that they will use shutter-glasses. Shutter glasses have been around for a long time, for computer viewing.

      A long time back, I had a stereo projector with polarizing filters on each lens, 90° out of phase, that required matching polarized glasses. Something similar could be done with a pair of digital projectors. In all, there are a huge range of options.

      A huge stock of 3D stereo items can be ordered from http://www.berezin.com/3d/Default.htm

      I also do 3D modeling and rendering with the program Shade10.5, which allows easy rendering in stereo as well as 2D. The same viewing methods apply.

  • Sean

    If you want 3D, you probably need more cameras and you need to spread them out. Here is a company that sells a system with 4 cameras that they put together. They can add more if you want 360 degree coverage.
    http://www.dirdim.com/port_featuredprojects.php?fileName=fp_di3d

  • Fredbare

    I think it’s safe to say that Nikon (or any other sensible camera manufacturer) will bring out their higher end cameras before their cheaper ‘stripped down’ ones. Otherwise they would be shooting themselves in the foot.
    So expect D4 before D800
    D400 before D9000
    A better ‘entry level’ model than the D3000/5000 to get customers into the Nikon ‘family’.
    They may have a D90S with better video capabilities but no other improvements in the shorter term – if they’ve figured out how to freeze Jello ….

  • Alex Fry

    Hey..
    I’m one of the guys doing this series…

    I’ll just answer a few of the points brought up in the comments here.

    Yes, the lenses are 35mm AFS 1.8.
    Focusing was done by having the subjects stand on a mark, flip to live view, find manual focus for both cameras, switch off live view, engage preload triggers.. Everything was shot at 2.8, this caused issues with both the subject moving off their mark, and asymmetrical focus when something was bumped or nudged accidentally..

    As for the oversize inter-occular, it was a byproduct of the construction technique. We have no metal working experience so the rig was bolted together using available materials and skills. Its backyard, but it works.
    The impact on the stereo is to exaggerate the depth, but the effects are negligible at the kind of display sizes we are viewing at, life is also made easier by having a blank background.

    As for the rig’s lack of ability to converge the cameras, we subscribe to the parallel school when it comes to camera setup. In a perfect world we would have purpose designed camera bodies with mechanically shiftable sensors, or be shooting with tilt/shift lenses to offset the sensor.. But we don’t have those things, so we shift in post. It means you lose some resolution and field of view, but its fast, easy and avoids all the keystoning and vertical disparities issues inherent in shooting converged.

    The Anaglyph conversion was done in Nuke. I implemented the conversion fomulas found here:
    http://www.3dtv.at/knowhow/anaglyphcomparison_en.aspx
    Specifically the Optimized Anaglyph formula, it dramatically reduces luminance asymmetry by knocking out colours that only appear in one eye or the other. You lose colour range but reduce the visual buzzing. Its responsible for the heavy green cast, but it feels better in 3d relative to traditional anaglyph.
    If anyone is interested they can download my Nuke gizmo below:
    http://dl.dropbox.com/u/3656251/OptimizedAnaglyph.gizmo.zip
    We dint use Ocular but most likely will in the future. We aren’t using it in our day jobs because Guardians is all CG and doesn’t suffer from the issues that Ocular is designed to address (mainly repairing problems with mismatches in live action stereo footage).

    @Larry N.Bolch
    Are you the guy who takes all the great 3d Lambo and Ferrari stuff on Flickr?

    As for the W1.
    I have one now, its great and lives in my bag everywhere I go. Takes seconds to fire up unlike the 30 minutes of frigging around with the D90s. I love it.
    Its worth noting that the W1 is also a parallel camera, doing the convergence as a post process..

    @Rob Bannister
    Yes, reconstructing 3d point clouds from stereo pairs is an area that interests us.
    Its non-trivial but we have tried some stuff with projecting into a deepimage.. Its cool, but we need more hours in the day..

    As for Nikon making dedicated stereo cameras in the future?
    I think its going to have to evolve out of wherever they go with mirrorless cameras, smaller lenses for the same size sensor will make it easier to position the sensors close together..
    See similar ideas with people getting excited about stereo rigs built on RED’s small sensor scarlet.

    • http://larry-bolch.com Larry N.Bolch

      I covered all aspects of motor racing as my job for more than a dozen years, but that was film and shot for publication a lot of years ago. So no shots on Flickr. Well over 90% of my shots so far have been decisive moment people shots.

      Photoshop makes anaglyphs a breeze. Paste the right eye view over the left as a new layer. Select Layer->Layer Style->Blending Option and click the Red button for red/cyan or the green button for green/magenta. Use the cursor keys to set the convergence plane while viewing through glasses. I have an Action that does the layering and blending, so all is left is using the cursor keys for setting the easiest and most natural viewing. Very slick, and works with either colour or monochrome images. I have one version of the Action that de-saturates the images prior to layering which produces less cross-talk between eyes.

    • http://nikonrumors.com/ [NR] admin

      Thanks for your answers Alex!

  • Jivee

    Ah! The ‘D180′, the D90 replacement! Right, now to go and build my Panasonic ‘FZ100′. I knew there was a darn good reason I bought that second FZ50.

  • zzddrr

    ok, this means that in 3D I am 3 times uglier >> I am alredu ugly :-)

  • Nikon L35

    The hardware used is certainly interesting to say the least. For those of you interested in stereo photography I would highly recommend this company’s hardware

    http://www.stereoscopy.com/jasper/

    I met the owner. He does all his own machining and has been quietly making them for about 25 years. I own the panorama head and the single vertical mount. The construction is top-notch.

  • Gav

    in the 3dtv world cameras are placed closer together (otherwise known as reducing the interaxial distance) by placing the 2nd camera on top of the other and then reflected through a mirror. obviously these are much larger cameras than a D90 and there is a quality loss but its only on one eye

  • glu

    can anyone explain the camera straps? I mean, is anyone going to use them with the combo or what?

    • ArtTwisted

      those are for style clearly and they also let the user guide the rig around like a horse.

  • Back to top