• Home
  • about this blog
  • Blog Posts

Parasam

Menu

  • design
  • fashion
  • history
  • philosophy
  • photography
  • post-production
    • Content Protection
    • Quality Control
  • science
  • security
  • technology
    • 2nd screen
    • IoT
  • Uncategorized
  • Enter your email address to follow this blog and receive notifications of new posts by email.

  • Recent Posts

    • Take Control of your Phone
    • DI – Disintermediation, 5 years on…
    • Objective Photography is an Oxymoron (all photos lie…)
    • A Historical Moment: The Sylmar Earthquake of 1971 (Los Angeles, CA)
    • Where Did My Images Go? [the challenge of long-term preservation of digital images]
  • Archives

    • September 2020
    • October 2017
    • August 2016
    • June 2016
    • May 2016
    • November 2015
    • June 2015
    • April 2015
    • March 2015
    • December 2014
    • February 2014
    • September 2012
    • August 2012
    • June 2012
    • May 2012
    • April 2012
    • March 2012
    • February 2012
    • January 2012
  • Categories

    • 2nd screen
    • Content Protection
    • design
    • fashion
    • history
    • IoT
    • philosophy
    • photography
    • post-production
    • Quality Control
    • science
    • security
    • technology
    • Uncategorized
  • Meta

    • Register
    • Log in
    • Entries feed
    • Comments feed
    • WordPress.com

Browsing Tags iPro

Lens Adaptors for iPhone4S: technical details, issues and usage

August 31, 2012 · by parasam

[Note: before moving on with this post, a comment on stupid spell-checkers… my blog writer (and even Microsoft Word!) insists that “adaptor” is a mis-spelling. Not so… “adaptor” is a device that adapts one thing to another that would otherwise be incompatible, while an “adapter” is a person that adapts to new situations or environments… I’ve seen countless instances of mis-use… I fear that even educated users are deferring to software, assuming that it’s always correct. The amount of flat-out wrong instances in both spelling and grammar in most major software applications is actually scary…]

Okay, now for the good stuff…  While the iPhone (in all models) is a fantastic camera for a cellphone, it does have many limitations, some of which have been discussed in previous articles in this blog. The one we’ll address today is the fixed field-of-view (FOV) of the camera lens. Since most users are familiar with 35mm SLR (Single Lens Reflex) – or, if you are young enough to not have used film, then DSLR (Digital SLR) – and have at least an acquaintance with the relative FOV of different focal length lenses. As a quick review, the so-called “normal” lens for a 35mm sensor size is a 50mm focal length. Anything less than that is termed a “wide angle” lens, anything greater than that is termed a “telephoto” lens. This is a somewhat loose description, and at very small focal lengths (which leads to very wide angle of view) the terminology changes to a “fisheye” lens. For a more detailed explanation of focal length and other issues please see my original post on the iPhone4S camera “Basic Overview” here.

Overview

The lens that is part of the iPhone4S camera system is a fixed aperture / fixed focal length lens. The aperture is set at f2.4 while the 35mm equivalent focal length of the lens is 32mm – a moderately wide angle lens. The FOV (Field of View) for this lens is 62° [for still photos], 46° [for video]. {Note: since the video mode of 1920×1080 pixels is smaller than the sensor size used for still photos (3264×2448) the angle of view changes with the focal length held constant} The fixed FOV (i.e. not a zoom lens) affects composition of the image, as well as depth of field. A quick note: yes the iPhone (and most other cellphone cameras) have a “zoom” function, but this is a so-called “digital zoom” which is achieved by cropping and magnifying a small portion of the original image as captured on the sensor. This produces a poor quality image that has low resolution, and is avoided for any serious photography. A true zoom lens (sometimes called ‘optical zoom’) achieves this function by mechanically changing the focal length – something that is impossible to engineer for a cellphone. As a rule of thumb, the smaller the focal length, the greater the depth of field (the areas of the image that are in focus, in relation to the distance from the lens); and the greater the field of view (how much of the total scene fits into the captured image).

In order to add some variety to the compositional choices afforded by the fixed iPhone lens, the only option is to fit external adaptor lenses to the iPhone. There are several manufacturers that offer these, using a variety of mechanical devices to mount the lens. There are two basic divisions of adaptor type: those that provide external lenses and the mounting hardware; and those that provide a mechanical adaptor to use commonly available 35mm lenses with the iPhone. One example of an adaptor for 35mm lenses is here, while an example of lens+mount is here.

I personally don’t find a use for adapting 35mm lenses to the iPhone:  if I am going to deal with the bulk of a full sized lens then I will always choose to attach a real camera body and take advantage of the resolution and control that a full purpose-built camera provides. Not everyone may share this sentiment, and for those that find this useful there are several adaptors available. I do shoot a lot with the iPhone, and found that I did really want to have a relatively small and lightweight set of adaptor lenses to offer more choice in framing an image. I researched the several vendors offering such devices, and for my personal use I chose the iPro lens system manufactured by Schneider Optics. I made this choice based on two primary factors:  I had prior experience with lenses made by Schneider (their unparalleled Super Angulon wide angle for my view camera), and the precision, quality and versatility of the iPro system. This is a personal choice – ultimately any user will find what works for them – but the principles discussed here will apply to any external adaptor lens. As I have mentioned in previous posts, I am not a professional reviewer, have no relationship with any hardware or software vendor (other than the support offered as an end user), and have no commercial interest in any product I mention in this blog. I pick what I like, then write about it.

I do want to point out however, once I started using the iPro lenses and had some questions, that I received a large amount of time and assistance from the staff at Schneider Optics, particularly Niki Mustain. I would like to thank her and all the staff that so generously answered my incessant questions, and did a fair amount of additional research and testing prompted by some of my observations. They kindly made available an internal report on iPro lens performance, and the interactions with the iPhone camera (some of these issues to be discussed below). When and if they make that public (likely as an application note on their website) I will update this blog with a comment to point to that, in the meantime they have allowed me to use some of their comments on the general technology and limitations of any adaptor lens system as background for this post.

Technical specs on the iPro lens adaptor system

This particular system offers three different adaptor lenses (they can be purchased individually or as a set): Wide Angle, Telephoto and Fisheye. Here are the basic specifications:

As can be seen from the above details, the Telephoto is a 2X magnification, doubling the focal length and halving the FOV (Field of View). The Wide Angle changes the stock medium wide-angle view of the iPhone to a “very wide” wide angle (19mm equivalent – about the widest FOV provided by most variable focal length** 35mm lenses). The Fisheye offers what I would consider a ‘medium’ fisheye look, with a 12mm equivalent focal length. With fisheye lenses generally accepted as having focal lengths of 18mm or less, this falls about midway between 6mm*** and 18mm.

**There is a difference between “variable focal length” and “zoom” lenses, although most use the term interchangeably not being aware of the distinction between the two. A variable focal length lens allows a continuous change of focal length, but once the new focal length is established, the image must be refocused. A true zoom lens will maintain focus throughout the entire range of focal lengths allowed by the lens design. Obviously a true zoom lens is more difficult (and therefore costly) to manufacture. Typically, zoom lenses are larger and heavier than a variable focal length lens. It is also more difficult to create such a lens with a wide aperture (low f/stop number). To give an example, you can purchase a reasonable 70-200mm zoom lens for about $200 (with a maximum aperture of f5.6); a high quality zoom lens of the same range (70-200mm) that opens up to f2.8 will run about $2,500.

Another thing to keep in mind is that most ‘variable focal length’ lenses are not advertised as such, they are often marketed as zoom lenses, but careful testing will show that accurate focus is not maintained throughout the full range of focal lengths. Not surprising, as this is a difficult optical feat to do well, which is why high quality zoom lenses cost so much. Really good HD video or cinemaphotography zoom lenses that have an extremely wide range (often used for sports television – for example the Canon DigiSuper 80 with a zoom range of 8.8 to 710mm) can cost upwards of $163,000. Warning: playing with one of these for a few days will produce depression and optical frustration once returning to ‘normal’ inexpensive zoom lenses… A good lens is simply the most important factor in getting a great image. Period.

*** The extreme wide end of fisheye lenses is held by the Nikkor 6mm/f2.8 which is a masterpiece of engineering. With an almost insane 220° FOV, this is the widest lens for 35mm cameras of which I am aware. You won’t find this in your local camera shop however, only a few hundred were ever made – during the 1970s – 1980s. The last time one went on auction (in the UK in April 2012) it sold for just over $160,000. The objective lens is a bit over 236mm (9.25″) in diameter! Here are a few pix of this awesome lens:

actual image taken with 6mm f2.8 Nikkor fisheye

Ok, back to reality (both size and wallet-wise…)

Here are some images of my iPro lenses to give the reader a better idea of the devices which we’ll be discussing further:

The 3-lens iPro kit fully assembled for carrying/storage.

An ‘exploded view’ of all the bits that make up the 3-lens iPro system.

The Fisheye, Telephoto and Wide Angle iPro lenses.

Front view of iPro case mounted to iPhone4S, showing the attached tripod adaptor.

Rear view of the iPro case mounted on iPhone4S.

Close-up of the bayonet lens mounting feature of the iPro case.

2X Telephoto mounted on iPhone.

WideAngle lens mounted on iPhone.

Fisheye lens mounted on iPhone.

Basic use of the iPro lens system

The essential parts of the iPro lens system are the case, which allows precision alignment of the lens with the iPhone camera, and the detachable lens elements themselves. As we will discuss below, the precision and accuracy of mounting an external adaptor lens is crucial to good optical performance. It may seem trivial, but the material and case design is an important overall part of the performance of this adaptor lens system. Due to the necessary rigidity of the case material, once it is installed on the iPhone it is not the easiest to remove… I missed this important part of the instructions provided:  you must attach the tripod adaptor to the case body to provide the additional leverage needed to slightly flex the case for removal. (the hole in the rear of the case that shows the Apple logo is actually a critical design element: that is where you push with a finger of your opposite hand while flexing the case in order to pop out the phone from the case).

In addition to providing the necessary means for taking the iPhone out of the case if you should need to (and you really won’t: I found that this case works just fine as an everyday shell for the phone, protecting the edges, insulating the metallic sideband to avoid the infamous ‘hand soaking up microwaves dropped call iPhone effect’, and is slim enough that it fits perfectly in my belt-mounted carrying case), the tripod mounting screw provides a very important improvement for iPhonography: stability. Even if you don’t use any of the adaptor lenses, the ability to affix the phone to a tripod (or even a small mono-pod) is a boon to getting better photographs with the iPhone. Rather than bore you with various laws of physics and optic science, just know that the smaller the sensor, the more a resultant image is affected by camera movement. The simple truth is that the very small sensor size of the iPhone camera, coupled with the light weight and small case size of the phone, means that most users unconsciously jiggle the camera a lot when taking an image. This is the single greatest reason for lack of sharpness in iPhone images. To compound things, the smaller the sensor size, the less sensitive it is for gathering light, which means that often, in virtually anything but direct sunlight, the iPhone is shooting at relatively slow shutter speeds, which only exaggerates camera movement.

Since the EXIF data (camera image metadata) is collected with each shot, you can see afterwards what shutter speed was used by the iPhone on each of your shots. The range of shutter speeds on the iPhone4S is from 1/15 sec to 1/2000 sec. Any shutter speed slower than 1/250 sec will show some blurring if the camera moves at all during the shot. So, whenever possible, brace your phone against a rigid object when shooting, particularly in partial shade or darker surroundings. Since often a suitable fence post, lamp pole or other object is not right where you need it for your shot, the ability to use some form of tripod will often provide a superior result for your image.

The adaptor lenses themselves twist into the case with a simple bayonet mount. As usual with any fine optics, take care to avoid dropping, scratching or otherwise damaging the delicate optical surfaces of the lenses. The telephoto lens will most benefit from tripod use (when possible), as the narrower the angle of view, the more pronounced camera shake is on the image. On the other hand, the fisheye lens can be handheld for most work with no visible impairment. A note on use of the fisheye lens:  the FOV is so wide that it’s easy for your hand to end up in the image… take some care and practice with how you hold the phone when using this lens.

Optical issues with adaptor lenses, including the iPro lens system

After using the adaptor lenses for a short time, I found several impairments in the images taken. Essentially the artifacts result in a lack of sharpness towards the edge of the image, and color fringing of certain objects near the edge of the frame. I went on to perform extensive tests of each of the lenses and then forwarded my concerns to the staff at Schneider Optics. To my pleasure, they were open to my concerns, and performed a number of tests in their own lab as well. While I will discuss the details below, the bottom line is that both myself and the iPro team agrees that external adaptor lenses are not a perfect science, particularly with the iPhone. We must remember, for all the fantastic capabilities that this device exhibits… it’s a bloody cellphone! I have every confidence that Schneider (and probably other vendors as well) have made every effort within the scope of practicality and budget for such lenses to minimize the side-effects. I have found the actual optical precision of the iPro lenses (as measured for such things as MTF [Modulation Transfer Function – an objective measurement of the resolving capability of a lens system], illumination fall-off, chromatic and geometric aberrations, optical alignment and contrast ratio) are excellent – particularly for lenses that are really quite inexpensive compared to their quality.

The real issue lies with the iPhone camera system itself: Apple never designed this camera to interoperate with external adaptor lenses. One cannot fault the original manufacturer for attempting to produce a piece of hardware that offers good performance at a reasonable price within a self-contained system. The iPhone designers have treated the totality of the hardware and software of the camera system as a fixed and closed universe. This is typical of the way that Apple designs both their hardware and software. There are both pros and cons to this philosophy:  the strong advantage is the ability to blend design characteristics of both hardware and software to mutually complement each other in the effort to meet design criteria with a time/cost budget; the disadvantage is the lack of easy adaptability in many cases for external hardware or software to easily interoperate with Apple products. For example, the software development guidelines for Apple devices are the most stringent in the entire industry. You work within the framework provided, or you don’t get approval for your app. Every app intended for any iDevice must be submitted to Apple directly for testing and approval. This is virtually unique in the entire computer/cellphone industry. (I’m obviously not talking about the gray area of ‘jailbroken’ phones and software).

The way in which this design philosophy shows up in relation to external adaptor lenses is this: the iPhone camera is an amazingly good camera for it’s size, cost and weight, but it was never designed to be complementary to external lenses. Certain design choices that are not evident when images are taken with the native camera show up, sometimes rather glaringly, when external lenses are coupled with the iPhone camera. One might say that latent issues in the lens and sensor design are significantly amplified by external adaptor lenses. This issue is endemic to any external lens, not just the iPro lenses I am discussing here. Each one will of course have its own unique ‘fingerprint’ of interaction with the iPhone camera, but the general issues discussed will be the same.

As usual, I bring all this up to share with my readers the best information I can find or develop in the pursuit of what’s realistically possible with this great little camera. The better we know the capabilities and limitations of our tools, the better able we are to make the images we want. I have taken some great shots with these adaptor lenses that would have been impossible to create any other way. I can live with the distortions introduced as a compromise to get the kind of shot that I want. The more aware I am of what the issues are, the better I can attempt (while composing a shot) to attempt to minimize the visibility of some of these artifacts.

To get started, here are some example shots:

[Note:  all shots are unretouched from the iPhone camera, the only adjustment is resizing to fit the constraints of this blog format]

iPhone4 Normal (no adaptor lens)

iPhone4 WideAngle adaptor lens

iPhone4 Fisheye adaptor lens

iPhone4 Telephoto adaptor lens

iPhone4S #1 Normal

iPhone4S #1 WideAngle

iPhone4S #1 Fisheye

iPhone4S #1 Telephoto

iPhone4S #2 Normal

iPhone4S #2 WideAngle

iPhone4S #2 Fisheye

iPhone4S #2 Telephoto

The above shots were taken to test one of the first potential causes for the artifacts in the images: the softening towards the edges as well as the color fringing of bright areas near the edge of the image (chromatic aberration). A big potential issue with externally mounted adaptor lenses for the iPhone is lens alignment. The iPhone lens is physically aligned to the sensor as part of the entire camera assembly. This unitary assembly is then inserted into the case during final manufacture of the device. Since Apple never considered the use of external adaptor lenses, no effort was made to ensure perfect alignment of the camera assembly into the case. As can be seen from my blog on the iPhone hardware (showing detailed images of an iPhone torn apart), the camera assembly is simply pressed into place – there is no precision mechanical lock to align the optical axis of the camera with the case. In addition, the actual camera lens is protected by being installed behind a clear plastic window that is part of the outer case itself.

What this means is that if the camera assembly is tilted even very slightly it will produce a “tilt-shift” de-focus effect when coupled with an external lens:  the center of the image will be in focus, but both edges will be out of focus. One side will actually be focused a bit behind the sensor plane, the other side will be focused a bit in front of the sensor plane.

The above diagram represents an extreme example, but you can see that if the lens is tilted in relation to the image sensor plane, the plane of focus changes. Objects at the edge of the frame will no longer be in focus, while objects in the center of the frame will remain in focus.

In order to eliminate this probability from my tests, I used three separate iPhones (one iPhone4 and two iPhone4S models). While not a large sample statistically, it did provide some certainty that the issues I was observing were not related to a single iPhone. You can see from the examples above that all of the adaptor lens shots exhibit some degree of the two artifacts (defocused edges and chromatic aberration). So further investigation was required in order to attempt to understand the root cause of these distortions.

Since the first set of test shots was not overly ‘scientific’ (back yard), I was advised by the staff at Schneider that a brick wall was a good test subject. It was easy to visualize the truth of this, so I went off in search of a large public test chart (brick wall…)

WideAngle taken from 15 ft.

Fisheye taken from 15 ft.

Telephoto taken from 15 ft.

To add some control to the shots, and reduce potential errors of camera movement that may affect sharpness in the image, the above and all subsequent test shots were taken while the iPhone was mounted on a stable tripod. In addition, each shot was taken from exactly the same camera position (in the above shots, 15 feet from the wall). Two things stood out here: 1) there was a lack of visible chromatic aberration [I think likely due to the flat lighting on the wall and lack of high contrast edges, which typically enhance that form of artifact]; and 2) the soft focus artifact is more pronounced on the left and right sides as opposed to the top and bottom edges. [More on why I think this may occur later in this article].

WideAngle, 8 ft.

Fisheye, 8 ft.

WideAngle, 30 ft.

Fisheye, 30 ft.

Telephoto, 30 ft.

WideAngle, 50 ft.

Fisheye, 50 ft.

Telephoto, 150 ft.

Telephoto, 150 ft.

Telephoto, 500 ft.

The above set of images represented the next test series of shots. Here, various distances to the “test chart” [this time I needed even a larger ‘chart’ so had to find a 3-story brick building…] were used in order to see what effect that may have on the resultant image. A few ‘real world’ images were shot using just the telephoto at long distances – here the large distance from camera to subject, using a telephoto lens, would normally result in a completely ‘flat’ image with everything in the same focal plane. Once again, we continue to see soft focus and chromatic aberrations at the edges.

Normal (no adaptor), auto-focus

Normal, Selective Focus

WideAngle, Auto Focus

WideAngle, Selective Focus

Fisheye, Auto Focus

Fisheye, Selective Focus

Telephoto, Auto Focus

Telephoto, Selective Focus

This last set of test shots was suggested by the Schneider staff, based on some tests they ran and subsequent discussions. One theory is that there is a difference in how the iPhone camera internal software (firmware + OS kernel software – not anything a camera app developer has access to) handles auto-focus vs selective-focus. Selective focus is where the user can select the focus area, usually with a little square that can be moved to different parts of the image. In all the above tests, the selective focus area was set to the center of the image. In theory, since my test images were flat and all at the same difference from the camera, there should have been no difference between auto-focus or selective-focus, no matter which lens was used. Careful examination of the above images shows an inconsistent result:  the fisheye showed no difference between the two focus modes, the normal and telephoto looked better with selective focus, while the wideangle looked best when auto focus was applied.

The internal test report I received from Schneider pointed out another potential anomaly, one I have not yet had time to attempt to reproduce: using selective focus off-center in the image. This usage appeared to generate results that would be unexpected in normal photographic work: the area of selective focus was sharp, most of the rest of the image was a bit softer, but a mirror image position of the original selective focus region was once again sharp on the opposite side of the image. This does seem to clearly point to some image-enhancement algorithms behaving in an unexpected fashion.

The issue of auto-focus methods is a bit beyond the scope of this article, but some considerable research shows that the most likely methodology used in the iPhone camera is passive detection (that is certain – there is no range finder on an iPhone!) controlled lens barrel or lens element adjustment. There are a large number of vendors that support this form of auto-focus (and here, I mean ‘not manual focus’ since there is no mechanical focus ring on cellphones… – the ‘auto-focus’ can either be entirely automatic [as I use the term “auto-focus” in my tests above] or selective area auto-focus, where the user indicates a region of the image on which the auto-focus is concentrated. One of the most advanced methods is MEMS (Micro-Electrical Mechanical Systems) which moves a single optical element within the lens barrel, another popular method is the ‘voice-coil’ micro-motor which moves the entire lens barrel to effect focus.

With the advances brought to bear with iOS5, including face area recognition (the camera attempts to recognize faces in the image and focus on those when in full auto-focus mode), it is apparent that significant image recognition and processing are being done at the kernel level, before any camera app ‘gets their hands on’ the camera controls. The bottom line is that there may well be some interactions between the way in which the passive detection and image processing algorithms are affected by an unexpected (to the iPhone software) presence of an external adaptor lens. Another way to put this is that the internal software of the camera is likely ‘tuned’ to the lens that is part of the camera assembly, and the addition of a significant change to the optical pattern drawn on the camera sensor (now that a telephoto lens adaptor is attached) alters the focusing algorithm in an unexpected manner, producing the artifacts we see in the examples.

This issue is not at all unknown in engineering and quality control: a holistically designed system where all of the variables are thought to be known can be significantly degraded when even one element is externally modified without knowledge of the full scope of design parameters. This often occurs with after-market additions or changes to automobiles. One simple example is if you change the tire size (radius, not width) the speedometer is no longer is accurate – the entire system of the car, including wheel and tire diameter, was part of the calculus for determining how many turns of the axle per minute (all the speedometer mechanism actually measures) are required to indicate X amount of kph (or mph) on the instrument panel.

Another factor that may have a material effect on the focus and observed chromatic aberration is the lens design itself, and how an external adaptor lens may interact with the native design. Simple lenses are often portions of a sphere, so called “spherical lenses.”  Such a lens suffers from significant optical aberrations, as not all of the light rays that are focused by a spherical lens converge to a single point (producing a lack of sharp focus). Also, such lenses bend different colors of light differently, leading to chromatic aberrations (where one sees color fringing, usually blue/purple on one side of a high contrast object and green/yellow on the opposite side). Most high quality modern camera lenses are either aspherical (specially modified shapes that deviate away from a perfect spheroid shape) or groups of elements, some of which may be spherical and others aspherical. Several examples are shown below:

We know from published literature that the lens used in the iPhone4S is a 5 element lens system with at least several aspherical elements. A diagram released by Apple is shown below:

iPhone4 lens system [top] and iPhone4S lens system [bottom]

Again, as described earlier, the iPhone camera system was designed as a unitary system, with factors from the lens system, the individual lens elements, the sensor, firmware and kernel software all becoming known variables in a highly complex opto-electronic equation. The introduction of an external adaptor array of additional elements can produce unplanned effects. All in all, the various vendors of such adaptor lenses, including iPro, have done a good job in dealing with many unknowns. Apple is a highly secretive manufacturer, and does not publish much information. Attempts to gain further technical knowledge are very difficult, at some point one invariably comes up against Apple’s draconian NDAs (Non-Disclosure Agreements) which have penalties large enough to deter even the most aggressive seekers of information. Even the accumulation of knowledge that I have acquired over the past year while writing about the iPhone has been slow, tedious and has taken a tremendous amount of research and ‘fact comparison.’

As a final example, using a more real-world subject, here are a few camera images and screen shots that demonstrate the challenge if one attempts to correct, using post-production techniques, some of the errors introduced by such a lens adaptor:

Original image, unretouched but annotated.

The original image shows significant chromatic aberrations (color fringing) around the reflections in the shop window, grout lines in the brickwork on the pavement, and on the left side of the man’s shirt.

Editing using Photoshop ‘CameraRaw’ to attempt to correct the chromatic aberrations.

Using the Photoshop Camera Raw module, it is possible to manually correct for color fringing shifts… but this affects the entire image. So a fix for the edges causes a new set of errors in the middle of the image.

Chromatic aberrations removed from around the reflections in the window…

Notice here that the color fringing is gone from around the bright reflections in the window, but now the left edge of the man’s shirt has the color shifted, leaving only the monochromatic outline behind, producing a dark gray edge instead of the uniform blue that should exist.

…but reciprocal chromatic edge errors are introduced in the central portion of the image where highly saturated colors abut more neutral areas.

Likewise, the green paint on the steel column has shifted, revealing a gray line on the right of the woman’s leg, with a corresponding shift of the flesh tone onto the green steelwork on the left side of her leg.

final retouched shot after ‘painting in’ was performed to resolve the chroma offset errors in the central portion of the image.

To fix all these new errors, a technique known as ‘painting in’ was used, sampling and filling the color errors with the correct shade, texture and intensity. This takes time, skill and patience. It is impractical in the most part – this was done as an example.

Summary

The use of external adaptor lenses, including the iPro system discussed here, can offer a useful extension to the creative composition of images with the iPhone. Such lenses bring a set of compromises with them, but hopefully once these are known, careful choice of lighting, camera position and other factors can be used to reduce the visibility of such effects. As with any ‘creative device’ less is often more… sparing use of such adaptors will likely bring the best results. However, there are shots that I have obtained with the iPro that would have been impossible with the basic iPhone camera/lens, so I happy to have this additional tool.

To close, here are a few more examples using the iPro lenses:

Fisheye

WideAngle

Telephoto

iPhone Cinemaphotography – A Proof of Concept (Part 1)

August 3, 2012 · by parasam

I’m introducing a concept that I hope some of my readers may find interesting:  the production of an HD video that is entirely built using only the iPhone (and/or iPad). Everything from storyboard to all photography, editing, sound, titles and credits, graphics and special effects, etc. – and final distribution – can now be performed on a “cellphone.” I’ll show you how. Most of the focus of the new crop of highly capable ‘cellphone cameras’ such as is available with the iPhone and certain Android phones has been focused on still photography. While motion photography (video) is certainly well-known, it has not received the same attention and detail – nor the amount of apps – as its single-image sibling.

While I am using a single platform with which I am familiar (iOS on the iPhone/iPad), this concept can I believe be performed on the Android class of devices as well. I have not (nor do I intend to) research that possibility – I’ll leave that for others who are more familiar with that platform. The purpose is to show that such a feat CAN be done – and hopefully done reasonably well. It’s only been a few years since the production of HD video was strictly in the realm of serious professionals, with budgets of hundreds of thousands of dollars or more. While there of course are many compromises – and I don’t for a minute pretend that the range of possible shots or quality will anywhere near approach what a high quality DSLR, RED, Arri or other professional video camera can produce, I do know that a full HD (1080P) video can now be totally produced on a low-cost mobile platform.

This POC (Proof Of Concept) is intended as more than just a lark or a geeky way to eat some spare time:  the real purpose is to bring awareness that the previous bar of high cost cinemaphotography/editing/distribution has been virtually eliminated. This paves the way for creative individuals almost anywhere in the world to express themselves in a way that was heretofore impossible. Outside of America and Western Europe both budgets and skilled operator/engineers are in far lower supply. But there are just as many people who have a good story to tell in South Africa, Nigeria, Uruguay, Aruba, Nepal, Palestine, Montenegro and many other places as there are in France, Canada or the USA. The internet has now connected all of us – information is being democratized in a huge way. Of course there are still the ‘firewalls’ of North Korea, China and a few others – but the human thirst for knowledge, not to mention the unbelievable cleverness and endurance of 13-year-old boys and girls in figuring out ‘holes in the wall’ shows us that these last bastions of stolidity are doomed to fall in short order.

With Apple and other manufacturers doing their best to leave nary a potential customer anywhere in the world ‘out in the cold’, the availability, both in real terms and affordability, is almost ubiquitous. With apps now costing typically a few dollars (it’s almost insane – the Avid editor for iOS is $5; the Avid Media Composer software for PC/Mac is $2,500) an entire production / post-production platform can be assembled for under $1,000. This exercise is about what’s possible, not what is the easiest, most capable, etc. Yes, there are many limitations. Yes, some things will take a lot longer. But what you CAN do is just nothing short of amazing. That’s the story I’m going to share with you.

A note to my readers:  None of the hardware or software used in this exercise was provided by any vendor. I have no commercial relationship with any vendor, manufacturer or distributor. Choices I have made or examples I use in this post are based purely on my own preference. I am not a professional reviewer, and have made no attempt to exhaustively research every possible solution for the hardware or software that I felt was required to produce this video. All of the hardware and software used in this exercise is currently commercially available – any reasonably competent user should be able to reproduce this process.

Before I get into detail on hardware or software, I need to remind you that the most important part of any video is the story. Just having a low-cost, relatively high quality platform on which to tell your ‘story’ won’t help if you don’t have something compelling to say – and the people/places/things in front of the lens to say it. We have all seen that vast amounts of money and technical talent means nothing in the face of a lousy script or poor production values – just look over some of the (unfortunately many) Hollywood bombs… I’m the first one to admit that motion picture storytelling is not my strong point. I’m an engineer by training and my personal passion is still photography – telling a story with a single image. So… in order to bring this idea to fruition – I needed help. After some thought, I decided that ‘piggybacking’ on an existing production was the most feasible way to produce this idea: basically adding a few iPhone cameras to a shoot where I could take advantage of existing set, actors, lighting, direction, etc. etc. For me, the this was the only practical way to make this happen in a relatively short time frame.

I was lucky enough to know a very talented director, Ambika Leigh, who was receptive and supportive of my idea. After we discussed my general idea of ‘piggybacking’ she kindly identified a potential shoot. After initial discussions with the producers, the green light for the project was given. The details of the process will come in future posts, but what I can say now (the project is an upcoming series that is not released yet – so be patient! It will be worth the wait) is that without the support and willingness of these three incredible women (Ambika Leigh, director; Tiffany Price & Lauren DeLong, producers/actors/writers) this project would not have moved forward with the speed, professionalism and just plain fun that it has. At a very high level, the series brings us into the clever and humorous world of the “Craft Ladies” – a couple of friends that, well, like to craft – and drink wine.

Craft Ladies is the story of Karen and Jane, best friends forever, who love to
craft…they just aren’t any good at it. Over the years Karen and Jane’s lives
have taken slightly different paths but their love of crafting (and wine)
remains strong. Tune in in September to watch these ladies fulfill their
dream…a craft show to call their own. You won’t find Martha Stewart here,
this is crafting Craft Ladies style. Craft Up Nice Things!”

Please check out their links for further updates and details on the ‘real thing’

www.facebook.com/CraftUpNiceThings
www.twitter.com/#!/2craftladies
www.CraftUpNiceThings.com

I am solely responsible for the iPhone portion of your program – so all errors, technical gaffs, editorial bloops and other stumbles are mine. As said, this is a proof of concept – not the next Spielberg epic… My intention is to follow – as closely as my expertise and the available iOS technology will allow – the editorial decisions, effects, titles, etc. that end up on the ‘real show’. To this end I will be necessarily lagging a bit in my production, as I have to review the assembled and edited footage first. However, I will make every effort to have my iPhone version of this series ready for distribution shortly after the real version launches. Currently this is planned for some time in September.

For the iPhone shoot, two iPhone4S devices were used. I need to thank my capable 2nd camerawoman – Tara Lacarna – for her endurance, professionalism and support over two very long days of shooting! In addition to her new career as an iPhonographer (ha!) she is a highly capable engineer, musician and creative spirit. While more detail will be provided later in this post, I would also like to thank Niki Mustain of Schneider Optics for her time (and the efforts of others at this company) in helping me get the best possible performance from the “iPro” supplementary lenses that I used on portions of the shoot.

Before getting down to the technical details of equipment and procedure, I’ll lay out the environment in which I shot the video. Of course, this can vary widely, and therefore the exact technique used, as well as some hardware, may have to change and adapt as required. In this case the entire shoot was indoors using two sets. Professional lighting was provided (3200°K) for the principal photography (which used various high-end DSLR cameras with cinema lenses). I had to work around the available camera positions for the two iPhone cameras, so my shots will not be the same as were used in principal photography. Most shots were locked off with both iPhones on tripods; there were some camera moves and a few handheld shots. The first set of episodes was filmed over two days (two very, very long days!!) and resulted in about 116GB of video material from the two iPhones. In addition to Ambika, Tiffany, Lauren and Tara there was a dedicated and professional crew of camera operators, gaffers, grips, etc. (with many functions often performed by just one person – this was after all about quality not quantity – not to mention the lack of a 7-figure Hollywood budget!). A full list of credits will be in a later post.

Aside from the technical challenges; the basic job of getting lines and emotion on camera; taking enough camera angles, close-ups, inserts and so on to ensure raw material for editorial continuity; and just plain endurance (San Fernando Valley, middle of summer, had to close all windows and turn off all fans and A/C for each shot due to noise, a pile of people on a small set, hot lights… you get the picture…) – the single most important ingredient was laughter. And there was lots of it!! At one time or another, we had to stop down for several minutes until one or the other of us stopped laughing so hard that we couldn’t hold a camera, say a line or direct the next sequence. That alone should prompt you to check this series out – these women are just plain hilarious.

Hardware:

As mentioned previously, two iPhone4S cameras were used. Each one was the 32GB model. Since shooting video generates large files, most user data was temporarily deleted off each phone (easy to restore later with a sync using iTunes). Approximately 20GB free space was made available on each phone. If one was going to use an iPhone for a significant amount of video photography the 64GB version would probably be useful. The down side is that (unless you are shooting very short events) you will still have to download several times a day to an external storage device or computer – and the more you have to download the longer that takes! As in any process, good advance planning is critical. In my case with this shoot, I needed to coordinate ‘dumping times’ with the rest of the shoot:  there was a tight schedule and the production would not wait for me to finish dumping data off the phones. The DSLR cameras use removable memory cards, so it only takes a few minutes to swap cards, then those cameras are ready to roll again. I’ll discuss the logistics of dumping files from the phones in more detail in the software section below. If one was going to attempt long takes with insufficient break time to fully dump the phone before needing to shoot again, the best solution would be to have two iPhones for each camera position, so that one phone could be transferring data while the other one was filming.

In order to provide more visual control, as well as interest, a set of external adapter lenses (the “iPro” system by Schneider Optics) was used on various shots. A total of three different lenses are available: telephoto, wide-angle and a fisheye. A detailed post on these lenses – and adaptor lenses in general – is here. For now, you can visit their site for further detail. These lenses attach to a custom shell that is affixed to the iPhone. The lenses are easily interchanged with a bayonet mounting system. Another vital feature of the iPro shell for the phone is the provision for tripod mounting – a must for serious cinemaphotography – especially with the telephoto lens which magnifies camera movement. Each phone was fitted with one of the iPro shells to facilitate tripod mounting. This also made each phone available for attaching one of the lenses as required for the shot.

iPro “Fisheye” lens

iPro “Wide Angle” lens

iPro “Telephoto” lens

Another hardware requirement is power:  shooting video kills batteries just about faster than any other activity on the iPhone. You are using most of the highest power consuming parts of the phone – all at the same time:  the camera sensor, the display, the processor, and high bandwidth memory writing. A fully charged iPhone won’t even last two hours shooting video, so one must run the phone on external power, or plan the shoot for frequent (and lengthy!) recharge sessions. Bring plenty of extra cables, spare chargers, extension cords, etc. – it’s very cheap insurance to keep the phones running. Damage to cables while on a shoot is almost a guaranteed experience – don’t let that ruin your session.

A particular challenge that I had was a lack of a ‘feed through’ docking connector on the Line6 “Mobile In” audio adapter (more on this below). This meant that while I was using this high quality audio input adapter I was forced to run on battery, since I could not plug in the Mobile In device and the power cable at the same time to the docking connector on the bottom of the phone. I’m not aware of a “Y” adapter for iPhone docking connectors, but that would have really helped. It took a lot of juggling to keep that phone charged enough to keep shooting. On several shots, I had to forgo the high quality audio as I had insufficient power remaining and had to plug in to the charger.

As can be seen, the lack of both removable storage and a removable battery are significant challenges for using the iPhone in cinemaphotography. This can be managed, but it’s a critical point that requires careful attention. Another point to keep in mind is heat. Continual use of the phone as a video camera definitely heats up the phone. While neither phone ever overheated to the point where it became an issue, one should be aware of this fact. If one was shooting outside, it may be helpful to (if possible) shade the phone(s) from direct sunlight as much as practical. However, do not put the iPhones in the ice bucket to keep them cool…

Gitzo tripod with fluid head attached

Close-up of fluid head

Tripods are a must for any real video work:  camera judder and shake is very distracting to the viewer, and is impossible to remove (with any current iPhone app). Even with serious desktop horsepower (there is rather good toolset in Adobe AfterEffects for helping to remove camera shake) it takes a lot of time, skill and computing power. Far better to avoid in the first place whenever possible. Since ‘locked off’ shots are not as interesting, it’s worth getting fluid heads for your tripods so you can pan and tilt smoothly. A good high quality tripod is also well worth the investment:  flimsy ones will bend and shake. While the iPhone is very light – and this may tempt one to go with a very lightweight tripod – this will work against you if you want to make any camera tilts or pans. The very light weight of the phone actually causes problems in this case: it’s hard to smoothly move a camera that has almost no mass. At least having a very rigid and sturdy tripod will help in this regard. One will need considerable practice to get used to the feel of your particular fluid head, get the tension settings just right, etc. – in order to effect the smoothest camera movements. Remember this is a very small sensor, and the best results will be obtained with slow and even camera pans/tilts.

For certain situations, miniature tripods or dollies can be very useful, but they don’t take the place of a normal tripod. I used a tiny tripod for one shot, and experimented with the Pico Dolly (sort of a miniature skateboard that holds a small camera) although did not actually use for a finished shot. This is where the small size and light weight of the iPhone can be a plus: you can hang it and place it in locations that would be difficult to impossible with a normal camera. Like anything else though, don’t get too creative and gimmicky:  the job of the camera is to record the story, not call attention to itself or technology. If a trick or a gadget can help you visually tell the story – then it’s useful. Otherwise stick with the basics.

Another useful trick I discovered that helped stabilize my hand-held shots:  my tripod (as many do) has a removable center post on which the fluid head is mounted (that in turn holds the camera). By removing the entire camera/fluid-head/center-post assembly I was able to hold the camera with far greater accuracy and stability. The added weight of the central post and fluid head, while not much – maybe 500 grams – certainly added stability to those shots.

Tripod showing center shaft extended before removal.

Center shaft removed for “hand-held” use

If you are planning on any camera moves while on the tripod (pans or tilts), it is imperative that the tripod be leveled first – and rechecked every time you move it or dismount the phone. Nothing worse than watching a camera pan move uphill as you traverse from left to right… A small circular spirit level is the perfect accessory. While I have seen very small circular levels actually attached to tripod heads, I find them too small for real accuracy. I prefer a small removable device that I can place on top of the phone itself (which then accounts for all the hardware up to and including the shell) that can affect alignment. The one I use is 25mm (1″) in diameter.

I touched on the external audio input adapter earlier while discussing power for the iPhones, I’ll detail that now. For any serious video photography you must use external microphones: the one in the phone itself – although amazingly sensitive, has many drawbacks. It is single channel – where the iPhone hardware (and several of the better video camera apps) are capable of recording stereo; you can’t focus the sensitivity of the microphone, and most importantly, the mike is on the front of the phone at the bottom – pointing away from where your lens is aimed!

While it is possible to plug a microphone into the combination headphone/microphone connector on the top of the phone, there are a number of drawbacks. The first is it’s still a mono input – only 1 channel of sound. The next is the audio quality is not that great. This input was designed for telephone conversation headpiece use, so extended frequency response, low noise and reduced harmonic distortion were not part of the design parameters. Far better audio quality is available on the digital docking connector on the bottom of the phone. That said, there are very few devices actually on the market today (that I have been able to locate) that will function in the environment of video cinemaphotography, particularly if one is using the iPro shell and tripod mounting the iPhone. Many of the devices treat the iPhone as just an audio device (the phone actually snaps into several of the units, making it impossible to use as a camera); with others the mechanical design is not compatible with either the iPro case or tripod mounting. Others offer only a single channel input (these are mostly designed for guitar input so budding Hendrix types can strum into GarageBand). The only unit I was able to find that met all of my requirements (stereo line input, high audio quality, mechanically did not interfere with tripod or the iPro case) was a unit “Mobile In” manufactured by Line6. Even this device is primarily a guitar input unit, but it does have a line in stereo connector that works very well. In order to use the hardware, you must download and install their free app (and it’s on the fat side, about 55MB) which contains a huge amount of guitar effects. Totally useless for the line input – but it won’t work without it. So just install it and forget about it. You never need to open the MobilePOD app in order to use the line input connector. As discussed above in the section on power, the only major drawback is that once this device is plugged in you can’t run your phone off external power. Really need to find that “Y” adapter for the docking connector..

“Mobile In” audio input adapter attached.

Now you may ask, why do I need a line input connector when I’m using microphones?? My attempt here is to produce the highest quality content possible, while still using the iPhone as the camera/recorder. For the reasons already discussed above, the use of external microphones is required. Typically a number of mikes will be placed, fed into a mixer, and then a line level feed (usually stereo) will be fed to the sound recorder. In all ‘normal’ (aka not using cellphones as cameras!!) video shoots, the sound is almost always recorded on a separate device, just synchronized in some fashion to each of the cameras so the entire shoot is in sync. In this particular shoot, the two actors on the set were individually miked with lavalier microphones (there is a whole hysterical story on that subject, but it will have to wait until after that episode airs…) and a third direction boom mike was used for ambient sound. The three mikes were fed into a small portable mixer/sound recorder. The stereo output (usually used for headphone monitoring – a line level output) was fed (through a “Y” cable) to both the monitoring headphones and the input to the Mobile In device. Essentially, I just ‘piggybacked’ on top of the existing audio feed for the shoot.

This didn’t violate my POC – as one would need this same equipment – or something like it – on any professional shoot. At a minimum, one could just use a small mixer, obviously if the iPhone was recording the sound an external recorder is not required. I won’t attempt to further discuss all the issues in recording high quality sound – that would take a full post (if not a book!) – but there is a massive amount of literature out there on the web if one looks. Good sound recording is an art – if possible avail yourself of someone who knows this skill to assist you on your shoot – it will be invaluable. I’ll just mention a few pointers to complete this part of the discussion:

  • Record the most dynamic range possible without distortion (big range between soft and loud sounds). This will markedly improve the presence of your audio tracks.
  • Keep all background noise to an absolute minimum. Turn off all cellphones! (put the iPhone that are ‘cameras’ in “airplane mode” so they won’t be disturbed by phone calls, texts or e-mails). Turn off fans, air conditioners, refrigerators (if you are near a kitchen), etc. etc. Take a few moments after calling ‘quiet on the set’ to sit still and really listen to your headphones to ensure you don’t hear any noise.
  • As much as possible, keep the loudness levels consistent from take to take – it will help keep your editor (or yourself…) from taking out the long knives after way too many hours trying to normalize levels between takes…
  • If you use lavalier mikes (those tiny microphones that clip onto clothing – they are available in ‘wired’ or ‘wireless’ versions) you need to listen carefully during rehearsals and actual takes for clothing rustle. That can be very distracting – you may have to stop and reposition the mike so that the housing is not touching any clothing. These mikes come with little clips that actually mount on to the cable just below the actual microphone body – thereby insulating clothing movement (rustle) from being transmitted to the sensor through the body of the microphone. Take care in mounting and test with your actor as they move – and remind them that clasping their hands to their chest in excitement (and thumping the mike) will make your sound person deaf – and ruin the audio for that shot!

Actors’ view of the camera setup for a shot, (2 iPhones, 3 DSLRs)

Storage and the process of dumping (transferring video files from the iPhones to external storage) is a vital part of both hardware, software and procedure. The hardware I used will be discussed here, the software and procedure is mentioned in the next section. Since the HD video files consume about 2.5GB for every 10 minutes of filming, even the largest capacity iPhone (64GB) will run out of space in short order. As mentioned earlier, I used the 32GB models on this shoot, with about 20GB free space on each phone. That meant that, at a maximum, I had a little over an hour’s storage on each phone. During the two days of shooting, we shot just under 5 hours of actual footage – which amounted to a total of 116GB from the two iPhones in total. (Not every shot was shadowed by the iPhones: some of the close-ups and inserts could not be performed by the iPhones as they would have been in the shot composed by the DSLR cameras).

The challenge to this project was to not involve anything other than the iPhone/iPad for all factors of the production. The dumping of footage from the iPhones to external storage is one area where Apple (nor any 3rd party developer that I have found) does not offer a purely iOS solution. With the lack of removable storage, there are only two ways to move files off the iPhone: Wi-Fi or the USB cable attached to the docking connector. Wi-Fi is not a practical solution in this environment:  the main reason is it’s too slow. You can find as many ‘facts’ on iPhone Wi-Fi speed as there are types of orchids in the Amazon, but my research (verified by personal tests) show that, in a real-world and practical manner 8Mb/s is a top-end average for upload (which is what you need to transmit files FROM the phone to an external storage device). That’s only 800KB/s – so it would take 7 hours to upload one 2.5GB movie file – which is 10 minutes of shooting! Not to mention the issues of Wi-Fi interference, dropped connections, etc. etc.

That brings us to cabled connections. Currently, the only way to move data off of (or on to for that matter) an iPhone is to use a computer. While the Apple Time Machine could in theory function as a direct-to-phone data storage device, it only connects via Wi-Fi. However, the method I chose only uses the computer as a ‘connection link’ to an external hard drive, so in my view it does not break my premise of an “all iOS” project. When I get to the editing stage, I just reverse the process and pull files back from the external drive through the computer back to the phone (in this case using iTunes).

I will discuss the precise technique and software used below, but suffice to say here that I used a PC as the computer – mainly just because that is the laptop that I have. It also does prove however that there is no issue of “Mac vs PC” as far as the computer goes. I feel this is an important point, as in many countries outside USA and Western Europe the price premium on Apple computers is such that they are very scarce. For this project, I wanted to make sure the required elements were as widely available as possible.

The choice of external storage is important for speed and reliability’s sake. Since the USB connection from the phone to the computer is limited to v2.0 (480Mb/s theoretical) one may assume that just any USB2.0 external drive would be sufficient. That’s not actually the case, as we shall see…  While the link speed of USB2.0 supposedly can provide a maximum of 48MB/s (480Mb/s), that is never matched in reality. USB chipsets in the internal hub in the computer, processing power in the phone and the computer, other processes running on the computer during transfer, bus and cpu speed in the computer, actual disk controller and disk speed of the external storage – all these factors serve to significantly affect transfer speed.

Probably the most important is the actual speed of the external disk. Most common portable USB2.0 disks (the small 2.5″ format) run at 5400RPM, and have disk controller chipsets that are commensurate, with actual performance in the 5-10MB/s range. This is too slow for our purposes. The best solution is to use an external RAID array of two ‘striped’ disks [RAID 0] using high performance 7200RPM SATA disks with an appropriately designed disk controller. Devices such as the G-RAID Mini system is a good example. If you are using a PC, get the best performance with an eSATA connection to the drive (my laptop has a built-in eSATA connector, but PC Card adapters are available that easily support this connectivity for computers that don’t have it built in). This offers the highest performance (real-world tests show average write speeds of 115MB/s using this device). If you are using an Apple computer, opt for the FW800 connection (I’m not aware of eSATA on any Mac computer). While this limits the performance to around 70MB/s maximum, it’s still much faster than the USB2.0 interface from the phone so it’s not an issue. I have proven that having a significant amount of ‘headroom’, in terms of speed performance, on the external drive, is desirable. You just don’t need the drive to slow things down any.

There are other viable alternatives for external drives, particularly if one needed a drive that did not require an external power supply (which the G-RAID does due to the performance). Keep in mind that while it’s possible to run a laptop and external drive all off battery power, you really won’t want to do this – for one, unless you are on a remote outdoor location shoot, you will have AC power – and disk writing at continuous high throughput is a battery killer! That said, a good alternative (for PC) is one of the Seagate GoFlex USB3.0 drives. I use a 1.5TB model that houses a high-performance 7200RPM drive and supports up to 50MB/s write speeds. For the Mac, Seagate has a Thunderbolt model. Although the Thunderbolt interface is twice as fast (10Gb/s vs 5Gb/s) as USB3.0 it makes no difference in transfer speed (these single drive storage devices can’t approach the transfer speeds of either interface). However, there is a very good reason to go with USB3.0/eSATA/Thunderbolt instead of USB2.0 – overall performance. With the newer high-speed interfaces, the full system (hard disk controller, interface chipset, etc.) is designed for high-speed data transfer, and I have proved to myself that it DOES make a difference. It’s very hard to find a USB2.0 system that matches the performance of a USB3.0/etc system – even on a 2.5″ single drive subsystem.

The last thing to cover here under storage is backup. Your video footage is irreplaceable. Procedure will be covered below, but under hardware, provide a second external drive on the set. It’s simply imperative that you immediately back up the footage on to a second physical drive as soon as practical – NOT at the end of the day! If you have a powerful enough computer, with the correct connectivity, etc. – you can actually copy the iPhone files to two drives simultaneously (best solution), but otherwise plan on copying the files from one external drive to the backup while the next scenes are being shot (background task).

I’ll close with a final suggestion:  while this description of hardware and process is not meant in any way to be a tutorial on cinemaphotography, audio, etc. etc. – here is a small list (again, this is under ‘hardware’ as it concerns ‘stuff’) of useful items that will make your life easier “on the set”:

  • Proper transport cases, bags, etc. to store and carry all these bits. Organization, labeling, color-coding, etc. all helps a lot when on a set with lots of activity and other equipment.
  • Spare cables for everything! Murphy will see to it that the one item for which you have no duplicate will get bent during the shoot…
  • Plenty of power strips and extension cords.
  • Gorilla tape or camera tape (this is NOT ‘duct tape’). Find a gaffer and he/she will explain it to you…
  • Small folding table or platform (for your PC/Mac and drives) – putting high value equipment on the floor is asking for BigFoot to visit…
  • Small folding stool (appropriate for the table above), or an ‘apple box’ – crouching in front of computer while manipulating high value content files is distracting, not to mention tiring.
  • If you are shooting outside, more issues come into play. Dust is the big one. Cans of compressed air, lens tissue, camel-hair brushes, zip-lock baggies, etc. etc. – none of the items discussed in this entire post appreciate dust or dirt…
    • Cooling. Mentioned earlier, but you’ll need to keep the phone and computer as cool as practical (unless of course you are shooting in Scotland in February in which case the opposite will be true: trying to figure out how to keep things warm and dry in the middle of a wet and freezing moor will become paramount).
    • Special mention for ocean-front shoots:  corrosion is a deadly enemy of iPhones and other such equipment. Wipe down ALL equipment (with appropriate cloths and solutions) every night after the shoot. Even the salt air makes deposits on every exposed metal surface – and later on a very hard to remove scale will become apparent.
  • A final note for sunny outdoor shoots: seeing the iPhone screen is almost impossible in bright sunlight, and unlike DSLRs the iPhone does not have an optical viewfinder. Some sort of ‘sunshade’ will be required. While researching this online, I came across this little video that shows one possible solution. Obviously this would have to be modified to accommodate the audio adapter, iPro lenses, etc. shown in my project, but it will hopefully give you some ideas. (Thanks to triplelucky for this video).

Software:

As amazing as the hardware capabilities of the above system are (iPhone, supplemental lenses, audio adapters, etc.) – none of this would be possible without the sophisticated software that is now available for this platform at such low cost. The list of software that I am currently using to produce this video is purely of my own choosing – there may be other equally viable solutions for each step or process. I feel what is important is the possibility of the process, not the precise piece of kit used to accomplish the task. Obviously, as I am using the iOS platform, all the apps are “Apple iPhone/iPad compliant”. The reader that chooses an alternate platform will need to do a bit of research to find similar functionality.

As a parallel project, I am currently describing my experiences with the iPhone camera in general, as well as many of the software packages (apps) that support the iPhone still and video camera. These posts are elsewhere in this same blog location. For that reason, I will not describe in any detail the apps here. If software that is discussed or listed here is not yet in my stable of posts, please be patient – I promise that each app used in this project will be discussed in this blog at some point. I will refer the reader to this post where an initial list of apps that will be discussed is located.

Here is a short list of the apps I am currently using. I may add to this list before I complete this project! If so, I will update this and other posts appropriately.

Storyboard Composer Excellent app for building storyboards from shot or library photos, adding actors, camera motion, script, etc. Powerful.

Movie*Slate A very good slate app.

Splice Unbelievable – a full video editor for the iPhone/iPad. Yes, you can: drop movies and stills on a timeline, add multiple sound tracks and mix them, work in full HD, has loads of video and audio efx, add transitions, burn in titles, resize, crop, etc. etc. Now that doesn’t mean that I would choose to edit my next feature on a phone…

Avid Studio  The renowned capability of Avid now stuffed into the iPad. Video, audio, transitions, etc. etc. Similar in capability to Splice (above) – I’ll have a lot more to say after these two apps get a serious test drive while editing all the footage I have shot.

iTC Calc The ultimate time code app for iDevices. I use on both iPad and iPhone.

FilmiC Pro Serious movie camera app for iPhone. Select shooting mode, resolution, 26 frame rates, in-camera slating, colorbars, multiple bitrates for each resolution, etc. etc.

Camera+ I use this as much for editing stills as shooting, biggest advantage over native iPhone camera app is you can set different part of frame for exposure and focus.

almost DSLR is the closest thing to fully manual control of iPhone camera you can get. Takes some training, but is very powerful once you get the hang of it.

PhotoForge2 Powerful editing app. Basically Photoshop on the iPhone.

TrueDoF This one calculates true depth-of-field for a given lens, sensor size, etc. I use this to plan my range of focus once I know my shooting distance.

OptimumCS-Pro This is sort of inverse of the above app – here you enter the depth of field you want, then OCSP tells you the shooting distance and aperture you need for that.

Juxtaposer This app lets you layer two different photos onto each other, with very controllable blending.

Phonto One of the best apps for adding titles and text to shots.

Some of the above apps are designed for still photography only, but since stills can be laid down in the video timeline, they will likely come into use during transitions, effects, title sequences, etc.

I used Filmic Pro as the only video camera app for this project. This was firstly based just on personal preference and the capabilities that it provided (the ability to lock focus, exposure and white balance were critical to maintaining continuity across takes in my opinion). Once I had selected a video camera app with which I was comfortable, I felt it important to use that on both the iPhones – again for continuity of the content. There may be other equally capable apps for this purpose. My focus was on producing as high a quality product as possible within the means and capabilities at my disposal. The particular tools are less important than the totality of the process.

The process of dumping footage off the iPhone (transferring video files to external storage) requires some additional discussion. The required hardware has been mentioned above, now let’s dive into process and the required software. The biggest challenge is logistics: finding enough time in between takes to transfer footage. If the iPhones are the only cameras used, then in one way this is easier – you have control over the timeline in that regard. In my case, this was even more challenging, as I was ‘piggybacking’ on an existing shoot so I had to fit in with the timeline and process in place. Since professional video cameras all use removable storage, they only require a few minutes to effectively be ready to shoot again after the on-camera storage is full. But even if iPhones are the only cameras, taking long ‘time-outs’ to dump footage will hinder your production.

There are several ways to maximize the transfer speed of files off the iPhone, but the best way is to make use of time management:  try to schedule dumping for normal ‘down time’ on the set (breaks, scene changes, wardrobe changes, meal breaks, etc.)  In order to do this you need to have your ‘transfer station’ [computer and external drive] ready and powered up so you can take advantage of even a short break to clear files from the phone. I typically transferred only one to three files at a time, so in case we started up sooner than expected I was not stuck in the middle of a long transfer. The other advantage in my situation was that the iPhone charges while connected via USB cable, so I was able to accomplish two things at once: replenish battery capacity due to shooting with the Mobile In audio adapter not allowing shooting while on line power; and dumping the files to external storage.

My 2nd camerawoman, Tara, brought her Mac Air laptop for file transfer to an external USB drive, I used a Dell PC laptop (discussed above in the hardware section). In both cases, I found that using the native OS file management (Image Capture [part of OS] for the Mac, Windows Explorer for the PC) was hideously slow. It does work (after plugging in the iPhone to the USB connector on the computer, the iPhone shows up as just another external disk. You can navigate down through a few folders and find your video files). On my PC (which BTW is a very fast machine – basically a 4-core mobile workstation that can routinely transfer files to/from external drives at over 150MB/s) the best transfer speed I could obtain with Windows Explorer amounted to needing almost an hour to transfer 10 minutes of video off the iPhone – a complete non-starter in this case. After some research, I located software from WideAngle Software called TouchCopy that solved my problem. They make versions for both Mac and PC, and it allowed transfer off the iPhone to external storage about 6x faster than Windows Explorer. My average transfer times were approximately ‘real time’ – i.e. 10 minutes of footage took about 10 minutes to transfer. There may be other similar applications out there – as mentioned earlier I am not in the software reviewing business – once I find something that works for me I will use that – until I find something “better/faster/cheaper.”

To summarize the challenging file transfer issue:

  • Use the fastest hardware connections and drives that you can.
  • Use time management skills and basic logistics to optimize your ‘windows’ for file transfer.
  • Use supplemental software to maximize your transfer speed from phone to external storage.
  • Transfer in small chunks so you don’t hold up production.

The last bit that requires a mention is file backup. Your original footage is impossible to replace, so you need to take exquisite care with it. The first thing to do it back it up to a second external physical drive immediately after the file transfer. Typically I started this task as soon as I was done with dumping files off the iPhone – this task could run unsupervised during the next takes.. However, one thing to consider before doing that (and this may depend on how much time you have during breaks): the relabeling of the video files. The footage is stored on your iPhone as a generically labeled .mov file, usually something like IMG_2334.mov – not a terribly insightful description of your scene/take. I never change the original label, only add to it. There is a reason… it helps to keep all the files in sequential order when starting the scene selection and editorial process later. This can be very helpful when things go a bit skew – as the always do during a shoot. For instance if the slate is missing on a clip (you DO slate every take, correct??) having the original ‘shot order’ can really help place the orphan take into its correct sequence. In my case, this happened several time due to slate placement:  since my iPhone cameras were in different locations, sometimes the slate was pointed where it was in frame for the DSLR cameras but was not visible by the iPhones.

I developed a short-hand description take from the slate at the head of each shot that I appended to the original file name. This does a few seconds (to launch Quicktime or VLC, shuttle in to the slate, pause and get the slate info), but the sooner you do this, the better. If you have time to rename the shots before the backup, then you don’t have to rename twice – or face the possibility of human error during this task. Here is a sample of one of my files after renaming: IMG_2334_Roll-A1_EP1-1_T-3.mov  This is short for Roll A1, Episode 1, Scene 1, Take 3.

However you go about this, just ensure that you back up the original files quickly. The last step of course is to delete the original video files off the iPhone so you have room for more footage. To double-check this process (you NEVER want to realize you just deleted footage that was not successfully transferred!!!) I do three things:

  1. Play into the file with headphones on to ensure that I have video and audio at head, middle and end of each clip. That only takes a few seconds, but just do it.
  2. Using Finder or Explorer, get the file size directly off the still-connected iPhone and compare it to the copied file on your external drive. Look at actual file size, not ‘size on disk’, as your external disk may have different sector sizes than the iPhone). If they are different, re-transfer the file.
  3. Using the ‘scrub bar’, quickly traverse the entire file using your player of choice (Quicktime, VLC, etc.) and make sure you have picture from end to end in the clip.

Then and only then, double-check exactly what you are about to delete, offer a small prayer to your production spirit of choice, and delete the file(s).

Summary:

This is only the beginning! I will write more as this project moves ahead, but wanted to introduce the concept to my audience. A deep thanks to all of you who have read my past posts on various subjects, and please return for more of this journey. Your comments and appreciation provides the fuel for this blog.

Support and Contact Details:

Please visit and support the talented women that have enabled me to produce this experiment. This would not have been possible otherwise.

Tiffany Price, Writer, Producer, Actress
Lauren DeLong, Writer, Producer, Actress
Ambika Leigh, Director, Producer

  • Blog at WordPress.com.
  • Connect with us:
  • Twitter
  • Vimeo
  • YouTube
  • RSS
  • Follow Following
    • Parasam
    • Join 95 other followers
    • Already have a WordPress.com account? Log in now.
    • Parasam
    • Customize
    • Follow Following
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...