• Home
  • about this blog
  • Blog Posts

Parasam

Menu

  • design
  • fashion
  • history
  • philosophy
  • photography
  • post-production
    • Content Protection
    • Quality Control
  • science
  • security
  • technology
    • 2nd screen
    • IoT
  • Uncategorized
  • Enter your email address to follow this blog and receive notifications of new posts by email.

  • Recent Posts

    • Take Control of your Phone
    • DI – Disintermediation, 5 years on…
    • Objective Photography is an Oxymoron (all photos lie…)
    • A Historical Moment: The Sylmar Earthquake of 1971 (Los Angeles, CA)
    • Where Did My Images Go? [the challenge of long-term preservation of digital images]
  • Archives

    • September 2020
    • October 2017
    • August 2016
    • June 2016
    • May 2016
    • November 2015
    • June 2015
    • April 2015
    • March 2015
    • December 2014
    • February 2014
    • September 2012
    • August 2012
    • June 2012
    • May 2012
    • April 2012
    • March 2012
    • February 2012
    • January 2012
  • Categories

    • 2nd screen
    • Content Protection
    • design
    • fashion
    • history
    • IoT
    • philosophy
    • photography
    • post-production
    • Quality Control
    • science
    • security
    • technology
    • Uncategorized
  • Meta

    • Register
    • Log in
    • Entries feed
    • Comments feed
    • WordPress.com

Browsing Tags video

iPhone Cinemaphotography – A Proof of Concept (Part 1)

August 3, 2012 · by parasam

I’m introducing a concept that I hope some of my readers may find interesting:  the production of an HD video that is entirely built using only the iPhone (and/or iPad). Everything from storyboard to all photography, editing, sound, titles and credits, graphics and special effects, etc. – and final distribution – can now be performed on a “cellphone.” I’ll show you how. Most of the focus of the new crop of highly capable ‘cellphone cameras’ such as is available with the iPhone and certain Android phones has been focused on still photography. While motion photography (video) is certainly well-known, it has not received the same attention and detail – nor the amount of apps – as its single-image sibling.

While I am using a single platform with which I am familiar (iOS on the iPhone/iPad), this concept can I believe be performed on the Android class of devices as well. I have not (nor do I intend to) research that possibility – I’ll leave that for others who are more familiar with that platform. The purpose is to show that such a feat CAN be done – and hopefully done reasonably well. It’s only been a few years since the production of HD video was strictly in the realm of serious professionals, with budgets of hundreds of thousands of dollars or more. While there of course are many compromises – and I don’t for a minute pretend that the range of possible shots or quality will anywhere near approach what a high quality DSLR, RED, Arri or other professional video camera can produce, I do know that a full HD (1080P) video can now be totally produced on a low-cost mobile platform.

This POC (Proof Of Concept) is intended as more than just a lark or a geeky way to eat some spare time:  the real purpose is to bring awareness that the previous bar of high cost cinemaphotography/editing/distribution has been virtually eliminated. This paves the way for creative individuals almost anywhere in the world to express themselves in a way that was heretofore impossible. Outside of America and Western Europe both budgets and skilled operator/engineers are in far lower supply. But there are just as many people who have a good story to tell in South Africa, Nigeria, Uruguay, Aruba, Nepal, Palestine, Montenegro and many other places as there are in France, Canada or the USA. The internet has now connected all of us – information is being democratized in a huge way. Of course there are still the ‘firewalls’ of North Korea, China and a few others – but the human thirst for knowledge, not to mention the unbelievable cleverness and endurance of 13-year-old boys and girls in figuring out ‘holes in the wall’ shows us that these last bastions of stolidity are doomed to fall in short order.

With Apple and other manufacturers doing their best to leave nary a potential customer anywhere in the world ‘out in the cold’, the availability, both in real terms and affordability, is almost ubiquitous. With apps now costing typically a few dollars (it’s almost insane – the Avid editor for iOS is $5; the Avid Media Composer software for PC/Mac is $2,500) an entire production / post-production platform can be assembled for under $1,000. This exercise is about what’s possible, not what is the easiest, most capable, etc. Yes, there are many limitations. Yes, some things will take a lot longer. But what you CAN do is just nothing short of amazing. That’s the story I’m going to share with you.

A note to my readers:  None of the hardware or software used in this exercise was provided by any vendor. I have no commercial relationship with any vendor, manufacturer or distributor. Choices I have made or examples I use in this post are based purely on my own preference. I am not a professional reviewer, and have made no attempt to exhaustively research every possible solution for the hardware or software that I felt was required to produce this video. All of the hardware and software used in this exercise is currently commercially available – any reasonably competent user should be able to reproduce this process.

Before I get into detail on hardware or software, I need to remind you that the most important part of any video is the story. Just having a low-cost, relatively high quality platform on which to tell your ‘story’ won’t help if you don’t have something compelling to say – and the people/places/things in front of the lens to say it. We have all seen that vast amounts of money and technical talent means nothing in the face of a lousy script or poor production values – just look over some of the (unfortunately many) Hollywood bombs… I’m the first one to admit that motion picture storytelling is not my strong point. I’m an engineer by training and my personal passion is still photography – telling a story with a single image. So… in order to bring this idea to fruition – I needed help. After some thought, I decided that ‘piggybacking’ on an existing production was the most feasible way to produce this idea: basically adding a few iPhone cameras to a shoot where I could take advantage of existing set, actors, lighting, direction, etc. etc. For me, the this was the only practical way to make this happen in a relatively short time frame.

I was lucky enough to know a very talented director, Ambika Leigh, who was receptive and supportive of my idea. After we discussed my general idea of ‘piggybacking’ she kindly identified a potential shoot. After initial discussions with the producers, the green light for the project was given. The details of the process will come in future posts, but what I can say now (the project is an upcoming series that is not released yet – so be patient! It will be worth the wait) is that without the support and willingness of these three incredible women (Ambika Leigh, director; Tiffany Price & Lauren DeLong, producers/actors/writers) this project would not have moved forward with the speed, professionalism and just plain fun that it has. At a very high level, the series brings us into the clever and humorous world of the “Craft Ladies” – a couple of friends that, well, like to craft – and drink wine.

Craft Ladies is the story of Karen and Jane, best friends forever, who love to
craft…they just aren’t any good at it. Over the years Karen and Jane’s lives
have taken slightly different paths but their love of crafting (and wine)
remains strong. Tune in in September to watch these ladies fulfill their
dream…a craft show to call their own. You won’t find Martha Stewart here,
this is crafting Craft Ladies style. Craft Up Nice Things!”

Please check out their links for further updates and details on the ‘real thing’

www.facebook.com/CraftUpNiceThings
www.twitter.com/#!/2craftladies
www.CraftUpNiceThings.com

I am solely responsible for the iPhone portion of your program – so all errors, technical gaffs, editorial bloops and other stumbles are mine. As said, this is a proof of concept – not the next Spielberg epic… My intention is to follow – as closely as my expertise and the available iOS technology will allow – the editorial decisions, effects, titles, etc. that end up on the ‘real show’. To this end I will be necessarily lagging a bit in my production, as I have to review the assembled and edited footage first. However, I will make every effort to have my iPhone version of this series ready for distribution shortly after the real version launches. Currently this is planned for some time in September.

For the iPhone shoot, two iPhone4S devices were used. I need to thank my capable 2nd camerawoman – Tara Lacarna – for her endurance, professionalism and support over two very long days of shooting! In addition to her new career as an iPhonographer (ha!) she is a highly capable engineer, musician and creative spirit. While more detail will be provided later in this post, I would also like to thank Niki Mustain of Schneider Optics for her time (and the efforts of others at this company) in helping me get the best possible performance from the “iPro” supplementary lenses that I used on portions of the shoot.

Before getting down to the technical details of equipment and procedure, I’ll lay out the environment in which I shot the video. Of course, this can vary widely, and therefore the exact technique used, as well as some hardware, may have to change and adapt as required. In this case the entire shoot was indoors using two sets. Professional lighting was provided (3200°K) for the principal photography (which used various high-end DSLR cameras with cinema lenses). I had to work around the available camera positions for the two iPhone cameras, so my shots will not be the same as were used in principal photography. Most shots were locked off with both iPhones on tripods; there were some camera moves and a few handheld shots. The first set of episodes was filmed over two days (two very, very long days!!) and resulted in about 116GB of video material from the two iPhones. In addition to Ambika, Tiffany, Lauren and Tara there was a dedicated and professional crew of camera operators, gaffers, grips, etc. (with many functions often performed by just one person – this was after all about quality not quantity – not to mention the lack of a 7-figure Hollywood budget!). A full list of credits will be in a later post.

Aside from the technical challenges; the basic job of getting lines and emotion on camera; taking enough camera angles, close-ups, inserts and so on to ensure raw material for editorial continuity; and just plain endurance (San Fernando Valley, middle of summer, had to close all windows and turn off all fans and A/C for each shot due to noise, a pile of people on a small set, hot lights… you get the picture…) – the single most important ingredient was laughter. And there was lots of it!! At one time or another, we had to stop down for several minutes until one or the other of us stopped laughing so hard that we couldn’t hold a camera, say a line or direct the next sequence. That alone should prompt you to check this series out – these women are just plain hilarious.

Hardware:

As mentioned previously, two iPhone4S cameras were used. Each one was the 32GB model. Since shooting video generates large files, most user data was temporarily deleted off each phone (easy to restore later with a sync using iTunes). Approximately 20GB free space was made available on each phone. If one was going to use an iPhone for a significant amount of video photography the 64GB version would probably be useful. The down side is that (unless you are shooting very short events) you will still have to download several times a day to an external storage device or computer – and the more you have to download the longer that takes! As in any process, good advance planning is critical. In my case with this shoot, I needed to coordinate ‘dumping times’ with the rest of the shoot:  there was a tight schedule and the production would not wait for me to finish dumping data off the phones. The DSLR cameras use removable memory cards, so it only takes a few minutes to swap cards, then those cameras are ready to roll again. I’ll discuss the logistics of dumping files from the phones in more detail in the software section below. If one was going to attempt long takes with insufficient break time to fully dump the phone before needing to shoot again, the best solution would be to have two iPhones for each camera position, so that one phone could be transferring data while the other one was filming.

In order to provide more visual control, as well as interest, a set of external adapter lenses (the “iPro” system by Schneider Optics) was used on various shots. A total of three different lenses are available: telephoto, wide-angle and a fisheye. A detailed post on these lenses – and adaptor lenses in general – is here. For now, you can visit their site for further detail. These lenses attach to a custom shell that is affixed to the iPhone. The lenses are easily interchanged with a bayonet mounting system. Another vital feature of the iPro shell for the phone is the provision for tripod mounting – a must for serious cinemaphotography – especially with the telephoto lens which magnifies camera movement. Each phone was fitted with one of the iPro shells to facilitate tripod mounting. This also made each phone available for attaching one of the lenses as required for the shot.

iPro “Fisheye” lens

iPro “Wide Angle” lens

iPro “Telephoto” lens

Another hardware requirement is power:  shooting video kills batteries just about faster than any other activity on the iPhone. You are using most of the highest power consuming parts of the phone – all at the same time:  the camera sensor, the display, the processor, and high bandwidth memory writing. A fully charged iPhone won’t even last two hours shooting video, so one must run the phone on external power, or plan the shoot for frequent (and lengthy!) recharge sessions. Bring plenty of extra cables, spare chargers, extension cords, etc. – it’s very cheap insurance to keep the phones running. Damage to cables while on a shoot is almost a guaranteed experience – don’t let that ruin your session.

A particular challenge that I had was a lack of a ‘feed through’ docking connector on the Line6 “Mobile In” audio adapter (more on this below). This meant that while I was using this high quality audio input adapter I was forced to run on battery, since I could not plug in the Mobile In device and the power cable at the same time to the docking connector on the bottom of the phone. I’m not aware of a “Y” adapter for iPhone docking connectors, but that would have really helped. It took a lot of juggling to keep that phone charged enough to keep shooting. On several shots, I had to forgo the high quality audio as I had insufficient power remaining and had to plug in to the charger.

As can be seen, the lack of both removable storage and a removable battery are significant challenges for using the iPhone in cinemaphotography. This can be managed, but it’s a critical point that requires careful attention. Another point to keep in mind is heat. Continual use of the phone as a video camera definitely heats up the phone. While neither phone ever overheated to the point where it became an issue, one should be aware of this fact. If one was shooting outside, it may be helpful to (if possible) shade the phone(s) from direct sunlight as much as practical. However, do not put the iPhones in the ice bucket to keep them cool…

Gitzo tripod with fluid head attached

Close-up of fluid head

Tripods are a must for any real video work:  camera judder and shake is very distracting to the viewer, and is impossible to remove (with any current iPhone app). Even with serious desktop horsepower (there is rather good toolset in Adobe AfterEffects for helping to remove camera shake) it takes a lot of time, skill and computing power. Far better to avoid in the first place whenever possible. Since ‘locked off’ shots are not as interesting, it’s worth getting fluid heads for your tripods so you can pan and tilt smoothly. A good high quality tripod is also well worth the investment:  flimsy ones will bend and shake. While the iPhone is very light – and this may tempt one to go with a very lightweight tripod – this will work against you if you want to make any camera tilts or pans. The very light weight of the phone actually causes problems in this case: it’s hard to smoothly move a camera that has almost no mass. At least having a very rigid and sturdy tripod will help in this regard. One will need considerable practice to get used to the feel of your particular fluid head, get the tension settings just right, etc. – in order to effect the smoothest camera movements. Remember this is a very small sensor, and the best results will be obtained with slow and even camera pans/tilts.

For certain situations, miniature tripods or dollies can be very useful, but they don’t take the place of a normal tripod. I used a tiny tripod for one shot, and experimented with the Pico Dolly (sort of a miniature skateboard that holds a small camera) although did not actually use for a finished shot. This is where the small size and light weight of the iPhone can be a plus: you can hang it and place it in locations that would be difficult to impossible with a normal camera. Like anything else though, don’t get too creative and gimmicky:  the job of the camera is to record the story, not call attention to itself or technology. If a trick or a gadget can help you visually tell the story – then it’s useful. Otherwise stick with the basics.

Another useful trick I discovered that helped stabilize my hand-held shots:  my tripod (as many do) has a removable center post on which the fluid head is mounted (that in turn holds the camera). By removing the entire camera/fluid-head/center-post assembly I was able to hold the camera with far greater accuracy and stability. The added weight of the central post and fluid head, while not much – maybe 500 grams – certainly added stability to those shots.

Tripod showing center shaft extended before removal.

Center shaft removed for “hand-held” use

If you are planning on any camera moves while on the tripod (pans or tilts), it is imperative that the tripod be leveled first – and rechecked every time you move it or dismount the phone. Nothing worse than watching a camera pan move uphill as you traverse from left to right… A small circular spirit level is the perfect accessory. While I have seen very small circular levels actually attached to tripod heads, I find them too small for real accuracy. I prefer a small removable device that I can place on top of the phone itself (which then accounts for all the hardware up to and including the shell) that can affect alignment. The one I use is 25mm (1″) in diameter.

I touched on the external audio input adapter earlier while discussing power for the iPhones, I’ll detail that now. For any serious video photography you must use external microphones: the one in the phone itself – although amazingly sensitive, has many drawbacks. It is single channel – where the iPhone hardware (and several of the better video camera apps) are capable of recording stereo; you can’t focus the sensitivity of the microphone, and most importantly, the mike is on the front of the phone at the bottom – pointing away from where your lens is aimed!

While it is possible to plug a microphone into the combination headphone/microphone connector on the top of the phone, there are a number of drawbacks. The first is it’s still a mono input – only 1 channel of sound. The next is the audio quality is not that great. This input was designed for telephone conversation headpiece use, so extended frequency response, low noise and reduced harmonic distortion were not part of the design parameters. Far better audio quality is available on the digital docking connector on the bottom of the phone. That said, there are very few devices actually on the market today (that I have been able to locate) that will function in the environment of video cinemaphotography, particularly if one is using the iPro shell and tripod mounting the iPhone. Many of the devices treat the iPhone as just an audio device (the phone actually snaps into several of the units, making it impossible to use as a camera); with others the mechanical design is not compatible with either the iPro case or tripod mounting. Others offer only a single channel input (these are mostly designed for guitar input so budding Hendrix types can strum into GarageBand). The only unit I was able to find that met all of my requirements (stereo line input, high audio quality, mechanically did not interfere with tripod or the iPro case) was a unit “Mobile In” manufactured by Line6. Even this device is primarily a guitar input unit, but it does have a line in stereo connector that works very well. In order to use the hardware, you must download and install their free app (and it’s on the fat side, about 55MB) which contains a huge amount of guitar effects. Totally useless for the line input – but it won’t work without it. So just install it and forget about it. You never need to open the MobilePOD app in order to use the line input connector. As discussed above in the section on power, the only major drawback is that once this device is plugged in you can’t run your phone off external power. Really need to find that “Y” adapter for the docking connector..

“Mobile In” audio input adapter attached.

Now you may ask, why do I need a line input connector when I’m using microphones?? My attempt here is to produce the highest quality content possible, while still using the iPhone as the camera/recorder. For the reasons already discussed above, the use of external microphones is required. Typically a number of mikes will be placed, fed into a mixer, and then a line level feed (usually stereo) will be fed to the sound recorder. In all ‘normal’ (aka not using cellphones as cameras!!) video shoots, the sound is almost always recorded on a separate device, just synchronized in some fashion to each of the cameras so the entire shoot is in sync. In this particular shoot, the two actors on the set were individually miked with lavalier microphones (there is a whole hysterical story on that subject, but it will have to wait until after that episode airs…) and a third direction boom mike was used for ambient sound. The three mikes were fed into a small portable mixer/sound recorder. The stereo output (usually used for headphone monitoring – a line level output) was fed (through a “Y” cable) to both the monitoring headphones and the input to the Mobile In device. Essentially, I just ‘piggybacked’ on top of the existing audio feed for the shoot.

This didn’t violate my POC – as one would need this same equipment – or something like it – on any professional shoot. At a minimum, one could just use a small mixer, obviously if the iPhone was recording the sound an external recorder is not required. I won’t attempt to further discuss all the issues in recording high quality sound – that would take a full post (if not a book!) – but there is a massive amount of literature out there on the web if one looks. Good sound recording is an art – if possible avail yourself of someone who knows this skill to assist you on your shoot – it will be invaluable. I’ll just mention a few pointers to complete this part of the discussion:

  • Record the most dynamic range possible without distortion (big range between soft and loud sounds). This will markedly improve the presence of your audio tracks.
  • Keep all background noise to an absolute minimum. Turn off all cellphones! (put the iPhone that are ‘cameras’ in “airplane mode” so they won’t be disturbed by phone calls, texts or e-mails). Turn off fans, air conditioners, refrigerators (if you are near a kitchen), etc. etc. Take a few moments after calling ‘quiet on the set’ to sit still and really listen to your headphones to ensure you don’t hear any noise.
  • As much as possible, keep the loudness levels consistent from take to take – it will help keep your editor (or yourself…) from taking out the long knives after way too many hours trying to normalize levels between takes…
  • If you use lavalier mikes (those tiny microphones that clip onto clothing – they are available in ‘wired’ or ‘wireless’ versions) you need to listen carefully during rehearsals and actual takes for clothing rustle. That can be very distracting – you may have to stop and reposition the mike so that the housing is not touching any clothing. These mikes come with little clips that actually mount on to the cable just below the actual microphone body – thereby insulating clothing movement (rustle) from being transmitted to the sensor through the body of the microphone. Take care in mounting and test with your actor as they move – and remind them that clasping their hands to their chest in excitement (and thumping the mike) will make your sound person deaf – and ruin the audio for that shot!

Actors’ view of the camera setup for a shot, (2 iPhones, 3 DSLRs)

Storage and the process of dumping (transferring video files from the iPhones to external storage) is a vital part of both hardware, software and procedure. The hardware I used will be discussed here, the software and procedure is mentioned in the next section. Since the HD video files consume about 2.5GB for every 10 minutes of filming, even the largest capacity iPhone (64GB) will run out of space in short order. As mentioned earlier, I used the 32GB models on this shoot, with about 20GB free space on each phone. That meant that, at a maximum, I had a little over an hour’s storage on each phone. During the two days of shooting, we shot just under 5 hours of actual footage – which amounted to a total of 116GB from the two iPhones in total. (Not every shot was shadowed by the iPhones: some of the close-ups and inserts could not be performed by the iPhones as they would have been in the shot composed by the DSLR cameras).

The challenge to this project was to not involve anything other than the iPhone/iPad for all factors of the production. The dumping of footage from the iPhones to external storage is one area where Apple (nor any 3rd party developer that I have found) does not offer a purely iOS solution. With the lack of removable storage, there are only two ways to move files off the iPhone: Wi-Fi or the USB cable attached to the docking connector. Wi-Fi is not a practical solution in this environment:  the main reason is it’s too slow. You can find as many ‘facts’ on iPhone Wi-Fi speed as there are types of orchids in the Amazon, but my research (verified by personal tests) show that, in a real-world and practical manner 8Mb/s is a top-end average for upload (which is what you need to transmit files FROM the phone to an external storage device). That’s only 800KB/s – so it would take 7 hours to upload one 2.5GB movie file – which is 10 minutes of shooting! Not to mention the issues of Wi-Fi interference, dropped connections, etc. etc.

That brings us to cabled connections. Currently, the only way to move data off of (or on to for that matter) an iPhone is to use a computer. While the Apple Time Machine could in theory function as a direct-to-phone data storage device, it only connects via Wi-Fi. However, the method I chose only uses the computer as a ‘connection link’ to an external hard drive, so in my view it does not break my premise of an “all iOS” project. When I get to the editing stage, I just reverse the process and pull files back from the external drive through the computer back to the phone (in this case using iTunes).

I will discuss the precise technique and software used below, but suffice to say here that I used a PC as the computer – mainly just because that is the laptop that I have. It also does prove however that there is no issue of “Mac vs PC” as far as the computer goes. I feel this is an important point, as in many countries outside USA and Western Europe the price premium on Apple computers is such that they are very scarce. For this project, I wanted to make sure the required elements were as widely available as possible.

The choice of external storage is important for speed and reliability’s sake. Since the USB connection from the phone to the computer is limited to v2.0 (480Mb/s theoretical) one may assume that just any USB2.0 external drive would be sufficient. That’s not actually the case, as we shall see…  While the link speed of USB2.0 supposedly can provide a maximum of 48MB/s (480Mb/s), that is never matched in reality. USB chipsets in the internal hub in the computer, processing power in the phone and the computer, other processes running on the computer during transfer, bus and cpu speed in the computer, actual disk controller and disk speed of the external storage – all these factors serve to significantly affect transfer speed.

Probably the most important is the actual speed of the external disk. Most common portable USB2.0 disks (the small 2.5″ format) run at 5400RPM, and have disk controller chipsets that are commensurate, with actual performance in the 5-10MB/s range. This is too slow for our purposes. The best solution is to use an external RAID array of two ‘striped’ disks [RAID 0] using high performance 7200RPM SATA disks with an appropriately designed disk controller. Devices such as the G-RAID Mini system is a good example. If you are using a PC, get the best performance with an eSATA connection to the drive (my laptop has a built-in eSATA connector, but PC Card adapters are available that easily support this connectivity for computers that don’t have it built in). This offers the highest performance (real-world tests show average write speeds of 115MB/s using this device). If you are using an Apple computer, opt for the FW800 connection (I’m not aware of eSATA on any Mac computer). While this limits the performance to around 70MB/s maximum, it’s still much faster than the USB2.0 interface from the phone so it’s not an issue. I have proven that having a significant amount of ‘headroom’, in terms of speed performance, on the external drive, is desirable. You just don’t need the drive to slow things down any.

There are other viable alternatives for external drives, particularly if one needed a drive that did not require an external power supply (which the G-RAID does due to the performance). Keep in mind that while it’s possible to run a laptop and external drive all off battery power, you really won’t want to do this – for one, unless you are on a remote outdoor location shoot, you will have AC power – and disk writing at continuous high throughput is a battery killer! That said, a good alternative (for PC) is one of the Seagate GoFlex USB3.0 drives. I use a 1.5TB model that houses a high-performance 7200RPM drive and supports up to 50MB/s write speeds. For the Mac, Seagate has a Thunderbolt model. Although the Thunderbolt interface is twice as fast (10Gb/s vs 5Gb/s) as USB3.0 it makes no difference in transfer speed (these single drive storage devices can’t approach the transfer speeds of either interface). However, there is a very good reason to go with USB3.0/eSATA/Thunderbolt instead of USB2.0 – overall performance. With the newer high-speed interfaces, the full system (hard disk controller, interface chipset, etc.) is designed for high-speed data transfer, and I have proved to myself that it DOES make a difference. It’s very hard to find a USB2.0 system that matches the performance of a USB3.0/etc system – even on a 2.5″ single drive subsystem.

The last thing to cover here under storage is backup. Your video footage is irreplaceable. Procedure will be covered below, but under hardware, provide a second external drive on the set. It’s simply imperative that you immediately back up the footage on to a second physical drive as soon as practical – NOT at the end of the day! If you have a powerful enough computer, with the correct connectivity, etc. – you can actually copy the iPhone files to two drives simultaneously (best solution), but otherwise plan on copying the files from one external drive to the backup while the next scenes are being shot (background task).

I’ll close with a final suggestion:  while this description of hardware and process is not meant in any way to be a tutorial on cinemaphotography, audio, etc. etc. – here is a small list (again, this is under ‘hardware’ as it concerns ‘stuff’) of useful items that will make your life easier “on the set”:

  • Proper transport cases, bags, etc. to store and carry all these bits. Organization, labeling, color-coding, etc. all helps a lot when on a set with lots of activity and other equipment.
  • Spare cables for everything! Murphy will see to it that the one item for which you have no duplicate will get bent during the shoot…
  • Plenty of power strips and extension cords.
  • Gorilla tape or camera tape (this is NOT ‘duct tape’). Find a gaffer and he/she will explain it to you…
  • Small folding table or platform (for your PC/Mac and drives) – putting high value equipment on the floor is asking for BigFoot to visit…
  • Small folding stool (appropriate for the table above), or an ‘apple box’ – crouching in front of computer while manipulating high value content files is distracting, not to mention tiring.
  • If you are shooting outside, more issues come into play. Dust is the big one. Cans of compressed air, lens tissue, camel-hair brushes, zip-lock baggies, etc. etc. – none of the items discussed in this entire post appreciate dust or dirt…
    • Cooling. Mentioned earlier, but you’ll need to keep the phone and computer as cool as practical (unless of course you are shooting in Scotland in February in which case the opposite will be true: trying to figure out how to keep things warm and dry in the middle of a wet and freezing moor will become paramount).
    • Special mention for ocean-front shoots:  corrosion is a deadly enemy of iPhones and other such equipment. Wipe down ALL equipment (with appropriate cloths and solutions) every night after the shoot. Even the salt air makes deposits on every exposed metal surface – and later on a very hard to remove scale will become apparent.
  • A final note for sunny outdoor shoots: seeing the iPhone screen is almost impossible in bright sunlight, and unlike DSLRs the iPhone does not have an optical viewfinder. Some sort of ‘sunshade’ will be required. While researching this online, I came across this little video that shows one possible solution. Obviously this would have to be modified to accommodate the audio adapter, iPro lenses, etc. shown in my project, but it will hopefully give you some ideas. (Thanks to triplelucky for this video).

Software:

As amazing as the hardware capabilities of the above system are (iPhone, supplemental lenses, audio adapters, etc.) – none of this would be possible without the sophisticated software that is now available for this platform at such low cost. The list of software that I am currently using to produce this video is purely of my own choosing – there may be other equally viable solutions for each step or process. I feel what is important is the possibility of the process, not the precise piece of kit used to accomplish the task. Obviously, as I am using the iOS platform, all the apps are “Apple iPhone/iPad compliant”. The reader that chooses an alternate platform will need to do a bit of research to find similar functionality.

As a parallel project, I am currently describing my experiences with the iPhone camera in general, as well as many of the software packages (apps) that support the iPhone still and video camera. These posts are elsewhere in this same blog location. For that reason, I will not describe in any detail the apps here. If software that is discussed or listed here is not yet in my stable of posts, please be patient – I promise that each app used in this project will be discussed in this blog at some point. I will refer the reader to this post where an initial list of apps that will be discussed is located.

Here is a short list of the apps I am currently using. I may add to this list before I complete this project! If so, I will update this and other posts appropriately.

Storyboard Composer Excellent app for building storyboards from shot or library photos, adding actors, camera motion, script, etc. Powerful.

Movie*Slate A very good slate app.

Splice Unbelievable – a full video editor for the iPhone/iPad. Yes, you can: drop movies and stills on a timeline, add multiple sound tracks and mix them, work in full HD, has loads of video and audio efx, add transitions, burn in titles, resize, crop, etc. etc. Now that doesn’t mean that I would choose to edit my next feature on a phone…

Avid Studio  The renowned capability of Avid now stuffed into the iPad. Video, audio, transitions, etc. etc. Similar in capability to Splice (above) – I’ll have a lot more to say after these two apps get a serious test drive while editing all the footage I have shot.

iTC Calc The ultimate time code app for iDevices. I use on both iPad and iPhone.

FilmiC Pro Serious movie camera app for iPhone. Select shooting mode, resolution, 26 frame rates, in-camera slating, colorbars, multiple bitrates for each resolution, etc. etc.

Camera+ I use this as much for editing stills as shooting, biggest advantage over native iPhone camera app is you can set different part of frame for exposure and focus.

almost DSLR is the closest thing to fully manual control of iPhone camera you can get. Takes some training, but is very powerful once you get the hang of it.

PhotoForge2 Powerful editing app. Basically Photoshop on the iPhone.

TrueDoF This one calculates true depth-of-field for a given lens, sensor size, etc. I use this to plan my range of focus once I know my shooting distance.

OptimumCS-Pro This is sort of inverse of the above app – here you enter the depth of field you want, then OCSP tells you the shooting distance and aperture you need for that.

Juxtaposer This app lets you layer two different photos onto each other, with very controllable blending.

Phonto One of the best apps for adding titles and text to shots.

Some of the above apps are designed for still photography only, but since stills can be laid down in the video timeline, they will likely come into use during transitions, effects, title sequences, etc.

I used Filmic Pro as the only video camera app for this project. This was firstly based just on personal preference and the capabilities that it provided (the ability to lock focus, exposure and white balance were critical to maintaining continuity across takes in my opinion). Once I had selected a video camera app with which I was comfortable, I felt it important to use that on both the iPhones – again for continuity of the content. There may be other equally capable apps for this purpose. My focus was on producing as high a quality product as possible within the means and capabilities at my disposal. The particular tools are less important than the totality of the process.

The process of dumping footage off the iPhone (transferring video files to external storage) requires some additional discussion. The required hardware has been mentioned above, now let’s dive into process and the required software. The biggest challenge is logistics: finding enough time in between takes to transfer footage. If the iPhones are the only cameras used, then in one way this is easier – you have control over the timeline in that regard. In my case, this was even more challenging, as I was ‘piggybacking’ on an existing shoot so I had to fit in with the timeline and process in place. Since professional video cameras all use removable storage, they only require a few minutes to effectively be ready to shoot again after the on-camera storage is full. But even if iPhones are the only cameras, taking long ‘time-outs’ to dump footage will hinder your production.

There are several ways to maximize the transfer speed of files off the iPhone, but the best way is to make use of time management:  try to schedule dumping for normal ‘down time’ on the set (breaks, scene changes, wardrobe changes, meal breaks, etc.)  In order to do this you need to have your ‘transfer station’ [computer and external drive] ready and powered up so you can take advantage of even a short break to clear files from the phone. I typically transferred only one to three files at a time, so in case we started up sooner than expected I was not stuck in the middle of a long transfer. The other advantage in my situation was that the iPhone charges while connected via USB cable, so I was able to accomplish two things at once: replenish battery capacity due to shooting with the Mobile In audio adapter not allowing shooting while on line power; and dumping the files to external storage.

My 2nd camerawoman, Tara, brought her Mac Air laptop for file transfer to an external USB drive, I used a Dell PC laptop (discussed above in the hardware section). In both cases, I found that using the native OS file management (Image Capture [part of OS] for the Mac, Windows Explorer for the PC) was hideously slow. It does work (after plugging in the iPhone to the USB connector on the computer, the iPhone shows up as just another external disk. You can navigate down through a few folders and find your video files). On my PC (which BTW is a very fast machine – basically a 4-core mobile workstation that can routinely transfer files to/from external drives at over 150MB/s) the best transfer speed I could obtain with Windows Explorer amounted to needing almost an hour to transfer 10 minutes of video off the iPhone – a complete non-starter in this case. After some research, I located software from WideAngle Software called TouchCopy that solved my problem. They make versions for both Mac and PC, and it allowed transfer off the iPhone to external storage about 6x faster than Windows Explorer. My average transfer times were approximately ‘real time’ – i.e. 10 minutes of footage took about 10 minutes to transfer. There may be other similar applications out there – as mentioned earlier I am not in the software reviewing business – once I find something that works for me I will use that – until I find something “better/faster/cheaper.”

To summarize the challenging file transfer issue:

  • Use the fastest hardware connections and drives that you can.
  • Use time management skills and basic logistics to optimize your ‘windows’ for file transfer.
  • Use supplemental software to maximize your transfer speed from phone to external storage.
  • Transfer in small chunks so you don’t hold up production.

The last bit that requires a mention is file backup. Your original footage is impossible to replace, so you need to take exquisite care with it. The first thing to do it back it up to a second external physical drive immediately after the file transfer. Typically I started this task as soon as I was done with dumping files off the iPhone – this task could run unsupervised during the next takes.. However, one thing to consider before doing that (and this may depend on how much time you have during breaks): the relabeling of the video files. The footage is stored on your iPhone as a generically labeled .mov file, usually something like IMG_2334.mov – not a terribly insightful description of your scene/take. I never change the original label, only add to it. There is a reason… it helps to keep all the files in sequential order when starting the scene selection and editorial process later. This can be very helpful when things go a bit skew – as the always do during a shoot. For instance if the slate is missing on a clip (you DO slate every take, correct??) having the original ‘shot order’ can really help place the orphan take into its correct sequence. In my case, this happened several time due to slate placement:  since my iPhone cameras were in different locations, sometimes the slate was pointed where it was in frame for the DSLR cameras but was not visible by the iPhones.

I developed a short-hand description take from the slate at the head of each shot that I appended to the original file name. This does a few seconds (to launch Quicktime or VLC, shuttle in to the slate, pause and get the slate info), but the sooner you do this, the better. If you have time to rename the shots before the backup, then you don’t have to rename twice – or face the possibility of human error during this task. Here is a sample of one of my files after renaming: IMG_2334_Roll-A1_EP1-1_T-3.mov  This is short for Roll A1, Episode 1, Scene 1, Take 3.

However you go about this, just ensure that you back up the original files quickly. The last step of course is to delete the original video files off the iPhone so you have room for more footage. To double-check this process (you NEVER want to realize you just deleted footage that was not successfully transferred!!!) I do three things:

  1. Play into the file with headphones on to ensure that I have video and audio at head, middle and end of each clip. That only takes a few seconds, but just do it.
  2. Using Finder or Explorer, get the file size directly off the still-connected iPhone and compare it to the copied file on your external drive. Look at actual file size, not ‘size on disk’, as your external disk may have different sector sizes than the iPhone). If they are different, re-transfer the file.
  3. Using the ‘scrub bar’, quickly traverse the entire file using your player of choice (Quicktime, VLC, etc.) and make sure you have picture from end to end in the clip.

Then and only then, double-check exactly what you are about to delete, offer a small prayer to your production spirit of choice, and delete the file(s).

Summary:

This is only the beginning! I will write more as this project moves ahead, but wanted to introduce the concept to my audience. A deep thanks to all of you who have read my past posts on various subjects, and please return for more of this journey. Your comments and appreciation provides the fuel for this blog.

Support and Contact Details:

Please visit and support the talented women that have enabled me to produce this experiment. This would not have been possible otherwise.

Tiffany Price, Writer, Producer, Actress
Lauren DeLong, Writer, Producer, Actress
Ambika Leigh, Director, Producer

iPhone4S – Section 4: Software

March 13, 2012 · by parasam

This section of the series of posts on the iPhone4S camera system will address the all-important aspect of software – the glue that connects the hardware we discussed in the last section with the human operator. Without software, the camera would have little function. Our discussion will be divided into three parts:  Overview; the iOS camera subsystem of the Operating System; and the actual applications (apps) that users normally interact through to take and process images.

As the audience of this post will likely cover a wide range of knowledge, I will try to not assume too much – and yet also attempt not to bore those of you who likely know far more than I do about writing software and getting it to behave in a somewhat consistent fashion…

Overview

The iPhone, surprise-surprise – is a computer. A full-fledged computer, just like what sits on your desk (or your lap). It has a CPU (brain), memory, graphics controller, keyboard, touch surface (i.e. mouse), network card (WiFi & Bluetooth), a sound card and many other chips and circuits. It even has things most desktops and laptops don’t have:  a GPS radio for location services, an accelerometer (a really tiny gyroscope-like device that senses movement and position of the phone), a vibrating motor (to bzzzzzz at you when you get a phone call in a meeting) – and a camera. A rather cool, capable little camera. Which is rather the point of our discussion…

So… like any good computer, it needs an operating system – a basic set of instructions that allows the phone to make and receive calls, data to be written to and read from memory, information to be sent and retrieved via WiFi – and on and on. In the case of the iDevice crowd (iPod, iPhone, iPad) this is called iOS. It’s a specialized, somewhat scaled down version of the full-blown OS that runs on a Mac. (Actually it’s quite different in the details, but the concept is exactly the same). The important part of all this for our discussion is that a number of basic functions that affect camera operation are baked into the operating system. All an app has to do is interact via software with these command structures in the OS, present the variable to the user in a friendly manner (like turn the flash on or off), and most importantly, take the image data (i.e the photograph) and allow the user to save it or modify it, based on the capability of the app in question.

The basic parameters that are available to the developer of an app are the same for everyone. It’s an equal playing field. Every app developer has exactly the same toolset, the same available parameters from the OS, and the same hardware. It’s up to the cleverness of the development team to achieve either brilliance or mediocrity.

The Core OS functions – iOS Camera subsystem

The following is a very brief introduction to some of the basic functions that the OS exposes to any app developer – which forms the basis for what an app can and cannot do. This is not an attempt to show anyone how to program a camera app for the iPhone! Rather, a small glimpse into some of the constraints that are put on ALL app developers – the only connection any app has with the actual hardware is through the iOS software interface – also known as the API (Application Programming Interface). For instance, Apple passes on to the developers through the API only 3 focus modes. That’s it. So you will start to see certain similarities between all camera apps, as they all have common roots.

There are many differences, due to the way a given developer uses the functions of the camera, the human interface, the graphical design, the accuracy and speed of computations in the app, etc. It’s a wide open field, even if everyone starts from the same place.

In addition, the feature sets made available through the iOS API change with each hardware model, and can (and do!) change with upgrades of the iOS. Of course, each time Apple changes the underlying API, each app developer is likely to need to update their software as well. So then you’ll get the little red number on your App Store icon, telling you it’s time to upgrade your app – again.

The capabilities of the two cameras (front-facing and rear-facing) are markedly different. In fact, all of the discussion in this series has dealt only with the rear-facing camera. That will continue to be the case, since the front-facing camera is of very low resolution, intended pretty much just to support FaceTime and other video calling apps.

Basic iOS structure

The iOS is like an onion, layers built upon layers. At the center of the universe… is the Core. The most basic is the Core OS. Built on top of this are additional Core Layers: Services, Data, Foundation, Graphics, Audio, Video, Motion, Media, Location, Text, Image, Bluetooth – you get the idea…

Wrapped around these “apple cores” are Layers, Frameworks and Kits. These Apple-provided structures further simplify the work of the developer, provide a common and well tuned user interface, and expand the basic functionality of the core systems. Some examples are:  Media Layer (including MediaPlayer, MessageUI, etc.); the AddressBook Framework; the Game Kit; and so on.

Our concern here will be only with a few structures – the whole reason for bringing this up is to allow you, the user, to understand what parameters on the camera and imaging systems can be changed and what can’t.

Focus Modes

There are three focus modes:

  • AVCaptureFocusModeLocked: the focal area is fixed.

This is useful when you want to allow the user to compose a scene then lock the focus.

  • AVCaptureFocusModeAutoFocus: the camera does a single scan focus then reverts to locked.

This is suitable for a situation where you want to select a particular item on which to focus and then maintain focus on that item even if it is not the center of the scene.

  • AVCaptureFocusModeContinuousAutoFocus: the camera continuously auto-focuses as needed.

Exposure Modes

There are two exposure modes:

  • AVCaptureExposureModeLocked: the exposure mode is fixed.
  • AVCaptureExposureModeAutoExpose: the camera continuously changes the exposure level as needed.

Flash Modes

There are three flash modes:

  • AVCaptureFlashModeOff: the flash will never fire.
  • AVCaptureFlashModeOn: the flash will always fire.
  • AVCaptureFlashModeAuto: the flash will fire if needed.

Torch Mode

Torch mode is where a camera uses the flash continuously at a low power to illuminate a video capture. There are three torch modes:

  •    AVCaptureTorchModeOff: the torch is always off.
  •    AVCaptureTorchModeOn: the torch is always on.
  •    AVCaptureTorchModeAuto: the torch is switched on and off as needed.

White Balance Mode

There are two white balance modes:

  •    AVCaptureWhiteBalanceModeLocked: the white balance mode is fixed.
  •    AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance: the camera continuously changes the white balance as needed.

You can see from the above examples that many of the features of the camera apps you use today inherit these basic structures from the underlying CoreImage API. There are obviously many, many more parameters that are available for control by a developer team – depending on whether you are doing basic image capture, video capture, audio playback, modifying images with built-in filter, etc. etc.

While we are on the subject of core functionality exposed by Apple, let’s discuss camera resolution.

Yes, I know we have heard a million times already that the iPhone4S has an 8MP maximum resolution (3264×2448). But there ARE other resolutions available. Sometimes you don’t want or need the full resolution – particularly if the photo function is only a portion of your app (ID, inventory control, etc.) – or even as a photographer you want more memory capacity and for the purpose at hand a lower resolution image is acceptable.

It’s almost impossible to find this data, even on Apple’s website. Very few apps give access to different resolutions, and the ones that do don’t give numbers – it’s ‘shirt sizes’ [S-M-L]. Deep in the programming guidelines for CoreImage I found a parameter AVCaptureStillIMageOutput that allows ‘presetting the session’ to one of the values below:

PresetNameStill           PresetResolutionStill

Photo                              3264×2448

High                                1920×1080

Med                                 640×480

Lo                                   192×144

PresetNameVideo         PresetResolutionVideo

1080P                              1920×1080

720P                                1280×720

480P                                640×480

I then found one of the very few apps that support ALL of these resolutions (almost DSLR) and shot test stills and video at each resolution to verify. Everything matched the above settings EXCEPT for the “Lo” preset in Still image capture. The output frame measured 640×480, the same as “Med” – however the image quality was much lower. I believe that the actual image IS captured at 192×144, but then is scaled up to 640×480 – why I am not sure, but it is apparent that the Lo image is of far lower quality than Med. The image size was lower for the Lo quality image – but not enough that I would ever use it. On the tests I shot, Lo = 86kB, Med = 91kB. The very small difference in size is not worth the big drop in quality.

So… now you know. You may never have need of this, or not have an app that supports it – but if you do require the ability to shoot thousands of images and have them all fit in your phone, now you know how.

There are two other important aspects of image capture that are set by the OS and not changeable by any app:  color space and image compression format. These are fixed, but different, for still images and video footage. The color space (which for the uninitiated is essentially the gamut – or range of colors – that can be reproduced by a color imaging system) is set to sRGB. This is a common and standard setting for many digital cameras, whether full sized DSLR or cellphones.

It’s beyond the scope of this post to get into color space, but I personally will be overjoyed when the relatively limited gamut of sRGB is put to rest… however, it is appropriate for the iPhone and other cellphone camera systems due to the limitations of the small sensors.

The image compression format used by the iPhone (all models) is JPEG, producing the well-known .jpg file format. Additional comments on this format, and potential artifacts, were discussed in the last post. Since there is nothing one can do about this, no further discussion at this time.

In the video world, things are a little different. We actually have to be aware of audio as well – we get stereo audio along with the video, so we have two different compression formats to consider (audio and video), as well as the wrapper format (think of this as the envelope that contains the audio and video track together in sync).

One note on audio:  if you use a stereo external microphone, you can record stereo audio along with the video shot by the iPhone4S. This requires an external device which connects via the 30-pin docking connector. You will get far superior results – but of course it’s not as convenient. Video recordings made with the on-board microphone (same one you use to speak into the phone) are mono only.

The parameters of the video and audio streams are detailed below: (this example is for the full 1080P resolution)

General

Format : MPEG-4
Format profile : QuickTime
Codec ID : qt
Overall bit rate : 22.9 Mbps

Video

ID : 1
Format : AVC
Format/Info : Advanced Video Codec
Format profile : Baseline@L4.1
Format settings, CABAC : No
Format settings, ReFrames : 1 frame
Codec ID : avc1
Codec ID/Info : Advanced Video Coding
Bit rate : 22.4 Mbps
Width : 1 920 pixels
Height : 1 080 pixels
Display aspect ratio : 16:9
Rotation : 90°
Frame rate mode : Variable
Frame rate : 29.500 fps
Minimum frame rate : 15.000 fps
Maximum frame rate : 30.000 fps
Color space : YUV
Chroma subsampling : 4:2:0
Bit depth : 8 bits
Scan type : Progressive
Bits/(Pixel*Frame) : 0.367
Title : Core Media Video
Color primaries : BT.709-5, BT.1361, IEC 61966-2-4, SMPTE RP177
Transfer characteristics : BT.709-5, BT.1361
Matrix coefficients : BT.709-5, BT.1361, IEC 61966-2-4 709, SMPTE RP177

Audio

ID : 2
Format : AAC
Format/Info : Advanced Audio Codec
Format profile : LC
Codec ID : 40
Bit rate mode : Constant
Bit rate : 64.0 Kbps
Channel(s) : 1 channel
Channel positions : Front: C
Sampling rate : 44.1 KHz
Compression mode : Lossy
Title : Core Media Audio

The highlights of the video/audio stream format are:

  • H.264 (MPEG-4) video compression, Baseline Profile @ Level 4.1, 22Mb/s
  • QuickTime wrapper (.mov)
  • AAC-LC audio compression, 44.1kHz, 64kb/s

The color space for the video is the standard adopted for HD television, Rec709. Importantly, this means that videos shot on the iPhone will look correct when played out on an HDTV.

This particular sample video I shot for this exercise was recorded at just under 30 frames per second (fps), the video camera supports a range of 15-30fps, controlled by the application.

Software Applications for Still & Video Imaging on the iPhone4S

The following part of the discussion will cover a few of the apps that I use on the iPhone4S. These are just what I have come across and find useful – this is not even close to all the apps available for the iPhone for imaging. I obtained all of the apps via normal retail Apple App Store – I have no relationship with any of the vendors – they are unaware of this article (well, at least until it’s published…)

I am not a professional reviewer, and take no stance as to absolute objectivity – I do always try to be accurate in my observations, but reserve the right to have favorites! The purpose in this section is really to give examples of how a few representative apps manage to expose the hardware and underlying iOS software to the user, showing the differences in design and functionality.

These apps are mostly ‘purpose-built’ for photography – as opposed to some other apps that have a different overall purpose but contain imaging capabilities as part of the overall feature set. One example (that I have included below) is EasyRelease, an app for obtaining a ‘model release’ [legal approval from the subject to use his/her likeness for commercial purposes]. This app allows taking a picture with the iPhone/iPad for identification purposes – so has some very basic image capture abilities – it’s not a true ‘photo app’.

BTW, this entire post has been focused on only the iPhone camera, not the iPad (both 2nd & 3rd generation iPads contain cameras) – I personally don’t think a tablet is an ideal imaging device – it’s more like a handy accessory if you have your tablet out and need to take a quick snap – than a camera. Evidently Apple feels this way as well, since the camera hardware in the iPads have always lagged significantly behind that of the iPhone. However, most photo apps will work on both the iPad as well as the iPhone (even on the 1st generation model – with no camera), since many of the apps support working with photos from the Camera Roll (library) as well as directly from the camera.

I frequently work this way – shoot on iPhone, transfer to iPad for easier editing (better for tired eyes and big fingers…), then store or share. I won’t get into the workflows of moving images around – it’s not anywhere near as easy as it should be, even with iCloud – but it’s certainly possible and often worth the effort.

Here is the list of apps that will be covered. For quick reference I have listed them all below with a simple description, a more detailed set of discussions on each app follows.

[Note:  due to the level of detail, including many screenshots and photo examples used for the discussion of each app, I have separated the detailed discussions into separate posts – one for each app. This allows the reader to only select the app(s) they may be interested in, as well as keep the overall size of an individual post to a reasonable size. This is important for mobile readers…]

Still Imaging

Each of the app names (except for original Camera) is a link that will take you to the corresponding page in the App Store.

Camera  The original photo app included on every iPhone. Basic but intuitive – and of course the biggest plus is the ability to fast-launch this without logging in to the home page first. For streetphotography (my genre) this a big feature.

Camera+  I use this as much for editing as shooting, biggest advantage over native iPhone camera app is you can set different part of frame for exposure and focus. The info covers the just-released version 3.0

Camera Plus Pro  This is similar to the above app (Camera+) – some additional features, not the least of which it shoots video as well as still images. Although made by a different company, it has many similar features, filters, etc. It allows for some additional editing functions and features ‘live filters’ – where you can add the filter before you start shooting, instead of as a post-production workflow in Camera+. However, there are tradeoffs (compression ratio, shooting speed, etc.)  Compare the apps carefully – as always, know your tools…  {NOTE: There are two different apps with very similar names: Camera+, made by TapTapTap with the help of pixel wizard Lisa Bettany; and Camera Plus, made by Global Delight Technologies – who also make Camera Plus Pro – the app under discussion here. Camera+ costs $0.99 at the time of this post; Camera Plus is free; Camera Plus Pro is $1.99 — are you confused yet? I was… to the point where I felt I needed to clarify this situation of unfortunately very similar brand names for somewhat similar apps – but there are indeed differences. I’m going to be as objective in my observations as possible. I am not reviewing Camera Plus, as I don’t use it. Don’t infer anything from that – this whole blog is about what I know about what I personally use. I will be as scientific and accurate as possible once I write about a topic, but it’s just personal preference as to what I use}

almost DSLR is the closest thing to fully manual control of iPhone camera you can get. Takes some training, but is very powerful once you get the hang of it.

ProHDR I use this a lot for HDR photography. Pic below was taken with this. It’s unretouched! That’s how it came out of the camera…

Big Lens This allows you to manually ‘blur’ background to simulate shallow depth of field. Quite useful since 30mm focal length lens (35mm equivalent) puts almost everything in focus.

Squareready  If you use Instagram then you know you need to upload in square format. Here’s the best way to do that.

PhotoForge2  Powerful editing app. Basically Photoshop on the iPhone.

Snapseed  Another very good editing app. I use this for straightening pix, as well as ability to tweak small areas of picture differently. On some iPhone snaps I have changed 9 different areas of picture with things like saturation, contrast, brightness, etc.

TrueDoF  This one calculates true depth-of-field for a given lens, sensor size, etc. I use this when shooting DSLR to plan my range of focus once I know my shooting distance.

OptimumCS-Pro  This is sort of inverse of the above app – here you enter the depth of field you want, then OCSP tells you the shooting distance and aperture you need for that.

Iris Photo Suite  A powerful editing app, particularly in color balance, changing histograms, etc. Can work with layers like Photoshop, perform noise reduction, etc.

Filterstorm  I use this app to add visible watermarks to images, as well as many other editing functions. Works with layers, masks, variable brushes for effects, etc.

Genius Scan+  While this app was intended for (and I use it for this as well) scanning documents with the camera to pdf, I found that it works really well to straighten photos… like when you are shooting architectural and have unavoidable keystoning distortion… Just be sure to pull back and give yourself some surround on your subject, as the perspective cropping technique that is used to straighten costs you some of your frame…

Juxtaposer  This app lets you layer two different photos onto each other, with very controllable blending.

Frame X Frame  Camera app, used for stop motion video production as well as general photography.

Phonto  One of the best apps for adding titles and text to shots.

SkipBleach  This mimics the effect of skipping (or reducing) the bleach step in photochemical film processing. It’s what gives that high contrast, faded and harsh ‘look’.

Monochromia  You probably know that getting a good B&W shot out of a color original is not as simple as just desaturating.. here’s the best iPhone app for that.

MagicShutter  This app is for time exposures on iPhone, also ‘light painting’ techniques.

Easy Release  Professional model release. Really, really good – I use it on iPad and have never gone back to paper. Full contractual terms & conditions, you can customize with your additional wording, logo, etc. – a relatively expensive app ($10) but totally worth it in terms of convenience and time saved if you need this function.

Photoshop Express  This is actually a bit disappointing for a $5 app, others above do more for less – except the noise reduction (a new feature) is worth it for that alone. It’s really, really good.

Motion Imaging

Movie*Slate  A very good slate app.

Storyboard Composer  Excellent app for building storyboards from shot or library photos, adding actors, camera motion, script, etc. Powerful.

Splice  Unbelievable – a full video editor for the iPhone/iPad. Yes, you can: drop movies and stills on a timeline, add multiple sound tracks and mix them, work in full HD, has loads of video and audio efx, add transitions, burn in titles, resize, crop, etc. etc. Now that doesn’t mean that I would choose to edit my next feature on a phone…

iTC Calc  The ultimate time code app for iDevices. I use on both iPad and iPhone.

FilmiC Pro  Serious movie camera app for iPhone. Select shooting mode, resolution, 26 frame rates, in-camera slating, colorbars, multiple bitrates for each resolution, etc. etc.

Camera+ Pro  This app is listed under both sections, as it has so many features for both still and motion photography. The video capture/edit portion even has numerous filters that can be used during capture.

Camcorder Pro  simple but powerful HD camera app. Anti-shake and other features.

This concludes this post on the iPhone4S camera software. Please check out the individual posts following for each app mentioned above. I will be posting each app discussion as I complete it, so it may be a few days before all the app posts are uploaded. Please remember these discussions on the apps are merely my observations on their behavior – they are not intended to be a full tutorial, operations manual or other such guide. However, in many cases, the app publisher offers little or no extra information, so I believe the data provided will be useful.

  • Blog at WordPress.com.
  • Connect with us:
  • Twitter
  • Vimeo
  • YouTube
  • RSS
  • Follow Following
    • Parasam
    • Join 95 other followers
    • Already have a WordPress.com account? Log in now.
    • Parasam
    • Customize
    • Follow Following
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...