Cinematographer & Filmmaker
Corey Steib
by Corey Steib on February 5th, 2017

I have had this happen to me twice on set during my 13 year career so far. You are getting ready to roll the camera and out of no where it just stops and you are freaking out as your mind is going 100 mps. The producer wants to know what's going on and you are trying so hard to figure out why the camera shut down.

Sadly their are no ways to prevent this but their are a few steps that you can fallow in which I have learned early on in my career as a camera assistant. I did my college internship at panavision were I learned the in's and out's of every camera. I encourage you to go spend as much time at a rental house as you can because those camera technicians have heard and seen it all. 

One of the most important things that I have learned was how to narrow down the source of a camera shutdown. Using these tips can help you out in a pinch while on set.   

Shooting on Film

  1. Check the Battery (Throw a fully charged one on just to make sure)
  2. Check the battery cable
  3. Check the gate and film feed to make sure the film did not jam.
  4. Check the fuses inside the camera.
If all else fails then send it back to the rental house as it could be the motherboard inside the camera.

Shooting on Digital

  1. Check the Battery (Throw a fully charged one on just to make sure)
  2. Check the P Tape connector if you have one.
  3. Does the camera feel hot, If so let it sit and cool down.
  4. Check your media to make sure that it has a fast write speed
Weather temperatures play a roll as well in a camera shut down. Some cameras have a fuse and some don't. Make sure that you do not do a firmware update in between the production. It's always best to let the camera sit between 5 to 10 minutes and attempted to start it back up.  These simple steps can help you if you are a solo shooter and or a full production crew. If you are renting then call the rental house asap so that if need be they can overnight you a replacement camera.

by Corey Steib on January 30th, 2017

As the use of 4K and other applications are becoming the norm on many productions, so is shooting in raw. So first what is a raw?

A camera raw image file contains minimally processed data from the image sensor of either a digital camera, image scanner, or motion picture film scanner. Raw files are named so because they are not yet processed and therefore are not ready to be printed or edited with a bitmap graphics editor.

Normally, the image is processed by a raw converter in a wide-gamut internal colors pace where precise adjustments can be made before conversion to a "positive" file format such as TIFF or JPEG for storage, printing, or further manipulation. This often encodes the image in a device-dependent color space. There are dozens, if not hundreds, of raw formats in use by different models of digital equipment (like cameras or film scanners).
The go to system to handle your raw files is DaVinci Resolve By BlackMagic. But what if you have an older computer (like myself) which can't handle resolve. Well have no fear as I have a solution for you. Since my camera can't handle it as it's an 2009 iMac I had to come up with an alternative for my raw Cinema DNG files. Their is this beautiful thing called Adobe LightRoom which you would normally use for still photos, but since their is no wrap around for dng files it creates individual images. I have created two videos which give you step by step instructions on how to edit and import your images over to Final Cut Pro X. 
if you have any questions feel free to leave a comment below or hit me up on twitter.

by Corey Steib on January 29th, 2017

As much as I love technology their is nothing more important then shot composition. One of the best things I learned in film school is shot composition as my instructors drilled it into us with hours and hours of hands on training. Of course you only get better over time and years working on different projects. 

So what is shot composition you ask (It refers to the frame of the image and how the elements of the mise-en-scène appear in it. Composition guidelines must be observed when telling stories visually, as in filmmaking.)

Rule of Thirds

Placing points of interest along one or more of the imaginary horizontal and vertical lines, or on one or more of the four intersections, your image will be more pleasing to look at.

Leading Lines

These imaginary lines, also called vertices, help lead your viewer's eye into your image, which creates depth -- a must for our dimensional medium. It creates a sense of kinesis and movement, which adds to your image's aesthetic energy.

Diagonals

Like leading lines, diagonals are vertices that lead your viewer's eye, but instead of them being lead into your image, they're lead across, which creates "movement". This is probably more important for still photography, but if you're shooting a static shot even if elements within the frame are moving.

Framing

You can use something natural, like windows and doors to create a frame within a frame.

Figure to Ground

We tend to notice things that contrast -- in fact, it's one of the main ideas in Gestalt psychology. By creating contrast between your subject and the background, you can create depth, as well as help your viewer orient the subjects within the space.

Fill the Frame

Get close! According to many aesthetic theories, the size of an object within the frame directly determines how much aesthetic energy (i.e. importance) it has: the bigger it is, the more "important" it is. (Remember also that this will be the first thing that your audience is most likely to look at.)

Center Dominate Eye

By positioning the dominant eye of your character in the center of the frame, it gives the illusion that it is following you.

Patterns and Repetition

Humans are naturally attracted to patterns -- I guess we don't like, or can't easily make sense of, chaos. So, using repetition will immediately attract your viewer to your image, but including an element that breaks the pattern will keep your images interesting and your audience engaged.

All of these points are like guidelines as their really is no rule per say. I once owned four cameras as each one did something a little different but my framing and story telling through the lens is what mattered over technology. You can see below the past two projects I have worked on and my framing for each one. Both of these films were wrote, shot and edited in 48 hours.
If you are interested in a more in depth breakdown video then check out this one by videomaker
So for any filmmaker starting out I would say pick up any camera that you have and start lining up shots and work on framing your subject. If anything pick up your phone and download an app like pcam or panavision where is allows you to see your frame through different lenses.

Before I became a cinematographer I cut my teeth as a camera assistant and film loader. This allowed me to understand framing and story telling through the lens and how each shot represents or tells something about the story. So go out and shoot, tell a story and most of all have fun doing it.

by Corey Steib on January 16th, 2017

This question was asked of me last week about my thoughts on where the industry is going in the next 5 years. Well if you were to ask me that 5 years ago I would of said 3D and 4K. 

But now it's 2017 and a new year for some interesting products coming out. My thoughts might differ from yours depending on what part of the industry you are in but I think it comes full circle in the end. 4K has been around for 10 years now and it's only starting to be used more and more on productions. 

I am sure you have seen Red make an announcement on a new 8K Camera just last week with an super 35 sensor. So of course I think things will be going to 8K but we don't have enough 4K tvs' and consumers who are wanting that just yet. 

Of course there are drones up the yin yang now a days that are shooting videos in 4K. Everyone wants to know how well any camera does in low light. And does it shoot slow motion. 

So what are my thoughts?

I do believe that we will continue to have 4K and it will become the normal but for the time being clients are happy with 1080. 

Sensors are getting better and better and it will be a matter of time before every camera will be able to handle low light situations better. 

Slow Motion is hit or miss as you have cameras design specificity for that because you are not going to have that many shots that need it and if it does it's only for 5 maybe 10 seconds.

Drones are in the same plate as slow motion cameras as they are not needed for every shot, just like a wide shot is not needed for every shot unless that's the directors vision.

So please post your own in the comments below and let me know what you think and what your thoughts are. 



by Gannon Burgett (DigitalTrends.com) on December 22nd, 2016

Polarizing filters have long been used as a tool to reduce glare, minimize reflections, and improve the color of the sky in photographs by absorbing certain wavelengths of light before they hit the sensor (or film) inside a camera.

Traditionally, these filters have been placed in front of the lens via screw-on and drop-in setups, but there have been attempts to add polarizers directly inside the camera to reduce the need for fragile external components. The technology has yet to go mainstream though, due to complicated designs, limited capabilities, and cost efficiency. Soon though, that might not be the case.

Sony has announced that it is working on a new backside-illuminated CMOS sensor with an integrated polarizing filter. Instead of placing a glass polarizing filter in front of your lens, Sony’s prototype sensor uses a metal grid atop the sensor’s photodiodes that acts as a polarizer to selectively reduce unwanted glare and reflections in an image.
To achieve this, Sony angles the metal grid into four distinct polarization directions: 0, 45, 90, and 135 degrees. Such a setup allows each of the polarizers to be used selectively and block out only certain light, a vital component since this leaves more creative control in the hands of the photographer holding the camera.
In the example images shared by Sony, seen above, we can see the prototype sensor was able to capture objects inside a glass box both with and without reflections, all in-body and without the need to fiddle with a circular polarizing filter.

The sensor is also capable of capturing photos with minimal “ghosting,” a phenomenon that occurs when the polarizer acts as a mirror and reflects its own image onto the sensor. To do this, Sony uses an anti-reflection layer above the gridded polarizers.
By using this new design, Sony should be able to develop smaller, more affordable polarization sensors that could some day make it into consumer cameras. According to data that Sony presented, this method of polarizing proved to be more effective and efficient than what is offered by previously developed polarization cameras.

Read more: