Back in the day – from the 1940’s to the 1980’s – television came in just one format: 4:3 at 525 lines, otherwise known as NTSC standard or analog TV or Standard Definition (SD). This simply meant that a picture was a bit wider than it was high and the resolution of that picture was pretty low by today’s digital standard. On June 12, 2009, the old school analog standard was upgraded to digital television (DTV) ushering in the age of High Definition (HD) television.
DTV describes a way to capture and broadcast video, but there are a lot of variations of digital television and HD is only one of them. We’re talking about this because there’s been some confusion about what HD is and why different kinds of HD cameras produce very different-looking pictures.
HD is a set of standards that defines the shape of the picture, its resolution, scanning type and data rate. HD is often:
|Frame Size (in pixels – WxH)
|1280 x 720
|1920 x 1080
|1920 x 1080
Progressive vs. Interlaced: a pretty geeky topic, but the short way to describe them is to say that progressive images are higher resolution for images that move and interlaced is better for images that don’t move. We know what you’re thinking, “It’s video – doesn’t everything move all the time? So, progressive is better, right?” Well, not exactly. Progressive vs. interlaced is a way to both capture and reproduce (broadcast/webcast) video. For example 720p recording mode is roughly one-half the resolution of 1080p, but it’s still HD and is often used to save recording storage space.
You can pick up an HD camera at your local electronics store for a fraction of the price of a professional HD camera used by videographers and video production companies. So, what’s the difference? Why is the pro camera so much more expensive than the prosumer camera?
There are many factors that account for the variations in costs from one camera to another, including lens quality, control over exposure, focus, and color-rendering, the ability to capture in slow motion and the camera’s audio recording quality. A principal factor that distinguishes a pro camera from its prosumer cousin is the type and size of the chip used to capture the video and the data rate and color space with which that video is captured. The larger the chip, the more data it produces. The higher quality the chip, again the more information or data it captures.
A camera’s data rate refers to its ability to capture data to some kind of recording media, e.g., a hard drive or a card (e.g., SD or CF) and how that data is encoded with an algorithm or codec. Because of the amount of data produced by an HD camera’s sensor, very few cameras capture video in native or RAW formats.
To give you some idea of what that means, if you were to capture a 1080 HD signal in full native or RAW, one minute of video would produce about 420MB of data and an hour would be about 25GB. Anyone can go out and buy a 32GB or even 64GB SD or CF cards, but at those sizes they can be quite costly.
Professional videographers and production companies routinely capture at those kinds of data rates with the high-end cameras they employ. Prosumer cameras typically use a codec to compress the data so that it will fit on the much more modestly sized storage media that those cameras use. But the compression produces loss-y images – video that loses quality as it’s enlarged.
A pro camera’s chip and circuitry is able to capture more color information than a prosumer camera. This is known as Color Space. Color theory is a very complex topic and we’re only going to graze the surface describing it here.
Imagine, if you will, that you’re an eight year-old kid with a box of 24 crayons and the classmate sitting next to you has a box of 64 crayons. Besides your understandable jealousy, in theory your classmate has more colors of crayons enabling her to more accurately draw a picture of her mommy, daddy and her dog Wrigley than could you. I know, it’s a pretty crude analogy, but more professional equipment has the ability to capture a larger color space than a prosumer camera making things like blue or green screen production substantially higher quality.
2K and 4K? More data and higher quality. 2K refers to a frame size of 2048 x 1080 (although there are several variations of this, too) and 4K or Ultra High Definition (UHD) is 4096 x 2160. 4K is gaining greater acceptance as a capture resolution by film and television professionals. Although 4K television sets are beginning to show up in retail channels, we’d caution against being an early adopter and buying one, soon.
We’ve tried to cram a lot of technical information into a very small post (kind of like video compression, eh?) and we hope that we’ve answered some of the questions you might have about HD.