While high definition has become a reality for many consumers, the technical jargon associated with this exiting new technology is causing much confusion. Just as we were beginning to understand the differences between Blu-ray and HD DVD along comes a new high-definition format, 1080p.
But why do we need another high-definition format anyway? Many of us have bought our HD Ready screens and were ready to sit back and enjoy this new viewing experience, but now we are all wondering if we bought the right kit in the first place.
Many of the more recent HD Ready flat screens feature a resolution of 1,366×768 pixels. This will display the commonly used 720p and 1080i formats, although 1080i/1080p signals will be downscaled to fit. To display 1080i/1080p signals in their entirety, you’ll need a screen with a resolution of 1,920×1,080 pixels, coined ‘Full HD’ by the marketing men.
However, just because a screen has 1,920×1,080-pixels it does not necessarily mean that it will accept 1080p input – so check before you buy.
Remember, 720p, 1080i, 1080p are formats in which ‘Sources’ of high definition content are presented for viewing on a particular output device such as your LCD/Plasma screen. The source could originate from your TV cable provider for example, or your xbox 360. To restate the point, 1080i/1080p needs a screen resolution of 1,920×1,080-pixels to display in its entirity, but you don’t have to have a screen with this resolution to display a 1080i/1080p signal – lower resolution screens downscale the signal to fit.
Taking a step back, 720p and 1080i were initially set out as the two key standards for High Definition content, with Sky HD, HD DVD and the Xbox 360 supporting these formats. Any TV that supports 720p and 1080i is classed as HD Ready. Let’s take a step back for a moment and take a quick look at the development of TV technology to see how we arrived at these standards.
In a CRT display (the TV you grew up with), a stream of electrons is generated by a gun, and is scanned across the face of the tube in scan lines, left to right and top to bottom. The face is coated in phosphors, which glow when hit by the electron stream. A method of scanning was required that would reduce the transmitted TV picture’s bandwidth and work in accordance with the electricity supply frequency (50Hz in the UK and Europe and 60Hz in the US). The result was interlaced scanning.
A method of reducing bandwidth was required because early sets were not able to draw the whole picture on screen before the top of the picture began to fade, resulting in a picture of uneven brightness and intensity. To overcome this, the screen was split in half with only half the lines (each alternate line) being refreshed each cycle. Hence, the signal is interlaced to deliver a full screen refresh every second cycle. So if the interlace signal refreshes half the lines on a screen 50 times per second this results in a full screen (or frame) refresh rate of 25 times per second. The problem with interlacing is the distortion when an image moves quickly between the odd and even lines as only one set of lines is ever being refreshed.
As TV screen technologies have progressed another system called Progressive Scan has also been developed. With progressive scanning the frames are not split into two fields of odd and even lines. Instead, all of the image scan lines are drawn in one go from top to bottom. This method is sometimes referred to as ’sequential scanning’ or ‘non-interlaced’. The fact that frames are shown as a whole makes it similar in principle to the way film is shown at the cinema.
At this point it is worth considering what we mean by resolution in relation to TVs;
Resolution: HD-Ready TVs need to be able to display pictures at the resolution set by the new standard. Resolution can be described either in terms of “lines of resolution,” or pixels. The resolution you see on your TV depends on two factors, namely the resolution of your display and the resolution of the video signal you receive. Because video images are always rectangular in shape, there is both horizontal resolution and vertical resolution to consider.
Vertical resolution: This is the number of horizontal lines that can be resolved in an image from top to bottom. The old familiar CRT TV displays 576 lines, while Digital HD television operates at a resolution of either 720 or 1080 lines. This is the most important resolution as it is most noticeable to the human eye.
Horizontal resolution: This is the number of vertical lines that can be resolved from one side of an image to the other. Horizontal resolution varies depending on the source. The number of horizontal pixels is not quite so critical as vertical resolution as it is not as obvious to the human eye during normal viewing.
An analogue TV signal in Europe, where the PAL standard is used, has 625 horizontal lines of which 576 lines are displayed and the image (or frame) is refreshed 25 times a second. This is the standard we have been used to for years.
A High Definition Digital TV signal delivers significantly more picture detail and audio quality than a standard signal, producing pictures that are significantly better, sharper and clearer;
720p: 1,280×720 pixel resolution. High-definition picture that is displayed progressively. Each line is displayed on the screen simultaneously, therefore it is smoother than an interlaced picture.
1080i: 1,920×1,080 pixel resolution. High-definition picture that is displayed interlaced. Each odd line of the picture is displayed, followed by each even line, and the resulting image is not as smooth as a progressive feed. 1080i is therefore a more detailed picture suited to documentaries and wildlife footage, but less suitable for action-oriented material such as sports and movies.
1080p: 1,920×1,080 pixel resolution. High-definition picture that is displayed progressively. Each line is displayed on the screen simultaneously, therefore it is smoother than an interlaced picture. This is the ultimate high-definition standard — the most detailed picture, displayed progressively.
There are two main formats for HDTV, namely 720p (i.e. a 720 line picture progressively scanned 50 times a second) and 1080i (1080 lines interlaced at 50 cycles per second). The picture resolution of a high definition digital TV is about 4 times greater than a typical 576 line TV picture.
not having a screen which is able to display 1080p may not be important to you. However, there are exceptions, and if you are a serious game player you will probably already know one of them, or to be precise two of them. The xbox360(with a little tweak) and the playstation 3 produce output at 1080p. Also, the new High Definition DVD format, blu-ray has also been designed for 1080p ouput. Is the difference worth the extra investment? Maybe, something you will have to judge for yourselves …
Although HDTV broadcasts had been demonstrated in Europe since the early 1990s, the first regular broadcasts started on January 1, 2004 when Euro1080 launched the HD1 channel with the traditional Vienna New Year’s Concert. Test transmissions had been active since the IBC exhibition in September 2003, but the New Year’s Day broadcast marked the official start of the HD1 channel, and the start of HDTV in Europe.
Euro1080, a division of the Belgian TV services company Alfacam, broadcast HDTV channels to break the pan-European stalemate of “no HD broadcasts mean no HD TVs bought means no HD broadcasts…” and kick-start HDTV interest in Europe.
The HD1 channel was initially free-to-air and mainly comprised sporting, dramatic, musical and other cultural events broadcast with a multi-lingual soundtrack on a rolling schedule of 4 or 5 hours per day.
These first European HDTV broadcasts used the 1080i format with MPEG-2 compression on a DVB-S signal from SES Astra’s 1H satellite at Europe’s main DTH Astra 19.2°E position. Euro1080 transmissions later changed to MPEG-4/AVC compression on a DVB-S2 signal in line with subsequent broadcast channels in Europe.
New Series of HDMI Video Fiber Optic Extenders Aimed at Pro Install, Medical, and Government Markets
Capable of sending HDMI v1.3 digital audio/video signal through multimode optical fibers, AT-HDF20SR and AT-HDF30SR allow any HD display to extend signals up to 1,320 ft at WUXGA or HDTV resolutions. They offer self adjustment with no compression, bit reduction, or signal degradation and can run multiple signals in single conduit without crosstalk. CEC compliant units support 12-bit color depth and have 8 dB input equalizer, which compensates losses over 16 ft.
Atlona Technologies is releasing a new line of video extenders that transfer high definition video over multi-mode Optical type fiber optic cable up to 1000 ft without any signal loss. These new extenders serve not only as a way for companies to send video signal far beyond the lengths of actual cables, but also provides a measure of security due to their use of Fiber Optic cable. Since Fiber Optic cable is immune to interference caused by electromagnetic fields (EMF/EMI), these extenders are perfect for medical imaging applications where a perfect picture is not only ideal, but necessary.
Atlona HDMI versions, model AT-HDF20SR and AT-HDF30SR, are capable of sending HDMI v1.3 digital Audio/Video signal through multimode fibers (SC connection) allowing any HD Display to extend resolutions up to 400 meters (1320 ft) at WUXGA (1920×120 @ 60Hz) or full range of HDTV resolutions 720p/1080i/1080p.
Along with being HDCP/DDC compliant, these UL/CE approved Atlona units are equipped with advanced digital fiber optic technology allowing for self adjustment with no compression, bit reduction, or signal degradation. Multiple signals can be run in a single conduit without crosstalk. These CEC compliant units support 12-bit color depth and have an 8dB input equalizer, which compensates losses over 5 meters (16 ft).
The AT-HDF20SR is an adaptor styled baluns with a small form factor that lends itself perfectly to HDMI matrix switchers where space is limited. The larger of the two baluns is sold as two separate units, the AT-HDF30R and the AT-HDF30R, to allow users the ability to convert among multiple HD video types depending on the receiver unit. This unit is compatible with Atlona’s RGB or DVI receiver units (models: AT-DVIF30R and AT-RGBF30R) making it versatile enough to handle even the most complicated digital signage application.
Choosing an HDMI cable can be a complex task. There are several factors which you must consider in order to select the best HDMI cable to meet your requirements:
- HDMI standards compliance
- HDMI Cable Categories
- Cable length
- Cable quality
- Active cables
- HDMI devices
HDMI Standards Compliance
Each HDMI cable is rated to comply with a specific revision of the HDMI standards. A cable rated for HDMI 1.2a should meet the requirements of HDMI 1.0, 1.1, and 1.2 — but is not guaranteed to meet the standards for HDMI 1.3.
HDMI Cable Categories
The HDMI standards define two categories of cables. Category 1 HDMI cables are designed to support HDTV resolutions and frame rates. Category 2 cables are required for higher resolutions or higher frame rates.
The HDMI specification does not define a maximum cable length. HDMI cables are commonly available in 3′ to 50′ lengths.
Purchasing a cable longer than necessary will cost you more money, but it will also increase signal loss due to attenuation.
All other factors being equal, a cable which is built to higher tolerances using better materials will outperform a cable which is built merely to meet a standards specification. In addition, these premium cables will often provide longer service lives.
An HDMI cable can be made using 28 AWG wire, but the use of 24 AWG wire will create a sturdier cable which is more resistant to attenuation.
As with traditional analog stereo cables, premium HDMI cables are often furnished with gold plated connectors to ensure the best possible signal quality.
For specialized high-end applications, some manufacturers are selling active HDMI cables. These cables use a variety of technologies which involve boosting the transmission distance or quality through the addition of electrical power to the cable connection.
Some of these active cables run over fiber optics or Cat-5 cable.
Another approach to supporting extremely long cable runs is to chain multiple HDMI cables together with amplifiers, repeaters, or equalizers.
An HDMI cable only has to be good enough to support the equipment which it connects. It is useless to pay for a premium gold-plated HDMI cable for a low-end television set.
How the designers of the HDMI standard screwed up, and what’s to be done about it.
HDMI, as we’ve pointed out elsewhere, is a format which was designed primarily to serve the interests of the content-provider industries, not to serve the interests of the consumer. The result is a mess, and in particular, the signal is quite hard to route and switch, cable assemblies are unnecessarily complicated, and distance runs are chancy. Why is this, and what did the designers of the standard do wrong? And what can we do about it?
The story begins with another badly-developed standard, DVI. A few years ago, there was a movement within the computer industry to develop a new digital video display standard to replace the traditional analog VGA/RGBHV arrangement still found on most computer video cards and monitors. Interested parties grouped together to form the Digital Display Working Group (DDWG), which developed the DVI standard.
DVI had all the earmarks of a standard designed by committee, and it remains one of the most confusing video interfaces ever. DVI could run analog signals, digital signals, or both, and it could run digital signals either in a single-link configuration (in a cable using four twisted pairs for the signal), or in a dual-link configuration (using seven). Identifying which DVI standard or standards any particular device supported was not always easy, and the DVI connector came in various flavors and was never really manufactured in any form that wasn’t well-nigh impossible to terminate.
But the worst thing about DVI was something that the computer-display professionals involved in its development really didn’t give much thought to: distance runs. Most computer displays are mounted at most a few feet away from the CPU, so it didn’t seem imperative that DVI work well over distance. This lack of concern for function at a distance, coupled with common use of twisted-pair cable (e.g., CAT 5) in computer interconnection, led to a decision that DVI would be run in twisted-pair cable.
Had the DVI standard been designed by broadcast engineers rather than computer engineers, things probably would have turned out very differently. In the broadcast world, everything from lowly composite video to High-Definition Serial Digital Video is run in coaxial cables, and for good reasons, which we’ll get to in a bit. Long-distance runs of VGA, in fact, are always handled in coaxial cable (though there may be a number of miniature coaxes in a small bundle, rather than something which obviously appears to be coax).
DVI lacked a couple of things which the consumer audio/video industry wanted. It was implemented on a variety of HD displays and source devices, but it was confusing for the consumer because of the many variants on the standard and different connector configurations, and it didn’t carry audio signals. A consortium to develop and promote a new interface, HDMI, was formed; the idea was to come up with a standard which could be implemented more uniformly, was less confusing, and offered the option of routing audio signals along with video.
Here, again, was an opportunity to avoid problems. The difficulties of running DVI-D signals over long distances were well known, and the mistakes of the past could have been avoided by developing HDMI as a wholly new standard, independent of DVI. Instead, the HDMI group elected to modify the DVI standard, using the same encoding scheme and the same basic interface design, but adding embedded audio and designing a new plug. Instead of many DVI options, analog, digital, single and dual link, there was one “flavor” of HDMI (actually, there is also a dual-link version in the HDMI spec–but you won’t find it implemented on any currently available device). This provided the advantage of making HDMI backward-compatible with some existing DVI hardware, but it locked the interface into the electrical requirements of the DVI interface. Specifically, that means that the signals have to be run balanced, on 100 ohm impedance twisted pairs.
We’re often asked why that’s so bad. After all, CAT 5 cable can run high-speed data from point to point very reliably–why can’t one count on twisted-pair cable to do a good job with digital video signals as well? And what makes coax so great for that type of application?
First, it’s important to understand that a lot of other protocols which run over twisted-pair wire are two-way communications with error correction. A packet that doesn’t arrive on a computer network connection can be re-sent; an HDMI or DVI signal is a real-time, one-way stream of pixels that doesn’t stop, doesn’t error-check, and doesn’t repair its mistakes–it just runs and runs, regardless of what’s happening at the other end of the signal chain.
Second, HDMI runs fast–at 1080p, the rate is around 150 Megapixels/second. CAT5, by contrast, is rated at 100 megabits per second–and that’s bits, not pixels.
Third, HDMI runs parallel, not serially. There are three color signals riding on three pairs, with a clock circuit running on the fourth. These signals can’t fall out of time with one another, or with the clock, without trouble–and the faster the bitrate, the shorter the bits are, and consequently the tighter the time window becomes for each bit to be registered.
Consider, by contrast, what the broadcast world did when it needed to route digital video from