It might be early days for 8K television, but one thing is already clear: The bandwidth demands it’s going to put on the smart homes of tomorrow will be extreme. And tomorrow is closer than most people think — some studies estimate 8K will become widely adopted as early as 2023.

If you’re an installer, this means it’s the perfect time to educate clients and consumers on the need for structured wiring installations, to futureproof their homes for the bandwidth crunch lurking just around the corner.

With that in mind, here’s Part 1 of our multi-part series on the nuts and bolts of what affects network speed and bandwidth.

The bits and bites of digital video

We all know computers and televisions use pixels, and the more pixels a display has the higher the resolution. But there’s more to bandwidth than resolution. We also need to account for bits, colors, frames, fields, and frequency. They all combine to create particular bandwidth requirements.

In simpler times, we had 1-bit video — each pixel was either black or white (or another 2 colors, such as green and black or yellow and black). Then came 4-bit video and the miracle of color displays. With 4-bit video, each pixel combines the primary colors of red, green, and blue. One bit was assigned to each color, while the fourth bit controlled intensity. All in all, 4-bit video could produce 16 colors.

What does that mean for bandwidth?

Take IBM’s early CGA display from 1981 as an example. It had a resolution of 320 x 200 (64,000 pixels) and used 4-bit color (4 bits per pixel). Multiply 64,000 by 4 and you get 256,000 bits of data.

When VGA appeared in 1987, it increased resolution to 640 x 480, quadrupling the pixels (307,200) and data (1,228,800).

Fast forward to today, where we have high definition (1920 x 1080) and 24-bit color. That’s 2,073,600 pixels and nearly 50,000,000 bits of data to wrangle.

Understanding fields and frames

Of course, the story doesn’t end with resolution and color. Frames, fields, and frequency also impact bandwidth demands.

In North America, standard definition television began as an analog technology that used horizontally scanned lines of light in a cathode ray tube. The rate at which the scanned lines created an image on screen was tied to the frequency of the TV’s AC power, which in North America is 60 Hz (or 60 times per second).

This means there are 60 images shown on screen each second. These are known as fields. Each field only shows every other horizontal line, so the two fields combine to create a complete frame. This is known as interlacing. Under this technology, video consists of 30 complete frames per second.

The impact of progressive scan technology

When HDTV first appeared on the scene with resolutions of 1080i it was still using interlacing. That’s what the “i” stood for. But then came 1080p. What’s the main difference? The “p” stands for progressive scan, which means each field shows every horizontal line (instead of every other one).

This requires double the bandwidth from 1080i, as every field in a 1080p display has 2,073,600 pixels. If the display is using 24-bit color, then every field is consuming 49,766,400 bits of data. Each second, there are 60 fields in a 1080p display, so this equates to 2,985,984,000 bits of data (3 megabits) per second.

Today, this is entirely manageable. But 1080p has already given way to 4K ultra high definition, and the forecasts are already in place for when 4K will begin being replaced by 8K. With each advance in display technology, the demands on bandwidth exponentially increase, further underscoring the need to futureproof new and existing homes with structured wiring.

Check back soon for our next installment in the series, as we continue to explore how to manage the demands of 4K and 8K video.