A video wall is only as good as the sum of its parts. Just one component can impact overall visual performance, and it’s important to spot the red flags. With changing standards and wide range of product quality, questioning the available solutions will ensure you make the best choice. Here are the top questions you must ask before buying an LED display.
Because the creation process is so complex, it’s impossible to make identical pixels. The best manufacturers sort pixels by brightness and color. This process is called binning. Using pixels within a tightly controlled bin range, we can ensure a better quality display. Since all pixels are unique, it’s impossible to match them exactly—but choosing from a narrow range of color and brightness allows manufacturers to achieve a more uniform picture throughout the screen while maximizing power efficiency
How much of the screen is covered by light-emitting pixels? That’s your fill factor. It’s impossible to achieve a fill factor of 100%, but this number should be high (at least 40%) to achieve the crispest and most vibrant picture.
Throughout an LED display, each pixel is held in place by an SMD package. For the deepest levels of black and highest contrast, these packages should be constructed with a dark encapsulation material, typically an epoxy resin. Be sure to ask for black SMD LEDs.
ICs are integral to the performance of your display. Without quality driver and switch ICs, the pixels in your display may not receive the best control. This will affect brightness and color uniformity. Over time, you can easily see the impact of low-quality ICs: the screen could show ghosting, dim lines, and color shifts.
Quality LED cabinets allow heat to escape the display, so that it can cool passively (instead of using an active fan). This decreases the overall size of the display, and opens more possibilities for your design and installation. It also provides noise-free operation, which is critical in many applications—especially if the viewing distance is close.
The best designs allow for full access from the front, so tiles can be replaced and electronics can be repaired without deconstructing your installation. Most products will also require rear access for certain components. It’s important that you understand which components can be accessed, and from which side of the tile. The best products allow individual tiles to be replaced from the front of the LED video wall, without having to remove the entire cabinet, dramatically reducing service time and operational costs.
Attending industry trade shows or requesting a live demonstration with multiple vendors will help you ensure top visual quality. Keep in mind, many vendors will offer their own content to maximize impact and minimize deficiencies. If possible, use a consistent package of visual images (and, preferably, video) to compare the vendors. Make sure to choose content that is representative of your specific application.
When viewing the images:
› look for a uniform image across all modules
› check for ghosting, ripping, or other anomalies
› use a light meter to test brightness
When LED components are used together, their interactions can cause additional concerns, such as EMC emission levels that may not meet standards. Completing a system-level certification ensures that the entire system is safe and provides you with a peace of mind. If an LED product is purchased and installed without a system level certification, a site certification may be required to complete your installation. Ask each potential vendor if site certifications are required, as these will increase Installation costs and cause project delays.
The best suppliers will work with you to create a solution that turns your vision into a reality. Ask if they offer additional products, like structures, processing and content management. The right mix of technologies and services will help create an entire experience for your audience. Make sure that your manufacturer will stand behind their product and make the installation right, should any problems arise.
The prospect of purchasing commercial digital display solutions is daunting, in no small part due to the complexity of ever-changing industry jargon. The fluidity of many of the industry’s most common –yet most confusing- specifications can make the precise definitions of these terms quite difficult to pin down. With this guide however, we hope to do exactly that: nail down these concepts with concrete definitions so consumers can confidently counter sales speak with specificity.
Organizations who make the decision to purchase a large-format LED display solution are not simply buying a piece of technology off the shelf.
They are partnering with a manufacturer like NanoLumens to arrive at a fully customized solution that is built to fit a specific space and serve a specific purpose.
Creating this perfect visualization solution requires a manufacturer to know what exactly a customer is looking for but in many cases these desires can be hard to articulate be- cause industry vernacular is so malleable. Following the initial decision to purchase a display, a customer will likely establish a few things up front, like viewing distance, pixel pitch, display size, and intended use. Each of these metrics are fairly easily understood and they work in tandem to set the basic
dimensions of each display project. Where the project path goes from there can get a lot more nuanced. We’ve found that even if primaryspecification discussions were productive, the confusing technical language that dominates secondary conversations can derail even the most promising projects. This is why we have created this guide; to finally pin down the confusing specifications that have long avoided concrete definitions. Equipped with the knowledge you’ll find within these pages, consumers and manufacturers alike will be able to move towards stronger long term relationships because they will at last be speaking the same language.
Before diving into the specifics about display specifications, what customers need to realize first is that there is no real governing body judging how specifications are created, claimed, and tested, so manufacturers can pretty much say anything. With no authority monitoring what display providers can and cannot claim, customers have to shoulder this fact-checking responsibility themselves. There are a few steps in this process. First, the absence of regulated standardization within the industry does not preclude the existence of informal but commonly-met benchmarks. For each display metric, customers should learn these benchmarks and use them to set their initial expectations. Next, customers should research the intricacies of each specification so they can smoothly counter sales speak with actual knowledge. The final step is to request that manufacturers test and prove their claims. To get this process started, let’s begin by dissecting one ubiquitous industry term: diode lifespan.
The first thing a customer should know about the lifespan of a diode is the commonly-met benchmark. Here, the industry standard is 100,000 hours, which translates to roughly 11 years and five months. That number means that operating at 100% brightness, full white, 24 hours a day, 7 days a week, a diode will last 100,000 hours until its brightness degrades to 50% of the level it met on Day 1. A few display manufacturers have started to advertise that there’s no need to pay up for displays with diodes guaranteed to last 100,000 hours when the 50,000 hour lifespan of their diodes is plenty good enough. You should look skeptically at this claim but it can indeed be true in certain cases. If a customer plans to use their display sparingly or they are particularly concerned with cutting costs, shorter lifespan diodes can prove a decent option. Displays rarely operate and their peak brightness and diodes with a lifespan of 50,000 hours can be perfectly viable in smaller scale boardroom or corporate settings. Still, a less robust diode with a shorter lifespan is going to exhibit a higher fail rate, meaning it may need to be replaced even more quickly than its 50,000 hour characterization would portend. All we’re saying is if someone advertises that 50,000 hours is all you need, take that with a grain of salt. The 100,000 (and 50,000) hour rating is set by the suppliers who create diodes and then sell them to manufacturers, who subsequently assemble them into the full-size displays you see out in the field. During this assembly process, manufacturers will often reduce the maximum brightness levels of these diodes. They do this because LED displays almost never need to run at full brightness in the field and setting a reduced upper brightness bound upon assembly helps extend the point in time at which the diode’s brightness will degrade to half of its original maximum. In other words, the diodes are designed to last 100,000 hours until degrading to half brightness but by artificially capping their brightness once assembled into a display, manufacturers extend this lifespan and decrease the fail rate of the diodes while not materially reducing performance. Most large-format LED displays rarely operate in the field at even 50% brightness, so this 100,000 hour benchmark exists mostly to indicate the lifespan of a diode under uncommonly rigorous usage patterns. Capping maximum diode brightness upon assembly and then operating the display at a lower brightness is a common best practice to maximize the lifespan and efficiency of a display
With the knowledge that diode lifespan is intrinsically tied to brightness, customers should next develop an understanding of how display brightness works. In the digital display industry, brightness is frequently measured in “nits,” with one nit being equivalent to one candela per square meter (cd/m2), representing the SI unit of luminance. For reference, a typical LCD display like a computer screen or home television runs around 300 nits, give or take. Commercial LED displays on the other hand range in brightness from roughly 750 nits in small scale indoor applications to 10,000 nits in huge outdoor installations. Many customers turn to LED because alternative digital display technologies have not proven bright enough. This can lead them to desire the brightest possible LED solution but in truth that is rarely necessary. In most large-format LED applications, even in high bright cases, 1200-1500 nits is sufficient. The intended use case for your display should serve as a guide for the brightness level you require from your diodes and NanoLumens allows our customers to fully customize virtually any brightness level they may desire. While NanoLumens is capable of producing outdoor LED displays with soaring brightness levels, we often find the best solution for most indoor installations is a smaller, sharper, product with a lower brightness maximum and more efficient energy usage. In any case, keep in mind going for a display with a brightness maximum well beyond the intended regular usage level not only leaves customers paying for performance they don’t need but also compromises the grayscale of the display
Grayscale is the industry term used to indicate how well a display differentiates separate tones of similar colors and it is a result of how well the display captures the range of grays between full black and full white. This scale is a product of the display’s bit depth but it is based on the display running at full brightness, so the further from full bright a display goes, the worse the grayscale quality will erode. For example, a display with a maximum brightness of 7,000 nits turned down to 1,000 will exhibit much worse grayscale than a display with a maximum brightness of 2,000 nits turned down to 1,000, even if the displays have the same bit depth. The reason for this has to do with how the displays are powered. When judging our grayscale quality against those of competitors, we import onto each display the same photo depicting twelve trees in a field vanishing into the fog. We do these sorts of comparisons for the same reason Best Buy tunes all their televisions to the same channel: when the content is the same, differences stand out more visibly. The picture of foggy trees functions as an ideal litmus test for grayscale quality for a pair of reasons. First, it is a fairly generic image and second, the foggy nature of the picture means that as grayscale quality changes, the number of trees visible changes as well. Customers should ask to see these comparisons, too.
While the brightness of the display plays a major factor in grayscale quality, grayscale is determined more fully by a display’s bit depth, as mentioned earlier. The bit depth of a display is essentially a representation of its processing capacity. All content shown on digital displays is part of a digital file. These binary digital files are compressed into pieces of data known as “bits” so they can be more efficiently stored and transferred. These bits are either a 1 or a 0. The term bit depth refers to the number of bits used to display the color channels of red, green, and blue, for each pixel in a given LED display. An 8-bit display can create over 16 million colors, while a 10-bit display can create over a trillion.
Most people can only perceive about 10 million colors, so the industry standard for digital displays and cameras is a bit depth of 8 bits. High-end photography, cinema, and gaming can require higher bit depths to prevent any banding in over- or underexposed parts of their content, but for most purposes, 8 bits will be sufficient. Increasing bit depth will increase the number of colors your display can show, but it will also require greater energy consumption and greater processing power from your software.
While grayscale quality controls the tonality of your colors and bit depth controls the number of colors your display can possibly create, the color gamut of a display determines the spectrum within which all those colors exist. To explain how color gamut, or color space, works, let’s start simple: there is only a certain range of colors that are perceptible to the human eye. To scientifically differentiate between the colors in this range, you need to introduce math. To graphically represent the finite differences between colors, as it is most easily stated in layman’s terms, the International Commission on Illumination (CIE) created through a series of experiments a 2-dimensional graph that plots color based on chromaticity (a product of hue and saturation) on the X axis and brightness (also called luminance) on the Y axis. This was first created in 1931, and is thus commonly referred to as the CIE 1931 Chromaticity Diagram. The diagram, as seen below, graphically represents the visible color space.
Within the range of colors we can perceive, as plotted in the 1931 Diagram, there are smaller subsections of colors that digital displays can recreate. These subsections were standardized at the urging of content creators, who wanted uniformity in the Color Space way their content was displayed across varying display technologies and manufacturers. The first baseline color gamut for digital content was established in 1990 and dubbed Rec. 709. As the first standard gamut introduced, it covers the smallest portion of the visible color spectrum and is met by nearly every digital display product. As time went on and technologies improved, content creators demanded greater color space to show their work, so display manufacturers improved their technology to keep up. The newest standard gamut is called DCI-P3, and it was first published by the Society of Motion Picture and Television Engineers (SMPTE). DCI-P3 is wider and richer than Rec. 709. The next generation color space is called Rec. 2020, but right now few if any digital technologies are able to reproduce it. Digital displays are getting better and better at recreating the colors of real life, pushing their native color spaces closer and closer to the limits of the visible color space. Still, less than half of people can even tell the difference between Rec. 709 and DCI-P3, so as long as brand’s
signature colors are represented true to form, displays used for basically anything other than cinema can get away with the common Rec. 709 gamut.
Often tethered to color space, the white point of a display determines its color temperature. White point settings are determined based on the Kelvin scale. 6,500 Kelvin, abbreviated as D65, is the most common setting. One white point isn’t inherently better than another and the reason to choose one white point over another depends nearly entirely on use case and use environment. This specification can be adjusted as needs change and customers should make sure the white point they request aligns with their intended display environment.
Many of the aforementioned terms regard what an audience sees but off-axis viewing governs where an audience sees. Off-axis versatility is crucial because LED displays are often not viewed by a stationary audience. As a viewer walks by a display, the amount of time he is able to see content is a function of the off-axis viewing capabilities of the display. For example, if a display only has an off-axis angle range of 90 degrees, or 45 degrees from center each way, a viewer will only be exposed to the content while he is within that range. Once he has moved beyond 45 degrees to the right or left of center, the display content is no longer cleanly visible. In contrast, a display with a viewing angle range of 160 degrees, or 80 degrees from center each way, allows for over four times the total visibility. This extra time is precious. In the image below, the pink region represents the visible area for a display with a 90 degree off-axis angle range, while the green region represents the additional area that becomes visible for a display with a 160 degree range.
In order to be effective, content needs to be seen. The better the off-axis ability of a display, the longer its content will be viewable, and the more likely it will be to influence audience members.
Beyond the major secondary specifications detailed above, there are further, more nuanced terms that customers can dive into if they feel so inclined. Without dedicating too much time to these more granular details, here are a few brief descriptions.
Often confused with pixel pitch, which is the distance between the centers of adjacent pixels, fill ratio is the blank space between the closest edges of adjacent pixels. Essentially, it measures the empty space of the display. A lower fill rate can make the pixel pitch seem smaller without actually changing it. A higher fill rate on the other hand can make the pitch seem wider but it acilitates a better contrast ratio because there’s more black visible against which the pixels can stand out. Lower fill rates often correspond with larger pixels, all else equal, and the larger a pixel is the more expensive it will be
The frame rate of a display indicates the number of frames per second a display can show. The higher the frame rate, the larger the digital file for the content will be and the greater demands it will place on your display. A common frame rate is 24 frames per second (fps) but customers should expect their digital display be able to process around 60 fps. Some newer processors can handle 120 fps but this sort of performance only really comes into play with sophisticated gaming and high-end cinematography
The British Thermal Units (BTUs) produced by a display is a product of the heat output efficiency of a display. One BTU is equal to the amount of heat needed to heat one pound of water one degree Fahrenheit. One BTU is roughly 1,055 joules. The efficiency of a diode and its circuitry determines the heat output of the LED driver board, while the efficiency and loading of the power supply determines the heat output of its components.
Today, there is a wide variety of large-format display options available, each with their own positives and negatives. How can you easily navigate these waters and understand which technology makes sense for your intended purpose?
First, some background: LED displays use discreet red, green and blue light-emitting diodes (LED) for each pixel (“dot” on the screen). These shouldn’t be confused with LED-backlit LCD displays, such as “LED” televisions, which are still LCD (liquid crystal display), but with an LED backlight. How are these technologies different? For starters, an LED display has virtually no limitations of size or shape, where LCD displays generally must conform to the native 16:9 format, either portrait (vertical) or landscape (horizontal), or some grouping of these shapes into a video wall. Second, each LCD display can be high-resolution (up to 1080p), giving the entire video wall a very high theoretical resolution, but this comes at a cost. Even “zero bezel” LCD displays still have bezel gaps between the displays. It is also difficult to calibrate and maintain calibration between all of the LCD displays (often nine or more individual displays).
So what about video projection? Projection comes in two “flavors”: rear projection (the image is projected onto the back side of the screen) and front projection (projected onto the front – like a conventional movie theater). Enterally speaking, projection is the least expensive option, especially at large sizes. But it certainly isn’t without complications! Both front and rear projection require reasonably light-controlled rooms, with no direct sunlight. This can be overcome to a small degree by “ganging up” two or more projectors together to overlay the same image at a higher brightness, but this can only help so much, plus it causes the cost to increase dramatically. Front projection must have an unimpeded light path to the screen, meaning it generally must hang from a ceiling. Rear projection requires a “throw distance”, or a space behind the screen defined by the size of the screen image. This could chop hundreds of square feet of usable space out of your location! Add in the high cost of replacement bulbs –sometimes hundreds of dollars each – plus the labor to change the bulbs, recalibrate and realign multiple projectors, these costs shouldn’t be ignored when looking at the total cost of operation.
Self-contained projection units (“tiles” or “mosaic display”) are another option for large format displays. These are essentially small rear projection displays in a single unit that can be joined together to create much larger displays. These systems can look very nice, but their primary drawbacks are cost, weight, thickness, visible gaps between units and uniformity between modules, especially in aging displays.
That leaves us with LED displays – are they the right choice for your application? They may well be, but we should begin by listing the places where an LED display is not a good choice. How far away is your expected viewer? Are they expected to be near? The exact distance varies by size and application, but as a general rule inside 10-12 feet is too close for most LED displays. Does your application need to be touchscreen? This is also not a good fit for most LED displays, although there could be exceptions. Outside of these considerations, an indoor-specific LED
display such as NanoLumens’ NanoSlim line are an excellent choice. They are bright, even in direct sunlight, thin and lightweight for architectural considerations, use relatively small amounts of power for their size, produce almost zero heat or noise, can be formed into concave or convex curves and fit into an almost unlimited variety of shapes and sizes!
LED Displays have come of age. As companies turn to digital signage for their messaging, competition in the industry has become more aggressive. Before making an investment decision there are several key questions buyers should ask: Are the LED displays available in any size, shape or curvature? This is important if space is limited or challenging. Is the color evenly distributed? It is normal for an LED to degrade in color over time, but the display color should remain constant across the screen. With the use of color calibration at NanoLumens, all displays are guaranteed to be the best quality.
LED displays are organized in an array format. Some arrangements can be found in rows and columns, and others by blocks. This is because LED manufacturers create and produce LEDs by grouping them through a process called binning. When an LED manufacturer bins LEDs, they group them by their common qualities — usually by light intensity and color wavelength. However, when a display combines multiple “bins” of LEDs, each bin will have a different quality. Individual bins will be relatively consistent, but as a whole, the display will look blotchy.
“Many times, batches of LEDs can be faulty and show obvious signs of color variation.” says Rick Morrison, Senior Electrical Engineer at NanoLumens. The result is an LED display that looks sloppy and unmatched. Displays that have not been calibrated can be quite noticeable even to the untrained eye, often resulting in a poorly color-distributed display.
LED displays are organized in an array format. Some arrangements can be found in rows and columns, and others by blocks. This is because LED manufacturers create and produce LEDs by grouping them through a process called binning. When an LED manufacturer bins LEDs, they group them by their common qualities — usually by light intensity and color wavelength. However, when a display combines multiple “bins” of LEDs, each bin will have a different quality. Individual bins will be relatively consistent, but as a whole, the display will look blotchy.
“Many times, batches of LEDs can be faulty and show obvious signs of color variation.” says Rick Morrison, Senior Electrical Engineer at NanoLumens. The result is an LED display that looks sloppy and unmatched. Displays that have not been calibrated can be quite noticeable even to the untrained eye, often resulting in a poorly color-distributed display.
To have color consistency throughout a display, color calibration should be done. Because all LEDs have different binning ranges, calibration is able to find the common value of brightness and intensity throughout the LEDs. Therefore, if one bin of LEDs has a luminance of 700 nits, and another bin of LEDs has 1000 nits; then we need to adjust the display to 700 nits, the base value, for evenness. Think of it like a camera. Essentially, optical equipment like cameras use digital processing of images to gather data for each pixel on the display then generates some coefficient to apply to the LEDs to make the LEDs either brighter or dimmer. Many times, when an LED display is given a solid color, usually white, calibration can make an obvious change for the best. Therefore, in order to create the best LED display, calibration will always make it look better. NanoLumens can ensure quality displays by its calibration process. After NanoLumens puts its displays together, we don’t just rely on the LED manufacturers’ binning process, we go the extra mile to establish color maintenance. Our process to calibrate displays is unparalleled in the large-scale visualization industry. NanoLumens calibrates displays through a series of images taken from a high- end digital camera that are then synced to software that analyzes the display pixels’ brightness and color. The software breaks down the image into partitions and focuses in on color and intensity. “NanoLumens’ software creates a database in which we can adjust the brightness of all LEDs to the same level.” Says Jammie Proctor, Electrical Engineer at NanoLumens. The way our software adjusts the LEDs is by calculating a coefficient that can then be applied to the display. This coefficient brings even level to the LEDs –eradicating any blotchiness found on the solid color background. Calibration brings uniformity to displays. With calibration, customers need not worry how their display will look with different types of images. Any picture they choose, even if it is a solid color on the display, will look amazing. At times, manufacturers bin LEDs very well, rendering calibration unnecessary but calibration will always guarantee consistency.
Cookie | Duration | Description |
---|---|---|
cookielawinfo-checbox-analytics | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics". |
cookielawinfo-checbox-functional | 11 months | The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional". |
cookielawinfo-checbox-others | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other. |
cookielawinfo-checkbox-necessary | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary". |
cookielawinfo-checkbox-performance | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance". |
viewed_cookie_policy | 11 months | The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data. |