Anonymous edits have been disabled on the wiki. If you want to contribute please login or create an account.


Warning for game developers: PCGamingWiki staff members will only ever reach out to you using the official press@pcgamingwiki.com mail address.
Be aware of scammers claiming to be representatives or affiliates of PCGamingWiki who promise a PCGW page for a game key.

Difference between revisions of "Glossary:Graphics card"

From PCGamingWiki, the wiki about fixing PC games
(added custom resolution steps, migrated GPU scaling steps to fixboxes)
(removed the key points / added an intro / added a description to switchable graphics / extra info to multiple graphics / removed winxp from the fixbox / extra step in fixbox / added citation to fixbox)
 
(38 intermediate revisions by 12 users not shown)
Line 1: Line 1:
{{cleanup}}
+
{{Video settings sidebar}}
The '''graphics card''', or video card, is a system component which houses, amongst other things, the GPU (Graphics Processing Unit), Video BIOS and various outputs which connect the graphics card to the motherboard and your monitor. The Graphics Processing Unit is the processor housed on the graphics card that renders and outputs video from your computer to your screen. It is important to note that while the graphics card and GPU are closely related, the two terms are not interchangeable, as the GPU is a component of the graphics card.
 
  
== Graphics Card & GPU Manufacturers ==
+
Computers have always had to display their output in some way, and for the most part that was done as a text only interface, only later with future systems did the default interface become graphical. Back in that time, systems had built-in dedicated hardware for drawing the interface and graphics, once DirectX and 3D Gaming started to become popular, more powerful graphics were needed to render games at decent speeds if at all, this is where Graphics cards from 3DFX, Nvidia, and ATI (Later bought by AMD) come into play, these were addon cards which housed powerful graphics processing units and their own memory which added increased graphics capability to systems that didnt have them previously.
=== Graphics Processing Units ===
 
The three major consumer GPU manufacturers are NVIDIA, AMD (formerly ATI)<ref>[[Wikipedia:ATI Technologies#History]]</ref>, and Intel, marketed under the GeForce, Radeon, and GMA/HD Graphics brands respectively. NVIDIA and AMD products are then utilised by graphics card manufacturers in the construction of complete cards.
 
  
==== AMD ====
+
==Dedicated versus integrated graphics==
AMD produce processors with comparatively high-end integrated mobile graphics known as APUs. These are a good option for low budget builds.
+
Graphics come in two forms, dedicated and integrated. Dedicated graphics have a bigger processor and their own memory (VRAM) seperate from the central processor which offers better performance. They are often separate cards that are added to a computer, whilst laptops and other forms of computers have built-in dedicated graphics. Integrated graphics are part of the CPU or motherboard, stopping the need for an additional card at the expense of performance. Usually memory is shared with the computer's RAM, with a small amount of dedicated video memory, which reduces the amount of total usable memory that a system has.
  
==== Intel ====
+
Both AMD and Intel provide their CPUs with integrated graphics, but AMD also sells a line of "APUs" (Accelerated Processing Units) that have increased graphics performance making them more suitable for gaming.
Intel produce integrated GPUs which are components of their CPUs.<ref>http://www.intel.com/content/www/us/en/architecture-and-technology/hd-graphics/hd-graphics-developer.html</ref> Initial offerings in the consumer GPU space were under the Intel Graphics Media Accelerator (GMA) brand which are found on mainboards and served only to provide basic video functionality to PCs.<ref>http://en.wikipedia.org/wiki/Intel_GMA</ref> Very near the end-of-life for the GMA brand, Intels chipset integrated chips began to compete with older, very basic GPUs from Nvidia and AMD; they are able to play old games at reduced settings. Recently Intel began moving away from chipset integrated graphics, and with the Core i3/5/7 line of chips began offering GPUs built into the CPU die. Currently the two best HD Graphics products are the HD Graphics 3000 and 4000 models. The more common HD Graphics 3000 is found on some Sandy Bridge i5 and i7 processors, whilst the 4000 model is exclusive to the latest Ivy Bridge chips, and amongst other things includes support for DirectX 11.<ref>http://www.notebookcheck.net/Intel-HD-Graphics-4000-Benchmarked.73567.0.html</ref>
 
  
Intel has produced a [http://www.intel.com/support/graphics/intelhdgraphics3000_2000/sb/CS-032052.htm list of games] that run on HD 3000 integrated graphics.
+
==Switchable graphics==
 +
Gaming / Workstation laptops come with dedicated graphics processors that are usually used when running a intensive workload or game, but this isn't always the case since the operating system has to decide whether a task is intensive enough to warrant using the dedicated processor, which can lead some tasks or games to run poorly.
 +
Systems can rely on several mechanisms to decide what processor to use, like using the dedicated when plugged in and integrated when on battery. Nvidia's Optimus technology is one such example of switchable graphics.
  
== Dedicated versus integrated graphics ==
+
==Multiple GPUs==
Graphics come in two forms, dedicated and integrated. On modern computers, both are capable of running games, but they differ greatly. Dedicated graphics have a distinct processor and their own memory (VRAM), and increase performance by taking the burden off of the CPU and RAM. They are actually separate cards that are added to a computer in most cases (laptops can have built-in dedicated graphics). Integrated graphics are hardwired into the CPU or motherboard, stopping the need for an additional card at the expense of performance. Usually memory is shared with the computer's RAM, with a small amount of dedicated video memory. In some cases, using an integrated graphics card will reduce available system memory, because of shared resources, causing another performance hit. Dedicated graphics typically out perform integrated graphics when used on comparable computers, although new advances in integrated graphics are catching up with their dedicated competitors.  
+
Some graphics cards can be used in tandem with up to 3 other graphics cards to boost the overall output of the cards. With Nvidia cards this technique is called SLI, with AMD/ATI cards it is called Crossfire. Nvidia's technology requires video cards that are exactly the same type (e.g. A GTX 760 and another GTX 760). AMD/ATIs technology requires cards from the same series (e.g. A HD7970 can be combined with a HD 7950). This feature's benefit has always been variable and doesn't give a perfect 2x performance boost, if any boost at all. As time went on, the feature got prioritized less and less until being dropped entirely from Nvidia cards starting with RTX 4000.<ref>{{Refurl|url=https://youtu.be/sKHrLA2whNo|title=A Farewell To SLI|date=2024-02-16}}</ref>
  
The most well known and common integrated graphics system is made by Intel for use with their processors. AMD (formerly ATI) also have integrated Radeon lines for AMD processors.
+
==Identifying the graphics card==
 
+
{{Fixbox|description=Run dxdiag|ref=<ref>{{Refcheck|user=Mine18|date=2024-02-16}}</ref>|fix=
Some laptops include [[#Switchable graphics|switchable graphics]], having both an integrated chipset for basic computing and a dedicated GPU for gaming.
+
For Windows systems:
 
+
# Run the DirectX Diagnostic Tool (DXDiag)
== Identifying your Graphics Card ==
+
# Go to the Display tab. The computer's video card is listed there.
=== Checking The Physical Card ===
+
# If you're on a device with switchable graphics, you may have to go to the Render tab.
Graphics cards will have information either printed on them or on a sticker which will help with identification.
+
}}
 
 
=== Using DirectX Diagnostics ===
 
[[Image:NVIDIAControlPanelSystemInfo.png|300px|right|thumb|Using the NVIDIA Control Panel to obtain detailed graphics card & GPU information.]]
 
Windows Vista/Windows 7:
 
# Type <code>dxdiag</code> into the Start search and hit enter
 
 
 
Windows XP:
 
# Type <code>dxdiag</code> into Run
 
# On the 'Display' tab it should list your Device.
 
 
 
=== Using [[GPU-Z]] ===
 
# Go to [http://www.techpowerup.com/gpuz/ TechPowerUp's website]
 
# Download the latest GPU-Z and install.
 
# Open GPU-Z
 
 
 
GPU-Z can give you a lot more information than dxdiag could give you. Also great for monitoring voltages and temperatures.
 
 
 
=== Using the [[NVIDIA Control Panel]] ===
 
If you have an NVIDIA-branded GPU, you can use the NVIDIA Control Panel to obtain detailed information about your graphics card. Simply open the program and click 'System Information' in the bottom left hand corner for a full specification for your graphics card.
 
 
 
=== Using the [[AMD VISION Engine Control Center]] ===
 
If you have an AMD-branded GPU, you can use the AMD Vision Engine Control Center (former Catalyst Control Center)to obtain detailed information about your graphics card. Simply open the program and click 'Information' in the bottom left panel and select hardware for full specification for your graphics card.
 
  
== RAM Type ==
+
==Overclocking==
At the moment there are four types of GPU RAM: DDR2 is twice as fast as DDR, GDDR3 is twice as fast as DDR2 and GDDR5 is twice as fast as GDDR3.
+
Overclocking is causing the GPU to run at speeds beyond what the manufacturer recommends. Overclocking can damage the GPU if performed improperly, If done properly can provide a notable increase in performance can be achieved. The gain further depends on the type of GPU and type of workload/game. It is recommended to look up corresponding information online to see the "sweet spot" of the GPU to overclock.  
  
== Multiple GPU ==
+
Overclocking can be done through software like EVGA's Precision X, MSI's Afterburner, or AMD's Radeon Software.  
Some graphics cards can be used in tandem with up to 3 other graphics cards to boost the overall output of the cards. With Nvidia cards this technique is called [[SLI]], with AMD/ATI cards it is called [[Crossfire]]. Nvidia's technology requires video cards that are the exact same type (eg. A GTX 560 and another GTX 560). AMD/ATIs technology requires cards from the same series (eg. A HD6970 can be combined with a HD 6950).
 
  
== Overclocking ==
+
'''Please note that overclocking will usually void the warranty and that it increases the chance of a GPU malfunctioning. Overclocking is done at the risk of the user and is not recommended for novices.'''
Overclocking is causing the GPU to run at speeds beyond what the manufacturer recommends. Overclocking can damage the GPU if performed improperly.
 
 
 
'''Please note that overclocking will probably void your warranty and that it increases the chance of a GPU malfunctioning. Overclocking is done at the risk of the user and is not recommended for novices.'''
 
 
 
== Graphics Settings ==
 
Most games allow graphical settings to be adjusted.
 
 
 
=== Anisotropic filtering ===
 
Also known as AF. The main purpose of AF is to sharpen the appearance of textures that are farther away from the player. It can noticeably increase visual quality, but can also be resource intensive. More information on AF can be found [http://www.extremetech.com/computing/78546-antialiasing-and-anisotropic-filtering-explained/6 here].
 
 
 
[[Anisotropic Filtering (AF)]]
 
 
 
=== Anti-Aliasing ===
 
Also known as AA. The main purpose of AA is to reduce the "jaggies" in a game, the pixelated edges along an object. A slightly more in depth description can be found [http://www.extremetech.com/computing/78546-antialiasing-and-anisotropic-filtering-explained/2 here].
 
 
 
[[Anti-Aliasing (AA)]]
 
 
 
=== High Dynamic Range ===
 
Also known as HDR. This increases the range of contrast making it much closer to how a human eye sees light.
 
 
 
=== Tessellation ===
 
A newer technique for [[DirectX]] 11. Tessellation uses the GPU to increase the complexity of the polygon mesh. This can infamously be seen in [http://techreport.com/articles.x/21404/2 Crysis 2's concrete slabs].
 
 
 
=== Render Distance ===
 
Render Distance or View Distance is how far in the game world you can see. This can usually be increased to see farther in the game at the cost of performance. At lower settings, distant areas are covered by fog.
 
 
 
=== Vertical Sync (Vsync) ===
 
[[Vertical Sync (Vsync)|Vsync]] limits a games frame rate to match a fraction of the refresh rate of the monitor in use. This reduces screen tearing effect. For example, a 60 Hz monitor will make V-sync limit the frame rate to 60, 30, 20, 15... etc. frames per second. On systems where achieving the FPS of the monitor's refresh rate Vsync should be unnoticable, however if for example you could only achieve 55FPS on your graphics card on a 60Hz monitor then it would be taken down to 30FPS for the duration of the achievable FPS being lower than 60. An in depth explanation can be found [http://hardforum.com/showthread.php?t=928593 here].
 
  
 
==GPU scaling==
 
==GPU scaling==
{{ii}} GPU scaling determines how non-native resolutions are displayed on your display.
+
'''GPU scaling''' allows the GPU to determine how non-native resolutions are displayed on the display. If configured to perform scaling on the '''Display''', the video scaler of the monitor will determine it instead. Some TVs and other non-monitor displays may show black borders on widescreen resolutions. GPU scaling does not affect this; see [[Glossary:Graphics card#Overscan|Overscan]] for solutions, See the [[Glossary:Scaling#GPU/Display scaling|glossary page]] for more information.
{{ii}} If your display has a special built-in scaling mode you may need to disable it for GPU scaling to work (many TVs and some monitors have this feature).
 
{{ii}} Some TVs and other non-monitor displays may show black borders on widescreen resolutions. GPU scaling does not affect this; see [[#Overscan|Overscan]] for solutions.
 
  
{|align=center
+
==Overscan/Underscan==
|width=33% valign=top style="padding:8px;"|[[File:GPU scaling (fullscreen).png|200px|center|GPU scaling: Fullscreen / scaled]]
+
:[[Wikipedia:Overscan|Overscan article on Wikipedia]]
<center>'''Fullscreen / scaled'''</center>
+
Overscan and underscan refer to the behavior of certain television sets and displays to show the image incorrectly; typically as a result of misaligned configurations or expectations between the TV/display and the graphics card that sends out the video signal. This is an issue that originates from how early analogue televisions had quite loose manufacturing standards, and the different<ref>[https://mjg59.dreamwidth.org/8705.html TVs are all awful | mjg59]</ref> solution that TV producers came up with to counteract these differences. While the [[Wikipedia:Overscan|overscan article on Wikipedia]] covers the subject in more detail, the importance of it is that the solution that TV producers came up involved adding black borders around the actual image that was sent. So instead of a video signal only containing the intended image to be displayed and nothing else, the signal would also include black borders around said image.
In this mode the output stretches to fit the monitor, often with unwanted results (e.g. fat characters). Some non-widescreen games have a setting for use with this mode to make the stretched output have the correct widescreen aspect ratio.
 
  
|width=33% valign=top style="padding:8px;"|[[File:GPU scaling (aspect).png|200px|center|GPU scaling: Maintain aspect ratio]]
+
* '''Underscan''' refers to when TVs/displays shows the black borders around the image that was added by the source device to the video signal. The receiving display end up showing these black borders because the black borders were not expected to be a part of the video signal, or the TVs/displays expected the black borders to be smaller than what they are sent as.
<center>'''Maintain aspect ratio'''</center>
+
* '''Overscan''' refers to when TVs/displays crops parts of the actual image. The receiving display does this because it expected additional black borders around the image to be a part of the video signal, but in fact the video signal either did not include black borders at all or the black borders were smaller than expected.
In this mode the output expands to the biggest size while retaining its original aspect ratio. The unused space is left black.
 
|width=33% valign=top style="padding:8px;"|[[File:GPU scaling (centered).png|200px|center|GPU scaling: Centered / no scaling]]
 
<center>'''Centered / no scaling'''</center>
 
In this mode the output displays at its original resolution. Graphics are sharp and have the correct aspect but the result may be very small depending on the resolution of the output and your monitor.
 
|}
 
===AMD/ATI===
 
[[File:GPU scaling settings (AMD).png|right|200px]]
 
{{Fixbox|1=
 
{{Fixbox/fix|Set GPU scaling}}
 
{{ii}} You must first set the desktop to a non-native resolution otherwise the settings will be greyed out.
 
# Open the AMD Vision Engine Control Center or Catalyst Control Center (depending on your card).
 
# From the main view choose My Digital Flat Panels.
 
# Choose Properties (Digital Flat-Panel).
 
# Choose Enable GPU scaling.
 
# Choose the setting you want.
 
# Click Apply.
 
{{ii}} Your screen may temporarily go black while the new mode is being applied.
 
}}
 
===NVIDIA===
 
[[File:GPU scaling settings (NVIDIA latest).png|right|200px]]
 
{{Fixbox|1=
 
{{Fixbox/fix|Set GPU scaling}}
 
{{ii}} NVIDIA Optimus GPU scaling is controlled by the Intel driver; see [[#Intel|Intel]] for details.
 
{{ii}} Built-in scaling is not supported by some displays (e.g. laptops) so will be greyed out in those cases.
 
# Open the NVIDIA Control Panel.
 
# Open the Display section (if it is closed).
 
# Choose Adjust desktop size and position.
 
# Choose your display (if you have more than one).
 
# Choose the setting you want.
 
}}
 
===Intel===
 
[[File:GPU scaling settings (Intel latest).jpg|right|200px]]
 
{{Fixbox|1=
 
{{Fixbox/fix|Set GPU scaling}}
 
# Launch the Intel Graphics Control Panel.
 
# Choose Display Settings or General Settings (depending on the driver version).
 
# Change the Scaling or Display Expansion option accordingly (depending on the driver version).
 
# Click Apply.
 
{{ii}} On some older Intel drivers this setting is behind an Aspect Ratio Options button.
 
}}
 
<br clear="all"/>
 
==Overscan==
 
{{--}} Some TVs and other non-monitor displays will put black borders around the image.
 
{{ii}} Disable this in the display's settings if at all possible before trying Overscan since the result will be much better that way.
 
* [http://nvidia.custhelp.com/app/answers/detail/a_id/2593/~/how-do-i-setup-my-nvidia-based-graphics-card-to-work-with-my-hdtv%3F NVIDIA cards]
 
* [http://superuser.com/a/64264 AMD/ATI]
 
* [http://www.vidabox.com/forum/showthread.php?t=778 Intel]
 
 
 
==Switchable graphics==
 
{{ii}} Some computers (usually laptops and tablets) pair integrated graphics with a dedicated graphics card and switch between them based on the current task.
 
{{--}} Some games may not be detected for switching, requiring manual intervention.
 
{{ii}} One solution is to set the computer to always use the dedicated card when plugged in and the integrated graphics when running on battery.
 
  
===Custom resolutions===
+
The solution to both scenarios is to tweak both or either device so that their configuration matches each other properly. For a crisper result, disable or tweak the settings of the display if at all possible before implementing overscan correction on the graphics card end.
{{ii}} Custom resolutions allow running games at resolutions your monitor can't normally display.
 
{{ii}} This can be used for downsampling [[Anti-aliasing (AA)|anti-aliasing]]; it is also useful for making custom 4:3 resolutions for games that stretch from 4:3 with normal widescreen resolutions.
 
{{ii}} [[#GPU scaling|GPU scaling]] must be enabled and set to "Maintain aspect ratio".
 
{{--}} This doesn't work for Intel graphics.
 
 
 
{{Fixbox|1=
 
{{Fixbox/fix|Use Custom Resolution Utility}}
 
# Download and run the [http://www.monitortests.com/forum/Thread-Custom-Resolution-Utility-CRU Custom Resolution Utility].
 
# Click the Add button under detailed resolutions.
 
# Change timing to Automatic - LCD Standard.
 
# Fill in the horizontal, vertical and refresh rate boxes (refresh rate is usually 60).
 
# Click OK. Click OK again to close the program.
 
# Restart your computer (required).
 
# Test it by temporarily making it the Windows desktop resolution; if it works there it will work for games.
 
# Some games won't detect the custom resolution so you may need to set it manually in a configuration file.
 
{{ii}} If the output is skewed or out of range ensure [[#GPU scaling|GPU scaling]] is enabled and set to "Maintain aspect ratio".
 
}}
 
  
== General Information ==
+
'''How-to's'''
[[Wikipedia:Graphics processing unit|GPU article on Wikipedia]]
+
* [https://nvidia.custhelp.com/app/answers/detail/a_id/2593/~/how-do-i-setup-my-nvidia-based-graphics-card-to-work-with-my-hdtv%3F NVIDIA cards]
 +
* [https://superuser.com/a/64264 AMD/ATI]
  
== Notes ==
+
==Glide Emulation==
<references />
+
The 3Dfx Voodoo card was the first true 3D accelerated video card (prior cards simply increased the number and sizes of available display modes and/or increased the color depth available). It utilized its own unique API known as Glide, which itself was simply a subset of the OpenGL API. Unfortunately after their acquisition by Nvidia, their Glide API was abandoned and Nvidia did little to add support for it. Luckily it can be emulated through various tools:
 +
* [https://www.zeus-software.com/downloads/nglide nGlide (recommended)]
 +
* [http://dege.freeweb.hu DgVoodoo 2]
 +
* [http://www.zeckensack.de/glide/ Zeckensack's Glide Wrapper]
 +
* [https://sourceforge.net/projects/openglide/ OpenGLide]
  
[[Category:Introduction]]
+
{{References}}

Latest revision as of 20:55, 16 February 2024

Computers have always had to display their output in some way, and for the most part that was done as a text only interface, only later with future systems did the default interface become graphical. Back in that time, systems had built-in dedicated hardware for drawing the interface and graphics, once DirectX and 3D Gaming started to become popular, more powerful graphics were needed to render games at decent speeds if at all, this is where Graphics cards from 3DFX, Nvidia, and ATI (Later bought by AMD) come into play, these were addon cards which housed powerful graphics processing units and their own memory which added increased graphics capability to systems that didnt have them previously.

Dedicated versus integrated graphics

Graphics come in two forms, dedicated and integrated. Dedicated graphics have a bigger processor and their own memory (VRAM) seperate from the central processor which offers better performance. They are often separate cards that are added to a computer, whilst laptops and other forms of computers have built-in dedicated graphics. Integrated graphics are part of the CPU or motherboard, stopping the need for an additional card at the expense of performance. Usually memory is shared with the computer's RAM, with a small amount of dedicated video memory, which reduces the amount of total usable memory that a system has.

Both AMD and Intel provide their CPUs with integrated graphics, but AMD also sells a line of "APUs" (Accelerated Processing Units) that have increased graphics performance making them more suitable for gaming.

Switchable graphics

Gaming / Workstation laptops come with dedicated graphics processors that are usually used when running a intensive workload or game, but this isn't always the case since the operating system has to decide whether a task is intensive enough to warrant using the dedicated processor, which can lead some tasks or games to run poorly. Systems can rely on several mechanisms to decide what processor to use, like using the dedicated when plugged in and integrated when on battery. Nvidia's Optimus technology is one such example of switchable graphics.

Multiple GPUs

Some graphics cards can be used in tandem with up to 3 other graphics cards to boost the overall output of the cards. With Nvidia cards this technique is called SLI, with AMD/ATI cards it is called Crossfire. Nvidia's technology requires video cards that are exactly the same type (e.g. A GTX 760 and another GTX 760). AMD/ATIs technology requires cards from the same series (e.g. A HD7970 can be combined with a HD 7950). This feature's benefit has always been variable and doesn't give a perfect 2x performance boost, if any boost at all. As time went on, the feature got prioritized less and less until being dropped entirely from Nvidia cards starting with RTX 4000.[1]

Identifying the graphics card

Run dxdiag[2]

For Windows systems:

  1. Run the DirectX Diagnostic Tool (DXDiag)
  2. Go to the Display tab. The computer's video card is listed there.
  3. If you're on a device with switchable graphics, you may have to go to the Render tab.

Overclocking

Overclocking is causing the GPU to run at speeds beyond what the manufacturer recommends. Overclocking can damage the GPU if performed improperly, If done properly can provide a notable increase in performance can be achieved. The gain further depends on the type of GPU and type of workload/game. It is recommended to look up corresponding information online to see the "sweet spot" of the GPU to overclock.

Overclocking can be done through software like EVGA's Precision X, MSI's Afterburner, or AMD's Radeon Software.

Please note that overclocking will usually void the warranty and that it increases the chance of a GPU malfunctioning. Overclocking is done at the risk of the user and is not recommended for novices.

GPU scaling

GPU scaling allows the GPU to determine how non-native resolutions are displayed on the display. If configured to perform scaling on the Display, the video scaler of the monitor will determine it instead. Some TVs and other non-monitor displays may show black borders on widescreen resolutions. GPU scaling does not affect this; see Overscan for solutions, See the glossary page for more information.

Overscan/Underscan

Overscan article on Wikipedia

Overscan and underscan refer to the behavior of certain television sets and displays to show the image incorrectly; typically as a result of misaligned configurations or expectations between the TV/display and the graphics card that sends out the video signal. This is an issue that originates from how early analogue televisions had quite loose manufacturing standards, and the different[3] solution that TV producers came up with to counteract these differences. While the overscan article on Wikipedia covers the subject in more detail, the importance of it is that the solution that TV producers came up involved adding black borders around the actual image that was sent. So instead of a video signal only containing the intended image to be displayed and nothing else, the signal would also include black borders around said image.

  • Underscan refers to when TVs/displays shows the black borders around the image that was added by the source device to the video signal. The receiving display end up showing these black borders because the black borders were not expected to be a part of the video signal, or the TVs/displays expected the black borders to be smaller than what they are sent as.
  • Overscan refers to when TVs/displays crops parts of the actual image. The receiving display does this because it expected additional black borders around the image to be a part of the video signal, but in fact the video signal either did not include black borders at all or the black borders were smaller than expected.

The solution to both scenarios is to tweak both or either device so that their configuration matches each other properly. For a crisper result, disable or tweak the settings of the display if at all possible before implementing overscan correction on the graphics card end.

How-to's

Glide Emulation

The 3Dfx Voodoo card was the first true 3D accelerated video card (prior cards simply increased the number and sizes of available display modes and/or increased the color depth available). It utilized its own unique API known as Glide, which itself was simply a subset of the OpenGL API. Unfortunately after their acquisition by Nvidia, their Glide API was abandoned and Nvidia did little to add support for it. Luckily it can be emulated through various tools:


References

  1. A Farewell To SLI - last accessed on 2024-02-16
  2. Verified by User:Mine18 on 2024-02-16
  3. TVs are all awful | mjg59