Modern computers often come with two types of graphics processors — integrated graphics (built into the CPU) and dedicated graphics cards (like NVIDIA or AMD). Integrated graphics are perfect for light tasks like web browsing or watching videos, but when you want top gaming performance, video editing, or 3D rendering, you’ll want your system to use the dedicated GPU.
However, sometimes your computer doesn’t automatically switch between the two or continues using the integrated graphics, which can slow down performance. In such cases, disabling integrated graphics manually can help.
In this comprehensive guide, we’ll walk you through how to disable integrated graphics using both Windows and BIOS/UEFI settings. We’ll also explore why you might want to disable them, potential risks, and tips for ensuring a smooth transition to dedicated graphics.
TL;DR
Disabling integrated graphics ensures your system uses the dedicated GPU for all graphical tasks, improving performance and reducing driver conflicts. You can disable integrated graphics via Device Manager in Windows or through your system’s BIOS/UEFI settings. Always confirm your monitor is connected to the dedicated GPU before making changes, and be cautious on laptops where integrated graphics help conserve battery life.
What Are Integrated Graphics?
Integrated graphics are GPUs built directly into your computer’s processor (CPU). They share the system’s RAM and power, which makes them energy-efficient but less powerful than dedicated graphics cards.
For example, Intel CPUs include Intel UHD or Iris Xe Graphics, while AMD CPUs may feature Radeon Vega Graphics. They’re ideal for everyday computing but not meant for high-end gaming or professional graphic workloads.
A dedicated GPU, on the other hand, has its own memory (VRAM) and processing power, designed specifically for demanding visual tasks. Disabling integrated graphics forces your system to use the dedicated card full-time for maximum performance.
Why Disable Integrated Graphics?
Disabling integrated graphics isn’t always necessary, but there are several compelling reasons to do so:
Boost Gaming or Rendering Performance:
Some systems split tasks between GPUs, but disabling the integrated one ensures your dedicated card handles everything.
Prevent Auto-Switching Issues:
Windows sometimes uses integrated graphics for certain apps even when a dedicated GPU is available. Disabling the integrated GPU fixes this.
Reduce Driver Conflicts:
Running both GPUs can occasionally cause crashes, black screens, or performance bugs, especially in older laptops or PCs.
Free Up Shared Memory:
Integrated graphics reserve part of your system’s RAM. Disabling them may slightly increase available memory for other tasks.
Thermal Management:
Reduces heat output by eliminating the use of the integrated GPU, which can help with cooling and battery life.
Multi-Monitor Setup:
Simplifies configuration when using multiple displays connected to a dedicated GPU.
However, disabling integrated graphics should only be done if you’re confident your dedicated GPU is functioning properly and compatible with your system.
Before You Disable Integrated Graphics
Before jumping into the steps, keep a few important things in mind:
- Make sure your dedicated GPU is installed properly.
Check that your graphics card is seated in the PCIe slot and connected to the monitor. - Install the latest GPU drivers.
Download and install drivers from NVIDIA, AMD, or your laptop manufacturer’s official website. - Connect your display to the GPU output port.
If your monitor cable is still plugged into the motherboard’s HDMI or DisplayPort, you won’t get any video output after disabling integrated graphics.
Once you’ve ensured all these, you’re ready to proceed.
Method 1: Disable Integrated Graphics via Device Manager (Windows)
This method is straightforward and doesn’t require entering BIOS. It’s ideal for users who want to quickly switch off the integrated GPU.
Step-by-Step Instructions:
- Open Device Manager:
- Press
Windows + Xand select Device Manager. - Alternatively, type “Device Manager” in the Start menu search bar.
- Press
- Expand Display Adapters:
- Click the arrow next to Display adapters to view all installed GPUs.
- Identify the Integrated GPU:
- Look for entries like Intel UHD Graphics, Intel Iris Xe, or AMD Radeon Vega.
- Your dedicated GPU will typically be labeled NVIDIA or AMD Radeon.
- Disable the Integrated GPU:
- Right-click on the integrated GPU and select Disable device.
- Confirm the action when prompted.
- Restart Your Computer:
- This ensures the changes take effect and your system defaults to the dedicated GPU.
Important Note: If you disable the integrated GPU and your monitor is connected to its output port, you may lose display functionality. Always ensure your monitor is connected to the dedicated GPU’s output.
Method 2: Disable Integrated Graphics via BIOS/UEFI
For a more permanent solution, you can disable integrated graphics at the firmware level. This method is especially useful for systems where the integrated GPU reactivates after driver updates or system resets.
Step-by-Step Instructions:
- Restart Your Computer:
- Begin by restarting your PC to access BIOS/UEFI settings.
- Enter BIOS/UEFI:
- During startup, press the designated key to enter BIOS. Common keys include
Delete,F2,F10, orEsc. - Watch for a message like “Press DEL to enter setup” during boot.
- During startup, press the designated key to enter BIOS. Common keys include
- Navigate to Advanced Settings:
- Use arrow keys or mouse to find the Advanced, Chipset, or Graphics Configuration tab.
- Locate Integrated Graphics Settings:
- Look for options labeled Integrated Graphics, IGPU, or Onboard Graphics.
- Set the option to Disabled or Auto (if Auto prioritizes the dedicated GPU).
- Save and Exit:
- Press
F10or select Save & Exit to apply changes. - Your system will reboot using only the dedicated GPU.
- Press
Tip: BIOS interfaces vary by manufacturer. Consult your motherboard’s manual or support site for exact navigation steps.
Method 3: Use GPU Software Control Panels
If you don’t want to disable integrated graphics completely, you can tell Windows to always use the dedicated GPU for specific applications.
For NVIDIA Users:
- Right-click your desktop and open the NVIDIA Control Panel.
- Go to Manage 3D settings > Global Settings.
- Under Preferred graphics processor, choose High-performance NVIDIA processor.
- Click Apply to save changes.
For AMD Users:
- Open AMD Radeon Settings.
- Go to Preferences > Additional Settings > Power > Switchable Graphics Application Settings.
- Select the program and choose High Performance to force the dedicated GPU.
This method ensures the right GPU is used without disabling integrated graphics completely, which is safer for laptops.
Method 4: Disable Integrated GPU via BIOS on Desktop PCs Only
If you’re using a custom desktop build, your motherboard likely includes an auto-detect feature that disables integrated graphics when a dedicated GPU is installed.
However, in rare cases, both remain active. To fix that:
- Reboot your PC and open BIOS.
- Under Primary Display Adapter or Initiate Graphic Adapter, set it to PEG (PCI Express Graphics).
- Disable Internal Graphics or Onboard GPU.
- Save and restart.
This ensures your PC boots using only the dedicated GPU.
What Happens After Disabling Integrated Graphics?
Once disabled, your system will rely solely on the dedicated GPU for all graphical tasks. This can lead to:
- Improved performance in games and creative applications.
- Reduced power consumption if the integrated GPU was causing unnecessary load.
- Simplified driver management, as only one GPU is active.
However, if your dedicated GPU fails or is removed, your system may not display anything until the integrated GPU is re-enabled.
When You Should NOT Disable Integrated Graphics
There are some cases where keeping your integrated graphics enabled is beneficial:
- Laptop Battery Life: Integrated GPUs consume less power, helping extend battery life.
- Dual Display Setups: Some systems use both GPUs for multiple monitors.
- Video Encoding: Certain programs use Intel Quick Sync (integrated GPU) to speed up rendering and video export.
- Fail-Safe Option: If your dedicated GPU fails, the integrated GPU can serve as a backup.
So, unless you’re facing performance or compatibility issues, consider leaving it enabled.
Risks and Considerations
Before disabling integrated graphics, consider the following:
- Loss of Display Output: If your monitor is connected to the integrated GPU port, disabling it will cut off the signal.
- Laptop Limitations: Many laptops use integrated graphics for power efficiency and switch to dedicated GPUs only when needed. Disabling the integrated GPU may reduce battery life or cause instability.
- Driver Dependencies: Some applications rely on integrated graphics for specific tasks. Disabling it could affect compatibility.
Always ensure your dedicated GPU is properly installed, updated, and connected before disabling the integrated graphics.
When Should You Keep Integrated Graphics Enabled?
In some scenarios, keeping the integrated GPU active is beneficial:
- Battery Efficiency: Laptops often use integrated graphics to conserve power during light tasks.
- Multi-GPU Workloads: Some software can offload specific tasks to the integrated GPU, improving overall performance.
- Backup Display: If your dedicated GPU fails, the integrated GPU can serve as a fallback.
Modern systems often manage GPU switching automatically, so manual disabling isn’t always necessary unless troubleshooting or optimizing for specific tasks.
Final Thoughts
Disabling integrated graphics can help maximize your system’s gaming or rendering performance — but only if your dedicated GPU is properly installed and configured. The easiest way is through Device Manager or BIOS settings, depending on your device type.
However, remember that integrated graphics have benefits too, such as power efficiency and serving as a backup GPU. Always double-check before disabling them permanently.
With the right setup, your dedicated GPU will handle all demanding tasks smoothly, giving you faster frame rates, improved visuals, and a better computing experience overall.
Frequently Asked Questions
What are integrated graphics?
Integrated graphics are built-in GPUs within your CPU or motherboard. They share system memory and are designed for basic tasks like browsing and video playback.
Why should I disable integrated graphics?
Disabling integrated graphics ensures your dedicated GPU handles all graphical tasks, improving performance, reducing heat, and avoiding driver conflicts.
How do I disable integrated graphics in Windows?
Open Device Manager, expand Display adapters, right-click the integrated GPU (e.g., Intel UHD Graphics), and select Disable device.
Can I disable integrated graphics from BIOS?
Yes. Restart your PC, enter BIOS/UEFI, navigate to Advanced or Chipset settings, and set Integrated Graphics to Disabled or Auto.
Will disabling integrated graphics affect my display?
Yes—if your monitor is connected to the integrated GPU port, you may lose display output. Always connect your monitor to the dedicated GPU before disabling.
Is it safe to disable integrated graphics on a laptop?
Not always. Laptops often use integrated graphics to conserve battery. Disabling it may reduce battery life or cause instability.
Can I re-enable integrated graphics later?
Yes. You can re-enable it via Device Manager or BIOS/UEFI by reversing the previous steps.
Do all systems allow disabling integrated graphics?
No. Some laptops and prebuilt systems may not offer BIOS options to disable integrated graphics. In such cases, Windows-based methods are your best bet.








