Nvidia Optimus
Lua error in package.lua at line 80: module 'strict' not found. Nvidia Optimus is a computer GPU switching technology created by Nvidia which, depending on the resource load generated by client software applications, will seamlessly switch between two graphics adapters within a computer system in order to provide either maximum performance or minimum power draw from the system's graphics rendering hardware.
A typical platform includes both a lower-performance integrated unit by Intel and a high-performance one by Nvidia. Optimus saves battery life by automatically switching the power of the discrete graphics processing unit (GPU) off when it is not needed and switching it on when needed again. The technology mainly targets mobile PCs such as notebooks.[1][lower-alpha 1] When the GPU power is off, the driver redirects graphics commands to the integrated graphics chip. The switching is designed to be completely seamless and to happen "behind the scenes".
Of operating systems, Windows and Linux are officially supported by Nvidia. A project called Bumblebee[4] brings open source support of Optimus to Linux.[5]
Operation
When a user launches an application, the graphics driver tries to determine whether the application would benefit from the discrete GPU. If so, the GPU is powered up from an idle state and is passed all rendering calls. Even in this case, though, the integrated graphics processor (IGP) is used to output the final image. When less demanding applications are used, the IGP takes sole control, allowing for longer battery life and less fan noise. Under MS-Windows the nVidia driver also provides the option to manually select the GPU in the right-click menu upon launching an executable.
Within the hardware interface layer of the NVIDIA GPU driver, the Optimus Routing Layer provides intelligent graphics management. The Optimus Routing Layer also includes a kernel-level library for recognizing and managing specific classes and objects associated with different graphics devices. This NVIDIA innovation performs state and context management, allocating architectural resources as needed for each driver client (i.e., application). In this context-management scheme, each application is not aware of other applications concurrently using the GPU.
By recognizing designated classes, the Optimus Routing Layer can help determine when the GPU can be utilized to improve rendering performance. Specifically, it sends a signal to power-on the GPU when it finds any of the following three call types:
- DX Calls: Any 3D game engine or DirectX application will trigger these calls
- DXVA Calls: Video playback will trigger these calls (DXVA = DirectX Video Acceleration)
- CUDA Calls: CUDA applications will trigger these calls
Predefined profiles also assist in determining whether extra graphics power is needed. These can be managed using the NVIDIA Control Panel.
Optimus avoids usage of a hardware multiplexer and prevents glitches associated with changing the display driver from IGP to GPU by transferring the display surface from the GPU frame buffer over the PCI Express bus to the main memory-based framebuffer used by the IGP. The Optimus Copy Engine is a new alternative to traditional DMA transfers between the GPU framebuffer memory and main memory used by the IGP.
Linux support
Lua error in package.lua at line 80: module 'strict' not found.
The binary Nvidia driver added partial Optimus support May 3, 2013 in the 319.17.[6] As of May 2013, power management for discrete card is not supported, which means it cannot save battery by turning off Nvidia graphic card completely.[7]
The open-source project Bumblebee tries to provide support for graphics-chip switching. As in the Windows implementation, by default all applications run through the integrated graphics processor. As of 2013[update] one can only run a program with improved graphical performance on the discrete GPU by explicitly invoking it as such: for example, by using the command line or through specially configured shortcut icons. Automatic detection and switching between graphics processors is not yet available.
Work in progress on a graphical interface - bumblebee-ui - aims to allow more convenient starting of programs for improved graphical performance when necessary.
Steam for Linux can be set up to run games using the discrete GPU (Steam Community: Optimus and Steam for Linux).
The Bumblebee Project continues to evolve as more necessary software changes are made to the graphics architecture of Linux. To make most use of it, it is best to use a recent Linux distribution. As of 2013[update], Bumblebee software repositories are available for Arch Linux, Debian, Fedora, Gentoo, Mandriva, OpenSuSE (OpenSuSE Bumblebee repository) and Ubuntu. The source package can be used for other distributions.
An attempt by Nvidia to support Optimus through DMA BUF, a Linux kernel-mechanism for sharing buffers across hardware (potentially GPUs), was rebuffed by kernel developers in January 2012 due to license incompatibility between the GPL-licensed kernel-code and the proprietary-licensed Nvidia blob.[8]
When no software mechanism exists for switching between graphics adapters, the system cannot use the NVIDIA GPU at all, even if an installed graphics driver would support it.[9] Some older computers contain a BIOS setting to manually select the state of the hardware multiplexer to switch output between the two video devices. However, this setting is no longer part of the Optimus platform.
See also
Notes
References
- ↑ http://www.nvidia.com/object/optimus_technology.html
- ↑ http://vr-zone.com/articles/nvidia-to-launch-desktop-optimus--synergy-at-computex/11946.html
- ↑ http://news.softpedia.com/news/NVIDIA-Optimus-Lands-on-Desktops-196761.shtml
- ↑ Bumblebee
- ↑ "Bumblebee version 3.0 'Tumbleweed'" release", ', January 20, 2012, accessed January 20, 2012.
- ↑ Official Nvidia dev zone
- ↑ http://www.opennet.ru/opennews/art.shtml?num=36848
- ↑ Lua error in package.lua at line 80: module 'strict' not found.
- ↑ "On laptops that don't have that hardware mux you currently cannot use the NVIDIA GPU for display.", ', July 23, 2010, accessed November 27, 2010.