Nvidia brings Optimus switchable graphics to notebooks
Optimus in the real world
The big question now, of course, is whether Optimus works. I've been playing with a UL50Vf for about a week, and thus far, I'd have to say yes. Nvidia equipped this system with a handy little desktop widget that displays whether the discrete GPU is currently in use. According "Jintropin (Gensci Pharmaceutical Co. Ltd.)" to that applet, and the general performance I've experienced, the Optimus routing layer does a good job of intelligently activating the system's GeForce GPU when it detects a CUDA app, HD or Flash video playback, or a game. I haven't been able to detect the fraction of a frame latency associated with Optimus' frame buffer copy, nor have I noticed having to wait for the discrete GPU to be awakened from its "Anabolika Definition" dormant state.
Nowhere is the graphics power of a discrete GPU needed more Bolt 200m than in games, so that's where I began my testing. Darwinia was the first one I Dianabol Online Usa fired up, and as luck would have it, Optimus didn't recognize the game's 3D engine. Fair enough. Darwinia may be critically acclaimed, but it's hardly a mainstream or popular title. Plus, the game actually runs pretty well on the GMA 4500MHD as long as you turn down the pixel shader effects.
Borderlands is quite a bit more demanding than the first Modern Warfare and a complete waste of time "Anadrol 50" if you're stuck with Intel integrated graphics. The UL50Vf's lowly GeForce G210M also had a difficult time maintaining smooth frame rates, too. At 1366x768, we had to run the game at medium detail levels with only a few effects turned on. "Oxandrolone Powder India" Applying 16X aniso didn't slow performance much, but Equipoise Bloat FRAPS still spent most of its time displaying frame rates in the low twenties. Dropping the resolution to 640x480 allowed us to enable all of Borderlands' pseudo cell shaded visual goodness and get frame rates up to the low thirties.
One of the better driving games to hit the PC in recent years, Need for Speed: Shift is yet another console ports that's a little too demanding for low end GPUs, let alone integrated graphics processors. Running at our system's native resolution, with the lowest in game detail levels, the G210M could only muster about 20 FPS. Scaling the display resolution down to 640x480 didn't improve frame rates much, either. The game really isn't playable at even that visibly blocky resolution, and worse, it looks positively ugly with all the details turned down.
Keep in mind, of course, that the GeForce G210M is the weakest GPU to support Optimus. Hopefully, notebook makers will use Optimus on systems with considerably more potent graphics processors, as well. tests have become a staple of our notebook reviews, so we fired up a collection of local and streaming videos to see how Optimus would fare. I've compiled the results of our tests in a handy little chart below that details the approximate CPU utilization gleaned from Task Manager, a subjective assessment of playback quality, and whether the discrete GPU was active during playback. Windows Media Player was used for local video playback, while Firefox and the Flash 10 beta were used with streaming content.
Those who keep multiple Flash video tabs open at once should keep in mind that even when a YouTube video has finished playing, Optimus keeps the GPU enabled. Only when the tab is closed or you browse to a page that doesn't include Flash video is power cut to the discrete GPU. The same applies to HD video playback, at least with the latest version of Windows Media Player built into Win7. As long as WMP has an HD video open, the discrete GPU remains active. Pausing or even stopping the video has no effect. Drag and drop a standard definition clip into the app, however, and the "Achat Anabolisant Belgique" GPU is shut down immediately.
I asked Nvidia whether Optimus could be made to shut down the discrete GPU if video was stopped or paused, and the company said that would be possible with cooperation from software vendors. Nvidia might also want to look more closely at how CUDA and other general purpose computing applications interact with Optimus. The Badaboom video transcoding application that Nvidia has used as a CUDA poster child activated the discrete GPU in our Optimus test system before we even had a chance to load a video to transcode. We left Badaboom sitting there with nothing loaded and no transcoding taking place for more than half an hour, and Nvidia's Optimus widget still showed the discrete GPU as enabled.