Archive Home arrow Reviews: arrow Video Cards arrow ASUS ENGTX560 Ti DCII TOP Video Card

ASUS ENGTX560 Ti DCII TOP Video Card E-mail
Reviews - Featured Reviews: Video Cards
Written by Servando Silva   
Monday, 14 February 2011
Table of Contents: Page Index
Nvidia GeForce GTX 560 Ti Features
Closer Look: ASUS GTX 560 Ti
ENGTX560 Ti Detailed Features
ASUS ENGTX560 Ti Software
Video Card Testing Methodology
DX10: 3DMark Vantage
DX10: Crysis Warhead
DX11: Aliens vs. Predator
DX11: BattleForge
DX10: Just Cause 2
DX11: Lost Planet 2
DX11: Metro 2033
DX11: Unigine Heaven 2.1
VGA Power Consumption
ASUS ENGTX560 Ti Conclusion

Nvidia GeForce GTX 560 Ti Features

NVIDIA's GeForce GTX 560 Ti introduces the new GF114 GPU that is largely based on the GF104 Fermi chip which drove the GTX 460 to great success. The differences are four fold: full utilization of the die (no disabled cores), architecture improvements, more widespread use of low-leakage transistors, and layout changes based on signal traffic analysis.

While the GF104 enabled only seven out of eight possible Streaming Multiprocessors (SM), the GF114 is able to use that last SM to make even more cores available, a total of 384 compared to 336 in the GTX 460. Each SM still offers 48 CUDA cores, four dispatch units, and eight texture/special function units. The architecture improvements are the addition of full speed FP16 texture filtering and new tile formats that improve Z-cull efficiency. These enhancements alone offer performance improvements ranging from 5% on tessellation-heavy benchmarks like Heaven 2.1, to a whopping 14% gain in 3DMark Vantage, where it's all about shader power.

The last two improvements go hand in hand to improve both the power usage and the maximum clock rates that the GPU can support. Low leakage transistors run cooler, use less power, and can be driven faster due to their lower gate capacitance. NVIDIA increased the usage of this more expensive device type, primarily to reduce power consumption, but also to gain some overclocking headroom. They also looked at signal flow across the various sections of the GPU and did some rearranging to shorten signal paths for the high traffic areas. It's not that the GTX 460 was particularly bad in this regard, but the luxury of a second chance yielded some improvements. Taken together, NVIDIA was able to increase the base clock on the core from 675 MHz to 822 MHz, a whopping 22% increase that supposedly doesn't eat into any overclocking headroom. We'll test that supposition later in the review.

As for the rest of the capabilities of this very advanced graphics card, here is the complete list of GPU features, as supplied by NVIDIA:

NVIDIA GeForce GTX 5xx GPU Feature Summary:

3D Graphics

  • Full Microsoft DirectX 11 Shader Model 5.0 support:
    • NVIDIA PolyMorph Engine with distributed HW tessellation engines
    • BC6H and BC7 texture compression formats
    • Gather4 extensions
  • OpenGL 4.1 support
  • Advanced image quality features:
    • 32× coverage sample antialiasing
    • Transparent multisampling and transparent supersampling
    • 16× angle independent anisotropic filtering
    • 128-bit floating point high dynamic-range (HDR) lighting with antialiasing; 32-bit per-component floating point texture filtering and blending
  • Interactive ray tracing support
  • Full-speed frame buffer blending
  • Advanced lossless compression algorithms for color, texture, and zdata
  • Support for normal map compression
  • Z-cull
  • Early-Z

GPU ComputingNVIDIA_Black_Square_3D_Logo_250px.jpg

  • NVIDIA CUDATM technology-allows the GPU cores to provide performance improvements for applications such as video transcoding, gaming, ray tracing, and physics. API support includes:
    • CUDA C
    • CUDA C++
    • DirectCompute 5.0
    • OpenCL
    • Java, Python, and Fortran
  • Third Generation Streaming Multiprocessor (SM)
    • 48 CUDA cores per SM
    • Dual Warp Scheduler simultaneously schedules and dispatches instructions from two independent warps
    • 64 KB of RAM with a configurable partitioning of shared memory and L1 cache
  • Second Generation Parallel Thread Execution ISA
    • Unified Address Space with Full C++ Support
    • Optimized for OpenCL and DirectCompute
    • Full IEEE 754-2008 32-bit and 64-bit precision
    • Full 32-bit integer path with 64-bit extensions
    • Memory access instructions to support transition to 64-bit addressing
    • Improved Performance through Predication
  • Improved Memory Subsystem
    • NVIDIA Parallel DataCacheTM hierarchy with Configurable L1 and Unified L2 Caches
    • Greatly improved atomic memory operation performance
  • NVIDIA GigaThreadTM Engine
    • 10x faster application context switching
    • Concurrent kernel execution
    • Out of order thread block execution

NVIDIA Technology

  • NVIDIA SLI technology-patented hardware and software technology allows up to four NVIDIA GeForce GPUs to run in parallel to scale performance and enhance image quality on today's top games.
  • NVIDIA PhysXTM technology-allows advanced physics effects to be simulated and rendered on the GPU.
  • NVIDIA 3D VisionTM Ready- GeForce GPU support for NVIDIA 3D Vision, bringing a fully immersive stereoscopic 3D experience to the PC.
  • NVIDIA 3D Vision SurroundTM Ready-scale games across 3 panels by leveraging the power of multiple GPUs in an NVIDIA SLI configuration. Combine with 3D Vision technology for the ultimate 3 panel stereoscopic 3D gaming experience.

GPU Interfaces

  • Designed for PCI Express 2.0 ×16 for a peak bandwidth (counting both directions) of up to 20 gigabytes (GB) per second (PCIe 2.0 devices are backwards compatible with PCI Express 1.x devices).
  • Up to 384-bit GDDR5 memory interface (memory interface width may vary by model)

Advanced Display Functionality

  • Two pipelines for dual independent display
  • Two dual-link DVI outputs for digital flat panel display resolutions up to 2560×1600
  • Dual integrated 400 MHz RAMDACs for analog display resolutions up to and including 2048×1536 at 85 Hz
  • HDMI 1.4a support including GPU accelerated Blu-ray 3D support, x.v.Color, HDMI Deep Color, and 7.1 digital surround sound. (Blu-ray 3D playback requires compatible software player. See for more details).
  • Displayport 1.1a support
  • HDCP support up to 2560×1600 resolution on all digital outputs
  • 10-bit internal display processing, including hardware support for 10-bit scanout
  • Underscan/overscan compensation and hardware scaling


  • NVIDIA PureVideo HD technology with VP4 programmable video processor
  • Decode acceleration for MPEG-2, MPEG-4 Part 2 Advanced Simple Profile, H.264, MVC, VC1, DivX (version 3.11 and later), and Flash (10.1 and later)
  • Blu-ray dual-stream hardware acceleration (supporting HD picture-in-picture playback)
  • Advanced spatial-temporal de-interlacing
  • Noise reduction
  • Edge enhancement
  • Bad edit correction
  • Inverse telecine (2:2 and 3:2 pull-down correction)
  • High-quality scaling
  • Motion Compensation
  • Video color correction
  • Dynamic contrast enhancement and color stretch

Digital Audio

  • Support for the following audio modes:
    • Dolby Digital (AC3), DTS 5.1, Multi-channel (7.1) LPCM, Dolby Digital Plus (DD+), MPEG2/MPEG4 AAC
  • Data rates of 44.1 KHz, 48 KHz, 88.2 KHz, 96 KHz, 176 KHz, and 192 KHz
  • Word sizes of 16bit, 20bit, and 24bit

Power Management Technology

  • Advanced power and thermal management for optimal acoustics, power, and performance based on usage:
  • ASPM power management
  • Adaptive Clocking
  • Adaptive Power States
  • Advanced fan control and temperature monitoring

NVIDIA GeForce GTX 560 Ti GPU Detail Specifications

GPU Engine Specs:MSI_N560GTX_Ti_GeForce_Video_Card_GTX_560Ti_Logo.jpg

  • Fabrication Process: TSMC 40nm Bulk CMOS
  • Die Size: 332mm2 (Estimated)
  • No. of Transistors: 1.95 Billion
  • Graphics Processing Clusters: 2
  • Streaming Multiprocessors: 8
  • CUDA Cores: 384
  • Texture Units: 64
  • ROP Units: 32
  • Engine Clock Speed: 822 MHz (ASUS OC @ 900MHz)
  • Texel Fill Rate (bilinear filtered): 56.3 Gigatexels/sec
  • Pixel Fill Rate: 28.2 Gigapixels/sec

Memory Specs:

  • Memory Clock: 2100 MHz - DDR
  • Memory Configurations: 1 GB GDDR5
  • Memory Interface Width: 256-bit
  • Memory data rate: 4.2 Gbps
  • Memory Bandwidth: 134.4 GB/sec

Display Support:

  • Maximum DVI Resolution: 2560x1600
  • Maximum VGA Resolution: 2048x153
  • Maximum Display Output: 4x - 1920x1200
  • Standard Display Connectors:
    • 2x Dual-Link DVI-I
    • 1x Mini HDMI v1.4a

Graphics card Dimensions:

  • Height: 4.376 inches (111 mm)
  • Length: 9.37 inches (238 mm)
  • Width: Dual-slot (37mm)
  • Weight: 669g

Thermal and Power Specs:

  • Maximum GPU Temperature: 104 C
  • Maximum Graphics Card Power: 170 W
  • Minimum Recommended System Power: 500 W
  • Power Connectors: Two 6-pin PCI Express (PCI-E)




# nice lookingRealNeil 2011-02-14 04:51
That's a damn good looking card, and I'm not speaking to the three sexy red stripes either. It just looks very functional.
(be back after I read this)
Report Comment
# RE: nice lookingServando Silva 2011-02-14 09:05
That's true. Sadly, you won't be that happy after reading the "Temperatures & Overclocking" section though...
Report Comment
# RE: RE: nice lookingAdam 2011-02-14 12:43
That's strange considering that the previous Direct CU was pretty decent compared to the stock heatsink. Wonder if perhaps this one was poorly seated, or they might have just #ed up the design this time round.
Shame, I like ASUS normally.
Report Comment
# RE: RE: RE: nice lookingServando Silva 2011-02-14 13:12
I re-installed the heatsink and changed TIM but results are the same. Other sites have tested this model and even if they don't get such bad results, it falls behind MSI, eVGA and GIGABYTE's solutions.
Report Comment
# RE: RE: RE: RE: nice lookingAlejandro 2011-09-22 09:20
The card gets hot because the "Auto" option in ASUS Smart Doctor doesn't work, it just keeps the fans running at ~1100RPM even when in load, I've found that manually setting the speed for certain ranges of temperature keeps it well under 65C
Hope this helps, I have this card and it's plain awesome :D
Report Comment
# eggegg chan 2012-08-17 16:53
because it's a 800mhz gpu, 1000mhz gddr5, card that's been overclocked to 900mhz, 1050mhz

wtf do you really expect?

a new gpu and mem with the same name on the card?

it is the same card with a new bios and higher clocks no TOP written anywhere but the packaging

hmmm how can we sell the left over stock and still make a prophet
Report Comment
# WOowHWMSTR 2011-02-14 06:16
The card is really cool!! I wanna have it! But its too expensive for a high school student like me..T_T
Report Comment
# CoolElite360 2011-03-02 09:04
I am getting it by the end of this mont h and I am in high school aswell.
My PC is amazing, bet it is better than yours.
Report Comment
# @Elite360Shane 2011-03-03 22:14
We can tell your in high school, but don't gloat about what you've got, because theirs always something better.
Report Comment
# Will it underclock itself in idle?lowpoxm 2011-03-28 13:01
I just bought this card.
I do have a major headache though. When I run CPU-Z or ASUS smart doctor, it says the GPU clock is only running at 830MHz and the shader at 1660MHz.
Anyone knows if it will underclock itself in idle mode?
I am getting desperate because it IS the ?TOP? version of the card. And I can?t find an answer anywhere.
I haven?t had the time to test if with furmark or anything like that.
Report Comment
# RE: Will it underclock itself in idle?Olin Coles 2011-03-28 13:31
Yes, in idle mode all video cards turn the GPU clocks down to conserve energy.
Report Comment
# RE: RE: Will it underclock itself in idle?lowpoxm 2011-03-28 23:28
I really hope that?s the case with this card as well!
I haven?t been able to find that info anywhere else.......
I was really bumbed out, because it seemed like I payed for the "TOP" version of the card, but only got the stock one.

But I don?t get it, if that?s the case. Why do they even make a "top" version of the card?
If the stock version will overclock itself as well.
Report Comment
# RE: RE: RE: Will it underclock itself in idle?Olin Coles 2011-03-29 08:45
Every video card does this. Once you use a program/game that forces 3D mode, the GPU clock speed returns to whatever it's been set to. Turning the GPU clock down during idle helps save electricity.
Report Comment
# RE: RE: RE: RE: Will it underclock itself in idle?lowpoxm 2011-03-30 00:23
Well, I had furmark running at the max settings for 5 mins, logged the data with GPU-Z, and the speed remained the same at all time.......
So the card I received is the stock version, even though the box says "TOP"
Report Comment
# RE: ASUS ENGTX560 Ti DCII TOP Video CardServando Silva 2011-04-01 06:02
Check out the VGA Consumption section to read more about different load frequencies.
In idle mode it should underclock far more than that.
Report Comment
# RE: ASUS ENGTX560 Ti DCII TOP Video CardZmol 2011-11-05 08:05
Bought one, but returned it the same day as IT SOUNDS LIKE A HAIR DRYER and it runs hot.

35% fan speed: inaudible
40%: humming
50%: high pitch humming/whining

Got a Gigabyte OC card instead. 900mhz also. Don't hear it untill 60% and even then it's just an air-noise as opposed to screeching and whining
Report Comment
# RE: ASUS ENGTX560 Ti DCII TOP Video Cardiv 2012-01-22 13:47
-right, a good looking card
-low temp in idle and not impressing temp in full or stress i expected lower because of my case antec 900
-with my e8500 proc i'm pleased by her performance
-is relatively easy to clean her of dust
-as usual the case of this vidcard cannot be removed for a better cleanup
-it not inspire me a solid construction because the pipes are not rigidized with card enough at jonction with uppercooler
Report Comment

Comments have been disabled by the administrator.

Search Benchmark Reviews

Like Benchmark Reviews on FacebookFollow Benchmark Reviews on Twitter