Archive Home arrow Reviews: arrow Video Cards arrow GIGABYTE GeForce GT-240 HDMI Video Card

GIGABYTE GeForce GT-240 HDMI Video Card E-mail
Reviews - Featured Reviews: Video Cards
Written by David Ramsey   
Tuesday, 05 January 2010
Table of Contents: Page Index
GIGABYTE GeForce GT-240 HDMI Video Card
GIGABYTE GeForce GT240 Features
GIGABYTE GeForce GT240 Closer Look
GV-N240D5-512I Detailed Features
Video Card Testing Methodology
3DMark Vantage GPU Tests
Crysis Warhead Tests
Devil May Cry 4 Benchmark
Far Cry 2 Benchmark
Resident Evil 5 Tests
GIGABYTE GeForce GT240 Temperatures
VGA Power Consumption
GeForce GT240 Final Thoughts
GIGABYTE GV-N240D5-512I Conclusion

VGA Testing Methodology

As of October 2009 Benchmark Reviews has discontinued testing on the Windows XP (DirectX 9) Operating System, although it is recognized that 52% or more of the gaming world still use this O/S. DirectX 11 is native to the Microsoft Windows 7 Operating System, and will be the centerpiece of our test platform for the foreseeable future. In many tests, DirectX 10 is utilized on the Windows 7 platform.

According to the Steam Hardware Survey published at the time of Windows 7 launch, the most popular gaming resolution is 1280x1024 (17-19" standard LCD monitors) closely followed by 1024x768 (15-17" standard LCD). Normally, our benchmark performance tests concentrate on the up-and-coming higher-demand resolutions: 1680x1050 (22-24" widescreen LCD) and 1920x1200 (24-28" widescreen LCD monitors), but for cards in this class we've replaced the 1920x1200 resolution with 1280x1024, and adjusted the parameters of the benchmark programs to better match the lower capabilities of these cards, reducing rendering quality and post-processing features. You're going to see some pretty low frame rates, but remember that cards in this class, advertising hyperbole notwithstanding, are not intended for gamers. You need at least 30 frames per second to achieve a smooth, stutter-free experience in a game, but in general should aim for a much higher average frame rate, since any game will have more-demanding sections that can drop the frame rate well below the average for the entire game.

In each benchmark test conducted, five tests are collected at each setting with the highest and lowest results discarded. The remaining three results are averaged, and displayed in the performance charts.

Intel P55 Test System

  • Motherboard: ASUS Sabertooth 55i
  • Processor: Intel Core i5 750 Lynnfield BX8060515750 @ 2.67gHz
  • System Memory: 4GB (2x 2GB) OCZ Dual-Channel 1333 mHz DDR3 CL 9-9-9-24
  • Primary Drive: Hitachi HTS543232L9SA0 250G
  • Power Supply Unit: Cooler Master UCP 900W
  • Monitor: 22-Inch Widescreen LCD (up to [email protected])

Benchmark Applications

  • 3DMark Vantage v1.01 (Entry Quality, 1x Multisample, 2x Anisotropic Filtering, 1:2 Scale)
  • Crysis Warhead v1.1 with HOC Benchmark (DX10, Medium Quality, 4x Anti-Aliasing, 16x Anisotropic Filtering, Airfield Demo)
  • Devil May Cry 4 Benchmark Demo (DX10, Super-High Quality, 8x MSAA)
  • Far Cry 2 v1.02 (DX10, Medium Performance, High Quality, 4x Anti-Aliasing, HDR + Bloom)
  • Resident Evil 5 Benchmark Demo (DX10, High Quality, 8x MSAA)

Video Card Test Products

  • NVIDIA GeForce 9400GT Reference Design (550 MHz GPU/1440 Shader/400 vRAM - Forceware 191.07)
  • NVIDIA GeForce 8600GT Reference Design (540 MHz GPU/1180 Shader/700 vRAM - Forceware 191.07)
  • GIGABYTE GV-N240D5-512I (550 MHz GPU/1340 Shader/1700 vRAM - Forceware 195.62)
  • NVIDIA GeForce GTS 250 Reference Design (740 MHz GPU/1836 Shader/1100 vRAM - Forceware 191.07)
  • MSI Radeon HD 4770 (750 MHz GPU/1100MHz vRAM - ATI Catalyst 9.10)
Product Series GeForce 9400GT GeForce 8600GT GIGABYTE GT240 GeForce GTS250 Radeon 4770
Stream Processors 16 32 96 128 640
Core Clock (mHz) 550 540 550 740 750
Shader Clock (mHz) 1440 1180 1340 1836 N/A
Memory Clock (mHz) 400 700 1700 1100 1100
Memory Amount 512M - GDDR2 256M - GDDR3 512M - GDDR5 512M - GDDR3 512M - GDDR5
Memory Interface 128-bit 128-bit 128-bit 256-bit 128-bit



# Now I'm really confused!Ian D. Samson 2010-06-20 08:27
With the HDMI/VGA/DVI ports on the card, does it contain a TV Tuner that will enable the viewing of television signals on the PC monitor? I have an HDMI cable from the decoder to the PC but so far have not been able to use it. There's a 9600GT card in the PC (8GB RAM, 1TB HDD, Vista Ultimate 64-bit) because I want to do VHS to HD-DVD video editing. I am less than a novice at this, though, while I have 30+ years experience in the PC industry. Thanks.
Report Comment
# RE: Now I'm really confused!Olin Coles 2010-06-20 08:30
Video cards do not come with TV tuners, as there is a separate market for this product. The HDMI/VGA/DVI ports simply allow the user to connect the computer to a HDTV, HD-monitor, or digital/analog monitor.
Report Comment
# RE: RE: Now I'm really confused!Ian D. Samson 2010-06-20 08:50
Oh, so I need a separate TV Tuner (presumably PCI-E) in another slot in the main board? Then the 9600GT in the machine already is sufficient?
Report Comment
# RE: RE: RE: Now I'm really confused!Olin Coles 2010-06-20 17:18
That is correct, either a PCI or PCI-E digital tuner card would be ideal, and the 9600GT you have can help output the video.
Report Comment
# Gigabyte- GT 240 1 gb display issueThomas Moss 2010-08-20 19:31
Running system into a vivo full ht lcd tv 32inch.. Get a max res 1360 x 768) Can go more using Nvida control panel... max 1838 x 1002.. on both dvi and hdmi... when computer is connenct with vga it scales screen to fit 1920 x 1080.....(recommended display

Hdmi cable 1.3 version
Adapter - dvi to vga

The card HD of Full Defenion.
Report Comment

Comments have been disabled by the administrator.

Search Benchmark Reviews

Like Benchmark Reviews on FacebookFollow Benchmark Reviews on Twitter