<?xml version='1.0' encoding='UTF-8'?><?xml-stylesheet href="http://www.blogger.com/styles/atom.css" type="text/css"?><feed xmlns='http://www.w3.org/2005/Atom' xmlns:openSearch='http://a9.com/-/spec/opensearchrss/1.0/' xmlns:blogger='http://schemas.google.com/blogger/2008' xmlns:georss='http://www.georss.org/georss' xmlns:gd="http://schemas.google.com/g/2005" xmlns:thr='http://purl.org/syndication/thread/1.0'><id>tag:blogger.com,1999:blog-1394573190797201130</id><updated>2024-09-14T01:40:48.725-07:00</updated><title type='text'>GRAPHIC CARDS - NVIDIA , AMD ATI</title><subtitle type='html'>Specification on latest high quality graphic cards nvidia / asus / AMD Raedon HD7970 / 7990 / 6770 / 6950, MSI , GeForce pci-e (PCI Express ) , AGP video cards , Ultra GTX , GTS , GT ,Latest GTX 550 / 570 / 590</subtitle><link rel='http://schemas.google.com/g/2005#feed' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/posts/default'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default?redirect=false'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/'/><link rel='hub' href='http://pubsubhubbub.appspot.com/'/><link rel='next' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default?start-index=26&amp;max-results=25&amp;redirect=false'/><author><name>supremegraphiccards</name><uri>http://www.blogger.com/profile/00185843048343540012</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><generator version='7.00' uri='http://www.blogger.com'>Blogger</generator><openSearch:totalResults>39</openSearch:totalResults><openSearch:startIndex>1</openSearch:startIndex><openSearch:itemsPerPage>25</openSearch:itemsPerPage><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-3184882195454604187</id><published>2012-03-22T06:24:00.002-07:00</published><updated>2013-07-16T10:46:28.544-07:00</updated><title type='text'>Introducing the GeForce GTX 680 GPU</title><content type='html'>&lt;div dir=&quot;ltr&quot; style=&quot;text-align: left;&quot; trbidi=&quot;on&quot;&gt;&lt;div class=&quot;separator&quot; style=&quot;clear: both; text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEicvB3S63N_XRyK4OACZ7Y3XFvyW3t13afmkz5CyafPHx-gHBEyGVlxOEk8MIODqZ7hJ6cuHAKbmI3VejdwvOHykzgt5hXyUhC7fmSLVbWA_ICkRa-jcepIc1aVCPyRllphsy_15vqkRUb-/s1600/xfx_geforce_gtx_280_itocp.jpg&quot; imageanchor=&quot;1&quot; style=&quot;clear: left; float: left; margin-bottom: 1em; margin-right: 1em;&quot;&gt;&lt;img alt=&quot;Introducing the GeForce GTX 680 GPU&quot; border=&quot;0&quot; height=&quot;249&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEicvB3S63N_XRyK4OACZ7Y3XFvyW3t13afmkz5CyafPHx-gHBEyGVlxOEk8MIODqZ7hJ6cuHAKbmI3VejdwvOHykzgt5hXyUhC7fmSLVbWA_ICkRa-jcepIc1aVCPyRllphsy_15vqkRUb-/s400/xfx_geforce_gtx_280_itocp.jpg&quot; width=&quot;400&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;br /&gt;
Today NVIDIA is unveiling its flagship single-GPU graphics card that&#39;s squarely aimed toward the well-heeled enthusiast. Readers with a penchant for graphics will have seen the numerous leaks springing up in the preceding two weeks, with NVIDIA dutifully filling as many as it can, but now is the time to set the record straight with an in-depth review of the Kepler-based GeForce GTX 680 2GB GPU.&lt;br /&gt;
&lt;br /&gt;
&lt;script type=&quot;text/javascript&quot; src=&quot;https://apis.google.com/js/plusone.js&quot;&gt;&lt;/script&gt;
&lt;g:plusone&gt;&lt;/g:plusone&gt;
&lt;script type=&quot;text/javascript&quot;&gt;&lt;!--
google_ad_client = &quot;ca-pub-4198184901345099&quot;;
/* sunny */
google_ad_slot = &quot;8832893649&quot;;
google_ad_width = 728;
google_ad_height = 90;
//--&gt;
&lt;/script&gt;
&lt;script type=&quot;text/javascript&quot;
src=&quot;http://pagead2.googlesyndication.com/pagead/show_ads.js&quot;&gt;
&lt;/script&gt;
Back in 2010, NVIDIA&#39;s publicly-known GPU trajectory, as divulged by CEO Jen-Hsun Huang, informed us that the Fermi GPU architecture was to be succeeded by Kepler in late-2011, which in turn is set to be replaced by Maxwell in late-2013, with the primary focus one of increasing the performance-per-watt metric through a combination of die shrinks and general optimisations.&lt;br /&gt;
&lt;br /&gt;
Fermi, if you recall, debuted in the consumer space in the form of the now-maligned GeForce GTX 480 in March 2010. Aiming to be all things to every type of user, NVIDIA&#39;s strategy of one-fits-all for consumer, workstation and professional markets meant that the big-die GPU, a hybrid of sorts, traded pure gaming performance for a forward-looking architecture. Worse still, the innate complexities of bringing-up such an overarching GPU left NVIDIA with a manufacturing and marketing headache.&lt;br /&gt;
&lt;br /&gt;
Perhaps the first incarnation of consumer Fermi came too early; it seemed as if sales and marketing won the battle against engineering, resulting in the release of a half-baked product. With time being the greatest teacher, six months&#39; later NVIDIA cleaned up Fermi and effectively re-released it as the GeForce GTX 580. Higher clocks and a complete architecture made it what Fermi should have been in the first place, and even today GTX 580 continues to offer reasonable value at the readjusted £315 price point.&lt;br /&gt;
&lt;br /&gt;

Quality Settings:&lt;br /&gt;
Crysis 2, 30.33 FPS: DirectX 11 Ultra Upgrade installed, high-resolution textures enabled, Extreme detail level.&lt;br /&gt;
Deus Ex: Human Revolution, 46.20 FPS: Highest possible settings, tessellation enabled, FXAA High enabled.&lt;br /&gt;
Just Cause 2, 46.60 FPS: Maximum settings, CUDA water enabled, 4xMSAA, 16xAF.&lt;br /&gt;
Left 4 Dead 2, 126.10 FPS: Maximum settings, 4xMSAA, 16xAF.&lt;br /&gt;
Mafia 2, 51.35 FPS: Maximum settings, PhysX Medium enabled, AA enabled, AF enabled.&lt;br /&gt;
Metro 2033, 40.72 FPS: Maximum settings, PhysX disabled, 4xMSAA, 16xAF.&lt;br /&gt;
Portal 2, 127.90 FPS: Maximum settings, 4xMSAA, 16xAF.&lt;br /&gt;
The Elder Scrolls V: Skyrim, 59.55 FPS: Ultra preset, Bethesda high-resolution texture pack, indoor cave scene.&lt;br /&gt;
&lt;br /&gt;
How Much Boost?&lt;br /&gt;
Because GPU Boost happens in realtime and the boost factor varies depending on exactly what&#39;s being rendered, it&#39;s hard to pin the performance gain down to a single number. To help clarify the typical performance gain, all Kepler GPUs with GPU Boost will list two clock speeds on its specification sheet: the base clock and the boost clock. The base clock equates to the current graphics clock on all NVIDIA GPUs. For Kepler, that&#39;s also the minimum clock speed that the GPU cores will run at in a 3D application. The boost clock is the typical clock speed that the GPU will run at in a 3D application.&lt;br /&gt;
&lt;br /&gt;
For example, the GeForce GTX 680 has a base clock of 1006 MHz and a boost clock of 1058 MHz. What this means is that in 3D games, the lowest the GPU will run at is 1006 MHz, but most of the time, it&#39;ll probably run at around 1058 MHz. It won&#39;t run exactly at this speed--based on realtime monitoring and feedback, it may go higher or lower, but in most cases it will run close to this speed.&lt;br /&gt;
&lt;br /&gt;
GPU Boost doesn&#39;t take away from overclocking. In fact, with GPU Boost, you now have more than one way to overclock your GPU. You can still increase the base clock just like before and the boost clock will increase correspondingly. Alternatively, you can increase the power target. This is most useful for games that are consuming near 100% of this power target.&lt;br /&gt;
&lt;br /&gt;
Kepler vs. Fermi - the differences&lt;br /&gt;
&lt;br /&gt;
What you&#39;re looking at is a hugely simplified high-level overview of Kepler and Fermi GPU architectures. A cursory glance indicates that Kepler is a super-sized version of Fermi, which is a reasonably accurate method by which to describe it, but the devil is in the details, because it is both super-sized from a core count yet physically almost half the size.&lt;br /&gt;
&lt;br /&gt;
Both NVIDIA architectures, as used on the GTX 680 and GTX 580, are based on combining mini-GPUs that are known as Graphics Processing Clusters (GPC). Both GPUs have four of these GPC &#39;squares,&#39; flanked by the usual ROPs and memory controllers - more on those later. Each GPC is now home to two beefed-up Streaming Multiprocessor units (SMs) - the thin rectangular sections - rather than four in Fermi, though the overhaul is radical enough for NVIDIA to term them SMX - hey, &#39;X&#39; just sounds cooler, right? - rather than plain ol&#39; SM.&lt;br /&gt;
&lt;br /&gt;
Going from top to bottom, Kepler&#39;s to-motherboard host interface has been jacked up from PCIe 2.0 to PCIe 3.0, enabling, potentially, double the bandwidth, which is useful as GPUs become increasingly more powerful. Controlling the thread scheduling on both chips is what NVIDIA terms the master GigaThread Engine. This feeds each GPC. In this sense, nothing much has changed.&lt;br /&gt;
&lt;br /&gt;
However, the main focus of Kepler rests with improving the oomph provided by each GPC, and each of these little green squares you see is a CUDA core. It doesn&#39;t take a genius to figure out that Kepler has more, lots more, than Fermi, so the next step is to provide an exploded view of an SMX, to see just what NVIDIA has been up to.&lt;br /&gt;
&lt;br /&gt;
Why Is Power Efficiency Important?&lt;br /&gt;
When we first launched Fermi with the GeForce GTX 480, people told us how much they loved the performance, but they also told us they wished it consumed less power. Gamers want top performance, but they want it in a quiet, power efficient form factor. The feedback we received from Fermi really drove this point home. With Kepler, one of our top priorities was building a flagship GPU that was also a pleasure to game with.&lt;br /&gt;
&lt;br /&gt;
Kepler introduces two key changes that greatly improve the GPU&#39;s efficiency. First, we completely designed the streaming multiprocessor, the most important building block of our GPUs, for optimal performance per watt. Second, we added a feature called GPU Boost that dynamically increases clock speed to improve performance within the card&#39;s power budget.&lt;br /&gt;
&lt;br /&gt;
Kepler&#39;s new SM, called SMX, is a radical departure from past designs. SMX eliminates the Fermi  &quot;2x&quot; processor clock and uses the same base clock across the GPU. To balance out this change, SMX uses an ultra wide design with 192 CUDA cores. With a total of 1536 cores across the chip, the GeForce GTX 680 handily outperforms the GeForce GTX 580.&lt;br /&gt;
&lt;br /&gt;
But what&#39;s benefited the most is power efficiency. Compared to the original Fermi SM, SMX has twice the performance per watt. Put another way, given a watt of power, Kepler&#39;s SMX can do twice the amount of work as Fermi&#39;s SM. And this is measured apples-to-apples, on the same manufacturing process. Imagine a conventional 50 watt light bulb that shines as brightly as a 100 watt light bulb—that&#39;s what Kepler&#39;s like when gaming.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;/div&gt;</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/3184882195454604187/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/3184882195454604187' title='1 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/3184882195454604187'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/3184882195454604187'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2012/03/introducing-geforce-gtx-680-gpu.html' title='Introducing the GeForce GTX 680 GPU'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEicvB3S63N_XRyK4OACZ7Y3XFvyW3t13afmkz5CyafPHx-gHBEyGVlxOEk8MIODqZ7hJ6cuHAKbmI3VejdwvOHykzgt5hXyUhC7fmSLVbWA_ICkRa-jcepIc1aVCPyRllphsy_15vqkRUb-/s72-c/xfx_geforce_gtx_280_itocp.jpg" height="72" width="72"/><thr:total>1</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-6879232278269928276</id><published>2012-01-11T03:10:00.001-08:00</published><updated>2012-06-20T23:59:54.436-07:00</updated><title type='text'>AMD Radeon HD 7970</title><content type='html'>&lt;div dir=&quot;ltr&quot; style=&quot;text-align: left;&quot; trbidi=&quot;on&quot;&gt;&lt;div class=&quot;separator&quot; style=&quot;clear: both; text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg31CGWo6F37XvOsZr96Wr385D-nxFznS0z6idUyRpmf2LrrYeD87KUELmZfPKv-XjZUxeEJ-DQyYf_Hg-suiY0qGJyEXW-6032ye7idHn2K-lhO2xkGL4yB9fgqRxtRBBOtpvI3rS7Kax4/s1600/HD+7970.jpg&quot; imageanchor=&quot;1&quot; style=&quot;clear: right; float: right; margin-bottom: 1em; margin-left: 1em;&quot;&gt;&lt;img alt=&quot;AMD RADEON HD 7970 graphic card&quot;  border=&quot;0&quot; card&quot;=&quot;&quot; graphic=&quot;&quot; height=&quot;304&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg31CGWo6F37XvOsZr96Wr385D-nxFznS0z6idUyRpmf2LrrYeD87KUELmZfPKv-XjZUxeEJ-DQyYf_Hg-suiY0qGJyEXW-6032ye7idHn2K-lhO2xkGL4yB9fgqRxtRBBOtpvI3rS7Kax4/s400/HD+7970.jpg&quot; width=&quot;320&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;br /&gt;
AMD is using the Taiwan Semiconductor Manufacturing Companies new 28nm manufacturing process to build the new high-end Radeon HD 7970 GPU. The HD 7970 sports 4.3 billion transistors in a 365mm2 die. The 7970 supports OpenCL and OpenGL 4.2. It is assembled from 32 GCN compute units, which translates to 2,048 stream processors, each based on AMD’s new SIMD-plus-scalar architecture. As well, the 7970 includes 768KB of L2 cache and eight render back-ends, features a 384-bit interface to 3GB GDDR5 memory and a PCIe 3.0 interface.&lt;br /&gt;
&lt;br /&gt;
AMD’s soon to become available Radeon HD 7970 graphics card seems to be a beast of an overclocker as recently two enthusiasts have managed to increase the card’s GPU clock from the standard 925MHz, to an impressive 1700MHz.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
AMD took advantage of TSMC’s new 28nm manufacturing process to build its new high-end GPU. The Radeon HD 7970 sports 4.3 billion transistors in a surprisingly small 365mm2 die. AMD product marketing manager Devon Nekechuk tells us AMD’s 28nm yields have been both “good” and “predictable.”&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Tahiti is assembled from 32 GCN compute units, which translates to 2,048 stream processors, each of which is based on AMD’s new SIMD-plus-scalar architecture. The existing Radeon HD 6970, by contrast, is equipped with just 1,536 stream processors and doesn’t benefit from the new architecture. The 7970 includes 768KB of L2 cache and eight render back-ends capable of pushing 32 color ROPs per clock and 128 Z/stencil ROPs per clock cycle. The existing 6970 provides the same quantity of render back-ends, but the newer card boasts higher throughput and much-improved efficiency; plus, the 7970 features a 384-bit interface to 3GB GDDR5 memory and a PCIe 3.0 interface. The GPU is capable of peak throughput of 264GB/s.&lt;br /&gt;
&lt;br /&gt;
&lt;div class=&quot;separator&quot; style=&quot;clear: both; text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh3WFFEypzyZjPV9MuGOK7_Zd26hE0kZxBpt9VrGcwsDtbWAqq7gOlEU5cyKkp0yeXp-bxHMjhMvShLvN3QhtCsFSEQUx4BVmjag8yPUH-8fsT-rpIJRnEg_yE0xOByFYdDVmFZ_Hvw-tsL/s1600/7970prev-eyefinity.jpg&quot; imageanchor=&quot;1&quot; style=&quot;margin-left: 1em; margin-right: 1em;&quot;&gt;&lt;img alt=&quot;AMD Radeon HD 7970 graphic card&quot; border=&quot;0&quot; card&quot;=&quot;&quot; graphic=&quot;&quot; height=&quot;134&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh3WFFEypzyZjPV9MuGOK7_Zd26hE0kZxBpt9VrGcwsDtbWAqq7gOlEU5cyKkp0yeXp-bxHMjhMvShLvN3QhtCsFSEQUx4BVmjag8yPUH-8fsT-rpIJRnEg_yE0xOByFYdDVmFZ_Hvw-tsL/s400/7970prev-eyefinity.jpg&quot; width=&quot;400&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The semiconductor maker extends its GPU leadership with the AMD Radeon HD 7970 through AMD App Acceleration. This technology enables exquisite high-definition video images and exceptional performance improvements for everyday applications.&lt;br /&gt;
&lt;br /&gt;
Also introduced in the card is AMD Eyefinity technology which enables gamers and desktop enthusiasts to connect up to six displays to one graphics card, delivering immersive stereoscopic 3D experiences and astonishing 16k x 16k display resolutions.&lt;br /&gt;
&lt;br /&gt;
Being equipped with the latest PCI Express 3 standard, the card enables uncompromised image quality, and is additionally armed with GDDR5 memory, enabling accelerated GPU performance.&lt;br /&gt;
&lt;br /&gt;
Based on intelligent AMD ZeroCore Power and AMD PowerTune technologies, the AMD Radeon HD 7970 enables higher performance levels while maximizing power efficiencies.&lt;br /&gt;
&lt;br /&gt;
GCN marks a major shift in how AMD GPUs operate, behaving more like a general-purpose vector processor than a pure graphics engine. What’s more, each basic building block, called a GCN Compute Unit, includes a scalar coprocessor that can behave like a traditional—but non-pipelined—CPU. AMD has beefed up the caches that are distributed throughout the GPU. Each GCN core (yes, AMD is calling them cores) has its own dedicated L1 read/write cache. Each group of four cores shares a 16KB instruction cache and a 32KB scalar data cache. All the cores communicate over a shared bus to a partitioned L2 cache that can be sized differently depending on the graphics card and particular GPU die.&lt;br /&gt;
AMD intends for GCN to serve as the basis for several product families. The first product, code-named Tahiti, is aimed at gaming enthusiasts who want maximum frame rates while enabling maximum eye candy. The next product, code-named Pitcairn, will supersede the Radeon HD 6800 series. Pitcairn will be followed by a series code-named Cape Verde, which AMD believes will redefine the segment now held by products such as the Radeon HD 6700 series&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Benchmarks&lt;br /&gt;
AMD Radeon HD 7970 ReferenceXFX Radeon HD 6970EVGA GTX 580 SCEVGA GTX 580 Classified3DMark 2011 Perf7,9855,7506,7477,3213D Mark Vantage Perf31,87324,45326,93628,559Unigine Heaven 2.5 (fps)28172223Shogun 2 (fps)28192224Far Cry 2 / Long (fps)96758592HAWX 2 DX11 (fps)11373120128STALKER: CoP DX11 (fps)37252829Just Cause 2 (fps)48314148Batman: Arkham City (fps)51364547Metro 2033 (fps)17141517DiRT3 (fps)60445055Core /Memory Clock Speeds925 / 1375880/ 1375797 / 1013855 / 1053Power @ idle (W)124126140140Power @ full throttle (W)325*296344385Price$549$350$550$600* &quot;Long dark&quot; system power was 109W&lt;br /&gt;
Best scores are bolded. Our test bed is a 3.33GHz Core i7 3960X Extreme Edition in an Asus P979X Deluxe motherboard with 16GB of Corsair DDR3/1600 and an AX1200 Corsair PSU. The OS is 64-bit Windows Ultimate. All games are run at 2560 x 1600 with 4x AA, except for the 3DMark tests.&lt;/div&gt;</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/6879232278269928276/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/6879232278269928276' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/6879232278269928276'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/6879232278269928276'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2012/01/amd-radeon-hd-7970.html' title='AMD Radeon HD 7970'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg31CGWo6F37XvOsZr96Wr385D-nxFznS0z6idUyRpmf2LrrYeD87KUELmZfPKv-XjZUxeEJ-DQyYf_Hg-suiY0qGJyEXW-6032ye7idHn2K-lhO2xkGL4yB9fgqRxtRBBOtpvI3rS7Kax4/s72-c/HD+7970.jpg" height="72" width="72"/><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-8821762224534850894</id><published>2011-12-23T21:30:00.001-08:00</published><updated>2012-06-21T00:08:32.911-07:00</updated><title type='text'>ATI HD 7950</title><content type='html'>&lt;div dir=&quot;ltr&quot; style=&quot;text-align: left;&quot; trbidi=&quot;on&quot;&gt;The release of Radeon HD 7970,A lower version  with a lower price also for the enthusiasts featuring the same Graphics, 28nm GPU architecture .&lt;br /&gt;
&lt;br /&gt;
The Radeon HD 7950 is a Tahiti-class GPU code named &quot;Tahiti Pro&quot; same as the one on the HD 7970 but with lower Compute Units number.&lt;br /&gt;
&lt;br /&gt;
A slide leaked show specifications of the card, with lower transistors number 4.3 billion transistor, 1792  Compute Units compared to 2048 units on the HD 7970, the GPU is connected to 3GB of GDDR 5 using the same 384-bit bus with 5Gbps transfer speed, one DVI, one HDMI and two mini-DisplayPort video outputs as well as PCI Express 3.0 and DirectX 11.1 compatibility, the memory speed is not announced yet but we guess it would be about 150Mhz lower than the HD 7970.&lt;br /&gt;
The price and release date is still unknown, for sure they will appear very soon so check us as soon as possible.&lt;br /&gt;
&lt;br /&gt;
The Radeon HD 7970 use new 28nm process technology helping to get more performance and low power consumption, so the Radeon HD 7970 is now the fastest single GPU graphic card with 925MHz core clock that can reach 1.1Ghz when OC&#39;d, packing 2,048 stream processors and a wide 384-bit memory bus interconnected with 3GB of GDDR5, using PCI-E 3.0, the price is 200$ more than previous HD 6970, set to $549.&lt;br /&gt;
&lt;br /&gt;
High-end HD 7900 series GPUs will be named as Tahiti Pro and XT variants, and the monstrous dual GPU card as New Zealand, these GPUs make use of the new Graphic Next Core architecture (GCN). The implementation of GCN, a more programming friendly architecture is not only to improve the performance but also to improve the functionality of GPGPU applications.&lt;br /&gt;
&lt;br /&gt;
http://i.imgur.com/Osyqu.jpg&lt;br /&gt;
&lt;br /&gt;
28nm HPL Technology from TSMC&lt;br /&gt;
&lt;br /&gt;
TSMC’s 28 nanometer HPL technology, which provides High Performance with Low-Power consumption. AMD Radeon HD 7870 have a maximum power consumption of only 120 watts, compared with 250 watts of the 40nm-made predecessor, whereas Pro-Thames, Lombok XT and Lombok Pro, all use the same power efficient 28nm technology, with consumption ranging from 90 watts down to 50 watts.&lt;br /&gt;
&lt;br /&gt;
http://i.imgur.com/Du4kh.jpg&lt;br /&gt;
&lt;br /&gt;
Apparently AMD is in its efforts to maintain a good balance between bandwidth and the memory bus, only so AMD would have chosen to abandon the use of GDDR5 memory and instead use XDR2 Rambus memory. Currently AMD pays royalties to Rambus on the use of its memory technologies for several years, so it would not sound strange upon AMD’s decision to implement them in their chips.&lt;br /&gt;
&lt;br /&gt;
http://i.imgur.com/OeRlM.jpg&lt;br /&gt;
&lt;br /&gt;
XDR2 Rambus Memories Performs more but costs more too&lt;br /&gt;
http://www.youtube.com/watch?v=v81VchLF2vE&amp;amp;feature=youtu.be&lt;br /&gt;
&lt;br /&gt;
First Stage of Filtered Data&lt;br /&gt;
&lt;br /&gt;
These cards can be expected to be released in between October to December of this year, Q4 2011. But high end HD 7900 series are expected to be released in Q1 2012. It is too early to get a complete picture of what AMD has prepared for us, and these details or rumors cannot be taken as a valid or official information. These are just first stage of filtered data, and we have no doubt that in the coming days / weeks, there will be many more like this, and even there may be some showing performance reviews of Radeon HD 7000 series.&lt;br /&gt;
&lt;br /&gt;
source:http://lenzfire.com/2011/09/amd-radeon-hd-7000-series-graphic-card-details-exposed-40795/2/&lt;br /&gt;
&lt;br /&gt;
figured i would post it here so you didn&#39;t have to leave OCF.&lt;br /&gt;
&lt;div class=&quot;separator&quot; style=&quot;clear: both; text-align: center;&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjOxl08pM9PUdZnmE66brdT4DE-0BK1pkXCR6hRnzKXPL901_YEpwxZ4k4-Bry2ZPKFnGTS-kyA9IrwjlzukLVN6oVn217YseX8n3nJacCwkHCH-C4n4DmjgD_E7g7shuKAfX2fSIdjUZH2/s1600/HD+7970.jpg&quot; imageanchor=&quot;1&quot; style=&quot;clear: right; float: right; margin-bottom: 1em; margin-left: 1em;&quot;&gt;&lt;img alt=&quot;ATI HD 7950 graphic card&quot; border=&quot;0&quot; height=&quot;304&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjOxl08pM9PUdZnmE66brdT4DE-0BK1pkXCR6hRnzKXPL901_YEpwxZ4k4-Bry2ZPKFnGTS-kyA9IrwjlzukLVN6oVn217YseX8n3nJacCwkHCH-C4n4DmjgD_E7g7shuKAfX2fSIdjUZH2/s400/HD+7970.jpg&quot; width=&quot;320&quot; /&gt;&lt;/a&gt;&lt;/div&gt;&lt;/div&gt;</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/8821762224534850894/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/8821762224534850894' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/8821762224534850894'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/8821762224534850894'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2011/12/ati-hd-7950.html' title='ATI HD 7950'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjOxl08pM9PUdZnmE66brdT4DE-0BK1pkXCR6hRnzKXPL901_YEpwxZ4k4-Bry2ZPKFnGTS-kyA9IrwjlzukLVN6oVn217YseX8n3nJacCwkHCH-C4n4DmjgD_E7g7shuKAfX2fSIdjUZH2/s72-c/HD+7970.jpg" height="72" width="72"/><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-837847764489482776</id><published>2011-01-10T05:58:00.001-08:00</published><updated>2012-06-21T00:09:55.161-07:00</updated><title type='text'>ATI Radeon HD 6970 2GB</title><content type='html'>&lt;div dir=&quot;ltr&quot; style=&quot;text-align: left;&quot; trbidi=&quot;on&quot;&gt;&lt;br /&gt;
&lt;br /&gt;
&lt;div dir=&quot;ltr&quot; style=&quot;text-align: left;&quot; trbidi=&quot;on&quot;&gt;&lt;img alt=&quot;ATI Radeon HD 6970 2GB graphic card&quot; border=&quot;0&quot; id=&quot;BLOGGER_PHOTO_ID_5560557113695252754&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgCLf67w_EDyn9FLubIGSWmtWC45-pRp16JDw6KVxYskzR4rzvgbfPvXDgqckTTERyA0y_KSxksFBY-rT2BjTjTQ9fJjyaT0PFDfhqJZoXleMEQTdLvN3KfzOqbhqGkKZiVu3PdpqWM5W_2/s400/hd-6970-1-w.jpg&quot; style=&quot;cursor: hand; cursor: pointer; float: right; height: 163px; margin: 0 0 10px 10px; width: 400px;&quot; /&gt;&lt;br /&gt;
&lt;span class=&quot;Apple-style-span&quot; style=&quot;font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif; font-size: 12px;&quot;&gt;The previous Radeon HD 6800-series got our hopes up in two ways – firstly it sounded like it’d be more than a mid-range GPU that went toe-to-toe with Nvidia’s GeForce GTX 460 cards, and secondly it wasn’t all that new. But while the Barts GPU of the HD 6800 looked more like an overclocked HD 5830 with a tweaked front-end unit, the Cayman GPU of the Radeon HD 6900 is a completely new design throughout.&lt;br /&gt;
&lt;/span&gt;&lt;br /&gt;
&lt;div&gt;&lt;span class=&quot;Apple-style-span&quot; style=&quot;font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif; font-size: 12px;&quot;&gt;&lt;br /&gt;
&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span class=&quot;Apple-style-span&quot; style=&quot;font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif; font-size: 12px;&quot;&gt;ATI might have only just launched its &lt;a href=&quot;http://www.bit-tech.net/hardware/graphics/2010/10/22/ati-radeon-hd-6870-review/1&quot; style=&quot;outline-color: initial; outline-style: none; outline-width: initial; text-decoration: none;&quot; target=&quot;_blank&quot;&gt;Radeon HD 6870 1GB&lt;/a&gt; and &lt;a href=&quot;http://www.bit-tech.net/hardware/graphics/2010/10/22/ati-radeon-hd-6850-review/1&quot; style=&quot;outline-color: initial; outline-style: none; outline-width: initial; text-decoration: none;&quot; target=&quot;_blank&quot;&gt;Radeon HD 6850 1GB&lt;/a&gt;cards, but enthusiasts are already looking to the next big thing: the company&#39;s Cayman-based Radeon HD 6970.&lt;br /&gt;
&lt;br /&gt;
Chinese-language site Zol.com claims to have got its hands on &lt;a href=&quot;http://translate.google.com/translate?js=n&amp;amp;prev=_t&amp;amp;hl=en&amp;amp;ie=UTF-8&amp;amp;layout=2&amp;amp;eotf=1&amp;amp;sl=auto&amp;amp;tl=en&amp;amp;u=http%3A%2F%2Fvga.zol.com.cn%2F195%2F1953715.html&quot; style=&quot;outline-color: initial; outline-style: none; outline-width: initial; text-decoration: none;&quot; target=&quot;_blank&quot;&gt;benchmarks for the ATI Radeon HD 6970&lt;/a&gt;, although it&#39;s not revealing where the figures have come from.&lt;br /&gt;
&lt;br /&gt;
The site is reporting, however, that the Cayman XT-based AMD Radeon HD 6970, believed to be the 1GB GDDR5 model, has performed admirably in the 3DMark Vantage test suite, scoring 23,499 3DMarks in Performance mode. In the Unigine Heaven benchmark, the card managed 36.6fps at a resolution of 1,920 x 1,200 with 4x anti-aliasing and 16x anisotropic filtering.&lt;br /&gt;
&lt;br /&gt;
Those scores, which represent a pre-release version of the card with unoptimised drivers, are certainly a boost over the company&#39;s Radeon HD 5870 1GB card, which managed 19,337 3DMarks and 17.3 fps in Unigine Heaven at the same settings.&lt;br /&gt;
&lt;br /&gt;
Impressively, the figures also beat Nvidia&#39;s current high-end card, the GeForce GTX 480 1.5GB, quite considerably: at the same settings, the GTX 480 1.5GB scored 21,106 in 3DMark Vantage and 29.5fps in Unigine Heaven.&lt;br /&gt;
&lt;br /&gt;
Further details of the as-yet unannounced card aren&#39;t available, but it is rumoured to feature a TDP of 255W or higher, and require a six- and an eight-pin power supply connection.&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span class=&quot;Apple-style-span&quot; style=&quot;color: #222222; font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif; font-size: 12px;&quot;&gt;&lt;br /&gt;
&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span class=&quot;Apple-style-span&quot; style=&quot;color: #222222; font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif; font-size: 12px;&quot;&gt;&lt;br /&gt;
&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span class=&quot;Apple-style-span&quot; style=&quot;color: #222222; font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif; font-size: 12px;&quot;&gt;For specification click on the link..........&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span class=&quot;Apple-style-span&quot; style=&quot;color: #222222; font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif; font-size: 12px;&quot;&gt;&lt;a href=&quot;http://www.microcenter.com/single_product_results.phtml?product_id=0354422&quot;&gt;http://www.microcenter.com/single_product_results.phtml?product_id=0354422&lt;/a&gt;&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;br /&gt;
&lt;/div&gt;&lt;div&gt;&lt;br /&gt;
&lt;/div&gt;&lt;div&gt;CHEck Out VIDeo for more details:&lt;/div&gt;&lt;div&gt;&lt;br /&gt;
&lt;/div&gt;&lt;div&gt;&lt;a href=&quot;http://videos.wittysparks.com/id/1866934023&quot;&gt;http://videos.wittysparks.com/id/1866934023&lt;/a&gt;&lt;/div&gt;&lt;div&gt;&lt;br /&gt;
&lt;/div&gt;&lt;div&gt;&lt;br /&gt;
&lt;/div&gt;&lt;div&gt;Check out Comparisons vs NVIDIA GEFORCE AND ATI HD5000 Series:&lt;/div&gt;&lt;div&gt;&lt;br /&gt;
&lt;/div&gt;&lt;div&gt;&lt;a href=&quot;http://benchmarkreviews.com/index.php?option=com_content&amp;amp;task=view&amp;amp;id=13177&amp;amp;Itemid=8&quot;&gt;http://benchmarkreviews.com/index.php?option=com_content&amp;amp;task=view&amp;amp;id=13177&amp;amp;Itemid=8&lt;/a&gt;&lt;/div&gt;&lt;div&gt;&lt;br /&gt;
&lt;/div&gt;&lt;div&gt;&lt;span class=&quot;Apple-style-span&quot; style=&quot;color: #3c3b3b; font-family: &amp;quot;Verdana&amp;quot;, &amp;quot;Arial&amp;quot;, sans-serif, &amp;quot;Helvetica&amp;quot;; font-size: 12px; line-height: 18px;&quot;&gt;the 6970 def beats the 5870 in benchmarks, but it also offers much better scaling &lt;a class=&quot;kLink&quot; href=&quot;http://www.tomshardware.com/forum/306592-33-5870-6970#&quot; id=&quot;KonaLink1&quot; style=&quot;background-attachment: initial !important; background-clip: initial !important; background-color: transparent !important; background-image: none !important; background-origin: initial !important; background-position: initial initial !important; background-repeat: initial initial !important; border-bottom-color: transparent !important; border-bottom-style: none !important; border-bottom-width: 0px !important; border-left-color: transparent !important; border-left-style: none !important; border-left-width: 0px !important; border-right-color: transparent !important; border-right-style: none !important; border-right-width: 0px !important; border-top-color: transparent !important; border-top-style: none !important; border-top-width: 0px !important; bottom: 0px; color: blue !important; cursor: pointer; display: inline !important; font-family: inherit !important; font-size: inherit !important; font-variant: normal; font-weight: inherit !important; left: 0px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; outline-color: initial; outline-style: none; outline-width: initial; padding-bottom: 0px !important; padding-left: 0px !important; padding-right: 0px !important; padding-top: 0px !important; position: static; right: 0px; text-decoration: underline !important; text-transform: none !important; top: 0px;&quot; target=&quot;undefined&quot;&gt;&lt;span style=&quot;color: blue; font-family: inherit; font-size: inherit; font-weight: inherit; position: static;&quot;&gt;&lt;span class=&quot;kLink&quot; style=&quot;background-color: transparent; background-image: none; border-bottom-style: solid; border-bottom-width: 1px; border-left-style: none; border-left-width: 0px; border-right-style: none; border-right-width: 0px; border-top-style: none; border-top-width: 0px; color: blue; display: inline; float: none; font-family: inherit; font-size: inherit; font-weight: inherit; padding-bottom: 1px; padding-left: 0px; padding-right: 0px; padding-top: 0px; position: static; width: auto;&quot;&gt;performance&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;in crossfire and imo better performance in heavy dx11 in general. Furthermore the 6970s 2gb of memory gives it better performance when higher then 1080p. But since the 5870 has dropped in price to the circa 250 range it is still a good value if you are going for a single card, but still for $300 I would much rather go with the 6950 with the good potential to unlock it to a 6970 with a bios flash all making the 6950 the best buy imo.... although at your res even a 5850/6850/6870 would be more than acceptable which are all for the most part a little bit slower (with the exception of an oc&#39;ed 6870/5850/6850) then the 5870 so the differences are hard to quantify but I would say the six series is worth the price increase, and if you can afford it go for a 6950.. and I believe the unlocking guides are on guru3d. &lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span class=&quot;Apple-style-span&quot; style=&quot;color: #222222; font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif; font-size: 12px;&quot;&gt;&lt;br /&gt;
&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span class=&quot;Apple-style-span&quot; style=&quot;color: #222222; font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif; font-size: 12px;&quot;&gt;&lt;br /&gt;
&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span class=&quot;Apple-style-span&quot; style=&quot;font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif; font-size: 12px;&quot;&gt;&lt;/span&gt;&lt;br /&gt;
&lt;h2 style=&quot;font-size: 13px; margin-bottom: 0.5em; margin-left: 0px; margin-right: 0px; margin-top: 0.5em;&quot;&gt;&lt;span class=&quot;Apple-style-span&quot; style=&quot;font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif; font-size: 12px;&quot;&gt;ATI Radeon HD 6970 2GB&lt;/span&gt;&lt;/h2&gt;&lt;span class=&quot;Apple-style-span&quot; style=&quot;font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif; font-size: 12px;&quot;&gt;&lt;b&gt;Manufacturer&lt;/b&gt; &lt;a href=&quot;http://www.amd.com/uk/Pages/AMDHomePage.aspx&quot; style=&quot;outline-color: initial; outline-style: none; outline-width: initial; text-decoration: none;&quot; target=&quot;_blank&quot;&gt;ATI&lt;/a&gt;&lt;br /&gt;
&lt;b&gt;UK price (as reviewed)&lt;/b&gt; £310 (inc VAT) MSRP&lt;br /&gt;
&lt;b&gt;US price (as reviewed)&lt;/b&gt; $369 (ex tax) MSRP&lt;/span&gt;&lt;/div&gt;&lt;div&gt;&lt;span class=&quot;Apple-style-span&quot; style=&quot;color: #222222; font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif; font-size: 12px;&quot;&gt;&lt;br /&gt;
&lt;/span&gt;&lt;br /&gt;
&lt;span class=&quot;Apple-style-span&quot; style=&quot;color: #222222; font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif; font-size: 12px;&quot;&gt;&lt;br /&gt;
&lt;/span&gt;&lt;br /&gt;
&lt;span class=&quot;Apple-style-span&quot; style=&quot;color: #222222; font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif; font-size: 12px;&quot;&gt;&lt;br /&gt;
&lt;/span&gt;&lt;br /&gt;
&lt;span style=&quot;color: red; font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif;&quot;&gt;&lt;span style=&quot;font-size: 12px;&quot;&gt;&lt;b&gt;ati 6970 user review:&lt;/b&gt;&lt;/span&gt;&lt;/span&gt;&lt;br /&gt;
&lt;span style=&quot;color: #222222; font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif;&quot;&gt;&lt;span style=&quot;font-size: 12px;&quot;&gt;&lt;b&gt;&lt;br /&gt;
&lt;/b&gt;&lt;/span&gt;&lt;/span&gt;&lt;br /&gt;
&lt;span style=&quot;color: #222222; font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif;&quot;&gt;&lt;span style=&quot;font-size: 12px;&quot;&gt;&lt;b&gt;&lt;br /&gt;
&lt;/b&gt;&lt;/span&gt;&lt;/span&gt;&lt;br /&gt;
&lt;span style=&quot;color: #222222; font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif;&quot;&gt;&lt;span style=&quot;font-size: 12px;&quot;&gt;&lt;b&gt;&lt;a href=&quot;http://www.amd.com/us/products/desktop/graphics/amd-radeon-hd-6000/hd-6970/pages/amd-radeon-hd-6970-overview.aspx&quot;&gt;http://www.amd.com/us/products/desktop/graphics/amd-radeon-hd-6000/hd-6970/pages/amd-radeon-hd-6970-overview.aspx&lt;/a&gt;&lt;/b&gt;&lt;/span&gt;&lt;/span&gt;&lt;br /&gt;
&lt;span style=&quot;color: #222222; font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif;&quot;&gt;&lt;span style=&quot;font-size: 12px;&quot;&gt;&lt;br /&gt;
&lt;/span&gt;&lt;/span&gt;&lt;br /&gt;
&lt;span style=&quot;color: #222222; font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif;&quot;&gt;&lt;span style=&quot;font-size: 12px;&quot;&gt;&lt;br /&gt;
&lt;/span&gt;&lt;/span&gt;&lt;br /&gt;
&lt;span style=&quot;background-color: white; color: #222222; font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif; font-size: 12px;&quot;&gt;While the Barts GPU of the HD 6800-series was a bit same-old, same-old, the new Cayman GPU of the HD 6900 is a big change. For a start, there’s two entire Front-End Engines, so the Cayman GPU has two triangle setup units, two tessellators, two rasterisers and can send twice as much work per clock to the stream processors as previous Radeon GPUs.&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;background-color: white; color: #222222; font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif; font-size: 12px;&quot;&gt;Those stream processors have been upgraded, so they’re all capable of the high-precision work that only a fifth of the stream processors of previous generations were able to calculate. They’re organised in groups of four, with 16 of these groups per SIMD Engine (or stream processors cluster).&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Below are some user review links.............check it out:&lt;br /&gt;
&lt;a href=&quot;http://www.amazon.com/HIS-Eyefinity-Mini-DisplayPort-Express-H697F2G2M/dp/B004FPYL9C&quot;&gt;http://www.amazon.com/HIS-Eyefinity-Mini-DisplayPort-Express-H697F2G2M/dp/B004FPYL9C&lt;/a&gt;&lt;br /&gt;
&lt;br /&gt;
&lt;div style=&quot;text-align: left;&quot;&gt;&lt;br /&gt;
&lt;/div&gt;&lt;div style=&quot;text-align: left;&quot;&gt;&lt;br /&gt;
&lt;/div&gt;&lt;div style=&quot;text-align: left;&quot;&gt;&lt;a href=&quot;http://www.overclock.net/t/861225/fz-some-cayman-radeon-hd6970-specs&quot;&gt;http://www.overclock.net/t/861225/fz-some-cayman-radeon-hd6970-specs&lt;/a&gt;&lt;/div&gt;&lt;br /&gt;
&lt;span style=&quot;background-color: white; color: #222222; font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif; font-size: 12px;&quot;&gt;While this means that the HD 6970 2GB has fewer stream processors than the HD 5870 1GB, they’re more capable. ATI says the new layout, called VLIW4, is 10 per cent faster per mm&lt;/span&gt;&lt;sup style=&quot;background-color: white; color: #222222; font-family: Arial, Helvetica, sans-serif; text-align: -webkit-auto;&quot;&gt;2&lt;/sup&gt;&lt;span style=&quot;background-color: white; color: #222222; font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif; font-size: 12px;&quot;&gt;.&lt;/span&gt; &lt;br /&gt;
&lt;br /&gt;
&lt;/div&gt;&lt;div&gt;&lt;span class=&quot;Apple-style-span&quot; style=&quot;color: #222222; font-family: &amp;quot;Arial&amp;quot;, &amp;quot;Helvetica&amp;quot;, sans-serif; font-size: 12px;&quot;&gt;&lt;br /&gt;
&lt;/span&gt;&lt;/div&gt;&lt;/div&gt;&lt;br /&gt;
&lt;/div&gt;</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/837847764489482776/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/837847764489482776' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/837847764489482776'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/837847764489482776'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2011/01/ati-radeon-hd-6970-2gb.html' title='ATI Radeon HD 6970 2GB'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgCLf67w_EDyn9FLubIGSWmtWC45-pRp16JDw6KVxYskzR4rzvgbfPvXDgqckTTERyA0y_KSxksFBY-rT2BjTjTQ9fJjyaT0PFDfhqJZoXleMEQTdLvN3KfzOqbhqGkKZiVu3PdpqWM5W_2/s72-c/hd-6970-1-w.jpg" height="72" width="72"/><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-597641761588957126</id><published>2010-12-21T08:08:00.000-08:00</published><updated>2010-12-21T08:08:02.739-08:00</updated><title type='text'>Ubuntu blogspot: Call For Testing: ATI and nVIDIA graphic cards</title><content type='html'>&lt;a href=&quot;http://latestpcperipheral.blogspot.com/&quot;&gt; Latest computer hardware &lt;/a&gt;&lt;br /&gt;&lt;br /&gt;&lt;a href=&quot;http://free-mobile-phone-games-download.blogspot.com/&quot;&gt; Free phone games &lt;/a&gt;&lt;br /&gt;&lt;br /&gt;&lt;a href=&quot;http://football-player-photos.blogspot.com/&quot;&gt; football player photos &lt;/a&gt;&lt;br /&gt;&lt;br /&gt;&lt;a href=&quot;http://supremegraphiccards.blogspot.com/&quot;&gt; Supreme graphic cards&lt;/a&gt;</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/597641761588957126/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/597641761588957126' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/597641761588957126'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/597641761588957126'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2010/12/ubuntu-blogspot-call-for-testing-ati.html' title='Ubuntu blogspot: Call For Testing: ATI and nVIDIA graphic cards'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-7445048413302734137</id><published>2010-09-11T09:18:00.001-07:00</published><updated>2012-06-21T00:11:43.015-07:00</updated><title type='text'>NVIDIA GeForce GTX 460</title><content type='html'>&lt;div dir=&quot;ltr&quot; style=&quot;text-align: left;&quot; trbidi=&quot;on&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi6zqgwUE3amuweiaiSdkWtre88VIeSdXkG3A-2BFoKkMmUjdnx9P5cKqoxuVhsIlAge-FHeAtLMMMNPjIPHjSFEf4OjPbGHWMKC0AkOoMsl5uq98BtEZnvsSF4xeJ3aKNspiDn4hthYrNs/s1600/16010306160l.jpg&quot; onblur=&quot;try {parent.deselectBloggerImageGracefully();} catch(e) {}&quot;&gt;&lt;img alt=&quot;&quot; border=&quot;0&quot; id=&quot;BLOGGER_PHOTO_ID_5515694194122007426&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi6zqgwUE3amuweiaiSdkWtre88VIeSdXkG3A-2BFoKkMmUjdnx9P5cKqoxuVhsIlAge-FHeAtLMMMNPjIPHjSFEf4OjPbGHWMKC0AkOoMsl5uq98BtEZnvsSF4xeJ3aKNspiDn4hthYrNs/s400/16010306160l.jpg&quot; style=&quot;cursor: hand; cursor: pointer; float: right; height: 295px; margin: 0 0 10px 10px; width: 400px;&quot; alt=&quot;NVIDIA GeForce GTX 460 graphic card&quot;/&gt;&lt;/a&gt;&lt;br /&gt;
Over the last 12 months, nVidia have been all about one thing - delays. Month after month, I&#39;m sure many of you had waited for the range topping GeForce GTX 470 and 480 graphics cards. Indeed, nVidia were late to the DirectX 11 party but today no one seems to care. The bottom line is that team green have taken back the performance crown and are succeeding in their catch up endeavours. Not so long ago, we also saw the release of nVidia&#39;s eyefinity competition, known as 3D Vision and Surround. Step by step, it may seem that light is at the end of the tunnel.&lt;br /&gt;
&lt;br /&gt;
One slight problem though. Not everyone on the market today can afford to drop upwards of £300 on a new DX11 graphics card. With ATi enjoying healthy sales figures from Radeon HD 5670, 5750 and 5770 SKUs, it is about time that nVidia returned with some more competition. Today we are pleased to present to you exactly that; meet the nVidia GeForce GTX 460.&lt;br /&gt;
&lt;br /&gt;
I have lied slightly, in that nVidia already have a &quot;mid range&quot; offering on the market today known as the GTX 465. As my colleague highlighted, the cut down GTX 470/480 graphics card suffered from all of the power and heat based issues of its larger siblings but without any of the performance that made them worthwhile. To make matters worse, it is priced well above the £200 mark, making it terrible value for money. Not good news at all.&lt;br /&gt;
&lt;br /&gt;
This is where the GTX 460 comes into the picture. Like the observant chaps that nVidia are, they went away and reworked the GF100 architecture to offer a native mid range graphics card. That&#39;s right, no added baggage and no inflated price tags. I think we&#39;re all eager to see what it has to offer so let&#39;s jump straight to the specifications.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Can you believe that after more than a year of rumors, debate and supposition, it has already been over four months after NVIDIA’s GTX 400-series launched? The GTX 480 and GTX 470 are for all intents and purposes are still extremely popular and have shown us their gaming capabilities again and again. Only a few weeks ago these two high end cards were joined by the GTX 465 which was met with a decidedly lukewarm reception from us as well as most other publications. In an effort to move on quickly from that bump in the road, NVIDIA is following up with yet another mid-range card: the GTX 460. &lt;br /&gt;
&lt;br /&gt;
Now we’re sure that some of you may be rolling your eyes towards the ceiling and thinking “not another power hungry, hot and expensive NVIDIA 400-series card”. Believe it or not, we’ll go on record right now by saying that this is one graphics card you&#39;ll want to pay attention to because the GTX 460 actually bucks several preconceptions many have had about the GTX 400-series cards. If we don’t yet have your attention, read on and I am sure you’ll start getting excited. &lt;br /&gt;
&lt;br /&gt;
Based off of a GF104 core, the GTX 460 doesn’t sport a 3 billion transistor GF100 with a bunch of disabled cores like the GTX 465 did. Rather, it uses a slimmed down 1.95 billion transistor die which is supposed to offer a much leaner power consumption envelope while being less expensive to produce and extremely compact. The result is beneficial for consumers on a number of fronts, especially considering NVIDIA will be releasing two versions of the GTX 460 right off the bat. There will be a 1GB, 256-bit SKU that will retail for around $230 while a slightly lower-end 768MB, 192-bit product should hit the magical $199 price point. Both are compatible with all of NVIDIA’s “Graphics Plus” technologies including CUDA and 3D Vision Surround which we talked about at length here.&lt;br /&gt;
&lt;br /&gt;
With the current price points as they are, the GTX 460 768MB is directly targeting the HD 5830’s performance envelope but its price is the same as or slightly below that of most HD 5830 cards on the market. Meanwhile, the $230 GTX 460 1GB is aiming to bridge the sometimes-miniscule gap between the HD 5830 and the higher-end HD 5850s. This also bodes well for those of you who held off buying the $270 GTX 465 since as you will see on the next pages, there are several areas in which this new card has the 465 beat clean in the specs department. Just be aware that in preparation for the GTX 460 landing on store shelves, several of NVIDIA’s board partners have effectively cut the price of their GTX 465s to around $255. &lt;br /&gt;
&lt;br /&gt;
On a final note it is important to note that while we are reviewing both the 768 MB and 1GB cards in this article, it is quite likely that only the 768MB cards will be widely available come launch time. The 1GB cards will slowly trickle in throughout this week with wide availability on the week of July 19th. &lt;br /&gt;
&lt;br /&gt;
All in all, the GTX 460 looks like a worthy successor its predecessors but the biggest question is whether it can actually surpass the higher end cards when it comes to capturing the attention of a market that has been waiting a long time for a proper sub-$250 GPU.&lt;/div&gt;</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/7445048413302734137/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/7445048413302734137' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/7445048413302734137'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/7445048413302734137'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2010/09/nvidia-geforce-gtx-460.html' title='NVIDIA GeForce GTX 460'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi6zqgwUE3amuweiaiSdkWtre88VIeSdXkG3A-2BFoKkMmUjdnx9P5cKqoxuVhsIlAge-FHeAtLMMMNPjIPHjSFEf4OjPbGHWMKC0AkOoMsl5uq98BtEZnvsSF4xeJ3aKNspiDn4hthYrNs/s72-c/16010306160l.jpg" height="72" width="72"/><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-4950983805107959087</id><published>2010-07-20T03:46:00.000-07:00</published><updated>2011-12-26T01:09:26.274-08:00</updated><title type='text'>ATI NEXT GENERATION GPU&#39;s HD6XXX SERIES</title><content type='html'>After blockbuster 5series cards ATI planing new next generation GPU in late 2010 or by early&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;Caicos planned for late 2010 which would be the successor of Cedar (Radeon HD5450)&lt;br /&gt;Turks planned for late 2010 which would be the successor of Redwood (HD55xx/HD5670)&lt;br /&gt;Barts scheduled for early 2011 would be the successor of Juniper (Radeon HD57xx)&lt;br /&gt;Cayman planned for early 2011 which would be the successor of Cypress (Radeon HD58xx)&lt;br /&gt;Caribbean scheduled for early 2011 would be the successor of Hemlock (Radeon HD5970)&lt;br /&gt;This whole new range of GPUs Southern Island would be engraved in 40nm and AMD would focus on the tessellation to catch up on Nvidia cards and Fermi.</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/4950983805107959087/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/4950983805107959087' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/4950983805107959087'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/4950983805107959087'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2010/07/ati-next-generation-gpus-hd6xxx-series.html' title='ATI NEXT GENERATION GPU&#39;s HD6XXX SERIES'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-3663257869929745666</id><published>2010-04-17T00:13:00.001-07:00</published><updated>2012-06-21T00:12:55.114-07:00</updated><title type='text'>NVIDIA GTX 480 ,470</title><content type='html'>&lt;div dir=&quot;ltr&quot; style=&quot;text-align: left;&quot; trbidi=&quot;on&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEipVL4LY72jiS05j98_stwbrYDko3MwnbQJcAaZeJo4BzcxK70osJT3DQAbRXIe2Ltvbr6wxtvGEuy4-fB2LVJWKnmjNCHlReBE9gbaK6ouzHu-jnKIe8QZGElJ4Tik79fneot2XYW-LFpR/s1600/imageview.php.jpeg&quot; onblur=&quot;try {parent.deselectBloggerImageGracefully();} catch(e) {}&quot;&gt;&lt;img alt=&quot;NVIDIA GTX 480 ,470 graphic card&quot; border=&quot;0&quot; id=&quot;BLOGGER_PHOTO_ID_5461002322268904946&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEipVL4LY72jiS05j98_stwbrYDko3MwnbQJcAaZeJo4BzcxK70osJT3DQAbRXIe2Ltvbr6wxtvGEuy4-fB2LVJWKnmjNCHlReBE9gbaK6ouzHu-jnKIe8QZGElJ4Tik79fneot2XYW-LFpR/s400/imageview.php.jpeg&quot; style=&quot;cursor: hand; cursor: pointer; float: right; height: 242px; margin: 0 0 10px 10px; width: 400px;&quot; /&gt;&lt;/a&gt;&lt;br /&gt;
GTX 480 gives your games an adrenaline shot with the world’s fastest performance and futuristic, visually-stunning graphics. Experience heart-pounding, cinematic visuals on your favorite games with the combined power of DirectX 11, CUDA™, and NVIDIA® PhysX® technologies. And expand your visual real estate across three HD displays in jaw-dropping stereoscopic 3D for the ultimate in immersive gaming.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Next-generation gaming has arrived. NVIDIA® GeForce® GTX 480 gives your games an adrenaline shot with the world’s fastest performance and futuristic, visually-stunning graphics. Experience heart-pounding, cinematic visuals on your favorite games with the combined power of DirectX 11, CUDA™, and NVIDIA® PhysX® technologies. And expand your visual real estate across three HD displays in jaw-dropping stereoscopic 3D for the ultimate in immersive gaming. NVIDIA® GeForce® GTX 480: pure adrenaline meets visual bliss.&lt;br /&gt;
&lt;br /&gt;
Stunning Gaming Effects&lt;br /&gt;
Stunning Gaming Effects&lt;br /&gt;
Cutting-edge Microsoft DirectX 11 graphics and NVIDIA® PhysX® technology take gaming to a new level. Witness effects so realistic that you’ll have to remind yourself it’s just a game. Repeat: It’s just a game.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Total Immersion  Total Immersion&lt;br /&gt;
Breakthrough the boundaries of your screen. Games, movies and photos enter a new dimension with NVIDIA® 3D Vision™ technology. Thrill seeker? Use two GeForce GTX 480 cards in NVIDIA® SLI® configuration to project across three displays with NVIDIA 3D Vision Surround1 technology. *Warning: May cause screen envy.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Futuristic Graphics  Futuristic Graphics&lt;br /&gt;
Prepare yourself for the future of gaming. GeForce GTX 480 powers interactive raytracing, bringing spectacular, photo-realistic renderings to your screen for the first time. Go ahead, spoil your eyes.&lt;br /&gt;
&lt;br /&gt;
1NVIDIA 3D Vision Surround requires upcoming NVIDIA release 256 driver, two or more graphics cards in NVIDIA SLI configuration, 3D Vision glasses, and 3D Vision-Ready displays. See www.nvidia.com/surround for more information.&lt;br /&gt;
&lt;br /&gt;
* Microsoft® DirectX® 11 Support&lt;br /&gt;
DirectX 11 GPU with Shader Model 5.0 support designed for ultra high performance in the new API’s key graphics feature, GPU-accelerated tessellation.&lt;br /&gt;
* NVIDIA® 3D Vision™ Surround Ready*&lt;br /&gt;
Expand your games across three displays in full stereoscopic 3D for the ultimate “inside the game” experience with the power of NVIDIA 3D Vision and SLI technologies. NVIDIA® Surround™ also supports triple screen gaming with non-stereo displays.&lt;br /&gt;
* Interactive Ray Tracing&lt;br /&gt;
By tracing the path of light through a 3D scene, ray tracing uses the power of the GPU to create spectacular, photo-realistic visuals. Get a glimpse into the future of gaming with ray tracing.&lt;br /&gt;
* 3-way NVIDIA SLI® Technology**&lt;br /&gt;
Industry leading 3-way NVIDIA SLI technology offers amazing performance scaling by implementing 3-way AFR (Alternate Frame Rendering) for the world’s premier gaming solution under Windows 7 with solid, state-of-the-art drivers.&lt;br /&gt;
* NVIDIA PhysX® Technology&lt;br /&gt;
Full support for NVIDIA PhysX technology, enabling a totally new class of physical gaming interaction for a more dynamic and realistic experience with GeForce.&lt;br /&gt;
* NVIDIA CUDA™ Technology&lt;br /&gt;
CUDA technology unlocks the power of the GPU’s processor cores to accelerate the most demanding tasks such as video transcoding, physics simulation, ray tracing, and more, delivering incredible performance improvements over traditional CPUs.&lt;br /&gt;
* 32x Anti-aliasing Technology&lt;br /&gt;
Lightning fast, high-quality anti-aliasing at up to 32x sample rates obliterates jagged edges.&lt;br /&gt;
* NVIDIA® PureVideo® HD Technology***&lt;br /&gt;
The combination of high-definition video decode acceleration and post-processing that delivers unprecedented picture clarity, smooth video, accurate color, and precise image scaling for movies and video.&lt;br /&gt;
* PCI Express 2.0 Support&lt;br /&gt;
Designed for the new PCI Express 2.0 bus architecture offering the highest data transfer speeds for the most bandwidth-hungry games and 3D applications, while maintaining backwards compatibility with existing PCI Express motherboards for the broadest support.&lt;br /&gt;
* Dual-link DVI Support&lt;br /&gt;
Able to drive industry’s largest and highest resolution flat-panel displays up to 2560x1600 and with support for High-bandwidth Digital Content Protection (HDCP).&lt;br /&gt;
* HDMI 1.3a Support&lt;br /&gt;
Fully integrated support for HDMI 1.3a including xvYCC, deep color, and 7.1 digital surround sound.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Overclocking GeForce GTX 480 with extra GPU voltage&lt;br /&gt;
&lt;br /&gt;
The GeForce GTX 400 series definitely has been topic of much discussion. Though extremely powerful this graphics card runs hot and is rather loud. Now these sure are lackluster factors .. but that doesn&#39;t mean the product doesn&#39;t have any further potential. Contrary as today&#39;s article will show there&#39;s in fact a lot of additional performance to be found in the GeForce GTX 480, that&#39;s if you know how to tweak it properly. Today we&#39;ll touch the holy grail over overclocking, voltage tweaking.&lt;br /&gt;
&lt;br /&gt;
Yesterday we released an update of AfterBurner, we implemented and thus added voltage control for the 470/480 cards in MSI AfterBurner revision 1.6.0 beta 4 and upwards.&lt;br /&gt;
&lt;br /&gt;
You can use MSI AfterBurner with ANY brand GeForce GTX 480, trust me when I say, the card are all the same and partners have purchased them from NVIDIA directly.&lt;br /&gt;
&lt;br /&gt;
Now as you guys probably know the default clock frequency for the GeForce GTX 480 is 700 MHz on the core, 1400 MHz on the shader domain and 3700 MHz on the memory. With an additional voltage we where able to push close to 850 MHz on the core and roughly 1700 MHz on the shader processor domain.&lt;br /&gt;
&lt;br /&gt;
The first group of people that will really benefit from what we&#39;ll show you today are the people that opt a liquid cooled GeForce GTX 470 or 480 as they will be in control of noise levels and get much better GPU temperatures. As such they should be able to, fairly straightforward, get another 15 to 20% performance out of the card.&lt;/div&gt;</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/3663257869929745666/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/3663257869929745666' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/3663257869929745666'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/3663257869929745666'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2010/04/nvidia-gtx-480-470.html' title='NVIDIA GTX 480 ,470'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEipVL4LY72jiS05j98_stwbrYDko3MwnbQJcAaZeJo4BzcxK70osJT3DQAbRXIe2Ltvbr6wxtvGEuy4-fB2LVJWKnmjNCHlReBE9gbaK6ouzHu-jnKIe8QZGElJ4Tik79fneot2XYW-LFpR/s72-c/imageview.php.jpeg" height="72" width="72"/><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-6872509841901490621</id><published>2010-03-15T11:33:00.001-07:00</published><updated>2012-06-21T00:12:57.973-07:00</updated><title type='text'>3DTV Play</title><content type='html'>&lt;div dir=&quot;ltr&quot; style=&quot;text-align: left;&quot; trbidi=&quot;on&quot;&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhf7RGCX88_w11ghyphenhyphenUO-7bHRS6V_CpgumXngvmb8yzao39jXJAuOhhKp-lgQjPgipqkuCtnWQp870gQLlGVGH7oZWvXI_9u9DS3kptwSaetXxHTDuDOI5mklWB-uIX5at_sY0uZjZXhZpKZ/s1600-h/NV_3DVision_Header_Alienware.jpg&quot; onblur=&quot;try {parent.deselectBloggerImageGracefully();} catch(e) {}&quot;&gt;&lt;img alt=&quot;3DTV Play nvidia&quot; border=&quot;0&quot; id=&quot;BLOGGER_PHOTO_ID_5448930851346390546&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhf7RGCX88_w11ghyphenhyphenUO-7bHRS6V_CpgumXngvmb8yzao39jXJAuOhhKp-lgQjPgipqkuCtnWQp870gQLlGVGH7oZWvXI_9u9DS3kptwSaetXxHTDuDOI5mklWB-uIX5at_sY0uZjZXhZpKZ/s400/NV_3DVision_Header_Alienware.jpg&quot; style=&quot;cursor: hand; cursor: pointer; float: right; height: 171px; margin: 0 0 10px 10px; width: 400px;&quot; /&gt;&lt;/a&gt;&lt;br /&gt;
&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi49TAnnsmJuVOhdrArjpxLGU8wgD1OJi383KuMSllbtmBwxRQtXjsGtyHiC-j5TG222Zx7ht2jxkmBHvIPWFebrxZqAlOtzX9fGb4w5pSNAc4QuuPyDa6qFbt7dMH_3I8dAdj-Cfv3HOHY/s1600-h/header_3dtv_play.jpg&quot; onblur=&quot;try {parent.deselectBloggerImageGracefully();} catch(e) {}&quot;&gt;&lt;img alt=&quot;&quot; border=&quot;0&quot; id=&quot;BLOGGER_PHOTO_ID_5448930778160934322&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi49TAnnsmJuVOhdrArjpxLGU8wgD1OJi383KuMSllbtmBwxRQtXjsGtyHiC-j5TG222Zx7ht2jxkmBHvIPWFebrxZqAlOtzX9fGb4w5pSNAc4QuuPyDa6qFbt7dMH_3I8dAdj-Cfv3HOHY/s400/header_3dtv_play.jpg&quot; style=&quot;cursor: hand; cursor: pointer; float: left; height: 171px; margin: 0 10px 10px 0; width: 400px;&quot; /&gt;&lt;/a&gt;&lt;br /&gt;
NVIDIA Just announced a new 3D software technology called 3DTV Play. We&#39;re taking it on a 15 city U.S. tour. Check out the blog for details, and RSVP on the events tab.&lt;br /&gt;
&lt;br /&gt;
3DTV Play will be released later this spring with an anticipated retail price of $39.99 USD. It will also be available for free for current NVIDIA 3D Vision customers.&lt;br /&gt;
&lt;br /&gt;
Seeing is believing, especially with 3DTV Play so, starting today we&#39;re kicking off a 15 city road trip with Panasonic. If you&#39;re in one of the destination cities across the US you&#39;ll be able to see 3DTV Play in action, running on the brand new Panasonic VIERA full HD 3D TVs.&lt;br /&gt;
&lt;br /&gt;
3DTV Play lets you connect your compatible GeForce-based desktop or notebook PC to a full HD 3D TV. 3DTV Play also lets you use the active-shutter (or passive) glasses supplied with new 3D TVs, with the ynchronization between the TV and the PC happening over the HDMI 1.4 interface. And, because 3DTV Play is based largely on what we do on the 3D Vision software side, you will be able to play your entire library of 3D games on the big screen. You’ll also be able to take advantage of our other 3D Vision technology benefits, including watching 3D Blu-ray movies, viewing videos, photographs in 3D, or browse the Web in 3D or stream 3D content.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
NVIDIA® 3DTV Play software lets you connect your NVIDIA GeForce® GPU-powered PC or notebook to 3DTVs using HDMI 1.4 for the ultimate, high-definition, big-screen, 3D entertainment experience. By leveraging the processing power of your NVIDIA GeForce GPU, 3DTV Play Software delivers the best your PC has to offer:&lt;br /&gt;
&lt;br /&gt;
&amp;gt; Supports all HDMI 1.4 3D TVs any and any compatible 3D glasses system, including Panasonic® VIERA® Plasma HDTVs&lt;br /&gt;
&amp;gt; Play hundreds of standard PC games in stunning 3D environments like World of Warcraft® – Wrath of the Lich King™, Battlefield Bad Company™ 2, and James Cameron’s Avatar™: The Game&lt;br /&gt;
&amp;gt; Enjoy Blu-ray 3D™ playback* in Full HD, seamless 1080p quality thanks to real-time &lt;br /&gt;
GPU accelerated decoding on your GeForce GPU&lt;br /&gt;
&amp;gt; View 3D photos instantly in slideshow or individually with included photo viewer&lt;br /&gt;
&amp;gt; Watch streaming 3D movie content using video-on-demand applications like Next3D™&lt;br /&gt;
NOTE: If you are an existing NVIDIA 3D Vision glasses owner or plan to purchase a kit, 3DTV Play Software will be included for free in a future software update.&lt;br /&gt;
&lt;br /&gt;
3D technology is driving advances in storytelling and visual experiences as we have seen from the successes of movies like Avatar and Alice in Wonderland. And while the 3D in your local movie theater is one thing, bringing it home to your living room is another. Over the past year or so, NVIDIA has been a major player in 3D digital entertainment technology, through our large ecosystem of display, content, and hardware partners.&lt;br /&gt;
&lt;br /&gt;
Upgrade your PC to a fully immersive stereoscopic 3D experience with NVIDIA® 3D Vision™. A combination of high-tech wireless glasses and advanced software, 3D Vision automatically transforms hundreds of PC games into full stereoscopic 3D. In addition, view movies and digital photographs in eye popping 3D. Now supporting full HD 1080p clarity with 23” Alienware OptX™ AW2310 3D Full HD Widescreen monitors and ACER GD245HQ and GD235HZ monitors.&lt;/div&gt;</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/6872509841901490621/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/6872509841901490621' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/6872509841901490621'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/6872509841901490621'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2010/03/3dtv-play.html' title='3DTV Play'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhf7RGCX88_w11ghyphenhyphenUO-7bHRS6V_CpgumXngvmb8yzao39jXJAuOhhKp-lgQjPgipqkuCtnWQp870gQLlGVGH7oZWvXI_9u9DS3kptwSaetXxHTDuDOI5mklWB-uIX5at_sY0uZjZXhZpKZ/s72-c/NV_3DVision_Header_Alienware.jpg" height="72" width="72"/><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-8320864029990343880</id><published>2010-01-10T02:19:00.001-08:00</published><updated>2012-06-21T00:13:18.892-07:00</updated><title type='text'>Sapphire Radeon HD 5000 series review</title><content type='html'>&lt;div dir=&quot;ltr&quot; style=&quot;text-align: left;&quot; trbidi=&quot;on&quot;&gt;&lt;a href=&quot;http://shigmo10.blogspot.com/&quot;&gt;&lt;/a&gt;&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjTnXxCFD4fyT7g-AcKzObmSaLHaZvci3AEhcOaQmtMIO4YStUZZozH9N8YbfcnVpyhIEhyphenhyphenYdDB-LBjhb6xEK3o1tqDPbCjYSWt-ZHEtSGftZ7_7h8uJ5QjydyDeisNt66HldtPZUmvjxZc/s1600-h/Sapphire-5850.jpg&quot; onblur=&quot;try {parent.deselectBloggerImageGracefully();} catch(e) {}&quot;&gt;&lt;img alt=&quot;Sapphire Radeon HD 5000 series review&quot; border=&quot;0&quot; id=&quot;BLOGGER_PHOTO_ID_5425059017681679890&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjTnXxCFD4fyT7g-AcKzObmSaLHaZvci3AEhcOaQmtMIO4YStUZZozH9N8YbfcnVpyhIEhyphenhyphenYdDB-LBjhb6xEK3o1tqDPbCjYSWt-ZHEtSGftZ7_7h8uJ5QjydyDeisNt66HldtPZUmvjxZc/s200/Sapphire-5850.jpg&quot; style=&quot;cursor: hand; cursor: pointer; display: block; height: 196px; margin: 0px auto 10px; text-align: center; width: 200px;&quot; /&gt;&lt;/a&gt;&lt;br /&gt;
&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiV7XBGhyhn82BVxvCry-K1W-8LPrPg1IohwDEx-pdzXf2jDzOuV10Ld0SVo92KwG4BjRPyOBtDfqcV0W1pNk3MM6E0VRX-dISoD6EihNcO4BP3WV8lO-4Y1TFlTVTn3JNL0YarxgG2xtuZ/s1600-h/sapphire_hd5750_01_thumb.jpg&quot; onblur=&quot;try {parent.deselectBloggerImageGracefully();} catch(e) {}&quot;&gt;&lt;img alt=&quot;Sapphire Radeon HD 5000 series review&quot; border=&quot;0&quot; id=&quot;BLOGGER_PHOTO_ID_5425058898615107762&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiV7XBGhyhn82BVxvCry-K1W-8LPrPg1IohwDEx-pdzXf2jDzOuV10Ld0SVo92KwG4BjRPyOBtDfqcV0W1pNk3MM6E0VRX-dISoD6EihNcO4BP3WV8lO-4Y1TFlTVTn3JNL0YarxgG2xtuZ/s200/sapphire_hd5750_01_thumb.jpg&quot; style=&quot;cursor: hand; cursor: pointer; float: right; height: 134px; margin: 0 0 10px 10px; width: 200px;&quot; /&gt;&lt;/a&gt;&lt;br /&gt;
Over the course of the past two months, AMD has launched five different models as part of its HD 5000 series. The company kicked things off with its higher-end HD 5850 and HD 5870 cards, which simply put, reaffirmed the fact that the roaring success of the HD 4000 series wasn&#39;t going to be the last. Just last week, AMD ushered in the launch of its first dual-GPU card as part of the HD 5000 series, the HD 5970, and it&#39;s mind-bogglingly fast, especially when compared to NVIDIA&#39;s current offerings.&lt;br /&gt;
&lt;br /&gt;
While high-end cards are all fine and good for those who need the kind of powered offered, there exists an even stronger market for lower-end components, and that&#39;s where the HD 5770 and HD 5750 come into play. I took a look at the former about a month ago, and was impressed with the overall value. No matter how you looked at it, the card offered fantastic performance, a lower power consumption (and lower temps as a result), along with such perks as DirectX 11. There wasn&#39;t a single aspect not to like - well, except for the overclocking potential.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Model               &lt;br /&gt;
Core MHz&lt;br /&gt;
Mem MHz&lt;br /&gt;
Memory&lt;br /&gt;
Bus Width&lt;br /&gt;
Processors&lt;br /&gt;
&lt;br /&gt;
Radeon HD 5870 &lt;br /&gt;
850&lt;br /&gt;
1200&lt;br /&gt;
1024MB&lt;br /&gt;
256-bit&lt;br /&gt;
1600&lt;br /&gt;
Radeon HD 5850 &lt;br /&gt;
725&lt;br /&gt;
1000&lt;br /&gt;
1024MB&lt;br /&gt;
256-bit&lt;br /&gt;
1440&lt;br /&gt;
Radeon HD 5770 &lt;br /&gt;
850&lt;br /&gt;
1200&lt;br /&gt;
1024MB&lt;br /&gt;
128-bit&lt;br /&gt;
800&lt;br /&gt;
Radeon HD 5750 &lt;br /&gt;
700&lt;br /&gt;
1150&lt;br /&gt;
512 - 1024MB&lt;br /&gt;
128-bit&lt;br /&gt;
720&lt;br /&gt;
Radeon HD 4890 &lt;br /&gt;
850 - 900&lt;br /&gt;
975&lt;br /&gt;
1024MB&lt;br /&gt;
256-bit&lt;br /&gt;
800&lt;br /&gt;
Radeon HD 4870 &lt;br /&gt;
750&lt;br /&gt;
900&lt;br /&gt;
512 - 2048MB&lt;br /&gt;
256-bit&lt;br /&gt;
800&lt;br /&gt;
Radeon HD 4850 &lt;br /&gt;
625&lt;br /&gt;
993&lt;br /&gt;
512 - 1024MB&lt;br /&gt;
256-bit&lt;br /&gt;
800&lt;br /&gt;
Radeon HD 4770 &lt;br /&gt;
750&lt;br /&gt;
800&lt;br /&gt;
512MB&lt;br /&gt;
128-bit&lt;br /&gt;
640&lt;br /&gt;
Radeon HD 4670 &lt;br /&gt;
750&lt;br /&gt;
900 - 1100&lt;br /&gt;
512 - 1024MB&lt;br /&gt;
128-bit&lt;br /&gt;
320&lt;br /&gt;
Radeon HD 4650 &lt;br /&gt;
600&lt;br /&gt;
400 - 500&lt;br /&gt;
512 - 1024MB&lt;br /&gt;
128-bit&lt;br /&gt;
320&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
At $109 for the 512MB version, and $129 for the 1GB version, the HD 5750 is without a doubt, a card designed for those who want performance far beyond what an integrated chip could offer, but doesn&#39;t cost an arm and a leg. The HD 5750 is capable of delivering on all fronts in that regard. As you can see below, Sapphire changes things up from the reference design just a wee bit. the board itself is identical, but the cooler is a little more robust, with a larger heatsink base.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Sapphire Radeon HD 5770 Vapor-X 1GB Video Card&lt;br /&gt;
&lt;br /&gt;
The Vapor-X series is one of Sapphire’s main lines when it comes to offering performance graphics cards. With the main focus being on cooling thanks to a completely new cooler, when compared to the stock one we&#39;ve seen the series perform well, not only recently but ever since its introduction.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Recently we had a chance to look at the HD 5750 and HD 5870 Vapor-X models. After looking at both of them there was a clear hole in the middle with the much loved HD 5770 and HD 5850 not being part of the lineup.&lt;br /&gt;
The good news is that the HD 5770 has now joined the Vapor-X series and if the whispers we&#39;re hearing are anything to go by, the HD 5850 shouldn&#39;t be all that far away either.&lt;br /&gt;
So let&#39;s take a look at the package of the HD 5770 Vapor-X before having a closer look at the card and the cooler it carries. Then we&#39;ll have a look at the specifications and get stuck into the benchmarks to see just how the model performs.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
ATI 5 series video cards. If not you have probably been living under a rock. ATI took a swing for the fences and hit a home run in terms of the performance delivered by the Sapphire HD 5870 when compared to the best single GPU card that Nvidia had to offer at the time, the GTX 285. The follow up, the Sapphire HD 5850, just about cleaned house as well. Building on that performance lead they solidified the high end stranglehold with the Sapphire HD 5970 that really just crushed the GTX 295 and HD 4870x2. With the high end covered the mid range was not forgotten about with the introduction of the 5700 series that included the Sapphire HD 5770 and 5750. The 5 series of cards are the first true Direct X 11 video cards to market but at launch really did not have any games ready to show off this technology. This has now changed with more than a few games ready with many more in the wings from a slew of developers. Sapphire as ATIs largest partner always brings something interesting to the table after the release of the reference or ATI cards. Sapphire has a few lines that are geared more towards the enthusiast with the Atomic, Toxic and Vapor-X series. Each offers better component usage and some innovative cooling solutions. For instance there was the use of a self contained liquid cooling system on the Sapphire HD 4870x2 Atomic and the first use of the Vapor-X technology from Microloops on the Sapphire HD 3870 Atomic back in January 2008.&lt;br /&gt;
&lt;br /&gt;
From that point forward Sapphire has made use of the technology not only on the Vapor-X line but in the Toxic and Atomic lineups to bring out the best cooling and noise performance from their video cards. The cooling is only part of the Vapor-X package. The Sapphire HD 5770 Vapor-X comes equipped with not only the additional cooling but is built using solid high polymer capacitors and &quot;Black Diamond&quot; chokes that use a built in heat spreader to drop the operating temperatures by 10% while increasing the efficiency by 25%. So what does this really get you net? Because of the cooling used Sapphire ups the clock speed by 10Mhz on the Juniper 40nm core going from 850MHz to 860MHz but no increase on the GDDR5 memory clocks. At this point you have a card with better cooling, better component selection and higher clock speeds. Lets see if that translates in to better performance and overclocking. If the past history is any indication this card should do well.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
HD 4850&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The graphics solution world is constantly evolving with visual quality being stepped up at an ever-growing exponential rate. The launch of Windows 7 and with it the tantalising prospect of DirectX 11 games is perhaps not so much just a single step up but a whole flight of stairs designed to deliver unparalleled graphics quality.&lt;br /&gt;
&lt;br /&gt;
With nVidia sliding by the way side, ATI is setting about monopolising the market rolling out numerous members of its 5 series cards, each of which offers a different price/performance ratio, following the main launch of the 5870 – a card which packed some serious performance and certainly turned heads at the beginning of the DirectX 11 era.&lt;br /&gt;
&lt;br /&gt;
However, with DirectX 11 games still very much on the horizon and price of the 5870 way over budget for many enthusiasts, the inevitable introduction of the 5850 followed shortly. Based on the same RV870 core, codenamed Cypress, but located inside the $250 price bracket, the 5850 should consolidate the performance achieved by its bigger brother placing it in the upper mid-range section so as to push all the buttons for DX11 enthusiasts.&lt;br /&gt;
&lt;br /&gt;
Features&lt;br /&gt;
&lt;br /&gt;
Microsoft DirectX® 11 Support&lt;br /&gt;
ATI Eyefinity Technology&lt;br /&gt;
ATI Stream Technology&lt;br /&gt;
Designed for DirectCompute 5.0 and OpenCL&lt;br /&gt;
Accelerate Video Transcoding&lt;br /&gt;
40 nm Process Technology&lt;br /&gt;
Advanced GDDR5 Memory Technology&lt;br /&gt;
2nd Generation TeraScale Engine&lt;br /&gt;
Microsoft Windows 7® Support&lt;br /&gt;
ATI CrossFireX™ Technology&lt;br /&gt;
Enhanced Anisotropic Filtering&lt;br /&gt;
Accelerated Video Transcoding&lt;br /&gt;
Display Flexibility, Supports DL-DVI, DP, HDMI and D-Sub&lt;br /&gt;
HDMI 1.3&lt;br /&gt;
Dolby® TrueHD and DTS-HD Master Audio™ Support&lt;br /&gt;
ATI PowerPlay™ Technology – Enhanced Support for GDDR5 memory&lt;br /&gt;
ATI Avivo™ Technology Enhanced Unified Video Decoder 2 (UVD 2)&lt;br /&gt;
Supports OpenGL 3.1&lt;br /&gt;
Specifications&lt;br /&gt;
&lt;br /&gt;
SKU Number: 21162-00-50R&lt;br /&gt;
I/O Output: Dual DL-DVI-I+DP+HDMI, Triple Display Support&lt;br /&gt;
Core Clock: 725 MHz&lt;br /&gt;
Memory Clock: Effective 4000 MHz&lt;br /&gt;
PCI Express 2.0 x16 bus interface&lt;br /&gt;
1024MB /256bit GDDR5 memory interface&lt;br /&gt;
Dual Slot Cooler with Auto Fan Control, 2 Ball Bearing&lt;br /&gt;
On-board HDMI, supports HDMI 1.3 with High Bitrate Audio&lt;br /&gt;
On-board DisplayPort&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;span style=&quot;font-weight: bold;&quot;&gt;Sapphire Radeon HD 5970 2GB Overclocked&lt;/span&gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
We mentioned in our standalone review that there was really more to the HD 5970 than just what meets the eye. We also covered the fact that ATI have kind of pushed this whole thing about the HD 5970 being &quot;Unlocked.&quot; At first glance it&#39;s a little hard to figure what exactly ATI mean when they&#39;re saying this. Generally speaking, in the computer world the term &quot;Unlocked&quot; tends to refer to a CPU and the fact that it is multiplier unlocked.&lt;br /&gt;
&lt;br /&gt;
So, what does this whole &quot;Unlocked&quot; mean for ATI’s latest giant card? Well, you can&#39;t enter some secret key combination and unlock more stream processers or ROPs, nor is there a secret to gaining extra memory. What ATI mean by saying the model is &quot;Unlocked&quot; is its overclocking ability.&lt;br /&gt;
Initial thoughts on the overclocking potential of the model was that it would be alright, but not great. If it was going to be so good, why didn&#39;t the model come out of the factory faster? If the OC potential is so good, why has Sapphire strapped the overclock name to the model and only given us a 10MHz core bump and 40MHz QDR memory bump?&lt;br /&gt;
To be honest, we&#39;re not sure what the answers to these questions are. All I can tell you is that many hours was spent trying to find the maximum overclock and what we achieved was surprising to say the least; and when I say least, I mean the absolute very least.&lt;br /&gt;
&lt;br /&gt;
In an evolution of its predecessor, the 4870 X2, today we look at the Sapphire Radeon HD 5970 OC.  Yes, it has a new naming scheme. And yes, it is an overclocked version. For those who cannot satisfy their hunger for performance, this Sapphire card features software to let you tweak the voltages to push your overclock to the limit; no volt modding required, no voided warranty. &lt;br /&gt;
The Sapphire Radeon HD 5970 OC is an absolute behemoth, from its gaming horsepower, to its size, weight, cooling technology, and price tag. It is big, it is brash, and and it is entirely unapologetic. &lt;br /&gt;
&lt;br /&gt;
ATI is on a roll. There is no doubting it or denying the fact that the boys in red have managed to hammer a successive number of nails into NVIDIA’s DX11 aspirations by being first to market with not one but a whole series of brand new, segment-leading DX11 cards. The HD 5800-series was first on the scene and proved that these new cards could compete with the best of the best from the previous generation and then some. However, in many people’s opinions, there was one thing missing: ATI firmly marking their turf by laying claim to the fastest graphics card in the world. That’s where the HD 5970 2GB comes into the picture.&lt;br /&gt;
&lt;br /&gt;
At its most basic, the new HD 5970 is a dual GPU card that makes use of an on-board PLX bridge chip to handle the communication between the two cores. Each GPU core is able to address a whopping 1GB of GDDR5 memory which will hopefully make the bandwidth issues of the HD5800-series of cards a thing of the past. From a pure performance standpoint, this card’s potential is simply out of this world. &lt;br /&gt;
&lt;br /&gt;
We all remember the HD 4870 X2 and the older yet no less significant HD 3870 X2 dual GPU cards so some of you may be wondering where the “X2” moniker went. Well, ATI has decided to do away with old naming conventions for one reason or another and believe it or not, we welcome this change. It cements the HD 5900-series as the current high performance cards in ATI’s lineup while keeping a clear distinction between all of their product ranges. &lt;br /&gt;
&lt;br /&gt;
In this review we will be looking something unique: a pre-overclocked ATI card being released right alongside the reference-clocked version. That’s right, at launch there will be two different HD 5970 cards being released by the likes of Sapphire, XFX and other ATI board partners: one with standard speeds and another with some increased performance potential. Along with this somewhat shocking revelation, there are several other things that make the HD 5970 a cut above but we will go into those a bit later in this review. &lt;br /&gt;
&lt;br /&gt;
Our introduction wouldn’t be complete without some speculation about the HD 5970’s pricing and availability and on both fronts, it isn’t pretty. We should be looking at an initial “launch” price of about $600USD or $675CAD which will make it the most expensive card on the market by a long shot. However, this price is likely to skyrocket in the days following launch since we hear it will be be next to impossible to find. The retailers we have spoken to are all expecting less than 10 cards in total at launch which makes this a paper launch that we are sure will be passed off as a hard launch. &lt;br /&gt;
&lt;br /&gt;
With NVIDIA’s Fermi cards firmly behind the iron curtain somewhere in Santa Clara, ATI has a clear path to complete market domination with their HD 5970. Let’s hope they make the most out of it.&lt;/div&gt;</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/8320864029990343880/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/8320864029990343880' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/8320864029990343880'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/8320864029990343880'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2010/01/sapphire-radeon-hd-5000-series-review.html' title='Sapphire Radeon HD 5000 series review'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjTnXxCFD4fyT7g-AcKzObmSaLHaZvci3AEhcOaQmtMIO4YStUZZozH9N8YbfcnVpyhIEhyphenhyphenYdDB-LBjhb6xEK3o1tqDPbCjYSWt-ZHEtSGftZ7_7h8uJ5QjydyDeisNt66HldtPZUmvjxZc/s72-c/Sapphire-5850.jpg" height="72" width="72"/><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-8099864140040787305</id><published>2009-11-01T07:43:00.000-08:00</published><updated>2009-11-01T07:45:48.747-08:00</updated><title type='text'>DirectX 11 Graphic Cards Now Available: ATI HD 5870/5850</title><content type='html'>&lt;a onblur=&quot;try {parent.deselectBloggerImageGracefully();} catch(e) {}&quot; href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhT5ENKfug-8RtieGJ6-x266k4yR8KwPhj0XE6HlvMpf14ieccxZSqCJZxuAfsQx3HSxEkv_Dm8HInidDeMr-TZVmu2fYSro0mYEU-z7DAVABq0iK4hGa9UPuajxoeF1HiRfgyOg4GPTzFi/s1600-h/106498-ati-400.jpg&quot;&gt;&lt;img style=&quot;float:right; margin:0 0 10px 10px;cursor:pointer; cursor:hand;width: 200px; height: 158px;&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhT5ENKfug-8RtieGJ6-x266k4yR8KwPhj0XE6HlvMpf14ieccxZSqCJZxuAfsQx3HSxEkv_Dm8HInidDeMr-TZVmu2fYSro0mYEU-z7DAVABq0iK4hGa9UPuajxoeF1HiRfgyOg4GPTzFi/s200/106498-ati-400.jpg&quot; border=&quot;0&quot; alt=&quot;&quot;id=&quot;BLOGGER_PHOTO_ID_5399161899575690082&quot; /&gt;&lt;/a&gt;&lt;br /&gt;The next generation of graphic cards ATI Radeon HD 5870 and HD 5850 boasting of DirectX 11 support and ATI Eyefinity technology are now available, these video cards support upto 3 monitors and resolution upto 2560×1600 per display, 6 screen support will eventually come out in November with a new version of 5870 graphics card with 6 mini display ports. HD 5800 series of graphics cards also supports dual, triple and quad ATI CrossFireX multi-GPU technology.&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;ATI HD 5870 and 5850  GPUs are developed on the advanced 40nm fabrication technology containing massive 2.15 billion transistors and additionally the streaming processor units, texture units and ROPs are almost doubled in comparison to ATI 4870 and 4850 GPUs respectively. My 6 months old Dual Sonic 4870 graphics card is already feeling older, I  might even sell it to make way for new GPU chips, the performance that these cards provide and their price point makes them the only graphics card you should buy if you are in a market for a  higher end graphics solution as of now.&lt;br /&gt;&lt;br /&gt;Power consumption&lt;br /&gt;&lt;br /&gt;Apart from double performance ATI has attained another feat in power consumption, 5870 graphics card consumes only 27 watts at idle and 188 watts at full load when used with single display whereas 4850 consumes 27 watts at idle and 170 watts at full load with single display, with two displays the idle power will be doubled.&lt;br /&gt;&lt;br /&gt;Price of ATI HD 5870 and HD 5850&lt;br /&gt;The new video cards are priced at sweet $259 for HD 5850 and $379 for HD 5870, the prices will come down a bit once ATI releases new cards in November or if NVIDIA comes out with its lineup of DirectX 11  cards.&lt;br /&gt;&lt;br /&gt;India Price would be around Rs 17,000 for HD 5850 and  Rs 25,000 for HD 5870&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;List of DirectX 11 Games for Windows 7 and Windows Vista&lt;br /&gt;&lt;br /&gt;Not that we need DirectX 11 so much, but AMD sure want us to believe so and is accelerating things with DirectX 11 games and graphics cards. why? because NVIDIA is yet to launch a DirectX 11 graphics card and ATI has its Evergreen GPU ready now, they already demonstrated its awesomeness and want to benefit as the first mover.&lt;br /&gt;&lt;br /&gt;&lt;br /&gt; &lt;br /&gt;&lt;br /&gt;With Microsoft launching DirectX 11 SDK for developers, you could well witness the first DirectX 11 game Dirt 2 in December.&lt;br /&gt;&lt;br /&gt;Here is a list of upcoming games for DirectX 11 for Windows 7 and Vista (the list will be updated frequently as soon as new games are announced)&lt;br /&gt;&lt;br /&gt;Games with DirectX 11 Support&lt;br /&gt;&lt;br /&gt;DiRT 2 : Direct X 11 support Confirmed&lt;br /&gt;S.T.A.L.K.E.R.: Call of Pripyat : Direct X 11 support Confirmed&lt;br /&gt;Aliens vs Predator : Direct X 11 support Confirmed&lt;br /&gt;Batteforge : Direct X 11 support Confirmed&lt;br /&gt;CRYSIS 2  : Not Confirmed, but the CryENGINE 3 supports DirectX 11</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/8099864140040787305/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/8099864140040787305' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/8099864140040787305'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/8099864140040787305'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2009/11/directx-11-graphic-cards-now-available.html' title='DirectX 11 Graphic Cards Now Available: ATI HD 5870/5850'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhT5ENKfug-8RtieGJ6-x266k4yR8KwPhj0XE6HlvMpf14ieccxZSqCJZxuAfsQx3HSxEkv_Dm8HInidDeMr-TZVmu2fYSro0mYEU-z7DAVABq0iK4hGa9UPuajxoeF1HiRfgyOg4GPTzFi/s72-c/106498-ati-400.jpg" height="72" width="72"/><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-272572530670028298</id><published>2009-10-09T04:03:00.000-07:00</published><updated>2009-10-09T04:09:33.038-07:00</updated><title type='text'>Nvidia Ready to Enter The DirectX 11 Gaming With Upcoming 40nm GT300 GPU</title><content type='html'>&lt;a onblur=&quot;try {parent.deselectBloggerImageGracefully();} catch(e) {}&quot; href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiYRXv8vaBOmVyEONd-PiGztktm19FN9kpP5IWyxVm5pTWD5EAo73J5QT0qjfryfCXCHwgriYXPjwP8gHrlz0j-3jHtKXZLWPfJmfaGotf2RX4ykEZDp7x7MaHpxrhk7SQeY8hw60JiAyAG/s1600-h/Fermi_Summary.gif&quot;&gt;&lt;img style=&quot;float:right; margin:0 0 10px 10px;cursor:pointer; cursor:hand;width: 200px; height: 102px;&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiYRXv8vaBOmVyEONd-PiGztktm19FN9kpP5IWyxVm5pTWD5EAo73J5QT0qjfryfCXCHwgriYXPjwP8gHrlz0j-3jHtKXZLWPfJmfaGotf2RX4ykEZDp7x7MaHpxrhk7SQeY8hw60JiAyAG/s200/Fermi_Summary.gif&quot; border=&quot;0&quot; alt=&quot;&quot;id=&quot;BLOGGER_PHOTO_ID_5390555695058912658&quot; /&gt;&lt;/a&gt;&lt;br /&gt;Nvidia lagged behind rival AMD in the introduction of a Direct X 11 GPU but it seems that the company is ready to make an impressive come back with the upcoming GT300 chip, which bloggers already describe as a &quot;computational beast&quot;. &lt;br /&gt;&lt;br /&gt;Nvidia&#39;s upcoming 40nm GPU codenamed &quot;Fermi&quot; or simpler &quot;GT300&quot; is slated for a Q4 2009 launch, meaning that it will most probably show up around the end of November. &lt;br /&gt;&lt;br /&gt;Rumors about the new chip indicate that it will pack a lot of architectural changes compared to Nvidia&#39;s GT200 generation of GPUs. &lt;br /&gt;&lt;br /&gt;Manufactured by TSMC, the 40nm chip is expected to pack three billion transistors and 16 Streaming Multiprocessor (former Shader Cluster). Each multiprocessor has 32 cores, with each core to be able to execute an integer or a floating point instruction per clock per thread. &lt;br /&gt;&lt;br /&gt;The chip also packs six 64-bit GDDR5 memory controllers for a grand total of 384-bit, matching AMD&#39;s current offering with the Radeon HD 5800 series. &lt;br /&gt;&lt;br /&gt;Possible configurations of the new boards should include 1.5, 3.0 GB and 6GB of GDDR5 memory. &lt;br /&gt;&lt;br /&gt;Of course, the GT300 gives direct hardware access for CUDA 3.0, DirectX 11, OpenGL 3.1 and OpenCL. &lt;br /&gt;&lt;br /&gt;Nvidia should have been very busy with the design of the new architecture fir the GT300 chip. According to early reports, the chip will be more than three times more powerful than the GT200 GPU. &lt;br /&gt;&lt;br /&gt;Nvidia has made an official statement on the state of the GT300 yields last week, after some stories online reported that the GT300 yields were bad, with only nine chips to work per wafer. An Nvidia&#39; product manager said that the company&#39;s 40nm yields were &quot;fine&quot; and described the information as &quot;baseless.&quot; &lt;br /&gt;&lt;br /&gt;It seems that the complicated design of the new GT300 chip was the reason behind the &quot;late&quot; introduction of the chip in the market. AMD has already made a strong statement with the release of its Radeon HD5800 series but Nvidia believes that they can make its DirectX 11 GPU even faster. &lt;br /&gt;&lt;br /&gt;All we have to do is to wait to see if the GT300 can outperform the HD 5870, HD 5850X2 and the HD 5870X2 chips, in a very interesting race in the GPU market.&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight:bold;&quot;&gt;NVIDIA Unveils Next Generation CUDA GPU Architecture - Codenamed &quot;Fermi&quot;&lt;/span&gt;&lt;br /&gt;&lt;br /&gt;Nvidia today officially introduced its next generation CUDA GPU architecture, codenamed &quot;Fermi&quot;. An entirely new ground-up design, the &quot;Fermi&quot; architecture is the foundation for the world&#39;s first computational graphics processing units (GPUs), promising to deliver breakthroughs in both graphics and GPU computing. &lt;br /&gt;&lt;br /&gt;&quot;NVIDIA and the Fermi team have taken a giant step towards making GPUs attractive for a broader class of programs,&quot; said Dave Patterson, director Parallel Computing Research Laboratory, U.C. Berkeley and co-author of Computer Architecture: A Quantitative Approach. &quot;I believe history will record Fermi as a significant milestone.&quot; &lt;br /&gt;&lt;br /&gt;Presented at the company&#39;s inaugural GPU Technology Conference, in San Jose, California, &quot;Fermi&quot; delivers a feature set that accelerates performance on a wider array of computational applications than ever before. Joining NVIDIA&#39;s press conference was Oak Ridge National Laboratory who announced plans for a new supercomputer that will use NVIDIA GPUs based on the &quot;Fermi&quot; architecture. &quot;Fermi&quot; also garnered the support of leading organizations including Bloomberg, Cray, Dell, HP, IBM and Microsoft. &lt;br /&gt;&lt;br /&gt;&quot;It is completely clear that GPUs are now general purpose parallel computing processors with amazing graphics, and not just graphics chips anymore,&quot; said Jen-Hsun Huang, co-founder and CEO of NVIDIA. &quot;The Fermi architecture, the integrated tools, libraries and engines are the direct results of the insights we have gained from working with thousands of CUDA developers around the world. We will look back in the coming years and see that Fermi started the new GPU industry.&quot; &lt;br /&gt;&lt;br /&gt;As the foundation for NVIDIA&#39;s family of next generation GPUs namely GeForce, Quadro and Tesla − &quot;Fermi&quot; features a host of new technologies including: &lt;br /&gt;&lt;br /&gt;- C++, complementing existing support for C, Fortran, Java, Python, OpenCL and DirectCompute. &lt;br /&gt;&lt;br /&gt;- ECC, a critical requirement for datacenters and supercomputing centers deploying GPUs on a large scale &lt;br /&gt;&lt;br /&gt;- 512 CUDA Cores featuring the new IEEE 754-2008 floating-point standard, surpassing even the most advanced CPUs &lt;br /&gt;&lt;br /&gt;- 8x the peak double precision arithmetic performance over NVIDIA?s last generation GPU. Double precision is critical for high-performance computing (HPC) applications such as linear algebra, numerical simulation, and quantum chemistry &lt;br /&gt;&lt;br /&gt;- NVIDIA Parallel DataCache - a cache hierarchy in a GPU that speeds up algorithms such as physics solvers, raytracing, and sparse matrix multiplication where data addresses are not known beforehand &lt;br /&gt;&lt;br /&gt;- NVIDIA GigaThread Engine with support for concurrent kernel execution, where different kernels of the same application context can execute on the GPU at the same time (eg: PhysX fluid and rigid body solvers) &lt;br /&gt;&lt;br /&gt;- Nexus - a fully integrated heterogeneous computing application development environment within Microsoft Visual Studio&lt;br /&gt;&lt;br /&gt;Nvidia described the &quot;Fermi&quot; architecture as the most significant leap forward in GPU architecture since the original G80. G80 was Nvidia&#39;s initial vision of what a unified graphics and computing parallel processor should look like. GT200 extended the performance and functionality of G80. With Fermi, Nvidia has taken all they have learned from the two prior processors and all the applications that were written for them, and employed a completely new approach to design to create the world?s first computational GPU.</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/272572530670028298/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/272572530670028298' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/272572530670028298'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/272572530670028298'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2009/10/nvidia-ready-to-enter-directx-11-gaming.html' title='Nvidia Ready to Enter The DirectX 11 Gaming With Upcoming 40nm GT300 GPU'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiYRXv8vaBOmVyEONd-PiGztktm19FN9kpP5IWyxVm5pTWD5EAo73J5QT0qjfryfCXCHwgriYXPjwP8gHrlz0j-3jHtKXZLWPfJmfaGotf2RX4ykEZDp7x7MaHpxrhk7SQeY8hw60JiAyAG/s72-c/Fermi_Summary.gif" height="72" width="72"/><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-3331268351279499493</id><published>2009-07-20T23:58:00.001-07:00</published><updated>2009-07-21T00:17:10.734-07:00</updated><title type='text'>NVIDIA GeForce GTX 295</title><content type='html'>&lt;a onblur=&quot;try {parent.deselectBloggerImageGracefully();} catch(e) {}&quot; href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEioDlN7oN09v43CYEoBZGX0ofs8CWasFeXQhl7TgXXKdjCyZ70clQs2IRE_-_YPA6alMn-XIgegBuMSPCNXuHtBpuKYA2d2lwcRDBrz4n2iUrmRsdcKpqMW52KIzFHkLJE94c3W-ENo9taj/s1600-h/295_greenie.jpg&quot;&gt;&lt;img style=&quot;float:right; margin:0 0 10px 10px;cursor:pointer; cursor:hand;width: 200px; height: 176px;&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEioDlN7oN09v43CYEoBZGX0ofs8CWasFeXQhl7TgXXKdjCyZ70clQs2IRE_-_YPA6alMn-XIgegBuMSPCNXuHtBpuKYA2d2lwcRDBrz4n2iUrmRsdcKpqMW52KIzFHkLJE94c3W-ENo9taj/s200/295_greenie.jpg&quot; border=&quot;0&quot; alt=&quot;&quot;id=&quot;BLOGGER_PHOTO_ID_5360808685205327682&quot; /&gt;&lt;/a&gt;&lt;br /&gt;&lt;a onblur=&quot;try {parent.deselectBloggerImageGracefully();} catch(e) {}&quot; href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjvKOMa_wTZqHkLIgBe-C96HHSNSm_hayXdrWsVeCSgV7RbghgAEz0V8XETdig1A897rIe3-DhnyYC-5zNb4Bo1ejjSTeiJSPY4lcTI06pO1p5FkluG5Y7vRvTRAUZ8Db8sOaPzkeJ0KvCT/s1600-h/9760-img8086s.jpg&quot;&gt;&lt;img style=&quot;float:right; margin:0 0 10px 10px;cursor:pointer; cursor:hand;width: 200px; height: 102px;&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjvKOMa_wTZqHkLIgBe-C96HHSNSm_hayXdrWsVeCSgV7RbghgAEz0V8XETdig1A897rIe3-DhnyYC-5zNb4Bo1ejjSTeiJSPY4lcTI06pO1p5FkluG5Y7vRvTRAUZ8Db8sOaPzkeJ0KvCT/s200/9760-img8086s.jpg&quot; border=&quot;0&quot; alt=&quot;&quot;id=&quot;BLOGGER_PHOTO_ID_5360808679488525698&quot; /&gt;&lt;/a&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;NVIDIA PhysX technology brings your games to life with massively destructible environments and ultra-realistic physical interaction in games such as Mirror&#39;s Edge&lt;br /&gt;&lt;br /&gt;Quad NVIDIA SLI technology and 1792MB of dedicated graphics memory deliver breathtaking frame rates and total graphics bliss with game settings and resolution maxed out (2560x1600)&lt;br /&gt;&lt;br /&gt;Experience ultimate graphics performance in the hottest DirectX 10 games such as Far Cry 2&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;Graphics Processor: NVIDIA GeForce GTX 295&lt;br /&gt;Memory Installed: 1792MB DDR3&lt;br /&gt;Interface: PCI Express x16 2.0&lt;br /&gt;Performance&lt;br /&gt;GPU:Dual NVIDIA GeForce GTX 295&lt;br /&gt;Core Clock: 576MHz&lt;br /&gt;Shader Clock: 1242MHz&lt;br /&gt;Shader Model: 4.0&lt;br /&gt;Texture Fill Rate: 92.2 Billion/sec. (Combined)&lt;br /&gt;Processor Cores: 480 (Combined)&lt;br /&gt;Memory:&lt;br /&gt;Video Memory: 1792MB (Combined)&lt;br /&gt;Memory Type: GDDR3&lt;br /&gt;Memory Data Rate: 1998MHz&lt;br /&gt;Memory Interface: 896-bit (Combined)&lt;br /&gt;Memory Bandwidth: 223.8GB/sec. (Combined)&lt;br /&gt;Connections&lt;br /&gt;Bus Type: PCI Express 2.0 (Backward compatible with PCI Express)&lt;br /&gt;Display Connectors: 2 Dual-Link DVI-I, HDMI Out&lt;br /&gt;RAMDACs: Dual 400MHz&lt;br /&gt;Multiple Monitor Support: Yes&lt;br /&gt;HDCP Capable: Yes, Dual link (Requires other compatible components which are HDCP capable. Designed to meet the output protection management (HDCP) and security specifications of the Blu-ray Disc and HD DVD formats, allowing the playback of encrypted movie content on PCs when connected to HDCP-compliant displays)&lt;br /&gt;HDMI Capable: Yes&lt;br /&gt;NVIDIA SLI Support: Yes, Quad&lt;br /&gt;Included In Box&lt;br /&gt;BFG NVIDIA GeForce GTX 295 1792MB graphics card&lt;br /&gt;Quick install guide&lt;br /&gt;DVI to VGA adapter&lt;br /&gt;HDMI cable (6 ft.)&lt;br /&gt;S/PDIF audio cable (1 ft.)&lt;br /&gt;Dual 4-pin peripheral to single 6-pin PCI Express power adapter&lt;br /&gt;Dual 6-pin PCI Express to single 8-pin PCI Express power adapter&lt;br /&gt;Driver CD, which includes: NVIDIA ForceWare unified graphics drivers and Full Multiple Language Installation Manual.pdf&lt;br /&gt;System Requirements&lt;br /&gt;2GB of system memory&lt;br /&gt;CD or DVD-ROM drive&lt;br /&gt;100MB of available hard disk drive space for basic driver installation&lt;br /&gt;Microsoft Windows Vista or XP operating system (Windows Vista required for Quad NVIDIA SLI)&lt;br /&gt;PCI Express or PCI Express 2.0-compliant system motherboard with one vacant PCI Express x16 slot&lt;br /&gt;One vacant add-in card slot below the PCI Express x16 slot. This graphics card physically occupies two slots&lt;br /&gt;680W PCI Express-compliant system power supply with a combined 12V current rating of 46A or more (Minimum system power requirement based on a PC configured with an Intel Core i7 965 Extreme Edition processor)&lt;br /&gt;One 8-pin and one 6-pin PCI Express supplementary power connector -or- Two 6-pin PCI Express and two 4-pin peripheral supplementary power connectors&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;There&#39;s been something of a trend in the computer component industry over the last few years with Intel and nVidia ruling the roost in terms of CPU and graphics card performance, respectively. Meanwhile AMD/ATI has gone through a bit of a rough patch but has recently come back strong with some competitively priced products that, while perhaps not the fastest, have proved to be worthwhile investments nonetheless. &lt;br /&gt;The one exception to this rule, however, was the ATI Radeon HD 4870 X2 graphics card that actually proved to be at least equal and generally faster than nVidia&#39;s then top-of-the-range card, the GTX 280. The obvious problem for ATI was that it had already needed to meld two graphics chips onto one board in order to get the performance needed to compete with the GTX 280&#39;s single chip, which begged the question, &quot;If nVidia put two GTX 280 chips on one card, wouldn&#39;t that be faster?&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;There&#39;s no doubt that the 3D Graphic Card business is what you&#39;d call a proverbial &quot;tough gig&quot;.  There was a time when discreet graphics card options were available from numerous GPU vendors, but over the years, the relentless pace of technology and fierce competition has homogenized the market down to virtually two primary suppliers.  In mainstream 3D Graphics, there is but one mantra--keep pace or exceed, execute or die.  It&#39;s a simple equation that keeps product refreshes ongoing and a natural progression of the graphics food chain that results in continuously improved product offerings, at both the hardware and software levels.&lt;br /&gt;&lt;br /&gt;NVIDIA is obviously one of the few companies, along with AMD&#39;s ATI graphics division, that has executed amazingly well over the years.  The continuous strike / counter-strike battle that rages on between the two companies affords consumers increasingly more powerful products, as well as more realism in 3D games, as developers take advantage of each new technology update.  In 2008, NVIDIA had the fastest single GPU solution on earth with the GeForce GTX 280, while AMD&#39;s ATI Radeon HD 4870 X2 took the most powerful single card performance spot with its dual, 55nm RV770 GPU solution.&lt;br /&gt;&lt;br /&gt;Each core on the 295 board runs at 576MHz. The memory&#39;s clocked at 999MHz.&lt;br /&gt;&lt;br /&gt;Today, fresh out of the gate for 2009, NVIDIA returns AMD&#39;s volley with their own optimized, multi-GPU, single card solution that aims to trump their rivals once again.  NVIDIA&#39;s GeForce GTX 295 is unleashed today.  With a pair of 55nm GT200B GPUs under its hood in a pseudo-single card, dual slot height configuration, it&#39;s direct competition for the ATI Radeon HD 4870 X2.  We&#39;ll step you through the technology behind NVIDIA&#39;s new single card SLI-enabled beast and then clock it around the benchmark track with some of the latest, most popular game titles on the market.&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;Thermal and Power Specs: &lt;br /&gt;Maximum GPU Temperature (in C): 105 &lt;br /&gt;Maximum Graphics Card Power (W): 289&lt;br /&gt;Power Connectors: 6-pin x1, 8-pin x1&lt;br /&gt;&lt;br /&gt;Memory: &lt;br /&gt;Memory Clock (MHz DDR): 1998 MHz &lt;br /&gt;Total Memory Config: 1792 MB &lt;br /&gt;Memory Interface Width: 448-bit per GPU &lt;br /&gt;Total Memory Bandwidth: 223.8GB/s&lt;br /&gt;&lt;br /&gt;Display Support: &lt;br /&gt;Maximum Digital Resolution: 2560x1600 &lt;br /&gt;Maximum VGA Resolution: 2048x1536 &lt;br /&gt;&lt;br /&gt;&lt;br /&gt;The 285 is a single-GPU product that Nvidia claimed was 30 per cent faster than rival one-chip graphics cards. Boards based on the 295 design contain two GPUs. Both types of board are two-slot cards.&lt;br /&gt;&lt;br /&gt;The 285 contains 240 unified shader processors, with the core as a whole running at 648MHz. Its 1GB of GDDR 3 is clocked at an effective 1242MHz. By comparison, Nvidia&#39;s existing GeForce 280, launched last summer, runs at 602MHz and 1107MHz, respectively.</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/3331268351279499493/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/3331268351279499493' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/3331268351279499493'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/3331268351279499493'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2009/07/nvidia-geforce-gtx-295.html' title='NVIDIA GeForce GTX 295'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEioDlN7oN09v43CYEoBZGX0ofs8CWasFeXQhl7TgXXKdjCyZ70clQs2IRE_-_YPA6alMn-XIgegBuMSPCNXuHtBpuKYA2d2lwcRDBrz4n2iUrmRsdcKpqMW52KIzFHkLJE94c3W-ENo9taj/s72-c/295_greenie.jpg" height="72" width="72"/><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-8808910286298110460</id><published>2009-06-30T23:03:00.000-07:00</published><updated>2009-07-02T22:36:53.970-07:00</updated><title type='text'>MSI GeForce N260GTX Lightning</title><content type='html'>&lt;form action=&quot;//http://www.supremegraphiccards.blogspot.com/&quot; id=&quot;cse-search-box&quot;&gt;&lt;br /&gt;  &lt;div&gt;&lt;br /&gt;    &lt;input type=&quot;hidden&quot; name=&quot;cx&quot; value=&quot;partner-pub-4198184901345099:i5ihc0zid5s&quot; /&gt;&lt;br /&gt;    &lt;input type=&quot;hidden&quot; name=&quot;cof&quot; value=&quot;FORID:10&quot; /&gt;&lt;br /&gt;    &lt;input type=&quot;hidden&quot; name=&quot;ie&quot; value=&quot;ISO-8859-1&quot; /&gt;&lt;br /&gt;    &lt;input type=&quot;text&quot; name=&quot;q&quot; size=&quot;31&quot; /&gt;&lt;br /&gt;    &lt;input type=&quot;submit&quot; name=&quot;sa&quot; value=&quot;Search&quot; /&gt;&lt;br /&gt;  &lt;/div&gt;&lt;br /&gt;&lt;/form&gt;&lt;br /&gt;&lt;br /&gt;&lt;script type=&quot;text/javascript&quot; src=&quot;http://www.google.com/coop/cse/brand?form=cse-search-box&amp;amp;lang=en&quot;&gt;&lt;/script&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;This GeForce GTX 260 (with 216 shader cores) from MSI is cooled by the Twin Frozr, a GPU cooler with five heatpipes, dual slot heatsink and two big fans. From the name and the images, this GPU cooler seems to be really efficient but the reality is different. When the GPU is stressed, the cooler is loud: 52db! Unbearable! But besides being noisy, the cooler seems to have some difficulties to do its job:&lt;br /&gt;&lt;br /&gt;With external panel, customized PCB, heatpipe cooler, 1792 MB, HDMI, 216SP and sheer design versus price, meet and greet the most unique GeForce GTX 260 in the market today.&lt;br /&gt;&lt;script src=&quot;http://feeds.feedburner.com/GraphicCardsNvidia?format=sigpro&quot; type=&quot;text/javascript&quot;&gt;&lt;/script&gt;&lt;noscript&gt;&lt;p&gt;Subscribe to RSS headline updates from: &lt;a href=&quot;http://feeds.feedburner.com/GraphicCardsNvidia&quot;&gt;&lt;/a&gt;&lt;br /&gt;Powered by FeedBurner&lt;/p&gt; &lt;/noscript&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;&lt;div id=&quot;cse-search-results&quot;&gt;&lt;/div&gt;&lt;br /&gt;&lt;script type=&quot;text/javascript&quot;&gt;&lt;br /&gt;  var googleSearchIframeName = &quot;cse-search-results&quot;;&lt;br /&gt;  var googleSearchFormName = &quot;cse-search-box&quot;;&lt;br /&gt;  var googleSearchFrameWidth = 800;&lt;br /&gt;  var googleSearchDomain = &quot;www.google.com&quot;;&lt;br /&gt;  var googleSearchPath = &quot;/cse&quot;;&lt;br /&gt;&lt;/script&gt;&lt;br /&gt;&lt;script type=&quot;text/javascript&quot; src=&quot;http://www.google.com/afsonline/show_afs_search.js&quot;&gt;&lt;/script&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;NVIDIA was seeking for some more performance in the GTX 260 series -- they needed to give it a little more bite. As such NVIDIA introduced the GeForce GTX 260 SP216; the very same product (as the GTX260 with 192 Shader processors), only carrying an additional 24 shader processors.&lt;br /&gt;&lt;br /&gt;192 cores on the GeForce GTX 260&lt;br /&gt;216 cores on the GeForce GTX 260 SP216&lt;br /&gt;240 of them on the GeForce GTX 280/275/285.&lt;br /&gt;Now by itself, the extra shader processors offer merely a very tiny boost in performance. But what we learned from our initial review is that once you take a look at the slightly more expensive overclocked editions, that&#39;s where the money shot really is. Once the shader processors are clocked higher, instantly performance climbs up, much closer to GeForce GTX 280/285 performance. And at sub-250 USD, that&#39;s just not a bad deal.&lt;br /&gt;&lt;br /&gt;Many AIB and AIC partners jumped the gun and started offering pre-overclocked products; however, there are always a few manufacturers out there that want to do something special. Today we have such a product in our test-lab.&lt;br /&gt;&lt;br /&gt;MSI has released two new GeForce GTX 260 video cards. The first one being the MSI N260GTX Lightning, and the second one a N260GTX Lightning Black Edition. Perhaps you guys remember it, but both products where already showcased at CeBit this year.&lt;br /&gt;&lt;br /&gt;These are probably the most feature rich GeForce GTX 260 videocards you can find on the web. First off, MSI N260GTX Lightning has twice the standard GeForce GTX 260 GDDR3 memory summing up to an incredible 1792 MB, and then utilizes a Twin Frozr cooling with heatpipes, dual slot heatsink and two big fans. The card design is Phase PWM, with 8 phases are reserved for the GPU and 2 phases for the memory. Furthermore, this graphics card has important overclocking features like V-Check points to measure the voltage.&lt;br /&gt;&lt;br /&gt;&lt;a onblur=&quot;try {parent.deselectBloggerImageGracefully();} catch(e) {}&quot; href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiSq_uvKtLmVa0GO2Gq9Ii8xd-l2SKs-2NC1p2ZyVzZBT9jFs1kRsXa7CB1IUz0tQwJW9BIYj2w-4kmOjT8Io4TUHnMP0Spfk0BOx2WXAIF3K43QXjrl6GMu58wV6gwG98IE0f7LDueFnN2/s1600-h/imageview.php.jpg&quot;&gt;&lt;img style=&quot;float:left; margin:0 10px 10px 0;cursor:pointer; cursor:hand;width: 200px; height: 112px;&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiSq_uvKtLmVa0GO2Gq9Ii8xd-l2SKs-2NC1p2ZyVzZBT9jFs1kRsXa7CB1IUz0tQwJW9BIYj2w-4kmOjT8Io4TUHnMP0Spfk0BOx2WXAIF3K43QXjrl6GMu58wV6gwG98IE0f7LDueFnN2/s200/imageview.php.jpg&quot; border=&quot;0&quot; alt=&quot;&quot; id=&quot;BLOGGER_PHOTO_ID_5353368792521242306&quot; /&gt;&lt;/a&gt;&lt;br /&gt;&lt;br /&gt;it&#39;s a graphics card with twice the standard GeForce GTX 260 GDDR3 memory summing up to an incredible 1792 MB, it utilizes a Twin Frozr cooling with heatpipes, dual slot heatsink and two big fans. The card design is Phase PWM, with 8 phases are reserved for the GPU and 2 phases for the memory. Furthermore, this graphics card has important overclocking features like V-Check points to measure the voltage.&quot;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;You will spot two versions, both really are similar. The Lightning Black Edition is paired with the company&#39;s AirForce Panel, an external touch panel that can change the card&#39;s voltages and clock speeds on the fly. That is not included in the regular Lightning edition.</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/8808910286298110460/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/8808910286298110460' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/8808910286298110460'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/8808910286298110460'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2009/06/msi-geforce-n260gtx-lightning.html' title='MSI GeForce N260GTX Lightning'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiSq_uvKtLmVa0GO2Gq9Ii8xd-l2SKs-2NC1p2ZyVzZBT9jFs1kRsXa7CB1IUz0tQwJW9BIYj2w-4kmOjT8Io4TUHnMP0Spfk0BOx2WXAIF3K43QXjrl6GMu58wV6gwG98IE0f7LDueFnN2/s72-c/imageview.php.jpg" height="72" width="72"/><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-6598439492709623652</id><published>2009-06-30T22:46:00.000-07:00</published><updated>2009-07-02T19:17:01.748-07:00</updated><title type='text'>Asus EN6700GT</title><content type='html'>Asus EN6700GT/HTDI&lt;br /&gt;The graphics card on a media PC needs to present a workable compromise between noise output and bulk versus picture quality. For our builds we chose an Asus EN6700GT/HTDI with built-in HDMI (these retail for around $200). As we were finishing this story, we learned about Nvidia&#39;s newest line of 8600 processors, which feature a redesigned graphics processor and include HDMI/HDCP and other important media functions (especially TV/HDTV output) as well as Vista Ready status. Though only a few models are available right now, and no passively cooled versions are available, it looks like these should be ready by summer&#39;s end for somewhere between $200 and $220 (by that time, passive 7600 cards will probably cost $150 or less).&lt;br /&gt;&lt;br /&gt;Whenever we can find a suitable passively-cooled graphics card for a media PC, we immediately take that option because such cards contribute nothing to the system&#39;s overall noise output. For the previous generation of graphics processors, this meant that the Nvidia card of choice was a passively cooled 6600, and the highest-level ATI graphics processor was a 1600XT model. Today, lots of HDMI/HDCP options are available for Nvidia 7xxx and 88xx processors, as well as the upcoming R600 family from ATI (where ATI has a substantial leg up over Nvidia at present because it includes onboard sound as well as video circuitry). (For more information on these increasingly important acronyms and their underlying technologies, see the sidebar entitled &quot;HDMI, HDCP Gotchas and Workarounds&quot;).&lt;br /&gt;&lt;br /&gt;The EN7600GT is a great mid-range card. If you&#39;re looking to get into gaming but don&#39;t want to break the bank, you could do far worse&lt;br /&gt;&lt;br /&gt;Physically, the card looks unremarkable. Asus sticker notwithstanding it uses the standard Nvidia heatsink and fan, and only stands out because of some curious additions to its rear connection plate. There&#39;s an optical SPDIF port at the top of the plate and an HDMI port sitting between the more traditional DVI and S-Video ports.&lt;br /&gt;&lt;br /&gt;The card&#39;s graphics processing unit (GPU) runs at a clock speed of 400MHz and has 256MB of DDR3 memory running at an effective speed of 1,400MHz. Asus has chosen to play it safe by not overclocking the GPU or memory as standard, but like all Nvidia cards it can be overclocked by the end user with the aid of the accompanying driver and software. Not that there&#39;s a particular need to go overboard with overclocking, the EN7600GT should serve the needs of the vast majority of users. It has 12 pixel shader pipelines, five vertex shaders and a memory bandwidth of 22.4GBps, all of which allows ample, if hardly mind-blowing, performance.&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;The EN7600GT ran Doom 3 at 95fps at a resolution of 1,280x1,024 pixels and dropped to 73fps when running at 1,600x1,200 pixels. With 4x anti-aliasing (AA) and 8x anisotropic filtering (AF) enabled, the card scored 46.8fps and 35fps respectively at the above resolutions. These results indicate the card is perfectly adequate if you aren&#39;t too bothered by image quality enhancements, but struggles slightly if AA and AF are switched on.&lt;br /&gt;&lt;br /&gt;&lt;a onblur=&quot;try {parent.deselectBloggerImageGracefully();} catch(e) {}&quot; href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh3TxR0rzmUnCDEbr-h08pxZgNZI-pA_PLPwrqqf3ea-zcYPsmCt1ckm9OehOCuxWsbLzJ6z1ReMbKf7pdfzwz9rfAyYD4zZWmcZnA6M8hyby7Cbt8Hx8ddT6ymnhON5K4gxZlnQX1FtOx1/s1600-h/hauppauge-wintv-pvr-500mce.jpg&quot;&gt;&lt;img style=&quot;display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 148px; height: 200px;&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh3TxR0rzmUnCDEbr-h08pxZgNZI-pA_PLPwrqqf3ea-zcYPsmCt1ckm9OehOCuxWsbLzJ6z1ReMbKf7pdfzwz9rfAyYD4zZWmcZnA6M8hyby7Cbt8Hx8ddT6ymnhON5K4gxZlnQX1FtOx1/s200/hauppauge-wintv-pvr-500mce.jpg&quot; border=&quot;0&quot; alt=&quot;&quot;id=&quot;BLOGGER_PHOTO_ID_5353365575179565058&quot; /&gt;&lt;/a&gt;&lt;br /&gt;&lt;a onblur=&quot;try {parent.deselectBloggerImageGracefully();} catch(e) {}&quot; href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEih9kuU66wqXYJSOQ7egxt4iRlgxG5mbEn-m2-xd6499UhFyrrkYMdDyUZuX_YJVt4Xe1sXJVVfADQVfvIog8D_hWWS-eokFm8l5uNfXJa7J9iuDyGwf1FsEiyX_90Et0Msw8MoqiY6guDR/s1600-h/base_media.jpg&quot;&gt;&lt;img style=&quot;float:right; margin:0 0 10px 10px;cursor:pointer; cursor:hand;width: 200px; height: 150px;&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEih9kuU66wqXYJSOQ7egxt4iRlgxG5mbEn-m2-xd6499UhFyrrkYMdDyUZuX_YJVt4Xe1sXJVVfADQVfvIog8D_hWWS-eokFm8l5uNfXJa7J9iuDyGwf1FsEiyX_90Et0Msw8MoqiY6guDR/s200/base_media.jpg&quot; border=&quot;0&quot; alt=&quot;&quot;id=&quot;BLOGGER_PHOTO_ID_5353364242569495938&quot; /&gt;&lt;/a&gt;&lt;br /&gt;&lt;script src=&quot;http://feeds.feedburner.com/GraphicCardsNvidia?format=sigpro&quot; type=&quot;text/javascript&quot; &gt;&lt;/script&gt;&lt;noscript&gt;&lt;p&gt;Subscribe to RSS headline updates from: &lt;a href=&quot;http://feeds.feedburner.com/GraphicCardsNvidia&quot;&gt;&lt;/a&gt;&lt;br/&gt;Powered by FeedBurner&lt;/p&gt; &lt;/noscript&gt;&lt;br /&gt;&lt;br /&gt;The EN7600GT ran Half-Life 2 at 63fps at a resolution of 1,280x1,024 pixels, and at 60.9fps at 1,600x1,200 pixels. With 4x AA and 8x AF enabled it scored 58fps and 49fps respectively at the same resolutions.&lt;br /&gt;&lt;br /&gt;In 3DMark 2006, our synthetic benchmark test, the EN7600GT scored 3,360, which is better than the 2,532 our reference ATI Radeon X1600 XT achieved.</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/6598439492709623652/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/6598439492709623652' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/6598439492709623652'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/6598439492709623652'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2009/06/asus-en6700gt.html' title='Asus EN6700GT'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh3TxR0rzmUnCDEbr-h08pxZgNZI-pA_PLPwrqqf3ea-zcYPsmCt1ckm9OehOCuxWsbLzJ6z1ReMbKf7pdfzwz9rfAyYD4zZWmcZnA6M8hyby7Cbt8Hx8ddT6ymnhON5K4gxZlnQX1FtOx1/s72-c/hauppauge-wintv-pvr-500mce.jpg" height="72" width="72"/><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-6564580726238591273</id><published>2009-04-04T22:24:00.000-07:00</published><updated>2009-04-04T22:37:56.159-07:00</updated><title type='text'>ATI RADEON 7500 Multi-monitor graphics card - 64 and Matrox Millennium G450 DualHead Multi-monitor graphics card - 16 MB - DDR SGRAM MB - DDR SDRAM</title><content type='html'>&lt;a onblur=&quot;try {parent.deselectBloggerImageGracefully();} catch(e) {}&quot; href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgJRXXWiUpml0uFkc-5k-GApv5XkuOZxMyiCTzNvLBxOHINFCQPQMgIQqRkzfFKh6ZTR4w-9EBQJtvmob5LPzGooZ-Py4a3fgJGdztxte7zk-Uj4fSjTtcoSS7vIdwdt-Ai_PLre5upl7Fi/s1600-h/base_media.jpg&quot;&gt;&lt;img style=&quot;float:right; margin:0 0 10px 10px;cursor:pointer; cursor:hand;width: 200px; height: 200px;&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgJRXXWiUpml0uFkc-5k-GApv5XkuOZxMyiCTzNvLBxOHINFCQPQMgIQqRkzfFKh6ZTR4w-9EBQJtvmob5LPzGooZ-Py4a3fgJGdztxte7zk-Uj4fSjTtcoSS7vIdwdt-Ai_PLre5upl7Fi/s200/base_media.jpg&quot; border=&quot;0&quot; alt=&quot;&quot;id=&quot;BLOGGER_PHOTO_ID_5321074779806416082&quot; /&gt;&lt;/a&gt;&lt;br /&gt;ATI Radeon 7500, DVI (DVI), AGP 4x, Plug-in card&lt;br /&gt; &lt;br /&gt;ATI RADEON 7500 is a powerful and versatile graphic solution. 64 MB of powerful DDR memory along with the RADEON 7500 GPU provide high-performance acceleration of today&#39;s demanding 3D graphic applications. Industry-leading DVD playback, support for dual independent displays, and support for digital flat panel (DVI-I) monitors meet the needs of a wide range of home and business graphic users. &lt;br /&gt;&lt;br /&gt;This card is an EXCELLENT value for the price. Performance is on par with a GF2 GTS or ULTRA or with a GF4 MX440. This can&#39;t dish out the power of a GF3, but for nearly HALF the price, you can&#39;t complain. 2D quality is also very good, and all the features like DVI-I and Svideo out make this card an exceptionally good deal. I highly recommend it for old computers in need of an upgrade. However, for newer computers, I recommend getting either an Radeon8500, or GF4ti. But for this price, the 7500 makes a perfect upgrade to for old computers&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;$32.00&lt;br /&gt;&lt;br /&gt;&lt;span style=&quot;font-weight:bold;&quot;&gt;Matrox Millennium G450 DualHead Multi-monitor graphics card - 16 MB - DDR SGRAM MB - DDR SDRAM&lt;br /&gt;&lt;/span&gt;&lt;br /&gt;Matrox MGA G450, AGP 4x, Plug-in card&lt;br /&gt; &lt;br /&gt;New Matrox G450 256-bit DualBus graphics chip integrates more features than ever before. It boasts the industry-defining 2D and 3D image quality and speed. DualHead Multi-Display will change the way you use your PC. Vibrant Color Quality rendering brings photo-realistic color quality to all your applications. Hardware Environment-Mapped Bump Mapping allows for 3D textures to life like never before. DualHead DVDMax lets you output DVD movies to your TV independent of your PC desktop. G450 has the comprehensive driver support.&lt;br /&gt;&lt;br /&gt;The low-profile design of the G450 Matrox graphics card and its support for all compliant AGP systems makes it compatible with a wider variety of systems. With DualHead support for using 2 monitors at a time, and the reliability, stability, and features of the proven Millennium G-Series product line. Applications for this product include computer-aided dispatch, process control, and alarm monitoring. By helping to manage large amounts of information, Matrox multi-display technology can improve productivity and reduce errors.&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;MATROX GRAPHICS Parts List&lt;br /&gt;Back to: Home &gt; M &gt; MATROX GRAPHICS&lt;br /&gt;15941-00 - Matrox - Video cable - DB-15 (F) - 60 pin LFH (M) - 1 ft&lt;br /&gt;A8X256 - Matrox Parhelia graphics adapter - Parhelia-512 - 256 MB&lt;br /&gt;ADAPT18 - ADAPTOR DVI PLUG TO HD15FEM FOR PH-A128B&lt;br /&gt;ADAPT20 - ADAPTOR DVI PLUG TO HD15FEM FOR PH-A128B&lt;br /&gt;ADP-DVI-AF - DVI (male) to HD15 (female) Adapter&lt;br /&gt;CAB-DVI-VINF - Video-input cable for Matrox Parhelia 256MB PCI&lt;br /&gt;CAB-HD15-TVF - TV-output adapter cable for Millennium G450/G550 series, Millennium P650, Millen&lt;br /&gt;CAB-KX20-QTVF - KX20-TO-QUAD-TV ADAPTER UPGRADE CABLE&lt;br /&gt;CAB-L60-2XAF - LFH60 TO HD15 DUAL-MONITOR CABLE (1 FOOT) FOR G200/G450 MULTI-MONITOR SERIES, MI&lt;br /&gt;CAB-L60-2XD6F - 6ft DiGtal CBL provide Dual Monitor DVI&lt;br /&gt;CAB-L60-2XTVF - Dual TV output adapter (composite video and S-video, 1-foot) for Matrox G450 - M&lt;br /&gt;CAB-L60-4XAF - CABLE LFH60 M TO 4XHD15 F ANALOG&lt;br /&gt;CAB-XTO-5F - Fiber-optic Dual-LC 5-meter cable&lt;br /&gt;D2G-A2A-AJ - DualHead2Go - Dual Analog Edition&lt;br /&gt;D2G-A2A-IF - DUALHEAD2GO - DUAL ANALOG EDITION&lt;br /&gt;D2G-A2D-IF - DUALHEAD2GO DIGITAL EDITION&lt;br /&gt;DL-CAB-DVI - MATROX GRAPHICS DL-CAB-DVI Display cable / DVI-D&lt;br /&gt;EPI-TC2P32LPAF - EPICA TC2-LITE IS A LOW PROFILE, LOW WATTAGE, FANLESS PCI DISPLAY GRAPHICS CONTR&lt;br /&gt;EPI-TC2P64LPAF - EpicA TC2 is a high resolution, dual-display PCI graphics controller.</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/6564580726238591273/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/6564580726238591273' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/6564580726238591273'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/6564580726238591273'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2009/04/ati-radeon-7500-multi-monitor-graphics.html' title='ATI RADEON 7500 Multi-monitor graphics card - 64 and Matrox Millennium G450 DualHead Multi-monitor graphics card - 16 MB - DDR SGRAM MB - DDR SDRAM'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgJRXXWiUpml0uFkc-5k-GApv5XkuOZxMyiCTzNvLBxOHINFCQPQMgIQqRkzfFKh6ZTR4w-9EBQJtvmob5LPzGooZ-Py4a3fgJGdztxte7zk-Uj4fSjTtcoSS7vIdwdt-Ai_PLre5upl7Fi/s72-c/base_media.jpg" height="72" width="72"/><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-1183807610742352930</id><published>2009-04-04T22:13:00.000-07:00</published><updated>2009-04-04T22:44:15.607-07:00</updated><title type='text'>ATi,s HD 4890</title><content type='html'>&lt;a onblur=&quot;try {parent.deselectBloggerImageGracefully();} catch(e) {}&quot; href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi5xw3G2pIIspKxGGQavFZDQUO9ppKEgMoksJ9iXucSbXdMgnLBcJZU24FzFkdkkqIESz3OwnXSWvcQ9Xxd7AHvBAcrjzGPG-k4bIgCK14yzKUv7NcjMsiQf6HJRcrkZF5KT2UjMz4TogqF/s1600-h/100733_eah4890.jpg&quot;&gt;&lt;img style=&quot;float:right; margin:0 0 10px 10px;cursor:pointer; cursor:hand;width: 200px; height: 158px;&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi5xw3G2pIIspKxGGQavFZDQUO9ppKEgMoksJ9iXucSbXdMgnLBcJZU24FzFkdkkqIESz3OwnXSWvcQ9Xxd7AHvBAcrjzGPG-k4bIgCK14yzKUv7NcjMsiQf6HJRcrkZF5KT2UjMz4TogqF/s200/100733_eah4890.jpg&quot; border=&quot;0&quot; alt=&quot;&quot;id=&quot;BLOGGER_PHOTO_ID_5321071341399368402&quot; /&gt;&lt;/a&gt;&lt;br /&gt;Asus Announces Radeon HD 4890 Graphics Cards:&lt;br /&gt;&lt;br /&gt;Asus has launched the Asus EAH4890 series graphics cards with Voltage Tweak technology. The two new Radeon HD 4890 cards - EAH4890 TOP and EAH4890 standard edition - are based on ATI Radeon HD 4890 graphics. Asus would be selling EAH4890 TOP for Rs. 20,400 and EAH4890 for Rs. 19,500 - both prices are exclusive of taxes.&lt;br /&gt;&lt;br /&gt;The standard edition EAH4890 graphics card has 800 stream processors, core clocked at 850MHz, and 1GB GDDR3 video memory clocked at 3900MHz with a 256-bit interface.&lt;br /&gt;&lt;br /&gt;Asus has used Voltage Tweak technology in the EAH4890 TOP graphics card, overclocked at the factory, with core clocked at 900MHz and memory clocked at 4000MHz. Asus claims that game performance can be boosted 15 percent with the SmartDoctor application in EAH4890 TOP.&lt;br /&gt;&lt;br /&gt;Thanks to the SmartDoctor application, gamers now don&#39;t have to worry about re-flashing their BIOS to get a voltage and performance boost. &lt;br /&gt;&lt;br /&gt;The HD 4890 cards support PCI Express 2.0 bus and two dual-link DVI as well as HDMI out. A maximum resolution of 2560 x 1600 is supported. &lt;br /&gt;&lt;br /&gt;The Asus EAH4890 standard edition card will directly compete with Nvidia GeForce GTX 275 and 285 based graphics cards.&lt;br /&gt;&lt;br /&gt;As anyone remotely in tune with the tech sector can attest, the rivalry in the PC graphics card market between AMD / ATI and NVIDIA is as intense as ever. It used to be that one of the two companies would release a new product, only to have the other release a competing offering a few weeks, or maybe a few months later. But even in these gloomy economic times, AMD and NVIDIA continue to fight on and today both graphics giants are releasing new graphics cards aimed squarely at one another. Not a few weeks apart, but simultaneously on the very same day. Don&#39;t believe us? See here for our NVIDIA GeForce GTX 275 coverage.&lt;br /&gt;&lt;br /&gt;AMD is rolling out the ATI Radeon HD 4890 today, technically a new graphics card, but one that borrows heavily from the previous generation. The Radeon HD 4890 is based on an updated variant of the popular RV770 GPU which powers Radeon HD 4850 and 4870 cards, dubbed the RV790. We&#39;ve got some technical details regarding the RV790 GPU below and have more particulars regarding the actual cards and performance on the pages ahead. Read on to see what AMD has in store with the brand new Radeon HD 4890...&lt;br /&gt;&lt;br /&gt;956 million transistors on 55nm fabrication process&lt;br /&gt;PCI Express 2.0 x16 bus interface&lt;br /&gt;256-bit GDDR3/GDDR5 memory interface&lt;br /&gt;Microsoft DirectX 10.1 support&lt;br /&gt;&lt;br /&gt;Shader Model 4.1&lt;br /&gt;32-bit floating point texture filtering&lt;br /&gt;Indexed cube map arrays&lt;br /&gt;Independent blend modes per render target&lt;br /&gt;Pixel coverage sample masking&lt;br /&gt;Read/write multi-sample surfaces with shaders&lt;br /&gt;Gather4 texture fetching&lt;br /&gt;Unified Superscalar Shader Architecture&lt;br /&gt;&lt;br /&gt;800 stream processing units&lt;br /&gt;&lt;br /&gt;Dynamic load balancing and resource allocation for vertex, geometry, and pixel shaders&lt;br /&gt;Common instruction set and texture unit access supported for all types of shaders&lt;br /&gt;Dedicated branch execution units and texture address processors&lt;br /&gt;128-bit floating point precision for all operations&lt;br /&gt;Command processor for reduced CPU overhead&lt;br /&gt;Shader instruction and constant caches&lt;br /&gt;Up to 160 texture fetches per clock cycle&lt;br /&gt;Up to 128 textures per pixel&lt;br /&gt;Fully associative multi-level texture cache design&lt;br /&gt;DXTC and 3Dc+ texture compression&lt;br /&gt;High resolution texture support (up to 8192 x 8192)&lt;br /&gt;Fully associative texture Z/stencil cache designs&lt;br /&gt;Double-sided hierarchical Z/stencil buffer&lt;br /&gt;Early Z test, Re-Z, Z Range optimization, and Fast Z Clear&lt;br /&gt;Lossless Z &amp; stencil compression (up to 128:1)&lt;br /&gt;Lossless color compression (up to 8:1)&lt;br /&gt;8 render targets (MRTs) with anti-aliasing support&lt;br /&gt;Physics processing support&lt;br /&gt;Dynamic Geometry Acceleration&lt;br /&gt;&lt;br /&gt;High performance vertex cache&lt;br /&gt;Programmable tessellation unit&lt;br /&gt;Accelerated geometry shader path for geometry amplification&lt;br /&gt;Memory read/write cache for improved stream output performance&lt;br /&gt;Anti-aliasing features&lt;br /&gt;&lt;br /&gt;Multi-sample anti-aliasing (2, 4 or 8 samples per pixel)&lt;br /&gt;Up to 24x Custom Filter Anti-Aliasing (CFAA) for improved quality&lt;br /&gt;Adaptive super-sampling and multi-sampling&lt;br /&gt;Gamma correct&lt;br /&gt;Super AA (ATI CrossFireX configurations only)&lt;br /&gt;All anti-aliasing features compatible with HDR rendering&lt;br /&gt;Texture filtering features&lt;br /&gt;&lt;br /&gt;2x/4x/8x/16x high quality adaptive anisotropic filtering modes (up to 128 taps per pixel)&lt;br /&gt;128-bit floating point HDR texture filtering&lt;br /&gt;sRGB filtering (gamma/degamma)&lt;br /&gt;Percentage Closer Filtering (PCF)&lt;br /&gt;Depth &amp; stencil texture (DST) format support&lt;br /&gt;Shared exponent HDR (RGBE 9:9:9:5) texture format support&lt;br /&gt;OpenGL 2.0 support&lt;br /&gt;ATI PowerPlay&lt;br /&gt;&lt;br /&gt;Advanced power management technology for optimal performance and power savings&lt;br /&gt;Performance-on-Demand&lt;br /&gt;&lt;br /&gt;Constantly monitors GPU activity, dynamically adjusting clocks and voltage based on user scenario&lt;br /&gt;Clock and memory speed throttling&lt;br /&gt;Voltage switching&lt;br /&gt;Dynamic clock gating&lt;br /&gt;Central thermal management – on-chip sensor monitors GPU temperature and triggers thermal actions as required</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/1183807610742352930/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/1183807610742352930' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/1183807610742352930'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/1183807610742352930'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2009/04/atis-hd-4890.html' title='ATi,s HD 4890'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi5xw3G2pIIspKxGGQavFZDQUO9ppKEgMoksJ9iXucSbXdMgnLBcJZU24FzFkdkkqIESz3OwnXSWvcQ9Xxd7AHvBAcrjzGPG-k4bIgCK14yzKUv7NcjMsiQf6HJRcrkZF5KT2UjMz4TogqF/s72-c/100733_eah4890.jpg" height="72" width="72"/><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-1552903343224806768</id><published>2009-02-14T03:14:00.000-08:00</published><updated>2009-02-14T03:18:01.442-08:00</updated><title type='text'>Nvidia GeForce 8800 GTX (768MB)</title><content type='html'>&lt;a onblur=&quot;try {parent.deselectBloggerImageGracefully();} catch(e) {}&quot; href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiMCoIAbx95XeH81DQSILVMN5yY_ifhsQuVtpYYTZ5Bodfvkplb7EbE3FdDzj8yOnrxIyousQbpP18_b6Z2ORUn-xvldXBi0dYz1bQ1Mx7bgg8HSr7mwSXzV56BlSmwCZP05nBAMBI5pOQe/s1600-h/32132889-2-440-overview-1.gif&quot;&gt;&lt;img style=&quot;float:right; margin:0 0 10px 10px;cursor:pointer; cursor:hand;width: 200px; height: 150px;&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiMCoIAbx95XeH81DQSILVMN5yY_ifhsQuVtpYYTZ5Bodfvkplb7EbE3FdDzj8yOnrxIyousQbpP18_b6Z2ORUn-xvldXBi0dYz1bQ1Mx7bgg8HSr7mwSXzV56BlSmwCZP05nBAMBI5pOQe/s200/32132889-2-440-overview-1.gif&quot; border=&quot;0&quot; alt=&quot;&quot;id=&quot;BLOGGER_PHOTO_ID_5302610540968919234&quot; /&gt;&lt;/a&gt;&lt;br /&gt;GeForce 8800 GTX brings tremendous processing power to current-generation games. It&#39;s also the first card to market that will support all of the 3D gaming-related features of Windows Vista and DirectX 10. The initial release of next-gen games is a bit far off. The poster child, the 3D shooter Crysis, is set to debut in March 2007, and even that game might not put all of the next-gen bells and whistles into play. Still, the GeForce 8800 GTX is so powerful, even compared to ATI&#39;s fastest dual card combination, that there&#39;s no reason to spend roughly $1,000 on a pair of Radeon cards when you can outperform them with a single $600 GeForce 8800 GTX. That and the fact that Nvidia has finally caught up to ATI&#39;s image-quality advantages earn Nvidia&#39;s newest card our Editors&#39; Choice award for high-end 3D graphics cards.&lt;br /&gt;&lt;br /&gt;Because of design changes in the GeForce 8800 GTX chip&#39;s new architecture, we need to consider some of this card&#39;s specs differently than we have in the past. The basics are the same. The GeForce 8800 GTX has a core clock speed of 575MHz, and it comes with 768MB of DDR3 RAM clocked to 900MHz with a 1,800MHz data rate. That memory rate is a significant uptick compared to the 800MHz RAM in Nvidia&#39;s last flagship card, the GeForce 7950 GX2. But one of the main differences in the GeForce 8800 GTX&#39;s architecture lies in how we consider its pipelines.&lt;br /&gt;&lt;br /&gt;With no DirectX 10 games available to test on at the moment, we can&#39;t speak to the GeForce 8800 GTX&#39;s next-generation performance, aside from the fact that it&#39;s the only card on the market that claims DirectX 10 compatibility. ATI&#39;s next-gen card, code-named R600, was rumored to be released in January 2007, but we haven&#39;t heard much about it so far. We imagine that ATI (whose acquisition by AMD has been finalized) will have a DirectX 10 card sooner or later, but right now Nvidia is the only vendor with something to show. And while we can&#39;t really say who will win the battle for next-generation performance, the GeForce 8800 GTX dominates every single other card on the market right now.&lt;br /&gt;&lt;br /&gt;One of the most important things to note about the GeForce 8800 GTX and its performance is that you would be smart to pair this card with a capable monitor that can go to resolutions of 1,600x1,200 or above. Nvidia calls this XHD (extreme high definition) gaming. Whatever you want to call it, if you&#39;re not playing at high resolutions with antialiasing, anisotropic filtering, and other image-quality tweaks cranked, you&#39;ll likely hit a CPU bottleneck, which means that you&#39;re not giving the card enough to do. But when you get up to those high-quality settings, the results are amazing.&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;The good:&lt;br /&gt;Dominating performance in current-generation games; catches up to ATI on current-gen image quality; first card out with support for DirectX10 and next-gen gaming features; amazing value proposition.&lt;br /&gt;&lt;br /&gt;The bad:&lt;br /&gt;Will likely require you to beef up your power supply in SLI mode.</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/1552903343224806768/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/1552903343224806768' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/1552903343224806768'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/1552903343224806768'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2009/02/geforce-8800-gtx-brings-tremendous.html' title='Nvidia GeForce 8800 GTX (768MB)'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiMCoIAbx95XeH81DQSILVMN5yY_ifhsQuVtpYYTZ5Bodfvkplb7EbE3FdDzj8yOnrxIyousQbpP18_b6Z2ORUn-xvldXBi0dYz1bQ1Mx7bgg8HSr7mwSXzV56BlSmwCZP05nBAMBI5pOQe/s72-c/32132889-2-440-overview-1.gif" height="72" width="72"/><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-3236040441117006564</id><published>2009-02-14T03:13:00.001-08:00</published><updated>2009-02-14T03:14:08.390-08:00</updated><title type='text'>ATI Radeon HD 3450 Graphics Card</title><content type='html'>&lt;a onblur=&quot;try {parent.deselectBloggerImageGracefully();} catch(e) {}&quot; href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgmtEz6lRO7OvXls6eZLZBqCMP9WSiij1vJhjWGkTbmc4eAuiNbgSFyUrpHNMjjLAsNn_NDLduNb7tm6tt23uP-VbN251_QqT-uq0XAGwTTrJcEEc56VwKEA8V5nkR-8OTE3Zbu4l_sCsUF/s1600-h/ati-radeon-hd-3450-graphics-card1_large.jpg&quot;&gt;&lt;img style=&quot;float:right; margin:0 0 10px 10px;cursor:pointer; cursor:hand;width: 200px; height: 129px;&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgmtEz6lRO7OvXls6eZLZBqCMP9WSiij1vJhjWGkTbmc4eAuiNbgSFyUrpHNMjjLAsNn_NDLduNb7tm6tt23uP-VbN251_QqT-uq0XAGwTTrJcEEc56VwKEA8V5nkR-8OTE3Zbu4l_sCsUF/s200/ati-radeon-hd-3450-graphics-card1_large.jpg&quot; border=&quot;0&quot; alt=&quot;&quot;id=&quot;BLOGGER_PHOTO_ID_5302609672043602402&quot; /&gt;&lt;/a&gt;&lt;br /&gt;AMD&#39;s entry-level ATI Radeon HD 3450 graphics card is an amazing deal for those looking to juice up their PC&#39;s video-playback performance and quality. For just $49, you get support for HD video playback, high desktop resolutions, power enough to display all the bells and whistles of Windows Vista&#39;s Aero interface, and the ability to add additional cards to support more monitors. But while it supports the latest DirectX 10.1 (DX10.1) 3D features, the Radeon HD 3450 is decidedly not the card for you if you play games.&lt;br /&gt;&lt;br /&gt;Our sample card was a half-height, fanless PCI Express model with 256MB of DDR2 memory, intended for low-profile home theater cases. The card has only a single DVI port, as well as a component-video/S-Video connector and a High-Definition Multimedia Interface (HDMI) adapter for the DVI port. ATI also has a model available with both DVI and VGA ports, as well as one with a VGA port and the new DisplayPort connector.&lt;br /&gt;&lt;br /&gt;Though this is an inexpensive, entry-level card, the HD 3450 has the power to handle all of Vista&#39;s graphics effects, including desktop transparency, Flip 3D task switching, and the full suite of slide-show effects. That makes it a good replacement for entry-level PC graphics, such as the integrated graphics chips found on many motherboards, which are fast enough to enable basic effects such as transparency but disable more-sophisticated effects such as slide-show transitions.&lt;br /&gt;&lt;br /&gt;While its desktop performance was excellent, the HD 3450&#39;s gaming performance was dismal. It delivered slide-show-like frame rates of 11 frames per second (fps) in F.E.A.R. and 4.6fps in Company of Heroes (both at a resolution of 1,280x1,024). If you have a nostalgic bent, the card has enough power to handle 3D games from early in the decade at low resolutions, but its support of the DX10.1 standard used by the newest games is mostly there as a checkbox item for the promotional text on the card&#39;s box. Casual gamers should consider the Radeon HD 3650 as the bare minimum, with the HD 3850 a more suitable entry-level card for serious gamers.&lt;br /&gt;&lt;br /&gt;For the card&#39;s home-theater-PC target audience, though, the HD 3450 delivers. Its lack of a fan offers silent operation, and its video-playback performance and visual quality are top-notch. Despite its low price, the card offered flawless playback of 1080p HD content on a 1,900x1,200 24-inch monitor. It supports ATI Avivo HD video, with hardware decoding of MPEG and DivX, as well as the H.264 and VC-1 video codecs used on HD DVD and Blu-ray discs. This makes for a sharp, clear picture with smooth frame rates, even at high resolutions, without requiring a PC with a high-end CPU to decode the video. The HD 3000 series cards improve on the hardware decoder introduced in the HD 2000 series by reducing CPU utilization and increasing memory bandwidth to smooth playback at the highest resolutions. With the HDMI adapter and High-Bandwith Digital Content Protection (HDCP) support, the HD 3450 will handle even copyright-protected HD video.&lt;br /&gt;&lt;br /&gt;If you&#39;re looking for better 3D-gaming performance, plan on spending more than the Radeon HD 3450&#39;s bargain price. But if you just want a graphics card that delivers good performance on the Windows desktop and excellent video-playback performance and quality, this card is hard to beat.&lt;br /&gt;&lt;br /&gt;Discuss this product in our components forum.&lt;br /&gt;&lt;br /&gt;Direct Price: $49</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/3236040441117006564/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/3236040441117006564' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/3236040441117006564'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/3236040441117006564'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2009/02/ati-radeon-hd-3450-graphics-card.html' title='ATI Radeon HD 3450 Graphics Card'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgmtEz6lRO7OvXls6eZLZBqCMP9WSiij1vJhjWGkTbmc4eAuiNbgSFyUrpHNMjjLAsNn_NDLduNb7tm6tt23uP-VbN251_QqT-uq0XAGwTTrJcEEc56VwKEA8V5nkR-8OTE3Zbu4l_sCsUF/s72-c/ati-radeon-hd-3450-graphics-card1_large.jpg" height="72" width="72"/><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-160054500954872964</id><published>2008-11-08T10:46:00.000-08:00</published><updated>2008-11-08T10:51:50.849-08:00</updated><title type='text'>AMD ATI Radeon HD 4800</title><content type='html'>&lt;a href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjnetVoxIrc4MERk8qHBeecR5xCetHsVDPDxRy4Xbovp3i4Gptj_9m-kUIP0JU6JkWfSxtfuxcEUWAXhrFeE-rHeYPZYU_spPnzKIQhX0-xBGYun_JNJ61rpfmjv6PWduGTxkQKKitHctxR/s1600-h/7941-amd4850span.jpg&quot;&gt;&lt;img style=&quot;float:right; margin:0 0 10px 10px;cursor:pointer; cursor:hand;width: 200px; height: 136px;&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjnetVoxIrc4MERk8qHBeecR5xCetHsVDPDxRy4Xbovp3i4Gptj_9m-kUIP0JU6JkWfSxtfuxcEUWAXhrFeE-rHeYPZYU_spPnzKIQhX0-xBGYun_JNJ61rpfmjv6PWduGTxkQKKitHctxR/s200/7941-amd4850span.jpg&quot; border=&quot;0&quot; alt=&quot;&quot;id=&quot;BLOGGER_PHOTO_ID_5266360459676487554&quot; /&gt;&lt;/a&gt;&lt;br /&gt;&lt;br /&gt;Having had nVidia&#39;s latest graphics card launch within the last week or so, it was almost inevitable that AMD would want to show its hand. Enter the AMD ATI Radeon HD 4800-series,&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;Apparently (but unofficially for now) the 4870 will offer a faster 750MHz core clock and use GDDR5 memory which should be impressive. About a month later on an &#39;X2&#39; version will apparently also be coming which will, oddly enough, be the flagship card of the range. The main advantage AMD has is that using a 55nm manufacturing process allows the production costs of the 4800-series to be considerably lower than that of the - which in turn means the cards are vastly cheaper. We&#39;re told to expect the 4850 to retail in the £130-150 region, half the price of the GeForce GTX 260 with which it will likely compete.&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;tegic decision to focus on GPU designs that maximized our efficiency and allowed us to provide enthusiasts, performance and mainstream users with the most compelling value proposition at every price point. The ATI Radeon 4800 series sets a new industry standard in key metrics such as performance-per-watt, performance-per-mm2 of chip die size, and performance-per-dollar.&quot; &lt;br /&gt;&lt;br /&gt;The ATI Radeon HD 4870, available immediately with a suggested retail price of USD$299, represents an unprecedented 1.2 teraFLOPS of visual compute power. It features a stock GPU core clock speed of 750 MHz, 512 MB of GDDR5 memory rated at 3.6 gigabits/second, and comes in a dual-slot PCI Express 2.0 configuration with a maximum board power of 160 watts. &lt;br /&gt;&lt;br /&gt;The ATI Radeon HD 4850, immediately available with a suggested retail price of USD$199, received an enthusiastic welcome from global graphics reviewers. The ATI Radeon HD 4850 is the world’s first teraFLOPS graphics chip, with 800 stream processing cores (identical to the ATI Radeon HD 4870), a stock GPU core clock speed of 625 MHz, 512 MB of GDDR3 memory rated at 2 gigabits/second, and comes in a single-slot PCI Express 2.0 configuration with a maximum board power of 110 watts.</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/160054500954872964/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/160054500954872964' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/160054500954872964'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/160054500954872964'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2008/11/amd-ati-radeon-hd-4800.html' title='AMD ATI Radeon HD 4800'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjnetVoxIrc4MERk8qHBeecR5xCetHsVDPDxRy4Xbovp3i4Gptj_9m-kUIP0JU6JkWfSxtfuxcEUWAXhrFeE-rHeYPZYU_spPnzKIQhX0-xBGYun_JNJ61rpfmjv6PWduGTxkQKKitHctxR/s72-c/7941-amd4850span.jpg" height="72" width="72"/><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-2099267832133461424</id><published>2008-08-08T01:51:00.000-07:00</published><updated>2008-08-08T01:53:08.119-07:00</updated><title type='text'>ZOTAC-IT&#39;S time to play</title><content type='html'>&lt;strong&gt;ZOTAC-IT&#39;S time to play&lt;/strong&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;Brings you graphic Revolution&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;9800GTX AMP!Edition&lt;br /&gt;Engine Clock:750mhz&lt;br /&gt;Memory clock:2300Mhz&lt;br /&gt;Shader clock:1890Mhz&lt;br /&gt;Memory Interface:256bit&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;9600 GT Standard Edition&lt;br /&gt;Engine Clock:675mhz&lt;br /&gt;Memory clock:1800Mhz&lt;br /&gt;Shader clock:1650Mhz&lt;br /&gt;Memory Interface:256bit&lt;br /&gt;&lt;br /&gt;NFORCE 790i-Supreme&lt;br /&gt;Nvidia Nforce 790i Ultra SLI&lt;br /&gt;Design for intel Pentium D/4,core2duo/quad/Extreme,Penryn(FSB 1600mhz)&lt;br /&gt;Four DDR3 sockets Supports upto 8gb(DDR3 2000)&lt;br /&gt;3-way sli graphics support&lt;br /&gt;onboard 2 Gigabit lan&lt;br /&gt;onboard IEEE1394 ands/pdif&lt;br /&gt;7 sata (3Gb/s) &amp; 1 eSATA Ports  &lt;br /&gt;&lt;br /&gt;&lt;br /&gt;GeForce 8300&lt;br /&gt;Nvidia GeForce 8300&lt;br /&gt;Design for AMD ATHLON 64/64FX/64x2//Phenom(AM2/AM2+)&lt;br /&gt;Four DDR3 sockets Supports upto 8gb(DDR3 1066)&lt;br /&gt;GeForce 8300Graphics Engine with Hybrid-SLI support&lt;br /&gt;VGA and DVI/HDMI Ports</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/2099267832133461424/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/2099267832133461424' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/2099267832133461424'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/2099267832133461424'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2008/08/zotac-its-time-to-play.html' title='&lt;strong&gt;ZOTAC-IT&#39;S time to play&lt;/strong&gt;'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-5983535376705944028</id><published>2008-04-22T08:07:00.001-07:00</published><updated>2008-08-08T02:05:47.523-07:00</updated><title type='text'>Quad SLI Review With Dual 9800 GX2</title><content type='html'>&lt;a onblur=&quot;try {parent.deselectBloggerImageGracefully();} catch(e) {}&quot; href=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjaC_Ef9Zf_2SJUf_5Ey9SsLxI9KhNBjPjGn-QujOKWNFtDFfRQ-8pkoB-NOsKrW3v7pYcDVG98cqEcOE9O_pm3I09rg9oTzHvjeAOjMOdU2ua3gNP5PeZCXGkYKB6RQhY1hNnvEfeDOnP7/s1600-h/all-7x3.jpg&quot;&gt;&lt;img style=&quot;float:left; margin:0 10px 10px 0;cursor:pointer; cursor:hand;&quot; src=&quot;https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjaC_Ef9Zf_2SJUf_5Ey9SsLxI9KhNBjPjGn-QujOKWNFtDFfRQ-8pkoB-NOsKrW3v7pYcDVG98cqEcOE9O_pm3I09rg9oTzHvjeAOjMOdU2ua3gNP5PeZCXGkYKB6RQhY1hNnvEfeDOnP7/s200/all-7x3.jpg&quot; border=&quot;0&quot; alt=&quot;&quot;id=&quot;BLOGGER_PHOTO_ID_5192088775157744562&quot; /&gt;&lt;/a&gt;&lt;br /&gt;This is exactly what we&#39;ll look at today. And with much of these articles I&#39;d like to present the article to you as a DIY (Do It Yourself) experience, because the few of you who will go for Quad SLI, will be building that system yourself.&lt;br /&gt;&lt;br /&gt;What do you need for Quad-SLI ? A pretty beefy system, that&#39;s for sure, here are your ingredients:&lt;br /&gt;Not one but two times a GeForce 9800 GX2&lt;br /&gt;nForce 680/780/790 mainboard&lt;br /&gt;2 GB memory&lt;br /&gt;Core to duo/quad processor - 2.67 GHz or faster.&lt;br /&gt;Windows Vista fully patched and updated (SP1 recommended)&lt;br /&gt;Quality 850 Watts or higher specced power supply&lt;br /&gt;High resolution monitor (1920x1200 or above)&lt;br /&gt;Positive thinking and a bit of luck&lt;br /&gt;Games man !&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;download latest drivers for ur XFX graphics card:&lt;br /&gt;click here:&lt;br /&gt;http://www.nvidia.com/Download/index.aspx?lang=en-us/&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;Nvidia GeForce 9800 GTX 512MB graphics card&lt;br /&gt;&lt;br /&gt;Ever since the launch of the GeForce 8800 GTX in November 2006, the discrete graphics card market—at least at the high-end—has been a one-sided affair, with Nvidia dominating proceedings.&lt;br /&gt;&lt;br /&gt;And with that, things haven’t really progressed an awful lot – we had the GeForce 8800 Ultra come to market almost eleven months ago just a few short weeks before the launch of the ill-fated Radeon HD 2900 XT. This was nothing more than a speed-bumped GeForce 8800 GTX with a nice new cooling solution that Nvidia wanted consumers to pay over-the-odds for.&lt;br /&gt;&lt;br /&gt;The ATI Radeon HD 3870 X2 was AMD’s next attempt to create some noise at the high-end and while it did a reasonable job, it didn’t really rumble as loudly as the troubled platform maker would have hoped. While it did win some battles, it didn’t win the war in our eyes and since last year’s saga surrounding GeForce 7950 GX2 driver support, we paid particular attention to the 3870 X2’s drivers.&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;In terms of clock speeds, the reference GeForce 9800 GTX comes with a 675MHz core clock, a 1,688MHz shader clock and, while the memory frequency is set to 1,100MHz (2,200MHz effective). This gives some reasonably good theoretical throughput – doing the calculations gives around 432 GigaFLOPS of compute power, 43.2 GigaTexels per second of bilinear texture filtering throughput, a fill rate of 10,800 Megapixels per second and 70.4GB per second of memory bandwidth.&lt;br /&gt;&lt;br /&gt; the GeForce 9800 GTX’s shader throughput has only been increased by 25 percent over the 8800 GTX. This is a little disappointing, as we’re used to some pretty big performance jumps generation-upon-generation, but this one is a little disappointing—especially considering the memory bandwidth and size reduction. Sure, texture throughput has been increased massively, but it’s only around four percent faster than the GeForce 8800 GTS 512 in that respect.&lt;a href=&quot; http://www.nvidia.com/Download/index.aspx?lang=en-us/&quot;&gt;&lt;/a&gt;</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/5983535376705944028/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/5983535376705944028' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/5983535376705944028'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/5983535376705944028'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2008/04/quad-sli-review-with-dual-9800-gx2.html' title='Quad SLI Review With Dual 9800 GX2'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjaC_Ef9Zf_2SJUf_5Ey9SsLxI9KhNBjPjGn-QujOKWNFtDFfRQ-8pkoB-NOsKrW3v7pYcDVG98cqEcOE9O_pm3I09rg9oTzHvjeAOjMOdU2ua3gNP5PeZCXGkYKB6RQhY1hNnvEfeDOnP7/s72-c/all-7x3.jpg" height="72" width="72"/><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-4472720261087097732</id><published>2008-04-22T07:48:00.001-07:00</published><updated>2008-04-22T08:07:17.620-07:00</updated><title type='text'></title><content type='html'>PNY VERTO GeForce 9600 GT 512MB &lt;br /&gt;PCIe SLI-Ready Graphics Card &lt;br /&gt;VCG96512GXPB&lt;br /&gt;&lt;br /&gt;XFX Geforce 9600GT PVT94PYDF4 Video Card &lt;br /&gt;Interface: PCI, PCI Express - Card Chipset: Nvidia - Installed Memory: 512MB &lt;br /&gt;XFX GeForce 9600 GT Video Card The XFX NVIDIA GeForce 9600 GT 512MB graphics card offers a powerfully immersive entertainment experience designed for high-definition gaming and video playback. Play the hottest Microsoft  DirectX 10 games with awesome speed and watch the latest HD DVD and Blu-ray Disc movies with brilliant clarity, powered by the revolutionary PureVideo HD engine. Couple two XFX NVIDIA GeForce 9600 GT graphics cards together with an NVIDIA nForce motherboard for an optimal graphics platform, giving you the gaming horsepower to handle the most demanding games with NVIDIA SLI technology. With the XFX NVIDIA GeForce 9600 GT, amazing graphics performance is now within your reach.&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;XFX Geforce 9600 GT Video Card &lt;br /&gt;Interface: PCI, PCI Express - Card Chipset: Nvidia - Installed Memory: 512MB &lt;br /&gt;XFX GeForce 9600 GT XXX Video Card - Alpha Dog Edition The NVIDIA GeForce 9600 GT GPU offers a powerfully immersive entertainment experience designed for extreme high-definition gaming and video playback. Play the hottest DirectX 10 games with awesome speed and watch the latest HD DVD and Blu-ray movies with brilliant clarity. Featuring next generation GeForce and PureVideo HD technologies, the GeForce 9600 GT GPU puts amazing graphics performance within your reach.</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/4472720261087097732/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/4472720261087097732' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/4472720261087097732'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/4472720261087097732'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2008/04/pny-verto-geforce-9600-gt-512mb-pcie.html' title=''/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-8805258776786052005</id><published>2008-04-22T07:48:00.000-07:00</published><updated>2008-08-08T02:02:07.965-07:00</updated><title type='text'>PNY VERTO GeForce 9600 GT 512MB</title><content type='html'>download latest drivers for ur XFX graphics card:&lt;br /&gt;click here:&lt;br /&gt; http://www.nvidia.com/Download/index.aspx?lang=en-us&lt;br /&gt;&lt;br /&gt;PNY VERTO GeForce 9600 GT 512MB &lt;br /&gt;PCIe SLI-Ready Graphics Card &lt;br /&gt;VCG96512GXPB&lt;br /&gt;&lt;br /&gt;XFX Geforce 9600GT PVT94PYDF4 Video Card &lt;br /&gt;Interface: PCI, PCI Express - Card Chipset: Nvidia - Installed Memory: 512MB &lt;br /&gt;XFX GeForce 9600 GT Video Card The XFX NVIDIA GeForce 9600 GT 512MB graphics card offers a powerfully immersive entertainment experience designed for high-definition gaming and video playback. Play the hottest Microsoft</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/8805258776786052005/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/8805258776786052005' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/8805258776786052005'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/8805258776786052005'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2008/04/download-latest-drivers-for-ur-xfx.html' title='PNY VERTO GeForce 9600 GT 512MB'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><thr:total>0</thr:total></entry><entry><id>tag:blogger.com,1999:blog-1394573190797201130.post-282533318767075238</id><published>2008-03-15T00:36:00.000-07:00</published><updated>2008-03-22T06:45:21.222-07:00</updated><title type='text'>Nvidia 9 series comming soon</title><content type='html'>Nvidia pci express 9 series launching soon.&lt;br /&gt;wether its rumour or not, But according to sources its features are&lt;br /&gt;double bandwidth, 1gb memory, double cooling&lt;br /&gt;it may be 128 bit that is picture may be more realistic.&lt;br /&gt;&lt;p&gt;Nvidia pciexpress 9 may launch in june or sept. &lt;/p&gt;&lt;p&gt; &lt;/p&gt;&lt;br /&gt;&lt;br /&gt;&lt;br /&gt;visit this&lt;br /&gt;http://mobtheme.blogspot.com/&lt;br /&gt;&lt;br /&gt;lyrics&lt;br /&gt;http://lyricsbole.blogspot.com/</content><link rel='replies' type='application/atom+xml' href='http://supremegraphiccards.blogspot.com/feeds/282533318767075238/comments/default' title='Post Comments'/><link rel='replies' type='text/html' href='http://www.blogger.com/comment/fullpage/post/1394573190797201130/282533318767075238' title='0 Comments'/><link rel='edit' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/282533318767075238'/><link rel='self' type='application/atom+xml' href='http://www.blogger.com/feeds/1394573190797201130/posts/default/282533318767075238'/><link rel='alternate' type='text/html' href='http://supremegraphiccards.blogspot.com/2008/03/nvidia-9-series-comming-soon.html' title='Nvidia 9 series comming soon'/><author><name>bunny</name><uri>http://www.blogger.com/profile/17210530971679298329</uri><email>noreply@blogger.com</email><gd:image rel='http://schemas.google.com/g/2005#thumbnail' width='16' height='16' src='https://img1.blogblog.com/img/b16-rounded.gif'/></author><thr:total>0</thr:total></entry></feed>