All technical data and various benchmarks of the GIGABYTE GeForce GTX 1080 Xtreme Gaming WATERFORCE 8G are listed on this page. The graphics card is based on the GP104-400-A1 / GP104-410-A1 (Pascal) graphics chip, which is manufactured with a structure width of 16 nm. The GIGABYTE GeForce GTX 1080 Xtreme Gaming WATERFORCE 8G can operate up to 4 screens with a maximum resolution of up to 7680x4320 become.
GPU
The GIGABYTE GeForce GTX 1080 Xtreme Gaming WATERFORCE 8G has 20 execution units with 2560 shaders. This gives the graphics card a theoretical FP32 computing power of 9.91 TFLOPS.
The GIGABYTE GeForce GTX 1080 Xtreme Gaming WATERFORCE 8G is equipped with 8 GB graphics memory of the type GDDR5X. The memory clock of the graphics card is 1.300 GHz and the speed is 10.4 Gbps.
Memory Size:
8 GB
Memory Type:
GDDR5X
Memory Clock:
1.300 GHz
Memory Speed:
10.4 Gbps
Memory bandwidth:
332 GB/s
Memory Interface:
256 bit
Clock Speeds
The base frequency of this graphics card is 1.759 GHz and the manufacturer specifies the maximum turbo clock as 1.936 GHz. If supported (see below), the performance can be further increased by overclocking.
Base Clock:
1.759 GHz+ 9 %
Boost Clock:
1.936 GHz+ 12 %
Avg (Game) Clock:
1.898 GHz
Overclocking:
Yes
Thermal Design
The GIGABYTE GeForce GTX 1080 Xtreme Gaming WATERFORCE 8G is powered by 2 x 8-Pin connectors. The TDP (Thermal Design Power) of the graphics card is 180 W.
TDP:
180 W
TDP (up):
--
Tjunction max:
94 °C
PCIe-Power:
2 x 8-Pin
Cooler & Fans
Graphics processor and graphics memory of the GIGABYTE GeForce GTX 1080 Xtreme Gaming WATERFORCE 8G are cooled by a AIO water cooling.
Fan-Type:
Axial (Radiator)
Fan 1:
1 x 120 mm
Fan 2:
--
Cooler-Type:
AIO water cooling
Noise (Idle):
--
Noise (Load):
--
Connectivity
Up to 4 monitors can be operated with the GIGABYTE GeForce GTX 1080 Xtreme Gaming WATERFORCE 8G.
Max. Displays:
4
HDCP-Version:
2.2
HDMI Ports:
1x HDMI v2.0b
DP Ports:
3x DP v1.4
DVI Ports:
1
VGA Ports:
--
USB-C Ports:
--
Featureset
With the GIGABYTE GeForce GTX 1080 Xtreme Gaming WATERFORCE 8G it is possible to operate a screen with a resolution of up to 7680x4320 pixels. Furthermore, the DirectX standard version 12_1 is supported.
Max. resolution:
7680x4320
DirectX:
12_1
Raytracing:
No
DLSS / FSR:
Yes
LED:
GIGABYTE RGB Fusion
Supported Video Codecs
This area lists the video codecs that the GIGABYTE GeForce GTX 1080 Xtreme Gaming WATERFORCE 8G can decode or encode in hardware in order to minimize the processor load, which leads to lower power consumption.
h264:
Decode / Encode
h265 / HEVC:
Decode / Encode
VP8:
Decode
VP9:
Decode
AV1:
No
Dimensions
The GIGABYTE GeForce GTX 1080 Xtreme Gaming WATERFORCE 8G needs the space of 2 PCIe-Slots in a case. It is 264 mm long, 124 mm high and 41 mm wide.
Length:
264 mm
Height:
124 mm
Width:
41 mm
Width (Slots):
2 PCIe-Slots
Weight:
--
Additional data
The GIGABYTE GeForce GTX 1080 Xtreme Gaming WATERFORCE 8G was released in Q2/2016 at a price of 599 $ (Reference). The graphics card manufactured in 16 nm is connected to the system via PCIe 3.0 x 16 lanes.
Here you can rate this graphics card and help other visitors with their purchase decision. The average rating for this graphics card is currently 0 stars (0 ratings). Rate it now:
3DMark is a benchmark program that determines the performance of certain components of a computer and then reports the performance as a numerical value.
Geekbench 6 is a cross-platform benchmark for main processors, which also carries out 3 different graphics benchmarks and outputs them in the form of a numerical value.
The theoretical computing power of the graphics card with single precision (32 bit) in TFLOPS indicates how many trillion FP32 floating point operations the graphics card (GPU) can perform per second.
In order to determine the performance of a graphics card, so-called "benchmarks" are carried out. The benchmark software carries out special calculations to determine the performance of a graphics card. We use so-called theoretical or synthetic benchmarks (e.g. 3D Mark) as well as real game benchmarks. To ensure real comparability of the results, we pay attention to the correct execution of the benchmarks as well as the condition of the graphics card and the system.
We use the following benchmarks to measure the performance of a graphics card: