Here we show you benchmarks and technical data for the GIGABYTE GeForce GTX 1650 SUPER D6 4G. The graphics card is based on the NVIDIA GeForce GTX 1650 SUPER (Turing) and has 4 GB GDDR6 graphics memory with a memory bandwidth of 192 GB/s. The GIGABYTE GeForce GTX 1650 SUPER D6 4G has 20 execution units with 1280 shaders.
GPU
The GIGABYTE GeForce GTX 1650 SUPER D6 4G has 1280 shaders and 20 execution units. The graphics card thus achieves a theoretical FP32 computing power of 4.42 TFLOPS.
The GIGABYTE GeForce GTX 1650 SUPER D6 4G can use 4 GB graphics memory of the type GDDR6. The graphics card thus achieves a memory bandwidth of 192 GB/s on a 128 bit wide memory interface.
Memory Size:
4 GB
Memory Type:
GDDR6
Memory Clock:
1.500 GHz
Memory Speed:
12.0 Gbps
Memory bandwidth:
192 GB/s
Memory Interface:
128 bit
Clock Speeds
The base clock frequency of the GIGABYTE GeForce GTX 1650 SUPER D6 4G is 1.530 GHz. A high clock frequency (even in turbo mode) can greatly increase the speed of a graphics card.
Base Clock:
1.530 GHz
Boost Clock:
1.725 GHz
Avg (Game) Clock:
Overclocking:
Yes
Thermal Design
The TDP (Thermal Design Power) of the GIGABYTE GeForce GTX 1650 SUPER D6 4G is 100 W. The graphics card is supplied with energy via the 1 x 6-Pin connector.
TDP:
100 W
TDP (up):
--
Tjunction max:
90 °C
PCIe-Power:
1 x 6-Pin
Cooler & Fans
The GIGABYTE GeForce GTX 1650 SUPER D6 4G has a total of 1 Axial fans to cool the graphics processor and graphics memory.
Fan-Type:
Axial
Fan 1:
1 x 90 mm
Fan 2:
--
Cooler-Type:
Air cooling
Noise (Idle):
--
Noise (Load):
--
Connectivity
Up to 3 screens can be connected to the GIGABYTE GeForce GTX 1650 SUPER D6 4G.
Max. Displays:
3
HDCP-Version:
2.2
HDMI Ports:
1x HDMI v2.0b
DP Ports:
1x DP v1.4
DVI Ports:
1
VGA Ports:
--
USB-C Ports:
--
Featureset
The GIGABYTE GeForce GTX 1650 SUPER D6 4G has a maximum resolution of 7680x4320 pixels. The graphics card supports the DirectX 12_1 standard.
Max. resolution:
7680x4320
DirectX:
12_1
Raytracing:
No
DLSS / FSR:
Yes
LED:
No LED lighting
Supported Video Codecs
The video codecs accelerated in hardware by the GIGABYTE GeForce GTX 1650 SUPER D6 4G reduce the processor load and ensure lower energy consumption.
h264:
Decode / Encode
h265 / HEVC:
Decode / Encode
VP8:
Decode
VP9:
Decode
AV1:
No
Dimensions
The GIGABYTE GeForce GTX 1650 SUPER D6 4G is 172 mm long and 122 mm high. With a width of 40 mm, the graphics card requires 2 PCIe-Slots in the case.
Length:
172 mm
Height:
122 mm
Width:
40 mm
Width (Slots):
2 PCIe-Slots
Weight:
--
Additional data
The GIGABYTE GeForce GTX 1650 SUPER D6 4G made in 12 nm was published in Q4/2019 for 159 $ (Reference).
GIGABYTE GeForce GTX 1650 SUPER D6 4G 1.530 GHz, 4 GB (100 W TDP)
Buy this graphic card at amazon!
Rate this graphics card
Here you can rate this graphics card and help other visitors with their purchase decision. The average rating for this graphics card is currently 0 stars (0 ratings). Rate it now:
3DMark is a benchmark program that determines the performance of certain components of a computer and then reports the performance as a numerical value.
The Last of Us Part One is a game released by Sony in June 2013 exclusively for the Playstation and released for the PC in early 2023. The benchmark values here were determined at high details.
Geekbench 6 is a cross-platform benchmark for main processors, which also carries out 3 different graphics benchmarks and outputs them in the form of a numerical value.
The theoretical computing power of the graphics card with single precision (32 bit) in TFLOPS indicates how many trillion FP32 floating point operations the graphics card (GPU) can perform per second.
In order to determine the performance of a graphics card, so-called "benchmarks" are carried out. The benchmark software carries out special calculations to determine the performance of a graphics card. We use so-called theoretical or synthetic benchmarks (e.g. 3D Mark) as well as real game benchmarks. To ensure real comparability of the results, we pay attention to the correct execution of the benchmarks as well as the condition of the graphics card and the system.
We use the following benchmarks to measure the performance of a graphics card: