Here you will find all technical data as well as various benchmarks of the Gigabyte GeForce GTX 1650 MINI ITX 4G. Up to 3 screens with a maximum resolution of up to 7680x4320 can be operated with this graphics card. The maximum turbo clock of the Gigabyte GeForce GTX 1650 MINI ITX 4G is 1.665 GHz, so the graphics card achieves an FP32 computing power of 2.98 TFLOPS.
GPU
The Gigabyte GeForce GTX 1650 MINI ITX 4G is equipped with 14 execution and 896 shader units. With single precision, a theoretical FP32 computing power of 2.98 TFLOPS is achieved.
The Gigabyte GeForce GTX 1650 MINI ITX 4G has a 128 bit wide memory interface, with which a memory bandwidth of 128 GB/s is achieved. In total, 4 GB GDDR5 graphics memory is available for the graphics card.
Memory Size:
4 GB
Memory Type:
GDDR5
Memory Clock:
2.000 GHz
Memory Speed:
8.0 Gbps
Memory bandwidth:
128 GB/s
Memory Interface:
128 bit
Clock Speeds
The manufacturer specifies the memory clock of the Gigabyte GeForce GTX 1650 MINI ITX 4G with 1.485 GHz (base clock) or with 1.665 GHz (turbo clock). If overclocking is supported (see below), the clock rate can be increased even further.
Base Clock:
1.485 GHz
Boost Clock:
1.665 GHz
Avg (Game) Clock:
Overclocking:
Yes
Thermal Design
The Gigabyte GeForce GTX 1650 MINI ITX 4G has 1 x 6-Pin plugs, which supply the graphics card with energy. The maximum operating temperature of the card is 92 °C.
TDP:
75 W
TDP (up):
--
Tjunction max:
92 °C
PCIe-Power:
1 x 6-Pin
Cooler & Fans
The Gigabyte GeForce GTX 1650 MINI ITX 4G is equipped with 1 Axial main fans, which are used to cool the graphics processor and graphics memory.
Fan-Type:
Axial
Fan 1:
1 x 80 mm
Fan 2:
--
Cooler-Type:
Air cooling
Noise (Idle):
--
Noise (Load):
--
Connectivity
Up to 3 screens can be connected to the Gigabyte GeForce GTX 1650 MINI ITX 4G, whereby the backward-compatible HDCP copy protection is supported in the 2.2 version.
Max. Displays:
3
HDCP-Version:
2.2
HDMI Ports:
2x HDMI v2.0b
DP Ports:
1x DP v1.4
DVI Ports:
--
VGA Ports:
--
USB-C Ports:
--
Featureset
The Gigabyte GeForce GTX 1650 MINI ITX 4G supports a maximum resolution of 7680x4320 pixels and the DirectX standard version 12_1.
Max. resolution:
7680x4320
DirectX:
12_1
Raytracing:
No
DLSS / FSR:
Yes
LED:
No LED lighting
Supported Video Codecs
Here is a list of which video codecs can be decoded / encoded by the Gigabyte GeForce GTX 1650 MINI ITX 4G in hardware in order to minimize the processor load.
h264:
Decode / Encode
h265 / HEVC:
Decode / Encode
VP8:
Decode
VP9:
Decode
AV1:
No
Dimensions
The dimensions of the Gigabyte GeForce GTX 1650 MINI ITX 4G are 152 mm in length, 114 mm in height and 36 mm in width. Thus 2 PCIe-Slots are needed in a case.
Length:
152 mm
Height:
114 mm
Width:
36 mm
Width (Slots):
2 PCIe-Slots
Weight:
--
SFF-Ready:
No
Additional data
The Gigabyte GeForce GTX 1650 MINI ITX 4G is manufactured in a structure width of 12 nm and has a PCIe 3.0 x 16 lanes interface. The graphics card was released in Q2/2019.
Gigabyte GeForce GTX 1650 MINI ITX 4G 1.485 GHz, 4 GB (75 W TDP)
Buy this graphic card at amazon!
Rate this graphics card
Here you can rate this graphics card and help other visitors with their purchase decision. The average rating for this graphics card is currently 0 stars (0 ratings). Rate it now:
3DMark is a benchmark program that determines the performance of certain components of a computer and then reports the performance as a numerical value.
Geekbench 6 is a cross-platform benchmark for main processors, which also carries out 3 different graphics benchmarks and outputs them in the form of a numerical value.
The theoretical computing power of the graphics card with single precision (32 bit) in TFLOPS indicates how many trillion FP32 floating point operations the graphics card (GPU) can perform per second.
In order to determine the performance of a graphics card, so-called "benchmarks" are carried out. The benchmark software carries out special calculations to determine the performance of a graphics card. We use so-called theoretical or synthetic benchmarks (e.g. 3D Mark) as well as real game benchmarks. To ensure real comparability of the results, we pay attention to the correct execution of the benchmarks as well as the condition of the graphics card and the system.
We use the following benchmarks to measure the performance of a graphics card: