Thursday, June 20, 2019

Hello world!

Welcome to WordPress. This is your first post. Edit or delete it, then start writing!



from cabletechtalk.com http://bit.ly/2XqKeE6
via IFTTT

Friday, January 11, 2019

RTX 2060 is Officially Announced and G Sync Will Work on Free Sync Monitors

CES is one of the more popular tech events throughout the year. This year Nvidia was one of the first to take the stage and sows us something new.

Unsurprisingly to everyone that was following the leaks that were more frequent as the CES approached, Nvidia announced its sort of high midrange graphics card, the RTX 2060.

Although there were rumors that there will be three memory capacities (3GB, 4GB & 6GB) across two different types of memory (GDDR5X & GDDR6) Nvidia only spoke about the 6GB DDR6 model.

Here are the specs:

GPU Engine Specs:
NVIDIA CUDA® Cores 1920
RTX-OPS 37T
Giga Rays/s 5
Boost Clock (MHz) 1680
Base Clock (MHz) 1365
Memory Specs:
Memory Speed 14 Gbps
Standard Memory Config 6 GB GDDR6
Memory Interface Width 192-bit
Memory Bandwidth (GB/sec) 336 GB/s
Technology Support:
Real-Time Ray Tracing Yes
NVIDIA® GeForce Experience Yes
NVIDIA® Ansel Yes
NVIDIA® Highlights Yes
NVIDIA® G-SYNC™ Compatible Yes
Game Ready Drivers Yes
YesMicrosoft® DirectX® 12 API, Vulkan API, OpenGL 4/5 Yes
DisplayPort 1.4a, HDMI 2.0b Yes
HDCP 2.2 Yes
4NVIDIA® GPU Boost™ 4
YesVR Ready Yes
Designed for USB Type-C™ and VirtualLink™ Yes
Display Support:
Maximum Digital Resolution 7680x4320
Standard Display Connectors DisplayPort, HDMI, USB Type-C™, DVI-DL
Multi Monitor 4
HDCP 2.2
Graphics Card Dimensions:
Height 4.435” (112.6mm)
Length 9.0” (228.60mm)
Width 2-Slot
Thermal and Power Specs:
Maximum GPU Temperature (in C) 88
Graphics Card Power (W) 160W
Recommended System Power (W) 500W
Supplementary Power Connectors 8 pin

The launch price is set to be $350 which is a whole $100 more than the GTX 1060 6GB model.
The price hike itself isn’t so bad if we consider the performance. Early reviews show RTX 2060 having the quite similar performance as GTX 1070 TI.
And if the rumors are to be believed there will be a “GTX 1160” which is supposed to have the same performance but will come without the new fancy features that the RTX cards brought, like RT Cores and Tensor Cores. Omitting these features should also bring down the price a bit.

Another interesting thing Nvidia spoke about was its adaptive screen refresh rate G-Sync working on selected Free Sync monitors.
A the moment there is just a handful of Free Sync monitors that will officially support G Sync, but the list will get longer as time progresses.

nvidia announcing g sync compatible monitors

Photo credit: Nvidia

The post RTX 2060 is Officially Announced and G Sync Will Work on Free Sync Monitors appeared first on Cable Tech Talk.



from Cable Tech Talk http://bit.ly/2CfBnsp
via IFTTT

Thursday, November 15, 2018

What’s the Difference Between HDMI and DVI? Which One is Better?

Many people are confused by the abundance of different video cables that you can buy today. If you’re among them, we’re here to help you out – we’ll be taking a look at the two types of video cables today – HDMI and DVI.

Some fifteen years ago, figuring out how to connect a TV to a certain device was much easier – almost everybody had a VCR. The quality was worse and the screens were smaller, but everything was much simpler. Today, the things are a little bit different – there’s a plethora of various cables and devices, and it can be a real chore to select something that will suit your needs the best. We’ll help you find a way through this mess and realize the most important things about these two types.

The Layout

One of the most crucial differences between DVI and HDMI is in their layout. DVI is larger, as it features the so-called 24-pin set that’s quite similar to that of the SCART or VGA cables (even though some versions sport fewer pins). HDMI, on the other hand, looks like a USB input and is much more compact.HDMI cable

The DVI can be purchased in a variety of layouts, each of them designed for a particular task. You can get it in the DVI-D (digital), DVI-A (analog), and DVI-I (both digital and analog) versions. It gets even more confusing – there are also the dual-link and single-link varieties, which have an effect on the overall refresh rate.

It’s a little bit simpler with HDMI, as the newly released versions of this cable usually stick to a numbered system (1.0, 2.0, 2.1 etc.).

One crucial difference here is that HDMI can support eight audio channels, while the DVI can transfer only the video signal. Thus, if you want to use your cable to connect something with sound, you’ll have to use an HDMI or an extra audio cable.

The Compatibility

The compatibility is one of the biggest questions when it comes to buying cables. No one likes purchasing a powerful new monitor only to find later that it doesn’t have the appropriate ports. This is one of HDMI’s greatest advantages – it’s more common, and thus more likely to fit almost all modern monitors, laptops, PCs, and gaming consoles. The DVI, on the other hand, is a little bit rarer.

If your monitor has a DVI input while your laptop is sporting a HDMI connection, not all is lost – you can simply buy an adapter for video conversion and easily solve the problem.

The Refresh Rates

For all those buying a new monitor, the refresh rate should be among the most important considerations. Higher rates provide a smoother experience which reduces things like headaches and are much easier on the user’s eyes. This is very important if you’re spending long days in the office. In simple terms, this would be a number of FPS (frames per second) that the monitor can put out – 144Hz monitors, for example, are capable of delivering 144 FPS.

This is probably the biggest difference between DVI and HDMI – the later can support only up to 120Hz, and only in the 1.3 version. The DVI, on the other hand, provides support for 144Hz, but only in the DVI-D version.

The Signal Quality

Since both of these cables are digital, there’s really no discernible difference between the signal qualities that they provide. If you’re using the maximum 24-pin configuration, your DVI will support the 1080p resolution (1920 x 1080) just like the HDMI.DVI cable

The only significant difference here is the fact that most of the DVI cables do not support the HDCP copy protection. This would be a system that prevents the high-definition content from being played on all sorts of unauthorized devices.

The Verdict

Functionally, both of these cables are basically identical, and it all comes down to the user’s particular needs – you should just go with the cord that fits your already existing hardware. However, we would recommend you to choose an HDMI, if possible, as it is more likely to provide compatibility for all of your future upgrades. When talking about DVI we can already consider it legacy interface, that’s being replaced by more modern and versatile DisplayPort.

The post What’s the Difference Between HDMI and DVI? Which One is Better? appeared first on Cable Tech Talk.



from Cable Tech Talk https://ift.tt/2zdnYAm
via IFTTT

Tuesday, September 11, 2018

AMD Could Release 10C/20T Ryzen 7 2800X as a response to Intel’s 9900K

Even though the 2nd-gen AMD Ryzen CPUs were launched earlier in this year, the company’s Ryzen 7 2800X is still nowhere to be seen. However, this doesn’t mean that it won’t be released soon – AMD’s ex general manager, Jim Anderson, stated that it would appear “someday.”

He also stated that the company’s 2700X and 2700 models have great price and performance points and that they’re sufficiently covering the space in the first wave of Pinnacle Ridge CPUs. Indeed, one could safely say that these processors make recommending Core i7 8700K, the competing models, almost impossible.

However, things are certainly about to change once Intel launches their octa-core processor in the upcoming months – its name is Intel Core i9-9900K, and it will be the company’s first mainstream 8-core/16-thread CPU. The model is expected to introduce a noticeable increase in performance and might as well become Intel’s best-selling top-of-the-line model.

The guys and girls at AMD won’t have headaches once that happens, though – they could have the brand new 2800X ready to roll out and become 9900K’s main competitor. What will it bring to the table? The word is that 2800X will be the world’s first mainstream CPU with ten cores and twenty threads, and offering speed at over 4GHz. In our opinion, this should be more than enough to keep the price reasonable but still successively counter Intel.

2800X cinebenchHowever, we advise you to take this information with a grain of salt – no one knows precisely what will the 2800X’s specifications be, as these are just rumors steaming from an already-viral Cinebench photo someone took a couple of days ago. Realistically, there’s a higher chance for Ryzen 7 2800X to be an octa-core model just like its competitor, but we’d definitely like to have AMD surprise us by going for ten cores.

Source: Guru3d

The post AMD Could Release 10C/20T Ryzen 7 2800X as a response to Intel’s 9900K appeared first on Cable Tech Talk.



from Cable Tech Talk https://ift.tt/2CJiTUt
via IFTTT