Over the last few years, there have been quite a few changes in the world of gaming monitors: ever faster panels, higher refresh rates, ever higher resolutions, variable vertical synchronization technologies. Let’s take a look at the Nvidia G-Sync and AMD FreeSync systems to compare their advantages and disadvantages.
Evolution of vertical synchronization
As you may already know, the movement that you see on the screen, be it a movie, a TV show or a PC game, is just an illusion. What we see are static images that pass before our eyes 30 (or more) times per second. Our brain puts them together and “sees” movement.
To get an idea of the problem that can arise in the world of video games, let’s see above how a graphics card works. A GPU has two main buffers (or stores) in video memory. The secondary buffer is where the GPU renders the current frame while the primary buffer holds a full-frame that is transported to the screen. When the GPU fills a frame in the secondary buffer, the image in the secondary buffer is sent to the primary, and the second buffer, in turn, begins to store another new image.
Meanwhile, the screen receives an image, displays it, and erases it in a process called vertical bleaching. If this process is out of sync with the GPU buffer exchanges, the image displayed on the screen may have a part of two consecutive and different primary buffers (part of the previous buffer and part of the current buffer).
Due to the way the screens draw the images, the separation between these two images can be seen as a visual tear (tearing) or horizontal tear line. If there is no movement, the effect will be practically imperceptible, but if there is a lot of movement you will see something like the following image.
The Vertical or V-Sync, Sync tries to solve this problem by limiting the output speed of the images of the GPU. The idea is to limit the GPU output to the screen refresh rate and eliminate visual tearing, but this introduces additional issues that can be just as annoying: stuttering and input lag.
Most games can work with or without vertical sync. If we activate it and the screen is 60 Hz, we can only display 60 images per second, therefore, this setting limits the GPU output to 60 frames per second. The stuttering or stuttering occurs when the GPU can not keep the frame rate (60 fps gives less) and the screen should reuse the same frame twice until the GPU send a new image. This produces jumps in the image and gives the sensation of jerks so the user experience is very bad.
Besides that, the V-Sync has another problem, as the GPU has to hold frames to wait for the monitor, there is a longer delay between the input (keyboard and mouse) and the actions rendered on the screen.
Nvidia G-Sync Vs. AMD FreeSync
To solve these problems, Nvidia launched G-Sync technology on the market in 2013 that manages to synchronize the monitor update with the speed at which our PC graphics generates each frame, even if that speed changes.
Soon after, AMD launched a similar system called FreeSync that works in a similar way to G-Sync but is cheaper to integrate by monitor manufacturers.
Today, you will find dozens of monitors, even non-gaming ones, that have G-Sync, FreeSync, or even both.
G-Sync guarantees that you will never see tearing even at the lowest refresh rates. Below 30Hz, G-Sync monitors double the frame renders (and thus double the refresh rate) to keep them running in the adaptive refresh rate.
G-Sync also has other advanced features such as ULMB that reduce image blur due to backlight strobing. Some FreeSync monitors also something similar with different names. The good news is that all G-Sync and G-Sync Ultimate monitors have it.
Nvidia’s solution requires a proprietary hardware component that monitors have to integrate inside, which is why G-Sync monitors are more expensive.
When G-Sync was released, monitors with G-Sync cost almost $ 200 more than their counterparts without that feature. As of today, that difference has dropped a lot and stands at around 100 euros. In addition, currently in the market, we have different certifications for monitors in different price ranges:
- G-Sync Compatibility: This is the lowest grade and only features adaptive vertical sync above 60 fps. Many monitors with G-Sync Compatibility can also run FreeSync.
- G-Sync – Features adaptive vertical sync at any frame rate, low motion blur, HDR, and better color support.
- G-Sync Ultimate – Features adaptive vertical sync at any frame rate, low motion blur, HDR, better support, peak brightness above 600 nits (previously 1000 nits, but reduced in January 2021), and refresh rates of 144 Hz or higher.
How to activate G-Sync
To use G-Sync, you need a G-Sync certified to display and an Nvidia graphics card, with the minimum compatible model being the GTX 650 Ti for G-Sync compatible monitors, and a GTX 1050 for G-Sync Ultimate.
You also need a modern DisplayPort cable: DP 1.2 for G-Sync compatible monitors, and DP 1.4 for G-Sync Ultimate monitors.
Finally, to activate it we only have to install the latest drivers, go to the Nvidia Control Panel and click on Screen. There you should see the option to “Configure G-SYNC”. Check the box to activate the settings and you’re done.
FreeSync has the main advantage of having a cheaper implementation because it uses an open-source standard created by VESA, Adaptive-Sync, which is also part of the VESA DisplayPort specification. Any DisplayPort interface version 1.2a or higher can support adaptive refresh rates. Although a manufacturer may choose not to implement it, the hardware is already there, so there is no additional production cost for the manufacturer to implement FreeSync. FreeSync can also work with HDMI 1.4.
- FreeSync: It is the lowest grade and only has adaptive vertical sync and can support HDR. Many FreeSync monitors can also run G-Sync.
- FreeSync Premium – Adaptive vertical sync, can support HDR, Low Frame Rate Compensation (CFL), 120Hz or higher refresh rates.
- FreeSync Premium Pro: Adaptive Vertical Sync, Low Frame Rate Compensation (CFL), refresh rates of 120Hz or higher, HDR, better color support, and, although not specified, FreeSync Premium Pro monitors typically have more than 600 nits of maximum brightness.
Due to its open nature, the implementation of FreeSync varies greatly between monitors… Many inexpensive monitors do not offer blur reduction, and the lower limit of the Adaptive-Sync range could be only 48 Hz. However, there are FreeSync displays (so such as G-Sync) that operate at 30 Hz or even lower.
One of the benefits of G-Sync is that it continuously adjusts the monitor’s overdrive to help eliminate ghosting. All G-Sync monitors incorporate Low Frame Rate Compensation (LFC), which ensures that even when the frame rate drops, no glitches or image quality problems occur. This feature is found on FreeSync Premium and Premium Pro monitors, but not always found on standard FreeSync monitors.
How FreeSync is activated
To use FreeSync you need a FreeSync-compatible display and one of the following options: a 2012 or newer AMD graphics card or APU, an Nvidia GeForce GTX 10-series graphics card or higher (you must use a DisplayPort cable), an Xbox One S or X, or an Xbox Series X or S. For FreeSync certified displays, make sure FreeSync is enabled through the monitor’s on-screen display.
For FreeSync TVs, simply activate Game Mode, usually through the settings menu.
For AMD Radeon graphics cards or AMD APUs, you can enable FreeSync through the AMD Radeon software, in the Display tab of the setup menu. Some recommend locking in maximum FPS for a smoother experience. If you follow this recommendation, you can use Radeon Chill, to limit your maximum FPS to about three or five FPS below the maximum refresh rate of your monitor.
In the case of Nvidia graphics cards, you need the latest Nvidia Game Ready drivers, although support for these displays started with driver version 417.71. After the latest drivers are installed, activate FreeSync through the on-screen display of the monitor. Then, in the Nvidia control panel, you can enable variable refresh rates through the “Configure G-SYNC” menu option.
On paper, Nvidia’s G-Sync technology can deliver better results, however, FreeSync’s performance is so even that the price difference is rarely offset.
Additionally, high-end monitors with FreeSync have proprietary technologies (not included in FreeSync) that improve image quality and make them as capable as the best G-Sync monitors.
However, as G-Sync certification is more demanding, it is easier to choose good G-Sync monitors than FreeSync. I explain. If we choose a G-Sync monitor it will be good practically always. If we choose a FreeSync monitor, we have to look at other technical features that compensate for the shortcomings of the FreeSync specification such as low motion blur or compensation with low frame rates.
Ultimately, the best option for the vast majority of people is to go for a good monitor with FreeSync. The value for money is usually much more advantageous. Also, they work with both AMD and Nvidia cards. If you don’t care about money, you want the best of the best and you have an Nvidia card, you can go for a G-Sync monitor knowing that you will normally be paying a premium of about 100 euros for it.
How to Choose a Good Gaming Monitor
Comfort in the game depends on a comfortable chair and table, the computer’s performance, and the monitor. The quality of the displayed image depends on the settings of the monitor and its refresh rate, color reproduction, and many other characteristics.
Which PC Gaming Monitor Should You Choose? What to remember and what to pay attention to?
The jungle of screens does not choose a gaming screen. We go through the most important things you need to keep in mind if you get a new gaming screen.
Buying a monitor can easily be equated with buying any component for your computer – we mean it’s a mistake. This is probably the most important purchase you can make in terms of hardware. It’s not just about getting a good picture; it’s basically about what gaming experience you want.
This article provides answers to what to look for and what you can more or less ignore.
Gaming Monitor – Matrix Types
One of the first questions to look out for when thinking about which gaming monitor to choose is the matrix type. The basic parameters of the display will depend on the decision made.
It determines the picture quality, color rendition, viewing angles, and so on. This is the main component of the device, its “filling.” Therefore, it is also worth paying attention to when buying a new monitor.
There are different types of matrix. The most common matrix are of four types:
- TN – the “fastest” matrix but with viewing angles and color reproduction, they are not very good;
- IPS – lead in picture quality, but only madly expensive models can match TN in speed;
- VA – they have excellent speed and decent color reproduction, at the IPS level in the same price category. In addition, VA matrices usually have twice or even three times higher static contrast ratio and more uniform backlighting.
- OLED – OLED screens stand out thanks to their incredible brightness and contrast ratios. It is enough to look at a barely luminous diode to see a significant difference compared to LED matrices.
They are widely used in modern gaming monitors, including the MPG and MAG series from MSI, on the example of which we analyze the topic of this material. VA matrices are in demand due to excellent contrast, good color rendering, and wide viewing angles. This is the best thing you can buy for gaming.
TN matrix for a gaming monitor
TN is a popular solution that can find in both budget monitors and more expensive models. TN panels are cheaper to manufacture but also have their drawbacks. The main advantages and disadvantages of TN-matrix in gaming monitors are presented below.
Advantages of TN matrices:
- Low price You can buy a gaming monitor with a TN matrix of up to 150 USD.
- Speedy response times, minimal delays.
- High refresh rate even on cheap monitors.
Disadvantages of TN matrices:
- Poor color rendering than monitors with VA or IPS matrices.
- Narrow viewing angles.
- Problems displaying deep blacks.
IPS matrix in a gaming monitor
An IPS gaming monitor is a popular choice, especially among gamers on a slightly larger budget who appreciate better color reproduction and wide viewing angles. IPS matrices are more expensive, but they make up for the price with image quality. Their most significant advantages and disadvantages are presented below.
Advantages of IPS matrices:
- Good color rendering.
- Wide viewing angles.
- Deep blacks and optimal contrast.
Disadvantages of IPS matrices:
- Higher purchase price than TN matrix.
- The response time is worse than for TN or VA matrix.
VA matrix in a gaming monitor
VA matrices in gaming monitors are a solution that can be viewed as a kind of compromise that combines the key advantages of TN and IPS matrices. The offer of gaming monitors with such a matrix is expanding, and its advantages and disadvantages are presented below.
Advantages of VA matrices:
- Reasonable price – VA matrix is more expensive than TN but cheaper than IPS.
- Good color rendering.
- Proper viewing angles – definitely wider than with TN.
Disadvantages of VA matrices:
- Response time – slower than TN but faster than IPS.
OLED matrix in a gaming monitor
OLED matrices in gaming monitors stands for Organic-Light-Emitting-Diodes. The essence of the technology is that each pixel of the display is a separate light source. A large pixel size allows for much higher resolution, pixel density, and therefore image quality. Rich colors and energy-saving: the advantages of OLED Perfect black Each pixel emits light by itself, so when you turn off one or a group of pixels.
Advantages of OLED matrices:
- In terms of brightness and color, each pixel works entirely independently of all the others. This helps to display all colors correctly in every area of the image.
- Viewing angles . The image is clearly visible from a wide variety of viewing angles without loss of quality.
- Flexibility . OLED matrices can be of almost any shape and design. Due to their extreme flexibility, such displays can be rounded as desired.
Disadvantages of OLED matrices:
- High price . At the moment, all OLED TVs are very expensive. They have completely occupied the premium class, which many potential buyers cannot afford.
- OLED displays are pixel burnout (especially blue color), which, in turn, leads to color distortion and loss of contrast. This reduces the lifespan of the TV.
After examining the types of matrices available, a decision needs to be made. It is best to do so based on the types of games you play most often and considering whether the monitor is mainly used for games or perhaps also for work or other applications.
Gaming monitor – which matrix for which games
If one type of matrix met the expectations of all buyers, there would be no point in producing other types of monitors. However, since each type of matrix has its advantages, it is worth considering which solution is best for a particular kind of game.
Buyers most often choose between a TN or IPS monitor, but it is also worth remembering the VA matrix, and, in addition, the type of matrix itself should be adapted to the specifics of the games used.
- A monitor for dynamic gaming – like action games, FPS, or FPP – can have a TN matrix, providing the fastest response times and possibly the best refresh rate compared to other monitors for the same price.
- A monitor for RPG, adventure, and strategy games – response time is not as important as perfect color reproduction, deep blacks, and effective contrasts, so an IPS monitor is the best choice.
You can find an IPS monitor with good response times and high refresh rates for higher purchasing budgets.
Players who play games of all kinds and users looking to buy a monitor for gaming and work can also opt for a VA monitor that combines the features of IPS and TN monitors in terms of both quality and price.
What size gaming monitor to choose
No less important than the type of matrix and the resolution of the monitor is its size. This decision should be made based on the available space for the monitor – a small corner table is not suitable for a widescreen monitor with a large diagonal, and a small screen would be a poor choice for a large and spacious table.
What monitor size should I choose? Here are the most popular sized monitors:
- 24 inches is a popular and very versatile gaming monitor size for work and many other applications. It fits easily on any desk. A monitor of this size performs well at Full HD resolution.
- 27 inches – A slightly larger screen will provide a better experience when playing games or watching movies. A 27-inch or 28-inch gaming monitor usually has the most common aspect ratio (16: 9) and performs well with Full HD and WQHD resolutions.
- 34 “and above – Monitors with a screen size of 34″ and up will work great on a large desk. It’s a good choice when looking for a widescreen monitor or curved screen (usually large). Monitors of this size can be bought in various resolutions, but the most prominent gaming monitor will benefit from a 4K resolution.
The largest gaming monitors can be 65, 86, or even 98 inches – but these are expensive and very large solutions for a standard desktop. However, if you prefer to play on the big screen, it is worth considering TV.
Gaming Monitor – What Resolution
Another essential issue to consider when choosing a gaming monitor is monitor resolution. This question should be made dependent on the available budget and the parameters of the computer itself.
The most popular solutions today include:
- Full HD resolution is still hugely popular, although we will find a huge range of higher-resolution monitors on the market. The advantage of a Full HD monitor will be a lower price point over a WQHD or 4K monitor and that it will work well even with weaker graphics cards.
- WQHD resolution is a resolution that provides higher picture details than Full HD. WQHD gaming monitors are also not as expensive as 4K models, and we can easily buy a model with that resolution with a high refresh rate.
- 4K is the most expensive solution and offers 60Hz refresh, which is a big drawback for fast-paced action games. This resolution is recommended for monitors with a large diagonal screen and for computers with a good video card. A 4K gaming monitor is a good choice when shopping for a monitor for games and movies.
The screen resolution directly affects the image detail. Still, the most advanced solution will not always be the best – a lot depends on the tools available (it’s hard to find a 4K gaming monitor for under 20,000 rubles).
Curved screen – yes or no?
Now that we have analyzed the leading technical indicators, it is worth talking about the design features. For example, is it worth looking into curved screens? Many companies have been selling such monitors in stores for about five years.
Why is it getting popular? There are two reasons at once. Firstly, the curved screen gives the picture some volume. This is because the image surrounds the person and creates the effect of depth.
It looks especially great in games with a first-person camera, because in reality, what we see is also around us and not on a flat “canvas.”
Secondly, the curved screen design allows for an even wider viewing angle when assembling multi-monitor configurations. By putting three monitors together, you can get a panorama with a 180-degree field of view with little or no gaps. Ordinary flat screens can be arranged in a “three,” but in this case, they are located at an angle, which disturbs the perception of the game.
Even though ordinary gamers still rarely use multiple monitors at once, the curved screen is an excellent feature that positively affects the immersion in the game. So our answer is yes. And if you still want to try a bunch of three monitors, then look for frameless ones.
As an example, a multi-monitor configuration of the 27-inch MSI Optix series with a refresh rate of 144 Hz. Among other things, this model boasts the use of Anti-Flicker and Less Blue Light technologies, which do an excellent job with screen flicker and reduce the intensity of blue light.
Gaming monitor Response time
It’s about how quickly the screen’s pixels change from, for example, entirely off to entirely on or how fast they change in grayscale – this partly affects lag in the image. But since there is no uniform standard and it is a single figure taken from a complex context, it is time to bury that specification once and for all.
How is this parameter measured, and what should you know about monitor response time?
Refreshing the image in gaming monitors is one of the most important parameters, especially for a monitor for dynamic action, FPS, and FPP. Higher refresh rates combined with slower response times will provide the smoothest image display.
Higher hertz is always right – to a specific limit.
Some myths surround the frequency of updates. Sometimes, it is not possible to prove that high refresh rates make you a better gamer (almost true). At the same time, few computers can render as many frames per second (FPS) as the screens can display.
But it is precisely at the collapse of these two that a high refresh rate comes into the picture. The higher the hertz, the lower the latency. The lower the latency, the more flow in the image, regardless of whether you are swinging between high-rise buildings located at 250 km / h just outside Edinburgh or have your best friend insight for an approaching gib.
FPS vs. Hz
We take it from the beginning. FPS? Hertz? FPS stands for Frames Per Second. This is how many frames the computer can render and send to the screen. It depends on how high quality you have on the graphics and what resolution you set.
Hertz (Hz), in turn, stands for “events per second” in this case, how many times the screen updates the image per second. So we want the computer to pump out many frames per second, and we then want the screen to be able to display at least as many or more.
- 60Hz is the frequency used in the cheapest monitors, but also 4K models. It is worth remembering that the effect of the final refresh rate also depends on the cable (interface) used and the screen resolution. The 60Hz refresh is also recommended for work for strategy and RPG games, not fast-paced shooters.
- 75Hz is a compromise solution for buyers interested in various games but on a small budget. You can buy a 75Hz refresh monitor for a little over £ 1,000.
- 144 Hz – Recommended refresh rate for players who prefer online battles and fast-paced games. A 144Hz gaming monitor performs well in day-to-day use, but keep in mind that this refresh rate is used primarily by Full HD and WQHD monitors and only a few higher resolution models.
- 240 Hz is an extremely high refresh rate that is practically only used in Full HD monitors. This type of monitor can be recommended for fast-paced online gaming, especially when they opt for a TN matrix monitor and fast response times.
Gaming Monitor – What’s the Input Lag
Refresh rate and response time are not the only factors affecting the smoothness of the displayed image. An equally important issue is signal latency, that is, input latency. This parameter is specified in units of ms (similar to monitor response time) if manufacturers provide this information at all.
Since input lag is not a fixed element of a monitor’s specifications, many manufacturers overlook this issue before buying a monitor; it is worth looking for monitor tests done in this regard – the lower the input lag, the better.
We can assume that in good gaming monitors, input lag should not exceed 11ms.
Technologies created in the depths of NVIDIA and AMD change the refresh rate depending on the speed of frame rendering. Visually, we get a smoother picture without image tearing at a rendering speed of 30-60 frames per second or lower. To work, you need an appropriate video card from the manufacturer.
The industry has developed a flexible standard that allows the screen to adapt to the number of frames that the computer produces. It’s called VESA Adaptive Sync. However, you may have heard of it with two other slightly nicer names – AMD Freesync and Nvidia G-Sync respectively.
This technology is built into basically all modern screens, and if one is supported, it will also be supported by the other. The name is set depending on which graphics card you have—Freesync for AMD and G-Sync for Nvidia’s graphics cards. In general, it can be said that AMD graphics cards from 2014 onwards can be expected to support it and all Nvidia cards from the 10, 16, 20, and 30 series.
However, it would help if you connected the monitor via DisplayPort cable. This particular technology does not work with HDMI.
Gaming monitors in 2019 are equipped with HDMI and DisplayPort. The former has a 144Hz refresh rate limit, while the latter allows you to squeeze the most out of your system.
Typically, gaming monitors over 144 Hz are a theoretically equipped with DisplayPort. Please note that your video card must support this connection interface.
What else to look for when choosing a gaming monitor?
If we choose the best gaming monitor, we need to analyze the parameters and technical issues carefully.
It is not only the size, format or resolution of the screen that matters, but also the following:
- Wall Mount Adaptation – If you have an empty wall above your desk, you may want to consider wall-mounting your monitor, but then you need to find a VESA-compliant model.
- Built-in speakers – many monitors have them, but this is not the rule. If you don’t have separate speakers, you can opt for a monitor with built-in speakers, but don’t expect high sound quality from them.
- Rotating Screen (PIVOT) – This option is useful for gamers as for buyers looking for a monitor for games and work (for example, graphics, design, and programming). The ability to rotate the screen vertically allows you to work with materials in a vertical layout comfortably.
- Screen Height Adjustment – Many gaming monitors have a height-adjustable base, which is a great solution especially when users of different heights use the monitor – the ability to adjust the position of the monitor to the user’s height improves ergonomics.
- Synchronization – Many gaming monitors may be equipped with refresh rate synchronization (AMD FreeSync or NVIDIA G-Sync). This is a good solution when you have a compatible graphics card, because then the refresh rate matches the generated frames.
- HDR – This solution can be found in more expensive gaming monitors. The offer of HDR gaming monitors is small, but you should be interested in this technology. An HDR gaming monitor provides stronger screen backlighting, better reproduction of colors, highlights, and shadows.
- Brand – It’s worth looking at the manufacturer’s reputation because it often goes hand in hand with the quality of the solutions offered, or at least with the reliability of specifications and descriptions.
- Design – A gaming monitor should perform well in games and look good on a table. It is worth choosing the appearance of the monitor by your preferences.
- Price – many buyers start with an affordable budget. This is a good approach as it allows you to restrict the selection to a specific price range – perhaps one of the more expensive models will impress us with parameters, and we will decide to postpone the purchase until prices drop.
- Popularity with buyers – It is good to use social signals to make sure other users receive your chosen monitor well. We can look for answers to questions such as, for example, which gaming monitor has the highest rating among real users.
- Release date. The latest gaming monitor is more likely to find the most up-to-date solutions than a device released many years ago, which is expected to be equipped with an outdated interface and, in general, older technological solutions.
How to set up your gaming monitor
The comfort in-game is influenced by the quality of the selected monitor and the corresponding settings. The position of the screen on the table is essential and the display parameters of the image.
What do you need to remember?
- The monitor must be in front of the user, any installation at an angle forces you to take an uncomfortable position, leading to back pain. In terms of ergonomics, monitors with adjustable height and tilt are the best choice.
- The monitor should be at a suitable distance from the user, especially when the screen is large – and an ergonomic solution is considered to be about 40 cm between the user’s face and the surface of the screen (but maybe larger).
- The monitor needs to be tuned in terms of image parameters – devices can have game modes ready, but you can also create your profile with your preferred settings for color, brightness, contrast, and even refresh rate.
Offering gaming monitors can make you dizzy. There are many models of these devices, so it isn’t easy to make a purchase decision for a specific one, especially when the technical details are foreign to the buyer. Therefore, before buying, it is worth spending a little time reading about the key parameters of gaming monitors and only on this basis to decide on a specific model.
How to Install Windows 11 on an Unsupported PC
Microsoft has released a quick guide to installing the latest Windows 11 on computers with legacy hardware. Now, owners of machines without a TPM 2.0 module and with the wrong processors will be able to upgrade from Windows 10 to the “eleventh” – all you need to do is add one new key to the system registry. However, Microsoft does not recommend using its method – it can be dangerous.
Microsoft has officially allowed its latest operating system, Windows 11, to be installed on computers that do not meet the minimum requirements. The corporation has posted instructions on its product support web page to work around some of the limitations that prevent the Tens from upgrading to Windows 11.
The method proposed by Microsoft involves editing the system registry. It allows you to install a new OS on some machines without an activated TPM 2.0 module and with a processor that is not included in the system’s officially supported. Keep in mind that you will still need a TPM 1.2 module.
TPM (Trusted Platform Module) is a module used to store cryptographic keys in computers. It, in particular, can be made as a separate chip installed on the motherboard.
Microsoft explained how to install Windows 11 on unsupported hardware.
The Windows 11 Installation Methods support page emphasizes that Microsoft does not recommend installing Windows 11 on devices that do not meet the minimum system requirements. Responsibility for the result of such an upgrade rests with the user.
“Your device may not function properly due to compatibility issues. Devices that do not meet these system requirements will no longer be guaranteed to receive updates, including but not limited to security updates, ”warns Microsoft.
Open Microsoft support website. Download Windows 10 disk image and download Windows 11 disc image.
The media creation tool is the easiest method to download Windows 10 disk images.
Next, download Windows 11 disc image.
Navigate to the Windows 11 ISO page
Select Windows 11 from the menu under download Windows 11 disc image. Click download.
Click the download button that appears.
The ISO file will now download to your computer.
Extract the Windows 10 ISO and Windows 11 ISO files.
Next, Navigate to the Windows 11\source folder. Find and delete the install. esd file.
Next, Navigate to the Windows 11\source folder.
Find and copy install.wim, then Navigate to Windows 10\source folder and paste it here
Next upgrade to Windows 11.
Double-click the Setup file to begin the Windows 11 upgrade process.
Click the “Change how Windows Setup downloads updates” option. Select the not right now option.
Click the accept button to Accept to the terms.
Click the install button to upgrade while keeping your files and apps.
Once you complete the steps, the setup will continue upgrading the device to Windows 11.
What Is Aptx Audio Codec
The popularity of headphones raises the question for users of which models are better to choose and how they generally differ. Many people look at such a parameter of headphones as the supported codecs. The more of them or, the better they are, the better the model. However, additional nuances must be taken into account.
AptX HD is the most optimal Bluetooth codec that allows you to transfer sound in quality close to CD. This codec gives the optimal ratio of sound quality and communication stability in terms of the set of parameters. It significantly loads the Bluetooth transmission band but at the same time gives a very decent sound.
What is a codec
A codec is a program that compresses music or other content. It allows you to reduce the size of the original file without spoiling the quality.
The file size is not essential when listening to music from a smartphone with wired headphones since digital data is converted into an electric current inside the phone. Then the analog signal is sent through the wires.
In the case of wireless audio, you need to send the music digitally to your headphones. At the same time, the channel for data transmission via Bluetooth is minimal. Usually, you can’t send music at speeds over 1 Mbps, and the “reference” CD quality requires 1.4 Mbps. That is why it is necessary to compress the file using a unique algorithm – a codec – and send it to headphones that “understand” this algorithm to decode the compressed file.
Both devices – smartphones and headphones – must support a specific codec. For example, if a smartphone does not support the aptX codec, headphones are meaningless.
When reading anything about Bluetooth, you invariably come across many numbers, letters, and other obscure characteristics. One such technology is aptX HD, which adds “HD” to the mysterious attributes of the previous version of the codec. We all know “High Definition” behind this acronym, but what about “aptX”? And what is the difference between aptX (in fact, also an HD codec) from its successor?
AptX technology is increasingly mentioned, which is necessary for high-quality wireless sound, but it is still not the most widespread and affordable.
AptX is a licensed Qualcomm technology that requires bi-directional support from both the host device and wireless headphones.
Types of Codecs
There are several codecs: SBC, AAC, aptX, LDAC.
- SBC ← standardized in A2DP, supported by all devices
- MPEG-1/2 Layer 1/2/3 ← standardized in A2DP: the well-known MP3 used in digital TV MP2, and the unknown MP1
- MPEG-2/4 AAC ← standardized in A2DP
- ATRAC ← old codec from Sony, standardized in A2DP
- LDAC ← new codec from Sony
- aptX ← codec from 1988
- aptX HD ← same as aptX, only with different encoding parameters
- aptX Low Latency ← completely different codec, no software implementation of aptX with reduced buffer
- aptX Adaptive ← another codec from Qualcomm
- FastStream ← Pseudo codec, bidirectional SBC modification
- HWA LHDC ← new codec from Huawei
- Samsung HD ← supported by 2 devices
- Samsung Scalable ← Supported by 2 devices
- Samsung UHQ-BT ← Supported by 3 devices
It may seem as if aptX is some kind of ultra-modern development, but it appeared back in 1988. Then it was only patented, and in technology, it began to be used only in the 2000s, but even then, it cannot be said that aptX was widespread.
The popularity came only in 2015 when Qualcomm acquired the codec and began to license it. The codec uses its compression algorithms, which allow you to keep much more of the original frequency range of the composition.
The SBC codec, a standard part of the A2DP profile, can transmit audio at bit rates up to 328 kbps at 48 kHz. But aptX can already output up to 384 kbps.
In the case of SBC, any music will sound like MP3 of average quality, but aptX is capable of producing the maximum sound quality for MP3. In addition, aptX has a lower latency: 120ms compared to 170-270ms for the SBC.
But aptX is not the king of the party yet. A year after Qualcomm took aptX under its wing, the company developed a “doped” aptX HD codec, which already allows you to listen to music at a bit rate higher than standard MP3.
It should be noted that only those devices that have a specialized chip have aptX support. It is not possible to implement support for this codec at the software level. AptX has different versions.
LDAC: Sony’s Hi-Res codec, which, unfortunately, is not good at all. It is marketed as an audiophile alternative to all existing Bluetooth audio codecs (bitrate – 990 kbps). It uses 12- and 16-band audio separation at 48 kHz and 96 kHz, respectively.
It is somewhat similar to an audiophile “prelude”, except that the primary bitrate consumption goes to the transmission of frequencies that the human ear simply does not hear.
Because of this, the audible spectrum is not properly worked out, and in many of the final characteristics, LDAC is no better than aptX HD. However, the result in terms of the signal-to-noise ratio of the codec is outstanding.
Almost all Android devices that ship with Android 8.0 and higher out of the box have LDAC support. LDAC-enabled headphones are only available from Sony, so the codec is not that widespread. In addition, there are no software decoders in the public domain, which does not allow us to analyze DACs with its support fully.
Advanced Audio Coding (AAC) is another computationally complex codec with various psychoacoustic masking. Many people mistakenly believe that AAC belongs to Apple but in fact the Cupertino’s use their own modified version of this codec. Apple AACs are licensed, but only to manufacturers of certified equipment for the company’s technology. Fraunhofer FDK AAC is used for Android.
Compared to Apple AAC, it is inferior in quality to Fraunhofer FDK AAC. However, funny enough, the quality of the Fraunhofer FDK AAC encoding is still quite different.
AAC has native tools for mixing multiple audio streams across different channels. This is used to reduce latency while simultaneously transmitting system sounds (notifications) and playing music.
As for music, here AAC “drives” music at 256 kbps, but thanks to more gentle compression algorithms, the sound quality is comparable to 320 kbps for MP3. Also, various psychoacoustic “tricks” are used to improve the auditory sound of the entire frequency range.
They differ in baud rate. For example, aptX-enabled devices can communicate at up to 384 kbps and LDAC at 990 kbps. Thus, the higher the value, the less the original file needs to be compressed, which affects the sound quality.
The more you have to reduce the file; the more information is lost. At the same time, the maximum data transfer rate requires a connection between devices of high quality. Any interference in the radio broadcast or physical obstacles between headphones and smartphones (clothes, body parts, walls) will lead to stuttering and cutoffs.
A codec is a trade-off between connection quality and sound quality. However, this is not true for all algorithms. Some codecs support dynamic bit rates. They can automatically decrease the speed indicator when the signal quality drops and vice versa. These codecs include Samsung Scalable, aptX Adaptive, and SBC.
The codecs also differ in the compression algorithm. Some codecs use psychoacoustics; that is, they remove those sounds from the original that a person cannot hear anyway. For example, all quiet ones are drowned out by louder frequencies. All sounds above a specific frequency are also removed. They are mainly able to hear only children since the hearing threshold decreases with age.
The complex the compression algorithm, the better the sound will be at the same bit rate since one codec will remove any information, and the other will remove what is already inaudible, which means it will leave audible information without strong distortion.
How is AptX different from other audio codecs?
Conventional audio codecs are based on a technology called psychoacoustics, whereby fragments of musical notes that may not be heard by the human ear are removed to compress the file.
Since a precise technical explanation is beyond the scope of this article, let’s briefly describe the process.
AptX works very differently in this regard. It processes the difference in sound notes between a specific time interval and transmits it, resulting in clearer and cleaner sound quality.
Benefits of AptX
- High-quality wireless sound
- Low latency (minimum latency) music playback
- Drains less battery
- Clear and crisp hands-free voice calling
Does AptX work with iOS devices?
Unfortunately no. AptX is licensed technology from Qualcomm and currently only works with Android devices.
Apple uses the AAC codec for Bluetooth streaming and is significantly less efficient than AptX for audio processing.
How aptX, aptX HD and aptX Adaptive codecs differ
aptX, aptX HD and aptX Adaptive are popular codecs from Qualcomm. Many smartphones support them, but they are rarely used in headphones since manufacturers pay for the codec.
Codecs do not use psychoacoustics but do support high bit rates. For aptX, the figure reaches 384 kbps, aptX HD – 576 kbps, and aptX Adaptive – 276-420 kbps.
Dynamic bit rate is supported only by the newest and highest quality codec: aptX Adaptive. Thanks to its state-of-the-art compression algorithm at 420 kbps, it delivers the same quality as aptX HD at 576 kbps. If the connection is poor, the speed will automatically drop to ensure a good connection.
SteelSeries Apex 3 TKL vs Steelseries Apex 3: Which is Better Keyboard
SteelSeries is a brand that knows how to make powerful, affordable and simple devices. The SteelSeries Apex 3 keyboard is the perfect example...
Huawei FreeBuds Studio vs AKG K245: Headphone Comparison
Huawei FreeBuds Studio is a full-size wireless active noise-canceling headphones. The first model in this segment is from Huawei. Claimed:...
2021 Newest Dell XPS 15 OLED 9510 Laptop Review
Do not be fooled by its unchanged design; each year, Dell takes advantage of the arrival of a new generation...
Ugreen HiTune T2 vs Ugreen HiTune True: Wireless Earbuds Comparision
Anyone who hears Ugreen will primarily think of charging cables and power banks. But the Chinese manufacturer has also expanded...
Technics EAH-AZ60 vs Technics EAH-AZ70W: Dual Hybrid Noise Cancelling Headphones
The Technics AZ70 True Wireless headphones from the high fidelity range from 200$. The selection here is very small, the...
Jabra Elite 7 Pro Vs Jabra Elite 7 Active: Which TWS Should You Buy
The Danish manufacturer Jabra was, alongside Apple, one of the first providers of TWS (True Wireless Stereo) headphones. This refers...
Logitech G Pro X Superlight vs Razer Viper Ultimate Wireless: Gaming Mouse Comparision
At the beginning of 2021, G Pro X Superlight will be competing for the crown at the highest level with Razer’s...
Acer Aspire Vero Intel Core i5 Laptop Review
The manufacture of a computer is often relatively energy-intensive. The materials used are often aluminum, magnesium, something solid certainly, but...
MSI GP76 Leopard Gaming Laptop With RTX 3070 Review
Having a PC to play is, in my opinion, the best gaming option. Nevertheless, buying a tower requires space; shortages...
AfterShokz Aeropex vs AfterShokz Trekz Air: Wireless Bone Conduction Headphones
Aftershokz is a brand that was founded in 2011 and has been selling boneconducting headphones for several years. For me,...
Bowers & Wilkins PX7 vs Bowers & Wilkins PX5: Active Noise Cancelling Headset Review
The Bowers and Wilkins PX7 visually, it resembles the slightly smaller PX5, and technically it is very similar technology. Even...
Sendy Audio Aiva Magnetic vs AKG K371 Closed-Back Headphones Review
Sendy Audio, a newcomer is entering the market for high-quality headphones. The first product from the Chinese, is now also...
WEARABLES3 months ago
Coros Vertix 2 vs Coros Vertix: Adventure GPS Watch
TABLETS2 months ago
2021 Huawei MatePad Pro 12.6: Tablet For Your Daily Needs
WEARABLES3 months ago
Huawei Watch 3 Pro: Smartwatch With Harmony OS
WEARABLES2 months ago
Fitbit Charge 5 Review: Connected Bracelet with ECG
WEARABLES3 months ago
Samsung Galaxy Watch 4 vs Samsung Galaxy Watch 3
WEARABLES2 months ago
Apple Watch Series 6 vs Apple Watch Series 7: What to Expect
MONITOR2 months ago
Dell 32 USB-C Hub P3222QE Review: Best 4K Monitor
TABLETS2 months ago
Xiaomi Mi Pad 5 Pro: Powerful 8520 mAh Battery