ASUS has introduced three new inexpensive displays for gamers with full-HD and 4K resolution: the MG248Q, the MG28UQ and the MG24UQ. The new monitors support VESA’s Adaptive-Sync technology and thus should be compatible with video cards that feature AMD’s FreeSync dynamic refresh rate technology. While the monitors do not carry the FreeSync badge just now, they will likely gain it eventually.

Specifications of ASUS MG-Series Displays
  MG248Q MG24UQ MG28UQ
Panel 24" TN 23.6 IPS 28" TN
Resolution 1920 x 1080 3840 x 2160
Refresh Rate 40 Hz - 144 Hz 30 Hz - 60 Hz
Adaptive-Sync
Range
unknown 40 Hz - 60 Hz
Response Time 1 ms gray-to-gray 4 ms gray-to-gray 1 ms gray-to-gray
Brightness 350 cd/m² 300 cd/m² 330 cd/m²
Contrast True contrast ratio unknown
Viewing Angles 170°/160° horizontal/vertical 178°/178° horizontal/vertical 170°/160° horizontal/vertical
PPI 92 ppi 186 ppi 157 ppi
Pixel Pitch 0.276 mm 0.136 mm 0.16 mm
Colors 16.7 million 1.07 billion
Color Saturation Unknown
Inputs DisplayPort 1.2
HDMI 1.4
DVI-D
DisplayPort 1.2
HDMI 2.0
HDMI 1.4 x 2
Audio 2 x 2 W

The biggest display among the novelties is the ASUS MG28UQ, which is based on a TN panel with 3840×2160 resolution, and a 330 nit max brightness. The MG28UQ has default refresh rate of 60 Hz and supports dynamic refresh rates between 40 and 60 Hz, which is typical for 4K monitors. The display is equipped with one DisplayPort 1.2 and three HDMI inputs, a dual-port USB 3.0 hub with quick charge support as well as two 2 W speakers. The unit also features tilt, swivel, pivot and height adjustment and is compatible with VESA display wall mounts.

The ASUS MG28UQ is available now for $549 from Amazon. The product does not seem to be very affordable for a TN-based display, possibly because ASUS charges premium for the Adaptive-Sync feature. Nonetheless, the monitor is not too expensive either.  

Next up, the MG24UQ is not as big as its large brother (it has 23.6” diagonal), but it will be a more interesting option for those, who prefer IPS panels with high pixel density. The monitor sports 3840×2160 resolution with up to 60 Hz refresh rate and a peak brightness of 300 nits. Adaptive-Sync works for refresh rates between 40 and 60 Hz, just like in case of the MG28UQ. The monitor features one DisplayPort 1.2, three HDMI inputs as well as two 2 W speakers. The design of the MG24UQ display is very similar to that of the MG28UQ (hence, it sports the same set of adjustments and VESA mounts) with the exception of dimensions and the lack of a USB hub on the smaller one.

The ASUS MG24UQ can be pre-ordered now for $399 on Amazon.

Finally, the ASUS MG248UQ is designed for gamers who value high dynamic refresh rates most of all other features. This display will be the first 24” monitor from the company which supports up to 144 Hz refresh rate as well as Active-Sync technology. The monitor uses a TN panel with 1920×1080 resolution and a peak brightness of 350 nits, offering slightly better specifications and a more aggressive visual design compared to the VG247H and the VG248QE. The display supports dynamic refresh rates between 40 and 144 Hz, according to ASUS, which is a very decent range. As an added bonus, thanks to the extremely high refresh rate, the MG248UQ could be used with NVIDIA's 3D Vision stereo-3D kit

ASUS plans to start selling the MG248UQ in the coming weeks for an undisclosed price. Typically, such monitors are not expensive, thus, the MG248UQ could be used to build relatively affordable ultra-fast multi-monitor setups with Adaptive-Sync.

ASUS is one of the leading suppliers of displays for gamers with a huge market share, according to the company. The new MG-series monitors should help ASUS to better address the segment of affordable displays for gamers.

Source: ASUS

Comments Locked

42 Comments

View All Comments

  • Flunk - Friday, April 15, 2016 - link

    These monitors are all pretty compelling, do we have any idea when Nvidia is going to suport VESA adaptive-sync?
  • G0053 - Friday, April 15, 2016 - link

    It is silly of them not supporting adaptive sync. It would just be another check box for them.
  • JoeyJoJo123 - Friday, April 15, 2016 - link

    It actually adds cost.

    The G-Sync module is used in place of (or bypasses any use of) the typical monitor's display scaler, and that's no exception for adaptive sync. So for monitors to be able to support both G-Sync and Adaptive Sync, it'd need to have both the usual display scaler AND the G-Sync module.

    Some laptop models have been shown to support G-Sync despite not having a G-Sync module, so I suppose it is possible to get both G-Sync and Adaptive Sync on the same monitor without the expensive G-Sync module, but you'd need some kind of mode switching between the two.

    In my opinion, buying G-Sync monitors today or video cards that don't support VESA Adaptive Sync (Nvidia cards) feels like a sunk cost in the long run. Only Adaptive Sync displays and video output will really have mileage 5 years down the road when a plethora of electronics have started adopting Adaptive Sync more commonly.

    Polaris can't come soon enough.
  • BurntMyBacon - Friday, April 15, 2016 - link

    @JoeyJoJo123: "It actually adds cost. The G-Sync module is used in place of (or bypasses any use of) the typical monitor's display scaler ..."

    I'm pretty sure the OP was talking about nVidia supporting VESA adaptive-sync on their video cards. There is no technical reason that they couldn't. There wouldn't be anything added to the Bill Of Materials, just a relatively short (trivial?) development time to support it in software.

    @JoeyJoJo123: "Some laptop models have been shown to support G-Sync despite not having a G-Sync module ..."

    Because they already know how to support "Sync" technology without their G-Sync module, it shouldn't be hard to support Adaptive Sync with their video cards. It probably isn't all that difficult (now that much of the work is done) to support G-Sync on most "Sync" capable monitor without the module, but there are probably some tradeoffs.
  • Pantsu - Friday, April 15, 2016 - link

    There might be a couple reasons for that. Their current gen graphics cards might not have the necessary hardware features in their DisplayPort implementation for Adaptive-Sync support. That would mean Nvidia could only support it with Pascal or newer, so it's understandable why they would not announce support before a good portion of their user base have supported cards.

    The more cynical view is that they have the capability in current cards, but see it as competition and try to stifle it in favor of their G-Sync implementation. In that case they'll probably wait to support it only if the rest of the industry chooses Adaptive-Sync. Intel's support would essentially doom G-Sync. At the moment Adaptive-Sync panels seem to have bigger restrictions with the refresh rate range, and it'll likely take a year or two before TCON manufacturers start offering boards that aren't hobbled with tight dynamic refresh rate ranges.
  • plopke - Friday, April 15, 2016 - link

    I would love to see this on Nvidia cards but I guess for Nvdia it isn't that easy. Not sure if the GSYNC premium pricing is higher then AMD's Freesync/Adaptive sync but since they went the exclusive route they might be stuck with some tough choices and questions?
    -Can they run freesync on G-sync hardware,namely the monitors might be a problem?
    -How to promote it to the market if they drop gsync and your gsync customers?
    -Try to improve/adapt Gsync so you can still offer it as apremium over freesync?
  • plopke - Friday, April 15, 2016 - link

    Anyway long life competition :P
  • Flunk - Friday, April 15, 2016 - link

    They don't need to stop supporting G-Sync to support adaptive sync.
  • valinor89 - Friday, April 15, 2016 - link

    Even better, just rename VESA adaptive-sync as G-sync Lite or something and offer the original G-sync as a premium product. Assuming original G-sync is better. Have not seen them both in action. This way they don't have to abandon the sunk costs on G-sync marketing.
  • Alexvrb - Saturday, April 16, 2016 - link

    This idea gets my vote. But good luck selling it to Nvidia. :D

Log in

Don't have an account? Sign up now