AMD announced Radeon Anti-Lag for Navi GPUs with the launch of the Navi 10 GPUs in July. Now they've brought that to all GCN and later discrete GPUs, as well as all Ryzen-based integrated GPUs. What it does is to optimize rendering for latency rather than for frame rates. As AMD explains it, the goal is to make the time lag between when you press a button and when the effect is displayed on your monitor as small as possible. For competitive e-sports, or any competitive, twitchy gaming, using Radeon Anti-Lag is an obvious decision--and a significant reason to prefer an AMD GPU to Nvidia even on an unlimited budget, at least until Nvidia offers an analogous feature. It does come at the expense of slightly lower average frame rates, sometimes by a fraction of a percent, and other times by several percent.
Radeon Anti-Lag being built into AMD's drivers isn't new. What's new is that AMD has extended support for it all the way back to the Radeon HD 7000 series cards that launched way back in 2012. For video cards just shy of eight years old to still have driver support is hardly guaranteed. I can't think of another time that a GPU vendor actually added major new driver features to cards that old. For comparison, when Nvidia finally offered Adaptive Sync support around the start of this year, they limited it to Pascal and later GPUs, which basically meant only cards less than three years old.
One caveat to the old card support is that on GCN-based cards, AMD only offers Radeon Anti-Lag for DirectX 11, DirectX 12, and Vulkan. I'm not sure if the early GCN-based GPUs support DirectX 12 or Vulkan at all. But the clear difference from Navi is that Navi GPUs also support Radeon Anti-Lag in DirectX 9 games, which GCN-based GPUs (which includes Polaris and Vega) do not.
That's not the only new feature, though. Another new feature is Radeon Boost, which can increase frame rates at the expense of reduced image quality. The idea is that that the driver will track your mouse pointer and dynamically reduce image quality for the portions of your screen far away from your mouse pointer. In a lot of games, that's a terrible idea, as they can't tell where you're looking. But in games where you're likely to be nearly always looking right at the mouse pointer, you won't notice the degraded image quality in your peripheral vision. You might notice increased frame rates, though.
Nvidia has an analogous feature, which I think originally was added with Turing, a little over a year ago. But AMD is offering to support it on very old GPUs. I'd bet that Nvidia doesn't support it on their old Kepler GPUs that launched after AMD's first GCN-based cards. They probably don't even support it on Pascal.
AMD is also offering integer display scaling. Some games really demand to run at a fixed monitor resolution. This tends to be older games, though sometimes no-budget indie games using a bad engine (or misusing a good one) can have this problem. Having a game that runs at 1024x768 stretch to a 1920x1080 monitor looks bad, and for multiple reasons. One is that the aspect ratio is stretched wrongly. Another is that the upscaling will blur pixels. This is especially bad for 2D, sprite-based games.
AMD's solution to this is to offer new full screen options by having the GPU handle how an image is scaled up rather than the monitor. One option is to preserve the aspect ratio by leaving part of the monitor unused, typically on the sides. Another is to insist that the game must be run at some integer multiple of the original resolution (e.g., 1280x960 on a 1920x1080 monitor for a game that wants to run at 640x480), center that on your monitor, and potentially leave some space unused on all sides. An integer upscaling won't cause the blurring of the first option, but can still take advantage of more space than having a tiny 640x480 box in the center of your screen. But AMD gives you a choice, and will let you make and save different choices for different games if you like.
There is one other caveat to some of the new features, which isn't advertised but is immediately obvious if you tinker with the drivers. Radeon Anti-Lag, Radeon Boost, and Radeon Chill are mutually exclusive features, in that turning one of them on will force the other two off. Radeon Chill is an older feature to save power by reducing your GPU clock speed and voltage in situations where you're rendering frames faster than the monitor can handle. All three of those features tinker with the rendering process in different ways, so it's not hard to imagine that they could trip over each other.