aorus 3080 lcd screen free sample
AORUS - the premium gaming brand from GIGABYTE had launched a completely new series of RTX 30 graphics cards, including RTX 3090 Xtreme, RTX 3090 Master, RTX 3080 Xtreme, and RTX 3080 Master.
Besides excellent cooling and superior performance, LCD Edge View is another spotlight of AORUS RTX 30 series graphics cards. LCD Edge View is a small LCD located on the top of the graphics card. What could users do with this small LCD? Let’s find it out.
LCD Edge View is a LCD located on the graphics card, you can use it todisplay GPU info including temperature, usage, clock speed, fan speed, VRAM usage, VRAM clock and total card power. All this information can be shown one by one or just certain ones on the LCD.
Besides that, there are three different displaying styles available and users could choose their ideal one. However, not just GPU info but FPS (Frame Per Second) in the game or other application could be displayed through LCD Edge View.
The LCD Edge View can also show customized content including text, pictures or even short GIF animations.Users could input the preferred text to the LCD, also set the font size, bold or italic. It also supports multi-language so users could input whatever type of text they want.
About the picture, LCD Edge View allows users to upload a JPEG file to it and AORUS RGB Fusion software will let users choose which region of the picture should be shown. The support of short GIF animations is the most interesting part.
Users can upload a short animation in terms of GIF to be shown on the LCD so they can easily build up a graphics card with their own style. All of the customizations above can be done via AORUS RGB Fusion software.
There’s something more interesting with LCD Edge View: The little CHIBI.CHIBI is a little falcon digitally living in the LCD Edge View and will grow up as more time users spend with their graphics card. Users could always check their little CHIBI through the LCD Edge View and watch it eat, sleep or fly around, which is quite interactive and interesting.
In conclusion, LCD Edge View can display a series of useful GPU information, customized text, pictures, and animations, allowing users to build up the graphics card with their own style. Users can also have more interaction with their card via the little CHIBI, the exclusive little digital falcon living inside the LCD Edge View, which brings more fun while playing with the graphics card.
We have something special for you! We’ve tested the largest GeForce RTX 3080, which is even equipped with its own display. And at least as interesting are also the results with Resizable BAR, which are making their debut in our GeForce graphics card tests, so sit back and get ready for it. Those increases and decreases in performance compared to Radeon are worth it.
With the RTX 3080 Xtreme 10G, we are starting to test Resizable BAR on GeForce as well. It is already mandatory due to the extensive support. Nevertheless, it makes sense to still maintain and expand the database of results without ReBAR. There are more or less two reasons for this.
And the second reason why not to leave standard measurements (i.e. with Resizable BAR off) is that the increase in performance may not always be progressive and may be descending. We’ve also noticed this in AMD’s graphics cards, which have been supporting ReBAR for longer. In short, it is also good to know about situations in which ReBAR is not currently suitable. Radeons with ReBAR have been tested across all available GPUs from the RX 6000 series, and the RTX 3080 will now have its premiere from the GeForce graphics. This is probably the most appropriate choice for a start due to its high popularity. We have this RTX 3080 in a non-traditional version from Gigabyte.
The graphics card has the Nvidia GA102 core (200-K1-A1) with 8704 shaders, 16 GB of GDDR6X memory, which is connected to a 320-bit bus. It is also worth emphasizing the specified boost clock – 1905 MHz. In practice, it will traditionally be more, we’ll see by how much. Detailed specifications of Aorus RTX 3080 Xtreme 10G can be found in the table below.
The RTX 3080 Xtreme 10G is attractive and rare especially thanks to its video output selection. There are three HDMI ports (2× 2.1 + 1× 2.0) and the same number of DisplayPorts (1.4a). You can only use four of the six available connectors at a time, but that is also very meaningful, and this configuration gives you more freedom than other cards. Some may prefer HDMI, which is usually limited to a single connector.
Gigabyte used the high profile of the card for mounting a 1.8″ display. It can display a variety of practical things such as clock speed, temperature or GPU usage, but you can also display any photo or GIF on it. Viewing angles are good, it’s an IPS panel. Still however, it is an LCD, so you will not miss the rectangular panel border because of imperfect black in the dark, but the backlight is pretty even.
Ampere has finally landed on laptops, with Gigabyte’s 2021 refresh of the Aorus 17G in particular representing our first chance to look at how RTX 3080 performs on mobile. This refresh also brings a 300Hz display to an Aorus laptop for the first time, plus sees the return of Aorus’ mobile physical mechanical keyboard. But the GPU still steals the show here, for two particular reasons. The first is to see if mobile Ampere is enough to propel this to our list of the best gaming laptops, and the second is that this represents yet another way to buy the notoriously rare RTX 30-series of GPUs.
Still, transitioning to mobile always comes with its tradeoffs, so the question remains- does the mobile RTX 3080 live up to the reputation set by its full-size cousins, or will PC owners still be left without many ways to get their hands on the best Ampere has to offer?
Like Gigabyte’s other Aorus laptops, the 2021 refresh of the Aorus 17G unabashedly wears its gamer branding on its sleeve, with an angled hinge, copious vents and a full physical mechanical keyboard. It does look a little cluttered upon opening the lid, mostly due to all the stickers advertising this laptop’s new features, like an RTX GPU and 300Hz screen. But removing those stickers reveals a slick look that speaks to its gaming nature while still not coming across as embarrassing.
Take the lid, which has a simple matte black finish that mostly resists fingerprints and is only accentuated by a single logo in the center. And the hinge, while angled, is also pleasingly rounded. The webcam placement is questionable- it’s under the screen- but that affects usefulness more than looks and at least allows for a thin bezel.
Despite looking reasonably restrained for gamer gear, the Aorus is still bulky enough to draw attention. When compared to other high-end gaming laptops, its 15.9 x 10.8 x 1 inch dimensions were only matched by the Asus ROG Strix Scar 17’s 15.7 x 11.53 x 1.02 inch dimensions. The Razer Blade Pro 17, meanwhile, is far more compact at 15.5 x 10.24 x 0.78 inches, while the similarly RTX-equipped, 15-inch Alienware m15 R4 is also smaller at 14.2 x 10.9 x 0.7 ~ 0.8 (depending on model) inches.
Despite its girth, though, the Aorus is slightly lighter than its competition at 5.95 pounds. While by no means lightweight, only the Alienware’s 5.25-pound weigh-in beat it. The Scar 17 and the Blade Pro 17, meanwhile, came in at 6.28 and 6.06 pounds, respectively.
The Aorus’ size also means it has plenty of room for ports. On its left side, you’ll find two USB 3.2 Gen 1 Type A ports, an SD card reader, an RJ-45 ethernet port and 2 separate 3.5mm audio jacks, one for headphones and one for microphones. That last feature in particular is a nice upgrade from, well, pretty much every other laptop I’ve reviewed. The laptop’s right side, meanwhile, has an additional USB 3.2 Gen 1 Type A port, a single Thunderbolt 3 connection, plus connections for both Mini DisplayPort 1.4 and HDMI 2.1. You’ll also find the DC In here, but there’s no lock slot anywhere on the Aorus.
What makes the Aorus 17G’s 2021 refresh special is that it’s the first laptop we’re looking at with a mobile RTX 3080 inside. Nvidia Control Center suggested the the laptop is utilizing Max-Q technologies. The Aorus 17G is also packing an Intel Core i7-10870H CPU and 32GB of RAM. So, how does the Aorus compare to both the m15 R4, which has the same CPU and 16GB of RAM, as well as powerful Turing laptops like the Scar 17 (i9-10980H, 2080 Super, 32GB RAM) and the Razer Blade Pro 17 (i7-10875H, 2080 Super Max-Q, 16GB RAM)?
In Assassin’s Creed Odyssey, the Aorus 17G hit an average 65 fps at 1080p on its highest settings, which was slightly behind the m15 R4’s 67 fps average but slightly above the 63 fps average of both the Scar 17 and the Blade Pro 17.
In Shadow of the Tomb Raider, both the Aorus 17G and the Scar 17 had average frame rates of 86 fps, while the m15 R4 was significantly lower at 77 fps and the Blade Pro 17 hit the bottom of the ranking at 75 fps.
Far Cry: New Dawn was fairly close across all contenders save the Blade Pro 17, with the Aorus 17G scoring 92 fps, the Scar 17, hitting 95 fps and the m15 R4 lagging imperceptibly behind at 91 fps. The slowest contender here was the Blade Pro 17, with 87 fps.
I also personally played Control for about a half hour on the Aorus using DirectX12 and High settings. With ray tracing off, I tended to fall between 79 - 84 fps, and with ray tracing on its high preset lowered that to 46 - 55 fps. The computer never felt hot to the touch during this time, nor did the fans get loud. The frame rate was also stable regardless of the amount of action on screen, though I did notice that it tended to load in at 94 - 105 fps before dropping a few minutes into play, I assume as more assets get loaded.
We also ran the Aorus through our typical Metro: Exodus stress test, where we ran the game’s 1080p RTX benchmark on a loop 15 times in a row. This is to simulate a half hour of intense gaming. The laptop scored an average frame rate of 59.6 fps, with a CPU clock speed of 3.47 GHz and a GPU clock speed of 1.19 GHz. The average CPU temperature during this time was 77.32 degrees Celsius (171.18 degrees Fahrenheit) while the average GPU temperature was 75.62 degrees Celsius (168.12 degrees Fahrenheit).
We’ve seen how the Aorus 17G handles games, but what about the productivity software that gaming laptops so frequently moonlight in? The Intel Core i7-10870H and 32GB of RAM provide a solid amount of power.
In Geekbench 5.0, which is a synthetic general productivity benchmark, the Aorus 17G scored 7895 points on multi-core tests and 1,265 on single-core tests. That puts it above both the Razer Blade Pro 17’s 5776/1,179 points and the Alienware m15 R4’s 7642/1,252 points, but behind the Asus ROG Strix Scar 17’s 8708/1,290 points.
In Handbrake, where we track how long it takes laptops to transcode a 4K video down to FHD, the Aorus 17G jumped down to third place, with a time of 8:33. That’s slower than the Scar 17’s 7:06 and the m15 R4’s near-identical 7:07, but still beats the Blade Pro 17’s 9:31.
The Aorus 17G was also in third place in our file transfer test, where we test the rate at which laptops can move 4.97GB of files. The Aorus did so at 845.02 MBps, which was about on par with the Blade Pro 17’s 844 MBps. That’s a far cry from the 1570.76 MBps score from the Scar 17 or even the m15 R4’s 1,055 MBps.
Aside from its GPU, another key innovation for the Aorus 17G is its 300 Hz IPS-level display. I tested this screen in two separate ways. First, I watched a trailer for Wandavision (one with color and widescreen, don’t worry), and second, I played Overwatch on it.
In Wandavision, I was impressed by the color quality and even the depth of blacks, but found viewing angles and reflectivity to be a big problem. While vertical viewing angles were almost complete, the screen’s image washed out whenever I strayed more than 45 degrees away from it horizontally. More problematic than this, though, was glare. I had to be certain my screen was pointing away from light, or else my image would reflect back at me even within perfect viewing angles.
Looking at our benchmarking results, I was surprised to see that the Aorus 17G actually covers less of the DCI-P3 color spectrum than competitors. It tops out at 79% versus the Scar 17’s 88.5% score, the Razer Blade Pro 17’s 84.1% score and the Alienware m15 R4’s whopping 149.5% score (thanks to a slower OLED screen). This is something I didn’t notice much in practice, though while my colors didn’t come across as flat, neither were they especially vivid.
The same pattern applied to brightness. The Aorus 17G had 300 nits of average brightness, while the Scar 17 had 336 nits, the Blade Pro had 304 nits and the m15 R4 had 362 nits. 300 nits of brightness was plenty for my purposes, and was a welcome increase over the unfortunately-dim 243 nits I saw on my last Aorus laptop.
While not new to the Aorus line, another key way this laptop differentiates itself from the competition is the inclusion of a full physical mechanical keyboard. It’s got a number pad and full-size keys, plus an easy to read Arial font and media controls baked into its Fn row. Full per-key RGB and clicky low profile Omron switches with 2.5mm of key travel and a 1.6mm actuation point make the gaming implications obvious, so I tested this in both general typing and Overwatch.
The Aorus 17G has the type of battery life you’d expect from a high-powered gaming laptop, which is to say "not much." It clocked in at 4:42 during our battery life test, which continuously streams video, browses the web and runs OpenGL over Wi-Fi tests at 150 nits of brightness. That’s about on par with the Razer Blade Pro 17’s 4:41 hours of battery life and longer than the Alienware m15 R4’s 4:01 battery life, but still falls short of the ROG Strix Scar 17’s 5:25 battery life.
We tested the Aorus 17G’s heat after 15 minutes of streaming video on YouTube, and found that it stays cool during non-gaming use. Its touchpad was the coolest touchpoint on the laptop at 71.4 degrees Fahrenheit (21.89 degrees Celsius), while the center of the keyboard (between the G&H keys) was slightly hotter at 75.2 degrees Fahrenheit (24 degrees Celsius). The bottom of the laptop generally hit 81.9 degrees Fahrenheit (27.72 degrees Celsius), but the center of the bottom, which is just below the vents, did hit 85.5 degrees Fahrenheit (29.72 degrees Celsius).
The Aorus 17G suffers from what we like to call a “nosecam.” Placed below the screen rather than above it, this webcam has the unfortunate tendency to look directly up your nose. The idea here is usually to save bezel space, but we have to wonder if the unflattering angle is worth it? You can rectify it a little by stretching in uncomfortable ways, but if you’re looking directly at your screen, be prepared to show off your nostrils, your chin and pretty much everything a good selfie avoids.
Quality is mixed, with accurate color and decent shadows, but fuzzy texture. On the plus side, the Aorus 17G’s webcam does come with a sliding privacy cover.
The Aorus 17G comes gracefully free of bloat, with the only examples we could find being standard Windows pre-installs like Microsoft Solitaire Collection and Spotify. In addition to these, you’ll also find utility apps like Nahimic Companion, Intel Graphics Command Center and Thunderbolt Control Center. These let you adjust and customize your audio and display as well as check what’s attached to your Thunderbolt ports.
The Aorus 17G has two different configurations, one with an RTX 3080, dubbed the Aorus 17G YC and one with an RTX 3070, listed as Aorus 17G XC. We reviewed the 3080 configuration, which is $2,699. Both configurations are otherwise identical, each packing an Intel Core i7-10870H, up to 64GB of DDR4-2933MHz RAM (our unit had 32GB) and 1TB of SSD storage. They also both have the same 17.3 inch 300HZ IPS-level display.
I’m of two minds on the 2021 refresh of the Aorus 17G. While I was hoping for a laptop equipped with a mobile RTX 3080 to far outperform its 2080 and 2080 Super cousins, what I instead got was a machine that was largely on par with them in performance. However, the Aorus 17G is also about $1000 cheaper than its competitors, even with the same CPU and memory/SSD loadouts.
Which brings us to the display. This is the first Aorus with a 300Hz option, and it’s just as responsive and satisfying as you’d think. The tradeoff here is that the screen is limited to FHD, and while it is IPS-level, its color and brightness don’t quite hit the peaks of its competitors. The 2021 refresh of the Aorus 17G also sees the return of its physical mechanical keyboard, though its featureless keycaps and awkward height leave it a little more useful for gaming than typing.
While I’d love to see an Ampere laptop pushing out significantly more frames than the competition, I have to compare it to what we have benchmarks for right now. And doing that, it’s still plenty enticing. The Aorus 17G gives you similar power to what you may find in a Asus ROG Strix Scar 17 G732 or Razer Blade Pro 17 for almost $1,000 less, plus a 300 Hz screen and a physical mechanical keyboard.
That said, we have recently reviewed another Ampere laptop, the Alienware m15 R4, which comes with a mobile RTX 3070 as opposed to a 3080. The upside here is that the Alienware lets you choose between a 300Hz screen or a 4K OLED, which drops the refresh rate to 60Hz but far eclipses the Aorus on color and brightness. You’ll also gain some performance on Handbrake and file transfer speed, but will generally be weaker on gaming. At $2,499 against the Aorus’ $2,699 (or $2,099 if you go for the RTX 3070 configuration), it’s up to you if those seem like worthwhile tradeoffs.
With the AORUS XTREME, Gigabyte brings a mightily impressive product to the market. Its factory tweaked deeply, comes with extended power utilization, and a cooler as thick as a brick. Next to that ethically it is a very pleasing product, with very subtle RGB elements and of course that LCD info/anim screen. Now, you"d think that with the extra power available, the beefed-up VRM, and the increased Boost clock frequency, that this card would be tremendously faster over reference. Ehm no, we stated this many times already, boost frequency matter less these days as the power limiter dumbs down that performance the second your max wattage has been reached. And for this product set in performance mode, that means 4% to 5% additional performance out of the box seen from reference.
Our performance paragraph is a generic paragraph used on all RTX 3080 reviews as the performance is more or less the same for all cards and brands. Gaming it can do well, with exceptional values. Yes, at Full HD, you"ll be quite often bottlenecked and CLU limited. But even there, in some games with proper programming and that right API (DX12/ASYNC), the sheer increase in performance is staggering. The good old rasterizer engine first, as hey, it is still the leading factor. Pure speaking from a shading/rasterizing point of view, you"re looking at 125% to 160% performance increases seen (relative) from the similar priced GeForce RTX 2080 (SUPER), so that is a tremendous step. The unimaginable number of shader processors is staggering. The new FP32/INT32 combo clusters remain a compromise that will work exceptionally well in most use cases, but not all of them. But even then, there are so many shader cores that not once the tested graphics card was slower than an RTX 2080 Ti; in fact (and I do mean in GPU bound situations), the RTX 2080 stays ahead by at least a margin of a relative 125%, bot more often 150% and even 160%. Performance-wise we can finally say, hey, this is a true Ultra HD capable graphics card (aside from Flight Simulator 2020, haha, that title needs D3D12/AYSNC en some DLSS!). The good news is that any game that uses traditional rendering will run excellent at 3840x2160. Games that can ray trace and manage DLSS also become playable in UHD. A good example was battlefield V with Raytracing and DLSS enabled, in Ultra HD now running in that 75 FPS bracket. Well, you"ve seen the numbers in the review; I"ll mute now. DXR Raytracing and tensor performance, the RTX 30 series have been received new tensor and RT cores. So don"t let the RT and Tensor core count confuse you. They"re located close inside that rendering engine, became more efficient, and that shows.
If we look at an RTX 2080 with port Royale, we will hit almost 30 FPS. The RTX 3080 nearly doubles that at 53 FPS. Tensor cores are harder to measure, but overall from what we have seen, it"s all in good balance. Overall though, the GeForce RTX 3080 starts to make sense at a Quad HD resolution (2560x1440), but again I deem this to be an Ultra HD targeted product. In contrast, for 2560x1440, I"d see the GeForce RTX 3070 see playing a more important role in terms of sense and value for money. At Full HD, then the inevitable GeForce RTX 3060, whenever that may be released. Games like Red Dead Redemption will make you aim, shoot, and smile at 70 FPS in UHD resolutions with the very best graphics settings. As always comparing apples and oranges, the performance results vary here and there as each architecture offers advantages and disadvantages in certain game render workloads. So, for the content creators among us, have you seen the Blender and V_Ray NEXT results? No, go towards page 30 of this review, and your eyes will pop out. The sheer compute performance has early exponentially doubled one step in the right direction. We need to stop for a second and talk VRAM, aka framebuffer memory. The GeForce RTX was fitted with new GDDR6X memory, it clocks in at 19 Gbps, and that is a freakfest of memory bandwidth, which the graphics card really likes. You"ll get 10GB of it. I can also tell you that there are plans for a 20GB version. We think initially the 20GB was to be released as the default, but for reasons none other than the bill of materials used, it became 10GB. In the year 2020, that is a very decent amount of graphics memory. However, signals are that the 20GB version may become available later for those who want to run Flight Simulator 2020; haha, that was a pun, sorry. We feel 10GB right now is fine, but with DirectX Ultimate and added scene complexity and raytracing becoming the new norm, I do not know if that"s still enough two years from now.
The power draw under intensive gaming for GeForce RTX 3080 remains to be significant. We measured it to be close to the 400 Watt at its peak, and for typical power draw under load that value is roughly 375 Watt. That is steep for an RTX 3080 rated at 320W at defaults. IDLE power consumption also was high at 28W, we suspect that the RGB and LCD setup are responsible for this. We advise a 750 Watt model at a minimum as the rest of the system needs some juice, and you will want some reserve.
This GeForce RTX 3080 did hardly exhibit coil squeak, much less than the founder card we tested. Is it disturbing? Well, no, it"s at a level you can hear it softly if you put your ear next to the card. In a closed chassis, however, that noise would fade away in the background. However, with an open chassis, you can hear a bit of coil whine/squeak.
The AIB product is deemed and damned to be called the more premium products. And I already told you, that"s no longer the case anymore as NVIDA"s founder cards are directly competing with the AIB product. In a perfect scenario, I would like to see the AIB product cheaper than the founder edition. That"s not the case. This card will be more expensive seen over that founder edition card. The price is currently rated at 1350 EUR incl vat (in the Netherlands). This will vary per country and, of course, availability. It is incredibly expensive for an RTX 3080, if you can find one to purchase at all.
The card actually tweaks well for an RTX 3080. Gigabyte had already maxed out the power limiter for you, then add ~100 MHz on the GPU clock resulting in observed boost frequencies towards 2100 MHz (depends and varies per game title/application). Remember that on the SILENT mode BIOS you could go even a bit higher. The memory was binned as well; we reached a beautiful 21 GHz. All in all, that brings us a very healthy 8% performance premium seen from the reference model.
Gigabyte offers a gorgeous looking product with the AORUS XTREME, really nice. Though powered down it"s a bit of a big brick to look at, but when you turn on that PC of yours, everything comes together. Gigabyte did things right when it comes to the factory tweak, I mean 1905 MHz is the highest clocked value next to MSI SUPRIM that we have seen. There"s no room left on the power limiter either, they opened it up completely at defaults for you. The product comes with dual-BIOS and for good reason, we feel that the performance mode measured at 42 DBA is a bit too loud for a product in this category and price range. At the cost of very little performance, you can bring that back to roughly 38 DBa under gaming load with the silent BIOS mode. The truth be told though that I did expect better value with this ginormous cooler. The two-outer fans spin clockwise, the smaller middle one any clockwise "to prevent turbulence", but it is exactly that middle fan where the noise is coming from as when I slow it down with my finger, the card becomes silent. Gigabyte really should look into their own thesis. Lovely is the RGB setup, and beautiful is the little LCD screen that can display a whole lot of things. You will need to activate it with Gigabytes software suite though.So all the extra"s like the newly defined looks, backplate, LCD, cooler, and dual BIOS, is it worth a price premium? We doubt that a little. But it is over-engineering at its best. Nvidia"s project green light defines that all cards are more or less in that same performance bracket, and that results in a meager 3~4% additional performance seen over the FE edition, that rule of thumb goes for all amped and beefed up products. Make no mistake, it"s love and fantastic, but is it worth the price premium? We doubt that. Gigabytes challenge is the DBA values, they preferred temperature of 65<>0 Degrees C over acoustics. I think I would have been fine with say 75 Degrees C and slightly lower acoustics. But that is a dilemma based on a personal and thus more subjective note. We can only acknowledge that the sheer performance this card series brings to the table is anything short of being impressive. The new generational architecture tweaks for Raytracing and Tensor also is significant. Coming from the RTX 2080, the RTX 3080 exhibited a roughly 85% performance increase, and that is going to bring hybrid raytracing towards higher resolutions. DXR will remain to be massively demanding, of course, but when you can plat Battlefield V in Ultra HD with Raytracing and DLSS enabled at over 70 FPS, hey, I"m cool with that. This card, in its default configuration, sits roughly 4% above founder edition performance. Of course, pricing will be everything as the AIB/AIC partners need to complete with an excellent founder edition product. Gigabyte did a marvelous job with the AORUS XTREME, but in the end, that choice rests at the end-user level availability and pricing. It"s over-engineered in all its ways but granted, we do like that. This has to be a top pick.