connect lcd panel to pc psu site www.tomshardware.com factory

I think now you"re trying to get video thropugh the video card, when actually the mobo"s BIOS has been reset to use the onboard video (happened when replaced the PSU). you need to make sure you get the video out through the output that is actually enabled in BIOS. For that you need do the following:

Plug the monitor in the mobo"s VGA output (as opposed to the card"s output)? After you do that, reset the BIOS to factory defaults (remove the CMOS battery for a little while and reinsert it, make sure the mobo is unplugged during all that) and then plug back in the PC and power it on. It should let you go into BIOS by default (again, using the mobo"s onboard video, not the card"s video output). Once in BIOS, select the video discrete card output for video, save and power off. Plug the monitor in the card"s output and reboot. This time the video output should come through the discrete card.

connect lcd panel to pc psu site www.tomshardware.com factory

As shared by @momomo_us on Twitter,(opens in new tab) ASRock has built a new accessory for PC builders that allows you to turn your PC chassis" side panel into an LCD monitor. The gadget is a 13.3" side panel kit designed to be tapped to the inside of your see-through side panel, giving users an additional display for monitoring system resources and temperatures or being used as a secondary monitor altogether.

The screen is a 16:9 aspect ratio 1080P IPS 60Hz display, measuring 13.3 inches diagonally. This screen is the equivalent of a laptop display. It uses the same connection method as laptops, featuring an embedded DisplayPort (eDP) connector.

Unfortunately, this represents a problem for most PC users. The connector was originally designed specifically for mobile, and embedded PC solutions, meaning the connector is not available on standard desktop motherboards or graphics cards.

As a result, only ASRock motherboards support the side panel, and only a few models at best, with less than ten motherboards featuring the eDP connector. The list includes the following motherboards: Z790 LiveMixer, Z790 Pro RS/D4, Z790M-ITX WiFi, Z790 Steel Legend WiFi, Z790 PG Lightning, Z790 Pro RS, Z790 PG Lightning/D4. H610M-ITX/eDP, and B650E PG-ITX WiFi.

Sadly adapters aren"t a solution either since eDP to DP (or any other display output) adapters don"t exist today. Furthermore, creating an adapter is problematic because eDP runs both power and video signals through a single cable.

It"s a shame this accessory won"t get mainstream popularity due to these compatibility issues. But for the few users with the correct motherboard, this side panel kit can provide a full secondary monitor that takes up no additional space on your desk. The only sacrifice you"ll make is blocking all the shiny RGB lighting inside your chassis.

connect lcd panel to pc psu site www.tomshardware.com factory

Display image quality can be measured in many ways, but I have always maintained that contrast is the most important element. A broad range of tones from the deepest blacks to the brightest whites produces a picture that looks dimensional and realistic while best replicating how we see the world around us. Color is also a factor, and monitors with large color gamuts set themselves apart with rich, vibrant hues that come close to the limits of human vision.

The technology that best delivers these things is Organic Light-Emitting Diode, or OLED, and its variants. Today, we have a chance to check out a new Kickstarter project, the Zion Pro 15.6-inch portable AMOLED monitor. What makes this monitor stand out is that it brings AMOLED to a portable form factor while maintaining touch functionality, a large color gamut and an impressive 100,000:1 contrast ratio. Does the Zion Pro have the goods to back up its $600 price tag? Read on to find out.

The Zion Pro is an Active-Matrix OLED (AMOLED), which means in addition to its printed diode layer, it adds a thin-film transistor (TFT) layer to control the individual pixels. In theory, this should produce faster response and greater contrast. My initial measurements of the Zion Pro show no perceivable difference in contrast. Like every other OLED screen, it has a black level too low to detect with any instruments. Its contrast is therefore effectively infinite.

Response time is also difficult to quantify in this case. The Zion Pro runs at 60 Hz, which means I cannot make a fair comparison to an OLED gaming monitor running at 120 or 144 Hz. The Zion Pro also leaves out Adaptive Sync. It isn’t a gaming monitor in the traditional sense, but I found it took care of business in Doom Eternal and other titles better than expected.

Portability and image quality are the two design focuses here, and the Zion Pro excels in both. The 15.6-inch screen is thin (just 4mm with a 10mm bulge at the bottom) and light (less than 1kg). The sample I received came with a nice neoprene sleeve that slips into any laptop bag. The only thing it lacks on that front is a battery. It must be plugged in to operate using the included wall-wart power supply and one of the panel’s USB-C ports.

The resolution is Ultra HD, 3840x2160 at 60 Hz. The Zion Pro also supports HDR10 signals and has a huge color gamut covering over 110% of DCI-P3. Its connectivity includes HDMI 2.0 and USB-C, so it can easily mate with a laptop or desktop PC. A small flip-out stand props it up like an easel.

The Zion Pro is also a touch screen with a 10-point capacitive layer like a tablet. It interfaces with Windows touch functions and lets you use your finger instead of a mouse or trackball. Functions like pinch to zoom, hold for right-click, and object drag are all supported.

The Zion Pro sample I received is an early production model, so I received a package with the panel, a power supply, HDMI, USB-C to C & USB-C to A cables and the neoprene bag mentioned above. It was secured with flexible foam in a slick white box.

The first thing I noticed was the Zion Pro’s slimness. It is only 4mm thick for most of its height, with a 10mm bulge at bottom to accommodate internal components, control keys and inputs. Considering that the holes for HDMI and USB-A ports are 6mm high, that bulge makes sense. The silver parts of the panel are all aluminum and very rigid. The screen is glass with an oleophobic coating like a phone or tablet. It’s very slippery, resistant to fingerprints and easy to wipe clean with a lint-free cloth. It also picks up every stray reflection, so you’ll want to be careful around bright lights and sunny windows. Unlike an LCD, an OLED’s image doesn’t wash out when ambient light is present. But you’ll see yourself in any black areas of the image if the environment is bright enough.

A small flip-out stand – also constructed of aluminum – is used to prop up the monitor. It has grippy rubber feet that keep it from sliding around. The stand also stays where you put it to adjust the angle easily and securely.

Inputs are on the left side and include one HDMI 2.0 and one USB-C. The second USB-C is for the included power supply, which has a very short cord. It can’t be removed from the wall wart, so you can’t substitute an aftermarket USB-C cable. However, I could power the Zion Pro from a laptop’s USB-C port, so there are options besides the included power supply.

There are two control keys on the right side, one that toggles power and activates the OSD and another rocker-style switch to change volume and menu settings. A USB-A downstream port can accommodate a mouse or keyboard if the Zion Pro connects to its host via USB-C.

The touchscreen is fully compatible with Windows touch functions once you configure it in Control Panel. You can use your finger to replace a mouse or other pointing device and the OSD is also touch-sensitive. Pinch to zoom, object drag and hold for right click all function properly.

To activate the OSD, press the smaller of the two control keys on the right side. It’s entirely icon-based with no text to help one decipher its functions. Most things are obvious, like brightness and contrast. However, the color modes are a bit esoteric. Some text labels would be a logical addition.

The OSD is divided into four sub-menus. The first controls brightness, contrast, color saturation & tint, and sharpness. The latter defaults to 1, which is fine, but 2 makes the picture a tad sharper. Higher settings cause edge enhancement, which detracts from image quality.

The second menu, the sun icon, controls color temperature. You can choose either 9300K, 6500K, or a user mode. User lets you slide the color temp up and down as a whole or tweak the individual RGB levels. The settings are quite coarse, so it’s hard to achieve a precise calibration. With that said, I came pretty close, and the Zion Pro has decent accuracy out of the box.

The third menu is for the picture modes. The icons are fairly obvious for game and movie. I’m assuming the bottom left one is photo, and the center looks like a landscape. The top left TV icon is the default mode, which I dubbed Standard. The two on the right (top and middle) flip the image 180 degrees. DCR is dynamic contrast, which makes little difference in a panel with infinite contrast. And HDR Auto does just what it says. HDR signals switch the Zion Pro automatically. This menu also has a volume slider you can move with your finger or the side rocker key.

Calibrating the Zion Pro isn’t strictly necessary, but it will improve picture quality. The default image is a bit warm in tone, but color saturation is so high that it’s hard to see an issue in most content. Contrast is tremendous in every case and gamma tracks 2.2 almost perfectly. I tweaked the RGB sliders and made a visible improvement. Whites are more neutral, and color pops just a bit more after adjustment. Check out my settings below if you want to tweak your Zion Pro.

I am spoiled by the high-performance gaming monitors that typically visit my lab. So, I didn’t expect the Zion Pro to compete with them. And that’s fair. Zion doesn’t bill this screen as a gaming monitor after all. But since that’s my focus, I spent plenty of time gaming on it.

The 60fps limit was not as noticeable as one would think. While the eerie smoothness of a 240 Hz screen cannot be duplicated, moving through first-person gaming environments was far better than I’ve seen from 60 Hz LCD panels. There is no stutter or visible lag. Fine detail goes a bit soft when moving the mouse quickly, but it doesn’t break up or ghost. In fact, there were no visible artifacts.

I haven’t gamed on an Ultra HD monitor smaller than 27 inches in a long time. This kind of pixel density is something normally seen only on a tablet or smartphone. 282ppi is definitely a level up, so I can see the attraction of gaming laptops when playing on the Zion Pro. That, coupled with OLED’s phenomenal contrast, delivers a picture that competing technology can’t duplicate.

The Zion’s enormous color gamut contributed to this as well. Doom Eternal is a great test of the color red. Most of its environments are drenched in blood, lava, or other red substance. It was hard to stop playing, even though I had to squint to see small objects on the relatively tiny screen. This game looks great in SDR and HDR. I had to tweak the HDR calibration sliders a bit in the game’s menu, but the image was stunning in every respect.

I also didn’t miss Adaptive Sync, which surprised me. I could not get the image to tear or break up in any way when playing Doom Eternal or Call of Duty WWII. I chalk this up to the tremendous number of natural textures that have very few straight vertical lines to call attention to frame tears. I just never saw them.

Working with the Zion Pro was super fun and easy once I enabled Windows’ touchscreen support. I never needed the mouse when editing text or navigating spreadsheets. The panel is too small for photo editing beyond the basic level, but it serves that function well if you need to finish a letter or report.

My only complaints are ergonomic ones. The power cord is far too short when plugging the Zion Pro into a wall socket. It’s best to connect it via USB-C to another monitor, laptop or PC so you can place it where you like. I also found the OSD a bit clunky with its unlabeled icons. That said, it doesn’t need any tweaking once you’ve set it up. The speakers are also in the realm of an afterthought. They’re fine for system sounds but anything resembling music or effects sounds tinny.

Minor nitpicks aside, the Zion Pro is an excellent display with a stunning picture and surprising game performance. With a few tweaks to its firmware, it will be a homerun.

connect lcd panel to pc psu site www.tomshardware.com factory

There are plenty of great pre-built gaming PCs on the market, but there’s nothing quite like the satisfaction of using one that you create yourself. While the process of assembling a computer isn’t difficult, it’s daunting the first time you do it. For those embarking on their first build, we’ve put together a step-by-step guide to building a PC.

Before we start, know that this is a guide exclusively dedicated to assembly. That means you’ll need to pick out your parts first. Our list of recommended cases, CPUs,GPUs, motherboards, SSDs, power supplies and RAM, along with our buying guides can help you choose key components.

The other thing to know is that no two builds are identical. The order we’re going in here is based partly on preference and also based on the needs of the build. For instance, if you have a large aftermarket cooler that blocks the DIMM slots, you may have to go in a different order than we did, or backtrack and pull out a part here or there to to make room for a particularly bulky part or cramped case. More advanced options like liquid cooling and RGB lighting, as well as high-end CPU platforms like Intel"s Core X and AMD"s Threadripper also aren’t covered in this guide.

Before you start building a PC, you need to get your workspace ready. Make sure that you have all of your parts and tools at the ready. At the very least, you’ll want:

Some builders swear by anti-static equipment, including mats or wrist straps. But as long as you don’t live in a particularly dry environment, you’re not building on a metal surface (opt for wood or plastic) and you aren"t rubbing your socks on a carpet while building, you should be able to avoid shorting out your PC or parts. Of course, there’s nothing wrong with playing things safe. So if you’re worried about static, take the appropriate precautions.

While some prefer to mount the motherboard in the case before they do anything else, it’s easier with many builds to connect key components like the CPU and the RAM, without leaning over inside the chassis.

The arrow/triangle on the top of the CPU needs to line up with one on the socket or the socket cover. Don’t attempt to install a CPU with the arrow facing the wrong direction, or you could damage your chip, your board, or both! Once your CPU and socket are properly aligned, you can drop the chip in place, and it will settle into the socket under its own weight. If it doesn"t, pick up the CPU and re-seat it. Don"t force the processor into the socket or you"ll almost certainly damage something. Once you"ve got the CPU settled correctly in the socket, press the tension lever back down (on Intel motherboards like the one in the image above, this will also include a metal plate that holds the CPU in).

Note that the above instructions pertain to the mainstream platforms for AMD (the AM4 socket) and Intel (socket LGA 1700). Enthusiast platforms like Intel’s Core X (LGA 2066) and AMD’s Threadripper (sTRX4) have different/more complex CPU installations, with the Intel chips involving two levers and Threadripper requiring Torx screws and a slide-in plastic bracket. The Threadripper CPU install process in particular is tricky and, given the price of chips and TRX40 motherboards, we would not recommend Threadripper as your first PC build platform.

Once the cooler is installed, plug the fan connector into its header on the motherboard. This is usually somewhere close to the CPU socket and labeled something like "CPU_FAN."

Stock coolers for Intel processors use push pins that go through holes in the motherboard. We recommend pushing opposite corners in to evenly spread the thermal paste, and to keep from putting uneven pressure on one side of the CPU. AMD stock coolers have metal arms that snap into notches on a plastic bracket on either side of the socket. Aftermarket coolers mount in various ways, so be sure to consult the instruction manual, as mounting aftermarket coolers can be surprisingly complicated, often involving a large backplate that has to be mounted behind the motherboard.

Installing RAM is a snap--literally. First, make sure that the latches for each memory slot are open. Some boards have latches on both sides of a RAM slot, while others--often budget boards--have a latch on one side, with the other end fixed in place. Once your latches are opened, look at each DIMM and position it over the slot such that the small divot on the bottom of the RAM stick is aligned with the matching bump on the board. Finally, push down on the DIMM on each edge until it snaps into place, causing the latches to close on their own. The process requires a bit of force, but if you’re having trouble, make sure that you’re not putting the module in backwards.

If you’re installing two RAM sticks in a board that has four slots, check the motherboard manual to make sure you’re installing your DIMMs in the right slots. If you put them in the wrong slots you may not get the best performance possible, or one of the sticks may not be recognized by the motherboard/operating system.

If it’s already installed, remove the screw located across from the M.2 slot and slide the SSD in at an angle. Make sure the notch lines up with the slot, similar to RAM installation. If the notch doesn’t line up, your drive may not be compatible with that slot. Slowly lay the SSD flat and secure the mounting screw. This tiny screw is easy to drop, which is another reason to install M.2 drives before putting your motherboard into the case.

Now that we’ve built the core platform (minus the graphics card, which we’ll do later), we’re going to install the CPU and RAM-equipped motherboard in the case. If you haven’t yet, remove the side panels on your chassis. Most cases have thumb screws holding their panels in place, which makes it easy to remove them.

First, gather the standoffs that came with your case and find the proper place to install them. They’re likely marked on the case based on the size of the motherboard you chose. Many cases have standoffs preinstalled, so you may be able to skip this step. If standoffs are preinstalled in the wrong spot for your motherboard, you can use needle nose pliers to get them out.

The I/O shield, which covers the area around your rear ports, comes with your motherboard. You’ll need to fit the shield into the chassis before you install the motherboard itself, making sure it"s the right-side up so that your motherboard ports will fit through the holes once both are installed. You’ll have to use some force to snap all four corners into place. Be careful of sharp edges (that’s why you have the band-aids) as well as metal bits that can block the ports--especially if you have a budget motherboard.

Now, it’s time to put the motherboard in. Make sure the holes on the motherboard line up with the standoffs you installed and that the ports line up with the cutouts on the I/O shield. Once the board is in, put the screws into the standoffs to anchor the motherboard in place.

The PSU is usually mounted to the back at the case. Sometimes you’ll find it at the top, but it’s usually mounted at the bottom, where it can pull in cool air from under the chassis. Once you put it in place, it’s generally as simple as screwing it into place with four screws at the back of the case. Then, plug in the 24-pin power connector and supplemental/CPU power connector into the motherboard.

We added M.2 storage earlier, so now it’s time for SATA drives, which could be a 2.5-inch SSD or hard drive, or a traditional 3.5-inch hard drive. Connect the SATA data cable to the motherboard and your drive or drives, then connect the SATA power connector from the PSU to your drive(s). Mount the hard drive or SSD in the appropriate bracket and screw or snap it into place. Note that bracket/drive mounting methods and placement vary by chassis model.

This is an optional step. If you’re using an Intel or AMD CPU with integrated graphics and don’t plan on serious gaming, you may not need or want a discrete graphics card. Many AMD CPUs, as well as high-end Intel models, don’t have on-board graphics, though, and will require a graphics card in order to connect and output to your monitor.

To install the GPU, you"ll likely have to remove some slot covers on the back of the case, so that the HDMI, DVI and other ports show through, letting you can connect your monitor(s) later.

Connect the GPU into the PCIe X16 slot on the motherboard (it’s the long one, and you’ll want to use the topmost one if there’s more than one on your motherboard). If necessary, plug the PCIe power connectors from the power supply into the card. (You may not need to do this on lower-end cards).

Most motherboards come with an Ethernet port on them and many also have Wi-Fi built-in. However, if you need wireless access and your computer doesn’t come with a Wi-Fi card, you’ll need to install one in one of the PCIe slots, a short M.2 slot, or attach a USB Wi-Fi dongle. If you’re gaming, an Ethernet connection is probably your best bet for reliable connectivity.

OK, just a few more cables to go until we try turning the PC on. Make sure the connectors for any fans are plugged into the motherboard fan headers. Then, attach the front-panel audio cable, USB 2.0 and USB 3.0 case connectors to those headers. You’ll want to consult your motherboard manual for this, because their location varies by motherboard model.

Lastly, there are the tiny front-panel connectors, including power, reset, HDD activity light, etc. Those need to go to the appropriate pins on the motherboard (usually in the bottom-right corner if your motherboard is mounted in the traditional orientation. You’ll need to consult your motherboard manual to see where each should go, as this also can differ based on board make and model.

Double check that you’re using the right headers. These things are small (and so are their labels), so it’s easy to mess them up if you’re not paying close attention.

Once all that"s done, it"s a good idea to double check to make sure there are no extra fan headers or power cables still waiting to be routed to the right connector. Then plug the PC in, plug in and connect your monitor (to one of the ports on the graphics card, if you"ve installed on) and your keyboard and mouse.

Hit the power button on your monitor, then turn the power supply switch on (on the back of the power supply) and then press your PC"s power button. If everything is working, the PC should turn on and run its POST (power-on self test). Since your operating system isn’t installed yet, you may get an error message about a missing boot drive, or you may get sent straight to the UEFI/BIOS.

Preferably before the build process, you"ll want to make a USB install drive for either Windows 10, Windows 11, or the Linux build of your choice. For Windows 10 or Windows 11, simply navigate to Microsoft’s Download page and click the “Download Tool Now” button. You’ll download and run the Media Creation tool which will turn any 8GB or larger USB drive into a Windows install disk. If you don’t already have a Windows 10 or 11 key, you can get one cheap or for free. If you have a problem with the OS, you can try to reset Windows to factory settings.

Once you"ve installed your operating system, when you first connect to the internet, Windows 10 and Windows 11 are pretty good these days at getting device drivers. However, you should still go to the manufacturers’ product pages for your parts to make sure you have or get the latest updates.

Finally, when your OS and drivers are all updated, it’s time to start using your PC! The one that you built. Install some games, stream some movies, edit some photo or video, chat on Discord — whatever it is you like to do with your PC. And remember: Whenever you’re ready to add more features or performance, you can always upgrade.

connect lcd panel to pc psu site www.tomshardware.com factory

For several years, we included only monitors with Adobe RGB gamuts in our definition of the professional category. But lately it seems that some displays are fitting that description with only an sRGB colorspace. They offer everything else a working pro could need, like factory calibration, lots of color adjustment options, and high quality IPS panels with solid build quality. Their advantage is that they cost less. sRGB screens with white backlights are often priced at less than half their wide-gamut counterparts.

Normally, we"d relegate monitors without Adobe RGB to the premium business class category. But we’ve expanded our rules to include products that include a factory calibration and are focused on color that conforms to industry standard specifications. Today we’re looking at ViewSonic’s VP2771. It’s a 27” monitor with factory-certified preset color modes, a 14-bit internal lookup table, and an optional calibration kit that can create custom setups for just about any purpose.

The VP2771 is an sRGB display with extended bit-depth courtesy of that 14-bit internal lookup table. It has preset modes for sRGB, Rec.709, EBU, SMPTE-C, DICOM-SIM, and even iPhone. Factory certifications are included for the first four standards with color and white point errors under 2dE with correct gamma tracking. In addition, the monitor features ViewModes for different game types, movie watching, and other common tasks. You can also calibrate it yourself using the OSD or, with an optional kit, create up to three custom modes that are stored internally.

The base panel is an IPS part with QHD resolution and white LED backlight. The chassis sports solid build quality that befits a monitor with these intentions. It should certainly have appeal as a professional’s tool as long as you don"t need an extended gamut. Does it measure up to its impressive specs? Let’s take a look.

ViewSonic packs all its monitors in oversized cartons with large foam blocks surrounding the contents. The VP2771 comes partially assembled. Once unpacked, just hook the panel on the upright and install four screws. Yes, you will need a Philips-head screwdriver; no snap-ons here.

The cable bundle includes only DisplayPort-to-mini and USB 3.0. There is no HDMI. You’ll need the latter to use the calibration kit, which is an add-on option. Our sample included it in the box. It’s called Colorbration and comes with an i1 Display Pro and the appropriate software. It’s based on X-rite’s application that we’ve seen shipped with some Dell and Samsung monitors. It can create custom calibrations and save them to one of three internal memories. We’ll detail its use on page two.

The front layer is the same 3H-hardness plastic found in most LCD monitors today. It’s precisely fitted to provide maximum clarity through a small air gap. There was no light bleed on our sample. In fact, it had exceptional uniformity out of the box.

The bezel is extremely narrow at just 7mm on the top and sides. You can put multiple panels right up against each other with a very thin line between images. The bottom frame is wider at 13mm and features touch-sensitive controls at the right. They’re marked by tiny white lights that glow at an ideal level for visibility in dark or bright work environments. The power LED is a traditional blue. The controls respond to light pressure and are very precise.

The side view reveals a slim panel at a bit over two inches. There are no USB ports on either edge; they are found on the bottom input panel instead. Around back you’ll find ViewSonic’s familiar power bulge with generous ventilation across the top. There are no speakers built into the VP2771, but if you want sound, there is a headphone output on the bottom. The upright unbolts to reveal a 100mm VESA mount.

The input panel, starting at left, features one upstream and three downstream USB 3.0 ports. You’ll have to connect the monitor to your computer if you intend to use the Colorbration kit. Next is a USB-C port, which can carry video signals. The next two DisplayPorts are version 1.2; one is of the mini variety. Then you have a DisplayPort output for MST daisy-chaining. Finally, you can see the headphone output and the jack for the external power supply.

connect lcd panel to pc psu site www.tomshardware.com factory

For many users, a 32-inch monitor is an ideal size for gaming, productivity and entertainment. It’s large enough to watch TV shows and movies in an office or dorm room and creates an immersive gaming experience at a distance of two or three feet. Workday tasks go by quickly thanks to all that screen real estate. Keeping multiple documents open and visible is a snap. Add in a curved panel and the immersion factor goes even higher.

32-inch monitors are usually a premium purchase, but Viotek says differently. Its new GNV32DBE is just $270 at this writing. That’s astounding when you consider that it also runs at QHD resolution and sports 165 Hz with Adaptive-Sync. There are plenty of 32-inch 60 Hz enterprise displays that cost more and grace our best gaming monitor list. But how does it perform?

The GNV32DBE starts with a 1500R curved VA QHD panel sporting a claimed 4,000:1 contrast ratio. I measured a good deal more than that, nearly 5,000:1 in fact, which is among the highest of any LCD panel in my experience. That, coupled with a large color gamut that covers almost 87% of DCI-P3, gives this monitor a vivid picture with strong saturation, deep blacks and a lot of dimension. It isn’t the brightest screen out there at less than 300 nits peak, but its size somewhat makes up for that.

Video processing receives the full treatment as well. The refresh rate of 165 Hz is achieved without overclock. Adaptive-Sync works equally well on Nvidia G-Sync or AMD FreeSync PCs, but the GNV32DBE has not been certified by Nvidia. It also supports HDR signals though in that case, I found it a bit lacking. Not only is there no dynamic contrast, but brightness is also slightly lower in HDR mode. Like most budget monitors, the HDR and SDR images aren’t that different.

Physically, the GNV32DBE is a simple screen with a solid but non-adjustable stand. There are no speakers or USB ports, nor is there LED lighting. But it delivers the features necessary for good gaming. Countdown timers, aiming points and a frame rate indicator are included in the OSD, which is controlled by a handy joystick.

The all-metal stand comes in two pieces which must be bolted together using the included fasteners and Phillips-head screwdriver. The panel snaps on, or you can use the 100 mm VESA lug pattern for your own mounting hardware. A DisplayPort cable is included, along with a small external power supply.

The GNV32DBE’s styling is understated. Only a logo, printed front and center on the panel’s trim, indicates the branding. The bezel is flush and thin at just 8 mm around the top and sides. A bright blue power LED shines at the bottom right. It flashes red in standby mode. Around the back right is the joystick, which is the monitor’s only control. A long press toggles power while a click calls up the OSD.

In the back, Viotek announces its logo more proudly and adds a few molded-in lines and textures along with red trim at the top and bottom. A round bit sets off the stand’s attachment point. The grills on the sides suggest speakers, but there are none, nor are there USB ports or LED lights. The stand is quite solid, made from crinkle-finished aluminum, and offers 5/20 degrees of tilt. There is no height adjustment, but the panel sits at a good position for typical desk furniture. I only had to tilt it slightly upward to meet my eyes. It also omits swivel and portrait mode.

The top-down view shows the 1500R curvature, which isn’t significant. You can get more tightly curved screens, especially if you get into the 21:9 aspect ratio. But the curve is ideal for the GNV32DBE’s size and shape. When doing text-based work, you don’t notice it. But it comes into play when gaming or watching video by creating a subtle wraparound effect. I found that it enhanced all forms of entertainment.

The input panel features two DisplayPort 1.2 inputs which is unusual in a good way. You also get an HDMI 2.0, but the USB port is only for firmware updates; it does not support peripherals. A 3.5 mm jack accepts headphones or powered speakers.

The GNV32DBE’s OSD is almost identical to the menu found in the other Viotek monitors I’ve reviewed. It has a gaming look to it but does not sacrifice efficiency. At the top, you can always see the input resolution, refresh rate and Adaptive-Sync status.

The first menu has the brightness and contrast sliders along with a Black Equalize control to raise the black level for better visibility. You won’t need it, though; the GNV32DBE has deep blacks rich with detail. I never had a problem seeing in the dark when gaming.

There are seven picture presets, of which User is the default. It is also the only mode that lets you adjust brightness and other image parameters. You’ll find a full set of those in the Color menu where you can choose between five color temps and four gamma levels. During my tests, I could not calibrate the GNV32DBE to a standard better than its out-of-box condition, which has a minor grayscale issue. I’ll explain this in more detail below.

In Gaming Setup, you can toggle between Adaptive-Sync and MPRT, which is a blur-reducing backlight strobe. It cuts brightness by around 50% and adds some ringing to moving objects which I found distracting. Leaving Adaptive-Sync on with overdrive set to high was the best choice. Motion was smooth, and there were no frame tears. This menu also has an HDR control which should be set to Auto Detect if you plan to use HDR. The GNV32DBE switches automatically when an HDR10 signal is detected, but there is no real difference in image quality. DCR (dynamic contrast) is available for SDR content only.

Game Plus is a set of three gaming aids. You get a set of countdown timers, crosshairs in multiple shapes and colors, and a frame rate indicator that sits largely in the screen"s top right corner.

Once you’ve dialed in the GNV32DBE to your liking, save the settings in one of the three memory slots for easy recall. This is something I’d love to see included with every monitor, but the feature is surprisingly rare.

The GNV32DBE’s User mode is the only one that allows any image adjustment. Even brightness is grayed out in the other picture modes. Unfortunately, I could not find a combination of settings that improved the picture above its out-of-box condition. Grayscale tracking is slightly green in the Normal color temp. Other presets were either more green or blue in tone. I attempted to use the RGB sliders but could only get the 80% brightness step on target. This is an unusual response in my experience. In the end, I left the color-related settings at their factory values and lowered brightness a bit to set 200 nits. Luckily, gamma tracks well, so I didn’t notice any significant problems in actual content.

In the 32-inch monitor category, it’s hard to imagine getting much more for your money than what the GNV32DBE offers. For $270, it delivers all the gaming performance of any premium QHD 165 Hz screen I’ve played on, with quick response, smooth motion processing and excellent color. Its high contrast really makes the image pop even though HDR doesn’t offer any advantage over SDR. I could play games like Call of Duty WWII and Doom Eternal in either signal mode and they looked about the same, which is to say they looked good, just that there was no difference between SDR and HDR.

Comparing gameplay between the GNV32DBE and an Ultra HD 144 Hz screen, the perception of resolution is about the same unless your game is slow-moving or static. The Viotek’s overdrive is enough to keep ghosting at bay and to keep moving objects in focus. 165fps at QHD resolution looks better than 144fps in Ultra HD. And remember that most Ultra HD games are running 120fps at best on the fastest video cards like the GeForce RTX 3090 I was using. The MPRT blur reduction feature cut brightness too much for my taste and created some ringing around moving objects which lowered motion resolution.

Though I noted some grayscale issues in testing, the picture had good saturation and a natural look. Some medium gray tones looked a little green, but this was rare. I would like to see a fix from Viotek that gives the user better use of the RGB sliders but at this price, I won’t call the errors a deal-breaker. The large color gamut helps mitigate those errors and it’s pretty close to spec, so I doubt most users will perceive a problem.

The curved screen helped bring games and video into my peripheral vision while being a non-issue for productivity. The balance of screen size, shape and curvature is ideal for all uses. It enhances the image when you want it to and doesn’t get in the way when spreadsheets and word processing are the order of the day. And a 32-inch 16:9 screen is always great for web browsing. Though pixel density is a little low at 92ppi, I didn’t notice any jaggies or other distractions. You won’t see the pixel structure unless you sit very close, less than three feet, which would be a strain for my eyes.

connect lcd panel to pc psu site www.tomshardware.com factory

The Raspberry Pi Model B has seen the same board layout since the Raspberry Pi B+ arrived in 2014. Sure the Raspberry Pi 4 swapped the Ethernet and USB ports around, but the same basic design has persisted. So when we received a parcel from Raspberry Pi Trading and opened the box to find a keyboard, we were somewhat puzzled as to the contents. Inside this compact and well designed keyboard is a Raspberry Pi 400, a variant of the Raspberry Pi 4 4GB designed specifically for this purpose.

Retailing as a single unit for $70 or as a complete $100 kit with mouse, power supply, cables, micro SD card and a copy of the Raspberry Pi Beginner’s Guide, the Raspberry Pi 400 could be the ideal way to introduce the Raspberry Pi to your home.

If you are of a “certain age” then you will remember a time when computers were hidden underneath a keyboard. Many of the early 1980s classic computers, from the Apple II to the Commodore 64, placed all of their internals in the same chassis as the keyboard.

The Raspberry Pi 400 brings that aesthetic back and it works rather well. The compact keyboard measures just over 11 x 4.6 x 0.7 inches (283 x 120 x 20 mm) and it is fairly comfortable to type on. we won’t be writing our next thesis with this keyboard, but it is more than capable for its intended purpose.

Despite the change in form factor, this is still a Raspberry Pi 4 4GB and, as such, it behaves in exactly the same manner, with one exception. The Raspberry Pi 400 lacks the CSI and DSI connectors, used for the Camera and Official Touchscreen. Without these connectors there is no way to use those devices. This loss of the touchscreen connector is not such a big deal, but the camera connector is.

The Raspberry Pi 400 is, at heart, a Raspberry Pi 4 and as we know the Pi 4 can get rather warm in use. In general use, the CPU will stay cool enough to remain well under the thermal throttle limit of 85 Celsius, but overclockers eager to squeeze extra performance from the CPU will often push those limits.

The stock speed of the Pi 400 is 1.8 GHz, 300 MHz more than the standard Raspberry Pi 4 and so it is surprising that the idle temperature of the Pi 400 is 31 Celsius, compared to our “control” Raspberry Pi 4 which was at 40 Celsius.

During our Stressberry test we saw the Pi 400 reach a peak of 46 Celsius, almost 20 Celsius lower than a stock Raspberry Pi 4! We wanted to push the Raspberry Pi 400 a little harder and so we overclocked it to 2.1 GHz and repeated the tests. Idle temperature was still a cool 32 Celsius, with a peak of 54 Celsius during Stressberry.

These temps are great, so can we push it a little further? We were able to beat the Raspberry Pi 4’s all-time record of 2.147 GHz and hit a stable overclock of 2.2 GHz. Keep in mind that there’s no fan, but we still saw an idle temperature of 34 Celsius and a peak Stressberry temperature of 57 Celsius, both well below the 85-degree throttle point..

But how can the Raspberry Pi 400 run at 1.8 GHz and yet remain so cool? We asked that question to Eben Upton, Chief Executive of Raspberry Pi Trading. “We are able to operate at a higher voltage, and thus generate more heat, because we have a more sophisticated thermal solution. So we took the decision to factory-qualify Pi 400 to the higher operating frequency. It is a unique capability for Pi 400.”

With this news spurring us on we quickly took the Raspberry Pi 400 apart and first noticed an exceptionally large heatsink. The heatsink is a “shield” between the keyboard and the Pi 400 board with a depression and thermal pad making contact with the system on chip (SoC). This heatsink is remarkable and largely responsible for the great temperatures, and overclocking ability of the Raspberry Pi 400.

We asked Eben Upton to explain why the Raspberry Pi 400 was created, and his response was “As the next step in our mission to provide cost-effective, high-performance, general-purpose PCs to everyone who needs/wants one. This form factor was very popular for non-mobile computers in the past, and we"ve sort of drifted away from it, in much the same way we drifted away from programmable computers. So this is a conviction product just like the Raspberry Pi itself: there"s a latent demand for this sort of compact PC”

What does that mean for potential use cases? This is a Pi for learning. In a similar vein to 1980s home computers, this is a machine which we can simply connect to a television and power then start learning to code. The Raspberry Pi 400 can also be used as a basic desktop computer and is more than capable of handling your general activities.

A departure from the norm, the Raspberry Pi 400 is a great all rounder. We get the same Raspberry Pi experience, but with far fewer wires and setup. As a $100 kit, the Raspberry Pi 400 is a great present for someone, particularly a child, taking their first steps into the Pi world and costs a little less than the official Raspberry Pi 4 Starter Kit ($115), which gives you the Pi 4 with a case and an external keyboard. Standalone units retail for $70 which is $15 more than a Raspberry Pi 4, but we get an added keyboard and a great CPU speed bump.

If money is no object and you’re buying a starter kit for a child or teen, the Elecrow CrowPi2 laptop ($269 without a Pi, $399 with the Pi and more), is also a great choice because not only is it a self-contained laptop, but it also comes with a smorgasbord of lights, motors and sensors built-in, along with tutorials to help you program them.

However, with the Raspberry Pi 400, you can always get whatever electronics you need and attach them, provided that you get a breakout cable and breadboard. The awkward GPIO access and lack of camera and display ports are the only negatives for the Raspberry Pi 400. All in all, this is a fantastic Raspberry Pi for those new to the platform.

connect lcd panel to pc psu site www.tomshardware.com factory

Custom PC builder Xidax was eager to send us the newest version of its X-6 desktop, so we configured a new Z270 test bed and put the tempered-glass-adorned gaming rig through its paces. With extras, including LED lighting, sleeved power cables, and overclocked components (CPU and GPU), the Xidax X-6 looks the part of a beastly gaming system, but can the flashy chassis and boosted performance justify a $2,094 price tag on a Core i7 and GTX 1070-equipped PC? Let’s find out.

Xidax has a floating inventory of configurable components, meaning that month to month, or even week to week, the company often changes its configurators to reflect the parts it can currently offer. This mostly occurs with storage, GPU, power supply, and case options, and the pricing for the machines doesn’t vary; if something is on the configurator one day but isn’t the next, it’s usually replaced with an option at the same price.

The same holds true for its cases; the X-6 unit we received features the Xidax Vertex chassis (which is really just a Bitfenix Aurora case), and it’s currently unavailable on the company’s website (however, a white version is in stock). We reached out to Xidax to see what this was about, and we were told that the black version of its Vertex case would return to the configurator once the company gets more stock.

Despite the company’s tricky inventory system, the Xidax X-6 we have is gorgeous. The chassis features a steel and plastic construction, with the top of the case sporting a smooth surface and slim, styled exhaust vents on the lengthy edges of the panel. Tempered-glass side panels on either side of the case give it a sleek look, and the left side panel prominently bears a Xidax logo and gives you a full view of the interior components, made even more visible with the glow of a green LED strip and fans. The glass panel on the right side is tinted dark black on one side (the interior side of the glass) so that you don’t see the cables underneath.

The front I/O resides on the top of the case. There are two USB 3.0 and two USB 2.0 ports, in addition to mic-in and headphone-out 3.5mm audio jacks (one of each). The rear motherboard I/O panel four USB 3.0 ports, two USB 2.0 ports, two USB 3.1 Gen2 ports (a Type-A and a Type-C), and a PS/2 combo port. Although the Type-C interface supports transfer speeds up to 10 Gb/s, Thunderbolt 3 connectivity isn’t part of the package.

The motherboard also features an RJ-45 Ethernet port (powered by an Intel i219-V gigabit Ethernet NIC), in addition to a DVI-D and HDMI 1.4 interface for display output. However, you won’t need the ports on the motherboard for your monitor; the primary GPU provides plenty of display connectivity with DVI-D, HDMI 2.0, and three DisplayPort 1.4 interfaces.

Opening the Xidax X-6 was as easy as it gets, with four thumbscrews (one in each corner) holding the glass side panels in place. We noticed that the chassis becomes a little wobbly once you remove one of the heavy tempered glass panels (the other makes it unbalanced), so it’s best to remove both panels if you want to work on the inside.

With the tempered glass panels removed, we’re greeted by a plethora of high-performance hardware and a handful of aesthetic bonuses. An Intel Core i7-7700K processor is socketed in an MSI Z270 SLI Plus ATX motherboard, with 16GB (2x8GB) of G.Skill Ripjaws V DDR4-2666 memory.

We were hard-pressed to find this motherboard in Xidax’s configurator as we finalized this review; the company swapped it out for an Asus TUF Mark II ATX motherboard for around the same price several weeks ago, after we received our review unit. However, with pricing largely unaffected and these types of inventory changes a regular occurrence for Xidax, we stuck with the previously available motherboard rather than starting over (although Xidax did offer to send us an updated model immediately).

The CPU is cooled by a 240mm Asetek 570LX all-in-one liquid cooler, which is mounted behind a pair of 120mm Thermaltake Riing Green LED fans at the front of the chassis (acting as an air intake). We were somewhat disappointed in this configuration considering that the light is mostly hidden by the solid front panel (which has only thin slits on the sides that act as air intakes). Mounting the radiator directly to the case and putting the fans on the inner side of it would have at least provided light to the interior. As is, the lighting seem to go to waste. Three more of the green LED fans are mounted in the case (two at the top exhaust, one rear exhaust), giving the internal components a gentle green glow and plenty of airflow with a total of five fans.

An MSI GeForce GTX 1070 Aero OC Edition graphics card resides in the motherboard’s primary PCIe x16 slot. It features 8GB of GDDR5 with an effective memory clock of 8008 MHz, in addition to a modest factory overclock. However, Xidax also overclocked the GPU (for an extra $25) to give it even more performance. This GPU should provide impressive 1080p and 1440p gaming performance.

The X-6 configuration Xidax sent us doesn’t have a lot of storage space, with a single 512GB Sandisk X400 SATA SSD (decorated with a Xidax logo) residing in one of the visible 2.5” drive bays. The capacity isn’t ideal for a gaming enthusiast looking to load their entire library into a new PC, but additional storage can be added in the company’s configurator, and Xidax told us it preferred not to include huge HDDs in a review unit “for the sake of it.” It makes sense to us, given that we don’t test secondary storage. We would, however, recommend filling the 3.5” bay at the bottom of the case with some kind of secondary storage. There"s also an empty M.2 interface on the motherboard and a 5.25" drive bay, but there"s no panel to expose an optical drive at the front of the chassis.

The Xidax X-6 we received gets its juice from an 850W Corsair RM850i 80 Plus Gold certified modular power supply, which is plenty of power for the GTX 1070 and Core i7 platform. It even gives you headroom to upgrade to an SLI configuration down the line. However, you could shave about $77 off the price tag if you go with a 550W PSU, which would also provide plenty of power for the components inside.

Curiously, the configurator no longer offers the RM850i option, replacing it with an RM850X at the same price. The latter doesn’t feature the i-series PSU’s USB connector, which allows the power supply to interface with Corsair Link, the company’s monitoring and control software. However, Xidax didn’t connect or install this feature, and we likely just got the last bit of the company’s previous 850W PSU stock.

The cable management is as good as possible in this chassis, with the main chamber clean of any excess wires and the backside (right side) neatly organized and tied down. The modular cables feeding into the back of the main chamber (right side) of the case are the stock Corsair cables, but they link to green, individually braided cable extensions that plug into the 4+4-pin CPU, 8-pin GPU, and 24-pin ATX power connectors on the motherboard. For an extra $40, Xidax offers a variety of colored cables that can bring a definite wow factor to any gaming PC.

Speaking of aesthetic add-ons, there’s a green LED light strip mounted to the case’s interior ceiling (top panel), and the single strand normally adds another $40 to the bill. Howver, Xidax is currently offering both of these eye-catching bonuses at no charge (so we didn"t include it in our total).

The only aesthetic bonus factored into the $2,094 price tag is the green LED fans, which cost an additional $85. It’s clear Xidax chose to showcase most of its mod shop add-ons with the flashy (or static, or breathing) eye-catching options. These add-ons won’t improve your system’s performance (except, perhaps, if the stock fans don’t perform as well as the LED replacements), but it certainly makes the Xidax X-6 look the part of a beastly gaming PC.

Xidax took the liberty of providing an edgy company-centric desktop background for its X-6 gaming PC, but aside from that, there isn’t any “bloatware” to speak of. However, the company did install Chrome, Adobe Reader, and MSI Afterburner (for the GPU overclock), which could hardly be considered bothersome (depending on your feelings about Chrome, we suppose).

The accessory box comes with the motherboard’s user manual and driver disk, along with the power cable and all of the paperwork for your order. Also included is the PSU"s extra modular cables and a free Xidax T-Shirt.

connect lcd panel to pc psu site www.tomshardware.com factory

According to industry insiders, Samsung has taken the decision to close down its last LCD production lines, coming into effect six months sooner than expected. According to industry insiders talking to the Korea Times, the June closure has been hastened due to the precipitous fall of LCD pricing driven by competitors based in China and Taiwan. As the news comes via unnamed "industry insiders", take the news with a healthy amount of salt until an official statement is made.

The apparent sudden change of heart by Samsung needs little supplementary explanation if you cast your eyes over the chart above. The chart, courtesy of Display Supply Chain Consultants (DSCC), shows the TV panel price index plotted against year on year change from January 2015 to today. The steepness of the decline seen from late 2021 is enough to give an analyst vertigo. It may show some signs of bottoming out now, but please note that we are in negative territory, approaching -60% pricing YoY.

Samsung Display is an important player in the business of making flat panel displays. The South Korea-based firm makes some of the best screens for devices spanning tiny wearables to full wall-sized modular TVs. Some of its displays go into PC monitors that are own branded, and others will be used by partners like Asus and Alienware.

It is expected that employees who were part of the LCD business operations will be reallocated to Samsung Display"s quantum dot (QD) and organic light-emitting diode (OLED) screen production. Hopefully the removal of the LCD lines, and rebalancing into QD and OLED, will bring economies of scale to Samsung"s operations and help make these vibrant modern screen technologies more accessible.

connect lcd panel to pc psu site www.tomshardware.com factory

Intel has been hyping up Xe Graphics for about two years, but the Intel Arc Alchemist GPU will finally bring some needed performance and competition from Team Blue to the discrete GPU space. This is the first "real" dedicated Intel GPU since the i740 back in 1998 — or technically, a proper discrete GPU after the Intel Xe DG1 paved the way last. The competition among the best graphics cards is fierce, and Intel"s current integrated graphics solutions basically don"t even rank on our GPU benchmarks hierarchy (UHD Graphics 630 sits at 1.8% of the RTX 3090 based on just 1080p medium performance).

The latest announcement from Intel is that the Arc A770 is coming October 12, starting at $329. That"s a lot lower on pricing than what was initially rumored, but then the A770 is also coming out far later than originally intended. With Intel targeting better than RTX 3060 levels of performance, at a potentially lower price and with more VRAM, things are shaping up nicely for Team Blue.

Could Intel, purveyor of low performance integrated GPUs—"the most popular GPUs in the world"—possibly hope to compete? Yes, it can. Plenty of questions remain, but with the official China-first launch of Intel Arc Alchemist laptops and the desktop Intel Arc A380 now behind us, plus plenty of additional details of the Alchemist GPU architecture, we now have a reasonable idea of what to expect. Intel has been gearing up its driver team for the launch, fixing compatibility and performance issues on existing graphics solutions, hopefully getting ready for the US and "rest of the world" launch. Frankly, there"s nowhere to go from here but up.

The difficulty Intel faces in cracking the dedicated GPU market can"t be underestimated. AMD"s Big Navi / RDNA 2 architecture has competed with Nvidia"s Ampere architecture since late 2020. While the first Xe GPUs arrived in 2020, in the form of Tiger Lake mobile processors, and Xe DG1 showed up by the middle of 2021, neither one can hope to compete with even GPUs from several generations back. Overall, Xe DG1 performed about the same as Nvidia"s GT 1030 GDDR5, a weak-sauce GPU hailing from May 2017. It was also a bit better than half the performance of 2016"s GTX 1050 2GB, despite having twice as much memory.

The Arc A380 did better, but it still only managed to match or slightly exceed the performance of the GTX 1650 (GDDR5 variant) and RX 6400. Video encoding hardware was a high point at least. More importantly, the A380 is potentially about a quarter of the performance of the top-end Arc A770, so there"s still hope.

Intel has a steep mountain to ascend if it wants to be taken seriously in the dedicated GPU space. Here"s the breakdown of the Arc Alchemist architecture, a look at the announced products, some Intel-provided benchmarks, all of which give us a glimpse into how Intel hopes to reach the summit. Truthfully, we"re just hoping Intel can make it to base camp, leaving the actual summiting for the future Battlemage, Celestial, and Druid architectures. But we"ll leave those for a future discussion.

Intel"s Xe Graphics aspirations hit center stage in early 2018, starting with the hiring of Raja Koduri from AMD, followed by chip architect Jim Keller and graphics marketer Chris Hook, to name just a few. Raja was the driving force behind AMD"s Radeon Technologies Group, created in November 2015, along with the Vega and Navi architectures. Clearly, the hope is that he can help lead Intel"s GPU division into new frontiers, and Arc Alchemist represents the results of several years worth of labor.

There"s much more to building a good GPU than just saying you want to make one, and Intel has a lot to prove. Here"s everything we know about the upcoming Intel Arc Alchemist, including specifications, performance expectations, release date, and more.

We"ll get into the details of the Arc Alchemist architecture below, but let"s start with the high-level overview. Intel has two different Arc Alchemist GPU dies, covering three different product families, the 700-series, 500-series, and 300-series. The first letter also denotes the family, so A770 are for Alchemist, and the future Battlemage parts will likely be named Arc B770 or similar.

Here are the specifications for the various desktop Arc GPUs that Intel has revealed. All of the figures are now more or less confirmed, except for A580 power.

These are Intel"s official core specs on the full large and small Arc Alchemist chips. Based on the wafer and die shots, along with other information, we expect Intel to enter the dedicated GPU market with products spanning the entire budget to high-end range.

Intel has five different mobile SKUs, the A350M, A370M, A550M, A730M, and A770M. Those are understandably power constrained, while for desktops there will be (at least) A770, A750, A580, and A380 models. Intel also has Pro A40 and Pro A50 variants for professional markets (still using the smaller chip), and we can expect additional models for that market as well.

The Arc A300-series targets entry-level performance, the the A500 series goes after the midrange market, and A700 is for the high-end offerings — though we"ll have to see where they actually land in our GPU benchmarks hierarchy when they launch. Arc mobile GPUs along with the A380 were available in China first, but the desktop A580, A750, and A770 should be full world-wide launches. Releasing the first parts in China wasn"t a good look, especially since one of Intel"s previous "China only" products was Cannon Lake, with the Core i3-8121U that basically only just saw the light of day before getting buried deep under ground.

As shown in our GPU price index, the prices of competing AMD and Nvidia GPUs have plummeted this year. Intel would have been in great shape if it had managed to launch Arc at the start of the year with reasonable prices, which was the original plan (actually, late 2021 was at one point in the cards). Many gamers might have given Intel GPUs a shot if they were priced at half the cost of the competition, even if they were slower.

Intel has provided us with reviewer"s guides for both its mobile Arc GPUs and the desktop Arc A380. As with any manufacturer provided benchmarks, you should expect the games and settings used were selected to show Arc in the best light possible. Intel tested 17 games for laptops and desktops, but the game selection isn"t even identical, which is a bit weird. It then compared performance with two mobile GeForce solutions, and the GTX 1650 and RX 6400 for desktops. There"s a lot of missing data, since the mobile chips represent the two fastest Arc solutions, but let"s get to the actual numbers first.

We"ll start with the mobile benchmarks, since Intel used its two high-end models for these. Based on the numbers, Intel suggests its A770M can outperform the RTX 3060 mobile, and the A730M can outperform the RTX 3050 Ti mobile. The overall scores put the A770M 12% ahead of the RTX 3060, and the A730M was 13% ahead of the RTX 3050 Ti. However, looking at the individual game results, the A770M was anywhere from 15% slower to 30% faster, and the A730M was 21% slower to 48% faster.

That"s a big spread in performance, and tweaks to some settings could have a significant impact on the fps results. Still, overall the list of games and settings used here looks pretty decent. However, Intel used laptops equipped with the older Core i7-11800H CPU on the Nvidia cards, and then used the latest and greatest Core i9-12900HK for the A770M and the Core i7-12700H for the A730M. There"s no question that the Alder Lake CPUs are faster than the previous generation Tiger Lake variants, though without doing our own testing we can"t say for certain how much CPU bottlenecks come into play.

There"s also the question of how much power the various chips used, as the Nvidia GPUs have a wide power range. The RTX 3050 Ti can ran at anywhere from 35W to 80W (Intel used a 60W model), and the RTX 3060 mobile has a range from 60W to 115W (Intel used an 85W model). Intel"s Arc GPUs also have a power range, from 80W to 120W on the A730M and from 120W to 150W on the A770M. While Intel didn"t specifically state the power level of its GPUs, it would have to be higher in both cases.

Switching over to the desktop side of things, Intel provided the above A380 benchmarks. Note that this time the target is much lower, with the GTX 1650 and RX 6400 budget GPUs going up against the A380. Intel still has higher-end cards coming, but here"s how it looks in the budget desktop market.

Even with the usual caveats about manufacturer provided benchmarks, things aren"t looking too good for the A380. The Radeon RX 6400 delivered 9% better performance than the Arc A380, with a range of -9% to +31%. The GTX 1650 did even better, with a 19% overall margin of victory and a range of just -3% up to +37%.

And look at the list of games: Age of E