Acer’s Predator X27 G-Sync HDR monitor is infused with every feature you could ask for

After teasing it at CES in January, Acer pulled back more of the curtain on one of the most badass, feature-filled PC monitors ever at the company’s Next@Acer event in New York on Thursday: the beastly Predator X27.

acer predator x27

The Predator X27 is one of the debut G-Sync HDR displays, infused with Nvidia’s stutter-killing graphics technology and glorious high-dynamic-range visuals that are only just starting to appear in PCs. While today’s brightest monitors top out around 400 nits of brightness, Acer’s flagship hits a whopping 1,000 nits via 384 separate backlights, which help it deliver the vibrant colors HDR is famous for—a vividness that makes HDR’s deep, deep blacks all the more impactful.

But it doesn’t end there. Acer’s loaded the Predator X27 with basically every feature you can ask for: 4K resolution, a blisteringly fast 144Hz refresh rate, 4ms response time, 99 percent coverage of the Adobe RGB color space, 178-degree viewing angles, Tobii eye-tracking, and bleeding-edge Quantum Dot enhancement film technology.

“First used on high-end HDR televisions, QDEF is coated with nano-sized dots that emit light of a very specific color depending on the size of the dot, producing bright, saturated, and vibrant colors through the whole spectrum, from deep greens and reds, to intense blues,” Nvidia’s original G-Sync HDR post explains. “This enables a far larger set of colors to be displayed, producing pictures that more accurately reflect the scenes and colors you see in real life.”

This is basically the holy grail of PC displays, folks. But it isn’t the only one; Asus is working on a G-Sync HDR monitor of its own with similar specs. Neither company has released crucial pricing or release-date info, however. Considering the premium on current G-Sync panels and the no-compromises list of luxurious features in the first G-Sync HDR panels, don’t expect the Predator X27 to be cheap—or the firepower needed to use it to its full capabilities. You’d need not one, but two $700 GeForce GTX 1080 Ti graphics cards to even come close to hitting 144Hz at 4K resolution in most games.

Radeon owners rebel when AMD drivers stealth drop Quake Champions links on desktops

AMD graphics card owners rebelled against Radeon graphics drivers that forced ads for unrelated software onto users on Thursday, after the new Radeon 17.4.4 drivers automatically plopped a tracking code-infused shortcut to the Quake Champions website onto desktops during installation—with no warning or no way to opt out.

amd radeon

The Internet exploded in outrage. Posts decrying the ads hit the top of every major PC gaming and hardware forum on Reddit, countless Twitter users screamed their displeasure directly to the company, and enthusiast forum-goers grabbed proverbial pitchforks. Here’s a small sampling, with some particularly vulgar language blacked out.

Fortunately, AMD responded just as quickly, removing the shortcut install before the end of the day. When I emailed AMD representatives with clarifying questions in the midst of the uproar, they remained silent for a few hours, then replied with the following statement:

“We’re very excited to be working with Bethesda and we wanted to make it easy for Radeon users to sign up for the Quake Champions beta program. Our installer placed a shortcut on gamers’ desktops – we’ve updated our 17.4.4 release and this shortcut install has been removed. We apologize if this has caused any inconvenience for anyone.”

for referrals generated by the link, or if this was simply related to the recent Bethesda/Radeon technology hook-up. Nor did AMD respond to my queries about why it decided to use a mandatory desktop shortcut to a website rather than the promotional space already front-and-center in the Radeon Settings software that opens when you update your drivers.

The story behind the story: Sure, AMD’s Quake Champions link could be deleted quickly enough, but it was still an unwanted and unavoidable ingress of users’ desktops. Even the most bundleware-infused freeware gives you the ability to opt out of those extras during installation, and if a user wasn’t paying attention during the Radeon driver upgrade, finding a random bitly link on their desktop could cause fears of malware infestation.

That said, while we wouldn’t want to see this happen again, it maybe blew up a wee bit more than it should have. AMD’s underlying intent here—giving Radeon users access to a hot beta test—is still a good one. It’s a benefit Nvidia also employs—just not with mandatory desktop shortcuts. Hopefully future perks like this are handled in a more overt way, preferably inside of the Radeon Settings software itself, or at least with an opt-in prompt while your drivers are installing.

 

This overclocked EVGA GTX 1050 Ti costs less than reference models

Today’s deal is one for the budget PC builders: Right now, Amazon’s selling an EVGA GeForce GTX 1050 Ti SC Gaming graphics card for $116. That’s $34 off MSRP for this factory overclocked model, and one of the best prices we’ve seen. It’s also the most hassle-free price available—Newegg is also offering a discount on the same card, but for $140 after a $10 mail-in rebate.

There is, however, a catch. In order to take advantage of this price, you have to be an Amazon Prime member. You can circumvent this issue by signing up for a free trial, though. Doing so will also net you free two-day shipping on the purchase (as well as most others during the trial period).

As a factory overclocked model, the EVGA GeForce GTX 1050 Ti SC Gaming comes with a bit of extra oomph out of the box, with a base clock of 1354MHz and a boost clock of 1468MHz. (Reference speeds are 1290MHz base clock and 1392MHz boost clock.) It also comes with 4GB of GDDR5 memory that runs at 7Gbps and uses a 128-bit bus.

dsc01070

Brad Chacos

We reviewed this very graphics card when it launched, and even at full price we said the EVGA GTX 1050 Ti SC Gaming was the budget graphics card to buy. It runs cool, sips power, and blows AMD’s rival Radeon RX 460 out of the water in performance. The card’s powerful enough to deliver 1080p gaming performance with a mixture of Medium to High settings at 60 frames per second—outpowering both the Xbox One and the PlayStation 4. At this price, it’s even more compelling.

Though it doesn’t support VR, the 1050 Ti SC Gaming does support other features that launched with Nvidia’s Pascal architecture, like Nvidia Ansel. It also of course supports Nvidia G-Sync monitors and Nvidia GameStream. But what makes this particular 1050 Ti card great is that it’s ultra-compact—perfect for a mini-ITX build designed to travel.

Another nice touch: EVGA’s card doesn’t require a supplementary power pin connector, which makes it ideal for turning a boxed prebuilt PC into an impromptu gaming machine. Just slap it in your motherboard and start playing.

Nvidia’s offering three free VR games if you buy a GTX 10 Series card and an Oculus Rift + Touch

We’ve seen several deals and giveaways for virtual-reality headsets in recent months, but those mostly involved the HTC Vive. This time around, Nvidia and Oculus have teamed up for a Rift-flavored deal: Currently, if you buy a select GeForce GTX 10 Series card and the Oculus Rift + Touch, you’ll three VR games for free.

nvidiaoculusdeal

The deal began on April 25 and lasts until Tuesday, June 13, 2017, or while supplies last. Nvidia says both Newegg and Amazon have this deal. However, while Newegg definitely offers the deal as described by Nvidia, Amazon seems to limit you to specific, preset bundles.

Nvidia’s eligible graphics cards include the GTX 1080 Ti, GTX 1080, GTX 1070, and GTX 1060. The offer covers buying the cards separately, as part of a system, or built into a laptop.

The free games include one of our personal favorites: Superhot. We reviewed this surreal, time-warping shooter in its traditional PC form, but I can only imagine it’s great in VR, too.

This is the closest thing Intel has built to a discrete GPU

Intel doesn’t make its own discrete GPU but has built something that specializes in processing 4K graphics. But that product isn’t powerful enough to run Crysis, if you were wondering.

vca

The chipmaker showed off its Intel Visual Compute Accelerator 2 at the NAB show in Las Vegas this week. It has the build of a GPU but is designed for server applications and not for PCs.

The VCA 2 is aimed at cloud streaming 4K video, graphics, and virtual reality content. Servers with the graphics accelerator installed could be used to stream video or broadcast content.

The VCA 2 uses the 4K-capable Iris Pro Graphics P580 graphics chip and three Intel Xeon E3-1500 v5 processors. The P580 is also used in Intel’s mini-PC called Skull Canyon, which is designed for gaming.

Like GPUs, the VCA 2 plugs into PCI-Express 3.0 slots. It is not meant to be the main CPU or GPU for a PC.

It uses year-old GPU technology, not the faster Iris Pro GPUs that are in the 7th Generation Core chips.

You won’t be able to buy VCA 2 off the shelf, but instead, it’ll be sold directly to server and device makers. Intel declined to comment on when it’ll be available.

The VCA 2 is an upgrade from its predecessor, simply called the Visual Compute Accelerator. The original VCA had the older Iris Pro graphics P6300, used in Broadwell chips, and the older Xeon E3-v4 chips.

At NAB, Haivision showed its KB 4K Encoder, which is powered by Intel’s VCA 2. The device helped stitch and stream 360-degree content taken by Nokia’s OZO Live 4K virtual reality camera. The content is delivered to VR headsets.

Intel isn’t really known for its prowess in GPUs, though its new integrated processors in some Kaby Lake chips support 4K. PCs with Intel’s integrated GPUs can handle VR content but have a long way to go to compete with AMD or Nvidia GPUs.

Larrabee, from 2010, was Intel’s first attempt to make a discrete high-end GPU, but the product was unceremoniously abandoned even after a prototype was shown. Technical challenges led to its cancellation, but its byproducts were used in the Xeon Phi supercomputing and integrated graphics chips.

A full-fledged GPU still doesn’t exist in Intel’s arsenal, and that’s a big hole because of the growing popularity of gaming, virtual reality, and machine learning. Nvidia and AMD GPUs are used in most servers involved in deep learning, natural language processing, and other machine-learning tasks.

Intel has said FPGAs could mimic GPUs to an extent. But GPUs are much more flexible than FPGAs, which are designed to do specific tasks.

To comment on this article and other PCWorld content, visit our Facebook page or our Twitter feed.

Huawei and Google supercharge Android with a new Raspberry Pi-like board

 

Prepare to run Android at blazing fast speeds on a new Raspberry Pi-like computer developed by Huawei.

3 img 9062

Huawei’s HiKey 960 computer board is priced at $239 but has some of the latest CPU and GPU technologies. Google, ARM, Huawei, Archermind, and LeMaker all played roles in developing the board.

The HiKey 960 is meant to be a go-to PC for Android or a tool to develop software and drivers for the OS. The board development was backed by Linaro, an organization that develops software packages for the Android OS and  ARM architecture.

Linaro CEO George Grey recently said it was sad that Android developers had to write code on x86 chips. He encouraged the organization’s members to build a superfast computer so developers could build ARM software on ARM architecture. Intel has scaled back Android support on x86 PCs and isn’t making smartphone chips.

The HiKey 960 can be used to create robots, drones, and other smart devices. But it’s mainly intended to be an Android PC or a tool for developers who want to write and test applications.

The board can deliver performance similar to the latest smartphone and tablets. It has a Huawei Kirin 960 octa-core chip, which has four high-performance ARM Cortex-A73 and four low-power Cortex-A53 cores. The Kirin 960 is also used in the Huawei Mate 9 smartphone, which started shipping late last year.

The HiKey 960 has 32GB of storage and 3GB of LPDDR4 RAM. The Mali-G71 GPU is capable of delivering 4K graphics and is based on ARM’s latest Bitfrost architecture. However, the board will only have HDMI 1.2a slot, which is a 1080p display output.

Other features include dual-band 802.11 b/g/n/ac Wi-Fi and Bluetooth 4.1. The board has a PCIe m.2 slot to add additional storage or wireless capabilities. It also has 40-pin and 60-pin expansion connectors and multiple high-definition outputs so cameras can be connected to the board.

It will ship in the U.S., European Union, and Japan in early May. It will later ship worldwide.

The board will also support multiple Linux versions in the future.

You can load Android 7.1 on the board, but you will need to be technically savvy and have knowledge of command-line operations. Instructions to load Android 7.1 are on Google’s website.

Cars will get superior digital vision with ARM’s camera chip

Cars are turning into computers with a unique set of requirements.

One of the more important components is a camera, which is a secondary feature in PCs. Cameras are aiding mirrors in allowing cars to self park, and they will serve as the eyes for autonomous cars, helping capture and analyze images.

The number of cameras on cars will only grow as drivers seek a better view of the vehicle’s interiors and exteriors. For car makers, the next big goal is to bring context and understanding to those images. Combined with data from radar, lidar, GPS, and other sensors, cameras can help cars and drivers make better decisions.

 p1170415

ARM has come up with a specialized camera chip for cars, with the goal of bringing context to images and improving driver and passenger safety. The Mali-C71 image signal processor will analyze every pixel from cameras onboard a car, and much like a human eye, read the image, and help make driving decisions.

For example, today’s cars are not good at identifying a person in view of the rear cameras when they are parking themselves. ARM’s chip will be able to identify a person and stop the car. That’s just the start — the chip will help identify people crossing the street as well as traffic signals and driving lanes in different lighting conditions.

The chip could also identify weather conditions, possibly with the help of information from GPS. That could help navigate safely through rough road conditions. A camera inside a car could also identify a drowsy driver and issue an alert.

A similar function could be performed by GPUs from companies like Nvidia that are targeting autonomous vehicles. But the ARM-based chips will be more power efficient, while GPUs are considered useful for more futuristic self-driving cars and may draw more power. Today’s cars don’t need full-blown GPUs for tasks like self parking.

The number of cameras in each car could exceed 10 in the coming years, and the reliance on them will only increase as cars go increasingly autonomous. The Mali-C71 supports up to four cameras in real-time. A car could have multiple Mali-C71s, and vehicles with the cameras installed could start appearing as early as next year.

The Mali-C71 is aimed at cars with drivers at the wheel, though it has features that could be used in autonomous cars. It can support images in a 4096 x 4096-pixel range.

Image signal processors aren’t new and exist on mobile chips even today. But the Mali-C71 is different because of multiple reliability features to ensure pixels are reliably tagged and to ensure there are no data errors. A small error could mean an accident.

The chip includes the features, image quality, and safety elements to be appropriately used in systems including simple backup cameras, multi-camera parking-assist systems, and even fully autonomous vehicles, an ARM spokeswoman said.

It can be used with ARM or other architectures, the spokeswoman said. Chips based on the ARM, x86, Power, and MIPS architectures are all vying for a spot in cars. So are specialized ASICs, real-time chips and FPGAs (field programmable gate arrays).