http://www.rssboard.org/rss-specification AnandTech This channel features the latest computer hardware related articles. http://www.anandtech.com en-us Copyright 2019 AnandTech Anand Lal Shimpi https://imchealthcheckup.com/?win=contenthttps://www.anandtech.com/images/rss_logo.png AnandTech http://www.anandtech.com ASUS ProArt PQ22UC 4K OLED Monitor: £4699, ~$5150 Anton Shilov

ASUS announced its first professional OLED display back at CES 2018 over a year ago. The compact and lightweight 21.6-inch 4K monitor covering 99% of the DCI-P3 color aimed at professionals attracted a lot of attention from various parties, but it has taken ASUS quite some time to perfect the product. Only this month the company began to sell the display on select markets with broader availability expected going forward. Meanwhile, the price of the monitor looks rather overwhelming.

]]>
https://imchealthcheckup.com/?win=show/14123/asus-proart-pq22uc-4k-oled-monitor-5150-usd Fri, 22 Mar 2019 17:00:00 EDT tag:www.anandtech.com,14123:news
Thermalright Silver Arrow IB-E Extreme Rev. B: An Air Cooler for 320 W Anton Shilov

Announcements of new high-performance air CPU coolers tend to get rare these days. On the one hand, many enthusiasts switched to closed loop liquid coolers in the recent years, which is why the market of high-end air coolers shrank. On the other hand, existing models of ‘mega coolers’ are powerful enough for the vast majority of CPUs. Nonetheless, makers known primarily for oversized air coolers continue to perfect their offerings. This week Thermalright introduced its Silver Arrow IB-E Extreme Rev. B giant air cooler that can dissipate up to 320 W of heat.

]]>
https://imchealthcheckup.com/?win=show/14120/thermalright-silver-arrow-ibe-extreme-rev-b-two-140mm-fans-8-heat-pipes-320-w Fri, 22 Mar 2019 16:00:00 EDT tag:www.anandtech.com,14120:news
OSS Unveils 5-Way PCIe 4.0 Backplane, Demonstrates PCIe 4.0 HPC Platform Anton Shilov

One Stop Systems this week introduced the industry’s first 5-way PCIe 4.0 backplane at NVIDIA’s GPU Technology Conference. The OSS 5 Slot Gen 4 Backplane is designed primarily for high-performance computing applications and enables the company to offer infrastructure building blocks necessary to build a PCIe 4.0-based HPC platform.

]]>
https://imchealthcheckup.com/?win=show/14122/oss-unveils-5way-pcie-40-backplane-demonstrates-pcie-40-hpc-platform Fri, 22 Mar 2019 15:00:00 EDT tag:www.anandtech.com,14122:news
ZOTAC Mek Mini: A Small High Performance Gaming PC Anton Shilov

ZOTAC this week launched its small form-factor (SFF) Mek Mini desktop computer, aimed at gaming in a small volume. The system will be initially available in only one configuration but is designed to deliver good performance for gamers who prefer compact PCs.

]]>
https://imchealthcheckup.com/?win=show/14119/zotac-mek-mini Fri, 22 Mar 2019 13:00:00 EDT tag:www.anandtech.com,14119:news
Samsung Galaxy S10 5G: Launch Date & Approximate Price Revealed Anton Shilov

When Samsung introduced its new family of flagship Galaxy S10 smartphones in February, the company disclosed prices and launch timeframes for all models except one, the Galaxy S10 5G. Based on two reports citing industry sources and Samsung, the manufacturer will release its first 5G handset in South Korea in early April.

]]>
https://imchealthcheckup.com/?win=show/14121/samsung-galaxy-s10-5g-launch-date-approximate-price-revealed Fri, 22 Mar 2019 11:30:00 EDT tag:www.anandtech.com,14121:news
Club 3D Launches 2.5 GbE USB Type-A & USB Type-C Dongles Anton Shilov

Club 3D has introduced its 2.5 GbE dongles featuring a USB Type-A or a USB Type-C interface. The adapters are designed to add 2.5 Gbps wired Ethernet to PCs without internal GbE controllers. For laptops, this is becoming increasingly more widespread.

Club 3D’s CAC-1420 (USB Type-A to 2.5 GbE) and CAC-1520 (USB Type-C to 2.5 GbE) are extremely simplist devices: they feature an RJ-45 connector on one side, and a USB 3.1 Gen 1 (5 Gbps) interface on another. The dongles are USB-powered and therefore do not need any external power adapters. As for compatibility, they can work with PCs running Apple’s MacOS X 10.6 ~ 10.14 as well as Microsoft’s Windows 8/10.

The manufacturer does not disclose which 2.5 GbE controller it uses, but it is highly likely that the dongles use Realtek’s RTL8156 controller specifically designed for such applications. The only other option is from Aquantia, who only offers a joint 2.5/5 GbE controller.

Apart from notebooks without a GbE port that have to work in corporate environments with wired networks (including those that use 2.5, 5, and 10 GbE networks), Club 3D’s new adapters can be used to upgrade older desktop PCs that need a faster Ethernet connectivity.

Club 3D has not announced pricing of the 2.5 GbE CAC-1420 and CAC-1520 adapters.

Related Reading:

Source: Club 3D (via Hermitage Akihabara)

]]>
https://imchealthcheckup.com/?win=show/14115/club-3d-launches-25-gbe-usb-typea-usb-typec-dongles Thu, 21 Mar 2019 16:00:00 EDT tag:www.anandtech.com,14115:news
Xiaomi Black Shark 2 Gaming Phone: Snapdragon 855, 12 GB RAM, 240 Hz Polling Anton Shilov

The smartphone market is no longer growing as rapidly as it used to several years ago, but it is actively segmentizing as customers want their handsets to be tailored for their needs. This presents opportunities for companies with R&D capabilities as they can capitalize on special-purpose devices. A couple of years ago Xiaomi established its Black Shark subsidiary to address mobile gamers. Since then, Black Shark has introduced two gaming handsets. This week, the subsidiary introduced its third offering.

]]>
https://imchealthcheckup.com/?win=show/14116/xiaomi-black-shark-2-gaming-phone Thu, 21 Mar 2019 14:00:00 EDT tag:www.anandtech.com,14116:news
The GIGABYTE Z390 Aorus Master Motherboard Review: Solid, But Not Special Gavin Bonshor The mainstream motherboard market is still predominantly focused on gamers and gaming features. From the useful to the inane, saying a device is 'gaming' is clearly bringing in the sales, and it becomes an all out marketing war. Each company is clearly trying to build a gaming brand beyond the company name, even if it means always being confused at how to pronounce it (Ay-orus, or Or-us?). Nonetheless, it is clear that each motherboard company is piling on the R&D dollars, as well as the design dollars, to ensure that it can convince users to part with some hard earned money in their next build. GIGABYTE's latest attempt is the Z390 Aorus Master, a motherboard that on paper sets its sights on features, aesthetics, and capability.

]]>
https://imchealthcheckup.com/?win=show/14047/the-gigabyte-z390-aorus-master-motherboard-review Thu, 21 Mar 2019 12:30:00 EDT tag:www.anandtech.com,14047:news
Samsung Develops Smaller DDR4 Dies Using 3rd Gen 10nm-Class Process Tech Anton Shilov

Samsung has completed development of its 3rd-generation 10 nm-class manufacturing process for DRAM as well as the first 8 Gb DDR4 chip that uses the technology. The 1z-nm process technology is said to be the world’s smallest process node for memory, and will enable Samsung to increase productivity without needing to go to extreme ultraviolet lithography (EUVL) at this time. The company plans to start volume production using the technology in the second half of 2019.

]]>
https://imchealthcheckup.com/?win=show/14118/samsung-develops-8-gb-drams-using-3rd-gen-10nmclass-process-technology Thu, 21 Mar 2019 11:30:00 EDT tag:www.anandtech.com,14118:news
Intel’s Xeon & Xe Compute Accelerators to Power Aurora Exascale Supercomputer Anton Shilov

Intel this week announced that its processors, compute accelerators, and Optane DC persistent memory modules will power Aurora, the first supercomputer in the US projected to feature a performance of one exaFLOP. The system is expected to be delivered in about two years, and goes beyond its initial Xeon Phi specification released in 2014.

]]>
https://imchealthcheckup.com/?win=show/14112/intels-xeon-xe-compute-accelerators-to-power-aurora-exascale-supercomputer Thu, 21 Mar 2019 09:00:00 EDT tag:www.anandtech.com,14112:news
Intel Releases New Graphics Control Panel: The Intel Graphics Command Center Ryan Smith & Billy Tallis

Making their own contribution to this busy week of GPU and gaming news, this evening Intel took the wraps off of their previously teased new graphics control panel. Dubbed the Intel Graphics Command Center, the new control panel – or to be more technically accurate, the new app – is an effort from Intel to modernize a part of their overall graphics infrastructure, replacing the serviceable (but not necessarily loved) current iteration of the company’s control panel. At the same time however, it’s also the first step in part of a larger process to prepare Intel’s software stack and overall software ecosystem ahead of the company’s ambitious plans to enter the discrete GPU market in 2020.

Starting from the top, Intel’s Graphics Command Center is largely cut from the same cloth as other modern graphics control panels, such as NVIDIA’s GeForce Experience and AMD’s Radeon Settings application. Which is to say, it’s designed to offer a highly visible and streamlined approach to a GPU control panel, making various features easy to find, and overall offering a more user-friendly experience than the company’s current control panel. And while Intel doesn’t go so far as to name names, from their presentation it’s clear that they consider this kind of user-friendly functionality to now be a required, baseline feature for any GPU ecosystem; in which case Intel is (or rather now, was) the only PC GPU vendor lacking an equivalent application.

To that end, the company is launching the new Graphics Command Center as part of their efforts to better support their current users, as well as new users going forward. The Intel Graphics Command Center works with 6th Gen Core processors (Skylake) and later, which at this point is most Intel-powered systems sold in the last few years. The company calls it an “early access” release, and this is a fairly apt description for the utility as while it shows a level of polish and stability that comes with over a year’s work, Intel clearly isn’t done adding features to it yet.

But perhaps the most interesting tidbit about the Graphics Command Center is how it’s being distributed: rather than being bundled with Intel’s drivers, it’s being delivered through the Microsoft Store on Windows 10. Yes, it’s a full-on UWP application with all of the “modern” flourishes that come with it, and this is actually an important part of Intel’s strategy. Because Microsoft’s new DCH driver model requires drivers to be stripped down to the bare essentials and delivered in pieces – graphics control panels can’t be bundled – these sorts of applications instead need to be delivered separately. In which case, using the Microsoft Store lets Intel tap into the OS’s built-in software update functionality. It also means that the control panel isn’t contingent on the checkered driver update schedules of PC OEMs; users can always download the Graphics Command Center out of band.

Overall, the Graphics Command Center borrows a lot from other GPU control applications. Front and center is a games-centric approach to settings, with the application preferring to offer game-specific settings when possible (scanning to discover what games are installed). For one of the 100 or so games on Intel’s list of supported games, this is relatively straightforward, and each game gets its own page with familiar driver-enforced settings such as anti-aliasing, v-sync, and anisotropic filtering.

Meanwhile, Intel has also thrown in some functionality to better explain what these graphics settings do, as well as their performance impacts. A small question mark next to each setting describes what the setting does, and includes photo demonstrating the concept as well. Meanwhile, towards the right of the control for that setting is an indicator to signal the performance impact of that setting, to offer a basic level of guidance about what the current setting will likely do to game performance. This is actually dynamic with the setting itself, so higher levels of MSAA are flagged as causing a greater performance hit, etc.

Going one step further, however, for 30 of those games, Intel also includes support for one-click graphics optimizations, which is indicated by the lightning bolt logo. Similar to how this works with other control panels, this function will actually go into a game and alter its settings to Intel’s suggested settings for the host computer. This allows Intel to adjust game settings on a fine-grained level, adjusting texture and shadow quality, rendering distance, internal AA settings, etc.

I’m told that right now most of the work to determine these settings is being done by hand by Intel engineers – and of that, I assume a lot of it is being taken from Intel’s existing gameplay settings service. However with 3 generations of iGPUs supported and more coming, the use of automation is increasing as well. As to the quality of Intel’s suggestions, I haven’t had nearly enough time with the Graphics Command Center to get a feel for them, though Intel makes it pretty easy to undo it as necessary.

Beyond game settings, the Graphics Command Center also supports all of the other common features you’d expect to find in a graphics control panel. There’s monitor display settings such as resolution and refresh rate, as well as arranging monitors. There are also a series of video quality settings for adjusting color correction, deinterlacing, film detection, etc. Not unlike the graphics settings, there are demo/explanation features here as well, in order to demonstrate in real-time what the various settings do. And of course, there are info panels on the current software and hardware, supported features, etc. This latter part is admittedly nowhere near groundbreaking, but if this is a baseline feature, then it needs to be present regardless.

Past the current functionality, it’s clear that Intel doesn’t consider themselves to be done with the development of their new graphics control panel. Besides adding support for more games – both for detection and one-click optimizations – there are several other features the other GPU vendors regularly support such as game recording. Performance monitoring, and game streaming. So I would be surprised if Intel didn’t eventually move towards parity here as well.

But ultimately the launch of their Graphics Command Center is about more than just improving the present; it’s about laying the groundwork for the future. The company is gearing up to launch it’s Gen11 iGPU architecture this year, and all signs point to the most common GPU configurations being a good deal more powerful than the Skylake-era GT2 configurations. And next year, of course, is slated to be the launch of Intel’s first Xe discrete GPUs. Intel has grand ambitions here, and to compete with NVIDIA and AMD, they need to match their software ecosystems as well, not just match them on the hardware front. So their latest control panel is an important step forward in establishing that ecosystem.

For the time being, however, Intel is just looking to polish their new control panel. As part of their Odyssey community feedback/evangelism program, Intel is very much embracing the “early access” aspect of this release, and is courting user feedback on the application. And while I admittedly suspect that Intel already knows exactly what they want to do and work on, it certainly doesn’t hurt to solicit feedback on this long road to Xe.

]]>
https://imchealthcheckup.com/?win=show/14117/intel-releases-new-graphics-control-panel-the-intel-graphics-command-center Wed, 20 Mar 2019 23:30:00 EDT tag:www.anandtech.com,14117:news
HP Reverb Virtual Reality Headset: A 4K HMD with 6DOF Anton Shilov

Numerous companies are making attempts to drive VR technology to the commercial space. HP this week introduced its first AR/VR headset that was designed from the ground both for consumer as well as for commercial/professional applications.

]]>
https://imchealthcheckup.com/?win=show/14104/hp-reverb-virtual-reality-headset Wed, 20 Mar 2019 17:00:00 EDT tag:www.anandtech.com,14104:news
Oculus Rift S VR Headset: An Upgraded Virtual Reality Experience Anton Shilov

Oculus VR has introduced its new Oculus Rift S virtual reality PC-powered headset. The new head mounted display (HMD) features an inside-out tracking and does not require any external sensors. In a generational update, it has a higher-resolution screen when compared to the original Oculus Rift. The new unit will ship this Spring.

]]>
https://imchealthcheckup.com/?win=show/14114/oculus-rift-s-vr-headset Wed, 20 Mar 2019 14:45:00 EDT tag:www.anandtech.com,14114:news
Samsung’s Space-Saving Monitors on Pre-Order: Up to 31.5-Inch Anton Shilov

Large displays tend to occupy a lot of desk space, something that is not appreciated by many. Samsung has developed a family of monitors featuring a minimalistic design that promises to save as much space as possible while still providing 27 or 31.5 inches of screen real estate. Announced early this year, Samsung’s Space Monitors are now available for pre-order and will ship in April.

]]>
https://imchealthcheckup.com/?win=show/14108/samsung-27-and-315inch-space-monitors-available Wed, 20 Mar 2019 14:00:00 EDT tag:www.anandtech.com,14108:news
HP Unveils ProDesk 405 G4 Desktop Mini PC: An SFF Ryzen Pro Desktop Anton Shilov

Over the past few months we have seen increasing adoption of AMD Ryzen processors by manufacturers of ultra-compact form-factor (UCFF) desktops. At present, the number of UCFF systems powered by AMD’s Ryzen is not large, but it is growing. On Tuesday HP announced its first small form-factor commercial desktop powered by AMD’s Ryzen Pro 2000-series.

]]>
https://imchealthcheckup.com/?win=show/14106/hp-unveils-prodesk-405-g4-desktop-mini-pc-an-sff-ryzen-probased-desktop Wed, 20 Mar 2019 12:00:00 EDT tag:www.anandtech.com,14106:news
Samsung HBM2E ‘Flashbolt’ Memory for GPUs: 16 GB Per Stack, 3.2 Gbps Anton Shilov

Samsung has introduced the industry’s first memory that correspond to the HBM2E specification. The company’s new Flashbolt memory stacks increase performance by 33% and offer double per-die as well as double per-package capacity. Samsung introduced its HBM2E DRAMs at GTC, a fitting location since NVIDIA is one of the biggest HBM2 consumers due to their popular GV100 processor.

]]>
https://imchealthcheckup.com/?win=show/14110/samsung-introduces-hbm2e-flashbolt-memory-16-gb-32-gbps Wed, 20 Mar 2019 11:00:00 EDT tag:www.anandtech.com,14110:news
Apple Launches 2nd Gen AirPods: Longer Talk Time & Hands-Free ‘Hey Siri’ Anton Shilov

Apple on Wednesday introduced its 2nd Generation AirPods. The new AirPods supports hands-free ‘Hey Siri’ functionality, a longer battery life for coversations, and faster connect times. The new headset will be available in both wireless and wired charging cases.

]]>
https://imchealthcheckup.com/?win=show/14113/apple-launches-2nd-gen-airpods-longer-talk-time-handsfree-hey-siri Wed, 20 Mar 2019 09:40:00 EDT tag:www.anandtech.com,14113:news
Apple Upgrades iMac and iMac Pro: More Cores, More Graphics, More Memory Anton Shilov

Apple has introduced its updated iMac all-in-one desktop computers to use Intel's latest generation processors with up to eight cores plus AMD’s latest Pro graphics, and its iMac Pro to be equipped with more memory and a faster GPU. Since Apple upgrades its iMac product line every couple of years or so, the company has every right to claim that its top-of-the-range AIO PCs are now up to twice faster than their predecessors.

]]>
https://imchealthcheckup.com/?win=show/14107/apple-upgrades-imac-line Tue, 19 Mar 2019 17:45:00 EDT tag:www.anandtech.com,14107:news
SilverStone EP14: A Miniature USB-C Hub with HDMI, USB-A, 100 W Power Anton Shilov

With hundreds of different USB Type-C adapters and docks on the market, manufacturers are trying hard to make theirs more attractive. To that end, they now tend to design rather interesting products addressing focused use cases. SilverStone has introduced its new compact USB-C dock that has three USB-A ports, a display output, and can pass through up to 100 W of power to charge a laptop and/or devices connected to the USB-A ports, a rare feature for small docks.

]]>
https://imchealthcheckup.com/?win=show/14093/silverstone-ep14-a-miniature-usbc-hub-with-hdmi-usba-100-w-power Tue, 19 Mar 2019 14:00:00 EDT tag:www.anandtech.com,14093:news
Google Announces Stadia: A Game Streaming Service Ian Cutress Today at GDC, Google announced its new video game streaming service. The new service will be called Stadia. This builds on the information earlier this year that AMD was powering Project Stream (as was then called) with Radeon Pro GPUs, and now Google is confirming the project is a primary partner using AMD’s next generation GPUs. (Edit: AMD reached out to confirm that their press release only mentioned GPUs.)

Stadia is being advertised as the central community for gamers, creators, and developers. The idea is that people can play a wide array of games regardless of the hardware at hand. Back in October, Google debuted the technology showcasing a top-end AAA gaming title running at 60 FPS. Google wants a single place where gamers and YouTube creators can get together – no current gaming platform, according to Google, does this.

Ultimately Google wants to stream straight to the Google browser. Google worked with leading publishers and developers to help build the system infrastructure. Google is one of a few companies with enough content delivery networks around the world to ensure that frame rates are kept high with super low latency.

Users will be able to watch a video about a game, and instantly hit ‘Play Now’ and start playing the game in under five seconds without any download and lag. The idea is that a single code base can be enjoyed at any stream. At launch, desktop, laptop, TV, tablets, and phones will be supported. With Stadia, the datacenter is platform. No hardware acceleration is required on the device. The experience can be transferred between devices, such as chromebook to smartphone.

One of the highlights of Google’s demonstration of Stadia was the platform working on Google-enabled TVs.

The platform allows users to have any USB connected controller, or mouse and keyboard. Google will also be releasing its own Stadia Controller, available in three colors – white, black, and light blue. The controller connects via Wi-Fi straight into the cloud, and also which device is being run (it’s unclear how this works).

The controller has two new buttons. The first allows saving and sharing the experience out to YouTube. The second is Google Assistant, using the integrated microphone in the controller. This allows game developers to integrate Google Assistant into their games. It also allows users to ask Google when they need help in a game - and the assistant will look for a guide to help.

Stadia uses the same datacenter infrastructure already in place at Google. There are 7500+ edge nodes allows for compute resources being closer to players for lower latency. Custom designed, purpose built hardware powers the experience. Interconnected racks have sufficient compute and memory for the most demanding games. The technology has been in development inside Google for years.

At launch, resolutions will be supported up to 4K 60 fps with HDR and surround sound. Future  plans for up to 8K streaming at 120 fps are planned. The platform has been built to scale to support this. While playing, the stream is duplicated in 4K for direct upload – you get rendering quality video rather than what you capture locally.

The platform is instance based, so Google can scale when needed. Game developers no longer have to worry about building to a specific hardware performance – the datacenter can scale as required.

Sadia is powered by a custom AMD GPU with 10 TFLOPS of power, with a custom CPU with AVX2 support. Combined they create a single instance per person. Uses Linux and Vulkan, with full Unreal and Unity support. Havok engine support as well. Tool companies are onboard.

At a high level, the specifications for the GPU are almost a shoe-in for AMD's Radeon Vega 56, right down to the number of CUs and compute throughput. So while not confirmed, it's very likely that Google is using some kind of Vega 10 card; probably a variant of the Radeon Instinct MI25.

However it's notable (and unusual) that Google is only announcing their partner for the GPU and not the CPU. With AMD capable of delivering solid products in both categories, one would consider them a shoe-in for the CPU as well since they're already providing the GPU. However Google's announcement took special care not to announce the CPU partner, and even AMD emailed us that they could only confirm the use of AMD GPUs. So whether Google's CPU vendor is Intel or AMD remains to be seen. There are good arguments for each based on the vague specifications, though with AVX 2 support listed, if it is AMD then that would mean that Google has gotten their hands on some early Zen 2 CPUs.

One of the first games supported will be Doom Eternal from id Software, which will support 4K with HDR at 60 fps. Every user will get a single GPU with no other users.

UL Benchmarks (3DMark) has been working with Google to help benchmark the systems and measure the power of the infrastructure. Developers if required can use multiple GPUs, it appears.

Multiplayer is also supported, at least between different Stadia players. Distributed physics becomes possible, which means up to 1000 players in Battle Royale titles. There’s also the advantage, according to Google, of getting around hackers and cheaters.

Developers can support multi-platform multiplayer, and transfer save files between platforms. Game developers have already been working on MP demos with destructive environments using real-time rigid body physics, allowing for perfect synchronization.

Google also points out that split-screen gaming has not been a priority recently because of rendering two scenes at once. With Stadia, that problem disappears, as each player will be powered by a separate instance, reviving the idea of local co-op and squad based gaming. This also allows for multiple cameras for a single player to navigate a single map, for better tactics in certain types of games. Google says that this ability allows developers to create new types of games.

Built on Google’s platform, Stadia will also support machine learning. For developers that want to take advantage, they can incorporate Google and third-party libraries to help improve games over time and enhance the experience both on a per-user level and on a local/global scale.

The other focus on Stadia is the interaction with YouTube. Google points out that gaming has been a fundamental part of YouTube since its exception, and it is Google’s goal to help creators interact with (and monetize) their audience. The idea is that creators can directly livestream from Stadia, as well as play with creators through Stadia. ‘Crowd Play’ will allow users to play directly into the server instance with the creator – it acts like a lobby, so players will sit in line to play with their favorite creator. For example, the NBA2K above shows 'join this game (3rd in line)'.

Google states that any link from any location can act as a launch point for a title. This means that developers do not have to be limited to a single game store – games can be launched from almost anywhere, as long as the user is in an up to date Chrome browser. Google is also set to put extensive parental controls into the mix.

Google will be creating an entity called ‘Stadia Games and Entertainment’, headed up by Jade Raymond, enabling first party studios to use Stadia. Other partner studios will also work through the new division as outreach in order to enable game development on Stadia.

Developers who want to create for Stadia should go to stadia.dev to sign up for tools and resources. Stadia Partners for distributers. Stadia.com will be the hub for gamers.

Stadia will launch in 2019, in the US, Canada, UK, and most of Europe. No word on pricing yet, but Google will be announcing more in the Summer.

 

 

]]>
https://imchealthcheckup.com/?win=show/14105/google-announces-stadia-a-game-streaming-service Tue, 19 Mar 2019 13:12:00 EDT tag:www.anandtech.com,14105:news
HP Reveals Envy x360 15 Laptops with AMD's Latest Ryzen APUs Anton Shilov

HP on Tuesday introduced its new 15.6-inch convertible notebooks based on AMD’s Ryzen Mobile 3000-series APUs. The new HP Envy x360 15 are positioned as inexpensive 15.6-inch-class laptops for productivity applications. In addition, the company announced its new Intel-based HP Envy x360 15 PCs.

HP’s AMD Ryzen 3000 and Intel Core i5/i7-based Envy x360 15 convertibles use exactly the same sand-blasted anodized aluminum chassis and thus have the same dimensions (17 mm z-height) and weight (~ 2 kilograms). The only visual difference between AMD and Intel-powered Envy x360 15 PCs is the color: the former features HP’s Nightfall Black finish, whereas the latter features HP’s Natural Silver finish. Overall the new 15.6-inch Envy x360 convertible laptops feature a 28% smaller bezel when compared to the previous generation according to the manufacturer. Meanwhile, all the HP Envy x360 15 machines introduced today also use the same 15.6-inch Full-HD IPS touch-enabled display panel featuring a WLED backlighting.

Inside the new AMD-based HP Envy x360 15 convertible laptops are AMD’s quad-core Ryzen 5 3500U or Ryzen 7 3700U processors with integrated Radeon RX Vega 8/10 graphics. The APUs are accompanied by 8 GB or single-channel DDR4-2400 memory as well as a 256 GB NVMe/PCIe M.2 SSD. As for Intel-powered Envy x360 15, they use Core i5-8265U or Core i7-8565U CPUs.

As far as connectivity is concerned, everything looks rather standard: the systems feature a 802.11ac + Bluetooth 5.0/4.2 controller from Intel or Realtek, one USB 3.1 Gen 1 Type-C connector (with DP 1.4), two USB 3.1 Gen 1 Type-A ports, an HDMI output, a 3.5-mm audio connector for headsets, an SD card reader, and so on. The new Envy x360 15 also has an HD webcam with a dual array microphone and a kill switch, a fingerprint reader, Bang & Olufsen-baged stereo speakers, and a full-sized keyboard.

When it comes to battery life, HP claims that its AMD Ryzen Mobile-powered Envy x360 15 convertibles offer exactly the same battery life as Intel-based machines: up to 13 hours of mixed usage when equipped with a 55.67 Wh battery.

HP will start sales of its Envy x360 15 convertible notebooks with AMD Ryzen Mobile inside this April. Pricing will start at $799.99. By contrast, a system featuring Intel’s Core i5-8265U with a generally similar configuration will cost $869.99.

HP Envy X360 15"
  Envy x360 15 (AMD)
15m-ds0011dx
15m-ds0012dx
Envy x360 15 (Intel)
15m-dr0011dx
15m-dr0012dx
Display 15.6-inch
IPS
1920x1080
Processor Ryzen 5 3500U
4C/8T
2.1 GHz Base
3.7 GHz Turbo
 
Ryzen 7 3700U
4C/8T
2.3 GHz Base
4.0 GHz Turbo
Core i5-8265U 
4C/8T

1.6 GHz Base
3
.9 GHz Turbo
Core i7-8565U
4C/8T
1.8 GHz Base
4.0 GHz Turbo
Graphics Vega 8 Vega 10 Intel UHD Graphics 620
RAM 8 GB DDR4-2400 (not user accessible)
Storage 256 GB PCIe/NVMe 256 GB PCIe/NVMe
or
512 GB PCIe/NVMe + 32 GB Optane
Network Realtek
2x2 802.11ac
Bluetooth 4.2
Intel Wireless-AC 9560
2x2 802.11ac
Bluetooth 5.0
Audio Bang & Olufsen
Dual Speakers
Digital Media SD card reader
Keyboard Full-size island-style
backlit keyboard
External Notebook
Ports
1 x USB Type-C 3.1 Gen 1
2 x USB 3.1 Gen 1
1 HDMI
1 x 3.5mm jack
Dimensions / Weight 14.13 x 9.68 x 0.67-inch
2 kilograms | 4.53 lbs
Battery / Battery Life 3-cell 55.67 Wh LiPo
65W AC adapter 
Price Starting $799.99 Starting $869.99

Related Reading

Source: HP

]]>
https://imchealthcheckup.com/?win=show/14103/hp-reveals-envy-x360-15-laptops-with-amd-ryzen-mobile-3000series-apus Tue, 19 Mar 2019 11:00:00 EDT tag:www.anandtech.com,14103:news
Western Digital: Over Half of Data Center HDDs Will Use SMR by 2023 Anton Shilov

Western Digital said at OCP Global Summit last week that over half of hard drives for data centers will use shingled magnetic recording (SMR) technology in 2023. At present Western Digital is the only supplier of SMR HDDs managed by hosts, but the technology is gaining support by hardware, software, and applications.

SMR technology to boost capacity of hard drives fairly easily but at the cost of some performance trade-offs due to the read-modify-write cycle introduced by shingled tracks. Since operators of datacenters are interested in maximizing their storage capacities, they are inclined to invest in software that can mitigate peculiarities of SMR. As a result, several years after Western Digital introduced its first host-managed SMR HDDs, more and more companies are adopting them. Right now, the vast majority of datacenter hard drives are based on perpendicular magnetic recording technology, but WD states that in four years SMR HDDs will leave PMR drives behind.

Obviously, usage of SMR will not be the only method to increase capacities of hard drives. Energy-assisted PMR technologies (e.g., MAMR, HAMR, etc.) will also be used by Western Digital. In the coming quarters the company intends to release MAMR-based HDDs featuring a 16 TB (ePMR) and 18 TB (eSMR) capacity. The company also plans to introduce 20 TB HDDs in 2020.

High-capacity hard drives are not going to be replaced by high-capacity SSDs any time soon, according to Western Digital. HDDs will continue to cost significantly less than SSDs on per-TB basis. Therefore, they will be used to store 6.5 times more data than datacenter SSDs in 2023.

Related Reading:

Source: Western Digital Presentation at OCP, YouTube

]]>
https://imchealthcheckup.com/?win=show/14099/western-digital-over-half-of-dc-hdds-will-use-smr-by-2023 Tue, 19 Mar 2019 10:30:00 EDT tag:www.anandtech.com,14099:news
Quick Note: NVIDIA’s “Einstein” Architecture Was A Real Project Ryan Smith

While it was never an official NVIDIA codename as far as roadmaps go, the name “Einstein” came up in rumors a few times earlier this decade. At the time, Einstein was rumored to be the architecture that would follow Maxwell in the NVIDIA lineup. And while we sadly didn’t find out anything new about NVIDIA’s future roadmap at this year’s show – or any sign of Ampere or other 7nm chips – I did inadvertently find out that the rumors about Einstein were true. At least, from a certain point of view.

While talking with NVIDIA’s research group this morning about some of their latest projects (more on this a bit later this week when I have the time), the group was talking about past research projects. And, as it turns out, one of those former research projects was Einstein.

Rather than just being a baseless rumor, Einstein was in fact a real project at NVIDIA. However rather than being an architecture, per-se, it was a research GPU that the NVIDIA research group was working on. And although this research project didn’t bear fruit under the Einstein name, it did under another name that is far more well-known: Volta.

So while this means we can scratch Einstein off of the list of names for potential future NVIDIA architectures, the project itself was real, and it was actually a big success for NVIDIA. As Einstein morphed into what became the Volta architecture, it has become the cornerstone of what are now all of NVIDIA’s current-generation GPUs for servers and clients. This includes both regular Volta and it’s graphics-enhanced derivative, Turing.

]]>
https://imchealthcheckup.com/?win=show/14102/quick-note-nvidias-einstein-architecture-was-a-real-project Tue, 19 Mar 2019 00:30:00 EDT tag:www.anandtech.com,14102:news
Nvidia Announces Jetson Nano Dev Kit & Board: X1 for $99 Andrei Frumusanu

Today at GTC 2019 Nvidia launched a new member of the Jetson family: The new Jetson Nano. The Jetson family of products represents Nvidia new focus on robotics, AI and autonomous machine applications. A few months back we had the pleasure to have a high level review of the Jetson AGX as well as the Xavier chip that powers it.

The biggest concern of the AGX dev kit was its pricing – with a retail price of $1299, it’s massively out of range of most hobbyist users such as our readers.

The new Jetson Nano addresses the cost issue in a quite dramatic way. Here Nvidia promises to deliver a similar level of functionality than its more expensive Jetson products, at a much lower price point, and of course at a lower performance point.

The Jetson Nano is a full blown single-board-computer in the form of a module. The module form-factor and connector is SO-DIMM and is similar to past Nvidia modules by the company. The goal of the form-factor is to have the most compact form-factor possible, as it is envisioned to be used in a wide variety of applications where a possible customer will design their own connector boards best fit for their design needs.

At the heart of the Nano module we find Nvidia’s “Erista” chip which also powered the Tegra X1 in the Nvidia Shield as well as the Nintendo Switch. The variant used in the Nano is a cut-down version though, as the 4 A57 cores only clock up to 1.43GHz and the GPU only has half the cores (128 versus 256 in the full X1) active. The module comes with 4GB of LPDDR4 and a 16GB eMMC module. The standalone Jetson Nano module for use in COTS production will be available to interested parties for $129/unit in quantities of 1000.

Naturally, because you can’t do much with the module itself, Nvidia also offers the Jetson Nano in the form of a complete computer: The Jetson Nano Developer Kit. Among the advantages of the Kit is vastly better hardware capabilities compared to competing solutions, such as the performance of the SoC or simply better connectivity such as 4 USB full (3x 2.0 + 1x 3.0) ports, HDMI, DisplayPort and a Gigabit Ethernet port, along with the usual SDIO, I2C, SPI, GPIO and UART connectors you’re used to on such boards. One even finds a M.2 connector for additional WiFi as well as a MIPI-CSI interface for cameras.


Jetson AGX Dev Kit vs Jetson Nano Dev Kit


Jetbot with Jetson Nano Dev Kit vs Jetson Nano Dev Kit

The Jetson Nano Development Kit can be had for only $99. One way Nvidia reaches this price is through the omission of on-board storage, and the kit is driven purely by microSD card. Availability starts today.

We have the Jetson Nano in-house and will seeing what fun things Nvidia cooked up for us soon!

]]>
https://imchealthcheckup.com/?win=show/14101/nvidia-announces-jetson-nano Mon, 18 Mar 2019 19:00:00 EDT tag:www.anandtech.com,14101:news
NVIDIA To Bring DXR Ray Tracing Support to GeForce 10 & 16 Series In April Ian Cutress & Ryan Smith

During this week, both GDC (the Game Developers’ Conference) and GTC (the Game Technology Conference) are happing in California, and NVIDIA is out in force. The company's marquee gaming-related announcement today is that, as many have been expecting would happen, NVIDIA is bringing DirectX 12 DXR raytracing support to the company's GeForce 10 series and GeForce 16 series cards.

]]>
https://imchealthcheckup.com/?win=show/14100/nvidia-april-driver-to-support-ray-tracing-on-pascal-gpus-dxr-support-in-unity-and-unreal Mon, 18 Mar 2019 18:40:00 EDT tag:www.anandtech.com,14100:news
The NVIDIA GPU Tech Conference 2019 Keynote Live Blog (Starts at 2pm PT/21:00 UTC) Ryan Smith Kicking off a very busy week for tech events in California, my first stop for the week is NVIDIA's annual GPU Technology Conference in San Jose.

As always, CEO Jensen Huang will be kicking off the show proper with a 2 hour keynote, no doubt making some new product announcements and setting the pace for the company for the next year. The biggest question that's no doubt on everyone's minds being what NVIDIA plans to do for 7nm, as that process node is quickly maturing. Hopefully we'll find out the answer to that and more, so be sure to check-in at 2pm Pacific to see what's next for NVIDIA.

]]>
https://imchealthcheckup.com/?win=show/14097/the-nvidia-gpu-tech-conference-2019-keynote-live-blog Mon, 18 Mar 2019 14:00:00 EDT tag:www.anandtech.com,14097:news
Micron Introduces 2200 Client NVMe SSD With New In-House Controller Billy Tallis

Micron has announced the first product based on their new in-house client NVMe SSD controller. The Micron 2200 doesn't boast performance sufficient to compete with the top enthusiast-class NVMe drives on the retail market, but should be plenty fast enough for OEMs and system integrators to use it as a performance option in the business PCs it is intended for.

Micron has been notably slow about bringing NVMe to their client and consumer product lines. They initially planned to launch both client OEM and consumer retail drives built around the combination of their first-generation 32-layer 3D NAND and the Silicon Motion SM2260 controller, but those plans were shelved as it became clear that combination could not deliver high-end performance. Last fall Micron finally launched the Crucial P1 entry-level NVMe SSD with QLC NAND and the SM2263 controller, but no high-end product has been announced until now.

It's been no secret that Micron has been working on their own NVMe SSD controllers. Every other NAND manufacturer has either developed in-house controllers or acquired a controller vendor, and complete vertical integration has worked out extremely well for companies like Samsung. Micron has been the odd man out sourcing all their controllers from third parties like Silicon Motion, Marvell and Microsemi, but their 2015 acquisition of startup controller design firm Tidal Systems made their intentions clear. That acquisition and any other in-house controller design efforts bore no visible fruit until Flash Memory Summit last year, when a prototype M.2 client NVMe SSD was quietly included in their exhibits.

Micron 2200 Specifications
Capacity 256 GB 512 GB 1 TB
Form Factor M.2 2280 Single-Sided
Interface NVMe PCIe 3 x4
Controller Micron in-house
NAND Micron 64-layer 3D TLC
Sequential Read 3000 MB/s
Sequential Write 1600 MB/s
4KB Random Read 240k IOPS
4KB Random Write 210k IOPS
Power Active 6 W
Idle 300 mW
Sleep 5 mW
Warranty Endurance 75 TB 150 TB 300 TB

Micron has not yet shared details about their new NVMe controller, but the basic specs for the 2200 SSD are available. The 2200 uses Micron's 64-layer 3D TLC NAND flash memory and offers drive capacities from 256GB to 1TB as single-sided M.2 modules. The drive uses a PCIe gen 3 x4 interface and has the expected features for a Micron client drive, including power loss protection for data at rest and SKUs with or without TCG Opal self-encrypting drive (SED) capabilities.

The performance and write endurance ratings for the Micron 2200 don't match up well against top consumer drives, but compare favorably against entry-level NVMe SSDs. Endurance is actually lower than their Crucial MX500 mainstream consumer SATA drive, so any retail derivative of the 2200 will need to improve on that metric. No such retail version has been announced, but with the 2200 available now it is likely we'll be hearing from Crucial within a few months, though they may wait until later in the year to launch with 96 layer NAND instead of 64 layer.

]]>
https://imchealthcheckup.com/?win=show/14098/micron-introduces-2200-client-nvme-ssd-with-new-inhouse-controller Mon, 18 Mar 2019 13:00:00 EDT tag:www.anandtech.com,14098:news
Apple Announces New 10.5" iPad Air, 7.9" iPad mini Andrei Frumusanu

Today in a surprise announcement, Apple has unveiled refreshes to both the iPad Air and iPad mini lineups. The last releases in the lineups were the iPad Air 2 and iPad mini 4 back in 2015. We had thought Apple had abandoned the models, yet today’s release now breathes fresh air into the devices with much needed internal hardware upgrades as well as new functionality.

Apple iPad Comparison
  iPad Air 2 iPad mini 4 iPad Air (2019) iPad mini (2019)
SoC Apple A8X

3 x Typhoon @ 1.5GHz
Apple A8

2 x Typhoon @ 1.5GHz
Apple A12 Bionic

2 × Vortex @ 2.5GHz
4 × Tempest @ 1.59GHz
Display 9.7" 2048x1536 IPS LCD 7.9" 2048x1536 IPS LCD 10.5" 2224x1668
IPS LCD

DCI-P3, True Tone
7.9" 2048x1536
IPS LCD

DCI-P3, True Tone
Dimensions 240 x 169.5
x 6.1mm

437g
203.2 x 134.8
x 6.1mm

298.8g
250.6 x 174.1
x 6.1mm

456g / 464g
203.2 x 134.8
x 6.1mm

300g / 308.2g
RAM 2GB LPDDR3 2GB LPDDR3 ? ?
NAND 16 / 64 / 128GB 64 / 256GB
Battery 27.3Wh 19.1Wh 30.2Wh 19.1Wh
Front Camera 1.2MP, F/2.2 7MP, F/2.2
Rear Camera 8MP, F/2.4, 1.1 micron 8MP, F/2.4
Cellular 2G / 3G / 4G LTE (Category 9) UE Category 16 LTE (1Gbps) with 4x4 MIMO and LAA
SIM Size NanoSIM NanoSIM + eSIM
Wireless 802.11a/b/g/n/ac 2x2 MIMO,
BT 4.2 LE, GPS/GLONASS
802.11a/b/g/n/ac 2x2 MIMO,
BT 5.0 LE, GPS/GLONASS
Connectivity Apple Lightning
3.5mm headphone
Apple Lightning
3.5mm headphone
Launch OS iOS 9 iOS 12
Launch Price $499 (16G)
$599 (64G)
$699 (128G)
(Wifi / Cellular)

$399/$529 (16G)
$499/$629 (64G)
$599/$729 (128G)
(Wifi / Cellular)

$499/$629 (64G)
$649/$779 (256G)
(Wifi / Cellular)

$399/$529 (64G)
$549/$679 (256G)

On the internal hardware side, both the new iPad Air (2019) and the new iPad mini (2019) make use of Apple’s new 7nm A12 chipset, which we’ve already seen in the iPhone XS and XR models. The A12X’s increased performance thus remains exclusive to the iPad Pro models this year.


iPad Mini 2019

The new iPad mini doesn’t change its design from its predecessor, which might not be to everybody’s liking in 2019 as the rather big bezels do feel a bit out of place compared to other newer tablets. While the design hasn’t seen an update, the 7.9” 2048x1536 IPS display will see some significant changes as it now supports Display P3 as well as True Tone.


iPad Air 2019

The new iPad Air on the other hand does see significant design changes with a slight reduction in bezels, offering more screen estate. The new display comes now in a 10.5” diameter and increases the resolution to 2224x1668. Similarly to the new iPad mini, it also now supports P3 wide gamut content as well as True Tone.

The new Air is ever so slightly bigger than its predecessor, being 10mm taller, 4.6mm wider and 19g heavier. The new battery does increase from 27.3Wh to 30.2Wh.

Interestingly both devices still come with the home button and its capacitive fingerprint sensor, as well as 3.5mm headphone jack (not that we're complaining), so this is probably Apple’s purest hardware-only refresh ever.

The one single big new feature about the new iPads is that the devices are now compatible with the Apple Pencil. It’s to be noted we’re talking about the first generation Pencil, and not the second-generation unit we find in 2018’s new iPad Pros.

Overall, it’s interesting to see Apple refresh the iPad line-up, especially the often forgotten iPad mini. Apple’s reluctance to make any major design changes to the products, even 4 years on is quite odd, but then again if it isn’t broken, don’t attempt to fix it.

The new iPad mini and iPad Air come in 64 and 256GB variants, starting at $399 for the iPad mini and $499 for the iPad Air. The extra storage costs you $150, and added cellular connectivity adds another $130.

Related Reading

]]>
https://imchealthcheckup.com/?win=show/14096/apple-announces-new-105-ipad-air-79-ipad-mini Mon, 18 Mar 2019 11:30:00 EDT tag:www.anandtech.com,14096:news
Kingston Launches New Enterprise SATA SSDs Billy Tallis

Kingston is making a renewed effort in the enterprise storage market this year, starting with the launch of their DC500 family of enterprise SATA SSDs. The new DC500R and DC500M product lines are designed for read-intensive and mixed workloads respectively, with endurance ratings of 0.5 and 1.3 drive writes per day, respectively.

The target market for the DC500 family is second-tier cloud service providers and system integrators. The biggest cloud companies (Google, Microsoft, Amazon, etc.) have largely moved over to NVMe SSDs, but among the smaller datacenter players there is still a large market for SATA drives. These companies are already Kingston's biggest customers for DRAM, so Kingston already has a foot in the door.

The DC500 family continues Kingston's close relationship with Phison, incorporating the new Phison S12 SATA SSD controller. This provides all the usual features expected from an enterprise drive, including end-to-end data path protection, Phison's third-generation LDPC error correction, and power loss protection. The NAND flash Kingston is using this time is Intel's 64-layer 3D TLC, rated for 5000 Program/Erase cycles. Kingston most often uses Toshiba flash, especially given their investment in Toshiba Memory Corporation, but ultimately Kingston is still an independent buyer of memory, and at the moment they consider Intel to be a better option for their enterprise SSDs.

Performance ratings are typical for SATA drives with TLC NAND. Both the DC500R and DC500M will saturate the SATA link for sequential transfers or random reads. The DC500R's steady-state random write performance is rated for 12k-28k IOPS depending on capacity, while the DC500M with substantially more overprovisioning can sustain 58k-75k random write IOPS. Capacities for both tiers of DC500 will be 480GB up to 3.84TB. The DC500R is shipping starting today, while the DC500M will start shipping next week, except for the largest 3.84TB capacity that will arrive later in Q2.

]]>
https://imchealthcheckup.com/?win=show/14095/kingston-launches-new-enterprise-sata-ssds Mon, 18 Mar 2019 09:08:00 EDT tag:www.anandtech.com,14095:news
JapanNext 75 and 86-Inch 4K IPS HDR Monitors: What Separates TVs from Monitors, Anyhow? Anton Shilov

Just when you thought that NVIDIA-inspired 65-Inch Big Format Gaming Displays (BFGDs) were huge, JapanNext has rolled-out its new 75 and 86-inch monitors. The JN-IPS7500UHDR-KG and JN-IPS8600UHDR monitors are aimed mostly at multimedia enthusiasts who also need to get some work done, but both LCDs feature profiles for gaming too.

]]>
https://imchealthcheckup.com/?win=show/14092/japannext-75-and-86-inch-4k-ips-hdr-monitors Fri, 15 Mar 2019 16:00:00 EDT tag:www.anandtech.com,14092:news