Club 3D Launches 2.5 GbE USB Type-A & USB Type-C Dongles

Club 3D has introduced its 2.5 GbE dongles featuring a USB Type-A or a USB Type-C interface. The adapters are designed to add 2.5 Gbps wired Ethernet to PCs without internal GbE controllers. For laptops, this is becoming increasingly more widespread.


Club 3D’s CAC-1420 (USB Type-A to 2.5 GbE) and CAC-1520 (USB Type-C to 2.5 GbE) are extremely simplist devices: they feature an RJ-45 connector on one side, and a USB 3.1 Gen 1 (5 Gbps) interface on another. The dongles are USB-powered and therefore do not need any external power adapters. As for compatibility, they can work with PCs running Apple’s MacOS X 10.6 ~ 10.14 as well as Microsoft’s Windows 8/10.


The manufacturer does not disclose which 2.5 GbE controller it uses, but it is highly likely that the dongles use Realtek’s RTL8156 controller specifically designed for such applications. The only other option is from Aquantia, who only offers a joint 2.5/5 GbE controller.


Apart from notebooks without a GbE port that have to work in corporate environments with wired networks (including those that use 2.5, 5, and 10 GbE networks), Club 3D’s new adapters can be used to upgrade older desktop PCs that need a faster Ethernet connectivity.


Club 3D has not announced pricing of the 2.5 GbE CAC-1420 and CAC-1520 adapters.


Related Reading:


Source: Club 3D (via Hermitage Akihabara)



Source: AnandTech – Club 3D Launches 2.5 GbE USB Type-A & USB Type-C Dongles

Xiaomi Black Shark 2 Gaming Phone: Snapdragon 855, 12 GB RAM, 240 Hz Polling

The smartphone market is no longer growing as rapidly as it used to several years ago, but it is actively segmentizing as customers want their handsets to be tailored for their needs. This presents opportunities for companies with R&D capabilities as they can capitalize on special-purpose devices. A couple of years ago Xiaomi established its Black Shark subsidiary to address mobile gamers. Since then, Black Shark has introduced two gaming handsets. This week, the subsidiary introduced its third offering.



Source: AnandTech – Xiaomi Black Shark 2 Gaming Phone: Snapdragon 855, 12 GB RAM, 240 Hz Polling

The GIGABYTE Z390 Aorus Master Motherboard Review: Solid, But Not Special

The mainstream motherboard market is still predominantly focused on gamers and gaming features. From the useful to the inane, saying a device is ‘gaming’ is clearly bringing in the sales, and it becomes an all out marketing war. Each company is clearly trying to build a gaming brand beyond the company name, even if it means always being confused at how to pronounce it (Ay-orus, or Or-us?). Nonetheless, it is clear that each motherboard company is piling on the R&D dollars, as well as the design dollars, to ensure that it can convince users to part with some hard earned money in their next build. GIGABYTE’s latest attempt is the Z390 Aorus Master, a motherboard that on paper sets its sights on features, aesthetics, and capability.



Source: AnandTech – The GIGABYTE Z390 Aorus Master Motherboard Review: Solid, But Not Special

Samsung Develops Smaller DDR4 Dies Using 3rd Gen 10nm-Class Process Tech

Samsung has completed development of its 3rd-generation 10 nm-class manufacturing process for DRAM as well as the first 8 Gb DDR4 chip that uses the technology. The 1z-nm process technology is said to be the world’s smallest process node for memory, and will enable Samsung to increase productivity without needing to go to extreme ultraviolet lithography (EUVL) at this time. The company plans to start volume production using the technology in the second half of 2019.



Source: AnandTech – Samsung Develops Smaller DDR4 Dies Using 3rd Gen 10nm-Class Process Tech

Intel’s Xeon & Xe Compute Accelerators to Power Aurora Exascale Supercomputer

Intel this week announced that its processors, compute accelerators, and Optane DC persistent memory modules will power Aurora, the first supercomputer in the US projected to feature a performance of one exaFLOP. The system is expected to be delivered in about two years, and goes beyond its initial Xeon Phi specification released in 2014.



Source: AnandTech – Intel’s Xeon & Xe Compute Accelerators to Power Aurora Exascale Supercomputer

Intel Releases New Graphics Control Panel: The Intel Graphics Command Center

Making their own contribution to this busy week of GPU and gaming news, this evening Intel took the wraps off of their previously teased new graphics control panel. Dubbed the Intel Graphics Command Center, the new control panel – or to be more technically accurate, the new app – is an effort from Intel to modernize a part of their overall graphics infrastructure, replacing the serviceable (but not necessarily loved) current iteration of the company’s control panel. At the same time however, it’s also the first step in part of a larger process to prepare Intel’s software stack and overall software ecosystem ahead of the company’s ambitious plans to enter the discrete GPU market in 2020.


Starting from the top, Intel’s Graphics Command Center is largely cut from the same cloth as other modern graphics control panels, such as NVIDIA’s GeForce Experience and AMD’s Radeon Settings application. Which is to say, it’s designed to offer a highly visible and streamlined approach to a GPU control panel, making various features easy to find, and overall offering a more user-friendly experience than the company’s current control panel. And while Intel doesn’t go so far as to name names, from their presentation it’s clear that they consider this kind of user-friendly functionality to now be a required, baseline feature for any GPU ecosystem; in which case Intel is (or rather now, was) the only PC GPU vendor lacking an equivalent application.



To that end, the company is launching the new Graphics Command Center as part of their efforts to better support their current users, as well as new users going forward. The Intel Graphics Command Center works with 6th Gen Core processors (Skylake) and later, which at this point is most Intel-powered systems sold in the last few years. The company calls it an “early access” release, and this is a fairly apt description for the utility as while it shows a level of polish and stability that comes with over a year’s work, Intel clearly isn’t done adding features to it yet.


But perhaps the most interesting tidbit about the Graphics Command Center is how it’s being distributed: rather than being bundled with Intel’s drivers, it’s being delivered through the Microsoft Store on Windows 10. Yes, it’s a full-on UWP application with all of the “modern” flourishes that come with it, and this is actually an important part of Intel’s strategy. Because Microsoft’s new DCH driver model requires drivers to be stripped down to the bare essentials and delivered in pieces – graphics control panels can’t be bundled – these sorts of applications instead need to be delivered separately. In which case, using the Microsoft Store lets Intel tap into the OS’s built-in software update functionality. It also means that the control panel isn’t contingent on the checkered driver update schedules of PC OEMs; users can always download the Graphics Command Center out of band.



Overall, the Graphics Command Center borrows a lot from other GPU control applications. Front and center is a games-centric approach to settings, with the application preferring to offer game-specific settings when possible (scanning to discover what games are installed). For one of the 100 or so games on Intel’s list of supported games, this is relatively straightforward, and each game gets its own page with familiar driver-enforced settings such as anti-aliasing, v-sync, and anisotropic filtering.



Meanwhile, Intel has also thrown in some functionality to better explain what these graphics settings do, as well as their performance impacts. A small question mark next to each setting describes what the setting does, and includes photo demonstrating the concept as well. Meanwhile, towards the right of the control for that setting is an indicator to signal the performance impact of that setting, to offer a basic level of guidance about what the current setting will likely do to game performance. This is actually dynamic with the setting itself, so higher levels of MSAA are flagged as causing a greater performance hit, etc.


Going one step further, however, for 30 of those games, Intel also includes support for one-click graphics optimizations, which is indicated by the lightning bolt logo. Similar to how this works with other control panels, this function will actually go into a game and alter its settings to Intel’s suggested settings for the host computer. This allows Intel to adjust game settings on a fine-grained level, adjusting texture and shadow quality, rendering distance, internal AA settings, etc.



I’m told that right now most of the work to determine these settings is being done by hand by Intel engineers – and of that, I assume a lot of it is being taken from Intel’s existing gameplay settings service. However with 3 generations of iGPUs supported and more coming, the use of automation is increasing as well. As to the quality of Intel’s suggestions, I haven’t had nearly enough time with the Graphics Command Center to get a feel for them, though Intel makes it pretty easy to undo it as necessary.


Beyond game settings, the Graphics Command Center also supports all of the other common features you’d expect to find in a graphics control panel. There’s monitor display settings such as resolution and refresh rate, as well as arranging monitors. There are also a series of video quality settings for adjusting color correction, deinterlacing, film detection, etc. Not unlike the graphics settings, there are demo/explanation features here as well, in order to demonstrate in real-time what the various settings do. And of course, there are info panels on the current software and hardware, supported features, etc. This latter part is admittedly nowhere near groundbreaking, but if this is a baseline feature, then it needs to be present regardless.



Past the current functionality, it’s clear that Intel doesn’t consider themselves to be done with the development of their new graphics control panel. Besides adding support for more games – both for detection and one-click optimizations – there are several other features the other GPU vendors regularly support such as game recording. Performance monitoring, and game streaming. So I would be surprised if Intel didn’t eventually move towards parity here as well.


But ultimately the launch of their Graphics Command Center is about more than just improving the present; it’s about laying the groundwork for the future. The company is gearing up to launch it’s Gen11 iGPU architecture this year, and all signs point to the most common GPU configurations being a good deal more powerful than the Skylake-era GT2 configurations. And next year, of course, is slated to be the launch of Intel’s first Xe discrete GPUs. Intel has grand ambitions here, and to compete with NVIDIA and AMD, they need to match their software ecosystems as well, not just match them on the hardware front. So their latest control panel is an important step forward in establishing that ecosystem.



For the time being, however, Intel is just looking to polish their new control panel. As part of their Odyssey community feedback/evangelism program, Intel is very much embracing the “early access” aspect of this release, and is courting user feedback on the application. And while I admittedly suspect that Intel already knows exactly what they want to do and work on, it certainly doesn’t hurt to solicit feedback on this long road to Xe.




Source: AnandTech – Intel Releases New Graphics Control Panel: The Intel Graphics Command Center

Oculus Rift S VR Headset: An Upgraded Virtual Reality Experience

Oculus VR has introduced its new Oculus Rift S virtual reality PC-powered headset. The new head mounted display (HMD) features an inside-out tracking and does not require any external sensors. In a generational update, it has a higher-resolution screen when compared to the original Oculus Rift. The new unit will ship this Spring.



Source: AnandTech – Oculus Rift S VR Headset: An Upgraded Virtual Reality Experience

Samsung’s Space-Saving Monitors on Pre-Order: Up to 31.5-Inch

Large displays tend to occupy a lot of desk space, something that is not appreciated by many. Samsung has developed a family of monitors featuring a minimalistic design that promises to save as much space as possible while still providing 27 or 31.5 inches of screen real estate. Announced early this year, Samsung’s Space Monitors are now available for pre-order and will ship in April.



Source: AnandTech – Samsung’s Space-Saving Monitors on Pre-Order: Up to 31.5-Inch

HP Unveils ProDesk 405 G4 Desktop Mini PC: An SFF Ryzen Pro Desktop

Over the past few months we have seen increasing adoption of AMD Ryzen processors by manufacturers of ultra-compact form-factor (UCFF) desktops. At present, the number of UCFF systems powered by AMD’s Ryzen is not large, but it is growing. On Tuesday HP announced its first small form-factor commercial desktop powered by AMD’s Ryzen Pro 2000-series.



Source: AnandTech – HP Unveils ProDesk 405 G4 Desktop Mini PC: An SFF Ryzen Pro Desktop

Samsung HBM2E ‘Flashbolt’ Memory for GPUs: 16 GB Per Stack, 3.2 Gbps

Samsung has introduced the industry’s first memory that correspond to the HBM2E specification. The company’s new Flashbolt memory stacks increase performance by 33% and offer double per-die as well as double per-package capacity. Samsung introduced its HBM2E DRAMs at GTC, indicating that the gaming market is a target market for this memory.



Source: AnandTech – Samsung HBM2E ‘Flashbolt’ Memory for GPUs: 16 GB Per Stack, 3.2 Gbps

Apple Launches 2nd Gen AirPods: Longer Talk Time & Hands-Free ‘Hey Siri’

Apple on Wednesday introduced its 2nd Generation AirPods. The new AirPods supports hands-free ‘Hey Siri’ functionality, a longer battery life for coversations, and faster connect times. The new headset will be available in both wireless and wired charging cases.



Source: AnandTech – Apple Launches 2nd Gen AirPods: Longer Talk Time & Hands-Free ‘Hey Siri’

Apple Upgrades iMac and iMac Pro: More Cores, More Graphics, More Memory

Apple has introduced its updated iMac all-in-one desktop computers to use Intel’s latest generation processors with up to eight cores plus AMD’s latest Pro graphics, and its iMac Pro to be equipped with more memory and a faster GPU. Since Apple upgrades its iMac product line every couple of years or so, the company has every right to claim that its top-of-the-range AIO PCs are now up to twice faster than their predecessors.


The new 21.5-inch and 27-inch Apple iMac AIO desktops come in the same sleek chassis as their predecessors and use the same 4K and 5K display panels featuring the P3 color gamut and 500 nits brightness. The systems are offered with Intel’s latest Core processors paired with up to 32 GB of DDR4-2666 memory, SSD storage or a hybrid Fusion Drive storage (comprising of NAND flash used for caching and a mechanical HDDs), and a discrete AMD Radeon Pro GPU. Optionally, customers can equip their new iMacs with Intel’s eight-core Core i9 as well as AMD’s Radeon Pro Vega 48 8 GB GPU.



Since the new Apple iMac AIO desktops inherit quite a lot from their ancestors, they feature the same set of I/O capabilities, including a 802.11ac Wi-Fi + Bluetooth adapter, a GbE port, two Thunderbolt 3 connectors, four USB 3.1 Gen 2 ports, an SDXC card reader, a 3.5-mm audio jack, built-in speakers, and a webcam.

























Apple iMac 2019 Brief Specifications
  21.5″ 27″
Display 21.5″ with 4096 × 2304 resolution

500 cd/m² brightness

DCI-P3 support
27″ with 5120 × 2880 resolution

500 cd/m² brightness

DCI-P3 support
CPU Default Core i3

4C/4T

3.6 GHz
Core i5

6C/6T

3.0-4.1 GHz
Core i5

6C/6T

3.1-4.3GHz
Core i5

6C/6T

3.7-4.6 GHz
Optional Core i7

6C/12T

3.2 – 4.6 GHz
Core i9

8C/16T

3.6 – 5.0 GHz
PCH ?
Graphics Default Radeon Pro 555X Radeon Pro 560X Radeon Pro 570X Radeon Pro 575X Radeon Pro 580X
Optional Radeon Pro Vega 20 Radeon Pro Vega 48
Memory Default 8 GB DDR4-2666
Optional 16 – 32 GB 16 – 64 GB
Storage Default 1 TB HDD 1 TB Fusion 2 TB Fusion
Optional 1TB Fusion


256 GB SSD

512 GB SSD

1 TB SSD

256 GB SSD

512 GB SSD

1TB SSD
2TB Fusion


256 GB SSD

512 GB SSD

1 TB SSD

 

2TB Fusion

3TB Fusion


256 GB SSD

512 GB SSD

1TB SSD

 

3TB Fusion


512 GB SSD

1TB SSD

2TB SSD

Wi-Fi IEEE 802.11ac Wi-Fi + BT 4.2
Ethernet 1 GbE
Display Outputs 2 × Thunderbolt 3
Audio Stereo speakers

Integrated microphones

1 × audio out
USB/Thunderbolt 2 × Thunderbolt 3/USB 3.1 Gen 2 Type-C

4 × USB 3.1 Gen2 Type-A (10 Gbps)
Other I/O FHD webcam

SDXC card reader
Dimensions Width 52.8 cm | 20.8″ 65 cm | 25.6″
Height 45 cm | 17.7″ 51.6 cm | 20.3″
Depth 17.5 cm | 6.9″ 20.3 cm | 8″
PSU ?
OS Apple MacOS Mojave

Apple’s latest 21.5-inch iMac with Intel’s quad-core Core i3 “Coffee Lake” processor and AMD’s Radeon Pro 555X 2 GB graphics adapter will start at $1,299. Meanwhile, a more advanced 21.5-inch iMac with Intel’s six-core Core i5 chip and AMD’s Radeon Pro 560X 4 GB graphics will start at $1,499.


Apple’s 27-inch iMacs with Intel’s six-core Core i5 CPUs will cost from $1,799 to $2,299 depending on the configuration. Once upgraded to Intel’s eight-core Core i9, AMD’s Radeon Pro Vega 48 8 GB, and 16 GB RAM, the price of the system will increase to $3,349.



Also updated is the iMac Pro, which uses Intel’s Xeon-W line of processors. The biggest jump in this line of products is the DRAM capacity, with Apple now offering a 256 GB DDR4 option. In order to get this option, users will have to pay an extra +$5200 above the cost of the default 32 GB configuration, which a number of users have voiced is a lot of money, considering the equivalent 4×64 GB memory layout can be purchased for around $2500. Also offered is an upgrade to the Radeon Pro Vega 64X, although details on what this card has (aside from 64 compute units) has not been disclosed at this point. Based on the ’12 TF Single Precision’ metric on the Apple Store, it appears that the frequency has increased by 9% over the ’11 TF Single Precision’ Radeon Pro Vega 64 model. The price difference between the two is $150.


A fully kitted out iMac Pro now stands at $15700, with an 18-core Xeon-W, Vega 64X, 256GB of DDR4 ECC memory, and a 4TB SSD. The base model is $4999, and comes with an 8-core Xeon-W, Vega 56, 32 GB of DDR4 ECC memory, and a 1TB SSD.


Related Reading:


Source: Apple



Source: AnandTech – Apple Upgrades iMac and iMac Pro: More Cores, More Graphics, More Memory

SilverStone EP14: A Miniature USB-C Hub with HDMI, USB-A, 100 W Power

With hundreds of different USB Type-C adapters and docks on the market, manufacturers are trying hard to make theirs more attractive. To that end, they now tend to design rather interesting products addressing focused use cases. SilverStone has introduced its new compact USB-C dock that has three USB-A ports, a display output, and can pass through up to 100 W of power to charge a laptop and/or devices connected to the USB-A ports, a rare feature for small docks.



Source: AnandTech – SilverStone EP14: A Miniature USB-C Hub with HDMI, USB-A, 100 W Power

Google Announces Stadia: A Game Streaming Service

Today at GDC, Google announced its new video game streaming service. The new service will be called Stadia. This builds on the information earlier this year that AMD was powering Project Stream (as was then called) with Radeon Pro GPUs, and Google is a primary partner using AMD’s next generation CPUs and GPUs.



Stadia is being advertised as the central community for gamers, creators, and developers. The idea is that people can play a wide array of games regardless of the hardware at hand. Back in October, Google debuted the technology showcasing a top-end AAA gaming title running at 60 FPS. Google wants a single place where gamers and youtube creators can get together – no current gaming platform, according to Google, does this.


Ultimately Google wants to stream straight to the Google browser. Google worked with leading publishers and developers to help build the system infrastructure.


Users will be able to watch a video about a game, and instantly hit ‘Play Now’ and start playing the game in under five seconds without any download and lag


This is breaking news, please refresh in case there are updates.



Source: AnandTech – Google Announces Stadia: A Game Streaming Service

HP Reveals Envy x360 15 Laptops with AMD's Latest Ryzen APUs

HP on Tuesday introduced its new 15.6-inch convertible notebooks based on AMD’s Ryzen Mobile 3000-series APUs. The new HP Envy x360 15 are positioned as inexpensive 15.6-inch-class laptops for productivity applications. In addition, the company announced its new Intel-based HP Envy x360 15 PCs.


HP’s AMD Ryzen 3000 and Intel Core i5/i7-based Envy x360 15 convertibles use exactly the same sand-blasted anodized aluminum chassis and thus have the same dimensions (17 mm z-height) and weight (~ 2 kilograms). The only visual difference between AMD and Intel-powered Envy x360 15 PCs is the color: the former features HP’s Nightfall Black finish, whereas the latter features HP’s Natural Silver finish. Overall the new 15.6-inch Envy x360 convertible laptops feature a 28% smaller bezel when compared to the previous generation according to the manufacturer. Meanwhile, all the HP Envy x360 15 machines introduced today also use the same 15.6-inch Full-HD IPS touch-enabled display panel featuring a WLED backlighting.



Inside the new AMD-based HP Envy x360 15 convertible laptops are AMD’s quad-core Ryzen 5 3500U or Ryzen 7 3700U processors with integrated Radeon RX Vega 8/10 graphics. The APUs are accompanied by 8 GB or single-channel DDR4-2400 memory as well as a 256 GB NVMe/PCIe M.2 SSD. As for Intel-powered Envy x360 15, they use Core i5-8265U or Core i7-8565U CPUs.



As far as connectivity is concerned, everything looks rather standard: the systems feature a 802.11ac + Bluetooth 5.0/4.2 controller from Intel or Realtek, one USB 3.1 Gen 1 Type-C connector (with DP 1.4), two USB 3.1 Gen 1 Type-A ports, an HDMI output, a 3.5-mm audio connector for headsets, an SD card reader, and so on. The new Envy x360 15 also has an HD webcam with a dual array microphone and a kill switch, a fingerprint reader, Bang & Olufsen-baged stereo speakers, and a full-sized keyboard.



When it comes to battery life, HP claims that its AMD Ryzen Mobile-powered Envy x360 15 convertibles offer exactly the same battery life as Intel-based machines: up to 13 hours of mixed usage when equipped with a 55.67 Wh battery.



HP will start sales of its Envy x360 15 convertible notebooks with AMD Ryzen Mobile inside this April. Pricing will start at $799.99. By contrast, a system featuring Intel’s Core i5-8265U with a generally similar configuration will cost $869.99.

















HP Envy X360 15″
  Envy x360 15 (AMD)

15m-ds0011dx

15m-ds0012dx
Envy x360 15 (Intel)

15m-dr0011dx

15m-dr0012dx
Display 15.6-inch

IPS

1920×1080
Processor Ryzen 5 3500U

4C/8T

2.1 GHz Base

3.7 GHz Turbo

 
Ryzen 7 3700U

4C/8T

2.3 GHz Base

4.0 GHz Turbo
Core i5-8265U 

4C/8T

1.6 GHz Base

3
.9 GHz Turbo
Core i7-8565U

4C/8T

1.8 GHz Base

4.0 GHz Turbo
Graphics Vega 8 Vega 10 Intel UHD Graphics 620
RAM 8 GB DDR4-2400 (not user accessible)
Storage 256 GB PCIe/NVMe 256 GB PCIe/NVMe

or

512 GB PCIe/NVMe + 32 GB Optane
Network Realtek

2×2 802.11ac

Bluetooth 4.2
Intel Wireless-AC 9560

2×2 802.11ac

Bluetooth 5.0
Audio Bang & Olufsen

Dual Speakers
Digital Media SD card reader
Keyboard Full-size island-style
backlit keyboard
External Notebook

Ports
1 x USB Type-C 3.1 Gen 1

2 x USB 3.1 Gen 1

1 HDMI

1 x 3.5mm jack
Dimensions / Weight 14.13 x 9.68 x 0.67-inch

2 kilograms | 4.53 lbs
Battery / Battery Life 3-cell 55.67 Wh LiPo

65W AC adapter 
Price Starting $799.99 Starting $869.99

Related Reading


Source: HP



Source: AnandTech – HP Reveals Envy x360 15 Laptops with AMD’s Latest Ryzen APUs

Western Digital: Over Half of Data Center HDDs Will Use SMR by 2023

Western Digital said at OCP Global Summit last week that over half of hard drives for data centers will use shingled magnetic recording (SMR) technology in 2023. At present Western Digital is the only supplier of SMR HDDs managed by hosts, but the technology is gaining support by hardware, software, and applications.


SMR technology to boost capacity of hard drives fairly easily but at the cost of some performance trade-offs due to the read-modify-write cycle introduced by shingled tracks. Since operators of datacenters are interested in maximizing their storage capacities, they are inclined to invest in software that can mitigate peculiarities of SMR. As a result, several years after Western Digital introduced its first host-managed SMR HDDs, more and more companies are adopting them. Right now, the vast majority of datacenter hard drives are based on perpendicular magnetic recording technology, but WD states that in four years SMR HDDs will leave PMR drives behind.



Obviously, usage of SMR will not be the only method to increase capacities of hard drives. Energy-assisted PMR technologies (e.g., MAMR, HAMR, etc.) will also be used by Western Digital. In the coming quarters the company intends to release MAMR-based HDDs featuring a 16 TB (ePMR) and 18 TB (eSMR) capacity. The company also plans to introduce 20 TB HDDs in 2020.



High-capacity hard drives are not going to be replaced by high-capacity SSDs any time soon, according to Western Digital. HDDs will continue to cost significantly less than SSDs on per-TB basis. Therefore, they will be used to store 6.5 times more data than datacenter SSDs in 2023.



Related Reading:


Source: Western Digital Presentation at OCP, YouTube




Source: AnandTech – Western Digital: Over Half of Data Center HDDs Will Use SMR by 2023

Quick Note: NVIDIA’s “Einstein” Architecture Was A Real Project

While it was never an official NVIDIA codename as far as roadmaps go, the name “Einstein” came up in rumors a few times earlier this decade. At the time, Einstein was rumored to be the architecture that would follow Maxwell in the NVIDIA lineup. And while we sadly didn’t find out anything new about NVIDIA’s future roadmap at this year’s show – or any sign of Ampere or other 7nm chips – I did inadvertently find out that the rumors about Einstein were true. At least, from a certain point of view.


While talking with NVIDIA’s research group this morning about some of their latest projects (more on this a bit later this week when I have the time), the group was talking about past research projects. And, as it turns out, one of those former research projects was Einstein.



Rather than just being a baseless rumor, Einstein was in fact a real project at NVIDIA. However rather than being an architecture, per-se, it was a research GPU that the NVIDIA research group was working on. And although this research project didn’t bear fruit under the Einstein name, it did under another name that is far more well-known: Volta.


So while this means we can scratch Einstein off of the list of names for potential future NVIDIA architectures, the project itself was real, and it was actually a big success for NVIDIA. As Einstein morphed into what became the Volta architecture, it has become the cornerstone of what are now all of NVIDIA’s current-generation GPUs for servers and clients. This includes both regular Volta and it’s graphics-enhanced derivative, Turing.



Source: AnandTech – Quick Note: NVIDIA’s “Einstein” Architecture Was A Real Project

Nvidia Announces Jetson Nano Dev Kit & Board: X1 for $99

Today at GTC 2019 Nvidia launched a new member of the Jetson family: The new Jetson Nano. The Jetson family of products represents Nvidia new focus on robotics, AI and autonomous machine applications. A few months back we had the pleasure to have a high level review of the Jetson AGX as well as the Xavier chip that powers it.


The biggest concern of the AGX dev kit was its pricing – with retail costs of $2500 ($1299 as part of Nvidia’s developer programme), it’s massively out of range of most hobbyist users such as our readers.



The new Jetson Nano addresses the cost issue in a quite dramatic way. Here Nvidia promises to deliver a similar level of functionality than its more expensive Jetson products, at a much lower price point, and of course at a lower performance point.


The Jetson Nano is a full blown single-board-computer in the form of a module. The module form-factor and connector is SO-DIMM and is similar to past Nvidia modules by the company. The goal of the form-factor is to have the most compact form-factor possible, as it is envisioned to be used in a wide variety of applications where a possible customer will design their own connector boards best fit for their design needs.



At the heart of the Nano module we find Nvidia’s “Erista” chip which also powered the Tegra X1 in the Nvidia Shield as well as the Nintendo Switch. The variant used in the Nano is a cut-down version though, as the 4 A57 cores only clock up to 1.43GHz and the GPU only has half the cores (128 versus 256 in the full X1) active. The module comes with 4GB of LPDDR4 and a 16GB eMMC module. The Jetson Nano module will be available to interested parties for $129.



Naturally, because you can’t do much with the module itself, Nvidia also offers the Jetson Nano in the form of a complete computer: The Jetson Nano Developer Kit. Among the advantages of the Kit is vastly better hardware capabilities compared to competing solutions, such as the performance of the SoC or simply better connectivity such as 4 USB full (3x 2.0 + 1x 3.0) ports, HDMI, DisplayPort and a Gigabit Ethernet port, along with the usual SDIO, I2C, SPI, GPIO and UART connectors you’re used to on such boards. One even finds a M.2 connector for additional WiFi as well as a MIPI-CSI interface for cameras.




Jetson AGX Dev Kit vs Jetson Nano Dev Kit



Jetbot with Jetson Nano Dev Kit vs Jetson Nano Dev Kit


The Jetson Nano Development Kit can be had for only $99. One way Nvidia reaches this price is through the omission of on-board storage, and the kit is driven purely by microSD card. Availability starts today.


We have the Jetson Nano in-house and will seeing what fun things Nvidia cooked up for us soon!



Source: AnandTech – Nvidia Announces Jetson Nano Dev Kit & Board: X1 for

NVIDIA’s April Driver to Support Ray Tracing on Pascal GPUs, DXR Support in Unity and Unreal

During this week, both GDC (the Game Developers’ Conference) and GTC (the Game Technology Conference) are happing in California, and NVIDIA is out in force. One of the announcements today surrounds the support of NVIDIA’s newest technologies on older graphics cards, as well as an increase in capabilities of the two major game engines of the era: Unity and Unreal.




Source: AnandTech – NVIDIA’s April Driver to Support Ray Tracing on Pascal GPUs, DXR Support in Unity and Unreal