LG X Charge With Massive 4500 mAh Battery Lands On Comcast Xfinity Mobile

LG X Charge With Massive 4500 mAh Battery Lands On Comcast Xfinity Mobile
If you want a smartphone that can last all day on a charge, and don’t necessarily care about having the all latest high-end features, LG has a new phone that might be right up your alley. Appropriately named the X Charge, the mid-range smartphone’s claim to fame is its monstrous 4,500 mAh battery.

For comparison, the recently released Motorola

Source: Hot Hardware – LG X Charge With Massive 4500 mAh Battery Lands On Comcast Xfinity Mobile

InspiroBot Inspires You

Feeling a little bummed this morning? Well try out this new chatbot, InspiroBot. that can take the blues away. By way of IFLScience, this AI-powered meme generator helps to inspire you for next goal in life. Looking like a green-tinted HAL 900, the garbled messages are a bit eery, but are as accurate as a fortune cookie. This still beats talking to Dave from work about trying to nab some chromed spinners for his daughter’s third best-friend. Thanks for the advice Albert E.



Ultimately though, who cares? This AI is so bad at its job that it turns out to be uplifting in the most inadvertent way possible. When a peaceful image of a couple holding hands is juxtaposed with the text “When the world ends, what we have strangled can’t be unstrangled” you can’t help but giggle at the madness of it all.

Discussion

Source: [H]ardOCP – InspiroBot Inspires You

GrubHub trial may finally answer contractor vs. employee quandary

Enlarge (credit: Bloomberg / Getty Images News)

SAN FRANCISCO—A federal judge has allowed a labor lawsuit filed against GrubHub to go forward, paving the way for a bench trial this fall.

On Thursday, US Magistrate Judge Jacqueline Scott Corley largely denied the startup’s attempt to have the case decided on summary judgement.

The case, which was first filed back in 2015, is one of a slew of ongoing cases filed against so-called “gig economy” firms. Many of these cases attempt to answer a basic question: should most gig economy workers be classified as contractors, or should they be considered employees?

Read 10 remaining paragraphs | Comments



Source: Ars Technica – GrubHub trial may finally answer contractor vs. employee quandary

How Android Beat the iPhone to World Domination

While everyone is gushing about the iPhone’s ten years of existence, an article from CNN’s Money blog takes its accomplishment down a peg, notd . Launching in 2008, Google’s partnership with HTC helped produce what the world has come to know as the HTC Dream (first released as the T-Mobile G1 in the US. Running off the its mobile operating system, Android, it still maintained a BlackBerry-like slide out keyboard. It ran on a 528 Mhz Qualcomm chip, 192 MB of RAM with 256 MB of internal storage and even a 16 GB expandable Micro SD slot; making it still have more possible storage space than the current iPhones.



“Google and Apple were working on developing the smartphone very much at the same time,” says Fred Vogelstein, author of Dogfight: How Apple and Google Went to War and Started a Revolution.

Then Jobs unveiled a radically different device on stage in January 2007. The head of Android, Andy Rubin, was in a car when the presentation kicked off. He asked the driver to pull over to watch it online, according to Dogfight.

“Android is very fragmented,” Jobs said on an earnings call in 2010. “The users will have to figure it all out.”
“I think Steve Jobs was in fact terribly worried that Google was going to do to him the same thing that Microsoft did,” Vogelstein says.

Discussion

Source: [H]ardOCP – How Android Beat the iPhone to World Domination

New Study Finds How Much Sleep Fitbit Users Really Get

Fitbit has published the results of a study that uses their longitudinal sleep database to analyze millions of nights of Sleep Stages data to determine how age, gender, and duration affect sleep quality. (Sleep Stages is a relatively new Fitbit feature that “uses motion detection and heart rate variability to estimate the amount of time users spend awake in light, deep, and REM sleep each night.”) Here are the findings: The average Fitbit user is in bed for 7 hours and 33 minutes but only gets 6 hours and 38 minutes of sleep. The remaining 55 minutes is spent restless or awake. That may seem like a lot, but it’s actually pretty common. That said, 6 hours and 38 minutes is still shy of the 7+ hours the the CDC recommends adults get. For the second year in a row Fitbit data scientists found women get about 25 minutes more sleep on average each night compared to men. The percentage of time spent in each sleep stage was also similar — until you factor in age. Fitbit data shows that men get a slightly higher percentage of deep sleep than women until around age 55 when women take the lead. Women win when it comes to REM, logging an average of 10 more minutes per night than men. Although women tend to average more REM than men over the course of their lifetime, the gap appears to widen around age 50.

Read more of this story at Slashdot.



Source: Slashdot – New Study Finds How Much Sleep Fitbit Users Really Get

Facebook's Drone Has Second Flight and Lands Successfully

Aquila, the first functional drone aircraft from Facebook, took on its second successful test flight at a total flight time of 1 hr and 46 minutes. The drone flew above 3,000 feet and successfully landed this time as opposed to the crash after the first test flight. The goal of the drone is to potentially provide internet access in remote locations while powered by solar for a weeks at a time.

Check out the flight and landing below.





…this new process, which included locking the propellers horizontally to reduce damage, worked mostly as designed – though only one propeller on the craft actually locked horizontally, while the rest remained vertical until landing, as you can see in the clip above. All four motors stopped as intended, however, and the craft landed softly on a gravel surface, resulting in “a few minor, easily repairable dings,” which is a much better result than they had the first time around.

Discussion

Source: [H]ardOCP – Facebook’s Drone Has Second Flight and Lands Successfully

There Is a Point At Which It Will Make Economical Sense To Defect From the Electrical Grid

Michael J. Coren reports via Quartz: More than 1 million U.S. homes have solar systems installed on their rooftops. Batteries are set to join many of them, giving homeowners the ability to not only generate but also store their electricity on-site. And once that happens, customers can drastically reduce their reliance on the grid. It’s great news for those receiving utility bills. It’s possible armageddon for utilities. A new study by the consulting firm McKinsey modeled two scenarios: one in which homeowners leave the electrical grid entirely, and one in which they obtain most of their power through solar and battery storage but keep a backup connection to the grid. Given the current costs of generating and storing power at home, even residents of sunny Arizona would not have much economic incentive to leave the electric-power system completely — full grid-defection, as McKinsey refers to it — until around 2028. But partial defection, where some homeowners generate and store 80% to 90% of their electricity on site and use the grid only as a backup, makes economic sense as early as 2020. [A]s daily needs for many are supplied instead by solar and batteries, McKinsey predicts the electrical grid will be repurposed as an enormous, sophisticated backup. Utilities would step up and supply power during the few days or weeks per year when distributed systems run out of juice.

Read more of this story at Slashdot.



Source: Slashdot – There Is a Point At Which It Will Make Economical Sense To Defect From the Electrical Grid

TechInsights Confirms Apple’s A10X SoC Is TSMC 10nm FF; 96.4mm2 Die Size

One of the more intriguing mysteries in the Apple ecosystem has been the question over what process the company would use for the A10X SoC, which is being used in the newly launched 2017 iPad Pro family. Whereas the A10 used in the iPhone was much too early to use anything but 16nm/14nm, the iPad Pro and A10X is coming in the middle of the transition point for high-end SoCs. 16nm is still a high performance process, but if a company pushes the envelope, 10nm is available. So what would Apple do?


The answer, as it turns out, is that they’ve opted to push the envelope. The intrepid crew over at TechInsights has finally dissected an A10X and posted their findings, giving us our first in-depth look at the SoC. Most importantly then, TechInsights is confirming that the chip has been fabbed on TSMC’s new 10nm FinFET process. In fact, the A10X is the first TSMC 10nm chip to show up in a consumer device, a very interesting turn of events since that wasn’t what various production roadmaps called for (that honor would have gone to MediaTek’s Helio X30)



Image Courtesy TechInsights


Apple is of course known for pushing the envelope on chip design and fabrication; they have the resources to take risks, and the profit margins to cover them should they not pan out. Still, that the A10X is the first 10nm SoC is an especially interesting development because it’s such a high-end part. Traditionally, smaller and cheaper parts are the first out the door as these are less impacted by the inevitable yield and capacity challenges of an early manufacturing node. Instead, Apple seems to have gone relatively big with what amounts to their 10nm pipecleaner part.


I say “relatively big” here because while the A10X is a powerful part, and big for a 10nm SoC, in terms of absolute die size it’s not all that big of a chip. In fact by Apple X-series SoC standards, it’s downright small: just 96.4mm2. This is 24% smaller than the 16nm A10 SoC (125mm2), and in fact is even 9% smaller than the A9 SoC (104.5mm2). So not only is it smaller than any of Apple’s 16nm SoCs, but it’s also about 20% smaller than the next-smaller X-series SoC, the A6X. Or, if you want to compare it to the previous A9X, Apple’s achieved a 34% reduction in die size. In other words, Apple has never made an iPad SoC this small before.



One key difference here however is that the X-series SoCs have never before been the leading part for a new process node. It has always been iPhone SoCs that have lead the charge – A9 at 16nm, A8 at 20nm, A7 at 28nm, etc. This does mean that as a pipecleaner part, Apple does need to be especially mindful of the risks. If an X-series SoC is to lead the charge for the 10nm generation, then it can’t be allowed to be too big. Not that this has stopped Apple from packing in three CPU cores and a 12-cluster GPU design.


Speaking of size, TechInsights’ estimates for area scaling are quite interesting. Based on their accounting, they believe that Apple has achieved a 45% reduction in feature size versus 16nm, which is consistent with a full node’s improvement. This is consistent with TSMC’s earlier statements, but given the challenges involved in bringing newer processes to market, it’s none the less exciting to actually see it happening. For chip vendors designing products against 10nm and its 7nm sibling, this is good news, as small die sizes are the rule for pretty much everyone besides Apple.


A10X Architecture: A10 Enlarged


Diving a bit deeper, perhaps the biggest reason that A10X is as small as it is, is that Apple seems to have opted to be conservative with its design. Which again, for a pipecleaner part, is what you’d want to do.













Apple SoC Comparison
  A10X A9X A8X A6X
CPU 3x Fusion

(Hurricane + Zephyr)
2x Twister 3x Typhoon 2x Swift
CPU Clockspeed ~2.36GHz 2.26GHz 1.5GHz 1.3GHz
GPU 12 Cluster GPU PVR 12 Cluster Series7 Apple/PVR GXA6850 PVR SGX554 MP4
Typical RAM 4GB LPDDR4 4GB LPDDR4 2GB LPDDR3 1GB LPDDR2
Memory Bus Width 128-bit 128-bit 128-bit 128-bit
Memory Bandwidth TBD 51.2GB/sec 25.6GB/sec 17.1GB/sec
L2 Cache 8MB 3MB 2MB 1MB
L3 Cache None None 4MB N/A
Manufacturing Process TSMC 10nm FinFET TSMC 16nm FinFET TSMC 20nm Samsung 32nm

We know from Apple’s official specifications that the A10X has 3 Fusion CPU core pairs, up from 2 pairs on A10, and 2 Twister CPU cores on A9X, all with 8MB of L2 cache tied to the CPU. Meanwhile the GPU in A10X is relatively unchanged; A9X shipped with a 12 cluster GPU design, and so does A10X. This means that Apple hasn’t invested their die space gains from 10nm in much of the way of additional hardware. To be sure, it’s not just a smaller A9X, but it’s also not the same kind of generational leap that we saw from A8X to A9X or similar iterations.


Unfortunately TechInsights’ public die shot release isn’t quite big enough or clean enough to draw a detailed floorplan from, but at a very high level we can make out the 12 GPU clusters on the left, along with the CPU cores to the right. Significantly, there aren’t any real surprises here. TechInsights heavily compares it to the A9X and there’s good reason to do so. IP blocks have been updated, but the only major change is the CPU cores, and those don’t take up a lot of die space relative to the GPU cores. This is what allows A10X to be more powerful than A9X while enjoying such a significant die size decrease.


As for the GPU in particular, Apple these days is no longer officially specifying whether they’re using Imagination’s PowerVR architecture in their chips. Furthermore we know that Apple is developing their own GPU, independent from Imagination’s designs, and that it will be rolled out sooner than later. With that said, even prior to today’s die shot release it’s been rather clear that A10X is not that GPU, and the die shot further proves that.


Apple’s developer documentation has lumped in the A10X’s GPU with the rest of the iOS GPU Family 3, which comprises all of the A9 and A10 family SoCs. So from a feature-set perspective, A10X’s GPU isn’t bringing anything new to the table. As for the die shot, as TechInsights correctly notes, the GPU clusters in the A10X look almost exactly like the A9X’s clusters (and the A10’s, for that matter), further indicating it’s the same base design.



Image Courtesy TechInsights


Ultimately what this means is that in terms of design and features, A10X is relatively straightforward. It’s a proper pipecleaner product for a new process, and one that is geared to take full advantage of the die space savings as opposed to spending those savings on new features/transistors.


Otherwise I am very curious as to just what this means for power consumption – is Apple gaining much there, or is it all area gains? A10X’s CPU clockspeed is only marginally higher than A9X’s, and pretty much identical to A10, so we can see that Apple hasn’t gained much in the way of clockspeeds. So does that mean that Apple instead invested any process-related gains in reducing power consumption, or, as some theories go, has 10nm not significantly improved on power consumption versus 16nm? But the answer to that will have to wait for another day.



Source: AnandTech – TechInsights Confirms Apple’s A10X SoC Is TSMC 10nm FF; 96.4mm2 Die Size

Google Photos 3.0 Released, Bringing Smarter Sharing, Suggestions and Shared Libraries

Google is rolling out Google Photos 3.0, which features an AI-powered Suggested Sharing feature along with Shared Libraries, “both of which are designed to make the Google Photos app a more social experience, rather than just a personal collection of photo memories,” reports TechCrunch. From the report: With the addition of Suggested Sharing, Google Photos will now prompt you to share photos you took by pushing an alert to your smartphone. The feature will identify people in the photos using facial recognition technology and machine learning, which helps it understand who you typically share photos with, among other things. It also looks at the photos you’ve taken at a particular location, before organizing them in a ready-to-share album by selecting the best shots (e.g., removing blurry or dark photos). You can edit the album if you choose, then share with the people the app suggests, remove suggestions, or add others. Even if your friends or family doesn’t use Google Photos, you can share by sending them a link via text or email. A second feature called Shared Libraries is designed more for use with families or significant others. This lets you either share your entire photo collection with someone else, or you can configure it to share only selected photos — for example, photos of your children.

Read more of this story at Slashdot.



Source: Slashdot – Google Photos 3.0 Released, Bringing Smarter Sharing, Suggestions and Shared Libraries

Uber: Discovery shows Waymo has “zero evidence,” plays blame game

Read 23 remaining paragraphs | Comments



Source: Ars Technica – Uber: Discovery shows Waymo has “zero evidence,” plays blame game

NVIDIA Releases 384.76 WHQL Game Ready Driver

A few weeks have passed since driver version 382.53, and it’s time again for another driver update from NVIDIA. Now onto release 384 with driver version 384.76, NVIDIA brings us a good number of bug fixes, along with a Game Ready and Game Ready VR title.


Starting things off, the new Release 384 driver branch doesn’t bring anything new to the table as far as major features go – at least, nothing that NVIDIA has documented. Instead their efforts have been focused almost entirely on bug fixes and performance improvements. To that end, NVIDIA has addressed issues where Firefall would not run at all. GeForce GTX 1080/1070/1060 stuttering during gameplay in Prey 2 was also addressed, as was texture corruption in No Man’s Sky when SLI was enabled. GTX 970 SLI can now be enabled while Norton 360 is running, as opposed to just when Norton 360 is disabled or in Safe mode. Glitches in Doom (2016) under the Vulkan API were also fixed.


NVIDIA also resolved CPU bottlenecks that occur when 3DVision is enabled, as well as issues with DirectX 12 titles failing to capture via GameStream. Issues with choppy video playback on the Windows Store video app while V-Sync was off were fixed. Last and probably least, a typo was corrected in NVIDIA Control Panel for the Command & Conquer Tiberium Alliances name on the Stereoscopic 3D Compatibility page.



The Game Ready headliner for this edition is the LawBreakers “Rise Up” Open Beta while Spider-Man: Homecoming VR Experience is the token Game Ready VR title. Slated to launch on August 8th, LawBreakers is developed by Cliff Bleszinski, whose pedigree includes games from the Unreal and Gears of War series, and is a class/character-based sci-fi ‘hero shooter,’ similar to games like Overwatch and Team Fortress 2. NVIDIA has polished their drivers in preparation for the “Rise Up” Open Beta, which goes from June 30th to July 3rd. As for the Spider-Man, Spider-Man: Homecoming VR Experience is a free tie-in to the upcoming Spider-Man: Homecoming film, and will be available for download on June 30th. While Homecoming VR is not exactly a full game, and described in the title itself as a “VR experience,” NVIDIA has prepped their drivers in advance regardless.


Wrapping things up, NVIDIA has added an SLI profile for FIFA 17, as well as adding a “debug” option in the NVIDIA Control Panel Help menu, which removes all overclocking performance and power settings.


The updated drivers are available through the GeForce Experience Drivers tab or online at the NVIDIA driver download page. More information on this update and further issues can be found in the 384.76 release notes.



Source: AnandTech – NVIDIA Releases 384.76 WHQL Game Ready Driver

StarCraft Remastered devs unveil price, explain how much is being rebuilt

Read 22 remaining paragraphs | Comments



Source: Ars Technica – StarCraft Remastered devs unveil price, explain how much is being rebuilt