OK. My 14 year old is in need to a new PC. It will likely be a Christmas present so I have time but I started researching now. One option that caught my mind was this all in one from lenovo:
http://www.computershopper.com/desktops/reviews/lenovo-ideacentre-aio-y910I have seen it for sale with a 1080 for $1599
I priced out a home built system and I am getting to over $1600 with a 1070 before I even add a monitor. (sticking with Nvidia)
So my question is... since this AOI is more like a desktop, with easy access to upgrade parts such as CPU, Memory, Storage etc.. and it uses desktop parts for the CPU and Video Card... what would you think the downside is?
Only one I can think of is that if the screen goes for some reason, the whole PC is shot. Thoughts?
All time classic MY NEW FAVORITE POST! (Keep laying those bricks)
"I should point out that no other company has shipped out a beta on a disc before this." - Official Mortal Online Lead Community Moderator
Proudly wearing the Harbinger badge since Dec 23, 2017.
Coined the phrase "Role-Playing a Development Team" January 2018
"Oddly Slap is the main reason I stay in these forums." - Mystichaze April 9th 2018
Comments
Most are glorified laptops. Meaning extremely limited upgradeability, and prone to heat issues if you aren't careful.
The Lenovo you have linked may happen to have some upgradeability. but I'd still think about it just like I would a laptop.
A lot of people are fine gaming on laptops, some even prefer it due to the portability. Here, your getting most of the downsides, and not getting the one great upside to a laptop.
That's my opinion about AIOs, at least insofar as it relates to gaming.
I've worked with them in the past, and despite their claims they are terrible for gaming. I won't own one.
Here is my quick pcpartpicker.com link. I didn't research specifics but just wanted a general idea on what it will run (got it down a bit from my first pass):
https://pcpartpicker.com/list/RhqB3F
Note I'd have to add a monitor on top of that...
All time classic MY NEW FAVORITE POST! (Keep laying those bricks)
"I should point out that no other company has shipped out a beta on a disc before this." - Official Mortal Online Lead Community Moderator
Proudly wearing the Harbinger badge since Dec 23, 2017.
Coined the phrase "Role-Playing a Development Team" January 2018
"Oddly Slap is the main reason I stay in these forums." - Mystichaze April 9th 2018
It's probable that the motherboard, power supply, and some other components aren't exactly of the caliber that you'd build in your own rig.
There's also the issue of monitors. You should be able to move a monitor from one computer to another, or get a new monitor without having to get a computer. With an all-in-one, you can't. I'm not sure if it's even possible to use multiple monitors with that all-in-one; with an ordinary desktop, it's trivial to connect more.
From the review, it looks like CPU and GPU cooling really isn't very good. In particular, even their low power CPU hit 93 C under a gaming load. Some other game or program might randomly happen to push the CPU considerably harder, and then bad things happen.
Ridelynn is right: it's better to think of an all-in-one as a larger laptop, not a smaller desktop.
For example, you could save a lot of money on the power supply by getting this instead:
https://www.newegg.com/Product/Product.aspx?Item=N82E16817151136
And you'd still have a far superior power supply to what comes in that all-in-one, certainly by wattage and probably by quality, too.
I guess I'll focus on watching the individual parts to come on sale and build it.
I always like that and it's a learning experience for a 14 year old... just fare that possibility of the black screen on bootup and the hours and hours of trying to find if it was something I did or a bad part.
All time classic MY NEW FAVORITE POST! (Keep laying those bricks)
"I should point out that no other company has shipped out a beta on a disc before this." - Official Mortal Online Lead Community Moderator
Proudly wearing the Harbinger badge since Dec 23, 2017.
Coined the phrase "Role-Playing a Development Team" January 2018
"Oddly Slap is the main reason I stay in these forums." - Mystichaze April 9th 2018
The problem with All in Ones is that the monitor and other components typically lasts much longer than the Computer. You can typically run the same Monitor, Speakers, Keyboard, Mouse, and Microphone across 2~3 towers. That means in 3~4 years when you need to upgrade, you will need to repurchase 3 of these.
Do you want to be my dad? lol
Speaking as a former Dell warranty tech never get an AIO. I would have rather worked on a hardened laptop.
This is just common sense!
Epic Music: https://www.youtube.com/watch?v=vAigCvelkhQ&list=PLo9FRw1AkDuQLEz7Gvvaz3ideB2NpFtT1
https://archive.org/details/softwarelibrary_msdos?&sort=-downloads&page=1
Kyleran: "Now there's the real trick, learning to accept and enjoy a game for what it offers rather than pass on what might be a great playing experience because it lacks a few features you prefer."
John Henry Newman: "A man would do nothing if he waited until he could do it so well that no one could find fault."
FreddyNoNose: "A good game needs no defense; a bad game has no defense." "Easily digested content is just as easily forgotten."
LacedOpium: "So the question that begs to be asked is, if you are not interested in the game mechanics that define the MMORPG genre, then why are you playing an MMORPG?"
"The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."
- Friedrich Nietzsche
I was gonna mention, right now GPU prices are heavily inflated due to a sharp increase in demand from bitcoin miners. That should hopefully die down with a few months and by Christmas you will probably have a little more to work with in the budget.
"The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."
- Friedrich Nietzsche
That is interesting, I was told it was bitcoin, not even familiar with etherium personally. However, yeah it doesn't make sense that a 1080 would be worse than a 1070, for example. My understanding of that sort of dataset (assuming ehterium is similar to bitcoin) is that it's not especially memory intensive. So, who knows.
"The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."
- Friedrich Nietzsche
That's generally considered a major flaw of bitcoin. Rather than coins being mined by millions of members of the general public, they're only mined in any meaningful volume by the handful of people with a bitcoin mining ASIC. No one else would make enough money by mining bitcoins to cover their power costs. I read one article a while ago that said that a majority of the entire world's bitcoin mining capability is owned by one person in China.
Ethereum was created with the explicit goal of avoiding that problem. Here's their hashing function:
https://github.com/ethereum/wiki/wiki/Ethash
The hashing function is dominated by the work in doing random table lookups to a table of something larger than 1 GB. That's large enough that no processors (interpreted loosely to include CPUs, GPUs, FPGAs, custom ASICs, etc.) that I'm aware of can cache it on die, and work is dominated by doing random access lookups to your off-die memory.
GDDR5 memory requires 128-byte alignment, so any access is exactly as expensive as reading in the full 128 bytes, even if you only grab 4 bytes and ignore the rest of the cache line. One hash takes 64 accesses, so 8 KB worth of reads. Some simple arithmetic gives you that a GPU with 256 GB/s of memory bandwidth could do up to 256/8 = 32 million hashes per second. Or perhaps rather, a little less than that because 8 KB = 8192 bytes, not 8000 bytes. You can't actually exhaust a GPU's theoretical memory bandwidth, and there is a little bit of other work in occasionally setting up the table, but it's probably not a coincidence that the Radeon RX 480 and RX 570, and the GeForce GTX 1070, all rated at 256 MB/sec, all score around 25 million hashes per second, or not that far under the theoretical cap.
GPUs, like CPUs, have a variety of caches to try to avoid using excessive memory bandwidth. Ethash is explicitly designed to break the functionality of those caches and make you lean heavily on global memory bandwidth.
Ever since Fermi, Nvidia's GPUs have had more sophisticated memory controllers than AMD's. They do that to try to prevent the memory controller from choking on realistic workloads because the memory accesses don't access all of the physical memory chips evenly. If you're inclined to do so, it's pretty trivial to design an algorithm that will make all memory accesses on an AMD GPU hit the same memory channel, and then performance completely chokes. A milder version of that does happen sometimes in real software, too. Nvidia's more sophisticated memory controller means that it's nearly immune to that particular problem, as you'd basically have to reverse-engineer a large chunk of their memory controller to come up with an access pattern that makes everything go to the same memory channel.
But the extra complexity means that some Nvidia GPUs choke if you ask them to do random accesses to a large enough memory buffer. I'm not sure why that happens. The article that Cleffy linked only lists Pascal GPUs among Nvidia options. Had they tested older Fermi, Kepler, and Maxwell GPUs, I'd be surprised if they didn't see a lot of them choking with far lower performance than you'd expect from the paper specs, like the GTX 1080 does. I also wouldn't be surprised if the GTX 1060 and GTX 1070 similarly choked if you made the dataset larger--and the Ethereum dataset is designed to grow over time.
AMD has talked up the HBM2 on Vega constituting a "high bandwidth cache". I'm not sure exactly what that means, but I've interpreted it as meaning that Vega is moving to a more sophisticated memory controller like what Nvidia GPUs have had for several years now. I don't know if that will mean that Vega chokes like a GTX 1080 and some other Nvidia GPUs on random accesses to large datasets. But it wouldn't surprise me if it does.