|
Post by itsnoot on Oct 5, 2017 10:27:58 GMT -5
I'm feeling the itch, I wont lie. There are so many options on the table right now too, it's a beautiful thing. I watched the bottom fall out on z270/Kaby Lake pricing and almost jumped. Glad I didn't because a couple days later all this Coffee Lake stuff starts floating in. Then there is AMD making themselves relevant with Ryzen for the first time in.. what? 6, 7 years? Is this actual competition???
GPUs too. AMD seems to be eating up Nvidia in DX12/Vulkan titles but the Volta line is right around the corner. I see speculation that Volta will make big improvements there, looking forward to seeing what shakes out.
My current 4770K + GTX 970 have done very well for me but my gut tells me DX12 will start to chip away at that. The GPU if nothing else. 8700K on z370 is sooo tempting, but I really haven't felt a CPU choke yet - and that's at stock clocks. I haven't even pressed an overclock on this bad oscar yet. I might just upgrade my machine and push this setup down to my son. Or am I just looking for excuses to buy something? Heh.
Why are new toys so damn tempting?
|
|
|
Post by Simian on Oct 6, 2017 0:22:13 GMT -5
Yeah, Ryzen has been a great thing for competition in the CPU world, Intel looks like they're finally shaking up their product line in reaction to AMD gaining a lot of market share relatively quickly. The thing with gaming right now is, unless you're playing at 1080p@144hz, the GPU is almost entirely going to be the bottleneck as most of the i5/i7's from the past 5 years have held up relatively well. As awesome as my Core 2 Duo E8400 was, after 3-4 years it was definitely showing its age. But with CPU's released after the sandy bridge generation, performance hasn't really increased all that much but neither have the amount of super demanding (on the CPU) PC games. But I know that temptation to move on to a new system, I have an i5 4670K and the system is about 4 years and I just want to move to a new platform and a build a new system, even though performance is awesome after upgrading my GPU to a 1080. As for Coffee Lake, I think the i5 8600K is going to be the one to watch for in terms of gaming performance. Benchmarks were showing it trading blows with the i7 7700K and it it looks like most of the chips reviewers were getting were overclocking to 5ghz relatively easily, so its 6 cores, clocks well, and its $259. Not bad. AMD is also apparently releasing a refresh of the Ryzen chips in February, hopefully they clock a little bit better (rumor has it that the 12nm process they're moving to should increase clocks by 10% at least).
|
|
|
Post by Coolverine on Oct 6, 2017 10:33:04 GMT -5
I've been using a Core i5 3570K (Ivy Bridge) for 5 years and it's still holding up great.
|
|
|
Post by Cop on Oct 6, 2017 10:33:17 GMT -5
Why are new toys so damn tempting? Because while it may seem huge at first, your sphincter quickly gets used to it and before you know you need something bigger to feel that same sensation... ...oh wait, you people were talking about computers... ...carry on, disregard what I said...
|
|
|
Post by itsnoot on Oct 6, 2017 10:41:07 GMT -5
Heh. The only practical reason I would have to upgrade my rig is to push my old parts down to my son. He's still rocking an overclocked Q9550, which has held up longer than I ever thought imaginable. With the FSB bumped to 400Mhz it gets close to Sandy Bridge single thread performance, at least on the CPU-Z bench. No idea whats going on under the hood especially in terms of instruction sets.
I was reading 8700K reviews last night. Heat looks a little scary. The question I have that isn't being answered is how worth it is a "K" model if the thing is so close to a heat ceiling at stock clocks? Also, why in the world is Intel still using TIM under the lid? I wish I was brave enough to delid...
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Oct 6, 2017 10:45:36 GMT -5
|
|
|
Post by ForRealTho on Oct 6, 2017 12:55:35 GMT -5
I haven't owned a desktop since my last one died in 2010 and I went laptop only.
Wanted to hold out for Ryzen laptops but I needed a new machine for PUBG so I am on a 17" Acer Predator. i7-6700hq, 1070, 16 gigs DD4 2400, 256 gig SSD OS, 1 TB SSD Games. Picked it up in early July.
Supposedly Ryzen laptops are coming out in a few months. I wish I could held out but the 960m in my old laptop simply wasn't up to playing PUBG.
I don't give a shit about most big AAA titles so I am hoping the 1070 will keep me going for a few years at least.
|
|
|
Post by Babel-17 on Oct 6, 2017 13:24:05 GMT -5
I'm feeling the itch, I wont lie. There are so many options on the table right now too, it's a beautiful thing. I watched the bottom fall out on z270/Kaby Lake pricing and almost jumped. Glad I didn't because a couple days later all this Coffee Lake stuff starts floating in. Then there is AMD making themselves relevant with Ryzen for the first time in.. what? 6, 7 years? Is this actual competition??? GPUs too. AMD seems to be eating up Nvidia in DX12/Vulkan titles but the Volta line is right around the corner. I see speculation that Volta will make big improvements there, looking forward to seeing what shakes out. My current 4770K + GTX 970 have done very well for me but my gut tells me DX12 will start to chip away at that. The GPU if nothing else. 8700K on z370 is sooo tempting, but I really haven't felt a CPU choke yet - and that's at stock clocks. I haven't even pressed an overclock on this bad oscar yet. I might just upgrade my machine and push this setup down to my son. Or am I just looking for excuses to buy something? Heh. Why are new toys so damn tempting? Eventually DX12 will mean higher efficiency for all setups. But because nVidia became experts at DX11 after pouring tons of effort into it and working closely with developers to use their optimizations, they sometimes do worse with DX12. Though with Vulcan, the OpenGL version of DX12, they do better. I guess that's because of the influence of ID software, who really know how to work with nVidia drivers. Feel free to upgrade, lol, but you don't have the excuse of having aging hardware. Ironically/hypocritically, I went from a GTX 970 (I gave that to my canvassing partner) to a GTX 1070, and an i7 2600 (non k) to an i7 6700 (non k). But I have a good excuse! I bought a 4k Sony at Best Buy and the salesman helpfully pointed out the promotional offer of double points on a Best Buy credit card if I took one out with this purchase. And I also got a coupon for 5% off my next purchase within x amount of days. New 4k set, and I got to get the FE GTX 1070 for $108.00 off. Not a bad deal considering that over a year later it's held its price. Granted this is because the 1070s are better for mining than the higher end cards from nVidia. Regular GDDR5 somehow is less of a bottleneck than the better memory, GDDR5X, that they use. Having the 4k set, even though it did an amazing job of scaling games from the GTX 970, was tempting me to upgrade. It more or less is the difference between 1080p and 1440p, and for older games to use straight up 4k. Though that can require some tricks to get enabled in games that don't natively support it. A few months later my Dell XPS mobo died, and on the same day my backup PC refused to boot. Like William Burroughs with an emty heroin cupboard I again hustled down to Best Buy, and while I knew I was paying too much I slapped down my CC for an ibuypower PC that was on sale. Quite happy with it so far. It has an ASUS H170 Pro mobo and it was easy to work with. I added some nice case fans, and tossed in my 1070 and my gold rated PSU which I could now mount from the bottom. The case has filters bottom and front that are held on magnetically. I was also able to add one of those new fangled M.2 SSD drives as it has a slot for that. It also came with a USB C port which I found a use for. Later on I was able to boot my backup PC a Dell XPS 420 with the Core 2 Quad Q6600 Processor 2.4GHz. Though later its HD crapped out, but since I had the one from the other Dell free, I swapped it in. Still games well, though I noticed that with Crysis 2 it's less CPU bound with a nVidia GPU. I'd been running my monstrously long HIS AMD HD 7950 in it (back when it was released, the XPS 420 represented a seriously high end offering from Dell and had a very large case), but now I could toss in the GTX 1060 that came with my new PC into it (partly why I overpaid, I didn't need such a nice GPU), since that had gotten my new GTX 1070. TL;DR: I used to use a benchmarking tool to keep tabs on the HD 7970's performance, and I let it store its records. Long story short, it was cpu bound at 1080p. With the GTX 1060 I could get higher framerates in areas that tended to stress the cpu. I now have a 4k monitor, so 60 fps is all I'll need from my cpu, and anyway it will be a long time before GPUs in my price range can see that resolution as normal for a AAA title. Your 4770K edges out my non-K CPU, you're a very long ways away from needing to upgrade that, unless video encoding is your thing. The more cores the merrier for that. HardOCP recently broke out an old i7 2600K, overclocked it, and benched it in games against the latest and greatest from AMD and Intel with them also being overclocked. All of them pumped out dazzling framerates in the non-gpu binding resolution they used with a 1080Ti. Even an i5 scarcely trails as video games on the PC just don't use that more threads, and they don't need all that much more oomph than their anemic console counterparts require. Unoptimized RTS games aside, until we see consoles with beefy CPUs, nobody with a decent PC cpu is in any danger of being left behind. The only caveat to that is players of online AAA twitch shooters. Some like to use 1080p monitors at 144Hz, so they need a steady framerate to match. That framerate, plus the overhead of the online match, means nobody can guarantee years and years of continued performance like that with titles yet to be released. Not with older generations of CPUs. But that ain't me, nor is it all likely to be me. I just want my 60 fps without breaking a sweat. I'd like to know I could hit 120 fps with my CPU, but I'm not envisioning any new TV in my immediate future that could support that, and it will be a long time before reasonably priced GPUs can support AAA titles at 4k and 120 fps. By that I mean AAA titles will keep getting more demanding. Today's AAA titles at 120 fps and at 4k might not be a very long time away from being possible on reasonably priced hardware. But I put that beyond the time where I'd be thinking of upgrading my PC, and my HDTV. Still too long, still didn't read: It's all good, what you and I have will provide magnificent PC gaming for a long time. Thank/blame the consoles, or just because these rigs are so very capable. They represent more than enough power for all able game designers to produce excellent titles. I just got done playing F.E.A.R., its DLCs, and F.E.A.R. 2. Even at 4k they barely register on my cpu's and gpu's performance counter, but they're a lot of fun and demonstrate how a game is about the design. The use of light and shadows alone make for an important part of the total.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Oct 6, 2017 14:44:39 GMT -5
I haven't owned a desktop since my last one died in 2010 and I went laptop only. Wanted to hold out for Ryzen laptops but I needed a new machine for PUBG so I am on a 17" Acer Predator. i7-6700hq, 1070, 16 gigs DD4 2400, 256 gig SSD OS, 1 TB SSD Games. Picked it up in early July. Supposedly Ryzen laptops are coming out in a few months. I wish I could held out but the 960m in my old laptop simply wasn't up to playing PUBG. I don't give a shit about most big AAA titles so I am hoping the 1070 will keep me going for a few years at least. An issue with laptops is that they've been hit hard by inflation (since the last 10 years) for some reason. Back in 2008, I bought a mid-range gaming laptop (on Newegg) for about $700. Today, a new mid-range laptop (with a 1060 gpu) on Newegg would set you back over $1100. www.newegg.com/Product/Product.aspx?Item=2WC-000N-00068
|
|
|
Post by itsnoot on Oct 6, 2017 16:09:19 GMT -5
I don't upgrade often enough to trust a laptop. Too much heat cooking the goods in there for my taste. I like to keep things very well cooled and then tune in a moderate overclock when I start to feel it choke. Not really doable with a laptop.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Oct 6, 2017 16:41:45 GMT -5
Yeah, durability could be a problem, if you choose the wrong laptop (one with a powerful gpu and weak cooling). In which case, if the gpu or some other vital component that can't be easily swapped goes out, you're pretty much screwed. Unless you get one of those massive "desktop replacement" laptops, with huge heatsinks and turbo fans. Or, unless you go for a laptop with a specialized gpu that's custom-made for laptops (i.e. GeForce GTX 1070 8 GB with Max-Q design). But those type of laptops are even more expensive... www.newegg.com/Product/Product.aspx?Item=N82E16834234727Last month, I bought a Dell XPS 15 laptop with a 1050 4GB gpu. Heat isn't an issue and it's very quiet. Only good for light-duty gaming of course. Also, earlier this year, I put together a custom desktop with a 1080 gpu and 7700k cpu, a very capable gaming rig.
|
|
|
Post by ForRealTho on Oct 6, 2017 18:51:41 GMT -5
I don't upgrade often enough to trust a laptop. Too much heat cooking the goods in there for my taste. I like to keep things very well cooled and then tune in a moderate overclock when I start to feel it choke. Not really doable with a laptop. My last laptop I used for almost exactly two years and sold it for half of what I paid for it. No issues.
|
|
|
Post by itsnoot on Oct 9, 2017 8:22:50 GMT -5
Eventually DX12 will mean higher efficiency for all setups. But because nVidia became experts at DX11 after pouring tons of effort into it and working closely with developers to use their optimizations, they sometimes do worse with DX12. Though with Vulcan, the OpenGL version of DX12, they do better. I guess that's because of the influence of ID software, who really know how to work with nVidia drivers. Feel free to upgrade, lol, but you don't have the excuse of having aging hardware. Ironically/hypocritically, I went from a GTX 970 (I gave that to my canvassing partner) to a GTX 1070, and an i7 2600 (non k) to an i7 6700 (non k). But I have a good excuse! I bought a 4k Sony at Best Buy and the salesman helpfully pointed out the promotional offer of double points on a Best Buy credit card if I took one out with this purchase. And I also got a coupon for 5% off my next purchase within x amount of days. New 4k set, and I got to get the FE GTX 1070 for $108.00 off. Not a bad deal considering that over a year later it's held its price. Granted this is because the 1070s are better for mining than the higher end cards from nVidia. Regular GDDR5 somehow is less of a bottleneck than the better memory, GDDR5X, that they use. Having the 4k set, even though it did an amazing job of scaling games from the GTX 970, was tempting me to upgrade. It more or less is the difference between 1080p and 1440p, and for older games to use straight up 4k. Though that can require some tricks to get enabled in games that don't natively support it. A few months later my Dell XPS mobo died, and on the same day my backup PC refused to boot. Like William Burroughs with an emty heroin cupboard I again hustled down to Best Buy, and while I knew I was paying too much I slapped down my CC for an ibuypower PC that was on sale. Quite happy with it so far. It has an ASUS H170 Pro mobo and it was easy to work with. I added some nice case fans, and tossed in my 1070 and my gold rated PSU which I could now mount from the bottom. The case has filters bottom and front that are held on magnetically. I was also able to add one of those new fangled M.2 SSD drives as it has a slot for that. It also came with a USB C port which I found a use for. Later on I was able to boot my backup PC a Dell XPS 420 with the Core 2 Quad Q6600 Processor 2.4GHz. Though later its HD crapped out, but since I had the one from the other Dell free, I swapped it in. Still games well, though I noticed that with Crysis 2 it's less CPU bound with a nVidia GPU. I'd been running my monstrously long HIS AMD HD 7950 in it (back when it was released, the XPS 420 represented a seriously high end offering from Dell and had a very large case), but now I could toss in the GTX 1060 that came with my new PC into it (partly why I overpaid, I didn't need such a nice GPU), since that had gotten my new GTX 1070. TL;DR: I used to use a benchmarking tool to keep tabs on the HD 7970's performance, and I let it store its records. Long story short, it was cpu bound at 1080p. With the GTX 1060 I could get higher framerates in areas that tended to stress the cpu. I now have a 4k monitor, so 60 fps is all I'll need from my cpu, and anyway it will be a long time before GPUs in my price range can see that resolution as normal for a AAA title. Your 4770K edges out my non-K CPU, you're a very long ways away from needing to upgrade that, unless video encoding is your thing. The more cores the merrier for that. HardOCP recently broke out an old i7 2600K, overclocked it, and benched it in games against the latest and greatest from AMD and Intel with them also being overclocked. All of them pumped out dazzling framerates in the non-gpu binding resolution they used with a 1080Ti. Even an i5 scarcely trails as video games on the PC just don't use that more threads, and they don't need all that much more oomph than their anemic console counterparts require. Unoptimized RTS games aside, until we see consoles with beefy CPUs, nobody with a decent PC cpu is in any danger of being left behind. The only caveat to that is players of online AAA twitch shooters. Some like to use 1080p monitors at 144Hz, so they need a steady framerate to match. That framerate, plus the overhead of the online match, means nobody can guarantee years and years of continued performance like that with titles yet to be released. Not with older generations of CPUs. But that ain't me, nor is it all likely to be me. I just want my 60 fps without breaking a sweat. I'd like to know I could hit 120 fps with my CPU, but I'm not envisioning any new TV in my immediate future that could support that, and it will be a long time before reasonably priced GPUs can support AAA titles at 4k and 120 fps. By that I mean AAA titles will keep getting more demanding. Today's AAA titles at 120 fps and at 4k might not be a very long time away from being possible on reasonably priced hardware. But I put that beyond the time where I'd be thinking of upgrading my PC, and my HDTV. Still too long, still didn't read: It's all good, what you and I have will provide magnificent PC gaming for a long time. Thank/blame the consoles, or just because these rigs are so very capable. They represent more than enough power for all able game designers to produce excellent titles. I just got done playing F.E.A.R., its DLCs, and F.E.A.R. 2. Even at 4k they barely register on my cpu's and gpu's performance counter, but they're a lot of fun and demonstrate how a game is about the design. The use of light and shadows alone make for an important part of the total. I appreciate the reality check. I think we are of the same mind, at least when I'm in my right mind that is. I don't upgrade as often as most enthusiasts, I just get tempted to when new hardware comes rolling out. I'm guess I'm about a 5-7 year guy, where I think many people are 2-3 year. It helps that I don't play twitch shooters, my style is more exploratory where frame rates don't necessarily need to be high. I do likes my eye candy though, don't we all? I may yet get something going but it will be based on the needs of upgrading my second machine for the kid. As long as that overclocked C2Q is cranking it out for him I have no reason to pull the trigger. Seriously, I really can't believe that thing still holds water 10 years later. If I didn't boot up Destiny 2 Beta and see it for myself I don't think I would have believed it was playable. I think that chip is a killer sample because it runs a 20% OC at stock voltage. Not that I'm bragging because I'm no where near hardcore, I basically raised the FSB to 400 without even trying. I've pressed it further but the power supply in that machine really starts to cry around 430 FSB so I backed it off. I'm not about to buy a new power supply to overclock a 10 year old chip. I did buy a GTX 960 from a friend for $50 to plop in there with it because the 560Ti it was coupled with was definitely choking on eye candy. As long as I'm rambling... I tried my hand at an OC of my 4770K. My luck got burned on that C2Q because this thing is a dud, lol. It hits a brick wall at 4.3 Ghz despite being well under thermals. I technically could have pushed some more voltage but it was starting to get sketchy for hardware I intend to keep for 5ish more years. I backed it off to 4.2 where the voltages and thermals were much more agreeable. A 15% OC is nothing to shake a stick at I suppose. It just stings reading about the chips that rocket up to 4.7 and 4.8 with relative ease. *shakes fist*
|
|
|
Post by Babel-17 on Oct 9, 2017 10:55:47 GMT -5
I appreciate the reality check. I think we are of the same mind, at least when I'm in my right mind that is. I don't upgrade as often as most enthusiasts, I just get tempted to when new hardware comes rolling out. I'm guess I'm about a 5-7 year guy, where I think many people are 2-3 year. It helps that I don't play twitch shooters, my style is more exploratory where frame rates don't necessarily need to be high. I do likes my eye candy though, don't we all? I may yet get something going but it will be based on the needs of upgrading my second machine for the kid. As long as that overclocked C2Q is cranking it out for him I have no reason to pull the trigger. Seriously, I really can't believe that thing still holds water 10 years later. If I didn't boot up Destiny 2 Beta and see it for myself I don't think I would have believed it was playable. I think that chip is a killer sample because it runs a 20% OC at stock voltage. Not that I'm bragging because I'm no where near hardcore, I basically raised the FSB to 400 without even trying. I've pressed it further but the power supply in that machine really starts to cry around 430 FSB so I backed it off. I'm not about to buy a new power supply to overclock a 10 year old chip. I did buy a GTX 960 from a friend for $50 to plop in there with it because the 560Ti it was coupled with was definitely choking on eye candy. As long as I'm rambling... I tried my hand at an OC of my 4770K. My luck got burned on that C2Q because this thing is a dud, lol. It hits a brick wall at 4.3 Ghz despite being well under thermals. I technically could have pushed some more voltage but it was starting to get sketchy for hardware I intend to keep for 5ish more years. I backed it off to 4.2 where the voltages and thermals were much more agreeable. A 15% OC is nothing to shake a stick at I suppose. It just stings reading about the chips that rocket up to 4.7 and 4.8 with relative ease. *shakes fist* Imo, a sensible upgrade path would be waiting for a next generation video card. Prices, due to miners, haven't dropped enough to justify buying a new card with more than a year old nVidia technology, or AMD's tepid competition to it. Maybe mining will collapse, and/or nVidia will speed up the release of its next generation, in which case prices could start plummeting and you could snap up a bargain. But mining is still here, and both nVidia can easily get the prices they want for their GPUs. 4k HDTVs are coming down in price and offer reasonable quality. It looks like this will only get better for 2018. You may or may not want to hold out for an upgrade to the HDMI standard, or TVs offering a DPI connection. Monitors already offer DPI, but their HDTV competition packs in a lot of other features, including very large sizes, monitors don't come with. Though if you have a nice sound system/game with headphones, and game close to the monitor, you can save a few hundred bucks by going that route. Sony regressed a bit with the panel it now offers (the 2017 X800E) in that it now uses IPS intead of being a VA panel like the one I bought that's prior to it (the 2016 X800D), but the features of the 2016 set I bought can be had in other models, and it will likely return for 2018. And we no longer have to accept bad lag in order to use a high quality HDTV panel. www.rtings.com/tv/reviews/sony/x800dI expect that for 2018 at the price range I bought it at Sony and its competitors will offer something with some nice improvements. Fingers crossed for either the upcoming HDMI standard or a DPI connector, but at least more brightness, which is important for HDR, more support for all HDR variants, a snappier processor to handle the Android OS, and a massaging of the other criteria we look for in a HDTV. Anyway, that's my input, look ahead to a new card, and a new HDTV/Monitor. Both of those easily carry over to when you go for a new PC build. I think that by the end of 2018 the new standard very high end cards will be judged against will be for reasonable framerates, using near Ultra settings (minus MSAA), while at 4k. More mainstream cards will be expected to perform well at 2560 x 1440. I base that on what will be the power available from consoles, including the upcoming one from Microsoft. The new reality is that consoles work a lot more like PCs, and both PCs and consoles are compatible with DX12. We are tied to the hip with each other. Developers will have both in mind when thinking of image quality. It will be interesting to see what PC only AAA titles come down the pipe, and break all the "rules". Off topic, but it's no coincidence the console cycle has shrunk. Developers are chomping at the bit for more power, and since they often develop for the PC, the improved IQ that the PC can offer will only get more glaring, and over a shorter period of time than in the past. And this another reason not to sweat the question of "Is my cpu future proof?". The console makers are now all about compatibility between past, current, and upcoming, versions of their current approach to making their devices. Even the upcoming new king of the hill from Microsoft pales in comparison to the PC in terms of its CPU. Savings had to come from somewhere as they have to have a decent GPU, and lots of memory. Now, when you take the power of the upcoming standard bearer for the consoles, Microsoft's Project Scorpio/Xbox One S, into mind, and remember that both Sony and Microsoft have sworn to be backwards compatible with their oldest PS4/Xbox One products, you see what game designers are facing. They'll have to put a lot of effort into not having any sloppy code that demands a lot from the CPU. I recently played through a game that was extolled for its diabolically clever AI. It's from 2005, and today's game developers have much, much, more headroom to play with, so it's not like the console's spec's are holding them back. Though I suppose someone might want to develop a unique kind of game that would benefit from lots of the calculation CPUs excel at, and thus would feel held back by not being able to market to the console users, or those with less than the very top of the line CPUs. Well, a bit of patience will fix this situation.
|
|
|
Post by ForRealTho on Oct 9, 2017 13:10:46 GMT -5
FEAR was a lot of fun but I never got around to playing 2 or 3.
|
|
|
Post by Babel-17 on Oct 9, 2017 15:20:58 GMT -5
FEAR was a lot of fun but I never got around to playing 2 or 3. I just finished 2. Good shooter, the added weapons are brief fun, improved visuals, nice choreography during the "big moments", and an OK story that only suffers because it's a sequel, and a bit less well paced. The AI is set to a lower intelligence imo, going by my playing on the medium setting in both the first one and this one. The enemy soldiers still work together, but some of those fleshing out the opposition don't act intelligently. Lot of content, nice mix of locations. It gets more consistent high quality in the later parts, I guess they called it a wrap before polishing all of it. No glitches I recall, just a bit less involving in parts. It's put together really well, it was made by the developer who did the original, though now part of Warner, the DLCs to the first one were outsourced, but still good shooters. Just lacking in the story, and the big moments. It will be probably be on sale again, soon, from either Steam, GOG, or The Humble Bundle.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Oct 10, 2017 10:53:40 GMT -5
I haven't owned a desktop since my last one died in 2010 and I went laptop only. Wanted to hold out for Ryzen laptops but I needed a new machine for PUBG so I am on a 17" Acer Predator. i7-6700hq, 1070, 16 gigs DD4 2400, 256 gig SSD OS, 1 TB SSD Games. Picked it up in early July. Supposedly Ryzen laptops are coming out in a few months. I wish I could held out but the 960m in my old laptop simply wasn't up to playing PUBG. I don't give a shit about most big AAA titles so I am hoping the 1070 will keep me going for a few years at least. If you have a newer laptop with "USB C" (for video out) and an external monitor with mini-DP, I suggest ditching the HDMI cable for one of these. www.amazon.com/Belinda-DisplayPort-Adapter-Aluminium-resolution/dp/B073S6V9CD/ref=sr_1_1?s=electronics&ie=UTF8&qid=1507650551&sr=1-1&keywords=USB+C+TO+Mini+dp+cable%2CBelinda+USB-COn my set-up, connecting my laptop to a 3440x1440 resolution monitor, the refresh rate went from 30Hz (HDMI to HDMI cable) to 75Hz (USB-C to mini-DP cable). Mouse lag went away and fps in games went way up. It's one hell of an upgrade for less than $20. Of course, part of the problem was limitation of my external monitor (as HDMI 2.0 should be capable of 60Hz... but my monitor is only capable of 30Hz at 3440x1440 resolution with HDMI, says so in the manual). Regardless, even without if the monitor limitation, a USB-C to mini-DP cable improves the refresh rate.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Oct 10, 2017 22:36:28 GMT -5
I don't upgrade often enough to trust a laptop. Too much heat cooking the goods in there for my taste. I like to keep things very well cooled and then tune in a moderate overclock when I start to feel it choke. Not really doable with a laptop. I noticed fps drop off in games while my laptop was unplugged. And fps went back up again when it's plugged in. In an attempt to get fps higher while the laptop is unplugged and running on battery, I tried tweaking the power management settings in Windows 10. But to no avail. Next, I looked at Bios settings. I noticed a feature called Intel SpeedStep, and it's fairly obvious that it does something to the effect of throttling the cpu. Turned that off for a minute or two to see if doing so would boost fps (while laptop is running on battery) and boy did that cause the keyboard to start cooking (on the left side, around the Caps Lock... evidently, close where the cpu is located). Disabling this feature didn't solved the fps issue. But lesson learned. Don't disable Intel SpeedStep on a laptop, as it does good job of keeping cpu temps under control.
|
|
|
Post by itsnoot on Oct 10, 2017 23:05:37 GMT -5
Speedstep lets the CPU multiplier drop during idle times. Only reason to turn it off is pressing for high over clocks.
Like I said, I'm not a laptop man BUT it would not surprise me a bit if the VGA driver (or BIOS setting) chops the GPU clock down when unplugged.
|
|
|
Post by Babel-17 on Oct 11, 2017 9:33:01 GMT -5
I don't upgrade often enough to trust a laptop. Too much heat cooking the goods in there for my taste. I like to keep things very well cooled and then tune in a moderate overclock when I start to feel it choke. Not really doable with a laptop. I noticed fps drop off in games while my laptop was unplugged. And fps went back up again when it's plugged in. In an attempt to get fps higher while the laptop is unplugged and running on battery, I tried tweaking the power management settings in Windows 10. But to no avail. Next, I looked at Bios settings. I noticed a feature called Intel SpeedStep, and it's fairly obvious that it does something to the effect of throttling the cpu. Turned that off for a minute or two to see if doing so would boost fps (while laptop is running on battery) and boy did that cause the keyboard to start cooking (on the left side, around the Caps Lock... evidently, close where the cpu is located). Disabling this feature didn't solved the fps issue. But lesson learned. Don't disable Intel SpeedStep on a laptop, as it does good job of keeping cpu temps under control. If you have a nVidia gpu, go into the Control Panel, Global Settings, and see if setting "Power Management Mode" from "Optimal", the default, to "Prefer maximum performance" helps. It's generally good to leave it at the default, as it's pretty smart about lowering clock speed and voltages when you're using v-sync, and/or in the menu of a game. But it's OK to use it, and even better you can set it to only use "Prefer maximum performance" for the individual game that you think needs it. So you can ignore using Global settings, and go straight to Program Settings. Two caveats, it takes time for the list of your games to load, and very often you'll have to browse to the directory of your game and point out the games executable. If you set the driver install to install nVidia's extra feature which optimizes all your games settings, you can probably do this from within that.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Oct 11, 2017 10:55:34 GMT -5
I noticed fps drop off in games while my laptop was unplugged. And fps went back up again when it's plugged in. In an attempt to get fps higher while the laptop is unplugged and running on battery, I tried tweaking the power management settings in Windows 10. But to no avail. Next, I looked at Bios settings. I noticed a feature called Intel SpeedStep, and it's fairly obvious that it does something to the effect of throttling the cpu. Turned that off for a minute or two to see if doing so would boost fps (while laptop is running on battery) and boy did that cause the keyboard to start cooking (on the left side, around the Caps Lock... evidently, close where the cpu is located). Disabling this feature didn't solved the fps issue. But lesson learned. Don't disable Intel SpeedStep on a laptop, as it does good job of keeping cpu temps under control. If you have a nVidia gpu, go into the Control Panel, Global Settings, and see if setting "Power Management Mode" from "Optimal", the default, to "Prefer maximum performance" helps. It's generally good to leave it at the default, as it's pretty smart about lowering clock speed and voltages when you're using v-sync, and/or in the menu of a game. But it's OK to use it, and even better you can set it to only use "Prefer maximum performance" for the individual game that you think needs it. So you can ignore using Global settings, and go straight to Program Settings. Two caveats, it takes time for the list of your games to load, and very often you'll have to browse to the directory of your game and point out the games executable. If you set the driver install to install nVidia's extra feature which optimizes all your games settings, you can probably do this from within that. Thanks for the suggestion, but I already tried that and I didn't see any changes in fps The newer laptops w/ Nvidia dedicated graphics also include Intel HD graphics (and you can switch between the two in the Nvidia Control panel). The inclusion of Intel integrated graphics results in many options (such as Display) having been completely removed from the Nvidia Control Panel and handled exclusively in "Intel Graphics Settings". It's one of the most ridiculous design choices I've come across when dealing with PC's, they deleted all settings from the Nvidia control panel except for 3D Settings. Basically, the Intel graphics settings get exclusive of control of resolution and refresh rate (among most everything else) and Nvidia drivers have zero say in it. It's utter BS and should be the other way around. Also, I believe I found the Intel graphics feature that's causing this issue. By going to "Intel HD Graphics Control Panel" --> "Power" --> "On Battery", selecting graphics power plan = "Balanced Mode"... this reveals a "Global Settings" box (which is hidden otherwise). In this box, there's an option called "Extended Battery Life for Gaming". Select the "?" next to it and I see the description of the BS it's doing to my gaming performance - i.e. "Select Enable to extend the battery life through dynamic control of the frame rate. Select Disable to turn off this option." Of course, I disabled it. But there was no effect, so I'm guessing the Intel graphics drivers are bugged and the feature is always on. yeah, if purchasing a new gaming laptop is in your plans, consider waiting for one with Ryzen and AMD graphics.
|
|
|
Post by ForRealTho on Oct 11, 2017 13:28:03 GMT -5
The newer laptops w/ Nvidia dedicated graphics also include Intel HD graphics (and you can switch between the two in the Nvidia Control panel). My Predator with a 1070 does not include the Nvidia Optimus, thank god. It is pure Nvidia all the way. This leads to higher power usage but gives you full Nvidia features and not the cut down Optimus versions.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Oct 11, 2017 13:38:28 GMT -5
The newer laptops w/ Nvidia dedicated graphics also include Intel HD graphics (and you can switch between the two in the Nvidia Control panel). My Predator with a 1070 does not include the Nvidia Optimus, thank god. It is pure Nvidia all the way. This leads to higher power usage but gives you full Nvidia features and not the cut down Optimus versions. You have a 1 generation prior laptop with Intel 6700HQ cpu, right? Are there even any new gen laptops with Nvidia graphics and without the Optimus nonsense? In either case, I don't recall my laptop being advertised with "Optimus" as a feature. So it seems the easiest way to be sure (without buying, testing, and possibly having to return your laptop) would be to get a prior gen laptop or wait for Ryzen/AMD.
|
|
|
Post by ForRealTho on Oct 11, 2017 14:23:51 GMT -5
My Predator with a 1070 does not include the Nvidia Optimus, thank god. It is pure Nvidia all the way. This leads to higher power usage but gives you full Nvidia features and not the cut down Optimus versions. You have a 1 generation prior laptop with Intel 6700HQ cpu, right? Are there even any new gen laptops with Nvidia graphics and without the Optimus nonsense? In either case, I don't recall my laptop being advertised with "Optimus" as a feature. So it seems the easiest way to be sure (without buying, testing, and possibly having to return your laptop) would be to get a prior gen laptop or wait for Ryzen/AMD. I don't think any of the Acer Predators have it.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Oct 11, 2017 15:33:11 GMT -5
That's good.
I tried every setting. Obviously, the graphics drivers have a feature that's causing this and the drivers are bugged.
Oh well, it's not that big of deal since I didn't purchase this laptop as a primary gaming rig. I suppose I'll just plug it in whenever I do use it for gaming.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Oct 12, 2017 19:42:14 GMT -5
Incidentally, even though the Nvidia graphics are crippled down to Intel integrated graphics (seems to be a 30fps cap) while the laptop is unplugged, it's still better to use the Nvidia graphics... because Intel graphic's anisotropic filtering is pure sh!t.
|
|
|
Post by ForRealTho on Oct 12, 2017 19:54:10 GMT -5
Also you can use Nvidia Inspector to force on certain features on an Optimus setup such as Adaptive Vsync. Thankfully I am on a gsync panel so I don't even bother.
|
|
|
Post by Coolverine on Oct 12, 2017 23:51:10 GMT -5
With Nvidia (even in the regular Nvidia control panel) you can also force SSAO in some games, including old ones. I've tried it with Mass Effect 1, Oblivion and Skyrim. It's not available for most games but for the ones it does work with, it makes it look a lot better.
As far as I know only Nvidia has this feature, though with ENB you can pretty much have it in any game no matter if you use AMD or Nvidia.
|
|
|
Post by itsnoot on Oct 27, 2017 14:32:35 GMT -5
Right when I was content to let it be the motherboard in the second machine goes poop. Southbridge I think... That's what I get for buying a used piece from eBay. So, Kaby Lake for me and Haswell gets handed down. Had to pass on Coffee Lake since its out of stock and the wife wants her machine back. It's cool though, saved a ton of cash on blowout prices.
On that note, B&H has amazing prices right now on 7700K and Maximus XI Hero if you are looking to jump in. Still expensive enough to push my GPU upgrade way out lol.
|
|
|
Post by Ambience on Jan 18, 2018 12:21:47 GMT -5
I'm getting ready to rip apart my 2008 Dell XPS, it's gotten to the point where the Core 2 Duo just isn't cutting the mustard any more. I'm also switching over to an SSD as well as a Win 10 upgrade. Replacing the CPU with i5-8400, because besides a brief stint in ancient times when I had one of the first dual processor rigs with AMD, I've always been an Intel guy. My upgrades come next week and I'm looking forward to system building more than the actual system I'll have after....I bought a spendy GeForce GPU several months ago, time to see how much it really screams.
|
|