this post was submitted on 19 Jul 2023
34 points (88.6% liked)

PC Gaming

8555 readers
417 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
 

The previous link was broken, so I've reposted a safer one with archive.org

top 35 comments
sorted by: hot top controversial new old
[–] vithigar@lemmy.ca 21 points 1 year ago (2 children)

This article needs a clearer title. I agree that upgrading from a 6000 or 3000 series card right now is almost completely pointless, and even going back another generation it's still not a great proposition. But I know people with "gaming PCs" rocking 1650s or even 1050s. Lots of folks with medium or low end several generations old hardware out there, for whom great upgrade options exist.

[–] vita_man@lemmy.world 6 points 1 year ago

In March, I upgraded my video card from 1660 to 6750. I am really happy with how much better things look now, especially while gaming.

[–] Mewtwo@lemmy.blahaj.zone 3 points 1 year ago (1 children)

I finished school and want to start gaming again. My PC has a AMD 370 I bought back in 2015. Is that still a decent GPU to play games on today?

[–] IntegrationLabGod@lemm.ee 5 points 1 year ago

The 370 will struggle with most anything recent since it's only 2GB VRAM. The Radeon 6600 would be an excellent upgrade there.

[–] nivenkos@lemmy.world 15 points 1 year ago (4 children)

Do that many people upgrade every generation?

I still use a 1070, so the GPU comparisons here aren't relevant.

The main issue I hit was deciding between DDR4 and DDR5 RAM since we're in an awkward transition phase - and that affects motherboard and so CPU choices too.

[–] RightHandOfIkaros@lemmy.world 9 points 1 year ago

Upgrading every generation is stupid. I try to upgrade every 5 years if I can afford it.

My 1080ti says the performance gap versus cost to upgrade is not affordable right now. So I gotta keep waiting.

[–] TheHighRoad@lemmy.world 5 points 1 year ago (2 children)

Well, I've had the same CPU/Mobo/RAM for over ten years and only upgraded my GPU once from a GTX660 to a 5700xt at the start of the pandemic. I'm finally seeing some issues with some modern AAA content. Hogwarts legacy won't really run at all, for example.

I also haven't wiped my system in the same amount of time, so that may be more the culprit than the system itself. Still going strong!

[–] LyD@lemmy.ca 4 points 1 year ago (1 children)

FYI it probably isn't the 5700XT that's causing issues in Hogwarts, mine works fine.

[–] TheHighRoad@lemmy.world 2 points 1 year ago

I think it's a memory issue, most likely due to the sorry state of my Windows installation. Need to knock off the lazy and wipe it, but it's pretty remarkable that it works as well as it does. I started with fresh Win7 and have survived upgrades to Win8 and Win10 in addition to the major feature updates that come now and again. I thought it was totally borked a few years back but some obscure automated tool managed to fix it.

IT BELONGS IN A MUSEUM!

[–] nivenkos@lemmy.world 3 points 1 year ago (2 children)

The CPU becomes the real issue though - which then means changing motherboard, which means changing RAM, etc. and then you might as well get an NVMe too etc.

[–] TheHighRoad@lemmy.world 2 points 1 year ago

I've come to realize that I don't really "upgrade" anything but the GPU and adding storage. I've never so much as dropped in a new CPU without going through the whole rigamarole you just described. Build them to last, folks.

[–] mangofromdjango@feddit.de 1 points 1 year ago* (last edited 1 year ago)

Sometimes you get around that for longer by upgrading to the highest possible configuration on that platform. Often for cheap second hand.

I replaced my 2017 Ryzen 1800x with a Ryzen 5800x3D recently which is supported on my x370 Motherboard. Huge upgrade, no platform change required. I think I can wait for DDR5 and a new motherboard for years to come.

[–] wccrawford@lemmyonline.com 3 points 1 year ago (1 children)

I used to upgrade every generation, and yeah, it was stupidly expensive. But it was my only hobby, and you could actually seen performance increases each time.

But for the last 10 years or so, there's much less point. Sometimes there are major advances (Cuda, RTX) that make it worthwhile for a single generation upgrade, but mostly it's just a few FPS at highest settings. So now I just upgrade every few years.

[–] TheHighRoad@lemmy.world 2 points 1 year ago

Back in the 90s and early 00s, frequent upgrades were kind of required to stay up to date with new games. The last 10-15 years have been muuuuch slower in that regard, thanks to consoles I guess. I'm not complaining, but I miss the sense of developers really pushing boundaries like they did in the old days.

[–] dandroid@dandroid.app 1 points 1 year ago

The only reason I upgraded my 10 series to a 30 series is because I'm a dummy and bought a monitor with only HDMI 2.1 and no display port, so I needed to upgrade my GPU or I would have no gsync. Otherwise, I probably would have waited at least 2 more generations to upgrade.

[–] Goret@lemmy.sdf.org 14 points 1 year ago (6 children)

Well, like some, I am still on the 10xx series (1060 3gb 🤣🥲) and starting to look to the futur full system upgrade for a Rx79xx or 78xx when out. Targeting Black Friday sale jump

I would be curious to know if many others are on a refresh cycle up to 4-5years

(Need to check how to create a poll in Lemmy)

[–] bionicjoey@lemmy.ca 8 points 1 year ago (2 children)

I'm on a 970 and it still plays all the games I want it to play.

[–] alessandro@lemmy.ca 3 points 1 year ago (1 children)

970 + PatientGamers = name a more iconic duo

(or current AAA in lowspecgamer mode XD )

[–] bionicjoey@lemmy.ca 1 points 1 year ago

Lol. My flair on pcmr used to be "GTX 970: definitely 4gb of VRAM". Which is itself a pretty outdated joke nowadays

[–] Goret@lemmy.sdf.org 2 points 1 year ago

That’s dedication (and to be fair you probably pulling more fps than I do 😂)

[–] Willifire@lemmy.world 4 points 1 year ago

I have recently ordered a 7900xtx to replace my 1080ti. It was a good companion but just doesn't cut it anymore. Originally wanted to upgrade with the last gen but scalpers made that impossible. And the used market is still fucked in my region.

[–] scutiger@lemmy.world 2 points 1 year ago

I upgraded last summer to a 6700XT from a 1070ti. I didn't need the upgrade, since the 1070ti is still a solid performer even now. There's not much that it wouldn't still run now and reasonable settings. I really only upgraded because there was a decent sale, and I had some money burning a hole in my pocket. I could have easily waited another year, and gone with a 6800XT or better for a similar price.

[–] theblandone@kbin.social 2 points 1 year ago

I guess I'm on the 7-10 year cycle. I just upgraded from a GTX 1050 (non-Ti) and .an i5-4570. Played almost everything I wanted to play just fine at 1080p and some at 1440p. I tend to be a patient gamer and play mostly indies, so it was great.

This article feels like it was written in a language I don't understand. I understand that other people are more into the hobby than I am (which is fine, no judgement, good for them), but its just so far outside what I would consider normal for me that it took me off guard.

[–] dan1101@lemmy.world 1 points 1 year ago

Exactly. I'm on a 1050Ti and I'm not sure Starfield will be happy with that. Cyberpunk wasn't too happy. And of course if I get a new card I will need new MB/CPU/RAM/etc.

[–] RedStrive@lemmynsfw.com 1 points 1 year ago

1070ti here. I think the fact that the needle hasn't moved significantly forward, as the article puts it, has decreased prices to the point where an upgrade to a more updated setup makes sense now for me personally.

I agree with the article if we're talking an upgrade from 30' series gpus, but things seem great for all other cases.

[–] testman@lemmy.ml 13 points 1 year ago (3 children)

repost

you know that on Lemmy you can just edit the post, right? Title, url and text, all can be changed.

[–] jeeva@lemmy.world 8 points 1 year ago

I (am not op, but) didn't know that! Thanks!

[–] alessandro@lemmy.ca 3 points 1 year ago

Didn't know that. I was used to ol' reddit. Thanks for the info tho, I'll come handy later!

[–] ArcticFox@lemmy.ca 2 points 1 year ago

The technology existing all this time. We just never knew.

[–] HidingCat@kbin.social 7 points 1 year ago* (last edited 1 year ago) (1 children)

The starting premise of the article is based on upgrading from a previous generation. Which sane mind does that? Aside from the one time I got a freebie, all my upgrades were at least two generations apart.

Edit: Also, coming with certain prices on the RX 6000 series, as long as you're from three generations behind you'll get a good upgrade. I went from a GTX 1070 to a RX 6700 XT. Felt a big improvement there.

[–] scutiger@lemmy.world 5 points 1 year ago

They do make the point in the article that even upgrading from two generations back is a waste, as you're getting basically getting no real benefit to having waited two generations instead of one. You may as well upgrade to last generation instead of this one and save yourself some money.

If you're three generations behind, no matter what your upgrade path is, you're getting a significant upgrade, but it's still not worth upgrading to the current gen when last gen is much better value for a marginal performance difference.

The exception to all this is buying the absolute top-of-the-line, which is never good value, but is again significantly inflated in price from the previous gen.

[–] antony@lemmy.ca 2 points 1 year ago (1 children)

I'm halfway through a 10 year cycle, with a 1060 3Gb on a 7th Gen i5. It's mostly Civ6, Stellaris, and Rocksmith 2014 @ 1080p so it's fine. The main problem is end-of-life for Windows 10 without support in the current hardware, and Rocksmith doesn't work well on Linux. I'll probably keep it as-is and start from scratch... when I see a title that I want to play enough to drop big cash on hardware.

[–] Darkrai@kbin.social 3 points 1 year ago* (last edited 1 year ago) (1 children)

Clone hero works on Linux, not sure if that's the same type of game since I'm just guessing rock smith is like rock band. https://clonehero.net/

[–] Smatt@lemmy.ca 5 points 1 year ago (1 children)

Rocksmith uses a real guitar and purports to teach you how to play... Not really like Rock band.

load more comments
view more: next ›