hate these ads?, log in or register to hide them
Page 208 of 216 FirstFirst ... 108158198205206207208209210211 ... LastLast
Results 4,141 to 4,160 of 4312

Thread: What Video Card Should I Buy?

  1. #4141

    Join Date
    April 18, 2011
    Posts
    3,532
    Depends on your reference point. If you're only going back one generation the advance is, I hesitate to say fantastic but at least very good. If you go back two generations or more it's good, maybe even better than average.


    Turing being such a shitty generation in everything from price hikes to puny performance increments really muddies the waters IMO.

  2. #4142
    Malcanis's Avatar
    Join Date
    April 12, 2011
    Posts
    17,916
    Jim makes a pretty good case that it's a mediocre advance on Turing at best, historically speaking. It's edifying to see the numbers broken down.
    Quote Originally Posted by Isyel View Post
    And btw, you're such a fucking asshole it genuinely amazes me on a regular basis how you manage to function.

  3. #4143
    Specially Pegged Donor Overspark's Avatar
    Join Date
    April 10, 2011
    Location
    NL fuck yeah
    Posts
    3,846
    Man I just can't watch his videos any more, they're so terrible. I don't think the 3000 series is such a great performance increase, because it's also a massive power usage increase. The performance per watt didn't go up much AFAIK. If it had it would have been an different story IMHO.

  4. #4144

    Join Date
    December 15, 2011
    Location
    The Establishment
    Posts
    1,751
    It's an interesting look at it. I didn't watch the entire thing, so I don't know if he touched upon it, but I think it'd be interesting to know why they can boost the number of cores in a 3080 by almost 200 % compared to a 2080, and yet not get more than about 50% more performance.

  5. #4145
    Malcanis's Avatar
    Join Date
    April 12, 2011
    Posts
    17,916
    Quote Originally Posted by morpheps View Post
    It's an interesting look at it. I didn't watch the entire thing, so I don't know if he touched upon it, but I think it'd be interesting to know why they can boost the number of cores in a 3080 by almost 200 % compared to a 2080, and yet not get more than about 50% more performance.
    He actually covered that in his previous video IIRC. The ts;dw (too scottish; didn't watch) is that Ampere is optimised for compute rather than gaming, because Nvidia makes way more money from that side of its operations. It's only "good" for gaming because it has a jillion cores - which is why the die is about the size of a credit card.

    Quote Originally Posted by Overspark View Post
    Man I just can't watch his videos any more, they're so terrible. I don't think the 3000 series is such a great performance increase, because it's also a massive power usage increase. The performance per watt didn't go up much AFAIK. If it had it would have been an different story IMHO.
    That's essentially his argument. IDK if that makes your posts terrible or his video better.
    Quote Originally Posted by Isyel View Post
    And btw, you're such a fucking asshole it genuinely amazes me on a regular basis how you manage to function.

  6. #4146

    Join Date
    April 18, 2011
    Posts
    3,532
    Quote Originally Posted by Malcanis View Post
    Quote Originally Posted by morpheps View Post
    It's an interesting look at it. I didn't watch the entire thing, so I don't know if he touched upon it, but I think it'd be interesting to know why they can boost the number of cores in a 3080 by almost 200 % compared to a 2080, and yet not get more than about 50% more performance.
    He actually covered that in his previous video IIRC. The ts;dw (too scottish; didn't watch) is that Ampere is optimised for compute rather than gaming, because Nvidia makes way more money from that side of its operations. It's only "good" for gaming because it has a jillion cores - which is why the die is about the size of a credit card.
    I find it interesting the Nvidia is going that route with a unified architecture for gaming and compute at the same time AMD is abandoning theirs with Vega as the last GCN chip. AMD is now the company with a separate compute and gaming architecture.

    Nvidia is in a wholly different position than AMD though when it comes to market penetration and software support so I'll expect they'll do just fine.

  7. #4147
    Specially Pegged Donor Overspark's Avatar
    Join Date
    April 10, 2011
    Location
    NL fuck yeah
    Posts
    3,846
    Quote Originally Posted by Malcanis View Post
    IDK if that makes your posts terrible or his video better.
    Oh it's not what he says I have issues with, it's the way he says them. Half his videos are always him defending himself against nebulous naysayers and "proving" how right he was all along. That 22 minute video could probably have been wrapped up in 5 minutes without all the "you're not attacking me, you're attacking my sources! look here!!! and here!!!11 and HERE!!!!11111111".

  8. #4148
    VARRAKK's Avatar
    Join Date
    September 27, 2011
    Location
    Sweden
    Posts
    7,030
    Radeon 6900XT specs leak. Competes with RTX 3090



    https://coreteks.tech/articles/index...with-rtx-3090/
    Why is it called earth, when it is mostly water???

  9. #4149
    walrus's Avatar
    Join Date
    April 9, 2011
    Location
    Fancomicidolkostümier- ungsspielgruppenzusammenkunft
    Posts
    6,742
      Spoiler:
    Quote Originally Posted by RazoR View Post
    But islamism IS a product of class warfare. Rich white countries come into developing brown dictatorships, wreck the leadership, infrastructure and economy and then act all surprised that religious fanaticism is on the rise.
    Also:
    Quote Originally Posted by Tellenta View Post
    walrus isnt a bad poster.
    Quote Originally Posted by cullnean View Post
    also i like walrus.
    Quote Originally Posted by AmaNutin View Post
    Yer a hoot

  10. #4150
    Malcanis's Avatar
    Join Date
    April 12, 2011
    Posts
    17,916
    It waits for the benchmarks or it gets hosed again

    Idc what AMD launch to compete with the 3090 because only an idiot with too much money would buy one for home use; I want to see what they launch to compete with the 3080 and 3070.
    Quote Originally Posted by Isyel View Post
    And btw, you're such a fucking asshole it genuinely amazes me on a regular basis how you manage to function.

  11. #4151

    Join Date
    April 18, 2011
    Posts
    3,532
    It's not like the 3090 offers much more for gaming over a 3080. When are you ever going to notice 10% more fps?

  12. #4152
    Super Moderator DonorGlobal Moderator whispous's Avatar
    Join Date
    April 9, 2011
    Location
    Mails Tegg > пошел ты на хуй
    Posts
    4,878

  13. #4153
    VARRAKK's Avatar
    Join Date
    September 27, 2011
    Location
    Sweden
    Posts
    7,030
    Doesn't matter if the 6900XT is the fastest card or not.
    AMD being able to compete with 3080/3090 so quickly is an astonishment alone!

    2020 was the year AMD sailed away from Intel. If they would do the same to Nvidia, it would be a miracle.
    Either way, Nvidia will have to up their pace if they want to continue being the faster GPU manufacturer.

    For us the consumers, this is the best possible news.
    Why is it called earth, when it is mostly water???

  14. #4154
    Malcanis's Avatar
    Join Date
    April 12, 2011
    Posts
    17,916
    Nvida could almost certainly have buried AMD if they'd wanted to try, but they'd rather go for the sweet, sweet compute dollar (and also not give any dollars to TSMC)

    And TBH much like Intel in the early 2010s, they may have calculated that if they really did bury AMD and AMD basically gave up, then they'd potentially be facing monopoly action. Better to just cash in and milk those fat margins for a couple of generations and let AMD nearly catch up so that can say "See! There is too competition!".
    Quote Originally Posted by Isyel View Post
    And btw, you're such a fucking asshole it genuinely amazes me on a regular basis how you manage to function.

  15. #4155

    Join Date
    December 15, 2011
    Location
    The Establishment
    Posts
    1,751
    Quote Originally Posted by Spartan Dax View Post
    It's not like the 3090 offers much more for gaming over a 3080. When are you ever going to notice 10% more fps?
    For most people, never. Then there are those who either have so much disposable money that it doesn't matter, or those who must have the best. Economically speaking, 10 % extra performance for 100 % extra money is stupid.

  16. #4156
    Smegs's Avatar
    Join Date
    April 2, 2012
    Posts
    1,111
    Quote Originally Posted by VARRAKK View Post
    Doesn't matter if the 6900XT is the fastest card or not.
    AMD being able to compete with 3080/3090 so quickly is an astonishment alone!

    2020 was the year AMD sailed away from Intel. If they would do the same to Nvidia, it would be a miracle.
    Either way, Nvidia will have to up their pace if they want to continue being the faster GPU manufacturer.

    For us the consumers, this is the best possible news.
    I'm not sure it's AMD who did the rush job. Nvidias R&D have been sat on their thumbs for approx 4 years as they had such an architecture lead on AMD/Radeon that they were almost superfluous. But with AMD working through their architecture and refining it (atm it's better than nvidia's, due to scaleability in speed, PPW, die size and clusters), it had Nvidia shitting themselves as they suddenly realised that their architecture was capping out/wrong just as AMD's was sorted out/ramping up.

    The Ampare realease seem rushed, half baked and desperate. Understanding that their architecture and process, as is, is not as competative must have come as a shock to Nvidia and they needed to maximise on sales prior to very stiff competition coming from AMD. So i think it was Nvidia who deserve the plaudits for pulling out the stops to get their cow to market first. AMD seem to have been just chugging along doing their thing, and it seems they were pleasently surprised at how little ampare had to offer....
    Shitting up eve for .... well, longer than most of you scumbags.

  17. #4157
    Malcanis's Avatar
    Join Date
    April 12, 2011
    Posts
    17,916
    P sure that Nvidia will offer the double-memory -Super cards before Christmas. Maybe even before the end of the month.
    Quote Originally Posted by Isyel View Post
    And btw, you're such a fucking asshole it genuinely amazes me on a regular basis how you manage to function.

  18. #4158

    Join Date
    April 9, 2011
    Location
    Chichester, UK
    Posts
    1,811
    Quote Originally Posted by Smegs View Post
    Quote Originally Posted by VARRAKK View Post
    Doesn't matter if the 6900XT is the fastest card or not.
    AMD being able to compete with 3080/3090 so quickly is an astonishment alone!

    2020 was the year AMD sailed away from Intel. If they would do the same to Nvidia, it would be a miracle.
    Either way, Nvidia will have to up their pace if they want to continue being the faster GPU manufacturer.

    For us the consumers, this is the best possible news.
    I'm not sure it's AMD who did the rush job. Nvidias R&D have been sat on their thumbs for approx 4 years as they had such an architecture lead on AMD/Radeon that they were almost superfluous. But with AMD working through their architecture and refining it (atm it's better than nvidia's, due to scaleability in speed, PPW, die size and clusters), it had Nvidia shitting themselves as they suddenly realised that their architecture was capping out/wrong just as AMD's was sorted out/ramping up.

    The Ampare realease seem rushed, half baked and desperate. Understanding that their architecture and process, as is, is not as competative must have come as a shock to Nvidia and they needed to maximise on sales prior to very stiff competition coming from AMD. So i think it was Nvidia who deserve the plaudits for pulling out the stops to get their cow to market first. AMD seem to have been just chugging along doing their thing, and it seems they were pleasently surprised at how little ampare had to offer....
    Forgive my ignorance here(and absolutely zero sarcasm implied) but is this referring to the massive uptick in CUDA cores but no similar uptick in relative performance?

    Using this source:


    The CUDA per FPS core is as follws:

    3090 FE - 70.6
    3080 FE - 60.2
    2080 ti FE - 30.3
    1080 ti FE - 29.1

    To me this suggests that 30series CUDA cores are dogshit lol. But I am an ignorant fool and probably comparing apples to keyboards here lol

  19. #4159
    Malcanis's Avatar
    Join Date
    April 12, 2011
    Posts
    17,916
    My understanding after watching several youtube videos, thus making me a highly qualified IC engineer: It's not so much that they're "dogshit", it's the the core structure is strongly optimised for Compute tasks which love to nom on FP32 calculations, and weakly optimised for gaming tasks which are heavily biased (70-80%) towards Integer calculations. They're counting the pairs of FP32 units as "two" cores, but one of them is dual FP32/Int, and it seems that if there is an Int calc to be done, the other core in the pair is effectively out of action until it has completed.

    So for 70-80% of gaming workloads, they effectively have half as many cores as the raw numbers Nvidia is shouting about. the "10,000 shaders" that the 3090 has become something like 6000-6500. Likewise for the 3080. However all those transistors are still there, gobbling die area, increasing cost, reducing yields.

    This is a far more attainable bar for AMD to clear than it could have been had Ampere been a gaming-focused arch. Nvidia have (correctly) made the calculation that they'll make more money selling these dies to data centres (where they will be extremely powerful btw) and that the slack-jawed gamerbois will still race to lap up their "8900 shaders!!!11" 350-watt 3080s
    Quote Originally Posted by Isyel View Post
    And btw, you're such a fucking asshole it genuinely amazes me on a regular basis how you manage to function.

  20. #4160

    Join Date
    April 9, 2011
    Location
    Chichester, UK
    Posts
    1,811
    Quote Originally Posted by Malcanis View Post
    My understanding after watching several youtube videos, thus making me a highly qualified IC engineer:
    Im sold.

    Lets assume it 2 is actually 1 in your analogy, that does somewhat help and keeps it inline with the 20 series.

    3090 FE gets 35.3
    3080 FE and 30.1

    So I'll take it. Thanks!

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •