RTX 3080 benchmarks are out

Why would you want to get a second hard 2080Ti when you can get a new 3080 for very similar prices?..or a cheaper 3070 for that matter.

I can’t speak for others but I have an Alienware and the 3080 won’t fit my case. So it’s either wait for a version that fits and pay a premium or get a new PC.

If the performance different isn’t a huge leap and you can find a 2080ti for a good price it would make sense to upgrade rather than spending thousands getting a new PC.

Ah fair enough. Do you have the Aurora?

Yup, the R8.

I encourage you to look more in depth at the power consumption numbers before you use that as a metric to make a determination on purchasing. In the GN video they measured the power consumption at about 325 watts for a stock FE card. So if we say something like (not counting the rest of the computer since we are only talking graphics card differences here) you game for 6 hours a day for all 365 days of the year you are looking at having used 711.75 kWh. Ok so that sounds like a scary number yeah? Well if the number I saw saying 135W for the 1060 6gb is accurate then that means that you are looking at 295.65 kWh for that same 6 hours a day 365 days a year number. Still that is significantly less than the 3080. But when you consider the difference in graphical power between the two cards you have to ask yourself is it worth it.

Now I certainly don’t have the cheapest power around here but my basic rate is this - First 650 kWh per month is at 9.6 cents per kWh. It goes up half a cent for the next 350 kWh if I go over that initial rate which I don’t. This card would cost me all of $683.28 for the entire year if I was gaming at that rate of 6 hours a day every day. Obviously that is an extreme scenario. I bet you probably don’t have that much time to spend gaming every week. I know I certainly don’t. So Let’s contrast that with your current card - my yearly cost would be $283.82. Certainly cheaper to run.

Now let’s talk about my current card - a 1080 Ti and your upgrade card the 3070. Unfortunately the 3070 numbers will only be based on what Nvidia says +5W since that is what it turned out to be for what they said on the 3080 vs GN testing.
So my 1080 Ti uses about 283W which equates to 620.5 kWh a year or $595.68 in yearly costs. Compared to the 3080 that isn’t that much less when we are talking about high end cards for their respective generations. Now the 3070 is expected to use 225W which would equate to 492.75 kWh a yeah and would end up costing $473.04. That puts you significantly higher already than what you currently have. So my only point is would another $210 a year really make that much of a difference to your wallet over what you will already be paying just to have what will amount to maybe a bit more performance than my current card?

Of course you have other factors to consider as well. My only point here is that you might have gotten hung up on a factor that probably isn’t really as big of a concern as some people on the internet have made it out to be.

4 Likes

render scaling 200 at 4k that means 8k, so a bit of a useless benchmark. there is no need to run it at 200 scaling when you are running 4k.

1 Like

at the moment from what I can see what kills the performance is JavaScript and the ui framework.

1 Like

Any guesses on how two 2080 ti’s in SLI would stack up against a 3080 in a 4K gaming scenario? This is assuming they would be used for games and applications that are SLI compatible.

Likely similar.

SLI scaling isn’t 2x the performance, so probably comparable with wayyy more micro stutter and worse application support

More like who doesn’t play 1080p. 4K is overkill on a monitor 30in or smaller. You won’t see anything on the screen lol!

2 Likes

9.6 Cents per kWh?
What a dream.
Come to germany
 Here 25 cents per kwh! :grin:

I believe if the benchmark was about Video card benchmark, in using 200% scaling they were trying to ensure the CPU having no major influence in the end results when comparing the latest and the previous cards differences in FPS (i.e. ‘rendering’ power).

Me. I haven’t played in 1080p for 4ish years now. If you mean you won’t see anything on a small screen because of how small the scale for ui elements and the like, I can tell you even on a 50" screen you aren’t going to see anything without scaling the ui. Then again I also don’t have the best vision so maybe I just need things scaled no matter what. It did take a bit of getting used to though when I first went from 1080 to 1440 and then again when i went from 1440 to 4k. Really gotta fine tune all your ui scaling and learn to get used to it.

2 Likes

Ouch. Yeah that sucks. But like I said, the rate I pay is not even the lowest around where I live. I live in a rural area so I am served by an electrical co-op with no power generation facilities of their own. That means that the co-op that serves me has to buy the power from someone else. But I also currently pay the bill at another house about an hour away from where I live and the cost at that house is only 5.7 cents per kWh for the first 650 kWh a month. I wish my house could get a rate that low. My understanding is that you guys have to buy a lot of your power from other countries due to switching to solar and wind which can’t provide stable power at all time. Maybe that is incorrect, but if not, that could explain such a high price differential.

1 Like

I don’t play in 1080. 1440p is where it’s at and 4K is just as beautiful. Of course on a screen bigger than 27”

1 Like

Folks don’t do this :grinning: I Paired an RTX 3080 With an AMD FX CPU... - YouTube

If you have a 1080 then you should buy a 30xx. But if you are sitting on a 2080 or even 2070 card its not worth the upgrade for a mere 5-10 fps, at least I would not.

I have a 1660 card runing at 2k and it looks very good with a super stable fps at 34-35 fps on everything high.

The difference between Ultra and High, are for me personally, not worth it.

I want a stable FPS that does not fluctuate and my current rig does that even over New York or London


2 Likes

Yes i agree, this sim has serious issues with even this next gen card!..i was hoping for more, its not much better than a RTX 2080 Ti
dismal. The RTX 3080 IS around 20% quicker overall the RTX 2080 Ti
yet in most other games the RTX 3080 IS around 50-60% quicker.
this video points out the issues with the sim!

1 Like

I agree am in 4k, who the hell plays a beautiful sim like this in 1080p? Upgrade ya cheap asses!

2000 series users have no point in upgrading. If you want to upgrade something, upgrade your cpu. 1000 and below and the upgrade makes much more sense.