Jump to content
Sign in to follow this  
amd-rules

frame rate drops... anyone got some input

Recommended Posts

i get frame rate drops in ONLY call of duty games like world at war and mw2... i have yet to figure out why... i have a xfx 4890 and set my games to play at 60 fps at 1920x1080 and all do it fine without hiccups, but call of duty does not... my bios is up to date and my card has the latest driver... any ideas why it is strictly doing it on call of duty???? thanks

Share this post


Link to post
Share on other sites

I haven't played COD in quite awhile. But IIRC, one of the biggest frame rate

cutters, was the sound. Do not use EAX/EAX2.

Even if you have a card, or MB is supposed to support it.

If I get a chance tomorrow, I'll ask some of the players at the V.

They played some earlier tonight.

Share this post


Link to post
Share on other sites

I would guess that there is just a ton of explosions and effects that require alot of processing power going on.

Also, the engine they used allows for bigger levels, so maybe the card is just having to use the memory a lot more than most games require?

Share this post


Link to post
Share on other sites

I dont know I have the sapphire tech version of the 4890 and no hiccups what so ever..

Even MW2 at the same res and runs flawlessly.

 

What are your cards specs?

Share this post


Link to post
Share on other sites

I dont know I have the sapphire tech version of the 4890 and no hiccups what so ever..

Even MW2 at the same res and runs flawlessly.

 

What are your cards specs?

 

I thought you were runnin X-fire.

Share this post


Link to post
Share on other sites

I thought you were runnin X-fire.

 

 

No I cant...(had to change mobo's for now) I just have one of the 4890's (although 2 gig version)

running at 935 and 1135

Share this post


Link to post
Share on other sites

No I cant...(had to change mobo's for now) I just have one of the 4890's (although 2 gig version)

running at 935 and 1135

 

DADGUM! :eek:

Share this post


Link to post
Share on other sites

you still have vertical sync enabled??

 

read stormy's link again and disable it. ;)

 

http://www.hardforum.com/showthread.php?t=928593

 

Did not know he already asked..

to be fair they weren't all about frame rates for cod waw, at least one was about frame rates in crysis. :lol:

 

but amd-rules there really isn't anything more anyone can tell you that hasn't already been said.

 

defrag hard drive

keep pc clean of viruses

if your not happy with frame rates

lower resolutions

lower other settings

etc, etc all been said before.

 

that card is a great card and plays any game.

 

all you need to do is stop looking at the frame rate numbers in the corner(switch them off) ;)

 

:b33r:

Share this post


Link to post
Share on other sites

Agree with Terry I mean the human eye cannot tell the difference above anything that is 40. fps...SO what do you care? (or is it 45?) Sure your eyes can detect things up to 60 fps but it is really not noticeable.. Just turn your head (as long as your monitor is at 60hz) and at 35 fps you cannot see the flicker. (use your peripheral vision to do this) I mean my 4890 gets 60 easily in Crysis and pretty much all games even when it gets down to 35 I cant tell the difference and you should not be able to either.

Edited by lugnut

Share this post


Link to post
Share on other sites

human eye cannot tell the difference above anything that is 40 fps...

so i thought too, but not true(over 200), and i won't go into that argument again here. :lol:

 

:b33r:

Edited by terry1966

Share this post


Link to post
Share on other sites

I am going to have to correct you on that. The frame rate discussion is never ending, the human eye is supposed to not be able to discern anything beyond 60 frames per second, but believe it or not we can notice and tell the difference between say 60 and 150 frames per second.

Share this post


Link to post
Share on other sites

Oh I agree now! I just did not read up on it before I said it. Last I heard( a long time ago) we could not see beyond the 30fps margin. So it was my mistake..

Obviously we can see more that 30fps (I should have used common sense first) as the newer LED tv's are 240hz so I should have taken it from there..

So no need for correction I said above that I just read up on it (again) and the old myth is false. My mistake.. :tup:

Share this post


Link to post
Share on other sites

Lug, 30 FPS is usually the cutoff between the human eye being able to tell that it is smooth game play, vs. choppy game play.

 

Between 30 and 60 our eyes can easily discern the difference between say 30 and 35, but at 60+ it looks ultra smooth to the human eye. We can't really tell the difference between 60 and 65, but for us to gap has to be bigger for us to detect..something like 60 to 120.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
Sign in to follow this  

×
×
  • Create New...