This might have been addressed already, and if so I apologize and look forward to the thread link. However, everything I find online in regards to * Fire Rate* doesn’t seem to match up. I’ve seen two different ways of looking at it–

**Fire Rate**: *Number of seconds the gun can be fired continuously before reloading*.

Obviously this one is wrong, because if that were true, we’d want a low fire rate rather than a high one (i.e. A gun with a Fire Rate of 2.5 would fire all rounds in 2.5sec).

**Fire Rate**: *Number of bullets fired per second*.

This makes more sense, but doesn’t seem to play out that way. For example, a gun with a Fire Rate of 7.0 and a clip of 55 bullets would take 7.86sec ( 55 / 7.0) to fire all 55 bullets before reloading. However, when I hold the trigger and count it out, sometimes it goes faster than the math suggests, sometimes it goes slower than the math suggests based on the gun type; so the pure math can’t be all to account for (obviously badazz points and skill tree perks taken into consideration–pretend that 7.0 already considers all of that).

I’ve also read that I should ignore the definition and should simply multiply Damage and Fire Rate (e.g. **DMG** of 100, **FR** of 7.0, **DPS** = 700/sec); but to understand over how many seconds, you still need to take the clip size into consideration (700DPS * 7.86sec = 5502 Total DMG per Clip).

I guess I’m wondering how, if at all, fire rate really matters since when considering it I get a Total DMG of 5502, but if I simply just multiply Clip (55) * DMG (100) I get 5500. Whichever way you calculate it, you get the same damage; really the question is how fast do you want to do that damage?

Thanks,

Kong