bannerBaka-UpdatesManga
Manga Poll
Preferred type of protagonist?
Hero
Heroine
Villain
Villainess
 
mascot
Manga is the Japanese equivalent of comics
with a unique style and following. Join the revolution! Read some manga today!

RSS Feed

Optimizing Gaming

You must be registered to post!
From User
Message Body
user avatar
Gespenster Jager
Member


14 years ago
Posts: 36

So I got a new computer 6 months ago. I like gaming a lot and am trying to spend a lot more time gaming then in the past. Here is my problem: I know very little about computers. I was hoping to get a better processor but I am unsure if this alone will make my computer run better.
Here are some specs:
Processor (current): Intel Core i7@1.60 GHz
RAM: 6 GB
Hard Drive: 600 GB (cant figure out rpm)
Graphics Card: dont know but pretty good
Does anyone know if just getting a new processor is good enough?
thanks


user avatar
lagomorphilia!
Member


14 years ago
Posts: 2506

...Is this a laptop?

Also, find out your graphics card model. Go to start, go to run, type in dxdiag, press enter, click display, and it should be listed under "device"


________________

This signature was recovered from Hades to serve in my rotting armies.

user avatar
Gespenster Jager
Member


14 years ago
Posts: 36

Yes. Sorry for not mentioning. A Dell


user avatar
Timeless
icon Member


14 years ago
Posts: 527

In that setup, I doubt that your i7 is the bottleneck, but i7 has a lot of ranges, so could you please list the specifics? Which games do you find laggy/processor heavy?

Ah, a laptop. Ignore this post.


Post #455594 - Reply To (#455591) by tearyouapart
Post #455594 - Reply To (#455591) by tearyouapart
user avatar
lagomorphilia!
Member


14 years ago
Posts: 2506

Quote from tearyouapart

Yes. Sorry for not mentioning. A Dell

Laptops aren't known for their ability to be upgraded, and aren't recommended as gaming machines.

If you'd like to swap out parts ("upgrade"), then you're probably out of luck since you're using a laptop. Sorry.


________________

This signature was recovered from Hades to serve in my rotting armies.

Post #455595 - Reply To (#455594) by x0mbiec0rp
Post #455595 - Reply To (#455594) by x0mbiec0rp
user avatar
Gespenster Jager
Member


14 years ago
Posts: 36

Quote from x0mbiec0rp

Quote from tearyouapart

Yes. Sorry for not mentioning. A Dell

Laptops aren't known for their ability to be upgraded, and aren't recommended as gaming machines.

If you'd like to swap out parts ("upgrade"), then you're probably out of luck since you're using a laptop. Sorry.

well thats good to know. i do have a home computer but right now im overseas for a few months.
thank you for the help


Member


14 years ago
Posts: 9

You could upgrade your laptop CPU to like an i7-820QM, but it can cost you at least $300 for the processor alone while giving you negligible performance gains in games. A better GPU will give you more of a performance gain (again, can be very costly); however, AFAIK, the Dell XPS (this what I believe you have) doesn't have the appropriate MXM slot to do so. It's soldered directly on the motherboard, so like x0mbiec0rp said, you're out of luck in this case, sadly.

May I ask, have you try to play games on it yet? If not and you bought it within 6 months, I would assume you have like the GT425m, GT435m, or GT445m (you didn't mentioned this, so I'm going off of memory). The GT425m isn't a bad card, and it capable of playing a lot of games on at least high @ 1360x768, while the more taxing modern day games like Battlefield Bad Company 2 on med.


user avatar
Case of Fumblitis
Member


14 years ago
Posts: 108

What you have will work, but don't expect to play with max settings without noticeable Frames Per Second (FPS) issues. The goal is to maintain 30+ FPS, as any more is virtually undetectable to the human eye. People who brag about 100+ FPS is just that, bragging rights.


________________

There is a simple solution to every problem; finding the simple solution is the difficult problem.

[img]http://valid.canardpc.com/cache/banner/2519041.png[/img]

Post #463135 - Reply To (#461576) by JakeOrion
Post #463135 - Reply To (#461576) by JakeOrion
user avatar
icon Member


14 years ago
Posts: 974

Quote from JakeOrion

What you have will work, but don't expect to play with max settings without noticeable Frames Per Second (FPS) issues. The goal is to maintain 30+ FPS, as any more is virtually undetectable to the human eye. People who brag about 100+ FPS is just that, bragging rights.

30 fps isn't enough for decent gameplay, at the very least it should be 50s

killzone on ps3 is running at 30 fps, its horrible 🤢


________________

"we are people because of other people"
"I am who I am because of who we all are"

Post #463173 - Reply To (#463135) by Domonkazu
Post #463173 - Reply To (#463135) by Domonkazu
user avatar
Case of Fumblitis
Member


14 years ago
Posts: 108

Quote from Domonkazu

30 fps isn't enough for decent gameplay, at the very least it should be 50s

killzone on ps3 is running at 30 fps, its horrible 🤢

You cannot compare a game console to a computer. Different architecture, different build, etc. Also:

http://en.wikipedia.org/wiki/Frame_rate#Visible_frame_rate

The human visual system does not see in terms of frames; it works with a continuous flow of light information.[citation needed] A related question is, “how many frames per second are needed for an observer to not see artifacts?” However, this question also does not have a single straight-forward answer. If the image switches between black and white each frame, the image appears to flicker at frame rates slower than 30 FPS (interlaced). In other words, the flicker-fusion point, where the eyes see gray instead of flickering tends to be around 60 FPS (inconsistent). However, fast moving objects may require higher frame rates to avoid judder (non-smooth motion) artifacts — and the retinal fusion point can vary in different people, as in different lighting conditions. The flicker-fusion point can only be applied to digital images of absolute values, such as black and white. Where as a more analogous representation can run at lower frame rates, and still be perceived by a viewer. For example, motion blurring in digital games allows the frame rate to be lowered, while the human perception of motion remains unaffected. This would be the equivalent of introducing shades of gray into the black–white flicker.

So in essence, while it is possible to "see" beyond 30 FPS, there is almost no justification to do so. Again, bragging rights.


________________

There is a simple solution to every problem; finding the simple solution is the difficult problem.

[img]http://valid.canardpc.com/cache/banner/2519041.png[/img]

Post #463176 - Reply To (#463173) by JakeOrion
Post #463176 - Reply To (#463173) by JakeOrion
user avatar
lagomorphilia!
Member


14 years ago
Posts: 2506

Quote from JakeOrion

Quote from Domonkazu

30 fps isn't enough for decent gameplay, at the very least it should be 50s

killzone on ps3 is running at 30 fps, its horrible 🤢

You cannot compare a game console to a computer. Different architecture, different build, etc. Also:

http://en.wikipedia.org/wiki/Frame_rate#Visible_frame_rate

The human visual system does not see in terms of frames; it works with a continuous flow of light information.[citation needed] A related question is, “how many frames per second are needed for an observer to not see artifacts?” However, this question also does not have a single straight-forward answer. If the image switches between black and white each frame, the image appears to flicker at frame rates slower than 30 FPS (interlaced). In other words, the flicker-fusion point, where the eyes see gray instead of flickering tends to be around 60 FPS (inconsistent). However, fast moving objects may require higher frame rates to avoid judder (non-smooth motion) artifacts — and the retinal fusion point can vary in different people, as in different lighting conditions. The flicker-fusion point can only be applied to digital images of absolute values, such as black and white. Where as a more analogous representation can run at lower frame rates, and still be perceived by a viewer. For example, motion blurring in digital games allows the frame rate to be lowered, while the human perception of motion remains unaffected. This would be the equivalent of introducing shades of gray into the black–white flicker.

So in essence, while it is possible to "see" beyond 30 FPS, there is almost no justification to do so. Again, bragging rights.

The article you've posted actually says that we'd notice up to 60 fps.

Also, there are some people find motion blurring annoying. This happens because the player can move his/her mouse fast enough that the motion blurring will hinder the players vision. This is mostly the result of bad implementation, yes, but I've yet to see any implementations I'd call good.


________________

This signature was recovered from Hades to serve in my rotting armies.

user avatar
icon Member


14 years ago
Posts: 974

I can notice when fps game are running with <50, it feels like my aiming skills dropped a bit cause of the slow frame rate.

no 30 fps game worth of real tournament, those game are for casual.

yep the motion blurring used very much in K3, I cant stand that game 🤢 , other than to massacre console fps gamer for easy win.

I play with mouse+keyboard thanks to Eagle Eye 🤣


________________

"we are people because of other people"
"I am who I am because of who we all are"

You must be registered to post!