Jump to content

t0night

Recommended Posts

First off; specs:

 

6GB RAM

Intel i5 processor

AMD Radeon HD 7700M / Intel HD 4000 graphics

 

TF2 is obviously not a particularly graphically challenging game, but it runs at a stable 90 FPS which I am entirely happy with.

 

However, Fallout: New Vegas, while it is obviously a more demanding game, only manages a shitty 20 or so on very low settings, below the game's own "recommended for your PC" settings. Looking at the options menu; the selected graphics adapter is the integrated Intel card which would explain the piss-poor performance. However, it's also the only one on the menu. I've tried disabling both adapters through device manager and only re-enabling the AMD adapter, but it's still not selectable on that menu.

 

In the direct x diagnostics it's also only appearing to find the Intel card, not noting the AMD adapter at all. This is confusing because I'm assuming it must be using the AMD for tf2 or I couldn't get 90 FPS on high settings; so how can I force it to recognise the far superior graphics card in both Direct X itself and F:NV?

 

Thanks for reading and thanks in advance for any help.

 

Here are some screenshots of the menus and information I'm referring to: http://imgur.com/Cm6fXwr,10XaHZN

Link to comment
Share on other sites

First off; specs:

 

6GB RAM

Intel i5 processor

AMD Radeon HD 7700M / Intel HD 4000 graphics

 

TF2 is obviously not a particularly graphically challenging game, but it runs at a stable 90 FPS which I am entirely happy with.

 

However, Fallout: New Vegas, while it is obviously a more demanding game, only manages a shitty 20 or so on very low settings, below the game's own "recommended for your PC" settings. Looking at the options menu; the selected graphics adapter is the integrated Intel card which would explain the piss-poor performance. However, it's also the only one on the menu. I've tried disabling both adapters through device manager and only re-enabling the AMD adapter, but it's still not selectable on that menu.

 

In the direct x diagnostics it's also only appearing to find the Intel card, not noting the AMD adapter at all. This is confusing because I'm assuming it must be using the AMD for tf2 or I couldn't get 90 FPS on high settings; so how can I force it to recognise the far superior graphics card in both Direct X itself and F:NV?

 

Thanks for reading and thanks in advance for any help.

 

Here are some screenshots of the menus and information I'm referring to: http://imgur.com/Cm6fXwr,10XaHZN

I mean a game like Team Fortress 2 isn't very demanding... and seeing your CPU, the Intel HD Graphics 4000 is more than enough to run TF2 at high

 

As for Fallout: New Vegas? Maybe find how to make the game set an exception to run on the dedicated GPU.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...