Physx Locked to CPU - fix to put to GPU.



  • Hiya guys.

    So my crappy computer (I have 2)with a decent GPU but a bit of a soft CPU has played chiv like shit for quite some time. Even dropping all setting down to minimal it still runs poorly. The best I could get was when I put ragdolls to 0.

    Anyway I had found out Physx has been running on my CPU, even though I have a Physx Nvidia enabled card. I tried setting Nvidia Control Panel to use physx on GPU but it wouldn’t stick for chivalry.

    Then I found this gem:

    In baseengine.ini found under the engine/config directory of Chivalry there is a line :

    bDisablePhysXHardwareSupport=True

    This is bollocks if you have a Nvidia card. set it to false. So it reads:

    bDisablePhysXHardwareSupport=False

    Why TB have this I don’t know, they probably have a reason. However I haven’t had an issue with it and Chiv now runs with Physx on GPU, it seems to run much smoother. It could be all in my head, but I don’t think so.

    Enjoy.



  • O.o

    my hero



  • That’s a long ini file. Could you narrow down maybe with the category you found it under?

    EDIT: FOUND IT. It’s not under a category, but is found roughly 1/4 of the file down from the top.



  • @quigleyer:

    That’s a long ini file. Could you narrow down maybe with the category you found it under?

    EDIT: FOUND IT. It’s not under a category, but is found roughly 1/4 of the file down from the top.

    Ctrl+f?



  • @Slacka:

    @quigleyer:

    That’s a long ini file. Could you narrow down maybe with the category you found it under?

    EDIT: FOUND IT. It’s not under a category, but is found roughly 1/4 of the file down from the top.

    Ctrl+f?

    Does that find things? Lol

    I was playing a TO match to try and find out if it would run better with higher ragdoll (I have a Geforce GTX 660 ti and pretty much couldn’t play with any ragdoll on).

    I didn’t get to the point the computer slowed down, but this did happen, and then shortly after it completely locked up on me. [image:3ddustt2]http://i247.photobucket.com/albums/gg123/EricQuigley/Banners_zps673c9931.jpg[/image:3ddustt2]

    I’m about to try again, but lemme know if you guys see anything similar or have similar experiences.

    EDIT: Played a little longer- Honestly I don’t see much of a difference. I’m almost certain this didn’t improve my performance at all, but thank you for finding it.



  • Open the console and type statfps to find out if it actually does improve your performance



  • Dude this is fucking awesome. Great find.

    It did not give me a great increase in performance but I can tell you it definitely frees up your CPU. I am now able to video capture without jerky in-game graphics. Recommend everyone with an NVIDIA card to try it.



  • @quigleyer:

    I didn’t get to the point the computer slowed down, but this did happen, and then shortly after it completely locked up on me. [image:1859o7l7]http://i247.photobucket.com/albums/gg123/EricQuigley/Banners_zps673c9931.jpg[/image:1859o7l7]

    The lines in your screenshot look like artifacts that usually show up when your graphic’s card fails for one reason or the other. In your case, it probably overheated due to the immense load and insufficient cooling.



  • Guys, guys, GUYS!!!

    What about Radeon graphics card? Mine is HD4850. Will it help too?



  • Wow really great find, I didn’t even know Chivalry supported PhysX lol. Going to test this and see if I get any gains on my 670FTW, even though I never drop under 60 anyways, but perhaps my CPU temps will be lower now.



  • @funthomass:

    Guys, guys, GUYS!!!

    What about Radeon graphics card? Mine is HD4850. Will it help too?

    No Physx is nvidia specific and this wont do anything for ya. One reason I always go with NVIDIA



  • Another tip you can enable SLI with NVIDIA inspector:

    1. Download NVIDIA Inspector
    2. Click the settings button next to the driver version to open the Profile Inspector
    3. Scroll down to UDK.exe’s profile
    4. Change SLI compatability bits to: 0x02406405 and apply changes.

    Seems to be working good so far. Trying to test with 64 player servers



  • Interesting, I have a GTX 590 so I’ll give this a go, I usually have it on Auto-Select, so PhysX in the Nvidia panel usually defaults to the 2nd chip on that card. I have a beasty Hex-core processor though but it will still be better having it run off the GPU I’m guessing?

    Also isn’t it more appropriate to change this in UDKEngine.Ini in MY DOCUMENTS > Chivalry > Config, rather than the BaseEngine.ini file in the game directory? I thought the Configs that the game actually runs on are the ones in My Docs, which are just derived from the Base ones in the game directory.

    Switched on SLI too. Will report back when I have a blast tonight. :D

    May be hard to tell a major difference though, as I only ever really have any slowdown when looking at a GIGANTIC (with ragdolls and blood on full) pile of bodies in the middle of Arena King of the Hill heh.



  • @BobT36:

    Interesting, I have a GTX 590 so I’ll give this a go, I usually have it on Auto-Select, so PhysX in the Nvidia panel usually defaults to the 2nd chip on that card. I have a beasty Hex-core processor though but it will still be better having it run off the GPU I’m guessing?

    Also isn’t it more appropriate to change this in UDKEngine.Ini in MY DOCUMENTS > Chivalry > Config, rather than the BaseEngine.ini file in the game directory? I thought the Configs that the game actually runs on are the ones in My Docs, which are just derived from the Base ones in the game directory.

    Switched on SLI too. Will report back when I have a blast tonight. :D

    May be hard to tell a major difference though, as I only ever really have any slowdown when looking at a GIGANTIC (with ragdolls and blood on full) pile of bodies in the middle of Arena King of the Hill heh.

    Seems to run smoother with more players on the screen with SLI enabled. Though occasionally the lighting on character models start flickering, may have to turn off some lighting setting or something.



  • It might be more appropriate however I just used the directory in Chiv and the onscreen display saying PHysx -> GPU tells me it worked that way, and I’m happy for it. Probably will get reset each patch?

    Keep in mind that offloading Physx from CPU to GPU will only help those of us with weak CPU’s that need everything they can get from it. Heck, it could make peoples game worse if the GPU is bottlenecking their system.

    I would start chunking up pretty horribly at about 12+ players, with this enabled it ran smoothly on a 16 player FFA server, it was Throneroom map, which does usually run a bit better for me though. The real test will be a full 16 player server on Arena, that map is almost unplayable for my little computer. 32 TO servers are just not worth playing, I haven’t tried that yet with this setting.

    If you use Stat FPS in console you’ll get a frame rate and frame latency display in game. If you find you get no improvement in frames when you go from say medium setting to low (leave ragdoll’s the same) then your probably CPU bound.
    I’m using an AMD Athlon 64bit X2 6000, runs at about 3ghz, with a win 8 32bit OS. It’s 32 bit because when I first installed win 8, my GTX650 wouldn’t work on 64bit.

    I’m a bit disappointed that this system can’t run chivalry on lowest settings with anything more than 16 players, so any help I can find is good.

    I must say though, after many years of building systems, going AMD CPU’s just isn’t worth the money saved, at least not for CPU’s in the last 4 years.



  • Bear in mind UDK*.ini files take precedence over Default*.ini files which take precedence over Base*.ini files. UDK ini files are what you should be changing.



  • @Martin:

    Bear in mind UDK*.ini files take precedence over Default*.ini files which take precedence over Base*.ini files. UDK ini files are what you should be changing.

    What I had experienced after first changing the BaseEngine.ini, playing the game, and then learning I should be changing the UDKEngine.ini, is that the changing of the BaseEngine.ini automatically changed the UDK one to False. Just tested this with the Beta as well.



  • SLI tweak still can’t help me in 64 player servers. Getting about 30-35 FPS when alot of players are on the screen. Works just a tiny bit smoother in 32 player games.



  • Well all I know is I have the on-screen display to say if PhysX is using CPU or GPU in big green letters on the top left of my screen. When I changed the baseengine.ini the OSD says PhysX is using GPU.

    I could try and use the udk*.ini stuff, but why fix something that isn’t broken.

    Now, what I want to know, is what is the Dev’s reasons for having the default set to CPU and not having a UI option in configuration to change it to GPU? In other words, what is their reason for disabling GPU support??



  • @[??:

    E1]SLI tweak still can’t help me in 64 player servers. Getting about 30-35 FPS when alot of players are on the screen. Works just a tiny bit smoother in 32 player games.

    Yeah the game really wasn’t designed around 64 players, so may not run too good for some.

    As to why they’ve set it that way, may be a default in the engine files? Dunno.

    Had to turn SLI off btw, was making shadows and corpses flicker, annoyed me heh.

    Will have to play a bit more to see if it works better with the PhysX setting off. Does switching that to false leave it up to the Nvidia control panel to decide? (I currently have it on Auto-Select).

    I have an I7 3960x Hex Core overclocked to 4.2ghz, and a GTX 590, so if it doesn’t use “Auto-Select”, I’m wondering whether for me personally, it may be better to leave it on my CPU. If it takes advantage of my cores that is.


Log in to reply