If this is your first visit, welcome to TrainSim.Com! This web site is for you if you are interested in railway simulations. We offer a library of downloadable files, forums to exchanges messages, news and more. We do not
sell train simulator software.
For full use of the site please
register.
Until you do, you will not be able to download files or post messages.
Announcement
Collapse
No announcement yet.
MSTS Cannot Use More Than 2GB of Memory, Or Can It?
There's actutally a command line for WinXP to make it work with LAA, but apparently, it isn't recommended. No matter, really. After all, XP, Vista, 7, 8, we're three iterations past XP. Might as well be running Windows 95 if you're running XP...
conrail1: I was not aware of a 64bit train sim? Am I missing something?
There's actutally a command line for WinXP to make it work with LAA, but apparently, it isn't recommended.
I've seen that too. Apparently, it will allow virtual memory to swap into a larger address space, but in real RAM it tends to cause more paging-out to the swap file, which can slow things down. It will keep memory-hungry applications happy and reduce crashes in things like Autodesk, which will open huge files and operate on them until memory runs dry. But intensive swapping isn't necessarily a good thing...
MSTS-Roundhouse (Technical Difficulties, Please Stand By)
Everything I've seen documented on XP-64 indicates that it's WOW64 mode for 32-bit applications is a little strange compared to later versions.
I used it primarily for FSX right after SP2 was released. At the time Windows XP/SP2 64-bit was able to handle extensive amounts of FSX add-on content better than Vista 64 which I was also using for the majority of my sims/games.
There was a HAL issue with Windows XP 64-bit Edition prior to the release of SP2 integrated in the install media but I never experienced the issue (both of copies of Windows XP 64-bit have SP2 already integrated).
Of course today with more and more of sims/games I run using DirectX 11, Windows XP is useless.
MSTS isn't the most efficient thing on the block, not compared to later MS Flight Simulator products.
Nether is any of the MS FS products, that's pretty well known.
conrail1: I was not aware of a 64bit train sim? Am I missing something?
Robert
A 64-bit train sim? The current train games we have now barely make it into the 21st century let alone going 64-bit, lol.
Asus Maximus VI Extreme Intel Z87, Intel Core i7-4770K, Corsair Dominator Platinum 16GB PC3-19200 (2400MHz), EVGA GeForce GTX TITAN SuperClocked, Samsung 840 Pro Series 128GB SSD (Windows 8.1 Pro 64)/Samsung 840 Pro Series 256GB SSD (Programs), CaseLabs Merlin SM8 with Corsair AX1200 PSU
Well, you know, that's true. I've even heard that there's a train game out there that's so bad it runs on Steam!
Robert
I know of two train games that use Steam and leave a lot to be desired but it's not because of Steam that's for sure, lol.
Asus Maximus VI Extreme Intel Z87, Intel Core i7-4770K, Corsair Dominator Platinum 16GB PC3-19200 (2400MHz), EVGA GeForce GTX TITAN SuperClocked, Samsung 840 Pro Series 128GB SSD (Windows 8.1 Pro 64)/Samsung 840 Pro Series 256GB SSD (Programs), CaseLabs Merlin SM8 with Corsair AX1200 PSU
>One other thing that I noticed in task manager was the starting memory usage (commit size) for -mem:1024, -mem:2047 and >-mem:3072 was always the same at 1108MB. (start only when train.exe first appeared in task manager)
This would suggest that train.exe cannot start with more than 1024MB extra memory.
I get precisely the same results; it never starts with more than the 1024MB. I've been running some of my really processor-intensive routes, like SFS Hannover-Berlin 2.0, and I get the same FPS no matter what. My rig is now 5+ years old (32-bit Dell XPS 710), with a middling graphics processor (though an SSD), and while the LAA didn't hurt, it didn't really help either. I think, given MSTS' now ancient roots, that some upward mobility in processor speed will give me the boost I need.
Curiously enough, the one trick that *did* work for me, and it was Eric who mentioned it some time ago (thanks, Eric!) was to use Prio to make certain that only one of my 4 cores was handling MSTS. That one little freebie app did more for MSTS stability than anything else I've ever done.
Did you restart your computer before you did these tests? A memory refresh is crucial for MSTS for as long as you keep your computer on the more "limits" the software will put on your FPS. I've seen limits of 43, 32, and 16 and this is a pretty powerful computer. When I restart the computer and open MSTS again I have max framerates.
I did a further test after reboot and I still get 16fps at the start of the activity.
Kind of funny, really. First computer I ran MSTS on had 128megs of Ram....
Robert
LOL. Same here - an AMD Duron @ 700MHz, 128MB PC-100 SD-RAM , 32MB Ati Rage Pro 128 (Xpert 2000) graphics, and onboard sound. Ran like a champ with 25 - 59 fps with the default routes and rolling stock.
Without the "hard allocation" of memory to MSTS you're probably forcing the OS to use that damn paging file and there by inducing some load stuttering at it spins the drive to find the data.
Easy way to find out is to turn off the page file.
I have 6gb of ram and run without the page file, no change to stuttering.
When you start a new process, it requests a certain amount of memory. Windows assigns the process memory from the virtual memory pool (this is equal to the total of physical RAM and page file). For a 32-bit non-LLA process Windows can assign up to 2 GB of virtual memory. Windows then uses page tables to assign this virtual memory to physical memory. As far as each process is concerned it's addressing memory locations from 0 up to 2 GB. Because these are remapped by the OS, there will never be two processes trying to address the same physical memory location because the page tables assign different locations in physical memory to each process.
64-bit Windows works pretty much the same except it can assign up to 4 GB to a 32-bit process if it is LLA, and it can remap the memory for a 32-bit process anywhere in physical memory it chooses, even above the 4 GB line. Incidentally, LLA works with some software. I've tried it with an old image editing program. If I open enough large images the program will use close to 4 GB. I tried LLA with MSTS last year. It seems to be more stable but I've yet to see it use more than ~1.2 GB of RAM. When it tries to use more than that, it immediately crashes. Remember at the time MSTS was made 128 MB was considered a lot of RAM, and 1 GB was a pipe dream for most users.
The biggest limitation of MSTS in my opinion is the fact that it only uses one processor. MSTS is largely CPU-bound, not GPU-bound. If there was a way of getting it to use more than one processor I think this would help considerably more than allowing it to address more RAM.
The biggest limitation of MSTS in my opinion is the fact that it only uses one processor.
Absolutely. Unfortunately, it would need some serious re-coding to handle multithreading. That's where Open Rails really shows its advantage, as it is multithread-aware.
MSTS-Roundhouse (Technical Difficulties, Please Stand By)
Multi-threading isn't really the panacea that everyone thinks it is... It works well with multitasking, but it has to return to the originating thread, since in reality, most programming is linear.
Most of the time threads end up waiting for each other anyway...
Unfortunately, it would need some serious re-coding to handle multithreading. That's where Open Rails really shows its advantage, as it is multithread-aware.
Multi-threading and Multi-Tasking uses "Time-slicing" and doing COPY-PASTEs of 2 large folders at the same time slow each other down by 50%, as does browsing the Internet and downloading something at the same time, taking less than normal speed, having 2 connections to it open at the same time.
Multi-core CPUs can do Parallel Processing, providing that WIN-7 and programmes running on it have been coded to use all of them.
My son who is an expert on the subject has created patented parallel processing to various servers in Novell Netware clusters, taking place at the same speed as would normally be the case when using a single one, now working for FUSION-IO, creating parallel processing code for their SSD drives used by Facebook, Oracle and other Cloud users.
This thread started with a way to tweak MSTS so it seems to be "large address aware". I've done that. I've noticed no real difference in operation between the tweaked & non-tweaked version, and both seem to use the same amount of memory as reported by Task Manager (Win 7 SP1 64-bit, 6GB RAM, Core 2 Extreme (4 real cores), MSTS w/o tweak has mem:1024 and with tweak has mem:3072). I still get some stuttering, which is not a function of memory but rather how fast the disk can be accessed & return data (I don't have a SSD - yet - but do have 7200rpm SATA3 drives). All of this makes sense because the limits on MSTS *are* the processor (as discussed above) and disk access (as discussed above and elsewhere). So the tweak seems to be something that causes no harm, but doesn't improve matters much either.
One thing that *did* help a little was plugging in a gift 4GB flash stick and telling Win7 to use it with "ReadyBoost." Makes sense, because Win7 uses it to cache more disk data than is normally done in RAM, and after running MSTS for a while it does seem to reduce the length of the disk-access pauses - though they still happen. Presumably running from a SSD to start with would reduce the stutters even more.
Parallel processing is fine (I'm considering an application of it at work), but to work well it needs a lot of independent tasks; any time you have to pull everything together and synchronize the results you have a bottleneck. There are many parallel processing systems out there, commercial and free/open source, and all of them have that "feature" - great until you want to collect the output and do something else with it (like generate a result). The spread/collect work is overhead that has to be amortized over a large number of parallel tasks to make sense. Parallelism in a PC is more along the lines of multiple processors letting background tasks (of which there are many in modern Windoze) have something else to run on while MSTS is monopolizing the core you're interested in. I did notice a very significant improvement going from an old dual-core (Pentium D) to the Core 2 Extreme (4 real cores - not 2 hyperthreaded - at near the same clock speed as the P-D). The Core 2 is much a more efficient computer per core than the old beast (which was really a pair of P4's), and I now have more of them for the general background work to spread over. i5-7's with 4 or more cores are even better. Funny, though, I didn't get a speedup (in fact, it went slower) when I set the processor affinity to a single core, so I suspect Windows is still spreading around a little of MSTS' work.
We process personal data about users of our site, through the use of cookies and other technologies, to deliver our services, personalize advertising, and to analyze site activity. We may share certain information about our users with our advertising and analytics partners. For additional details, refer to our Privacy Policy.
By clicking "I AGREE" below, you agree to our Privacy Policy and our personal data processing and cookie practices as described therein. You also acknowledge that this forum may be hosted outside your country and you consent to the collection, storage, and processing of your data in the country where this forum is hosted.
Comment