- Status
- Offline
- Joined
- Mar 3, 2026
- Messages
- 297
- Reaction score
- 7
Anyone else running into CLR InvalidProgramException when hitting native memory reads in C#?
I've been tinkering with manual module parsing and memory dumping, but I'm hitting a wall with the Common Language Runtime throwing an InvalidProgramException. The error pops up specifically during the ReadProcessMemory call, despite the base address and memory size being resolved correctly via module iteration.
The Stack Trace:
The implementation details:
The binary I'm targeting is a standard C/C++ module. I'm not dealing with any game-specific protection or obfuscated entry points yet, just trying to get the basics of memory access right without crashing the runtime. It feels like a marshaling issue with the out IntPtr bytesRead parameter or a signature mismatch in the delegate definition compared to the native function signature.
Has anyone seen the CLR trip over itself like this when pinvoking memory functions? Thinking it might be related to IntPtr vs UIntPtr in the size param, or possibly how the stack is being cleaned up after the call.
Anyone have a reliable way to map these modules without triggering these CLR exceptions?
I've been tinkering with manual module parsing and memory dumping, but I'm hitting a wall with the Common Language Runtime throwing an InvalidProgramException. The error pops up specifically during the ReadProcessMemory call, despite the base address and memory size being resolved correctly via module iteration.
The Stack Trace:
Code:
[ERROR] InvalidProgramException: Common Language Runtime detected an invalid program.
[STACK] at ReadProcessMemory(IntPtr hProcess, IntPtr baseAddress, IntPtr buffer, Int32 size, IntPtr& bytesRead)
The implementation details:
- Resolution of module base and size via ProcessModuleCollection.
- Memory allocation via Marshal.AllocHGlobal.
- Delegate-based wrapping of native ReadProcessMemory.
Code:
internal static bool ReadProcessMemory(IntPtr hProcess, IntPtr baseAddress, IntPtr buffer, int size, out IntPtr bytesRead)
{
return _readProcMem(hProcess, baseAddress, buffer, (IntPtr)size, out bytesRead);
}
The binary I'm targeting is a standard C/C++ module. I'm not dealing with any game-specific protection or obfuscated entry points yet, just trying to get the basics of memory access right without crashing the runtime. It feels like a marshaling issue with the out IntPtr bytesRead parameter or a signature mismatch in the delegate definition compared to the native function signature.
Has anyone seen the CLR trip over itself like this when pinvoking memory functions? Thinking it might be related to IntPtr vs UIntPtr in the size param, or possibly how the stack is being cleaned up after the call.
Anyone have a reliable way to map these modules without triggering these CLR exceptions?