Hi all,
I am trying to acquire a live memory dump from an Ubuntu system. This is what I do
1. Download fmem tool
2. Compile it with make and run ./sh
3. A /dev/fmem is created
I know this is a special file and I have to specify the size for dd. However, I either end up with a small file or I get a Segmentation Fault error
The RAM is 2GB size. My commands are
dd if=/dev/fmem of=./dumpfile.raw count=400
It works but the file is not complete
dd if=/dev/fmem of=./dumpfile.raw bs=1MB count=2000
dd if=/dev/fmem of=./dumpfile.raw count=500
Error Segmentation fault
The version of fmem is 1.5
On top of that, if I download version 1.6 and run 'make', it gives me compilation errors (
Any clue?
Thanks in advance!
In my experience there are more (slightly) different versions of dd than stars in the sky, so I wouldn't even THINK of using "a" dd (unless I am very familiar and have thoroughfully tested that specific version in the specific environment) without specifying all the needed parameters and specifying them in the most "basic" way, as the "default" blocksize may differ from what is expected and/or the "translation" from (say) 1MB to 1048576 bytes may simply not happen (i.e. the specific build/version may use 1M instead of 1MB, etc.)
Personally I would try
1) dd if=/dev/fmem of=./dumpfile.raw count=4194304 bs=512
2) dd if=/dev/fmem of=./dumpfile.raw count=2097152 bs=1024
3) dd if=/dev/fmem of=./dumpfile.raw count=1048576 bs=2048
4) dd if=/dev/fmem of=./dumpfile.raw count=524288 bs=4096
5) dd if=/dev/fmem of=./dumpfile.raw count=2048 bs=1048576
2 GB are 2147483648 bytes, so
4194304*512
2097152*1024
1048576*2048
524288*4096
2048*1048576
Of course, the dump will be (in some cases not-so-slightly) faster the bigger the blocksize, so you may want to try the above list of commands in reverse order.
jaclaz