Copy the page URI to the clipboard
Kennedy, Ian
(2010).
Abstract
The emerging techniques in volatile memory acquisition and analysis are ideally suited to malware analysis. However, context based data such as the extraction of unpacked binaries, data allocated to hidden processes, identification of terminated processes are developments in need of quantifiable evaluation. A greater level of science in tool and process evaluation arises out of the move towards higher standards in the delivery of forensic science services. In this paper, we propose a number of experiments to gather data that will allow derivation of a statistical measure of error for volatile memory acquisition. The design of these experiments takes into account the flux nature of volatile memory not only between acquisitions but also during the imaging process, and there applies piecewise hashing is applied to each of the files containing the RAM acquisitions to derive an error rate. Repeating the experiments for multiple tools and operating systems/patches will produce a more generalised error rate for the procedures alone. It is intended that the proposed experiments will be offered to postgraduate students following the Open University's Computer Forensics and Investigations course (M889). Therefore, the paper also discusses the risks and challenges associated with teaching malware analysis techniques within a distance-learning setting. Also discussed are the challenges relating to the technical skills required to perform malware analysis. Programming, debugging, assembly language and obfuscation, for example, are tasks often outside the student's skill set. For the student, the experiments demonstrate a quantifiable data gathering technique to inform decisions made regarding the validation of tools where reference data may be difficult or impossible to produce. Further to this, these experiments are an initial step to develop a methodology to quantifiably evaluate other processes and tools associated with malware analysis.