Magnet Weekly CTF — Week 5

Some background

If you’re not familiar with Hadoop, this question was probably going to require a little bit of digging, but essentially, Hadoop is a distributed filesystem which runs on top of your base operating system and needs Java to run. If you want to learn more, here you go.

Rabbit Hole of Death

I went down my first rabbit hole trying to essentially recreate a Hadoop cluster locally and somehow mount the forensic image as the files for it, but that was a brick wall. To say I spent a lot of time on this is an understatement. I tried all sort of things from attempting to mount an hdfs cluster in Linux to setting up a Cloudera instance online (this cost money, so that’s where I drew the line). This was evidently NOT how to solve this challenge.


So after sleeping on it, I did a little more digging in FTK Imager only to find the following files in the Master E01 file:

Rabbit Hole #2

From the link above, it became evident I needed to able to run the ‘hdfs’ or ‘hadoop’ commands locally on my Linux VM. I spent another large chunk of time attempting to setup a local Hadoop cluster locally when I had my second bit of luck finding that you could simply install the Hadoop Command Line interface:

It’s the darkest before the dawn

At this point I knew I needed 2 things:

  • Java SDK
  • Hadoop
./hdfs --version
export JAVA_HOME="/usr/lib/jvm/java-1.11.0-openjdk-amd64"

Moment of Truth

At last I had all the pieces put together, my Linux VM gave me a correct response to the hdfs --version command and we were ready to go. I expoerted the first file I found to test it out and see what I got. Let’s be honest, I expected a full screen of error messages:

./hdfs oiv -i fsimage_0000000000000000024 -o fsimage24.xml -p XML



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store