Here is one example payload after base64 decoding it:
In Burp Suite:
It is clear that you cannot easily manipulate binary data in the payload and just encode it to base64 again because it uses Protocol Buffers and it uses kind of serialization and when you want to make changes in the payloads that have multiple parameters, it is kind of impossible to do it manually like editing JSON.
Here is another payload from a real target:
There are some tools to send gRPC or gRPC-Web Requests but there is a problem! You need the .proto file! If you are doing a black-box test, naturally you do not have the .proto file and the only thing you have is the payload and the Webpacked JavaScript files which the browser uses for sending gRPC-Web Requests. In the end, I will show a little about white-box testing with .proto files.
After decoding the base64 encoding payload, and piping the output to xxd command, we can see hex data. The 5 first bytes of the payload is the entire message length in hex, in this example is (16¹ * 1 + 16⁰ * 6 = 22) which means the entire payload is 22 bytes long.
after removing the length prefix we can pipe the payload to Protoscope tool and it outputs a human-readable version of the payload which is editable and is like JSON format but not exactly JSON. The message fields are separated with field numbers and not field names because protocol buffers work with field numbers and field names are specified in the .proto file and stub files.
gRPC Pentest Suite has 2 tools + 1 Burp Suite extension for hacking gRPC-Web:
This tool helps manipulate the payloads, removes the length prefix, and is useful also for examining responses from the server or doing response manipulation. you also need to have the Protoscope tool installed to make the gRPC Pentest Suite Complete and Available.
This extension helps use gRPC Coder tool faster and with just one click for decoding and encoding payloads
This tool scans JavaScript Webpacked gRPC-Web related files and outputs gRPC endpoints, services, methods, messages, fields, and field types. It helps a lot to find hidden parameters or hidden endpoints and also in some situations you can make .proto file with the output of this tool.
First, you have to pass the payload to the standard input of the gRPC Coder with — decode flag. Then pass the output of the tool to Protoscope and save the output to a file for editing.
echo "AAAAABYSC0FtaW4gTmFzaXJpGDY6BVhlbm9u" | python3 grpc-coder.py --decode | protoscope > out.txt
cat out.txt
2: {"Amin Nasiri"}
3: 54
7: {"Xenon"}
Edit the out.txt file:
cat out.txt
2: {"Amin Nasiri Xenon GRPC"}
3: 54
7: {"<script>alert(origin)</script>"}
Then use Protoscope and pass its output to the gRPC Coder tool with — encode flag:
protoscope -s out.txt | python3 grpc-coder.py --encode
After that, we can send the payload with Burp Suite:
AAAAADoSFkFtaW4gTmFzaXJpIFhlbm9uIEdSUEMYNjoePHNjcmlwdD5hbGVydChvcmlnaW4pPC9zY3JpcHQ+
I am sure you found out that this is a time-consuming process and for manipulating every request, you have to spend tons of minutes to do that. That’s why I made the extension.
I have made a video for using this extension, in the video I exploit a lab sample that has client side XSS protection:
You can easily encode and decode payloads with the extension. See the gRPC Pentest Suite to install the extension in Burp.
When you are working with a web application that is using gRPC-Web, maybe you see a main.js or somethingRandom.js file that has gRPC-Web related files inside itself.
Note: For finding the correct JS file which has gRPC-Web data, you can search one gRPC-Web route in all Burp Responses for example search this:
hidden.sqli.Searcher like this:
After finding the right JS file, download the file and scan it with gRPC Scan.
The example for Hidden-SQLi gRPC lab after web packing client.js file is this:
It is a minified JavaScript file that has good information about the gRPC back-end endpoints and services. The gRPC Scan tool makes analyzing this file much easier:
python3 grpc-scan.py --file main.js
The output:
In the output, we can see 2 endpoints in which we can send requests to them.
There are also 3 messages, each of them has some fields, and each field has a field number and field type. Pay attention that field names in gRPC Scan output are not important when we are manipulating payloads, because Protobuf works with field numbers. fields names are just small clues that help us know a bit about parameter and their usage.
Pay attention that sometimes the application does not use all endpoints or all message fields and maybe they are optional. You have to fuzz them to find possible vulnerabilities.
At first, I decode the payload:
The result is this:
Then change the route and put the SQLi payload inside it:
After making all changes to the payload, I encode it and send it to the server:
When I decode the response with gRPC Coder Extension, we see that there are no published posts, and /Search2 route was not protected against SQLi vulnerability and we see the flag :)
The complete video of exploiting hidden SQLi and XSS is here: