Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SSL Not Working #5

Open
aderon2 opened this issue Jan 9, 2018 · 1 comment
Open

SSL Not Working #5

aderon2 opened this issue Jan 9, 2018 · 1 comment

Comments

@aderon2
Copy link

aderon2 commented Jan 9, 2018

When scanning a host that runs Apache 2.4.18 on both 80 (for http) and 443 (for https), the scheme doesn't appear to change to https, causing the ssl enum scans to fail. My initial guess was that the service was being misidentified by the nmap scan, but I've attached the nmap scan (with some info omitted) as well and it appears to label the service correctly as 'ssl/http'.

My other guess is that perhaps once 'http' is chosen as the scheme when it first scans port 80, it doesn't do a second check for scheme when it's scanning port 443? Looking at the timing for when the output files were created, it definitely look like port 80 is scanned before port 443. Not sure, just throwing it out there. Please let me know if I can provide any additional info.

-rw-r--r-- 1 root root   447 Jan  9 17:56 443_http_dirb.txt
-rw-r--r-- 1 root root   741 Jan  9 17:56 443_http_index.html
-rw-r--r-- 1 root root   915 Jan  9 17:56 443_http_nikto.txt
-rw-r--r-- 1 root root  7537 Jan  9 17:56 443_http_nmap.txt
-rw-r--r-- 1 root root 17621 Jan  9 17:56 443_http_nmap.xml
-rw-r--r-- 1 root root   741 Jan  9 17:56 443_http_robots.txt
-rw-r--r-- 1 root root   761 Jan  9 17:55 80_http_dirb.txt
-rw-r--r-- 1 root root   368 Jan  9 17:55 80_http_index.html
-rw-r--r-- 1 root root   914 Jan  9 17:55 80_http_nikto.txt
-rw-r--r-- 1 root root  4215 Jan  9 17:55 80_http_nmap.txt
-rw-r--r-- 1 root root  8534 Jan  9 17:55 80_http_nmap.xml
-rw-r--r-- 1 root root   565 Jan  9 17:55 80_http_robots.txt

Nmap Results (ssl-cert and ssl-date NSE scripts omitted)

PORT    STATE SERVICE  REASON         VERSION
21/tcp  open  ftp      syn-ack ttl 64 vsftpd 3.0.3
80/tcp  open  http     syn-ack ttl 64 Apache httpd 2.4.18 ((Ubuntu))
| http-methods: 
|_  Supported Methods: POST OPTIONS GET HEAD
|_http-server-header: Apache/2.4.18 (Ubuntu)
|_http-title: Site doesn't have a title (text/html).
443/tcp open  ssl/http syn-ack ttl 64 Apache httpd 2.4.18 ((Ubuntu))
| http-methods: 
|_  Supported Methods: POST OPTIONS GET HEAD
|_http-server-header: Apache/2.4.18 (Ubuntu)
|_http-title: Site doesn't have a title (text/html).

Example 1: Missing https:// when scanning port 443, causes gobuster to return no results

root@kali:~/Documents/recon/results/10.0.0.105# cat 443_http_dirb.txt 

Gobuster v1.2                OJ Reeves (@TheColonial)
=====================================================
[+] Mode         : dir
[+] Url/Domain   : http://10.0.0.105:443/
[+] Threads      : 10
[+] Wordlist     : /usr/share/seclists/Discovery/Web_Content/common.txt
[+] Status codes : 200,204,301,302,307,403,500
[+] Expanded     : true
=====================================================
=====================================================

Example 2: Missing https:// in index curl causes an invalid request to be sent, when sending to 443

root@kali:~/Documents/recon/results/10.0.0.105# cat 443_http_index.html 
HTTP/1.1 400 Bad Request
Date: Tue, 09 Jan 2018 22:56:07 GMT
Server: Apache/2.4.18 (Ubuntu)
Strict-Transport-Security: max-age=63072000; includeSubdomains
X-Frame-Options: DENY
X-Content-Type-Options: nosniff
Content-Length: 439
Connection: close
Content-Type: text/html; charset=iso-8859-1

<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>400 Bad Request</title>
</head><body>
<h1>Bad Request</h1>
<p>Your browser sent a request that this server could not understand.<br />
Reason: You're speaking plain HTTP to an SSL-enabled server port.<br />
 Instead use the HTTPS scheme to access this URL, please.<br />
</p>
<hr>
<address>Apache/2.4.18 (Ubuntu) Server at vulnerable Port 443</address>
</body></html>

Example 3: Missing https:// in robots.txt curl causes an invalid request to be sent, when sending to 443

root@kali:~/Documents/recon/results/10.0.0.105# cat 443_http_robots.txt
HTTP/1.1 400 Bad Request
Date: Tue, 09 Jan 2018 22:56:07 GMT
Server: Apache/2.4.18 (Ubuntu)
Strict-Transport-Security: max-age=63072000; includeSubdomains
X-Frame-Options: DENY
X-Content-Type-Options: nosniff
Content-Length: 439
Connection: close
Content-Type: text/html; charset=iso-8859-1

<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>400 Bad Request</title>
</head><body>
<h1>Bad Request</h1>
<p>Your browser sent a request that this server could not understand.<br />
Reason: You're speaking plain HTTP to an SSL-enabled server port.<br />
 Instead use the HTTPS scheme to access this URL, please.<br />
</p>
<hr>
<address>Apache/2.4.18 (Ubuntu) Server at vulnerable Port 443</address>
</body></html>

Nikto appears to work fine, but I believe it will automatically try https if http fails. I noticed you had a note in the code about # -C all potentially slowing it down?

Perhaps the nikto_ssl variable is not changing to ' -ssl' for the same reason the scheme is having issues and this is causing nikto to test with http first before https (causing the slow speeds)?

'443_http_nmap.txt' doesn't require scheme, so no issues there.

@aderon2
Copy link
Author

aderon2 commented Mar 19, 2018

Is this still being maintained?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant