Cookie is not being set after CRLF Injection in one domain but set in another domain. How can i bypass/set it?

Ok i am facing a very weird behaviour that sets and doesnt set cookie both. So, first i have found CRLF injection in 2 domains, and When i go to vulnerable url, the cookie gets set into firefox-esr. This works in browser. There first vulnerable domain i encountered had this url:

I can view cookies using developers tool. This is default behaviour as i think. The next domain i encountered had this vulnerable urls but it didnt work in browser 🙁 :

But when i visit this any urls from it doest work in browser. Also, both and sets cookie in curl response. This is what it looks like for both redacted but the first one works in browser and second doesnt in browser.
Working Curl request:

root@kali-linux:~/redacted/# http

HTTP/2 301 
date: Thu, 13 Aug 2020 15:02:53 GMT
content-type: text/html
content-length: 185
set-cookie: dipesh=yadav
expires: Thu, 20 Aug 2020 15:02:53 GMT
cache-control: max-age=604800

HTTP/2 200 
date: Thu, 13 Aug 2020 15:02:53 GMT
content-type: text/html
content-length: 1452
vary: Accept-Encoding
last-modified: Tue, 04 Feb 2020 15:54:26 GMT
etag: "redacted"
expires: Thu, 20 Aug 2020 15:02:53 GMT
cache-control: max-age=604800
access-control-allow-origin: *
accept-ranges: bytes


root@kali-linux:~/redacted# http

HTTP/1.1 301 Moved Permanently
Server: nginx
Date: Thu, 13 Aug 2020 15:05:04 GMT
Content-Type: text/html
Content-Length: 162
Set-Cookie: bugbounty=bugbountyplz
Last-Modified: Thu, 13 Aug 2020 15:05:04 GMT
Cache-Control: private
Age: 0
X-Frame-Options: DENY
X-XSS-Protection: 1; mode=block
X-Content-Type-Options: nosniff
Connection: keep-alive

HTTP/2 200 
server: nginx
date: Thu, 13 Aug 2020 15:05:05 GMT
content-type: text/html; charset=UTF-8
vary: Accept-Encoding
access-control-allow-credentials: true
last-modified: Thu, 13 Aug 2020 15:05:05 GMT
cache-control: no-cache, private
age: 0
strict-transport-security: max-age=15768000
x-frame-options: DENY
x-xss-protection: 1; mode=block
x-content-type-options: nosniff
accept-ranges: bytes

Can anyone help me with this? Whats the problem that doesnt letme set cookie in but i can set cookie in

Go to Source
Author: Dipesh Sunrait

What’s the deal with X25519 Support in Chrome/Firefox?

RFC8446/TLSv1.3 Section 9.1 says that “implementations SHOULD support X25519”.

An online list
of software supporting Curve25519 list both Firefox and /Chrome
as supporting it for TLS.

I did an experiment and created a self-signed TLS cert with Ed25519. Both Chromium 84 and Firefox 79 complain
about not being able to negotiate the cipher list/version. I’ve also noticed that they initiate TLSv1.2 handshakes when
connecting to localhost, but use TLSv1.3 handshakes when connecting to google for example. wget on the other hand,
has no problem connecting (I used --no-check-certificate,
but afaik that shouldn’t matter here)

I then looked at the TLSv1.3 handshakes. neither browser offers Ed25519 as a signature in their ClientHello (even when connecting to google via TLSv1.3). Again, wget does offer it
as part of the ClientHello.

Chromium 84.0 TLSv1.3 Supported Signatures

So I figured this might be a platform issue with my distro (Fedora), but this Blog Post also claims that the major browsers don’t supports X25519. While ChromeStatus says it’s been supported since Chrome 50 (I’m assuming chrome and upstream chromium do not differ in this).

I’m totally confused. What’s the current state of X25519 support on major browsers? is it a google chrome vs. upstream chromium issue?

Go to Source
Author: Jim Landy

What a malicious website can do in the worst scenario on a upgraded system [closed]

I use last Debian stable (buster as June 2020).

  • system upgraded everyday (and browser addons updated automatically)
  • Firefox 68.9.0esr (64 bits) (the one from apt package system)
  • decent hardware (less than 5 years old)
  • Debian security upgrade enabled

I’m aware of security concerns, I…

  • verify (before clicking a HTTP link) if the link looks like, but are in fact by example (I take care about phising and tracking)
  • take care of untrusted X509 certificates for https websites
  • avoid using non trusted Firefox addons
  • never open suspicious files in web or mails
  • don’t use weak passwords (and I don’t use the same on 2 websites)
  • never run Firefox as root (who do this ?)
  • use httpsEverywhere, uBlock-Origin, Ghostery, Decentraleyes Firefox addons

So my question:

  • what is the risk of opening a malicious website (if not in google safe browsing DB) ? What it can do, the worst way, apart phishing website ? (I guess crypto-mining at least, exploit of Firefox vulnerability…)

Go to Source
Author: Gilles Quenot

Is there some method to track what changes to files a website does?

So I was trying to find out how this site pentest-tools tracks me without normal MAC address or even IP address. This website basically gives us 2 “free scans” allowing us to scan any 2 sites and find some of the basic vulnerabilities present in websites.

So there are plenty of websites out there (actually, only 2) that provide this free “scans” for basic vulnerabilities. So as a challenge, I wanted to find out whether I could fool the website into allowing me more than 2 scans by changing my identity thus having the server think of me as a new user.

So, I tried all the basic methods of hiding my identity (BTW I am not a hacker or anywhere near) which included MAC and IP spoofing and cookies clearing. But they didn’t work. So, I had a few questions:-

  1. Is there any way to track what file changes the site performs to identify me so that I can find those cookies responsible for storing the number of scans I have used and delete them?

  2. Also, would there be any program that immediately removes whatever files generated by the website (tracker by the above method) and placed on the computer files be deleted automatically? something which even prevents supercookies?


Neel Gupta

Go to Source
Author: neel g