I've gone back and forth on this since it was first announced.
It's a very tough pill to swallow and is absolutely going to break a lot of web content. There's just no way around that. And there are very few things that are worth breaking the web over.
This is probably one of them. The security benefits of this change are very, very high. It's fixing something that arguably was broken since the introduction of cookies.
I am, as a rule, fairly skeptical when Chrome/Chromium devs say they're going to break something for security. Usually I think they're wrong, this time I think they're right, and I'm grateful that they've pushed for the change.
Also to the Chromium team's credit, they're taking the lead on something that they are 100% going to get criticized about. There are going to be some very angry threads about how they did it wrong, and how a 2020 release is too soon. I don't think the rollout has been perfect either, but... somebody has to do it. Once everything dies down, Firefox is going to implement the same changes, and no one will bat an eye because everyone will have already adjusted to Chrome. There's very little upside to being the browser that makes these changes first, but one of the browsers needs to do it first.
A nontrivial portion of our privacy/security problems on the web come from some fundamentally bad sandboxing; and frankly, cookies are badly sandboxed. This change makes the default at least a little better, and will immediately block a large number of CSRF attacks.
So, yeah. I get that it's annoying, but I'm with Google on this one.
Firefox and Safari are way ahead of Chrome on this. Safari has blocked third-party cookies for years, with no way to opt-in to supporting them. Ad companies have worked around that restriction, using a number of circumventions that Chrome does not currently intend to block.
When ad companies attempt to circumvent Safari's third-party cookie policy, Safari tries to block them as part of their non-standard ITP (Intelligent Tracking Prevention) program. https://clearcode.cc/blog/intelligent-tracking-prevention/
Rather than fearing to break the web, Apple has broken the web's ad tech frequently and aggressively; users seem not to mind.
Browser vendors fear "breaking the web," in part because they're afraid that users will switch browsers if their favorite web site stops working. But it turns out that if you just break the web's ads, users tend to like it, if they even notice in the first place.
The point of this change isn't to break ads, the tracking benefits are at best incidental. Ad networks can just opt-out anyway.
The point is to block CSRF attacks on ordinary websites for developers who don't know enough to secure their cookies.
> But it turns out that if you just break the web's ads, users tend to like it, if they even notice in the first place.
Breaking just the web's ads wouldn't have the same impact on security. The reason this change is valuable is because it's not restricted to specific blacklists or domains or applied via an algorithm, it's universal. I really like the tracking protections that Safari/Firefox are adding, but comparing them to this is apples and oranges.
They're policies that are designed for different purposes.
Firefox's Enhanced Tracking Protection is not mandatory. It's just enabled by default now. You can configure it in your Firefox settings or disable for individual sites via the address bar's shield icon.
Firefox seems to have already implemented it behind prefs in Fx69 [0], and are just now waiting to toggle them to default true, presumably around the same time Chrome ships it, February 2020.
Thanks for the heads up, I was curious what Firefox's status was.
I think that's just a smart move in general. There's certainly no PR reason to preemptively turn this setting on, it's better to see what happens when Chrome does. If it turns out to be a disaster, you delay, if not, you flip the switch a few days later so you're not lagging behind on security. Again, I have a reasonable amount of respect for the amount of publicity that the Chromium team has done here, because I think that if things go wrong, they're the ones who will be blamed by most people.
It's good to bring up that Mozilla and Microsoft are also on board with this. It's not Google going off and doing something on their own or forcing the change despite objections, pretty much everyone including Mozilla seems to agree, and would like to see this change.
Not mentioned in the article, but critical for debugging and testing - Chrome has added a 2 minute grace period, where cookies set in that time period are still allowed on non-get requests even without a samesite policy. This is considered temporary and will be removed some day in the future.
If you test with Chrome 78, and your app works, you can't know if it'll work once the 2 minute grace period goes away. Please test your app using an old version of Chrome (76/77) with the flags for this enabled.
There is a DevTools console message warning you about this that pops up if the cookie would have been rejected, if not for the 2 minute threshold for POST requests (and friends).
This kills most Cross-Site Request Forgery vulnerabilities, since sites will now need to explicitly opt-in to allowing authentication cookies to be served in cross-site requests.
Of course, it'll probably be a long time before enough browsers support this that you can actually rely on this behavior, and even once support is universal there'll probably still be some sites doing dumb stuff like setting `SameSite=None` on authentication cookies without taking additional precuations, or allowing data to be modified with GET requests. But overall this is a huge win for security on the web.
>it'll probably be a long time before enough browsers support this
Will it? All major browsers now receive automatic or timely updates. The speed of features rolling out over all major browsers is pretty rapid now days and unless you care about supporting IE you can expect new features to be ready to use within a year.
There is only so much you can do to support users using a discontinued browser many many years out of date. Unless you want to be stuck in 2013 forever you have to let them go eventually.
The problem is that this is a critical security feature, so the consequences for users are rather severe if you rely on it but their browser doesn't support it.
Instead of your site simply being broken or a certain feature on it not working, you'd instead be opening up all users on older browsers to a CSRF attack against your site.
For anyone else wondering: the new default SameSite=Lax should (continue to) work fine for cookies with the "Domain" attribute, for sharing with subdomains.
Thanks for mentioning this. I've poured through docs on SameSite and none seem to mention subdomains or what setting is required for a cookie set at root Domain to be shared with subdomains.
The other thing I had a really hard time finding an answer to is how SameSite handles sibling subdomains like www.foo.com and api.foo.com.
Apparently browsers will use the public suffix list to determine what constitutes the first part of a domain which is identifying, and anything beyond that is considered SameSite.
So www.foo.com is the same site as api.foo.com, but user1.github.io is not the same site as user2.github.io. Because github.io is on the public suffix registry.
Correct, in that I think of a "Site" as an entity defined a layer above the "Domain". However the "Domain" attribute and the "SameSite" attribute control different behaviour.
"SameSite" affects sending the cookie in situations where top-level site in the browser context is different from the target site of the request where the browser is determining if it should send cookies. e.g. on example.site with an iframe to widget.site
"Domain" determine the the highest level domain to which cookies should be sent, regardless of the browsing context. e.g. on example.site an iframe on widgets.example.site or top-level navigation to accounts.example.site
Thank you for breaking it down and the reference link.
I have a couple of setups where an application has a single sign-on for root and subdomains. The shared cookie has the Domain attribute set to the root domain, but (so far) they have no explicit SameSite attribute.
I searched around and came to the conclusion that the above setup will behave the same way with new default SameSite=Lax. However, there wasn't a canonical reference that I could point to, to prove this works as I expect.
My website has one affiliate link to Amazon page and a PayPal Buy It Now button (implemented as a form). Chrome's console message suggests I add 'SameSite=None' and 'Secure'. Any advice on where I would add those properties and if they are the right value for these type of links? I didn't realize these type of links had anything to do with cookies.
Maybe I'm misunderstanding, but isn't the whole point that Amazon and PayPal need to update their cookies if they intend for these affiliate/pay now links to their domain to pass along their own cookies?
I don't know how AWS and Paypal affiliate links work and if they even send a cookie, or if they just send to a URL with your account ID embedded in the link. You could probably check by using the 'Network' tab in your browsers developer tools, or as other have suggested newer versions of chrome allow testing of this feature.
If there are any cookies, you will need to add SameSite=none or similar to the Set-Cookie directives for the cookies which are intended to be shared cross-domain.
Ridiculous that they're trying to secure cookies but their blog requires you to access at least 6 domains to get more than a title. They push out other ad providers with the comfort of knowing they can use their CDN's to harvest user data.
This change makes a mockery of Google's commitment to a compatible web. There have been 5 SameSite drafts over two year. This change is strictly incompatible with two of the drafts that shipped in two different browsers (Chrome 66 and Safari 12). Users of LTS Linux distributions will find random websites broken because of this change. It's a great idea on paper, but by shipping so many incompatible versions in the draft stage, Google has virtually ensured breakage for certain user agent - service combos. They should have held the change for another 18 months.
Can you give some examples with specifics on how their changes are causing problems? I only recently became aware of the SameSite setting and was considering using it on my apps, but might hold off
If you have cookies that need to be read in a 3p context (federated auth for instance), you must UA sniff in order to figure out if setting samesite=none is going to break your cookies, do nothing, or be required for your cookies - just across three versions of Chrome.
Then there are the apps that forked Chromium 66 and then renamed it, so you can't tell what version it is. We're planning to just break them, since it's impossible to tell what to do there.
If you return SameSite=None to Chrome 66, it ignores the cookie entirely. If you return SameSite=None to iOS 12, it treats it as SameSite=Strict. At the time each software was written, the behavior was blessed by the spec ("None" wasn't yet a valid value).
Here is Google's current list of incompatible clients. Someone should write an npm or Python module that encapsulates all these exceptions so sites can decorate their cookies correctly without worrying about these details.
On top of preventing CSRF, it forces labeling of cross site cookies, commonly used for tracking. There's an option in the latest builds to remove all 3p cookies (and add a site allow list)
I find it ironic that Microsoft will be implementing it when the latest version is IS11 on a fully patched windows 7 host doesn’t support samesite.
I have no idea why the host OS is relevant, but the feature should never have been limited by it, and it’s a nasty move given it’s a security feature and not just some random shiny new button.
Windows 7 is end of life in about 2 months. There should be no expectation of future updates on an OS passed its EoL especially when it was such a long support period.
Same site was implemented in IE some time ago, but they didn’t implement it in
Windows 7 (which, as you pointed out has not yet been end of lifed).
It’s still in support, and the function shouldn’t even care about the host OS, I think the lack of implementation was more likely completely intentional (on the level of ‘if Windows.version < 10 return true’) which is an absolute dick move.
Well to start, there's browser history sync through your Google account. I'm not aware what the privacy policy is on that data, but I'd imagine it allows Google to get aggregate views at least at the domain level, if not the page level.
GA usually comes from a Google-domain. Implementing GA on one of your properties always means using 3rd-party cookies (as long as you not configure some CNAME-records to point your own 1st party tracking domain to GA's tracking domain). Where the script is being executed does not matter in this case, the script is just building the tracking request to the tracking app.
In my opinion, cookies on cross-site requests (third-party cookies) should be sent to web servers under a different request header, since they're semantically different (do not necessarily imply user authorization and intent) from cookies attached to a request initiated by the user from a first-party page.
I thought "third-party cookies" generally referred to cookies that were set in a third-party context. Whereas you're talking about cookies being read in a third-party context.
What is a scenario that would illustrate the difference? On paper third party cookie just means “any cookie from another domain”. How would you read another domain’s cookie without some feature that explicitly allows it, like the Chrome feature in question?
It's a very tough pill to swallow and is absolutely going to break a lot of web content. There's just no way around that. And there are very few things that are worth breaking the web over.
This is probably one of them. The security benefits of this change are very, very high. It's fixing something that arguably was broken since the introduction of cookies.
I am, as a rule, fairly skeptical when Chrome/Chromium devs say they're going to break something for security. Usually I think they're wrong, this time I think they're right, and I'm grateful that they've pushed for the change.
Also to the Chromium team's credit, they're taking the lead on something that they are 100% going to get criticized about. There are going to be some very angry threads about how they did it wrong, and how a 2020 release is too soon. I don't think the rollout has been perfect either, but... somebody has to do it. Once everything dies down, Firefox is going to implement the same changes, and no one will bat an eye because everyone will have already adjusted to Chrome. There's very little upside to being the browser that makes these changes first, but one of the browsers needs to do it first.
A nontrivial portion of our privacy/security problems on the web come from some fundamentally bad sandboxing; and frankly, cookies are badly sandboxed. This change makes the default at least a little better, and will immediately block a large number of CSRF attacks.
So, yeah. I get that it's annoying, but I'm with Google on this one.