this post was submitted on 14 Nov 2023
190 points (97.5% liked)

Technology

59107 readers
3242 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
all 49 comments
sorted by: hot top controversial new old
[–] [email protected] 64 points 11 months ago

Fuck chrome, fuck Google.

[–] [email protected] 45 points 11 months ago (2 children)

so, if were usin firefox, were good to go?

[–] [email protected] 36 points 11 months ago (1 children)

Good in what sense? Firefox is already blocking third party cookies as part of its enhanced tracking protection (which you should set to "strict" level, go do that right now if you didn't already).

[–] [email protected] 5 points 11 months ago

Thank you. Just did it

[–] [email protected] 34 points 11 months ago (4 children)

Kill third party everything. No more CDNs, no more tracking pixels, no more cookies, no more content from anything but the domain in the url bar.

[–] [email protected] 31 points 11 months ago (2 children)

No more CDNs is a bad fucking idea.

[–] [email protected] 5 points 11 months ago* (last edited 11 months ago) (2 children)

Any CDN worth its salt can run on your domain so that's not an issue. The issue is that no third-party anything is pointless as links will just change from nyt.adnetwork.com to adnetwork.nyt.com. I'd rather not encourage those kinds of DNS shenanigans.

[–] [email protected] 3 points 11 months ago (1 children)

What about a CDN for JS libraries?

What about YouTube embeds?

What about images from Imgur?

Why should all of this be handled by me, on my domain?

[–] [email protected] 3 points 11 months ago* (last edited 11 months ago)

You'd tell cloudflare DNS "yo put your stuff on cloudflare-cdn.mydomain.foo". Embeds should be iframes, that is, different webpages, imgur could do the same though yes it's overkill. Another option would imgur offering an automated API that would allow cloudflare DNS to tell it "here's a key, please get ready to serve on imgur-cdn.mydomain.foo".

It can all be handled on your domain without you actually running the backing servers. It's also insanity.

[–] [email protected] 2 points 11 months ago (1 children)

Running a CDN on your domain effectively defeats the purpose of CDN.

[–] [email protected] 2 points 11 months ago (1 children)

No. Things being on your domain doesn't mean that traffic hits your servers.

[–] [email protected] -1 points 11 months ago (1 children)

It doesn't, but it defeats the purpose of CDN, because your users still hit your domain instead of CDN one and cannot leverage the benefits of distributed caching. Browser cache is bound to a URL, you change one letter and it is invalidated.

[–] [email protected] 1 points 11 months ago* (last edited 11 months ago) (1 children)

Why would the URL change?

It won't share js libraries and fonts and whatnot cross-site but compared to a single image that should be negligible. At least if you don't pull in gazillions of superfluous dependencies and don't even run dead code elimination over them. And anyway that's more bandwith usage between user and CDN, not user and you.

Also I already said that it's insanity. But it would work.

[–] [email protected] -1 points 11 months ago (1 children)

Because you're not using a CDN URL everyone else is.

Savings are massive for the user. If you don't care about your users, please stop doing anything development related.

[–] [email protected] 1 points 11 months ago (1 children)

You know what's faster than a CDN? Vanilla js.

And how often do I have to repeat that it's insanity? It's just that user network traffic doesn't even come close to the top of reasons why it's a bad idea.

[–] [email protected] -2 points 11 months ago (1 children)

Insanity is what you have in your head.

[–] [email protected] 2 points 11 months ago* (last edited 11 months ago)

I wasn't the one advocating to outlaw cross-site everything. I only said that it could be made to work... not well, but still. Also that it's a bad idea. Do you disagree with that?

But yes I'm also insane how could you tell.

[–] [email protected] 1 points 11 months ago (2 children)
[–] [email protected] 17 points 11 months ago (2 children)

CDNs like CloudFlare reduce load on smaller servers through caching and delivery of common assets, which reduces load times (helping to democratize sites as it's not just big companies that can afford quick websites). CDNs also prevent DDoS attacks and can improve uptime.

They're pretty critical pieces of internet architecture. Not that they're perfect, but banning all third party content from sites is kind of a baby/bathwater situation.

[–] [email protected] 13 points 11 months ago

CDNs also reduce load on the network. Why pull a resource from a server on the opposite side of the world when a CDN on my 'door step' can provide a cached version of it.

[–] [email protected] 2 points 11 months ago

Ah. Thanks!!

[–] [email protected] 0 points 11 months ago

The purpose of a CDN is to better cache common resources between different web sites. For example, if you're using a Roboto font from Google CDN on your web site, just like many other web sites do, the user who previously visited other sites with such font will load your web site much faster and will spend less traffic, because he already has this font from CDN in their cache. It also means that you save money on hosting.

If you remove CDN from the equation, you punish yourself and your users. That's a very dumb idea. Especially when CDNs are free to use.

[–] [email protected] 15 points 11 months ago (1 children)

What if everything is just routed through the backend and still is able to track you?

[–] [email protected] 13 points 11 months ago

But that is a lot harder to do and requires more resources.

If you have a tracking pixel now, the company directly knows your browser from you downloading that pixel. If they were to implement the single-backend stuff, the site would have to gather all that information themselves and then send it to all the trackers. But they can't just send it somewhere, because then everyone could send bogus info to them so you need verification and an api and that is costly and each company would build their own api so you need to buy a program that speaks all those apis... You get the point. It's a LOT more work than just pasting the text for some pixel somewhere on your site and let the others do the rest.

[–] [email protected] 4 points 11 months ago

Yes please.

[–] [email protected] 1 points 11 months ago

This person absolutely HATES Twitter embeds.

[–] [email protected] 17 points 11 months ago (2 children)

Would this destroy site analytics?

[–] [email protected] 44 points 11 months ago (1 children)
[–] [email protected] 7 points 11 months ago (1 children)

Getting rid of the competition.

[–] [email protected] 4 points 11 months ago* (last edited 11 months ago)

Google would never do that. Google is: "Streamlining products to ensure business owners can expand on potential revenue sources by providing single-channel access to advanced site analytics while helping people optimize access to the broad landscape of available data sources."

Randomly insert "Advanced Next Generation AI" anywhere in the above for full effect.

[–] [email protected] 15 points 11 months ago (1 children)

No, it would not. If you're talking GA, that's all first party.

[–] [email protected] 2 points 11 months ago

I'm thinking more along the lines of the aggressive cookies sites like Facebook and Tiktok use. For FB that's like their whole model.

[–] [email protected] 10 points 11 months ago (1 children)

This is the best summary I could come up with:


With the publication of its notice of intent to deprecate and remove third-party cookies, those involved in the development of Google's Chrome browser and its associated Chromium open source project now have more specific guidance.

As Google senior software engineer Johann Hofmann observed in his aforementioned notice, the phaseout of third-party cookies and shift to Privacy Sandbox technology – in Chrome at least – is a significant change in the status quo.

The impact of replacing the technical foundation of internet advertising while marketers are still doing business on the premises hasn't been lost on regulators, who have been trying to ensure that Google builds a level-playing field – something critical lobbying groups have disputed.

Thus Google has agreed to make specific commitments to the UK's Competition and Markets Authority (CMA) to allay concerns that the Privacy Sandbox doesn't become a killzone for competitors.

While it seems unlikely that watchdogs want to ensure that every marketer operates from an equal level of informational wealth, competitors have a unique opportunity to hamstring the ad giant by raising the alarm amid its antitrust trials and inquiries around the globe.

"The web in general is rapidly moving away from third-party cookies, with Firefox and Safari leading the way," said EFF senior staff technologist Jacob Hoffman-Andrews in an email to The Register.


The original article contains 1,739 words, the summary contains 218 words. Saved 87%. I'm a bot and I'm open source!

[–] [email protected] 6 points 11 months ago
[–] [email protected] 2 points 11 months ago

Isn't it sad that they referred to them as "Google Chrome coders" and not "web developers?"

Tells me everything I need to know.

[–] [email protected] 1 points 11 months ago