this post was submitted on 23 Apr 2024
548 points (97.4% liked)

Technology

59370 readers
4596 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
548
submitted 6 months ago* (last edited 6 months ago) by [email protected] to c/[email protected]
 

Edward Zitron has been reading all of google's internal emails that have been released as evidence in the DOJ's antitrust case against google.

This is the story of how Google Search died, and the people responsible for killing it.

The story begins on February 5th 2019, when Ben Gomes, Google’s head of search, had a problem. Jerry Dischler, then the VP and General Manager of Ads at Google, and Shiv Venkataraman, then the VP of Engineering, Search and Ads on Google properties, had called a “code yellow” for search revenue due to, and I quote, “steady weakness in the daily numbers” and a likeliness that it would end the quarter significantly behind.

HackerNews thread: https://news.ycombinator.com/item?id=40133976

MetaFilter thread: https://www.metafilter.com/203456/The-core-query-softness-continues-without-mitigation

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 6 points 6 months ago (1 children)

I was thinking on something slightly different. It would be automatic; a bit more like "federated Google" and less like old style indexing sites. It's something like this:

  • there are central servers with info about pages on the internet
  • you perform searches through a program or add-on (let's call it "the software")
  • as you're using the software, performing your search, it'll also crawl the web and assign a "desirability" value to the pages, with that info being added to the server that you're using
  • the algorithm is open and, if you so desire, you can create your own server and fork the algorithm

It would be vulnerable to SEO, but less so than Google - because SEO tailored to the algorithm being used by one server won't necessarily work well for another server.

Please, however, note that this is "ideas guy" tier. I wouldn't be surprised if it's unviable, for some reason that I don't know.

[–] [email protected] 1 points 6 months ago

I think you could do it in Lemmy itself combined with RSS feeds. The mods would curate a list of RSS feeds, and use the keywords to pick the ones for a bot to automatically post (which means if a programming blog did a post about windsurfing, it wouldn't show up as long as the meta keywords didn't match). Mods could take suggestions each week for feeds to add or remove.