

I had a similar experience where we had an entire class for Novell Directory Services. The reason our teacher gave for keeping the class in the curriculum? We MAY run into it in the workforce.
I had a similar experience where we had an entire class for Novell Directory Services. The reason our teacher gave for keeping the class in the curriculum? We MAY run into it in the workforce.
DNSSEC is a means of authenticating the data receives was not tampered with, such as MITM attacks, thus ensuring data integrity. It uses PKI but it’s not an alternative to DoH or DoT which encrypts the DNS traffic, either over HTTPS or TLS, providing confidentiality.
DNSSEC can be used in conjunction with DoH or DoT to achieve the Security CIA triad - Confidentiality, Integrity, Authenticity.
Considering the recent revelations about the shady, scummy and unethical business practices by Honey, I can’t say I’m surprised that one of the co-founders is doing more shady shit with their new endeavor.
Future Cop: LAPD
Though the game wasn’t groundbreaking it was fun going around LA in a giant Mech blowing stuff up.
I really liked the ability to transform from a bipedal mech to a fast hover car which also helps with the pacing of the game.
It did introduce me to a tower defense PVP style multiplayer that my best friend and I were hooked on for a solid couple of months.
Does codeberg have anything that will prevent an influx of bots or AI accounts that have plagued GitHub?
I ask because as the user base for codeberg grows the bots, AI and nefarious actors will follow.
I like the idea of a federated source code hosting platform especially since it removes lock-in to a single corporation and a defacto monopoly.
That in itself is a good enough reason to migrate, but regarding this particular issue, bots/AI and artificial project promotion for malicious intent, feels like re-arranging deck chairs on the Titanic.
It’s all good, we both clarified our* thoughts on the matter and to be fair using “ruined” instead of “ruining” or “started to ruin” indicates a completed process or final state instead of a continuous one.
I agree that previously one could construct a search to sort the noise out, but as you stated this has become unfeasible without a sharp increase of queries needed to refine results which has shifted the thought from questioning if Google search is bad to now generally accepted belief - to the point where people are trying to quantify and provide evidence to back up the claim.
This article links to a research paper on the topic: https://www.fastcompany.com/91012311/is-google-getting-worse-this-is-what-leading-computer-scientists-say
*Fixed typo of ‘out’ to ‘our’
Public in this term has nothing to do with intelligence, but rather people outside of companies working on AI/LLMs or doing AI research. It’s why I mentioned it entering the zeitgeist.
I never mentioned a hard cutoff but said they ruined it before LLMs were in use by the general public. Essentially I’m referring to the starting of the degradation of Google’s search which they made conscious decisions that deliberately put profit above all.
Avid Amoeba is right that Google ruined their own search before LLMs entered the public consciousness (this does not mean LLMs didn’t exist before this, but that they were not widely available for the general public to use or became part of the zeitgeist).
If you don’t agree please listen to the Better Offline podcast episode “The Man That Destroyed Google Search”. The episode goes through the rollbacks/changes Google made to their search Algorithm well before AI was commonplace.
Better Offline: CZM Rewind: The Man That Destroyed Google Search: https://omny.fm/shows/better-offline/czm-rewind-the-man-that-destroyed-google-search
In contrast Packt is on the crap end.