

LLMs work in some limited use cases with good data to draw from. Most companies, especially large corporations, have shit data. Current AI cannot fix bad data. They will never do what these guys are promising they will do.
LLMs work in some limited use cases with good data to draw from. Most companies, especially large corporations, have shit data. Current AI cannot fix bad data. They will never do what these guys are promising they will do.
Most have four buttons on the left and four on the right. The second or third button on the right side has been the mute button on every system I’ve tried.
You don’t have to have a total lack of self control to be prone to overconsumption. That’s a caricature you have in your mind that’s main function is to make you feel better about your own habits which keeps many from examining their habits in any detail.
No you’re not. Advertising, propaganda, and overconsumption impacts everyone.
Come back to what you’ve posted here in two years and read it again. You’re trusting people with data that you really shouldn’t. Perhaps that’s an acceptable risk to you but you should be sure that you can live without the privacy you think you have because that is a really bad bet.
I’m not going to touch the ethical and emotional minefield that is flirting with a chatbot but I will say that those conversations are definitely not private. That whole industry is based on stealing other people’s data. Do not do anything with an LLM that you can’t handle other people finding out about because there’s a very good chance that they will.
It was a pretty convenient way for them to hand pick a nominee. Given how the last two played out I don’t think that’s an unfair lens to view the situation through.
They did the same in 2020 then did away with all pretenses of input in 2024 and just told us who the candidate was going to be
Why would capitalists give a fuck about any constraint? We’ve conditioned them for a century to believe they can do anything they want with impunity.