

Its like the problems around the ozone hole, or acid rain.
A lot of people scrambled and worked very hard to find an alternative that didn’t cause problems, and now it’s almost like they never existed, and people think it was much ado over nothing.
Its like the problems around the ozone hole, or acid rain.
A lot of people scrambled and worked very hard to find an alternative that didn’t cause problems, and now it’s almost like they never existed, and people think it was much ado over nothing.
I don’t think it will. It hasn’t so far, despite there having been linux phones in the past, and it’s got the Windows Phone/Symbian problem where users don’t go to it because it lacks support for the apps they use, and developers don’t support it because it doesn’t have much by way of users.
Or give an option to toggle. Surely letting people turn it off would save them even more resources, if they don’t have to bother with upscaling the video in the first place.
Honestly, that should have been for the better. If it’s meant to be a tool, I would much rather it behave like a tool, rather than trying to be my best friend, or an evil vizier trying to give me advice.
The fact that people got so attached to what is essentially a text generation algorithm that they were mourning its “death” is worrying, especially when it’s one that OpenAI has proven themselves to be more than able to modify as they wish.
Just as concerning is OpenAI rolling back the update to make their model “friendlier”, or that people were clamouring hand over fist to throw money at the company in the hopes of getting their “friend” back.
That can’t possibly be good news, especially when the shareholders find out that they have an iron grip over a portion of their users.
The worst part is that they backstepped a bit and made it “friendlier”.
Basically undoing that part.
Maladvertising and scams just make that a surefire thing, especially since there’s a chance that just loading an ad could infect your machine.
And for less tech-savvy family members, it cuts down on the risk of them falling for scams or suchlike.
The website would also have to display to users at the end of the day. It’s a similar problem as trying to solve media piracy. Worst comes to it, the crawlers could read the page like a person would.
Not without making real users also mine bitcoin/avoiding the site because their performance tanked.
They could have kept it at the same price, though.
Once it generates the response, there is a button you can click to make it use the reasoning model.
Why they did it that way instead of giving users the option to just set the model that they want to use ahead of time boggles the mind. Surely it would be more efficient for them to chose a model if they want ahead of time, rather than generating something that’s going to be regenerated with the desired model instead.
Depends on it and its dependencies, probably. A lot of the core utilities are generally unchanged enough that they should still work despite being a decade old.
Can’t have issues on the issue tracker if you’re not allowing people to submit issues.
It’s only made worse by the people who treat it like the Master Computer from Star Trek, claim that it can solve all the problems, and thus attempt to shove it into anything and everything.
It’s baffling why my notepad needs to be hooked up to an LLM in the first place. It’s a notepad, for quick scribbling. If people want to write something serious in it, there are far better things for that.
In my experience, it’s been a bit of a mixed bag. There are some things that work in Linux, and some things that don’t, even after a bit of fiddling. My desktop’s front panel is completely unusable on Linux, for example.
Windows is at least widespread enough that it’s far more likely that parts will work on it at least to some degree. And sure enough, the front panel works fine there.
Although 10 years ago isn’t that long in computer terms any more. Those are machines that can still run Windows 10 without issue. It’s an older computer, but still perfectly usable these days.
Unless it wasn’t as low as they wanted it. It’s at least cheap enough to run that they can afford to drop the pricing on the API compared to their older models.
Could you not just get an eGPU dock, and do it that way?