

F-Zero X ran at 60 fps. Also Yoshi’ Story, Mischief Makers, and probably a few others.
Also the PS1 has many games that ran at 60 fps, too many to list here in a comment.
F-Zero X ran at 60 fps. Also Yoshi’ Story, Mischief Makers, and probably a few others.
Also the PS1 has many games that ran at 60 fps, too many to list here in a comment.
I don’t understand all the technicals myself but it has to do with the way every pixel in an OLED is individually self-lit. Pixel transitions can be essentially instant, but due to the lack of any ghosting whatsoever, it can make low frame motion look very stilted.
Also the inherent LCD latency thing is a myth, modern gaming monitors have little to no added latency even at 60hz, and at high refresh rates they are faster than 60hz crts
That’s a misunderstanding. CRTs technically don’t have refresh rates, outside of the speed of the beam. Standards were settled on based on power frequencies, but CRTs were equally capable of 75, 80, 85, 120Hz, etc.
The LCD latency has to do with input polling and timing based on display latency and polling rates. Also, there’s added latency from things like wireless controllers as well.
The actual frame rate of the game isn’t necessarily relevant, as if you have a game at 60 Hz in a 120 Hz display and enable black frame insertion, you will have reduced input latency at 60 fps due to doubling the refresh rate on the display, increasing polling rate as it’s tied to frame timing. Black frame insertion or frame doubling doubles the frame, cutting input delay roughly in half (not quite that because of overhead, but hopefully you get the idea).
This is why, for example, the Steam deck OLED has lower input latency than the original Steam Deck. It can run up to 90Hz instead of 60, and even at lowered Hz has reduced input latency.
Also, regarding LCD, I was more referring to TVs since we’re talking about old games (I assumed consoles). Modern TVs have a lot of post process compared to monitors, and in a lot of cases there’s gonna be some delay because it’s not always possible to turn it all off. Lowest latency TVs I know are LG as low as 8 or 9ms, while Sony tends to be awful and between 20 and 40 ms even in “game mode” with processing disabled.
I hate that. I had my home built to spec a few years ago. The exterior siding is cedar shake stained a chocolatey brown with forest green trim, and the interior is white walls but with natural wood trim, pale golden laminate wood flooring, and two tone hickory wood cabinets, and the interior doors are all just natural wood unpainted.
I’ve leaned into the wood aesthetic with my DIY standing desk and custom pine desktop stained a dark red oak color, among various other earth tone color hints, and splashes of brighter decoration here and there.
Was going for “cozy cabin/cottage” and I think we nailed it. It’s very rustic.
I really hate the modern trends of white, black, steel, and glass.
Couple things. Frame timing is critical and modern games aren’t programmed as close to the hardware as older games were.
Second is the shift from CRT to modern displays. LCDs have inherent latency that is exacerbated by lower frame rates (again, related to frame timing).
Lastly with the newest displays like OLED, because of the way the screen updates, lower frame rates can look really jerky. It’s why TVs have all that post processing and why there’s no “dumb” TVs anymore. Removing the post process improves input delay, but also removes everything that makes the image smoother, so higher frame rates are your only option there.
Sounds great. I’m in my 40s with myopia, astigmatism, and more recently, presbyopia.
Progressive lenses don’t work for me, and needing two pairs of glasses is not ideal, even if it mostly works. Plus I can’t even just buy reading glasses off the shelf, even my short range office lenses need a prescription and are expensive as hell.
Autofocusing lenses sound like an awesome alternative.
As a former VMware employee this is just sad.
VMware was a great place to work, with a lot of people who cared about what they were building and supporting, and now it’s just a hollowed out vulture capitalist’s pump and dump. Anybody with any sense is migrating to alternatives, if they haven’t already.
This is a weirdly aggressive take without considering variables. Almost petulant seeming.
6” readers are relatively cheap no matter the brand, but cost goes up with size. $250 to $300 is what a 7.8” or 8” reader costs, but there’s not a single one I know of at 6” at that price.
There’s 10” and 13” models. Are you saying they should cost the same as a Kindle?
Not to mention, regarding Kindle, Amazon spent years building the brand but selling either at cost or possibly even taking a loss on the devices as they make money on the book sales. Companies who can’t do that tend to charge more.
Lastly, it’s not “feature creep” to improve the devices over time, many changes are quality of life. Larger displays for those that want them. Frontlit displays, and later the addition of warm lighting. Displays essentially doubled their resolution allowing for crisper fonts and custom fonts to render well. Higher contrast displays with darker blacks for text. More recently color displays as an option.
This is all progress, but it’s not free. Also, inflation is a thing and generally happens at a rate of 2% to 3% annually or thereabouts during “normal” times, and we’ve hardly been living in normal times over the last decade and a half.
I would rather pay more for a better device, and preferably not one from Amazon if I can help it. Its only a matter of time before they start cracking down even more on side loading
They already started that technically with removing USB downloads. I got sick of their shit and jailbroke my Kindles. They live in KOReader now.
Is the price of an eReader that big of a deal? They practically pay for themselves with use over time, and they last a ridiculous number of years.
My first Kindle was the K3 Keyboard for $140 in 2011. It finally died in late 2018 after nearly 8 years of use. I regrettably binned it, as I didn’t know you could replace the battery at the time. Shame, I really liked that thing.
I bought a Kindle PW4 for “cheap” ($80 or $90?) in 2019 to replace it, but I hated it after spending some months reading on a larger tablet, Replaced it with a “premium” Boox Nova 2 eReader for $310, and I still use that one today. I plan to just get a cheap battery replacement when it kicks the bucket, as it’s easily user serviceable and a new battery for it is less than $15.
I also got a Kindle Paperwhite Signature in 2023 for $135 as an “upgrade” to the Boox, but it was more a sidegrade. I use both of them alternatingly today.
So I’ve on average paid about $48 a year on eReaders. Seems reasonable considering how many books I’ve gotten for free or very deep discounts via stuff like Bookbub, as well as “free” Prime First reads and Kindle Unlimited books I read over the years as a Prime subscriber, Project Gutenberg and Standard eBooks, as well as digital library access.
I’ve paid more than $48 in one month for subscription services at times that I used less than my eReaders, which see use daily. And you don’t have to be like me and buy multiple, you can buy one reader and use it pretty much indefinitely so long as the battery is user replaceable, so the upfront cost is sort of irrelevant over a long enough time span.
True, but even that is higher than the latency was on the original systems on CRT. My previous comments were specific to display tech, but there’s more to it.
Bear in mind I can’t pinpoint the specific issue for any given game but there are many.
Modern displays, even the fastest ones have frame buffers for displaying color channels. That’s one link in the latency chain. Even if the output was otherwise equally fast as a CRT, this would cause more latency in 100% of cases, as CRT was an analogue technology with no buffers.
Your GPU has a frame buffer that’s essentially never less than one frame, and often more.
I mentioned TVs above re: post processing.
Sometimes delays are added for synchronizing data between CPU and GPU in modern games, which can add delays.
Older consoles were simpler and didn’t have shaders, frame buffers, or anything of that nature. In some cases the game’s display output would literally race the beam, altering display output mid-“frame.”
Modern hardware is much more complex and despite the hardware being faster, the complexity in communication on the board (CPU, GPU, RAM) and with storage can contribute to perceived latency.
Those are some examples I can think of. None of them alone would be that much latency, but in aggregate, it can add up.