JavaScript frameworks are invented because pure HTML and CSS suck for dynamically loaded pages, and vanilla JavaScript suck in general.
JavaScript frameworks are invented because pure HTML and CSS suck for dynamically loaded pages, and vanilla JavaScript suck in general.
What’s not shown is that the car doesn’t have an engine. Management was really eager to release it to the customer. Don’t worry, it’s planned to get fixed later (spoiler: it’s never going to get fixed).
If you’re concerned I think you should consider getting it checked out (or at least browse the ADHD communities to see if there are any other patterns you can spot).
I recently got diagnosed with autism at ~30. I’ve lived my entire life under the impression that I’m ok, only to realize I’ve never really been ok.
One of the diagnostic criteria of ADHD is that it’s life long, which means it can’t be acquired. However, it’s possible to acquire symptoms that are similar to ADHD, but then it’s probably something else.
Rust and Cargo were built to be in a symbiosis with each other.
NPM is an afterthought of a rushed language.
Then you haven’t seen bad documentation (or had that sex you regret).
There’s also ”we do machine learning”, which usually translates to ”someone trained an SVM model 10 years ago”.
Desychronization will likely happen considering the heart rate is varying. Both must somehow increase by exactly the same rate. Any slight variation will cause them to go out of sync.
Once they’re out of sync, it’s going to be hard for them to get back in sync.
This is assuming both hearts are independent systems. Could be a different story if there dependent (like connected in series rather than parallel), but in that case it’s conceptually no different than having one heart.
Not often. Maybe once every half year.
Don’t worry, he’ll back down either way. He’ll just claim that he’s made a great deal with EU even if no such deal has been made, and then lower the tariffs again.
Creator of curl just made a rant about users submitting AI slop vulnerability reports. It has gotten so bad they will reject any report they deem AI slop.
So there’s some data.
There’s a mix of reasons to start a hobby project.
One reason for starting a hobby project is the learning experience. For example, learning a new programming language or a particular tech stack. The goal isn’t to build something useful. Often it can be building things that already exist. For example, a Minecraft server or a Gameboy emulator.
Another reason is to build something useful for you. Maybe you have an idea of a program you feel should exist. Or maybe a program exists, but not in the way you want. Building it yourself can bridge this gap. Hopefully someone else might find your program useful.
Then there’s also the people who do it for fun. It’s kind of like building a model railway. The process of building it can be more fun and rewarding than actually using it.
Regarding Linux, it’s mostly a matter of preference. There are some things that are easier in Linux. Mac and Windows can sometimes be ” overly protective” and prevent the user from doing particular things. Linux has generally fewer of such barriers.
”Hello fellow kids” vibes
Fun fact: New Super Mario Bros turns 19 this year.
Yes, that means we’re close to the turning point where New Super Mario Bros gets older than what Super Mario Bros was when it was released.
Just an additional note: the xz backdoor is well known because it was found. It was found mostly because it’s foss. It’s doubtful it would’ve been found if it was closed source.
Imagine how many xz-like exploits are live today that hasn’t been detected yet. Is this exploit more prevalent in open source or closed source software?
Most of that cost was unlikely for the hardware itself, but rather Nintendo greed. Most of it was probably for the early access to Nintendo’s next console and possibly support from Nintendo directly.
Interesting paper. I skimmed through it quickly, but it seems like they wanted to avoid relying on ray tracing.
Minimal ray tracing. Many non-local lighting effects can be approximated with texture maps. Few objects in natural scenes would seem to require ray tracing. Accordingly, we consider it more important to optimize the architecture for complex geometries and large models than for the non-local lighting effects accounted for by ray tracing or radiosity.
Most of the paper is way above my understanding, so I’m not qualified.
They used top of the line hardware specialized for 3D rendering. Seems like they used Silicon Graphics workstations, which costed more than $10k back in the day. Not something the typical consumer would buy. The calculations are probably a bit off with this taken into account.
Then they likely relied on rendering techniques optimized for the hardware they had. I suspect modern GPUs aren’t exactly compatible with these old rendering pipelines.
So multiply with 10ish and I think we have a more accurate number.
Did Toy Story use ray tracing back then?
AFAIK, A Bug’s Life is the first Pixar movie that used ray tracing to some extent, and that was for a few reflections. Monster’s University is the first Pixar movie that was fully ray traced.
jQuery got popular because Internet Explorer, Firefox, Chrome and other browsers weren’t exactly cross compatible. Writing vanilla JS was risky business in that sense.
It also supported AJAX across all major browsers, which meant the website could make API requests without reloading the entire page. It was super revolutionary to press a button and it only changed a part of the page.
Then Angular and React took it a step forward and that’s where we are now.