I know, but it does let you sled off a cliff if you choose to.
o hai mark
I know, but it does let you sled off a cliff if you choose to.
Prefix the name with what it’s for. For example, I’ve previously got a SoundFontError
from opening soundfont file.
“Error” is already used by std::error::Error
. It might not be imported by the code that imports your error type, but I think it’s better to give it distinct name.
The other thing is that you might want to use more than one library. Typical imports at the top of the file might look like this:
use bingbong::{BingBong, BingBongError, SomethingElse};
use bogos::binted::crotchidizer::{Crotchidizer, CrotchidizerError};
If both libraries named their error enums just “Error”, the collision could be worked around, but it’s an unnecessary extra step:
// Not sure how renaming affects compiler hints.
use bingbong::{BingBong, Error as BingBongError, SomethingElse};
use bogos::binted::crotchidizer::{Crotchidizer, Error as CrotchidizerError};
or if you want to avoid renaming:
use bingbong::{BingBong, SomethingElse};
use bogos::binted::crotchidizer::{self, Crotchidizer};
/* ... */
match result {
Ok(value) => return value,
Err(bingbong::Error::MissionFailed) => panic!(),
Err(bingbong::Error::UrMom) => todo!(),
_ => unreachable!(),
}
if let Err(crotchidizer::Error::SomethingsWrong) = result2 {
// ...
}
If the screenshot had followed conventions, the message would say something like this:
could not convert error type `BingBongError` to `MyAppError`
You can wrap everything in unsafe and keep living dangerously!
I don’t know if this will be attempt, but I helped. Godspeed, OP
You really should avoid naming your type plain “Error”
I don’t know what app you’re using, but that spoiler tag isn’t part of the spec.
Node: Did you say “Nerd”?
deleted by creator
*checks out on jazz fire alarm*
Subscribed
The b is lower case so it sould be bits.
Romanticizing “past greatness” seems to always involve some very shit politics. It’s more obvious in these old empires, but it exists in more subtle forms elsewhere, too.
I was specifically talking about euros, but I guess a certain US president gets a honourable mention for his campaign slogan
At least here in Finland, the military part is optional.
C# is like Microsoft-branded java. No real difference in the language, but some of the tooling for java is worse.
That’s just basically looking up the answer. … i will not learn anything from it.
Looking up the answer is the way to do it. You’re of course supposed to pay at some attention instead of copy-pasting without using your brains. As you keep doing things, you’ll develop a rough idea of how things are done.
Even if i find it (which is unlikely without asking an llm)
But i don’t know how i did it, and i couldn’t recreate it by myself.
You mean building the thing without any reference? Except for the most basics, you’re not supposed to memorize everything by the smallest details. Imagine asking a lawyer to know the details of every single law off the top of their head.
Seriously, go build that clock.
With basically no knowledge?
Well yeah. You find yourself some simple project and try to build it. When you don’t know how to do something, you look it up.
Like, make a command line clock, for example. Figure out how to get the current time, and then how to print it. And after that how to make it print the time once a second.
Edit: probably the most important skill in programming is breaking the problem into smaller pieces that you can then figure out. With experience, getting stuck like this becomes much less of a problem.
I don’t think it’s a linear progression where one ingredient is a “step.” I don’t really use fresh pasta, but I make the food myself instead of buying a canned sauce.
I’m sure counter strike would be a decent option
I was vaguely aware that some ancient architectures had weird byte widths, but I did not know about this. Pretty interesting.
This paper cannot succeed without mentioning the PDP-10 (though noting that PDP-11 has 8-bit bytes), and the fact that some DSPs have 24-bit or 32-bit words treated as “bytes.” These architectures made sense in their era, where word sizes varied and the notion of a byte wasn’t standardized. Today, nearly every general-purpose and embedded system adheres to the 8-bit byte model. The question isn’t whether there are still architectures where bytes aren’t 8-bits (there are!) but whether these care about modern C++… and whether modern C++ cares about them.
Oh boy, now I can stop missing C++