Hacker News new | past | comments | ask | show | jobs | submit
The root cause here is poorly named settings.

If the original setting had been named something bool-y like `help.autocorrect_enabled`, then the request to accept an int (deciseconds) would've made no sense. Another setting `help.autocorrect_accept_after_dsec` would've been required. And `dsec` is so oddball that anyone who uses it would've had to look up.

I insist on this all the time in code reviews. Variables must have units in their names if there's any ambiguity. For example, `int timeout` becomes `int timeout_msec`.

This is 100x more important when naming settings, because they're part of your public interface and you can't ever change them.

> I insist on this all the time in code reviews. Variables must have units in their names if there's any ambiguity. For example, `int timeout` becomes `int timeout_msec`.

Same here. I'm still torn when this gets pushed into the type system, but my general rule of thumb in C++ context is:

  void FooBar(std::chrono::milliseconds timeout);
is OK, because that's a function signature and you'll see the type when you're looking at it, but with variables, `timeout` is not OK, as 99% of the time you'll see it used like:

  auto timeout = gl_timeout; // or GetTimeoutFromSomewhere().
  FooBar(timeout);
Common use of `auto` in C++ makes it a PITA to trace down exact type when it matters.

(Yes, I use IDE or a language-server-enabled editor when working with C++, and no, I don't have time to stop every 5 seconds to hover my mouse over random symbols to reveal their types.)

loading story #42766408
loading story #42764656
loading story #42768558
loading story #42764053
Yes and it's made worse by using "deciseconds," a unit of time I've used literally 0 times in my entire life. If you see a message saying "I'll execute in 1ms," you'd look straight to your settings!
> Variables must have units in their names if there's any ambiguity

Then you end up with something where you can write "TimoutSec=60" as well as "TimeoutSec=1min" in the case of systemd :)

I'd argue they'd been better of not putting the unit there. But yes, aside from that particular weirdness I fully agree.

loading story #42764992
I do that, but I can't help thinking that it smells like Hungarian notation.

The best alternative I've found is to accept units in the values, "5 seconds" or "5s". Then just "1" is an incorrect value.

That’s not automatically bad. There are two kinds of Hungarian notation: systems Hungarian, which duplicates information that the type system should be tracking; and apps Hungarian, which encodes information you’d express in types if your language’s type system were expressive enough. [1] goes into the difference.

[1] https://www.joelonsoftware.com/2005/05/11/making-wrong-code-...

And this is exactly the kind the language should have a type for, Duration.
Not really.

I don't want to have a type for an integer in seconds, a type for an integer in minutes, a type for an integer in days, and so forth.

Just like I don't want to have a type for a float that means width, and another type for a float that means height.

Putting the unit (as oppose to the data type) in the variable name is helpful, and is not the same as types.

For really complicated stuff like dates, sure make a type or a class. But for basic dimensional values, that's going way overboard.

loading story #42772678
> I insist on this all the time in code reviews. Variables must have units in their names if there's any ambiguity. For example, `int timeout` becomes `int timeout_msec`.

Personally I flag any such use of int in code reviews, and instead recommend using value classes to properly convey the unit (think Second(2) or Millisecond(2000)).

This of course depends on the language, it's capabilities and norms.

loading story #42766016
Yes! As it is, '1' is ambiguous, as it can mean "True" or '1 decisecond', and deciseconds are not a common time division. The units commonly used are either seconds or milliseconds. Using uncommon units should have a very strong justification.
Though, ironically, msec is still ambiguous because that could be milli or micro. It's often milli so I wouldn't fault it, but we use micros just enough at my workplace where the distinction matters. I would usually do timeout_micros or timeout_millis.
We use "ms" because it's the standard SI symbol. Microseconds would be "us" to avoid the µ.

In fact, our French keyboards do have a "µ" key (as far as I remember, it was done so as to be able to easily write all SI prefixes) but using non-ASCII symbols is always a bit risky.

Shouldn't that be named "usec"? But then again, I can absolutely see someone typing msec to represent microseconds.
ms for microseconds would be a paddlin'. The micro prefix is μ, but a "u" is sufficient for easy of typing on an ascii alphabet.
can also do usec for micro
What would you call the current setting that takes both string enums and deciseconds?
It's almost like Git is a version control system built by developers who only knew Perl and C.