samedi 5 septembre 2020

js - Comparison with 0 always returns true?

I'm just printing a rectangle to the console to get somewhat familiar with JS. I have a nested loop (pseudocode for i < this.height, for j < this.width). If j is 0 or width - 1, I'd like to print out a | for the border:

if (j == this.width - 1 || j == 0)
{ c = '|'; }

and then c is appended to my output. The comparison to 0 returns true regardless of the j value. My best guess is that it doesn't think that 0 is the type that is, or I've used it in some way that JS thinks it's a wildcard. For reference, my output looks like this:

--------
||||||||
||||||||
||||||||
||||||||
||||||||
||||||||
--------

The -'s at the top and bottom are specified in the outer loop.

Aucun commentaire:

Enregistrer un commentaire