I have recently started learning the basics of coding from codeacademy and was given a project involving applying something called the "Luhn Algorithm" for validating credit card numbers. I think I understand the algorithm, but I seem to be getting something wrong in my javascript code.
I'm using this as a sample to test if it works
const invalid1 = [4, 5, 3, 2, 7, 7, 8, 7, 7, 1, 0, 9, 1, 7, 9, 5];I used that Array for this piece of code:
const validateCred = arr => {
for (let i = arr.length - 1; i >= 0; i-=2) {
for (let j = arr.length - 2; j >= 0; j-=2) {
arr[j] *= 2;
if (arr[j] > 9) {
arr[j] -= 9;
}
const redFunc = (tot,num) => {
return tot + num;
}
return total = arr.reduce(redFunc,0);
}
}
}
console.log(validateCred(invalid1));And the answer that logs is 82. I checked the math manually and found out that it should be 85.
I figured out that the problem is that this code isn't registering
arr[j] *= 2;I've been at it for hours but can't, for the life of me, figure out how to fix it. Please help.
Aucun commentaire:
Enregistrer un commentaire