I'm brand new to Java and creating a simple program that takes information output from another program and uses it via command line for a loop. No matter how many times I look at it, I can't figure out why the 'if (minValue > value' doesn't change minValue.
Output given some information that should probably produce a minValue:
Count:3
Minimum: 0 @
Maximum: 75 @ DummyDate3 DummyTime3
Average: 45.00
Is this a result of the while loop?
int minValue = 0;
int maxValue = 0;
String minValueTime = "";
String minValueDate = "";
String maxValueDate = "";
String maxValueTime = "";
int count = 0;
double average = 0;
/*
* For as long as input is going through A date, a time, and a value
* will come through as a loop If the minimum value is less than the
* value coming through The minimum value will become the value. If the
* maximum value is less than the value, the maximum value will become
* the value.
*/
while (input.hasNext() == true) {
String date = (input.next());
String time = (input.next());
int value = (input.nextInt());
if (minValue > value) {
minValue = value;
minValueDate = date;
minValueTime = time;
}
if (maxValue < value) {
maxValue = value;
maxValueDate = date;
maxValueTime = time;
}
count++;
average = average + value;
}
input.close();
System.out
.printf("Count:%d%nMinimum: %d @ %s %s%nMaximum: %d @ %s %s%nAverage: %.2f%n",
count, minValue, minValueDate, minValueTime, maxValue,
maxValueDate, maxValueTime, average / count);
}
}
Aucun commentaire:
Enregistrer un commentaire