if(m.options.hicColors != null ) {
if(d.hic <= m.options.cutoffs[0]) {
d.fillColor = m.options.hicColors[0];
return m.options.hicColors[0];
} else if (d.hic > m.options.cutoffs[0] && d.hic <= m.options.cutoffs[1]) {
d.fillColor = m.options.hicColors[1];
return m.options.hicColors[1];
} else if(d.hic > m.options.cutoffs[1] && d.hic <= m.options.cutoffs[2]){
d.fillColor = m.options.hicColors[2];
return m.options.hicColors[2];
} else if(d.hic > m.options.cutoffs[2] && d.hic <= m.options.cutoffs[3]) {
d.fillColor = m.options.hicColors[3];
return m.options.hicColors[3];
} else {
d.fillColor = m.options.hicColors[4];
return m.options.hicColors[4];
}
}
The program I am working on involves taking a set of shapes and coloring them based on two arrays, one containing the colors to be used, the other containing the cutoff values to decide which color is used. Previously, these were always 5 colors and 4 cutoffs, but now I need to accept arrays of any length. I'm not sure how I can accomplish this. Does anyone have any suggestions how I could complete this task?
Note that the length of the color array is always one more than the cutoff array.
Aucun commentaire:
Enregistrer un commentaire