I wonder how I can put together two equations together in an efficient way. The first equation I have is an analytic expression that among other variables depends on a variable T. I write it in python in the following way:
def Q_1(den, r, T):
return 3./2 * ((k * (T - eq(den, r)) * den)/(mu * time_constant))
The second approximation comes from interpolating a table, in which the values of T start at values of T = 10^3.8. That's why I am setting bounds_error = False, fill_value = 0.
f = interp1d(self.logT, self.logQ_2, bounds_error=False, fill_value = 0.)
def Q_2(den, T):
L = 10**f(T)
n = den * X * mass**-1
return L * n**2 * dens**-1
What I want to do is something like: total_Q = Q_1 + Q_2. In such a way that Q_1 is applied if T<10^3.8 and Q_2 is applied if T>=10^3.8, and I want to avoid the use of if statements since I have to do this for sets of thousands of hundreds of Ts, dens, and r's, i.e. what I want to avoid is the following:
def total_Q(den, r, T):
for t, d, k in zip(T, den, r):
if t<10^3.8:
Q = Q_1(d, k, t)
elif t>= 3.8:
Q = Q_2(d, t)
return Q
What I tried so far is:
def total_Q(den, r, T):
return Q_1(dens, r, T) * (T<10**3.8) + Q_2(dens, T) * (T>=10**4.)
The problem is that this last form raises an error saying that the interpolation is out of bounds.
Any ideas?
Aucun commentaire:
Enregistrer un commentaire