mardi 27 décembre 2016

for loop taking too long to produce/export output in Python

This question is a continuation of a previous question for loop taking too long to produce output that I asked earlier today. As advised to me in one comment, I used pandas for reading excel files in place of xlrd. Here is the program that I wrote -

   import pandas as pd
   import numpy as np

   no_of_columns = 10000

   Book1 = pd.read_excel("D:\Python\Book1.xlsx",header=None,name=range(no_of_columns))
   Book2 = pd.read_excel("D:\Python\Book2.xlsx",header=None,name=range(no_of_columns))
   Book3 = pd.read_excel("D:\Python\Book3.xlsx",header=None,name=range(no_of_columns))


   for i in range(1,11001):
      for j in range(0,10000):
         if Book1.iloc[i,j] == 100 and Book2.iloc[i,j] == 150 and Book3.iloc[i,j] == 150:
            print 1
         else:
            print 0

But this also didn't solved the problems that I am having. The program is still running (it has been 5 hours) and the text output that I am exporting in my directory is still of size 0 bytes. Again, is there anything wrong with the program? Why am I getting a file whose size has been the same since the beginning of the execution? I have ran such kind of large loops on R but each time I started to export my output in text or in excel format, I get a file in my directory whose size continues to increase as the loop progresses. So why this isn't happening here? What should I do here?

Aucun commentaire:

Enregistrer un commentaire