Python – ModuleNotFoundError: No module named ‘numpy.testing.nosetester’

importerrormachine learningnosenumpypython

I was using the Decision Tree and this error was raised. The same situation appeared when I used Back Propagation. How can I solve it?

import pandas as pd
import numpy as np
a = np.test()
f = open('E:/lgdata.csv')
data = pd.read_csv(f,index_col = 'id')

x = data.iloc[:,10:12].as_matrix().astype(int)
y = data.iloc[:,9].as_matrix().astype(int)

from sklearn.tree import DecisionTreeClassifier as DTC
dtc = DTC(criterion='entropy')
dtc.fit(x,y)
x=pd.DataFrame(x) 

from sklearn.tree import export_graphviz
with open('tree.dot','w') as f1:
    f1 = export_graphviz(dtc, feature_names = x.columns, out_file = f1)

Traceback (most recent call last):
  File "<ipython-input-40-4359c06ae1f0>", line 1, in <module>
    runfile('C:/ProgramData/Anaconda3/lib/site-packages/scipy/_lib/_numpy_compat.py', wdir='C:/ProgramData/Anaconda3/lib/site-packages/scipy/_lib')
  File "C:\ProgramData\Anaconda3\lib\site-packages\spyder\utils\site\sitecustomize.py", line 710, in runfile
    execfile(filename, namespace)
  File "C:\ProgramData\Anaconda3\lib\site-packages\spyder\utils\site\sitecustomize.py", line 101, in execfile
    exec(compile(f.read(), filename, 'exec'), namespace)
  File "C:/ProgramData/Anaconda3/lib/site-packages/scipy/_lib/_numpy_compat.py", line 9, in <module>
    from numpy.testing.nosetester import import_nose

ModuleNotFoundError: No module named 'numpy.testing.nosetester'

Best Answer

This is happening due to a version incompatibility between numpy and scipy. numpy in its latest versions have deprecated numpy.testing.nosetester.

Replicating the issue

pip install numpy==1.18 # > 1.18
pip install scipy<=0.19.0 # <= 0.19 

and

from sklearn.tree import DecisionTreeClassifier as DTC

Triggers the error.

Fixing the error

Upgrade your scipy to a higher version.

pip install numpy==1.18
pip install scipy==1.1.0
pip install scikit-learn==0.21.3

But not limited to this. By upgrading the above libraries to the latest stable, you should be able to get rid of this error.

Related Topic